WRITING RESEARCH 
PROPOSALS & PUBLICATIONS 
CILIP IL Group and LIRG
Getting started 
Dr Alison Brettle
Aims 
• To help LIS practitioners get started in research or 
evaluation projects (and then write them up) 
– What is research? What is evaluation? 
– What are the challenges and practical considerations? 
– When to use quantitative and qualitative approaches 
– How to plan a project 
• No ..isms or ..ologies or paradigms!
4 D’s 
• Defining – which questions are 
suitable? 
• Designing – what methods can 
I use? 
• Doing – how do I do it? 
• Describing – how do I tell 
people about it?
Context: why research? 
10 good reasons to engage in research?
Aims of research ... 
• Discover causes 
• Understand activities 
• Understand behaviours 
• Explore perceptions 
• Predict future trends 
• ‘Blue skies’ – exploratory research 
• Testing hypotheses 
• Evaluating impact
Research or evaluation? 
Research Evaluation 
“The systematic investigation into and 
study of materials and sources in order to 
establish facts and reach new 
conclusions” (Oxford Dictionaries, 
http://oxforddictionaries.com/definition/ 
english/research) 
“An enquiry” (Robson, 2002) 
“A study with a distinctive purpose; it is 
not a new or a different research 
strategy” 
“Often to assess the effects and 
effectiveness of something” (Robson, 
2002)
Characteristics of research 
Successful Unsuccessful 
Activity and involvement Expedience 
Convergence Method or technique 
Intuition Motivation by publication, money or 
funding 
Theory Lack of theory 
Real world value (Robson, 2002 – adapted from Campbell, 
1982)
Generic categories of research purpose 
• Exploratory 
• Descriptive 
• Evaluative 
• Predictive 
• Explanatory 
• Developmental
Some purposes of evaluation 
To find out if 
user needs 
are met 
To improve a 
service 
To assess the 
outcomes of 
the service 
To find out 
how a service 
is operating 
To assess the 
efficiency of a 
service 
To 
understand 
why a service 
works (or 
doesn’t) 
What should 
the focus be? 
How can we 
make the 
service 
better? 
Does the 
service meet 
its planned 
goals? 
What 
happens in 
the service? 
How do the 
costs 
compare with 
the benefits? 
Are we 
reaching the 
right group? 
What 
happens to 
users as a 
result? 
Is the service 
operating as 
planned? 
Is it more or 
less efficient 
than other 
services? 
Are we 
providing 
what users 
need? 
Is it worth 
continuing? 
(Adapted 
from Robson, 
2000)
What is involved? 
• Deciding the focus 
• Developing the questions 
• Choosing a strategy 
• Selecting the method(s) 
• Arranging the practicalities 
• Collecting the data 
• Analysing the findings 
• Reporting what you have found
What skills do I need?
Everyday life skills for research 
• Reading 
• Listening 
• Watching 
• Choosing 
• Questioning 
Summarising 
Organising 
Writing 
Presenting 
Reflecting
A note on terminology 
• Research methodology – the approach or 
perspective taken to do the research 
• Research design – how you do the research 
• Research methods – the tools you use to do 
the research
Getting started……Defining
Why do you need 
a research question? 
• Offers direction throughout the 
study 
• Guides the search strategy and 
choice of data collection method 
• Suggests the format of likely 
answers
Start with a topic 
• What are you interested in? 
• What problems are there at work? 
• What issues are topical? 
• What do we know little about? 
• What do people disagree about? 
• What do influential people want to 
know about? 
• What did your mum tell you to do? 
?
Can the topic be researched? 
• What is pain? 
• Why are people anxious? 
• Is Coke better than Pepsi? 
• Why do girls get pregnant? 
• Should we return to the moon? 
• Can teenagers live on only noodles and beer? 
• Why do patients not take their drugs properly?
Back to your topic, then… 
Is it too broad (or too narrow)? 
Is it researchable? 
Is it worthwhile? 
Does is still grab you?
Focus of questions – and studies 
• What? (What is happening?) Exploratory 
• What? (What has happened?) Evaluative 
• What? (What will happen?) Predictive 
• What? (What caused it?) Experimental 
• Why? (Why is this happening?) Explanatory 
• How? (How could things be different?) AR 
• How? (How many?) Survey
Defining! 
• Explanatory – Do doctors find 
things quicker after being 
taught search skills? 
• Experimental – What works 
best – face to face or online 
teaching? 
• Exploratory – What are 
students’ experiences of 
information literacy training? 
• Descriptive - What are users 
perceptions about information 
literacy training?
Components of a research question 
• Concise and direct 
• Understandable 
• A researchable problem 
• Focused on central issues 
• Multiple questions? 
• Sub-questions?
Activity 1 - 
• In small groups 
• What are the key issues about 
information literacy? 
• What questions could be asked? 
• …are they what, why or how 
questions? 
– …what sorts of answers might they 
prompt? 
• It may help to think of a problem
Using the literature 
• Why do a literature review?
Why… 
• Demonstrates your subject knowledge 
• Improves your writing skills 
• Contextualises your research 
• Helps direct your research 
• Helps formulate your research questions 
• Can’t reinvent wheels – justify your original contribution 
• Provides material for comparison in later discussion sections
The literature can help justify/discuss 
• Whether your findings confirm those of other studies 
• Whether your findings extend other studies 
• Whether your findings break new ground 
• Whether your work raises issues about the methodological 
choices used by other studies 
• Whether your work challenges existing theoretical 
approaches to your subject
Your literature review should be 
• A coherent synthesis of existing research which 
– Demonstrates the context of your work 
– Involves thematic lines of argument round the research question 
– Demonstrates trends in how the topic has been treated by other 
researchers 
– Makes links to the themes of your study 
– Shows a clear gap where your study fits in
What does this involve? 
• Assessing the value of the literature at a number of levels 
– Individual papers – eg significant material 
– Collections or groupings of papers 
• Emphasising the limitations of existing knowledge 
– Identifying the gaps – promote the value of your research 
– Justify the contribution of your study
What doesn’t it involve? 
• Lists of references 
• Long descriptions/summaries of other studies 
• Inaccurate citations 
• Illogical flow 
• Overuse of quotations and just paraphrasing of 
other’s work 
• etc
Another way of thinking about it 
• Literature review as a map of the 
field and its debates 
• Purposes of a literature review: 
-> identify and 
summarize key 
paradigms and arguments 
in your field 
-> position yourself in 
relation to these 
arguments to set up your 
own argument
Can learn from (or do a) Systematic 
Review 
A review of all the 
literature on a 
particular topic, which 
has been 
systematically 
identified, appraised 
and summarised 
giving a summary 
answer.
What is a systematic review? 
• An overview of primary research 
studies conducted according to 
explicit and reproducible 
methodology 
• A rigorous method of summarising 
research evidence 
• Shows what we know and don’t 
know about a topic area 
• Provides evidence of effectiveness 
(or not) by summarising and 
appraising relevant evidence
Systematic review process 
• Define/focus the question 
• Develop a protocol 
• Search the literature (possibly 2 stages scoping and actual 
searches) 
• Refine the inclusion/exclusion criteria 
• Assess the studies (data extraction tools, objective manner) 
• Combine the results of the studies to produce conclusion 
• Place findings in context – quality and heterogeniety of studies, 
applicability of findings
Your protocol 
• Plan what you are going to do in 
the review 
• Set out the background and 
objectives 
• Outline the resources you will use 
• Establish inclusion/exclusion 
criteria 
• How will data be extracted (what 
will be extracted) 
• How will you synthesise literature 
• Keeps you on track and focussed
How will you refine the inclusion/exclusion 
criteria? 
• Tighter the criteria = 
– less papers to review 
• BUT 
– will your review draw any 
meaningful conclusions? 
– Will it cover all relevant 
perspectives 
• Focussing the question v quality 
of studies
Know what you want to find 
out???? 
• Think about your structure 
before you start writing 
• Use a tool so that you record the 
same information about each of 
the study 
• Make sure it captures the 
elements that you wish to write 
about in your final report 
• Don’t just cherry pick the bits you 
like 
• Quality of studies – what 
evidence are you going to 
include? How are you going to 
assess the quality
Critical appraisal and SRs in LIS 
• LIS specific or adapt one from healthcare 
– http://nettingtheevidence.pbworks.com/w/page/114 
03006/Critical%20Appraisal%20Checklists 
– HCPRDU tools 
• http://usir.salford.ac.uk/13070/ 
• LIS specific systematic reviews 
– http://lis-systematic-reviews. 
wikispaces.com/Welcome
Quantitative v Qualitative?
What is quantitative research? 
• Objective approach, neutral 
• Scientific? Experimental? Non-experimental 
– Introduces a change and collects 
data about effects 
– Specifiy design, collect data about 
effects 
• Fixed (Robson) 
– Set out what you are going to do and 
how you are going to do it 
– Follows well established procedures 
• Samples, variables, 
measurement, control, 
confounders
What is quantitative research? 
• …allows you to count things 
• …may try to prove things 
• Answers “what” or “how” questions 
– Questionnaires (to collect numerical 
data), 
– Usage figures 
– Web logs
What is qualitative research? 
• Understanding behaviour 
or perceptions or views 
• Less structured 
• Obtaining meaning 
• “Why” questions or “how 
things are perceived 
• In reality – mixed methods 
are often used
• QUANTITATIVE 
• Counting/measuring 
• Large samples (often) 
• Questionnaires 
• Analysis of statistics, 
weblogs 
• QUALITATIVE 
• Understanding of 
behaviour 
• Perceptions 
• Exploratory research 
• Interviews, focus 
groups 
• Observation 
What? Why? How?
Some Worked Examples 
• What database is best for searching on the topic 
of severe mental illness? (Counting – 
Quantitative, simple descriptive statistics) 
• Teaching online is as good as teaching face to 
face (Proving – Quantitative, quasi experimental 
or experimental) 
• Which is best? Mediated searches or teaching 
users to find information? (Mixed – simple 
descriptive statistics, inferential statistics and 
thematic analysis of users views confirmed with 
focus groups)
What are doctors experiences of 
• Do doctors receive training in searching? 
• Do doctors use the methods they were taught in training? 
• Do doctors find relevant material? 
• How often do doctors need to search using online 
databases? 
• How frequently do doctors use online databases? 
• What do doctors think about the training they receive? 
• Do doctors percieve that the training they receive equips them to 
be evidence based practitioners? 
• Does the training enable doctors to be better evidence based 
practitioners? 
searching?
Defining Tips! 
• Make sure your question isn’t 
too big/broad/narrow 
• Make sure your objectives are 
smaller than your aims 
• Think – what exactly is it that 
you wish to find out. Does 
your question really reflect 
this? 
• Don’t build in assumptions 
– What are the benefits of 
teaching doctors to search? 
• The clearer the question – the 
easier it is to find the answer!
Activity 2: Defining 
• Think of a question that 
you could answer by a 
research or evaluation 
study 
• It may help to think of a 
problem! 
• You may want to break it 
down into objectives that 
will allow you to collect 
data to answer the 
question 
• Does it lend itself to 
quantitative, qualitative 
or mixed methods
Designing 
• What methods are you going to use to collect the data to 
answer your question? 
– Involves asking questions of people, systems or texts 
– Involves testing? A hypothesis? 
– Could use questionnaires or data you already collect or other 
measure or test or “experiment” 
• Who (what) is your population? 
• Sampling – random, representative, purposive, 
theoretical, snowball? 
• Valid/trustworthiness – does it do what it says on the tin? 
• Reliable – does it do it consistently/accurately? 
• Ethics – is your approach ethical ? 
• Bias and confounders - can you avoid them, 
or account for them?
Research ethics 
• Need to be considered at all stages: 
– Formulating questions 
– Gathering data (sampling, informed consent) 
– Analysing data (anonymity, confidentiality) 
– Writing up (reliability, accuracy) 
– Dissemination 
• Working with particular groups 
– ‘hard to reach’ 
– Children 
– NHS ethics
An experimental study of information literacy 
training to pre-registration nurses (testing online 
training v traditional teaching) 
• As part of the degree course nurses need to 
learn IL skills – to help them through 
assessments (obtain their degree) and to 
ensure competency in professional practice 
(evidence based) 
• What are the ethical issues involved?
An experimental study of information literacy training 
to pre-registration nurses (testing online training v 
traditional teaching) 
• Training session(s) linked to assessed work 
Two training sessions – 4 tests 
– Ensure that both methods cover the learning outcomes in the same 
way 
– Ensure both groups receive the same information at the same time 
– Ensure that the cohort wasn’t being tested for anything else 
– Ensure that test didn’t affect their assessment or its results 
– Students were able to opt out of intervention group and/or not have 
their “test” results included in the study 
– University ethics procedure/school procedure 
– Student names/numbers not used (analysed by group not individual) 
– Crossover design
Design – Tips! 
• Look for examples of similar 
studies – can you use/adapt 
the approach 
• Has someone else developed 
a tool you can use? 
• Make it feasible and 
manageable 
• Be pragmatic - be as rigorous 
as you can whilst being aware 
of the limitations 
• Make sure it is appropriate for 
the question 
• Write a research proposal
Activity 4: Designing 
• Go back to your question 
• How would you design 
your study to answer it? 
– What are your aims and 
objectives? 
– What data do you need to 
collect? 
– How are you going to 
collect it? 
– Who are you going to 
collect it from? 
– What 
tools/methods/approaches 
are you going to use?
Doing 
• How am I going to collect the 
data/information? 
– Eg online or paper questionnaires, 
interviews, observations, recording, 
transcribing 
• How am I going to analyse the data 
I collect? 
– Excel, SPSS, Descriptive statistics, 
Inferential statistics, content analysis 
• Do I need any help? (You may 
want to seek this at the design 
stage) 
• Do I have the right skills? 
• Do I have enough resources?
Describing 
• What should I write about and how should I do 
it? 
– Need to explain what you have done and how 
you did it. 
– Need to present your study in a way that is 
meaningful for the particular audience. 
– Need to think about style 
• Where should I write about it? 
– Project report 
– Newsletters 
– Journals – Evidence Based Library and 
Information Practice, Library and Information 
Research, subject specific 
– Posters – conferences or study days 
– Presentations – internal, conferences 
• If you are not going to do anything with it – 
why do the research?
Describing - Tips 
• Think – what are the key messages for 
this audience 
• What is the best way of presenting the 
data? 
• Can the audience understand what I’ve 
done? 
• Can the audience work out if it is valid? 
Reliable? 
• Have I explained and addressed the 
limitations 
• Don’t “hide” results 
• Have I “answered” my 
objectives/research question?

Writing Research Proposals and Publications: Getting started

  • 1.
    WRITING RESEARCH PROPOSALS& PUBLICATIONS CILIP IL Group and LIRG
  • 2.
    Getting started DrAlison Brettle
  • 3.
    Aims • Tohelp LIS practitioners get started in research or evaluation projects (and then write them up) – What is research? What is evaluation? – What are the challenges and practical considerations? – When to use quantitative and qualitative approaches – How to plan a project • No ..isms or ..ologies or paradigms!
  • 4.
    4 D’s •Defining – which questions are suitable? • Designing – what methods can I use? • Doing – how do I do it? • Describing – how do I tell people about it?
  • 5.
    Context: why research? 10 good reasons to engage in research?
  • 6.
    Aims of research... • Discover causes • Understand activities • Understand behaviours • Explore perceptions • Predict future trends • ‘Blue skies’ – exploratory research • Testing hypotheses • Evaluating impact
  • 7.
    Research or evaluation? Research Evaluation “The systematic investigation into and study of materials and sources in order to establish facts and reach new conclusions” (Oxford Dictionaries, http://oxforddictionaries.com/definition/ english/research) “An enquiry” (Robson, 2002) “A study with a distinctive purpose; it is not a new or a different research strategy” “Often to assess the effects and effectiveness of something” (Robson, 2002)
  • 8.
    Characteristics of research Successful Unsuccessful Activity and involvement Expedience Convergence Method or technique Intuition Motivation by publication, money or funding Theory Lack of theory Real world value (Robson, 2002 – adapted from Campbell, 1982)
  • 9.
    Generic categories ofresearch purpose • Exploratory • Descriptive • Evaluative • Predictive • Explanatory • Developmental
  • 10.
    Some purposes ofevaluation To find out if user needs are met To improve a service To assess the outcomes of the service To find out how a service is operating To assess the efficiency of a service To understand why a service works (or doesn’t) What should the focus be? How can we make the service better? Does the service meet its planned goals? What happens in the service? How do the costs compare with the benefits? Are we reaching the right group? What happens to users as a result? Is the service operating as planned? Is it more or less efficient than other services? Are we providing what users need? Is it worth continuing? (Adapted from Robson, 2000)
  • 11.
    What is involved? • Deciding the focus • Developing the questions • Choosing a strategy • Selecting the method(s) • Arranging the practicalities • Collecting the data • Analysing the findings • Reporting what you have found
  • 12.
  • 13.
    Everyday life skillsfor research • Reading • Listening • Watching • Choosing • Questioning Summarising Organising Writing Presenting Reflecting
  • 14.
    A note onterminology • Research methodology – the approach or perspective taken to do the research • Research design – how you do the research • Research methods – the tools you use to do the research
  • 15.
  • 16.
    Why do youneed a research question? • Offers direction throughout the study • Guides the search strategy and choice of data collection method • Suggests the format of likely answers
  • 17.
    Start with atopic • What are you interested in? • What problems are there at work? • What issues are topical? • What do we know little about? • What do people disagree about? • What do influential people want to know about? • What did your mum tell you to do? ?
  • 19.
    Can the topicbe researched? • What is pain? • Why are people anxious? • Is Coke better than Pepsi? • Why do girls get pregnant? • Should we return to the moon? • Can teenagers live on only noodles and beer? • Why do patients not take their drugs properly?
  • 20.
    Back to yourtopic, then… Is it too broad (or too narrow)? Is it researchable? Is it worthwhile? Does is still grab you?
  • 21.
    Focus of questions– and studies • What? (What is happening?) Exploratory • What? (What has happened?) Evaluative • What? (What will happen?) Predictive • What? (What caused it?) Experimental • Why? (Why is this happening?) Explanatory • How? (How could things be different?) AR • How? (How many?) Survey
  • 22.
    Defining! • Explanatory– Do doctors find things quicker after being taught search skills? • Experimental – What works best – face to face or online teaching? • Exploratory – What are students’ experiences of information literacy training? • Descriptive - What are users perceptions about information literacy training?
  • 23.
    Components of aresearch question • Concise and direct • Understandable • A researchable problem • Focused on central issues • Multiple questions? • Sub-questions?
  • 24.
    Activity 1 - • In small groups • What are the key issues about information literacy? • What questions could be asked? • …are they what, why or how questions? – …what sorts of answers might they prompt? • It may help to think of a problem
  • 25.
    Using the literature • Why do a literature review?
  • 26.
    Why… • Demonstratesyour subject knowledge • Improves your writing skills • Contextualises your research • Helps direct your research • Helps formulate your research questions • Can’t reinvent wheels – justify your original contribution • Provides material for comparison in later discussion sections
  • 27.
    The literature canhelp justify/discuss • Whether your findings confirm those of other studies • Whether your findings extend other studies • Whether your findings break new ground • Whether your work raises issues about the methodological choices used by other studies • Whether your work challenges existing theoretical approaches to your subject
  • 28.
    Your literature reviewshould be • A coherent synthesis of existing research which – Demonstrates the context of your work – Involves thematic lines of argument round the research question – Demonstrates trends in how the topic has been treated by other researchers – Makes links to the themes of your study – Shows a clear gap where your study fits in
  • 29.
    What does thisinvolve? • Assessing the value of the literature at a number of levels – Individual papers – eg significant material – Collections or groupings of papers • Emphasising the limitations of existing knowledge – Identifying the gaps – promote the value of your research – Justify the contribution of your study
  • 30.
    What doesn’t itinvolve? • Lists of references • Long descriptions/summaries of other studies • Inaccurate citations • Illogical flow • Overuse of quotations and just paraphrasing of other’s work • etc
  • 31.
    Another way ofthinking about it • Literature review as a map of the field and its debates • Purposes of a literature review: -> identify and summarize key paradigms and arguments in your field -> position yourself in relation to these arguments to set up your own argument
  • 32.
    Can learn from(or do a) Systematic Review A review of all the literature on a particular topic, which has been systematically identified, appraised and summarised giving a summary answer.
  • 33.
    What is asystematic review? • An overview of primary research studies conducted according to explicit and reproducible methodology • A rigorous method of summarising research evidence • Shows what we know and don’t know about a topic area • Provides evidence of effectiveness (or not) by summarising and appraising relevant evidence
  • 34.
    Systematic review process • Define/focus the question • Develop a protocol • Search the literature (possibly 2 stages scoping and actual searches) • Refine the inclusion/exclusion criteria • Assess the studies (data extraction tools, objective manner) • Combine the results of the studies to produce conclusion • Place findings in context – quality and heterogeniety of studies, applicability of findings
  • 35.
    Your protocol •Plan what you are going to do in the review • Set out the background and objectives • Outline the resources you will use • Establish inclusion/exclusion criteria • How will data be extracted (what will be extracted) • How will you synthesise literature • Keeps you on track and focussed
  • 36.
    How will yourefine the inclusion/exclusion criteria? • Tighter the criteria = – less papers to review • BUT – will your review draw any meaningful conclusions? – Will it cover all relevant perspectives • Focussing the question v quality of studies
  • 37.
    Know what youwant to find out???? • Think about your structure before you start writing • Use a tool so that you record the same information about each of the study • Make sure it captures the elements that you wish to write about in your final report • Don’t just cherry pick the bits you like • Quality of studies – what evidence are you going to include? How are you going to assess the quality
  • 38.
    Critical appraisal andSRs in LIS • LIS specific or adapt one from healthcare – http://nettingtheevidence.pbworks.com/w/page/114 03006/Critical%20Appraisal%20Checklists – HCPRDU tools • http://usir.salford.ac.uk/13070/ • LIS specific systematic reviews – http://lis-systematic-reviews. wikispaces.com/Welcome
  • 39.
  • 40.
    What is quantitativeresearch? • Objective approach, neutral • Scientific? Experimental? Non-experimental – Introduces a change and collects data about effects – Specifiy design, collect data about effects • Fixed (Robson) – Set out what you are going to do and how you are going to do it – Follows well established procedures • Samples, variables, measurement, control, confounders
  • 41.
    What is quantitativeresearch? • …allows you to count things • …may try to prove things • Answers “what” or “how” questions – Questionnaires (to collect numerical data), – Usage figures – Web logs
  • 42.
    What is qualitativeresearch? • Understanding behaviour or perceptions or views • Less structured • Obtaining meaning • “Why” questions or “how things are perceived • In reality – mixed methods are often used
  • 43.
    • QUANTITATIVE •Counting/measuring • Large samples (often) • Questionnaires • Analysis of statistics, weblogs • QUALITATIVE • Understanding of behaviour • Perceptions • Exploratory research • Interviews, focus groups • Observation What? Why? How?
  • 44.
    Some Worked Examples • What database is best for searching on the topic of severe mental illness? (Counting – Quantitative, simple descriptive statistics) • Teaching online is as good as teaching face to face (Proving – Quantitative, quasi experimental or experimental) • Which is best? Mediated searches or teaching users to find information? (Mixed – simple descriptive statistics, inferential statistics and thematic analysis of users views confirmed with focus groups)
  • 45.
    What are doctorsexperiences of • Do doctors receive training in searching? • Do doctors use the methods they were taught in training? • Do doctors find relevant material? • How often do doctors need to search using online databases? • How frequently do doctors use online databases? • What do doctors think about the training they receive? • Do doctors percieve that the training they receive equips them to be evidence based practitioners? • Does the training enable doctors to be better evidence based practitioners? searching?
  • 46.
    Defining Tips! •Make sure your question isn’t too big/broad/narrow • Make sure your objectives are smaller than your aims • Think – what exactly is it that you wish to find out. Does your question really reflect this? • Don’t build in assumptions – What are the benefits of teaching doctors to search? • The clearer the question – the easier it is to find the answer!
  • 47.
    Activity 2: Defining • Think of a question that you could answer by a research or evaluation study • It may help to think of a problem! • You may want to break it down into objectives that will allow you to collect data to answer the question • Does it lend itself to quantitative, qualitative or mixed methods
  • 48.
    Designing • Whatmethods are you going to use to collect the data to answer your question? – Involves asking questions of people, systems or texts – Involves testing? A hypothesis? – Could use questionnaires or data you already collect or other measure or test or “experiment” • Who (what) is your population? • Sampling – random, representative, purposive, theoretical, snowball? • Valid/trustworthiness – does it do what it says on the tin? • Reliable – does it do it consistently/accurately? • Ethics – is your approach ethical ? • Bias and confounders - can you avoid them, or account for them?
  • 49.
    Research ethics •Need to be considered at all stages: – Formulating questions – Gathering data (sampling, informed consent) – Analysing data (anonymity, confidentiality) – Writing up (reliability, accuracy) – Dissemination • Working with particular groups – ‘hard to reach’ – Children – NHS ethics
  • 50.
    An experimental studyof information literacy training to pre-registration nurses (testing online training v traditional teaching) • As part of the degree course nurses need to learn IL skills – to help them through assessments (obtain their degree) and to ensure competency in professional practice (evidence based) • What are the ethical issues involved?
  • 51.
    An experimental studyof information literacy training to pre-registration nurses (testing online training v traditional teaching) • Training session(s) linked to assessed work Two training sessions – 4 tests – Ensure that both methods cover the learning outcomes in the same way – Ensure both groups receive the same information at the same time – Ensure that the cohort wasn’t being tested for anything else – Ensure that test didn’t affect their assessment or its results – Students were able to opt out of intervention group and/or not have their “test” results included in the study – University ethics procedure/school procedure – Student names/numbers not used (analysed by group not individual) – Crossover design
  • 52.
    Design – Tips! • Look for examples of similar studies – can you use/adapt the approach • Has someone else developed a tool you can use? • Make it feasible and manageable • Be pragmatic - be as rigorous as you can whilst being aware of the limitations • Make sure it is appropriate for the question • Write a research proposal
  • 53.
    Activity 4: Designing • Go back to your question • How would you design your study to answer it? – What are your aims and objectives? – What data do you need to collect? – How are you going to collect it? – Who are you going to collect it from? – What tools/methods/approaches are you going to use?
  • 54.
    Doing • Howam I going to collect the data/information? – Eg online or paper questionnaires, interviews, observations, recording, transcribing • How am I going to analyse the data I collect? – Excel, SPSS, Descriptive statistics, Inferential statistics, content analysis • Do I need any help? (You may want to seek this at the design stage) • Do I have the right skills? • Do I have enough resources?
  • 55.
    Describing • Whatshould I write about and how should I do it? – Need to explain what you have done and how you did it. – Need to present your study in a way that is meaningful for the particular audience. – Need to think about style • Where should I write about it? – Project report – Newsletters – Journals – Evidence Based Library and Information Practice, Library and Information Research, subject specific – Posters – conferences or study days – Presentations – internal, conferences • If you are not going to do anything with it – why do the research?
  • 56.
    Describing - Tips • Think – what are the key messages for this audience • What is the best way of presenting the data? • Can the audience understand what I’ve done? • Can the audience work out if it is valid? Reliable? • Have I explained and addressed the limitations • Don’t “hide” results • Have I “answered” my objectives/research question?