Chapter 20
The Research Critique
Research Critique
• A critical appraisal of a piece of completed
research
• Identifies both the strengths & limitations
• Usually 3-4 pages long
4 Stages of the Critique Process
• Understanding the purpose & problem &
determining if the design & methodology are
consistent with the study purpose
• Is the methodology applied properly
• Are the outcomes & conclusions believable &
supported by the findings
• Overall quality, strengths & limitations,
contributions to knowledge & suggestions for
improvement in the study
Critiquing Quantitative Studies
• See p. 613 (Box 20.1) for the Stages &
Components for critiquing a quantitative
research report
Criteria for Quantitative Critique
1. Clear statement of purpose
2. Clear link between purpose and problem
statement/question/hypothesis
3. Literature review
4. Theoretical framework
5. Congruence of purpose, design, and
method
Criteria Cont.
6. Appropriate sampling procedures (size &
type)
7. Statistical procedures appropriate
8. Adequate reliability & validity checks to
accept findings and generalize to
appropriate populations
9. Significance of study for nursing is
discussed
Key Questions: Purpose/Problem
Statement
• Is the study problem and purpose statement clearly
articulated?
• Are reasons for conducting the study stated?
• Is the study’s potential contribution to nursing
knowledge stated?
• Are the research objectives or research questions or
hypotheses stated clearly and researchable (answerable
through the collection of empirical data)?
• Are terms defined conceptually and operationally?
Literature Review
• Does the literature review or theoretical framework
provide evidence that the researcher has synthesized the
classic and current literature and placed the research
question in the proper context?
• Does the literature review identify gaps in knowledge,
suggest how the current study extends the knowledge
base in this area, and points out contradictions in the
current knowledge base?
• Does the researcher summarize the literature review,
provide rationale for the current study and show how this
study will extend previous research?
Design
• Is the study design specified, including its
advantages and limitations for the research
problem?
• Is there evidence that a pilot study had been
conducted and the findings were used to enhance
the design?
• Is the design (overall plan of research)
appropriate to the research purpose and capable
of answering the research question?
• How does the design control for extraneous
variables?
Sample
• Was probability or non-probability sampling used and the reason for choice
specified?
• Is the population to whom results will be generalized described?
• Are precautions taken to avoid collecting a biased sample (see Chapter 9)
that would limit generalizability of findings?
• Are the demographic characteristics of the sample described?
• Is the sample representative of the population?
• Are inclusion and exclusion criteria identified?
• Is the sample size appropriate to meet assumptions of statistical tests?
Data Collection
• Are data collection methods appropriate to meet
the study purpose and answer the
questions/hypotheses?
• What evidence is provided that data collection
procedures are valid and reliable?
• Are adaptations to data collection tools
described?
• Are data collection instruments described in
sufficient detail to enable the reader to ascertain
method of scoring, range of values, and what a
particular score means?
Data Analysis
• Are data analysis procedures described?
• Are the statistical techniques appropriate for the
study methodology (that is, the type of data
collected and analysis)?
• Do the statistical tests answer the research
questions and specify level of significance?
• If results are nonsignificant, is a power analysis
conducted to explore nonsignificant findings
(see Chapter 15)?
Human Rights
• How are rights of research participants
protected?
• Are ethical issues anticipated and handled
appropriately?
Procedures
• Are techniques used to ensure that there is consistency in
the data collection process?
• What procedures were used to keep research conditions
the same for all participants?
• Are there strategies to limit errors in data collection,
recording, and analysis?
• Did any unplanned circumstances influence the results?
• In experimental designs, is there evidence of
manipulation of independent variables, randomization in
selection of sample and assignment to experimental and
control groups, and control of extraneous variables
(Wilson, 1993)?
Findings
• Are the findings presented clearly, correctly, and related
to the theoretical framework?
• Is there a clear statement of whether or not the data
support the
• hypotheses or answer each research question?
• Are tables and graphs clearly labeled, easy to
comprehend and congruent with results presented in text
form (see Chapter 19 for table construction)?
• Are findings presented in an unbiased manner (see
Chapter 9)?
Discussion
• Are alternative explanations offered for the
findings?
• Does the researcher discuss both clinical and
statistical significance of findings?
• Does the researcher over generalize the findings
beyond the appropriate population?
• Are limitations of the study such as sample size,
inadequate instruments, sources of bias, etc.
identified and their implications discussed?
Implications & Conclusions
• Does the researcher identify important
implications of the study for practice,
education, or research (if appropriate)?
• How do the findings of the study advance
nursing knowledge?
• Do new research questions emerge from
the study?
Overall Quality
• What are the major strengths of the study?
• What are the major limitations of the study?
• Was the study described in sufficient detail to
facilitate a replication study?
• What are the major contributions of this study to
knowledge development in nursing?
• What suggestions might enhance the study and
correct the limitations?
Criteria for Appraising
Qualitative Research
• Key Question: “Did the investigator get it
right?
• Did the investigator publish an accurate and
vivid description of the research phenomenon
that is recognized easily by those who
experienced it?
Read the research report in its
entirety to get a sense of the study
and its contribution to knowledge
development, then read again paying
attention to the questions
appropriate to each stage of the
critiquing process.
Stage 1 Questions for Critiquing
Qualitative Designs
• Are the research purpose & statement of the
phenomenon of interest clearly stated?
• Is the rationale for qualitative approach stated?
• Is the philosophy of the tradition stated?
• Is a statement of self-understanding included?
• Is there a single broad research question? Are
there sub questions?
• Does the initial question become more focused
as data are analyzed?
Stage 1 Questions for Critiquing
Qualitative Designs
• Does the method used require a literature review
before data collection?
• Does the researcher have expertise in this area,
know what gaps exist, & show how this study
will eliminate gaps?
• Is there evidence that a ROL is done after data
collection, if it is appropriate to the design?
• Is a framework appropriate? If so is it presented
clearly?
Stage 1 Questions for Critiquing
Qualitative Designs
• Is the design appropriate to the research purpose?
• Is there congruency between the methodology & the
research question?
• Is the context for the study adequately described?
• Is the researcher-participant relationship
understood?
Stage 1 Sample Questions
• Is purposive sampling used?
• Are informants able to inform the study?
• Does sample selection allow for saturation
of data?
Stage 1 - Data Collection
• Are data collection strategies described in
sufficient detail?
• Is data collection congruent with study purpose,
question, tradition?
• Are prolonged engagement & observation in the
field used to build trust & ensure validity of data
collection?
Stage 1 - Data Analysis
• Are procedures clearly described & appropriate to
the research tradition?
• Are data collection & analysis concurrent &
ongoing?
• Is there evidence of decision rules for analyzing
data & does the researcher remain true to the rules?
• Are codes narrowed as categories are
systematically discarded when not supported by
the data
• Is there evidence of “theoretical saturation”?
Stage 2: Critical Appraisal of the
Conduct of the Research
• Are the rights of participants protected?
• Are ethical issues anticipated and handled
appropriately?
Stage 2 - Procedures
• Does the research meet the criteria for rigor
• 8 procedures for testing truth value
(Creswell, 1998)
• Should use at least 2 of the 8 in a study
Testing Truth Value (Creswell, 1998)
• Prolonged engagement in the field
• Triangulation
• Peer review of research process
• Negative case analysis
• Articulation of researcher bias
• Member check re findings & conclusions
• External audit using consultant
• Rich, in-depth description of participants & setting
so transferability of findings to other settings can
be judged
Stage 3: Critical Appraisal of Outcomes
• Are findings contextualized?
• Do readers vicariously experience the
phenomenon? (descriptive vividness)
• Are the findings true to the data?
• Do themes present a meaningful picture of the
phenomenon?
• Are the findings compatible with the field of
nursing?
• Are reasons for incompatible findings explored?
Stage 3: Discussion/Implications
• Are implications of the study for nursing
identified?
• Is a context provided in which to use the
findings?
• Do implications follow logically from the
findings?
• Do new research questions emerge from the
findings?
Stage 4: Appraisal of Overall Quality of
Study
• What are the major strengths?
• What are the major limitations?
• What are the major contributions of the study?
• What suggestions might enhance the study &
correct limitations?

This is the Chapter 20 Gillis & Jackson.ppt

  • 1.
  • 2.
    Research Critique • Acritical appraisal of a piece of completed research • Identifies both the strengths & limitations • Usually 3-4 pages long
  • 3.
    4 Stages ofthe Critique Process • Understanding the purpose & problem & determining if the design & methodology are consistent with the study purpose • Is the methodology applied properly • Are the outcomes & conclusions believable & supported by the findings • Overall quality, strengths & limitations, contributions to knowledge & suggestions for improvement in the study
  • 4.
    Critiquing Quantitative Studies •See p. 613 (Box 20.1) for the Stages & Components for critiquing a quantitative research report
  • 5.
    Criteria for QuantitativeCritique 1. Clear statement of purpose 2. Clear link between purpose and problem statement/question/hypothesis 3. Literature review 4. Theoretical framework 5. Congruence of purpose, design, and method
  • 6.
    Criteria Cont. 6. Appropriatesampling procedures (size & type) 7. Statistical procedures appropriate 8. Adequate reliability & validity checks to accept findings and generalize to appropriate populations 9. Significance of study for nursing is discussed
  • 7.
    Key Questions: Purpose/Problem Statement •Is the study problem and purpose statement clearly articulated? • Are reasons for conducting the study stated? • Is the study’s potential contribution to nursing knowledge stated? • Are the research objectives or research questions or hypotheses stated clearly and researchable (answerable through the collection of empirical data)? • Are terms defined conceptually and operationally?
  • 8.
    Literature Review • Doesthe literature review or theoretical framework provide evidence that the researcher has synthesized the classic and current literature and placed the research question in the proper context? • Does the literature review identify gaps in knowledge, suggest how the current study extends the knowledge base in this area, and points out contradictions in the current knowledge base? • Does the researcher summarize the literature review, provide rationale for the current study and show how this study will extend previous research?
  • 9.
    Design • Is thestudy design specified, including its advantages and limitations for the research problem? • Is there evidence that a pilot study had been conducted and the findings were used to enhance the design? • Is the design (overall plan of research) appropriate to the research purpose and capable of answering the research question? • How does the design control for extraneous variables?
  • 10.
    Sample • Was probabilityor non-probability sampling used and the reason for choice specified? • Is the population to whom results will be generalized described? • Are precautions taken to avoid collecting a biased sample (see Chapter 9) that would limit generalizability of findings? • Are the demographic characteristics of the sample described? • Is the sample representative of the population? • Are inclusion and exclusion criteria identified? • Is the sample size appropriate to meet assumptions of statistical tests?
  • 11.
    Data Collection • Aredata collection methods appropriate to meet the study purpose and answer the questions/hypotheses? • What evidence is provided that data collection procedures are valid and reliable? • Are adaptations to data collection tools described? • Are data collection instruments described in sufficient detail to enable the reader to ascertain method of scoring, range of values, and what a particular score means?
  • 12.
    Data Analysis • Aredata analysis procedures described? • Are the statistical techniques appropriate for the study methodology (that is, the type of data collected and analysis)? • Do the statistical tests answer the research questions and specify level of significance? • If results are nonsignificant, is a power analysis conducted to explore nonsignificant findings (see Chapter 15)?
  • 13.
    Human Rights • Howare rights of research participants protected? • Are ethical issues anticipated and handled appropriately?
  • 14.
    Procedures • Are techniquesused to ensure that there is consistency in the data collection process? • What procedures were used to keep research conditions the same for all participants? • Are there strategies to limit errors in data collection, recording, and analysis? • Did any unplanned circumstances influence the results? • In experimental designs, is there evidence of manipulation of independent variables, randomization in selection of sample and assignment to experimental and control groups, and control of extraneous variables (Wilson, 1993)?
  • 15.
    Findings • Are thefindings presented clearly, correctly, and related to the theoretical framework? • Is there a clear statement of whether or not the data support the • hypotheses or answer each research question? • Are tables and graphs clearly labeled, easy to comprehend and congruent with results presented in text form (see Chapter 19 for table construction)? • Are findings presented in an unbiased manner (see Chapter 9)?
  • 16.
    Discussion • Are alternativeexplanations offered for the findings? • Does the researcher discuss both clinical and statistical significance of findings? • Does the researcher over generalize the findings beyond the appropriate population? • Are limitations of the study such as sample size, inadequate instruments, sources of bias, etc. identified and their implications discussed?
  • 17.
    Implications & Conclusions •Does the researcher identify important implications of the study for practice, education, or research (if appropriate)? • How do the findings of the study advance nursing knowledge? • Do new research questions emerge from the study?
  • 18.
    Overall Quality • Whatare the major strengths of the study? • What are the major limitations of the study? • Was the study described in sufficient detail to facilitate a replication study? • What are the major contributions of this study to knowledge development in nursing? • What suggestions might enhance the study and correct the limitations?
  • 19.
    Criteria for Appraising QualitativeResearch • Key Question: “Did the investigator get it right? • Did the investigator publish an accurate and vivid description of the research phenomenon that is recognized easily by those who experienced it?
  • 20.
    Read the researchreport in its entirety to get a sense of the study and its contribution to knowledge development, then read again paying attention to the questions appropriate to each stage of the critiquing process.
  • 21.
    Stage 1 Questionsfor Critiquing Qualitative Designs • Are the research purpose & statement of the phenomenon of interest clearly stated? • Is the rationale for qualitative approach stated? • Is the philosophy of the tradition stated? • Is a statement of self-understanding included? • Is there a single broad research question? Are there sub questions? • Does the initial question become more focused as data are analyzed?
  • 22.
    Stage 1 Questionsfor Critiquing Qualitative Designs • Does the method used require a literature review before data collection? • Does the researcher have expertise in this area, know what gaps exist, & show how this study will eliminate gaps? • Is there evidence that a ROL is done after data collection, if it is appropriate to the design? • Is a framework appropriate? If so is it presented clearly?
  • 23.
    Stage 1 Questionsfor Critiquing Qualitative Designs • Is the design appropriate to the research purpose? • Is there congruency between the methodology & the research question? • Is the context for the study adequately described? • Is the researcher-participant relationship understood?
  • 24.
    Stage 1 SampleQuestions • Is purposive sampling used? • Are informants able to inform the study? • Does sample selection allow for saturation of data?
  • 25.
    Stage 1 -Data Collection • Are data collection strategies described in sufficient detail? • Is data collection congruent with study purpose, question, tradition? • Are prolonged engagement & observation in the field used to build trust & ensure validity of data collection?
  • 26.
    Stage 1 -Data Analysis • Are procedures clearly described & appropriate to the research tradition? • Are data collection & analysis concurrent & ongoing? • Is there evidence of decision rules for analyzing data & does the researcher remain true to the rules? • Are codes narrowed as categories are systematically discarded when not supported by the data • Is there evidence of “theoretical saturation”?
  • 27.
    Stage 2: CriticalAppraisal of the Conduct of the Research • Are the rights of participants protected? • Are ethical issues anticipated and handled appropriately?
  • 28.
    Stage 2 -Procedures • Does the research meet the criteria for rigor • 8 procedures for testing truth value (Creswell, 1998) • Should use at least 2 of the 8 in a study
  • 29.
    Testing Truth Value(Creswell, 1998) • Prolonged engagement in the field • Triangulation • Peer review of research process • Negative case analysis • Articulation of researcher bias • Member check re findings & conclusions • External audit using consultant • Rich, in-depth description of participants & setting so transferability of findings to other settings can be judged
  • 30.
    Stage 3: CriticalAppraisal of Outcomes • Are findings contextualized? • Do readers vicariously experience the phenomenon? (descriptive vividness) • Are the findings true to the data? • Do themes present a meaningful picture of the phenomenon? • Are the findings compatible with the field of nursing? • Are reasons for incompatible findings explored?
  • 31.
    Stage 3: Discussion/Implications •Are implications of the study for nursing identified? • Is a context provided in which to use the findings? • Do implications follow logically from the findings? • Do new research questions emerge from the findings?
  • 32.
    Stage 4: Appraisalof Overall Quality of Study • What are the major strengths? • What are the major limitations? • What are the major contributions of the study? • What suggestions might enhance the study & correct limitations?