Reporting the Review
Upcoming SlideShare
Loading in...5
×
 

Reporting the Review

on

  • 1,055 views

 

Statistics

Views

Total Views
1,055
Views on SlideShare
1,055
Embed Views
0

Actions

Likes
3
Downloads
21
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Reporting the Review
  • Learning Objectives This module has been prepared to help authors of comparative effectiveness reviews ensure that their reports are of the highest possible quality. When authors do not provide sufficient details concerning the conduct of their study, readers are left with an incomplete picture of what was done. As such, they are not able to judge the reliability of the results or interpret them. There are two learning objectives for this module. The first one is to describe the various elements that need to be reported upon completion of the review. The second one is to distinguish examples of reporting that are adequate from those that are inadequate. The examples in the subsequent slides are of adequate reporting.
  • Systematic Review Process Overview This slide illustrates the steps in the systematic review process. This module focuses on reporting the review, which is more than simply presenting the findings. Reporting the results of a systematic review includes, for example, information on the number of databases searched, the number of records screened, and the complete methods and results.
  • Writing the Report Authors should use a standard template (guidance) for the overall report. The purpose of using a standard template is to make reading and understanding of the report easier for readers. The subsections in the report may vary. Using the PICOTS framework (population, intervention, comparator, outcome, timing, and study design or setting) can help guide the construction. Subsequent slides will provide more details about the framework’s use. Note: In this module, “PICOTS” is used for the purposes of consistency. The basic framework, PICO is used in all five steps of conducting a systematic review: preparing the topic, searching for and selecting studies, abstracting data, presenting findings, and reporting the review. In order to understand the review topic or question, the Evidence-based Practice Center investigators must identify the relevant Population, the Intervention, the Comparison, and the Outcome of interest. PICOTS, or PICO(TS), includes two other elements — timing and setting — that are sometimes included as part of the other study criteria or are called out specifically to bring attention to these items. For example, the timing of the outcome assessment may be included with the outcome (PICOT); but if timing is not specified clearly, investigators may forget to define it. The “S” in PICOS or PICOTS usually represents setting but occasionally (as in the modules on quantitative synthesis and the reporting of the review) it is used to refer to the study design. The setting encompasses issues of both population (hospital populations have different characteristics than those in outpatient clinics) and the intervention (particularly for behavioral or surgical interventions in which service delivery may vary greatly in different settings). In preparing their report, authors should use the framework they employed during the first four steps of their systematic review process.
  • Systematic Review Report Structure (I) This slide shows the first three of the five major sections of the standard template used in a systematic review report. The structure of the review report, particularly for comparative effectiveness reviews, will include an abstract and an executive summary. Some readers will read only the abstract and executive summary, so particular focus and attention should be given to this section.
  • Systematic Review Report Structure (II) This slide shows the last two of the five major sections of the standard template used in a systematic review report.
  • Using Formal Guidelines To Improve the Reporting of Systematic Reviews International guidelines have been developed to help improve the quality of research reporting. The PRISMA Statement is an example of reporting guidelines especially developed to help authors report systematic reviews. These guidelines consist of a checklist and a flow diagram that can be downloaded from the PRISMA Web site. References: PRISMA Statement Web site. Home page. Available at: http://www.prisma-statement.org/index.htm . Liberati A, Altman DF, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med 2009;151:W65-94. http://www.ncbi.nlm.nih.gov/pubmed/19622512 David Moher, Alessandro Liberati, Jennifer Tetzlaff, Douglas G Altman for the PRISMA Group. Preferred reporting items for systematic reviews and meta-analyses: the PRISMA statement. BMJ 2009;339:b2535 http://www.ncbi.nlm.nih.gov/pubmed/ 19622552
  • Title of the Review Report In the title of the report, it is important to identify the type of review — a systematic review, a meta-analysis, or a comparative effectiveness review — that was conducted. It is also important to remember that the title needs to be understandable to readers of the report. Consider these examples. The first example, “Comparative Effectiveness of Lipid-Modifying Agents,” does not provide much detail. It is simple and gives a higher level overview of what the review is about. Very often, this type of title is sufficient for the audience. The second example, ”Mortality in Randomized Trials of Antioxidant Supplements for Primary and Secondary Prevention: Systematic Review and Meta-analysis,” is more detailed and provides a more information for readers. Both examples are reasonable. When developing the title, as in many other aspects of the report, the PICOTS framework can be used as a guide.
  • Executive Summary (I) Although the review team may work hard to develop a 100-page report, not everyone will have the time to read it. Because the executive summary may be used for decisionmaking by individuals who do not have time to read the full report, it is important that it be well-organized, as illustrated on this slide. It should include the background to the review, the review objectives, the key research questions, and the methods, which also might include the data sources, the eligibility criteria, the study appraisal, and the synthesis methods. The results, limitations, and conclusions, including the implications of key findings, should also be included.
  • Executive Summary (II) The executive summary is meant to be a distillation of the entire review report and should not include study-by-study results. It should provide an overall summary of the review and describe the evidence that supports the summary statements. It is important to describe the strength of the evidence, as was categorized in the evidence review.
  • Introduction: Purpose and Scope In the introductory section of the document, it is important to share with the reader the purpose and scope of the review, including the clinical decisional dilemma, the current literature, and the current state of practice. Providing this information gives the reader context, which is extremely important. The next step is to provide information about the objectives and the key questions that the evidence review will address. It might be helpful at this point to use the PICOTS framework to describe the objectives and state the key research questions.
  • Methods Overview Ideally, research methods are detailed in a preexisting protocol (and any subsequent amendments), which is typically prepared by Evidence-based Practice Center investigators in advance of a systematic review. In this report, the Methods overview should provide information on literature search strategy and data sources, eligibility criteria, and data collection processes and data items. The Methods overview should also provide information on quality assessment and synthesis of results, grading of evidence and additional analysis methods. The information should be provided in as clear and transparent a manner as possible. Using the PICOTS framework might be helpful in considering what to include. A point to be noted on writing style: this section of the review should be written in the past tense.
  • Methods: Literature Search All systematic reviews conducted by Evidenced-based Practice Centers must present the complete electronic search strategy in the Appendix. The purpose of including the entire search strategy is to ensure transparency and to permit replication of the review. Reference : Alejandria MM, Lansang MA, Dans LF, et al. Intravenous immunoglobulin for treating sepsis and septic shock. Cochrane Database Syst Rev 2002;(1):CD001090. http://www.ncbi.nlm.nih.gov/pubmed/11869591
  • Methods: Information Sources All information sources used in the literature search — including the databases used, dates covered in each database, and any contacts with authors — should be reported. This slide shows an example of how this information is reported. Reference : Bolen S, Wilson L, Vassy J, et al. Comparative Effectiveness and Safety of Oral Diabetes Medications for Adults With Type 2 Diabetes . Comparative Effectiveness Review No. 8 (Prepared by The Johns Hopkins University Evidence-based Practice Center under Contract No. 290-02-0018). Rockville, MD: Agency for Health Care Research and Quality, July 2007. AHRQ Publication No. 07-EHC010-EF. Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/6/39/OralFullReport.pdf.
  • Methods: Eligibility Criteria (I) Authors of systematic reviews should report the processes by which they selected studies for analysis. Specifically, they should explain how the studies were screened for inclusion or exclusion and which eligibility criteria were applied. The brief example on this slide shows the level of detail that authors should provide in a report. This level of detail is needed ensure transparency and to permit replication of the review. Reference : Shah MR, Hasselvlad V, Stevenson LW, et al. Impact of the pulmonary artery catheter in critically ill patients: meta-analysis of randomzied clinical trials. JAMA 2005;294:1664-70. http://www.ncbi.nlm.nih.gov/pubmed/16204666
  • Methods: Eligibility Criteria (II) This slide offers examples of how studies and participants were described in the methods section of a published report. The PICOTS (population, intervention, comparator, outcome, timing, and study design or setting) framework can serve as a guide for reporting the eligibility criteria of the review. Specifically, the type of study, the types of participants, the type of intervention, and the type of outcome should be reported. This ensures that readers, such as clinicians, can determine if the review results can be generalized to their patient population. It is also important to provide this information clearly so that anyone who would like to replicate the review are able to do so. Reference: Schroth RJ, Hitchon CA, Uhanova J, et al. Hepatitis B vaccination for patients with chromic renal failure. Cochrane Database Syst Rev 2004;(3):CD003775. http://www.ncbi.nlm.nih.gov/pubmed/15266500
  • Methods: Eligibility Criteria (III) This slide offers examples of how interventions and outcomes were described in the methods section of a published report. The PICOTS (population, intervention, comparator, outcome, timing, and study design or setting) framework can serve as a guide for reporting the eligibility criteria of the review. This ensures that readers, such as clinicians, can determine if the review results can be generalized to their patient population. It is also important to provide this information clearly so that anyone who would like to replicate the review are able to do so. Reference : Schroth RJ, Hitchon CA, Uhanova J, et al. Hepatitis B vaccination for patients with chromic renal failure. Cochrane Database Syst Rev 2004;(3):CD003775. http://www.ncbi.nlm.nih.gov/pubmed/15266500
  • Methods: Data Extraction Authors should describe the method used for extracting data from the studies selected for the review. The level of detail that should be included in the description of the data collection process is illustrated in the example on this slide. Authors should also describe any processes they followed to obtain and confirm any information they received from the authors of the primary studies. Once again, the goal is to ensure transparency and to permit replication of the review. Reference Mistiaen P, Poot E. Telephone follow-up, initiated by a hospital-based health professional, for postdischarge problems in patients discharged from hospital to home. Cochrane Database Syst Rev 2006;(4):CD004510. http://www.ncbi.nlm.nih.gov/pubmed/17054207
  • Methods: Data Items Authors should list and define all the variables for which data were sought. The PICOTS (population, intervention, comparator, outcome, timing, and study design or setting) framework can be used as a guide. In addition, authors should describe any assumptions and simplifications that were made in defining the variables. Following this approach will make the report clear and ensure transparency. Reference : Allen C, et al. Non-steroidal anti-inflammatory drugs for pain in women with endometriosis. Cochrane Database Syst Rev 2005;(4):CD004753. http://www.ncbi.nlm.nih.gov/pubmed/16235379
  • Methods: Quality Assessment Authors should describe the methods and criteria they used to assess the quality (risk of bias) of the studies selected for the review. As noted in the previous slides on methods, authors should provide a certain level of detail about the approaches they used for quality assessment. This information helps to ensure transparency and to permit replication of the review. References: Tracz MJ, Sideras K, Boloña ER, et al. Testosterone use in men and its effects on bone health. A systematic review and meta-analysis of randomized placebo-controlled trials. J Clin Endocrinol Metab 2006;91:2011-6. http://www.ncbi.nlm.nih.gov/pubmed/16720668 Bucher HC, Hengstler P, Schindler C, et al. Percutaneous transluminal coronary angioplasty versus medical treatment for non-acute coronary heart disease: meta-analysis of randomised controlled trials. BMJ 2000;321:73-7. http://www.ncbi.nlm.nih.gov/pubmed/10884254
  • Methods: Synthesis of Results Authors should describe the methods they used to handle the data and to combine the results of studies. If measures of consistency, such as an I-squared test, were used, the results should be reported for each meta-analysis. This level of detail will also be helpful to ensure transparency and to permit replication of the review. This slide shows two examples of descriptions of methods used for synthesizing data. References: Jones M, Schenkel B, Just J, et al. Epoetin alfa improves quality of life in patients with cancer: results of metaanalysis. Cancer 2004;101:1720-32. http://www.ncbi.nlm.nih.gov/pubmed/15386341 Briel M, Studer M, Glass T, et al. Effects of statins on stroke prevention in patients with and without coronary heart disease: a meta-analysis of randomized controlled trials. Am J Med 2004;117:596-606. http://www.ncbi.nlm.nih.gov/pubmed/15465509
  • Methods: Grading Strength of Evidence Authors should report any assessments they made of risk of bias that may effect the cumulative evidence. One example of such bias is study level publication bias. The example in the slide gives the level of detail that authors should report. Reference: Hróbjartsson A, Gøtzsche PC. Placebo interventions for all clinical conditions. Cochrane Database Syst Rev 2004;(1):CD003974. http://www.ncbi.nlm.nih.gov/pubmed/20091554
  • Methods: Additional Analyses Additional analysis, such as a subgroup analysis or a meta-regression, should be reported in the same detail as the other methods used in the review. this slide shows an example of additional analyses conducted as part of a review. Reference: Briel M, Schwartz GG, Thompson PL, et al. Effects of early treatment with statins on short-term clinical outcomes in acute coronary syndromes: a meta-analysis of randomized controlled trials. JAMA 2006;295:2046-56. http://www.ncbi.nlm.nih.gov/pubmed/16670413
  • Results Overview The next several slides describe how results should be reported, including information on the study selection, the study characteristics, and quality assessment.
  • Results: Study Selection In reporting the study selection, authors should provide information on the numbers of studies screened, assessed for eligibility, and actually included in the review. Authors should also provide the rationale for the exclusion of studies at each of these stages. This can be done in two ways. In the example in this slide, authors provide the rationale in a narrative format. The authors report the starting total of 548 records (usually titles and/or abstracts) initially identified in searches of electronic databases and then describe how studies were excluded until only 5 studies remained that met the inclusion criteria for the systematic review. The other way to provide this information is to use a flow diagram, an example of which is presented on the next slide. One common mistake in reporting study selection is to put the information in the Methods section, for example, under eligibility criteria. Study selection should be the first part of the Results section. Reference: Uitterhoeve RJ, Vernooy M, Litjens M, et al. Psychosocial interventions for patients with advanced cancer — a systematic review of the literature. Br J Cancer 2004;91:1050-62. http://www.ncbi.nlm.nih.gov/pubmed/15316564
  • Examples of Flow Diagrams The flow diagram is a good option to consider when reporting the steps used to select studies for the systematic review. Such diagrams can be included within the report or in the appendix. There are many different ways to design flow diagrams. This slide shows two examples. In the example labeled Figure 2, the authors have laid out the flow diagram in an easy-to-comprehend manner. Notice too that the bottom three boxes contain numerical data, a level of detail that is important to give when using flow diagrams. Reference: Sharma M, Ansari MT, Abou-Setta AM, et al. Systematic review: comparative effectiveness and harms of combination therapy and monotherapy for dyslipidemia. Ann Intern Med 2009;151:622-30. http://www.ncbi.nlm.nih.gov/pubmed/19884623 Fuccio L, Minardi ME, Zagari RM, et al. Meta-analysis: duration of first-line proton-pump inhibitor based triple therapy for Helicobacter pylori eradication. Ann Intern Med 2007;147:553-62. http://www.ncbi.nlm.nih.gov/pubmed/17938394
  • Results: Study Characteristics Authors should report the characteristics of the studies that were included in the review. This can be adequately achieved within the text or by using tables and figures. The example shows how to report study characteristics in text and the level of detail required. The Presentation of Findings module offers additional guidance on how to present study characteristics. Reference: Curioni C, André C. Rimonabant for overweight or obesity. Cochrane Database Syst Rev 2006;(4):CD006162. http://www.ncbi.nlm.nih.gov/pubmed/17054276
  • Results: Quality Assessment Authors should provide quality assessment information on each of the studies included in the review. In this example, the columns represent the quality criteria used, and the rows represent the individual studies included in the review. The table is populated with the quality assessment results. Refer to the Assessing the Quality of Individual Studies module for additional guidance. References: Devereaux PJ, Beattie WS, Choi PT, et al. How strong is the evidence for the use of perioperative beta blockers in non-cardiac surgery? Systematic review and meta-analysis of randomised controlled trials. BMJ 2005;331:313-21. http://www.ncbi.nlm.nih.gov/pubmed/15996966 Liberati A, Altman DF, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med 2009;151:W65-94. http://www.ncbi.nlm.nih.gov/pubmed/19622512
  • Results: Individual Studies When reporting the results of individual studies included in a review, authors should report all outcomes, including benefits and harms, that were assessed. For each study, authors should present simple summary data for each intervention group and some effect estimates and confidence intervals with a forest plot, assuming that a meta-analysis has been done. Ideally, authors should also report the results of individual studies in evidence tables.
  • Results: Synthesis of Results There are at least three ways to present syntheses of results. It can be textual, as in the detailed example in this slide. Summary tables and evidence maps are also useful. More detailed information on the use of tables can be found in the Presentation of Findings module. Reference: Duffett M, Choong K, Ng V, Randolph A, et al. Surfactant therapy for acute respiratory failure in children: a systematic review and meta-analysis. Crit Care 2007;11:R66. http://www.ncbi.nlm.nih.gov/pubmed/17573963
  • Results: Risk of Bias Across Studies There should be synergy between what authors report about their methods and what they report about their results. For example, in reporting methods, authors should also report on the risk of bias across studies (see the slide “Methods: Risk of Bias Across Studies” in this talk for additional information). The results of such as assessment need to be included in the results section as well. For example, in providing information about how the strength of a body of evidence was graded, authors should discuss risk of bias, directness, consistency, and precision. Review the Grading the Strength of Evidence module for additional guidance. Reference: Ip S, Terasawa T, Balk EM, et al. Comparative Effectiveness of Radiofrequency Catheter Ablation for Atrial Fibrillation . Comparative Effectiveness Review No. 15 (Prepared by Tufts–Mew England Medical Center Evidence-based Practice Center under Contract No. 290-02-0022). Rockville, MD: Agency for Healthcare Research and Quality, July 2009. AHRQ Publication No. 09-EHC015-EF. Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/51/114/2009_0623RadiofrequencyFinal.pdf.
  • Results: Additional Sensitivity Analyses Authors should report sensitivity analyses, which is usually methodological (e.g., high quality vs. low quality; low risk of bias vs. high risk of bias), separate from subgroup analyses in the report. The example in this slide provides the type of information authors should include in reporting any additional sensitivity analyses. Reference: Reichenbach S, Sterchi R, Scherer M, et al. Meta-analysis: chondroitin for osteoarthritis of the knee or hip. Ann Intern Med 2007;146:580-90. http://www.ncbi.nlm.nih.gov/pubmed/17438317
  • Results: Additional Subgroup Analyses It is important to separate subgroup analyses, which is usually clinical (e.g., men vs. women; young vs. old), from sensitivity analyses when reporting results. The example in this slide provides the type of information authors should include in reporting any additional subgroup analyses. Reference: Hodson EM, Craig JC, Strippoli GF, et al. Antiviral medications for preventing cytomegalovirus disease in solid organ transplant recipients. Cochrane Database Syst Rev 2008;(2):CD003774. http://www.ncbi.nlm.nih.gov/pubmed/18425894
  • Discussion: Summary of Evidence Authors should summarize the main findings, including the strength of evidence for each of the main outcomes. Review the Assessing Applicability and Grading of Strength of Evidence modules for helpful information.
  • Discussion: Summary of Evidence This slide illustrates a well-written example of a summary of evidence. Reference: Wilt TJ, Shamliyan T, Taylor B, et al. Comparative Effectiveness of Therapies for Clinically Localized Prostate Cancer . Comparative Effectiveness Review No. 13 (Prepared by Minnesota Evidence-based Practice Center under Contract No. 290-02-0009). Rockville, MD: Agency for Healthcare Research and Quality, February 2008. AHRQ Publication No. 08-EHC010-EF. Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/9/80/2008_0204ProstateCancerFinal.pdf.
  • Discussion: Limitations Authors should feel comfortable reporting on the limitations of the review, because all reviews have limitations. Some of the limitations will be at the study level and some will be at the outcome level. The example illustrates the sort of information that should be included in the discussion section of the document. Reference: Liberati A, D'Amico R, Pifferi, et al. Antibiotic prophylaxis to reduce respiratory tract infections and mortality in adults receiving intensive care. Cochrane Database Syst Rev 2004;(1):CD000022. http://www.ncbi.nlm.nih.gov/pubmed/14973945
  • Discussion: Conclusions Authors should provide a general interpretation of the results in the context of other evidence and the implications for future research. Reference: Sharma M, Ansari MT, Soares-Weiser K, et al. Comparative Effectiveness of Lipid-Modifying Agents . Comparative Effectiveness Review No. 16 (Prepared by the University of Ottawa Evidence-based Practice Center under contract No. 290-02-0021). Rockville, MD: Agency for Healthcare Research and Quality, September 2009. AHRQ Publication No. 09-EHC024-EF. Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/11/171/reptbodyfin-typofixed4-12-2010.pdf.
  • Key Messages Reporting the findings of a systematic review is the final step in the continuum of the review process. The report should convey in a transparent manner what was done and what was found. It is especially important to be transparent in reporting the methods and the results. Doing so helps all readers—including consumers, clinicians, and policymakers—better understand the review process. Inadequate reporting makes it more difficult to judge the validity of the methods and the results, thereby making the review less useful as tool for effective health care.
  • References (I)
  • References (II)
  • References (III)
  • References (IV)
  • References (V)
  • References (VI)
  • References (VII)
  • Author

Reporting the Review Reporting the Review Presentation Transcript

  • Reporting the Review Prepared for: The Agency for Healthcare Research and Quality (AHRQ) Training Modules for Systematic Reviews Methods Guide www.ahrq.gov
    • To describe the various elements that need to be reported upon completion of a systematic review
    • To distinguish examples of reporting that are adequate from those that are inadequate
    Learning Objectives
  • Systematic Review Process Overview
    • Follow a standard template for the overall report:
      • Abstract and Executive Summary
      • Chapter 1. Introduction
      • Chapter 2. Methods
      • Chapter 3. Results
      • Chapter 4. Discussion
    • Ordering of subsections may vary but:
      • Should adhere to principles of clarity
      • Should be consistent with key questions
      • May be guided by PICOTS
    Writing the Report PICOT(S) = population, intervention, comparator, outcome, time frame, and study design or setting
    • Abstract and Executive Summary
    • Chapter 1: Introduction
      • The purpose of this chapter is to define the project, the purpose and scope of the review, the key research questions, the analytic framework, et cetera.
    • Chapter 2: Methods
      • The purpose of this chapter is to explain the methods used in the review, including the experts involved, the literature search strategy used, the inclusion and exclusion criteria applied, how the evidence tables were developed, the approach used to assess the quality of studies, and data abstraction and data synthesis methods.
      • This chapter should not present any “results” (e.g., tables), but rather should serve as a guide for how the study information was collected and the evidence tables were created so that the research can be replicated.
    Systematic Review Report Structure (I)
    • Chapter 3: Results
      • The purpose of this chapter is to report the results of the data analyses, which should be broken down according to the key research questions.
      • Subsections should be used when applicable.
    • Chapter 4: Discussion
      • The purpose of this chapter is to discuss the strength of the literature and evidence, the principal findings (broken down by key questions) of the review, areas of future research, and any conclusions that can be drawn.
    Systematic Review Report Structure (II)
    • The PRISMA (Preferred Reporting Items for Systematic Reviews and Meta-Analyses) Statement is a guideline that was developed to help improve the quality of review reports.
      • The guideline consists of a 27-item checklist and a flow diagram.
      • Investigators can access the guideline online (http://www.prisma-statement.org/index.htm).
    Using Formal Guidelines To Improve the Reporting of Systematic Reviews
    • Identify the report as a systematic review, meta-analysis, or comparative effectiveness review.
    • Use the PICOTS framework to guide construction of the title.
    • Make the title as succinct as possible yet keep it informative.
      • Example of a short title:
      • Comparative Effectiveness of Lipid-Modifying Agents
      • Example of a longer but more informative title:
      • Mortality in Randomized Trials of Antioxidant Supplements for Primary And Secondary Prevention: Systematic Review and Meta-analysis
    Title of the Review Report PICOT(S) = population, intervention, comparator, outcome, time frame, and study design or setting
    • Should be structured as follows:
      • Background
      • Objectives
      • Key questions
      • Methods
        • Data sources
        • Eligibility criteria
        • Study appraisal
        • Data synthesis
      • Results
      • Limitations
      • Conclusions
        • Implications of key findings
    Executive Summary (I)
    • Should be a distillation of the entire report
    • Should exclude study-by-study results
    • Should describe the evidence that supports all summary statements
    • Should describe the strength of the evidence as categorized in the evidence review
    Executive Summary (II)
    • Should state the purpose and scope of the review:
      • Identify the clinical decisional dilemma.
      • Identify the current literature and state of practice.
      • Give readers the context in which the review was conducted.
    • Should include these important components:
      • Objective(s) and key question(s)
        • Provide an explicit statement of the research questions being addressed with reference to the PICOTS framework.
      • Analytic framework
        • Use such a framework to model existing evidence (refer to the Analytic Framework module for details).
    Introduction PICOTS = population, intervention, comparator, outcome, time frame, and study design or setting
      • Should provide the following information in a clear and transparent manner:
        • Literature Search Strategy and Data Sources
        • Eligibility Criteria
          • Specific to study characteristics (e.g., PICOTS, length of followup)
          • Specific to report characteristics (e.g., years considered, language, publication status)
        • Data Extraction and Data Items (e.g., variables for which data were sought, assumptions and simplifications )
        • Quality Assessment
        • Synthesis of Results
        • Grading Strength of Evidence
        • Additional Analyses
    Methods Overview
    • Present the complete electronic search strategy — including any limits used — in the Appendix of the report.
      • Brief example:
      • We used the following search terms to search all trials registers and databases: immunoglobulin; IVIG [intravenous immunoglobulin]; sepsis; septic shock; septicaemia; and septicemia.
    • The purpose of including the entire search strategy is to ensure transparency and to permit replication of the review.
    Methods: Literature Search Strategy Alejandria MM, et al. Cochrane Database Syst Rev 2002;(1):CD001090.
    • Describe all information sources used in the literature search:
      • Databases with dates of coverage
      • Contacts with study authors to identify additional studies
      • Date of the last search
      • Example:
      • We searched the following databases for primary studies for the periods in parentheses: MEDLINE ® (1966 to January 2006), EMBASE ® (1974 to January 2006), and the Cochrane Central Register of Controlled Trials (1966 to January 2006). We also searched for systematic reviews until November 2005.
    Methods: Data Sources Bolen S, et al. AHRQ Comparative Effectiveness Review No. 5. Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/6/39/OralFullReport.pdf.
    • State the processes used to select studies for review:
      • Screening
      • Eligibility assessment
      • Inclusion/exclusion criteria applied for the systematic review and, if applicable, the meta-analysis
      • Example:
      • We included trials if the randomization scheme included groups that assigned patients to treatment guided by the PAC [pulmonary artery catheter] or treatment without the PAC. We only included trials if they reported death and number of days hospitalized or the number of days in the ICU as outcome measures. Studies were excluded if the randomization scheme did not specify groups as PAC or no PAC, if patients were not randomized to a conventional PAC, if investigators combined randomized and nonrandomized groups when reporting outcomes, or if there were no outcome data on death or hospitalizations.
    Methods: Eligibility Criteria (l) Shah MR, et al. JAMA 2005;294:1664-70.
    • Type of studies:
      • Example:
      • Randomised clinical trials studying the administration of hepatitis B vaccine to CRF [chronic renal failure] patients, with or without dialysis. No language, publication date, or publication status restrictions were imposed.
    • Types of participants:
      • Example:
      • Participants of any age with CRF or receiving dialysis (haemodialysis or peritoneal dialysis) were considered.… Renal transplant patients were excluded from this review.
    Methods: Eligibility Criteria (II) Schroth RJ, et al. Cochrane Database Syst Rev 2004;(3):CD003775.
    • Types of interventions:
      • Example:
      • Trials comparing the beneficial and harmful effects of hepatitis B vaccines with adjuvant or cytokine co-interventions [and] trials comparing the beneficial and harmful effects of immunoglobulin prophylaxis.… Hepatitis B vaccines (plasma or recombinant [yeast] derived) of all types, dose, and regimens versus placebo, control vaccine, or no vaccine.
    • Types of outcomes:
      • Example:
      • Primary outcome measures: Seroconversion , [that is], proportion of patients with adequate anti-HBs response (10 IU/L or Sample Ratio Units).…Secondary outcome measures: Adverse events of hepatitis B vaccinations…[and] mortality .
    Methods: Eligibility Criteria (III) Schroth RJ, et al. Cochrane Database Syst Rev 2004;(3):CD003775.
    • Describe the method of data extraction.
      • Example:
      • We developed a data extraction sheet[,]…pilot-tested it on ten randomly-selected included studies, and refined it accordingly. One review author extracted the…data…and the second author checked the extracted data. …Disagreements were resolved by discussion between the two review authors.
    • Describe any processes used to obtain and confirm data from other investigators.
    Methods: Data Extraction Mistiaen P, Poot E. Cochrane Database Syst Rev 2006;(4):CD004510.
    • List and define all variables for which data were sought, using the PICOTS framework as a guide.
    • List any assumptions and simplifications that were made in defining the variables.
      • Example:
      • Information was extracted from each included trial on: (1) characteristics of trial participants…and the trial’s inclusion and exclusion criteria; (2) type of intervention…(versus placebo or versus the type, dose, duration and frequency of another NSAID [nonsteroidal antiinflammatory drug]; or versus another pain management drug; or versus no treatment); (3) type of outcome measure.
    Methods: Data Items Allen C, et al. Cochrane Database Syst Rev 2005;(4):CD004753. PICOT(S) = population, intervention, comparator, outcome, timing, and study design or setting
    • Describe the methods and criteria used to assess the quality (risk of bias) of individual studies.
      • Specify whether or not the assessment was carried out at the study or outcome level, or both.
      • Describe how this information is to be used in any data synthesis.
      • Examples from two separate studies:
      • Pairs of reviewers…determined the adequacy of randomization and concealment of allocation, blinding of patients, health care providers, data collectors, and outcome assessors; and extent of loss to follow-up .
      • To explore variability in study results (heterogeneity) we specified the following hypotheses…
    Methods: Quality Assessment Tracz MJ, et al . J Clin Endocrinol Metab 2006;91:2011-6; Bucher HC, et al. BMJ 2000;321:73-7.
    • For each meta-analysis:
      • Describe the methods used to handle the data and to combine the results of studies.
      • Describe measures of consistency (e.g., I-squared).
      • Examples from two separate studies:
      • In very few instances, estimates of baseline mean or mean QOL [quality of life] responses were obtained without corresponding estimates of variance (standard deviation [SD] or standard error). In these instances, an SD was imputed from the mean of the known SDs. In a number of cases, the response data available were the mean and variance in a prestudy condition and after therapy.…
      • We tested for heterogeneity with the Breslow-Day test, and used the method proposed by Higgins et al. to measure inconsistency.
    Methods: Synthesis of Results Jones M, et al. Cancer 2004;101:1720-32; Briel M, et al. Am J Med 2004;117:596-606.
    • Detail any assessment of risk of bias that may affect the cumulative evidence.
      • Publication bias
      • Selective reporting within studies
      • Example:
      • For each trial we plotted the effect by the inverse of its standard error. The symmetry of such “funnel plots” was assessed both visually, and formally with Egger’s test, to see if the effect decreased with increasing sample size.
    Methods: Grading Strength of Evidence Hróbjartsson A, Gøtzsche PC. Cochrane Database Syst Rev 2004;(1):CD003974.
    • Describe the methods of additional analyses (e.g., sensitivity or subgroup analyses, meta-regression).
    • Indicate which of these analyses were prespecified.
      • Example:
      • Sensitivity analyses were pre-specified. The treatment effects were examined according to quality components (concealed treatment allocation, blinding of patients and caregivers, blinded outcome assessment), time to initiation of statins, and the type of statin. One post-hoc sensitivity analysis was conducted including unpublished data from a trial using cerivastatin.
    Methods: Additional Analyses Briel M, et al. JAMA 2006;295:2046-56.
    • The results section of the review should contain the following subsections:
      • Study Selection
      • Study Characteristics
      • Quality Assessment
      • Individual Studies
      • Synthesis of Results
      • Grading Strength of Evidence
      • Additional Analyses
        • Sensitivity
        • Subgroup
    Results Overview
    • Place at the beginning of the Results section, not in the Methods section.
    • Give numbers of studies screened, assessed for eligibility, and included in the review.
    • Give the reasons for exclusions at each stage of the assessment, ideally illustrated with a flow diagram.
      • Example:
      • The search of Medline, PsycInfo and Cinahl [sic] databases provided a total of 584 citations. After adjusting for duplicates 509 remained. Of these, 479 studies were discarded because…. Three additional studies…were discarded because…. The full text of the remaining 27 citations was examined in more detail. It appeared that 22 studies did not meet the inclusion criteria as described. Five studies…met the inclusion criteria and were included in the systematic review.
    Results: Study Selection Uitterhoeve RJ, et al. Br J Cancer. 2004;91:1050-62.
  • Examples of Flow Diagrams Fuccio L, et al. Ann Intern Med 2007;147:53-62. Reprinted with permission from the American College of Physicians. Sharma M, et al. Ann Intern Med 2009:151:622-30. Reprinted with permission from the American College of Physicians.
    • Study characteristics can be presented:
      • Within text
      • In summary tables and graphs
      • Example:
      • All four studies finally selected for the review were randomised controlled trials published in English. The duration of the intervention was 24 months for the RIO-North America and 12 months for the RIO-Diabetes RIO-Lipids and RIO-Europe study. Although the last two described a period of 24 months during which they were conducted, only the first 12-months results are provided. All trials had a run-in, as a single blind period before the randomisation.
    Results: Study Characteristics Curioni C, André C. Cochrane Database Syst Rev 2006;(4):CD006162.
    • Present data on risk of bias for each study analyzed.
    • Present the results of outcome-level assessments, if available.
      • Example:
    Results: Quality Assessment Devereaux PJ, et al. BMJ 2005;331:313-21, as adapted in Liberati A, et al. Ann Intern Med 2009;151:W65-94.
    • For all outcomes considered (benefits and harms), present the following for each study:
      • simple summary data for each intervention group, and
      • effect estimates and confidence intervals (ideally with a forest plot).
    • Present the results of individual studies in evidence tables and summary tables and not in the text.
    • Refer to the Presentation of Findings module to see examples of the tables and graphs used to present summaries of individual studies.
    Results: Individual Studies* * This may appear in the appendix for Evidence-based Practice Center reports.
    • Present synthesized results in text, summary tables, or evidence maps.
      • Text example:
      • Mortality data were available for all six trials, randomizing 311 patients and reporting data for 305 patients. There were no deaths reported in the three respiratory syncytial virus/severe bronchiolitis trials; thus our estimate is based on three trials randomizing 232 patients, 64 of whom died. In the pooled analysis, surfactant was associated with significantly lower mortality (relative risk = 0.7, 95% confidence interval = 0.4 – 0.97, P = 0.04). There was no evidence of heterogeneity (I 2 = 0%).
      • Summary tables: refer to the Presentation of Findings module
      • Evidence maps: refer to the Presentation of Findings module
    Results: Synthesis of Results Duffett M, et al. Crit Care 2007;11:R66.
    • Include risk of bias, directness, consistency, and precision in reporting how evidence was graded.
      • Example:
      • There is a low level of evidence…that RFA [radiofrequency catheter ablation] improves quality of life more than medical treatment. Three RCTs [randomized controlled trials] and one observational study reported more improvement in the general or physical functioning…in patients who underwent RFA.…However, these studies assessed the results at nonuniform time points and therefore the findings may be difficult to interpret.
    Results: Grading Strength of Evidence Ip S, et al. AHRQ Comparative Effectiveness Review No. 15. Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/51/114/2009_0623RadiofrequencyFinal.pdf.
    • Give the results of additional analyses, such as sensitivity or subgroup analyses and meta-regressions.
    • Include the results of additional analyses to facilitate a better understanding of heterogeneity.
      • Example 1:
      • [B]enefits of chondroitin were smaller in trials with adequate concealment of allocation compared with trials with unclear concealment (P for interaction = 0.050), in trials with an intention-to-treat analysis compared with those that had excluded patients from the analysis (P for interaction = 0.017), and in large compared with small trials (P for interaction = 0.022).
    Results: Additional Sensitivity Analyses Reichenbach S, et al. Ann Intern Med 2007;146:580-90.
    • Example 2:
    • Subgroup analyses according to antibody status, antiviral medications, organ transplanted, treatment duration, use of antilymphocyte therapy, time to outcome assessment, study quality and other aspects of study design did not demonstrate any differences in treatment effects. Multivariate meta-regression showed no significant difference in CMV [cytomegalovirus] disease after allowing for potential confounding or effect-modification by prophylactic drug used, organ transplanted or recipient serostatus in CMV positive recipients and CMV negative recipients of CMV positive donors.
    Results: Additional Subgroup Analyses Hodson EM, et al. Cochrane Database Syst Rev 2008;(2):CD003774.
    • Summarize the main findings, including the strength of evidence for each main outcome.
    • Consider the applicability of findings to key groups (e.g., health care providers, users, and policymakers).
    • Refer to the Assessing Applicability and Grading Strength of Evidence modules for additional guidance.
    Discussion: Summary of Evidence
    • Example:
    • Compared with men who used watchful waiting , men with clinically localized prostate cancer detected by methods other than PSA [prostate-specific antigen] testing and treated with radical prostatectomy experienced fewer deaths from prostate cancer, marginally fewer deaths from any cause, and fewer distant metastases. The greater benefit of RP on cancer-specific and overall mortality appears to be limited to men under 65 years of age but is not dependent on baseline PSA level or histologic grade.
    Discussion: Summary of Evidence Wilt TJ, et al. AHRQ Comparative Effectiveness Review No. 13. Available at: . http://www.effectivehealthcare.ahrq.gov/ehc/products/9/80/2008_0204ProstateCancerFinal.pdf.
    • Discuss the limitations of the review at different levels:
      • Study level (e.g., risk of bias)
      • Outcome level (e.g., benefits or harms)
      • Review level (e.g., incomplete retrieval of identified research; reporting bias)
      • Example:
      • The meta-analysis reported here combines data across studies in order to estimate treatment effects with more precision than is possible in a single study. The main limitation of this meta-analysis, as with any overview, is that the patient population, the antibiotic regimen and the outcome definitions are not the same across studies.
    Discussion: Limitations Liberati A, et al. Cochrane Database Syst Rev 2004;(1):CD000022.
    • Provide a general interpretation of the results in the context of other evidence and the implications for future research.
      • Example:
      • [T]he available clinical trial evidence supporting the use of combination therapies over higher dose statin therapy is insufficient to guide clinical decisions. The long term clinical benefits and risks of combination therapies have yet to be demonstrated. There are some instances, such as failure to reach targets in spite of maximal statin therapy, and populations with elevated triglycerides who need to achieve secondary goals, in which clinicians may choose combinations pending definitive evidence.
    Discussion: Conclusions Sharma M, et al. AHRQ Comparative Effectiveness Review No. 16. Available at: http://www.effectivehealthcare.ahrq.gov/ehc/products/11/171/reptbodyfin-typofixed4-12-2010.pdf.
    • Reporting a systematic review is the final step of the review process.
    • The report should convey in a transparent manner the methods and results to readers, including consumers, clinicians, and policymakers.
    • Inadequate reporting makes it more difficult to judge the validity of the methods and results.
    Key Messages
    • Alejandria MM, Lansang MA, Dans LF, et al. Intravenous immunoglobulin for treating sepsis and septic shock. Cochrane Database Syst Rev 2002;(1):CD001090.
    • Allen C, et al. Non-steroidal anti-inflammatory drugs for pain in women with endometriosis. Cochrane Database Syst Rev 2005;(4):CD004753.
    • Bolen S, Wilson L, Vassy J, et al. Comparative Effectiveness and Safety of Oral Diabetes Medications for Adults With Type 2 Diabetes . Comparative Effectiveness Review No. 8 (Prepared by The Johns Hopkins University Evidence-based Practice Center under Contract No. 290-02-0018). Rockville, MD: Agency for Health Care Research and Quality, July 2007. AHRQ Publication No. 07-EHC010-EF.
    • Briel M, Studer M, Glass T, et al. Effects of statins on stroke prevention in patients with and without coronary heart disease: a meta-analysis of randomized controlled trials. Am J Med 2004;117:596-606.
    References (I)
    • Briel M, Schwartz GG, Thompson PL, et al. Effects of early treatment with statins on short-term clinical outcomes in acute coronary syndromes: a meta-analysis of randomized controlled trials. JAMA 2006;295:2046-56.
    • Bucher HC, Hengstler P, Schindler C, et al. Percutaneous transluminal coronary angioplasty versus medical treatment for non-acute coronary heart disease: meta-analysis of randomised controlled trials. BMJ 2000;321:73-7.
    • Curioni C, André C. Rimonabant for overweight or obesity. Cochrane Database Syst Rev 2006;(4):CD006162.
    • Devereaux PJ, Beattie WS, Choi PT, et al. How strong is the evidence for the use of perioperative beta blockers in non-cardiac surgery? Systematic review and meta-analysis of randomised controlled trials. BMJ 2005;331:313-21.
    References (II)
    • Duffett M, Choong K, Ng V, Randolph A, et al. Surfactant therapy for acute respiratory failure in children: a systematic review and meta-analysis. Crit Care 2007;11:R66.
    • Fuccio L, Minardi ME, Zagari RM, et al. Meta-analysis: duration of first-line proton-pump inhibitor based triple therapy for Helicobacter pylori eradication. Ann Intern Med 2007;147:553-62.
    • Hodson EM, Craig JC, Strippoli GF, et al. Antiviral medications for preventing cytomegalovirus disease in solid organ transplant recipients. Cochrane Database Syst Rev 2008;(2):CD003774.
    • Hróbjartsson A, Gøtzsche PC. Placebo interventions for all clinical conditions. Cochrane Database Syst Rev 2004;(1):CD003974 .
    References (III)
    • Ip S, Terasawa T, Balk EM, et al. Comparative Effectiveness of Radiofrequency Catheter Ablation for Atrial Fibrillation . Comparative Effectiveness Review No. 15 (Prepared by Tufts –Mew England Medical Center Evidence-based Practice Center under Contract No. 290-02-0022). Rockville, MD: Agency for Healthcare Research and Quality, July 2009. AHRQ Publication No. 09-EHC015-EF.
    • Jones M, Schenkel B, Just J, et al. Epoetin alfa improves quality of life in patients with cancer: results of metaanalysis. Cancer 2004;101:1720-32.
    • Lakhdar R, Al-Mallah MH, Lanfear DE. Safety and tolerability of angiotensin-converting enzyme inhibitor versus the combination of angiotensin-converting enzyme inhibitor and angiotensin receptor blocker in patients with left ventricular dysfunction: a systematic review and meta-analysis of randomized controlled trials. J Card Fail 2008;14:181-8.
    • Liberati A, Altman DF, Tetzlaff J, et al. The PRISMA statement for reporting systematic reviews and meta-analyses of studies that evaluate health care interventions: explanation and elaboration. Ann Intern Med 2009;151:W65-94.
    References (IV)
    • Liberati A, D'Amico R, Pifferi, et al. Antibiotic prophylaxis to reduce respiratory tract infections and mortality in adults receiving intensive care. Cochrane Database Syst Rev 2004;(1):CD000022.
    • Mistiaen P, Poot E. Telephone follow-up, initiated by a hospital-based health professional, for postdischarge problems in patients discharged from hospital to home. Cochrane Database Syst Rev 2006;(4):CD004510.
    • PRISMA Statement Web site. Home page. Available at: http://www.prisma-statement.org/index.htm. Reichenbach S, Sterchi R, Scherer M, et al. Meta-analysis: chondroitin for osteoarthritis of the knee or hip. Ann Intern Med 2007;146:580-90.
    • Schroth RJ, Hitchon CA, Uhanova J, et al. Hepatitis B vaccination for patients with chromic renal failure. Cochrane Database Syst Rev 2004;(3):CD003775.
    References (V)
    • Shah MR, Hasselvlad V, Stevenson LW, et al. Impact of the pulmonary artery catheter in critically ill patients: meta-analysis of randomzied clinical trials. JAMA 2005;294:1664-70.
    • Sharma M, Ansari MT, Soares-Weiser K, et al. Comparative Effectiveness of Lipid-Modifying Agents . Comparative Effectiveness Review No. 16 (Prepared by the University of Ottawa Evidence-based Practice Center under contract No. 290-02-0021). Rockville, MD: Agency for Healthcare Research and Quality, September 2009. AHRQ Publication No. 09-EHC024-EF.
    • Sharma M, Ansari MT, Abou-Setta AM, et al. Systematic review: comparative effectiveness and harms of combination therapy and monotherapy for dyslipidemia. Ann Intern Med 2009;151:622-30.
    References (VI)
    • Tracz MJ, Sideras K, Boloña ER, et al. Testosterone use in men and its effects on bone health. A systematic review and meta-analysis of randomized placebo-controlled trials. J Clin Endocrinol Metab 2006;91:2011-6.
    • Uitterhoeve RJ, Vernooy M, Litjens M, et al. Psychosocial interventions for patients with advanced cancer — a systematic review of the literature. Br J Cancer 2004;91:1050-62.
    • Wilt TJ, Shamliyan T, Taylor B, et al. Comparative Effectiveness of Therapies for Clinically Localized Prostate Cancer . Comparative Effectiveness Review No. 13 (Prepared by Minnesota Evidence-based Practice Center under Contract No. 290-02-0009). Rockville, MD: Agency for Healthcare Research and Quality, February 2008. AHRQ Publication No. 08-EHC010-EF.
    References (VII)
    • This presentation was prepared by David Moher, Ph.D., director, University of Ottawa Evidence-based Practice Center.
    • Many of the examples in the presentations are taken from the PRISMA Statement for Reporting Systematic Reviews and Meta-Analyses of Studies That Evaluate Health Care Interventions (Liberati A, et al. PLoS Med 2009;6(7): e1000100).
    Author