Systematic review: teams, processes, experiences
Upcoming SlideShare
Loading in...5
×
 

Systematic review: teams, processes, experiences

on

  • 3,135 views

A presentation given for the School of Information course on SI 653: Evidence-Based Health Information Practice taught by Tiffany Veinot.

A presentation given for the School of Information course on SI 653: Evidence-Based Health Information Practice taught by Tiffany Veinot.

Statistics

Views

Total Views
3,135
Views on SlideShare
2,088
Embed Views
1,047

Actions

Likes
2
Downloads
63
Comments
0

3 Embeds 1,047

http://etechlib.wordpress.com 551
http://guides.lib.umich.edu 491
http://flavors.me 5

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

CC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike LicenseCC Attribution-NonCommercial-ShareAlike License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Systematic review: teams, processes, experiences Systematic review: teams, processes, experiences Presentation Transcript

  • Systematic Review Processes, Teams,& Experiences PF Anderson, University of Michigan March 29, 2012
  • Overview●About Systematic reviews ● Purpose ● Uses ● Role of the Librarian●Process & Methodology ● FRIAR/SECT ● MEMORABLE ● Sentinels & MeSH ● Publishing ● Discovering/Sharing Strategies ● Data Abstraction / Extraction ● Reporting
  • About Systematic Reviews
  • Evidence-based or Systematic Review,Whats the Difference? ●Evidence-based -> clinically integrated ●Systematic review -> research methodology http://www.flickr.com/photos/rosefirerising/1175879764/in/set- 72157604660150389/
  • Clinical
  • What is a Systematic Review?Scientific & Unbiased:● ● “A systematic review involves the application of scientific strategies, in ways that limit bias, to the assembly, critical appraisal, and synthesis of all relevant studies that address a specific clinical question.”Summary:● ● “A meta-analysis is a type of systematic review that uses statistical methods to combine and summarize the results of several primary studies.”Clearly Reported:● ● “Because the review process itself (like any other type of research) is subject to bias, a useful review requires clear reporting of information obtained using rigorous methods.” ● Cook DJ, Mulrow CD, Haynes RB. Systematic Reviews: Synthesis of Best Evidence for Clinical Decisions. Annals of Internal Medicine 1997 126(5):376-380.
  • Cochrane Collaboration ◆ http://cochrane.org/consumers/docs/01Cochrane5min.ppt
  • Cochrane Review TeamsClinical expert● ● Initiates, defines, selects topic.Clinical expert● ● Partners in above process, and collaborates in review to prevent bias.Statistician● ● Provides methodological oversight, ensures process quality for entire project.Librarian● ● Provides methodological oversight, ensues process quality for information search process.Healthcare Consumer● ● Provides insight into the priorities for research, information conduit for relating priorities and findings between consumers and clinicians.
  • What is Evidence-Based Healthcare?●According to the ADA policy statement onEBD, the term "best evidence" "refers toinformation obtained from randomizedcontrolled clinical trials, nonrandomizedcontrolled clinical trials, cohort studies, case-control studies, crossover studies, cross-sectional studies, case studies or, in theabsence of scientific evidence, theconsensus opinion of experts in theappropriate fields of research or clinicalpractice. The strength of the evidencefollows the order of the studies or opinionslisted above.” ● Ismail AI, Bader JD. Evidence-based dentistry in clinical practice. JADA 2004 135(1):78-83
  • What is Evidence-Based Healthcare?●Short version:●‘Make your [clinical] decisions based on thebest evidence available, integrated with yourclinical judgment. That’s all it means. The bestevidence, whatever that is.’ ● Paraphrased from Dr. Ismail in conversation, circa 2003.
  • Levels ofEvidence, inContext
  • Process & Methodology
  • Process & Methodology, OverviewTeam meets● ● Define topic, overview literature base, suggest inclusion/exclusion criteria, discuss methodology & timeline.Librarian● ● Generates data for the team ● FRIAR/MEMORABLE/SECT ● Topic experts collaborateTopic experts● ● Review data at 3-4 levels (title, abstract, article, [request additional information]), achieve consensus ● Handsearching (librarian generates list, experts implement) ● Determine level of evidence for remaining research ● Generate review tablesShare findings (Publication)● ● Strength of evidence available (strong, weak, inadequate); suggest directions for future research to fill gaps in research base
  • Expectations of the Librarian Role●Search strategy ● Background research of already published similar search strategies, systematic reviews on related topics ● Suggest appropriate terms & concepts for review by clinical experts ● Clear communications with team about relationship of search to question and methodology ● Sensitivity / specificity ● Validated, revised, adhering to standards / guidelines / best practices ● Publication-ready copy of strategy ● Variant strategies for other databases●Data set ● In appropriate format ● Data set management support●Methodology oversight ● Write / revise methodology & results as appropriate ● Assure replicability of methods
  • Review Search Terms●Disease / clinical topic ● Anatomical area ● Earlier terms ● Alternate terms ● Related concepts●Methodologies●Investigative terms●Publication types
  • Special IssuesAuthorship● ● Justifying co-authorship ● Criteria ● Avoiding plagiarism, self-plagiarism, and other questionable writing practices: A guide to ethical writing: http://ori.dhhs. gov/education/products/plagiarism/index.shtml ● Establishing authorship: http://ori.dhhs. gov/education/products/plagiarism/33.shtml ● Co-authorship versus fee-for-services-rendered (or both) ● When do you NOT want to be a co-author?
  • FRIAR/SECT ●S – Search●F – Frame ●E – Evaluate●R - Rank by Relevance ●C – Cite●I - Irrelevant Search ●T - Test/Try AgainConcepts●A - Alternates/Aliases(Term Generation)●R - Review, Revise,Repeat
  • Frame = Question (PIO/PICO/PICOT)●P = Patient●I = Intervention●C = Control group or comparison ● NOTE: In very small research domains, this portion may not be included. A systematic review would not reach clinical significance, but would focus on levels of evidence available and directions for future research.●O = Outcome desired●T = Time factor
  • R = Relevance●Searches usually have 2-4 topics that are relevant and searchable.●Rank relevant searchable topics by importance.●TIP: Structure the search to place the greatest attention, focus, andrichness of term generation on the most important topic.
  • I = Irrelevant●Too broad●Too specific●Unsearchable●NOTE: Irrelevant topics are not irrelevant to the question or clinicaldecision, and are important to making the final decisions. They areirrelevant to the search process.
  • A = Alternates/Aliases● Term generation process might include: ● Alternate terms, spellings (UK), archaic terms ● Acronyms & what they stand for ● Anatomical area, symptoms, diagnostic criteria ● Products, chemicals, microorganisms, registry numbers, etc.●NOTE: After asking the question, this is mostimportant part of the process.●TIP: Have team brainstorm terms, then search formore, have team review added terms.
  • R = Review, Revise, Repeat●Did search retrieve what it should? (True positive) ● If not, why not?●Did search retrieve what it shouldn’t (False positive) ● Are there patterns that would allow you to exclude the false positives? Without losing the ones you want?
  • MEMORABLE, A Medline SearchStrategy Development Tool
  • Sentinel Articles●Number of sentinels desired - 3-5. Can have more or less, but this tends towork best. Verify appropriateness of selected sentinels.●Neither very recent (current year) or old (before 1985) ● Articles old enough to have MeSH assigned, new enough to have complete indexing ● On topic, not broader or narrower ● Well-indexed with appropriate terms●Representative of citations that would be retrieved by a well-done search●Remember – purpose is for validating search, not proving you know thebest articles on the topic●Each sentinel article must represent ALL desired concepts in the search ● Articles selected must meet all inclusion and exclusion criteria.
  • MESH Tip●Earlier term mappings prior toassignment of a MeSH term are often: ● presenting symptom or diagnosis ● anatomical area●TMJD Example: ● TMJD = temporomandibular joint disorder ● = (Temporomandibular joint [anatomical area] + ("myofacial pain" OR "Bone Diseases") [presenting symptom OR diagnosis] Image: Frank Gaillard. Normal anatomy of the Temporomandibular joint. 14 Jan 2009. http://commons.wikimedia.org/wiki/File:Temporomandibular_joint.png
  • MESH & Sentinels 1 & 2 Process citations ● ● strip out everything except MeSH terms ● sort alphabetically●Verify sentinel citations in ● strip irrelevant Mesh termsMEDLINE ● dedupe●Save file with full citations ● note frequency of(abstract, MeSH headings, repetition of terms to aideverything) in determining weight of●Make duplicate file to process term ● note distribution and frequency of subheadings
  • MESH & Sentinels 3Analyse MeSH Terms● ● retain topical terms ● retain methodology terms ● retain non-MeSH terms such as publication type and registry numbers (but separate from core concept terms)
  • Systematic Review SearchStrategy Example●Torabinejad M, Anderson P, Bader J, Brown LJ, ChenLH, Goodacre CJ, Kattadiyil MT, Kutsenko D, Lozada J,Patel R, Petersen F, Puterman I, White SN. Outcomes ofroot canal treatment and restoration, implant-supportedsingle crowns, fixed partial dentures, and extractionwithout replacement: a systematic review. J ProsthetDent. 2007 Oct;98(4):285-311. PMID: 17936128
  • Sources of Search Strategies●Lingo: filters vs. hedges●Search the Methods of existing systematic reviews.●Warning: ● Many articles published as systematic reviews may have modified the process. ● Many articles published as systematic reviews may not include a replicable search methodology. ● Some articles published as systematic reviews may not actually be systematic reviews.
  • Sources of Search Strategies●NIH Consensus Development Conference on Dental Caries: ● http://www.lib.umich.edu/health-sciences-libraries/nih-consensus- development-conference-diagnosis-and-management-dental-carie●EBHC Strategies Wiki: ● http://ebhcstrategies.wetpaint.com/
  • Data Abstraction / Extraction●Symbolic expression alwaysarises through abstraction andsimplification, and can onlyreflect a small part of reality. ● Cajal, Advice to a Young Investigator, p. 76
  • Cochrane: Data Extraction
  • Evidence Table Example ●Levels of evidence ●Participant characteristics ●Study characteristics ●Intervention and outcome measurements ●Results ●Study limitations ●Inclusion/Exclusion criteria http://www.aota.org/DocumentVault/AJOT/Template.aspx?FT=.pdf
  • Data abstraction / extraction samplesCochrane:● ● Forms: http://endoc.cochrane.org/data-extraction-sheets ● Elements: http://www.cochrane.org/handbook/table-73a-checklist-items-consider-data- collection-or-data-extraction ● Cochrane CFGD November 2004 * (DOC): http://cfgd.cochrane.org/sites/cfgd. cochrane.org/files/uploads/Study%20selection%20and%20%20extraction%20form.docData Extraction Template, 2011 (XLS): http://www.latrobe.edu.au/chcp/assets/downloads/DET_2011.xls ● Overview: http://www.cochrane-pro-mg.org/Documents/Reviewsheet-1-4-08.pdfOther● ● CDC Data Abstraction Form * : http://www.nccmt.ca/uploads/registry/CDC%20Tool. pdf ● Social science example (suicide) http://www.cmaj.ca/content/suppl/2009/01/29/180. 3.291.DC2/ssri-barbui-3-at.pdf
  • Clearly Stating the Evidence (PRISMA) NOTE: PRISMA-P coming later this yearhttp://www.prisma-statement.org/2.1.4%20-% 20PRISMA%20Flow%202009%20Diagram. pdf
  • Guidelines for Assessing/Reporting Results●CONSORT (Consolidated Standards of Reporting Trials) 2001, 2010●ASSERT (A Standard for the Scientific & Ethical Review of Trials)2001 ○ EQUATOR (Enhancing the QUAlity & Transparency of health Research) 2008 ○ SPIRIT (Standard Protocol Items for Randomized Trials) 2008●QUORUM (Quality of Reporting of Meta-analyses) 1999, 2009●MOOSE (Meta-analysis of Observational Studies in Epidemiology)2000●STROBE (Strengthening the Reporting of Observational Studies inEpidemiology) 2007, STROBE-ME 2011●Others:CENT, COREQ, GNOSIS, ORION, RECORD, REFLECT, REMARK,REMINDE, SQUIRE, STARD, STREGA, TREND, WIDER, ...
  • Evolution of Guidelines for Assessing/Reporting Results●CONSORT 2001 —> 2010 —> PRISMA●QUORUM 1999 —> 2009 —> PRISMA●ASSERT 2001 —> SPIRIT 2008 —> EQUATOR 2008●STROBE 2007 —> STROBE-ME 2011More on fragmentation & history:Jan P. Vandenbroucke. Journal of Clinical Epidemiology 62 (2009):594-596 http://www.veteditors.org/Publication%20Guidelines/JOURNAL%20VARIATION%20on%20Guidelines%202009.pdfCURRENT LIST: http://www.equator-network.org/resource-centre/library-of-●health-research-reporting/
  • Standard Team/Process vs Reality: Case Studies Elements Projects & Examples ● Team: size, configuration ● NIH Consensus Development ● Domain size Conference on Dental Caries ● Torebinejad: Root Canal Treatment ● Librarian role, # ● Cochrane: Herpes Simplex (Oral ● Clerical support Health, Skin) ● Software / tech ● Delta Dental support ● Diabetes ● Articaine vs. Lidocaine ● Handsearching / ● Healthcare Social Media experts ● Outcomes● Most common? ● Underestimating time/labor requirements ● First time findings: "insufficient evidence"
  • ●Slides at: ● http://slideshare.net/umhealthscienceslibraries/●Contact: ● pfa@umich.edu