Your SlideShare is downloading. ×
0
Systematic Review Processes,    Teams,& Experiences      PF Anderson, University of Michigan               March 29, 2012
Overview●About Systematic reviews  ●  Purpose  ●  Uses  ●  Role of the Librarian●Process & Methodology  ●  FRIAR/SECT  ●  ...
About Systematic Reviews
Evidence-based or Systematic Review,Whats the Difference?                          ●Evidence-based -> clinically integrate...
Clinical
What is a Systematic Review?Scientific & Unbiased:●    ●   “A systematic review involves the application of scientific str...
Cochrane Collaboration                               ◆       http://cochrane.org/consumers/docs/01Cochrane5min.ppt
Cochrane Review TeamsClinical expert●    ●   Initiates, defines, selects topic.Clinical expert●    ●   Partners in above p...
What is Evidence-Based Healthcare?●According to the ADA policy statement onEBD, the term "best evidence" "refers toinforma...
What is Evidence-Based Healthcare?●Short version:●‘Make your [clinical] decisions based on thebest evidence available, int...
Levels ofEvidence, inContext
Process & Methodology
Process & Methodology, OverviewTeam meets●    ●   Define topic, overview literature base, suggest inclusion/exclusion     ...
Expectations of the Librarian Role●Search strategy   ●   Background research of already published similar search strategie...
Review Search Terms●Disease / clinical topic   ●  Anatomical area   ●  Earlier terms   ●  Alternate terms   ●  Related con...
Special IssuesAuthorship● ●  Justifying co-authorship ●  Criteria    ●  Avoiding plagiarism, self-plagiarism, and other qu...
FRIAR/SECT                          ●S – Search●F – Frame                ●E – Evaluate●R - Rank by Relevance    ●C – Cite●...
Frame = Question (PIO/PICO/PICOT)●P = Patient●I = Intervention●C = Control group or comparison   ●   NOTE: In very small r...
R = Relevance●Searches usually have 2-4 topics that are relevant and searchable.●Rank relevant searchable topics by import...
I = Irrelevant●Too broad●Too specific●Unsearchable●NOTE: Irrelevant topics are not irrelevant to the question or clinicald...
A = Alternates/Aliases●   Term generation process might include:     ●   Alternate terms, spellings (UK), archaic terms   ...
R = Review, Revise, Repeat●Did search retrieve what it should? (True positive)  ●   If not, why not?●Did search retrieve w...
MEMORABLE, A Medline SearchStrategy Development Tool
Sentinel Articles●Number of sentinels desired - 3-5. Can have more or less, but this tends towork best. Verify appropriate...
MESH Tip●Earlier term mappings prior toassignment of a MeSH term are often:  ●   presenting symptom or diagnosis  ●   anat...
MESH & Sentinels 1 & 2                                  Process citations                                  ●              ...
MESH & Sentinels 3Analyse MeSH Terms● ●  retain topical terms ●  retain methodology terms ●  retain non-MeSH terms such as...
Systematic Review SearchStrategy Example●Torabinejad M, Anderson P, Bader J, Brown LJ, ChenLH, Goodacre CJ, Kattadiyil MT,...
Sources of Search Strategies●Lingo: filters vs. hedges●Search the Methods of existing systematic reviews.●Warning:   ● Man...
Sources of Search Strategies●NIH Consensus Development Conference on Dental Caries:  ●  http://www.lib.umich.edu/health-sc...
Data Abstraction / Extraction●Symbolic expression alwaysarises through abstraction andsimplification, and can onlyreflect ...
Cochrane: Data Extraction
Evidence Table Example ●Levels of evidence ●Participant characteristics ●Study characteristics ●Intervention and outcome m...
Data abstraction / extraction samplesCochrane:●    ●   Forms: http://endoc.cochrane.org/data-extraction-sheets    ●   Elem...
Clearly Stating    the Evidence    (PRISMA)         NOTE:         PRISMA-P coming later         this yearhttp://www.prisma...
Guidelines for Assessing/Reporting    Results●CONSORT (Consolidated Standards of Reporting Trials) 2001, 2010●ASSERT (A St...
Evolution of Guidelines for    Assessing/Reporting Results●CONSORT 2001 —> 2010 —> PRISMA●QUORUM 1999 —> 2009 —> PRISMA●AS...
Standard Team/Process vs Reality:     Case Studies    Elements                 Projects & Examples     ●   Team: size,    ...
●Slides at:   ●  http://slideshare.net/umhealthscienceslibraries/●Contact:   ●  pfa@umich.edu
Upcoming SlideShare
Loading in...5
×

Systematic review: teams, processes, experiences

3,146

Published on

A presentation given for the School of Information course on SI 653: Evidence-Based Health Information Practice taught by Tiffany Veinot.

Published in: Health & Medicine, Education
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
3,146
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
79
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Transcript of "Systematic review: teams, processes, experiences"

  1. 1. Systematic Review Processes, Teams,& Experiences PF Anderson, University of Michigan March 29, 2012
  2. 2. Overview●About Systematic reviews ● Purpose ● Uses ● Role of the Librarian●Process & Methodology ● FRIAR/SECT ● MEMORABLE ● Sentinels & MeSH ● Publishing ● Discovering/Sharing Strategies ● Data Abstraction / Extraction ● Reporting
  3. 3. About Systematic Reviews
  4. 4. Evidence-based or Systematic Review,Whats the Difference? ●Evidence-based -> clinically integrated ●Systematic review -> research methodology http://www.flickr.com/photos/rosefirerising/1175879764/in/set- 72157604660150389/
  5. 5. Clinical
  6. 6. What is a Systematic Review?Scientific & Unbiased:● ● “A systematic review involves the application of scientific strategies, in ways that limit bias, to the assembly, critical appraisal, and synthesis of all relevant studies that address a specific clinical question.”Summary:● ● “A meta-analysis is a type of systematic review that uses statistical methods to combine and summarize the results of several primary studies.”Clearly Reported:● ● “Because the review process itself (like any other type of research) is subject to bias, a useful review requires clear reporting of information obtained using rigorous methods.” ● Cook DJ, Mulrow CD, Haynes RB. Systematic Reviews: Synthesis of Best Evidence for Clinical Decisions. Annals of Internal Medicine 1997 126(5):376-380.
  7. 7. Cochrane Collaboration ◆ http://cochrane.org/consumers/docs/01Cochrane5min.ppt
  8. 8. Cochrane Review TeamsClinical expert● ● Initiates, defines, selects topic.Clinical expert● ● Partners in above process, and collaborates in review to prevent bias.Statistician● ● Provides methodological oversight, ensures process quality for entire project.Librarian● ● Provides methodological oversight, ensues process quality for information search process.Healthcare Consumer● ● Provides insight into the priorities for research, information conduit for relating priorities and findings between consumers and clinicians.
  9. 9. What is Evidence-Based Healthcare?●According to the ADA policy statement onEBD, the term "best evidence" "refers toinformation obtained from randomizedcontrolled clinical trials, nonrandomizedcontrolled clinical trials, cohort studies, case-control studies, crossover studies, cross-sectional studies, case studies or, in theabsence of scientific evidence, theconsensus opinion of experts in theappropriate fields of research or clinicalpractice. The strength of the evidencefollows the order of the studies or opinionslisted above.” ● Ismail AI, Bader JD. Evidence-based dentistry in clinical practice. JADA 2004 135(1):78-83
  10. 10. What is Evidence-Based Healthcare?●Short version:●‘Make your [clinical] decisions based on thebest evidence available, integrated with yourclinical judgment. That’s all it means. The bestevidence, whatever that is.’ ● Paraphrased from Dr. Ismail in conversation, circa 2003.
  11. 11. Levels ofEvidence, inContext
  12. 12. Process & Methodology
  13. 13. Process & Methodology, OverviewTeam meets● ● Define topic, overview literature base, suggest inclusion/exclusion criteria, discuss methodology & timeline.Librarian● ● Generates data for the team ● FRIAR/MEMORABLE/SECT ● Topic experts collaborateTopic experts● ● Review data at 3-4 levels (title, abstract, article, [request additional information]), achieve consensus ● Handsearching (librarian generates list, experts implement) ● Determine level of evidence for remaining research ● Generate review tablesShare findings (Publication)● ● Strength of evidence available (strong, weak, inadequate); suggest directions for future research to fill gaps in research base
  14. 14. Expectations of the Librarian Role●Search strategy ● Background research of already published similar search strategies, systematic reviews on related topics ● Suggest appropriate terms & concepts for review by clinical experts ● Clear communications with team about relationship of search to question and methodology ● Sensitivity / specificity ● Validated, revised, adhering to standards / guidelines / best practices ● Publication-ready copy of strategy ● Variant strategies for other databases●Data set ● In appropriate format ● Data set management support●Methodology oversight ● Write / revise methodology & results as appropriate ● Assure replicability of methods
  15. 15. Review Search Terms●Disease / clinical topic ● Anatomical area ● Earlier terms ● Alternate terms ● Related concepts●Methodologies●Investigative terms●Publication types
  16. 16. Special IssuesAuthorship● ● Justifying co-authorship ● Criteria ● Avoiding plagiarism, self-plagiarism, and other questionable writing practices: A guide to ethical writing: http://ori.dhhs. gov/education/products/plagiarism/index.shtml ● Establishing authorship: http://ori.dhhs. gov/education/products/plagiarism/33.shtml ● Co-authorship versus fee-for-services-rendered (or both) ● When do you NOT want to be a co-author?
  17. 17. FRIAR/SECT ●S – Search●F – Frame ●E – Evaluate●R - Rank by Relevance ●C – Cite●I - Irrelevant Search ●T - Test/Try AgainConcepts●A - Alternates/Aliases(Term Generation)●R - Review, Revise,Repeat
  18. 18. Frame = Question (PIO/PICO/PICOT)●P = Patient●I = Intervention●C = Control group or comparison ● NOTE: In very small research domains, this portion may not be included. A systematic review would not reach clinical significance, but would focus on levels of evidence available and directions for future research.●O = Outcome desired●T = Time factor
  19. 19. R = Relevance●Searches usually have 2-4 topics that are relevant and searchable.●Rank relevant searchable topics by importance.●TIP: Structure the search to place the greatest attention, focus, andrichness of term generation on the most important topic.
  20. 20. I = Irrelevant●Too broad●Too specific●Unsearchable●NOTE: Irrelevant topics are not irrelevant to the question or clinicaldecision, and are important to making the final decisions. They areirrelevant to the search process.
  21. 21. A = Alternates/Aliases● Term generation process might include: ● Alternate terms, spellings (UK), archaic terms ● Acronyms & what they stand for ● Anatomical area, symptoms, diagnostic criteria ● Products, chemicals, microorganisms, registry numbers, etc.●NOTE: After asking the question, this is mostimportant part of the process.●TIP: Have team brainstorm terms, then search formore, have team review added terms.
  22. 22. R = Review, Revise, Repeat●Did search retrieve what it should? (True positive) ● If not, why not?●Did search retrieve what it shouldn’t (False positive) ● Are there patterns that would allow you to exclude the false positives? Without losing the ones you want?
  23. 23. MEMORABLE, A Medline SearchStrategy Development Tool
  24. 24. Sentinel Articles●Number of sentinels desired - 3-5. Can have more or less, but this tends towork best. Verify appropriateness of selected sentinels.●Neither very recent (current year) or old (before 1985) ● Articles old enough to have MeSH assigned, new enough to have complete indexing ● On topic, not broader or narrower ● Well-indexed with appropriate terms●Representative of citations that would be retrieved by a well-done search●Remember – purpose is for validating search, not proving you know thebest articles on the topic●Each sentinel article must represent ALL desired concepts in the search ● Articles selected must meet all inclusion and exclusion criteria.
  25. 25. MESH Tip●Earlier term mappings prior toassignment of a MeSH term are often: ● presenting symptom or diagnosis ● anatomical area●TMJD Example: ● TMJD = temporomandibular joint disorder ● = (Temporomandibular joint [anatomical area] + ("myofacial pain" OR "Bone Diseases") [presenting symptom OR diagnosis] Image: Frank Gaillard. Normal anatomy of the Temporomandibular joint. 14 Jan 2009. http://commons.wikimedia.org/wiki/File:Temporomandibular_joint.png
  26. 26. MESH & Sentinels 1 & 2 Process citations ● ● strip out everything except MeSH terms ● sort alphabetically●Verify sentinel citations in ● strip irrelevant Mesh termsMEDLINE ● dedupe●Save file with full citations ● note frequency of(abstract, MeSH headings, repetition of terms to aideverything) in determining weight of●Make duplicate file to process term ● note distribution and frequency of subheadings
  27. 27. MESH & Sentinels 3Analyse MeSH Terms● ● retain topical terms ● retain methodology terms ● retain non-MeSH terms such as publication type and registry numbers (but separate from core concept terms)
  28. 28. Systematic Review SearchStrategy Example●Torabinejad M, Anderson P, Bader J, Brown LJ, ChenLH, Goodacre CJ, Kattadiyil MT, Kutsenko D, Lozada J,Patel R, Petersen F, Puterman I, White SN. Outcomes ofroot canal treatment and restoration, implant-supportedsingle crowns, fixed partial dentures, and extractionwithout replacement: a systematic review. J ProsthetDent. 2007 Oct;98(4):285-311. PMID: 17936128
  29. 29. Sources of Search Strategies●Lingo: filters vs. hedges●Search the Methods of existing systematic reviews.●Warning: ● Many articles published as systematic reviews may have modified the process. ● Many articles published as systematic reviews may not include a replicable search methodology. ● Some articles published as systematic reviews may not actually be systematic reviews.
  30. 30. Sources of Search Strategies●NIH Consensus Development Conference on Dental Caries: ● http://www.lib.umich.edu/health-sciences-libraries/nih-consensus- development-conference-diagnosis-and-management-dental-carie●EBHC Strategies Wiki: ● http://ebhcstrategies.wetpaint.com/
  31. 31. Data Abstraction / Extraction●Symbolic expression alwaysarises through abstraction andsimplification, and can onlyreflect a small part of reality. ● Cajal, Advice to a Young Investigator, p. 76
  32. 32. Cochrane: Data Extraction
  33. 33. Evidence Table Example ●Levels of evidence ●Participant characteristics ●Study characteristics ●Intervention and outcome measurements ●Results ●Study limitations ●Inclusion/Exclusion criteria http://www.aota.org/DocumentVault/AJOT/Template.aspx?FT=.pdf
  34. 34. Data abstraction / extraction samplesCochrane:● ● Forms: http://endoc.cochrane.org/data-extraction-sheets ● Elements: http://www.cochrane.org/handbook/table-73a-checklist-items-consider-data- collection-or-data-extraction ● Cochrane CFGD November 2004 * (DOC): http://cfgd.cochrane.org/sites/cfgd. cochrane.org/files/uploads/Study%20selection%20and%20%20extraction%20form.docData Extraction Template, 2011 (XLS): http://www.latrobe.edu.au/chcp/assets/downloads/DET_2011.xls ● Overview: http://www.cochrane-pro-mg.org/Documents/Reviewsheet-1-4-08.pdfOther● ● CDC Data Abstraction Form * : http://www.nccmt.ca/uploads/registry/CDC%20Tool. pdf ● Social science example (suicide) http://www.cmaj.ca/content/suppl/2009/01/29/180. 3.291.DC2/ssri-barbui-3-at.pdf
  35. 35. Clearly Stating the Evidence (PRISMA) NOTE: PRISMA-P coming later this yearhttp://www.prisma-statement.org/2.1.4%20-% 20PRISMA%20Flow%202009%20Diagram. pdf
  36. 36. Guidelines for Assessing/Reporting Results●CONSORT (Consolidated Standards of Reporting Trials) 2001, 2010●ASSERT (A Standard for the Scientific & Ethical Review of Trials)2001 ○ EQUATOR (Enhancing the QUAlity & Transparency of health Research) 2008 ○ SPIRIT (Standard Protocol Items for Randomized Trials) 2008●QUORUM (Quality of Reporting of Meta-analyses) 1999, 2009●MOOSE (Meta-analysis of Observational Studies in Epidemiology)2000●STROBE (Strengthening the Reporting of Observational Studies inEpidemiology) 2007, STROBE-ME 2011●Others:CENT, COREQ, GNOSIS, ORION, RECORD, REFLECT, REMARK,REMINDE, SQUIRE, STARD, STREGA, TREND, WIDER, ...
  37. 37. Evolution of Guidelines for Assessing/Reporting Results●CONSORT 2001 —> 2010 —> PRISMA●QUORUM 1999 —> 2009 —> PRISMA●ASSERT 2001 —> SPIRIT 2008 —> EQUATOR 2008●STROBE 2007 —> STROBE-ME 2011More on fragmentation & history:Jan P. Vandenbroucke. Journal of Clinical Epidemiology 62 (2009):594-596 http://www.veteditors.org/Publication%20Guidelines/JOURNAL%20VARIATION%20on%20Guidelines%202009.pdfCURRENT LIST: http://www.equator-network.org/resource-centre/library-of-●health-research-reporting/
  38. 38. Standard Team/Process vs Reality: Case Studies Elements Projects & Examples ● Team: size, configuration ● NIH Consensus Development ● Domain size Conference on Dental Caries ● Torebinejad: Root Canal Treatment ● Librarian role, # ● Cochrane: Herpes Simplex (Oral ● Clerical support Health, Skin) ● Software / tech ● Delta Dental support ● Diabetes ● Articaine vs. Lidocaine ● Handsearching / ● Healthcare Social Media experts ● Outcomes● Most common? ● Underestimating time/labor requirements ● First time findings: "insufficient evidence"
  39. 39. ●Slides at: ● http://slideshare.net/umhealthscienceslibraries/●Contact: ● pfa@umich.edu
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×