meta analysis


Published on

Published in: Education, Technology
1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

meta analysis

  1. 1. A Workshop on the Basics of Systematic Review & Meta-Analysis Philip C. Abrami, Robert M. Bernard C. Anne Wade, Evgueni Borokhovski, Rana Tamim, Gretchen Lowerison & Mike Surkes Centre for the Study of Learning and Performance and CanKnow Concordia University
  2. 2. 02/25/11 What is a Systematic Review? <ul><li>A review of a clearly formulated question that uses systematic and explicit methods to identify, select and critically appraise relevant research, and to collect and analyze data from the studies that are included in the review. </li></ul><ul><li>Statistical methods (meta-analysis) may or may not be used to analyze and summarize the results of the included studies. </li></ul><ul><li>Other examples: Narrative review, qualitative review, vote count, meta-synthesis. </li></ul>
  3. 3. What is Meta-Analysis? <ul><li>Meta-Analysis is a set of quantitative research synthesis techniques and procedures </li></ul><ul><li>Meta-Analysis uses effect size as a metric for judging the magnitude of standardized difference between a treatment and control condition </li></ul>02/25/11
  4. 4. 02/25/11
  5. 5. Purpose: Explaining Variability in Effect Size 02/25/11 Effect Sizes Study Features Shared Variability Unique Variability Unique Variability Prediction
  6. 6. 10 Steps in Planning and Conducting a Systematic Review/Meta-Analysis 02/25/11 <ul><li>Determine the research question </li></ul><ul><li>Develop terms and definitions related to the question </li></ul><ul><li>Develop a search strategy for identification of relevant studies </li></ul><ul><li>Establish criteria for inclusion and exclusion of studies </li></ul><ul><li>Select studies based on abstract review (agreement) </li></ul><ul><li>Select studies based on full-text review (agreement) </li></ul><ul><li>Extract effect sizes (agreement) </li></ul><ul><li>Develop codebook of study features </li></ul><ul><li>Code studies (agreement) </li></ul><ul><li>Conduct statistical analysis and interpretation </li></ul>
  7. 7. 02/25/11
  8. 8. 02/25/11 1. Determine the research question The “big question” that guides the research. It usually involves asking about the difference between two conditions (i.e., usually treatment and control) or the relationship between two measures. 10 Steps in a Meta-Analysis
  9. 9. Questions the Researcher Should Ask <ul><li>Does the question have theoretical or practical relevance (i.e., aids in practice and/or policy making decisions)? </li></ul><ul><li>Is the literature of a type that can answer the question? </li></ul><ul><li>Is there a sufficient quantitative research literature? </li></ul><ul><li>Do the studies lend themselves to meta-analysis? </li></ul><ul><li>Is the literature too large given the resources available? </li></ul>02/25/11
  10. 10. Example: Critical Thinking 02/25/11 Research Question: What instructional interventions, to what extent, and under what particular circumstances, impact on the development and effective use of learner’s critical thinking skills and dispositions?
  11. 11. 02/25/11 2. Develop terms and definitions related to the question This helps refine the research question and inform the search strategies. 10 Steps in a Meta-Analysis
  12. 12. 02/25/11 3. Develop a search strategy for the identification of relevant studies This involves the planning/implementation of search and retrieval for primary studies (e.g., electronic databases, branching). 10 Steps in a Meta-Analysis
  13. 13. 02/25/11 Information Retrieval: A Continuous Process <ul><ul><li>Preliminary Searches </li></ul></ul><ul><ul><ul><li>Supports beginning steps: Definition of key concepts & research question </li></ul></ul></ul><ul><ul><ul><li>Use of standard reference tools and broad searches for review articles and key primary studies </li></ul></ul></ul><ul><ul><li>Main Searches </li></ul></ul><ul><ul><ul><li>Identification of primary studies through searches of online databases, printed indices, Internet, branching, hand-searches </li></ul></ul></ul><ul><ul><ul><li>Most difficult given a number of challenges </li></ul></ul></ul><ul><ul><li>Final Searches </li></ul></ul><ul><ul><ul><li>Occurs towards the end of the Review Process </li></ul></ul></ul><ul><ul><ul><li>Refine search terms and update original searches </li></ul></ul></ul>
  14. 14. 02/25/11 Preliminary Searches Reference Sources: Purpose : To obtain definitions for the terms; creativity, critical thinking, decision making, divergent thinking, intelligence; problem solving, reasoning, thinking. Sources: Bailin, S. (1998). Critical Thinking: Philosophical Issues . [CD-ROM] Education: The Complete Encyclopedia. Elsevier Science, Ltd. Barrow, R., & Milburn, G. (1990). A critical dictionary of educational concepts: An appraisal of selected ideas and issues in educational theory and practice (2 nd ed.) . Hertfordshire, UK: Harvester Wheatsheaf Colman (2001). Dictionary of Psychology (complete reference to be obtained) Corsini, R. J. (1999). The dictionary of psychology . Philadelphia, PA: Brunner/Mazel Dejnoka, E. L., & Kapel, D. E. (1991). American educator’s encyclopedia . Westport, CT: Greenwood Press. …… (see handout)
  15. 15. 02/25/11 Main Searches: Decisions <ul><li>Selection of Primary Information Retrieval Tools </li></ul><ul><ul><li>Scope of search: Which fields should be searched (including all related fields)? </li></ul></ul><ul><ul><li>Availability of indexing tools: Which tools do we have access to at our institution? Are there others who can perform searches for us? </li></ul></ul><ul><ul><li>Format of indexing tools: What format are they in (e.g. online, print, web-based)? </li></ul></ul><ul><ul><li>Date: How far back does the indexing go for each tool? </li></ul></ul><ul><ul><li>Language: What is the language of the material that is indexed? How can we locate non-English material? </li></ul></ul><ul><ul><li>Unpublished work : How can we access dissertations, reports, & other grey literature? </li></ul></ul>
  16. 16. Examples of Databases <ul><ul><li>Education: ERIC, British Education Index, Australian Education Index, Chinese ERIC, CBCA Education, Education index, Education: A SAGE Full-text Collection </li></ul></ul><ul><ul><li>Psychology : PsycINFO, PubMed (Medline), Psychology: A SAGE Full-Text Collection </li></ul></ul><ul><ul><li>Sociology: Sociological Abstracts, Contemporary Women’s Issues. Sociology: A SAGE Full-text Collection </li></ul></ul><ul><ul><li>Multidisciplinary: EBSCO Academic Search Premier, ProQuest Dissertations and Theses Fulltext, FRANCIS, Social Sciences Index, SCOPUS, Web of Science </li></ul></ul>02/25/11
  17. 17. Example: Critical Thinking <ul><li>To date, the following databases have been searched: </li></ul><ul><ul><li>AACE Digital Library (now known as EdITLib) </li></ul></ul><ul><ul><li>ABI/Inform Business </li></ul></ul><ul><ul><li>EBSCO Academic Search Premier </li></ul></ul><ul><ul><li>ERIC </li></ul></ul><ul><ul><li>EconLit </li></ul></ul><ul><ul><li>PAIS International </li></ul></ul><ul><ul><li>ProQuest Dissertations and Theses Fulltext </li></ul></ul><ul><ul><li>PsycINFO </li></ul></ul><ul><ul><li>Social Science Index </li></ul></ul><ul><ul><li>Sociological Abstracts </li></ul></ul>02/25/11
  18. 18. 02/25/11 Main Searches: More Decisions <ul><li>Preparation of Search Strategies </li></ul><ul><ul><li>What are the key concepts to be searched? </li></ul></ul><ul><ul><li>How are these represented in each discipline? </li></ul></ul><ul><ul><li>What are their related terms? </li></ul></ul><ul><ul><li>How are these key concepts represented in the controlled vocabulary within each database to be searched? (See handout) </li></ul></ul><ul><ul><li>Note: these decisions need to be made for each indexing tool used. </li></ul></ul>
  19. 19. Main Searches: Yet More Decisions <ul><li>Construction of the Search Statements </li></ul><ul><ul><li>What terms should be searched as descriptors or as “free text”? </li></ul></ul><ul><ul><li>What Boolean operators should be used? </li></ul></ul><ul><ul><li>Where should truncation characters be used? (e.g. parent* will retrieve parent, parents, parental) </li></ul></ul><ul><ul><li>What limiting features are available to narrow results? (e.g. use of Publication Type codes)? </li></ul></ul><ul><ul><li>What time period should be searched? </li></ul></ul>02/25/11
  20. 20. Example: ERIC <ul><li>Combining Keywords/Descriptors using Boolean operators: </li></ul><ul><li>Searches and records below from: The ERIC Database </li></ul><ul><li>#5 #3 and #4 (1520 records) </li></ul><ul><ul><li>#4 DTC = 142 or DTC = 143 or control group* (322893 records) </li></ul></ul><ul><ul><li>#3 #1 or #2 (7718 records) </li></ul></ul><ul><ul><li>#2 critical thinking in DE,ID (7562 records) </li></ul></ul><ul><ul><li>#1 thinking skills in DE and critical thinking (1269 records) </li></ul></ul>02/25/11
  21. 21. Documenting Your Searches 02/25/11 Example from our Codebook: ERIC (Date: September 21, 2003; AW) Purpose : To retrieve the first set of abstracts to be reviewed by team according to the current inclusive/exclusion criteria. Result : Hit rate of 514/1520 Source code: ERIC1 Searches and records below from: The ERIC Database (1966-2003, June) #5 #3 and #4 (1520 records) #4 DTC = 142 or DTC = 143 or control group* (322893 records) #3 #1 or #2 (7718 records) #2 critical thinking in DE,ID (7562 records) #1 thinking skills in DE and critical thinking (1269 records)
  22. 22. Next Steps <ul><li>Repeat these steps for each database to be searched. </li></ul><ul><li>(see handout) </li></ul>02/25/11
  23. 23. 02/25/11 <ul><li>Secondary Retrieval Strategies </li></ul><ul><ul><li>Locating the grey (unpublished) literature: </li></ul></ul><ul><ul><ul><li>- Using the web, & Dissertations Abstracts </li></ul></ul></ul><ul><ul><li>Branching: </li></ul></ul><ul><ul><ul><li>- Scanning the reference section of review articles </li></ul></ul></ul><ul><ul><li>Hand searches: </li></ul></ul><ul><ul><ul><li>- Scanning the Table of Contents of key journals and conference proceedings </li></ul></ul></ul><ul><ul><li>Personal contacts: </li></ul></ul><ul><ul><ul><li>- Contacting key researchers in the field </li></ul></ul></ul>Main Searches: Yet Still More Decisions
  24. 24. 02/25/11 Information Retrieval: Wrap Up <ul><li>“ Shoestring-budget information retrieval is likely to introduce bias, and should be avoided.” ( IR Policy Brief , 2004) </li></ul><ul><li>Importance of information retrieval process </li></ul><ul><ul><li>Not a “one-shot”deal </li></ul></ul><ul><ul><li>Requires expertise in the planning and implementation of searches </li></ul></ul><ul><ul><li>Library personnel are important members of the team </li></ul></ul><ul><li>Use of bibliographic management software </li></ul><ul><ul><li>Reference Manager, EndNotes, RefWorks </li></ul></ul><ul><li>Ability to replicate review </li></ul><ul><ul><li>Documentation of entire process, including search strategies used for each database, decisions taken, etc. </li></ul></ul>
  25. 25. 02/25/11 10 Steps in a Meta-Analysis 4. Establish criteria for inclusion and exclusion of studies These are the criteria that guide the search for literature and ultimately determine what studies are in and out of the review.
  26. 26. Inclusion/Exclusion: Questions <ul><li>What characteristics of studies will be used to determine whether a particular effort was relevant to the research question? </li></ul><ul><li>What characteristics of studies will lead to inclusion? exclusion? </li></ul><ul><li>Will relevance decisions be based on a reading of report titles? abstracts? full reports? </li></ul><ul><li>Who will make the relevance decisions? </li></ul><ul><li>How will the reliability of relevance decisions be assessed? </li></ul>02/25/11
  27. 27. 02/25/11 10 Steps in a Meta-Analysis 5. Select studies based on abstract review This is the initial decision as to what studies will be retrieved as full-text documents.
  28. 28. 02/25/11 10 Steps in a Meta-Analysis 6. Select studies based on full-text review This is the second decision as to what studies will be included in the review.
  29. 29. 02/25/11 10 Steps in a Meta-Analysis 7. Extract effect sizes Effect sizes extraction involves converting descriptive or other statistical information contained in studies into a standard metric by which studies can be compared.
  30. 30. What is an Effect size? <ul><li>A descriptive metric that characterizes the standardized difference (in SD units) between the mean of a control group and the mean of a treatment group (educational intervention) </li></ul><ul><li>Can also be calculated from correlational data derived from pre-experimental designs or from repeated measures designs </li></ul>02/25/11
  31. 31. Characteristics of Effect Sizes <ul><li>Can be positive or negative </li></ul><ul><li>Interpreted as a z -score, in SD units , although individual effect sizes are not part of a z -score distribution </li></ul><ul><li>Can be aggregated with other effect sizes and subjected to other statistical procedures such as ANOVA and multiple regression </li></ul><ul><li>Magnitude interpretation : ≤ 0.20 is a small effect size, 0.50 is a moderate effect size and ≥ 0.80 is a large effect size (Cohen, 1992) </li></ul>02/25/11
  32. 32. Effect Size Extraction <ul><li>Effect size extraction is the process of identifying relevant statistical data in a study and calculating an effect size based on those data </li></ul><ul><li>All effect sizes should be extracted by two coders , working independently </li></ul><ul><li>Coders’ results should be compared and a measure of inter-coder agreement calculated and recorded </li></ul><ul><li>In cases of disagreement, coders should resolve the discrepancy in collaboration </li></ul>02/25/11
  33. 33. 02/25/11
  34. 34. 02/25/11
  35. 35. Example of ES Extraction with Descriptive Statistics 02/25/11 Study reports: Treatment mean = 42.8 Control Mean = 32.5 Treatment SD = 8.6 Control SD = 7.4 n = 26 n = 31 Procedure: Calculate SD pooled Calculate d and g
  36. 36. Extracting Effect Sizes in the Absence of Descriptive Statistics <ul><li>Inferential Statistics ( t -test, ANOVA , ANCOVA , etc.) when the exact statistics are provided </li></ul><ul><li>Levels of significance, such as p < .05, when the exact statistics are not given ( t can be set at the conservative t = 1.96) (Glass, McGaw & Smith, 1981; Hedges, Shymansky & Woodworth, 1989) </li></ul><ul><li>Studies not reporting sample sizes for control and experimental groups should be considered for exclusion </li></ul>02/25/11
  37. 37. Examples of Alternative Methods of ES Extraction 02/25/11 • Study Reports: t (63) = 2.56, p < .05 • Study Reports: F (1, 63) = 2.56, p < .05 Convert F to t and apply the above equation:
  38. 38. Zero Effect Size 02/25/11 ES = 0.00 Control Condition Treatment Condition Overlapping Distributions
  39. 39. Moderate Effect Size 02/25/11 Control Condition Treatment Condition ES = 0.40
  40. 40. Large Effect Size 02/25/11 Control Condition Treatment Condition ES = 0.85
  41. 41. Mean and Variability 02/25/11 Variability ES+ Note: Results from Bernard, Abrami, Lou, et al. (2004) RER
  42. 42. 02/25/11 10 Steps in a Meta-Analysis 8. Develop a codebook Study feature coding involves describing the relevant characteristics for each study (e.g., research methodology, publication source).The codebook details the study feature categories and their levels.
  43. 43. Examining Study Features <ul><li>Purpose : to attempt to explain variability in effect size </li></ul><ul><li>Any nominal, ordinal or interval coded study feature can be investigated </li></ul><ul><li>In addition to mean effect size, variability should be investigated </li></ul><ul><li>Study features with small ks may be unstable </li></ul>02/25/11
  44. 44. Examples of Study Features <ul><li>Research methodology </li></ul><ul><li>Type and nature of measures </li></ul><ul><li>Direction of the statistical test </li></ul><ul><li>Publication data </li></ul><ul><li>Relevant aspects of the treatment </li></ul><ul><li>Relevant aspects of the control condition </li></ul>02/25/11
  45. 45. 10 Steps in a Meta-Analysis 02/25/11 9. Code studies for study features Coding study features is perhaps the most time-consuming and onerous aspect of conducting a meta-analysis. However, it is arguably the most important step because it provides the possibility for explaining variability in effect sizes.
  46. 46. 02/25/11 10 Steps in a Meta-Analysis 10: Analysis and interpretation Analysis involves invoking a range of standard statistical tests to examine average effect sizes, variability and the relationship between study features and effect size. Interpretation is drawing conclusion from these analyses.
  47. 47. Questions: Statistical Analysis <ul><li>What techniques will be used to combine results of separate tests? </li></ul><ul><li>What techniques will be used to assess and then analyze the variability in findings across studies? </li></ul><ul><li>What sensitivity analyses (i.e., tests of the impact of such decisions on the results of the review) will be carried out and how? </li></ul><ul><li>What statistical procedures will be used to test relationships between study features and effect sizes (e.g., meta regression) </li></ul>02/25/11
  48. 48. Homogeneity vs. Heterogeneity of Effect Size <ul><li>If homogeneity of effect size is established, then the studies in the meta-analysis can be thought of as sharing the same effect size (i.e., the mean) </li></ul><ul><li>If homogeneity of effect size is violated (heterogeneity of effect size), then no single effect size is representative of the collection of studies (i.e., the “true” mean effect size remains unknown) </li></ul>02/25/11
  49. 49. 02/25/11
  50. 50. 02/25/11 Statistics in Comprehensive Meta-Analysis™ Comprehensive Meta-Analysis 2.0 is a trademark of BioStat® Interpretation: Moderate ES for all outcomes ( g+ = 0.34) in favor of the intervention condition. Homogeneity of ES is violated. Q-value is significant (i.e., there is too much variability for g+ to represent a true average in the population).
  51. 51. Examining the Study Feature “Type of Research Design” 02/25/11 g + = +0.34 Overall Effect Pre-Post Designs Post-Only Designs Quasi-Exp. Designs
  52. 52. Tests of Levels of “Type of Research Design” 02/25/11 Interpretation: Small to Moderate ESs for all categories in favor of the intervention condition. Homogeneity of ES is violated. Q-value is significant for all categories (i.e., type of research design does not explain enough variability to reach homogeneity.
  53. 53. Sensitivity Analysis <ul><li>Tests the robustness of the findings </li></ul><ul><li>Asks the question : Will these results stand up when potentially distorting or deceptive elements, such as outliers, are removed? </li></ul><ul><li>Particularly important to examine the robustness of the effect sizes of study features, as these are usually based on smaller numbers of outcomes </li></ul>02/25/11
  54. 54. Selected References <ul><li>Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How Does Distance Education Compare to Classroom Instruction? A Meta-Analysis of the Empirical Literature. Review of Educational Research, 74 (3), 379-439. </li></ul><ul><li>Glass, G. V., McGaw, B., & Smith, M. L. (1981). Meta-analysis in social research . Beverly Hills, CA: Sage. </li></ul><ul><li>Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis . Orlando, FL: Academic Press. </li></ul><ul><li>Hedges, L. V., Shymansky, J. A., & Woodworth, G. (1989). A practical guide to modern methods of meta-analysis . [ERIC Document Reproduction Service No. ED 309 952]. </li></ul>02/25/11
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.