A Workshop on the Basics of Systematic Review &  Meta-Analysis  Philip C. Abrami, Robert M. Bernard C. Anne Wade, Evgueni Borokhovski, Rana Tamim, Gretchen Lowerison & Mike Surkes Centre for the Study of Learning and Performance  and CanKnow Concordia University
02/25/11 What is a Systematic Review? A review of a  clearly formulated question  that uses  systematic and explicit methods  to identify, select and critically appraise relevant research, and to  collect and analyze data  from the studies that are included in the review. Statistical methods (meta-analysis) may or may not be used  to analyze and summarize the results of the included  studies. Other examples: Narrative review, qualitative review, vote count, meta-synthesis.
What is Meta-Analysis? Meta-Analysis is  a set of quantitative research synthesis techniques and procedures Meta-Analysis uses  effect size  as a metric for judging the magnitude of  standardized   difference between a treatment and control condition 02/25/11
02/25/11
Purpose:  Explaining Variability in Effect Size 02/25/11 Effect Sizes Study Features Shared Variability Unique Variability Unique Variability Prediction
10 Steps in Planning and Conducting a Systematic Review/Meta-Analysis 02/25/11 Determine the research question Develop terms and definitions related to the question Develop a search strategy for identification of relevant studies Establish criteria for inclusion and exclusion of studies Select studies based on abstract review (agreement) Select studies based on full-text review (agreement) Extract effect sizes (agreement) Develop codebook of study features Code studies (agreement) Conduct statistical analysis and interpretation
02/25/11
02/25/11 1. Determine the research question The “big question” that guides the research. It usually involves asking about the difference between two conditions (i.e., usually treatment and control) or the relationship between two measures. 10 Steps in a Meta-Analysis
Questions the Researcher  Should Ask Does the question have  theoretical or practical relevance  (i.e., aids in practice and/or policy making decisions)? Is the literature of a type that  can answer the question? Is there a  sufficient quantitative research  literature? Do the  studies lend themselves to meta-analysis? Is the literature  too large  given the resources available? 02/25/11
Example:  Critical Thinking 02/25/11 Research Question:   What  instructional interventions,  to what extent, and under what particular circumstances, impact on the development and effective use of learner’s  critical thinking skills   and dispositions?
02/25/11 2.  Develop terms and definitions related to the question This  helps refine the research question and inform the search strategies. 10 Steps in a Meta-Analysis
02/25/11 3.  Develop a search strategy for the identification of relevant studies This involves the planning/implementation of search and retrieval for primary studies  (e.g., electronic databases, branching). 10 Steps in a Meta-Analysis
02/25/11 Information Retrieval:  A Continuous Process Preliminary Searches Supports beginning steps: Definition of key concepts & research question Use of standard reference tools and broad searches for review articles and key primary studies Main Searches Identification of primary studies through searches of online databases, printed indices, Internet, branching, hand-searches Most difficult given a number of challenges Final Searches Occurs towards the end of the Review Process Refine search terms and update original searches
02/25/11 Preliminary Searches Reference Sources: Purpose :  To obtain definitions for the terms; creativity, critical thinking, decision making, divergent thinking, intelligence; problem solving, reasoning, thinking. Sources: Bailin, S. (1998).  Critical Thinking: Philosophical Issues . [CD-ROM] Education: The Complete Encyclopedia. Elsevier Science, Ltd.  Barrow, R., & Milburn, G. (1990).  A critical dictionary of educational concepts: An appraisal of selected ideas and issues in educational theory and practice  (2 nd  ed.) .  Hertfordshire, UK: Harvester Wheatsheaf Colman (2001). Dictionary of Psychology (complete reference to be obtained) Corsini, R. J. (1999).  The dictionary of psychology . Philadelphia, PA: Brunner/Mazel Dejnoka, E. L., & Kapel, D. E. (1991).  American educator’s encyclopedia . Westport, CT: Greenwood Press. ……  (see handout)
02/25/11 Main Searches: Decisions Selection of Primary Information Retrieval Tools Scope of search:  Which fields should be searched (including all related fields)?  Availability of indexing tools:  Which tools do we have access to at our institution? Are there others who can perform searches for us? Format of indexing tools:  What format are they in (e.g. online, print, web-based)? Date:  How far back does the indexing go for each tool? Language:  What is the language of the material that is indexed? How can we locate non-English material? Unpublished work :  How can we access dissertations, reports, & other grey literature?
Examples of Databases Education:  ERIC, British Education Index, Australian Education Index, Chinese ERIC, CBCA Education, Education index, Education: A SAGE Full-text Collection Psychology :  PsycINFO, PubMed (Medline), Psychology: A SAGE Full-Text Collection Sociology:  Sociological Abstracts, Contemporary Women’s Issues. Sociology: A SAGE Full-text Collection Multidisciplinary:  EBSCO Academic Search Premier, ProQuest Dissertations and Theses Fulltext, FRANCIS, Social Sciences Index, SCOPUS, Web of Science 02/25/11
Example: Critical Thinking To date, the following databases have been searched: AACE Digital Library  (now known as EdITLib) ABI/Inform Business  EBSCO Academic Search Premier ERIC EconLit PAIS International ProQuest Dissertations and Theses Fulltext PsycINFO Social Science Index Sociological Abstracts 02/25/11
02/25/11 Main Searches: More Decisions Preparation of Search Strategies What are the key concepts to be searched? How are these represented in each discipline? What are their related terms? How are these key concepts represented in the controlled vocabulary within each database to be searched? (See handout) Note: these decisions need to be made for each indexing tool used.
Main Searches:  Yet More Decisions Construction of the Search Statements What terms should be searched as descriptors or as “free text”?  What Boolean operators should be used?  Where should truncation characters be used? (e.g. parent* will retrieve parent, parents,  parental) What limiting features are available to narrow results? (e.g. use of Publication Type codes)? What time period should be searched?   02/25/11
Example: ERIC Combining Keywords/Descriptors using Boolean operators:   Searches and records below from: The ERIC Database   #5  #3 and #4 (1520 records) #4    DTC = 142 or DTC = 143 or control group* (322893 records) #3  #1 or #2 (7718 records) #2    critical thinking in DE,ID (7562 records) #1    thinking skills in DE and critical thinking (1269 records) 02/25/11
Documenting Your Searches 02/25/11 Example from our Codebook: ERIC   (Date: September 21, 2003; AW) Purpose :  To retrieve the first set of abstracts to be reviewed by team according to the current inclusive/exclusion criteria.  Result :  Hit rate of  514/1520 Source code:  ERIC1 Searches and records below from: The ERIC Database (1966-2003, June) #5  #3 and #4 (1520 records) #4  DTC = 142 or DTC = 143 or control group* (322893  records) #3  #1 or #2 (7718 records) #2  critical thinking in DE,ID (7562 records) #1  thinking skills in DE and critical thinking (1269 records)
Next Steps Repeat these steps for  each database  to be searched. (see handout) 02/25/11
02/25/11 Secondary Retrieval Strategies Locating the grey (unpublished) literature: - Using the web, & Dissertations Abstracts Branching:  - Scanning the reference section of review articles Hand searches: - Scanning the Table of Contents of key journals and conference proceedings Personal contacts: - Contacting key researchers in the field Main Searches:  Yet Still More Decisions
02/25/11 Information Retrieval:  Wrap Up “ Shoestring-budget information retrieval is likely to introduce bias, and should be avoided.” ( IR Policy Brief , 2004) Importance of information retrieval process Not a “one-shot”deal Requires expertise in the planning and implementation of searches Library personnel are important members of the team Use of bibliographic management software Reference Manager, EndNotes, RefWorks Ability to replicate review Documentation of entire process, including search strategies used for each database, decisions taken, etc.
02/25/11 10 Steps in a Meta-Analysis 4. Establish criteria for inclusion  and exclusion of studies   These are the criteria that guide the search for literature and ultimately determine what studies  are in and out of the review.
Inclusion/Exclusion: Questions What  characteristics of studies  will be used to determine whether a particular effort was relevant to the research question? What characteristics of studies will lead to  inclusion? exclusion? Will  relevance decisions  be based on a reading of report titles? abstracts? full reports? Who will make the relevance decisions? How will the  reliability of relevance decisions  be assessed? 02/25/11
02/25/11 10 Steps in a Meta-Analysis 5.  Select studies based on  abstract review This is the initial decision as to what  studies will be retrieved as full-text documents.
02/25/11 10 Steps in a Meta-Analysis 6. Select studies based on  full-text review This is the second decision as to what studies  will be included in the review.
02/25/11 10 Steps in a Meta-Analysis 7. Extract effect sizes Effect sizes extraction involves converting descriptive or other statistical information contained in studies into a standard metric by which studies can be compared.
What is an Effect size? A  descriptive metric  that characterizes the  standardized difference  (in  SD  units) between the mean of a  control group  and the mean of a  treatment group  (educational intervention) Can also be calculated from  correlational data  derived from pre-experimental designs or from repeated measures designs 02/25/11
Characteristics of  Effect Sizes Can be  positive  or  negative   Interpreted as a  z -score, in  SD units , although individual effect sizes are not part of a  z -score distribution Can be  aggregated with other effect sizes  and subjected to other statistical procedures such as ANOVA and multiple regression Magnitude interpretation :  ≤ 0.20 is a small effect size, 0.50 is a moderate effect size and ≥ 0.80 is   a large effect size   (Cohen, 1992) 02/25/11
Effect Size Extraction Effect size extraction is the process of  identifying relevant statistical data  in a study and calculating an effect size based on those data All effect sizes should be extracted by  two coders , working independently Coders’ results should be compared and a measure of  inter-coder agreement  calculated and recorded In cases of disagreement, coders should resolve the discrepancy in  collaboration   02/25/11
02/25/11
02/25/11
Example of  ES  Extraction with Descriptive Statistics 02/25/11 Study reports: Treatment mean = 42.8 Control Mean = 32.5 Treatment  SD  = 8.6 Control  SD  = 7.4 n  = 26 n  = 31 Procedure:   Calculate  SD pooled   Calculate  d  and  g
Extracting Effect Sizes in the  Absence of Descriptive Statistics Inferential Statistics ( t -test,  ANOVA ,  ANCOVA , etc.) when the exact statistics are provided Levels of significance, such as  p  < .05, when the exact statistics are not given ( t  can be set at the conservative  t  = 1.96)   (Glass, McGaw & Smith, 1981; Hedges, Shymansky & Woodworth, 1989)   Studies not reporting sample sizes for control and experimental groups should be considered for exclusion   02/25/11
Examples of Alternative Methods of  ES  Extraction   02/25/11 •  Study Reports:  t  (63) = 2.56,  p  < .05 •  Study Reports:  F  (1, 63) = 2.56,  p  < .05 Convert  F  to  t  and apply the above equation:
Zero Effect Size 02/25/11 ES  = 0.00 Control  Condition Treatment  Condition Overlapping Distributions
Moderate Effect Size 02/25/11 Control  Condition Treatment  Condition ES  = 0.40
Large Effect Size 02/25/11 Control  Condition Treatment  Condition ES  = 0.85
Mean and Variability 02/25/11 Variability ES+ Note:  Results from Bernard, Abrami, Lou, et al. (2004)  RER
02/25/11 10 Steps in a Meta-Analysis 8.  Develop a codebook Study feature coding involves describing the relevant characteristics for each study (e.g., research methodology, publication source).The codebook details the study feature categories and their levels.
Examining Study Features Purpose :  to attempt to explain variability in effect size Any  nominal, ordinal or interval  coded study feature can be investigated In addition to mean effect size,  variability  should be investigated Study features with  small  ks  may be unstable   02/25/11
Examples of Study Features Research methodology Type and nature of measures Direction of the statistical test  Publication data Relevant aspects of the treatment Relevant aspects of the control condition 02/25/11
10 Steps in a Meta-Analysis 02/25/11 9. Code studies for study features Coding study features is perhaps the most time-consuming and onerous aspect of conducting a meta-analysis.  However, it is arguably the most important step because it provides the possibility for explaining variability in effect sizes.
02/25/11 10 Steps in a Meta-Analysis 10: Analysis and interpretation Analysis involves invoking a range of standard statistical tests to examine average effect sizes, variability and the relationship between study features and effect size. Interpretation is drawing conclusion from these analyses.
Questions:  Statistical Analysis What techniques will be used to  combine results  of separate tests?  What techniques will be used to  assess and then analyze the variability  in findings across studies? What  sensitivity analyses  (i.e., tests of the impact of such decisions on the results of the review) will be carried out and how? What statistical procedures will be used to test  relationships between study features and effect sizes  (e.g., meta regression) 02/25/11
Homogeneity vs. Heterogeneity of Effect Size If  homogeneity of effect size  is established, then the studies in the meta-analysis can be thought of as sharing the same effect size (i.e., the mean) If  homogeneity of effect size is  violated (heterogeneity of effect size), then no single effect size is representative of the collection of studies (i.e., the “true” mean effect size remains unknown)  02/25/11
02/25/11
02/25/11 Statistics in Comprehensive  Meta-Analysis™ Comprehensive Meta-Analysis 2.0 is a trademark of BioStat® Interpretation:   Moderate  ES  for all outcomes ( g+  = 0.34) in favor of the intervention condition. Homogeneity of ES is violated.  Q-value  is significant (i.e., there is too much variability for  g+  to represent a true average in the population).
Examining the Study  Feature “Type of Research Design” 02/25/11 g + = +0.34 Overall Effect Pre-Post  Designs Post-Only  Designs Quasi-Exp.  Designs
Tests of Levels of “Type of Research Design” 02/25/11 Interpretation:   Small to   Moderate  ESs  for all categories in favor of the intervention condition. Homogeneity of ES is violated.  Q-value  is significant for all categories (i.e., type of research design does not explain enough variability to reach homogeneity.
Sensitivity Analysis Tests the robustness of the findings Asks the question :  Will these results stand up when potentially distorting or deceptive elements, such as outliers, are removed? Particularly important to examine the robustness of the effect sizes of study features, as these are usually based on smaller numbers of outcomes 02/25/11
Selected References Bernard, R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How Does Distance Education Compare to Classroom Instruction? A Meta-Analysis of the Empirical Literature.  Review of Educational Research, 74 (3), 379-439. Glass, G. V., McGaw, B., & Smith, M. L. (1981).  Meta-analysis in social research . Beverly Hills, CA: Sage. Hedges, L. V., & Olkin, I. (1985).  Statistical methods for meta-analysis . Orlando, FL: Academic Press. Hedges, L. V., Shymansky, J. A., & Woodworth, G. (1989).  A practical guide to modern methods of meta-analysis . [ERIC Document Reproduction Service No. ED 309 952]. 02/25/11

meta analysis

  • 1.
    A Workshop onthe Basics of Systematic Review & Meta-Analysis Philip C. Abrami, Robert M. Bernard C. Anne Wade, Evgueni Borokhovski, Rana Tamim, Gretchen Lowerison & Mike Surkes Centre for the Study of Learning and Performance and CanKnow Concordia University
  • 2.
    02/25/11 What isa Systematic Review? A review of a clearly formulated question that uses systematic and explicit methods to identify, select and critically appraise relevant research, and to collect and analyze data from the studies that are included in the review. Statistical methods (meta-analysis) may or may not be used to analyze and summarize the results of the included studies. Other examples: Narrative review, qualitative review, vote count, meta-synthesis.
  • 3.
    What is Meta-Analysis?Meta-Analysis is a set of quantitative research synthesis techniques and procedures Meta-Analysis uses effect size as a metric for judging the magnitude of standardized difference between a treatment and control condition 02/25/11
  • 4.
  • 5.
    Purpose: ExplainingVariability in Effect Size 02/25/11 Effect Sizes Study Features Shared Variability Unique Variability Unique Variability Prediction
  • 6.
    10 Steps inPlanning and Conducting a Systematic Review/Meta-Analysis 02/25/11 Determine the research question Develop terms and definitions related to the question Develop a search strategy for identification of relevant studies Establish criteria for inclusion and exclusion of studies Select studies based on abstract review (agreement) Select studies based on full-text review (agreement) Extract effect sizes (agreement) Develop codebook of study features Code studies (agreement) Conduct statistical analysis and interpretation
  • 7.
  • 8.
    02/25/11 1. Determinethe research question The “big question” that guides the research. It usually involves asking about the difference between two conditions (i.e., usually treatment and control) or the relationship between two measures. 10 Steps in a Meta-Analysis
  • 9.
    Questions the Researcher Should Ask Does the question have theoretical or practical relevance (i.e., aids in practice and/or policy making decisions)? Is the literature of a type that can answer the question? Is there a sufficient quantitative research literature? Do the studies lend themselves to meta-analysis? Is the literature too large given the resources available? 02/25/11
  • 10.
    Example: CriticalThinking 02/25/11 Research Question: What instructional interventions, to what extent, and under what particular circumstances, impact on the development and effective use of learner’s critical thinking skills and dispositions?
  • 11.
    02/25/11 2. Develop terms and definitions related to the question This helps refine the research question and inform the search strategies. 10 Steps in a Meta-Analysis
  • 12.
    02/25/11 3. Develop a search strategy for the identification of relevant studies This involves the planning/implementation of search and retrieval for primary studies (e.g., electronic databases, branching). 10 Steps in a Meta-Analysis
  • 13.
    02/25/11 Information Retrieval: A Continuous Process Preliminary Searches Supports beginning steps: Definition of key concepts & research question Use of standard reference tools and broad searches for review articles and key primary studies Main Searches Identification of primary studies through searches of online databases, printed indices, Internet, branching, hand-searches Most difficult given a number of challenges Final Searches Occurs towards the end of the Review Process Refine search terms and update original searches
  • 14.
    02/25/11 Preliminary SearchesReference Sources: Purpose : To obtain definitions for the terms; creativity, critical thinking, decision making, divergent thinking, intelligence; problem solving, reasoning, thinking. Sources: Bailin, S. (1998). Critical Thinking: Philosophical Issues . [CD-ROM] Education: The Complete Encyclopedia. Elsevier Science, Ltd. Barrow, R., & Milburn, G. (1990). A critical dictionary of educational concepts: An appraisal of selected ideas and issues in educational theory and practice (2 nd ed.) . Hertfordshire, UK: Harvester Wheatsheaf Colman (2001). Dictionary of Psychology (complete reference to be obtained) Corsini, R. J. (1999). The dictionary of psychology . Philadelphia, PA: Brunner/Mazel Dejnoka, E. L., & Kapel, D. E. (1991). American educator’s encyclopedia . Westport, CT: Greenwood Press. …… (see handout)
  • 15.
    02/25/11 Main Searches:Decisions Selection of Primary Information Retrieval Tools Scope of search: Which fields should be searched (including all related fields)? Availability of indexing tools: Which tools do we have access to at our institution? Are there others who can perform searches for us? Format of indexing tools: What format are they in (e.g. online, print, web-based)? Date: How far back does the indexing go for each tool? Language: What is the language of the material that is indexed? How can we locate non-English material? Unpublished work : How can we access dissertations, reports, & other grey literature?
  • 16.
    Examples of DatabasesEducation: ERIC, British Education Index, Australian Education Index, Chinese ERIC, CBCA Education, Education index, Education: A SAGE Full-text Collection Psychology : PsycINFO, PubMed (Medline), Psychology: A SAGE Full-Text Collection Sociology: Sociological Abstracts, Contemporary Women’s Issues. Sociology: A SAGE Full-text Collection Multidisciplinary: EBSCO Academic Search Premier, ProQuest Dissertations and Theses Fulltext, FRANCIS, Social Sciences Index, SCOPUS, Web of Science 02/25/11
  • 17.
    Example: Critical ThinkingTo date, the following databases have been searched: AACE Digital Library (now known as EdITLib) ABI/Inform Business EBSCO Academic Search Premier ERIC EconLit PAIS International ProQuest Dissertations and Theses Fulltext PsycINFO Social Science Index Sociological Abstracts 02/25/11
  • 18.
    02/25/11 Main Searches:More Decisions Preparation of Search Strategies What are the key concepts to be searched? How are these represented in each discipline? What are their related terms? How are these key concepts represented in the controlled vocabulary within each database to be searched? (See handout) Note: these decisions need to be made for each indexing tool used.
  • 19.
    Main Searches: Yet More Decisions Construction of the Search Statements What terms should be searched as descriptors or as “free text”? What Boolean operators should be used? Where should truncation characters be used? (e.g. parent* will retrieve parent, parents, parental) What limiting features are available to narrow results? (e.g. use of Publication Type codes)? What time period should be searched? 02/25/11
  • 20.
    Example: ERIC CombiningKeywords/Descriptors using Boolean operators: Searches and records below from: The ERIC Database #5 #3 and #4 (1520 records) #4 DTC = 142 or DTC = 143 or control group* (322893 records) #3 #1 or #2 (7718 records) #2 critical thinking in DE,ID (7562 records) #1 thinking skills in DE and critical thinking (1269 records) 02/25/11
  • 21.
    Documenting Your Searches02/25/11 Example from our Codebook: ERIC (Date: September 21, 2003; AW) Purpose : To retrieve the first set of abstracts to be reviewed by team according to the current inclusive/exclusion criteria. Result : Hit rate of 514/1520 Source code: ERIC1 Searches and records below from: The ERIC Database (1966-2003, June) #5 #3 and #4 (1520 records) #4 DTC = 142 or DTC = 143 or control group* (322893 records) #3 #1 or #2 (7718 records) #2 critical thinking in DE,ID (7562 records) #1 thinking skills in DE and critical thinking (1269 records)
  • 22.
    Next Steps Repeatthese steps for each database to be searched. (see handout) 02/25/11
  • 23.
    02/25/11 Secondary RetrievalStrategies Locating the grey (unpublished) literature: - Using the web, & Dissertations Abstracts Branching: - Scanning the reference section of review articles Hand searches: - Scanning the Table of Contents of key journals and conference proceedings Personal contacts: - Contacting key researchers in the field Main Searches: Yet Still More Decisions
  • 24.
    02/25/11 Information Retrieval: Wrap Up “ Shoestring-budget information retrieval is likely to introduce bias, and should be avoided.” ( IR Policy Brief , 2004) Importance of information retrieval process Not a “one-shot”deal Requires expertise in the planning and implementation of searches Library personnel are important members of the team Use of bibliographic management software Reference Manager, EndNotes, RefWorks Ability to replicate review Documentation of entire process, including search strategies used for each database, decisions taken, etc.
  • 25.
    02/25/11 10 Stepsin a Meta-Analysis 4. Establish criteria for inclusion and exclusion of studies These are the criteria that guide the search for literature and ultimately determine what studies are in and out of the review.
  • 26.
    Inclusion/Exclusion: Questions What characteristics of studies will be used to determine whether a particular effort was relevant to the research question? What characteristics of studies will lead to inclusion? exclusion? Will relevance decisions be based on a reading of report titles? abstracts? full reports? Who will make the relevance decisions? How will the reliability of relevance decisions be assessed? 02/25/11
  • 27.
    02/25/11 10 Stepsin a Meta-Analysis 5. Select studies based on abstract review This is the initial decision as to what studies will be retrieved as full-text documents.
  • 28.
    02/25/11 10 Stepsin a Meta-Analysis 6. Select studies based on full-text review This is the second decision as to what studies will be included in the review.
  • 29.
    02/25/11 10 Stepsin a Meta-Analysis 7. Extract effect sizes Effect sizes extraction involves converting descriptive or other statistical information contained in studies into a standard metric by which studies can be compared.
  • 30.
    What is anEffect size? A descriptive metric that characterizes the standardized difference (in SD units) between the mean of a control group and the mean of a treatment group (educational intervention) Can also be calculated from correlational data derived from pre-experimental designs or from repeated measures designs 02/25/11
  • 31.
    Characteristics of Effect Sizes Can be positive or negative Interpreted as a z -score, in SD units , although individual effect sizes are not part of a z -score distribution Can be aggregated with other effect sizes and subjected to other statistical procedures such as ANOVA and multiple regression Magnitude interpretation : ≤ 0.20 is a small effect size, 0.50 is a moderate effect size and ≥ 0.80 is a large effect size (Cohen, 1992) 02/25/11
  • 32.
    Effect Size ExtractionEffect size extraction is the process of identifying relevant statistical data in a study and calculating an effect size based on those data All effect sizes should be extracted by two coders , working independently Coders’ results should be compared and a measure of inter-coder agreement calculated and recorded In cases of disagreement, coders should resolve the discrepancy in collaboration 02/25/11
  • 33.
  • 34.
  • 35.
    Example of ES Extraction with Descriptive Statistics 02/25/11 Study reports: Treatment mean = 42.8 Control Mean = 32.5 Treatment SD = 8.6 Control SD = 7.4 n = 26 n = 31 Procedure: Calculate SD pooled Calculate d and g
  • 36.
    Extracting Effect Sizesin the Absence of Descriptive Statistics Inferential Statistics ( t -test, ANOVA , ANCOVA , etc.) when the exact statistics are provided Levels of significance, such as p < .05, when the exact statistics are not given ( t can be set at the conservative t = 1.96) (Glass, McGaw & Smith, 1981; Hedges, Shymansky & Woodworth, 1989) Studies not reporting sample sizes for control and experimental groups should be considered for exclusion 02/25/11
  • 37.
    Examples of AlternativeMethods of ES Extraction 02/25/11 • Study Reports: t (63) = 2.56, p < .05 • Study Reports: F (1, 63) = 2.56, p < .05 Convert F to t and apply the above equation:
  • 38.
    Zero Effect Size02/25/11 ES = 0.00 Control Condition Treatment Condition Overlapping Distributions
  • 39.
    Moderate Effect Size02/25/11 Control Condition Treatment Condition ES = 0.40
  • 40.
    Large Effect Size02/25/11 Control Condition Treatment Condition ES = 0.85
  • 41.
    Mean and Variability02/25/11 Variability ES+ Note: Results from Bernard, Abrami, Lou, et al. (2004) RER
  • 42.
    02/25/11 10 Stepsin a Meta-Analysis 8. Develop a codebook Study feature coding involves describing the relevant characteristics for each study (e.g., research methodology, publication source).The codebook details the study feature categories and their levels.
  • 43.
    Examining Study FeaturesPurpose : to attempt to explain variability in effect size Any nominal, ordinal or interval coded study feature can be investigated In addition to mean effect size, variability should be investigated Study features with small ks may be unstable 02/25/11
  • 44.
    Examples of StudyFeatures Research methodology Type and nature of measures Direction of the statistical test Publication data Relevant aspects of the treatment Relevant aspects of the control condition 02/25/11
  • 45.
    10 Steps ina Meta-Analysis 02/25/11 9. Code studies for study features Coding study features is perhaps the most time-consuming and onerous aspect of conducting a meta-analysis. However, it is arguably the most important step because it provides the possibility for explaining variability in effect sizes.
  • 46.
    02/25/11 10 Stepsin a Meta-Analysis 10: Analysis and interpretation Analysis involves invoking a range of standard statistical tests to examine average effect sizes, variability and the relationship between study features and effect size. Interpretation is drawing conclusion from these analyses.
  • 47.
    Questions: StatisticalAnalysis What techniques will be used to combine results of separate tests? What techniques will be used to assess and then analyze the variability in findings across studies? What sensitivity analyses (i.e., tests of the impact of such decisions on the results of the review) will be carried out and how? What statistical procedures will be used to test relationships between study features and effect sizes (e.g., meta regression) 02/25/11
  • 48.
    Homogeneity vs. Heterogeneityof Effect Size If homogeneity of effect size is established, then the studies in the meta-analysis can be thought of as sharing the same effect size (i.e., the mean) If homogeneity of effect size is violated (heterogeneity of effect size), then no single effect size is representative of the collection of studies (i.e., the “true” mean effect size remains unknown) 02/25/11
  • 49.
  • 50.
    02/25/11 Statistics inComprehensive Meta-Analysis™ Comprehensive Meta-Analysis 2.0 is a trademark of BioStat® Interpretation: Moderate ES for all outcomes ( g+ = 0.34) in favor of the intervention condition. Homogeneity of ES is violated. Q-value is significant (i.e., there is too much variability for g+ to represent a true average in the population).
  • 51.
    Examining the Study Feature “Type of Research Design” 02/25/11 g + = +0.34 Overall Effect Pre-Post Designs Post-Only Designs Quasi-Exp. Designs
  • 52.
    Tests of Levelsof “Type of Research Design” 02/25/11 Interpretation: Small to Moderate ESs for all categories in favor of the intervention condition. Homogeneity of ES is violated. Q-value is significant for all categories (i.e., type of research design does not explain enough variability to reach homogeneity.
  • 53.
    Sensitivity Analysis Teststhe robustness of the findings Asks the question : Will these results stand up when potentially distorting or deceptive elements, such as outliers, are removed? Particularly important to examine the robustness of the effect sizes of study features, as these are usually based on smaller numbers of outcomes 02/25/11
  • 54.
    Selected References Bernard,R. M., Abrami, P. C., Lou, Y. Borokhovski, E., Wade, A., Wozney, L., Wallet, P.A., Fiset, M., & Huang, B. (2004). How Does Distance Education Compare to Classroom Instruction? A Meta-Analysis of the Empirical Literature. Review of Educational Research, 74 (3), 379-439. Glass, G. V., McGaw, B., & Smith, M. L. (1981). Meta-analysis in social research . Beverly Hills, CA: Sage. Hedges, L. V., & Olkin, I. (1985). Statistical methods for meta-analysis . Orlando, FL: Academic Press. Hedges, L. V., Shymansky, J. A., & Woodworth, G. (1989). A practical guide to modern methods of meta-analysis . [ERIC Document Reproduction Service No. ED 309 952]. 02/25/11