• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Where We Are in Mixed Methods Evaluation?
 

Where We Are in Mixed Methods Evaluation?

on

  • 506 views

Washington Evaluators Brown Bag ...

Washington Evaluators Brown Bag
by Donna Mertens and Mika Yoder Yamashita
June 11, 2012

This presentation provides an introduction to mixed methods evaluation. The presenters survey current discussions in mixed methods evaluation from the following aspects: definitions of mixed methods evaluation and different levels of mixing methods including philosophical assumptions, designs, and data collection techniques. The presenters will review American Evaluation Association’s (AEA) annual conference presentations hosted by the Mixed Methods Evaluation Topical Interest Group (TIG) for the past three years to identify areas most discussed among AEA members. Then, the presenters discuss possible areas for further inquiry.

Donna Mertens is a past AEA president and an editor of the Journal of Mixed Methods Research (Sage). She has authored a number of articles and books on the topic of mixed methods and evaluation, including Program Evaluation Theory and Practice: A Comprehensive Guide (with Amy Wilson, NY: Guilford, 2012), Research and Evaluation in Education and Psychology: Integrating Diversity with Qualitative, Quantitative, and Mixed Methods (3rd ed., Sage 2010), and Transformative Research and Evaluation (Guilford, 2009). She also consulted on evaluation projects internationally. She is a professor at Gallaudet University where she teaches research and evaluation methods. She serves as a co-chair of Mixed Methods Evaluation TIG.

Mika Yoder Yamashita is a research associate at the Center for Education Policy and Practice, FHI360. As a contractor to government-funded projects, her work includes program evaluation of college access programs for low-income students, institutional capacity building program of community colleges. She serves as a program chair for Mixed Methods Evaluation TIG.

Statistics

Views

Total Views
506
Views on SlideShare
503
Embed Views
3

Actions

Likes
0
Downloads
0
Comments
0

2 Embeds 3

http://washingtonevaluators.org 2
http://www.washingtonevaluators.roundtablelive.org 1

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Where We Are in Mixed Methods Evaluation? Where We Are in Mixed Methods Evaluation? Presentation Transcript

    • Donna Mertens, Ph.D. Gallaudet University and an Editor of the Journal of Mixed Methods Research and Mika Yoder Yamashita Ph.D. FHI360 June 11, 2012 George Washington University
    •   Get a quick overview of mixed methods evaluation Hear from evaluators who have been working with mixed methods evaluation
    •          Greene, Caracelli & Graham (1989). “Towards a conceptual framework for mixed methods evaluation designs.” EEPA Greene & Caracelli (1997). “NDE: Advances in mixed-method evaluation.” NSF (1997). “User-friendly handbook for mixed method evaluations” Tashakkori & Teddlie (2003) “Handbook of mixed methods in social and behavioral research” Journal of Mixed Methods Research (2007) Tashakkori & Teddlie (2010). “Handbook of mixed methods in social and behavioral research” (2nd edition) NIH (2011). “Best practices for mixed methods research in the health sciences.” More books, articles… InterAction. “Impact evaluation guidelines” include mixed methods research.
    • Areas of ongoing discussions listed by Tashakkori & Teddlie In 2003 Handbook In 2010 Handbook 1) Nomenclature and definitions used in mixed methods research. 2) Utility of mixed methods research 3) The paradigmatic foundations for mixed methods research 4) Design issues in mixed methods research 5) Issues in drawing inferences in mixed methods research 6) Logistics of conducting mixed methods research PLUS 7) The conceptual/methodological/meth ods interface in mixed methods research 8) Research questions and research problems in mixed methods research 9) Analysis issues in mixed methods research 10) Cross disciplinary and cross cultural issues in mixed methods research
    •     Petition was submitted in 2010 “TIG will examine the use of mixed methods in evaluation through reflective analysis of the philosophy, theory and methodology that is developing in the field of mixed methods.” “TIG would contribute to the improvement of evaluation practices, method and use” because “ it(TIG) will focus on the contributions that a better understanding of mixed methods has to offer”
    • Johnson & Onwuegbuzie (2007) “ Towards Definition of mixed Methods Research”  36 researchers provided 19 definitions.  Some differences across researchers are: •Should two methods be used in one study, one question, or related studies? •What is mixed? (e.g. Quantitative methods and qualitative methods, quantitative methods and quantitative methods, or paradigmatic standpoints?) •When should “mixing” occur? What types of mixing are included? (e.g. Is converting quantitative data to qualitative description considered to be “mixing”?) “Mixed methods research is the class of research where the researcher mixes or combines quantitative and qualitative research techniques, methods, approaches, concepts or language it a single study or set of related studies” (Johnson & Onwuegbuzie, 2007, page 120) Mixed Methods Evaluation TIG’s definition: “mixed methods is viewed as the combination of more than one methodological standpoint in the same study” “mixing can occur at the level of inquiry purpose, philosophical assumptions, methodological design, and/or specific data gathering technique” (Petition, 2010)
    •  Sequential Design: Qualitative followed by  Sequential Design: Quantitative followed by  Concurrent Design: Quantitative and quantitative; qualitative; qualitative methods used together.
    • An example of mixed methods research Ivanokova, Creswell & Stick (2006). Sequential Explanatory Design Research Question: What factors predict students’ persistence in a distance education Ph.D. program and how these factors contribute to persistence in the program? Quantitative Data Collection (n=278) •Crosssectional webbased survey Quantitative Data Analysis Factor analysis, frequencies, discriminant function analysis 4 groups of students: a)beginning of the program, b) middle of the program, c) completed PhD. d) dropped out. “Mixing” Purposefully selecting 1 participants from each group (n=4) based on typical response and maximal variation principle QUALITATIVE Data collection •In-depth phone interviews with 4 participants •E-mail follow up •Documents QUALITATIVE Data Analysis •Coding, thematic analysis, •Within-case and cross-case theme development •Cross thematic analysis “Mixing” Integration of the quantitative and qualitative results •Interpretation and explanation of the qualitative and qualitative results
    • Anti-paradigm war Pragmatic Constructivism Dialectical Postpositivism Transformative
    • Mertens & Wilson (2012)
    • Paradigm Branch Post-positivist Methods Constructivist Values Transformative Social Justice Pragmatic Use Mertens & Wilson (2012)
    • Stage 3 Stage 1 Stage 2 Qual Concurrent Sequentia l Pilot Preliminary studies: youth, gender, disability, tribe intervention: Observations, Interviews, Surveys Assemble team; read documents; engage in dialogues; identify contextual factors Stage 4 Concurrent Process eval Demographic information; Surveys; Incidence data Pretest: Knowledge, Attitude, Behavior; Post tests: Quant Qual; Behavior & Policy Change; Transfer To other contexts
    • 1.Validity within quantitative and qualitative method. 2. Validity issue that derive from using two methods. For example, “Is this design and sampling fine for answering this question?” 3. Validity of inference
    • How well can a specific evaluation or research design meet the purpose of mixing (purposes by Greene, Caracelli & Graham, 1989)? a) Triangulation b) Complementarity c) Development d) Initiation e) Expansion Onwuegbuzie & Johnson (2006) introduced the following mixed methods specific validity (they called it “legitimation”) criteria: 1) Sample integration legitimation 2) Inside-outside legitimation 3) Weakness minimizing legitimation 4) Sequential legitimation 5) Conversion legitimation 6) Paradigmatic mixing legitimation 7) Commensurability legitimation 8) Multiple validities 9) Political legitimation
    • Presentations at the AEA conferences Main focus of presentations 2010 2011 2012 Evaluation findings (reflection on methods, e.g. data collection, data management, analysis, or inference process) 14(1) 16 (5) 9 (3) Logistic 2 (1: how to involve stakeholders, 1: how team members worked) 2 (1: how a client’s request changed evaluation design and process, 1: How team members worked) 0 Quality of mixed methods evaluation 1 (comparison of designs) 3 (2: quality in relation relevance to stakeholders, 1: quality criteria) 2 (1: quality in relation to drawing evaluative conclusion, 1: quality criteria) Discussion that covers relationship between paradigm, methodology, or methods 0 2 2 Other 1 0 0 Total 18 23 13
    • Questions and Discussions
    •   Creswell J.W, Klassen A.C., Plano Clark V.L, Smith K.C. for the Office of Behavioral and Social Sciences Research. Best practices for mixed methods research in the health sciences. August 2011. National Institutes of Health. http://obssr.od.nih.gov/mixed_methods_research Creswell, J. W., Plano Clark, V. L., Gutmann, M. L., & Hanson, W. E. (2003). Advanced mixed methods research designs. In A. Tashakkori & C. Teddlie (Eds.), Handbook of Mixed Methods in Social and Behavioral Research, 209–240. Thousand Oaks, CA: Sage.  Greene, J.C., (2006). Toward a methodology of mixed methods social inquiry. Research in the Schools, 13 (1), 93-98. http://www.msera.org/Rits_131/Greene_131.pdf  Greene, J.C. & Caracelli, V. J. (1997). Defining and describing the paradigm issue in mixed-method evaluation. New Directions for Evaluation, 74. 5-17.         Greene, J. C., Caracelli, V. J., & Graham, W. D. (1989). Toward a conceptual framework for mixedmethod evaluation designs. Educational Evaluation and Policy Analysis, 11(3), 255-274. http://xa.yimg.com/kq/groups/18751725/9997427/name/Toward+a+Conceptual+Framework+for+ Mixed-Method+Evaluation+Designs_Greene_1989.pdf This article discusses the categorization of mixed methods evaluation by focusing on reasons for using quantitative and qualitative methods in one evaluation study. Johnson, R. B., Onwuegbuzie, A. J. & Turner, L.A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1 (1), 112-133 http://drupal.coe.unt.edu/sites/default/files/24/59/Johnson,%20Burke%20Mixed%20Methods%20Res earch.pdf Mertens, D. M. & Wilson, A. T. (2012). Program Evaluation Theory and Practice: A Comprehensive Guide. New York, NY: The Guilford Press. Ivankova, N.V., Creswell, J.W., & Stick, S. L. (2006). Designing mixed-methods sequential explanatory design: From theory to practice. Field Methods, 18 (3) 3-20. http://wtgrantmixedmethods.com/pdf_files/Ivankova%20etal_2006_mixed%20methods%20sequential %20design.pdf
    •     National Science Foundation (1997). User-Friendly Handbook for Mixed Method Evaluations. National Science Foundation Report Number nsf97153 http://www.nsf.gov/pubs/1997/nsf97153/start.htm Onwuegbuzie, A. J. & Collins, K. (2007). A typology of mixed methods sampling designs in social science research. The Qualitative Report, 12 (2) 281-316. http://wtgrantmixedmethods.com/pdf_files/Onwuegbuzie_Collins_2007_Typology%20of%20M M%20Sampling%20Designs.pdf Onwuegbuzie, A. J. & Johnson, R.B. (2006). The validity issues in mixed research. Research in the Schools, 13 (1), 48-63. http://carbon.videolectures.net/v005/e1/4gi2nosqk7a4u3rhmb6f4yl2huqff7a5.pdf  Tashakkori, A. & Teddlie, C. (2003). Handbook of Mixed Methods in Social and Behavioral Research. Thousand Oaks, CA: Sage.  Tashakkori, A., & Teddlie, C. (2010). Handbook of mixed methods in social and behavioral research (2nd Edition). Thousand Oaks, CA: Sage.  Teddlie, C. & Tashakkori, A. (2006). A general typology of research design featuring mixed methods. Research in the Schools, 13(1), 12-28. http://www.msera.org/Rits_131/Teddlie_Tashakkori_131.pdf
    •  American Evaluation Association. (2011). Public statement on cultural competence in evaluation . AEA.  Bledsoe, K.L., & Graham, J.A. (2005). Using multiple evaluation approaches in program evaluation. American Journal of Evaluation, 26, 302-319.  Bledsoe, K. L., & Hopson, R. H. (2009). Conducting ethical research in underserved communities. In D. M. Mertens and P. Ginsberg (Eds), Handbook of ethics for research in the social sciences. Thousand Oakes, CA: Sage Publications.  Greene, J. C. (2006). Toward a methodology of mixed methods social inquiry. Research in the Schools. Special Issue: New Directions in Mixed Methods Research, 13(1), 93-99.  Hopson, R. K., Kirkhart, K., & Bledsoe, K. L. (2012). Decolonizing evaluation in a developing world: Implications and cautions for Equity-focused Evaluation (EFE). In UNICEF’s How to design and manage equity-focused evaluations.New York: UNICEF.  Hood, S., Hopson, R. H., & Frierson H. T. (2005, Eds.) The role of culture and cultural context: A mandate for inclusion, the discovery of truth and understanding in evaluative theory and practice. Greenwich, CT: Information Age Publishing.  Kirkhart, K. E. (2005). Through a cultural lens: Reflections on validity and theory in evaluation. In S. Hood, R. K. Hopson, and H. T. Frierson (eds.). The role of culture and cultural context: A mandate for inclusion, the discovery of truth and understanding in evaluative theory and practice. Greenwich, CT: Information Age Publishing.  Lee, S. A., & Farrell, M. (February, 2006). Is cultural competency a backdoor to racism? Anthropology News. The American Anthropological Association.  Lincoln, Y. S. (2009). Ethical practices in qualitative research. In D. M. Mertens and P. Ginsberg (Eds), Handbook of ethics for research in the social sciences. Thousand Oakes, CA: Sage Publications.  Mertens, D.M. & Wilson, A.T. (2012). Program evaluation theory and practice: A comprehensive guide. NY: Guilford.  Mertens, D. M. (2010). Research methods in education & psychology: Integrating diversity with quantitative, qualitative, and mixed methods. 3rd ed. Thousand Oaks, CA: Sage.
    •  Mertens, D. M. (2009). Transformative research and evaluation. New York, NY: Guilford Press  Mertens, D. & Ginsberg, P. (Eds.) (2009). The handbook of social research ethics. Thousand Oaks, CA: Sage Publications.      Mertens, D. M., Sullivan, M., & Stace, H. (2009). Transformative research with the disability community. In N. Denzin & Y.S. Lincoln (Eds.), Handbook of qualitative research. 5th ed. Thousand Oaks, CA: Sage. Mertens, D. M., Bledsoe, K. L., Sullivan, M., & Wilson, A. (2010). Utilization of mixed methods for transformative purposes. In C. Teddlie and A. Tashakkori (Eds.) Handbook of mixed methods research, 2nd Edition. Thousand Oakes, CA: Sage Publications. Pon, G. (2009). Cultural competency as new racism: An ontology of forgetting. Journal of Progressive Human Services, 20, 59-71. Symonette, H. (2009). Cultivating self as responsive instrument: Working the boundaries and borderlands for ethical border crossings. In D. M. Mertens and P. Ginsberg (Eds), Handbook of ethics for research in the social sciences. Thousand Oakes, CA: Sage Publications. Thomas, V. (2009). Critical Race Theory: Ethics and dimensions of diversity in research. In D. M. Mertens and P. Ginsberg (Eds), Handbook of ethics for research in the social sciences. Thousand Oakes, CA: Sage Publications.