Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Qualitative Methods Course: Moving from Afterthought to Forethought

744 views

Published on

Presented during a February 23, 2018, webinar.

Published in: Health & Medicine
  • Be the first to comment

Qualitative Methods Course: Moving from Afterthought to Forethought

  1. 1. Qualitative Methods Course: Phyllis Dako-Gyeke, PhD School of Public Health, University of Ghana Pilar Torres, MA National Institute of Public Health, Mexico Jessica Fehringer, PhD, MHS Carolina Mejia, PhD, MPH MEASURE Evaluation University of North Carolina, USA February 22, 2018 Moving from Afterthought to Forethought
  2. 2. Acknowledgments  Liz Archer  Sunil George  Hemali Kulatilaka  Liz Millar  Emily Bobrow Special thanks go to Jen Curran, who assisted in initial course development, and to Daijah Charnelle Street and Daniel Gluck for assisting with this webinar. Coauthors
  3. 3. Session Objectives 1. Provide an overview of a short course on qualitative evaluations and share innovative content examples 2. Describe the key challenges in addressing the development of the course and provide examples from the case studies and group activities 3. Generate a discussion with the audience on how evaluators can improve teaching qualitative courses
  4. 4. Innovative Content Qualitative Methods Course for Rigorous Evaluation
  5. 5. Rigorous Evaluations  Follow a clearly specified protocol  Adhere to recognized scientific standards  Include formative evaluations, process evaluations, outcome evaluations, and impact evaluations Definition
  6. 6. Qualitative Evaluations  Fulfill an important role in rigorous evaluation of programs  May be used to complement quantitative data or answer a question not accessible quantitatively—the “why” behind program successes or challenges  Illuminate the uniquely human side of health programming and bring to light important contextual factors
  7. 7. Rationale  A need for a course emerged from: • MEASURE Evaluation’s impact evaluation course feedback • MEASURE Evaluation’s field experience and literature familiarity o Limited focus on quality of qualitative methods: “afterthought” or “add-on”  Global Evaluation and Monitoring Network for Health (GEMNet) demand analysis
  8. 8. About the Course  Enhance participants’ capacity to conceptualize, design, develop, govern, and manage qualitative methods in evaluation and use the information generated for improved public health practice and service delivery  The course contextualizes qualitative methods within rigorous evaluation, rather than offering the basics of a qualitative approach.
  9. 9. Audience and Prerequisites  Designed for participants who have basic knowledge of program evaluation and qualitative methods  Intended audience: professionals from monitoring and evaluation in health and development fields  Prior experience with qualitative methods and public health program evaluation is strongly encouraged.
  10. 10. Course Competencies Categories 1. Concepts, approaches, and purposes of qualitative methods in evaluation 2. Creating and conceptualizing evaluation questions 3. Troubleshooting selected qualitative methods for evaluation 4. Choosing an appropriate qualitative method 5. Developing data collection tools 6. Qualitative data analysis techniques 7. Fieldwork considerations 8. Presentation and dissemination of data 9. Quality standards for qualitative inquiry/trustworthiness 10. Ethical principles for qualitative evaluation
  11. 11. Course Content  Eleven sessions covering the key aspects of rigorous qualitative evaluation  Original course is 40 hours: seven days of in-person instruction, including time for practical application • One day to be added based on pilot feedback; about 56 hours now  Content tailored to address issues faced by evaluators in low- and middle-income countries
  12. 12. Sessions (1) 1. Introduction to Qualitative Methods in Evaluation: Discussion and Use of Paradigms in Study Design and the Emergent Nature of Qualitative Evaluation 2. Creating and Conceptualizing Qualitative Evaluation Questions 3. Troubleshooting in Select Qualitative Methods 4. Developing Data Collection Tools 5. Sampling Strategies and Saturation 6. Qualitative Data Analysis: Techniques and Planning
  13. 13. Sessions (2) 7. Qualitative Data Analysis: Hands-on 8. Quality Evaluation Standards for Qualitative Inquiry 9. Developing a Fieldwork Plan for Qualitative Evaluation 10. Data Presentation and Dissemination 11. Key Ethical Principles and gender integration in Qualitative Evaluation Note: Gender is also integrated throughout the course.
  14. 14. Teaching Methods  Course delivery is based on adult learning principles.  Each session includes varied teaching approaches for its activities.  Teaching methods include facilitated discussion, presentations, storytelling, groupwork, debates, thematic analysis, and case study from the relevant region of the world (based on workshop location).
  15. 15. Course Activities  Debates (paradigm debate)  Case study used across all sessions  Trustworthiness  Audience matters (role play as presenter for different audiences)  Groupwork on preselected projects (development of short protocol)
  16. 16. Activity Examples (1) The Third Wave Positivist Constructivist/ Interpretivist Critical/Emancipatory Pragmatists
  17. 17. Group Activity  Split into four groups representing each of the four major paradigms  Design an evaluation project around the topic • Develop a particular evaluation question and expand on the context • Develop your group’s evaluation concept • 20 minutes  Present your plan to the class • 5 minutes each  Class discussion • Combined 20 minutes after all groups present Activity Examples (2)
  18. 18. Activity Examples (3) Putting Quality First  Split into three groups  Use the template provided and indicate (30 minutes): • How you would address the different components of trustworthiness • Practical implementation o E.g., how would you conduct member checks?  Present your plan to the class • 10 minutes each
  19. 19. Activity Examples (4) Component of Trustworthiness Aspects Addressed Application (Real world operationalization) Dependability and Confirmability • Evaluation process • Methodology • Analysis o Audit trail: storing and cataloging raw data to be useful in the future o Careful documentation of the analytic and interpretation process, code/theme definitions o Keep “field diaries” to note and theoretical or philosophical approach of evaluators which may impact the evaluation o Piloting and refining for data collection tools to be appropriate to study population Credibility • Study design • Analysis • Confidence in the study outcomes • Value of the findings o Appropriate selection of person interviewing – female interviewers for women’s FGD, etc. o Field notes – was anyone else present during interviews/FGDs? o Consistency between data presented and findings o Work to establish inter-coder reliability during analysis o Consistency between data and findings of study o Participants provide feedback on preliminary findings – bring findings to women’s/men’s community group meetings to receive feedback Transferability • Sampling • Context • Methodology o Using maximum variation sampling to capture different tribal and religious backgrounds in communities o Culturally appropriate approach to recruiting participants o Data analysis which captures varying perspectives among sample population o Using illustrative quotes in reports/presentation to capture participant voices and illustrate themes
  20. 20. Activity Examples (5) Audience Matters The local community Cambridge University World Health Organization
  21. 21. Evaluation Methods (1) Measuring Success Student evaluation  Pretest and posttest covering all 11 sessions  Assessment of final group project Course evaluation  Daily participants’ evaluation form for facilitators to review covering the following: • Was content clear? • Were the facilitators prepared and organized in conducting the session? • Overall impression of the day (use a scale)
  22. 22. Evaluation Methods (2) Measuring Success  Final course evaluation, stressing the following: • Overall impressions • Comments on specific module presentation • Group comments and ranking • What worked best; what did not work • Suggestions for improvement (general and specific suggestions)
  23. 23. Learning as Evaluators and Trainers: The Development of a Short Course in Intermediate Qualitative Methods in Evaluation
  24. 24. Curriculum Development  Formed curriculum advisory committee (CAC) • Comprising experts in qualitative evaluation of health nominated by GEMNet-Health institutions  CAC member institutions: • Public Health Foundation, India • University of Pretoria, South Africa • University of Ghana, Ghana • National Institute for Public Health, Mexico • MEASURE Evaluation, a USAID-funded project, at the University of North Carolina in the United States
  25. 25. Curriculum Development Competencies Example Discuss major concepts, approaches, and types of qualitative methods in evaluation, including the purpose of using qualitative methods in evaluation, as well as discussing the use of mixed methods. LO1: Understand and compare the four major paradigms of evaluation LO2: Compare and contrast the use of qualitative methods for evaluation with other approaches LO3: Establish the appropriateness of the use of mixed methods of evaluation
  26. 26. Curriculum Development  Training of trainers and curriculum review meeting in February 2017 • GEMNet-Health faculty  First full pilot workshop in October 2017 in Ghana • 28 participants from 10 countries Review and Piloting
  27. 27. Participant Selection  Mix of locations to offer opportunities broadly  Prioritizing academic applicants who can pass knowledge on to others  Funding
  28. 28. Practical Component  Specific program evaluation proposals were submitted—5 selected.  Each group works on one real evaluation.  Work in group across the next 6 days to develop a protocol. • Last session of the day • Each day, answer questions relevant to topics presented that day  Protocols will be presented on Day 7 for feedback.
  29. 29. Practical Component
  30. 30. Case Study  Originally used multiple case studies for session activities, for topic and contextual variety  TOT*/review meeting feedback asked for one case study throughout  Developed gender- based violence evaluation case study set in Tanzania *TOT—training of trainers
  31. 31. Challenges (1)  Making it affordable for participants in developing countries, while also providing high-quality teaching and accommodation • Limit length vs. what you can cover and in what depth in 7 days • Limit hotel costs vs. comfort  Balancing theory and practical instruction Balance
  32. 32. Challenges (2)  Teaching participants with a variety of qualitative skills and experiences  Groupwork with participants of different skill levels, cultures, and varying personal investment (e.g., if program under evaluation was submitted by you)
  33. 33. Challenges (3) Software! Software! Software!
  34. 34. Challenges (4)  Integrating gender throughout and having it “stick” • Rather than having a stand- alone session, we tried to integrate it throughout • At end, minimal retention • Adding a stand-alone session for revision
  35. 35. Pilot Evaluation Results (1) ++ Content ++ Facilitation • Appreciated single case study throughout • All levels left with something
  36. 36. Pilot Evaluation Results (2) Needs improvement:  Timeline (too rushed)  Hotel/food Want more:  Data analysis (software!)  Ongoing mentorship
  37. 37. Next Steps  Small revisions based on participant feedback  Complete facilitator’s notes to ensure that external trainers who want to teach the course can do it  Post course content online in coming months to make available to wide audience  For use in teaching; not designed as a self- taught course  Another workshop later this year in Africa
  38. 38. Any Questions/Input? For additional information, contact: Jessica Fehringer, PhD, MPH fehringe@email.unc.edu Carolina Mejia, PhD, MPH cmejia@unc.edu Hemali Kulatilaka hkulatil@email.unc.edu
  39. 39. This presentation was produced with the support of the United States Agency for International Development (USAID) under the terms of MEASURE Evaluation cooperative agreement AID-OAA-L-14-00004. MEASURE Evaluation is implemented by the Carolina Population Center, University of North Carolina at Chapel Hill in partnership with ICF International; John Snow, Inc.; Management Sciences for Health; Palladium; and Tulane University. Views expressed are not necessarily those of USAID or the United States government. www.measureevaluation.org

×