Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Measuring Impact Qualitatively

3,045 views

Published on

Webinar presentation by Susan Pietryzk. Access the webinar recording at http://www.measureevaluation.org/resources/webinars/measuring-impact-qualitatively

Published in: Education

Measuring Impact Qualitatively

  1. 1. Susan Pietrzyk, PhD MEASURE Evaluation ICF International Email: susan.pietrzyk@icfi.com Twitter: @susanpietrzyk October 29, 2015 Measuring Impact Qualitatively
  2. 2. Aim of the webinar To elicit discussion and share insights regarding evaluative efforts to understand and measure impact and the role of qualitative methods in these efforts
  3. 3. Common ground Impact evaluations are trendy Qualitative work important to how impact is examined and measured USAID evaluations have long been qualitative oriented Manuals and toolkits about qualitative research are plentiful Interest is in reflection, the trend
  4. 4. About the team ICF International Research and Evaluation Staff Susan Pietrzyk, PhD Reeti Hobson, MPH Lwendo Moonzwe, PhD Debra Prosnitz, MPH
  5. 5. Set-up of the study Observation & Investigation • Projects, articles, social media, RfPs, webinars, reports, proposals, presentations, etc. Document Review & Bearings • USAID HIV/AIDS-related evaluation reports • Development Experience Clearinghouse (DEC) • 2003-present, selected 32 documents • USAID policy statements • 3ie impact evaluation repository Conversations & Mapping • Within ICF, MEASURE partners, friends • Talks, networking, meeting new people • Who’s conducted USAID HIV/AIDS evaluations
  6. 6. Overview of the webinar Why Take a step back and assess the trend Reflection, both critical and practical Context Shifts over time and the evaluation landscape Understand the past to plan for the future Document Review Insights and concrete ideas, not a singular method Tips for impact and qualitative evaluative work Questions Your ideas and thoughts Develop some momentum
  7. 7. Why The motivation for the Measuring Impact Qualitatively study has been twofold: • Impact evaluations are increasingly of interest among international development policymakers and practitioners • Although the term “impact evaluation” signifies to many a specifically rigorous quantitative exercise and use of a counterfactual, impact is also both an old and a subjective concept
  8. 8. Of interest ≠ existent Document review selection 224 Impact Evaluation (inclusive of counter factual) Qualitative Evaluation (performance evaluation) 168 144 104 23 81 32 15 17
  9. 9. Impact = an old concept It is now recognized that social progress is essential to economic development, and that without improvements in education, both general and technical, health, sanitation, land utilization, tax structures, and social justice, foreign development assistance programs would be extremely limited in their impact on the economic development of underdeveloped countries It is striking in fact that, while the United States has engaged in assistance programs of various types for decades, no systematic analysis of this type has been made on such problems as, for example, the impact of certain technological developments on the economies of nations, or the transfer of technology from one society to another
  10. 10. Impact = an old concept It is now recognized that social progress is essential to economic development, and that without improvements in education, both general and technical, health, sanitation, land utilization, tax structures, and social justice, foreign development assistance programs would be extremely limited in their impact on the economic development of underdeveloped countries -Wallace J. Campbell and David D. Lloyd, Report on the Eighth Annual National Conference on International Economic and Social Development June 1961 It is striking in fact that, while the United States has engaged in assistance programs of various types for decades, no systematic analysis of this type has been made on such problems as, for example, the impact of certain technological developments on the economies of nations, or the transfer of technology from one society to another -Henry R. Labouisse, Director, President’s Task Force on Foreign Economic Assistance. The Act for International Development: A Program for the Decade of Development June 1961
  11. 11. Impact = subjective concept 32 HIV/AIDS-related evaluation reports reviewed Total Average Number of pages 3,383 106 Uses of the word impact 765 24 Uses of impact-related words Specifically: achievements, effectiveness, outcome, performance, success, sustainability 6,105 191
  12. 12. Impact = subjective concept 765 + 6,105 = 6,870 words (32 reports) These are all words that work as part of understanding and assessing impact
  13. 13. Impact = subjective concept 765 + 6,105 = 6,870 words (32 reports) These are all words that work as part of understanding and assessing impact • Is the meaning always the same? • Does location change the meaning? • Might women and men understand the words differently? • Can we know if the experience was always the same?
  14. 14. Document review Objectives Undertake a practical and critical reflection of the details surrounding the use of qualitative methods in USAID HIV/AIDS-related evaluations Productively use this critical reflection to establish insights, ideas, and promising practices around ways to understand and assess impact in qualitative terms
  15. 15. Document review • Search for impact evaluations (inclusive of a counterfactual) and impact-oriented qualitative evaluations (often, performance evaluations) • Development Experiences Clearinghouse (DEC) • Keyword search on evaluation, impact evaluation, assessment, HIV, AIDS o Three people searching o 2003-2008, 2009-2013, 2014- present • Yield in the 1,000s, narrow based on document title o Exclude mid-term evaluations, conference proceedings, trip reports, project quarterly reports Methodology
  16. 16. Document review • Down to 224 evaluation reports o Read the abstract, scan the report o Specifically looking for impact evaluations o In reality, found more impact oriented qualitative evaluations o Check-in going from 224 to 168 to 144 to 104 • At 104 divide between impact evaluation (23) and impact oriented qualitative evaluation (81) • From 104 to 32, HIV/AIDS focus, strive for balance • Map the selected documents, maintain anonymity o 21 countries, 25 evaluating organizations, 26 implementers o Years: 2004, 2005, 2006 (2), 2007, 2008 (3), 2009 (2), 2010, 2011 (3), 2012 (9), 2013 (4), 2014 (4), 2015 • Read all 32, use ATLAS.ti to code Methodology (cont.)
  17. 17. Instances(n = 32) Evaluation / Data Type Type of Primary Data Collection (Qualitative) Mixed Methods Evaluation 23 Key Informant Interview FocusGroup Discussion In-Depth Interview Primarily Qualitative Evaluation 9 EvaluationsInclusive of a Control Group 13 Primary Data Collection (Quantitative) 15 Primary Data Collection (Qualitative) 31 23 19 8 Total Average Median Range Characteristicsof the Evaluation Reports(including annexes) Pages 3,383 106 93 21 - 339 Uses of the Word Impact 765 24 22 3 - 91 Uses of Impact Related Words 6,105 191 146 14 - 794 Uses of the Word Gender 1,027 32 10 1 - 369 CountriesRepresented Acrossthe Sample (21) Cote D'Ivoire (2), Ethiopia(3), Ghana, Honduras, India (3), Kenya (2), Malawi (2), Multiple Country Evaluations (3), Namibia, Nepal, Rwanda, South Africa(4), Swaziland, Tanzania, Uganda(3), Zambia(2) and Zimbabwe Year of the Evaluation Report Acrossthe Sample 2004, 2005, 2006 (2), 2007, 2008 (3), 2009 (2), 2010, 2011 (3), 2012 (9), 2013 (4), 2014 (4) and 2015 Evaluating OrganizationsRepresented Acrossthe Sample (25) Business Enterprise, University of Pretoria, CAMRIS, Care India, CHANGES2, Clacherty & Associates Education and Social Developments (Pty) Ltd, Development & TrainingServices (dTS) (3), Engender Health, FARST Africa, Feedback Research and Analytics, Inc., HORIZONS(2), Impact Consulting, John Snow International (JSI), Management Sciences for Health (MSH), Management Systems International (MSI) (2), MEASUREEvaluation (2), MELA PLC, Mendez England & Associates (ME&A), MIDEGO, Inc., The Mitchell Group (2), QED Group, LLC (3), Save the Children, Social Impact, Inc., Social Scientific Systems, Synergy and USAID Project ImplementersRepresented Acrossthe Sample (26) Care India, CHANGES2, Chemonics, Childline Mpumalanga, Children in Distress Network, Development Alternatives, Inc. (DAI), Elizabeth Glaser Pediatric AIDSFoundation (EGPAF) (2), Engender Health (2), FHI360, GOAL, HORIZONS(2), HumanaPeople to People, IntraHealth, John Snow International (JSI) (2), John Snow International Research & TrainingInstitute, Inc., Johns Hopkins University Center for Communications Programs, Management Sciences for Health (MSH), National Association of Child Care Workers, Government of Ghana National HIV Prevention Program, Pact, Pathfinder, Population Services International (PSI), Save the Children (2), USAID and Zambia VCT Partnership
  18. 18. Insights and ideas Document Review What struck me. What I discovered Choosing not to use terms such as findings or results The goal with these insights and ideas is not to produce a manual about the use of qualitative methods. There are of these enough already There is no single qualitative method or approach. There are many methods, many families of methods Embrace the diversity. Own it. Say what you mean
  19. 19. Method specificity Capacity assessment tool Case story Case study Case study documentations Client interaction Client interviews Community-level group discussion Comprehensive analysis session Detailed discussions Detailed narratives Direct observations Discussions Email survey Facility checklist Facility-level group discussion Field visits Fieldwork Focus group discussion Focus group interview Guided group discussion Held discussions In-depth discussion In-depth interview Insightful occurrences Interactions with stakeholders Interpretive phenomenology Key actor interview Key informant interview Observational checklists Observations Open-ended interviews Opportunistic group discussion Organization interviews Plenary group dialogue meetings Regular group analysis Site visits Structured group discussions Structured interviews Success story Telephone discussions Triangulation 41 qualitative data collection methods terms
  20. 20. The limits of listing The Team performed an intensive desk review of the documents, including those provided by USAID, and data encountered through Internet searches, site visits and discussions with local counterparts (Appendix B). A Team Planning Meeting was held on XXXX to draft the evaluation framework, followed by an orientation meeting with USAID on XXXX. The framework was subsequently approved by USAID in an email dated XXXX. Following that meeting, the Team met in XXXX with the implementing agencies and other stakeholders as identified by USAID (Appendix B). The Team finalized the evaluation framework in accordance with USAID feedback and guidance, and further tested and refined the discussion guides (Appendix D). (P24)
  21. 21. Activities ≠ method 1) a team planning meeting between the team and USAID/XXXX; 2) extensive desk review of all project-related documents; 3) interviews with key informants; 4) field visits to three of the seven XXXX districts in XXXX, one XXXX district, and one XXXX district (control districts), where team members met select NGOs and community members and visited XXXX Clinics; 5) meetings with the XXXX and XXXX in XXXX and XXXX; 6) client briefings with XXXX and USAID through in-person meetings and teleconferences; and 7) presentations and discussion of findings with members of the XXXX team, USAID and XXXX. To enhance the quantitative rigor of the evaluation, the team undertook a separate epidemiological study to analyze health outcomes. The evaluation team conducted field visits to verify data collection and to inform subsequent findings qualitatively. Key informant interviews further enhanced the findings of the XXXX Documentation team and provided additional insight (P4) The methodology included
  22. 22. Method Two overall method statements The evaluation team used both qualitative and quantitative methods to collect and analyze information relevant to the objectives, the four outcomes of the development hypothesis, and the research questions outlined in the Scope of Work. (P3) A sequential mixed methods design was used to combine quantitative (survey design) and interpretive qualitative aspects. (P2)
  23. 23. Method A subtle but telling variation The evaluation team used both qualitative and quantitative methods to collect and analyze information relevant to the objectives, the four outcomes of the development hypothesis, and the research questions outlined in the Scope of Work. (P3) A sequential mixed methods design was used to combine quantitative (survey design) and interpretive qualitative aspects. (P2)
  24. 24. Mixed method Denzin, Norman (2010) Qualitative Inquiry Analysis traces history surrounding rise mixed method. Persons who are less familiar with the rich traditions of qualitative inquiry are telling others with the same lack of experience how to do qualitative work (2010: 420) This type of instruction exists, Denzin suggests, because the call for mixed method work has largely come from those with quantitative expertise not those with qualitative expertise.
  25. 25. Mapping Evidence for Denzin’s assertion Article: Tracing publications, tracing history, a compelling analysis of paradigms and when, why, how they shift Document review: Very rare that the bios of the evaluators are included; thus, we can’t know the expertise of the evaluation team members Observationally: Often an approach to qualitative evaluations relies on mapping quantitative methods and words onto a qualitative study. Potentially an indication of quantitative experts trying to map their expertise onto an area (qualitative work) where they have less expertise
  26. 26. Unit of analysis Conceptually, the team chose to focus on both the provision of services, the “supply side” or referred to in the PMP as “Access”, as well as the “demand side”, mainly the utilization of those services by beneficiaries. Three main units of analysis were: first, national level leaders who had received training and other inputs from XXXX; second, the district level, including the XX country unit of local government and local level civil society organizations, and third, HIV/AIDS affected Households, as identified from People with HIV/AIDS (PHA) registration lists. (P17)
  27. 27. Impact Key informant interview = 23 What is the unit of analysis for a key informant interview? Focus group discussion = 19 What is the unit of analysis for a focus group discussion? In-depth interview = 8 What is the unit of analysis for an in-depth interview?
  28. 28. Impact Key informant interview = 23 Did the key informant experience the impact? Focus group discussion = 19 Did the focus group experience the impact? In-depth interview = 8 Did the in-depth interviewee experience the impact?
  29. 29. Summary & tips Impact & Qualitative Evaluative Work Consider that impact is a subjective concept Tailor the methods, descriptively title interview types Say what you mean, state method specifically and in detail Think before you list, methods deserve full sentences Flush things out, activities are not methods Both and mixed are not synonyms Mixed method requires quantitative and qualitative expertise When to / when not to map quantitative onto qualitative Unit of analysis and its relevance to understanding impact Experience and its relevance to feeling impact
  30. 30. MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) under terms of Cooperative Agreement AID-OAA-L-14-00004 and implemented by the Carolina Population Center, University of North Carolina at Chapel Hill in partnership with ICF International, John Snow, Inc., Management Sciences for Health, Palladium Group, and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. www.measureevaluation.org

×