• Like
  • Save
Lessons Learned from OVC Evaluations for Future Public Health Evaluations
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Lessons Learned from OVC Evaluations for Future Public Health Evaluations

  • 1,050 views
Published

 

Published in Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
1,050
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • .

Transcript

  • 1. Lessons Learned from OVC Evaluations for Future Public Health Evaluations Siân Curtis, PhD OVC Evaluation Dissemination Meeting, September 3 rd , 2009, Washington, DC
  • 2. Growing Emphasis on Evaluation
    • IOM PEPFAR evaluation report and reauthorization legislation
    • Global Fund 5 year impact evaluation and OR initiative
    • CFGD “When will we ever learn” report
    • 3IE Initiative
    • IHP M&E Working Group – common evaluation framework initiative
    • USAID evaluation revitalization efforts
  • 3. Ideal Impact Assessment
  • 4.
    • Need to think about evaluation at the beginning not at the end, but it is hard to attract attention at that point
    • Timing – projects are already underway and it is hard to incorporate a strong evaluation design
    • Scale – many projects are too small to expect to be able to demonstrate impact
    • Pressure for rapid results to inform programs now
    • Expectations of multiple stakeholders – scope, competing objectives, multiple/unclear research questions
    • Political will – need someone in a position of authority to buy in and advocate for evaluation
    Challenges to Implementing Rigorous Impact Evaluation
  • 5. Methodological Constraints to Rigorous Impact Evaluation
    • Non-random placement of programs - intervention areas and control areas often not comparable
    • Suitable control areas may not exist – other programs in control areas or cross-over of interventions to control areas
    • Need/ability to control for other factors beyond the program that might affect outcomes
    • (Victora, Black, and Bryce 2009)
  • 6. OVC Evaluation Experience
    • Timing
      • programs already underway – no baseline; post-test only design;
      • Length and intensity of exposure - short duration of exposure (i.e. less 2 years) but impact likely to be longer term
    • Scale
      • coverage low in intervention areas – quality of beneficiary lists
      • some programs small
  • 7. OVC Evaluation Experience
    • Pressure for rapid results
      • post-test only design; short program exposure.
    • Multiple stakeholder
      • Supports data use;
      • Managing expectations regarding scope and coverage of study
    • Political will/leadership – needs to be strong to facilitate buy-in from all stakeholders
  • 8. OVC Evaluation Experience
    • Program participation was non-random
      • purposive selection of intervention areas
      • self-selection into (some) programs
      • controls different from beneficiaries.
    • Control areas – contamination between program and control areas i.e. some children in control areas reported receiving interventions
  • 9. Additional Issues for OVC Evaluation
    • Multiple outcome domains – what to focus on?
    • Measurement tools for outcome domains vary in how widely tested they are and how well they work
    • Measurement and comparison of cost-effectiveness across multiple domains new
    • Lack of standardized interventions/variable intensity and quality
      • Wide variation in combination of interventions offered and way the programs are implemented
  • 10. Data Use
    • Critical to think about data use throughout the evaluation process, not just at the end
    • Engagement of stakeholders critical to understanding the evaluation questions from different perspectives and creating ownership and demand
    • Proactive and explicit data use activities will help stakeholders understand and apply findings - recommendations from them better than from research team
  • 11. Conclusions
    • Continuing challenge to develop pragmatic evaluation designs that meet rigorous scientific standards within field realities – ongoing area of research
    • Recognize the long term benefits of evaluations for future programs – “public good”
    • Takes time for programs to scale-up and to have an effect – evaluations need to be ongoing.
    • More work needed to test measures of OVC outcomes – often multi-dimensional
    • Attention to data use (both short and long term) throughout process needed
  • 12. MEASURE Evaluation is funded by the U.S. Agency for International Development through Cooperative Agreement GHA-A-00-08-00003-00 and is implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill, in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. Visit us online at http://www.cpc.unc.edu/measure.