Your SlideShare is downloading. ×
Dissemination and Use of Results from OVC Programs
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Dissemination and Use of Results from OVC Programs

1,239
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,239
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Pre-visits to program sites – starting a relationship Case studies for each program evaluated were disseminated to: program staff and in-country national OVC stakeholders (IPGs) Sharing preliminary results with each participating program for validation, get 1 st reactions, and solicit their feedback on drafts. Conducting dissemination and information use workshops of final findings and come-up with a data use plan (recommendations) Follow-up – to ensure recommendations are implemented.
  • National OVC stakeholders –service providers, donors, etc
  • Transcript

    • 1. Dissemination and Use of Results from OVC Program Evaluations Florence Nyangara, PhD MEASURE Evaluation/Futures Group Dissemination Meeting, September 3 rd , 2009 Washington, DC
    • 2. Primary Objective of OVC Program Evaluations
      • Provide evidence to guide program decisions such as;
        • Scaling-up of best practices (models, strategies), and
        • Modify & improve interventions - to make them effective
      • Therefore, needed to collect quality and relevant data, analyze, and use results to guide OVC programs
          • M onitoring and E valuation to A ssess and U se R esults (MEAS U RE) Evaluation project
    • 3. Overall Data Use Strategy
      • Employed a comprehensive data use strategy;
      • Diverse stakeholders were involved throughout the study
      • Ensured only relevant and useful data was collected by continuous consultations between researchers & practitioners
      • Data is packaged to meet the needs of target audiences
      • Results are used to improve programs
    • 4. Comprehensive Data Use Strategy
      • Stakeholder engagement: To ensure - support, ownership, relevance and sustainability.
        • Diverse stakeholders were involved (e.g. beneficiaries, program)
          • Capture different perspectives and information needs
        • Stakeholders were involved throughout the study
          • Get buy-in and promote ownership (consultation meetings)
          • Continuous communication between researchers and various stakeholders for updates (feedback).
    • 5. Comprehensive Data Use Strategy
      • Ensured that only relevant data was collected- by holding consultation meetings with donor, program implementers, community, and beneficiaries. This helped to;
          • Identify key OVC program issues and information needs for service, program, and policy decisions
          • Identify program models for evaluation
          • Inform questionnaire development
    • 6. Examples
      • Example 1 - Dissemination of Case Studies:
        • The 1 st feedback – to share information on program descriptions, implementation challenges, & opportunities
        • Involved - program staff and in-country stakeholders
        • Discussed and identified issues to consider for outcome evaluations
      • Example 2 - Dissemination of preliminary outcome evaluation results:
        • Consultations with each program & key in-country stakeholders to validate preliminary findings
        • Presentations at international conferences and other forums helped interpret findings
    • 7. Packaging Data for and Reaching Various Audiences
      • Packaging information in various formats for diverse audiences
      • Publications: http://www.cpc.unc.edu/measure/ovc
        • Six case study reports for each program evaluated 1 .
        • Five briefing papers specific to each of the program evaluated
        • Two summary papers - four CORE-funded programs
          • Overarching paper on key findings
          • Cost-effectiveness analysis paper
        • One summary paper of key findings from the three studies in Tanzania
        • Program-specific summaries of key findings (1-page) – TSA
        • Dissemination meetings
        • Program sites
        • In-country – national level
        • International level
      • 1 Jali Watoto – was a mini-case study
    • 8. Use of Results Workshops: Tanzania Example
      • Facilitated two workshops with OVC stakeholders in TZ
        • TSA program staff (field-staff and managers)
        • National OVC stakeholders (Implementing partners, government, donors, bilateral agencies, etc)
    • 9. Objectives of Results Use Workshops
        • Present and discuss the key findings
        • Develop actionable recommendations based on the results
        • Develop a data use action plan to implement each of the recommendations
        • Develop and agree on a mechanism to monitor the data use action plan
        • Follow-up plan
    • 10. Use of Results: Program Staff
      • Findings were presented to the Salvation Army – Mama Mkubwa program staff (field-supervisors, program managers, M&E staff) from all regions
      • Discussions of how TSA findings could be used to inform program improvement and the well-being of OVC.
      • Program-specific recommendations
      • Developed a data use action plan
      • Developed and agreed on how to monitor the plan
    • 11. Use of Results: National Stakeholders
      • Results were presented to National OVC stakeholders – service providers, policy- makers, donors in TZ
      • Discussions of how findings from three program evaluations could be used for decision-making.
      • National OVC program actionable recommendations
      • Developed a data use action plan
      • Developed and agreed on how to monitor the plan.
    • 12. Example of a Recommendation
      • Researchers’ proposed recommendations were challenged & participants came-up with their own, e.g.
        • Researchers : Need to review & restructure Kids’ Clubs and home-visit activities to make them more effective.
        • TSA staff : volunteer motivation – through incentives.
        • National stakeholders: more government involvement to develop guidelines that will allow volunteer growth, recognition, and ensure sustainability.
    • 13. Information Use Bulletin
      • Contains:
      • Reactions to the findings (surprises)
      • Data use actionable recommendations plan (see National Tanzania)
          • Responsibilities are assigned ( joint plans) which shows who will do what - implicated people do something
      • Follow-up plans to assess if the actionable recommendations are implemented according to plan
          • This formalizes the follow-up plans which is often forgotten after disseminations
          • Qualitatively (January/February, 2010)
      • Process Effect: Increased demand for more data for decision-making - more programs want to conduct simple evaluations of these nature to find out if their key program components (IGA) are making a difference.
    • 14. MEASURE Evaluation is funded by the U.S. Agency for International Development through Cooperative Agreement GHA-A-00-08-00003-00 and is implemented by the Carolina Population Center at the University of North Carolina at Chapel Hill, in partnership with Futures Group International, ICF Macro, John Snow, Inc., Management Sciences for Health, and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. Visit us online at http://www.cpc.unc.edu/measure.