• Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
352
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
19
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Examples Formative assessment will identify who the target population is, where they live, lifestyles, culture etc. Anything that gives insight into the lives of the target audience that will help us to design an appropriate intervention Process: We are looking at how the actual implementation was done. If we had planned to recruit 20 peer educators, did we manage that, how did we do the recruitment, what were the challenges. Outcome: Did knowledge on HIV/AIDS increase because of our intervention, are more people using condoms, are people having less casual partners etc. You have to have had a baseline. If KAPB was conducted, it is repeating it and comparing results from before and after intervention. Impact: Did the program reduce the number of new infections, have we stopped or reduced the number of STIs, has the health of target population improved because of our program. The measurements for these require biological data which is not easy to collect. The required data may also need an experiment or long study. It is scientifically verified.
  • Examples: Inputs: are things we put in the program such as the money to run the program, the human resources, training, equipment and any other resources we have put in to make the intervention possible. Outputs: are the activities or service we provided. How many peer educators did we recruit, train, how many meeting did we conduct, how many condoms did we distribute, how many people did we reach, how many information sessions did we conduct, what pamphlets or posters did we give etc Outcomes: have people changed their behaviors. Are people now using condoms, do they have less casual partners, are they abstaining, can they now insist on condom use, has their knowledge improved compare to before the program? Impact: Did the program reduce the number of new infections, have we stopped or reduced the number of STIs, has the health of target population improved because of our program?. The measurements for these require biological data which is not easy to collect. The required data may also need an experiment or long study. It is scientifically verified

Transcript

  • 1. IFC Against AIDS Protecting People and Profitability
  • 2. Monitoring and Evaluation (M&E)
    • Day-to-day follow up of activities during implementation to measure progress and identify deviations
    • Routine follow up to ensure activities are proceeding as planned and are on schedule
    • Routine assessment of activities and results
    • Answers the question, “what are we doing?”
    What is monitoring?
  • 3. Monitoring and Evaluation (M&E)
    • Tracks inputs and outputs and compares them to plan
    • Identifies and addresses problems
    • Ensures effective use of resources
    • Ensures quality and learning to improve activities and services
    • Strengthens accountability
    • Program management tool
    Why monitor activities?
  • 4. Monitoring and Evaluation (M&E)
    • Episodic assessment of overall achievement and impacts
    • Systematic way of learning from experience to IMPROVE current activities and promote better planning for future action
    • Designed specifically with intention to attribute changes to intervention itself
    • Answers the question, “what have we achieved and what impact have we made”
    What is evaluation?
  • 5. Monitoring and Evaluation (M&E)
    • Determines program effectiveness
    • Shows impact
    • Strengthens financial responses and accountability
    • Promotes a learning culture focused on service improvement
    • Promotes replication of successful interventions
    Why evaluate activities?
  • 6. Monitoring and Evaluation (M&E) Types of evaluation Gauges the program’s overall impact and effectiveness. Aims to strengthen design and replication of effective programs and strategies Impact Examines specific program outcomes and accomplishments. What changes were observed, what does it mean, and if changes are a result of the interventions? Outcome Seeks to identify the extent to which planned activities have been achieved and assesses the quality of the activities/services Process Initial assessment of the target populations and contextual environment. Determines concept and design Formative Purpose Type
  • 7. Monitoring and Evaluation (M&E) Monitoring vs evaluation External analysis Self-assessment Provides managers with strategy and policy options Alerts managers to problems Focuses on outcomes and impacts Focuses on inputs and outputs In-depth analysis of achievements Documents progress Periodic: important milestones Continuous: day-to-day Evaluation Monitoring
  • 8. Monitoring and Evaluation (M&E) M&E Framework 3-5 years (long term) Measurable changes in health status, particularly reduced STI/HIV transmission and reduced AIDS impact. Impact results are effects of the intervention Impacts 2-3 years (short to medium term) Changes in behaviors or skills as a result of the implemented project. Outcomes are anticipated to lead to impacts Outcomes Quarterly Activities or services that the project is providing. Outputs lead to outcomes Outputs Continuous Resources that are put into the project. Lead to the achievement of the outputs Inputs Frequency Description Level
  • 9. Monitoring and Evaluation (M&E) How IFC Against AIDS integrates M&E
    • Internal client satisfaction survey
    • External client satisfaction survey
    • Database with all client data
    IFC Against AIDS
    • Client engagement process
    • M&E guide with tools attached
    • Train clients/service providers on M&E
    • Database for client monitoring data
    Client Description Level
  • 10. Pre-engagement Client identification: nature of deal, sector, country, internal stakeholders Mapping out IFC team & value-add
    • Foundations of an HIV/AIDS program
    • Infrastructure and assessment:
    • Assessment of programmatic conditions
    • Senior management buy-in
    • Partnership with service providers
    • Implementation and partnering
    • Program execution and consolidation of partnership:
    • Client and NGO implement the program
    • Eventually IFC Against AIDS develops supervisory role
    • Feedback to management
    • Sustainability and maintenance
    • M&E and program transition:
    • Consolidation of lessons learned
    • M&E and program evaluation
    • Bridge plan of action
    • Feedback to management
    IFC Against AIDS Client Engagement and M&E Process Months 0-6 Months 6-12 Months 12-18
    • Monitoring & Evaluation
    • Meetings with senior managers
    • AIDS committees set up
    • Focal persons identified
    • Policy
    • Implementation plan
    • Budgets for programs
    • MoUs with service providers
    • This is tracked through program monitoring forms
    • Monitoring & Evaluation
    • KAP survey
    • Choose appropriate intervention and monitor using M&E guide and indicators
    • Contract service provider
    • Monthly reports to management
    • Quarterly reports to IFC for database
    • Participate in IFC workshops
    • This is tracked through program monitoring forms
    • Monitoring & Evaluation
    • Client
    • Analyze monitoring data
    • Conduct evaluation
    • Re-plan using results of both
    • IFC
    • Analyze monitoring database
    • Analyze client evaluation reports
    • Client satisfaction survey
    • Re-plan for new year
    • This is tracked through program monitoring and evaluation forms
  • 11. Monitoring and Evaluation (M&E)
    • M&E should be part of the design of a program
    • Ensures systematic reporting
    • Communicates results and accountability
    • Measures efficiency and effectiveness
    • Provides information for improved decision making
    • Ensures effective allocation of resources
    • Promotes continuous learning and improvement
    Conclusion: Why M&E?
  • 12. Monitoring and Evaluation (M&E)
    • Monitoring
      • Appendix 1 Coordinator’s Workplace Peer Educator Training Form (M1)
      • Appendix 2 Coordinator’s Community Peer Educator Training Form (M2)
      • Appendix 3 Workplace Peer Educator Activity Form (M3)
      • Appendix 4 Workplace Peer Educator Community Activity Form (M4)
      • Appendix 5 Community Peer Educator Activity Form (M5)
      • Appendix 6 Workplace Peer Educator Monthly Activity Summary (M6)
      • Appendix 7 Workplace Peer Educator Monthly Community Summary (M7)
      • Appendix 8 Community Peer Educator Monthly Activity Summary (M8)
      • Appendix 9 Coordinator’s Monthly Activity Summary (M9)
      • Appendix 10 Planned vs Achieved Matrix (E1)
    • Evaluation
      • Appendix 11 Process Evaluation: Structured Interview (E2)
      • Appendix 12 Process Evaluation: Focus Group Discussion Guide (E3)
      • Appendix 13 Outcome Evaluation: KAPB Survey Questionnaire (E4)
      • Appendix 14 Outcome Evaluation: KAPB Focus Group Discussion Guide (E5)
    Practical exercise on IFC Against AIDS M&E tools
  • 13. IFC Against AIDS http://www.ifc.org/ifcagainstaids Sabine Durier - Program Leader Tel: +1-202-473-4176, Email: SDurier@ifc.org Gillette Conner - Program Officer Tel: +1-202-473-4040, Email: GConner@ifc.org Tish Enslin - Program Officer (Johannesburg) Tel: +27-11-731-3062, Email: LEnslin@ifc.org Noleen Dube - Program Officer (Johannesburg) Tel: +27-11-731-3059, Email: NDube@ifc.org Martin Lutalo - Program Analyst Tel: +1-202-458-1406, Email: MLutalo@ifc.org Ilídio Silva - Consultant (Maputo) Tel: +1-258-84-3070-360, Email: IDasilva1@ifc.org