Monitoring and Evaluating Scale-Up: Methodological and Programmatic Challenges

1,010 views

Published on

Published in: Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,010
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • For program management and decision makingAlso for understanding SU processes
  • QuResearch questions and hypothesesLogic model, indicators, benchmarksScale up indicators and benchmarks and Access data baseHousehold, clinic and community-based provider interview instrumentsFacility assessmentIn-depth stakeholder interview guidesalitative and quantitative tools, logic model, scale-up indicators and benchmarks, research questions and hypotheses, perform
  • For IRH, we see many benefits of this particular model…
  • Some people call a logic model their “roadmap”.The INPUTS in this case are all the resources we have available – competent staff, partners (most importantly here, the MOH), funds (provided by USAID, as well as leveraged funds from other sources) and CycleBeads – the visual tool that helps women learn and use the method.PROCESS – relates to what we do .. Capacity building, advocacy, supportive supervision.OUTPUTS are the activities a programundertakes. OUTCOMES are the changes or benefits thatresult from our program activities.“What gets measured, gets done”[Osborne and Gaebler, 1992)]
  • Operationalizing scale up indicators – so that it could be evaluated/researched
  • Ongoing assessment of knowledge, attitudes and behaviors related to fertility and FP use.Assess scale up progress and share with partners for intervention and advocacy; Monitor quality of services, improve training/supervision as needed Monitor understanding of staff role in SU Monitor
  • Maybe we don’t need this much detail. (This and the next slide)
  • Overview from different countries of the different kind of data being collected to inform scale up in the different countries. Multiple sources will provide ways to validate information as well as inform program planning and monitoring scale up progress.
  • Information from policy makers/program managers for stakeholder interviews in GuatemalaTeasing out at central and other levels factors influencing scale up of the SDM. Questions reflect elements of the scaling up model of Expandnet – looking at system capability, political factors, resource factors.
  • Scale up barriers/successes seen at level of service delivery – provider interviews and facility assessments – Rwanda example
  • This is one of the key data sources we’re using for monitoring purposes – the process tracking tool. This is in addition to service statistics, training reports, follow up visits with a sample of users, supportive supervision, etc.This tool helps us keep track of events that reflect both progress (like signing an MOU with the Government of Jharkhand, or the fact that HLL became a licensed manufacturer of CycleBeads)And of setbacks – like a change of government that requires renewed advocacy.
  • This are indicators collected by our semiannual reporting. I put Mali as an exampleShows year 2 or a five year plan.Shows that scale up is not really measured by yes/no.
  • Total population of Jharkhand is 22284991 and IRH intervention district is 9018050 (census 2001)
  • Where we are as the scale up process for several focus countriesEasy first wins in SU of the SDM appear to be norms/procedures and training/ supply distribution. But many challenges to sustainabilityAdvocacy efforts continue to be needed, but in different areas of institutionalization
  • Horizontal SU challengesResources for scaling up – there are additionalcosts!Much of the work to date has been in areas supported by USAID bilateralsHighlights importance of partners (resource team partnerslikebilaterals and user organizationsalike) taking on SU responsibilitiesSearch for creativeways to workwithlesswellresourcedpartners – slower pace of SU but no less important whenthinking of access / equity issues
  • How do we measure when our intensive TA is no longer needed?
  • Monitoring and Evaluating Scale-Up: Methodological and Programmatic Challenges

    1. 1. Monitoring and Evaluating Scale-Up:Methodological and Programmatic Challenges EXPANDING FAMILY PLANNING OPTIONS
    2. 2. BACKGROUND
    3. 3. Studying SDM Scale Up (2007-2012)• 5 year prospective, multi-site, comparative study of process and outcomes of scaling up a FP innovation (SDM)• Uses ExpandNet/WHO model for planning, monitoring and research
    4. 4. Scaling up SDM®Democratic Republic of CongoGuatemalaIndia (Jharkhand)MaliRwanda
    5. 5. SDM: From Research to Practice Scale-Up Integration Case Studies Studies 2007-2012 Operations 2005 - 2007 Research 2003- 2005 Pilot Studies 2000-2004Method Concept & Efficacy Trial 1999-2002
    6. 6. Why monitor and evaluate SDM scale up? E• Guide scale-up process• Maintain stakeholder Evidence(+) Evidence (+) momentum and Practice (-) Practice (+) accountability• Assess whether scale up P is achieved (outcome/impact) Evidence (-) Evidence (-) Practice (-) Practice (+)• Contribute to growing evidence base of scaling up, with focus on M&E
    7. 7. Beyond SDM…Rigorous monitoring andevaluation of scale up Theory-based methods and tool kit to study scale up process and outcomes, including:  Research questions and hypotheses  Logic model, indicators, benchmarks  Access data base/reporting forms  Baseline/endline instruments  Quality assurance tools
    8. 8. MONITORING & EVALUATION OFSCALE UP: DESIGN, DATA SOURCESAND INDICATORS
    9. 9. WHO/Expandnet Scale up Framework• Ensures that ‘systems’ are not forgotten• Evidence to guide strategic choices and adjustments• Encourages participatory approaches with multiple stakeholders• Creates consciousness of rights and equity issues• Offers common scale up language
    10. 10. SDM Scale Up Logic Model Scaling Up StrategyProblem: Gap in availability & access to SDM services INPUTS PROCESS OUTPUTS OUTCOMES • Staff • TA for systems • Providers • Provider • Partners adjustment trained competency • Funds • Advocacy • Clinics offering • Awareness • Capacity SDM and use • CycleBeads Building • Demand • Availability of • QA – oriented IEC quality monitoring & • Supportive services supervision partners/ • Supportive stakeholders policies • Systems Harmonization Impact: increased sustained availability of SDM
    11. 11. Operationalizing Scale Up1. Iterative, participatory process with stakeholders to select indicators BEGIN WITH2. Set baselines and THE END targets based on IN MIND indicators
    12. 12. Defining success in scale upAvailability of quality SDM services at national, sub-national, organizational level Availability of quality SDM services at SDPs Provider capacity
    13. 13. Monitoring benchmark scale-up indicators Automated country-level andDevelop scale up Develop Access donor reports for program indicators data base management
    14. 14. SDM scale-up monitoring data base, Microsoft Access 2007
    15. 15. Data sources M&E and Case Study Event tracking Guided discussions (timelines) Semi annual with staff (quarterly)benchmark monitoring Individual interviews Community surveys & with stakeholders Most Significant facility assessments Change (MSC) story (1-3 times) collection (1-2 times) (1-2 times)
    16. 16. M&E Tool Kit for SDM Scale Up • Benchmark tables • FP service statistics Monitoring & • Access data base supervision • Staff discussion guides tools • Event tracking (timelines) • Knowledge Improvement Tool • Client follow-up Interviews • Household survey instruments • Facility Assessment toolEvaluation tools • Provider interview guide • Most Significant Change (MSC) story collection
    17. 17. Quality Assurance ToolsProvider Refresher Client Follow Up
    18. 18. Most Significant Change stories… start with a question “Looking back over the last year, what do you think was the most significant change you have experienced as a result of SDM being offered in your community?” And ask why
    19. 19. Most Significant Change (MSC)Provides different type of Actioninformation to document andimprove scale-up• Scale-up process and outcomes Learning not detected by quantitative monitoring• Unanticipated processes/effects Stories of scale up• Meanings of scale-up process and outcomes to partners, stakeholders, communities• Intangible aspects of scale up (advocacy, leadership, gender PROJECT equity, informed choice) ACTIVITIES
    20. 20. Sample Evaluation Questions: Scale-Up Outcomes • What is the experience of women and men withClient SDM when scaled-up? (Knowledge, attitudes and use) • Is SDM offered correctly by providers?Service • How does SDM introduction influence quality,provision availability and use of overall family planning services? • To what extent has SDM been integrated intoSystem training, IEC, procurement and distribution, andintegration HMIS? Is it included in norms, protocols and guidelines?Resource • What is the level of resources dedicated to SDM?mobilization
    21. 21. Sample Evaluation Questions: Scale-up ProcessResource team • Do user organizations assume the roles, responsibilities and ownership of the resource team during scale-up process?Advocacy/ • What is the role of SDM champions? WhatDissemination strategies work best?Organizational choices • Has SDM been offered outside traditional public sector service delivery?
    22. 22. MONITORING & EVALUATION“RESULTS” TO DATE
    23. 23. Baseline Stakeholder Interviews:Health/FP program managers and policymakers in Guatemala (n=20)Political commitment Yes, SDM already integrated (norms, training,to SDM scale up materials)Political factors in SDM Some not convinced a natural method can be modernscale up and effective and demand is sufficient demand. FBOs and community based NGO networks strong supportersSDM Aware of SDM (but lack specifics, esp. efficacy)knowledge/attitudesAbility of MOH to Within their mandate. If there is demand, they willmanage SDM scale up support it.Integration of SDM Not yet. If high SDM ‘demand proved’ it would beinto annual planning/ integrated.budgeting processes
    24. 24. Baseline Provider interviews/facility assessmentsin Rwanda (n=155 and n=109) SDM integration • 2/3 of providers have seen protocols into • Most unfamiliar with norms (newly introduced in Rw) norms, guidelines, p olicies Status of SDM • 60% of providers have offered SDM (42% in last 3 services months) • 70% have been offering SDM between 1-5 years Correctness of • Most providers offer SDM competently, do not find SDM SDM info counseling difficult Service delivery • Providers only have 4-10 min for counseling on FP – not environment enough Status of SDM • 91% of visited facilities offering FP offered SDM. services CycleBeads found in most. • Only 17% of facilities displayed FP info (SDM/LAM are integrated into IEC)
    25. 25. Process Tracking Tool: Events Timeline FAM project SDM begins. extended Rwanda is SDM included in in UNFPA picked as focus country performanc zone (full DHS e-based Training of integration 2010, incl finance trainers for of SDM in FP udes SDM SDM PSI Rwanda) community-included in mechanism basedMIS family National distribution planning SDM Pre-service training of registers, starts in included training trainers withclient cards Rwanda, in in mini- activities the MOH and report cluding DHS begun (1 trainer/2 templates SDM districts) March October May July July Novembe Februar Februar March June 2007 2007 2008 2008 2008 r 2008 y 2009 y 2010 2010 2010
    26. 26. Jharkhand: Snapshot of Progress Toward Benchmarks• SDPs that • Public or private • SDM & LAM in include FAM as orgs including IEC activities, part of the FAM in-service materials & method mix training mass media1250 4 5(60%) (67%) (100%)
    27. 27. Performance benchmarks: Jharkhand Selected Indicators as of Jan 2011Proportion of SDPs with SDM in method mix 1250/2100 (60%)Providers trained 6700/15,000 (47%)No. of resource orgs 3/8 (38%)SDM included in key policies, norms, protocols 2/3 (67%)SDM in pre-service training InitiatedPublic or private training orgs include SDM in in- 4/6 (67%)service trainingCommodities in logistics & procurement systems In progressSDM in IE&C materials 5/5 (100%)SDM in HMIS 1/2SDM in surveys (DHS) Under discussionFunds leveraged for SDM $196,000 (est’d)
    28. 28. Availability of SDM: FP Service Statistics Jharkhand Six Districts 40000 Addtl 3 Districts 35000 30000 25000 No Data Avl for Gumla 20000 15000 First 3 Districts # users 10000 5000 0 April08 Sept FEB JULY DEC09 MAY OCT Tubectomy NSV I U D (C T) Oral Pill Users Condom SDM LAM Jharkand service data, through Jan 2011
    29. 29. Monitoring SDM Uptake during Scale Up Jharkhand Six Districts 4000 3500 3000 2500 Addtl 3 Districts 2000# users 1500 First 3 Districts 1000 500 0 April08 Aug DEC08 APRIL AUG DEC09 APRIL AUG DEC10 SDM LAM Jharkand service data, through Jan 2011
    30. 30. SDM availability: Phased scale up Jharkhand, India Phase 1, started Jan 2008 Pop: 3,765,983 Phase 2, started Feb 2010 Pop: 2,755,023 Phase 3, started Nov 2010 Pop: 5,520,869
    31. 31. SDM availability: Phased scale up Democratic Republic of Congo2003 - 20082008 - 2010None Health Zones in the DRC
    32. 32. SDM Integration Progress Policy Environment - Vertical Scale Up (June 2011) Norms & Training Supervision Health Info Supply Budget line- procedures curricula Systems Distribution CBsDRCMaliRwandaIndiaGuatemala
    33. 33. SDM Integration Progress Service Coverage – Horizontal Scale Up (June 2011) % SDPs offering 5 year goal SDM (Jan 2011) (% of country)DRC 93% 75%Mali 84% 90%Rwanda 84% 95%India 60% 50% of Jharkhand’s 22 million pop)Guatemala 48% (3 demonstration departments, 1/6 of country)
    34. 34. Proposed indicators for “graduation” from technical assistance• Accomplishment of benchmarks• Complete transfer of responsibility to resource organizations for all vertical and horizontal elements• Sufficient level of ownership within and across key FP actors/champions and key subsystems
    35. 35. Rebecka LundgrenInstitute for Reproductive HealthGeorgetown Universitylundgrer@georgetown.edu www.irh.org

    ×