Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Verification of Results
Findings and Recommendations
from a Cross-Case Analysis
Petra Vergeer, Tawab Hashemi, Martin Sabig...
• Relatively new function
• Verificationis the first-order substantiation of results paid for in RBF, including
coverage r...
• Process for ensuring the consistency of routine
reporting on the volume (i.e.,quantity) of purchased
services provided (...
Objectives of Cross-Case Analysis
• Expand knowledge about verification processes and
practices to address the design and ...
Overview of Verification Structures
and Interactive Quiz
Overview of Verification Structures
Afghanistan, Argentina, Burundi, Rwanda (community RBF)
Afghanistan: Verification structure
There are four types of verification activities:
• Quantity of services verified in fa...
There are three types of verificationactivities:
• Beneficiary enrollmentverification,conductedmonthly,ex-ante,
internally...
Burundi: Verification structure
There are 4 types of verification activities in Burundi:
• Quantity verification, conducte...
Rwanda: Verification structure for
community RBF
There are four types of verification activities:
• Verification by the he...
Rwanda: Community RBF
verification structure (cont.)
• All cooperatives’ reports are assessed each quarter for data
comple...
Interactive Quiz
How long does it take to observe improvements in
RBF data due to quantity verification?
1. One (1) year
2. Two (2) years
3...
Findings: Level of agreement between HMIS and
facility data increases over time in Afghanistan
• Structure
o Verification ...
Findings: Error rates in beneficiary enrollment
decline over time in Argentina
• Structure
Beneficiary enrollment:
o Verif...
Findings: Counter-verification shows <10% errors
in Burundi after 2 years
• Structure:
o Quantity verification, conducted ...
Findings: Decline in size of indicator over-reporting
in Rwanda
• Structure
o Verification by the health facility of quant...
After 2 years of implementation, what was the average
percentage of traced patients who could not be found?
1. 20%-30% of ...
Findings: Percentage of missing patients reduced
over time in Afghanistan
• Structure
o Patient tracing is conducted quart...
Findings: Most patients traced from health centers
but more difficult in hospitals in Burundi
• Findings
o In Q1 2012, 7.4...
Findings: National study in Rwanda identifies most
patients in community
• Structure:
• A national study, which is to be d...
What are the links between patient confidentiality
and verification?
1. There are no concerns about patient confidentialit...
Findings: Indicators excluded because of patient
confidentiality concerns in Burundi
• Findings – Burundi
o Only 9 of 22 i...
What type of indicators have high error rates?
1. Indicators with high patient volume
2. Indicators with complex definitio...
Findings: Indicators with a high rate of
occurrence and with complex definitions
have higher error rates in Burundi
• Find...
Findings: Indicators with a high rate of occurrence
have higher error rates in Rwanda
• Findings
o Malnutrition monitoring...
Findings: Higher error rates are associated
with registration difficulties in Argentina
• Structure
o Tracer indicator ver...
Why is there a large difference between
verification and counter-verification of quality?
1. Time delay between verificati...
Findings: Afghanistan quality verification
• National Monitoring Checklist (NMC) is used for qualityverification.
o Interv...
Findings: Burundi quality verification
• Systematic difference between quality verification and counter-verification (79% ...
Key Recommendations
1. Consider context to determine whether merging functions
is appropriate (be mindful of conflict of i...
Factors influencing verification: a
conceptual framework
Context
Verification Characteristics Impacton
accuracy,
cost,
sus...
When and how to change your verification
strategy…examine quantity verification error rates
1
Facility-level patterns
Indi...
Risk-based sampling for verification
• Using a risk-based sampling approach likely more cost-
effective
• Sample contracte...
-15
-10
-5
0
5
10
15
HF1 HF2 HF3 HF4 HF5 HF6
Difference Between Declared and Verified 6 Month Totals
Within 5% Difference
...
Example at District level
Health
Facility
Total
Declared
Total
Verified
%
Difference
Hoyuyu
1
1011 1016 0%
Matedza 344 325...
Total facilities by risk-category
85
63
244
0
50
100
150
200
250
300
Red Amber Green
NumberofFacilities
Category
Areas for further research in verification
• Costs, savings, and cost-effectiveness of
verification and counter-verificati...
Conclusions
• The conceptual framework and is intended to assist RBF
implementers and policymakers in their deliberations ...
References
• Cashin,Cheryl and Lisa Fleisher. Verification of performance in results-based financing:
the case of Afghanis...
Thank You
Upcoming SlideShare
Loading in …5
×

Annual Results and Impact Evaluation Workshop for RBF - Day One - Verification of Results Findings and Recommendations from a Cross-Case Analysis

540 views

Published on

A presentation from the 2014 Annual Results and Impact Evaluation Workshop for RBF, held in Buenos Aires, Argentina.

Published in: Healthcare
  • Be the first to comment

  • Be the first to like this

Annual Results and Impact Evaluation Workshop for RBF - Day One - Verification of Results Findings and Recommendations from a Cross-Case Analysis

  1. 1. Verification of Results Findings and Recommendations from a Cross-Case Analysis Petra Vergeer, Tawab Hashemi, Martin Sabignoso, Olivier Basenya, Catherine Mugeni, Eubert Vushoma, Chenjerai Sisimayi
  2. 2. • Relatively new function • Verificationis the first-order substantiation of results paid for in RBF, including coverage rates or quantities of patients seen, quality of services provided, and patient satisfaction. • Counter-verification is the second order substantiation of the above, i.e., it requires that a first order verification has been carried out and is verifying the accuracy of it. • Donors and government acutely sensitive to potential for “over-payments” for inflated service reporting • Avoid appearance of, or actual conflict of interest: contracted party has incentive to over report; separate actor must verify reporting Verification: Essential element of implementation Not to be confused with M&E!!
  3. 3. • Process for ensuring the consistency of routine reporting on the volume (i.e.,quantity) of purchased services provided (recount of data) • Process for confirming with patients the provision of purchased services (patienttracing) • Direct observation of conditions of service delivery and actual care to assess quality • Assessment of satisfaction of patients • RBF mechanisms often include multiple approaches Different methods used
  4. 4. Objectives of Cross-Case Analysis • Expand knowledge about verification processes and practices to address the design and implementation needs of RBF projects. • Add to available knowledge by comparing the characteristics of verification strategies as well as available data on costs (using level of effort as a proxy), savings, and verification results to date in six countries: Afghanistan, Argentina, Burundi, Panama, Rwanda, and the UK. • Country cases written with a common outline to describe major characteristics of the verification method, the verification results, the use of the verification results, costs, and key lessons and recommendations.
  5. 5. Overview of Verification Structures and Interactive Quiz
  6. 6. Overview of Verification Structures Afghanistan, Argentina, Burundi, Rwanda (community RBF)
  7. 7. Afghanistan: Verification structure There are four types of verification activities: • Quantity of services verified in facilities, conducted quarterly, ex-ante, by a third party, of 25% of providers; • Patient tracing, conducted quarterly, ex-ante, by a third party, of 25% of providers; • Quality of services assessed by the Provincial Health Office (PHO) jointly with the NGO • Counter-verification of quality through health facility assessment by a third party, sample basis.
  8. 8. There are three types of verificationactivities: • Beneficiary enrollmentverification,conductedmonthly,ex-ante, internally at national level, with electronicdata validation all records,no field visits; • Beneficiary enrollmentcounter-verification,conductedeverytwo months,ex-post,by a third party,with electronic data validation all records, and a sample is checked to ensure existenceof enrollment form; • Tracer indicator verification,conductedeveryfour months, ex- post, third party,data validationall records,risk-based sample of health facilities (primarily facilities with higher numbers of patients). Argentina: Verification structure
  9. 9. Burundi: Verification structure There are 4 types of verification activities in Burundi: • Quantity verification, conducted monthly, ex-ante, jointly by verifiers from the MOH and civil society organizations (public – private partnership), at all sites; • Technical quality verification, conducted quarterly, ex-ante, internally with civil society engagement, at all sites; • Patient tracing (including patient satisfaction) as part of the quality score, conducted bi-annually, ex-ante, by a third party, each facility, sample basis; (n.b. for quarters with no patient tracing, the previous quarter’s score is used in the calculation) • Quantity, quality and patient tracing counter-verification, conducted quarterly, ex-post, by third party, sample basis.
  10. 10. Rwanda: Verification structure for community RBF There are four types of verification activities: • Verification by the health facility of the quantity of referrals based on information in submitted reports, monthly, ex- ante, internally, all indicators, all CHWs. The number of referrals is cross-checked against health center records. • Quarterly counter-verification, by a sector steering committee (mostly comprised of health center staff - but with some community members, considered independent).
  11. 11. Rwanda: Community RBF verification structure (cont.) • All cooperatives’ reports are assessed each quarter for data completeness and report submission timeliness, internal. The evaluation of cooperative management is carried out by the district hospital. Each quarter, 100% of cooperatives are evaluated. • Verification of the demand-side scheme is not systematic and is integrated into the monitoring of the health center by the district hospital.
  12. 12. Interactive Quiz
  13. 13. How long does it take to observe improvements in RBF data due to quantity verification? 1. One (1) year 2. Two (2) years 3. Five (5) years
  14. 14. Findings: Level of agreement between HMIS and facility data increases over time in Afghanistan • Structure o Verification of the quantity of services in facilitiesis conducted quarterly,ex-ante, by a third party. 25% of providersare sampled each quarter. • Findings o Error rates in quantityverification declined from 17% to 8% between 2010 and 2013 %agreement 83 83 83 83 86 87 89 93 89 95 94 91 92 76 78 80 82 84 86 88 90 92 94 96 Figure: Trends in Level of Agreement between HMIS and Facility-Level Verification Data for the Quantity of ServicesDelivered
  15. 15. Findings: Error rates in beneficiary enrollment decline over time in Argentina • Structure Beneficiary enrollment: o Verification through electronic data validation of all records (no field visits) monthly, ex-ante, internallyat the national level. o Counter-verification,every two months, ex-post, third party, electronic data validation all records, sample checked to ensure existence of enrollment form; • Findings o Error rates in beneficiary enrollment declined from 20% to less than 1% in 2 years 0.00% 5.00% 10.00% 15.00% 20.00% 25.00% 2004 2005 2006 2007 2008 2009 2010 2011 2012 %RecordsRejected Phase 1 Provinces Phase 2 Provinces Figure: Counter-verification of beneficiary enrollment compared to records submitted by provinces in Argentina, 2004-2012
  16. 16. Findings: Counter-verification shows <10% errors in Burundi after 2 years • Structure: o Quantity verification, conducted monthly, ex-ante, jointly by verifiers from the MOH and civil society organizations (public –privatepartnership),at all sites; o Quantity counter-verification,conducted quarterly,ex-post, by a third party, sample basis. • Findings: o Internal verification found that 31% of declarations for health centers were reported with error and 38% for hospitals. The average error size was 5% for health centers and 4% for hospitals. o Difference between verification and third party counter-verification of quantityafter 2 years of nationwide PBF implementation were small (<1%) for health centers. Differences for hospitals were substantiallylarger (9%) but can be explained in part by a lack of standardized registers among hospitals.
  17. 17. Findings: Decline in size of indicator over-reporting in Rwanda • Structure o Verification by the health facility of quantity of referrals based on information in submitted reports is done on a monthly basis, ex-ante, by an internal verifier. All indicators and all CHWs are verified. The number of referrals is cross-checked against health center records. • Findings o The percentage of service indicators that contained error did not change dramatically, with 49% of indicator reports inaccurate in Q4 2012 (23% over-reported and 26% under-reported ). However, the size of the error for over-reporting declined substantially (from over 140% to around 7%) in two year’s time. o This refers to the discrepancies between the performance as self-assessed by CHWs at cell level and the performance afterthe verification process is complete.
  18. 18. After 2 years of implementation, what was the average percentage of traced patients who could not be found? 1. 20%-30% of all patients traced could not be found 2. 10% of patients traced from health centers and 15% of patients traced from hospitals could not be found 3. <10% of patients traced from health centers and 15% of patients traced from hospitals could not be found
  19. 19. Findings: Percentage of missing patients reduced over time in Afghanistan • Structure o Patient tracing is conducted quarterly,ex-ante, by a third party. 25% of providers are sampled. • Findings o “Missing patients” reduced from 33% to 7% between 2010 and 2013 %agreement 67 77 83 86 89 89 91 92 94 95 96 93 93 0 20 40 60 80 100 120 Trends in Level of Agreement between HMIS and Community-Level Verification Data for the Quantity of Services Delivered
  20. 20. Findings: Most patients traced from health centers but more difficult in hospitals in Burundi • Findings o In Q1 2012, 7.4% of patients traced from health centres (see below graph) and 15.4% of patients from hospitals were not found. o More than 98% of those found (both for health centers and hospitals) confirmed receiving the services recorded. o Counter-verification of patient tracing is performed but with a newly-taken sample of patients and hence no comparison can be done between verification and counter-verification. • Structure o Patient tracing (including patient satisfaction), conducted bi-annually, ex-ante, third party, each facility, sample basis; o Counter-verification of patient tracing counter- verification, conducted quarterly, ex-post, third party, sample basis. Figure: Health centre patient tracing results 2011-2012
  21. 21. Findings: National study in Rwanda identifies most patients in community • Structure: • A national study, which is to be distinguished from regular patient tracing as in Burundi and Afghanistan, was conducted by the MoH in 2012 and included, among other things, patient tracing in the community • Findings: • 97% of the patients could be identified in the community. • Of those found, 97% confirmed having been treated at the facility for the services for which the CHW referred them. • In addition, 98% of eligible women confirmed to have received in-kind incentives.
  22. 22. What are the links between patient confidentiality and verification? 1. There are no concerns about patient confidentiality in verification 2. Concerns about protecting patient confidentiality result in the exclusion of certain indicators from verification (e.g., family planning) 3. Electronic checks of records helps to protect patient confidentiality because patient data are de-identified 4. 2 and 3 are correct
  23. 23. Findings: Indicators excluded because of patient confidentiality concerns in Burundi • Findings – Burundi o Only 9 of 22 indicators at the health center and 8 out of 24 for hospitals are verified as part of patient tracing for confidentialityreasons. Indicators on HIV, tuberculosis, and family planning are excluded. o As a result, the existence of “phantom patients” for almost halfof the health centre indicators, and one third of hospital indicators, is never assessed.
  24. 24. What type of indicators have high error rates? 1. Indicators with high patient volume 2. Indicators with complex definitions 3. Indicators with registration difficulties 4. Indicators with a high incentive attached 5. 1 and 2 are correct 6. 1, 2, and 3 are correct
  25. 25. Findings: Indicators with a high rate of occurrence and with complex definitions have higher error rates in Burundi • Findings o Indicators with a high rate of occurrence have the highest level of error. These are indicators where the risk of errors when counting may be greater. o Indicators with definitions that are complicated also have higher errors. Facilities have greater chances of counting a service that does not match the definition and that will not be validated by the verification team o Indicators with high incentives do not have high error rates 10 Indicators with highest error rates identified during verification in health centers (January-August 2012) Indicator % accuracy of declared data n Consultation (child) 22% 4,085 Consultation (adult) 30% 4,085 Observation day (child) 44% 3,802 Consultation (pregnant woman) 44% 4,062 Small surgery 53% 3,987 Family planning 58% 3,458 Completely vaccinated child 67% 4,065 Anti-tetanus vaccination (TT2-TT5) 68% 4,055 Observation day (adult) 69% 2,896 Prenatal consultation 67% 4,074
  26. 26. Findings: Indicators with a high rate of occurrence have higher error rates in Rwanda • Findings o Malnutrition monitoring had the highest error rate in Rwanda which may have been due to the large number of children involved. Table:Percentage of inaccurate reports detected by the health centers and by the sector steering committees in the 4 sectors visited for 8 paid indicators, during Q4 2010, Q4 2011 and Q4 2012 Indicator Errors detected by the health center (comparison between cell and sector reports) Errors detected by the sector steering committee (comparison between sector reports and national db) % inaccurate indicators Nb. of reports % inaccurate indicators Nb. of reports Woman accompanied for delivery 51% 35 14% 35 Woman accompanied for antenatal care 43% 35 14% 35 Patients accompanied for VCT 49% 35 20% 35 Children monitored for nutrition status 59% 34 29% 34 Family planning users referred 23% 35 23% 35 TB-cases followed per month 23% 35 37% 35 TB suspects referred 37% 35 41% 34 Women referred for PMTCT 52% 33 36% 33 Total 42% 277 27% 276
  27. 27. Findings: Higher error rates are associated with registration difficulties in Argentina • Structure o Tracer indicator verification is conducted every fourmonths, ex-post by a third party. Data validation is conducted forall records, and a risk- based sample of health facilities is selected(primarily facilities with higher numbers of patients). • Finding o Higher error rates are associatedwith: • Lack of registration tools (tracer VII) • Weak adherence to registration norms (tracer IX) • More than one source of data needs to be utilized (tracer IV). Tracer Percentage of results with error rates >20% of declared Percentage of results with error rates >40% of declared I Early detection of pregnant w omen 23% 3% II Effectiveness of childbirth and neonatal care 6% 1% III Effectiveness of prenatal care and prevention of prematurity 7% 1% IV Effectiveness of prenatal and delivery care 19% 5% V Case assessments in child and maternal deaths out of all child and maternal deaths 5% 3% VI Immunization coverage 10% 1% VII Sexual and reproductive care 18% 6% VIII Tracking healthy child up to 1 year 11% 3% IX Tracking healthy children betw een 1 and 6 years 24% 7% X Inclusion of the indigenous population 6% 2% Table: Error rates >20% and >40% by indicator 2008-2012 identified counter verification
  28. 28. Why is there a large difference between verification and counter-verification of quality? 1. Time delay between verification and counter-verification 2. Potential conflict of interest between those assessing quality and those contracted to provide services 3. Objectivity of measuring tool is compromised 4. Sanctions for discrepancies between verification and counter-verification are not applied 5. All of the above are correct
  29. 29. Findings: Afghanistan quality verification • National Monitoring Checklist (NMC) is used for qualityverification. o Interviewees understand payment is linked to quality. o However, they are not necessarily clear about which specific indicators are linked to quality payment (e.g., clinic infrastructure,facility health information system HMIS data and essential drugs from the NMC all are partof the indicators making up the payment for quality. • NGO and MoPH – SM supervisors fill NMC checklist as part of their routine supervision visit to health facilities. o However, PPHOs often do not join the health facility visits which can lead to conflict of interest. • BSC is used for verification at the provincial level hospitals. o Due to delays in the implementation of the BSC, bonus payments to hospitals were also delayed. • BSC was also intended as a way of triangulating the NMC results at provincial level. o Due to delays in the implementation of the BSC, this has not been operationalized.
  30. 30. Findings: Burundi quality verification • Systematic difference between quality verification and counter-verification (79% of health centers and 84% of hospitals) • Technical quality was overestimated (by 11% in health centres and by 17% in hospitals). • Overestimation can be explained by three factors: o Time lag betweenverification and counter-verification o Counter-verificationteamis more rigorous o Possible conflict of interest as peer hospitals review other hospitals and provincial health teams verify their own health facilities. Sanctions were not applied for discrepancies found during counter- verification Average difference % with over- estimation Average over- estimation % with under- estimation Average under- estimation n Health centres -11% 79% -20% 21% 24% 101 Hospitals -17% 84% -24% 16% 20% 32 Table.Difference between technical quality assessment performed by the BPS, BDS or peers and counter-verification by HDP in all hospitals and health centres counter-verified during the 8 counter-verification rounds, 2010-2012
  31. 31. Key Recommendations 1. Consider context to determine whether merging functions is appropriate (be mindful of conflict of interest) 2. Analyze and use data available from verification and counter-verification 3. Verification strategies should be dynamic, not static, and use a risk-based approach
  32. 32. Factors influencing verification: a conceptual framework Context Verification Characteristics Impacton accuracy, cost, sustainability RBF Characteristics RATIONALE FOR RBF CONTRACT TYPE USE OF RBF RESULTS Improvinghealthoutcomes/HSS Relational Payment,improvingperformance Financial accountability/Costcontrol Classic Transparency, Namingand Shaming Monthly Annual Yes Large Whole universe Risk-basedapproach Internal Verification Results and Their Use FREQUENCY ALLOWABLE ERROR MARGIN SAMPLE SIZE INSTITUTIONAL SETUP ADVANCE WARNING No Small Third party Learning, Error correction Cost recovery, Sanction PAYMENT FREQUENCY Monthly Annual POLITICAL ENVIRONMENT GOVERNANCE CULTURE
  33. 33. When and how to change your verification strategy…examine quantity verification error rates 1 Facility-level patterns Indicator-level patterns Generalized across all contracted parties Localized to specific facilities (by geographic area) Localized to specific types of facilities Indicators with complex compliance criteria Indicators that are rewarded more frequently and/or have higher patient volume Indicators rewarded at a higher level Examine quantity verification error rates Activities to explore Activities to explore
  34. 34. Risk-based sampling for verification • Using a risk-based sampling approach likely more cost- effective • Sample contracted parties (e.g., facilities) with selection criteria, such as: o Higher volume (like in Argentina) o Outliers in performance relative to province- or national- averages (as in the UK) • Sample indicators with selection criteria such as: o Higher volume (possibly more prone to error as in Burundi and Rwanda example) o More complex (possibly more prone to error as in Argentina example) o Higher $ value • Always ensure a credible threat of verification remains for all contracted parties
  35. 35. -15 -10 -5 0 5 10 15 HF1 HF2 HF3 HF4 HF5 HF6 Difference Between Declared and Verified 6 Month Totals Within 5% Difference Risk-based verification: Zimbabwe model Green Category: • Verifiedon a quarterly basis Amber Category • Verifiedbi-monthly - randomly selected 2 months Red Category • Verifiedon a monthly basis • Also incorporates new facilities Difference above 5% but below or equal to 10% Difference above 10% • Model based on three risk levels • Comparisonbetween declared and verifiedvalues for 6-month totals
  36. 36. Example at District level Health Facility Total Declared Total Verified % Difference Hoyuyu 1 1011 1016 0% Matedza 344 325 6% Kawazva 327 417 -28% Sample Facilities (Mutoko District) • Districts have a mix of facilities at different risk levels
  37. 37. Total facilities by risk-category 85 63 244 0 50 100 150 200 250 300 Red Amber Green NumberofFacilities Category
  38. 38. Areas for further research in verification • Costs, savings, and cost-effectiveness of verification and counter-verification • Application of technology for verification and potential for cost savings • Ensuring patient confidentiality is protected • Patient tracing • Measuring quality
  39. 39. Conclusions • The conceptual framework and is intended to assist RBF implementers and policymakers in their deliberations about the consequences of various verification characteristics on the accuracy, cost, and sustainability of a chosen approach. • Verification strategies are not static but should be a dynamic process. The pathway tool can provide guidance for how to adapt verification strategies. • While there is no optimal verification method appropriate to all settings, the recommendations provided can be useful to consider in different contexts.
  40. 40. References • Cashin,Cheryl and Lisa Fleisher. Verification of performance in results-based financing: the case of Afghanistan.World Bank: Washington, DC. Forthcoming. • Cashin,Cheryl and Petra Vergeer. (2013). Verification in results-based financing: the case of the United Kingdom. World Bank: Washington, DC. https://openknowledge.worldbank.org/handle/10986/13567 • Perazzo, Alfredo. Verification of performance in results-based financing: the case of Panama. World Bank: Washington, DC. Forthcoming. • Perazzo, Alfredo and Erik Josephson. Verification of performance in results-based financing:the case of Argentina.World Bank: Washington, DC. Forthcoming. • Renaud, Adrien. Verification of performance in results- based financing:the case of Burundi.World Bank: Washington, DC. Forthcoming. • Renaud, Adrien and Jean-Paul Semasaka. Verification of performance in results-based financing:the case of the Rwanda community RBF interventions: community PBF and demand side scheme. World Bank: Washington, DC. Forthcoming. • Vergeer, Petra, Anna Heard, Erik Josephson, and Lisa Fleisher. Verification in results- based financing for health: findingsand recommendations from a cross-case analysis. World Bank: Washington, DC. Forthcoming.
  41. 41. Thank You

×