MEASURE Evaluation Data Quality Assessment Methodology and Tools

5,544 views

Published on

Presented by David Boone on July 19, 2012.

Published in: Technology, Business
6 Comments
4 Likes
Statistics
Notes
No Downloads
Views
Total views
5,544
On SlideShare
0
From Embeds
0
Number of Embeds
14
Actions
Shares
0
Downloads
360
Comments
6
Likes
4
Embeds 0
No embeds

No notes for slide
  • Another way to assess data quality is through a data quality audit.
  • This chart shows visually that the trace and verify protocol starts with data at the level of service delivery (either in a health facility or a community based program) and how the data are “traced” to the “intermediate aggregation level” (in this case a district) and then to the M&E Unit Level (in this case the national level).
  • These charts show the summary statistics that are automatically generated from the trace and verify protocol of the data quality audit tool. They show for the reporting period and indicator audited, the: Verification factor (how the recounted data compare to the reported data) Availability of Reports Timeliness of Reports Completeness of Reports
  • These charts show the summary statistics that are automatically generated from the trace and verify protocol of the data quality audit tool. They show for the reporting period and indicator audited, the: Verification factor (how the recounted data compare to the reported data) Availability of Reports Timeliness of Reports Completeness of Reports
  • MEASURE Evaluation Data Quality Assessment Methodology and Tools

    1. 1. MEASURE EvaluationData Quality Assessment Methodology and Tools USAID Washington D.C. July 19, 2012
    2. 2. Background ■ National Programs and Donor-funded projects are working towards achieving ambitious goals in the fight against HIV, TB and Malaria. ■ Measuring success and improving management of these initiatives is predicated on strong M&E systems that produce quality data regarding program implementation. ■ In the spirit of the “Three Ones”, the “Stop TB Strategy” and the “RBM Global Strategic Plan”, a multi-partner project* was launched 2006 to develop a joint Data Quality Audit (DQA) Tool. ■ The objective of this initiative is to provide a common approach for assessing and improving data quality. A single tool ensures that standards are harmonized and allows for joint implementation (between partners and with National Programs).* Partners most directly involved include PEPFAR, USAID, WHO, Stop TB, the Global Fund and MEASURE
    3. 3. Outline Auditing  DQA  Evolution  Methods and tools  How and where applied Routine Data Quality Assurance and Capacity Building  RDQA  Purpose  Methods and tools  How and where applied  Integration into Routine Practice
    4. 4. Objective of the Data Quality Audit (DQA) ToolThe Data Quality Audit (DQA) Tool is designed to: verify the quality of reported data for key indicators at selected sites; and assess the ability of data-management systems to collect manage and report quality data.
    5. 5. Conceptual Framework Generally, the quality of reported data is dependent on the underlying data management and reporting systems; stronger systems should produce better quality data. Dimensions of Quality QUALITY DATA Accuracy, Completeness, Reliability, Timeliness, Confidentiality, Precision, Integrity Functional Components of a Data Management System Data-Management and M&E Unit Needed to Ensure Data Quality Reporting SystemREPORTING LEVELS I M&E Structure, Functions and Capabilities II Indicator Definitions and Reporting Guidelines Intermediate Aggregation III Data-collection and Reporting Forms / Tools Levels (e.g. Districts, Regions) IV Data Management Processes V Links with National Reporting System VI Data management processes Service Points VII 5 Data quality mechanisms and controls VIII Links with the national reporting system
    6. 6. DQA METHODOLOGY – 2 Protocols■ The methodology for the DQA includes two (2) protocols:1 Assessment of Data Management Qualitative assessment of the strengths and Systems weaknesses of the data-collection and reporting system. (Protocol 1)2 Quantitative comparison of recounted to reported Data Verifications data and review of timeliness, completeness and (Protocol 2) availability of reports.
    7. 7. Typical Timeline PHASE 1 PHASE 2 PHASE 3 PHASE 4 PHASE 5 PHASE 6 IntermediatePreparation and M&E Service Delivery Aggregation M&E Completion Initiation Management Sites / levels Management (multiple (multiple Unit Organizations Unit (eg. District, locations) locations) Region)1. Select 6. Draft initial 7. Draft and Indicators and findings and 3. Assess Data Management and Reporting Systems (Part 1) discuss Audit Reporting conduct close- Report Period out meeting2. Notify 4. Select/Confirm 8. Initiate Program and Service follow-up of desk review of Delivery 5. Trace and Verify Reported Results (Part 2) recommended Program Points to be actions documentation visited ■ The DQA is implemented chronologically in 6 Phases. ■ Assessments and verifications will take place at every stage of the reporting system: - M&E Management Unit - Intermediate Aggregation Level (Districts, Regions) - Service Delivery Sites. 7
    8. 8. PROTOCOL 1:Assessment of Data Management Systems 8
    9. 9. Intermediate M&E Service Delivery PROTOCOL 1: Aggregation Management Sites / levels Unit Organizations Assessment of Data (eg. District, Region) Management Systems 3. Assess Data Management and Reporting Systems (Part 1)■ PURPOSE: Identify potential risks to data quality created by the data- management and reporting systems at: - the M&E Management Unit; - the Service Delivery Points; - any Intermediary Aggregation Level (District or Region).■ The DQA assesses both (1) the design; and (2) the implementation of the data- management and reporting systems.■ The assessment covers 5 functional areas (HR, Training, Data Management Processes , etc.) 9
    10. 10. SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA Functional Areas Summary QuestionsI M&E Structure, Are key M&E and data-management staff identified with Functions and 1 clearly assigned responsibilities? Capabilities Have the majority of key M&E and data-management staff 2 received the required training?II Indicator Has the Program/Project clearly documented (in writing) Definitions and 3 what is reported to who, and how and when reporting is Reporting required? Guidelines Are there operational indicator definitions meeting relevant 4 standards? And are they consistently followed by all service points?III Data-collection Are there standard data-collection and reporting forms that and Reporting 5 are systematically used? Forms and Tools Are data recorded with sufficient precision/detail to 6 measure relevant indicators? Are data maintained in accordance with international or 7 national confidentiality guidelines? Are source documents kept and made available in 8 accordance with a written policy?
    11. 11. SYSTEMS ASSESSMENT QUESTIONS BY FUNCTIONAL AREA Functional Areas Summary QuestionsIV Data Management Does clear documentation of collection, aggregation 9 Processes and manipulation steps exist? Are data quality challenges identified and are 10 mechanisms in place for addressing them? Are there clearly defined and followed procedures to 11 identify and reconcile discrepancies in reports? Are there clearly defined and followed procedures to 12 periodically verify source data?V Links with National Does the data collection and reporting system of the Reporting 13 Program/project link to the National Reporting System? System
    12. 12. PROTOCOL 2:Data Verifications 13
    13. 13. Intermediate M&E Service Delivery Aggregation Management Sites / PROTOCOL 2: Unit Organizations levels (eg. District, Region) Data Verifications 5. Trace and Verify Reported Results (Part 2)■ PURPOSE: Assess on a limited scale if Service Delivery Points and Intermediate Aggregation Sites are collecting and reporting data accurately and on time.■ The trace and verification exercise will take place in two stages: - In-depth verifications at the Service Delivery Points; and - Follow-up verifications at the Intermediate Aggregation Levels (Districts, Regions) and at the M&E Unit. 14
    14. 14. DQA Data Verifications M&E Unit/National Monthly Report ILLUSTRATION District 1 65 District 2 45 District 3 75 District 4 250 TOTAL 435 District 1 District 2 District 3 District 4 Monthly Report Monthly Report Monthly Report Monthly Report SDS 1 45 SDS 3 45 SDS 4 75 SDP 5 50 SDS 2 20 TOTAL 45 TOTAL 75 SDP 6 200 TOTAL 65 TOTAL 250 Service Delivery Site 1 Service Delivery Site 2 Service Delivery Site 3 Service Delivery Site 4 Service Delivery Site 5 Service Delivery Site 6 Monthly Report Monthly Report Monthly Report Monthly Report Monthly Report Monthly Report ARV Nb. 45 ARV Nb. 20 ARV Nb. 45 ARV Nb. 75 ARV Nb. 50 ARV Nb. 200 Source Source Source Source Source SourceDocument 1 Document 1 Document 1 Document 1 Document 1 Document 1
    15. 15. Service Delivery Points – Data Verification SERVICE DELIVERY POINT - 5 TYPES OF DATA VERIFICATIONS Verifications Description -Verification no. 1: Describe the connection between the delivery of services/commodities and the completion of the source In all casesDescription document that records that service delivery.Verification no. 2: Review availability and completeness of all indicatorDocumentation In all cases source documents for the selected reporting period. Review Trace and verify reported numbers: (1) Recount theVerification n . 3: o reported numbers from available source documents; (2) In all casesTrace and Verification Compare the verified numbers to the site reported number; (3) Identify reasons for any differences.Verification no. 4: Perform “cross-checks” of the verified report totals with other data-sources (eg. inventory records, laboratory If feasibleCross-checks reports, etc.).Verification no. 5: Perform “spot checks” to verify the actual delivery of If feasibleSpot checks services or commodities to the target populations.
    16. 16. Summary Statistics of Data Verifications Total and Adjusted District Verification Factors from % On Time Reports from DQA Total DQA Verification Factor IAL #4 M&E Unit 0.75 IAL #4 1,07 0.73 IAL #3 IAL #3 1.22 IAL #2 1 0.88 1 1,03 IAL #1 IAL #2 0.47 1,07 0.80 IAL #1 0,95 0, 00 0, 20 0, 40 0,60 0,80 1, 00 1, 20 0.00 0.20 0.40 0.60 0.80 1.00 % Available Reports from DQA % Complete Reports from DQA 0.50 0,75M&E Unit 0.67 0,83IAL #4 M&E UnitIAL #3 IAL #4 1 0.72 1 0,92IAL #2 IAL #3IAL #1 IAL #2 0,50 0.60 IAL #1 0,80 0.70 0,00 0,20 0,40 0,60 0,80 1,00 0.00 0.10 0.20 0.30 0.40 0.50 0.60 0.70 0.80 17
    17. 17. Outputs of the DQA Implementation■ Completed protocols and templates related to: - Protocol 1: System Assessment; and - Protocol 2: Data Verification.■ Write-ups of observations, interviews, and conversations with key staff.■ Preliminary findings and recommendation notes.■ Final DQA Report, including: - Summary of the evidence collected by the DQA team; - Specific findings or gaps related to that evidence; - Recommendations to improve data-management systems and overall data quality.The Final DQA Report should also include summary statistics on:1. Strengths and weaknesses of data-collection and reporting systems;2. Recounting exercise performed on primary records and aggregated reports;3. Available, On time and Complete Reports. 18
    18. 18. Use of the DQA since 2008
    19. 19. Use of DQA (continued) Biggest user is the Global Fund Part of the grant renewal process Random and purposive selection of grants External audit teams – DQA IQC  1st IQC 2008-2010  2nd IQC 2011-2012 Standardized output GF Reorganization
    20. 20. Global Fund DQAs 2008-2012 Disease Year Qtr HIV/AIDS TB Malaria 2008 4 Mali, Belarus Philippines Rwanda, China, LAC- Multi-country 2009 1 Comoros Islands China, Yemen (Americas – Andean Region) 2009 2 Haiti, Burundi Niger Ghana, Indonesia Mozambique, Gambia, Vietnam, Sri 2009 3 Dominican Republic, Lanka, Nigeria, Zambia+ Tanzania Peru, Swaziland, South Kenya, Cote d’Ivoire, 2009 4 Africa, Thailand, Tajikistan Madagascar Guyana Ethiopia, Paraguay, 2010 1 Malawi, C.A.R. Uzbekistan Uganda, Cameroon Papua New Guinea, Russia, Guinea- 2010 2 Conakry, Eritrea Pakistan, Paraguay Pakistan, South Sudan, Madagascar 2011 2 Zambia Kyrgyzstan North Sudan Angola, Cambodia, Senegal, South Sudan, 2011 4 Namibia, Guyana, India, North Korea DRC Myanmar, Ukraine
    21. 21. GF Audits by Indicator (2008-2010) Disease Program Area Number of AuditsMalaria Malaria Treatment 13HIV Current on ART 12Malaria LLIN 10HIV C&T HIV 9HIV Care and support 6TB TB Detection 5HIV PMTCT – preg. women on prophylaxis 5Malaria Malaria Training 4TB TB Treatment 4HIV HIV Education 4Malaria IPT 3HIV Condoms 3Malaria Malaria Treatment - Community Based 2Malaria Malaria Testing 2HIV OVC 2HIV HIV Training 1TB MDR TB Treatment 1HIV TB/HIV 1Total 87
    22. 22. Use of DQA (continued) PEPFAR Countries  2009 Country Operational Plan Guidance  Mozambique, Ukraine, DRC, Uganda, Nigeria, Cote d’Ivoire, Dominican Republic Ad hoc use by Partners, other agencies  UNICEF in India Adaptation for local use  South Africa (ESI)
    23. 23. New DQA Dashboards
    24. 24. DQA Performance Analysis National Level: Performance Table: Performance Scores Weighted Final Audit Performance Score 100% Sample Accuracy 77% 100% 90%Values provided automatically OSDV Accuracy 92% 80% 90% Availability 74% 70% Timeliness 52% 80% 60% Completeness 84% 76.5% 70% 50% Cross-Checks 79% 60% Service Site 40% 65% Documentation Review M&E Structure, Functions 30% 50% 2.40 and CapabilitiesQualitative - Enter these values Indicator Definitions and 20% 2.50 40% Reporting Guidelines Data-collection and 10% 2.30 Reporting Forms / Tools 30% Data Management 0% 1.90 Processes Links with National 2.80 Excellent = PS 91.0 - 100.0 Poor = PS 61.0 - 70.0 Reporting System Good = PS 81.0 - 90.0 Very Poor = PS < 60.0 Fair = PS 71.0 - 80.0
    25. 25. Evolution of RDQA tool Need expressed for a tool for programs/projects to use for self assessment, improved data quality, capacity building and to prepare for external audits New generic, simplified tool adapted to the country, program and/or project Routine use Quantify and track data quality performance over time Identify and respond to data quality problems
    26. 26. Use of the Auditing and Self Assessment/Routine Versions Auditing version… Capacity Building version… Assessment by funding  Simplified from the auditing agency version Standardized approach to  Flexible use by program auditing  Self-assessment Conducted by an external audit team  Focus is on identifying weaknesses for system Relatively expensive strengthening Limited opportunities for  Should be used routinely – data capacity building quality performance indicators monitored over time
    27. 27. RDQA Modifications No implied sampling methodology No aggregate sample statistics Quantitative and qualitative assessments merged into the same tool Dashboards by site and level to facilitate data use Easily extract data to link with data systems Action plans for system strengthening
    28. 28. Objectives of the RDQAOBJECTIVES: • VERIFY • the quality of reported data for key indicators at selected sites; and the ability of data-management systems to collect, manage and report quality data. • IDENTIFY • weaknesses in the data management system and • interventions for system strengthening. • MONITOR • data quality performance over time and • capacity to produce good quality data
    29. 29. Uses of the RDQAThe RDQA is designed to be flexible in use and can serve multiple purposes: Routine data quality checks as part of on-going supervision Initial and follow-up assessments of data management and reporting systems – measure performance improvement over time: Strengthening program staff’s capacity in data management and reporting: Preparation for a formal data quality audit· External assessment by partners, other stakeholders
    30. 30. RDQA Outputs1. Strength of the M&E System evaluation based on a review of the Program/project’s data management and reporting system, including responses to overall summary questions on how well the system is designed and implemented;2. Verification Factors generated from the trace and verify recounting exercise performed on primary records and/or aggregated reports (i.e. the ratio of the recounted value of the indicator to the reported value);3. Available, On time and Complete Reports percentages calculated at the Intermediate Aggregation Levels and the M&E Unit).4. Action Plan for System Strengthening for each site and level assessed.
    31. 31. RDQA Dashboards
    32. 32. RDQA Summary Statistics – Level Specific DashboardPart 4: DASHBOARD: National Level - M&E Unit Data Management Assessment - Data and Verifications - Reporting Performance - National Level - M&E Unit National Level - M&E Unit National Level - M&E Unit I - M&E Structure, 120% Functions and Capabilities 106% % Complete 82% 3 100% 102% 105% 94% 2 II- Indicator 80% V - Links with 1 Definitions and National ReportingReporting System 60% % On Time 41% Guidelines 0 40% III - Data-collection 20% IV- Data Management and Reporting Forms % Available 92% Processes and Tools 0% ANC Registered ANC Counseled and ANC Received ANC Tested Positive Tested Results 0% 20% 40% 60% 80% 100%
    33. 33. Global Dashboard Data Management Assessment - System Assessment Results by Level of the Reporting System Overall Average 3.00 M&E Structure, Functions and 2.50 Capabilities RDQA 3 2.00 Summary 2 1.50 Links with Indicator Definitions National Reporting 1 and ReportingStatistics – System Guidelines 1.00 0 0.50 Global 0.00Dashboard Data Data-collection National M&E Unit - Cotonou Management and Reporting Processes Forms / Tools M&E Structure, Indicator Definitions Data-collection Data Links with Functions and and Reporting and Reporting Management National Reporting Capabilities Guidelines Forms / Tools Processes System Data Verifications - Overall Average Data Verifications - Verification Factors by Level of the Reporting System by Indicator 140% 1.4 1.2 120% 124% 112% 112% 1 100% 90% 0.8 80% 0.6 60% 0.4 40% 0.2 20% 0 Service Site Average District Average Regional Average National M&E Unit - Cotonou 0% ANC Registered ANC Counseled ANC Received ANC Tested and Tested Results Positive ANC Registered ANC Counseled and Tested ANC Received Results ANC Tested Positive Reporting Performance - Overall Average Reporting Performance by Reporting Level 1 0.9 0.8 Completeness 132% 0.7 0.6 0.5 Timeliness 51% 0.4 0.3 0.2 0.1 Availability 86% 0 District Average Regional Average National M&E Unit - Cotonou 0% 20% 40% 60% 80% 100% 120% 140% Availability Timeliness Completeness
    34. 34. RDQA SummaryStatistics – Site specificqualitative results
    35. 35. RDQA Output- SystemStrengtheningAction Plan
    36. 36. Data ExportTab –EasilyIntegrateResultsfromdifferentworkbooks
    37. 37. Growing family of RDQA Tools Data Verifications - Service Site Summary RDQA traditional version: Trend in Verification Ratio by Indicator one indicator per 3.5 workbook, x-sectional 3 2.5 Health Facility RDQA Multi-indicator 2 version: up to four 1.5 indicators for the same 1 0.5 program area in x-section 0 Q1 Q2 Q3 Q4 RDQA Longitudinal CS Parakou Percent Accuracy Maternite de bonne saveur Parakou version: one indicator for CS Immacule Conception de Kalale Hopital St. Louis de Kalale up to four points in time. Hopital Regionale de Savalou CS de Zongo - Savalou Centre Hopitalier de Grand Quelquun dAbomey CS Behanzin de Abomey RDQA training materials PMTCT and TB RDQA
    38. 38. Countries where RDQA has been used or is currently being implemented Nigeria • South Africa Kenya • Lesotho Tanzania • Swaziland Cote d’Ivoire • DRC Guyana • India • Botswana Haiti • Global Fund On Site Mozambique Data Verification Zimbabwe (OSDV) by LFAs in many countries Uganda Dominican Republic Peru Paraguay
    39. 39. Now we know the quality of thedata. Now what? Standardization:  Indicators  data collection and reporting tools  data management and indicator compilation practices Capacity building to address identified needs  MEval Data quality assurance curriculum Formalization of RDQA into routine practice Monitor data quality performance indicators over time. What is the trend?
    40. 40. Integrating Routine Data Quality Assurance intoStandard Operating Procedures Explicitly State:  Who will do what?  When they will do it?  What is the output?  Where will it happen?  What is the required follow up?  Who will verify that it is happening? Formalize in official Program documentation Incorporate into pre-service/in-service training
    41. 41. Integration of RDQA into SOPs Example from Botswana
    42. 42. Key Success Factors for Data Quality1. Functioning information systems2. Clear definition of indicators consistently used at all levels3. Description of roles and responsibilities at all levels4. Specific reporting timelines5. Standard/compatible data-collection and reporting forms/tools with clear instructions6. Documented data review procedures to be performed at all levels7. Steps for addressing data quality challenges (missing data, double- counting, lost to follow up, …)8. Storage policy and filling practices that allow retrieval of documents for auditing purposes (leaving an audit trail)
    43. 43. MEASURE Evaluation is funded by the U.S. Agency forInternational Development (USAID) through Cooperative Agreement GPO-A-00-03-00003-00 and is implemented by the Carolina Population Center at the University of North Carolina in partnership with Futures Group, ICF International, John Snow, Inc., Management Sciences for Health, and Tulane University. Visit us online at http://www.cpc.unc.edu/measure.

    ×