This document provides instructions for conducting a Routine Data Quality Assessment (RDQA) to evaluate program or project monitoring and evaluation systems. The RDQA involves selecting levels of the data reporting system to review, identifying indicators and time periods to assess, conducting site visits using checklists, and developing an action plan to strengthen areas of weakness. Site visits include verifying reported data against source documents, assessing data management processes, and providing recommendations. Dashboards then summarize findings to prioritize improvements to data quality.
Improving Data for Decision-Making: Leveraging Data Quality Audits in Haryana...HFG Project
Resource Type: Report
Authors: Gajinder Pal Singh, Jordan Tuchman, and Michael P. Rodriguez
Published: May 31, 2014
Resource Description:
The Government of India has prioritized 184 of the 640 districts in the country for focused maternal and child health interventions under an integrated program called the Reproductive, Maternal, Neonatal, Child and Adolescent Health (RMNCH+A) initiative. A key factor in the success of this initiative is the ability of the government to effectively track health outcomes through the routine collection of data from service delivery points across the high-priority districts.
The National Rural Health Mission (NRHM)1 is responsible for monitoring RMNCH+A indicators across the country and has leveraged the rollout of a web-based national health management information (HMIS) for this purpose. In several states, another information system – the web-based District Health Information System (DHIS) 2.0 – is used. A number of reviews of the data produced through the national HMIS have indicated that there are data quality issues. However, there are limited reviews of data quality taking place across the NRHM facilities and no systematic assessment mechanism is currently in place.
To address these issues, the Haryana State NRHM partnered with the HFG Project to conduct a data quality audit (DQA) across four of the state’s high-priority districts. The DQA exercise took place in December 2013, beginning with a presentation of the methodology to a cadre of Haryana NRHM Program Managers, HMIS Officers, and District Monitoring and Evaluation (M&E) Officers. Presented here is a brief summary of the findings and recommendations, sorted by the five domains evaluated in the DQA exercise.
Improving Data for Decision-Making: Leveraging Data Quality Audits in Haryana...HFG Project
Resource Type: Report
Authors: Gajinder Pal Singh, Jordan Tuchman, and Michael P. Rodriguez
Published: May 31, 2014
Resource Description:
The Government of India has prioritized 184 of the 640 districts in the country for focused maternal and child health interventions under an integrated program called the Reproductive, Maternal, Neonatal, Child and Adolescent Health (RMNCH+A) initiative. A key factor in the success of this initiative is the ability of the government to effectively track health outcomes through the routine collection of data from service delivery points across the high-priority districts.
The National Rural Health Mission (NRHM)1 is responsible for monitoring RMNCH+A indicators across the country and has leveraged the rollout of a web-based national health management information (HMIS) for this purpose. In several states, another information system – the web-based District Health Information System (DHIS) 2.0 – is used. A number of reviews of the data produced through the national HMIS have indicated that there are data quality issues. However, there are limited reviews of data quality taking place across the NRHM facilities and no systematic assessment mechanism is currently in place.
To address these issues, the Haryana State NRHM partnered with the HFG Project to conduct a data quality audit (DQA) across four of the state’s high-priority districts. The DQA exercise took place in December 2013, beginning with a presentation of the methodology to a cadre of Haryana NRHM Program Managers, HMIS Officers, and District Monitoring and Evaluation (M&E) Officers. Presented here is a brief summary of the findings and recommendations, sorted by the five domains evaluated in the DQA exercise.
Latest Innovations from Workday Analytics and PlanningWorkday, Inc.
Learn from our product leaders through viewing the highlights of 2020R1, including the latest from Adaptive Insights, Workday Prism Analytics, and core reporting in Workday HCM and Workday Financial Management.
The 2014 CRASH Report finds applications built with either Agile or Waterfall methods alone are more susceptible to security, reliability, performance, and cost issues.
This presentation deals with the Performance testing implemented in the Project.
We have to use Certain Tools in the POC process.
1) VSTS
2) JMeter
finally, VSTS has been Implemented.
I hope this presentation helps the viewer to get the overview of the tools which Accenture Deals.
Increasing the probability of program successGlen Alleman
Program Success starts and end with Process. Along the way, people and tools are needed, but process is the foundation of program success. These processes start with the Concept of Operations, describing what Capabilities are needed by the stakeholder to accomplish the mission of the program. Assessment of progress to plan must be made in units of measure meaningful to the decision makers. Measures of Effectiveness are defined by the Government. Measures of Performance and Technical Performance Measures are defined by Industry.
Value Stream Mapping – Stories From the TrenchesDevOps.com
Value stream mapping is an enormously rewarding process for finding bottlenecks in your software delivery pipelines and for aligning the team’s efforts in improving pipeline shortcomings. In prior webinars, we learned that in order to be effective, you must involve the team and need to maximize the time invested. Preparation is vital and learning from other value stream mapping experiences will give you the knowledge and experience you need to move your organization forward.
Snapshotz is revolutionizing the way call and contact center operators are looking at their centers. Objective assessment, meaningful benchmarking, actionable results. Snapshotz is a planning tool, a management tool, a reporting and process improvement tool. Snapshotz is the one tool your call center can\'t be without.
Latest Innovations from Workday Analytics and PlanningWorkday, Inc.
Learn from our product leaders through viewing the highlights of 2020R1, including the latest from Adaptive Insights, Workday Prism Analytics, and core reporting in Workday HCM and Workday Financial Management.
The 2014 CRASH Report finds applications built with either Agile or Waterfall methods alone are more susceptible to security, reliability, performance, and cost issues.
This presentation deals with the Performance testing implemented in the Project.
We have to use Certain Tools in the POC process.
1) VSTS
2) JMeter
finally, VSTS has been Implemented.
I hope this presentation helps the viewer to get the overview of the tools which Accenture Deals.
Increasing the probability of program successGlen Alleman
Program Success starts and end with Process. Along the way, people and tools are needed, but process is the foundation of program success. These processes start with the Concept of Operations, describing what Capabilities are needed by the stakeholder to accomplish the mission of the program. Assessment of progress to plan must be made in units of measure meaningful to the decision makers. Measures of Effectiveness are defined by the Government. Measures of Performance and Technical Performance Measures are defined by Industry.
Value Stream Mapping – Stories From the TrenchesDevOps.com
Value stream mapping is an enormously rewarding process for finding bottlenecks in your software delivery pipelines and for aligning the team’s efforts in improving pipeline shortcomings. In prior webinars, we learned that in order to be effective, you must involve the team and need to maximize the time invested. Preparation is vital and learning from other value stream mapping experiences will give you the knowledge and experience you need to move your organization forward.
Snapshotz is revolutionizing the way call and contact center operators are looking at their centers. Objective assessment, meaningful benchmarking, actionable results. Snapshotz is a planning tool, a management tool, a reporting and process improvement tool. Snapshotz is the one tool your call center can\'t be without.
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionAggregage
Join Maher Hanafi, VP of Engineering at Betterworks, in this new session where he'll share a practical framework to transform Gen AI prototypes into impactful products! He'll delve into the complexities of data collection and management, model selection and optimization, and ensuring security, scalability, and responsible use.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfMalak Abu Hammad
Discover how MongoDB Atlas and vector search technology can revolutionize your application's search capabilities. This comprehensive presentation covers:
* What is Vector Search?
* Importance and benefits of vector search
* Practical use cases across various industries
* Step-by-step implementation guide
* Live demos with code snippets
* Enhancing LLM capabilities with vector search
* Best practices and optimization strategies
Perfect for developers, AI enthusiasts, and tech leaders. Learn how to leverage MongoDB Atlas to deliver highly relevant, context-aware search results, transforming your data retrieval process. Stay ahead in tech innovation and maximize the potential of your applications.
#MongoDB #VectorSearch #AI #SemanticSearch #TechInnovation #DataScience #LLM #MachineLearning #SearchTechnology
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024Neo4j
Neha Bajwa, Vice President of Product Marketing, Neo4j
Join us as we explore breakthrough innovations enabled by interconnected data and AI. Discover firsthand how organizations use relationships in data to uncover contextual insights and solve our most pressing challenges – from optimizing supply chains, detecting fraud, and improving customer experiences to accelerating drug discoveries.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Maruthi Prithivirajan, Head of ASEAN & IN Solution Architecture, Neo4j
Get an inside look at the latest Neo4j innovations that enable relationship-driven intelligence at scale. Learn more about the newest cloud integrations and product enhancements that make Neo4j an essential choice for developers building apps with interconnected data and generative AI.
20 Comprehensive Checklist of Designing and Developing a WebsitePixlogix Infotech
Dive into the world of Website Designing and Developing with Pixlogix! Looking to create a stunning online presence? Look no further! Our comprehensive checklist covers everything you need to know to craft a website that stands out. From user-friendly design to seamless functionality, we've got you covered. Don't miss out on this invaluable resource! Check out our checklist now at Pixlogix and start your journey towards a captivating online presence today.
LF Energy Webinar: Electrical Grid Modelling and Simulation Through PowSyBl -...DanBrown980551
Do you want to learn how to model and simulate an electrical network from scratch in under an hour?
Then welcome to this PowSyBl workshop, hosted by Rte, the French Transmission System Operator (TSO)!
During the webinar, you will discover the PowSyBl ecosystem as well as handle and study an electrical network through an interactive Python notebook.
PowSyBl is an open source project hosted by LF Energy, which offers a comprehensive set of features for electrical grid modelling and simulation. Among other advanced features, PowSyBl provides:
- A fully editable and extendable library for grid component modelling;
- Visualization tools to display your network;
- Grid simulation tools, such as power flows, security analyses (with or without remedial actions) and sensitivity analyses;
The framework is mostly written in Java, with a Python binding so that Python developers can access PowSyBl functionalities as well.
What you will learn during the webinar:
- For beginners: discover PowSyBl's functionalities through a quick general presentation and the notebook, without needing any expert coding skills;
- For advanced developers: master the skills to efficiently apply PowSyBl functionalities to your real-world scenarios.
In the rapidly evolving landscape of technologies, XML continues to play a vital role in structuring, storing, and transporting data across diverse systems. The recent advancements in artificial intelligence (AI) present new methodologies for enhancing XML development workflows, introducing efficiency, automation, and intelligent capabilities. This presentation will outline the scope and perspective of utilizing AI in XML development. The potential benefits and the possible pitfalls will be highlighted, providing a balanced view of the subject.
We will explore the capabilities of AI in understanding XML markup languages and autonomously creating structured XML content. Additionally, we will examine the capacity of AI to enrich plain text with appropriate XML markup. Practical examples and methodological guidelines will be provided to elucidate how AI can be effectively prompted to interpret and generate accurate XML markup.
Further emphasis will be placed on the role of AI in developing XSLT, or schemas such as XSD and Schematron. We will address the techniques and strategies adopted to create prompts for generating code, explaining code, or refactoring the code, and the results achieved.
The discussion will extend to how AI can be used to transform XML content. In particular, the focus will be on the use of AI XPath extension functions in XSLT, Schematron, Schematron Quick Fixes, or for XML content refactoring.
The presentation aims to deliver a comprehensive overview of AI usage in XML development, providing attendees with the necessary knowledge to make informed decisions. Whether you’re at the early stages of adopting AI or considering integrating it in advanced XML development, this presentation will cover all levels of expertise.
By highlighting the potential advantages and challenges of integrating AI with XML development tools and languages, the presentation seeks to inspire thoughtful conversation around the future of XML development. We’ll not only delve into the technical aspects of AI-powered XML development but also discuss practical implications and possible future directions.
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
Observability Concepts EVERY Developer Should Know -- DeveloperWeek Europe.pdfPaige Cruz
Monitoring and observability aren’t traditionally found in software curriculums and many of us cobble this knowledge together from whatever vendor or ecosystem we were first introduced to and whatever is a part of your current company’s observability stack.
While the dev and ops silo continues to crumble….many organizations still relegate monitoring & observability as the purview of ops, infra and SRE teams. This is a mistake - achieving a highly observable system requires collaboration up and down the stack.
I, a former op, would like to extend an invitation to all application developers to join the observability party will share these foundational concepts to build on:
1. Routine Data Quality Assessment (RDQA)
Checklist to Assess Program/Project Data Quality
Number of Regional Aggregation Sites þÿ1
Number of District Aggregation Sites þÿ1
Number of Service Delivery Sites þÿ1
Version: Jan 2010
Important notes for the use of this spreadsheet:
1. In order to use the Routine Data Quality Assessment tool you will need to ensure that your 'macro security' is set to something less than 'high'. With the spreadsheet open, go
to the 'Tools' pull-down menu and select 'Macro', then 'Security'. Select 'medium'. Close Excel and re-open the file. When you open the file the next time you will have to select
'Enable Macros' for the application to work as designed.
2. On the START Page (this page), please select number of intermediate aggregation sites (IAS) and Service Delivery Points (SDPs) that you plan to review from the dropdown
lists above. IAS are typically the district level health unit of the Ministry of Health.
START Page 1
2. B – INSTRUCTIONS FOR USE OF THE RDQA
1. Determine Purpose
The RDQA checklist can be used for:
T Initial assessment of M&E systems established by new implementing partners (or in decentralized systems) to collect, manage and report data.
I Routine supervision of data management and reporting systems and data quality at various levels. For example, routine supervision visits may include checking on a certain time period
worth of data (e.g. one day, one week or one month) at the service site level, whereas periodic assessments (e.g. quarterly, biannually or annually) could be carried out at all levels to assess
the functioning of the entire Program/project’s M&E system.
t Periodic assessment by donors of the quality of data being provided to them (this use of the DQA could be more frequent and more streamlined than official data quality audits that use the
DQA for Auditing) but less frequent than routine monitoring of data.
D Preparation for a formal data quality audit.
The RDQA is flexible for all of these uses. Countries and programs are encouraged to adapt the checklist to fit local program contexts.
2. Level/Site Selection
Select levels and sites to be included (depending on the purpose and resources available). Once the purpose has been determined, the second step in the RDQA is to decide what levels of
the data-collection and reporting system will be included in the assessment - service sites, intermediate aggregation levels, and/or central M&E unit. The levels should be determined once
the appropriate reporting levels have been identified and “mapped” (e.g., there are 100 sites providing the services in 10 districts. Reports from sites are sent to districts, which then send
aggregated reports to the M&E Unit). In some cases, the data flow will include more than one intermediate level (e.g. regions, provinces or states or multiple levels of program organizations).
3. Identify indicators, data sources and reporting period. The
RDQA is designed to assess the quality of data and underlying systems related to indicators that are reported to programs or donors. It is necessary to select one or more indicators – or at
least program areas – to serve as the subject of the RDQA. This choice will be based on the list of reported indicators. For example, a program focusing on treatment for HIV may report
indicators of numbers of people on ART. Another program may focus on meeting the needs of orphans or vulnerable children, therefore the indicators for that program would be from the OVC
program area. A malaria program might focus on providing insecticide-treated bed nets (ITN) or on treating people for malaria – or on both of those activities.
4. Conduct site visits. During the site visits, the relevant sections of the appropriate checklists in the Excel file are filled out (e.g. the service site checklist at service sites, etc). These
checklists are completed following interviews of relevant staff and reviews of site documentation. Using the drop down lists on the HEADER page of this workbook, select the appropriate
number of Intermediate Aggregation Levels (IAL) and Service Delivery Points (SDP) to be reviewed. The appropriate number of worksheets will automatically appear in the RDQA workbook
(up to 12 SDP and 4 IALs).
5. Review outputs and findings. The RDQAoutputs need to be reviewed for each site visited. Site-specific summary findings in the form of recommendations are noted at each site visited.
The RDQA checklists exist in MS Excel format and responses can be entered directly into the spreadsheets on the computer. Alternatively, the checklists can be printed and completed by
hand. When completed electronically, a dashboard produces graphics of summary statistics for each site and level of the reporting system.
The dashboard displays two (2) graphs for each site visited:
- A spider-graph displays qualitative data generated from the assessment of the data-collection and reporting system and can be used to prioritize areas for improvement.
- A bar-chart shows the quantitative data generated from the data verifications; these can be used to plan for data quality improvement.
In addition, a 'Global Dashboard' shows statistics aggregated across and within levels to highlight overall strengths and weaknesses in the reporting system. The Global Dashboard shows a
spider graph for qualitative assessments and a bar chart for quantitative assessments as above. In addition, stengths and weakness of the reporting system are displayed as dimensions of
data quality in a 100% stacked bar chart. For this analysis questions are grouped by the applicable dimension of data quality (e.g. accuracy or reliability) and the number of responses by type
of response (e.g. 'Yes - completely', 'Partly' etc.) are plotted as a percentage of all responses. A table of survey questions and their associated dimensions of data quality can be found on the
'Dimensions of data quality' tab in this workbook.
6. Develop a system’s strengthening plan, including follow-up actions. The final output of the RDQA is an action plan for improving data quality which describes the identified
strengthening measures, the staff responsible, the timeline for completion, resources required and follow-up. Using the graphics and the detailed comments for each question, weak
performing functional areas of the reporting system can be identified. Program staff can then outline strengthening measures (e.g. training, data reviews), assign responsibilities and timelines
and identify resources using the Action Plan tab in this workbook.
INSTRUCTIONS Page 2
3. C – BACKGROUND INFORMATION – RDQA
Country:
Name of Program/project:
Indicator Reviewed:
Reporting Period Verified:
Assessment Team: Name Title Email
Primary contact:
M&E Management Unit at Central Level
Name of Site Facility Code Date (mm/dd/yy)
1-
Regional Level Aggregation Sites
Name of Site Facility Code Region Region Code Date (mm/dd/yy)
1
District Level Aggregation Sites
Name of Site Facility Code District District Code Region Region Code Date (mm/dd/yy)
1
Service Delivery Points (SDPs)
Name of Site Facility Code District District Code Region Region Code Date (mm/dd/yy)
1
Information_Page Page 3
4. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 1 Page 4
5. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 1 Page 5
6. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
1.00 National
Reporting
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 1 Page 6
7. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 2 Page 7
8. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 2 Page 8
9. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 2 Page 9
10. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 3 Page 10
11. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 3 Page 11
12. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 3 Page 12
13. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 4 Page 13
14. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 4 Page 14
15. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 4 Page 15
16. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 5 Page 16
17. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 5 Page 17
18. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 5 Page 18
19. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 6 Page 19
20. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 6 Page 20
21. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 6 Page 21
22. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 7 Page 22
23. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 7 Page 23
24. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 7 Page 24
25. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 8 Page 25
26. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 8 Page 26
27. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 8 Page 27
28. Data Verification and System Assessment Sheet - Service Delivery Point
Service Delivery Point/Organization: -
Region and District: -
Indicator Reviewed: -
Date of Review: -
Reporting Period Verified: -
Answer Codes:
Yes - completely REVIEWER COMMENTS
Component of the M&E System Partly (Please provide detail for each response not coded "Yes - Completely". Detailed
No - not at all responses will help guide strengthening measures. )
N/A
Part 1: Data Verifications
A - Documentation Review:
Review availability and completeness of all indicator source documents for
the selected reporting period.
Review available source documents for the reporting period being verified. Is
there any indication that source documents are missing?
1
If yes, determine how this might have affected reported numbers.
Are all available source documents complete?
2
If no, determine how this might have affected reported numbers.
Review the dates on the source documents. Do all dates fall within the
reporting period?
3
If no, determine how this might have affected reported numbers.
B - Recounting reported Results:
Recount results from source documents, compare the verified numbers to the
site reported numbers and explain discrepancies (if any).
Recount the number of people, cases or events during the reporting period by
4
reviewing the source documents. [A]
Enter the number of people, cases or events reported by the site during the
5
reporting period from the site summary report. [B]
6 Calculate the ratio of recounted to reported numbers. [A/B] -
What are the reasons for the discrepancy (if any) observed (i.e., data entry
7
errors, arithmetic errors, missing source documents, other)?
C - Cross-check reported results with other data sources:
Cross-checks can be performed by examining separate inventory records documenting the quantities of treatment drugs, test-kits or ITNs purchased and delivered during the reporting
period to see if these numbers corroborate the reported results. Other cross-checks could include, for example, randomly selecting 20 patient cards and verifying if these patients were
recorded in the unit, laboratory or pharmacy registers. To the extent relevant, the cross-checks should be performed in both directions (for example, from Patient Treatment Cards to the
Register and from Register to Patient Treatment Cards).
8 List the documents used for performing the cross-checks.
9 Describe the cross-checks performed?
10 What are the reasons for the discrepancy (if any) observed?
Service Point 9 Page 28
29. Part 2. Systems Assessment
I - M&E Structure, Functions and Capabilities
There are designated staff responsible for reviewing aggregated numbers
1 prior to submission to the next level (e.g., to districts, to regional offices, to
the central M&E Unit).
The responsibility for recording the delivery of services on source documents
2
is clearly assigned to the relevant staff.
All relevant staff have received training on the data management processes
3
and tools.
II- Indicator Definitions and Reporting Guidelines
The M&E Unit has provided written guidelines to each sub-reporting level on …
4 ,,, what they are supposed to report on.
5 … how (e.g., in what specific format) reports are to be submitted.
6 … to whom the reports should be submitted.
7 … when the reports are due.
III - Data-collection and Reporting Forms and Tools
Clear instructions have been provided by the M&E Unit on how to complete
8
the data collection and reporting forms/tools.
The M&E Unit has identified standard reporting forms/tools to be used by all
9
reporting levels
10 ….The standard forms/tools are consistently used by the Service Delivery Site.
All source documents and reporting forms relevant for measuring the
11 indicator(s) are available for auditing purposes (including dated print-outs in
case of computerized system).
The data collected on the source document has sufficient precision to
12 measure the indicator(s) (i.e., relevant data are collected by sex, age, etc. if
the indicator specifies desegregation by these characteristics).
IV- Data Management Processes
If applicable, there are quality controls in place for when data from paper-
13 based forms are entered into a computer (e.g., double entry, post-data entry
verification, etc).
If applicable, there is a written back-up procedure for when data entry or data
14
processing is computerized.
….if yes, the latest date of back-up is appropriate given the frequency of
15
update of the computerized system (e.g., back-ups are weekly or monthly).
Relevant personal data are maintained according to national or international
16
confidentiality guidelines.
The recording and reporting system avoids double counting people within and
across Service Delivery Points (e.g., a person receiving the same service
17
twice in a reporting period, a person registered as receiving the same service
in two different locations, etc).
The reporting system enables the identification and recording of a "drop out",
18
a person "lost to follow-up" and a person who died.
V - Links with National Reporting System
When available, the relevant national forms/tools are used for data-collection
19
and reporting.
When applicable, data are reported through a single channel of the national
20
information systems.
The system records information about where the service is delivered (i.e.
21
region, district, ward, etc.)
22 ….if yes, place names are recorded using standarized naming conventions.
Service Point 9 Page 29
30. Part 3: Recommendations for the Service Site
Based on the findings of the systems’ review and data verification at the service site, please describe any challenges to data quality identified and recommended strengthening
measures, with an estimate of the length of time the improvement measure could take. These will be discussed with the Program.
Identified Weaknesses Description of Action Point Responsible(s) Time Line
1
2
3
4
Part 4: DASHBOARD: Service Delivery Point
Data Management Assessment - Service Delivery Point Data and Reporting Verifications -
Service Delivery Point
1200%
I - M&E Structure,
Functions and Capabilities
3.00 1000%
2.00
II- Indicator 800%
V - Links with
Definitions and
National
Reporting 1.00
Reporting System
Guidelines
600%
0.00
400%
200%
III - Data-collection IV- Data Management
and Reporting Forms Processes
and Tools
0%
Verification Factor
Service Point 9 Page 30