SlideShare a Scribd company logo
Data Quality
Review (DQR)
Framework
Review of quality of
health facility data
|
2
Why is health facility data important?Why is health facility data important?
 For many indicators it is the only
continuous/frequent source of data
 It is most often the only data source that is
available at the subnational level -- important for
equity;
 For many key indicators, it is the sole source of
data. For example, PMTCT, ART, TB treatment
outcomes, TB notification, confirmed malaria
cases, causes of death, etc.
|
3
Quality of health facility data – why do we care?Quality of health facility data – why do we care?
 High-quality data provide evidence to providers
and managers to optimize healthcare coverage,
quality, and services.
 High-quality data help:
― Form an accurate picture of health needs, programs, and
services in specific areas
― Inform appropriate planning and decision making
― Inform effective and efficient allocation of resources
― Support ongoing monitoring, by identifying best practices
and areas where support and corrective measures are
needed
|
4
Most common problems affecting data qualityMost common problems affecting data quality
 Lack of guidelines to fill out the main data sources
and reporting forms
 Personnel not adequately trained
 Misunderstanding about how to compile data,
use tally sheets, and prepare reports
 Un-standardized source documents and reporting
forms
 Arithmetic errors during data compilating
 Lack of a reviewing process, before report
submission to next level
|
5
Many tools have been used to address data qualityMany tools have been used to address data quality
 GAVI DQA
 WHO Immunization Data Quality Self-assessment
(DQS)
 Global Fund/MEASURE Evaluation DQA
 RDQA - Self assessment version of DQA (with in-
country adaptations)
 Global Fund OSDV
 PRISM
 WHO Data Quality Report Card (DQRC)
|
6
DQR - harmonized approach to assessing and
improving data quality
DQR - harmonized approach to assessing and
improving data quality
 The DQR is a multi-pronged , multi-partner framework for
country-led data quality assurance that proposes a
harmonized approach to assessing data quality
 It is a framework that builds on the earlier program-
specific quality tools and methods while proposing the
examination of data quality in a more systemic way that
can meet the needs of multiple stakeholders
 It is a framework that also includes the examination of
existing facility data (that does not require additional
data collection) missing from earlier tools
 It provides valuable information for fitness-for-purpose to
support the Health Sector Strategic Planning Cycle (e.g.
health sector or program reviews)
|
7
Why are we recommending a harmonized
approach to data quality?
Why are we recommending a harmonized
approach to data quality?
 Data quality is a systems issue - multiple assessments for
different diseases/programs are inefficient and burdensome
for the health system
 Can we satisfy the needs in data quality assurance of all
stakeholders with one holistic data quality assessment?
 The application of a standard framework to evaluate data
quality enables the understanding of the adequacy of
routine data used for health sector planning – can we link
data quality assessment to health planning efforts?
 Permits stakeholders to know that the routine data have
undergone a known minimum level of scrutiny which lends
credibility and confidence in the data
|
8
DQR FrameworkDQR Framework
|
9
Recommended Core Program IndicatorsRecommended Core Program Indicators
Program Area Indicator Name Full Indicator
Maternal
Health
Antenatal care 1st
visit (ANC1)
Number (%) of pregnant women who attended at least
once during their pregnancy
Immunization DTP3/Penta3 Number (%) of children < 1 year receiving three doses of
DTP/Penta vaccine
HIV/AIDS ART coverage Number and % of people living with HIV who are
currently receiving ART
TB Notified cases of
all forms of TB
Number (%) of all forms of TB cases (i.e. bacteriologically
confirmed plus clinically diagnosed) reported to the
national health authority in the past year (new and
relapse)
Malaria Confirmed malaria
cases
Number (%) of all suspected malaria cases that were
confirmed by microscopy or RDT
|
10
DQR methodologyDQR methodology
C
om
ponent
s
|
11
Domains of Data QualityDomains of Data Quality
Data Quality
Review (DQR)
Desk review
13
 Completeness and timeliness
― Completeness of reports
― Completeness of data
― Timeliness of reports
 Internal consistency
― Accuracy
― Outliers
― Trends
― Consistency between indicators
 External consistency
― Data triangulation
― Comparison with data surveys
― Consistency of population trends
 External comparisons (population denominators)
Metrics for Data Quality Performance
14
Completeness and Timeliness of Data
This examines the extent to which:
Data reported through the system are available and
adequate for the intended purpose
All entities that are supposed to report are actually reporting
Data elements in submitted reports are complete
Reports are submitted/received on time through the levels
of the information system data flow
15
• Completeness of reports (%) =
# total reports available or received
# total reports expected
• Completeness of indicator data (%) =
# indicator values entered (not missing) in the
report
# total expected indicator values
• Timeliness (%) =
# reports submitted or received on time
# total reports available or received
Completeness and Timeliness of Data
16
Internal Consistency of Reported Data
This dimension examines:
The accuracy of reporting of selected indicators, by
reviewing source documents
Whether data are free of outliers (within bounds), by
assessing whether specific reported values within the
selected period (such as monthly) are extreme, relative to
the other values reported
Trends in reporting over time, to identify extreme or
implausible values year-to-year
The program indicator compared to other indicators with
which they have a predicable relationship, to determine
whether the expected relationship exists between the two
indicators
17
Internal Consistency: Outliers
Metric Severity
Definition
National Level Subnational Level
Outliers
(Analyze
each
indicator
separately.)
Extreme
(At least 3
standard
deviations from the
mean)
% of monthly
subnational unit
values that are
extreme outliers
# (%) of subnational units in
which ≥1 of the monthly
subnational unit values over the
course of 1 year is an extreme
outlier value
Moderate
(Between 2–3
standard
deviations from the
mean, or >3.5 on
modified Z-score
method)
% of subnational
unit values that
are moderate
outliers
# (%) of subnational units in
which ≥2 of the monthly
subnational unit values over the
course of 1 year are moderate
outliers
17
18
Example: Outliers in a Given Year
Dist
Month
Total
Outliers
%
Outliers
1 2 3 4 5 6 7 8 9 10 11 12
A 2543 2482 2492 2574 3012 2709 3019 2750 3127 2841 2725 2103 1 8.3%
B 1184 1118 1195 1228 1601 1324 1322 711 1160 1178 1084 1112 2 16.7%
C 776 541 515 527 857 782 735 694 687 628 596 543 0 0%
D 3114 2931 2956 4637 6288 4340 3788 3939 3708 4035 3738 3606 1 8.3%
E 1382 1379 1134 1378 1417 1302 1415 1169 1369 1184 1207 1079 0 0%
Nat’l 0 0 0 0 2 0 0 1 0 0 0 1 4 6.7%
Months with at least one moderate outlier on the district monthly reports
are shown in red.
19
Metric
Definition
National Level Subnational Level
Trends/
Consistency
over Time
(Analyze
each
indicator
separately.)
Conduct one of the following, based on
indicator’s expected trend:
• Indicators or programs with expected growth:
Compare current year to the value predicted
from the trend in the 3 preceding years
• Indicators or programs expected to remain
constant: Compare current year to the average
of 3 preceding years
# (%) of districts
whose ratio of
current year to
predicted value (or
current year to
average of
preceding 3 years) is
at least ± 33% of
national ratio
Graphic depiction of trend to determine
plausibility based on programmatic knowledge
Internal Consistency: Trends Over Time
20
Example: Trends over Time
District
Year
Mean of
Preceding
3 Years
(2010-
2012)
Ratio of 2013
to Mean of
2010-2012
% Difference
between National
and District Ratios2010 2011 2012 2013
A 30242 29543 26848 32377 28878 1.12 0.03
B 19343 17322 16232 18819 17632 1.07 0.08
C 7512 7701 7403 7881 7539 1.05 0.09
D 15355 15047 14788 25123 15063 1.67 0.44
E 25998 23965 24023 24259 24662 0.98 0.16
National 98450 93578 89294 108459 93774 1.16
Consistency trend: Comparison of district ratios to national ratios
Any difference between district and national ratio that is ≥33% is
highlighted in red.
21
Internal Consistency: Comparing Related Indicators
Metric
Definition
National Level Subnational Level
Consistency
among
related
indicators
Maternal Health: ANC1 - IPT1 or TT1
(should be roughly equal)
# (%) of subnational units where there is
an extreme difference (≥ ± 10%)
Immunization: DTP3 dropout rate =
(DTP1 - DTP3)/DTP1
(should not be negative)
# (%) of subnational units with # of DTP3
immunizations > DTP1 immunizations
(negative dropout)
HIV/AIDS: ART coverage - HIV
coverage (should be <1)
# (%) of subnational units where there is
an extreme difference (≥ ± 10%)
TB: TB cases notified - TB cases on
treatment (should be roughly
equal)
# (%) of subnational units where there is
an extreme difference (≥ ± 10%)
Malaria: # confirmed malaria cases
reported - cases testing positive
(should be roughly equal)
# (%) of subnational units where there is
an extreme difference (≥ ± 10%)
22
Example: Internal Consistency
District ANC1 IPT1
Ratio of
ANC1 to IPT1
% Difference between
National & District Ratios
A 20995 18080 1.16 0.02
B 18923 16422 1.15 0.02
C 7682 6978 1.10 0.07
D 12663 9577 1.32 0.12
E 18214 15491 1.18 0
National 78477 66548 1.18
% difference between ANC1 and IPT1, by district
Districts with % difference ≥10% are flagged in red.
23
External Consistency with Other Data Sources
This dimension examines the level of agreement
between two sources of data measuring the same
health indicator.
The two most common sources of data are:
The routinely collected and reported data from the
health management information system (HMIS) or program-
specific information system
A periodic population-based survey
24
External Consistency: Compare with Survey Results
Examples of
Indicators
Definition
National Level Subnational Level
ANC 1st
visit
Ratio of facility
ANC1 coverage
rates to survey
ANC1 coverage
rates
# (%) of aggregation units used for the most recent
population-based survey, such as
province/state/region, whose ANC1 facility-based
coverage rates and survey coverage rates differ
by at least 33%
3rd
dose
DTP3
vaccine
Ratio of DTP3
coverage rates
from routine data
to survey DTP3
coverage rates
# (%) of aggregation units used for the most recent
population-based survey, such as
province/state/region, whose DTP3 facility-based
coverage rates and survey coverage rates differ
by at least 33%
25
Example: External Consistency
District
Facility
Coverage
Rate
Survey
Coverage
Rate
Ratio of Facility
to Survey Rates
% Difference
between Official and
Alternate
Denominator
A 1.05 0.95 1.10 10%
B 0.93 0.98 0.96 4%
C 1.39 0.90 1.54 54%
D 1.38 0.92 1.50 50%
E 0.76 0.95 0.80 20%
National 1.10 0.94 1.17 17%
Comparison of HMIS and survey coverage rates for ANC1
Differences ≥ 33% are highlighted in red.
26
External Comparison of Population Data
This dimension examines two points:
The adequacy of the population data used in the
calculation of health indicators
The comparison of two different sources of population
estimates (for which the values are calculated differently) to
see the level of congruence between the two sources
27
External Comparison of Population Data
Metric
Definition
National Level Subnational Level
Consistency of
population
projections
Ratio of population projection
of live births from the country
census bureau/bureau of
statistics to a United Nations
live births projection for the
country
NA
Consistency of
denominator
between program
data & official
government
population
statistics
Ratio of population projection
for select indicator(s) from the
census to values used by
programs
# (%) of subnational units
where there is an extreme
difference (e.g., ±10%)
between the 2 denominators
28
External Comparisons of Population Denominators
District
Official Government
Estimate for Live
Births
Health Program
Estimate for Live
Births
Ratio of Official
Government to Health
Program Estimates
A 29855 29351 1.02
B 25023 30141 0.83
C 6893 7420 0.93
D 14556 14960 0.97
E 25233 25283 1.00
National 101560 107155 0.95
Comparison of national and subnational administrative unit ratios of
official government live birth estimates
Administrative units with differences ≥ ±10% are highlighted in red.
29
How do we do the desk review?
A data quality app has been created for DHIS and can be
downloaded by users and applied to their country DHIS
databases
For those that do not have DHIS, an Excel tool has been
developed to support data quality analysis
The principles of the desk review can be applied in any
software that a country has and are not limited to the
tools presented
Data Quality
Review (DQR)
Data verification and
system
assessment
31
Facility Survey Component of DQR
There are 2 components:
― Data verification – Examines the accuracy of reporting of selected
indicators, by reviewing source documents
― System assessment -- Review adequacy of system to collect, compile,
transmit, analyze, and use HMIS & program data
Survey at 2 levels
― Health facility
― District
32
Accuracy: Data Verification
Quantitative:
Compares recounted to reported
data
Implement in 2 stagesImplement in 2 stages
Assess on a limited
scale if sites are
collecting and
reporting data
accurately and on
time
In-depth verifications at
the service delivery sites
Follow-up verifications at
the intermediate and
central levels
32
33
Data Verification Following Data Flow
33
34
Accuracy: Verification Factor
Verification Factor
Numerator: Recounted data
Denominator: Reported data
 Over-reporting: <100%
 Under-reporting: >100%
Suggested range of
acceptability:
100% +/- 10%
(90% –110%)
34
35
Verification factor
 Weighted mean of verification ratios
 Summarizes information on the reliability of reporting of
the data reporting system
 Indicates the degree of over-/under-reporting in the
system
― e.g. VF = 0.80 indicates that of the total reported number of
events, approximately 80% could be verified in source
documents -> over-reporting
36
Verification Factor Example
v
Indicator 1 Indicator 2
Recounted Reported VF Recounted Reported VF
A 1212 1065 1.14 4009 4157 0.96
B 1486 1276 1.16 3518 3686 0.95
C 357 387 0.92 672 779 0.86
D 2987 3849 0.78 1361 1088 1.25
E 4356 4509 0.97 4254 3970 1.07
Data accuracy by district
Indicators flagged in red are verification factors ≥ ±10% of 1.
36
37
Verification Factors Plotted Graphically
over-reported
underreported
37
38
Data verification
Recommended maximum 5 indicators for review
—ANC1, DTP3/Penta 3, ART coverage, TB cases, malaria
cases (confirmed)
Select a time period for the verification (3 months)
— e.g. : July, August, September 2016
For each indicator:
—Review the source documents and reports
—Recount the number of events
—Compare the recount to the reported events
—Determine reasons for any discrepancies
39
System Assessment Indicators
Indicator
Level
Facility District
Presence of trained staff X X
Presence of guidelines X X
No recent stock out of data collection tools X X
Recently received supervision and written feedback X X
Evidence of analysis and use data X X
40
If the sampling permits it,
system assessment findings can be disaggregated by strata
41
Are there tools to support data verification
and system assessment?
 A paper-based questionnaire is available that can be adapted to a
country situation
 A data collection program has been developed in CSPro for tablets
 An analysis tool has been developed in Excel for support the analysis
of the data collected during this exercise
Data Quality
Review (DQR)
Desk review
Excel Tool
43
43
Opening Screen
44
Input Basic Information Tab – Parameters of the
analysis
45
Input Administrative Units for the Analysis
46
Input Program Areas and Indicators
47
Input Quality Thresholds
Recommended User-defined
Domain 1: Completeness and Consistency of Reporting/Indicator Data Col 1 Col 2
1a
1a1a 75%
1a1b 75%
1a2a 75%
1a2b 75%
1a3a 75%
1a3b 75%
1a4a 75%
1a4b 75%
1b
Program Area 1: Maternal_Health
1b1 Indicator 1: ANC 1st Visit 90%
Program Area 2: Immunization
1b2 Indicator 1: 3rd dose DPT-containing vaccine 67%
Program Area 3: HIV_AIDS
1b3 Indicator 1: Number of HIV+ persons currently on ART 90%
Program Area 4: Malaria
1b4 Indicator 1: Number of confirmed malaria cases reported 90%
Program Area 5: TB
1b5 Indicator 1: Number of Notified TB cases (all forms of TB) 75%
Completeness of Region Level Reporting
Completeness of Indicator Reporting: % of data elements that are non-zero values; % of data elements
that are non-missing values
Timeliness of Region Level Reporting
Completeness of District Level Reporting
Threshold
Completeness and Timliness of Reporting from Health Facilities and Aggregation Levels: District, Region,
Province
Completeness of Province Level Reporting
Timeliness of Province Level Reporting
Timeliness of District Level Reporting
Completeness of Health Facility Level Reporting
Timeliness of Health Facility Level Reporting
Quality Thresholds:
'Quality thresholds'are the values that set the limits of acceptable error in data reporting. The analyses in the DQR
compare results to these thresholds to judge the quality of the data. Recommended values are included for each
metric in column 1. User-defined thresholds can be input into col 2 which will take precedence over the values in
col 1.
48
Input Information on Completeness and Timeliness
49
Input Population Data
50
Input Data on Indicator Trends
51
Input Indicator Data
52
Summary Dashboard
No. Indicator Definition
National Score
(%)
# of districts not
attaining quality
threhold
% of districts not
attaining quality
threshold
1a
Completeness of District
Reporting
National district reporting completeness rate and
districts with poor completeness of reporting
99.1% 1 2.1%
1b
Timeliness of District
Reporting
National district reporting timeliness rate and districts
with poor timeliness of reporting
90.3% 3 6.4%
1c
Completeness of Facility
Reporting
National facility reporting completeness rate and districts
with poor facility reporting completeness
96.1% 8 17.0%
1d
Timeliness of Facility
Reporting
National facility reporting timeliness rate and districts
with poor facility reporting timeliness
89.6% 11 23.4%
Maternal_Health - ANC 1st Visit 98.9% 1 2.1%
Immunization - 3rd dose DPT-containing vaccine 99.3%
HIV_AIDS - Number of HIV+ persons in palliative care 99.8%
Malaria - Number of confirmed malaria cases reported 99.8%
Immunization - OPV3 98.9% 1 2.1%
Multi-program - Penta 1st doses 99.1%
Maternal_Health - ANC 1st Visit 98.9% 2 4.3%
Immunization - 3rd dose DPT-containing vaccine 99.5%
HIV_AIDS - Number of HIV+ persons in palliative care 100.0%
Malaria - Number of confirmed malaria cases reported 99.5% 1 2.1%
Immunization - OPV3 100.0%
Multi-program - Penta 1st doses 100.0%
1f.1
Consistency of Reporting
Completeness - District
Reporting
Consistency of district reporting completeness and
districts deviating from the expected trend
105.8%
1f.2
Consistency of Reporting
Completeness - Facility
Reporting
Consistency of facility reporting completeness and
Districts deviating from the expected trend
103.5%
DOMAIN 1: COMPLETENESS OF REPORTING
BURUNDI - ANNUAL DATA QUALITY REVIEW: RESULTS, 2016
DOMAIN 2: INTERNAL CONSISTENCY OF REPORTED DATA
Completeness of indicator
data (missing values)
1e.1
1e.2
Indicator 1: Completeness and timeliness of reporting
Indicator 1e: Completeness of indicator data - presence of missing and zero values
Indicator 1f: Consistency of reporting completeness over time
Completeness of indicator
data (zero values)
53
Domain 1: Completeness of Reporting
54
Domain 1: Completeness of Indicator Data
National
score
Program Area and Indicator
Quality
Threshold
Type % No. % Name
Missing 98.9% 1 2.1%
Zero 98.9% 2 4.3%
Missing 99.3% 1 2.1%
Zero 99.5% 1 2.1%
Missing 99.8%
Zero 100.0%
Missing 99.8%
Zero 99.5% 1 2.1%
Missing 98.9% 1 2.1%
Zero 100.0%
Missing 99.1% 1 2.1%
Zero 100.0%
Missing 99.3% 4 8.5%
Zero 99.6% 4 8.5%
Malaria - Number of confirmed malaria
cases reported
<= 90%
-
Multi-program - Penta 1st doses <= 90%
District 21
-
Indicator 1f: Consistency of Reporting Completeness
Immunization - OPV3 <= 90%
District 7
-
Total (all indicators combined)
HIV_AIDS - Number of HIV+ persons in
palliative care
<= 90%
-
-
Indicator 1e: Completeness of Indicator Reporting - Presence of Missing and Zero Values
2016
Maternal_Health - ANC 1st Visit
District 44
District 17, District 29
<= 90%
Districts with > user-defined % of zero or missing values
Immunization - 3rd dose DPT-
containing vaccine
<= 90%
District 19
District 17
Interpretation of results: Indicator 1e
•
•
•
•
55
Domain 2: Internal Consistency - Outliers
National
score
% No. % Name
0.2% 1 2.1%
0.0%
0.5% 3 6.4%
0.0%
0.4% 2 4.3%
0.0%
0.2%
Indicator 2a.1: Extreme Outliers (>3 SD from the mean) 2016
Malaria - Number of confirmed malaria cases reported
Immunization - OPV3
Multi-program - Penta 1st doses
Total (all indicators combined)
DOMAIN 2: INTERNAL CONSISTENCY OF REPORTED DATA
-
Districts with extreme outliers relative to the mean
District 39
Indicator 2a: Identification of Outliers
-
District 31, District 39
-
Program Area and Indicator
District 9, District 38, District 41
Maternal_Health - ANC 1st Visit
Immunization - 3rd dose DPT-containing vaccine
HIV_AIDS - Number of HIV+ persons in palliative care
Interpretation of results - Indicator 2a1:
•
•
•
•
•
•
56
Domain 2: Consistency over time
Quality threshold
National score (%)
Number of districts with divergent scores
Percent of districts with divergent scores
Names of districts with divergent scores:
20%
Expected trend Increasing
Compare districts to: expected result
2b3: Consistency of 'General_Service_Statistics -
OPD Total Visits' over time
Year 2014
100%
5
38.5%
District 6, District 7, District 8, District 9, District 11
0
100,000
200,000
300,000
400,000
500,000
600,000
0 100,000 200,000 300,000 400,000 500,000
General_Service_Statistics-OPDTotalVisits
eventsforyearofanalysis
Forcasted General_Service_Statistics - OPD Total Visits value for
current year based on preceding years (3 years max)
0
1,000,000
2,000,000
3,000,000
2011 2012 2013 2014
Trend over time: General_Service_Statistics -OPD Total Visits
Interpretation of results - Indicator 2c3:
•This indicator is increasing over time (Outpatient visits are
increasing - something we were expecting given social mobiliation for
public health services.
•Comparison of expected result (that the forecasted value is equal to the
actual value for 2014) yeilds 5 districts with ratios that exceed the
quality threhold of 20%. 3 are inferior of the quality threshold while 2
are greater.
• Errors are not systematic (e.g. all in one direction) Review district
outpatient registers in affected districts to confirm reported values.
57
Domain 2: Consistency between related indicators
Percent of districts with divergent scores 15.4%
Names of districts with divergent scores:
District 5, District 6
Indicator 2c: Internal Consistency - Consistency Between Related Indicators
Consistency between related indicators - Ratio of two related indicators and Districts with ratios significantly different from the
national ratio *
2c1: Maternal Health Comparison: ANC 1st Visit :
IPT 1st Dose
Year 2014
Expected relationship
National Score (%) 114%
Number of districts with divergent scores 2
equal
Quality Threshold 10%
Compare districts with: national rate
Interpretationof results - Indicator 2c1:
• Data seem pretty good - only district 5 has a largely discrepant value
• IPT seens consistently lower than ANC1 - more pregnant women should be receiving IPT
• Stock out of fansidar in Region 2 could explain low number of IPTin Districts 5 . Call DHIO in these districts to
investigate
•National rate is 114% - most districts are close to this value. District 6 is performing well relative to the other districts
but is 'discrepant' relative to the national rate. - no follow up needed.
0
5000
10000
15000
20000
25000
30000
35000
40000
45000
50000
ANC1eventsforyearofanalysis
IPT 1st Dose eventsfor year of analysis
Scatter Plot: ANC 1st Visit : IPT1st Dose(Districts compared tonational
rate)
58
Domain 3: External Consistency – Consistency
with survey values
59
Domain 4: Consistency of population data –
Comparison of denominators in use in-country
Names of districts with divergent scores:
District 1, District 5, District 7, District 12
National Score (%) 106%
Number of districts with divergent scores 4
Percent of districts with divergent scores 30.8%
Indicator4b: Consistency of denominatorbetween program data and official government population statistics
Indicator4b1- Comparing the official Live Births
denominator to aprogram denominator, if
applicable
Year 2014
Quality Threshold 10%
0.00
10000.00
20000.00
30000.00
40000.00
50000.00
60000.00
70000.00
ProgramdenominatorforLiveBirths
Official government denominator for Live Births
Interpretationof results- Indicator 4b1:
• the Program denominatorsin Districts1, 7, and 12 seemtoo large - and too small inDistrict 5. Review growth rates
used by program to estimateintercensal yearly values forlivebirths.
•
Data Quality
Review (DQR)
Data verification and
system
Assessment
Excel
Chartbook
61
Opening Screen
62
CSPro Data Entry Application
63
Create DQR Indicators in CSPro
64
Export Data From CSPro and Paste into the
Chartbook
65
Input Analysis Disaggregations
66
General Facility Information
67
Availability of Source Documents and
Reports
68
Timeliness and Completeness
69
Verification Factors
70
Program Specific Verification Factors
71
Program Specific – Reasons for
Discrepancies
72
System Assessment Results
73
System Assessment – Subnational Results

More Related Content

What's hot

History of Use of GxAlert in Nigeria
History of Use of GxAlert in NigeriaHistory of Use of GxAlert in Nigeria
History of Use of GxAlert in Nigeria
SystemOne
 
Quality improvement in reproductive, maternal, newborn and child health
Quality improvement in reproductive, maternal, newborn and child healthQuality improvement in reproductive, maternal, newborn and child health
Quality improvement in reproductive, maternal, newborn and child health
REACHOUTCONSORTIUMSLIDES
 
ASLM- Alere: Importance of Quality Systems
ASLM- Alere: Importance of Quality SystemsASLM- Alere: Importance of Quality Systems
ASLM- Alere: Importance of Quality Systems
SystemOne
 
EHR Migration Guide
EHR Migration GuideEHR Migration Guide
EHR Migration Guide
Boston Software Systems
 
ACDM - "Data Driven" Monitoring of Clinical Trials - Neill Barron
ACDM - "Data Driven" Monitoring of Clinical Trials - Neill BarronACDM - "Data Driven" Monitoring of Clinical Trials - Neill Barron
ACDM - "Data Driven" Monitoring of Clinical Trials - Neill Barron
Neill Barron
 
DIA 2014 Risk Based Monitoring - Neill Barron
DIA 2014 Risk Based Monitoring - Neill BarronDIA 2014 Risk Based Monitoring - Neill Barron
DIA 2014 Risk Based Monitoring - Neill Barron
Neill Barron
 
Use of Visualisations to Optimise Clinical Trials - Neill Barron
Use of Visualisations to Optimise Clinical Trials - Neill BarronUse of Visualisations to Optimise Clinical Trials - Neill Barron
Use of Visualisations to Optimise Clinical Trials - Neill Barron
Neill Barron
 
Cost Effective Technology for Effective and Rapid TB Response in Nigeria
Cost Effective Technology for Effective and Rapid TB Response in Nigeria Cost Effective Technology for Effective and Rapid TB Response in Nigeria
Cost Effective Technology for Effective and Rapid TB Response in Nigeria
SystemOne
 
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
MEASURE Evaluation
 
Thompson nctc va database3
Thompson nctc va database3Thompson nctc va database3
Thompson nctc va database3
National Wildlife Federation
 
Presentation e-TB Manager
Presentation e-TB Manager Presentation e-TB Manager
Presentation e-TB Manager
Abdimalik Osman Abdi
 
Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration
Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration
Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration
nashp
 
Optimising Clinical Trials Monitoring Data review - Neill Barron
Optimising Clinical Trials Monitoring Data review - Neill BarronOptimising Clinical Trials Monitoring Data review - Neill Barron
Optimising Clinical Trials Monitoring Data review - Neill Barron
Neill Barron
 
Dr. Kurt Rossow - Disease Mapping for PRRS
Dr. Kurt Rossow - Disease Mapping for PRRSDr. Kurt Rossow - Disease Mapping for PRRS
Dr. Kurt Rossow - Disease Mapping for PRRS
John Blue
 
On the ground experiences & challenges of a connected diagnostics GxAlert in ...
On the ground experiences & challenges of a connected diagnostics GxAlert in ...On the ground experiences & challenges of a connected diagnostics GxAlert in ...
On the ground experiences & challenges of a connected diagnostics GxAlert in ...
SystemOne
 
Data capture
Data captureData capture
Data capture
Rohit K.
 
Scientific Sessions 2015: HIV estimations and projections 2015
Scientific Sessions 2015: HIV estimations and projections 2015Scientific Sessions 2015: HIV estimations and projections 2015
Scientific Sessions 2015: HIV estimations and projections 2015
Sri Lanka College of Sexual Health and HIV Medicine
 
Dr Rajeev Bijalwan
Dr Rajeev BijalwanDr Rajeev Bijalwan
Dr Rajeev Bijalwan
IAPSMUPUKCON2010
 
Clinical data management
Clinical data management Clinical data management
Clinical data management
sopi_1234
 
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
MEASURE Evaluation
 

What's hot (20)

History of Use of GxAlert in Nigeria
History of Use of GxAlert in NigeriaHistory of Use of GxAlert in Nigeria
History of Use of GxAlert in Nigeria
 
Quality improvement in reproductive, maternal, newborn and child health
Quality improvement in reproductive, maternal, newborn and child healthQuality improvement in reproductive, maternal, newborn and child health
Quality improvement in reproductive, maternal, newborn and child health
 
ASLM- Alere: Importance of Quality Systems
ASLM- Alere: Importance of Quality SystemsASLM- Alere: Importance of Quality Systems
ASLM- Alere: Importance of Quality Systems
 
EHR Migration Guide
EHR Migration GuideEHR Migration Guide
EHR Migration Guide
 
ACDM - "Data Driven" Monitoring of Clinical Trials - Neill Barron
ACDM - "Data Driven" Monitoring of Clinical Trials - Neill BarronACDM - "Data Driven" Monitoring of Clinical Trials - Neill Barron
ACDM - "Data Driven" Monitoring of Clinical Trials - Neill Barron
 
DIA 2014 Risk Based Monitoring - Neill Barron
DIA 2014 Risk Based Monitoring - Neill BarronDIA 2014 Risk Based Monitoring - Neill Barron
DIA 2014 Risk Based Monitoring - Neill Barron
 
Use of Visualisations to Optimise Clinical Trials - Neill Barron
Use of Visualisations to Optimise Clinical Trials - Neill BarronUse of Visualisations to Optimise Clinical Trials - Neill Barron
Use of Visualisations to Optimise Clinical Trials - Neill Barron
 
Cost Effective Technology for Effective and Rapid TB Response in Nigeria
Cost Effective Technology for Effective and Rapid TB Response in Nigeria Cost Effective Technology for Effective and Rapid TB Response in Nigeria
Cost Effective Technology for Effective and Rapid TB Response in Nigeria
 
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
 
Thompson nctc va database3
Thompson nctc va database3Thompson nctc va database3
Thompson nctc va database3
 
Presentation e-TB Manager
Presentation e-TB Manager Presentation e-TB Manager
Presentation e-TB Manager
 
Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration
Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration
Post Acute Care: Patient Assessment Instrument and Payment Reform Demonstration
 
Optimising Clinical Trials Monitoring Data review - Neill Barron
Optimising Clinical Trials Monitoring Data review - Neill BarronOptimising Clinical Trials Monitoring Data review - Neill Barron
Optimising Clinical Trials Monitoring Data review - Neill Barron
 
Dr. Kurt Rossow - Disease Mapping for PRRS
Dr. Kurt Rossow - Disease Mapping for PRRSDr. Kurt Rossow - Disease Mapping for PRRS
Dr. Kurt Rossow - Disease Mapping for PRRS
 
On the ground experiences & challenges of a connected diagnostics GxAlert in ...
On the ground experiences & challenges of a connected diagnostics GxAlert in ...On the ground experiences & challenges of a connected diagnostics GxAlert in ...
On the ground experiences & challenges of a connected diagnostics GxAlert in ...
 
Data capture
Data captureData capture
Data capture
 
Scientific Sessions 2015: HIV estimations and projections 2015
Scientific Sessions 2015: HIV estimations and projections 2015Scientific Sessions 2015: HIV estimations and projections 2015
Scientific Sessions 2015: HIV estimations and projections 2015
 
Dr Rajeev Bijalwan
Dr Rajeev BijalwanDr Rajeev Bijalwan
Dr Rajeev Bijalwan
 
Clinical data management
Clinical data management Clinical data management
Clinical data management
 
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
 

Similar to RHINO Forum Presentation on DQR Framework

sources of data.ppt
sources of data.pptsources of data.ppt
sources of data.ppt
TeenaPS1
 
Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...
Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...
Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...
CharanjitBasumatary
 
Time trends & patterns, TB
Time trends & patterns, TBTime trends & patterns, TB
Time trends & patterns, TB
Routine Health Information NetwOrk (RHINO)
 
Assessing M&E Systems For Data Quality
Assessing M&E Systems For Data QualityAssessing M&E Systems For Data Quality
Assessing M&E Systems For Data Quality
MEASURE Evaluation
 
8 M&E: Data Sources
8 M&E: Data Sources8 M&E: Data Sources
8 M&E: Data Sources
Tony
 
NASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.ppt
NASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.pptNASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.ppt
NASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.ppt
Nasiru Ibrahim Barda
 
Development of health indicators and their measurement with.pptx
Development of health indicators and their measurement with.pptxDevelopment of health indicators and their measurement with.pptx
Development of health indicators and their measurement with.pptx
Dr. Nishant Mishra
 
Nasiru i. barda data collection tools for emtct
Nasiru i. barda data collection tools for emtctNasiru i. barda data collection tools for emtct
Nasiru i. barda data collection tools for emtct
Nasiru Ibrahim Barda
 
Epide 7.ppt epidomology assignment for year one
Epide 7.ppt epidomology assignment for year oneEpide 7.ppt epidomology assignment for year one
Epide 7.ppt epidomology assignment for year one
GetahunAlega
 
8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdf8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdf
ssusere0ee1d
 
me-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.pptme-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.ppt
HodaFakour2
 
Me module-3-data-presentation-and-interpretation-may-2
Me module-3-data-presentation-and-interpretation-may-2Me module-3-data-presentation-and-interpretation-may-2
Me module-3-data-presentation-and-interpretation-may-2
TsegayeTesfaye4
 
me-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.pptme-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.ppt
JamesMajok1
 
me-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.pptme-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.ppt
Gayatri Devi
 
Components Of M E Systems La 4
Components Of M E Systems La 4Components Of M E Systems La 4
Components Of M E Systems La 4
lmwagwabi
 
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
MEASURE Evaluation
 
Improved_Smartcare-ART System Presentation_V6.pptx
Improved_Smartcare-ART System Presentation_V6.pptxImproved_Smartcare-ART System Presentation_V6.pptx
Improved_Smartcare-ART System Presentation_V6.pptx
Betsegaw1
 
M&E Intro.ppt
M&E Intro.pptM&E Intro.ppt
M&E Intro.ppt
MaiwandHoshmand
 
2010: Time for Minimum Standards for Health Facilities
2010:  Time for Minimum Standards for Health Facilities 2010:  Time for Minimum Standards for Health Facilities
2010: Time for Minimum Standards for Health Facilities
MEASURE Evaluation
 
Strengthening Information Systems for Community Based HIV Programs
Strengthening Information Systems for Community Based HIV ProgramsStrengthening Information Systems for Community Based HIV Programs
Strengthening Information Systems for Community Based HIV Programs
MEASURE Evaluation
 

Similar to RHINO Forum Presentation on DQR Framework (20)

sources of data.ppt
sources of data.pptsources of data.ppt
sources of data.ppt
 
Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...
Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...
Session 1 Presentation for Volume 1 Service Providers Manual Introduction HMI...
 
Time trends & patterns, TB
Time trends & patterns, TBTime trends & patterns, TB
Time trends & patterns, TB
 
Assessing M&E Systems For Data Quality
Assessing M&E Systems For Data QualityAssessing M&E Systems For Data Quality
Assessing M&E Systems For Data Quality
 
8 M&E: Data Sources
8 M&E: Data Sources8 M&E: Data Sources
8 M&E: Data Sources
 
NASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.ppt
NASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.pptNASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.ppt
NASIRU I. BARDA DATA COLLECTION TOOLS FOR EMTCT.ppt
 
Development of health indicators and their measurement with.pptx
Development of health indicators and their measurement with.pptxDevelopment of health indicators and their measurement with.pptx
Development of health indicators and their measurement with.pptx
 
Nasiru i. barda data collection tools for emtct
Nasiru i. barda data collection tools for emtctNasiru i. barda data collection tools for emtct
Nasiru i. barda data collection tools for emtct
 
Epide 7.ppt epidomology assignment for year one
Epide 7.ppt epidomology assignment for year oneEpide 7.ppt epidomology assignment for year one
Epide 7.ppt epidomology assignment for year one
 
8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdf8_2-Intro-to-IndicatorsFMEF.pdf
8_2-Intro-to-IndicatorsFMEF.pdf
 
me-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.pptme-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.ppt
 
Me module-3-data-presentation-and-interpretation-may-2
Me module-3-data-presentation-and-interpretation-may-2Me module-3-data-presentation-and-interpretation-may-2
Me module-3-data-presentation-and-interpretation-may-2
 
me-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.pptme-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.ppt
 
me-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.pptme-module-3-data-presentation-and-interpretation-may-2.ppt
me-module-3-data-presentation-and-interpretation-may-2.ppt
 
Components Of M E Systems La 4
Components Of M E Systems La 4Components Of M E Systems La 4
Components Of M E Systems La 4
 
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
 
Improved_Smartcare-ART System Presentation_V6.pptx
Improved_Smartcare-ART System Presentation_V6.pptxImproved_Smartcare-ART System Presentation_V6.pptx
Improved_Smartcare-ART System Presentation_V6.pptx
 
M&E Intro.ppt
M&E Intro.pptM&E Intro.ppt
M&E Intro.ppt
 
2010: Time for Minimum Standards for Health Facilities
2010:  Time for Minimum Standards for Health Facilities 2010:  Time for Minimum Standards for Health Facilities
2010: Time for Minimum Standards for Health Facilities
 
Strengthening Information Systems for Community Based HIV Programs
Strengthening Information Systems for Community Based HIV ProgramsStrengthening Information Systems for Community Based HIV Programs
Strengthening Information Systems for Community Based HIV Programs
 

More from Routine Health Information Network

Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...
Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...
Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...
Routine Health Information Network
 
Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...
Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...
Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...
Routine Health Information Network
 
Achieving Universal Health Coverage - RHINO (HSR2018) pt.2
Achieving Universal Health Coverage - RHINO (HSR2018) pt.2Achieving Universal Health Coverage - RHINO (HSR2018) pt.2
Achieving Universal Health Coverage - RHINO (HSR2018) pt.2
Routine Health Information Network
 
Achieving Universal Health Coverage - RHINO (HSR2018) pt.1
Achieving Universal Health Coverage - RHINO (HSR2018) pt.1Achieving Universal Health Coverage - RHINO (HSR2018) pt.1
Achieving Universal Health Coverage - RHINO (HSR2018) pt.1
Routine Health Information Network
 
Intro web rhino
Intro web rhinoIntro web rhino
Cerhis RHINO Webinar
Cerhis RHINO Webinar Cerhis RHINO Webinar
Cerhis RHINO Webinar
Routine Health Information Network
 
Dhis2 tracker.pptx (1)
Dhis2 tracker.pptx (1)Dhis2 tracker.pptx (1)
Dhis2 tracker.pptx (1)
Routine Health Information Network
 
Data quality data verification and system assessment excel chartbook
Data quality data verification and system assessment excel chartbookData quality data verification and system assessment excel chartbook
Data quality data verification and system assessment excel chartbook
Routine Health Information Network
 
Using Data Visualization to Make Routine Health Information Meaningful
Using Data Visualization to Make Routine Health Information MeaningfulUsing Data Visualization to Make Routine Health Information Meaningful
Using Data Visualization to Make Routine Health Information Meaningful
Routine Health Information Network
 
The Role of Routine Health Information Systems in the Post-2015 Development A...
The Role of Routine Health Information Systems in the Post-2015 Development A...The Role of Routine Health Information Systems in the Post-2015 Development A...
The Role of Routine Health Information Systems in the Post-2015 Development A...
Routine Health Information Network
 
Strengthening Routine Health Information Systems through Regional Networks
Strengthening Routine Health Information Systems through Regional NetworksStrengthening Routine Health Information Systems through Regional Networks
Strengthening Routine Health Information Systems through Regional Networks
Routine Health Information Network
 
Routine Health Information Systems FAA Website Launch
Routine Health Information Systems FAA Website LaunchRoutine Health Information Systems FAA Website Launch
Routine Health Information Systems FAA Website Launch
Routine Health Information Network
 
Strengthening Routine Health Information Systems in Africa: Improving Regiona...
Strengthening Routine Health Information Systems in Africa: Improving Regiona...Strengthening Routine Health Information Systems in Africa: Improving Regiona...
Strengthening Routine Health Information Systems in Africa: Improving Regiona...
Routine Health Information Network
 
Strengthening Routine Health Information Systems through Regionalizing Networ...
Strengthening Routine Health Information Systems through Regionalizing Networ...Strengthening Routine Health Information Systems through Regionalizing Networ...
Strengthening Routine Health Information Systems through Regionalizing Networ...
Routine Health Information Network
 
Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...
Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...
Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...
Routine Health Information Network
 
Hitchhiker's Guide to Community Health and Information
Hitchhiker's Guide to Community Health and InformationHitchhiker's Guide to Community Health and Information
Hitchhiker's Guide to Community Health and Information
Routine Health Information Network
 
RHINO Forum Kickoff: iHRIS Open Source HR Information Solutions
RHINO Forum Kickoff: iHRIS Open Source HR Information SolutionsRHINO Forum Kickoff: iHRIS Open Source HR Information Solutions
RHINO Forum Kickoff: iHRIS Open Source HR Information Solutions
Routine Health Information Network
 

More from Routine Health Information Network (17)

Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...
Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...
Part I-Achieving Universal Health Coverage: The Role of Routine Health Inform...
 
Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...
Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...
Part II- Achieving Universal Health Coverage: The Role of Routine Health Info...
 
Achieving Universal Health Coverage - RHINO (HSR2018) pt.2
Achieving Universal Health Coverage - RHINO (HSR2018) pt.2Achieving Universal Health Coverage - RHINO (HSR2018) pt.2
Achieving Universal Health Coverage - RHINO (HSR2018) pt.2
 
Achieving Universal Health Coverage - RHINO (HSR2018) pt.1
Achieving Universal Health Coverage - RHINO (HSR2018) pt.1Achieving Universal Health Coverage - RHINO (HSR2018) pt.1
Achieving Universal Health Coverage - RHINO (HSR2018) pt.1
 
Intro web rhino
Intro web rhinoIntro web rhino
Intro web rhino
 
Cerhis RHINO Webinar
Cerhis RHINO Webinar Cerhis RHINO Webinar
Cerhis RHINO Webinar
 
Dhis2 tracker.pptx (1)
Dhis2 tracker.pptx (1)Dhis2 tracker.pptx (1)
Dhis2 tracker.pptx (1)
 
Data quality data verification and system assessment excel chartbook
Data quality data verification and system assessment excel chartbookData quality data verification and system assessment excel chartbook
Data quality data verification and system assessment excel chartbook
 
Using Data Visualization to Make Routine Health Information Meaningful
Using Data Visualization to Make Routine Health Information MeaningfulUsing Data Visualization to Make Routine Health Information Meaningful
Using Data Visualization to Make Routine Health Information Meaningful
 
The Role of Routine Health Information Systems in the Post-2015 Development A...
The Role of Routine Health Information Systems in the Post-2015 Development A...The Role of Routine Health Information Systems in the Post-2015 Development A...
The Role of Routine Health Information Systems in the Post-2015 Development A...
 
Strengthening Routine Health Information Systems through Regional Networks
Strengthening Routine Health Information Systems through Regional NetworksStrengthening Routine Health Information Systems through Regional Networks
Strengthening Routine Health Information Systems through Regional Networks
 
Routine Health Information Systems FAA Website Launch
Routine Health Information Systems FAA Website LaunchRoutine Health Information Systems FAA Website Launch
Routine Health Information Systems FAA Website Launch
 
Strengthening Routine Health Information Systems in Africa: Improving Regiona...
Strengthening Routine Health Information Systems in Africa: Improving Regiona...Strengthening Routine Health Information Systems in Africa: Improving Regiona...
Strengthening Routine Health Information Systems in Africa: Improving Regiona...
 
Strengthening Routine Health Information Systems through Regionalizing Networ...
Strengthening Routine Health Information Systems through Regionalizing Networ...Strengthening Routine Health Information Systems through Regionalizing Networ...
Strengthening Routine Health Information Systems through Regionalizing Networ...
 
Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...
Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...
Relacsis: Red Latinoamericana y Caribeña para el Fortalecimiento de los Siste...
 
Hitchhiker's Guide to Community Health and Information
Hitchhiker's Guide to Community Health and InformationHitchhiker's Guide to Community Health and Information
Hitchhiker's Guide to Community Health and Information
 
RHINO Forum Kickoff: iHRIS Open Source HR Information Solutions
RHINO Forum Kickoff: iHRIS Open Source HR Information SolutionsRHINO Forum Kickoff: iHRIS Open Source HR Information Solutions
RHINO Forum Kickoff: iHRIS Open Source HR Information Solutions
 

Recently uploaded

一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理
一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理
一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理
nuttdpt
 
Intelligence supported media monitoring in veterinary medicine
Intelligence supported media monitoring in veterinary medicineIntelligence supported media monitoring in veterinary medicine
Intelligence supported media monitoring in veterinary medicine
AndrzejJarynowski
 
Challenges of Nation Building-1.pptx with more important
Challenges of Nation Building-1.pptx with more importantChallenges of Nation Building-1.pptx with more important
Challenges of Nation Building-1.pptx with more important
Sm321
 
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdf
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdfEnhanced Enterprise Intelligence with your personal AI Data Copilot.pdf
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdf
GetInData
 
State of Artificial intelligence Report 2023
State of Artificial intelligence Report 2023State of Artificial intelligence Report 2023
State of Artificial intelligence Report 2023
kuntobimo2016
 
Learn SQL from basic queries to Advance queries
Learn SQL from basic queries to Advance queriesLearn SQL from basic queries to Advance queries
Learn SQL from basic queries to Advance queries
manishkhaire30
 
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
ahzuo
 
The Ipsos - AI - Monitor 2024 Report.pdf
The  Ipsos - AI - Monitor 2024 Report.pdfThe  Ipsos - AI - Monitor 2024 Report.pdf
The Ipsos - AI - Monitor 2024 Report.pdf
Social Samosa
 
Population Growth in Bataan: The effects of population growth around rural pl...
Population Growth in Bataan: The effects of population growth around rural pl...Population Growth in Bataan: The effects of population growth around rural pl...
Population Growth in Bataan: The effects of population growth around rural pl...
Bill641377
 
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
ahzuo
 
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
slg6lamcq
 
一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理
一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理
一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理
mzpolocfi
 
My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.
rwarrenll
 
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...
Social Samosa
 
Global Situational Awareness of A.I. and where its headed
Global Situational Awareness of A.I. and where its headedGlobal Situational Awareness of A.I. and where its headed
Global Situational Awareness of A.I. and where its headed
vikram sood
 
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...
sameer shah
 
一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理
一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理
一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理
zsjl4mimo
 
一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理
一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理
一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理
nuttdpt
 
一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理
一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理
一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理
74nqk8xf
 
在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样
在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样
在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样
v7oacc3l
 

Recently uploaded (20)

一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理
一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理
一比一原版(UCSF文凭证书)旧金山分校毕业证如何办理
 
Intelligence supported media monitoring in veterinary medicine
Intelligence supported media monitoring in veterinary medicineIntelligence supported media monitoring in veterinary medicine
Intelligence supported media monitoring in veterinary medicine
 
Challenges of Nation Building-1.pptx with more important
Challenges of Nation Building-1.pptx with more importantChallenges of Nation Building-1.pptx with more important
Challenges of Nation Building-1.pptx with more important
 
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdf
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdfEnhanced Enterprise Intelligence with your personal AI Data Copilot.pdf
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdf
 
State of Artificial intelligence Report 2023
State of Artificial intelligence Report 2023State of Artificial intelligence Report 2023
State of Artificial intelligence Report 2023
 
Learn SQL from basic queries to Advance queries
Learn SQL from basic queries to Advance queriesLearn SQL from basic queries to Advance queries
Learn SQL from basic queries to Advance queries
 
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
一比一原版(UIUC毕业证)伊利诺伊大学|厄巴纳-香槟分校毕业证如何办理
 
The Ipsos - AI - Monitor 2024 Report.pdf
The  Ipsos - AI - Monitor 2024 Report.pdfThe  Ipsos - AI - Monitor 2024 Report.pdf
The Ipsos - AI - Monitor 2024 Report.pdf
 
Population Growth in Bataan: The effects of population growth around rural pl...
Population Growth in Bataan: The effects of population growth around rural pl...Population Growth in Bataan: The effects of population growth around rural pl...
Population Growth in Bataan: The effects of population growth around rural pl...
 
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
一比一原版(CBU毕业证)卡普顿大学毕业证如何办理
 
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
一比一原版(Adelaide毕业证书)阿德莱德大学毕业证如何办理
 
一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理
一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理
一比一原版(Dalhousie毕业证书)达尔豪斯大学毕业证如何办理
 
My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.My burning issue is homelessness K.C.M.O.
My burning issue is homelessness K.C.M.O.
 
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...
4th Modern Marketing Reckoner by MMA Global India & Group M: 60+ experts on W...
 
Global Situational Awareness of A.I. and where its headed
Global Situational Awareness of A.I. and where its headedGlobal Situational Awareness of A.I. and where its headed
Global Situational Awareness of A.I. and where its headed
 
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...
 
一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理
一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理
一比一原版(Harvard毕业证书)哈佛大学毕业证如何办理
 
一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理
一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理
一比一原版(UCSB文凭证书)圣芭芭拉分校毕业证如何办理
 
一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理
一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理
一比一原版(牛布毕业证书)牛津布鲁克斯大学毕业证如何办理
 
在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样
在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样
在线办理(英国UCA毕业证书)创意艺术大学毕业证在读证明一模一样
 

RHINO Forum Presentation on DQR Framework

  • 1. Data Quality Review (DQR) Framework Review of quality of health facility data
  • 2. | 2 Why is health facility data important?Why is health facility data important?  For many indicators it is the only continuous/frequent source of data  It is most often the only data source that is available at the subnational level -- important for equity;  For many key indicators, it is the sole source of data. For example, PMTCT, ART, TB treatment outcomes, TB notification, confirmed malaria cases, causes of death, etc.
  • 3. | 3 Quality of health facility data – why do we care?Quality of health facility data – why do we care?  High-quality data provide evidence to providers and managers to optimize healthcare coverage, quality, and services.  High-quality data help: ― Form an accurate picture of health needs, programs, and services in specific areas ― Inform appropriate planning and decision making ― Inform effective and efficient allocation of resources ― Support ongoing monitoring, by identifying best practices and areas where support and corrective measures are needed
  • 4. | 4 Most common problems affecting data qualityMost common problems affecting data quality  Lack of guidelines to fill out the main data sources and reporting forms  Personnel not adequately trained  Misunderstanding about how to compile data, use tally sheets, and prepare reports  Un-standardized source documents and reporting forms  Arithmetic errors during data compilating  Lack of a reviewing process, before report submission to next level
  • 5. | 5 Many tools have been used to address data qualityMany tools have been used to address data quality  GAVI DQA  WHO Immunization Data Quality Self-assessment (DQS)  Global Fund/MEASURE Evaluation DQA  RDQA - Self assessment version of DQA (with in- country adaptations)  Global Fund OSDV  PRISM  WHO Data Quality Report Card (DQRC)
  • 6. | 6 DQR - harmonized approach to assessing and improving data quality DQR - harmonized approach to assessing and improving data quality  The DQR is a multi-pronged , multi-partner framework for country-led data quality assurance that proposes a harmonized approach to assessing data quality  It is a framework that builds on the earlier program- specific quality tools and methods while proposing the examination of data quality in a more systemic way that can meet the needs of multiple stakeholders  It is a framework that also includes the examination of existing facility data (that does not require additional data collection) missing from earlier tools  It provides valuable information for fitness-for-purpose to support the Health Sector Strategic Planning Cycle (e.g. health sector or program reviews)
  • 7. | 7 Why are we recommending a harmonized approach to data quality? Why are we recommending a harmonized approach to data quality?  Data quality is a systems issue - multiple assessments for different diseases/programs are inefficient and burdensome for the health system  Can we satisfy the needs in data quality assurance of all stakeholders with one holistic data quality assessment?  The application of a standard framework to evaluate data quality enables the understanding of the adequacy of routine data used for health sector planning – can we link data quality assessment to health planning efforts?  Permits stakeholders to know that the routine data have undergone a known minimum level of scrutiny which lends credibility and confidence in the data
  • 9. | 9 Recommended Core Program IndicatorsRecommended Core Program Indicators Program Area Indicator Name Full Indicator Maternal Health Antenatal care 1st visit (ANC1) Number (%) of pregnant women who attended at least once during their pregnancy Immunization DTP3/Penta3 Number (%) of children < 1 year receiving three doses of DTP/Penta vaccine HIV/AIDS ART coverage Number and % of people living with HIV who are currently receiving ART TB Notified cases of all forms of TB Number (%) of all forms of TB cases (i.e. bacteriologically confirmed plus clinically diagnosed) reported to the national health authority in the past year (new and relapse) Malaria Confirmed malaria cases Number (%) of all suspected malaria cases that were confirmed by microscopy or RDT
  • 11. | 11 Domains of Data QualityDomains of Data Quality
  • 13. 13  Completeness and timeliness ― Completeness of reports ― Completeness of data ― Timeliness of reports  Internal consistency ― Accuracy ― Outliers ― Trends ― Consistency between indicators  External consistency ― Data triangulation ― Comparison with data surveys ― Consistency of population trends  External comparisons (population denominators) Metrics for Data Quality Performance
  • 14. 14 Completeness and Timeliness of Data This examines the extent to which: Data reported through the system are available and adequate for the intended purpose All entities that are supposed to report are actually reporting Data elements in submitted reports are complete Reports are submitted/received on time through the levels of the information system data flow
  • 15. 15 • Completeness of reports (%) = # total reports available or received # total reports expected • Completeness of indicator data (%) = # indicator values entered (not missing) in the report # total expected indicator values • Timeliness (%) = # reports submitted or received on time # total reports available or received Completeness and Timeliness of Data
  • 16. 16 Internal Consistency of Reported Data This dimension examines: The accuracy of reporting of selected indicators, by reviewing source documents Whether data are free of outliers (within bounds), by assessing whether specific reported values within the selected period (such as monthly) are extreme, relative to the other values reported Trends in reporting over time, to identify extreme or implausible values year-to-year The program indicator compared to other indicators with which they have a predicable relationship, to determine whether the expected relationship exists between the two indicators
  • 17. 17 Internal Consistency: Outliers Metric Severity Definition National Level Subnational Level Outliers (Analyze each indicator separately.) Extreme (At least 3 standard deviations from the mean) % of monthly subnational unit values that are extreme outliers # (%) of subnational units in which ≥1 of the monthly subnational unit values over the course of 1 year is an extreme outlier value Moderate (Between 2–3 standard deviations from the mean, or >3.5 on modified Z-score method) % of subnational unit values that are moderate outliers # (%) of subnational units in which ≥2 of the monthly subnational unit values over the course of 1 year are moderate outliers 17
  • 18. 18 Example: Outliers in a Given Year Dist Month Total Outliers % Outliers 1 2 3 4 5 6 7 8 9 10 11 12 A 2543 2482 2492 2574 3012 2709 3019 2750 3127 2841 2725 2103 1 8.3% B 1184 1118 1195 1228 1601 1324 1322 711 1160 1178 1084 1112 2 16.7% C 776 541 515 527 857 782 735 694 687 628 596 543 0 0% D 3114 2931 2956 4637 6288 4340 3788 3939 3708 4035 3738 3606 1 8.3% E 1382 1379 1134 1378 1417 1302 1415 1169 1369 1184 1207 1079 0 0% Nat’l 0 0 0 0 2 0 0 1 0 0 0 1 4 6.7% Months with at least one moderate outlier on the district monthly reports are shown in red.
  • 19. 19 Metric Definition National Level Subnational Level Trends/ Consistency over Time (Analyze each indicator separately.) Conduct one of the following, based on indicator’s expected trend: • Indicators or programs with expected growth: Compare current year to the value predicted from the trend in the 3 preceding years • Indicators or programs expected to remain constant: Compare current year to the average of 3 preceding years # (%) of districts whose ratio of current year to predicted value (or current year to average of preceding 3 years) is at least ± 33% of national ratio Graphic depiction of trend to determine plausibility based on programmatic knowledge Internal Consistency: Trends Over Time
  • 20. 20 Example: Trends over Time District Year Mean of Preceding 3 Years (2010- 2012) Ratio of 2013 to Mean of 2010-2012 % Difference between National and District Ratios2010 2011 2012 2013 A 30242 29543 26848 32377 28878 1.12 0.03 B 19343 17322 16232 18819 17632 1.07 0.08 C 7512 7701 7403 7881 7539 1.05 0.09 D 15355 15047 14788 25123 15063 1.67 0.44 E 25998 23965 24023 24259 24662 0.98 0.16 National 98450 93578 89294 108459 93774 1.16 Consistency trend: Comparison of district ratios to national ratios Any difference between district and national ratio that is ≥33% is highlighted in red.
  • 21. 21 Internal Consistency: Comparing Related Indicators Metric Definition National Level Subnational Level Consistency among related indicators Maternal Health: ANC1 - IPT1 or TT1 (should be roughly equal) # (%) of subnational units where there is an extreme difference (≥ ± 10%) Immunization: DTP3 dropout rate = (DTP1 - DTP3)/DTP1 (should not be negative) # (%) of subnational units with # of DTP3 immunizations > DTP1 immunizations (negative dropout) HIV/AIDS: ART coverage - HIV coverage (should be <1) # (%) of subnational units where there is an extreme difference (≥ ± 10%) TB: TB cases notified - TB cases on treatment (should be roughly equal) # (%) of subnational units where there is an extreme difference (≥ ± 10%) Malaria: # confirmed malaria cases reported - cases testing positive (should be roughly equal) # (%) of subnational units where there is an extreme difference (≥ ± 10%)
  • 22. 22 Example: Internal Consistency District ANC1 IPT1 Ratio of ANC1 to IPT1 % Difference between National & District Ratios A 20995 18080 1.16 0.02 B 18923 16422 1.15 0.02 C 7682 6978 1.10 0.07 D 12663 9577 1.32 0.12 E 18214 15491 1.18 0 National 78477 66548 1.18 % difference between ANC1 and IPT1, by district Districts with % difference ≥10% are flagged in red.
  • 23. 23 External Consistency with Other Data Sources This dimension examines the level of agreement between two sources of data measuring the same health indicator. The two most common sources of data are: The routinely collected and reported data from the health management information system (HMIS) or program- specific information system A periodic population-based survey
  • 24. 24 External Consistency: Compare with Survey Results Examples of Indicators Definition National Level Subnational Level ANC 1st visit Ratio of facility ANC1 coverage rates to survey ANC1 coverage rates # (%) of aggregation units used for the most recent population-based survey, such as province/state/region, whose ANC1 facility-based coverage rates and survey coverage rates differ by at least 33% 3rd dose DTP3 vaccine Ratio of DTP3 coverage rates from routine data to survey DTP3 coverage rates # (%) of aggregation units used for the most recent population-based survey, such as province/state/region, whose DTP3 facility-based coverage rates and survey coverage rates differ by at least 33%
  • 25. 25 Example: External Consistency District Facility Coverage Rate Survey Coverage Rate Ratio of Facility to Survey Rates % Difference between Official and Alternate Denominator A 1.05 0.95 1.10 10% B 0.93 0.98 0.96 4% C 1.39 0.90 1.54 54% D 1.38 0.92 1.50 50% E 0.76 0.95 0.80 20% National 1.10 0.94 1.17 17% Comparison of HMIS and survey coverage rates for ANC1 Differences ≥ 33% are highlighted in red.
  • 26. 26 External Comparison of Population Data This dimension examines two points: The adequacy of the population data used in the calculation of health indicators The comparison of two different sources of population estimates (for which the values are calculated differently) to see the level of congruence between the two sources
  • 27. 27 External Comparison of Population Data Metric Definition National Level Subnational Level Consistency of population projections Ratio of population projection of live births from the country census bureau/bureau of statistics to a United Nations live births projection for the country NA Consistency of denominator between program data & official government population statistics Ratio of population projection for select indicator(s) from the census to values used by programs # (%) of subnational units where there is an extreme difference (e.g., ±10%) between the 2 denominators
  • 28. 28 External Comparisons of Population Denominators District Official Government Estimate for Live Births Health Program Estimate for Live Births Ratio of Official Government to Health Program Estimates A 29855 29351 1.02 B 25023 30141 0.83 C 6893 7420 0.93 D 14556 14960 0.97 E 25233 25283 1.00 National 101560 107155 0.95 Comparison of national and subnational administrative unit ratios of official government live birth estimates Administrative units with differences ≥ ±10% are highlighted in red.
  • 29. 29 How do we do the desk review? A data quality app has been created for DHIS and can be downloaded by users and applied to their country DHIS databases For those that do not have DHIS, an Excel tool has been developed to support data quality analysis The principles of the desk review can be applied in any software that a country has and are not limited to the tools presented
  • 30. Data Quality Review (DQR) Data verification and system assessment
  • 31. 31 Facility Survey Component of DQR There are 2 components: ― Data verification – Examines the accuracy of reporting of selected indicators, by reviewing source documents ― System assessment -- Review adequacy of system to collect, compile, transmit, analyze, and use HMIS & program data Survey at 2 levels ― Health facility ― District
  • 32. 32 Accuracy: Data Verification Quantitative: Compares recounted to reported data Implement in 2 stagesImplement in 2 stages Assess on a limited scale if sites are collecting and reporting data accurately and on time In-depth verifications at the service delivery sites Follow-up verifications at the intermediate and central levels 32
  • 34. 34 Accuracy: Verification Factor Verification Factor Numerator: Recounted data Denominator: Reported data  Over-reporting: <100%  Under-reporting: >100% Suggested range of acceptability: 100% +/- 10% (90% –110%) 34
  • 35. 35 Verification factor  Weighted mean of verification ratios  Summarizes information on the reliability of reporting of the data reporting system  Indicates the degree of over-/under-reporting in the system ― e.g. VF = 0.80 indicates that of the total reported number of events, approximately 80% could be verified in source documents -> over-reporting
  • 36. 36 Verification Factor Example v Indicator 1 Indicator 2 Recounted Reported VF Recounted Reported VF A 1212 1065 1.14 4009 4157 0.96 B 1486 1276 1.16 3518 3686 0.95 C 357 387 0.92 672 779 0.86 D 2987 3849 0.78 1361 1088 1.25 E 4356 4509 0.97 4254 3970 1.07 Data accuracy by district Indicators flagged in red are verification factors ≥ ±10% of 1. 36
  • 37. 37 Verification Factors Plotted Graphically over-reported underreported 37
  • 38. 38 Data verification Recommended maximum 5 indicators for review —ANC1, DTP3/Penta 3, ART coverage, TB cases, malaria cases (confirmed) Select a time period for the verification (3 months) — e.g. : July, August, September 2016 For each indicator: —Review the source documents and reports —Recount the number of events —Compare the recount to the reported events —Determine reasons for any discrepancies
  • 39. 39 System Assessment Indicators Indicator Level Facility District Presence of trained staff X X Presence of guidelines X X No recent stock out of data collection tools X X Recently received supervision and written feedback X X Evidence of analysis and use data X X
  • 40. 40 If the sampling permits it, system assessment findings can be disaggregated by strata
  • 41. 41 Are there tools to support data verification and system assessment?  A paper-based questionnaire is available that can be adapted to a country situation  A data collection program has been developed in CSPro for tablets  An analysis tool has been developed in Excel for support the analysis of the data collected during this exercise
  • 42. Data Quality Review (DQR) Desk review Excel Tool
  • 44. 44 Input Basic Information Tab – Parameters of the analysis
  • 45. 45 Input Administrative Units for the Analysis
  • 46. 46 Input Program Areas and Indicators
  • 47. 47 Input Quality Thresholds Recommended User-defined Domain 1: Completeness and Consistency of Reporting/Indicator Data Col 1 Col 2 1a 1a1a 75% 1a1b 75% 1a2a 75% 1a2b 75% 1a3a 75% 1a3b 75% 1a4a 75% 1a4b 75% 1b Program Area 1: Maternal_Health 1b1 Indicator 1: ANC 1st Visit 90% Program Area 2: Immunization 1b2 Indicator 1: 3rd dose DPT-containing vaccine 67% Program Area 3: HIV_AIDS 1b3 Indicator 1: Number of HIV+ persons currently on ART 90% Program Area 4: Malaria 1b4 Indicator 1: Number of confirmed malaria cases reported 90% Program Area 5: TB 1b5 Indicator 1: Number of Notified TB cases (all forms of TB) 75% Completeness of Region Level Reporting Completeness of Indicator Reporting: % of data elements that are non-zero values; % of data elements that are non-missing values Timeliness of Region Level Reporting Completeness of District Level Reporting Threshold Completeness and Timliness of Reporting from Health Facilities and Aggregation Levels: District, Region, Province Completeness of Province Level Reporting Timeliness of Province Level Reporting Timeliness of District Level Reporting Completeness of Health Facility Level Reporting Timeliness of Health Facility Level Reporting Quality Thresholds: 'Quality thresholds'are the values that set the limits of acceptable error in data reporting. The analyses in the DQR compare results to these thresholds to judge the quality of the data. Recommended values are included for each metric in column 1. User-defined thresholds can be input into col 2 which will take precedence over the values in col 1.
  • 48. 48 Input Information on Completeness and Timeliness
  • 50. 50 Input Data on Indicator Trends
  • 52. 52 Summary Dashboard No. Indicator Definition National Score (%) # of districts not attaining quality threhold % of districts not attaining quality threshold 1a Completeness of District Reporting National district reporting completeness rate and districts with poor completeness of reporting 99.1% 1 2.1% 1b Timeliness of District Reporting National district reporting timeliness rate and districts with poor timeliness of reporting 90.3% 3 6.4% 1c Completeness of Facility Reporting National facility reporting completeness rate and districts with poor facility reporting completeness 96.1% 8 17.0% 1d Timeliness of Facility Reporting National facility reporting timeliness rate and districts with poor facility reporting timeliness 89.6% 11 23.4% Maternal_Health - ANC 1st Visit 98.9% 1 2.1% Immunization - 3rd dose DPT-containing vaccine 99.3% HIV_AIDS - Number of HIV+ persons in palliative care 99.8% Malaria - Number of confirmed malaria cases reported 99.8% Immunization - OPV3 98.9% 1 2.1% Multi-program - Penta 1st doses 99.1% Maternal_Health - ANC 1st Visit 98.9% 2 4.3% Immunization - 3rd dose DPT-containing vaccine 99.5% HIV_AIDS - Number of HIV+ persons in palliative care 100.0% Malaria - Number of confirmed malaria cases reported 99.5% 1 2.1% Immunization - OPV3 100.0% Multi-program - Penta 1st doses 100.0% 1f.1 Consistency of Reporting Completeness - District Reporting Consistency of district reporting completeness and districts deviating from the expected trend 105.8% 1f.2 Consistency of Reporting Completeness - Facility Reporting Consistency of facility reporting completeness and Districts deviating from the expected trend 103.5% DOMAIN 1: COMPLETENESS OF REPORTING BURUNDI - ANNUAL DATA QUALITY REVIEW: RESULTS, 2016 DOMAIN 2: INTERNAL CONSISTENCY OF REPORTED DATA Completeness of indicator data (missing values) 1e.1 1e.2 Indicator 1: Completeness and timeliness of reporting Indicator 1e: Completeness of indicator data - presence of missing and zero values Indicator 1f: Consistency of reporting completeness over time Completeness of indicator data (zero values)
  • 54. 54 Domain 1: Completeness of Indicator Data National score Program Area and Indicator Quality Threshold Type % No. % Name Missing 98.9% 1 2.1% Zero 98.9% 2 4.3% Missing 99.3% 1 2.1% Zero 99.5% 1 2.1% Missing 99.8% Zero 100.0% Missing 99.8% Zero 99.5% 1 2.1% Missing 98.9% 1 2.1% Zero 100.0% Missing 99.1% 1 2.1% Zero 100.0% Missing 99.3% 4 8.5% Zero 99.6% 4 8.5% Malaria - Number of confirmed malaria cases reported <= 90% - Multi-program - Penta 1st doses <= 90% District 21 - Indicator 1f: Consistency of Reporting Completeness Immunization - OPV3 <= 90% District 7 - Total (all indicators combined) HIV_AIDS - Number of HIV+ persons in palliative care <= 90% - - Indicator 1e: Completeness of Indicator Reporting - Presence of Missing and Zero Values 2016 Maternal_Health - ANC 1st Visit District 44 District 17, District 29 <= 90% Districts with > user-defined % of zero or missing values Immunization - 3rd dose DPT- containing vaccine <= 90% District 19 District 17 Interpretation of results: Indicator 1e • • • •
  • 55. 55 Domain 2: Internal Consistency - Outliers National score % No. % Name 0.2% 1 2.1% 0.0% 0.5% 3 6.4% 0.0% 0.4% 2 4.3% 0.0% 0.2% Indicator 2a.1: Extreme Outliers (>3 SD from the mean) 2016 Malaria - Number of confirmed malaria cases reported Immunization - OPV3 Multi-program - Penta 1st doses Total (all indicators combined) DOMAIN 2: INTERNAL CONSISTENCY OF REPORTED DATA - Districts with extreme outliers relative to the mean District 39 Indicator 2a: Identification of Outliers - District 31, District 39 - Program Area and Indicator District 9, District 38, District 41 Maternal_Health - ANC 1st Visit Immunization - 3rd dose DPT-containing vaccine HIV_AIDS - Number of HIV+ persons in palliative care Interpretation of results - Indicator 2a1: • • • • • •
  • 56. 56 Domain 2: Consistency over time Quality threshold National score (%) Number of districts with divergent scores Percent of districts with divergent scores Names of districts with divergent scores: 20% Expected trend Increasing Compare districts to: expected result 2b3: Consistency of 'General_Service_Statistics - OPD Total Visits' over time Year 2014 100% 5 38.5% District 6, District 7, District 8, District 9, District 11 0 100,000 200,000 300,000 400,000 500,000 600,000 0 100,000 200,000 300,000 400,000 500,000 General_Service_Statistics-OPDTotalVisits eventsforyearofanalysis Forcasted General_Service_Statistics - OPD Total Visits value for current year based on preceding years (3 years max) 0 1,000,000 2,000,000 3,000,000 2011 2012 2013 2014 Trend over time: General_Service_Statistics -OPD Total Visits Interpretation of results - Indicator 2c3: •This indicator is increasing over time (Outpatient visits are increasing - something we were expecting given social mobiliation for public health services. •Comparison of expected result (that the forecasted value is equal to the actual value for 2014) yeilds 5 districts with ratios that exceed the quality threhold of 20%. 3 are inferior of the quality threshold while 2 are greater. • Errors are not systematic (e.g. all in one direction) Review district outpatient registers in affected districts to confirm reported values.
  • 57. 57 Domain 2: Consistency between related indicators Percent of districts with divergent scores 15.4% Names of districts with divergent scores: District 5, District 6 Indicator 2c: Internal Consistency - Consistency Between Related Indicators Consistency between related indicators - Ratio of two related indicators and Districts with ratios significantly different from the national ratio * 2c1: Maternal Health Comparison: ANC 1st Visit : IPT 1st Dose Year 2014 Expected relationship National Score (%) 114% Number of districts with divergent scores 2 equal Quality Threshold 10% Compare districts with: national rate Interpretationof results - Indicator 2c1: • Data seem pretty good - only district 5 has a largely discrepant value • IPT seens consistently lower than ANC1 - more pregnant women should be receiving IPT • Stock out of fansidar in Region 2 could explain low number of IPTin Districts 5 . Call DHIO in these districts to investigate •National rate is 114% - most districts are close to this value. District 6 is performing well relative to the other districts but is 'discrepant' relative to the national rate. - no follow up needed. 0 5000 10000 15000 20000 25000 30000 35000 40000 45000 50000 ANC1eventsforyearofanalysis IPT 1st Dose eventsfor year of analysis Scatter Plot: ANC 1st Visit : IPT1st Dose(Districts compared tonational rate)
  • 58. 58 Domain 3: External Consistency – Consistency with survey values
  • 59. 59 Domain 4: Consistency of population data – Comparison of denominators in use in-country Names of districts with divergent scores: District 1, District 5, District 7, District 12 National Score (%) 106% Number of districts with divergent scores 4 Percent of districts with divergent scores 30.8% Indicator4b: Consistency of denominatorbetween program data and official government population statistics Indicator4b1- Comparing the official Live Births denominator to aprogram denominator, if applicable Year 2014 Quality Threshold 10% 0.00 10000.00 20000.00 30000.00 40000.00 50000.00 60000.00 70000.00 ProgramdenominatorforLiveBirths Official government denominator for Live Births Interpretationof results- Indicator 4b1: • the Program denominatorsin Districts1, 7, and 12 seemtoo large - and too small inDistrict 5. Review growth rates used by program to estimateintercensal yearly values forlivebirths. •
  • 60. Data Quality Review (DQR) Data verification and system Assessment Excel Chartbook
  • 62. 62 CSPro Data Entry Application
  • 64. 64 Export Data From CSPro and Paste into the Chartbook
  • 67. 67 Availability of Source Documents and Reports
  • 71. 71 Program Specific – Reasons for Discrepancies
  • 73. 73 System Assessment – Subnational Results

Editor's Notes

  1. Poor quality of facility data affects monitoring
  2. Some tools are still in existence while others are no longer is use. And, now, yet another tool/framework is being thrown at the audience. What is the DQR offering that these other tools did not offer?
  3. What is the DQR
  4. The DQR Framework includes 3 components: The data quality review process should start with regular, routine, monthly review of data quality with feedback by each level of the health system. This process can be integrated, reviewing the quality of data from multiple programs. As shown in the previous slide, there should be an annual review of data quality at national level preceding the Annual Review. This annual review can also be integrated. Periodically, perhaps each 3 to 5 years there should be in-depth reviews of the quality of data for particular programs such as immunization, MCH, HIV and malaria.
  5. Four domains of data quality are defined: Completeness and timeliness Internal consistency -- do the routine data agree with each other? External consistency – do the routine data agree with survey findings? Are the denominator estimates consistent with one another
  6. Accuracy: Measured against a reference and found to be correct Completeness: Present, available, and usable Timeliness: Up-to-date and available on time
  7. This slide shows how to measure reporting performance to determine the extent to which data reports are appropriately available, complete, and timely.
  8. Outliers = Deviation from the mean
  9. The table shows moderate outliers for a given indicator. There are four identified moderate outliers. They are highlighted in red. Three of the districts have at least one occurrence of a monthly value that is a moderate outlier. Nationally, this indicator is a percentage of values that are moderate outliers for the indicator. The numerator for the equation is the number of outliers across all administrative units [in this case, 4]. The denominator is the total number of expected reported values for the indicator for all the administrative units. That value is calculated by multiplying the total number of units (in the selected administrative unit level) with the expected number of reported values for one indicator for one administrative unit. In this case, we have 5 districts and 12 expected monthly reported values per district for one indicator, so the denominator is 60 [5 × 12]. Thus, about 6.7% are moderate outliers [4/60 = 0.0666 × 100, or 6.7 %].   Subnationally, see if you can calculate the number of outliers for each district. Count the districts where there are two or more outliers (for moderate outliers) among the monthly values for the district [1]. Divide by the total number of administrative units [1/5 = 0.25 × 100 = 25%].
  10. Mean of preceding three years (2010, 2011, and 2012) is 93,774 [98,450 + 93,578 + 89,294)/3] Ratio of current year to the mean of the past three years is 1.16 [108,459/93,774 ≈ 1.16].   The average ratio of 1.16 shows that there is an overall 16% increase in the service outputs for 2013 when compared to the average service outputs for the preceding three years of the indicator. Subnationally, try to evaluate each district, by calculating the ratio of the current year (2013) to the average of the previous three years (2010, 2011, and 2012). For example, the ratio for District 1 is 1.12 [32,377/28,878].   Then calculate the % of difference between the national and district ratios for each district. For example, for district A:   |(𝐷𝑖𝑠𝑡𝑟𝑖𝑐𝑡 1 𝑅𝑎𝑡𝑖𝑜 − 𝑁𝑎𝑡𝑖𝑜𝑛𝑎𝑙 𝑅𝑎𝑡𝑖𝑜)/(𝑁𝑎𝑡𝑖𝑜𝑛𝑎𝑙 𝑅𝑎𝑡𝑖𝑜)| = |(1.121−16)/1.16| = 0.03 = 3.0%   The difference between the district ratio and the national ratio for District A is less than 33%. However, there is a difference of approximately 44% for District D between the deliveries ratio and the national ratio.   To calculate this indicator subnationally, all administrative units whose ratios are different from the country’s ratio by ±33%, or more are counted. In this example, only District D has a difference greater than ±33%. Therefore, 1 out of 5 districts (20%) has a ratio that is more than 33% different from the national ratio.
  11. The annual number of pregnant women started on antenatal care each year (ANC1) should be roughly equal to the number of pregnant women who receive intermittent preventive therapy for malaria (IPT1) in ANC, because all pregnant women should receive this prophylaxis. First, we will calculate the ratio of ANC1 to IPT1 for the national level, and then for each district. At the national level, the ratio of ANC1 to IPT1 is about 1.18 [78,477/66,548]. At the subnational level, we can calculate the ratio of ANC1 to IPT1 and the % difference between the national and district ratios.   We see that there is one district (D) with a ratio of ANC1 to IPT1 greater than 20%. We also see that the % difference between the national and district ratios for district D is more than 10%.
  12. Population-based surveys: Demographic and Health Survey (DHS); MICS, etc. Indicator values are based on recall, referring to period before the survey (such as 5 years) Sampling error: confidence intervals
  13. If the HMIS is accurately detecting all ANC visits in the country (not just those limited to the public sector), and the denominators are accurate, the coverage rate for ANC1 derived from the HMIS should be very similar to the ANC1 coverage rate derived from population surveys. However, HMIS coverage rates are often different from survey coverage rates for the same indicator. At the national level: The coverage rate from HMIS is 110%. The coverage rate from the most recent population-based survey is 94%. The ratio of the two coverage rates is: 1.17 [110%/94%]. If the ratio is 1, it means that the two coverage rates are exactly the same. If the ratio is &amp;gt;1, it means that the HMIS coverage is higher than the survey coverage rate. If the ratio is &amp;lt;1, it means that the survey coverage rate is higher than the HMIS coverage rate.   The ratio of 1.17 shows that the two denominator values are fairly different, and there is about a 17% difference between the two values.   At the subnational level, the ratio of denominators is calculated for each administrative unit. Districts with at least 33% difference between their two denominators are flagged. Districts C and D have more than 33% difference between their two ratios.
  14. This slide shows the ratio of the number of live births from official government statistics nationally for the year of analysis to the value used by the selected health program.   Calculate the ratio of subnational administrative unit 2014 live births to the value used by the selected health program; district B has a difference of 0.17 or 17%.
  15. This chart shows that the trace-and-verify protocol starts with data at the level of service delivery. Data are then “traced” to the “intermediate aggregation level” (in this case a district), and then to the central level.
  16. At the heart of the data quality (DQ) process are two important components: data verification (DV) and report performance. Data verification is derived through quantitative comparison of recounted data and reported data. The verification factor (VF) is calculated by dividing the recounted number by the reported number, giving a percentage. *Facilitators: Ask participants, “What would 85% mean? How about 125%?” Refer to Module 9: RHIS Performance Assessment, concerning the PRISM tools. The diagnostic tool also measures data quality for district and health facility levels.
  17. Recounted data/reported data = VF (verification factor) When VF is ≥ ±10%, data are considered inaccurate.
  18. This graph shows the verification factors for four indicators First, we can see that there is a wide variation in the accuracy of these indicators. The area marked with red horizontal lines shows a margin of acceptability: plus or minus 10% of 100%, the global standard. However, individual programs can select their own ranges of acceptability, as deemed appropriate. We also can see that, of the four indicators, three are outside the acceptable margins. [Ask participants:] What would you say about indicators 1 and 4? [After someone answers, CLICK here to animate.] What about indicator 2? [After someone answers, CLICK here to animate.] Ideally, we would see no under-reporting or over-reporting of data, with indicators as close to 100% as possible. [CLICK here to animate.]
  19. Objective To assess the adequacy of information system to collect, compile, transmit, analyze, and use HMIS &amp; program data
  20. On this tab you will enter the parameters for the analysis. Data flow model, periodicity of reporting and the specific levels of the health system selected for the analysis. 1. Select Country: The Country selected will automatically be included in dashboards of results, as well as being used to calculate the UN population projection for Live Births. 2. Select year: This is the year of analysis, the year for which data will be obtained and analyzed. 3. Complete the data flow model for the Country HMIS (or Programme, depending on the scope of the DQR). Include all levels of the reporting system where data are collected, aggregated, and forwarded to the next higher level. The last box should indicate the &amp;apos;National&amp;apos; level. 4. Select the level of the reporting system for which you are conducting your analysis, that is, the level for which metrics are calculated and compared. This is usually the level for which data are input, such as the district level. 5. Select the periodicity (i.e. how often the data are reported) for the level of analysis selected. This selection will configure the indicator data entry pages for the periodicity selected. Remember also to select the first period of the reporting year (input 10). The selection of the periodicity of reporting for the level of analysis will populate the drop down list in input 10. 6. Input the periodicity of reporting from health facilities. This is used in the evaluation of reporting performance from facilities (domain 1 - completeness and timeliness of reporting). 7. Input the periodicity of reporting from the 1st level of aggregation (usually the district). This is used in the evaluation of reporting performance from the 1st level of aggregation (domain 1 - completeness and timeliness of reporting). 8. Input the level of the reporting system for which you are inputting service output data. These are the indicator values by month or quarter. These data can be facility level (only rarely in the event that facility level data are entered into the computer), or district, or regional level depending on what aggregate level data are available at national level. 9. Input the level of the most recent population-based survey. In domain 3 - External comparison, routinely reported data values will be compared with survey values. The routine data will need to be aggregated to the level of the survey (typically the regional level) so that the values are comparable. 10. Enter the first period of the year of analysis. Depending on the periodicity of reporting from the level selected for analysis (in #5) the drop down list will provide the range of options. Select the first period (e.g. 1st quarter, the month of January etc.) from the drop down. 11. Enter the nature of facility reporting, either integrated (e.g. on the monthly form HMIS) or program-specific. Integrated reporting means the results from different health programs are all reported on the same form, and only that form is forwarded to the next level to satisfy reporting requirements for all health programs. Program-specific reporting means that health programs report to the next level separately, each program with its own set of reporting forms. Since it may be the case that reporting from health facilities is only partially integrated, selection of the type of reporting on the Input_basic_info tab will only hide or reveal the program-specific reporting data entry and results areas. The integrated reporting tab and result areas will always be available to enter information on reporting for the HMIS in general.
  21. Input the relevant administrative units in column 2 on the Input_admin_units tab. These are the administrative units for the level of analysis selected on the Basic Information tab.
  22. The DQR has a standard set of indicators from across program areas that are intended to provide a cross-cutting assessment of data quality. These are: ▪ Maternal health - Antenatal care 1st visit (ANC1) ▪ Immunization - DTP3/Penta3 ▪ HIV/AIDS - ART coverage ▪ TB - Notified cases of all forms of TB ▪ Malaria - Confirmed malaria cases   However, the DQR is designed to accommodate any program areas and indicators. On the &amp;apos;Program Areas and Indicators&amp;apos; tab select program areas and their associated indicators using the drop down lists provided. One primary indicator should be selected for each program area. The primary indicator is listed as #1 in the two spaces provided for each program area. The secondary indicator (#2) is only used for the Internal Consistency metric &amp;apos;Comparison between related indicators&amp;apos;. Drop down lists for program areas and indicators include the standard indicators used for the recommended implementation of the DQR, as well as a supplemental list of alternative indicators for each program area. Information on the core and alternative indicators can be found in the DQR Technical Guide (Module 3: Review of data quality through a health facility survey; Annex 1 - Recommended Indicators). It is also possible to include user-defined program areas and indicators by selecting &amp;apos;Other_specify&amp;apos; from the drop down list. Another field will appear in which the user-defined program area and/or indicator can be entered. Once entered the program area and indicator names auto-populate the dashboards of results in the DQR. Finally, a section for selecting the indicator type, either cumulative or current, is included. A cumulative indicator is one for which monthly values are added to the previous month&amp;apos;s value to derive a running total (e.g. number counseled and tested for HIV). A current indictor is one where the current month&amp;apos;s value updates or replaces the previous month&amp;apos;s value (e.g. current on ART where lost, stopped, transferred out or died are all subtracted from the total, new patients are added, and those counted this month were most likely also counted last month). The default value is cumulative since most indicators are cumulative.
  23. To judge the quality of data using the metrics in the DQR it is necessary to define benchmarks of quality with which to compare the results. WHO has recommended thresholds for each metric which can be found on the &amp;apos;Quality Thresholds&amp;apos; tab. Often, global standards are not relevant in a given country if the information system is immature, or is undergoing reform. In cases where the recommended thresholds are inappropriate, user-defined thresholds can be supplied by entering the values in column 2 on the &amp;apos;Quality Thresholds&amp;apos; tab which will override the recommended thresholds.
  24. On the &amp;apos;Input_reports_received&amp;apos; tab enter the information required on completeness and timeliness of reporting from subnational units. Depending on the data flow model input in the Basic Information tab, you will need to enter data on the number of reports received for each level, and historically (3 prior years). Also required is information on the number of reports received by the deadline of reporting for the year of analysis. Please ensure to select the appropriate periodicity of reporting on the Basic Information tab for facility reporting and from the next higher level (#7) so the DQR tool will know the number of expected reports in the calculation of completeness of reporting.
  25. The DQR evaluates the adequacy of population data (i.e. denominators) used to calculate coverage rates for performance monitoring in &amp;apos;Domain 4 - Consistency of population data&amp;apos;. Denominator data is also required to compute rates for comparisons of routine data to population-based survey data (&amp;apos;Domain 3 - External consistency&amp;apos;). There are two tabs in which input of population data are required, one for each domain. On the tab &amp;apos;Input_Standard_Populations&amp;apos; (Figure 12) enter the populations from Official Government Statistics (e.g. from the National Statistics Bureau) by the level selected for analysis (e.g. district) for Live Births, Expected Pregnancies, and Children &amp;lt; 1 year of age (columns F-H). These denominators will be compared to the same populations used by health programs, if applicable. If health programs are using their own estimates of these populations enter the values by the level selected for analysis into the appropriate cells (columns I-K).
  26. To evaluate &amp;apos;Internal consistency - Consistency of indicator data over time&amp;apos; (Domain 2) you will need to enter annual values for the level selected for analysis for the DQR primary indicators (selected on the &amp;apos;Program Areas and Indicators&amp;apos; tab). Annual values for the indicators are required for the three years prior to the analysis year. The annual values for the prior years need to be pasted into the appropriate columns for each of the indicators, while the values for the year of analysis are aggregated automatically by the DQR tool once the monthly values have been input into the indicator data tabs
  27. Paste monthly (or quarterly) data by level selected for analysis into the indicator data tabs. The indicator names should appear automatically at the top of each of the indicator data tabs once the indicators are selected on the &amp;apos;Program Areas and Indicators&amp;apos; tab. The indicator data tabs are named according to the following logic: PA1 is Program Area #1, while Ind1 is the primary indicator for the program area. Each Program Area selected on the &amp;apos;Program Areas and Indicators&amp;apos; tab has two indicators, a primary and a secondary indicator. The primary indicator is the indicator for which DQR metrics are calculated. The secondary indicator is only used for the &amp;apos;Domain 2 - Internal consistency&amp;apos; evaluation of the consistency between related indicators. Furthermore, PA2 is Program Area #2, which has Ind1 and Ind2, etc. Please ensure that the periodicity of reporting for the level of analysis is indicated in #5 on the Basic Information tab. This selection will configure the Indicator Data tabs for 12 columns for monthly reporting, and 4 columns for quarterly reporting.   In Domain 2 – Internal Consistency of Reported Data extreme and moderate sub-national unit values are identified for monthly (or quarterly) reporting. These values are highlighted on the Input Indicator Data tabs by color coding as follows: outliers are noted by a stippling pattern and shaded gray for moderate outliers, and shaded pink for extreme outliers (Figure 16). These values are summarized and the sub-national units where they occur are identified in the summary tabs for Domain 2.
  28. The tab &amp;apos;Summary_dashboard&amp;apos; displays results for all DQR domains and metrics in summary form, without detail or graphics. The standard form for results is the value of the metric plus the number and percent of subnational units which do not attain the established benchmark for the metric. The subnational units which do not attain the standard are listed on the domain-specific dashboards.
  29. Add interpretations in the text box to facilitate action planning using assessment results
  30. Completeness of indicator data - measures the percentage of missing or zero values reported from subnational units
  31. ▪ Identification of extreme outliers - monthly (or quarterly) values entered for subnational units selected as the level of analysis are examined for the presence of extreme outliers, i.e. values that are ≥ 3 standard deviations from the mean of monthly (or quarterly) values entered for subnational units. For each primary indicator entered on the &amp;apos;Program Areas and Indicators&amp;apos; tab, the number and percentage of values that are extreme outliers is calculated and the subnational units identified (
  32. Consistency over time - The plausibility of reported results for selected programme indicators are examined in terms of the history of reporting of the indicators. Trends are evaluated to determine whether reported values are extreme in relation to other values reported during the year or over several years (Figure 22). -For this metric the annual value of primary indicators for the year of analysis (aggregated from monthly or quarterly values entered for subnational units) is compared to the mean of annual values for the three years proceeding the year of analysis. Subnational units with a ratio of the annual value for the year of analysis to the mean of the annual values from the 3 preceding years divergent from the expected ratio (or national ratio) more than the recommended (or user defined) quality threshold are identified, and the number and percent of such subnational units is calculated.
  33. Consistency between related indicators - Programme indicators which have a predictable relationship are examined to determine whether, in fact, the expected relationship exists between those indicators. In other words, this process examines whether the observed relationship between the indicators, as depicted in the reported data, is that which is expected (Figure 23). For this metric, annual aggregate values for primary indicators are compared to annual aggregate values for secondary indicators input into the program area specific Indicator Data tabs. A ratio of the primary indicator to the secondary indicator is calculated and compared to the national ratio of the same two indicators, or to the expected value of the ratio of the two indicators. The expected value is the value of the ratio when the two indicators are equal, or for a ratio, the value of 1.
  34. Data for recent population-based surveys are entered in the &amp;apos;External_Data_Sources&amp;apos; tab. Routine data entered for primary indicators are aggregated to the administrative units of the survey as indicated on the &amp;apos;Survey_Mapping&amp;apos; tab. The routine data value for the appropriate survey administrative units are then divided by the population value, also aggregated to the survey administrative unit to derive a rate comparable to the survey value for the same administrative unit. The ratio of the routine value to the survey value is then calculated. Subnational units with a ratio greater than 1 + the recommended (or user-defined) quality threshold (or less than 1 - the quality threshold) are flagged as potential data quality problems. In the graphics in the indicator-specific dashboards and the Domain 3 - External Consistency dashboard, the routine values are depicted as bars, while the survey values are depicted as points (a triangle) with error bars based on the standard error of the survey estimate (entered in the &amp;apos;External_Data_Sources&amp;apos; tab) depicting the range of acceptable error between the survey and the routine values.
  35. The level of congruence between the denominators from official government sources and those used by Health Programs is evaluated by calculating the ratio between the two values for subnational units. Subnational units with a ratio greater than 1 + the recommended (or user-defined) quality threshold, or less than 1 - the quality threshold are flagged as potential data quality problems. The denominator-specific dashboards in the &amp;apos;Domain 4 - External Consistency of Population Data&amp;apos; dashboard provide a scatter plot depicting the relationship between subnational unit values of the two denominators (Figure 26). Points falling outside the dashed gray lines indicate values that exceed the quality threshold.
  36. Enter Country and Year in the Yellow boxes
  37. By design, the data are collected in the standard CSPro Data Verification application which can used for data capture in the field on tablet computers, or from entering paper-based results into the desk top back in the office after the assessment.
  38. Once the data are in the database and cleaned, a ‘batch file’ is available to compile the relevant indicators for the analysis. Run the batch file in CSPro to compile the indicators, then export the data as a text file. From there the data can be pasted into the Excel tool.
  39. Paste the data from CSPro (open the text file in Excel) into the DV Chartbook on the Indicators tab. NB -there are facility and district level versions of the CSPro data entry applications and batch files. There are also facility and district level versions of the Chartbook.
  40. Enter the country specific information on stratifiers – subnational units, facility types, management authority and urban/rural.
  41. Results are presented for all program level indicators and for each of the program indicators individually. General facility information page gives information on the availability of services in the facilities surveyed. -results are provided for national level (that is, all subnational units taken together), and also broken down by subnational unit (e.g. region or district).
  42. There is more detail on data quality metrics in the indicator-specific tabs (e.g. % of facilities over/under reporting by &amp;gt;10%).
  43. System assessment indicators (qualitative indicators) are color coded for ease of interpretation. Also ‘national’ and ‘subnational’ results tabs.