Presentation for the American Public Health Association & Expo in Atlanta, GA. November 2017:
Ensuring that quality data are collected and reported to the Ministry of Health (MOH) is a priority in Mozambique as it is the foundation for the provision of quality health services. Since 2014, the Strategic Information Project in Mozambique (M-SIP) has provided technical assistance to MOH to conduct annual rounds of data quality assessments (DQA) in each province. Seven indicators were selected as part of the national DQA strategy. Each DQA had a quantitative and a system assessment component. The quantitative component includes tracing and verification of reported data, where recounted data is compared to data reported at three levels: health facility (HF), district, and province. M-SIP conducted all DQAs using the same methodology making the results comparable. After three consecutive national rounds, there is a clear trend of improvement, despite deviations remaining high. The regular, reinforcing nature of this activity and consistency of HF recommendations has had a positive impact on the data quality and results of the assessments. For example, the overall national deviation of the “patients active in ART” indicator decreased from 37% to 22% over the three-year period. The successful implementation of the DQA activity, as well as its unique, inclusive approach to promoting MOH ownership, has resulted in MOH recognition—at all levels—that DQA activities are crucial to future success. The M-SIP and MOH teams are now developing a more methodological approach to MOH staff empowerment, enabling fully independent MOH implementation of this activity while continuing to improve the quality of data.
Building on a FAIRly Strong Foundation to Connect Academic Research to Transl...
Clinical Data Quality in Mozambique: A Comparative Exercise
1. CLINICAL DATA QUALITY IN MOZAMBIQUE
A COMPARATIVE EXERCISEAFTER FOUR ROUNDS
Mozambique Strategic Information Project (MSIP)
JSI Research &Training Institute, Inc. (JSI)
Prepared by:
Dália Monteiro Traça,
Chief of Party
dtraca@mz.jsi.com
November 7, 2017
3. Assessing the
quality of
reported data
Use results
to inform
quality
improvement
Build
capacity of
national
health
information
systems
Objectives of the DQA strategy
4. StrategicApproach
Create a sustainable Data Quality Assessment system that is
affordable, accepted, owned and scalable by the MOH
Prioritize the inclusion of MOH staff in all steps of
the development, piloting and implementation
of the DQA strategy
Promote the alignment of the existing reporting systems
(PEPFAR and DHIS2)
5. DQA Objectives
• To assess the quality of data registered in primary sources
and data reported to the upper levels, verifying the
following sources:
– Daily registers vs. Monthly reports (Health Facility),
– National Database DHIS2 (District),
– DHIS2 (Province),
– DHIS2 (Central level)
• To assess the data management and reporting systems at
the HF and District level.
6. Assessed Indicators
Area Indicator Abbre-
viation
Treatment and Care Number of HIV+ positive individuals active on ART TARV
Number of HIV + individuals who are eligible for
Cotrimoxazole (CTX) and receive CTX
CTX
Prevention of Mother to
ChildTransmission
Number of HIV+ pregnant women who received
medication/prophylaxis ARV to reduce the risk of
mother to child transmission during prenatal consult
CPN
Number of HIV+ pregnant women who received
medication/prophylaxis ARV to reduce the risk of
mother to child transmission during labor and delivery
MAT
Number of children exposed to HIV who received a
PCR test at <8 weeks
PCR
Counseling andTesting Number of people who were tested for HIV and
received their results in a clinical environment
UATS
Voluntary Medical Male
Circumcision
Number of men circumcised as part of the voluntary
package of male circumcision for HIV prevention
CM
7. Overall DQA Implementation Methodology
1. Calendar of DQA implementation with MOH (including site
selection)
2. MOH informs Provinces Health Department (DPS) of DQA
implementation dates and facilities
3. DPS informs District Health Directorates (DDS) and Health
Facility (HF) of DQA implementation and dates
4. Training of MOH central staff (prior to departure to provinces)
5. Training for DPS and Implementing Partner (IP) staff at
province
6. DQA implementation (with debrief at HF level)
7. DQA debrief at province level for DPS and IP
8. National debrief at MOH central
16. • Country-wide, annual reporting by province
• Effective MOH ownership, with actual involvement from the beginning:
Selection of indicators and Health Facilities, definition of calendars and
team composition, budget allocation to the activity.
• Methodological approach with immediate written feedback at the level
where most discrepancies occur (Health Facility) - the innovative factor
• No new technology – uses only technology already available in the public
system
• Sites that received more then one visit show improvements – confirming
that the inclusion of HF staff and the immediate feedback have a great
positive impact on future improvement actions.
KeyAspects
18. Key Findings
Ownership of the activity by MOH and leadership at
province level are essential for real impact and change at HF
level
The regular, periodic nature of the activity and consistency of
feedback and recommendations had a very positive impact on
the results
“Non-punitive” nature of activity, made it more welcoming by
HF staff and more willing to implement changes
20. Moving Forward
“From QA to QI” – Existence of a clear gap between the DQA
activity and the implementation of changes that need to happen;
Use the momentum created by the DQA to generate change;
TA delivery to MOH for DQA implementation;
Roll out of methodology to other program areas – DQA
methodology with Malaria,TB and MCH programs (2016/17);
Inclusion of DQA activity in the official MOH budget.