CLINICAL DATA QUALITY IN MOZAMBIQUE
A COMPARATIVE EXERCISEAFTER FOUR ROUNDS
​Mozambique Strategic Information Project (MSIP)
​JSI Research &Training Institute, Inc. (JSI)
Prepared by:
Dália Monteiro Traça,
Chief of Party
dtraca@mz.jsi.com
November 7, 2017
No relationships to disclose.
Assessing the
quality of
reported data
Use results
to inform
quality
improvement
Build
capacity of
national
health
information
systems
Objectives of the DQA strategy
StrategicApproach
Create a sustainable Data Quality Assessment system that is
affordable, accepted, owned and scalable by the MOH
Prioritize the inclusion of MOH staff in all steps of
the development, piloting and implementation
of the DQA strategy
Promote the alignment of the existing reporting systems
(PEPFAR and DHIS2)
DQA Objectives
• To assess the quality of data registered in primary sources
and data reported to the upper levels, verifying the
following sources:
– Daily registers vs. Monthly reports (Health Facility),
– National Database DHIS2 (District),
– DHIS2 (Province),
– DHIS2 (Central level)
• To assess the data management and reporting systems at
the HF and District level.
Assessed Indicators
Area Indicator Abbre-
viation
Treatment and Care Number of HIV+ positive individuals active on ART TARV
Number of HIV + individuals who are eligible for
Cotrimoxazole (CTX) and receive CTX
CTX
Prevention of Mother to
ChildTransmission
Number of HIV+ pregnant women who received
medication/prophylaxis ARV to reduce the risk of
mother to child transmission during prenatal consult
CPN
Number of HIV+ pregnant women who received
medication/prophylaxis ARV to reduce the risk of
mother to child transmission during labor and delivery
MAT
Number of children exposed to HIV who received a
PCR test at <8 weeks
PCR
Counseling andTesting Number of people who were tested for HIV and
received their results in a clinical environment
UATS
Voluntary Medical Male
Circumcision
Number of men circumcised as part of the voluntary
package of male circumcision for HIV prevention
CM
Overall DQA Implementation Methodology
1. Calendar of DQA implementation with MOH (including site
selection)
2. MOH informs Provinces Health Department (DPS) of DQA
implementation dates and facilities
3. DPS informs District Health Directorates (DDS) and Health
Facility (HF) of DQA implementation and dates
4. Training of MOH central staff (prior to departure to provinces)
5. Training for DPS and Implementing Partner (IP) staff at
province
6. DQA implementation (with debrief at HF level)
7. DQA debrief at province level for DPS and IP
8. National debrief at MOH central
DQA RESULTS
Round 2014,2015,2016 & 2017
Deviation:
Good quality data = <10%
Moderate data quality = 10-20%
Poor Quality Data = >20%
IndicatorART:Deviation in Reporting
37%
28%
22%
20%
0%
5%
10%
15%
20%
25%
30%
35%
40%
2014 2015 2016 2017
Deviation
Deviation:
Good quality data = <10%
Moderate data quality = 10-20%
Poor Quality Data = >20%
IndicatorART:Deviation in Reporting (Repeat sites)
2016
IndicatorART:SystemsAssessment Scores
IndicatorART:Registers
IndicatorART:Patient Files
Before After
Deviation:
Good quality data = <10%
Moderate data quality = 10-20%
Poor Quality Data = >20%
National Level Deviations by indicator, DQA 2014 -2017
Discussion
• Country-wide, annual reporting by province
• Effective MOH ownership, with actual involvement from the beginning:
Selection of indicators and Health Facilities, definition of calendars and
team composition, budget allocation to the activity.
• Methodological approach with immediate written feedback at the level
where most discrepancies occur (Health Facility) - the innovative factor
• No new technology – uses only technology already available in the public
system
• Sites that received more then one visit show improvements – confirming
that the inclusion of HF staff and the immediate feedback have a great
positive impact on future improvement actions.
KeyAspects
Key Findings
Key Findings
 Ownership of the activity by MOH and leadership at
province level are essential for real impact and change at HF
level
 The regular, periodic nature of the activity and consistency of
feedback and recommendations had a very positive impact on
the results
 “Non-punitive” nature of activity, made it more welcoming by
HF staff and more willing to implement changes
Moving Forward
Moving Forward
 “From QA to QI” – Existence of a clear gap between the DQA
activity and the implementation of changes that need to happen;
 Use the momentum created by the DQA to generate change;
 TA delivery to MOH for DQA implementation;
 Roll out of methodology to other program areas – DQA
methodology with Malaria,TB and MCH programs (2016/17);
 Inclusion of DQA activity in the official MOH budget.
OBRIGADA!
Dália MonteiroTraça
Chief of Party MSIP
Maputo, Mozambique
dtraca@mz.jsi.com

Clinical Data Quality in Mozambique: A Comparative Exercise

  • 1.
    CLINICAL DATA QUALITYIN MOZAMBIQUE A COMPARATIVE EXERCISEAFTER FOUR ROUNDS ​Mozambique Strategic Information Project (MSIP) ​JSI Research &Training Institute, Inc. (JSI) Prepared by: Dália Monteiro Traça, Chief of Party dtraca@mz.jsi.com November 7, 2017
  • 2.
  • 3.
    Assessing the quality of reporteddata Use results to inform quality improvement Build capacity of national health information systems Objectives of the DQA strategy
  • 4.
    StrategicApproach Create a sustainableData Quality Assessment system that is affordable, accepted, owned and scalable by the MOH Prioritize the inclusion of MOH staff in all steps of the development, piloting and implementation of the DQA strategy Promote the alignment of the existing reporting systems (PEPFAR and DHIS2)
  • 5.
    DQA Objectives • Toassess the quality of data registered in primary sources and data reported to the upper levels, verifying the following sources: – Daily registers vs. Monthly reports (Health Facility), – National Database DHIS2 (District), – DHIS2 (Province), – DHIS2 (Central level) • To assess the data management and reporting systems at the HF and District level.
  • 6.
    Assessed Indicators Area IndicatorAbbre- viation Treatment and Care Number of HIV+ positive individuals active on ART TARV Number of HIV + individuals who are eligible for Cotrimoxazole (CTX) and receive CTX CTX Prevention of Mother to ChildTransmission Number of HIV+ pregnant women who received medication/prophylaxis ARV to reduce the risk of mother to child transmission during prenatal consult CPN Number of HIV+ pregnant women who received medication/prophylaxis ARV to reduce the risk of mother to child transmission during labor and delivery MAT Number of children exposed to HIV who received a PCR test at <8 weeks PCR Counseling andTesting Number of people who were tested for HIV and received their results in a clinical environment UATS Voluntary Medical Male Circumcision Number of men circumcised as part of the voluntary package of male circumcision for HIV prevention CM
  • 7.
    Overall DQA ImplementationMethodology 1. Calendar of DQA implementation with MOH (including site selection) 2. MOH informs Provinces Health Department (DPS) of DQA implementation dates and facilities 3. DPS informs District Health Directorates (DDS) and Health Facility (HF) of DQA implementation and dates 4. Training of MOH central staff (prior to departure to provinces) 5. Training for DPS and Implementing Partner (IP) staff at province 6. DQA implementation (with debrief at HF level) 7. DQA debrief at province level for DPS and IP 8. National debrief at MOH central
  • 8.
  • 9.
    Deviation: Good quality data= <10% Moderate data quality = 10-20% Poor Quality Data = >20% IndicatorART:Deviation in Reporting 37% 28% 22% 20% 0% 5% 10% 15% 20% 25% 30% 35% 40% 2014 2015 2016 2017 Deviation
  • 10.
    Deviation: Good quality data= <10% Moderate data quality = 10-20% Poor Quality Data = >20% IndicatorART:Deviation in Reporting (Repeat sites)
  • 11.
  • 12.
  • 13.
  • 14.
    Deviation: Good quality data= <10% Moderate data quality = 10-20% Poor Quality Data = >20% National Level Deviations by indicator, DQA 2014 -2017
  • 15.
  • 16.
    • Country-wide, annualreporting by province • Effective MOH ownership, with actual involvement from the beginning: Selection of indicators and Health Facilities, definition of calendars and team composition, budget allocation to the activity. • Methodological approach with immediate written feedback at the level where most discrepancies occur (Health Facility) - the innovative factor • No new technology – uses only technology already available in the public system • Sites that received more then one visit show improvements – confirming that the inclusion of HF staff and the immediate feedback have a great positive impact on future improvement actions. KeyAspects
  • 17.
  • 18.
    Key Findings  Ownershipof the activity by MOH and leadership at province level are essential for real impact and change at HF level  The regular, periodic nature of the activity and consistency of feedback and recommendations had a very positive impact on the results  “Non-punitive” nature of activity, made it more welcoming by HF staff and more willing to implement changes
  • 19.
  • 20.
    Moving Forward  “FromQA to QI” – Existence of a clear gap between the DQA activity and the implementation of changes that need to happen;  Use the momentum created by the DQA to generate change;  TA delivery to MOH for DQA implementation;  Roll out of methodology to other program areas – DQA methodology with Malaria,TB and MCH programs (2016/17);  Inclusion of DQA activity in the official MOH budget.
  • 21.
    OBRIGADA! Dália MonteiroTraça Chief ofParty MSIP Maputo, Mozambique dtraca@mz.jsi.com