The document describes an IMCI Bulletin tool used in Rwanda to monitor and improve the quality of community-based integrated management of childhood illness (C-IMCI). The tool was developed within a quality assurance framework to provide an overarching view of district and health center C-IMCI performance. It enables decision-makers to assess progress in real-time across critical indicators and standards. The tool collects data monthly from registers, reports, and supervisions at various levels. It analyzes the data and provides automated feedback to trigger quality improvement discussions. While the tool provided benefits like improved quality and sustainability, challenges remain around duplication with the health information system. Adaptation would be needed for use in other contexts.
Using and Improving Indictors for CCM of Sick Children_Landegger_5.3.12
1. Using Indicators for Quality Improvement;
Overview of the IMCI Bulletin Tool
CORE Spring Meeting May 2012
2. Overview
• Context of tool development
• Methodology of use
- Tool
- Information produced
• Tool platform and components
• Benefits
• Challenges
• Relevance in other contexts
3. Tool Context
•Expanded Impact Project (EIP) 2006 – 2011 focused on CCM scale-
up, quality of care and community mobilization
• Designed within a systems strengthening and quality assurance
framework to provide an overarching view of district and health
center C-IMCI performance
• Enables decision makers to assess progress in real-time across
critical indicators/standards to trigger quality improvement
Rwanda
4. C-IMCI Quality Improvement Landscape in Rwanda
Information Medical
Hospital
Bulletin
District
& Feedback Director
analysis in
Supervision 1-2 hours
feedback
meetings motorcycle /
• Cell coordinators car ride
Health center
CHW • HC CHW Supervisor
Health Center
• Data Manager
Data compiling and
reporting, meetings
supervisor
Sector
Cell coordinator 1 hour to 1
day walk;
Cell bicycle or
motorcycle
Village
• Community Case
CHW groups Management
• Health Promotion
Activities
CHWs
5. Methodology
Who?
• Data managers (HC)
• Cell coordinators
• CHW supervisors (HC)
• CHW supervisors (hospital)
• EIP QA and M&E staff (HC)
• District planners
• District authorities
6. Methodology
What?
• Tool explores elements of quality
improvement; with corresponding
indicators and performance standards
• Designed to collect and analyze data
from multiple sources
CHW Patient and CCM
Drug Register CHW Health Excel
Compiled Facility database
Report Report
Supervision Checklist
• Based on existing CCM data flow
• And more…
7. Methodology
Where and When?
• Community, ongoing data
collection
• Implementation of corrective
measures
• Health Center, monthly data
aggregation and quality checks
• Analysis and feedback
• Hospital and District, quarterly
data consolidation
• Results dissemination, feedback
and planning
• National, six-monthly data review
8. Methodology
How is the tool and its
information useful?
1. Real-time review of automated
results
2. Feedback and planning with
stakeholders and decision makers
3. Best performing HC per district
publically acknowledged
4. Analysis at community health desk
and technical working group
5. Barriers to quality improvement
identified, workplans
adapted, new targets developed
9. Methodology
Why incorporate?
• To identify strengths and
weaknesses by comparing
achievements to agreed norms
• To provide a practical guide
for further planning and
decision making
• To share strengths, replicate
best practices and move quickly
to improve on weaknesses
11. Finally…
• 2008: district scorecard pilot in one HC
• 2010: scorecard revised into IMCI
Bulletin allowing all HCs (across six
districts) to collect data monthly
• Six district data consolidated every six
months (June 2010, January 2011, and
June 2011)
• Final tool: home page, help page, data
entry platform, automated analysis
22. • Design
Tool Components • Indicators
• Standards
Indicators based on –
• Global C-IMCI technical
reference materials
• MoH guidance
• Performance-based
financing (PBF)
• Data simplicity and
accessibility
Standards based on –
• Evidence (expected
incidence…)
• Consensus (feasible
targets)
23. 17 indicators and standards
across six focus areas –
• Utilization of services
• Medication stock management
• Community participation
• Community case management
• Human resources for health
• Reporting systems
24. Bulletin Standards and
Indicators
vs.
CCM Benchmarks and
Indicators
Component 3: Targeted CHWs
providing CCM
Component 4: Medicine and
diagnostic availability
Component 5: Treatment
coverage
Component 7: Routine
supervision coverage
Component 7: Correct case
management practice
25. Bulletin Benefits to EIP
• Did quality improve?
– Contributed to quality of care
– HC quality of monitoring evolved and
improved
• Handwritten data displays on the wall
• Automated data tables
• Auto-generated analysis tools: bar charts
and trend lines
– Increased episodes of information
feedback, ‘flowing downstream’
• Supplied reliable data to C-IMCI
stakeholders
26. Bulletin Benefits to EIP
• Is it sustainable?
– Better inclusion of C-IMCI data in HIS
• Secondment of staff
• Bulletin indicators integrated into national HIS
– “I am certain that the bulletin will continue. It
will be led by the health center Data
Manager, and I will ensure this during my
supervision visits to the health center, as well
as the Data Manager at the hospital level.”
27. Bulletin Benefits to EIP
• Contribute as advertised?
Served technical + motivational purposes
Provided reasonable set of
standards, comparable overtime and
between areas
Serves as ‚evolving learning tool‛ on
producing and using information to
support performance and quality of care
Met critical need for timely information
usable at the local level
But…was not as ‘stand alone’ as
28. Challenges
• Clear need for further improvement
– Tool logic is okay
– Redesign necessary for use at national scale
• Duplication of HIS
– Is that outweighed by the rapid feedback of usable information
at the local level?
29. EIP Final Evaluation Findings
Strengths Weaknesses
• Establishes local standards • Duplication with HIS
• Information locally relevant outweighing complementarity?
– Utilization • Replication needs adaptation
– Drug supply and re-linking to HIS
– Human resources
– Coverage
– Quality of treatment
– Reporting
• ‚Real-time‛ automated signals
• Basis for advancing culture of
quality
• ‚Mirror‛ of performance
referred to by MOH
• Developed at low cost
30. Applicable for other contexts?
• Relevance
– Where there are champions for improving the quality of
C-IMCI data and monitoring systems
– Culture of learning at many levels
• Incentives
– One size/format does not fit all
• Manageability
– Considerable demands on HC staff
• MoH and/or NGO training and ongoing
technical support
– Hardware and electricity
• Adaptability
– User-friendly format
– Low cost
– Start small, then scale
31. Relevance to CCM Benchmarks
Advocacy and Pilot and Early Expansion and
Planning Implementation Scale-Up
1. Coordination and Policy Setting
2. Financing
3. Human Resources
4. Supply Chain Management
5. Service Delivery and Referral
6. Communication and Social
Mobilization
7. Supervision and Performance
Quality Assurance
8. M&E and Health Information
Systems
Pre-CCM Benchmarks!Six districts in RwandaResponded to a need for the ability to assess progress
Overview of CCM system in Rwanda: CHWs recognized as frontline of MoH, stipends as do the supervisors...cell coordinator, like an empowered peer supervisor…Illustration of information and supervision flows – very busyEIP aimed to strengthened existing data reporting mechanisms – really focusing on the feedback, downward data flowNB:context of revised HIS, with PBF, simultaneously being developed…(data managers)
Throughout this methodology section, it’s important to discern what happens with the tool and what happens with the information produced from the tool.Input dataSupport data entry, data checks and analysisUse the analysis for decision-making
1) What it does/supposed to do?2) What data goes into it?
Comm – collection and then action implementation post feedback meetingsHC – data consolidation, analysis, discussion (feedback meetings), disseminationHosp + district – coordination and feedback meetings and public results disseminationNational - TWG
Main objectives
Long design process with significant evolution over time. Tool imagined during Mar 2008 technical visitMoH involved in design, especially at the district level, but not enoughInadequate and not visually pleasing
Bulletin allowed information to be produced faster, with data entry and analysis done at HC level!
Four components
Criteria definition tab is incomplete…attempted to define on this page when you click on the yellow boxesNB: Some are yes/no some are figures
Aggregated
Automatically generated graphs and charts to support analysis
Diarrhea incidence: If each child under 5 has 3 episodes/year covered at 100%, this would mean 25% of children covered every quarter…but, it’s not realistic to think coverage would be 100%...
Similar to the database design process…agreeing on indicators took some time…
Again, pre-formal Benchmarks…but many of the data quality elements are the same as the CCM Benchmarks
Feedback: dual information flow
Though the MoH not very engaged in the beginning – appreciation at the end, we’re working to integrate indicators and parts of the tool platform liked by MoH. We got buy-in along the wayBulletin prompted progress on the integration of IMCI data into the HISIndicators on treatment, supervision, stock management and CHW meetings
Remember: Assess progress in real-time across critical indicators/standards to trigger quality improvement Comparable: HC and districts
A couple key points to consider…2011 HFA found: 2.6 – 4.2 computers/facility – Per district range of 37% - 80% of facilities have reliable power
Using the same slide as Serge’s presentation last year…We are hoping to move to the scale-up level on “quality assurance and performance, feedback used for problem solving and coaching…”