OVC_HIVSTAT and Linkages to Care for Strengthened Collection, Analysis, and Use of Routine Health Data
Report
Share
MEASURE EvaluationMEASURE Evaluation works to improve collection, analysis and presentation of data to promote better use of data in planning, policymaking, managing, monitoring and evaluating population, health and nutrition programs.
Follow
•3 likes•1,205 views
1 of 43
OVC_HIVSTAT and Linkages to Care for Strengthened Collection, Analysis, and Use of Routine Health Data
This webinar focused on explaining the HIV Risk Assessment cascade and how it is related to OVC_HIVSTAT disaggregates. The presenters also provided guidance for how OVC_HIVSTAT data can be analyzed to enhance program outcomes.
MEASURE EvaluationMEASURE Evaluation works to improve collection, analysis and presentation of data to promote better use of data in planning, policymaking, managing, monitoring and evaluating population, health and nutrition programs.
OVC_HIVSTAT and Linkages to Care for Strengthened Collection, Analysis, and Use of Routine Health Data
1. OVC_HIVSTAT and
Linkages to Care
for Strengthened Collection, Analysis,
and Use of Routine Health Data
Jenny Mwanza, MPH
Kristen Brugh, PhD
Lisa Parker, PhD
MEASURE Evaluation
Erin Schelar, MPH, RN
Amy Aberra, MPH
USAID Washington
March 14, 2018
Global Webinar for PEPFAR OVC
Programs
2. Global, five-year, $232M cooperative agreement
6 partners, led by the University of North Carolina at Chapel Hill
Strategic objective:
Strengthen capacity in developing countries to gather, interpret, and
use data to improve health
MEASURE Evaluation Overview
2
2
3. Local Partners and Capacity
Building Are Key
Prime: UNC-CH and partners:
ICF
John Snow, Inc.
Management Sciences for Health
Palladium
Tulane University
MEASURE Evaluation works with more than 72
smaller sub-awardees in over 27 countries
Over 26 percent of project funding goes back
to minor sub-awardees
3
7. Activity Objectives
1. Strengthen data collection and management for
reporting on OVC_HIVSTAT
2. Improve tracking of the OVC platform’s
contributions to 95-95-95
7
8. Activity Deliverables
1. Webinar for all global OVC programs to improve reporting of
OVC_HIVSTAT data for FY18 Q2
2. Study Report to expand upon webinar and will include examples
of best practices
3. Technical assistance for implementing partners who participated in
the study to strengthen their M&E systems in order to improve
collection, analysis and use of OVC_HIVSTAT
4. HIV Risk Assessment Prototype to provide a structure for the
data collection tool that enables high-quality data
collection on risk behaviors for children and adolescents*
*Theprototypewillnotprovideguidance on thetypesofquestionstoinclude
8
9. Webinar Objectives
1. Clarify rationale for collecting OVC_HIVSTAT data
2. Explain the HIV Risk Assessment cascade and how it is
related to OVC_HIVSTAT disaggregates
3. Demonstrate how OVC_HIVSTAT data can be
analyzed to enhance program outcomes
4. Provide recommendations on how to revise and
implement HIV Risk Assessment
9
10. Background
• OVC_HIVSTAT data was requested by PEPFAR in FY2017 Quarter 2
with the aim of strengthening the role of OVC programs to identify
children at risk for HIV infection, ensure they are tested, and link
them to care and treatment.
• Performance, data quality, and contextual factors were considered
to select three countries (South Africa, Côte d’Ivoire, and
Zimbabwe) for in-depth study.
• A total of six implementing partners across three countries were
visited between November 2017 and February 2018; 32
qualitative interviews were conducted and over 60 community
volunteers participated in workshops.
• HIV risk assessments, indicator reference sheets, and standard
operating procedures were collected from each implementing
partner.
10
11. Rationale for OVC_HIVSTAT
1. Implementing partners should assess the HIV risk of OVC
enrolled in their programs in order to focus HIV
counseling and testing services on those OVC determined
to be most at risk for HIV infection.
2. Implementing partners should track whether OVC who
report to be HIV positive are successfully linked to and
retained in treatment and care.
11
13. DATIM
DSD: OVC_HIVSTAT: Total
Auto-calculated
Number of OVC with HIV status reported to implementing partner
(including status not reported).
Numerator will auto-calculate from Status Type Disaggregate.
Numerator
Required Disaggregated by Status Type
Reported HIV positive to IP (includes tested in
the reporting period and known positive)
Of those positive: Currently receiving ART
Of those positive: Not Currently receiving ART
Reported HIV Negative to IP
No HIV status reported to the implementing
partner
Of those not reported: Test not indicated
Of those not reported: Other Reasons
13
14. Assessment Cascade
2. Conduct
HIV Risk
Assessment
1. Register
OVC & elicit
HIV status
3. Refer at-risk
children to
testing
4. Elicit test
result &
document
14
15. HIV Unknown – Other Reasons
At Risk
Not at
Risk
Refuse
assess
2. Conduct
HIV Risk
Assessment
1. Register
OVC & elicit
HIV status
HIV +
not on
ART
HIV +
on
ART
HIV
Un-
known
HIV
Neg
HIV +
HIV
Referral
Referral
com-
plete
3. Refer at-risk
children to
testing
4. Elicit test
result &
document
HIV +
not on
ART
HIV +
on
ART
Refuse
Report
HIV
Neg
HIV +
At the end of a reporting period,
any OVC recorded in one of the red boxes (A + B + C+ D +E +F)
in the MIS database, should be reported in DATIM as
“HIV Unknown – Other Reasons”
A
B
C
D
E
F
Stop
StopStop
Stop
Stop
StopStop
15
16. MIS Database Fields
HIV
Assessment
1. HIV positive
2. HIV positive on ART
3. HIV positive not on ART
4. HIV negative
5. HIV unknown
6. HIV unknown – test not indicated
7. HIV unknown – other reasons
1. At risk
2. Not at risk
HIV Test
Referral
1. Referral made
2. Referral completed
3. Refuse self-report
HIV status
16
Required Suggested
17. Challenges
Data collection tool
• Risk assessment questions can be confusing
• Outcomes of “At Risk” and “Not at Risk” are not clearly labeled
• Next steps related to HIV test referral are often missing
Record of new test results
• Community workers often hesitate to record HIV-positive test results
• HIV-negative test results are often not recorded at all
• Inconsistent linkage between the paper forms and MIS database
Update of HIV treatment status
• Despite strong documentation of ART treatment status at enrollment, there
was weak documentation of HIV treatment status at 6-month intervals
17
18. Data Quality Controls
Data entry clerks record
• HIV Unknown – At Risk
• HIV Unknown – Test Not
Indicated
Community volunteers apply HIV
Risk Assessment with guardians to
determine if child displays HIV risk
factors
These risk factors are
recorded on a paper
data collection tool; the
results should be clearly
documented
In a data quality audit,
we must be able to
observe these
outcomes on the
individual child’s form
In a contact trace and verify
exercise, the guardian may be re-
interviewed to determine if the
child displays the same risk factors
1.
2.
3.
4.
5.
18
21. OVC_HIVSTAT Logic Model
Input Outcome Impact
High-quality
collection forms
Robust
database
Technical
capacity
Standard
Operating
Procedures
95% of all
people living
with HIV will
know their HIV
status
Process
% of unknown
who have been
assessed
% of “at risk”
who have been
referred
% of referrals
completed 95% of all HIV
positive people
on ART (2)
% of HIV
positive with
updated Tx
% of HIV
positive OVC
currently on
ART (1)
% of OVC for
whom HIV
status is known
or test is not
indicated
(1) OVC programs measure self-reported ART treatment status
(2) Point of care data measures actual ART adherence
21
22. Assessment Cascade
Overview
*These de-aggregates are not reported via DATIM;
these “dummy” data are presented to demonstrate how MIS data can be used to strengthen internal performance
management of IPs and narrative sections of reporting to USAID
22
28. Process Indicators
• We notice that children are “lost” at each stage of the
assessment cascade.
• Therefore, we recommend that all MIS databases collect
these information for internal performance monitoring to
target supervision and conduct ongoing training.
• Although process indicator data will not be reported via
DATIM, we suggest that there is a direct link between
process indicators and outcome indicators.
• IPs should share feedback on process indicators to both
sub-recipients in the field as well Mission focal points/HQ
in the narrative sections of their bi-annual reports.
28
34. Global OVC Programs
HIV Positive, HIV Negative, HIV Unknown – Test
Not Indicated & OVC_SERV
34
35. Global OVC Programs
Close-up of Kenya and Zimbabwe
• Kenya has strong
performance because
they have tested a
large proportion of
their OVC_SERV
population.
• On the other hand,
Zimbabwe shows large
numbers of children
who have been
assessed and
determined “test not
indicated.”
35
36. Global OVC Programs
89% HIV-positive OVC on ART (FY17Q4)*
*These data probably best measure linkage to treatment
and exclude self-reported adherence
36
38. Recommendations (1)
1. Include risk factors which require testing; if any one risk factor is
met then the child would be referred for HIV testing.
2. Track the outcomes of “At Risk” and “Not at Risk” on the form
3. Track testing referral and completion for “At Risk” on the form
4. Track new HIV test results on the form (or elsewhere)
5. Ensure new HIV test results are entered into the MIS database
6. Ensure ART treatment status is collected regularly on the form (or
elsewhere)
to revise HIV Risk Assessment
(MEASURE Evaluation will be publishing HIV Risk Assessment Prototype
to strengthen M&E aspects of data collection)
38
39. Recommendations (2)
1. Integrate HIV Risk Assessment within household visits to
ensure sustainability
2. Use paper data collection tool during interview to
improve reliability of the data
3. Organize a training specifically on how to conduct the
HIV Risk Assessment
• Discuss each of the HIV risk factors
• Identify probing questions that are culturally
appropriate
• Do role-plays to explore different scenarios
to implement HIV Risk Assessment
39
40. Recommendations (3)
1. Implementing partners to provide regular
feedback on process indicators to sub-
recipients.
2. Implementing partners to establish internal
targets for outcome indicators
to use OVC_HIVSTAT data
Process Indicators
• % assessed of unknown
• % referred of at risk
• % referrals completed
• % HIV Tx status updated
Outcome Indicators
• % of OVC with known status
or Test Not Indicated
• % of HIV positive OVC
currently on ART
3. Sub-recipients to organize quarterly data analysis meetings to review progress
4. Sub-recipients to identify districts with weak performance and provide
supportive supervision and enhanced training.
5. Through regular analysis of data,
improve the linkage between risk
assessment, testing, and treatment.
40
41. Thank you!
To the tireless community
volunteers, M&E officers,
and leaders of
implementing partners
who continually strive to
improve linkages among
OVC populations and HIV
testing and treatment
from HIVSA, PACT, REVE,
MAVAMBO, HOSPAZ,
and FACT.
43
42. Acknowledgments
MEASURE Evaluation would like to thank Christine Fu, Amy
Aberra, and Erin Schelar of USAID/Washington, for their
continued support and insights into improving HIV Risk
Assessment of Orphans and Vulnerable children in PEPFAR
priority countries.
USAID Health Teams in South Africa, Côte d’Ivoire, and
Zimbabwe—Ambereen Jaffer, Anita Sampson, Brilliant
Nkomo, Collen Marawanyika, David Chikoka, Kathryn
Reichert, Lauren Murphy, Lucie Dagri, Mavis Boateng,
Naletsana Masango, Natalie Kruse-Levy, Samson Chidiya —
and Brenda Yamba from the Regional Office, provided
invaluable guidance and access to local implementing
partners.
44
43. This presentation was produced with the support of the United States Agency for
International Development (USAID) under the terms of MEASURE Evaluation
cooperative agreement AID-OAA-L-14-00004. MEASURE Evaluation is
implemented by the Carolina Population Center, University of North Carolina at
Chapel Hill in partnership with ICF International; John Snow, Inc.; Management
Sciences for Health; Palladium; and Tulane University. Views expressed are not
necessarily those of USAID or the United States government.
www.measureevaluation.org