Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Addressing Complexity in the Impact Evaluation of the Cross-Border Health Integrated Partnership Project in East Africa

314 views

Published on

Presented by Grace Mulholland at the 2016 AEA conference.

Published in: Health & Medicine
  • Be the first to comment

  • Be the first to like this

Addressing Complexity in the Impact Evaluation of the Cross-Border Health Integrated Partnership Project in East Africa

  1. 1. Addressing Complexity in the Impact Evaluation of the Cross-Border Health Integrated Partnership Project in East Africa Grace Mulholland, MSPH MEASURE Evaluation University of North Carolina at Chapel Hill October 27, 2016 American Evaluation Association 2016 Conference
  2. 2. Acknowledgements • Jess Edwards, MEASURE Evaluation • Milissa Markiewicz, MEASURE Evaluation • Freddie Ssengooba, Makerere University • Sharon Weir, MEASURE Evaluation • Sian Curtis, MEASURE Evaluation • Peter Arimi, USAID/East Africa
  3. 3. Outline 1. CB-HIPP overview 2. Evaluation overview 3. Sources of complexity 4.Addressing complexity
  4. 4. CB-HIPP Overview The purpose of the Cross-Border Health Integrated Partnership Project (CB-HIPP) is to improve access to quality health services and improve health outcomes in cross- border areas.
  5. 5. CB-HIPP Overview Implementation of CB-HIPP is planned for 4 land border sites and 3 wet border sites in Uganda, Kenya, Tanzania, Rwanda, and Burundi.
  6. 6. Evaluation Overview The purpose of this evaluation is to quantify the impact of CB-HIPP on its intended health outcomes.
  7. 7. Evaluation Overview 1. Identify comparison sites 2. Measure health outcomes in intervention and comparison sites at baseline and over time 3. Compare differences in health outcomes over time in intervention and comparison sites Steps
  8. 8. Evaluation Overview Method Program start Program midpoint or end With program Outcome Time Program Impact Effect of other factors Without program We want to compare outcomes in CB-HIPP intervention sites with outcomes that would have occurred without CB-HIPP.
  9. 9. Evaluation Overview Sites
  10. 10. Evaluation Overview Measurement of health outcomes involves a mixed-methods approach. • Quantitative cross-sectional survey (via the “PLACE” method) • Medical records review • Qualitative and quantitative interviews of personnel at health facilities Measurement
  11. 11. Sources of Complexity • Number of outcomes • Number of populations • Sensitive outcomes • Varied contexts • Trends in health outcomes unrelated to the intervention
  12. 12. Addressing Complexity Various stakeholders consulted for input, including: • Project implementers • EAC delegates • Local government representatives • NGO & civil society organization representatives • Health officials and health management teams Strategy: Seek multiple perspectives.
  13. 13. Addressing Complexity MEASURE Evaluation staff visited 7 potential intervention and comparison sites. Findings from the scoping visits informed the evaluation questions. Strategy: Include scoping visits.
  14. 14. Addressing Complexity Data sources: • Mapping readiness assessment • Bio-behavioral survey • Medical records review • Interviews of health facilities personnel Local data collectors are recruited and trained at each cross-border site. Strategy: Collect data from multiple sources and recruit local data collectors.
  15. 15. Addressing Complexity • HIV/AIDS • Prevention • Care and treatment • Prevention of mother-to-child transmission • Tuberculosis • Antenatal care • Maternal and child health • Reproductive and sexual health Strategy: Measure a broad range of outcomes.
  16. 16. Addressing Complexity In bio-behavioral survey: e.g., self- reported sexual behaviors In interviews at health facilities: e.g., reported coordination between health facilities Strategy: Collect data for leading indicators.
  17. 17. Addressing Complexity Account for background effects. In a difference-in-differences approach, select appropriate comparison sites to serve as counterfactuals for the intervention sites. Strategy: Choose appropriate analytic methods.
  18. 18. Addressing Complexity • Qualitative interviews at health facilities • Frequent communication with supervisors • Supervisor comments on fieldwork summary forms Strategy: Create opportunities for unexpected findings.
  19. 19. This presentation was produced with the support of the United States Agency for International Development (USAID) under the terms of MEASURE Evaluation cooperative agreement AID-OAA-L-14-00004. MEASURE Evaluation is implemented by the Carolina Population Center, University of North Carolina at Chapel Hill in partnership with ICF International; John Snow, Inc.; Management Sciences for Health; Palladium; and Tulane University. Views expressed are not necessarily those of USAID or the United States government. www.measureevaluation.org
  20. 20. Analysis Time Outcome B A Baseline Follow-up B-A
  21. 21. Analysis Time Outcome B A Baseline Follow-up B-A C D-C D
  22. 22. Analysis Time Outcome B A Baseline Follow-up B-A Impact = (B-A)-(D-C) C D-C D
  23. 23. Analysis For each outcome of interest 𝑌, let 𝑌1(𝑡) represent the potential outcome under the CB-HIPP program at time 𝑡, where 𝑡∈{0, 0.5, 1}. 𝑌0 (𝑡) represents the potential outcome at time 𝑡 without the CB-HIPP program. Let 𝐴 be an indicator of inclusion of the site in the CB-HIPP program. Difference-in-differences
  24. 24. Analysis The parameter of interest is the difference in the difference of outcomes before and after the intervention periods, under the CB-HIPP program and without the CB-HIPP program: 𝑌1(1)−𝑌1(0). The parameter of interest, the difference in differences, can be written as: 𝛿 𝐷𝐷=[𝑌1(1)−𝑌1(0)]−[𝑌0(1)−𝑌0(0)]. The expected value of 𝛿 𝐷𝐷 is the true effect of the intervention. Difference-in-differences
  25. 25. Analysis Assuming the trends observed at the comparison sites are proportional to the trends that would have been observed at the intervention sites had they been comparison sites, one can estimate: 𝐸 𝑌0 1 − 𝑌0 0 as 𝐸 𝑌 1 − 𝑌 0 𝐴 = 0 𝐸[𝑌1 1 − 𝑌1 0 ] as 𝐸 𝑌 1 − 𝑌 0 𝐴 = 1 𝛿 𝐷𝐷 can be estimated using a regression model for 𝑌: 𝐸(𝑌|𝐴, 𝑡) = 𝛽0 + 𝛽1 𝐴 + 𝛽2 𝑡 + 𝛽3 𝐴 × 𝑡 where 𝛽3 is the impact of the intervention on outcome 𝑌. Difference-in-differences
  26. 26. Conceptual Framework Reduce transmission Improve survival Engaged in treatment program Community viral load Loss to follow-up Community prevalence at baseline Treatment regimen changes Unmet need for treatment
  27. 27. CB-HIPP Targets Reduce transmission Improve survival Engaged in treatment program Community viral load Loss to follow-up Community prevalence at baseline Treatment regimen changes Unmet need for treatment
  28. 28. Impact Evaluation Questions Reduce transmission Improve survival Engaged in treatment program Community viral load Loss to follow-up Community prevalence at baseline Treatment regimen changes Unmet need for treatment
  29. 29. Intermediate Evaluation Questions Reduce transmission Improve survival Engaged in treatment program Community viral load Loss to follow-up Community prevalence at baseline Treatment regimen changes Unmet need for treatment
  30. 30. Motivations for Mobility People are also motivated to cross borders by differences in policies and costs of service, and, when seeking health services, by stigma. At wet borders, those involved in the fishing industry are particularly mobile due to the seasonal movement of the fish.
  31. 31. Challenges to Health Service Delivery Challenges at cross-border sites include limited or no contact between health officials on opposite sides of a border. Contact tracing and surveillance also do not extend across borders.

×