Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmission Settings: Technical Approaches

104 views

Published on

Presented at ASTMH 2019.

Published in: Health & Medicine
  • Be the first to comment

  • Be the first to like this

Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmission Settings: Technical Approaches

  1. 1. Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmission Settings: Technical Approaches Background New evaluation approaches that are adaptive and flexible are needed to accurately measure the progress and achievements of malaria programs in low-, moderate-, and heterogeneous-transmission settings. These settings present unique challenges because of the level of granularity, mosaic of intervention packages, and number of implementing partners focusing on different transmission zones in heterogeneous settings. The timing of impact evaluation in relation to the maturity of the interventions is another challenge because high population coverage of the intervention might be less of an issue. In these contexts, evaluation approaches should include both process and impact evaluations. Debra Prosnitz, MPH1 ; Ruth Ashton, PhD2 ; Andrew Andrada, MSPH1 ; Yazoumé Yé, PhD1 1 MEASURE Evaluation, ICF; 2 MEASURE Evaluation, Tulane University Refined Evaluation Design The approach to refined, adaptive, and flexible evaluation design includes both process and impact evaluations (Figure 1). Process evaluations give programs evidence for corrective action to improve program performance (coverage, intensity, and quality of malaria interventions) and will also help determine the timing of an impact evaluation, which will measure the reduction of malaria burden. For impact evaluations, confirmed malaria incidence is recommended as the primary impact indicator in low- or heterogeneous-transmission settings due to its sensitivity to detect changes. Methods to measure impact will include interrupted time series, district-level dose-response analyses, and constructed control, supported by triangulation of multiple data sources to strengthen the plausibility argument. Access the full Framework for Evaluating National Malaria Programs in Moderate- and Low‑Transmission Settings Limitations The challenges, which are priority areas for further research, are as follows: benchmarking “adequate quality” routine health information system data, defining intervention maturity of malaria programs, setting thresholds for implementation strength, fully accounting for endogeneity, and determining at what level of program coverage measurable impact is expected. Summary The framework for evaluating national malaria programs highlights the importance of routine surveillance data for evaluation and use of confirmed malaria incidence to measure impact in low-, moderate-, and heterogeneous-transmission settings. In many low-transmission settings in which the program has already achieved scale-up of key interventions, evaluation activities are likely to take the form of continuous process evaluation, complemented by impact evaluation when a substantial change in policy, intervention, or strategy has taken place. Emphasizing a continuous cycle along the causal pathway, linking process evaluation to impact evaluation and then programmatic decision making, the framework provides practical guidance in evaluation design, analysis, and interpretation to ensure that the evaluation responds to national malaria program priority questions and guides decision making at national and subnational levels. Definitions: NMSP=national malaria strategic plan; ITNs=insecticide-treated nets; IRS=indoor residual spraying; IPTp=intermittent preventive treatment in pregnancy; SMC=seasonal malaria chemoprevention; SME=surveillance, monitoring, and evaluation; SBC=social behavior change; HIS=health information system; MDA/MSAT/FSAT=mass drug administration/mass screening and treatment/focal screening and treatment Figure 1. Theory of change for national malaria programs across the transmission spectrum Evaluation questionsEnabling environment Malaria program implementation Health system (e.g., government expenditure on health, health facility infrastructure and resources) Macro socioeconomic (e.g., political situation, GDP per capita, transport, gender, and communication infrastructure) Environmental (e.g., extreme weather events, other disease outbreaks) Process evaluation All settings implement NMSP? All settings Program managed and implemented well? All settings Interventions in NMSP implemented as expected? Interventions good quality? Low Implementing the right interventions in the right places? Surveillance system functioning well? Factors influencing outcomes Health system (e.g., access to and use of health services, availability of essential drugs and commodities) Macro and micro sociocultural and economic (e.g., poverty,migration, householdwealth, sleepinghabits, gender) Outcome and impact evaluation Moderate or high Reached high enough coverage? Interventions equitable? Low Coverage sustained at high enough levels? Interventions reaching the right people? All settings Program have an impact? Do alternative explanations exist for the observed impact? Factors influencing impact Health system of other health interventions) Environmental (e.g., climate, temperature) Epidemiological (e.g., malnutrition, disease prevalence, population movement) HIGH MODERATE LOW • Identification of high-risk groups/subnational areas Funding, health workforce, infrastructure, HIS, commodities, governance, policy and guidelines 1. INPUTS • Delivery of prevention activities (ITNs, IRS, larviciding, MDA/MSAT/ FSAT) • Surveillance system strengthening/adaptation • Delivery of prevention activities (ITNs, IRS, larviciding, IPTp, SMC, MDA/MSAT/FSAT) • Surveillance system development • Delivery of prevention activities (ITNs, IRS, larviciding, IPTp, SMC) • Surveillance system development SBC, case management, supervision, supply chain management, training, SME and research, diagnostics quality assurance 2. PROCESSES • Diagnostic tests retested • Surveillance data generated, reviewed and acted on • Work force trained • M&E data generated and reviewed • Chemoprevention delivered • Work force trained • M&E data generated and reviewed Diagnostics and treatment delivered, ITNs distributed, IRS conducted, supervision, coordinated implementation of activities, utilization of services, research generated and reviewed 3. OUTPUTS • Uptake of interventions (ITNs, IRS, MDA/MSAT/ FSAT) • Case-based surveillance system • Uptake of interventions (ITNs, IRS, IPTp, SMC, MDA/MSAT/FSAT) • Routine surveillance system • Uptake of interventions (ITNs, IRS, IPTp, SMC) • Routine surveillance system Diagnostics and treatment coverage, quality data used for decision-making, diagnostic proficiency, functioning and responsive surveillance system 4. OUTCOMES • Interrupt malaria transmission • Decrease malaria incidence and prevent outbreaks • Prevent malaria- attributable mortality • Decrease all-cause child mortality Decrease malaria transmission, decrease malaria incidence, decrease malaria-attributable mortality 5. IMPACT *May include net durability monitoring and IRS application quality monitoring using cone bioassay. WHO provides specific guidance on entomological surveillance, and intervention monitoring and evaluation available at http://www.who.int/malaria/areas/vector_control/entomological_surveillance/en/ Transmision Setting Acknowledgments—This poster presents work initiated by the Evaluation Taskforce of the Roll Back Malaria Monitoring and Evaluation Reference Group. This research has been supported by the President’s Malaria Initiative (PMI) through the United States Agency for International Development (USAID) under the terms of MEASURE Evaluation cooperative agreement AIDOAA-L-14-00004. The opinions expressed are those of the authors and do not necessarily reflect the views of USAID, or the United States Government. For information, contact: Debra.Prosnitz@icf.com https://www.measureevaluation.org/

×