MEASURE EvaluationMEASURE Evaluation works to improve collection, analysis and presentation of data to promote better use of data in planning, policymaking, managing, monitoring and evaluating population, health and nutrition programs.
This Data for Impact webinar took place October 29, 2020. Learn more at https://www.data4impactproject.org/resources/webinars/use-of-routine-data-for-economic-evaluations/
MEASURE EvaluationMEASURE Evaluation works to improve collection, analysis and presentation of data to promote better use of data in planning, policymaking, managing, monitoring and evaluating population, health and nutrition programs.
1. Use of Routine Data
for Economic Evaluations
Anna Krivelyova, Data for Impact
Webinar
October 29, 2020
2. Webinar goals
• Why measure costs?
• Types of economic evaluations
• Design considerations
• Sources of routine data/data quality
• Factors affecting feasibility
Theory Practice
3. Why invest in cost measurement?
• Scaling and sustaining programs
and interventions
• Improving efficiency
• Improving value for money
Cost measurement is a process of collecting,
processing, analyzing, and reporting on the costs
of interventions.
4. Types of economic evaluations
Cost analysis/“costing”
effects: Not measured Need to measure: Costs
Cost-effectiveness analysis
effects: Natural units
Cost-utility analysis
effects: QALYs/DALYs
Cost-benefit analysis
effects: Monetary
Need to measure: Costs
and effects
QALY: Quality Adjusted Life Year
DALY: Disability Adjusted Life Year
5. Types of economic evaluations (cont.)
• Cost-Utility and Cost-Benefit Analyses normally
require longer-term follow-up period, modelling,
and collection and analysis of secondary
population-level epidemiological, demographic,
and financial data
• Level of detail and design for cost analysis
depends on the type of economic evaluation
6. Cost analysis: Research questions
• What is the total cost of the program/
intervention?
• What is the cost of the program per client?
• How do program costs differ for different
client groups?
• What are the major cost drivers?
• How would the program cost per client
change if we scale-up?
• What is the allocation of costs across various
program activities?
Use of routine data only
7. Design considerations
• Perspective
• Population and focus
• Unit of observation
• Included and excluded costs
• Time period
• Retrospective/prospective (retrospective easier)
• Routine data only or routine data plus some
primary data collection
8. Analysis perspective
Perspective Examples of costs Use of routine data only
• Personnel
• Supplies
• Equipment
Provider/program
• User fees
• Travel
• Time
Client/patient
• Lost production/taxesSocietal
9. Challenges with measuring costs
• Routine data exists but often in many different
places and formats
• Resources are used by one entity and paid for
by another
• Requires detailed understanding of
implementation
• Benchmarks rarely exist — very contextualized
10. HIV TX program: Cost centers
Above site: central/subnational/technical assistance
11. HIV TX program: Examples of resources
Other resources:
Facility level: Utilities and
contracted services
Above site: Monitoring and
evaluation (M&E) software
costs, supervisory/
training visits
Laboratory: Sample
transport, testing
Supply chain: Warehouse
and transport costs for
drugs and supplies
Community level:
Community health workers
(CHWs) for retention
and adherence
13. Data components needed
Total cost = Quantity x Unit Cost
1) Quantities
2) Unit costs
3) Allocations
Sources of routine data will often differ depending
on the component.
For each component there may be different
sources of routine data.
14. Drugs: Sources of routine data
Type Source Where
Quantity Electronic health record (EHR)/
patient service record
Facility
Quantity Pharmacy dispensing Facility
Quantity Stock registers/cards Facility
Quantity/price Invoices Facility
Quantity/price Orders/delivery logs Warehouse
Price Country-level procurement price lists Central
level/MOH
Price Global/funder procurement price lists Global
databases
18. Supplies, equipment, utilities, contracted
services: Sources of routine data
Type Source Where
Quantity Stock registers/cards Facility
Quantity/price Invoices Facility/district
Quantity/price Orders/delivery logs Warehouse/IPs
Price Country level procurement price lists Central level/
MOH
Price IP accounting records/grant reporting IP
Price Market prices Country level/
global
20. Merging prices and quantities
• Item description:
Abacavir 300mg/Abacavir 600mg
Red top tube/grey top tube
• Unit definition
Box of 100/box of 1,000
Monthly bill/annual bill
21. Allocations
Use of routine data only
Stand-alone HIV TX clinic
Stand-alone HIV TX
and testing clinic
HIV clinic within
a larger facility
PMTCT services: Multiple entry
points within a facility
Large hospital: HIV TX is
completely integrated with
primary care
Every client/patient receives
exact same services
Few well-defined
patient groups
Many well-defined
patient groups
Many patient groups,
not well defined
Every patient receives
completely different services
22. Allocations: Some options
• Allocating based on some measure (e.g., number
of patients, visits, tests, space, distance)
• Normative assumptions (e.g., using TX guidelines)
• Adding limited primary data collection
(e.g., personnel questionnaire)
• Excluding certain costs from the analysis
23. Key lessons
• Understanding of implementation/service model
• Complexity and heterogeneity of the
service delivery
• Understanding of resource types, cost centers,
and payers
• Hypothesis on major cost drivers (e.g., don’t
spend a month counting cotton balls)
24. Key lessons (cont.)
• Routine data in “analyzable” format
• Reasonable allocation is feasible
• Prospective data collection maybe easier (can we
add a simple field to the existing data system?)
• Routine data may need to be supplemented
26. This presentation was produced with the support of the United States
Agency for International Development (USAID) under the terms of the
Data for Impact (D4I) associate award 7200AA18LA00008, which is
implemented by the Carolina Population Center at the University of
North Carolina at Chapel Hill, in partnership with Palladium
International, LLC; ICF Macro, Inc.; John Snow, Inc.; and Tulane
University. The views expressed in this publication do not necessarily
reflect the views of USAID or the United States government.
www.data4impactproject.org