Randomization and Its Discontents

MEASURE Evaluation
MEASURE EvaluationMEASURE Evaluation works to improve collection, analysis and presentation of data to promote better use of data in planning, policymaking, managing, monitoring and evaluating population, health and nutrition programs.
Peter M. Lance, PhD
MEASURE Evaluation
University of North Carolina at
Chapel Hill
JUNE 29, 2016
Randomization and Its
Discontents
Global, five-year, $180M cooperative agreement
Strategic objective:
To strengthen health information systems – the
capacity to gather, interpret, and use data – so
countries can make better decisions and sustain good
health outcomes over time.
Project overview
Improved country capacity to manage health
information systems, resources, and staff
Strengthened collection, analysis, and use of
routine health data
Methods, tools, and approaches improved and
applied to address health information challenges
and gaps
Increased capacity for rigorous evaluation
Phase IV Results Framework
Global footprint (more than 25 countries)
• The program impact evaluation challenge
• Randomization
• Selection on observables
• Within estimators
• Instrumental variables
• The program impact evaluation challenge
• Randomization
• Selection on observables
• Within estimators
• Instrumental variables
Randomization and Its Discontents
Randomization and Its Discontents
Did the program make a
difference?
Did the program cause a change in an
outcome of interest Y ?
(Causality)
What happens
if the individual
participates
{Causal} Program impact
𝑌𝑖
1
− 𝑌𝑖
0
= Program impact
What happens
if the individual
does not
participate
Average treatment effect (ATE)
𝐸 𝑌1 − 𝑌0
Average effect of treatment on the treated (ATT)
𝐸 𝑌1 − 𝑌0|𝑃 = 1
Treatment effects
𝑌𝑖
1
, 𝑌𝑖
0
Observed outcome
𝑌𝑖
1
, 𝑌𝑖
0
Observed outcome
𝑌𝑖
1
, 𝑌𝑖
0
Observed outcome
𝑌𝑖
1
, 𝑌𝑖
0
Observed outcome
𝑌𝑖
1
, 𝑌𝑖
0
Observed outcome
Fundamental Identification
Problem of Program Impact
Evaluation
𝑌𝑖
1
, 𝑌𝑖
0
Observed outcome
Fundamental identification
problem of program impact
evaluation
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
Impact evaluation
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
Randomization and Its Discontents
Randomization and Its Discontents
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
Average Y
across sample
of participants
−
Average Y
Across Sample
of Non−Participants
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
Average Y
across sample
of participants
−
Average Y
across sample
of non−participants
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
Average Y
across sample
of participants
−
Average Y
across sample
of non−participants
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
Average Y
across sample
of participants
−
Average Y
across sample
of non−participants
Randomization and Its Discontents
Randomization and Its Discontents
Impact evaluation
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
𝐸 𝑌0|𝑃 = 0 = 𝐸(𝑌0|𝑃 = 1)
𝐸 𝑌0
|𝑃 = 0 ≠ 𝐸 𝑌0
𝐸 𝑌0|𝑃 = 1 ≠ 𝐸 𝑌0
𝐸 𝑌0
|𝑃 = 0 ≠ 𝐸 𝑌0
|𝑃 = 1
𝐸 𝑌0
|𝑃 = 0 ≠ 𝐸 𝑌0
𝐸 𝑌0|𝑃 = 1 ≠ 𝐸 𝑌0
𝐸 𝑌0
|𝑃 = 0 ≠ 𝐸 𝑌0
|𝑃 = 1
X
Y
P
X
Y
P
𝑌 = 𝑃 ∙ 𝑌1
+ 1 − 𝑃 ∙ 𝑌0
X
Y
P
X
Y
P
𝐸 𝑌1
− 𝑌0
|𝑃 = 1
= 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
Average Y
across sample
of participants
−
Average Y
across sample
of non−participants
The big idea (part un)
So, the basic idea of the
randomization/experimental
approach to impact evaluation is
that by randomizing program
participation we insure that
participants and non-participants are
alike on average in terms of their
characteristics
The big idea (part deux)
If this is the case, then any
differences in average outcomes
between the two groups can be
ascribed to the one way in which
they do differ: program participation
1.Participants in the experiment are randomly
selected from the population of interest and
randomly assigned to their program participation
status;
2.All participants in the trial comply with the program
participation status to which they are assigned;
3.The experiment lasts long enough to replicate the
program under consideration and to influence
outcomes;
4.There are no social interactions that may make a
full-scale program inherently different from a smaller
scale intervention
Randomization and Its Discontents
Source: Washington Policy Center
Source: EH.net
𝐸 𝑌1
− 𝑌0
= 𝐸 𝑌1 − 𝐸 𝑌0
𝑌 = 𝑃 ∙ 𝑌1 + 1 − 𝑃 ∙ 𝑌0
𝐸 𝑌1
− 𝑌0
= 𝐸 𝑌1 − 𝐸 𝑌0
Average Y
Across Sample
of Participants
−
Average Y
Across Sample
of Non−Participants
𝐸 𝑌1
− 𝑌0
= 𝐸 𝑌1 − 𝐸 𝑌0
Average Y
across sample
of participants
−
Average Y
Across Sample
of Non−Participants
𝐸 𝑌1
− 𝑌0
= 𝐸 𝑌1 − 𝐸 𝑌0
Average Y
across sample
of participants
−
Average Y
across sample
of non−participants
X
Y
P
𝐸 𝑋|𝑃 = 1 ≠ 𝐸 𝑋|𝑃 = 0
X
Y
P
𝐸 𝑌1|𝑃 = 1 > 𝐸 𝑌1 > 𝐸 𝑌1|𝑃 = 0
𝐸 𝑌0|𝑃 = 0 < 𝐸 𝑌0 < 𝐸 𝑌0|𝑃 = 1
𝐸 𝑌1
− 𝑌0
=
𝐸 𝑌1 − 𝐸 𝑌0
(Overestimated) (Underestimated)
𝐸 𝑌1
− 𝑌0
=
𝐸 𝑌1 − 𝐸 𝑌0
(Overestimated) (Underestimated)
𝐸 𝑌1
− 𝑌0
=
𝐸 𝑌1 − 𝐸 𝑌0
(Overestimated) (Underestimated)
X
Y
P
X
Y
P
Randomization and Its Discontents
Randomization and Its Discontents
Refusal rates by plan
Plan Refusal rate (%)
Free 8
25 and 50% coinsurance 11
95% coinsurance 25
Source: Newhouse et al. (1993)
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
X
Y
P
X
Y
P
X
Y
P
If such and such is true of the real world
processes that gave rise to program
participation and outcomes in our observed
non-experimental sample, then the estimates
of program impact generated by this quasi-
experimental estimator provide the causal
impact of program participation on outcomes
of interest.
Lalonde’s critique of non-
experimental estimators
Lalonde’s critique of non-
experimental estimators
Randomization and Its Discontents
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs carried out in a limited or pilot setting
can mislead about the impact of the “scaled up” program
-Randomization is often not straightforward
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant
group cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
𝐸 𝑌1
− 𝑌0
= 𝐸 𝑌1 − 𝐸 𝑌0
𝑚𝑒𝑑𝑖𝑎𝑛 𝑌1
− 𝑌0
≠ 𝑚𝑒𝑑𝑖𝑎𝑛 𝑌1 − 𝑚𝑒𝑑𝑖𝑎𝑛 𝑌0
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
-Individuals cannot be forced to participate in a program
-Individuals cannot be forced to accept their random
experimental assignment
-Individuals assigned to the control/non-participant group
cannot be prevented from seeking alternatives
-Individuals cannot be forced to stay in an experiment
-Experiments/RCTs cannot estimate many important
parameters of interest
-Experiments/RCTs in a limited, pilot setting can mislead
about the impact of the “scaled up” program
-Randomization is often not straightforward
1.Participants in the experiment are randomly
selected from the population of interest and
randomly assigned to their program participation
status;
2.All participants in the trial comply with the program
participation status to which they are assigned;
3.The experiment lasts long enough to replicate the
program under consideration and to influence
outcomes;
4.There are no social interactions that may make a
full-scale program inherently different from a smaller
scale intervention
Time
Approval/
acceptance
Time
Approval/
acceptance
Phase 1
Time
Approval/
acceptance
Phase 2
Time
Approval/
acceptance
Phase 3
Time
Approval/
acceptance
Phase 4
Time
Approval/
acceptance
Phase 5
Time
Approval/
acceptance
Phase 2
Time
Approval/
acceptance
Phase 3
Randomization and Its Discontents
Randomization and Its Discontents
Randomization and Its Discontents
Conclusion
Conclusion
Conclusion
Conclusion
Links:
The manual:
http://www.measureevaluation.org/resources/publications/ms-
14-87-en
The webinar introducing the manual:
http://www.measureevaluation.org/resources/webinars/metho
ds-for-program-impact-evaluation
My email:
pmlance@email.unc.edu
MEASURE Evaluation is funded by the U.S. Agency
for International Development (USAID) under terms
of Cooperative Agreement AID-OAA-L-14-00004 and
implemented by the Carolina Population Center, University
of North Carolina at Chapel Hill in partnership with ICF
International, John Snow, Inc., Management Sciences for
Health, Palladium Group, and Tulane University. The views
expressed in this presentation do not necessarily reflect
the views of USAID or the United States government.
www.measureevaluation.org
1 of 108

Recommended

Seven Steps to EnGendering Evaluations of Public Health Programs by
 Seven Steps to EnGendering Evaluations of Public Health Programs Seven Steps to EnGendering Evaluations of Public Health Programs
Seven Steps to EnGendering Evaluations of Public Health ProgramsMEASURE Evaluation
687 views46 slides
The Role of Economic Evaluation and Cost-Effectiveness in Program Science by
The Role of Economic Evaluation and Cost-Effectiveness in Program ScienceThe Role of Economic Evaluation and Cost-Effectiveness in Program Science
The Role of Economic Evaluation and Cost-Effectiveness in Program Scienceamusten
1.3K views19 slides
Como avaliar os programas na empresa_Seminario Prom Saude - IESS by
Como avaliar os programas na empresa_Seminario Prom Saude - IESSComo avaliar os programas na empresa_Seminario Prom Saude - IESS
Como avaliar os programas na empresa_Seminario Prom Saude - IESSAlberto Ogata, MD MBA
549 views67 slides
STOP HIV/AIDS Pilot: Program Science and Systems Transformation by
STOP HIV/AIDS Pilot: Program Science and Systems TransformationSTOP HIV/AIDS Pilot: Program Science and Systems Transformation
STOP HIV/AIDS Pilot: Program Science and Systems Transformationamusten
1K views25 slides
Exploring the Economics of Quality Improvement Education in Healthcare: An A... by
Exploring the Economics of Quality Improvement Education in Healthcare:  An A...Exploring the Economics of Quality Improvement Education in Healthcare:  An A...
Exploring the Economics of Quality Improvement Education in Healthcare: An A...Daniel McLinden
104 views30 slides
Systematic reviews and trials (Claire Allen, Evidence Aid) by
Systematic reviews and trials (Claire Allen, Evidence Aid)Systematic reviews and trials (Claire Allen, Evidence Aid)
Systematic reviews and trials (Claire Allen, Evidence Aid)ALNAP
1K views16 slides

More Related Content

What's hot

A Role for Mathematical Models in Program Science by
A Role for Mathematical Models in Program ScienceA Role for Mathematical Models in Program Science
A Role for Mathematical Models in Program Scienceamusten
1.1K views38 slides
Learning from the Care Quality Commission by
Learning from the Care Quality CommissionLearning from the Care Quality Commission
Learning from the Care Quality CommissionNuffield Trust
886 views22 slides
How to Evaluate Complex Interventions by
How to Evaluate Complex InterventionsHow to Evaluate Complex Interventions
How to Evaluate Complex Interventionsamusten
1.7K views16 slides
Community-based Evaluation Methods and Practice by
Community-based Evaluation Methods and PracticeCommunity-based Evaluation Methods and Practice
Community-based Evaluation Methods and Practiceamusten
1.6K views15 slides
Innovative research approaches to improve evidence in global health by
Innovative research approaches to improve evidence in global healthInnovative research approaches to improve evidence in global health
Innovative research approaches to improve evidence in global healthEmilie Robert
658 views33 slides
Evaluating new models of care: Improvement Analytics Unit by
Evaluating new models of care: Improvement Analytics UnitEvaluating new models of care: Improvement Analytics Unit
Evaluating new models of care: Improvement Analytics UnitNuffield Trust
572 views21 slides

What's hot(20)

A Role for Mathematical Models in Program Science by amusten
A Role for Mathematical Models in Program ScienceA Role for Mathematical Models in Program Science
A Role for Mathematical Models in Program Science
amusten1.1K views
Learning from the Care Quality Commission by Nuffield Trust
Learning from the Care Quality CommissionLearning from the Care Quality Commission
Learning from the Care Quality Commission
Nuffield Trust886 views
How to Evaluate Complex Interventions by amusten
How to Evaluate Complex InterventionsHow to Evaluate Complex Interventions
How to Evaluate Complex Interventions
amusten1.7K views
Community-based Evaluation Methods and Practice by amusten
Community-based Evaluation Methods and PracticeCommunity-based Evaluation Methods and Practice
Community-based Evaluation Methods and Practice
amusten1.6K views
Innovative research approaches to improve evidence in global health by Emilie Robert
Innovative research approaches to improve evidence in global healthInnovative research approaches to improve evidence in global health
Innovative research approaches to improve evidence in global health
Emilie Robert658 views
Evaluating new models of care: Improvement Analytics Unit by Nuffield Trust
Evaluating new models of care: Improvement Analytics UnitEvaluating new models of care: Improvement Analytics Unit
Evaluating new models of care: Improvement Analytics Unit
Nuffield Trust572 views
Dissemination and Implementation Research - Getting Funded by HopkinsCFAR
Dissemination and Implementation Research - Getting FundedDissemination and Implementation Research - Getting Funded
Dissemination and Implementation Research - Getting Funded
HopkinsCFAR1.1K views
Forecasting - Estimating the future value of training investments: Creating ... by Daniel McLinden
Forecasting - Estimating the future value of training investments:  Creating ...Forecasting - Estimating the future value of training investments:  Creating ...
Forecasting - Estimating the future value of training investments: Creating ...
Daniel McLinden63 views
Beyond Indicators and Reporting: M&E as a Systems Strengthening Intervention by MEASURE Evaluation
Beyond Indicators and Reporting: M&E as a Systems Strengthening InterventionBeyond Indicators and Reporting: M&E as a Systems Strengthening Intervention
Beyond Indicators and Reporting: M&E as a Systems Strengthening Intervention
MEASURE Evaluation2.5K views
Providing actionable healthcare analytics at scale: Understanding improvement... by Nuffield Trust
Providing actionable healthcare analytics at scale: Understanding improvement...Providing actionable healthcare analytics at scale: Understanding improvement...
Providing actionable healthcare analytics at scale: Understanding improvement...
Nuffield Trust433 views
Evaluating the quality of quality improvement training in healthcare by Daniel McLinden
Evaluating the quality of quality improvement training in healthcareEvaluating the quality of quality improvement training in healthcare
Evaluating the quality of quality improvement training in healthcare
Daniel McLinden127 views
Effectiveness of the current dominant approach to integrated care in the NHS:... by Sarah Wilson
Effectiveness of the current dominant approach to integrated care in the NHS:...Effectiveness of the current dominant approach to integrated care in the NHS:...
Effectiveness of the current dominant approach to integrated care in the NHS:...
Sarah Wilson112 views
Big Data: Big Opportunities or Big Trouble? by Shea Swauger
Big Data: Big Opportunities or Big Trouble?Big Data: Big Opportunities or Big Trouble?
Big Data: Big Opportunities or Big Trouble?
Shea Swauger2.6K views
2. ph250b.14 measures of association 1 by A M
2.  ph250b.14  measures of association 12.  ph250b.14  measures of association 1
2. ph250b.14 measures of association 1
A M2.4K views
Concept Maps As Network Data: Applying Social Network Analysis to a Network ... by Daniel McLinden
Concept Maps As Network Data:  Applying Social Network Analysis to a Network ...Concept Maps As Network Data:  Applying Social Network Analysis to a Network ...
Concept Maps As Network Data: Applying Social Network Analysis to a Network ...
Daniel McLinden103 views
Lessons from meta-evaluation of MGNREGA from a gender and equality lens (2014) by Ranjani K.Murthy
Lessons from meta-evaluation of MGNREGA from a gender and equality lens (2014)Lessons from meta-evaluation of MGNREGA from a gender and equality lens (2014)
Lessons from meta-evaluation of MGNREGA from a gender and equality lens (2014)
Ranjani K.Murthy839 views
What is impact evaluation? by David Evans
What is impact evaluation?What is impact evaluation?
What is impact evaluation?
David Evans2.7K views
Evaluating opportunities to optimize learning and economic impact: Applyin... by Daniel McLinden
Evaluating opportunities to optimize learning and economic impact:  Applyin...Evaluating opportunities to optimize learning and economic impact:  Applyin...
Evaluating opportunities to optimize learning and economic impact: Applyin...
Daniel McLinden90 views

Viewers also liked

M&E for Social Service System Strengthening by
M&E for Social Service System Strengthening M&E for Social Service System Strengthening
M&E for Social Service System Strengthening MEASURE Evaluation
1.9K views23 slides
An Introduction to the Manual: How Do We Know if a Program Made a Difference?... by
An Introduction to the Manual:How Do We Know if a Program Made a Difference?...An Introduction to the Manual:How Do We Know if a Program Made a Difference?...
An Introduction to the Manual: How Do We Know if a Program Made a Difference?...MEASURE Evaluation
3.6K views47 slides
Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal... by
Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal...Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal...
Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal...MEASURE Evaluation
849 views16 slides
A Powerful Partnership: GIS and Sampling by
A Powerful Partnership: GIS and SamplingA Powerful Partnership: GIS and Sampling
A Powerful Partnership: GIS and SamplingMEASURE Evaluation
1.6K views222 slides
Randomised controlled trials by
Randomised controlled trialsRandomised controlled trials
Randomised controlled trialsHesham Gaber
36.4K views26 slides
Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro... by
Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro...Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro...
Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro...MEASURE Evaluation
1.3K views21 slides

Viewers also liked(20)

M&E for Social Service System Strengthening by MEASURE Evaluation
M&E for Social Service System Strengthening M&E for Social Service System Strengthening
M&E for Social Service System Strengthening
MEASURE Evaluation1.9K views
An Introduction to the Manual: How Do We Know if a Program Made a Difference?... by MEASURE Evaluation
An Introduction to the Manual:How Do We Know if a Program Made a Difference?...An Introduction to the Manual:How Do We Know if a Program Made a Difference?...
An Introduction to the Manual: How Do We Know if a Program Made a Difference?...
MEASURE Evaluation3.6K views
Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal... by MEASURE Evaluation
Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal...Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal...
Evaluation of the Impact of Malaria Control Interventions on All-Cause Mortal...
MEASURE Evaluation849 views
Randomised controlled trials by Hesham Gaber
Randomised controlled trialsRandomised controlled trials
Randomised controlled trials
Hesham Gaber36.4K views
Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro... by MEASURE Evaluation
Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro...Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro...
Using Routine Data to Improve ART Retention: Examples and Lessons Learned fro...
MEASURE Evaluation1.3K views
Using the PLACE Method to Inform Decision Making by MEASURE Evaluation
Using the PLACE Method to Inform Decision MakingUsing the PLACE Method to Inform Decision Making
Using the PLACE Method to Inform Decision Making
MEASURE Evaluation1.7K views
Assessing HIV Service: Use and Information Systems for Key Populations in Nam... by MEASURE Evaluation
Assessing HIV Service: Use and Information Systems for Key Populations in Nam...Assessing HIV Service: Use and Information Systems for Key Populations in Nam...
Assessing HIV Service: Use and Information Systems for Key Populations in Nam...
MEASURE Evaluation641 views
Including AIDS-affected young people in OVC research: Challenges and opportu... by MEASURE Evaluation
Including AIDS-affected young people in OVC research:  Challenges and opportu...Including AIDS-affected young people in OVC research:  Challenges and opportu...
Including AIDS-affected young people in OVC research: Challenges and opportu...
MEASURE Evaluation925 views
Monitoring Scale-up of Health Practices and Interventions by MEASURE Evaluation
Monitoring Scale-up of Health Practices and InterventionsMonitoring Scale-up of Health Practices and Interventions
Monitoring Scale-up of Health Practices and Interventions
MEASURE Evaluation1.6K views
Measuring Ethnic and Sexual Identities: Lessons from Two Studies in Central A... by MEASURE Evaluation
Measuring Ethnic and Sexual Identities: Lessons from Two Studies in Central A...Measuring Ethnic and Sexual Identities: Lessons from Two Studies in Central A...
Measuring Ethnic and Sexual Identities: Lessons from Two Studies in Central A...
MEASURE Evaluation1.6K views
Addressing Complexity in the Impact Evaluation of the Cross-Border Health Int... by MEASURE Evaluation
Addressing Complexity in the Impact Evaluation of the Cross-Border Health Int...Addressing Complexity in the Impact Evaluation of the Cross-Border Health Int...
Addressing Complexity in the Impact Evaluation of the Cross-Border Health Int...
MEASURE Evaluation642 views
Interoperability & Crowdsourcing: Can these improve the management of ANC pro... by MEASURE Evaluation
Interoperability & Crowdsourcing: Can these improve the management of ANC pro...Interoperability & Crowdsourcing: Can these improve the management of ANC pro...
Interoperability & Crowdsourcing: Can these improve the management of ANC pro...
MEASURE Evaluation1.4K views
Measuring Success in Repositioning Family Planning by MEASURE Evaluation
Measuring Success in Repositioning Family PlanningMeasuring Success in Repositioning Family Planning
Measuring Success in Repositioning Family Planning
MEASURE Evaluation1.3K views
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services? by MEASURE Evaluation
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
RHINO Forum: How can RHIS improve the delivery of HIV/AIDS services?
MEASURE Evaluation990 views
The Prevalence, Experience and Management of Pain by MEASURE Evaluation
The Prevalence, Experience and Management of PainThe Prevalence, Experience and Management of Pain
The Prevalence, Experience and Management of Pain
MEASURE Evaluation1.4K views
Integration as a Health Systems Strengthening Intervention: Case Studies from... by MEASURE Evaluation
Integration as a Health Systems Strengthening Intervention: Case Studies from...Integration as a Health Systems Strengthening Intervention: Case Studies from...
Integration as a Health Systems Strengthening Intervention: Case Studies from...
MEASURE Evaluation1.5K views
Enhancing FP/RH Decision Making through GIS Data Linking by MEASURE Evaluation
Enhancing FP/RH Decision Making through GIS Data LinkingEnhancing FP/RH Decision Making through GIS Data Linking
Enhancing FP/RH Decision Making through GIS Data Linking
MEASURE Evaluation1.4K views

Similar to Randomization and Its Discontents

3. How to Randomize by
3. How to Randomize3. How to Randomize
3. How to Randomizevinhthedang
495 views51 slides
Fundamentals of Program Impact Evaluation by
Fundamentals of Program Impact EvaluationFundamentals of Program Impact Evaluation
Fundamentals of Program Impact EvaluationMEASURE Evaluation
3.4K views166 slides
# 9th lect clinical trial analysis by
# 9th lect  clinical  trial analysis# 9th lect  clinical  trial analysis
# 9th lect clinical trial analysisDr. Eman M. Mortada
192 views69 slides
Experimental Evaluation Methods by
Experimental Evaluation MethodsExperimental Evaluation Methods
Experimental Evaluation Methodsclearsateam
4K views49 slides
83341 ch13 jacobsen by
83341 ch13 jacobsen83341 ch13 jacobsen
83341 ch13 jacobsenNada G.Youssef
539 views33 slides
Research methods revision 2015 by
Research methods revision 2015Research methods revision 2015
Research methods revision 2015coburgpsych
2.4K views56 slides

Similar to Randomization and Its Discontents(20)

3. How to Randomize by vinhthedang
3. How to Randomize3. How to Randomize
3. How to Randomize
vinhthedang495 views
Fundamentals of Program Impact Evaluation by MEASURE Evaluation
Fundamentals of Program Impact EvaluationFundamentals of Program Impact Evaluation
Fundamentals of Program Impact Evaluation
MEASURE Evaluation3.4K views
Experimental Evaluation Methods by clearsateam
Experimental Evaluation MethodsExperimental Evaluation Methods
Experimental Evaluation Methods
clearsateam4K views
Research methods revision 2015 by coburgpsych
Research methods revision 2015Research methods revision 2015
Research methods revision 2015
coburgpsych2.4K views
Ipac 2014 by cwhms
Ipac 2014Ipac 2014
Ipac 2014
cwhms870 views
Randomized Controlled Trials.pptx by Ved Gharat
Randomized Controlled Trials.pptxRandomized Controlled Trials.pptx
Randomized Controlled Trials.pptx
Ved Gharat11 views
Resourcd File by Resourcd
Resourcd FileResourcd File
Resourcd File
Resourcd276 views
Transitions in M&E of SBC Handout by CORE Group
Transitions in M&E of SBC HandoutTransitions in M&E of SBC Handout
Transitions in M&E of SBC Handout
CORE Group156 views
Quick introduction to critical appraisal of quantitative research by Alan Fricker
Quick introduction to critical appraisal of quantitative researchQuick introduction to critical appraisal of quantitative research
Quick introduction to critical appraisal of quantitative research
Alan Fricker16.8K views
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event by MEASURE Evaluation
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual EventKey Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event
Key Issues in Impact Evaluation: A MEET and GEMNet-Health Virtual Event
MEASURE Evaluation1.8K views
Risk of bias in systematic review by Prabesh Ghimire
Risk of bias in systematic reviewRisk of bias in systematic review
Risk of bias in systematic review
Prabesh Ghimire2.9K views
Study Designs_YL by Yvonne Lee
Study Designs_YLStudy Designs_YL
Study Designs_YL
Yvonne Lee2.6K views
NES Pharmacy, Critical Appraisal 2011 by NES
NES Pharmacy, Critical Appraisal 2011NES Pharmacy, Critical Appraisal 2011
NES Pharmacy, Critical Appraisal 2011
NES5K views

More from MEASURE Evaluation

Managing missing values in routinely reported data: One approach from the Dem... by
Managing missing values in routinely reported data: One approach from the Dem...Managing missing values in routinely reported data: One approach from the Dem...
Managing missing values in routinely reported data: One approach from the Dem...MEASURE Evaluation
1.6K views52 slides
Use of Routine Data for Economic Evaluations by
Use of Routine Data for Economic EvaluationsUse of Routine Data for Economic Evaluations
Use of Routine Data for Economic EvaluationsMEASURE Evaluation
1.1K views26 slides
Routine data use in evaluation: practical guidance by
Routine data use in evaluation: practical guidanceRoutine data use in evaluation: practical guidance
Routine data use in evaluation: practical guidanceMEASURE Evaluation
2.3K views40 slides
Tuberculosis/HIV Mobility Study: Objectives and Background by
Tuberculosis/HIV Mobility Study: Objectives and BackgroundTuberculosis/HIV Mobility Study: Objectives and Background
Tuberculosis/HIV Mobility Study: Objectives and BackgroundMEASURE Evaluation
779 views78 slides
How to improve the capabilities of health information systems to address emer... by
How to improve the capabilities of health information systems to address emer...How to improve the capabilities of health information systems to address emer...
How to improve the capabilities of health information systems to address emer...MEASURE Evaluation
906 views19 slides
LCI Evaluation Uganda Organizational Network Analysis by
LCI Evaluation Uganda Organizational Network AnalysisLCI Evaluation Uganda Organizational Network Analysis
LCI Evaluation Uganda Organizational Network AnalysisMEASURE Evaluation
625 views24 slides

More from MEASURE Evaluation(20)

Managing missing values in routinely reported data: One approach from the Dem... by MEASURE Evaluation
Managing missing values in routinely reported data: One approach from the Dem...Managing missing values in routinely reported data: One approach from the Dem...
Managing missing values in routinely reported data: One approach from the Dem...
MEASURE Evaluation1.6K views
Use of Routine Data for Economic Evaluations by MEASURE Evaluation
Use of Routine Data for Economic EvaluationsUse of Routine Data for Economic Evaluations
Use of Routine Data for Economic Evaluations
MEASURE Evaluation1.1K views
Routine data use in evaluation: practical guidance by MEASURE Evaluation
Routine data use in evaluation: practical guidanceRoutine data use in evaluation: practical guidance
Routine data use in evaluation: practical guidance
MEASURE Evaluation2.3K views
Tuberculosis/HIV Mobility Study: Objectives and Background by MEASURE Evaluation
Tuberculosis/HIV Mobility Study: Objectives and BackgroundTuberculosis/HIV Mobility Study: Objectives and Background
Tuberculosis/HIV Mobility Study: Objectives and Background
MEASURE Evaluation779 views
How to improve the capabilities of health information systems to address emer... by MEASURE Evaluation
How to improve the capabilities of health information systems to address emer...How to improve the capabilities of health information systems to address emer...
How to improve the capabilities of health information systems to address emer...
MEASURE Evaluation906 views
LCI Evaluation Uganda Organizational Network Analysis by MEASURE Evaluation
LCI Evaluation Uganda Organizational Network AnalysisLCI Evaluation Uganda Organizational Network Analysis
LCI Evaluation Uganda Organizational Network Analysis
MEASURE Evaluation625 views
Using Organizational Network Analysis to Plan and Evaluate Global Health Prog... by MEASURE Evaluation
Using Organizational Network Analysis to Plan and Evaluate Global Health Prog...Using Organizational Network Analysis to Plan and Evaluate Global Health Prog...
Using Organizational Network Analysis to Plan and Evaluate Global Health Prog...
MEASURE Evaluation633 views
Understanding Referral Networks for Adolescent Girls and Young Women by MEASURE Evaluation
Understanding Referral Networks for Adolescent Girls and Young WomenUnderstanding Referral Networks for Adolescent Girls and Young Women
Understanding Referral Networks for Adolescent Girls and Young Women
MEASURE Evaluation606 views
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method by MEASURE Evaluation
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping MethodData for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
MEASURE Evaluation1.1K views
Development and Validation of a Reproductive Empowerment Scale by MEASURE Evaluation
Development and Validation of a Reproductive Empowerment ScaleDevelopment and Validation of a Reproductive Empowerment Scale
Development and Validation of a Reproductive Empowerment Scale
MEASURE Evaluation409 views
Sustaining the Impact: MEASURE Evaluation Conversation on Maternal and Child ... by MEASURE Evaluation
Sustaining the Impact: MEASURE Evaluation Conversation on Maternal and Child ...Sustaining the Impact: MEASURE Evaluation Conversation on Maternal and Child ...
Sustaining the Impact: MEASURE Evaluation Conversation on Maternal and Child ...
MEASURE Evaluation343 views
Using Most Significant Change in a Mixed-Methods Evaluation in Uganda by MEASURE Evaluation
Using Most Significant Change in a Mixed-Methods Evaluation in UgandaUsing Most Significant Change in a Mixed-Methods Evaluation in Uganda
Using Most Significant Change in a Mixed-Methods Evaluation in Uganda
MEASURE Evaluation491 views
Lessons Learned In Using the Most Significant Change Technique in Evaluation by MEASURE Evaluation
Lessons Learned In Using the Most Significant Change Technique in EvaluationLessons Learned In Using the Most Significant Change Technique in Evaluation
Lessons Learned In Using the Most Significant Change Technique in Evaluation
MEASURE Evaluation794 views
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:... by MEASURE Evaluation
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
Malaria Data Quality and Use in Selected Centers of Excellence in Madagascar:...
MEASURE Evaluation269 views
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio... by MEASURE Evaluation
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
Evaluating National Malaria Programs’ Impact in Moderate- and Low-Transmissio...
MEASURE Evaluation166 views
Improved Performance of the Malaria Surveillance, Monitoring, and Evaluation ... by MEASURE Evaluation
Improved Performance of the Malaria Surveillance, Monitoring, and Evaluation ...Improved Performance of the Malaria Surveillance, Monitoring, and Evaluation ...
Improved Performance of the Malaria Surveillance, Monitoring, and Evaluation ...
MEASURE Evaluation206 views
Lessons learned in using process tracing for evaluation by MEASURE Evaluation
Lessons learned in using process tracing for evaluationLessons learned in using process tracing for evaluation
Lessons learned in using process tracing for evaluation
MEASURE Evaluation1.7K views
Use of Qualitative Comparative Analysis in the Assessment of the Actionable D... by MEASURE Evaluation
Use of Qualitative Comparative Analysis in the Assessment of the Actionable D...Use of Qualitative Comparative Analysis in the Assessment of the Actionable D...
Use of Qualitative Comparative Analysis in the Assessment of the Actionable D...
MEASURE Evaluation658 views
Sustaining the Impact: MEASURE Evaluation Conversation on Health Informatics by MEASURE Evaluation
Sustaining the Impact: MEASURE Evaluation Conversation on Health InformaticsSustaining the Impact: MEASURE Evaluation Conversation on Health Informatics
Sustaining the Impact: MEASURE Evaluation Conversation on Health Informatics
MEASURE Evaluation503 views

Recently uploaded

Women from Hackney’s History: Stoke Newington by Sue Doe by
Women from Hackney’s History: Stoke Newington by Sue DoeWomen from Hackney’s History: Stoke Newington by Sue Doe
Women from Hackney’s History: Stoke Newington by Sue DoeHistory of Stoke Newington
133 views21 slides
CWP_23995_2013_17_11_2023_FINAL_ORDER.pdf by
CWP_23995_2013_17_11_2023_FINAL_ORDER.pdfCWP_23995_2013_17_11_2023_FINAL_ORDER.pdf
CWP_23995_2013_17_11_2023_FINAL_ORDER.pdfSukhwinderSingh895865
501 views6 slides
STERILITY TEST.pptx by
STERILITY TEST.pptxSTERILITY TEST.pptx
STERILITY TEST.pptxAnupkumar Sharma
114 views9 slides
Narration ppt.pptx by
Narration  ppt.pptxNarration  ppt.pptx
Narration ppt.pptxTARIQ KHAN
110 views24 slides
Student Voice by
Student Voice Student Voice
Student Voice Pooky Knightsmith
148 views33 slides
ICS3211_lecture 08_2023.pdf by
ICS3211_lecture 08_2023.pdfICS3211_lecture 08_2023.pdf
ICS3211_lecture 08_2023.pdfVanessa Camilleri
95 views30 slides

Recently uploaded(20)

Narration ppt.pptx by TARIQ KHAN
Narration  ppt.pptxNarration  ppt.pptx
Narration ppt.pptx
TARIQ KHAN110 views
EIT-Digital_Spohrer_AI_Intro 20231128 v1.pptx by ISSIP
EIT-Digital_Spohrer_AI_Intro 20231128 v1.pptxEIT-Digital_Spohrer_AI_Intro 20231128 v1.pptx
EIT-Digital_Spohrer_AI_Intro 20231128 v1.pptx
ISSIP256 views
The basics - information, data, technology and systems.pdf by JonathanCovena1
The basics - information, data, technology and systems.pdfThe basics - information, data, technology and systems.pdf
The basics - information, data, technology and systems.pdf
JonathanCovena177 views
JiscOAWeek_LAIR_slides_October2023.pptx by Jisc
JiscOAWeek_LAIR_slides_October2023.pptxJiscOAWeek_LAIR_slides_October2023.pptx
JiscOAWeek_LAIR_slides_October2023.pptx
Jisc72 views
Class 10 English notes 23-24.pptx by TARIQ KHAN
Class 10 English notes 23-24.pptxClass 10 English notes 23-24.pptx
Class 10 English notes 23-24.pptx
TARIQ KHAN95 views
OEB 2023 Co-learning To Speed Up AI Implementation in Courses.pptx by Inge de Waard
OEB 2023 Co-learning To Speed Up AI Implementation in Courses.pptxOEB 2023 Co-learning To Speed Up AI Implementation in Courses.pptx
OEB 2023 Co-learning To Speed Up AI Implementation in Courses.pptx
Inge de Waard165 views
Structure and Functions of Cell.pdf by Nithya Murugan
Structure and Functions of Cell.pdfStructure and Functions of Cell.pdf
Structure and Functions of Cell.pdf
Nithya Murugan317 views
AI Tools for Business and Startups by Svetlin Nakov
AI Tools for Business and StartupsAI Tools for Business and Startups
AI Tools for Business and Startups
Svetlin Nakov89 views
Class 10 English lesson plans by TARIQ KHAN
Class 10 English  lesson plansClass 10 English  lesson plans
Class 10 English lesson plans
TARIQ KHAN239 views
Chemistry of sex hormones.pptx by RAJ K. MAURYA
Chemistry of sex hormones.pptxChemistry of sex hormones.pptx
Chemistry of sex hormones.pptx
RAJ K. MAURYA119 views
7 NOVEL DRUG DELIVERY SYSTEM.pptx by Sachin Nitave
7 NOVEL DRUG DELIVERY SYSTEM.pptx7 NOVEL DRUG DELIVERY SYSTEM.pptx
7 NOVEL DRUG DELIVERY SYSTEM.pptx
Sachin Nitave56 views

Randomization and Its Discontents

  • 1. Peter M. Lance, PhD MEASURE Evaluation University of North Carolina at Chapel Hill JUNE 29, 2016 Randomization and Its Discontents
  • 2. Global, five-year, $180M cooperative agreement Strategic objective: To strengthen health information systems – the capacity to gather, interpret, and use data – so countries can make better decisions and sustain good health outcomes over time. Project overview
  • 3. Improved country capacity to manage health information systems, resources, and staff Strengthened collection, analysis, and use of routine health data Methods, tools, and approaches improved and applied to address health information challenges and gaps Increased capacity for rigorous evaluation Phase IV Results Framework
  • 4. Global footprint (more than 25 countries)
  • 5. • The program impact evaluation challenge • Randomization • Selection on observables • Within estimators • Instrumental variables
  • 6. • The program impact evaluation challenge • Randomization • Selection on observables • Within estimators • Instrumental variables
  • 9. Did the program make a difference?
  • 10. Did the program cause a change in an outcome of interest Y ? (Causality)
  • 11. What happens if the individual participates {Causal} Program impact 𝑌𝑖 1 − 𝑌𝑖 0 = Program impact What happens if the individual does not participate
  • 12. Average treatment effect (ATE) 𝐸 𝑌1 − 𝑌0 Average effect of treatment on the treated (ATT) 𝐸 𝑌1 − 𝑌0|𝑃 = 1 Treatment effects
  • 17. 𝑌𝑖 1 , 𝑌𝑖 0 Observed outcome Fundamental Identification Problem of Program Impact Evaluation
  • 18. 𝑌𝑖 1 , 𝑌𝑖 0 Observed outcome Fundamental identification problem of program impact evaluation
  • 19. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
  • 20. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
  • 21. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
  • 25. Impact evaluation 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1
  • 28. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1 Average Y across sample of participants − Average Y Across Sample of Non−Participants
  • 29. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1 Average Y across sample of participants − Average Y across sample of non−participants
  • 30. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1 Average Y across sample of participants − Average Y across sample of non−participants
  • 31. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1 Average Y across sample of participants − Average Y across sample of non−participants
  • 34. Impact evaluation 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1 𝐸 𝑌0|𝑃 = 0 = 𝐸(𝑌0|𝑃 = 1)
  • 35. 𝐸 𝑌0 |𝑃 = 0 ≠ 𝐸 𝑌0 𝐸 𝑌0|𝑃 = 1 ≠ 𝐸 𝑌0 𝐸 𝑌0 |𝑃 = 0 ≠ 𝐸 𝑌0 |𝑃 = 1
  • 36. 𝐸 𝑌0 |𝑃 = 0 ≠ 𝐸 𝑌0 𝐸 𝑌0|𝑃 = 1 ≠ 𝐸 𝑌0 𝐸 𝑌0 |𝑃 = 0 ≠ 𝐸 𝑌0 |𝑃 = 1
  • 37. X Y P
  • 38. X Y P 𝑌 = 𝑃 ∙ 𝑌1 + 1 − 𝑃 ∙ 𝑌0
  • 39. X Y P
  • 40. X Y P
  • 41. 𝐸 𝑌1 − 𝑌0 |𝑃 = 1 = 𝐸 𝑌1|𝑃 = 1 − 𝐸 𝑌0|𝑃 = 1 Average Y across sample of participants − Average Y across sample of non−participants
  • 42. The big idea (part un) So, the basic idea of the randomization/experimental approach to impact evaluation is that by randomizing program participation we insure that participants and non-participants are alike on average in terms of their characteristics
  • 43. The big idea (part deux) If this is the case, then any differences in average outcomes between the two groups can be ascribed to the one way in which they do differ: program participation
  • 44. 1.Participants in the experiment are randomly selected from the population of interest and randomly assigned to their program participation status; 2.All participants in the trial comply with the program participation status to which they are assigned; 3.The experiment lasts long enough to replicate the program under consideration and to influence outcomes; 4.There are no social interactions that may make a full-scale program inherently different from a smaller scale intervention
  • 48. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0 𝑌 = 𝑃 ∙ 𝑌1 + 1 − 𝑃 ∙ 𝑌0
  • 49. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0 Average Y Across Sample of Participants − Average Y Across Sample of Non−Participants
  • 50. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0 Average Y across sample of participants − Average Y Across Sample of Non−Participants
  • 51. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0 Average Y across sample of participants − Average Y across sample of non−participants
  • 52. X Y P
  • 53. 𝐸 𝑋|𝑃 = 1 ≠ 𝐸 𝑋|𝑃 = 0
  • 54. X Y P
  • 55. 𝐸 𝑌1|𝑃 = 1 > 𝐸 𝑌1 > 𝐸 𝑌1|𝑃 = 0
  • 56. 𝐸 𝑌0|𝑃 = 0 < 𝐸 𝑌0 < 𝐸 𝑌0|𝑃 = 1
  • 57. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0 (Overestimated) (Underestimated)
  • 58. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0 (Overestimated) (Underestimated)
  • 59. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0 (Overestimated) (Underestimated)
  • 60. X Y P
  • 61. X Y P
  • 64. Refusal rates by plan Plan Refusal rate (%) Free 8 25 and 50% coinsurance 11 95% coinsurance 25 Source: Newhouse et al. (1993)
  • 73. X Y P
  • 74. X Y P
  • 75. X Y P
  • 76. If such and such is true of the real world processes that gave rise to program participation and outcomes in our observed non-experimental sample, then the estimates of program impact generated by this quasi- experimental estimator provide the causal impact of program participation on outcomes of interest.
  • 77. Lalonde’s critique of non- experimental estimators
  • 78. Lalonde’s critique of non- experimental estimators
  • 80. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs carried out in a limited or pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 81. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 82. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 83. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 84. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 85. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 86. 𝐸 𝑌1 − 𝑌0 = 𝐸 𝑌1 − 𝐸 𝑌0
  • 87. 𝑚𝑒𝑑𝑖𝑎𝑛 𝑌1 − 𝑌0 ≠ 𝑚𝑒𝑑𝑖𝑎𝑛 𝑌1 − 𝑚𝑒𝑑𝑖𝑎𝑛 𝑌0
  • 88. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 89. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 90. -Individuals cannot be forced to participate in a program -Individuals cannot be forced to accept their random experimental assignment -Individuals assigned to the control/non-participant group cannot be prevented from seeking alternatives -Individuals cannot be forced to stay in an experiment -Experiments/RCTs cannot estimate many important parameters of interest -Experiments/RCTs in a limited, pilot setting can mislead about the impact of the “scaled up” program -Randomization is often not straightforward
  • 91. 1.Participants in the experiment are randomly selected from the population of interest and randomly assigned to their program participation status; 2.All participants in the trial comply with the program participation status to which they are assigned; 3.The experiment lasts long enough to replicate the program under consideration and to influence outcomes; 4.There are no social interactions that may make a full-scale program inherently different from a smaller scale intervention
  • 107. Links: The manual: http://www.measureevaluation.org/resources/publications/ms- 14-87-en The webinar introducing the manual: http://www.measureevaluation.org/resources/webinars/metho ds-for-program-impact-evaluation My email: pmlance@email.unc.edu
  • 108. MEASURE Evaluation is funded by the U.S. Agency for International Development (USAID) under terms of Cooperative Agreement AID-OAA-L-14-00004 and implemented by the Carolina Population Center, University of North Carolina at Chapel Hill in partnership with ICF International, John Snow, Inc., Management Sciences for Health, Palladium Group, and Tulane University. The views expressed in this presentation do not necessarily reflect the views of USAID or the United States government. www.measureevaluation.org