4. GETTING
STARTED
IMPACT PERFORMANCE PROCESS
Aim is to determine if Also aims to determine if Aim to assess whether and
changes have taken place changes have taken place to what degree projects
in the VC or among VC in the VC or among VC have been implemented in
actors actors. line with initial plan.
To what degree those To what degree CARE’s Does not consider results
changes can be attributed interventions contributed to directly, but how the
to CARE’s work. those changes. initiative is managed.
Designed to answer what Use non experimental Typically internal.
would have happened if designs.
CARE had not intervened. Assesses timeliness and
Typically only gather and quality of performance.
Answered via a statistically analyze data from those
valid counterfactual using directly engaged or Looking to ID areas for
control groups and impacted. improvement to enhance
experimental or quasi-
3
implementation process.
experimental designs.
May 22, 2012 Lower cost, but less
rigorous.
5. GETTING
STARTED
IMPACT PERFORMANCE PROCESS
Aim is to determine if Also aims to determine if Aim to assess whether and
changes have taken place changes have taken place to what degree projects
in the VC or among VC in the VC or among VC have been implemented in
actors actors. line with initial plan.
To what degree those To what degree CARE’s Does not consider results
changes can be attributed interventions contributed to directly, but how the
to CARE’s work. those changes. initiative is managed.
Designed to answer what Use non experimental Typically internal.
would have happened if designs.
CARE had not intervened. Assesses timeliness and
Typically only gather and quality of performance.
Answered via a statistically analyze data from those
valid counterfactual using directly engaged or Looking to ID areas for
control groups and impacted. improvement to enhance
experimental or quasi-
4
implementation process.
experimental designs.
May 22, 2012 Lower cost, but less
rigorous.
7. STEP-BY-STEP GUIDE
1
• Determine the Purpose for the Evaluation
• Determine the Financial Resources Available for the
2 Evaluation
3
• Identify Research Team and Partners
4
• Identify Research Questions
5
• Choose a Research Methodology
6
• Determine the Other Details of the Research Design
6
7
• Implement the Impact Evaluation
May 22, 2012
8. STEP-BY-STEP GUIDE:
STEP 1
FIRST!
Ask yourself what is the
purpose of this evaluation and
who is it for?
Not asking this can result in a
methodology poorly matching
donor requirements and can
cost precious time, money and
energy.
7
May 22, 2012
9. STEP-BY-STEP GUIDE:
STEP 1
Other Angles to Consider
•Typically, there are 3 M&E clients that
might want an impact evaluation
•Motivations for conducting
an impact evaluation
•Impact evaluations are not
always the right choice.
8
May 22, 2012
10. STEP-BY-STEP GUIDE
1
• Determine the Purpose for the Evaluation
• Determine the Financial Resources Available for
2 the Evaluation
3
• Identify Research Team and Partners
4
• Identify Research Questions
5
• Choose a Research Methodology
6
• Determine the Other Details of the Research Design
9
7
• Implement the Impact Evaluation
May 22, 2012
11. STEP-BY-STEP GUIDE:
STEP 2
Evaluation Costs
•More rigorous evaluations = More $$
•Attributable evidence is expensive
• Cost depends on:
Sample size # of research rounds
Survey length
Sampling methodology
Geographic dispersion of
respondents
International evaluation experts
10 Price of local research
May 22, 2012 talent
12. STEP-BY-STEP GUIDE:
STEP 2
WA R N I N G !
Best practice evaluation standards
strongly recommend outsourcing
impact evaluations.
11
May 22, 2012
13. STEP-BY-STEP GUIDE
1
• Determine the Purpose for the Evaluation
• Determine the Financial Resources Available for the
2 Evaluation
3
• Identify Research Team and Partners
4
• Identify Research Questions
5
• Choose a Research Methodology
6
• Determine the Other Details of the Research Design
12
7
• Implement the Impact Evaluation
May 22, 2012
14. STEP-BY-STEP GUIDE:
STEP 3
Activities and Responsibilities for External Research
Partners
Refining Sharpening Translating Pilot testing
Evaluation Research research research
Design Questions instruments instruments
Developing Training Managing Entering
the research survey the field data results into
instruments enumerators collection data shell
Cleaning the Transcripts Data
Final reports
data set of interviews analysis
13
May 22, 2012
15. STEP-BY-STEP GUIDE:
STEP 3
Proposals from Potential Research
Partners
•Receiving proposals
•Evaluating proposals
•Evaluation and selection criteria
•World Bank guide
14
May 22, 2012
16. STEP-BY-STEP GUIDE
1
• Determine the Purpose for the Evaluation
• Determine the Financial Resources Available for the
2 Evaluation
3
• Identify Research Team and Partners
4
• Identify Research Questions
5
• Choose a Research Methodology
6
• Determine the Other Details of the Research Design
15
7
• Implement the Impact Evaluation
May 22, 2012
17. STEP-BY-STEP GUIDE:
STEP 4
How to ID Questions
•Benefit from research partner’s
knowledge and experience
•Questions should measure critical links
and associated key performance
indicators
•Goal is to verify results
•Involve the team and M&E clients early
•Check USAID publications
16
May 22, 2012
18. STEP-BY-STEP GUIDE
1
• Determine the Purpose for the Evaluation
• Determine the Financial Resources Available for the
2 Evaluation
3
• Identify Research Team and Partners
4
• Identify Research Questions
5
• Choose a Research Methodology
6
• Determine the Other Details of the Research Design
17
7
• Implement the Impact Evaluation
May 22, 2012
19. STEP-BY-STEP GUIDE:
STEP 5
Creating a Comparison Group
•Must be a group of farmers,
entrepreneurs, business owners,
etc. as similar as possible to the
actual project beneficiaries
•AKA Control group vs. Treatment
group
•Isolates different impacts
18
May 22, 2012
20. STEP-BY-STEP GUIDE:
STEP 5
2 Sources of Selection
OBSERVABLE
CHARACTERISTICS
Bias of an individual’s personality
UNOBSERVABLE CHARACTERISTICS
•Aspects
•Include things that can be seen or that play a large role in determining
tangibly measured success
•Personal initiative, entrepreneurial spirit,
•Sex, education, age, location, etc.
risk orientation, persistence, self-
•If treatment group is 90% male / confidence, optimism, etc.
10% female and control group is •Those who volunteer for VC projects
40% male / 60% female, you will have more of these qualities than others
come up with invalid conclusions. •Comparing a group of new-seed
•Educated vs. uneducated adopters to a group of non-adopters
•Urban vs. rural would not allow us to know to what extent
any observed differences in farming
outcomes are the result of the project of
the result of pre-existing personality
19 differences among the groups
May 22, 2012
21. STEP-BY-STEP GUIDE:
STEP 5
Experimental Methods of
Evaluation
•Follows the same basic approach as a placebo
experiment
•Of a selected group of maize farmers, some
receive project assistance while others do not
•Theoretically eliminates all sources of selection
bias
•Also referred to as randomized controlled trials
20
(RCTs)
May 22, 2012
22. STEP-BY-STEP GUIDE:
STEP 5
Downsides of Experimental
Method
•Randomization protocols can be complicated, time consuming
and operationally burdensome
•May be perceived as unethical
•Not ideal for projects with small #’s of beneficiaries, impromptu
projects, specified locations or groups of people, or projects
with no available control group (broad-based policy reform)
•VC projects are flexible, easily changed, while this
methodology requires consistent variables
•Difficult for evaluation designers to reasonably ‘control for’
changes in the environment that was not influenced by the
21
May 22, 2012
project.
23. STEP-BY-STEP GUIDE:
STEP 5
Quasi-Experimental Methods
•Does not randomly assign subjects into treatment and control
groups
•Instead, compares pre-existing groups via a matching process
•Treatment groups are selected via random sampling
•Control groups are selected by ID’ing areas and communities
with matching observable
characteristics and then randomly
sampling the relevant population
living in those areas and
communities.
•But, quasi-experimental methods
are less rigorous than
experimental
22
May 22, 2012
24. STEP-BY-STEP GUIDE:
STEP 5
In choosing your method, ask the following:
•Will our M&E system clients be less well served if we opt for a
quasi-experimental design over an experimental design?
•Is our project amenable to random assignment?
•Is random assignment operationally feasible?
•Can we manage/overcome the anticipated opposition from our
project staff and external stakeholders?
•Is the tradeoff of an increased operational burden worth the
improvement we get in statistical credibility?
If ‘Yes’ to each, then experimental
If ‘No’ to any, then quasi-
23
May 22, 2012
experimental
25. STEP-BY-STEP GUIDE
1
• Determine the Purpose for the Evaluation
• Determine the Financial Resources Available for the
2 Evaluation
3
• Identify Research Team and Partners
4
• Identify Research Questions
5
• Choose a Research Methodology
• Determine the Other Details of the Research
6 Design
24
7
• Implement the Impact Evaluation
May 22, 2012
26. STEP-BY-STEP GUIDE:
STEP 6
Other Considerations
•Sample size and composition
•Trend study vs. Panel study
•Single method vs. Mixed methods
•Early vs. Delayed Baseline Data
Collection
25
May 22, 2012
27. STEP-BY-STEP GUIDE
1
• Determine the Purpose for the Evaluation
• Determine the Financial Resources Available for the
2 Evaluation
3
• Identify Research Team and Partners
4
• Identify Research Questions
5
• Choose a Research Methodology
6
• Determine the Other Details of the Research Design
26
7
• Implement the Impact Evaluation
May 22, 2012
28. STEP-BY-STEP GUIDE:
STEP 7
The Final Act
•Implement the evaluation
•Work closely with the local research
firm, project staff and (as relevant)
implementing partners and local
authorities/community leaders
•Assign one person specific task of
monitoring the research firm’s
performance at every stage
27
May 22, 2012
29. Resources
Impact Evaluation Resources
Donor Organizations
Evaluation Firms
•International Program for Development Evaluation
•Abdul Latif Jameel Poverty Action
Training
Lab
•United Nations Evaluation Group
•Innovations for Poverty Action
•USAID Private Sector Development Impact
•International Food Policy Research
Assessment Initiative
Institute (IFPRI) Impact Assessment
•World Bank Development Impact Evaluation Initiative
Program
•World Bank Independent Evaluation Group
Associations and Networks
•American Evaluation Association Web Resources
•Donor Committee for Enterprise Development •Evaluation Portal
•InterAction Monitoring & Evaluation •Evaluation Virtual Library
•International Initiative for Impact Evaluation •Free Resource for Program
•Network of Networks on Impact Evaluation Evaluation and Social Research
Methods
28
May 22, 2012
30. COMMON
PITFALLS
•Teams do not conduct appropriate •Teams do not seek advice on
due diligence about their evaluation sampling from qualified technical
options experts
•Teams implement the baseline data •Teams inappropriately attribute
collection too soon evaluation findings
•Teams implement a trend study •Teams attempt to implement the
when a panel study would have impact evaluation using project staff
been both preferable and possible •Donors demand rigorous
•Teams load up the impact survey evaluations but do not allocate
with excess questions sufficient funding
•Teams do not monitor the local •Projects make compromises to the
research firm’s adherence to the evaluation methodology
TOR •Evaluation reports do not fully
•Teams do not budget or plan for disclose the tradeoffs made
mixed-methods evaluations •Projects do not closely monitor the
performance of external research
firms
29
May 22, 2012
32. Want to Learn More?
Multiple ways to continue the discussion
and continue learning:
• Initiate a monthly session on the M&E guide and
case studies from across CARE. Contact
cpennotti@care.org
• Join the Market Engagement Community of
Practice on LinkedIn.
• Join a task force to review and refine the universal
31
indicators. Contact nardi@careinternational.org
May 22, 2012