Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Report
MEASURE EvaluationMEASURE Evaluation works to improve collection, analysis and presentation of data to promote better use of data in planning, policymaking, managing, monitoring and evaluating population, health and nutrition programs.
Follow
•4 likes•1,105 views
1 of 51
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
Presented during a February 2020 webinar from Data for Impact.
MEASURE EvaluationMEASURE Evaluation works to improve collection, analysis and presentation of data to promote better use of data in planning, policymaking, managing, monitoring and evaluating population, health and nutrition programs.
Data for Impact: Lessons Learned in Using the Ripple Effects Mapping Method
1. Data for Impact: Lessons Learned in Using
the Ripple Effects Mapping Method
February 20, 2020
2. • Introduction to ripple effects mapping (REM)
• Tanzania’s REM experience
• Botswana’s REM experience
• Lessons learned
Webinar structure
3. What is the REM mapping method?
• Ripples are tiny waves generated when someone
drops a stone into water.
• REM is a participatory method to understand the
different effects (or ripples) a complex program
has on a community or beneficiary.
• REM engages program stakeholders to discuss
and visually map these ripples.
Introduction
4. REM’s origins
Developed by the University of
Minnesota, the Evaluation
Studies Institute (MESI) was
established to conduct an
impact analysis of the Horizons
program; an 18-month
community-based program in
the US delivered to strengthen
leadership to reduce poverty.
Introduction (2)
5. Why REM?
• Well-suited for complex evaluations
• Engages participants and stakeholders and
creates positive energy for further action
• Ground truths a program theory of change
• Can uncover unanticipated consequences
• Part of a larger evaluation toolkit
REM in evaluation
6. What does REM require?
• A combination of facilitation and evaluation skills
Facilitation
Engaging in
meaningful
discussion
Creating open
communication
Organizing time
Evaluation
Group interviewing
Rapid qualitative
data analysis
Developing thematic
categories
Identifying causal
pathways
Rippleeffects
mapping
REM in evaluation
8. Peer-to-peer interviews
• Participants divide into pairs for peer-to-peer interviews
and ask a set of standard questions
• As a facilitator, ensure participants:
Use a standard set of questions to elicit program outcomes
Do not deviate from the interview (except to ask
follow-up questions)
Use active listening skills
Take notes
Conducting REM: Appreciate Inquiry interviews
9. Group reflection
• The facilitator should ask each pair to offer one story
(only one at a time so everyone has an opportunity
to share)
• Facilitator should probe participants:
Then what happened?
Who was involved?
What skill, approach, or tool, if any, was involved?
What are people doing differently?
How have relationships changed as a result?
Conducting REM: Group reflection
and mapping
10. Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Program
Effect
Effect
Effect
Effect
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Conducting REM: Mapping
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
Then what
happened?
11. Group reflection—after collective mapping
• Ask the group to reflect on the map
as a whole
• Open participatory discussion
Conducting REM: Preliminary analysis
12. This presentation was produced with the support of the United States
Agency for International Development (USAID) under the terms of the
Data for Impact (D4I) associate award 7200AA18LA00008, which is
implemented by the Carolina Population Center at the University of
North Carolina at Chapel Hill, in partnership with Palladium
International, LLC; ICF Macro, Inc.; John Snow, Inc.; and Tulane
University. The views expressed in this publication do not necessarily
reflect the views of USAID or the United States government.
www.data4impactproject.org
13. Headline goes here
Headline goes here
Headline goes here
Author Name and Degree Here
MEASURE Evaluation
Your organization here
Date for presentation if necessary
Name of meeting
Lessons Learned in Using the
Ripple Effects Mapping Method:
Tanzania and Botswana Case Studies
15. Evaluation question
What are the perceptions of program
implementers, community members, and other
stakeholders on performance and influence of the
Public Service Systems Strengthening project (PS3)
on uptake of health services, finance, and human
resources, and community engagement and
governance?
16. How and why REM?
REM was just one method
used in the qualitative
component:
Solicit outcomes of the
local government authority
(LGA) councilor trainings only
More participatory method than a focus
group discussion and we thought the
visual component would help to elicit
more interaction, jog memories, etc.
17. Structure of REMs
Focus on PS3 priority regions (Eastern Lake,
Central, Eastern Southern Highlands)
for sampling
Six local governing areas—1 REM per LGA
• Participants: local government councilors;
8 per LGA
• Selected based on participation in PS3
trainings and to represent range of local
vulnerabilities (e.g., higher income ward vs.
lower income; urban vs. rural, etc.)
Three facilitators
• Main facilitator
• Mapper
• Notetaker/quote recorder
18. Training
1. Reviewed content of LGA councilor training
and example statements the program
had collected from LGA councilors
about training experience
2. Reviewed field guide to REM mapping
3. PowerPoint training session
4. Demonstration with MEASURE
Evaluation staff acting as
facilitator and mapper
19. Demonstration: Let’s practice!
We will practice each step, using a model
“program” – Your Bachelor’s Program
1. Informed consent and introduction
2. Ground rules
3. Appreciative inquiry (AI) interviews
Tell a story about how you have used the skills/
knowledge you gained during your Bachelor’s
degree program
What new or deepened collaborations with others
have you made as a result of your Bachelor’s training?
What unexpected things have happened as a result
of your Bachelor’s training?
20. Training (2)
4. Mock session with the facilitators
practicing and other researchers playing
roles of LGA councilor participants
Assigned acting roles to researchers
(e.g., rural area councilor who is very quiet,
urban area councilor who is gregarious and
opinionated and talks over others, etc.)
Went through entire process from AI to
mapping and reflection
5. Pilot with LGA councilors in the field
21. Implementation: AI excerpt
1. Interview each other; share your experiences with
PS3 training and mentoring using these questions:
Tell me a story about how you have used the
information and/or tools received through PS3
councilor training and mentoring.
What is an achievement or success you had based
on your experience with/learning through PS3
training and mentoring—what made it possible?
What did this achievement lead to?
What new or deepened collaborations with
others (individuals, community organizations,
government, etc.) have you made as a result of
these efforts? What did these connections lead to?
22. Implementation: Mapping excerpt
1. Ask each pair to offer one story, then ripple it out
(draw out some of the details)
Probing questions can include:
• Then what happened?
• What skill, approach,
or tool from Government
of Tanzania/PS3 training,
if any, was involved?
• What are people
doing differently?
• How have relationships changed as a result?
24. Discussion of the map
Ask the group to identify the
most significant change(s)
on the map.
What other effects or impacts would you
like there to be? What do you wish would
happen? (Are there things from your
action plan that have not yet happened
that you think are important?)
What should we do next?
Implementation: Reflections excerpt
25. Analysis
Mapping data was entered into
MindManager software
Identified additional connections;
commonalities
Compared with themes from other
qualitative methods
27. Evaluation objective
To qualitatively examine how factors at the
personal, family, school, community, and
service delivery levels influence the education,
economic, and health trajectories and related
outcomes of OVC
28. Structure of REMs
Total of 4 REM groups
• 2 HIV-negative
• 2 HIV-positive (participants part of youth
support group)
• ~ 8 participants per group
• Youth ages 16–19 years old
• Received USG services
for orphans and vulnerable
children (OVC)
Two facilitators
• Main facilitator
• Mapper
29. Training
The two facilitators reviewed
a training PowerPoint and the
field guide to REM mapping
A mock REM group was
conducted with data collectors
• Data collectors played role
as youth beneficiaries
A pilot REM was conducted in the
capital city
30. Appreciative inquiry
Conversations between pairs of
participants (audio-recorded):
Tell me a story about how you have
used the information you received,
or skills learned through the project.
Has the project helped maintain or improve
your health?
Discuss an achievement or a success you
had based on your learning from the
project — what made it possible?
31. Mapping
In the pilot REM group, the participants were
distracted by the live mapping
• The mapper moved to the side of the room and
mapped the conversation and then brought it back
to the group to react to at the end of the session
Some youth were hesitant to share stories initially
due to the sensitivity of the information being shared
• Facilitators ensured confidentiality
• Youth came up with creative pseudonyms
Challenging to document the intricacies and
intersections between the different sector areas
in “real time”
32. Reflections
Facilitators presented the map back to
participants at the end and explored
whether the beneficiaries thought the map:
• Was a true reflection of their discussion; and/or
• Had any gaps
Participants used the time to reflect on what
was the most significant change identified
on the map
More concrete questions (e.g., most significant
change) were easier to grasp than questions
about “what was interesting” about the map
33. Analysis
The “live” maps were photographed and
validated after listening to audio transcripts
of the session
• Additional details were added and vague
concepts clarified
These maps were then replicated in Xmind
mapping software
The maps and written transcripts were
reviewed to identify:
• Emerging themes,
• Perceived results of the project, and
• The perceived impact of those results
38. Advantages of REM
Post-data collection analysis time is
reduced compared with the analysis time
of typical FGDs.
The map is a useful tool to instigate further
discussion and details during the REM
session and afterwards.
Final maps create a sense of pride and
accomplishment among respondents.
39. Lessons learned
Facilitation
Facilitators should be experienced
and well-trained in group facilitation
and management.
Mappers should be well-versed in the
program components and theory of
change, and need analytical skills
to distill key aspects and examples
for the evaluation.
40. Lessons learned
Specific example
of increased self-
confidence related
to studies/school
Improved school
performance
Another specific
example of increased
self-confidence related
to studies/school
Facilitation
41. Lessons learned
Training: Practice, practice, practice
Practice session with facilitators-in-
training acting as participants using
a real, common experience they share.
Mock REM session of the study topic,
with facilitators-in-training playing
various study participant roles.
Conduct 1–2 pilots to ensure facilitator
and mapper are well prepared.
42. Training
Clarify relationship between
facilitator, mapper, and notetaker
• For example, facilitator checks
if mapper is capturing outcomes;
mapper seeks clarity when necessary;
notetaker documents key quotes, etc.
Ask REM trainees to give examples of possible
stories from program beneficiaries to practice
mapping; encourage facilitators to probe
for details
Clarify that outcomes, not activities or stories,
should be mapped
Lessons learned
43. Lessons learned
Tailoring
How will the population
see the concept
of mapping?
Adults versus youth
May need to ask
questions differently
Youth may not understand some
concepts the same way as adults
44. Lessons learned
Tailoring (cont.)
Complex programs may need
to divide into categories
of discussion
Tanzania was just one topic—governance
and citizen engagement and single
intervention
Botswana was three different topic areas
and layered interventions; had to divide
up questions
45. Lessons learned
Implementation
Budget time to introduce and explain the
purpose and process to groups unfamiliar
with participatory research
Clarify that you want as many details as possible
Outcomes further than outputs on the
causal chain
Concrete examples
Facilitators probe for those examples
• Participant: “We learned how to manage finances.”
• Facilitator: “Ok, so then what?”
• Participant: “I developed a budget and needed
less guidance than I normally do.”
46. Lessons learned
Implementation (cont.)
Audio recorders can work, depending on space
Botswana—worked well
Tanzania—spaces were expected to be small
and noisy; did not use
When not using an audio recorder, hold a
quality check team meeting right after session
• Clarify notes
• Check all quotes
• Make sure map captures all relevant
outcomes and ripples
47. Conclusions
REM is an engaging and interactive
method visualize outcomes that may not
be captured otherwise.
Maps can be used to compare an
intervention’s a priori theory of change
to participants’ lived experiences.
REM can be used alone or in combination
with other evaluation methods.
REM can help further the effects of
an intervention.
48. Contributors
Tanzania
MEASURE Evaluation
Jessica Fehringer
Brittany Iskarpatyoti
Health and Development
International Consultants
(HDIC)
Mathew Senga
Egidius Kamanyi
Botswana
MEASURE Evaluation
Lisa Parker
Mahua Mandal
Abby Cannon
Elizabeth Millar
Research 4 Results
Iris Halldorsdottir
Sedilame Bagane
Graphic Harvest
Sonja Niederhumer
50. Resources
A Field Guide to Ripple Effects Mapping
https://conservancy.umn.edu/handle/11299/190639
Experiences and Lessons Learned: Implementing the Ripple
Effects Mapping Method
https://www.measureevaluation.org/resources/publications
/fs-20-423
Evaluation of Services for Orphans and Vulnerable Youth in
Botswana
https://www.measureevaluation.org/resources/evaluation-
of-services-for-orphans-and-vulnerable-youth-in-botswana
Midline Evaluation of the Tanzania Public Sector System
Strengthening Program – Final Report
https://www.measureevaluation.org/resources/publications
/tre-19-26
51. This presentation was produced with the support of the United
States Agency for International Development (USAID) under
the terms of MEASURE Evaluation cooperative agreement
AID-OAA-L-14-00004. MEASURE Evaluation is implemented by
the Carolina Population Center, University of North Carolina
at Chapel Hill in partnership with ICF International; John Snow,
Inc.; Management Sciences for Health; Palladium; and Tulane
University. Views expressed are not necessarily those of USAID
or the United States government.
www.measureevaluation.org