In the final episode of the cup of tea webinar series, John Young Head of ODI's RAPID programme shared his reflections on monitoring and evaluation of research uptake. The final webinar took place on Thursday 25 May.
2. House keeping
• 5 minute introduction by Megan Lloyd-Laney
• 15 minute discussion with John Young
• 10 minute Q&A with the audience
• When you have a question please type it into the
question box and we will share it with the panel
and participants.
• If you do not have a chance to ask your question
please send it in via email or social media using
the hashtag #R2AWebinar.
3. Poll
What do you find MOST challenging about M+E of
research uptake?
A) It’s hard to measure
B) Research uptake can take a long time
C) Research uptake can take unexpected forms
D) M+E often left to the end of a project
E) Inadequate tools for the job
F) Tools need to be adapted to different contexts
4. Levels of uptake
• Discursive: Client-
focused services
• Attitudinal: Farmers
have good ideas
• Procedural: Participatory
approaches to service development
• Content: UU20, UU25. New
guidelines
• Behavioural: Approach being applied
in practice
• Discursive: Client-
focused services
• Attitudinal: Farmers
have good ideas
• Procedural: Participatory
approaches to service development
• Content: UU20, UU25. New
guidelines
• Behavioural: Approach being applied
in practice
http://pubs.iied.org/pdfs/G02016.pdf
5. A systematic approach
1. Strategy and direction –are you doing
the right thing?
2. Management –are you doing what you
planned to do?
3. Outputs – are the outputs appropriate
for the audience?
4. Uptake – are people aware of your
work?
5. Outcomes and impacts –are you having
any impact?
1. Strategy and direction
2. Management
3. Outputs
4. Uptake
5. Outcomes and impacts
6. Monitor the context - what else might
be influencing the changes you observe?
6. 1. Strategy and direction – Log frames;
Theories of change, Impact Pathways
2. Management – Quality Audits; Horizontal
Evaluation; After-action-reviews
3. Outputs – Peer review; Evaluating websites;
Evaluating networks;
4. Uptake – Impact Logs; New Areas for
Citation Analysis; User Surveys
5. Outcomes and impacts – Stories of change;
MSC; Episode Studies, performance stories
A systematic approach
1. Strategy and direction
2. Management
3. Outputs
4. Uptake
5. Outcomes and impacts
6. Monitor the context - Bellwether surveys,
media monitoring, timelines.
8. R2A’s next webinar series
• The ‘Cup of Tea’ webinar series is now
over. You can re-watch the series on
R2A’s Vimeo channel or read the full
text summaries on the website.
• Stay tuned for details of the next series
of ‘R2A Roundtables’ starting in July.
9. Follow R2A on social media
• Twitter @Research2Action
• Facebook @Research2Action
• Instagram @Research2Action
• LinkedIn http://ow.ly/67Vg308bxP2
• Vimeo http://ow.ly/8v2N308bxTg
Editor's Notes
Tools need to be adapted to different contexts
Many people have a very narrow definition of policy – essentially focusing only on legislation, regulations, policies and strategies.
But these are only the explicit descriptions of policy.
Remember we define policy as a purposive course of action. Making that happen involves a much wider set of changes.
We often describe 5 levels of policy:
Discursive – this is about ideas and how they evolve
Attitudinal – before behavior can change the attitudes of the different stakeholders needs to change
Procedural – that can lead to different ways in which issues are discussed and decisions are taken about policy solutions
Content- this is the actual legislation etc
Behaviour – but it is only if the behaviour of the people who are supposed to be affected by the policy change that there will in fact be any change in policy.
Here are some examples of what that looks like in practice from a livestock service reform project I was working on the eastern regions of Indonesia.
Projects seeking to use research-based evidence to influence policy can, and probably need to work at all of these levels.
When you are developing your strategy it is very important to be clear which you are aiming at.
You can find out more about that project at this website.
5
6
We have tried to apply the 5-level approach across the whole of ODI’s work, so that we do something at each level for all types of intervention from small projects, through larger projects, to programmes, and finally at the organisational level.
So the “strategic” level in very small projects might be only that there is a clear rationale for the project in the terms of reference, and that this is discussed with the client. The “management” level might just be that there is a clear sequence of activities with a timeline, and that this is monitored. The quality of “outputs” at this level is largely based on the judgement of the client, and “uptake” and “impact” though informal logs.
More is done in large projects. For example, there may be an inception phase at the end of which it is necessary to get a theory of change or logframe approved by an advisory group. There may be a more formal management process including PRINCE2, a formal process of peer review for outputs, and an evaluation at the end of the project.
Even more is done at programme and organisational level. For example some programmes have formal steering groups, and a 3 or 5 yearly strategic review, and a formal external evaluation.
At organisational level, the Board of Trustees is responsible for approving the strategic plan, and we periodically have a peer review of the whole organisation by a group of other think tank heads.
The important thing is to keep it simple, and as far as possible tied in with routine project activities.