Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Evaluating research brokers and intermediaries  What is success?  And how do we measure it? Anna Downie Information Depart...
Evaluation: why?  <ul><li>Research brokers and intermediaries are more than just a by-product of research </li></ul><ul><l...
Introductions What challenges do you face in evaluation in this area?
Research communication evaluation is challenging … <ul><li>Information, communication, knowledge, learning, change are all...
… But evaluating research brokers and intermediaries is even more challenging! <ul><li>Intermediaries are involved with va...
Challenges <ul><li>M&E needs to be tailored </li></ul><ul><li>Different drivers for intermediaries so different concepts o...
IDS Knowledge Services external evaluation Key evaluation recommendations <ul><li>Outcomes:  Make clear what ‘use’ of info...
What is success? <ul><li>For different stakeholders: </li></ul><ul><ul><li>What outcomes are we looking for? Are they meas...
Knowledge dissemination Knowledge  building Sharing  ideas  To provide wider  access  to the knowledge base of resources r...
What do we mean by  use of research ? Information Knowledge Learning Action / Decisions Awareness Access Development outco...
How is research used? <ul><li>To change what people do (behaviours, policies) </li></ul><ul><li>To change how people think...
 
Mapping out pathways <ul><li>Can help you prioritise and plan </li></ul><ul><li>Makes explicit your assumptions and values...
Mapping out pathways <ul><li>Logframes (logical frameworks) </li></ul><ul><li>Outcome mapping </li></ul><ul><li>Social net...
Outputs Access and debate Understanding and influence Action Goal Vision The things which IDS Knowledge Services will prod...
Identifying target groups and building relationships with stakeholders <ul><li>Who are our target groups? </li></ul><ul><u...
A few evaluation methods <ul><li>Things we’ve tried: </li></ul><ul><li>Web statistics </li></ul><ul><li>Usability testing ...
What evaluation methods do you use? What do you find easy to measure/evaluate? What do you find harder to measure/evaluate...
Final thoughts <ul><li>Expect the unexpected </li></ul><ul><li>Building relationships with users </li></ul><ul><li>Evaluat...
Thanks!
Upcoming SlideShare
Loading in …5
×

Power Of In Between M&E Session

1,698 views

Published on

Identifying outcomes and impact- monitoring and evaluation of research brokering and intermediation

Presentation by Anna Downie , Strategic Learning Initiative, IDS, UK at the Locating the Power of the In-between conference

Published in: Economy & Finance, Education
  • Be the first to comment

  • Be the first to like this

Power Of In Between M&E Session

  1. 1. Evaluating research brokers and intermediaries What is success? And how do we measure it? Anna Downie Information Department, Institute of Development Studies, UK Power of In-Between Conference, South Africa (July 2008)
  2. 2. Evaluation: why? <ul><li>Research brokers and intermediaries are more than just a by-product of research </li></ul><ul><li>Accountability: Like all programmes which receive aid money we need to show what difference we make </li></ul><ul><li>Most importantly, to learn and improve </li></ul>
  3. 3. Introductions What challenges do you face in evaluation in this area?
  4. 4. Research communication evaluation is challenging … <ul><li>Information, communication, knowledge, learning, change are all difficult concepts to understand and evaluate </li></ul><ul><li>Decision making or policy making are complex processes to disentangle </li></ul><ul><li>Research communications evaluation: single pieces of research, or a specific policy process </li></ul><ul><li>“ Although there is little consensus on what research ‘use’ refers to exactly, there seems to be broad agreement on the fact that research evidence rarely has a direct impact on decision making” (Schryer-Roy 2005) </li></ul>
  5. 5. … But evaluating research brokers and intermediaries is even more challenging! <ul><li>Intermediaries are involved with variety of research in multiple processes </li></ul><ul><li>Not advocating for a single piece of research or a single process </li></ul><ul><li>Trying to change use of information and the policy environment? </li></ul>
  6. 6. Challenges <ul><li>M&E needs to be tailored </li></ul><ul><li>Different drivers for intermediaries so different concepts of outcomes </li></ul><ul><li>Often set up from ‘supply’ side; but we want to evaluate whether it meets a need/demand </li></ul><ul><li>We address multiple problems, therefore we have multiple outcomes </li></ul>
  7. 7. IDS Knowledge Services external evaluation Key evaluation recommendations <ul><li>Outcomes: Make clear what ‘use’ of information means </li></ul><ul><li>Pathways: Make explicit our theory of change </li></ul><ul><li>Targeting: Identify more specific groups of target users </li></ul><ul><li>Methods: Don’t just focus on collecting ‘success’ stories </li></ul>
  8. 8. What is success? <ul><li>For different stakeholders: </li></ul><ul><ul><li>What outcomes are we looking for? Are they measurable? </li></ul></ul><ul><ul><li>What are indicators of success? </li></ul></ul>
  9. 9. Knowledge dissemination Knowledge building Sharing ideas To provide wider access to the knowledge base of resources relevant to tackling poverty and injustice Information access The results of research is more widely used Increased access to and engagement with research findings Supporting and enabling evidence based pro poor policy and practice To seek solutions to knowledge and communications challenges Enhance the communication and use of evidence-based development research Awareness and improved understanding among policymakers Improved links and relationships between researchers, practitioners and policy makers Bridge the gap between ideas and policy Bridging the gaps between research, policy and practice with accessible and diverse information Changing communication patterns Increased debate about research findings
  10. 10. What do we mean by use of research ? Information Knowledge Learning Action / Decisions Awareness Access Development outcomes?
  11. 11. How is research used? <ul><li>To change what people do (behaviours, policies) </li></ul><ul><li>To change how people think (different debates, different voices) </li></ul><ul><li>To change how people feel </li></ul>
  12. 13. Mapping out pathways <ul><li>Can help you prioritise and plan </li></ul><ul><li>Makes explicit your assumptions and values </li></ul><ul><li>Allows you to compare your ‘ideal’ with reality and helps you to evaluate </li></ul><ul><li>Needs to be simple enough for everyone to understand, but meaningful enough to be able to test it </li></ul>
  13. 14. Mapping out pathways <ul><li>Logframes (logical frameworks) </li></ul><ul><li>Outcome mapping </li></ul><ul><li>Social network analysis </li></ul><ul><li>Theory of change </li></ul>
  14. 15. Outputs Access and debate Understanding and influence Action Goal Vision The things which IDS Knowledge Services will produce <ul><li>Desire and capacity to use information </li></ul><ul><li>Access to relevant, diverse and credible information </li></ul><ul><li>Sharing, discussion and debate </li></ul><ul><li>An enabling environment for intermediaries </li></ul><ul><li>More understanding of poverty and injustice </li></ul><ul><li>Increased capacity to build the understanding of others </li></ul><ul><li>Increased capacity to influence the behaviours and actions of others </li></ul><ul><li>Changing or supporting a development intervention </li></ul><ul><li>Framings of issues and new agendas </li></ul><ul><li>Spaces to negotiate power relations </li></ul><ul><li>Wider awareness of development issues and public debate </li></ul><ul><li>Information contributes to more enabling development processes and interventions which are poor-poor, inclusive and equitable </li></ul><ul><li>A world in which poverty does not exist, social justice prevails and the voices of all the people are heard in national and international institutions </li></ul>
  15. 16. Identifying target groups and building relationships with stakeholders <ul><li>Who are our target groups? </li></ul><ul><ul><li>Policy makers and practitioners? (who are they?!) </li></ul></ul><ul><ul><li>Whoever uses us? </li></ul></ul><ul><ul><li>Or can we be more specific? </li></ul></ul><ul><li>Developing relationships with stakeholders </li></ul><ul><ul><li>Requires energy and commitment </li></ul></ul><ul><ul><li>Who do we prioritise? </li></ul></ul>
  16. 17. A few evaluation methods <ul><li>Things we’ve tried: </li></ul><ul><li>Web statistics </li></ul><ul><li>Usability testing </li></ul><ul><li>Questionnaires </li></ul><ul><li>Interviews and case studies </li></ul><ul><li>Research into information and communication </li></ul><ul><li>Advisory groups and editorial panels </li></ul><ul><li>Things we’d like to do: </li></ul><ul><li>Counterfactuals (comparison with non-users) </li></ul><ul><li>Tracker studies </li></ul><ul><li>Network analysis </li></ul><ul><li>Follow-up studies </li></ul><ul><li>Benchmarking??? </li></ul><ul><li>Readers panels </li></ul><ul><li>Most significant change approach </li></ul>
  17. 18. What evaluation methods do you use? What do you find easy to measure/evaluate? What do you find harder to measure/evaluate? What does success mean to you?
  18. 19. Final thoughts <ul><li>Expect the unexpected </li></ul><ul><li>Building relationships with users </li></ul><ul><li>Evaluation is challenging- but worth it! </li></ul><ul><li>Aim to learn and improve: Be questioning, inquisitive and reflective </li></ul>
  19. 20. Thanks!

×