Evaluating research brokers and intermediaries  What is success?  And how do we measure it? Anna Downie Information Department,  Institute of Development Studies, UK Power of In-Between Conference, South Africa (July 2008)
Evaluation: why?  Research brokers and intermediaries are more than just a by-product of research Accountability: Like all programmes which receive aid money we need to show what difference we make Most importantly, to learn and improve
Introductions What challenges do you face in evaluation in this area?
Research communication evaluation is challenging … Information, communication, knowledge, learning, change are all difficult concepts to understand and evaluate Decision making or policy making are complex processes to disentangle Research communications evaluation: single pieces of research, or a specific policy process “ Although there is little consensus on what research ‘use’ refers to exactly, there seems to be broad agreement on the fact that research evidence rarely has a direct impact on decision making”   (Schryer-Roy 2005)
… But evaluating research brokers and intermediaries is even more challenging! Intermediaries are involved with variety of research in multiple processes Not advocating for a single piece of research or a single process Trying to change use of information and the policy environment?
Challenges M&E needs to be tailored Different drivers for intermediaries so different concepts of outcomes Often set up from ‘supply’ side; but we want to evaluate whether it meets a need/demand We address multiple problems, therefore we have multiple outcomes
IDS Knowledge Services external evaluation Key evaluation recommendations Outcomes:  Make clear what ‘use’ of information means Pathways:  Make explicit our theory of change Targeting:  Identify more specific groups of target users Methods:  Don’t just focus on collecting ‘success’ stories
What is success? For different stakeholders: What outcomes are we looking for? Are they measurable? What are indicators of success?
Knowledge dissemination Knowledge  building Sharing  ideas  To provide wider  access  to the knowledge base of resources relevant to tackling poverty and injustice Information  access The results of research is more widely  used Increased  access  to and engagement with research findings Supporting  and enabling evidence based pro poor policy and practice To seek  solutions  to knowledge and communications challenges  Enhance the communication and  use  of evidence-based development research  Awareness  and improved understanding among policymakers Improved  links  and relationships between researchers, practitioners and policy makers Bridge the  gap  between ideas and policy Bridging the  gaps  between research, policy and practice with accessible and diverse information Changing  communication  patterns Increased  debate  about research findings
What do we mean by  use of research ? Information Knowledge Learning Action / Decisions Awareness Access Development outcomes?
How is research used? To change what people do (behaviours, policies) To change how people think (different debates, different voices) To change how people feel
 
Mapping out pathways Can help you prioritise and plan Makes explicit your assumptions and values Allows you to compare your ‘ideal’ with reality and helps you to evaluate Needs to be simple enough for everyone to understand, but meaningful enough to be able to test it
Mapping out pathways Logframes (logical frameworks) Outcome mapping Social network analysis Theory of change
Outputs Access and debate Understanding and influence Action Goal Vision The things which IDS Knowledge Services will produce Desire and capacity to use information Access to relevant, diverse and credible information Sharing, discussion and debate An enabling environment for intermediaries More understanding of poverty and injustice Increased capacity to build the understanding of others Increased capacity to influence the behaviours and actions of others Changing or supporting a development intervention Framings of issues and new agendas Spaces to negotiate power relations Wider awareness of development issues and public debate Information contributes to more enabling development processes and interventions which are poor-poor, inclusive and equitable A world in which poverty does not exist, social justice prevails and the voices of all the people are heard in national and international institutions
Identifying target groups and building relationships with stakeholders Who are our target groups? Policy makers and practitioners? (who are they?!) Whoever uses us? Or can we be more specific? Developing relationships with stakeholders Requires energy and commitment Who do we prioritise?
A few evaluation methods Things we’ve tried: Web statistics Usability testing Questionnaires Interviews and case studies Research into information and communication Advisory groups and editorial panels Things we’d like to do: Counterfactuals (comparison with non-users) Tracker studies Network analysis Follow-up studies Benchmarking??? Readers panels Most significant change approach
What evaluation methods do you use? What do you find easy to measure/evaluate? What do you find harder to measure/evaluate? What does success mean to you?
Final thoughts Expect the unexpected Building relationships with users Evaluation is challenging- but worth it! Aim to learn and improve: Be questioning, inquisitive and reflective
Thanks!

Power Of In Between M&E Session

  • 1.
    Evaluating research brokersand intermediaries What is success? And how do we measure it? Anna Downie Information Department, Institute of Development Studies, UK Power of In-Between Conference, South Africa (July 2008)
  • 2.
    Evaluation: why? Research brokers and intermediaries are more than just a by-product of research Accountability: Like all programmes which receive aid money we need to show what difference we make Most importantly, to learn and improve
  • 3.
    Introductions What challengesdo you face in evaluation in this area?
  • 4.
    Research communication evaluationis challenging … Information, communication, knowledge, learning, change are all difficult concepts to understand and evaluate Decision making or policy making are complex processes to disentangle Research communications evaluation: single pieces of research, or a specific policy process “ Although there is little consensus on what research ‘use’ refers to exactly, there seems to be broad agreement on the fact that research evidence rarely has a direct impact on decision making” (Schryer-Roy 2005)
  • 5.
    … But evaluatingresearch brokers and intermediaries is even more challenging! Intermediaries are involved with variety of research in multiple processes Not advocating for a single piece of research or a single process Trying to change use of information and the policy environment?
  • 6.
    Challenges M&E needsto be tailored Different drivers for intermediaries so different concepts of outcomes Often set up from ‘supply’ side; but we want to evaluate whether it meets a need/demand We address multiple problems, therefore we have multiple outcomes
  • 7.
    IDS Knowledge Servicesexternal evaluation Key evaluation recommendations Outcomes: Make clear what ‘use’ of information means Pathways: Make explicit our theory of change Targeting: Identify more specific groups of target users Methods: Don’t just focus on collecting ‘success’ stories
  • 8.
    What is success?For different stakeholders: What outcomes are we looking for? Are they measurable? What are indicators of success?
  • 9.
    Knowledge dissemination Knowledge building Sharing ideas To provide wider access to the knowledge base of resources relevant to tackling poverty and injustice Information access The results of research is more widely used Increased access to and engagement with research findings Supporting and enabling evidence based pro poor policy and practice To seek solutions to knowledge and communications challenges Enhance the communication and use of evidence-based development research Awareness and improved understanding among policymakers Improved links and relationships between researchers, practitioners and policy makers Bridge the gap between ideas and policy Bridging the gaps between research, policy and practice with accessible and diverse information Changing communication patterns Increased debate about research findings
  • 10.
    What do wemean by use of research ? Information Knowledge Learning Action / Decisions Awareness Access Development outcomes?
  • 11.
    How is researchused? To change what people do (behaviours, policies) To change how people think (different debates, different voices) To change how people feel
  • 12.
  • 13.
    Mapping out pathwaysCan help you prioritise and plan Makes explicit your assumptions and values Allows you to compare your ‘ideal’ with reality and helps you to evaluate Needs to be simple enough for everyone to understand, but meaningful enough to be able to test it
  • 14.
    Mapping out pathwaysLogframes (logical frameworks) Outcome mapping Social network analysis Theory of change
  • 15.
    Outputs Access anddebate Understanding and influence Action Goal Vision The things which IDS Knowledge Services will produce Desire and capacity to use information Access to relevant, diverse and credible information Sharing, discussion and debate An enabling environment for intermediaries More understanding of poverty and injustice Increased capacity to build the understanding of others Increased capacity to influence the behaviours and actions of others Changing or supporting a development intervention Framings of issues and new agendas Spaces to negotiate power relations Wider awareness of development issues and public debate Information contributes to more enabling development processes and interventions which are poor-poor, inclusive and equitable A world in which poverty does not exist, social justice prevails and the voices of all the people are heard in national and international institutions
  • 16.
    Identifying target groupsand building relationships with stakeholders Who are our target groups? Policy makers and practitioners? (who are they?!) Whoever uses us? Or can we be more specific? Developing relationships with stakeholders Requires energy and commitment Who do we prioritise?
  • 17.
    A few evaluationmethods Things we’ve tried: Web statistics Usability testing Questionnaires Interviews and case studies Research into information and communication Advisory groups and editorial panels Things we’d like to do: Counterfactuals (comparison with non-users) Tracker studies Network analysis Follow-up studies Benchmarking??? Readers panels Most significant change approach
  • 18.
    What evaluation methodsdo you use? What do you find easy to measure/evaluate? What do you find harder to measure/evaluate? What does success mean to you?
  • 19.
    Final thoughts Expectthe unexpected Building relationships with users Evaluation is challenging- but worth it! Aim to learn and improve: Be questioning, inquisitive and reflective
  • 20.

Editor's Notes

  • #2 Introduce me. Conference objective is to understand how brokers and intermediaries add value. How do we evaluate that value? This session will share some of the experiences of evaluation in the IDDS knowledge services and will give participants a chance to discuss some key questions. So I promise I won’t be talking for the next hour and a half- you’ll be getting involved.