The value of engagement John Young [email_address] Laura Harper  l.harper@wellcome.ac.uk
Buzz:  “What do you think would constitute good evidence of the value of public engagement?” Presentation:  Some approaches to measuring, learning and sharing knowledge. Groups:  What do we need to learn and share about engagement, and how we’d like to do it? Introduction
Why worry about it Because WT is. Because all other research donors are. Because we all think it’s important, but don’t really know what works: How (exactly) does research influence policy? Should we invest energy engaging with legislators or bureaucrats or the media or the public, or schoolchildren? Are we even achieving what we’re aiming to achieve?
Evidence Discuss with your neighbours: “ What do you think would be convincing evidence of the  value of public engagement?” Write down whatever you come up with.
The DELIVERI Programme Developing, testing and promoting new forms of animal health services in Indonesia:  Pilot projects with farmers & field staff Training and capacity development for all Institutional development Quality management Communication & advocacy “ The DELIVERI programme has developed some useful models of institutional change in the context of decentralisation, making a government service more responsive to the needs of local people” 1 1  DFID Country Strategy Paper for Indonesia Sept. 2000
ODI and RAPID UK’s leading development  Think Tank. RAPID: Promoting greater  use of research-based  evidence in development  policy & practice Research / Advice / Information and Capacity Development Working with all stakeholders Case studies, frameworks,  toolkits evidence-based policy in  development network (ebpdn) www.odi.org.uk  /  www.odi.org.uk/rapid
Forms of “engagement” With policy-makers, practitioners and communities to: Identify problems & issues for research Develop research projects and  methodologies Undertake the research Feedback, discuss and validate the  results of research Formulate solutions – policies and  programmes Implement the solutions – training and  capacity development Evaluate their effectiveness
Forms of “engagement” ...and we do it through: (Literature reviews) Telephone calls, e-mail, (and video conferences) Face to face 1:1 meetings & field trips Meetings, workshops & seminars Collaborative work / projects Sharing draft outputs for comment Web 2 – blogs, wikis, discussion groups Print & web publications The media
Policy processes are complex Monitoring and Evaluation Agenda  Setting Decision Making Policy Implementation Policy Formulation Civil Society Donors Cabinet Parliament Ministries Private  Sector
Chronic Poverty in Uganda Kate Bird et al,  Fracture Points in Social Policies for Chronic Poverty Reduction, ODI WP242, 2004  (http://www.odi.org.uk/publications/working_papers/wp242.pdf)
Factors influencing uptake External Influences  Socio-economic and cultural influences,  donor policies  etc www.odi.org.uk/RAPID/Tools/Toolkits/RAPID_Framework.html The  political context  – political and economic structures and processes, culture, institutional pressures, incremental vs radical change. The  evidence   – credibility, the degree it  challenges received wisdom,  research approaches and  methodology, simplicity of the message, how it is packaged. The  links  between policy and research communities – networks, relationships, power,  competing discourses, trust, knowledge etc.
A Practical Framework political context evidence links Politics and Policymaking Media,  Advocacy, Networking Research, learning & thinking www.odi.org.uk/RAPID/Tools/Toolkits/Policy_Impact/Framework_qus.html External Influences  Scientific information exchange & validation Policy analysis, & research Campaigning, Lobbying
Health Care in Tanzania “ The  results of household disease surveys informed processes of health service reform which contributed to a 43 and 46 per cent reduction in infant mortality between 2000 and 2003 in two districts in rural Tanzania.” TEHIP Project, Tanzania:  www.idrc.ca/tehip
What should you measure? It depends what you’re trying to do…. “ If you don't know where you are going, any road will get you there”
Should be: S pecific M easurable A chievable R ealistic T ime-bound ( O bjective) Whatever you measure
...and many projects fail when the inputs cease... Change takes a long time Inputs Activities Outputs Outcomes Impact Other Actors Project Effort Behaviour Change
Focusing on change www.odi.org.uk/RAPID/Tools/Toolkits/KM/Outcome_mapping.html OUTCOME MAPPING: Building Learning and Reflection into Development Programs Sarah Earl, Fred Carden, and Terry Smutylo www.idrc.ca/en/ev-9330-201-1-DO_TOPIC.html
Emphasis on “learning” “… every time we do something again, we should do it better than the last time… ” Goals Results Activities External networks; Colleagues; Information assets; Own knowledge www.odi.org.uk/RAPID/Tools/Toolkits/KM/Index.html Learn during Learn after Learn before
Starts with the attitude that someone has probably already done what I am about to do.  I wonder who?” Learning before: Peer Assist www.odi.org.uk/RAPID/Tools/Toolkits/KM/Peer_assists.html
Learning During: Stories What was the situation? What was the challenge? What was done? What was the result? What lessons can be drawn? www.odi.org.uk/RAPID/Tools/Toolkits/KM/Stories.html www.mande.co.uk/docs/MSCGuide.pdf Most significant change Best stories at each level Synthesis Stories of change
Horizontal evaluation Peer review Choose the moment Choose your peers Limited criteria e.g. ODI Peer Review Appreciative enquiry Self-evaluation CGIAR/CIAT Workshop
Learning after: AAR www.odi.org.uk/RAPID/Tools/Toolkits/KM/AAR.html An after action review asks 4 simple questions:  15 minute team debrief, conducted in a “rank-free” environment. What was supposed to happen? What actually happened? Why was there a difference? What can we learn from it?
Case & Episode Studies Classical case studies: how did evidence shape policy decisions? e.g. IFPRI & IDRC Overestimate the role of research www.idrc.ca/en/ev-26606-201-1-DO_TOPIC.html www.ifpri.org/impact/impact.htm www.odi.org.uk/RAPID/Publications/BRP_ITDG.html www.odi.org.uk/RAPID/Projects/PPA0104/Index.html www.gdnet.org/middle.php?oid=175 Episode studies: retrospective tracking back from policy change e.g. PRSPs, SL, AHC Underestimate the role of research
RAPID Outcome Mapping www.odi.org.uk/RAPID/Publications/RAPID_WP_266.html
Social Network Analysis www.odi.org.uk/RAPID/Tools/Toolkits/KM/Social_network_analysis.html
Other approaches: Public Citations, webstats, media  logs etc Surveys Quantitative Qualitative Distribution lists and  attendance records Meeting evaluations Logs: The expected The unexpected How you have changed Evaluation: Practical  Guidelines, Research  Councils UK. 2002  www.rcuk.ac.uk/cmsweb/downloads/rcuk/publications/evaluationguide.pdf
Other approaches: Policy Strategy and direction:  Logframes; Social Network Analysis; Impact Pathways; Modular Matrices Management :  ‘Fit for Purpose’ Reviews; ‘Lighter Touch’ Quality Audits; Horizontal Evaluation; Appreciative Inquiry Outputs:  Evaluating academic articles and research reports; Evaluating policy and briefing papers; Evaluating websites; Evaluating networks; After Action Reviews Uptake:  Impact Logs; New Areas for Citation Analysis; User Surveys Outcomes and impacts:  Outcome Mapping; RAPID Outcome Assessment; Most Significant Change; Innovation Histories; Episode Studies www.odi.org.uk/RAPID/Publications/RAPID_WP_281.html
Learning from each other Face-to-face meetings Establishing a network  E-mail Websites Collaborative work Publications D-groups So...you already are a sort of network
Network Functions Facilitators / learners Community builders Investor / providers Convenors Filters Amplifiers Support Agency
Keys to Success Clear governance.  Strength in numbers. Representativeness. Quality of evidence. Packaging of evidence Persistence. Membership of key individuals. Making use of informal links.  Complementing official structures.  Good use of ICTs. www.odi.org.uk/RAPID/Publications/RAPID_WP_276.html Or contact Enrique Mendizabal –  [email_address]
Community of practice “ a group of individuals participating in communal activity, and experiencing/continuously creating their shared identity through engaging in and contributing to the practices of their communities” Wenger, Etienne (1998),  Communities of Practice: Learning, Meaning, and Identity , Cambridge: Cambridge University Press, ISBN 978-0-521-66363-2 
e vidence  b ased  p olicy in  d evelopment  n etwork To promote greater use of research-based evidence in development policy & practice Research Consultations with CSOs and other stakeholders Capacity-development Collaborative action-research Joint projects Mutual learning  www.ebpdn.org
e vidence  b ased  p olicy in  d evelopment  n etwork To promote greater use of research-based evidence in development policy & practice Research Consultations with CSOs and other stakeholders Capacity-development Collaborative action-research Joint projects Mutual learning  www.ebpdn.org Meet annually Interactive map of members
Further information MandE News by Rick Davies:  www.mande.co.uk   Psci-com (practical guides section) by the Wellcome Trust:  http://www.intute.ac.uk/healthandlifesciences/pscicom/ Wellcome Trust Researcher Support Links:  www.wellcome.ac.uk/Professional-resources/Researcher-support/WTD026043.htm RAPID Website:  www.odi.org.uk/rapid John Young:  [email_address]
Group work – 4 Questions: What sort of evidence do you need in your own project(s) to make sure you are on track (and how to collect it)? What sort of evidence would you like to have about other public engagement projects? How would you like to get that evidence (and share your own)? Who should do what (you, other projects, 3 rd  party, Wellcome)?
Process 4 groups, each + facilitator & rapporteur Split into 2 sub-groups: Own evidence Evidence from others Whole group How to get it Who should do what Max. 3 responses to each question, highlighting the most important. Report this back to plenary  “ Discussion” on D-Groups
Focus and location Policy  – in the Auditorium Fac: Michelle Jimenez Rap: Greer Van Zyl Community  – in the Boardroom Fac: Bella Starling Rap: Monica Bonaccorso Media  – in the canteen Fac: Craig Brierly Rap: Katrina Nevin Ridley Creative/Other  – on the Terrace Fac: Laura Harper Rap: Marina Joubert

The value of engagement

  • 1.
    The value ofengagement John Young [email_address] Laura Harper l.harper@wellcome.ac.uk
  • 2.
    Buzz: “Whatdo you think would constitute good evidence of the value of public engagement?” Presentation: Some approaches to measuring, learning and sharing knowledge. Groups: What do we need to learn and share about engagement, and how we’d like to do it? Introduction
  • 3.
    Why worry aboutit Because WT is. Because all other research donors are. Because we all think it’s important, but don’t really know what works: How (exactly) does research influence policy? Should we invest energy engaging with legislators or bureaucrats or the media or the public, or schoolchildren? Are we even achieving what we’re aiming to achieve?
  • 4.
    Evidence Discuss withyour neighbours: “ What do you think would be convincing evidence of the value of public engagement?” Write down whatever you come up with.
  • 5.
    The DELIVERI ProgrammeDeveloping, testing and promoting new forms of animal health services in Indonesia: Pilot projects with farmers & field staff Training and capacity development for all Institutional development Quality management Communication & advocacy “ The DELIVERI programme has developed some useful models of institutional change in the context of decentralisation, making a government service more responsive to the needs of local people” 1 1 DFID Country Strategy Paper for Indonesia Sept. 2000
  • 6.
    ODI and RAPIDUK’s leading development Think Tank. RAPID: Promoting greater use of research-based evidence in development policy & practice Research / Advice / Information and Capacity Development Working with all stakeholders Case studies, frameworks, toolkits evidence-based policy in development network (ebpdn) www.odi.org.uk / www.odi.org.uk/rapid
  • 7.
    Forms of “engagement”With policy-makers, practitioners and communities to: Identify problems & issues for research Develop research projects and methodologies Undertake the research Feedback, discuss and validate the results of research Formulate solutions – policies and programmes Implement the solutions – training and capacity development Evaluate their effectiveness
  • 8.
    Forms of “engagement”...and we do it through: (Literature reviews) Telephone calls, e-mail, (and video conferences) Face to face 1:1 meetings & field trips Meetings, workshops & seminars Collaborative work / projects Sharing draft outputs for comment Web 2 – blogs, wikis, discussion groups Print & web publications The media
  • 9.
    Policy processes arecomplex Monitoring and Evaluation Agenda Setting Decision Making Policy Implementation Policy Formulation Civil Society Donors Cabinet Parliament Ministries Private Sector
  • 10.
    Chronic Poverty inUganda Kate Bird et al, Fracture Points in Social Policies for Chronic Poverty Reduction, ODI WP242, 2004 (http://www.odi.org.uk/publications/working_papers/wp242.pdf)
  • 11.
    Factors influencing uptakeExternal Influences Socio-economic and cultural influences, donor policies etc www.odi.org.uk/RAPID/Tools/Toolkits/RAPID_Framework.html The political context – political and economic structures and processes, culture, institutional pressures, incremental vs radical change. The evidence – credibility, the degree it challenges received wisdom, research approaches and methodology, simplicity of the message, how it is packaged. The links between policy and research communities – networks, relationships, power, competing discourses, trust, knowledge etc.
  • 12.
    A Practical Frameworkpolitical context evidence links Politics and Policymaking Media, Advocacy, Networking Research, learning & thinking www.odi.org.uk/RAPID/Tools/Toolkits/Policy_Impact/Framework_qus.html External Influences Scientific information exchange & validation Policy analysis, & research Campaigning, Lobbying
  • 13.
    Health Care inTanzania “ The results of household disease surveys informed processes of health service reform which contributed to a 43 and 46 per cent reduction in infant mortality between 2000 and 2003 in two districts in rural Tanzania.” TEHIP Project, Tanzania: www.idrc.ca/tehip
  • 14.
    What should youmeasure? It depends what you’re trying to do…. “ If you don't know where you are going, any road will get you there”
  • 15.
    Should be: Specific M easurable A chievable R ealistic T ime-bound ( O bjective) Whatever you measure
  • 16.
    ...and many projectsfail when the inputs cease... Change takes a long time Inputs Activities Outputs Outcomes Impact Other Actors Project Effort Behaviour Change
  • 17.
    Focusing on changewww.odi.org.uk/RAPID/Tools/Toolkits/KM/Outcome_mapping.html OUTCOME MAPPING: Building Learning and Reflection into Development Programs Sarah Earl, Fred Carden, and Terry Smutylo www.idrc.ca/en/ev-9330-201-1-DO_TOPIC.html
  • 18.
    Emphasis on “learning”“… every time we do something again, we should do it better than the last time… ” Goals Results Activities External networks; Colleagues; Information assets; Own knowledge www.odi.org.uk/RAPID/Tools/Toolkits/KM/Index.html Learn during Learn after Learn before
  • 19.
    Starts with theattitude that someone has probably already done what I am about to do. I wonder who?” Learning before: Peer Assist www.odi.org.uk/RAPID/Tools/Toolkits/KM/Peer_assists.html
  • 20.
    Learning During: StoriesWhat was the situation? What was the challenge? What was done? What was the result? What lessons can be drawn? www.odi.org.uk/RAPID/Tools/Toolkits/KM/Stories.html www.mande.co.uk/docs/MSCGuide.pdf Most significant change Best stories at each level Synthesis Stories of change
  • 21.
    Horizontal evaluation Peerreview Choose the moment Choose your peers Limited criteria e.g. ODI Peer Review Appreciative enquiry Self-evaluation CGIAR/CIAT Workshop
  • 22.
    Learning after: AARwww.odi.org.uk/RAPID/Tools/Toolkits/KM/AAR.html An after action review asks 4 simple questions: 15 minute team debrief, conducted in a “rank-free” environment. What was supposed to happen? What actually happened? Why was there a difference? What can we learn from it?
  • 23.
    Case & EpisodeStudies Classical case studies: how did evidence shape policy decisions? e.g. IFPRI & IDRC Overestimate the role of research www.idrc.ca/en/ev-26606-201-1-DO_TOPIC.html www.ifpri.org/impact/impact.htm www.odi.org.uk/RAPID/Publications/BRP_ITDG.html www.odi.org.uk/RAPID/Projects/PPA0104/Index.html www.gdnet.org/middle.php?oid=175 Episode studies: retrospective tracking back from policy change e.g. PRSPs, SL, AHC Underestimate the role of research
  • 24.
    RAPID Outcome Mappingwww.odi.org.uk/RAPID/Publications/RAPID_WP_266.html
  • 25.
    Social Network Analysiswww.odi.org.uk/RAPID/Tools/Toolkits/KM/Social_network_analysis.html
  • 26.
    Other approaches: PublicCitations, webstats, media logs etc Surveys Quantitative Qualitative Distribution lists and attendance records Meeting evaluations Logs: The expected The unexpected How you have changed Evaluation: Practical Guidelines, Research Councils UK. 2002 www.rcuk.ac.uk/cmsweb/downloads/rcuk/publications/evaluationguide.pdf
  • 27.
    Other approaches: PolicyStrategy and direction: Logframes; Social Network Analysis; Impact Pathways; Modular Matrices Management : ‘Fit for Purpose’ Reviews; ‘Lighter Touch’ Quality Audits; Horizontal Evaluation; Appreciative Inquiry Outputs: Evaluating academic articles and research reports; Evaluating policy and briefing papers; Evaluating websites; Evaluating networks; After Action Reviews Uptake: Impact Logs; New Areas for Citation Analysis; User Surveys Outcomes and impacts: Outcome Mapping; RAPID Outcome Assessment; Most Significant Change; Innovation Histories; Episode Studies www.odi.org.uk/RAPID/Publications/RAPID_WP_281.html
  • 28.
    Learning from eachother Face-to-face meetings Establishing a network E-mail Websites Collaborative work Publications D-groups So...you already are a sort of network
  • 29.
    Network Functions Facilitators/ learners Community builders Investor / providers Convenors Filters Amplifiers Support Agency
  • 30.
    Keys to SuccessClear governance. Strength in numbers. Representativeness. Quality of evidence. Packaging of evidence Persistence. Membership of key individuals. Making use of informal links. Complementing official structures. Good use of ICTs. www.odi.org.uk/RAPID/Publications/RAPID_WP_276.html Or contact Enrique Mendizabal – [email_address]
  • 31.
    Community of practice“ a group of individuals participating in communal activity, and experiencing/continuously creating their shared identity through engaging in and contributing to the practices of their communities” Wenger, Etienne (1998), Communities of Practice: Learning, Meaning, and Identity , Cambridge: Cambridge University Press, ISBN 978-0-521-66363-2 
  • 32.
    e vidence b ased p olicy in d evelopment n etwork To promote greater use of research-based evidence in development policy & practice Research Consultations with CSOs and other stakeholders Capacity-development Collaborative action-research Joint projects Mutual learning www.ebpdn.org
  • 33.
    e vidence b ased p olicy in d evelopment n etwork To promote greater use of research-based evidence in development policy & practice Research Consultations with CSOs and other stakeholders Capacity-development Collaborative action-research Joint projects Mutual learning www.ebpdn.org Meet annually Interactive map of members
  • 34.
    Further information MandENews by Rick Davies: www.mande.co.uk Psci-com (practical guides section) by the Wellcome Trust: http://www.intute.ac.uk/healthandlifesciences/pscicom/ Wellcome Trust Researcher Support Links: www.wellcome.ac.uk/Professional-resources/Researcher-support/WTD026043.htm RAPID Website: www.odi.org.uk/rapid John Young: [email_address]
  • 35.
    Group work –4 Questions: What sort of evidence do you need in your own project(s) to make sure you are on track (and how to collect it)? What sort of evidence would you like to have about other public engagement projects? How would you like to get that evidence (and share your own)? Who should do what (you, other projects, 3 rd party, Wellcome)?
  • 36.
    Process 4 groups,each + facilitator & rapporteur Split into 2 sub-groups: Own evidence Evidence from others Whole group How to get it Who should do what Max. 3 responses to each question, highlighting the most important. Report this back to plenary “ Discussion” on D-Groups
  • 37.
    Focus and locationPolicy – in the Auditorium Fac: Michelle Jimenez Rap: Greer Van Zyl Community – in the Boardroom Fac: Bella Starling Rap: Monica Bonaccorso Media – in the canteen Fac: Craig Brierly Rap: Katrina Nevin Ridley Creative/Other – on the Terrace Fac: Laura Harper Rap: Marina Joubert