SCC2011 - Evaluation: Facing the tricky questions


Published on

The presentation from the 'Evaluation: Facing the Tricky Questions' session at the 2011 Science Communication Conference

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • What long term outcomes can we meaningfully measure, and for whom?How do we link these outcomes to public engagement work?What types of programmes are appropriate to evaluate in the long term?What other evaluation work is necessary to lay the foundations for a successful long-term study?Can the findings of such work usefully influence public engagement practice?
  • Important to noteThese long term outcomes are about the PE and engineering communitiesNOT the public, who are covered in the shorter-term outcomes and who are much more problematic to find, also less deep involvementHaving been involved in project delivery, we felt that there would likely be more pronounced impacts on engineers/PE people that we could capture through the evaluation.Also – these goals probably overlap with the goals of many other programmes and schemes.What we did have was anecdotal evidence of collaborations and activities that had developed following other grant schemes, so we had an idea where to start looking, but were very open to unexpected outcomes, or no outcomes at all.Include comment from Lesley’s colleague: “what do you want to evaluate long term impact for? There might not be any”
  • Will mention findings verbally as they are not yet published…Evidence that PE community were more likely to drive the work following the funded period (but they were usually the ‘delivery’ partners)Many had maintained linksStrongest outcome for engineers was skills, unclear whether they would have had these anyway, but impact could be around removing/reducing barriers to engineers gaining PE skills and experienceSome evidence in a small proportion of cases that engineering had been considered in greater depth and longer-term influence on PE community“Engineering was practically invisible in our programme prior to the Award.”Also picked up many people that were doing this work anyway, so Ingenious helped them continue their work rather than change it. Is this success?
  • i.e. we discussed long term impacts and who these would be on in the ToC work – here we identified that the desired long term changes were for the PE and engineering communities. We then felt that it would be possible to evaluate these in the longer-term.NOT – we can’t reach the people that we really want the long term impacts to be on… so let’s just evaluate what we can with who we can get.
  • We hope to address the questions about attribution in the qualitative work
  • SCC2011 - Evaluation: Facing the tricky questions

    1. 1.
    2. 2. Evaluation: Facing the Tricky Questions<br />Mark Dyball, Director, People Science and Policy Ltd<br />Diane Warburton, Evaluation manager, Sciencewise-ERC<br />Laura Grant, Director, Laura Grant Associates Ltd<br />
    3. 3. Evaluation: facing the tricky questionsAssessing Impact on Policy<br />Mark Dyball, Director<br />People Science & Policy Ltd<br />
    4. 4. Basic Questions<br />Can science communication have an impact on policy-making?<br />What sort of science communications projects might impact on policy-making?<br />Is it feasible to make an assessment of the impact of a science communication project on policy-making?<br />Is it appropriate to devote resources to find out whether there has been an impact on policy?<br />
    5. 5. Policy-makers …<br />are not just politicians and civil servants<br />include managers in<br />Universities<br />Charities<br />Businesses<br />are people that have the power to make decisions and set policies within their organisations<br />
    6. 6. What influences policy-makers?<br />Evidence<br />Dogma/belief<br />
    7. 7. Can science communication have an impact on policy making?<br />Yes, but<br />science communication is a broad term;<br />different projects have different goals; and<br />may influence different policy-makers.<br />Was there an intention to influence policy-makers?<br />If so, which ones?<br />
    8. 8. What sort of projects might impact on policy-making?<br />Research-based dialogue <br />Understanding stakeholder attitudes and values<br />Bringing stakeholders and policy-makers together<br />e.g. HSE public dialogue on train protection<br />Often commissioned by the relevant policy-maker<br />
    9. 9. What sort of projects might impact on policy-making?<br />Communication of science<br />Human cloning<br />Stem cells<br />
    10. 10. What sort of projects might impact on policy-making?<br />Communication and evaluation<br />Fosters understanding of<br />Audiences<br />Activities that “work”<br />Who are the effective communicators<br />Evaluation can be used to influence policies of:<br />Programme managers (funders)<br />Project managers (communicators)<br />
    11. 11. Resources?<br />It is worth investing time and money in evaluating impact on policy-makers if<br />It was a key goal of the project<br />e.g. Beacons <br /> there is clear evidence that influence on one or more types of policy-maker was an unintended consequence<br />In many cases it will not be worth this investment <br />
    12. 12. How do you know if you had an influence?<br />Interview policy-makers<br />Follow document trails<br />Observe relevant events<br />The co-operation of the relevant policy-makers makes life easier for the evaluator<br />e.g. EPSRC nanotechnology in healthcare<br />
    13. 13. EPSRCNanotechnology in healthcare<br />Large scale public dialogue (BMRB)<br />Ongoing evaluation<br />The quality of the project<br />Observation<br />Questionnaires<br />interview<br />The impact of the project<br />Documentation<br />Interviews<br />
    14. 14. A Final Thought<br />Which policy-makers would I most like to see influenced?<br />Those who have commissioned the project/evaluation<br />If you have paid for intelligence, then use it.<br />This includes you!<br />
    15. 15. Evaluation: facing the tricky questions<br />Assessing value for money<br />Diane Warburton, Evaluation Manager<br />Sciencewise-ERC<br /><br /><br />
    16. 16. Question:<br />When is a public engagement project good value for money?<br />
    17. 17. Why we have assessed value<br />Sciencewise has developed ways to measure and demonstrate the value of public dialogue because:<br />• reduced budgets and increased scrutiny<br />• need to maintain support as well as funding<br />• useful to review costs of design decisions<br />This became a priority for Sciencewise project evaluations over the last couple of years<br />
    18. 18. Practical and other problems<br />• "Everything that can be counted does not necessarily count; everything that counts cannot necessarily be counted" Albert Einstein <br />• A cynic is "A man who knows the price of everything and the value of nothing" Oscar Wilde <br />• Measuring can be expensive in time as well as money<br />• It is very hard to get detailed accurate cost data<br />We decided that measuring value needs to be relevant and proportionate. And it is not always appropriate.<br />
    19. 19. Looked at all sorts of models<br />• Classic VFM: economy, efficiency, effectiveness<br />• SROI: Cabinet Office approach to social value<br />• Classic cost benefit analysis<br />Found were not appropriate because, as models:<br />• too complex<br />• too detailed<br />• made too many assumptions<br />• too much focus on ‘monetising value’<br />Learnt from these but took a different approach<br />
    20. 20. Assessing impacts<br />Rather than looking at benefits, we focused our evaluations on assessing four types of impacts:<br />• impacts on policy and policy making<br />• impacts on policy makers<br />• impacts on public participants<br />• impacts on scientists and others involved<br />This approach allowed us to identify short term as well as long term impacts, on people and policy <br />
    21. 21. Six questions on costs<br />1 What was the basic budget? e.g. Nanodialogues £240,000; Drugsfutures £300,000 <br />2 What were public participants' and stakeholders' perceptions of whether it was 'money well spent'? e.g. Drugsfutures public participant: "yes, if our views are listened to"<br />3 Could costs have been reduced without losing quality? e.g. geoengineering dialogue: public access events did not add enough value to match costs, and very detailed evaluation <br />
    22. 22. Six questions on costs, continued<br />Could a small additional investment have achieved significant extra benefits? e.g. Synthetic biology dialogue: commissioning follow on work to maximise impacts of project reports; Big Energy Shift: additional public participants at workshop with policy makers <br />What costs could be saved later by having had good public engagement? no examples yet <br />
    23. 23. Six questions on costs, continued<br />6 What are the costs of engagement compared to overall programme budgets? e.g.<br />• Geoengineering: cost £155,000; fed into EPSRC / NERC sandpit which alone allocated £2.5 million<br />• Synthetic biology: cost £360,000; budget for synthetic biology research in UK 2005-10 £18 - £33 million<br />• Nanodialogues: cost £240,000; value of nano research in 2007 estimated at $12 billion<br />• Stem cell dialogue: cost £300,000; industry valued at more than £500 million per year<br /> <br />"If you think dialogue is expensive, try conflict" Andrew Acland <br />
    24. 24. Assessing value of engagement<br />• Costs are only part of the story, but an important part and usually missed out<br />• Overall balance of costs and benefits almost always depends on understanding longer term impacts, especially on policy <br />• Different audiences perceive value differently. We have found: <br />• policy makers value robust evidence and advice from public, raising public awareness, or testing ideas to manage risk<br />• public participants value being listened to and having influence, or just the fun of it<br />• experts and scientists value taking their work to new audiences, and learning new communications skills <br />
    25. 25. Final thoughts<br />We have found:<br />• Evidence of value is vital. Numbers are always powerful, but hard to pin down<br />• Different audiences want different evidence and so need different messages<br />• Collaboration and sharing experience helps development, but …<br />• Can it continue in the current climate?<br />
    26. 26. Many thanks<br />Diane Warburton, Evaluation Manager<br />Sciencewise-ERC<br /><br /><br />
    27. 27. Facing the tricky questions:EVALUATING LONG TERM IMPACTS<br />Dr Laura Grant<br /><br />
    28. 28. The problem(s)<br />It’s hard to track people that have been involved in activities<br />Impacts might diminish over time, making them harder to measure<br />Other factors have more time to intervene, making attribution (more) difficult<br />Who will do/fund this work?<br />
    29. 29. Case study: Ingenious long term tracking study<br />Ingenious is the Royal Academy of Engineering public engagement grant scheme, now in its 5th round<br />Evaluation approach:<br />Used Theory of Change to think about what success looked like (including long-term)<br />Support grant holders in self-evaluation of short-term outcomes<br />Long term tracking study to explore impacts after two years<br />
    30. 30. Desired impacts (long-term)<br />Changing practice – engineering<br />PE is embedded in engineering practice and policy making<br />To create a community of engineers that are able to communicate their work to many groups and understand how their work impacts on society<br />For it to become standard practice to engage the public at an early enough stage to influence the development of technology <br />For more engineers to be willing and able to work with the media<br />Changing practice – public engagement<br />A proportionate amount of engineering in the PE landscape (and for it to be labelled as such)<br />For the ‘development’ funding model to become established <br />Promote meaningful longer-term evaluation of programmes, such as the tracking study<br />
    31. 31. The study<br />Initial e-survey<br />Round 1 n=21, 22% response rate<br />Round 2 n=40, 63% response rate<br />Potential to follow up with interviews<br />May combine data from Round 1&2<br />Need to take into consideration the way the grant scheme has developed<br />
    32. 32. What worked well<br />Thinking through ‘what success looked like’ for the programme alongside the evaluation approach from the start<br />The overall approach meaning the evaluator and grant holders meet at workshops, where the tracking study can be discussed<br />People were interested enough to respond!<br />
    33. 33. Challenges<br />Collecting contact details<br />Difficulty in benchmarking long-term impacts<br />Very difficult to know if respondents could have achieved the impacts another way<br />Attribution or contribution?<br />
    34. 34. Final thoughts<br />Lay the foundations for long-term work by defining success and evaluating short-term outcomes.<br />Long-term evaluation can make a difference to programmes. Obviously this will not happen overnight.<br />Think about who the impacts will be on and whether it’s possible for your evaluation to measure them.<br />As a community, we have research questions around long-term impact that evaluations of individual projects/programmes are not best placed to answer. Who faces these questions?<br />
    35. 35. Discussion<br />In groups…<br /><ul><li>Say hello and introduce yourselves; nominate four roles
    36. 36. Timekeeper
    37. 37. Reporter
    38. 38. Scribe
    39. 39. Facilitator</li></li></ul><li>Was there an impact on policy?Was the activity good value for money?What are the long-term impacts?<br />Points for discussion:<br /><ul><li>Have you come across these tricky questions in your own work? Where?
    40. 40. How did you deal with them? What did you learn?
    41. 41. What refined or new tricky question would you like to see future evaluations address?</li></ul>Note your comments on the sheets provided. Please be clear as we will be writing these up after the session.<br />The reporter from each group will be asked to very briefly feed back one ‘tricky question’ at the end of the session.<br />
    42. 42. Next steps<br />Please hand the notes from your table in at the front of the room.<br />We will write up the notes and questions and share the on the British Science Association website.<br />