Online evaluation for Local Government Information Unit


Published on

1 Comment
  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Evaluating Public Engagement TCI/Involve 2008
  • Evaluating Public Engagement TCI/Involve 2008
  • Evaluating Public Engagement TCI/Involve 2008
  • Online evaluation for Local Government Information Unit

    1. 1. Introduction <ul><li>About me </li></ul><ul><li>About Involve </li></ul><ul><li>About evaluation...we will look at; </li></ul><ul><li>Why we might evaluate </li></ul><ul><li>How we might evaluate </li></ul><ul><li>What we might measure </li></ul>
    2. 2. &quot;Not everything that can be counted counts, and not everything that counts can be counted.&quot; Albert Einstein’s desk at Princeton University
    3. 3. “ Evaluation of online participation ... is still just an evaluation of participation!” * Alice Casey’s desk at the Involve offices, London * Less catchy, but also true
    4. 4. Back to basics...why evaluate? <ul><li>Clarifying your objectives and purpose </li></ul><ul><li>Ensuring goals are achievable and measurable </li></ul><ul><li>Improving ongoing management </li></ul><ul><li>Improved public accountability and value data </li></ul><ul><li>Improving future work – learning! </li></ul>
    5. 5. A choice of purpose <ul><li>Evidence generator </li></ul><ul><li>Collecting the evidence of success </li></ul><ul><li>Learning tool </li></ul><ul><li>Planning our future work more effectively </li></ul><ul><li>Very important for emerging online methods </li></ul>
    6. 6. Its never too soon... <ul><li>There’s a role for the evaluator at all stages; </li></ul><ul><li>Scoping </li></ul><ul><li>Planning </li></ul><ul><li>Delivery </li></ul><ul><li>Reporting back </li></ul><ul><li>... so, begin with the end in mind. </li></ul>
    7. 7. External or internal evaluation? <ul><li>Internal </li></ul><ul><li>Develops in-house learning </li></ul><ul><li>Self-assessment </li></ul><ul><li>Less defensive reaction </li></ul><ul><li>Reduces risks of innovation </li></ul><ul><li>Lower Costs </li></ul><ul><li>External </li></ul><ul><li>An outside perspective </li></ul><ul><li>Perceived independence </li></ul><ul><li>Credibility </li></ul><ul><li>Therapeutic </li></ul><ul><li>Lets others focus on doing </li></ul>
    8. 8. So, if ; &quot;Not everything that can be counted counts, and not everything that counts can be counted.&quot; … then how on earth do we know what to count and how to count it? Evaluation of online participation
    9. 9. We’re still learning... but here are some ideas...
    10. 10. How do we measure? <ul><li>Use online and offline methods </li></ul><ul><li>Online surveys, timing is important </li></ul><ul><li>Pop-ups vs. email </li></ul><ul><li>Consider incentivising </li></ul><ul><li>Ongoing panel of participants </li></ul><ul><li>Telephone.... Skype? </li></ul><ul><li>Targeted publications </li></ul><ul><li>Peer interviews (include ‘people missed’) </li></ul><ul><li>Use focus groups (include ‘people missed’) </li></ul>
    11. 11. Involve your stakeholders... <ul><li>Target participant groups provide insight </li></ul><ul><li>Ensure they understand why you are evaluating </li></ul><ul><li>Consult them about methodology </li></ul><ul><li>Consult them about indicators </li></ul><ul><li>Include them on the evaluation team </li></ul><ul><li>Invite them to be peer researchers </li></ul>
    12. 12. A step further... Participatory Evaluation <ul><li>What role should members of the public and/or service users play in evaluating projects? </li></ul><ul><li>What are the benefits and downsides to having them involved? </li></ul><ul><li>If you’re interested, read more here: </li></ul><ul><li>Putting People Into Public Services; National </li></ul><ul><li>Consumer Council (2008) </li></ul>
    13. 13. What do we measure?
    14. 14. Some useful definitions Term Definition Outputs Tangible products (reports, meetings etc.) Outcomes Overall direct results (increased skills, reduced crime etc.) Impact Longer-term results (may include unintended effects) Quantitative Objective, measurable, predetermined response options Qualitative Subjective, interpretative, open ended response options
    15. 15. <ul><li>Web stats: </li></ul><ul><li>(Tip: Check out GoogleAnalytics) </li></ul><ul><li>Unique visitors to the site </li></ul><ul><li>2. Length of time spent on a page </li></ul><ul><li>3. Downloads of certain resources or materials </li></ul>‘ Thin’ engagement Deeper engagement
    16. 16. <ul><li>Interaction quality: </li></ul><ul><li>eg. online forums, blogs, wikis. </li></ul><ul><li>Number of comments </li></ul><ul><li>Number of user-instigated threads or topics </li></ul><ul><li>Length of comments </li></ul><ul><li>Number and length of chains </li></ul><ul><li>of comments ‘discussions’ </li></ul><ul><li>5. Variety of participants, including representation of decision makers </li></ul>
    17. 17. <ul><li>Motivations and perceptions: </li></ul><ul><li>Understanding the experience </li></ul><ul><li>Did participants feel they had influence? </li></ul><ul><li>Did participants feel that they understood what they were being asked to contribute to and why? </li></ul><ul><li>How did the moderator’s role seem to affect the engagement exercise? </li></ul>
    18. 18. <ul><li>Outcomes and impact: </li></ul><ul><li>What actually changed? </li></ul><ul><li>Did participants have any demonstrable influence on the process? </li></ul><ul><li>Did policy or decision making change as a result in the medium - long term? </li></ul><ul><li>How did the working practices of decision makers change if at all, how did the participants attitudes or views change? </li></ul>
    19. 19. <ul><li> </li></ul><ul><li>22-25 Finsbury Square, EC2A 1DX </li></ul><ul><li>tel: +44 (0) 207 9206470 </li></ul><ul><li>email: [email_address] </li></ul><ul><li>twitter: @cased </li></ul>Creative commons thanks for photos go to: plindberg, memotions