Introduction

• About me
• About Involve
• About evaluation...we will look at;

• Why we might evaluate
• How we might eva...
"Not everything that can be counted
counts, and not everything that counts
can be counted."

Albert Einstein’s desk
at Pri...
“Evaluation of online participation ... is
 still just an evaluation of

 participation!”            *
 Alice Casey’s desk...
Back to basics...why evaluate?
• Clarifying your objectives and purpose
•Ensuring goals are achievable and measurable
• Im...
A choice of purpose
• Evidence generator
Collecting the evidence of success

• Learning tool
Planning our future work more...
Its never too soon...
There’s a role for the evaluator at all
 stages;
•   Scoping
•   Planning
•   Delivery
•   Reporting...
External or internal evaluation?
 Internal                      External
• Develops in-house learning   •   An outside per...
Evaluation of online participation
So, if ;
"Not everything that can be counted
counts, and not everything that counts
can...
We’re still learning...




but here are some ideas...
How do we measure?
Use online and offline methods
•Online surveys, timing is important
•Pop-ups vs. email
•Consider incent...
Involve your stakeholders...
Target participant groups provide insight
• Ensure they understand why you are
  evaluating
•...
A step further... Participatory Evaluation

• What role should members of the public
  and/or service users play in evalua...
What do we measure?
Some useful definitions
Term           Definition

Outputs        Tangible products (reports, meetings etc.)


Outcomes   ...
Web stats:
(Tip: Check out GoogleAnalytics)
                                   ‘Thin’ engagement

•Unique visitors to the ...
Interaction quality:
eg. online forums, blogs, wikis.

4.Number of comments
5.Number of user-instigated threads or
topics
...
Motivations and perceptions:
Understanding the experience

•Did participants feel they had influence?

•Did participants f...
Outcomes and impact:
What actually changed?

•Did participants have any demonstrable
influence on the process?

•Did polic...
www.involve.org.uk
22-25 Finsbury Square, EC2A 1DX




            tel: +44 (0) 207 9206470
         email: alice@involve....
Upcoming SlideShare
Loading in …5
×

Evaluating Online Participation Web 2.0 Engagement

3,314 views

Published on

Evaluating Online Participation : An introduction to evaluating an online participation process, web forum, user group, blog.

Published in: Technology, Business
0 Comments
7 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
3,314
On SlideShare
0
From Embeds
0
Number of Embeds
107
Actions
Shares
0
Downloads
61
Comments
0
Likes
7
Embeds 0
No embeds

No notes for slide
  • Evaluating Public Engagement TCI/Involve 2008
  • Evaluating Public Engagement TCI/Involve 2008
  • Evaluating Public Engagement TCI/Involve 2008
  • Evaluating Online Participation Web 2.0 Engagement

    1. 1. Introduction • About me • About Involve • About evaluation...we will look at; • Why we might evaluate • How we might evaluate • What we might measure
    2. 2. "Not everything that can be counted counts, and not everything that counts can be counted." Albert Einstein’s desk at Princeton University
    3. 3. “Evaluation of online participation ... is still just an evaluation of participation!” * Alice Casey’s desk at the Involve offices, London * Less catchy, but also true
    4. 4. Back to basics...why evaluate? • Clarifying your objectives and purpose •Ensuring goals are achievable and measurable • Improving ongoing management • Improved public accountability and value data • Improving future work – learning!
    5. 5. A choice of purpose • Evidence generator Collecting the evidence of success • Learning tool Planning our future work more effectively Very important for emerging online methods
    6. 6. Its never too soon... There’s a role for the evaluator at all stages; • Scoping • Planning • Delivery • Reporting back ... so, begin with the end in mind.
    7. 7. External or internal evaluation? Internal External • Develops in-house learning • An outside perspective • Self-assessment • Perceived independence • Less defensive reaction • Credibility • Reduces risks of • Therapeutic innovation • Lets others focus on doing • Lower Costs
    8. 8. Evaluation of online participation So, if ; "Not everything that can be counted counts, and not everything that counts can be counted." …then how on earth do we know what to count and how to count it?
    9. 9. We’re still learning... but here are some ideas...
    10. 10. How do we measure? Use online and offline methods •Online surveys, timing is important •Pop-ups vs. email •Consider incentivising •Ongoing panel of participants •Telephone.... Skype? •Targeted publications •Peer interviews (include ‘people missed’) •Use focus groups (include ‘people missed’)
    11. 11. Involve your stakeholders... Target participant groups provide insight • Ensure they understand why you are evaluating • Consult them about methodology • Consult them about indicators • Include them on the evaluation team • Invite them to be peer researchers
    12. 12. A step further... Participatory Evaluation • What role should members of the public and/or service users play in evaluating projects? • What are the benefits and downsides to having them involved? • If you’re interested, read more here: Putting People Into Public Services; National Consumer Council (2008)
    13. 13. What do we measure?
    14. 14. Some useful definitions Term Definition Outputs Tangible products (reports, meetings etc.) Outcomes Overall direct results (increased skills, reduced crime etc.) Impact Longer-term results (may include unintended effects) Quantitative Objective, measurable, predetermined response options Qualitative Subjective, interpretative, open ended response options
    15. 15. Web stats: (Tip: Check out GoogleAnalytics) ‘Thin’ engagement •Unique visitors to the site 2. Length of time spent on a page 3. Downloads of certain resources or materials Deeper engagement
    16. 16. Interaction quality: eg. online forums, blogs, wikis. 4.Number of comments 5.Number of user-instigated threads or topics 6.Length of comments 7.Number and length of chains of comments ‘discussions’ 5. Variety of participants, including representation of decision makers
    17. 17. Motivations and perceptions: Understanding the experience •Did participants feel they had influence? •Did participants feel that they understood what they were being asked to contribute to and why? • How did the moderator’s role seem to affect the engagement exercise?
    18. 18. Outcomes and impact: What actually changed? •Did participants have any demonstrable influence on the process? •Did policy or decision making change as a result in the medium - long term? • How did the working practices of decision makers change if at all, how did the participants attitudes or views change?
    19. 19. www.involve.org.uk 22-25 Finsbury Square, EC2A 1DX tel: +44 (0) 207 9206470 email: alice@involve.org.uk twitter: @cased Creative commons thanks for photos go to: plindberg, memotions

    ×