Your SlideShare is downloading. ×
  • Like
Conference evaluation at a glance
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Now you can save presentations on your phone or tablet

Available for both IPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Conference evaluation at a glance

  • 4,800 views
Published

Short presentation on conference evaluation presented to the Geneva Evaluation Network by Laetitia Lienart of IAS and Glenn O'Neil of Owl RE on 16 March 2011

Short presentation on conference evaluation presented to the Geneva Evaluation Network by Laetitia Lienart of IAS and Glenn O'Neil of Owl RE on 16 March 2011

Published in Education
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
4,800
On SlideShare
0
From Embeds
0
Number of Embeds
6

Actions

Shares
Downloads
24
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. GENEVA EVALUATION NETWORK WORKSHOP CONFERENCE EVALUATIONOrganized by Laetitia Lienart & Glenn O’Neil Geneva, 16 March 2011
  • 2. WHY IS IT IMPORTANT TO EVALUATE CONFERENCES? Institutional Memory ESSENTIAL FOR… Continuous Accountability Learning/ Improvement
  • 3. WHAT ARE THE OBJECTIVES OF CONFERENCE EVALUATIONS? ASSESS IMMEDIATE IMPACTSPROCESS OUTCOMES (lasting changes) At individual Governance Reactions level Learnings/ At organizational Programme Benefits level Logistics & Applications At country level onsite support Information/ Communication
  • 4. RELEVANT DATA COLLECTION METHODSFace-to-face or phone individual interviews (structured & semi-structured)Focus group interviewsOnline surveysPrinted surveysStructured observations of key sessions and conference areasReview of conference programme and online resourcesReview of statistical data on conference registration, scholarship recipients, abstracts, etcReview of statistical data and evaluation findings from previous conference to allow comparison over time
  • 5. RELEVANT DATA COLLECTION METHODS (cont.)Use of rapporteurs to follow sessions addressing key topics. Their feedback can be also used to measure some indicators (e.g. number of sessions presenting new findings).Use of conference “instant” feedback systems.Use of network analysis and mapping.Analysis of the conference media coverage.Review of posts and comments left by delegates and non-attendees on the conference blog, Facebook page and Twitter.
  • 6. Focus on IMPACT ASSESSMENT Assessing conference impact(s) is feasible but needs to beplanned and budgeted for at the planning stage (incl. in ToRs) Methods: follow-up survey (online/face-to-face), action plans Ex: AIDS 2008 follow-up survey (1,5 year after)  1,195 AIDS 2008 delegates completed the survey  About 2/3 had learnt something new and had changed some aspects of their work practice thanks to the new knowledge gained at the conference  Almost half reported that AIDS 2008 had directly influenced their organizations’ HIV work  Almost 4 in 10 were aware of AIDS 2008’s influences on HIV work, policies or advocacy in their countries  75% had kept in contact with at least 1 person met at AIDS 2008, mainly to exchange knowledge, lessons learnt and/or suggested solutions (86%)
  • 7. USE OF EVALUATION FINDINGSEvaluation findings should be “very usable” as conferences are often repeated annually or bi-annually.Importance of “buy-in” of conference organizers.Sharing of evaluation plan with conference organizers and committees/working groups*.Evaluation reports: the quality of content and format is crucial to attract readers and convince them that evaluation results are reliable and useable.Dissemination of evaluation results: timely, use a variety of channels depending on the target audience.Use of follow-up mechanisms** with conference organizers and relevant stakeholders.Review progress on evaluation findings in the lead-up to the next conference.
  • 8. KEY LESSONS LEARNT1. Over-positive feedback (new strategy to be tested in 2011).2. Evaluation report more used as an accountability & marketing tool rather than for learning purposes.3. Unwillingness of conference organizers to devote adequate human & financial resources to evaluation.4. Impact assessment remains a challenge (difficult to measure the extent to which changes are attributable to the conference, especially policies, norms & guidelines).5. Data disaggregation is important to make evaluation results more accurate and useful*.6. Conference evaluation provides unique opportunity to see how findings are integrated (or not) into future conferences.
  • 9. Further informationProceedings (slides & handouts) of a 1-day workshop onconference evaluation held in Nov 2010 are available onrequest (email Laetitia.Lienart@iasociety.org)Feel free to join the Conference Evaluation Google Group:http://groups.google.com/group/conference_evaluationGlenn’s blog has more resources on conference evaluation,see category “event evaluation:http://intelligentmeasurement.wordpress.com/category/event-evaluation/