Evaluation training for wellcome trust 15th may

513 views
459 views

Published on

Traininf

Published in: Education, Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
513
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • SophieImage[Wriggly Rangoli Project – Manchester Science Festival: The project was collaboration between Manchester Development Education Project, UoM researchers and Inspired Sisters (a group of Asian women and their children from Longsight) to raise awareness of parasitic infections and global poverty. A workshop with scientists and a group of Asian women informed them of the science and discussed their experiences. The science then inspired designs which were translated into large-scale public art (Rangoli) in Longsight and Manchester Museum.]
  • Introduction activity – when people arrive and register – as them to take part in this activity
  • Suzanne : Introduce the evaluation plan as part of the project plan – ie need to think about it at the start.Develop your evaluation plan alongside your event/activity plan. This will help you plan your project, as thinking about aims and objectives is clearly part of developing your project plan anyway. Does not have be long eg. 1-2 sides of A4.Helps keep you focused and clear – what you want to know; how you will collect the data; what data you need to collect; etc.
  • talk through the slide – details of all the things you need in an evaluation planPoint out there is one in the participant handbook – and copy will be available on ning site for download
  • 5 mins
  • This is a very simplified version of a logic model. It assumes that there is a linear and singular relationship between the various stages. We will look at some more complex versions later. There is often some confusion between outputs and outcomes and I like the Treasury’s definitions that they use in their guide to Evaluating policy impact, although like many Civil Service definitions, they can be a little circular
  • 10 mins
  • Once you have an understanding of how what your project does links with what you are hoping it will change, it becomes easier to plan the evaluation. It also helps you see how there are different aspects that may be evaluated.
  • An example of a more complicated logic model on the control of Striga (a weed that infests maize crops in Africa) with multiple causal links
  • Each of these steps has a set of assumptions and it is often these assumptions that are the key to the evaluation that you want to do.
  • Take answers (5 mins)
  • Introduction activity – when people arrive and register – as them to take part in this activity
  • This exercise is an opportunity for participants to explore what they already know about the key audiences for their evaluation. Each group is given one of the key audiences: You and your team; senior managers; funders; stakeholders (which could include participants in the activity). Feedback onto a grid – with key things people want to know from the evaluation and four columns which we can tick to say which of the audiences each is relevant to ....
  • Unless you know the purpose and audiences for your evaluation – it is impossible to come up with a good plan.For example if you are not prepared to learn from the evaluation in order to inform your own practice – then you may want to reconsider whether to evaluate your activity at all.If the funder expects certain things from your evaluation – then you need to ensure your plan enables you to collect the relevant data to address the questions the funder has.Finally – an understanding of the limits of budget and resources, means that when you put your funding proposal together – you don’t overclaim what the evaluation can show you. This is particularly pertinent is you are claiming impacts that are measurable over a long time period and have no plan to evaluate after the end of the project.So how do you go about putting an evaluation plan together – this is a quick overview of what we cover in our day course – a beginners guide to evaluation. We won’t have much time to dwell on this – but it is important framing for the rest of the day.
  • See strategies from the case study doc.
  • Use Summer Corrosion Ball to quickly group brainstorm and then define outputs, outcomes and impact. Possible answers could include:Outputs = Number of people who took part. Type of people who took part eg. families. How the activity could be improved. Any unexpected outputs.Outcomes = Public knowledge of corrosion improved. People developed a better understanding of how corrosion affects their everyday lives. People enjoyed the experience. Young people were exposed to area of science - corrosion. Any unexpected outcomes.Impact = Was there an impact? If so, what type of impact? Could the impact have been greater?
  • Using the quotes – which are good examples of impact? What else would you need to know.In groups come up with top challenges of measuring impact.These could include Difficulty of proving or measuring causality Resources needed to properly measure long term impacts (ie longitudinal studies are resource intensive) Attrition Variety of factors involved many of which are nothing to do with you Lots of things outside of your control Many impacts are unexpected and therefore difficult to set up a system to measure them Difficulty of setting up control group -to ensure differences in outcomes are not attributed to the intervention if they are in fact the result of changes in the ‘state of the world’.A useful tool for thinking about the methodology of an evaluation is the Kirkpatrick model.4 This model is helpful for thinking about how much evaluation to undertake for a particular initiative. There are four levels ofpotential impact of a initiative according to the Kirkpatrick model:a. Reaction – The initial response to participation (e.g. immediatefeedback on the initiative including things like enjoyment, facilities, best and worst aspects of initiative)b. Learning – Changes in people’s understanding, or raising theirawareness of an issue (this might include a comparison group tomeasure how things have changed as a result of the initiative, or use a baseline to establish changes)For instance, comparing a group of participating pupils with an otherwise similar set of non participating pupils. This is further detailed in the RCUK ‘Evaluation: Practical Guidelines’ document mentioned above.c. Behaviour – Whether people substantially modify what they dolonger term assessment of changes and measurement of the extentd. Results – To track longer-term impacts of the initiative on measurable outcomes e.g. exam results (longer term more complex analysis – might be difficult to separate effects of an initiative with other things which have an impact on the relevant results)
  • Making use of evaluation – what are the key ways we make use of it
  • How can you use evaluation to improve your own practice? To maximise the benefits of your evaluation you need to critically reflect on your project. It is about learning from experience.  Activity: Brainstorm what makes up critical reflection? Questioning Seeking alternatives Keeping an open mind Comparing and contrasting Viewing from various perspectives Asking "what if...?" Asking for others' ideas and viewpoints Considering consequences Hypothesising Synthesising and testing Seeking, identifying, and resolving problems   [Source: Roth, R. A. "Preparing the Reflective Practitioner: Transforming the Apprentice through the Dialectic." Journal of Teacher Education 40, no. 2 (March-April 1989): 31-35.] A well-used reflection model is based on three questions:WHAT? (what happened?)SO WHAT? (what did you learn?)AND WHAT? (what will do as result of experience?) When reflecting yourself or facilitating others to reflect remember to select a style that suits your audience and situation; be creative; cater for different learning styles; and give people time to think! Activity: Think of an activity/event or project that you have been involved in. Summarise what your activity was in one sentence (WHAT). Draw a picture that illustrates what you learnt (SO WHAT). State one thing you will/have done as a result of the activity (AND WHAT). Ask for people to share in groups?/pairs? To summarise the KEY QUESTIONS to ask are:what worked well? why? what did not work well? why not? what will I do the same next time? what will I do differently next time?  Consider using different creative methods to facilitate reflection eg.Using drawing or images, objectsPose questions, interview each other, videoWritten diaries, logs, stories, journalScrapbooks, graffiti walls
  • In final report it is important to consider strengths and weaknesses and lessons learned for the future.There is no point collecting data unless you are going to make use of it and share it with colleagues/stakeholders. Producing a written report of the evaluation process can be very useful – even if it is a short summary.WHAT DO YOU NEED TO THINK ABOUT WHEN WRITING YOUR REPORT?Ask participants to suggest things needed in reportAudience – who will be reading your report?Structure - should be structured around the evaluation questions/objectives your evaluation set out to address and include:The context of the evaluationAim, objectives and evaluation questionsDescription of activity/eventMethodologySummary of evidence (data itself may form appendix)Overview of the activity/eventConclusions and recommendationsLayout – standard report including an exec summary; case study approach Critical Reflection - reflect on what you have learned from the experience. What changes will you make next time?Public - if possible remember to feedback findings to those involved, value their contribution and thank them.Next steps - make sure the findings are acted upon.
  • Think about your audiences (potential participants and audiences for your evaluationDevelop your evaluation plan at the beginningDon’t collect data you can’t useBeware of misrepresenting your dataBack up qualitative data with quantitative dataDon’t hide mistakes – learn from themReflect on what you would do differently next timeRecognise the challenges of measuring impact – but don’t let this put you off. Be realistic about what you can measure with your evaluation – and don’t overpromiseRemember the value of using evaluation during the project, to create more effective activities and eventsShare what you have learnt with others and use it to make a case for the future
  • ResourcesNing siteThank you for attending...
  • Evaluation training for wellcome trust 15th may

    1. 1. EvaluationBruce Etherington15th May 2013
    2. 2. This presentation is developed from a number ofpresentations originally created by the NationalCoordinating Centre for Public Engagement and theBeacons for Public Engagement through the HESTEM Programme.http://www.publicengagement.ac.uk/evaluating-stem-outreachAs such, this presentation is released under thesame Creative Commons licence of Attribution-Share Alike 3.0 Unportedhttp://creativecommons.org/licenses/by-sa/3.0/
    3. 3. Aims of day• To help develop a shared set of approaches toevaluating engagement across Wellcome TrustCentres• To develop skills of participants to develophigh quality evaluation strategies• To help participants to make strong cases forengagement with research and for theevaluation of this activity
    4. 4. Timetable9am Arrive (tea and coffee)9.15 Introductions9.20 Why Evaluate?9.40 How do I know what I am evaluating?10.30 Break10.45 How do I know what I am evaluating? (cont)11.45 Who is the evaluation for?12.30 Lunch13.30 Making the case for engagement (and evaluationof engagement)15.00 End
    5. 5. Introductions• Who you are• What experience you have in evaluation• What you are hoping from the day
    6. 6. WHY EVALUATE?
    7. 7. 1. Why evaluate?Beginner’s Guide to Evaluation
    8. 8. Why and Who of evaluationPost up as many reasons as you can think ofwhy we evaluate our activities on the WHYflipchart.
    9. 9. HOW DO I KNOW WHAT I AMEVALUATING?
    10. 10. Source: Ingenious evaluations: A guide for grant holders, The Royal Academy of Engineers2a: What are you aiming to do?Beginner’s Guide to Evaluation
    11. 11. 1. Aim (what do you want to achieve? Big picture!)2. Objectives (what you need to do to achieve your aim?)3. Evaluation questions (what do you want to know?)4. Methodology (what strategy will you use?)5. Data collection (what techniques will you use to collectyour evidence?)6. Data analysis (how will you analyse your data?)7. Reporting (who will be reading your report?)What goes in an evaluation plan?Beginner’s Guide to Evaluation
    12. 12. • Pick an activity that you know well• Pair up with someone you do not know andexplain you activity to each other– Why you do the activity– What you hope to achieve by doing the activityActivity
    13. 13. Inputs Outputs OutcomesBasic Logic Model
    14. 14. Term Definition ExampleInputs Public sector resources required toachieve the policy objectiveResources used to deliver thepolicyActivities What is delivered on behalf of thepublic sector to the recipientProvision of seminars, trainingevents, consultations etc.Outputs What the recipient does with theresources, advice/trainingreceived, or intervention relevantto themThe number of trainingcourses completedIntermediateOutcomesThe intermediate outcomes of thepolicy produced by the recipientJobs created, turnover,reduced costs or trainingopportunities providedImpacts Wider societal and economicoutcomesThe change in personalincomes and ultimatelywellbeingHM Treasury Definitions (p22)
    15. 15. • Pick an project you are familiar and start towork out the steps of a logic model for it• Consider:– Inputs – Resources Used– Activities – What the project did/does– Outputs – What the participants did/do– Intermediate Outcomes – What changed in theparticipants– Impact – Wider societal effectsYour activity
    16. 16. • Understanding the theory of the change youare aiming for improves evaluation– You can see what you need to evaluate– You can see what you do not need to evaluate– You can see the assumptions you may be makingLogic Models & Evaluation
    17. 17. Pool of long termunemployed wholack skillsObtain Placementsand undertaketrainingImproveQualifications andworkplace skills ofattendeesJob Training scheme exampleObtainInterviews andJob OffersIncrease injobs andincomesLower overallunemploymentHM Treasury, Magenta Book, p23
    18. 18. • What evaluation questions might you want toask about this project?– Are we promoting it sufficiently to the targetaudience?– Are the training courses at the right level?– Are they improving the skills and qualifications ofattendees?– Are they getting more interviews? If not, why not?– Etc.Job Training scheme example
    19. 19. Other TemplatesAssumptionsAssumptionsNick Temple/School for Social EntrepreneursAssumptionsActivitiesImmediateEffectsMediumtermoutcomesLong termimpact
    20. 20. 1. Analysis of the project’s Context2. Stakeholder Analysis3. Problem Analysis/Situation Analysis4. Objectives Analysis5. Plan of Activities6. Resource Planning7. Indicators/Measurements of Objectives8. Risk Analysis and Risk Management9. Analysis of the AssumptionsOther Templates – Logical Framework
    21. 21. • Participatory approach to evaluation• Looks to understand the contribution of aproject to changes in practice of stakeholders• Needs skilled facilitation and a budgetOther Templates – Outcomes Mapping
    22. 22. • “So That” chains• UNDP template:– Identify the desired change– Identify the agents of change– Identify the assumptions– Pathways to Change– Indicators of Change• Theory U (www.presencing.com)Other Templates
    23. 23. • HM Treasury, The Magenta Book: Guidance forevaluation (2011)• Annie E Casey Foundation, Theory of Change:A Practical Tool for Action, Results andLearning (2004)http://www.aecf.org/upload/publicationfiles/cc2977k440.pdf• www.outcomemapping.ca• www.theoryofchange.orgReferences
    24. 24. WHO IS THE EVALUATION FOR?
    25. 25. Why and Who of evaluationPost up as many audiences that you can thinkof for our evaluation work on the WHOflipchart.
    26. 26. What do these audiences want?In groups consider the following questions:• Why is this audience interested in yourevaluation?• What are the top three things they wouldwant to know?• What are the things you, as the organiser,would like them to know?
    27. 27. 2. Making sure your evaluation is fit for purpose
    28. 28. ActivityLook at the example evaluation strategies.Is the approach suitable for all the potentialaudiences of evaluation?What else could the organisers do to help improvetheir evaluation plan?Read the feedback – do you agree/ disagree withthe suggestions?
    29. 29. Corrosion Summer BallA family activity during the Manchester Science Festival with 4 table-top interactive experiences related tocorrosion science. The aims of the activity were to:• inspire the general public with an introduction to corrosion.• communicate that corrosion is interesting and relevant to peoples daily lives.• provide an exciting and memorable learning experience.• make universities more accessible to the general public.What could be the possible outputs, outcomesand impact of this activity?http://www.publicengagement.ac.uk/how/case-studies/corrosion-summer-ballOutputs, outcomes and impact
    30. 30. Challenges of measuring impactWhat are the key challenges to measuringimpact?ResultsBehaviourL e a r n I n gR e a c t I o n
    31. 31. MAKING THE CASE FORENGAGEMENT AND EVALUATION
    32. 32. 3. Making an impact with your evaluationHow can you make use of evaluation?• Self reflection• Reports– Case studies and other formats eg presentations/ video/audio etc• Making a case for future funding/ support
    33. 33.  What worked well? Why? What did not work well? Why not? What will I do the same next time? What will I do differently next time?
    34. 34. What are the keythings you needto include in areport?Reports
    35. 35. Other ways of reportingCase studies/ Video etcWhat are the pros and cons of usingcase studies as a way of reporting onyour evaluation?
    36. 36. Making a case• Things you can evidence–History of evaluative practice informingdevelopment of activities–Learning (self/team reflection)–Approach is informed by target audiences–Effective practice–Commitment to future evaluation toinform activity
    37. 37. Beyond the reportWhat are the opportunities for sharing yourevaluation with others?• On your website• With funders• With partners• With others e.g. NCCPE; Collective Memory
    38. 38. Top tips• Think about your audience• Develop your evaluation plan at the beginning• Don’t collect data you can’t use• Beware of misrepresenting your data• Back up qualitative data with quantitative data• Don’t hide mistakes – learn from them• Reflect on what you would do differently next time• Recognise the challenges of measuring impact• Be realistic about what you can measure• Remember the value of using evaluation during theproject• Share what you have learnt
    39. 39. Evaluation can always havean impact.............if you let it
    40. 40. NCCPE http://www.publicengagement.ac.uk/how/guides/evaluation/resourcesManchester Beacon Evaluation Guide http://www.manchesterbeacon.org/about/UCL Evaluation Toolkithttp://www.ucl.ac.uk/public-engagement/research/toolkits/Event_EvaluationRCUK Evaluation Guidehttp://www.rcuk.ac.uk/documents/publications/evaluationguide.pdfHE STEM http://www.hestem.ac.uk/evaluationInspiring Learning for All http://www.inspiringlearningforall.gov.uk/toolstemplates/Useful Resources

    ×