Evaluation introduction

241 views
194 views

Published on

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
241
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Evaluation introduction

  1. 1. What is ‘Evaluation’? Nathan Loynes
  2. 2. In this presentation:1. Definitions and disagreements about evaluation.2. Logic Models.3. Outcomes, Indicators and Targets4. Measuring Outcomes5. Mark Friedman : Outcome Based Accountability
  3. 3. Working Definition of Programme EvaluationThe practice of evaluation involvesthoughtful, systematic collection andanalysis of information about the activities,characteristics, and outcomes ofprogrammes, for use by specific people, toreduce uncertainties, improve effectiveness,and make decisions. 2
  4. 4. Scott & Morrison (2005) Evaluation Focuses on:• Value & Worth• Education or Social Programmes• Activities, Characteristics and Outcomes• Policy Implications (What should happen next?)
  5. 5. Pawson & Tilley, 1997 (In Scott & Morrison)• Realistic Evaluation1. Take into account the ‘institutional’ nature of programmes2. Should be scientific3. Evaluation should not be self-serving.
  6. 6. Chen (1996) (In Scott and Morrison, 2005) 4 Types of Evaluation:1. Process-Improvement2. Process-Assessment3. Outcome-Improvement4. Outcome-Assessment
  7. 7. Working Definition of Programme EvaluationThe practice of evaluation involvesthoughtful, systematic collection andanalysis of information about the activities,characteristics, and outcomes ofprogrammes, for use by specific people, toreduce uncertainties, improve effectiveness,and make decisions. 6
  8. 8. Evaluation Strategy ClarificationAll Evaluations Are:  Partly social  Partly political  Partly technicalBoth qualitative and quantitative data can be collected and used and both are valuableThere are multiple ways to address most evaluation needs. Different evaluation needs call for different designs, types of data and data collection strategies. 7
  9. 9. Purposes of EvaluationEvaluations are conducted to:  Render judgment  Facilitate improvements  Generate knowledgeEvaluation purpose must be specified at the earliest stages of evaluation planning and with input from multiple stakeholders. 8
  10. 10. What are Logic Models?
  11. 11. To Construct a Logic Model You Must Describe: Inputs: resources, money, staff/time, facilities, etc. Outputs: how a program uses inputs to fulfill its mission – the specific strategies, service delivery. Outcomes: changes to individuals or populations during or after participation. Inputs Outputs Outcomes 10
  12. 12. Here is an illustration that will help you create your ownLogic Model. Inputs Contextual Analysis Resources dedicated to or Identify the major consumed by the conditions and programme. reasons for why you are doing the work E.G. in your community money  staff and staff time, volunteers and volunteer time facilities equipment and supplies Outputs Outcomes What the programme does with the inputs to Benefits for participants during fulfill its mission. and after programme activities. E.G. E.G. provide x number of classes to x participants new knowledge provide weekly counseling sessions increased skills educate the public about signs of child abuse by changed attitudes distributing educational materials to all agencies that modified behavior serve families improved condition Identify 20 mentors to work with youth and altered status opportunities for them to meet monthly for one year 11
  13. 13. Outcomes, Indicators, Targets 12
  14. 14. What is the difference between outcomes, indicators, and targets?Outcomes are changes in behavior, skills, knowledge, attitudes, condition or status.Outcomes are related to the core business of the programme, are realistic and attainable, within the program’s sphere of influence, and appropriate. Outcomes are what a programme is held accountable for. 13
  15. 15. What is the difference between outcomes, indicators, and targets? Indicators are specific characteristics or changes that represent achievement of an outcome. Indicators are directly related to the outcome and help define it. Indicators are measurable, observable, can be seen, heard or read, and make sense in relation to the outcome whose achievement they signal. 14
  16. 16. What is the difference between outcomes, indicators, and targets?Targets specify the amount or level of outcome attainment that is expected, hoped for or required. 15
  17. 17. Why measure outcomes?To see if your programme is really making a difference in the lives of your clientsTo confirm that your programme is on the right track To be able to communicate to others what you’re doing and how it’s making a difference To get information that will help you improve your programme 16
  18. 18. Use Caution When Identifying OutcomesThere is No right number of outcomes.Be sure to think about when to expect outcomes. 1)Initial Outcomes  First benefits/changes participants experience 2)Intermediate Outcomes  Link initial outcomes to longer-term outcomes 3)Longer-term Outcomes  Ultimate outcomes desired for program participants 17
  19. 19. How do you identify indicators? Indicators are specific characteristics or changes that represent achievement of an outcome. Indicators are directly related to the outcome and help define it. Indicators are measurable, observable, can be seen, heard or read, and make sense in relation to the outcome whose achievement they signal. Ask the questions shown on the following slide. 18
  20. 20. Questions to Ask When Identifying Indicators1. What does this outcome look like when it occurs?2. What would tell us it has happened?3. What could we count, measure or weigh?4. Can you observe it?5. Does it tell you whether the outcome has been achieved? 19
  21. 21. The BIG question is what evidence do we need to see to be convinced that things are changing or improving?The “I’ll know it (outcome) when I see it (indicator)” rule in action -- some examples:I’ll know that retention has increased among home health aides involved in a career ladder program when I see a reduction in the employee turnover rate among aides involved in the program and when I see survey results that indicate that aides are experiencing increased job satisfaction 20
  22. 22. Mark Friedman (2005)• Outcomes Based Accountability• Frustrated by social programmes ‘all talk; no action’• Need for a ‘Common Language’.• Need for accurate data• Need for baselines.• Differentiate between Inputs, Outcomes, Outputs
  23. 23. Summary• Evaluation is a systematic process.• Evaluation considers inputs, outputs, and outcomes.• Evaluation involves making qualitative and quantitative judgements.• Effective evaluation requires that you are clear about what it is that you are measuring/judging. 22

×