Outcome-Based MeasurementFrom Theory to ImplementationGhebray Consulting647-823-5402ghebrayconsulting@rogers.com
Workshop Objectives1. Examine the process of developing &   implementing outcome-based measurement2. Reflect on the value ...
Agenda1. Presentation: An overview of measurement &   evaluation (15 minutes)2. Discussion (10 minutes)3. Presentation: wa...
Measurement vs. Evaluation
Measurement•Measurement is the process of systematicallyand regularly collecting data on program quality,outputs, and outc...
INPUTS                   STRATEGIES                 OUTCOMESWhat was      Which       What was     Which         Didthe   ...
INPUTS          OUTPUTS       OUTCOMES          IMPACTMONITORING      PERFORMANCE      OUTCOME                            ...
Monitoring Questions• How many clients are in the program?• What are the socio-demographic characteristic  of those in the...
Performance Measurement Questions• How well was the program marketed or   promoted?• Did the program offer adequate and hi...
Outcome Measurement Questions•Did clients attain the intended outcomes?•What did clients learn?Impact Study Questions•Have...
INPUTS        OUTPUTS          OUTCOMES       IMPACTMONITORING PERFORMANCE        OUTCOME        IMPACT STUDY     MEASUREM...
Questions, Thoughts, Discussions…
Before Implementing Outcome-basedMeasurement…1. Thorough program description• Program rationale, definition of the problem...
Community                                        Design   Analysis,                Design               M & E PlanAsset & ...
CONTEXTUAL                ASSUMPTIONS    INPUTS    STRATEGIES   OUTPUTS    OUTCOMES  ANALYSISThe           The         Res...
Employment Program Example CONTEXTUAL                                                STRATEGIES       OUTPUTS             ...
Small Group Discussion…
Using Program Theory to Plan and   implement Outcome-based         Measurement
What does “success” look like?  According to Mark Friedman(2005), three basic questions can help us assess program success...
Outcome Indicator (Is anyone better off?)• Outcome indicators measure how much progress  was made towards achieving the ou...
Process Indicators (How well did we do it?)• Process indicators measure ways in which   program services are delivered – F...
Output Indicators (How much did we do?)• Output indicators measure the quantity or the  volume of services produced - For ...
Measuring Program “Success”                   #    %1. Meet outreach target                                     150 1002. ...
The Measurement & Evaluation Plan                Program   Indicators    Data    Data       Who        When      Who Needs...
Thoughts, Reflections, Questions…
Building Evaluative Culture1. How do you create a shared understanding of   measurement and evaluation? How do you make   ...
References• Crawford, P. & Bryce, P. (2002). Project monitoring  and evaluation: A method for enhancing the  efficiency an...
Upcoming SlideShare
Loading in …5
×

D6 e6 outcome based measurement from theory to implementation

908 views

Published on

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
908
On SlideShare
0
From Embeds
0
Number of Embeds
346
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

D6 e6 outcome based measurement from theory to implementation

  1. 1. Outcome-Based MeasurementFrom Theory to ImplementationGhebray Consulting647-823-5402ghebrayconsulting@rogers.com
  2. 2. Workshop Objectives1. Examine the process of developing & implementing outcome-based measurement2. Reflect on the value of embedding Theory of Change in developing a program theory or logic model3. Explore how the process of developing & implementing outcome-based measurement can be a vehicle for building evaluative culture
  3. 3. Agenda1. Presentation: An overview of measurement & evaluation (15 minutes)2. Discussion (10 minutes)3. Presentation: walking through the program planning cycle – embedding theory of change (15 minutes)4. Break (10 minutes)5. Small group discussion (50 minutes)6. Presentation: Measurement plan (15 minutes)7. Facilitated discussion: Building evaluative culture (25 minutes)8. Final thoughts & reflections (10 minutes)
  4. 4. Measurement vs. Evaluation
  5. 5. Measurement•Measurement is the process of systematicallyand regularly collecting data on program quality,outputs, and outcomes for program participants(Not intended to establish causal relationship)Evaluation•Evaluation is a form of research method –in-depth and rigorous effort to measure programimpact and uses scientific research methods tocompare outcomes with what would happen inabsence of strategies or intervention(s)
  6. 6. INPUTS STRATEGIES OUTCOMESWhat was Which What was Which Didthe resources the strategies participantquantity & were most quantity were most s change?quality of important & quality important If so, howresources for of the for much andused to providing strategies achieving in whatimplement “high provided? the desired ways?strategies? quality” outcomes? strategy?
  7. 7. INPUTS OUTPUTS OUTCOMES IMPACTMONITORING PERFORMANCE OUTCOME IMPACT STUDY MEASUREMENT MEASUREMENTWhat is the What is the What are the What arequantity & quantity results for thequality of and quality participants impactsresources we of the and other that can behave invested program or stakeholders? attributedin this services we to theprogram? Are have program orthey aligned delivered? services?with programgoals?
  8. 8. Monitoring Questions• How many clients are in the program?• What are the socio-demographic characteristic of those in the program?• Are program participants part of the intended or target population?• How are program funds being used?
  9. 9. Performance Measurement Questions• How well was the program marketed or promoted?• Did the program offer adequate and high quality strategy or intervention?• How do clients perceive the program?• Are clients satisfied with the service and their encounters with service providers?
  10. 10. Outcome Measurement Questions•Did clients attain the intended outcomes?•What did clients learn?Impact Study Questions•Have clients’ quality of life improved followingprogram experience? Is the program responsiblefor their improved quality of life?•Is client participation in the program“responsible” for their improvement?
  11. 11. INPUTS OUTPUTS OUTCOMES IMPACTMONITORING PERFORMANCE OUTCOME IMPACT STUDY MEASUREMENT MEASUREMENT DOCUMENTING DOCUMENTING RESEARCHTRACKING INVESTMENT PROGRESS CAUSALITY SOME EVALUATION EXPERIENCE & HIGH EVALUATION EXPERTISE REQUIRED EXPERIENCE & EXPERTISE REQUIRED LOW/REASONABLE COST HIGH COST SHORTER TIME LONGER TIME
  12. 12. Questions, Thoughts, Discussions…
  13. 13. Before Implementing Outcome-basedMeasurement…1. Thorough program description• Program rationale, definition of the problem &feasibility of strategies or interventions• Clarity of causal assumptions - between resourcesstrategies or interventions and expected outcomesas well as performance measures or indicators2. Capacity & experience of program leaders tofacilitate and act on findings & lessons learned
  14. 14. Community Design Analysis, Design M & E PlanAsset & Needs Program Assessment Develop Engage Key Decision Data Collection Stakeholders Making Tools Share Data Collect ImplementFindings & Analysis Data Program Lessons Learned
  15. 15. CONTEXTUAL ASSUMPTIONS INPUTS STRATEGIES OUTPUTS OUTCOMES ANALYSISThe The Resources What Direct BenefitsReason implicit dedicated program products for clientsbehind theory to the does of during &the or belief program with program after theneed for why the inputs strategies programthe programprogram will work
  16. 16. Employment Program Example CONTEXTUAL STRATEGIES OUTPUTS ASSUMPTIONS INPUTS -# of classes ANALYSIS -Weekly soft -The absence -2 F.T.E Staff & mentorship-People in skills classes of a job history sessions our community: perpetuates -Program & -Match clients & offered Unemployment counseling-Have few job mentors space -# & type ofskills, bad or no -Job search skills jobs & limited - 8 months of clients training helps -Programjob histories supervised job served with job supplies & training & Readiness equipment-Have few job mentoringtraining & job sessions -Job mentorshipplacement can enhance on IMPACT OUTCOMEopportunities the job learning -Clients find -Clients learn soft jobs, remain skills & job strategies -Job placement employed & helps establish establish job -Clients establish supportive job history histories professional relationships with mentors
  17. 17. Small Group Discussion…
  18. 18. Using Program Theory to Plan and implement Outcome-based Measurement
  19. 19. What does “success” look like? According to Mark Friedman(2005), three basic questions can help us assess program success1.How much did we do?2.How well did we do it?3.Is anyone better off?
  20. 20. Outcome Indicator (Is anyone better off?)• Outcome indicators measure how much progress was made towards achieving the outcome - For an outcome “Newcomers are job ready to compete in Canadian job market”Outcome indicators include:• Rate of successful training & job placement completion (i.e. ability to prepare resume, positive feedback on mock interview & job placement• Maintaining professional relationship with mentors 6 month after end of program
  21. 21. Process Indicators (How well did we do it?)• Process indicators measure ways in which program services are delivered – For “Provide high quality service”Process indicators include:• Qualification & experience of mentors• Suitability of client/mentor match & job placement• Rate of client satisfaction
  22. 22. Output Indicators (How much did we do?)• Output indicators measure the quantity or the volume of services produced - For “Provide adequate support to targeted population”Output indicators include:- # & types services offered (i.e. learning sessions, employment counseling, mentorship etc.)- # and characteristics of clients served- # of mentors recruited & matched
  23. 23. Measuring Program “Success” # %1. Meet outreach target 150 1002. Attendance during orientation, intake & level of 75 50registration in program (# & type of clients served)3. High attendance, participation, & rate of satisfaction 65 434. Knowledge of job search strategies, Canadian labor 50 33market & work place culture5. Successful completion of program (including job 45 30placement)7. #s employed in profession related at ($17.00/hour) 26 178. Favorable Job assessment (tracked quarterly) 20 139. Job retention (for at least one year) 15 10
  24. 24. The Measurement & Evaluation Plan Program Indicators Data Data Who When Who Needs the Component Sources Collection Collects to Findings & to be Methods Data Collect Lessons Learned Measured Data did we do? How much Scope & Reach we do it it? How well did Program or Service Qualitybetter off?Is anyone Outcome for Participants
  25. 25. Thoughts, Reflections, Questions…
  26. 26. Building Evaluative Culture1. How do you create a shared understanding of measurement and evaluation? How do you make measurement & evaluation everyone’s business?2. What does measurement & evaluation mean to your board, staff etc?3. How do you balance between learning and accountability? How do you manage the conversation about measurement & evaluation with funders/staff? How do you negotiate what can and cannot be delivered?
  27. 27. References• Crawford, P. & Bryce, P. (2002). Project monitoring and evaluation: A method for enhancing the efficiency and effectiveness of aid project implementation• Friedman, M. (2005). Trying hard is not good enough: How to produce measurable improvements for customers and communities.• Penna, R. & Phillips, W. (2004). Outcome Frameworks: An overview for practitioners• Smith, M. (2010). Handbook of program evaluation for social work and health professionals• www.tccgrp.com

×