Measuring Program Outcomes and Impacts Ayanava Majumdar Extension Entomologist, ACES Gulf Coast Research and Extension Cen...
Objectives <ul><li>Why measure outcomes and impacts? </li></ul><ul><li>Discuss timing of evaluations (Logic model) </li></...
Few critical sources of information <ul><li>Developing a Logic Model (Taylor-Powell & Renner, 2000, UWEX) </li></ul><ul><l...
Conventional measurements  (old-school evaluations) <ul><li>Participant reaction: usefulness of program </li></ul><ul><li>...
Welcome to the  Accountability Era! <ul><li>What gets measured gets done! </li></ul><ul><li>If you don’t measure results, ...
Accountability & evolution of concepts Taylor-Powell & Renner, 2000 Outputs.   The activities, products, and participation...
Revisiting the LOGIC MODEL <ul><li>Remember… </li></ul><ul><li>It is not a theory, it is not a reality </li></ul><ul><li>I...
Measuring success is complicated! <ul><li>Measuring outcomes, by itself, will need resources! </li></ul><ul><li>Assumption...
Types of evaluation <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluation </li...
Types of evaluation contd. <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluati...
Types of evaluation contd. <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluati...
Types of evaluation contd. <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluati...
Types of evaluation contd. <ul><li>When to conduct?  </li></ul><ul><li>You should have partially achieved this if you did ...
Determine some INDICATORS Relatively EASY Relatively DIFFICULT Indicators :  can be qualitative or quantitative
Evaluation techniques <ul><li>Survey:  collect standardized information, may be mailed, done on-site, structured interview...
Evaluation techniques (contd.) <ul><li>Group assessment:   use of nominal techniques like focus groups, brainstorming, com...
Evaluation techniques (contd.) <ul><li>Tests:   assess knowledge, skills, performance, e.g., pre-test & post-test (P, O) <...
Four evaluation criteria <ul><li>Utility :  </li></ul><ul><ul><li>Goal: how useful is your program evaluation to you & you...
<ul><li>Feasibility :  </li></ul><ul><ul><li>Goal: how practical is your assessment technique? </li></ul></ul><ul><ul><li>...
<ul><li>Appropriateness :  </li></ul><ul><ul><li>Goal: how appropriate is your program evaluation for those involved? </li...
Byod’s evaluation criteria contd. <ul><li>Accuracy :  </li></ul><ul><ul><li>Goal: how accurate is your program evaluation ...
Final tips on program evaluations <ul><li>Consult specialist in planning phase </li></ul><ul><li>Think backwards in LOGIC ...
Thank you for patient listening! <ul><li>QUESTIONS FOR AYANAVA? </li></ul>
Upcoming SlideShare
Loading in …5
×

Basics of Extension Evaluations

667 views

Published on

Published in: Education, Technology
1 Comment
1 Like
Statistics
Notes
  • Thanks for sharing, this is of great help to my students and myself.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total views
667
On SlideShare
0
From Embeds
0
Number of Embeds
6
Actions
Shares
0
Downloads
20
Comments
1
Likes
1
Embeds 0
No embeds

No notes for slide
  • Although the title indicates only measurement of program outcomes and impacts, I will also discuss needs assessment and process evaluation in this presentation because they are precursors to outcomes and impacts.
  • Basics of Extension Evaluations

    1. 1. Measuring Program Outcomes and Impacts Ayanava Majumdar Extension Entomologist, ACES Gulf Coast Research and Extension Center 8300 State Hwy 104, Fairhope, AL 36532 Email: [email_address] Cell phone: 251-331-8416 Fax: 251-990-8912
    2. 2. Objectives <ul><li>Why measure outcomes and impacts? </li></ul><ul><li>Discuss timing of evaluations (Logic model) </li></ul><ul><li>Discuss 12 major evaluation techniques </li></ul><ul><li>How you can make evaluations useful? </li></ul><ul><li>Provide critical sources of information </li></ul>This presentation contains information unique to the presenter and should be taken as suggestions. You may modify content of this presentation to suit your situation.
    3. 3. Few critical sources of information <ul><li>Developing a Logic Model (Taylor-Powell & Renner, 2000, UWEX) </li></ul><ul><li>Logic Model Development Guide (W.K. Kellogg Foundation, 1998) </li></ul><ul><li>Kirkpatrick’s Four Levels of Evaluation (1994), Encyclopedia of Education </li></ul>
    4. 4. Conventional measurements (old-school evaluations) <ul><li>Participant reaction: usefulness of program </li></ul><ul><li>Teaching & facilitation: suggestions for improvement </li></ul><ul><li>Outcomes: what did you learn today? </li></ul><ul><li>Future programming: what do want to learn more about? </li></ul>Taylor-Powell & Renner, 2000 <ul><ul><li>… focus on outputs and immediate effects (learning) </li></ul></ul><ul><ul><li>… no information about action and conditions </li></ul></ul><ul><ul><li>… in effect, this was the push strategy (linear TOT) </li></ul></ul>
    5. 5. Welcome to the Accountability Era! <ul><li>What gets measured gets done! </li></ul><ul><li>If you don’t measure results, you can’t differentiate success from failure. </li></ul><ul><li>If you demonstrate results, you can win public support. </li></ul>Osborne & Gaebler, 1992
    6. 6. Accountability & evolution of concepts Taylor-Powell & Renner, 2000 Outputs. The activities, products, and participation generated through the investment of resources. Goods and services delivered. Outcomes. Results or changes from the program such as changes in knowledge, awareness, skills, attitudes, opinions, aspirations, behavior, practice, decision-making, policies, social action, condition, or status. Impact. The social, economic, civic and/or environmental consequences of the program. Impacts tend to be longer-term and so may be equated with goals.
    7. 7. Revisiting the LOGIC MODEL <ul><li>Remember… </li></ul><ul><li>It is not a theory, it is not a reality </li></ul><ul><li>It is only a MODEL…a framework for visualizing relationships </li></ul>Taylor-Powell & Renner, 2000
    8. 8. Measuring success is complicated! <ul><li>Measuring outcomes, by itself, will need resources! </li></ul><ul><li>Assumptions and external factors create variations in outcomes. </li></ul>
    9. 9. Types of evaluation <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluation </li></ul><ul><li>Outcome evaluation </li></ul><ul><li>Impact evaluation </li></ul><ul><li>When to conduct? </li></ul><ul><li>First thing we should be doing! </li></ul><ul><li>Establishes priorities </li></ul><ul><li>What questions to ask? </li></ul><ul><li>Characteristics of audience </li></ul><ul><li>Needs of audience (prioritize) </li></ul><ul><li>Where do they find information? </li></ul><ul><li>Best learning method </li></ul><ul><li>Find barriers to knowledge adoption </li></ul>Don’t forget your camera, writing instruments, survey print & give time to respond
    10. 10. Types of evaluation contd. <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluation </li></ul><ul><li>Outcome evaluation </li></ul><ul><li>Impact evaluation </li></ul><ul><li>When to conduct? </li></ul><ul><li>During program implementation </li></ul><ul><li>E.g., quality survey, satisfaction survey, future needs survey </li></ul><ul><li>What questions to ask? </li></ul><ul><li>Were you satisfied with delivery methods? </li></ul><ul><li>Was there too much information? </li></ul><ul><li>Are you reaching to targeted audience? </li></ul>
    11. 11. Types of evaluation contd. <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluation </li></ul><ul><li>Outcome evaluation </li></ul><ul><li>Impact evaluation </li></ul><ul><li>When to conduct? </li></ul><ul><li>Measure learning </li></ul><ul><li>During on-site programs: workshops, field days, etc. </li></ul><ul><li>What questions to ask? </li></ul><ul><li>Short-term change: key words in questions “awareness”, “knowledge”, “opinion”, “motivation” </li></ul><ul><li>Document who is not benefiting (analyze sample and understand biases) </li></ul>Part 1
    12. 12. Types of evaluation contd. <ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluation </li></ul><ul><li>Outcome evaluation </li></ul><ul><li>Impact evaluation </li></ul><ul><li>When to conduct? </li></ul><ul><li>Measure behavioral changes </li></ul><ul><li>During one-to-one visits, farm visits, telephonic, mail, email…repeat surveys! </li></ul><ul><li>What questions to ask? </li></ul><ul><li>Medium-term changes: key words in questions “behavior”, “practices”, “decision”, “action” </li></ul><ul><li>Are you meeting goals? Unintended outcomes? </li></ul>Part 2
    13. 13. Types of evaluation contd. <ul><li>When to conduct? </li></ul><ul><li>You should have partially achieved this if you did previous steps right. </li></ul><ul><li>What questions to ask? </li></ul><ul><li>Long-term changes: change in “condition” </li></ul><ul><li>Separate real impact from “background noise” </li></ul><ul><li>Try to document final consequences: new products, innovations, services, community changes, motivation to act in the absence of program </li></ul><ul><li>Four basic types: </li></ul><ul><li>Needs assessment </li></ul><ul><li>Process evaluation </li></ul><ul><li>Outcome evaluation </li></ul><ul><li>Impact evaluation </li></ul>
    14. 14. Determine some INDICATORS Relatively EASY Relatively DIFFICULT Indicators : can be qualitative or quantitative
    15. 15. Evaluation techniques <ul><li>Survey: collect standardized information, may be mailed, done on-site, structured interviews (N, P) </li></ul><ul><li>Case study: in-depth examination of particular groups or individuals (O, I) </li></ul>Taylor-Powell, 2002 N = Needs asses., P = Process eval., O = Outcome eval., I = Impact eval. <ul><li>Interviews: face-to-face interaction, conversational, one-on-one or small groups (P, I) </li></ul><ul><li>Observation: collecting information via seeing or listening, structured or unstructured (I) </li></ul>
    16. 16. Evaluation techniques (contd.) <ul><li>Group assessment: use of nominal techniques like focus groups, brainstorming, community forum (N) </li></ul><ul><li>Expert/peer review: examination by a review committee, Delphi method (“indicator”, I) </li></ul>Taylor-Powell, 2002 N = Needs asses., P = Process eval., O = Outcome eval., I = Impact eval. <ul><li>Portfolio reviews: collection and presentation of materials and samples of work that indicate breadth of program (“indicator”) </li></ul><ul><li>Testimonials: individual statements by people indicating personal reactions, household drop-off (O, I) </li></ul>
    17. 17. Evaluation techniques (contd.) <ul><li>Tests: assess knowledge, skills, performance, e.g., pre-test & post-test (P, O) </li></ul><ul><li>Photos, videos: group or one-on-one interviews ( I) </li></ul><ul><li>Success or problem stories: narrative account by participants about adoption of new practices(“indicator”, N, I) </li></ul><ul><li>Unobtrusive methods: gathering information without making participants aware, e.g., indirect measures, content analysis (N, P, O) </li></ul>Taylor-Powell, 2002 N = Needs asses., P = Process eval., O = Outcome eval., I = Impact eval.
    18. 18. Four evaluation criteria <ul><li>Utility : </li></ul><ul><ul><li>Goal: how useful is your program evaluation to you & your audience? </li></ul></ul><ul><ul><li>Know the following information: </li></ul></ul><ul><ul><ul><li>State purpose clearly </li></ul></ul></ul><ul><ul><ul><li>Consider your audience </li></ul></ul></ul><ul><ul><ul><li>Communicate findings & relevance of findings </li></ul></ul></ul>Boyd, 2002
    19. 19. <ul><li>Feasibility : </li></ul><ul><ul><li>Goal: how practical is your assessment technique? </li></ul></ul><ul><ul><li>Know the following information: </li></ul></ul><ul><ul><ul><li>Keep evaluation practical, nondisruptive </li></ul></ul></ul><ul><ul><ul><li>Calculate cost: benefit </li></ul></ul></ul><ul><ul><ul><li>Use appropriate evaluation technique/s </li></ul></ul></ul>Boyd, 2002 Byod’s evaluation criteria contd.
    20. 20. <ul><li>Appropriateness : </li></ul><ul><ul><li>Goal: how appropriate is your program evaluation for those involved? </li></ul></ul><ul><ul><li>Know the following information: </li></ul></ul><ul><ul><ul><li>Respect people and their rights </li></ul></ul></ul><ul><ul><ul><li>Use appropriate choice statements </li></ul></ul></ul><ul><ul><ul><li>Disclose findings properly </li></ul></ul></ul>Boyd, 2002 Byod’s evaluation criteria contd.
    21. 21. Byod’s evaluation criteria contd. <ul><li>Accuracy : </li></ul><ul><ul><li>Goal: how accurate is your program evaluation to you & your audience? </li></ul></ul><ul><ul><li>Know the following information: </li></ul></ul><ul><ul><ul><li>Design repeatable surveys </li></ul></ul></ul><ul><ul><ul><li>Use appropriate analyses </li></ul></ul></ul><ul><ul><ul><li>Draw justifiable conclusions </li></ul></ul></ul>Boyd, 2002
    22. 22. Final tips on program evaluations <ul><li>Consult specialist in planning phase </li></ul><ul><li>Think backwards in LOGIC model (impact >> output >> input) & allocate resources </li></ul><ul><li>Think about “indicators” of success </li></ul><ul><li>If you conduct surveys, allocate time to respond (don’t rush) </li></ul><ul><li>Publicize your programs, use multiple channels >> create a “pull” system >> more success </li></ul>
    23. 23. Thank you for patient listening! <ul><li>QUESTIONS FOR AYANAVA? </li></ul>

    ×