Evaluation: Planning, Implementation, Presenting Results

595 views

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
595
On SlideShare
0
From Embeds
0
Number of Embeds
45
Actions
Shares
0
Downloads
10
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Evaluation: Planning, Implementation, Presenting Results

  1. 1. Evaluation: A Very Practical Guide Shari Holland, President Morningside Research and Consulting February 3, 2012
  2. 2. Purpose•  To make evaluation accessible to your program/organization•  To help you plan for a successful evaluationMorningside Research and Consulting, Inc. 2
  3. 3. Definition•  Did my program meet its goals?•  Did my program have an impact?Morningside Research and Consulting, Inc. 3
  4. 4. Evaluation Spectrum•  Policy analysis/program review•  Descriptive statistics•  Predictive statistics – Regression analysis – double-blind studies – randomized control studiesMorningside Research and Consulting, Inc. 4
  5. 5. Why Do An Evaluation?•  To get funding (needs assessment)•  Future planning•  Funders require it•  Advocacy•  Legitimacy/reputation•  Improve your organization - it’s good to know if resources are being used wiselyMorningside Research and Consulting, Inc. 5
  6. 6. Evaluation Planning•  Very important•  Logic model – inputs, outputs, outcomes•  Data collection tools•  Data collection timeline•  Timing of evaluation activities•  Better to think about an evaluation before implementation, not after the program has concludedMorningside Research and Consulting, Inc. 6
  7. 7. Evaluation Planning•  What impact will your program have?•  What data will you need to measure results?•  Where/how are you going to get the data?•  Evaluators can be a resource at any time, but help with planning is importantMorningside Research and Consulting, Inc. 7
  8. 8. Data Collection Timeline•  Baseline data is very important•  Mid-point, after implementation•  Conclusion of program•  Follow-upMorningside Research and Consulting, Inc. 8
  9. 9. Data Collection Considerations•  Consistency, accuracy and timeliness•  Objectivity•  Minimize demands on participants•  Ethical considerationsMorningside Research and Consulting, Inc. 9
  10. 10. Simple Evaluation•  Count and record on paper or computer (outputs)•  Use a camera to record progress perceptions•  Get testimonials•  Simple, short survey•  Observations•  Each by itself is not an evaluation, but contribute to the evaluationMorningside Research and Consulting, Inc. 10
  11. 11. Video Examples•  Needs Assessment: Anderson Lane (4:06)•  http://www.youtube.com/watch?v=BTWlz95BQGg•  Outputs: Garden (4:08)•  http://www.youtube.com/watch? v=_mFN4CoW4B8&feature=player_embedded•  Testimonials: SRTS (Austin SRTS) (3:28)•  http://www.youtube.com/watch?v=3xgOuVvWFtw•  Outcome: SRTS (Netherlands biking) (4:45)•  http://www.youtube.com/watch? v=qZmpVy068bo&feature=player_embeddedMorningside Research and Consulting, Inc. 11
  12. 12. Needs Assessment: Anderson LaneMorningside Research and Consulting, Inc. 12
  13. 13. Outputs: Gardenhttp://www.youtube.com/watch?v=_mFN4CoW4B8&feature=player_embeddedMorningside Research and Consulting, Inc. 13
  14. 14. Outcomes: SRTSMorningside Research and Consulting, Inc. 14
  15. 15. Outcomes: SRTSMorningside Research and Consulting, Inc. 15
  16. 16. Short, Simple Survey•  Don’t be afraid to ask about your program•  Simple is good – i.e. 5-10 questions, one- page•  Don’t ask (or JUST ask) if they like your program•  Ask how their behavior will change: –  Would they come back? –  How likely are you to tell your friends about us? –  Are you going to use what you learn?Morningside Research and Consulting, Inc. 16
  17. 17. Survey Considerations•  Length•  Clarity•  Simplicity•  Questions – scaled vs. open-ended•  Scale•  Test your surveyMorningside Research and Consulting, Inc. 17
  18. 18. Surveys•  Anything more than a few questions and a few people becomes more complicated and there are more considerations: –  Representative sample –  Bias –  Response rate –  Validity/reliabilityMorningside Research and Consulting, Inc. 18
  19. 19. Survey ExamplesMorningside Research and Consulting, Inc. 19
  20. 20. Interviews•  Anecdotes vs. testimonials vs. interviews•  Better to survey 100 people than interview 25•  Consistency•  Ask about positives AND negativesMorningside Research and Consulting, Inc. 20
  21. 21. Presenting Results•  Show it works•  Use visuals – charts and graphs•  Show the number of people affected•  Show changes in attitude•  Know your audience•  Keep it as simple as possible•  Neat, spell-checked, etc.Morningside Research and Consulting, Inc. 21
  22. 22. External Evaluation•  How important is it to have an external evaluation?•  Objectivity and lack of conflict of interest•  Internal is better than nothing•  How to choose an external evaluator•  Evaluation softwareMorningside Research and Consulting, Inc. 22
  23. 23. Cost of an Evaluation•  5-15% of program budget is a VERY rough guide•  Cost is really based on what you want to know and how difficult it is to collect the dataMorningside Research and Consulting, Inc. 23
  24. 24. Cost of an Evaluation•  How to keep costs down: –  Hire a graduate student in the field or get an intern –  Partner with a professor in the field –  Find an organization willing to mentor your organization –  Hire an evaluator for planning and conduct the evaluation internallyMorningside Research and Consulting, Inc. 24
  25. 25. SRTS Evaluation Results•  Walking did increase•  Geography of the school had the most impact•  Older students walk more•  Race/ethnicity was mixed•  Kids asking their parents is the most effective outreachMorningside Research and Consulting, Inc. 25
  26. 26. SRTS Evaluation Results•  Evaluation Challenges – Citation data – Parent survey thrown out – Lack of middle school data – What to do about kindergartners?Morningside Research and Consulting, Inc. 26
  27. 27. Your Examples•  What would you like to evaluate?•  What has been easy with your evaluation?•  What has been difficult?•  Questions?Morningside Research and Consulting, Inc. 27
  28. 28. Session Evaluation•  What do I want to know?•  Why did I choose the questions I did?Morningside Research and Consulting, Inc. 28
  29. 29. Contact InformationShari Holland, PresidentMorningside Research and Consulting(512) 302 4416sholland@morningsideresearch.comMorningside Research and Consulting, Inc. 29

×