Your SlideShare is downloading. ×
0
Reserach Short Course: Persepectives on Evaluation
Reserach Short Course: Persepectives on Evaluation
Reserach Short Course: Persepectives on Evaluation
Reserach Short Course: Persepectives on Evaluation
Reserach Short Course: Persepectives on Evaluation
Reserach Short Course: Persepectives on Evaluation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Reserach Short Course: Persepectives on Evaluation

259

Published on

Short presentation used by Phyllis Fleming, PhD and Evaluation Director of the Center for Health Promotion and Disease Prevention at University of North Carolina during the short course title …

Short presentation used by Phyllis Fleming, PhD and Evaluation Director of the Center for Health Promotion and Disease Prevention at University of North Carolina during the short course title "Institutional Research"

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
259
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Perspective on Evaluation <ul><li>We don’t have the evaluation recipe. </li></ul><ul><li>There is no one right answer – life is ambiguous. </li></ul><ul><li>Evaluation is a paradigm of choices. </li></ul><ul><li>Stakeholder involvement is helpful for acceptance, usefulness. </li></ul><ul><li>Things will not always go according to plan. </li></ul><ul><li>Four primary characteristics of good evaluation: utility, feasibility, propriety, accuracy </li></ul>
  • 2. The National Farm to School Network Evaluation: Questions, Perspectives, Lessons Learned Phyllis Fleming, PhD Evaluation Director Center for Health Promotion and Disease Prevention University of North Carolina Taking Root: National Farm to Cafeteria Conference May 17, 2010
  • 3. Evaluation Questions <ul><li>Effectiveness of the National Farm to School Network related to five priorities </li></ul><ul><ul><ul><li>Marketing & media </li></ul></ul></ul><ul><ul><ul><li>Training and technical assistance </li></ul></ul></ul><ul><ul><ul><li>Networking </li></ul></ul></ul><ul><ul><ul><li>Information development and dissemination </li></ul></ul></ul><ul><ul><ul><li>Policy </li></ul></ul></ul><ul><li>Impact of four local programs on consumption of F & V, children’s attitudes, school food service operation, farmers and community </li></ul>
  • 4. Multi-Method Design <ul><li>Logic models </li></ul><ul><li>Methods </li></ul><ul><ul><li>Q1 - Web-based template administered quarterly </li></ul></ul><ul><ul><li>Q2 </li></ul></ul><ul><ul><ul><li>School Lunch Recall (paper/pencil) – 2 sites </li></ul></ul></ul><ul><ul><ul><li>Fruit/Vegetable Neophobia Scale (paper/pencil) – 2 sites </li></ul></ul></ul><ul><ul><ul><li>Knowledge Scale – 1 site </li></ul></ul></ul><ul><ul><ul><li>Interviews – farmers, school food service personnel, teachers, school district leadership, grass roots organizers, farm to school advocates - 4 sites </li></ul></ul></ul><ul><ul><ul><li>School food service data – 1 site </li></ul></ul></ul>
  • 5. Lessons learned <ul><li>Evaluation - a necessary evil? </li></ul><ul><ul><li>Help program people understand their/your role </li></ul></ul><ul><li>Written approval to do evaluations (IRB). </li></ul><ul><li>Formal commitment from those who will provide data. </li></ul><ul><li>Working your way into the school day. </li></ul><ul><ul><li>Be clear/specific upfront about what you need. </li></ul></ul><ul><li>Cooperation and response rates. </li></ul><ul><li>Networking. </li></ul>
  • 6. Lessons learned <ul><li>Validation of tools. </li></ul><ul><li>Minimizing the data collection burden. </li></ul><ul><li>When you learn things you wish you had not. </li></ul><ul><li>Nice to know or need to know. </li></ul><ul><ul><li>Have plans to use all of the data that you collect. </li></ul></ul><ul><li>When things don’t go as planned. </li></ul><ul><ul><li>Be flexible. </li></ul></ul><ul><li>As evaluator, you may not be the most popular person in the room. </li></ul><ul><ul><li>Be clear about what your role is. </li></ul></ul>

×