Program evaluation

71,638 views

Published on

A presentation on the different types of program evaluation

Published in: Education
0 Comments
5 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
71,638
On SlideShare
0
From Embeds
0
Number of Embeds
39
Actions
Shares
0
Downloads
272
Comments
0
Likes
5
Embeds 0
No embeds

No notes for slide

Program evaluation

  1. 1. Program Evaluation<br />Alex, Ann, Mark, Lisa, Nick, Ethan<br />
  2. 2. Evaluation Video<br />http://vimeo.com/6765206<br />
  3. 3. Evaluation<br />Alternative Views of Evaluation:<br />Diverse conceptions – definition? purpose?<br />Philosophical & Ideological Differences<br />Objectivism & Subjectivism <br />Utilitarian & Intuitionist-Pluralist Evaluation<br />
  4. 4. Evaluation Methodologies<br />Methodological Preferences<br />Qualitative & Quantitative Methods<br />
  5. 5. Metaphors for Evaluation<br />Investigative journalism <br />Photography <br />Literary criticism <br />Industrial production <br />Sports <br />
  6. 6. Objective-Oriented Evaluation<br />Tyler, Metfessel & Michael, Provus, Hammond <br />Use – achievement objectives determine success or failure <br />Taxonomy of objectives and measurement instruments <br />Assessments – objectives / criterion – referenced testing <br />NAEP, Accountability Systems, NCLB<br />Strengths & Limitations <br />Simplicity & Simplicity<br />Goal Free Evaluation <br />
  7. 7. Management-Oriented Evaluation<br />Context, Input, Process, Product Model (CIPP) <br />UCLA Model <br />Strengths & Limitations <br />Rational & Orderly <br />Fuels high level decision makers <br />Costly & Complex <br />Stability v. Need for Adjustment<br />
  8. 8. Consumer-Oriented Approach<br />Typically a summative evaluation approach <br />This approach advocates consumer education and independent reviews of products<br />Scriven’s contributions based on groundswell of federally funded educational programs in 1960s<br />Differentiation between formative/summative evaluation<br />
  9. 9. What is a consumer-oriented evaluation approach?<br />When independent agencies, governmental agencies, and individuals compile information on education or other human services products for the consumer. <br />Goal: To help consumers become more knowledgeable about products<br />
  10. 10. For what purposes is it applied?<br />Typically applied to educational products and programs <br />Governmental agencies<br />Independent consumer groups<br />Educational Products Information Exchange<br />To represent the voice and concerns of the consumers<br />
  11. 11. How is it generally applied?<br />Creating and using stringent checklists and criteria <br />Michael Scriven<br />Educational Products Information Exchange<br />U.S. Dept of Education<br />Program Effectiveness Panel<br />Processes<br />Content<br />Transportability<br />Effectiveness<br />
  12. 12. Consumer Oriented Checklist<br />Need<br />Market<br />Performance<br />True field trials [tests in a “real” setting]<br />True consumer tests [tests with real users]<br />Critical comparisons [comparative data]<br />Long term [effects over the long term]<br />Side effects [unintended outcomes]<br />Process [product use fits its descriptions]<br />Causation [experimental study]<br />Statistical significance [supports product effectiveness]<br />Educational significance<br />
  13. 13. Strengths…<br /><ul><li>Has made evaluations available on products and programs to consumers who may have not had the time or resources to do the evaluation process themselves
  14. 14. Increases the consumers’ knowledge about using criteria and standards to objectively and effectively evaluate educational and human services products
  15. 15. Consumers have become more aware of market strategies </li></li></ul><li>…and Weaknesses<br /><ul><li>Increases product costs onto the consumer
  16. 16. Product tests involves time and money, typically passed onto the consumer
  17. 17. Stringent criteria and standards may curb creativity in product creation
  18. 18. Concern for rise of dependency of outside products and consumer services rather than local initiative development</li></li></ul><li>Consumer Oriented Promotions<br />• Goals: Encourage repurchases by rewarding current users, boost sales of complementary products, and increase impulse purchases.<br />• Coupons—most widely used form of sales promotion.<br />Refunds or rebates—help packaged-goods companies increase purchase rates, promote multiple purchases, and reward product users.<br />
  19. 19. More Consumer Oriented Promotions<br /><ul><li> Samples, bonus packs, and premiums—a “try it, you’ll like it” approach.</li></ul>• Contests—require entrants to complete a task such as solving a puzzle or answering questions in a trivia quiz. <br />• Sweepstakes—choose winners by chance; no product purchase is necessary.<br />• Specialty Advertising—sales promotion technique that places the advertiser’s name, address, and advertising message on useful articles that are then distributed to target consumers.<br />
  20. 20. Consumer Oriented Type Questions<br /><ul><li>What educational products do you use?
  21. 21. How are purchasing decisions made?
  22. 22. What criteria seem to most important in the selection process?
  23. 23. What other criteria for selection does this approach suggest to you?</li></li></ul><li>Expertise Oriented Evaluation<br />Expertise-oriented approaches, Direct application of professional expertise to judge the quality of educational endeavors, especially the resources and the processes. <br />
  24. 24. Approaches<br />Formal Professional Review System<br />Informal Professional Review System<br />Ad Hoc Panel Review<br />Ad Hoc Individual Review<br />Educational Connoisseurship and<br /> Criticism<br />
  25. 25. Formal Professional Review System<br />Structure or organization established to conduct periodic reviews of educational endeavors<br />Published standards<br />Pre-specified schedule<br />Opinions of several experts<br />Impact on status of that which is reviewed<br />
  26. 26. Informal Professional Review System<br />State review of district funding programs<br />Review of professors for determining rank advancement or tenure status<br />Graduate student’s supervisory committee <br /><ul><li>Existing structure, no standards, infrequent schedule, experts, status usually affected</li></li></ul><li>Other Approaches<br /><ul><li>Ad Hoc Panel Reviews (journal reviews)</li></ul>-Funding agency review panels<br />-Blue-ribbon panels<br /><ul><li>Multiple opinions, status sometimes affected
  27. 27. Ad Hoc Individual Reviews (consultant)</li></ul>-Status sometimes affected<br /><ul><li>Educational Connoisseurship and Criticism</li></ul>-Theater, art, and. literary critic<br />
  28. 28. Uses<br />Institutional accreditation<br />Specialized or program accreditation<br />doctoral exams, board reviews, accreditation, reappointment/tenure reviews etc…<br />
  29. 29. Strengths and Weaknesses<br /><ul><li>Strengths: those well-versed make decisions, standards are set, encourage improvement through self-study
  30. 30. Weaknesses: whose standards? (personal bias), expertise credentials, can this approach be used with issues of classroom life, texts, and other evaluation objects or only with the bigger institutional questions?</li></li></ul><li>Expertise Type Questions<br />What outsiders review your program or organization? <br />How expert are they in your program’s context, process, and outcomes?<br />What are characteristics of the most/least helpful reviewers?<br />
  31. 31. Participant Oriented Evaluation<br /><ul><li>Heretofore, the human element was missing from program evaluation
  32. 32. This approach involves all relevant interests in the evaluation
  33. 33. This approach encourages support for representation of marginalized, oppressed and/or powerless parties</li></li></ul><li>Participant Oriented Characteristics<br /><ul><li>Depend in inductive reasoning [observe, discover, understand]
  34. 34. Use multiple data sources [subjective, objective, quant, qual]
  35. 35. Do not follow a standard plan [process evolves as participants gain experience in the activity]
  36. 36. Record multiple rather than single realities [e.g., focus groups]</li></li></ul><li>Participant Oriented Examples <br /><ul><li>Stake’s Countenance Framework
  37. 37. Description and judgment
  38. 38. Responsive Evaluation
  39. 39. Addressing stakeholders’ concerns/issues
  40. 40. Case studies describe participants’ behaviors
  41. 41. Naturalistic Evaluation
  42. 42. Extensive observations, interviews, documents and unobtrusive measures serve as both data and reporting techniques
  43. 43. Credibility vs. internal validity (x-checking, triangulation)
  44. 44. Applicability vs. external validity (thick descriptions)
  45. 45. Auditability vs. reliability (consistency of results)
  46. 46. Confirmability vs. objectivity (neutrality of evaluation)</li></li></ul><li>Participant Oriented Examples <br />Participatory Evaluation<br />Collaboration between evaluators & key organizational personnel for practical problem solving<br />Utilization-Focused Evaluation<br />Base all decisions on how everything will affect use<br />Empowerment Evaluation<br />Advocates for societies’ disenfranchised, voiceless minorities<br />Advantages: training, facilitation, advocacy, illumination, liberation<br />Unclear how this approach is a unique participant-oriented approach<br />Argued in evaluation that it is not even ‘evaluation’ <br />
  47. 47. Strengths and Weaknesses<br /><ul><li>Strengths: emphasizes human element, gain new insights and theories, flexibility, attention to contextual variables, encourages multiple data collection methods, provides rich, persuasive information, establishes dialogue with and empowers quiet, powerless stakeholders
  48. 48. Weaknesses: too complex for practitioners (more for theorists), political element, subjective, “loose” evaluations, labor intensive which limits number of cases studied, cost, potential for evaluators to lose objectivity</li></li></ul><li>Participant Oriented Questions<br />What current program are you involved in that could benefit from this type of evaluation?<br />Who are the stakeholders?<br />
  49. 49. What's going on in the field?<br />Educational Preparationhttp://www.duq.edu/program-evaluation/<br />TEAhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949<br />
  50. 50. What's going on in the field?<br />Rockwood School District<br />Clear Creek ISD<br />Educational Link posted by Austin ISD<br />Houston ISD<br />Austin ISD<br />
  51. 51.
  52. 52. http://uncq.edu/<br />
  53. 53. http://ceee.gwu.edu/<br />
  54. 54. District Initiatives<br />Houston ISD “Real Men Read”http://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRDAlvin ISD “MHS (Manvel HS) Reading Initiative Programhttp://www.alvinisd.net/education/staff/staff.php?sectionid=245<br />
  55. 55. What does the research say?<br />“Rossman and Salzman (1995) have proposed a classification system for organizing and comparing evaluations of inclusive school programs. They suggest that evaluations be described according to their program features (purpose, complexity, scope, target population, and duration) and features of the evaluation (design, methods, instrumentation, and sample).”Dymond, S. (2001).  A Participatory Action Research Approach to Evaluating Inclusive School Programs.  Focus on Autism & Other Developmental Disabilities, 16, 54-63.<br />
  56. 56. What does the research say?<br />“Twenty-eight school counselors from a large Southwestern school district participated in a program evaluation training workshop designed to help them develop evaluation skills necessary for demonstrating program accountability. The majority of participants expressed high levels of interest in evaluating their programs but believed they needed more training in evaluation procedures.” <br />
  57. 57. What does the research say?<br />Group Interview Questions<br />“Graduate research assistants conducted group interviews in Grades 2-5 during the final weeks of the school year. We obtained parent permission by asking teachers to distribute informed consent forms to students in their classes, which invited the students to participate in the group interviews. We received informed consent forms from at least 3 students-the criterion number for a group interview at the school-for 21 schools (66% participation rate). If participation rates were high enough, the research assistants conducted separate interviews for Grades 2-3 and 4-5; the assistants conducted 23 interviews. The research assistants tape recorded all interviews, which averaged about 25 min, for data analysis. The interviewer encouraged responses from all group members. Four questions guided the group interviews.”Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N.   (2005).  Balanced Literacy in an Urban School District.  The Journal of Educational Research, 98, 272-280.<br />
  58. 58. What does the research say?<br />Survey Collection<br />“A survey collected teachers' self-reports of the frequency with which they implemented selected literacy activities and the amount of time in minutes that they used the literacy activities. Teachers also reported their level of satisfaction with the literacy resources available to them.”Frey, B., Lee, S., Massengill, D, Pass, L, & Tollefson, N.   (2005).  Balanced Literacy in an Urban School District.  The Journal of Educational Research, 98, 272-280.<br />
  59. 59. Tips on making a survey:<br />Make the survey response time around 20 minutes <br />Make the survey easy to answer<br />What changes should the school board make in its policies regarding the placement of computers in elementary school?  -Is this question effective or vague?  <br />Survey questions should clarify the time period<br />During the past year, has computer use by the average child in your classroom increased, decreased, or stayed the same?<br />Avoid Double (or triple, or quadruple) Barreled questions and responses<br />My classroom aide performed his/her tasks carefully, impartially, thoroughly, and on time.<br />Langbein, L. (2006). Public Program Evaluation: A Statistical Guide.  New York: M.E. Sharpe, Inc. <br />
  60. 60. Examples of Evaluation Methods<br />EXAMPLES OF EVALUATION METHODS USEDOne study uses a mixed-methods approach of objective-oriented, expertise-oriented, and participant-oriented approaches.  Evaluations were based on the models provided in “Program EvaluationAlternative Approaches and Practical Guidelines”<br />
  61. 61.
  62. 62. Cont.<br />The purpose of this report is to illustrate the procedures necessary to complete an evaluation of the Naval Aviation Survival Training Program (NASTP) written by Anthony R. Artino Jr. a program manager and instructor within the NASTP for eight years“In very few instances have we adhered to any particular “model” of evaluation.  Rather, we find we can ensure a better fit by snipping and sewing bits and pieces off the more traditional ready-made approaches and even weaving a bit of homespun, if necessary, rather than by pulling any existing approach off the shelf.  Tailoring works” (Worthen, Sanders, Fitzpatrick p. 183).  <br />
  63. 63. Cont.<br /> objective-oriented evaluation – Objective-oriented evaluation was used because a) the NASTP has a number of well-written objectives; b) it would be relatively easy to measure student attainment of those objectives using pre and post-assessments; and c) the program sponsor, the CNO, would be very interested to know if the objectives that he approves are in fact being met.<br />
  64. 64. Cont.<br />Expertise-oriented evaluation -   An outside opinion from an aviation survival training subject matter expert – someone very familiar with the topics being taught and the current research literature in survival training was used.<br />
  65. 65. Cont.<br />Participant-oriented evaluation - It was important for evaluators and the SME to be totally immersed in the training environment.  This included a focus on audience concerns and issues (i.e. mangers, instructors, and students) and an examination of the program “in situ” without any attempt to manipulate or control it (Worthen, Sanders, & Fitzpatrick, 1997).<br />
  66. 66.
  67. 67.
  68. 68.
  69. 69. References<br />Astramovich, R., Coker, J., & Hoskins, W.   (2005). Professional School Counseling.  <br />The Journal of Educational Research, 9, 49-54.<br />Dymond, S. (2001).  A Participatory Action Research Approach to Evaluating Inclusive School Programs.  Focus on Autism & Other Developmental Disabilities, 16, 54-63.<br />Frey, B., Lee, S., Massengill, D., Pass, L., & Tollefson, N.   (2005).  Balanced Literacy in an Urban School District.  The Journal of Educational Research, 98, 272-280.<br />Langbein, L. (2006). Public Program Evaluation: A Statistical Guide.  New York: M.E. Sharpe,<br />
  70. 70. References <br /> <br />http://ceee.gwu.edu/http://www.alvinisd.net/education/staff/staff.php?sectionid=245http://www.austinisd.org/inside/accountability/evaluation/http://www.duq.edu/program-evaluation/http://www.eval.org/Publications/GuidingPrinciples.asphttp://www.houstonisd.org/portal/site/ResearchAccountabilityhttp://www.houstonisd.org/portal/site/ResearchAccountability/menuitem.b977c784200de597c2dd5010e041f76a/?vgnextoid=159920bb4375a210VgnVCM10000028147fa6RCRD&vgnextchannel=297a1d3c1f9ef010VgnVCM10000028147fa6RCRDhttp://www.mcrel.org/topics/Assessment/services/231/http://www.rockwood.k12.mo.us/dataquality/Pages/ProgramEvaluations.aspxhttp://www.tea.state.tx.us/index2.aspx?id=2934&menu_id=949http://www.uncg.edu/http://www2.ccisd.net/Departments/ResearchAccountability/ProgramEvaluation.aspx<br />

×