Program evaluation 20121016


Published on

  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Program evaluation 20121016

  1. 1. ProgramEvaluation:Methods andCase StudiesEmil J. Posavac andRaymond G. Carey7th Edition. 2007. NewJersey: Pearson,Prentice Hall.Aung Thu NyeinDA- 8020 Policy Studies
  2. 2. Content About the authors Chapter 1: Program Evaluation: An Overview Chapter 3: Selecting criteria and setting standards
  3. 3. About the authors Emil J. PosavacPh. D., University of Illinois, a professor Emeritus of Psychology atLoyola University of Chicago,Director of applied social psychology graduate programAwarded for Myrdal Award by American Evaluation Association Raymond G. CareyPh. D., Loyola University of Chicago, principal of R. G. CareyAssociates.Widely published in the field of health services and qualityassurance.
  4. 4. An Overview Evaluation is natural routine. “Program evaluation is a collection of methods, skills, and sensitivities necessary to determine whether a human service is needed and likely to be used, whether the service is sufficiently intensive to meet the unmet needs identified, whether the service is offered as planned, and whether the service actually does help people in need at a reasonable cost without unacceptable side effects.”
  5. 5. An Overview… Contd.But program evaluation is different with natural, automaticevaluation. First, organization efforts are carried out by team. This specialization means that responsibility for program evaluation is diffused among many people. Secondly, most programs attempt to achieve objectives that can only be observed sometime in the future rather than in a matter of minutes. Then choice of criteria? Third, when evaluating our own ongoing work, a single individual fills many roles– workers, evaluator, beneficiary, recipient of the feedback, etc. Last, programs are usually paid for by parties other than clients of the program.
  6. 6. Evaluation tasks that need to be donePE is designed to assist some audience to access the aprogram’s merit or worth. Verify that resources would be devoted to meeting unmet needs Verify that implemented programs do provide services Examine the outcomes Determine which program produce the most favorable outcome Select the programs that offer the most needed types of services Provide information to maintain and improve quality Watch for unplanned side effects.
  7. 7. Common Types of ProgramEvaluation Assess needs of the program participants  Identify and measure the level of unmet needs,  Some alternatives Examine the process of meeting the needs  Extent of the implementation,  the nature of people being served  The degree to which the program operates as planned Measure the outcomes of the program  Who had received what?  Program service makes changes for better?  Different opinions of people on outcome? Integrate the needs, costs, and outcomes  Cost-effectiveness
  8. 8. Activities often confused withprogram evaluation Basic research Individual assessment Program audit Although these activities are valuable, program evaluation is different and more difficult to carry out.
  9. 9. Different Types of Evaluationsfor Different Kinds of Programs No “one size fits all” approach. Organizations needing program evaluations  Health care  Criminal justice  Business and Industry  Government Time Frame of needs  Short-term needs  Long-term needs  Potential needs
  10. 10. Extensiveness of the programs Some programs are offered to small group of people with similar needs, but other are developed for use at many sites through out the country. Complexities involved.
  11. 11. Purpose of program evaluation The over all purpose of program evaluation is contributing to the provision of quality services to the people in needs. Feedback mechanism: formative evaluations or summative evaluations or evaluation for knowledge. A Feedback Loop
  12. 12. The roles of evaluators A variety of work setting  Internal evaluators  External: of governmental or regulatory agencies  Private research firms
  13. 13. Comparison of internal and externalevaluators Factors related to competence  Access and advantages  Technical expertise Personal qualities  Evaluator’s personal qualities: objective, fair and trustable. Factorsrelated to the purpose of an evaluation  Formative, summative or quality assurance evaluation?
  14. 14. Evaluation and service The role of social scientist concerned with theory, the design of research, and analysis of data. And the role of practitioners dealing with people in need.
  15. 15. Evaluation and related activities oforganizations Research Education and staff development Auditing Planning Human resources
  16. 16. Chapter 3:Selecting Criteria andSetting Standards
  17. 17. Useful criteria and standards Research design is important, but criteria and standards as well. Criteria that reflect a program’s purposes  Immediate short-term effects, but a marginal long-term ones. Criteria that the staff can influence  Could meet with resistance to an evaluation if the program staff feel that their program will be judged on criteria that they cannot effect. Criteria that can be measured reliably and validly.  Repeated observation could give same values. Criteria that stakeholders participate in selecting  In consultation with evaluator and stakeholders
  18. 18. Developing Goals and Objectives How much agreement on goals is needed?  A number of issues to be addressed. Different types of goals  Implementation goals  Intermediate goals  Outcome goals Goals that apply to all programs  Treating the subjects with respect  Personal exposure to the program  Depending on surveys and records to provide evaluations, etc.
  19. 19. Evaluation criteria andevaluation questions Does the program or plan match the values of the stakeholders? Does the program or plan match the needs of the people to be served? Does the program as implemented fulfill the plans? Does the outcomes achieved match the goals?
  20. 20. Using Program Theory Why a program theory is helpful? How to develop a program theory? Implausible program theories Every program embodies a conception of the structure, functions, and procedures appropriate to attain its goals. The conception constitutes the “logic” or plan of the program, which is called “Program Theory”.Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998.Evaluation: A Systematic Approach, 6th Ed., SAGE Publications,Inc., London.
  21. 21. Assessing program theoryFramework for assessing program theory In relation to social needs Assessment of logic and plausibility Are the program goals and objectives well defined? Are the program goals and objectives feasible? Is the change process presumed in the program theory plausible? Are the program procedures for identifying members of the target population, delivering service to them, and sustaining that service through completion well defined and sufficient? Are the constituent components, activities, and functions of the program well defined and sufficient? Are the resources allocated to the program and its various components and activities adequate? Assessment through comparison with research and practice Assessment via preliminary observation
  22. 22. Assessing program theory-2 Program theory can be assessed in relation to the support for critical assumptions found in research or documented program practice elsewhere. Sometimes findings are available for similar programs. Assessment of program theory yields findings that can help improve conceptualization of a program or, to affirm its basic design.Source: Peter H. Rossi, Howard E. Freeman & Mark W. Lipsey. 1998. Evaluation: A SystematicApproach, 6th Ed., SAGE Publications, Inc., London.
  23. 23. More questions.. Is the program accepted? Are the resources devoted to the program being expended appropriately?  Using program costs in the planning phase  Is offering the program fair to all stakeholders?  Is this the way the funds are supposed to be spent?  Do the outcomes justify the resources spent?  Has the evaluation plan allowed for the development of criteria that are sensitive to undesirable side effects?
  24. 24. Example: Program Theory
  25. 25. Example: Program Theory
  26. 26. Example: Program Theory and theoryfailure
  27. 27. E.g. Theory failure
  28. 28. Some practical limitations inselecting evaluation criteria Evaluation budget: Evaluation is not free. Time available for the project Criteria that are credible to the stakeholders.
  29. 29. Overlap in terminology in program evaluation by Jane T. BertrandBertrand, Jane T., Understanding the Overlap in Programme Evaluation Terminology, May 2005, Thecommunicating initiative network.
  30. 30. Thanks for your attention.