Management-Oriented Evaluation Approaches


Published on

A presentation for my Ed. D. Degree Program relating to Program Evaluation Models: Developers of the Management-Oriented Evaluation Approach and their Contributions;
How the Management-Oriented Evaluation Approach Has Been Used; Strengths and Limitations of the Management-Oriented Evaluation Approach; Other References, Questions for Discussion

Published in: Education
1 Comment
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Management-Oriented Evaluation Approaches

  1. 1. Chapter Five: Management-Oriented Evaluation Approaches Presented by: Iva Angelova & Larry Weas ETR 531 Program Evaluation Northern Illinois University Education, Technology & Research
  2. 2. Introduction <ul><li>Developers of the Management-Oriented Evaluation Approach and their Contributions </li></ul><ul><li>How the Management-Oriented Evaluation Approach Has Been Used </li></ul><ul><li>Strengths and Limitations of the Management-Oriented Evaluation Approach </li></ul><ul><li>Other References </li></ul><ul><li>Questions for Discussion </li></ul>
  3. 3. The CIPP Evaluation Model (Stufflebeam , 1971) Context Evaluation Planning Decisions Input Evaluation Structuring Decisions Process Evaluation Implementing Decisions Product Evaluation Recycling Decisions
  4. 4. Logical Structure for Designing <ul><li>Focusing the Evaluation </li></ul><ul><li>Collection of Information </li></ul><ul><li>Organization of Information </li></ul><ul><li>Analysis of Information </li></ul><ul><li>Reporting of Information </li></ul><ul><li>Administration of the Information </li></ul>(Stufflebeam , 1973a) Step 1 Step 2 Step 3 Step 4 Step 5 Step 6
  5. 5. Step 1: Further Detail <ul><li>Focusing the Evaluation </li></ul><ul><ul><li>Identify the major level(s) of decision making to be served, for example local, state, or national. </li></ul></ul><ul><ul><li>For each level of decision making, project the decision situations to be served and describe each one in terms of its locus , focus, criticality, timing, and composition of alternatives. </li></ul></ul><ul><ul><li>Define criteria for each decision situation by specific variables for measurement and standards for use in judgment of alternatives. </li></ul></ul><ul><ul><li>Define policies within which the evaluator must operate. </li></ul></ul>Step 1
  6. 6. Step 2: Further Detail <ul><li>Collection of Information </li></ul><ul><ul><li>Specify the source of the information to be collected. </li></ul></ul><ul><ul><li>Specify the instruments and methods for collecting the needed information. </li></ul></ul><ul><ul><li>Specify the sampling procedure to be employed. </li></ul></ul><ul><ul><li>Specify the conditions and schedule for information collection. </li></ul></ul>Step 2
  7. 7. Step 3: Further Detail <ul><li>Organization of Information </li></ul><ul><ul><li>Provide a format for the information that is to be collected. </li></ul></ul><ul><ul><li>Designate a means for performing the analysis. </li></ul></ul>Step 3
  8. 8. Step 4: Further Detail <ul><li>Analysis of Information </li></ul><ul><ul><li>Select the analytical procedures to be employed. </li></ul></ul><ul><ul><li>Designate a means of performing the analysis. </li></ul></ul>Step 4
  9. 9. Step 5: Further Detail <ul><li>Reporting of Information </li></ul><ul><ul><li>Define the audiences for the evaluation reports. </li></ul></ul><ul><ul><li>Specify means for providing information to the audience. </li></ul></ul><ul><ul><li>Specify the format for evaluation reports and /or reporting sessions. </li></ul></ul><ul><ul><li>Schedule the reporting of information. </li></ul></ul>Step 5
  10. 10. Step 6: Further Detail <ul><li>Administration of the Information </li></ul><ul><ul><li>Summarize the evaluation schedule. </li></ul></ul><ul><ul><li>Define staff and resource requirements and plans for meeting these requirements. </li></ul></ul><ul><ul><li>Specify means for meeting policy requirements. </li></ul></ul><ul><ul><li>Evaluate the potential of the evaluation design for providing information that is valid , reliable, credible, timely, and pervasive (i.e. will reach all relevant stakeholders). </li></ul></ul><ul><ul><li>Specify and schedule means for periodic updating of the evaluation design. </li></ul></ul><ul><ul><li>Provide a budget for the total evaluation program. </li></ul></ul>Step 6
  11. 11. Four Types of Evaluation (Stufflebeam & Shinkfield, 1985) <ul><li>Context Evaluation </li></ul><ul><li>Input Evaluation </li></ul><ul><li>Process Evaluation </li></ul><ul><li>Product Evaluation </li></ul>
  12. 12. The UCLA Evaluation Model (Alkin, 1969) Systems Assessment (C) UCLA Evaluation Model compared to CIPP Program Planning (I) Program Implementation To assist in the selection of particular programs likely to be effective in meeting specific education needs Program Improvement (P) Program Certification (P) To provide information about whether a program was introduced to the appropriate group in manner intended To provide information about how a program in functioning, whether interim objective are being achieved, and whether unanticipated outcomes are appearing To provide information about the value of the program and its potential for use elsewhere To provide information about the state of the system
  13. 13. The UCLA Evaluation Model <ul><li>Evaluation is a process of gathering information. </li></ul><ul><li>The information collected in an evaluation will be used mainly to make decisions about alternative course of action. </li></ul><ul><li>Evaluation information should be presented to the decision maker in form that he can use effectively and that is designed to help rather than confuse or mislead him. </li></ul><ul><li>Different kinds of decision require different kinds of evaluation procedures. </li></ul>(Alkin, 1991) Four assumptions:
  14. 14. Growth & Development of the Early Models <ul><li>The CIPP and UCLA appear to be linear and Sequential </li></ul><ul><li>Evaluators may undertake “retrospective” evaluations </li></ul><ul><li>Process evaluation can be done without having specific decisions. </li></ul><ul><li>Cycle through another type of evaluation is the nature of Management-Oriented Evaluation. </li></ul>
  15. 15. Guides produces using the CIPP Model CONTEXT Evaluation INPUT Evaluation PROCESS Evaluation PRODUCT Evaluation Shufflebeam, (1977) advanced the procedure for conducting a context evaluation with his guidelines for designing a needs assessment for an educational program or activity. Reinhard (1972) developed a guide for use in input evaluation called the advocate team technique . It is used when acceptable alternatives for designing a new program are not available or obvious. Cronbach, (1963) proposed procedures which provided useful suggestions for the conduct of process evaluation. Techniques discussed in Chapter 6 provide information useful in conducting product evaluation.
  16. 16. Management-Oriented Evaluation Approach Used Record of attainment and recycling decisions Guidance for termination, continuation, modification, or installation P roduct Record of actual process Guidance for implementation P rocess Record of choice strategy and design and reason for their choice over other alternatives Guidance for choice or program strategy: input for specification of procedural design I nput Record of objectives and bases for their choice along with a record of needs, opportunities, and problems Guidance for choice of objectives and assignment of priorities C ontext Accountability (Summative Evaluation) Decision Making (Formative Orientation)
  17. 17. Strengths & Limitations <ul><li>Proved appealing to evaluators </li></ul><ul><li>Give focus to the evaluation </li></ul><ul><li>Stress the importance of the utility of information </li></ul><ul><li>Instrumental in showing evaluators and program managers that they need not wait until and activity or program has run its course before evaluating it </li></ul><ul><li>Preferred choice in the eyes of most managers and boards </li></ul><ul><li>The CIPP model is a useful and simple heuristic tool that helps the evaluator generate potentially important questions to be addressed in an evaluation </li></ul><ul><li>Using CIPP model, the user can identify a number of questions about an undertaking </li></ul><ul><li>Supports evaluation of every component of program as it operates, grows, or changes </li></ul><ul><li>It stresses timely used feedback </li></ul>Strengths
  18. 18. Strengths & Limitations <ul><li>Evaluator’s occasional inability to respond to questions or issues that may be significant </li></ul><ul><li>Given to top management, can be become the “hired gun” thus, can become unfair and undemocratic </li></ul><ul><li>Direct effect of policy-shaping community </li></ul><ul><li>If follows in its entirety, can result in costly and complex evaluations </li></ul><ul><li>Assumes the importance decisions can be clearly identified in advanced </li></ul><ul><li>Thus, frequent adjustments may be needed in the original evaluation plan if this approach is to work well </li></ul>Limitations
  19. 19. Major Concepts, Theories & Summary <ul><li>The management-oriented evaluation approach, informs decision makers about the inputs, processes, and outputs of the program under evaluation </li></ul><ul><li>Shufflebeam’s CIPP evaluation model incorporates four separate evaluations into one framework to better serve managers and decision makers </li></ul><ul><li>In the CIPP model, </li></ul><ul><ul><li>the Context Evaluation helps define objectives </li></ul></ul><ul><ul><li>Process Evaluation is used to determine how well a program is being implemented </li></ul></ul><ul><ul><li>Product Evaluation is used to provide information on what program results were obtained, how well needs were reduced, and what should be done once the program has ended </li></ul></ul><ul><ul><li>Alkin’s UCLA model is similar to the CIPP model in that it provides decision makers information on the context, inputs, implementations, processes and products of the program under evaluation </li></ul></ul>
  20. 20. <ul><li>Alkin, M. C. (1991) Evaluation theory development: II. In M.W. McLaughlin & D. C. Phillips (Eds.), Evaluation and education: at quarter century , Ninetieth Yearbook of the National Society for the Study of education, Part II. Chicago: University of Chicago Press. </li></ul><ul><li>Shufflebeam, D. L. (2000) The CIPP model for evaluation. In D. L. Shufflebeam, G. F. Madaus, T. Kelleghan (Eds.) Evaluation models: Viewpoints on educational and human services evaluation (2 nd ed., pp.274-317). </li></ul><ul><li>Shufflebeam , D. L. & Shinkfield, A. J. (1985). Systematic evaluation, Boston: Kluwer- Nijhoff. </li></ul>R References
  21. 21. <ul><li>Questions for Leading a Discussion </li></ul><ul><li>(Larry’s question): </li></ul><ul><ul><li>Based on the two model (CIPP & UCLA) we have discussed, what are some decisions you may have in you own organization? </li></ul></ul><ul><ul><li>Could one of these two models be used in evaluating your organization's decision? Why? </li></ul></ul><ul><ul><li>What are some limitations you may incur when evaluation you organization? </li></ul></ul>Q Questions