• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Management-Oriented Evaluation Approaches
 

Management-Oriented Evaluation Approaches

on

  • 17,421 views

A presentation for my Ed. D. Degree Program relating to Program Evaluation Models: Developers of the Management-Oriented Evaluation Approach and their Contributions; ...

A presentation for my Ed. D. Degree Program relating to Program Evaluation Models: Developers of the Management-Oriented Evaluation Approach and their Contributions;
How the Management-Oriented Evaluation Approach Has Been Used; Strengths and Limitations of the Management-Oriented Evaluation Approach; Other References, Questions for Discussion

Statistics

Views

Total Views
17,421
Views on SlideShare
16,223
Embed Views
1,198

Actions

Likes
5
Downloads
0
Comments
1

6 Embeds 1,198

http://www.scoop.it 1080
http://www.slideshare.net 66
http://austudent617.blogspot.com 25
http://austudent617.blogspot.ca 14
http://www.linkedin.com 8
https://bb4.utc.edu 5

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

11 of 1 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • nice presentation
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

    Management-Oriented Evaluation Approaches Management-Oriented Evaluation Approaches Presentation Transcript

    • Chapter Five: Management-Oriented Evaluation Approaches Presented by: Iva Angelova & Larry Weas ETR 531 Program Evaluation Northern Illinois University Education, Technology & Research
    • Introduction
      • Developers of the Management-Oriented Evaluation Approach and their Contributions
      • How the Management-Oriented Evaluation Approach Has Been Used
      • Strengths and Limitations of the Management-Oriented Evaluation Approach
      • Other References
      • Questions for Discussion
    • The CIPP Evaluation Model (Stufflebeam , 1971) Context Evaluation Planning Decisions Input Evaluation Structuring Decisions Process Evaluation Implementing Decisions Product Evaluation Recycling Decisions
    • Logical Structure for Designing
      • Focusing the Evaluation
      • Collection of Information
      • Organization of Information
      • Analysis of Information
      • Reporting of Information
      • Administration of the Information
      (Stufflebeam , 1973a) Step 1 Step 2 Step 3 Step 4 Step 5 Step 6
    • Step 1: Further Detail
      • Focusing the Evaluation
        • Identify the major level(s) of decision making to be served, for example local, state, or national.
        • For each level of decision making, project the decision situations to be served and describe each one in terms of its locus , focus, criticality, timing, and composition of alternatives.
        • Define criteria for each decision situation by specific variables for measurement and standards for use in judgment of alternatives.
        • Define policies within which the evaluator must operate.
      Step 1
    • Step 2: Further Detail
      • Collection of Information
        • Specify the source of the information to be collected.
        • Specify the instruments and methods for collecting the needed information.
        • Specify the sampling procedure to be employed.
        • Specify the conditions and schedule for information collection.
      Step 2
    • Step 3: Further Detail
      • Organization of Information
        • Provide a format for the information that is to be collected.
        • Designate a means for performing the analysis.
      Step 3
    • Step 4: Further Detail
      • Analysis of Information
        • Select the analytical procedures to be employed.
        • Designate a means of performing the analysis.
      Step 4
    • Step 5: Further Detail
      • Reporting of Information
        • Define the audiences for the evaluation reports.
        • Specify means for providing information to the audience.
        • Specify the format for evaluation reports and /or reporting sessions.
        • Schedule the reporting of information.
      Step 5
    • Step 6: Further Detail
      • Administration of the Information
        • Summarize the evaluation schedule.
        • Define staff and resource requirements and plans for meeting these requirements.
        • Specify means for meeting policy requirements.
        • Evaluate the potential of the evaluation design for providing information that is valid , reliable, credible, timely, and pervasive (i.e. will reach all relevant stakeholders).
        • Specify and schedule means for periodic updating of the evaluation design.
        • Provide a budget for the total evaluation program.
      Step 6
    • Four Types of Evaluation (Stufflebeam & Shinkfield, 1985)
      • Context Evaluation
      • Input Evaluation
      • Process Evaluation
      • Product Evaluation
    • The UCLA Evaluation Model (Alkin, 1969) Systems Assessment (C) UCLA Evaluation Model compared to CIPP Program Planning (I) Program Implementation To assist in the selection of particular programs likely to be effective in meeting specific education needs Program Improvement (P) Program Certification (P) To provide information about whether a program was introduced to the appropriate group in manner intended To provide information about how a program in functioning, whether interim objective are being achieved, and whether unanticipated outcomes are appearing To provide information about the value of the program and its potential for use elsewhere To provide information about the state of the system
    • The UCLA Evaluation Model
      • Evaluation is a process of gathering information.
      • The information collected in an evaluation will be used mainly to make decisions about alternative course of action.
      • Evaluation information should be presented to the decision maker in form that he can use effectively and that is designed to help rather than confuse or mislead him.
      • Different kinds of decision require different kinds of evaluation procedures.
      (Alkin, 1991) Four assumptions:
    • Growth & Development of the Early Models
      • The CIPP and UCLA appear to be linear and Sequential
      • Evaluators may undertake “retrospective” evaluations
      • Process evaluation can be done without having specific decisions.
      • Cycle through another type of evaluation is the nature of Management-Oriented Evaluation.
    • Guides produces using the CIPP Model CONTEXT Evaluation INPUT Evaluation PROCESS Evaluation PRODUCT Evaluation Shufflebeam, (1977) advanced the procedure for conducting a context evaluation with his guidelines for designing a needs assessment for an educational program or activity. Reinhard (1972) developed a guide for use in input evaluation called the advocate team technique . It is used when acceptable alternatives for designing a new program are not available or obvious. Cronbach, (1963) proposed procedures which provided useful suggestions for the conduct of process evaluation. Techniques discussed in Chapter 6 provide information useful in conducting product evaluation.
    • Management-Oriented Evaluation Approach Used Record of attainment and recycling decisions Guidance for termination, continuation, modification, or installation P roduct Record of actual process Guidance for implementation P rocess Record of choice strategy and design and reason for their choice over other alternatives Guidance for choice or program strategy: input for specification of procedural design I nput Record of objectives and bases for their choice along with a record of needs, opportunities, and problems Guidance for choice of objectives and assignment of priorities C ontext Accountability (Summative Evaluation) Decision Making (Formative Orientation)
    • Strengths & Limitations
      • Proved appealing to evaluators
      • Give focus to the evaluation
      • Stress the importance of the utility of information
      • Instrumental in showing evaluators and program managers that they need not wait until and activity or program has run its course before evaluating it
      • Preferred choice in the eyes of most managers and boards
      • The CIPP model is a useful and simple heuristic tool that helps the evaluator generate potentially important questions to be addressed in an evaluation
      • Using CIPP model, the user can identify a number of questions about an undertaking
      • Supports evaluation of every component of program as it operates, grows, or changes
      • It stresses timely used feedback
      Strengths
    • Strengths & Limitations
      • Evaluator’s occasional inability to respond to questions or issues that may be significant
      • Given to top management, can be become the “hired gun” thus, can become unfair and undemocratic
      • Direct effect of policy-shaping community
      • If follows in its entirety, can result in costly and complex evaluations
      • Assumes the importance decisions can be clearly identified in advanced
      • Thus, frequent adjustments may be needed in the original evaluation plan if this approach is to work well
      Limitations
    • Major Concepts, Theories & Summary
      • The management-oriented evaluation approach, informs decision makers about the inputs, processes, and outputs of the program under evaluation
      • Shufflebeam’s CIPP evaluation model incorporates four separate evaluations into one framework to better serve managers and decision makers
      • In the CIPP model,
        • the Context Evaluation helps define objectives
        • Process Evaluation is used to determine how well a program is being implemented
        • Product Evaluation is used to provide information on what program results were obtained, how well needs were reduced, and what should be done once the program has ended
        • Alkin’s UCLA model is similar to the CIPP model in that it provides decision makers information on the context, inputs, implementations, processes and products of the program under evaluation
      • Alkin, M. C. (1991) Evaluation theory development: II. In M.W. McLaughlin & D. C. Phillips (Eds.), Evaluation and education: at quarter century , Ninetieth Yearbook of the National Society for the Study of education, Part II. Chicago: University of Chicago Press.
      • Shufflebeam, D. L. (2000) The CIPP model for evaluation. In D. L. Shufflebeam, G. F. Madaus, T. Kelleghan (Eds.) Evaluation models: Viewpoints on educational and human services evaluation (2 nd ed., pp.274-317).
      • Shufflebeam , D. L. & Shinkfield, A. J. (1985). Systematic evaluation, Boston: Kluwer- Nijhoff.
      R References
      • Questions for Leading a Discussion
      • (Larry’s question):
        • Based on the two model (CIPP & UCLA) we have discussed, what are some decisions you may have in you own organization?
        • Could one of these two models be used in evaluating your organization's decision? Why?
        • What are some limitations you may incur when evaluation you organization?
      Q Questions