SlideShare a Scribd company logo
MANAGEMENT-ORIENTED
EVALUATION APPROACHES
MANAGEMENT-ORIENTED EVALUATION APPROACHES
Is meant to serve decision makers.
RATIONALE
- Evaluation information is an essential part of good decision making is essential part
of good decision making and that the evaluator can be most effective by serving
administrators managers, policy makers boards, practioners, and others who need
good evaluation information.
DEVELOPERS
Have relied on system approach to evaluation in which decision are made inputs,
process and output program model and program theory.
DECISION MAKER
Audience to whom a management – oriented evaluation is directed.
Concerns informational needs and criteria for effectiveness pride to direction to the
study.
DEVELOPERS OF THE MANAGEMENT-ORIENTED EVALUATION
APPROACH AND THEIR CONTRIBUTION
STUFFLE LOAN (1968)
A recognized the short comings of available approaches.
Working to expand and systematized thinking about administrative studies and educational
decision making.
HENRY BERNARD, HORACE MANN, WILLIAM TOREY HARIS,
CARLITON WASHBURNE
(1960’s and 1970’s)
They drew from management theory.
STUFFLEBEAM (1968)
Made the decisions of program managers the priotal organizer for the evaluation rather than
program objectives.
ALKINS (1969)
Evaluation, working closely with the administrators identifies the decision the administrators
must make and them collects sufficient information about the relative advantages, disadvantages
of each decision alternative to allow a fair judgment based on specified criteria.
Success of evaluation rest on the quality of teamwork and between evaluation decision makers.
CIPP EVALUATIONMODEL
Stuffbeam (1971, 2000) Shinkfield, 1985 proponent of decision-oriented evaluation approaches
structured to help administrators make good decisions.
View Evaluation as “the process of deliberating, obtaining, and providing useful information for
judging decision alternatives.
He develop evaluation framework to serve managers and administrators forcing from different
kinds of decisions.
1. Context Evaluation to serve planning decision: determining:
– What needs are to be addressed by a program.
– What the programs already exist helps in defining objectives for the program.
2. Input Evaluation – to serve structuring decision.
– What resources are available
– What alternatives strategies for the program should be considered.
– What plan seems to have to best potential for meetings needs foliates design of program
procedures.
2. Process Evaluation (serve implementing decisions)
– How well is the plan being implemented?
– What barriers threaten its success?
– What revision are needed?
4. Once these questions are answered, procedures can be monitored, controlled and
refined.
– Product Evaluation (to serve recycling decision)
– What results were obtained?
– How well were needs reduced?
– What should be done with the program after it has run its course?
– These question are important in judging program attainments.
Stufflebeam’s Evaluation Model
CIPP – Acronym – Context, Input, Process and Product.
General Steps of Stufflebeam – proposed evaluation.
A. Focusing the Evaluation
1. Identify the major level of decision to be served.
2. Ex. Local, state, rational
3. Each level of decision making project the decision to be served and describe each on in terms of its
focus, critical timing and composition of alternatives.
4. Define criteria for each decision situation by specifying variables for measurement and standard for use
in its judgment of alternatives.
5. Define policies within which the evaluation must operate.
B. Collection of Information
1. Specify the source of information to be collected.
2. Specify the instruments and method for collecting the needed information.
3. Specify the sampling procedure to be employed
– Specify the conditions and schedule for the information collected
C. Organization of Information
1. Provide a format for the information that is to be collected.
2. Designate a means for performing the analysis.
D. Analysis of Information
1. Select the analytical procedures to be employed.
2. Designate a means for performing the analysis.
E. Reporting of Information
1. Define the audience for the evaluation reports.
2. Specify means for providing information to the audiences.
3. Specify the format for evaluation reports,
4. Schedule the reporting
F. Administration of the Evaluation.
1. Summarize the evaluation schedule.
2. Define Staff and resource requirements and plans for meeting these requirements.
3. Specify means for meeting policy requirements for conduct of the evaluation.
4. Evaluate the potential of evaluation design for providing information that is valid reliable, credible,
timely, and persuasive (all stake-holder)
FOUR TYPES OF EVALUATION
CONTEXT EVALUATION
Objectives
– To define the institutional context identify the target population and assess its needs
– To identify opportunities for addressing the needs
– To diagnose problems underlying the needs
– To judge whether prosed objectives are sufficiently responsive
– To assessed needs
Method
– By using methods as system analysis, survey, document review, hearings, interviews, diagnostic tests,
and the Delphi technique
Relation to decision making in the change process
– For deciding on the setting to be served, the goals associated with meeting needs or using
opportunities, and the objectives associated with solving problems.
INPUT EVALUATION
Objective
– To identify ad assess system capabilities, alternative program strategies, procedural designs for
implementing the strategies, budgets, and schedules.
Method
– By inventorying and analyzing available human and material resources, solution strategies, and
procedural designs for relevance, feasibility and economy, and by using such method as a literature
search, visits to exemplary programs, advocate teams, and pilot trials
Relation to decision making in the change process
– For selecting sources of support, solution strategies, and procedural designs – that is, for structuring
change activities – and to provide a basis for judging implementation
PROCESS EVALUATION
Objectives
– To identify or predict in process defects in the procedural design or its implementation, to
provide information for the preprogrammed decision, and to record and judge procedural
events and activities
Method
– By monitoring the activity’s potential procedural barriers and remaining alert to unanticipated
ones, by obtaining specified information for programmed decisions, by describing the actual
process and y continually interacting with and observing the activities of project staff.
Relation to decision making in the change process
– For implementing and refining the program design and procedure – that is, for effecting
process control – and to provide a log of the actual process for later use in interpreting
outcomes
PRODUCT EVALUATION
Objective
– To collection descriptions and judgments of outcomes and to relate them to objectives and to
context, input and process information and to interpret their worth and merit.
Method
– By defining operationally and measuring outcome criteria, by collecting judgments of
outcomes form stakeholders, and by performing both qualitative and quantitative analyses.
Relation to decision making in the change process
– For deciding to continue, terminate, modify, or refocus a change activity, and to present a
clear record of effects (intended and unintended, positive and negative)
The UCLA EvaluationModel
CIPP Model. Alkin defined evaluation as “the process of ascertaining the decision
areas of concern, selecting appropriate information, and collecting and analyzing
information in order to report summary data useful to decision-makers in selecting
among alternatives”. Alkin’s model included the following five types of evaluation:
1. Systems Assessments, to provide information bout the state of the system
(similar to context evaluation I the CIPP model)
2. Program planning, to assist in the selection of particular program likely to be
effective in meeting specific educational needs (similar to input evaluation)
3. Program implementation, to provide information about whether a program was
introduced to the appropriate group in the manner intended.
4. Program improvement, to provide information about how a program is
functioning, whether interim objectives are being achieve, and whether
unanticipated outcomes, are appearing (similar to process evaluation).
5. Program certification, to provide information about the value of the program
and its potential for use elsewhere (similar to product evaluation)
Four assumptions about evaluation:
1. Evaluation is a process of gathering information
2. The information collected in an evaluation will
be used mainly to make decisions about
alternative courses of action.
3. Evaluation information should be presented to
the decision maker in a form that he can use
effectively and that is designed to help rather
than confuse or mislead him.
4. Different kinds of decisions require different
kinds of evaluation procedures.
Growth and Development of the Early Models
CIP and UCLA frameworks for evaluation appear to be linear and sequential, but the developers
have stressed that such is not the case. For example, the evaluator would not have to complete
an input evaluation or a systems assessment in order to undertake one of the other types of
evaluation listed in the framework. Often evaluators may undertake “retrospective” evaluations
(such a context evaluation or a systems assessment) in preparation for a process or program
improvement evaluation study, believing this evaluation approach is cumulative, linear, and
sequential; however, such steps are not always necessary. A process evaluation can be done
without having completed context or input evaluation studies. The evaluator my cycle into
another types of evaluation if some decisions suggest that earlier decision should be reviewed.
Such is the nature of management-oriented evaluation.
CIPP model has produced guides for types of evaluation included in the framework. For example,
Stuffbeam (1977) advance the procedure for conducting a context evaluation with his guidelines
for designing a needs assessment for an education program or activity.
In input evaluation was developed by REinhard (1972). The input evaluatin approach that she
developed is called the advocate team technique. It is used when acceptable alternatives for
designing a new program are not available or obvious. The technique creates alternative new
dsigns that are then evaluated she selected, adapted, or combine to create the most viable
alternative design for a new program. This technique has been used successfully by the federal
government (Reinhard, 1972) and by school districts (Sanders, 1982) to generate options and
guide the final design of educational programs. Procedures proposed by Cronbach (1963)
provided useful suggestions for the conduct of process evaluation.
OTHERMANGEMENT-ORIENTED EVALUATION
APPROACHES
Provus’s Discrepancy Evaluation Model was described as an objectives-oriented
evaluation model. Some aspects of that model are also directed toward serving the
information needs of educational program managers. It is system-oriented, focusing
on input, process, and output at each of five stages of evaluation: program definition,
program installation, program process, program products, and cost-benefit analysis.
UCLA evaluation model with respect to their sensitivity to the various decisions
managers need to make at each stage of program development. Likewise, the systems
approached of logic models and program theory focus on inputs, processes, and
outcomes, as do the management-oriented evaluation approaches.
The utilization-focused evaluation approach of Patton (1986, 1996) could also be
viewed as a decision-making approach in one respect. He stressed that the process of
identifying and organizing relevant decision makers and information users is the first
step in evaluation. In his view, the use of evaluation findings requires that decision
makers determine what information is needed by various people and arrange for that
information to be collected and provided to them.
Wholey (1983, 1994) could also be considered a proponent of management-oriented
evaluation, given his focus on working with managements. His writings have
concentrated on the practical uses of evaluation in pubic administration settings.
How theManagement-OrientedEvaluation Approach Has Been
used
CIPP model has been used in school districts and state and federal
government agencies. The Dallas (Texas) Independent School District,
for example, established an evaluation office organized around the
four types of evaluation in the model.
The management-oriented approach to evaluation has guided program
managers through program planning, operation, and the review.
Program staff has found this approach a useful guide to program
improvement.
Evaluation approach has also been used for accountability purposes. It
provides a record-keeping framework that facilitates public review of
client needs, objectives, plans, activities, and outcomes.
Administrators and boards have found this approach useful in meeting
public demands for information. Stufflebeam and Shinkfield (1985)
described these two uses for the CIPP model as shown in Figure.
The Relevance of Four EvaluationTypes of Decision Making and
Accountability
Decision Making (Formative
Orientation)
Accountability (Summative
Orientation)
Context
Input
Process
Product
Guidance for choice of objectives and
assignment of priorities
Guidance for choice of program
strategy: input for specification of
procedural design
Guidance for implementation
Guidance for termination, continuation,
modification, or installation
Record of objectives and bases for
their choice along with a record of
needs, opportunities and problems
Record of chosen strategy and design
and reason for their choice over other
alternatives
Record of the actual process
Record of attainments and recycling
decisions
STRENGHTS AND LIMITATIONS OF THE MANAGEMENT ORIENTED
EVALUATION APPROACH
This approach has proved appealing to many evaluators and program managers,
particularly those at home with the rational and orderly systems approach, to which it
is clearly related. Its greatest strength is that it gives focus to the evaluation.
Experienced evaluators know hoe tempting it is simply to cast a wide net, collecting an
enormous amount of information, only later to discard much of it because it is not
directly relevant to the key issues or questions the evaluation must address. Focusing
on the informational needs and pending decisions of managers limits the range of
relevant data and brings the evaluation into sharp focus. This evaluation approach
also stresses the importance of the utility of information. Connecting decision making
and evaluation underscore the very purpose of evaluation. Also focusing an evaluation
on the decisions managers must make prevents the evaluator form pursuing unfruitful
lines of inquiry that are not of interest to the decision makers.
The management-oriented approach to evaluation was instrumental in showing
evaluators and program managers that they need not wait until an activity or program
has run its course before evaluating it. Educators can begin evaluating even when
ideas from programs are first discussed. Because of lost opportunities and heavy
resource investment, evaluation is generally least affective at the end o f a developing
program.
Management-oriented evaluation approach is probably the preferred choice
in the eyes of most managers and boards. This approach places on
information for decision makers. This approach addressed one of the biggest
criticisms of evaluation in the 1960s: that it did not provide useful
information.
CIPP model, a useful and simple heuristic tool that helps the evaluator
generate potentially important questions to be addressed in an evaluation.
Context, input, process and product, the evaluator can identify a number of
questions about an undertaking. The model and the questions it generates
also make the evaluation easy to explain to lay audiences.
Management-oriented approach to evaluation supports evaluation of every
component of a program as it operates, grows, or changes. It stresses the
timely use of feedback by decision makers so that the program is not left to
flounder or proceed unaffected by updated knowledge about needs,
resources, new developments, the realities of day-to-day operations, or the
consequences of program interventions.
A potential weakness of this approach is the evaluator’s occasional inability to
respond to questions or issues that may be significant – even critical – but
that clash with or at least do not match the concern and questions of the
decision maker who, essentially, controls the evaluation. In addition,
programs that lack decisive leadership are not likely to benefit from this
approach to evaluation.
Potential weakness of management-oriented evaluation is the
preference it seems to give to top management. The evaluator
can become the “hired gun” of the manager and program
establishment. Potential weakness of the management-oriented
approach establishment. Potential weakness of the management
–oriented approach is the possibility that the evaluation can
become unfair and possibly even undemocratic stakeholders
who have less power and resources (House and Howe, 1999).
Policy-shaping community. The policy-shaping community includes:
• Public servants, such as responsible officials at the policy and
program levels and the actual operating personnel.
• The public, consisting not only of constituents but also influential
persons such as commentators, academic social scientists,
philosophers, gadflies, and even novelists or dramatists.
Few policy studies have been found to have a direct effect on the policy-
shaping community, but evaluations can and do incluence these audiences
over time. Poly as a reflection of public values, may be seen as never-ending
in that it continues to be molded or revised as issues, reforms, social causes,
and social values change or come to the forefront of attention. One
important role fo the evaluator is to illuminate, not to dictate, the decision.
Helping clients to understand the complexity of issues, not to give simple
answers to narrow questions, is a role of evaluation.
Limitations is it followed in ita entirety, the management-oriented approach
can result in costly and complex evaluations. If priorities are not carefully set
and followed, the many questions to be addressed using a management-
oriented approach can clamor for attention, leading to an evaluation system
as large as the program itself and diverting resources from program activities.
In planning evaluation procedures, management-oriented evaluators need to
consider the resources and time available. If the management-oriented
approach requires more time or resources than are available, another
approach may have to be considered.
As a case in point, consider the program manager who has to make
decision about next week’s production schedule. This manager may be
able to use only the CIPP or UCLA models informally, as an armchair
aid. The management-oriented evaluator needs to be realistic about
what work is possible and must not promise more than can be
delivered.
This evaluation approach assumes that the important decisions can be
clearly identified in advance. That clear decision alternatives can be
specified, and that the decisions to be served remain reasonably stable
while the evaluation is being done. All of these assumptions about the
orderliness and predictability of the decision-making process are
suspect and frequently unwarranted. Frequent adjustments may be
needed in the original evaluation plan if this approach is to work well.
MAJOR CONCEPTS AND THEORIES
1. The major impetus behind the management-oriented approach to evaluation is to inform
decision makers about eh inputs, processes, and outputs of the program under evaluation. This
approach considers the decision maker’s concerns, information needs and criteria for
effectiveness when developing the evaluation.
2. Stufflbeam’s CIPP evaluation model incorporates four separate evaluation (i.e., context, input,
process, and product) into one framework to better serve mangers, and decision makers. Each
of these evaluations collects data to serve different decisions (e.g., context evaluation serve
planning decisions) by progressing through a series of evaluation steps that provide structure
to the evaluation.
3. In the CIPP model, a context evaluation helps define objectives for the program under
evaluating.
4. To facilitate the design of program procedures, the CIPP model’s input evaluation provides
information on what resources are available, what alternative strategies to the program should
be considered, and what plans will best meet the needs of the program.
5. A process evaluation is used in the CIPP model to determine how well a program is being
implemented, what barriers to success exist, and what program revisions may be needed.
6. Product evaluation is used in the CIPP model to provide information on what program results
were obtained, how well needs were reduced, and what should be done once the program has
ended.
7. Alkin’s UCLA model is similar to the CIPP model in that it provides decision makers information
of the context, inputs, implementations, processes, and products of the program under
evaluation.
THANK YOU!

More Related Content

What's hot

CIPP Evaluation Model
CIPP Evaluation ModelCIPP Evaluation Model
CIPP Evaluation Model
Ct Hajar
 
Program evaluation part 2
Program evaluation part 2Program evaluation part 2
Program evaluation part 2
sourav goswami
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
aneez103
 
Curriculum constrction sem i evaluation models
Curriculum constrction sem i   evaluation modelsCurriculum constrction sem i   evaluation models
Curriculum constrction sem i evaluation models
Raj Kumar
 
Assessment for learning
Assessment for learningAssessment for learning
Assessment for learningCarlo Magno
 
Types of Assessment
Types of AssessmentTypes of Assessment
Types of Assessment
Cinderella Banares
 
Smith & ragan instructional design theory
Smith & ragan instructional design theorySmith & ragan instructional design theory
Smith & ragan instructional design theoryGurmin Hans
 
Implementing the curriculum
Implementing the curriculumImplementing the curriculum
Implementing the curriculum
Kim Gerard Mandocdoc
 
The Naturalistic Evaluation
The Naturalistic EvaluationThe Naturalistic Evaluation
The Naturalistic Evaluationmrborup
 
Meta Evaluation
Meta EvaluationMeta Evaluation
Meta Evaluation
Mohsen Sharifirad
 
Purpose and planning of evaluation (ps)
Purpose and planning of evaluation (ps)Purpose and planning of evaluation (ps)
Purpose and planning of evaluation (ps)
Puja Shrivastav
 
Descriptive research design
Descriptive research designDescriptive research design
Descriptive research design
Prateek Kakkar
 
Ppt.oral questioning & peer appraisal
Ppt.oral questioning & peer appraisalPpt.oral questioning & peer appraisal
Ppt.oral questioning & peer appraisal
saira kazim
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation models
Maarriyyaa
 
Curriculum: Organizing Knowledge for the Classroom. Section 6
Curriculum: Organizing Knowledge for the Classroom. Section 6Curriculum: Organizing Knowledge for the Classroom. Section 6
Curriculum: Organizing Knowledge for the Classroom. Section 6
Saide OER Africa
 
Curriculum evaluation
Curriculum evaluationCurriculum evaluation
Curriculum evaluation
HennaAnsari
 
cipp model
 cipp model cipp model
cipp model
Orly Abellanosa
 
Curriculum Theory
Curriculum Theory Curriculum Theory
Curriculum Theory
IER, University of the Punjab
 

What's hot (20)

CIPP Evaluation Model
CIPP Evaluation ModelCIPP Evaluation Model
CIPP Evaluation Model
 
Ch03 ppt-theory
Ch03 ppt-theoryCh03 ppt-theory
Ch03 ppt-theory
 
Program evaluation part 2
Program evaluation part 2Program evaluation part 2
Program evaluation part 2
 
Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
Curriculum constrction sem i evaluation models
Curriculum constrction sem i   evaluation modelsCurriculum constrction sem i   evaluation models
Curriculum constrction sem i evaluation models
 
History of supervision
History of supervisionHistory of supervision
History of supervision
 
Assessment for learning
Assessment for learningAssessment for learning
Assessment for learning
 
Types of Assessment
Types of AssessmentTypes of Assessment
Types of Assessment
 
Smith & ragan instructional design theory
Smith & ragan instructional design theorySmith & ragan instructional design theory
Smith & ragan instructional design theory
 
Implementing the curriculum
Implementing the curriculumImplementing the curriculum
Implementing the curriculum
 
The Naturalistic Evaluation
The Naturalistic EvaluationThe Naturalistic Evaluation
The Naturalistic Evaluation
 
Meta Evaluation
Meta EvaluationMeta Evaluation
Meta Evaluation
 
Purpose and planning of evaluation (ps)
Purpose and planning of evaluation (ps)Purpose and planning of evaluation (ps)
Purpose and planning of evaluation (ps)
 
Descriptive research design
Descriptive research designDescriptive research design
Descriptive research design
 
Ppt.oral questioning & peer appraisal
Ppt.oral questioning & peer appraisalPpt.oral questioning & peer appraisal
Ppt.oral questioning & peer appraisal
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation models
 
Curriculum: Organizing Knowledge for the Classroom. Section 6
Curriculum: Organizing Knowledge for the Classroom. Section 6Curriculum: Organizing Knowledge for the Classroom. Section 6
Curriculum: Organizing Knowledge for the Classroom. Section 6
 
Curriculum evaluation
Curriculum evaluationCurriculum evaluation
Curriculum evaluation
 
cipp model
 cipp model cipp model
cipp model
 
Curriculum Theory
Curriculum Theory Curriculum Theory
Curriculum Theory
 

Similar to Management oriented evaluation approaches

Program evaluation
Program evaluationProgram evaluation
Program evaluation
Yen Bunsoy
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...Institute of Development Studies
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
sourav goswami
 
Administration and Supervision in Evaluation
Administration and Supervision in EvaluationAdministration and Supervision in Evaluation
Administration and Supervision in Evaluation
Sharon Geroquia
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptx
ssusere0ee1d
 
Ot5101 005 week 5
Ot5101 005 week 5Ot5101 005 week 5
Ot5101 005 week 5
stanbridge
 
Evaluating Systems Change
Evaluating Systems ChangeEvaluating Systems Change
Evaluating Systems Change
Noel Hatch
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
gggadiel
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
Resty Samosa
 
planning process and decesion making techniques
planning process and decesion making techniquesplanning process and decesion making techniques
planning process and decesion making techniques
ChelJo
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
contentli
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
Maxwell Ranasinghe
 
Conducting Programme Evaluation
Conducting Programme EvaluationConducting Programme Evaluation
Conducting Programme Evaluation
Puja Shrivastav
 
Dr Odejayi Abosede Mary
Dr Odejayi Abosede MaryDr Odejayi Abosede Mary
Dr Odejayi Abosede Mary
Dr Odejayi Mary Abosede
 
Planning and Designing Evaluation
Planning and Designing EvaluationPlanning and Designing Evaluation
Planning and Designing EvaluationMarlin Dwinastiti
 
ITFT Strategic Management
ITFT Strategic ManagementITFT Strategic Management
ITFT Strategic Management
archan26
 
Unit -9 evaluating management system
Unit -9 evaluating management systemUnit -9 evaluating management system
Unit -9 evaluating management system
Asima shahzadi
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christieharrindl
 
Monitoring R&D functions
Monitoring R&D functionsMonitoring R&D functions
Monitoring R&D functions
Nandita Das
 

Similar to Management oriented evaluation approaches (20)

Program evaluation
Program evaluationProgram evaluation
Program evaluation
 
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
IDS Impact, Innovation and Learning Workshop March 2013: Day 2, Keynote 2 Pat...
 
Program Evaluation 1
Program Evaluation 1Program Evaluation 1
Program Evaluation 1
 
Administration and Supervision in Evaluation
Administration and Supervision in EvaluationAdministration and Supervision in Evaluation
Administration and Supervision in Evaluation
 
Training on Evaluation.pptx
Training on Evaluation.pptxTraining on Evaluation.pptx
Training on Evaluation.pptx
 
Ot5101 005 week 5
Ot5101 005 week 5Ot5101 005 week 5
Ot5101 005 week 5
 
Evaluating Systems Change
Evaluating Systems ChangeEvaluating Systems Change
Evaluating Systems Change
 
COMMUNITY EVALUATION 2023.pptx
COMMUNITY  EVALUATION 2023.pptxCOMMUNITY  EVALUATION 2023.pptx
COMMUNITY EVALUATION 2023.pptx
 
Evaluation research-resty-samosa
Evaluation research-resty-samosaEvaluation research-resty-samosa
Evaluation research-resty-samosa
 
planning process and decesion making techniques
planning process and decesion making techniquesplanning process and decesion making techniques
planning process and decesion making techniques
 
Community engagement - what constitutes success
Community engagement - what constitutes successCommunity engagement - what constitutes success
Community engagement - what constitutes success
 
Monitoring and evaluation
Monitoring and evaluationMonitoring and evaluation
Monitoring and evaluation
 
Conducting Programme Evaluation
Conducting Programme EvaluationConducting Programme Evaluation
Conducting Programme Evaluation
 
Dr Odejayi Abosede Mary
Dr Odejayi Abosede MaryDr Odejayi Abosede Mary
Dr Odejayi Abosede Mary
 
Planning and Designing Evaluation
Planning and Designing EvaluationPlanning and Designing Evaluation
Planning and Designing Evaluation
 
Monitoring process
Monitoring processMonitoring process
Monitoring process
 
ITFT Strategic Management
ITFT Strategic ManagementITFT Strategic Management
ITFT Strategic Management
 
Unit -9 evaluating management system
Unit -9 evaluating management systemUnit -9 evaluating management system
Unit -9 evaluating management system
 
June 20 2010 bsi christie
June 20 2010 bsi christieJune 20 2010 bsi christie
June 20 2010 bsi christie
 
Monitoring R&D functions
Monitoring R&D functionsMonitoring R&D functions
Monitoring R&D functions
 

More from Jessica Bernardino

Labadeee
LabadeeeLabadeee
Seven seas
Seven seasSeven seas
Seven seas
Jessica Bernardino
 
Pcic Ambulong blue water experience
Pcic Ambulong blue water experiencePcic Ambulong blue water experience
Pcic Ambulong blue water experience
Jessica Bernardino
 
Air pollution
Air pollutionAir pollution
Air pollution
Jessica Bernardino
 
Development plannig and management
Development plannig and managementDevelopment plannig and management
Development plannig and management
Jessica Bernardino
 
Tourism at San Jose, Occidental Mindoro
Tourism at San Jose, Occidental MindoroTourism at San Jose, Occidental Mindoro
Tourism at San Jose, Occidental Mindoro
Jessica Bernardino
 
Fundamental principle of curriculum development and instruction
Fundamental principle of curriculum development and instructionFundamental principle of curriculum development and instruction
Fundamental principle of curriculum development and instruction
Jessica Bernardino
 
Assessment
AssessmentAssessment
Assessment
Jessica Bernardino
 
Strategic planning
Strategic planningStrategic planning
Strategic planning
Jessica Bernardino
 

More from Jessica Bernardino (11)

Palawan
PalawanPalawan
Palawan
 
Labadeee
LabadeeeLabadeee
Labadeee
 
Disney
DisneyDisney
Disney
 
Seven seas
Seven seasSeven seas
Seven seas
 
Pcic Ambulong blue water experience
Pcic Ambulong blue water experiencePcic Ambulong blue water experience
Pcic Ambulong blue water experience
 
Air pollution
Air pollutionAir pollution
Air pollution
 
Development plannig and management
Development plannig and managementDevelopment plannig and management
Development plannig and management
 
Tourism at San Jose, Occidental Mindoro
Tourism at San Jose, Occidental MindoroTourism at San Jose, Occidental Mindoro
Tourism at San Jose, Occidental Mindoro
 
Fundamental principle of curriculum development and instruction
Fundamental principle of curriculum development and instructionFundamental principle of curriculum development and instruction
Fundamental principle of curriculum development and instruction
 
Assessment
AssessmentAssessment
Assessment
 
Strategic planning
Strategic planningStrategic planning
Strategic planning
 

Recently uploaded

Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
beazzy04
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Thiyagu K
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
JosvitaDsouza2
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
bennyroshan06
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
Celine George
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
EverAndrsGuerraGuerr
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
siemaillard
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
GeoBlogs
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
Steve Thomason
 
Fish and Chips - have they had their chips
Fish and Chips - have they had their chipsFish and Chips - have they had their chips
Fish and Chips - have they had their chips
GeoBlogs
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
Mohd Adib Abd Muin, Senior Lecturer at Universiti Utara Malaysia
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
Fundacja Rozwoju Społeczeństwa Przedsiębiorczego
 
Sectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdfSectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdf
Vivekanand Anglo Vedic Academy
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
EduSkills OECD
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
Celine George
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
Celine George
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
RaedMohamed3
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
Col Mukteshwar Prasad
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
rosedainty
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
Special education needs
 

Recently uploaded (20)

Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345Sha'Carri Richardson Presentation 202345
Sha'Carri Richardson Presentation 202345
 
Unit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdfUnit 2- Research Aptitude (UGC NET Paper I).pdf
Unit 2- Research Aptitude (UGC NET Paper I).pdf
 
1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx1.4 modern child centered education - mahatma gandhi-2.pptx
1.4 modern child centered education - mahatma gandhi-2.pptx
 
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptxMARUTI SUZUKI- A Successful Joint Venture in India.pptx
MARUTI SUZUKI- A Successful Joint Venture in India.pptx
 
How to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS ModuleHow to Split Bills in the Odoo 17 POS Module
How to Split Bills in the Odoo 17 POS Module
 
Thesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.pptThesis Statement for students diagnonsed withADHD.ppt
Thesis Statement for students diagnonsed withADHD.ppt
 
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
aaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaaa
 
The geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideasThe geography of Taylor Swift - some ideas
The geography of Taylor Swift - some ideas
 
The Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve ThomasonThe Art Pastor's Guide to Sabbath | Steve Thomason
The Art Pastor's Guide to Sabbath | Steve Thomason
 
Fish and Chips - have they had their chips
Fish and Chips - have they had their chipsFish and Chips - have they had their chips
Fish and Chips - have they had their chips
 
Chapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptxChapter 3 - Islamic Banking Products and Services.pptx
Chapter 3 - Islamic Banking Products and Services.pptx
 
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdfESC Beyond Borders _From EU to You_ InfoPack general.pdf
ESC Beyond Borders _From EU to You_ InfoPack general.pdf
 
Sectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdfSectors of the Indian Economy - Class 10 Study Notes pdf
Sectors of the Indian Economy - Class 10 Study Notes pdf
 
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptxStudents, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
Students, digital devices and success - Andreas Schleicher - 27 May 2024..pptx
 
Model Attribute Check Company Auto Property
Model Attribute  Check Company Auto PropertyModel Attribute  Check Company Auto Property
Model Attribute Check Company Auto Property
 
How to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERPHow to Create Map Views in the Odoo 17 ERP
How to Create Map Views in the Odoo 17 ERP
 
Palestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptxPalestine last event orientationfvgnh .pptx
Palestine last event orientationfvgnh .pptx
 
How to Break the cycle of negative Thoughts
How to Break the cycle of negative ThoughtsHow to Break the cycle of negative Thoughts
How to Break the cycle of negative Thoughts
 
Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)Template Jadual Bertugas Kelas (Boleh Edit)
Template Jadual Bertugas Kelas (Boleh Edit)
 
special B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdfspecial B.ed 2nd year old paper_20240531.pdf
special B.ed 2nd year old paper_20240531.pdf
 

Management oriented evaluation approaches

  • 2. MANAGEMENT-ORIENTED EVALUATION APPROACHES Is meant to serve decision makers. RATIONALE - Evaluation information is an essential part of good decision making is essential part of good decision making and that the evaluator can be most effective by serving administrators managers, policy makers boards, practioners, and others who need good evaluation information. DEVELOPERS Have relied on system approach to evaluation in which decision are made inputs, process and output program model and program theory. DECISION MAKER Audience to whom a management – oriented evaluation is directed. Concerns informational needs and criteria for effectiveness pride to direction to the study.
  • 3. DEVELOPERS OF THE MANAGEMENT-ORIENTED EVALUATION APPROACH AND THEIR CONTRIBUTION STUFFLE LOAN (1968) A recognized the short comings of available approaches. Working to expand and systematized thinking about administrative studies and educational decision making. HENRY BERNARD, HORACE MANN, WILLIAM TOREY HARIS, CARLITON WASHBURNE (1960’s and 1970’s) They drew from management theory. STUFFLEBEAM (1968) Made the decisions of program managers the priotal organizer for the evaluation rather than program objectives. ALKINS (1969) Evaluation, working closely with the administrators identifies the decision the administrators must make and them collects sufficient information about the relative advantages, disadvantages of each decision alternative to allow a fair judgment based on specified criteria. Success of evaluation rest on the quality of teamwork and between evaluation decision makers.
  • 4. CIPP EVALUATIONMODEL Stuffbeam (1971, 2000) Shinkfield, 1985 proponent of decision-oriented evaluation approaches structured to help administrators make good decisions. View Evaluation as “the process of deliberating, obtaining, and providing useful information for judging decision alternatives. He develop evaluation framework to serve managers and administrators forcing from different kinds of decisions. 1. Context Evaluation to serve planning decision: determining: – What needs are to be addressed by a program. – What the programs already exist helps in defining objectives for the program. 2. Input Evaluation – to serve structuring decision. – What resources are available – What alternatives strategies for the program should be considered. – What plan seems to have to best potential for meetings needs foliates design of program procedures. 2. Process Evaluation (serve implementing decisions) – How well is the plan being implemented? – What barriers threaten its success? – What revision are needed?
  • 5. 4. Once these questions are answered, procedures can be monitored, controlled and refined. – Product Evaluation (to serve recycling decision) – What results were obtained? – How well were needs reduced? – What should be done with the program after it has run its course? – These question are important in judging program attainments. Stufflebeam’s Evaluation Model CIPP – Acronym – Context, Input, Process and Product. General Steps of Stufflebeam – proposed evaluation.
  • 6. A. Focusing the Evaluation 1. Identify the major level of decision to be served. 2. Ex. Local, state, rational 3. Each level of decision making project the decision to be served and describe each on in terms of its focus, critical timing and composition of alternatives. 4. Define criteria for each decision situation by specifying variables for measurement and standard for use in its judgment of alternatives. 5. Define policies within which the evaluation must operate. B. Collection of Information 1. Specify the source of information to be collected. 2. Specify the instruments and method for collecting the needed information. 3. Specify the sampling procedure to be employed – Specify the conditions and schedule for the information collected C. Organization of Information 1. Provide a format for the information that is to be collected. 2. Designate a means for performing the analysis. D. Analysis of Information 1. Select the analytical procedures to be employed. 2. Designate a means for performing the analysis. E. Reporting of Information 1. Define the audience for the evaluation reports. 2. Specify means for providing information to the audiences. 3. Specify the format for evaluation reports, 4. Schedule the reporting F. Administration of the Evaluation. 1. Summarize the evaluation schedule. 2. Define Staff and resource requirements and plans for meeting these requirements. 3. Specify means for meeting policy requirements for conduct of the evaluation. 4. Evaluate the potential of evaluation design for providing information that is valid reliable, credible, timely, and persuasive (all stake-holder)
  • 7. FOUR TYPES OF EVALUATION CONTEXT EVALUATION Objectives – To define the institutional context identify the target population and assess its needs – To identify opportunities for addressing the needs – To diagnose problems underlying the needs – To judge whether prosed objectives are sufficiently responsive – To assessed needs Method – By using methods as system analysis, survey, document review, hearings, interviews, diagnostic tests, and the Delphi technique Relation to decision making in the change process – For deciding on the setting to be served, the goals associated with meeting needs or using opportunities, and the objectives associated with solving problems. INPUT EVALUATION Objective – To identify ad assess system capabilities, alternative program strategies, procedural designs for implementing the strategies, budgets, and schedules. Method – By inventorying and analyzing available human and material resources, solution strategies, and procedural designs for relevance, feasibility and economy, and by using such method as a literature search, visits to exemplary programs, advocate teams, and pilot trials Relation to decision making in the change process – For selecting sources of support, solution strategies, and procedural designs – that is, for structuring change activities – and to provide a basis for judging implementation
  • 8. PROCESS EVALUATION Objectives – To identify or predict in process defects in the procedural design or its implementation, to provide information for the preprogrammed decision, and to record and judge procedural events and activities Method – By monitoring the activity’s potential procedural barriers and remaining alert to unanticipated ones, by obtaining specified information for programmed decisions, by describing the actual process and y continually interacting with and observing the activities of project staff. Relation to decision making in the change process – For implementing and refining the program design and procedure – that is, for effecting process control – and to provide a log of the actual process for later use in interpreting outcomes PRODUCT EVALUATION Objective – To collection descriptions and judgments of outcomes and to relate them to objectives and to context, input and process information and to interpret their worth and merit. Method – By defining operationally and measuring outcome criteria, by collecting judgments of outcomes form stakeholders, and by performing both qualitative and quantitative analyses. Relation to decision making in the change process – For deciding to continue, terminate, modify, or refocus a change activity, and to present a clear record of effects (intended and unintended, positive and negative)
  • 9. The UCLA EvaluationModel CIPP Model. Alkin defined evaluation as “the process of ascertaining the decision areas of concern, selecting appropriate information, and collecting and analyzing information in order to report summary data useful to decision-makers in selecting among alternatives”. Alkin’s model included the following five types of evaluation: 1. Systems Assessments, to provide information bout the state of the system (similar to context evaluation I the CIPP model) 2. Program planning, to assist in the selection of particular program likely to be effective in meeting specific educational needs (similar to input evaluation) 3. Program implementation, to provide information about whether a program was introduced to the appropriate group in the manner intended. 4. Program improvement, to provide information about how a program is functioning, whether interim objectives are being achieve, and whether unanticipated outcomes, are appearing (similar to process evaluation). 5. Program certification, to provide information about the value of the program and its potential for use elsewhere (similar to product evaluation)
  • 10. Four assumptions about evaluation: 1. Evaluation is a process of gathering information 2. The information collected in an evaluation will be used mainly to make decisions about alternative courses of action. 3. Evaluation information should be presented to the decision maker in a form that he can use effectively and that is designed to help rather than confuse or mislead him. 4. Different kinds of decisions require different kinds of evaluation procedures.
  • 11. Growth and Development of the Early Models CIP and UCLA frameworks for evaluation appear to be linear and sequential, but the developers have stressed that such is not the case. For example, the evaluator would not have to complete an input evaluation or a systems assessment in order to undertake one of the other types of evaluation listed in the framework. Often evaluators may undertake “retrospective” evaluations (such a context evaluation or a systems assessment) in preparation for a process or program improvement evaluation study, believing this evaluation approach is cumulative, linear, and sequential; however, such steps are not always necessary. A process evaluation can be done without having completed context or input evaluation studies. The evaluator my cycle into another types of evaluation if some decisions suggest that earlier decision should be reviewed. Such is the nature of management-oriented evaluation. CIPP model has produced guides for types of evaluation included in the framework. For example, Stuffbeam (1977) advance the procedure for conducting a context evaluation with his guidelines for designing a needs assessment for an education program or activity. In input evaluation was developed by REinhard (1972). The input evaluatin approach that she developed is called the advocate team technique. It is used when acceptable alternatives for designing a new program are not available or obvious. The technique creates alternative new dsigns that are then evaluated she selected, adapted, or combine to create the most viable alternative design for a new program. This technique has been used successfully by the federal government (Reinhard, 1972) and by school districts (Sanders, 1982) to generate options and guide the final design of educational programs. Procedures proposed by Cronbach (1963) provided useful suggestions for the conduct of process evaluation.
  • 12. OTHERMANGEMENT-ORIENTED EVALUATION APPROACHES Provus’s Discrepancy Evaluation Model was described as an objectives-oriented evaluation model. Some aspects of that model are also directed toward serving the information needs of educational program managers. It is system-oriented, focusing on input, process, and output at each of five stages of evaluation: program definition, program installation, program process, program products, and cost-benefit analysis. UCLA evaluation model with respect to their sensitivity to the various decisions managers need to make at each stage of program development. Likewise, the systems approached of logic models and program theory focus on inputs, processes, and outcomes, as do the management-oriented evaluation approaches. The utilization-focused evaluation approach of Patton (1986, 1996) could also be viewed as a decision-making approach in one respect. He stressed that the process of identifying and organizing relevant decision makers and information users is the first step in evaluation. In his view, the use of evaluation findings requires that decision makers determine what information is needed by various people and arrange for that information to be collected and provided to them. Wholey (1983, 1994) could also be considered a proponent of management-oriented evaluation, given his focus on working with managements. His writings have concentrated on the practical uses of evaluation in pubic administration settings.
  • 13. How theManagement-OrientedEvaluation Approach Has Been used CIPP model has been used in school districts and state and federal government agencies. The Dallas (Texas) Independent School District, for example, established an evaluation office organized around the four types of evaluation in the model. The management-oriented approach to evaluation has guided program managers through program planning, operation, and the review. Program staff has found this approach a useful guide to program improvement. Evaluation approach has also been used for accountability purposes. It provides a record-keeping framework that facilitates public review of client needs, objectives, plans, activities, and outcomes. Administrators and boards have found this approach useful in meeting public demands for information. Stufflebeam and Shinkfield (1985) described these two uses for the CIPP model as shown in Figure.
  • 14. The Relevance of Four EvaluationTypes of Decision Making and Accountability Decision Making (Formative Orientation) Accountability (Summative Orientation) Context Input Process Product Guidance for choice of objectives and assignment of priorities Guidance for choice of program strategy: input for specification of procedural design Guidance for implementation Guidance for termination, continuation, modification, or installation Record of objectives and bases for their choice along with a record of needs, opportunities and problems Record of chosen strategy and design and reason for their choice over other alternatives Record of the actual process Record of attainments and recycling decisions
  • 15. STRENGHTS AND LIMITATIONS OF THE MANAGEMENT ORIENTED EVALUATION APPROACH This approach has proved appealing to many evaluators and program managers, particularly those at home with the rational and orderly systems approach, to which it is clearly related. Its greatest strength is that it gives focus to the evaluation. Experienced evaluators know hoe tempting it is simply to cast a wide net, collecting an enormous amount of information, only later to discard much of it because it is not directly relevant to the key issues or questions the evaluation must address. Focusing on the informational needs and pending decisions of managers limits the range of relevant data and brings the evaluation into sharp focus. This evaluation approach also stresses the importance of the utility of information. Connecting decision making and evaluation underscore the very purpose of evaluation. Also focusing an evaluation on the decisions managers must make prevents the evaluator form pursuing unfruitful lines of inquiry that are not of interest to the decision makers. The management-oriented approach to evaluation was instrumental in showing evaluators and program managers that they need not wait until an activity or program has run its course before evaluating it. Educators can begin evaluating even when ideas from programs are first discussed. Because of lost opportunities and heavy resource investment, evaluation is generally least affective at the end o f a developing program.
  • 16. Management-oriented evaluation approach is probably the preferred choice in the eyes of most managers and boards. This approach places on information for decision makers. This approach addressed one of the biggest criticisms of evaluation in the 1960s: that it did not provide useful information. CIPP model, a useful and simple heuristic tool that helps the evaluator generate potentially important questions to be addressed in an evaluation. Context, input, process and product, the evaluator can identify a number of questions about an undertaking. The model and the questions it generates also make the evaluation easy to explain to lay audiences. Management-oriented approach to evaluation supports evaluation of every component of a program as it operates, grows, or changes. It stresses the timely use of feedback by decision makers so that the program is not left to flounder or proceed unaffected by updated knowledge about needs, resources, new developments, the realities of day-to-day operations, or the consequences of program interventions. A potential weakness of this approach is the evaluator’s occasional inability to respond to questions or issues that may be significant – even critical – but that clash with or at least do not match the concern and questions of the decision maker who, essentially, controls the evaluation. In addition, programs that lack decisive leadership are not likely to benefit from this approach to evaluation.
  • 17. Potential weakness of management-oriented evaluation is the preference it seems to give to top management. The evaluator can become the “hired gun” of the manager and program establishment. Potential weakness of the management-oriented approach establishment. Potential weakness of the management –oriented approach is the possibility that the evaluation can become unfair and possibly even undemocratic stakeholders who have less power and resources (House and Howe, 1999). Policy-shaping community. The policy-shaping community includes: • Public servants, such as responsible officials at the policy and program levels and the actual operating personnel. • The public, consisting not only of constituents but also influential persons such as commentators, academic social scientists, philosophers, gadflies, and even novelists or dramatists.
  • 18. Few policy studies have been found to have a direct effect on the policy- shaping community, but evaluations can and do incluence these audiences over time. Poly as a reflection of public values, may be seen as never-ending in that it continues to be molded or revised as issues, reforms, social causes, and social values change or come to the forefront of attention. One important role fo the evaluator is to illuminate, not to dictate, the decision. Helping clients to understand the complexity of issues, not to give simple answers to narrow questions, is a role of evaluation. Limitations is it followed in ita entirety, the management-oriented approach can result in costly and complex evaluations. If priorities are not carefully set and followed, the many questions to be addressed using a management- oriented approach can clamor for attention, leading to an evaluation system as large as the program itself and diverting resources from program activities. In planning evaluation procedures, management-oriented evaluators need to consider the resources and time available. If the management-oriented approach requires more time or resources than are available, another approach may have to be considered.
  • 19. As a case in point, consider the program manager who has to make decision about next week’s production schedule. This manager may be able to use only the CIPP or UCLA models informally, as an armchair aid. The management-oriented evaluator needs to be realistic about what work is possible and must not promise more than can be delivered. This evaluation approach assumes that the important decisions can be clearly identified in advance. That clear decision alternatives can be specified, and that the decisions to be served remain reasonably stable while the evaluation is being done. All of these assumptions about the orderliness and predictability of the decision-making process are suspect and frequently unwarranted. Frequent adjustments may be needed in the original evaluation plan if this approach is to work well.
  • 20. MAJOR CONCEPTS AND THEORIES 1. The major impetus behind the management-oriented approach to evaluation is to inform decision makers about eh inputs, processes, and outputs of the program under evaluation. This approach considers the decision maker’s concerns, information needs and criteria for effectiveness when developing the evaluation. 2. Stufflbeam’s CIPP evaluation model incorporates four separate evaluation (i.e., context, input, process, and product) into one framework to better serve mangers, and decision makers. Each of these evaluations collects data to serve different decisions (e.g., context evaluation serve planning decisions) by progressing through a series of evaluation steps that provide structure to the evaluation. 3. In the CIPP model, a context evaluation helps define objectives for the program under evaluating. 4. To facilitate the design of program procedures, the CIPP model’s input evaluation provides information on what resources are available, what alternative strategies to the program should be considered, and what plans will best meet the needs of the program. 5. A process evaluation is used in the CIPP model to determine how well a program is being implemented, what barriers to success exist, and what program revisions may be needed. 6. Product evaluation is used in the CIPP model to provide information on what program results were obtained, how well needs were reduced, and what should be done once the program has ended. 7. Alkin’s UCLA model is similar to the CIPP model in that it provides decision makers information of the context, inputs, implementations, processes, and products of the program under evaluation.