Programme Evaluation
Conducting the Programme Evaluation
Puja Shrivastav
JRF (UGC)
If the Goal of Evaluation is…
… to improve a program
Then no evaluation is good unless
findings are used to make a difference
2Puja Shrivastav
Programme Evaluation
 Any evaluation to examine and assess
the implementation and effectiveness of
specific instructional activities in order to
make adjustments or changes in the
activities is often labeled "process or
programme evaluation.”
 The focus of process evaluation
includes a description and assessment
of the curriculum, teaching methods
used, staff experience and performance,
in-service training, and adequacy of
equipment and facilities.
3Puja Shrivastav
When to Conduct Evaluation?
 The stage of program development
influences the reason for program
evaluation.
o The design stage.
o The start up stage.
o While the programme is in progress.
o After the programme wrap up.
o Long after programme finishes.
4Puja Shrivastav
Steps of conducting
evaluation
1. Planning for Evaluation-Identify the
problem, Renew program goals.
2. Identify stakeholders and their needs-
Identify and contact evaluation
stakeholders,
3. Determining the evaluation purpose -
Revisit the purpose/objectives of evaluation
4. Decide who will evaluate-Decide if
evaluation will be in house or contracted
out.
5. Report results
6. Justify Conclusions
Cont..5Puja Shrivastav
1. Planning for Evaluation
Identify the problem.. And renew the goals
The mission and objectives of the
instructional program should be focused.
Include information about its
purpose, expected effects, available
resources, the program’s stage of
development, and instructional context.
Descriptions set the frame of reference for all
subsequent planning decisions in an
evaluation.
6Puja Shrivastav
Planning for Evaluation
o Determine data-collection methods,
o Create data-collection instrument,
o Test data-collection instrument,
o Evaluation of collected data.
o Summarize and analyze the data,
prepare reports for stakeholders
7Puja Shrivastav
Planning for Evaluation
Gather data
 Data gathering focuses on collecting
information that conveys a holistic
picture of the instructional program.
 Data gathering includes consideration
about what indicators, data
sources and methods to use,
the quality and quantity of the
information, human subject protections,
and the context in which the data
gathering occurs.
8Puja Shrivastav
Create an evaluation plan
 The evaluation plan outlines how to
implement the evaluation including:
i. Identification of the sponsor and
resources available for implementing
the plan,
ii. What information is to be gathered,
iii. The research method(s) to be used,
iv. A description of the roles and
responsibilities of
sponsors and evaluators.
v. A timeline for accomplishing tasks. 9Puja Shrivastav
2. Identify stakeholders and their
needs
 Stakeholders are the individuals and
organizations involved in program
operations,
 Those served or affected by the
program, and the intended users of the
assessment or evaluation.
 Stakeholder needs generally reflect the
central questions which they have about
the instructional activity, innovation, or
program.
 Determining stakeholder needs helps to
focus the evaluation process so that the
results are of the greatest utility.
10Puja Shrivastav
Three principle groups of
stakeholders
 Persons Involved in Program Operations
◦ Staff and Partners
 Persons affected or served by the
program
◦ Clients, their families and social networks,
providers and community groups
 Intended users of the evaluation findings
◦ Policy makers, managers, administrators,
advocates, funders, and others
11Puja Shrivastav
3. Determining the evaluation
purpose
Three general purposes for instructional
evaluations are --
a. Gain Insight -
o Assess needs and wants of community
members
o Identify barriers to use of the program
o Learn how to best describe and measure
program activities
b. Change Practice - to improve the quality,
effectiveness, or efficiency of instructional
activities.
12Puja Shrivastav
Determining the evaluation purpose
◦ Refine plans for introducing a new practice
◦ Determine the extent to which plans were
implemented
◦ Improve educational materials
◦ Enhance cultural competence
◦ Verify that participants' rights are protected
◦ Set priorities for staff training
◦ Make mid-course adjustments
◦ Clarify communication
◦ Determine if client satisfaction can be improved
◦ Compare costs to benefits
◦ Find out which participants benefit most from the
program
◦ Mobilize community support for the program
13Puja Shrivastav
Determining the evaluation
purpose
c. Measure Effects of program– to examine
the relationship between instructional
activities and observed consequences
◦ Assess skills development by program
participants
◦ Compare changes in behavior over time
◦ Decide where to allocate new resources
◦ Document the level of success in
accomplishing objectives
◦ Demonstrate that accountability requirements
are fulfilled
◦ Use information from multiple evaluations to
predict the likely effects of similar programs
14Puja Shrivastav
Determining the evaluation purpose
d. Affect on participants-
◦ Empower program participants (for example,
being part of an evaluation can increase
community members' sense of control over the
program);
◦ Supplement the program (for example, using a
follow-up questionnaire can reinforce the main
messages of the program);
◦ Promote staff development (for example, by
teaching staff how to collect, analyze, and
interpret evidence); or
◦ Contribute to organizational growth (for
example, the evaluation may clarify how the
program relates to the organization's mission).
15Puja Shrivastav
Determining the evaluation purpose
◦ Reinforce messages of the program
◦ Stimulate dialogue and raise awareness
about community issues
◦ Broaden consensus among partners
about program goals
◦ Teach evaluation skills to staff and other
stakeholders
◦ Gather success storie
◦ Support organizational change and
improvement
16Puja Shrivastav
Identify intended uses
 Intended uses are the specific ways
evaluation results will be applied.
 They are the underlying goals of the
evaluation, and are linked to the
central questions of the study that
identify the specific aspects of the
instructional program to be examined.
 The purpose, uses, and central
questions of an evaluation are all
closely related.
17Puja Shrivastav
4. Decide who will evaluate
 Decide who will evaluate-Decide if
evaluation will be in house or
contracted out.
In house – Principal, Teachers,
Students, or Parents.
Out – Some agencies can be hired to
help out, Retired professional of same
stream etc can also evaluate the
program.
18Puja Shrivastav
5.Reporting Results
Analyze data
 Data analysis involves identifying
patterns in the data, either by isolating
important findings (analysis) or by
combining sources of information to
reach a larger understanding (synthesis),
and
 Making decisions about how to
organize, classify, interrelate, compare,
and display information.
 These decisions are guided by the
questions being asked, the types of data
available, and by input from
stakeholders.
19Puja Shrivastav
Report results
 Factors to consider when reporting results, or
dissemination, include tailoring report content
for a specific audience, explaining the focus
of the study and its limitations, and listing
both the strengths and weaknesses of the
study.
 It may also include the reporting of active
follow-up and interim findings.
 Reporting interim findings is sometimes
useful to instructors or staff in making
immediate instructional adjustments.
Cont..
20Puja Shrivastav
Report results
 Describe the accomplishments of the
program, identifying those instructional
elements that were the most effective;
 Describe instructional elements that
were ineffective and problematic as
well as areas that need modifications
in the future; and
 Describe the outcomes or the impact
of the instructional unit on students.
21Puja Shrivastav
Report results
 Complete documentation will make
the report useful for making decisions
about improving curriculum and
instructional strategies.
 In other words, the evaluation report
is a tool supporting decision making,
program improvement, accountability,
and quality control in curriculum.
 This will help in reframing the
curriculum ….if needed.
22Puja Shrivastav
Make conclusions and
recommendations
 Conclusions are linked to the evidence
gathered and judged against agreed-
upon standards set by stakeholders.
 Recommendations are actions for
consideration that are based on
conclusions but go beyond
simple judgments about efficacy
or interpretation of the evidence
gathered.
23Puja Shrivastav
Justify the Conclusions
 Conclusions become justified when
they are linked to the evidence
gathered and judged against agreed-
upon values set by the stakeholders.
 Stakeholders must agree that
conclusions are justified in order to
use the evaluation results with
confidence.
The principal elements involved in
justifying conclusions based on
24Puja Shrivastav
Justify the Conclusions
 Standards- Standards reflect the values
held by stakeholders about the program.
They provide the basis to make program
judgments.
 Analysis and synthesis- Analysis and
synthesis are methods to discover and
summarize an evaluation's findings.
 Interpretation- Interpretation is the effort to
figure out what the findings mean.
Uncovering facts about a program's
performance.
 Judgements- Judgments are statements
about the merit, worth, or significance of the
program.
 Recommendations-Recommendations are 25Puja Shrivastav
Standards for Effective Evaluation
26
Standards
Utility
Feasibility
Propriety
Accuracy
Engage
stakeholders
Steps
Describe
the program
Gather credible
evidence
Focus the
Evaluation
design
Justify
conclusions
Ensure use
and share
lessons learned
26
Puja Shrivastav
The Four Standards
 Utility: Who needs the information
and what information do they need?
 Feasibility: How much money, time,
and effort can we put into this?
 Propriety: What steps need to be
taken for the evaluation to be ethical?
 Accuracy: What design will lead to
accurate information?
27Puja Shrivastav
Standard: Utility
Ensures that the information needs of
intended users are met.
 Who needs the evaluation findings?
 What do the users of the evaluation
need?
 Will the evaluation provide relevant
(useful) information in a timely
manner?
28Puja Shrivastav
Standard: Feasibility
Ensures that evaluation is realistic,
prudent, diplomatic, and frugal.
 Are the planned evaluation activities
realistic given the time, resources, and
expertise at hand?
29Puja Shrivastav
Standard: Propriety
Ensures the evaluation is conducted
legally, ethically, and with due regard
for the welfare of those involved and
those affected.
 Does the evaluation protect the rights of
individuals and protect the welfare of
those involved?
 Does it engage those most directly
affected by the program and by changes
in the program, such as participants or
the surrounding community?
30Puja Shrivastav
Standard: Accuracy
 Ensures that the evaluation reveals
and conveys technically accurate
information.
Will the evaluation produce findings
that are valid and reliable, given the
needs of those who will use the
results?
31Puja Shrivastav
Utilizing the Evaluation Result
 The evaluator records the actions, the features
and experiences of students, teachers and
administrators. People who read the report will
be able to visualise what the place looks like and
the processes taking place. Thus the reader will
understand the area’s for requirement of
improvement.
 The evaluator interpret and explains the meaning
of events reported by putting it in its context. For
example, why academically weak students were
motivated to ask questions; why reading
comprehension skills improved; why enthusiasm
for doing science experiments increased and so
forth.
32Puja Shrivastav
Utilization of Evaluation
Result
 Use of Available resources-
organization of staff for learning,
Administrative and physical
conditions.
 Decision area of Teacher- identifying
the objective, selection of teaching
learning process.
 Communication- Properly done with
the stakeholders.
33Puja Shrivastav
Utilization of Evaluation
Result
 The Results ensures that the
information needs of intended users
are met/ or not. If not then further
recommendations can be used which
are made by the evaluator.
 The feedback obtained could be used
to revise and improve instruction or
whether or not to adopt the
programme before full implementation.
 Development of overall programme or
curriculum.
34Puja Shrivastav
35Puja Shrivastav

Conducting Programme Evaluation

  • 1.
    Programme Evaluation Conducting theProgramme Evaluation Puja Shrivastav JRF (UGC)
  • 2.
    If the Goalof Evaluation is… … to improve a program Then no evaluation is good unless findings are used to make a difference 2Puja Shrivastav
  • 3.
    Programme Evaluation  Anyevaluation to examine and assess the implementation and effectiveness of specific instructional activities in order to make adjustments or changes in the activities is often labeled "process or programme evaluation.”  The focus of process evaluation includes a description and assessment of the curriculum, teaching methods used, staff experience and performance, in-service training, and adequacy of equipment and facilities. 3Puja Shrivastav
  • 4.
    When to ConductEvaluation?  The stage of program development influences the reason for program evaluation. o The design stage. o The start up stage. o While the programme is in progress. o After the programme wrap up. o Long after programme finishes. 4Puja Shrivastav
  • 5.
    Steps of conducting evaluation 1.Planning for Evaluation-Identify the problem, Renew program goals. 2. Identify stakeholders and their needs- Identify and contact evaluation stakeholders, 3. Determining the evaluation purpose - Revisit the purpose/objectives of evaluation 4. Decide who will evaluate-Decide if evaluation will be in house or contracted out. 5. Report results 6. Justify Conclusions Cont..5Puja Shrivastav
  • 6.
    1. Planning forEvaluation Identify the problem.. And renew the goals The mission and objectives of the instructional program should be focused. Include information about its purpose, expected effects, available resources, the program’s stage of development, and instructional context. Descriptions set the frame of reference for all subsequent planning decisions in an evaluation. 6Puja Shrivastav
  • 7.
    Planning for Evaluation oDetermine data-collection methods, o Create data-collection instrument, o Test data-collection instrument, o Evaluation of collected data. o Summarize and analyze the data, prepare reports for stakeholders 7Puja Shrivastav
  • 8.
    Planning for Evaluation Gatherdata  Data gathering focuses on collecting information that conveys a holistic picture of the instructional program.  Data gathering includes consideration about what indicators, data sources and methods to use, the quality and quantity of the information, human subject protections, and the context in which the data gathering occurs. 8Puja Shrivastav
  • 9.
    Create an evaluationplan  The evaluation plan outlines how to implement the evaluation including: i. Identification of the sponsor and resources available for implementing the plan, ii. What information is to be gathered, iii. The research method(s) to be used, iv. A description of the roles and responsibilities of sponsors and evaluators. v. A timeline for accomplishing tasks. 9Puja Shrivastav
  • 10.
    2. Identify stakeholdersand their needs  Stakeholders are the individuals and organizations involved in program operations,  Those served or affected by the program, and the intended users of the assessment or evaluation.  Stakeholder needs generally reflect the central questions which they have about the instructional activity, innovation, or program.  Determining stakeholder needs helps to focus the evaluation process so that the results are of the greatest utility. 10Puja Shrivastav
  • 11.
    Three principle groupsof stakeholders  Persons Involved in Program Operations ◦ Staff and Partners  Persons affected or served by the program ◦ Clients, their families and social networks, providers and community groups  Intended users of the evaluation findings ◦ Policy makers, managers, administrators, advocates, funders, and others 11Puja Shrivastav
  • 12.
    3. Determining theevaluation purpose Three general purposes for instructional evaluations are -- a. Gain Insight - o Assess needs and wants of community members o Identify barriers to use of the program o Learn how to best describe and measure program activities b. Change Practice - to improve the quality, effectiveness, or efficiency of instructional activities. 12Puja Shrivastav
  • 13.
    Determining the evaluationpurpose ◦ Refine plans for introducing a new practice ◦ Determine the extent to which plans were implemented ◦ Improve educational materials ◦ Enhance cultural competence ◦ Verify that participants' rights are protected ◦ Set priorities for staff training ◦ Make mid-course adjustments ◦ Clarify communication ◦ Determine if client satisfaction can be improved ◦ Compare costs to benefits ◦ Find out which participants benefit most from the program ◦ Mobilize community support for the program 13Puja Shrivastav
  • 14.
    Determining the evaluation purpose c.Measure Effects of program– to examine the relationship between instructional activities and observed consequences ◦ Assess skills development by program participants ◦ Compare changes in behavior over time ◦ Decide where to allocate new resources ◦ Document the level of success in accomplishing objectives ◦ Demonstrate that accountability requirements are fulfilled ◦ Use information from multiple evaluations to predict the likely effects of similar programs 14Puja Shrivastav
  • 15.
    Determining the evaluationpurpose d. Affect on participants- ◦ Empower program participants (for example, being part of an evaluation can increase community members' sense of control over the program); ◦ Supplement the program (for example, using a follow-up questionnaire can reinforce the main messages of the program); ◦ Promote staff development (for example, by teaching staff how to collect, analyze, and interpret evidence); or ◦ Contribute to organizational growth (for example, the evaluation may clarify how the program relates to the organization's mission). 15Puja Shrivastav
  • 16.
    Determining the evaluationpurpose ◦ Reinforce messages of the program ◦ Stimulate dialogue and raise awareness about community issues ◦ Broaden consensus among partners about program goals ◦ Teach evaluation skills to staff and other stakeholders ◦ Gather success storie ◦ Support organizational change and improvement 16Puja Shrivastav
  • 17.
    Identify intended uses Intended uses are the specific ways evaluation results will be applied.  They are the underlying goals of the evaluation, and are linked to the central questions of the study that identify the specific aspects of the instructional program to be examined.  The purpose, uses, and central questions of an evaluation are all closely related. 17Puja Shrivastav
  • 18.
    4. Decide whowill evaluate  Decide who will evaluate-Decide if evaluation will be in house or contracted out. In house – Principal, Teachers, Students, or Parents. Out – Some agencies can be hired to help out, Retired professional of same stream etc can also evaluate the program. 18Puja Shrivastav
  • 19.
    5.Reporting Results Analyze data Data analysis involves identifying patterns in the data, either by isolating important findings (analysis) or by combining sources of information to reach a larger understanding (synthesis), and  Making decisions about how to organize, classify, interrelate, compare, and display information.  These decisions are guided by the questions being asked, the types of data available, and by input from stakeholders. 19Puja Shrivastav
  • 20.
    Report results  Factorsto consider when reporting results, or dissemination, include tailoring report content for a specific audience, explaining the focus of the study and its limitations, and listing both the strengths and weaknesses of the study.  It may also include the reporting of active follow-up and interim findings.  Reporting interim findings is sometimes useful to instructors or staff in making immediate instructional adjustments. Cont.. 20Puja Shrivastav
  • 21.
    Report results  Describethe accomplishments of the program, identifying those instructional elements that were the most effective;  Describe instructional elements that were ineffective and problematic as well as areas that need modifications in the future; and  Describe the outcomes or the impact of the instructional unit on students. 21Puja Shrivastav
  • 22.
    Report results  Completedocumentation will make the report useful for making decisions about improving curriculum and instructional strategies.  In other words, the evaluation report is a tool supporting decision making, program improvement, accountability, and quality control in curriculum.  This will help in reframing the curriculum ….if needed. 22Puja Shrivastav
  • 23.
    Make conclusions and recommendations Conclusions are linked to the evidence gathered and judged against agreed- upon standards set by stakeholders.  Recommendations are actions for consideration that are based on conclusions but go beyond simple judgments about efficacy or interpretation of the evidence gathered. 23Puja Shrivastav
  • 24.
    Justify the Conclusions Conclusions become justified when they are linked to the evidence gathered and judged against agreed- upon values set by the stakeholders.  Stakeholders must agree that conclusions are justified in order to use the evaluation results with confidence. The principal elements involved in justifying conclusions based on 24Puja Shrivastav
  • 25.
    Justify the Conclusions Standards- Standards reflect the values held by stakeholders about the program. They provide the basis to make program judgments.  Analysis and synthesis- Analysis and synthesis are methods to discover and summarize an evaluation's findings.  Interpretation- Interpretation is the effort to figure out what the findings mean. Uncovering facts about a program's performance.  Judgements- Judgments are statements about the merit, worth, or significance of the program.  Recommendations-Recommendations are 25Puja Shrivastav
  • 26.
    Standards for EffectiveEvaluation 26 Standards Utility Feasibility Propriety Accuracy Engage stakeholders Steps Describe the program Gather credible evidence Focus the Evaluation design Justify conclusions Ensure use and share lessons learned 26 Puja Shrivastav
  • 27.
    The Four Standards Utility: Who needs the information and what information do they need?  Feasibility: How much money, time, and effort can we put into this?  Propriety: What steps need to be taken for the evaluation to be ethical?  Accuracy: What design will lead to accurate information? 27Puja Shrivastav
  • 28.
    Standard: Utility Ensures thatthe information needs of intended users are met.  Who needs the evaluation findings?  What do the users of the evaluation need?  Will the evaluation provide relevant (useful) information in a timely manner? 28Puja Shrivastav
  • 29.
    Standard: Feasibility Ensures thatevaluation is realistic, prudent, diplomatic, and frugal.  Are the planned evaluation activities realistic given the time, resources, and expertise at hand? 29Puja Shrivastav
  • 30.
    Standard: Propriety Ensures theevaluation is conducted legally, ethically, and with due regard for the welfare of those involved and those affected.  Does the evaluation protect the rights of individuals and protect the welfare of those involved?  Does it engage those most directly affected by the program and by changes in the program, such as participants or the surrounding community? 30Puja Shrivastav
  • 31.
    Standard: Accuracy  Ensuresthat the evaluation reveals and conveys technically accurate information. Will the evaluation produce findings that are valid and reliable, given the needs of those who will use the results? 31Puja Shrivastav
  • 32.
    Utilizing the EvaluationResult  The evaluator records the actions, the features and experiences of students, teachers and administrators. People who read the report will be able to visualise what the place looks like and the processes taking place. Thus the reader will understand the area’s for requirement of improvement.  The evaluator interpret and explains the meaning of events reported by putting it in its context. For example, why academically weak students were motivated to ask questions; why reading comprehension skills improved; why enthusiasm for doing science experiments increased and so forth. 32Puja Shrivastav
  • 33.
    Utilization of Evaluation Result Use of Available resources- organization of staff for learning, Administrative and physical conditions.  Decision area of Teacher- identifying the objective, selection of teaching learning process.  Communication- Properly done with the stakeholders. 33Puja Shrivastav
  • 34.
    Utilization of Evaluation Result The Results ensures that the information needs of intended users are met/ or not. If not then further recommendations can be used which are made by the evaluator.  The feedback obtained could be used to revise and improve instruction or whether or not to adopt the programme before full implementation.  Development of overall programme or curriculum. 34Puja Shrivastav
  • 35.

Editor's Notes

  • #27  Slide 16 Objective: Provide an overview of the standards for program evaluation Speaker: Good evaluations are diligently conducted under the guidance of the standards for program evaluation provided by the Joint Committee of Standards for Educational Evaluation. There are 4 standards for program evaluation: Utility, Feasibility, Propriety, and Accuracy. These standards serve to guide your decision making process at each step of the Framework to ensure that the evaluation stays focused and balanced. In the next couple of slides, we will go over each standard in depth.