SlideShare a Scribd company logo
1 of 41
Prescriptive Evaluation Model
Kirkpatrick Evaluation Model
Schuman: Experimental Evaluation Model
Stufflebeam: CIPP
Prescriptive model evaluation involves providing
recommendations or prescriptions for
improvement based on the findings of an
evaluation. It goes beyond simply identifying
strengths and weaknesses to offering specific
actions or strategies for enhancing the
effectiveness, efficiency, or impact of a program,
intervention, or process.
It goes beyond simply identifying
strengths and weaknesses to
offering specific actions or strategies
for enhancing the effectiveness,
efficiency, or impact of a program,
intervention, or process.
In prescriptive evaluation, evaluators not only
assess the current state of the program but
also offer guidance on how to address any
identified issues or capitalize on strengths.
This may involve suggesting changes to
program design, implementation methods,
resource allocation, or monitoring and
evaluation strategies.
Prescriptive evaluation typically follows a structured process that includes:
Analysis of Findings:
Reviewing the results of the evaluation
to identify key findings, trends, and
areas for improvement.
Prescriptive evaluation typically follows a structured process that includes:
Identification of Needs:
Determining the specific needs or areas
where changes or interventions are
required to enhance program
effectiveness.
Prescriptive evaluation typically follows a structured process that includes:
Development of Recommendations:
Formulating actionable recommendations
based on the evaluation findings and needs
analysis. These recommendations are
tailored to address identified weaknesses
or capitalize on strengths.
Prescriptive evaluation typically follows a structured process that includes:
Consultation and Collaboration:
Engaging stakeholders, program staff,
and other relevant parties in the
development of recommendations to
ensure buy-in and feasibility.
Prescriptive evaluation typically follows a structured process that includes:
Prioritization and Implementation
Planning:
Prioritizing recommendations based on
their potential impact and feasibility,
and developing plans for implementing
them effectively.
Prescriptive evaluation typically follows a structured process that includes:
Monitoring and Follow-Up:
Tracking the implementation of
recommendations over time and assessing
their impact on program outcomes.
Adjustments may be made as needed based
on ongoing monitoring and feedback.
Prescriptive model evaluation is valuable for
organizations and program managers seeking
actionable insights to improve the performance
and outcomes of their initiatives. It helps ensure
that evaluation findings lead to meaningful
changes and enhancements that contribute to
the overall success of the program or
intervention.
Kirkpatrick's Four-Level
Training Evaluation
Model
“Capacity Building Interventions: How do we know what difference we are
making?”
Capacity Building via Training
Participant Training is:
The transfer of knowledge, skills, or attitudes (KSAs), as well as ideas and
sector context, through structured learning and follow-up activities to
solve job performance problems or fill identified performance gaps.
ADS Chapter 253
Evaluating Training
Evaluating training is
recommended as a
best practice, and it
aligns with USAID’s
policy on evidence-
based decision-making.
ADS Chapter 253
If you deliver training, then you probably know how important it is to measure
its effectiveness. After all, you don't want to spend time or money on training
that doesn't provide a good return
The four levels are:
Kirkpatrick's Four-Level Training
Evaluation Model
Each level is
important and has an
impact on the next
level. As you move
from one level to the
next,
Level 1: Reaction
This level measures how your trainees (the people being trained), reacted to the
training. Obviously, you want them to feel that the training was a valuable experience,
and you want them to feel good about the instructor, the topic, the material, its
presentation, and the venue.
Why?
• Gives us valuable feedback that helps us to evaluate the program.
• Tells trainees that the trainers are there to help them do their job better and that
they need feedback to determine how effective they are.
• Provides trainers with quantitative information that can be used to establish
standards of performance for future programs.
How?
• Satisfaction Survey
Kirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation Model
Level 1: Reaction measures:
• CUSTOMER SATISFACTION:measure participants satisfaction with the
training.
• Taking this program was worth my time.
• ENGAGEMENT: measure involvement and contribution of participants.
• My learning was enhanced by the facilitator.
• RELEVANCE: measure participants opportunity to apply what they
learned in training on the job.
• What I learned in this class will help me on the job.
Level 2: Learning
At level 2, you measure what your trainees have learned. How much has their
knowledge increased as a result of the training?
When?
• After training conducted
How?
• By evaluating both before and after the training program.
• Before training commences, test trainee to determine their knowledge, skills and attitude.
• After training is completed, test trainee for second time to determine if there is any
improvement.
• By comparing both the result, it can be determined whether learning is successful or not.
Kirkpatrick's Four-Level Training Evaluation Model
Level 2: Learning measures:
• Knowledge “I know it” : measured primarily with formative exercises during the
session or a quiz near the end.
• Skills “I can do it right now.” : measured with activities and demonstrations during
the session that show that participants can perform the skill
• Attitude “I believe this will be worthwhile to do on the job.” : measured with Rating
Scale Questions
• Confidence “I think I can do it on the job.” : measured with Rating Scale Questions
• Commitment “I intend to do it on the job.”: measured with Rating Scale Questions
Kirkpatrick's Four-Level Training Evaluation Model
Level 3: Behavior
At this level, you evaluate how far your trainees have changed their behavior, based on
the training they received. Specifically, this looks at how trainees apply the information.
How?
• Use a control group if practical,
• Evaluate both before and after the program,
• Survey and/or interview: one or more of the following:
• Trainees,
• Immediate supervisor,
• others who often observe their behavior.
• Repeat the evaluation at appropriate times,
• Consider cost versus benefits.
Kirkpatrick's Four-Level Training Evaluation Model
Allow time for
behavior
change to take
place
Level 3: Behavior
Examples for interviews questions.
• Did the trainees put any of their learning to use?
• Are trainees able to teach their new knowledge, skills, or attitudes to other people?
• Are trainees aware that they've changed their behaviour?
Kirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation Model
Level 4: Results
At this level, you analyze the final results of your training. This includes outcomes that
you or your organization have determined to be good for business, good for the
employees, or good for the bottom line.
When?
If your programs aim at tangible results rather
than teaching management concepts, theories,
and principles, then it is desirable to evaluate in
terms of results.
How?
• Search for evidences
Level 4: Result
Examples for interviews questions.
• What results have you seen since attending this training?
• Please give an example of the success you have achieved since attending this training.
Kirkpatrick's Four-Level Training Evaluation Model
Schuman: Experimental
Evaluation Model
The Schuman Experimental Evaluation
Model is a framework used in social
science research to assess the
effectiveness of interventions or
programs. Developed by William
Schuman
This model typically involves several stages:
Design:
This phase involves planning the intervention or
program and designing the evaluation process.
Researchers need to define clear objectives,
identify the target population, and select
appropriate methodologies for data collection
and analysis.
This model typically involves several stages:
Implementation:
During this stage, the intervention or program
is put into action according to the design plan.
It's essential to follow the implementation plan
closely to ensure consistency and fidelity to
the intended intervention.
This model typically involves several stages:
Data Collection:
Researchers gather data on various aspects
of the intervention, such as its impact on
participants, changes in behavior or attitudes,
and any other relevant outcomes. This often
involves using a combination of qualitative
and quantitative methods, such as surveys,
interviews, observations, or experiments.
This model typically involves several stages:
Analysis:
In this phase, researchers analyze the
collected data to assess the effectiveness
of the intervention. Statistical techniques
are commonly used to determine whether
any observed changes are statistically
significant and to identify patterns or
trends in the data.
This model typically involves several stages:
Interpretation:
Researchers interpret the findings of the
evaluation, considering the implications
for theory, practice, and policy. They may
also assess the strengths and limitations
of the intervention and offer
recommendations for future
improvements or research.
This model typically involves several stages:
Reporting:
Finally, the results of the evaluation are
communicated to stakeholders, such as
policymakers, practitioners, and the
general public. Clear and transparent
reporting is crucial to ensure that the
findings are understood and can inform
decision-making effectively.
The Schuman Experimental
Evaluation Model provides a
systematic approach to evaluating
interventions or programs, helping
researchers to generate reliable
evidence about their effectiveness
and impact.
Stufflebeam: CIPP
The CIPP Model, developed by Daniel
Stufflebeam, is a comprehensive
framework used for evaluating
programs or interventions. The
acronym stands for Context, Input,
Process, and Product.
The CIPP component includes:
Context Evaluation:
This involves understanding the
environment in which the program operates.
It examines factors such as the needs of the
target population, the resources available,
and any external influences that may affect
the program.
The CIPP component includes:
Input Evaluation:
Input evaluation focuses on the resources
invested in the program, including personnel,
funding, materials, and technology. It aims to
assess whether these resources are
adequate and appropriate for achieving the
program's objectives.
The CIPP component includes:
Process Evaluation:
Process evaluation looks at how the program
is implemented. It examines the activities,
procedures, and interactions involved in
delivering the program to determine whether
they are being carried out as planned and
whether they are effective in achieving the
desired outcomes.
The CIPP component includes:
Product Evaluation:
Product evaluation assesses the
outcomes or results of the program. This
includes both intended and unintended
outcomes, as well as the overall impact
of the program on the target population
or the broader community.
The CIPP component includes:
Product Evaluation:
Product evaluation assesses the
outcomes or results of the program. This
includes both intended and unintended
outcomes, as well as the overall impact
of the program on the target population
or the broader community.
By addressing these four components,
the CIPP Model provides a
comprehensive framework for evaluating
programs at various stages of
development and implementation,
helping stakeholders make informed
decisions about program improvement
and future planning.
Proverbs 1:5
“Let the wise hear and increase in learning,
and the one who understands obtain
guidance”

More Related Content

Similar to Kirkpatricks Foul Levels Evaluation.pptx

Similar to Kirkpatricks Foul Levels Evaluation.pptx (20)

Krickpatrick basic level of evaluation
Krickpatrick basic level of evaluationKrickpatrick basic level of evaluation
Krickpatrick basic level of evaluation
 
Kirk patrick's simplistic approach
Kirk patrick's simplistic approachKirk patrick's simplistic approach
Kirk patrick's simplistic approach
 
Training Evaluation
Training EvaluationTraining Evaluation
Training Evaluation
 
Evaluation of training Program
Evaluation of training ProgramEvaluation of training Program
Evaluation of training Program
 
training evaluation
 training evaluation training evaluation
training evaluation
 
Training program effectiveness a measuring instrument (1)
Training program effectiveness a measuring instrument (1)Training program effectiveness a measuring instrument (1)
Training program effectiveness a measuring instrument (1)
 
Kirkspatrick model
Kirkspatrick modelKirkspatrick model
Kirkspatrick model
 
Training Evaluation
Training EvaluationTraining Evaluation
Training Evaluation
 
Training evaluation models
Training evaluation modelsTraining evaluation models
Training evaluation models
 
Evaluation of training
Evaluation of trainingEvaluation of training
Evaluation of training
 
Training Evaluation & Management
Training Evaluation & ManagementTraining Evaluation & Management
Training Evaluation & Management
 
Unit 5- training evalutaion pptx
Unit 5- training evalutaion  pptxUnit 5- training evalutaion  pptx
Unit 5- training evalutaion pptx
 
C&M day 5
C&M day 5C&M day 5
C&M day 5
 
OBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptxOBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptx
 
Chapter 6, Training Evaluation
Chapter 6, Training Evaluation Chapter 6, Training Evaluation
Chapter 6, Training Evaluation
 
Programme Evaluation in extension
Programme Evaluation in extensionProgramme Evaluation in extension
Programme Evaluation in extension
 
Module 5 Training Evaluation.pptx
Module 5 Training Evaluation.pptxModule 5 Training Evaluation.pptx
Module 5 Training Evaluation.pptx
 
Kirkpatrick Evaluation Model.ppt
Kirkpatrick Evaluation Model.pptKirkpatrick Evaluation Model.ppt
Kirkpatrick Evaluation Model.ppt
 
training ppt - Copy
training ppt - Copytraining ppt - Copy
training ppt - Copy
 
Kirkpatric
KirkpatricKirkpatric
Kirkpatric
 

Recently uploaded

Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSAnaAcapella
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxCeline George
 
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdfFICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdfPondicherry University
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - Englishneillewis46
 
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdfUnit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdfDr Vijay Vishwakarma
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxDr. Ravikiran H M Gowda
 
dusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningdusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningMarc Dusseiller Dusjagr
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Pooja Bhuva
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfSherif Taha
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024Elizabeth Walsh
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.christianmathematics
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxPooja Bhuva
 
latest AZ-104 Exam Questions and Answers
latest AZ-104 Exam Questions and Answerslatest AZ-104 Exam Questions and Answers
latest AZ-104 Exam Questions and Answersdalebeck957
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxJisc
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jisc
 
21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptxJoelynRubio1
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptNishitharanjan Rout
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSCeline George
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentationcamerronhm
 

Recently uploaded (20)

Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
 
How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdfFICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
FICTIONAL SALESMAN/SALESMAN SNSW 2024.pdf
 
Graduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - EnglishGraduate Outcomes Presentation Slides - English
Graduate Outcomes Presentation Slides - English
 
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdfUnit 3 Emotional Intelligence and Spiritual Intelligence.pdf
Unit 3 Emotional Intelligence and Spiritual Intelligence.pdf
 
REMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptxREMIFENTANIL: An Ultra short acting opioid.pptx
REMIFENTANIL: An Ultra short acting opioid.pptx
 
Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024Mehran University Newsletter Vol-X, Issue-I, 2024
Mehran University Newsletter Vol-X, Issue-I, 2024
 
dusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learningdusjagr & nano talk on open tools for agriculture research and learning
dusjagr & nano talk on open tools for agriculture research and learning
 
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
Sensory_Experience_and_Emotional_Resonance_in_Gabriel_Okaras_The_Piano_and_Th...
 
Food safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdfFood safety_Challenges food safety laboratories_.pdf
Food safety_Challenges food safety laboratories_.pdf
 
FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024FSB Advising Checklist - Orientation 2024
FSB Advising Checklist - Orientation 2024
 
This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.This PowerPoint helps students to consider the concept of infinity.
This PowerPoint helps students to consider the concept of infinity.
 
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptxExploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
Exploring_the_Narrative_Style_of_Amitav_Ghoshs_Gun_Island.pptx
 
latest AZ-104 Exam Questions and Answers
latest AZ-104 Exam Questions and Answerslatest AZ-104 Exam Questions and Answers
latest AZ-104 Exam Questions and Answers
 
Wellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptxWellbeing inclusion and digital dystopias.pptx
Wellbeing inclusion and digital dystopias.pptx
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)
 
21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx21st_Century_Skills_Framework_Final_Presentation_2.pptx
21st_Century_Skills_Framework_Final_Presentation_2.pptx
 
AIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.pptAIM of Education-Teachers Training-2024.ppt
AIM of Education-Teachers Training-2024.ppt
 
How to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POSHow to Manage Global Discount in Odoo 17 POS
How to Manage Global Discount in Odoo 17 POS
 
SOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning PresentationSOC 101 Demonstration of Learning Presentation
SOC 101 Demonstration of Learning Presentation
 

Kirkpatricks Foul Levels Evaluation.pptx

  • 1. Prescriptive Evaluation Model Kirkpatrick Evaluation Model Schuman: Experimental Evaluation Model Stufflebeam: CIPP
  • 2. Prescriptive model evaluation involves providing recommendations or prescriptions for improvement based on the findings of an evaluation. It goes beyond simply identifying strengths and weaknesses to offering specific actions or strategies for enhancing the effectiveness, efficiency, or impact of a program, intervention, or process.
  • 3. It goes beyond simply identifying strengths and weaknesses to offering specific actions or strategies for enhancing the effectiveness, efficiency, or impact of a program, intervention, or process.
  • 4. In prescriptive evaluation, evaluators not only assess the current state of the program but also offer guidance on how to address any identified issues or capitalize on strengths. This may involve suggesting changes to program design, implementation methods, resource allocation, or monitoring and evaluation strategies.
  • 5. Prescriptive evaluation typically follows a structured process that includes: Analysis of Findings: Reviewing the results of the evaluation to identify key findings, trends, and areas for improvement.
  • 6. Prescriptive evaluation typically follows a structured process that includes: Identification of Needs: Determining the specific needs or areas where changes or interventions are required to enhance program effectiveness.
  • 7. Prescriptive evaluation typically follows a structured process that includes: Development of Recommendations: Formulating actionable recommendations based on the evaluation findings and needs analysis. These recommendations are tailored to address identified weaknesses or capitalize on strengths.
  • 8. Prescriptive evaluation typically follows a structured process that includes: Consultation and Collaboration: Engaging stakeholders, program staff, and other relevant parties in the development of recommendations to ensure buy-in and feasibility.
  • 9. Prescriptive evaluation typically follows a structured process that includes: Prioritization and Implementation Planning: Prioritizing recommendations based on their potential impact and feasibility, and developing plans for implementing them effectively.
  • 10. Prescriptive evaluation typically follows a structured process that includes: Monitoring and Follow-Up: Tracking the implementation of recommendations over time and assessing their impact on program outcomes. Adjustments may be made as needed based on ongoing monitoring and feedback.
  • 11. Prescriptive model evaluation is valuable for organizations and program managers seeking actionable insights to improve the performance and outcomes of their initiatives. It helps ensure that evaluation findings lead to meaningful changes and enhancements that contribute to the overall success of the program or intervention.
  • 12. Kirkpatrick's Four-Level Training Evaluation Model “Capacity Building Interventions: How do we know what difference we are making?”
  • 13. Capacity Building via Training Participant Training is: The transfer of knowledge, skills, or attitudes (KSAs), as well as ideas and sector context, through structured learning and follow-up activities to solve job performance problems or fill identified performance gaps. ADS Chapter 253
  • 14. Evaluating Training Evaluating training is recommended as a best practice, and it aligns with USAID’s policy on evidence- based decision-making. ADS Chapter 253
  • 15. If you deliver training, then you probably know how important it is to measure its effectiveness. After all, you don't want to spend time or money on training that doesn't provide a good return The four levels are: Kirkpatrick's Four-Level Training Evaluation Model Each level is important and has an impact on the next level. As you move from one level to the next,
  • 16. Level 1: Reaction This level measures how your trainees (the people being trained), reacted to the training. Obviously, you want them to feel that the training was a valuable experience, and you want them to feel good about the instructor, the topic, the material, its presentation, and the venue. Why? • Gives us valuable feedback that helps us to evaluate the program. • Tells trainees that the trainers are there to help them do their job better and that they need feedback to determine how effective they are. • Provides trainers with quantitative information that can be used to establish standards of performance for future programs. How? • Satisfaction Survey Kirkpatrick's Four-Level Training Evaluation Model
  • 17. Kirkpatrick's Four-Level Training Evaluation Model Level 1: Reaction measures: • CUSTOMER SATISFACTION:measure participants satisfaction with the training. • Taking this program was worth my time. • ENGAGEMENT: measure involvement and contribution of participants. • My learning was enhanced by the facilitator. • RELEVANCE: measure participants opportunity to apply what they learned in training on the job. • What I learned in this class will help me on the job.
  • 18. Level 2: Learning At level 2, you measure what your trainees have learned. How much has their knowledge increased as a result of the training? When? • After training conducted How? • By evaluating both before and after the training program. • Before training commences, test trainee to determine their knowledge, skills and attitude. • After training is completed, test trainee for second time to determine if there is any improvement. • By comparing both the result, it can be determined whether learning is successful or not. Kirkpatrick's Four-Level Training Evaluation Model
  • 19. Level 2: Learning measures: • Knowledge “I know it” : measured primarily with formative exercises during the session or a quiz near the end. • Skills “I can do it right now.” : measured with activities and demonstrations during the session that show that participants can perform the skill • Attitude “I believe this will be worthwhile to do on the job.” : measured with Rating Scale Questions • Confidence “I think I can do it on the job.” : measured with Rating Scale Questions • Commitment “I intend to do it on the job.”: measured with Rating Scale Questions Kirkpatrick's Four-Level Training Evaluation Model
  • 20. Level 3: Behavior At this level, you evaluate how far your trainees have changed their behavior, based on the training they received. Specifically, this looks at how trainees apply the information. How? • Use a control group if practical, • Evaluate both before and after the program, • Survey and/or interview: one or more of the following: • Trainees, • Immediate supervisor, • others who often observe their behavior. • Repeat the evaluation at appropriate times, • Consider cost versus benefits. Kirkpatrick's Four-Level Training Evaluation Model Allow time for behavior change to take place
  • 21. Level 3: Behavior Examples for interviews questions. • Did the trainees put any of their learning to use? • Are trainees able to teach their new knowledge, skills, or attitudes to other people? • Are trainees aware that they've changed their behaviour? Kirkpatrick's Four-Level Training Evaluation Model
  • 22. Kirkpatrick's Four-Level Training Evaluation Model Level 4: Results At this level, you analyze the final results of your training. This includes outcomes that you or your organization have determined to be good for business, good for the employees, or good for the bottom line. When? If your programs aim at tangible results rather than teaching management concepts, theories, and principles, then it is desirable to evaluate in terms of results. How? • Search for evidences
  • 23. Level 4: Result Examples for interviews questions. • What results have you seen since attending this training? • Please give an example of the success you have achieved since attending this training. Kirkpatrick's Four-Level Training Evaluation Model
  • 25. The Schuman Experimental Evaluation Model is a framework used in social science research to assess the effectiveness of interventions or programs. Developed by William Schuman
  • 26. This model typically involves several stages: Design: This phase involves planning the intervention or program and designing the evaluation process. Researchers need to define clear objectives, identify the target population, and select appropriate methodologies for data collection and analysis.
  • 27. This model typically involves several stages: Implementation: During this stage, the intervention or program is put into action according to the design plan. It's essential to follow the implementation plan closely to ensure consistency and fidelity to the intended intervention.
  • 28. This model typically involves several stages: Data Collection: Researchers gather data on various aspects of the intervention, such as its impact on participants, changes in behavior or attitudes, and any other relevant outcomes. This often involves using a combination of qualitative and quantitative methods, such as surveys, interviews, observations, or experiments.
  • 29. This model typically involves several stages: Analysis: In this phase, researchers analyze the collected data to assess the effectiveness of the intervention. Statistical techniques are commonly used to determine whether any observed changes are statistically significant and to identify patterns or trends in the data.
  • 30. This model typically involves several stages: Interpretation: Researchers interpret the findings of the evaluation, considering the implications for theory, practice, and policy. They may also assess the strengths and limitations of the intervention and offer recommendations for future improvements or research.
  • 31. This model typically involves several stages: Reporting: Finally, the results of the evaluation are communicated to stakeholders, such as policymakers, practitioners, and the general public. Clear and transparent reporting is crucial to ensure that the findings are understood and can inform decision-making effectively.
  • 32. The Schuman Experimental Evaluation Model provides a systematic approach to evaluating interventions or programs, helping researchers to generate reliable evidence about their effectiveness and impact.
  • 34. The CIPP Model, developed by Daniel Stufflebeam, is a comprehensive framework used for evaluating programs or interventions. The acronym stands for Context, Input, Process, and Product.
  • 35. The CIPP component includes: Context Evaluation: This involves understanding the environment in which the program operates. It examines factors such as the needs of the target population, the resources available, and any external influences that may affect the program.
  • 36. The CIPP component includes: Input Evaluation: Input evaluation focuses on the resources invested in the program, including personnel, funding, materials, and technology. It aims to assess whether these resources are adequate and appropriate for achieving the program's objectives.
  • 37. The CIPP component includes: Process Evaluation: Process evaluation looks at how the program is implemented. It examines the activities, procedures, and interactions involved in delivering the program to determine whether they are being carried out as planned and whether they are effective in achieving the desired outcomes.
  • 38. The CIPP component includes: Product Evaluation: Product evaluation assesses the outcomes or results of the program. This includes both intended and unintended outcomes, as well as the overall impact of the program on the target population or the broader community.
  • 39. The CIPP component includes: Product Evaluation: Product evaluation assesses the outcomes or results of the program. This includes both intended and unintended outcomes, as well as the overall impact of the program on the target population or the broader community.
  • 40. By addressing these four components, the CIPP Model provides a comprehensive framework for evaluating programs at various stages of development and implementation, helping stakeholders make informed decisions about program improvement and future planning.
  • 41. Proverbs 1:5 “Let the wise hear and increase in learning, and the one who understands obtain guidance”