SlideShare a Scribd company logo
Kirkpatrick
The Four Levels
Reaction
Learning
Behavior
Results
All about Kirkpatrick
In 1959, Kirkpatrick wrote four articles describing
the four levels for evaluating training programs.
He was working on his dissertation for a Ph.D.
when he came up with the idea of defining
evaluation.
Evaluation, as according to Kirkpatrick, seems to
have multiple meanings to training and
developmental professionals. Some think
evaluation is a change in behavior, or the
determination of the final results.
All about Kirkpatrick
(continued)
Kirkpatrick says they are all right, and
yet all wrong. All four levels are
important in understanding the basic
concepts in training. There are
exceptions, however.
Kirkpatrick: Evaluating
Training Programs
“What is quality training?”
“How do you measure it?”
“How do you improve it?”
Evaluating
“The reason for evaluating is to
determine the effectiveness of a
training program.” (Kirkpatrick,
1994, pg. 3)
The Ten Factors of Developing
a Training Program
1. Determine needs
2. Set objectives
3. Determine subject content
4. Select qualified applicants
5. Determine the best schedule
The Ten Factors of Developing
a Training Program
6. Select appropriate facilities
7. Select qualified instructors
8. Select and prepare audiovisual
aids
9. Co-ordinate the program
10. Evaluate the program
Reasons for Evaluating
Kirkpatrick gives three reasons ‘why’
there is a need to evaluate training:
1.“To justify the existence of the
training department by showing how
it contributes to the organizations’
objectives and goals.”
Reasons for Evaluating
2. “To decide whether to continue or
discontinue training programs.”
3. “To gain information on how to
improve future training programs.”
(Kirkpatrick, 1994, pg. 18)
The Four Levels
Reaction
Learning
Behavior
Results
“The Four Levels represent a
sequence of ways to evaluate
(training) programs….As you move
from one level to the next, the
process becomes more difficult and
time-consuming, but it also provides
more valuable information.”
(Kirkpatrick, 1994, pg. 21)
Reaction:
is the measuring of the reaction of
the participants in the training
program.
is “a measure of customer
satisfaction.” (Kirkpatrick, 1994,
pg. 21)
Learning:
is the change in the participants’
attitudes, or an increase in
knowledge, or greater skills
received, as a result of the
participation of the program.
Learning
The measuring of learning in any training
program is the determination of at least one
of these measuring parameters:
Did the attitudes change positively?
Is the knowledge acquired related and
helpful to the task?
Is the skill acquired related and helpful to
the task?
Behavior
Level 3 attempts to evaluate how
much transfer of knowledge, skills,
and attitude occurs after the
training.
The four conditions Kirkpatrick
identifies for changes to occur:
Desire to change
Knowledge of what to do and
how to do it
Work in the right climate
Reward for (positive) change
When all conditions are met,
the employee must:
Realize an opportunity to use the
behavioral changes.
Make the decision to use the
behavioral changes.
Decide whether or not to continue
using the behavioral changes.
When evaluating change in
behavior, decide:
When to evaluate
How often to evaluate
How to evaluate
Guidelines for evaluating
behavior:
Use a control group
Allow time for change to occur
Evaluate before and after
Survey/interview observers
Get 100% response or sampling
Repeat evaluation, as appropriate
Consider cost versus benefits
Results
Level 4 is the most important and
difficult of all - determining final
results after training.
Evaluation Questions:
Increased production?
Improved quality?
Decreased costs?
Improved safety numbers?
Increased sales?
Reduced turnover?
Higher profits?
Guidelines for evaluating
results:
Use a control group.
Allow time for results to be achieved.
Measure before and after the program.
Repeat the measurements, as needed.
Consider cost versus benefits.
Be satisfied with evidence if proof is not
possible.
Case Study #1
INTEL CORPORATION
Intel’s Compromise
of the Kirkpatrick Model
Intel uses the four-level model as
an analysis instrument to
determine the initial training needs
and design of its training program;
as well as using the model for
evaluations.
Intel’s Compromise
of the Kirkpatrick Model
Their uniqueness of using the model
is in the fact that the designers of
the training program worked
backwards in the analysis of the
training, starting with Level Four.
The Model
This implementation of the
Kirkpatrick Model stands as vivid
testimony to the versatility of the
model as a training tool, and in
developing fledgling training
programs.
The Model
It also reflects the open-mindedness
of the senior executives at Intel for
their infinite use of the model and
the use of the genius and visions of
Kirkpatrick.
How Intel applies the analysis
to their training program
Level Four …”Determine the
organizations’ structure and future
needs.”
Level Three. Change the
environmental conditions and
employee conditions to improve
business indicators.
How Intel applies the analysis
to their training program
Level Two. “Design a training
program that would ensure a transfer
of deficient skills and knowledge.”
Level One. Use a questionnaire,
according to their skill level, that
would instruct and inspire training
participants.
How Intel applies evaluation
to their training program
Level One - Questionnaire.
Level Two - Demonstrate competency,
create action plans through group
simulations.
Level Three - Follow-up to determine if
action plans were met (specific steps to
implement concepts of what was learned).
Level Four - Ongoing process of tracking
business indicators.
Case Study #2
ST. LUKE’S HOSPITAL
St. Luke’s is unique -
Evaluation of outdoor-based
training program, not classroom.
Results analyzed statistically to
determine the significance of any
change.
Evaluation led to recommendations
for future programs.
The New Questionnaire
Used before attendance in the program.
Used 3 months after completion of the
program.
Used again 6 months after completion of
the program.
(Communication showed statistically significant
improvement, and Group Effectiveness showed
statistically significant change.)
Kirkpatrick’s 4 Levels of
Evaluation are:
Level 1 - Reaction: how participants reacted
to the program.
Level 2 - Learning: what participants
learned from the program.
Level 3 - Behavior: whether what was
learned is being applied on the job.
Level 4 - Results: whether that application
is achieving results.
Post-test Questions
(1) Name three ways evaluation results can
be measured.
(2) Do all 4 Levels have to be used?
(3) Do they have to be used in 1,2,3,4
order?
(4) Is Kirkpatrick’s method of evaluation
summative or formative?
(5) Which developmental “view” does
Kirkpatrick use? (discrepancy,
“IF YOU THINK TRAINING IS
EXPENSIVE, TRY IGNORANCE.”
and, remember, the definition of
ignorance is
repeating the same behavior, over and
over, and expecting different results!

More Related Content

What's hot

Simplistic approach of krik patrick
Simplistic approach of krik patrickSimplistic approach of krik patrick
Simplistic approach of krik patrick
rhimycrajan
 
Kirkpatrick4 levels
Kirkpatrick4 levelsKirkpatrick4 levels
Kirkpatrick4 levels
Nicole
 
Kirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation ModelKirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation Model
Maram Barqawi
 
Levels 1-4 Evaluation
Levels 1-4 EvaluationLevels 1-4 Evaluation
Levels 1-4 Evaluation
Dawn Drake, Ph.D.
 
Kirkpatricks Levels Presentation
Kirkpatricks Levels PresentationKirkpatricks Levels Presentation
Kirkpatricks Levels Presentation
Larry Weas
 
Kirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and DevelopmentKirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and Development
Manu Melwin Joy
 
Interrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionbInterrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionb
Rita Ndagire Kizito
 
Kirkpatrick training module_2016
Kirkpatrick training module_2016Kirkpatrick training module_2016
Kirkpatrick training module_2016
Darshna P. Choudhury
 
The Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels OfThe Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels Of
wendystein
 
Kirkpatrick's model
Kirkpatrick's modelKirkpatrick's model
Kirkpatrick's model
John Samuel Thomas
 
Kirkpatrick's model
Kirkpatrick's modelKirkpatrick's model
Kirkpatrick's model
cindyyew
 
Kirkpatrick model
Kirkpatrick modelKirkpatrick model
Kirkpatrick model
SanK6
 
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiFour Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Ravinder Tulsiani
 
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
vina serevina
 
Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)
Shahla Khan
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
Eyad Al-Samman
 
4 Quadrant Approach
4 Quadrant Approach4 Quadrant Approach
4 Quadrant Approach
Prateek Malik
 
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Lambda Solutions
 
Kaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainningKaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainning
Saranya Dhanesh Kumar
 
Module 8: Assessment and Evaluation
Module 8: Assessment and EvaluationModule 8: Assessment and Evaluation
Module 8: Assessment and Evaluation
Cardet1
 

What's hot (20)

Simplistic approach of krik patrick
Simplistic approach of krik patrickSimplistic approach of krik patrick
Simplistic approach of krik patrick
 
Kirkpatrick4 levels
Kirkpatrick4 levelsKirkpatrick4 levels
Kirkpatrick4 levels
 
Kirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation ModelKirkpatrick's Four-Level Training Evaluation Model
Kirkpatrick's Four-Level Training Evaluation Model
 
Levels 1-4 Evaluation
Levels 1-4 EvaluationLevels 1-4 Evaluation
Levels 1-4 Evaluation
 
Kirkpatricks Levels Presentation
Kirkpatricks Levels PresentationKirkpatricks Levels Presentation
Kirkpatricks Levels Presentation
 
Kirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and DevelopmentKirkpatrick's Levels of Training Evaluation - Training and Development
Kirkpatrick's Levels of Training Evaluation - Training and Development
 
Interrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionbInterrogating evaluation 2015 inductionb
Interrogating evaluation 2015 inductionb
 
Kirkpatrick training module_2016
Kirkpatrick training module_2016Kirkpatrick training module_2016
Kirkpatrick training module_2016
 
The Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels OfThe Benefits Of Utilizing Kirkpatrick’S Four Levels Of
The Benefits Of Utilizing Kirkpatrick’S Four Levels Of
 
Kirkpatrick's model
Kirkpatrick's modelKirkpatrick's model
Kirkpatrick's model
 
Kirkpatrick's model
Kirkpatrick's modelKirkpatrick's model
Kirkpatrick's model
 
Kirkpatrick model
Kirkpatrick modelKirkpatrick model
Kirkpatrick model
 
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiFour Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
 
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
KIRKPATRICK MODEL OF EVALUATION (LEO CHANDRA)
 
Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)Learner Experience (model for training evaluation)
Learner Experience (model for training evaluation)
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
4 Quadrant Approach
4 Quadrant Approach4 Quadrant Approach
4 Quadrant Approach
 
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...
 
Kaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainningKaufman’s five levels of evaluation of trainning
Kaufman’s five levels of evaluation of trainning
 
Module 8: Assessment and Evaluation
Module 8: Assessment and EvaluationModule 8: Assessment and Evaluation
Module 8: Assessment and Evaluation
 

Similar to Kirkpatric

Krickpatrick basic level of evaluation
Krickpatrick basic level of evaluationKrickpatrick basic level of evaluation
Krickpatrick basic level of evaluation
Sajan Ks
 
Training evaluation models
Training evaluation modelsTraining evaluation models
Training evaluation models
Megha Anilkumar
 
Review of literature
Review of  literatureReview of  literature
Review of literature
dhanarajnaik
 
Kirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptxKirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptx
JohnnyGGalla
 
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxPRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
JOHNNYGALLA2
 
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdjAkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
RagaviS16
 
Chapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.pptChapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.ppt
Dr. Nazrul Islam
 
Measuring Learning Impact
Measuring Learning ImpactMeasuring Learning Impact
Measuring Learning Impact
Iskandar Noor
 
MED07_joycepagkatipunan.pdf
MED07_joycepagkatipunan.pdfMED07_joycepagkatipunan.pdf
MED07_joycepagkatipunan.pdf
JOYCEPAGKATIPUNAN
 
Training Evaluation
Training EvaluationTraining Evaluation
Training Evaluation
Preeti Bhaskar
 
G.training evaluation by jyoti k
G.training evaluation by jyoti kG.training evaluation by jyoti k
G.training evaluation by jyoti k
jyoti karvande
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
Nancy Raj
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation models
Maarriyyaa
 
Unit 5- training evalutaion pptx
Unit 5- training evalutaion  pptxUnit 5- training evalutaion  pptx
Unit 5- training evalutaion pptx
Manoj Kumar
 
Training Evaluation Model.pptx
Training Evaluation Model.pptxTraining Evaluation Model.pptx
Training Evaluation Model.pptx
HitkarshSethi2
 
Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by
Dr.Shazia Zamir
 
Kirkspatrick model
Kirkspatrick modelKirkspatrick model
Kirkspatrick model
Jellyfab February
 
evaluation of program.pptx
evaluation of program.pptxevaluation of program.pptx
evaluation of program.pptx
MahwishBukhari3
 
Level1trainingevaluation
Level1trainingevaluationLevel1trainingevaluation
Level1trainingevaluation
Sandy Clare
 
OBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptxOBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptx
SitiHafidah1
 

Similar to Kirkpatric (20)

Krickpatrick basic level of evaluation
Krickpatrick basic level of evaluationKrickpatrick basic level of evaluation
Krickpatrick basic level of evaluation
 
Training evaluation models
Training evaluation modelsTraining evaluation models
Training evaluation models
 
Review of literature
Review of  literatureReview of  literature
Review of literature
 
Kirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptxKirkpatricks Foul Levels Evaluation.pptx
Kirkpatricks Foul Levels Evaluation.pptx
 
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxPRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
 
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdjAkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
AkdjcijdjcBusiness card 8 Dec 2023.pdfhjdhbcjkewhcjkejckljecjdj
 
Chapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.pptChapter 11 Training Evaluation.ppt
Chapter 11 Training Evaluation.ppt
 
Measuring Learning Impact
Measuring Learning ImpactMeasuring Learning Impact
Measuring Learning Impact
 
MED07_joycepagkatipunan.pdf
MED07_joycepagkatipunan.pdfMED07_joycepagkatipunan.pdf
MED07_joycepagkatipunan.pdf
 
Training Evaluation
Training EvaluationTraining Evaluation
Training Evaluation
 
G.training evaluation by jyoti k
G.training evaluation by jyoti kG.training evaluation by jyoti k
G.training evaluation by jyoti k
 
Training evaluation
Training evaluationTraining evaluation
Training evaluation
 
Evaluation models
Evaluation modelsEvaluation models
Evaluation models
 
Unit 5- training evalutaion pptx
Unit 5- training evalutaion  pptxUnit 5- training evalutaion  pptx
Unit 5- training evalutaion pptx
 
Training Evaluation Model.pptx
Training Evaluation Model.pptxTraining Evaluation Model.pptx
Training Evaluation Model.pptx
 
Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by Evaluation models by dr.shazia zamir by
Evaluation models by dr.shazia zamir by
 
Kirkspatrick model
Kirkspatrick modelKirkspatrick model
Kirkspatrick model
 
evaluation of program.pptx
evaluation of program.pptxevaluation of program.pptx
evaluation of program.pptx
 
Level1trainingevaluation
Level1trainingevaluationLevel1trainingevaluation
Level1trainingevaluation
 
OBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptxOBJEKTIF PPT MODEL.pptx
OBJEKTIF PPT MODEL.pptx
 

Recently uploaded

5th LF Energy Power Grid Model Meet-up Slides
5th LF Energy Power Grid Model Meet-up Slides5th LF Energy Power Grid Model Meet-up Slides
5th LF Energy Power Grid Model Meet-up Slides
DanBrown980551
 
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUHCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
panagenda
 
Finale of the Year: Apply for Next One!
Finale of the Year: Apply for Next One!Finale of the Year: Apply for Next One!
Finale of the Year: Apply for Next One!
GDSC PJATK
 
AWS Cloud Cost Optimization Presentation.pptx
AWS Cloud Cost Optimization Presentation.pptxAWS Cloud Cost Optimization Presentation.pptx
AWS Cloud Cost Optimization Presentation.pptx
HarisZaheer8
 
Ocean lotus Threat actors project by John Sitima 2024 (1).pptx
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxOcean lotus Threat actors project by John Sitima 2024 (1).pptx
Ocean lotus Threat actors project by John Sitima 2024 (1).pptx
SitimaJohn
 
Deep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStr
Deep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStrDeep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStr
Deep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStr
saastr
 
Nunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdf
Nunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdfNunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdf
Nunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdf
flufftailshop
 
Operating System Used by Users in day-to-day life.pptx
Operating System Used by Users in day-to-day life.pptxOperating System Used by Users in day-to-day life.pptx
Operating System Used by Users in day-to-day life.pptx
Pravash Chandra Das
 
Choosing The Best AWS Service For Your Website + API.pptx
Choosing The Best AWS Service For Your Website + API.pptxChoosing The Best AWS Service For Your Website + API.pptx
Choosing The Best AWS Service For Your Website + API.pptx
Brandon Minnick, MBA
 
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing InstancesEnergy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Alpen-Adria-Universität
 
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfHow to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
Chart Kalyan
 
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...Letter and Document Automation for Bonterra Impact Management (fka Social Sol...
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...
Jeffrey Haguewood
 
UI5 Controls simplified - UI5con2024 presentation
UI5 Controls simplified - UI5con2024 presentationUI5 Controls simplified - UI5con2024 presentation
UI5 Controls simplified - UI5con2024 presentation
Wouter Lemaire
 
HCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAUHCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAU
panagenda
 
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
saastr
 
Introduction of Cybersecurity with OSS at Code Europe 2024
Introduction of Cybersecurity with OSS  at Code Europe 2024Introduction of Cybersecurity with OSS  at Code Europe 2024
Introduction of Cybersecurity with OSS at Code Europe 2024
Hiroshi SHIBATA
 
Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
Octavian Nadolu
 
System Design Case Study: Building a Scalable E-Commerce Platform - Hiike
System Design Case Study: Building a Scalable E-Commerce Platform - HiikeSystem Design Case Study: Building a Scalable E-Commerce Platform - Hiike
System Design Case Study: Building a Scalable E-Commerce Platform - Hiike
Hiike
 
GraphRAG for Life Science to increase LLM accuracy
GraphRAG for Life Science to increase LLM accuracyGraphRAG for Life Science to increase LLM accuracy
GraphRAG for Life Science to increase LLM accuracy
Tomaz Bratanic
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Malak Abu Hammad
 

Recently uploaded (20)

5th LF Energy Power Grid Model Meet-up Slides
5th LF Energy Power Grid Model Meet-up Slides5th LF Energy Power Grid Model Meet-up Slides
5th LF Energy Power Grid Model Meet-up Slides
 
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUHCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAU
 
Finale of the Year: Apply for Next One!
Finale of the Year: Apply for Next One!Finale of the Year: Apply for Next One!
Finale of the Year: Apply for Next One!
 
AWS Cloud Cost Optimization Presentation.pptx
AWS Cloud Cost Optimization Presentation.pptxAWS Cloud Cost Optimization Presentation.pptx
AWS Cloud Cost Optimization Presentation.pptx
 
Ocean lotus Threat actors project by John Sitima 2024 (1).pptx
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxOcean lotus Threat actors project by John Sitima 2024 (1).pptx
Ocean lotus Threat actors project by John Sitima 2024 (1).pptx
 
Deep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStr
Deep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStrDeep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStr
Deep Dive: Getting Funded with Jason Jason Lemkin Founder & CEO @ SaaStr
 
Nunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdf
Nunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdfNunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdf
Nunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdf
 
Operating System Used by Users in day-to-day life.pptx
Operating System Used by Users in day-to-day life.pptxOperating System Used by Users in day-to-day life.pptx
Operating System Used by Users in day-to-day life.pptx
 
Choosing The Best AWS Service For Your Website + API.pptx
Choosing The Best AWS Service For Your Website + API.pptxChoosing The Best AWS Service For Your Website + API.pptx
Choosing The Best AWS Service For Your Website + API.pptx
 
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing InstancesEnergy Efficient Video Encoding for Cloud and Edge Computing Instances
Energy Efficient Video Encoding for Cloud and Edge Computing Instances
 
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdfHow to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
How to Interpret Trends in the Kalyan Rajdhani Mix Chart.pdf
 
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...Letter and Document Automation for Bonterra Impact Management (fka Social Sol...
Letter and Document Automation for Bonterra Impact Management (fka Social Sol...
 
UI5 Controls simplified - UI5con2024 presentation
UI5 Controls simplified - UI5con2024 presentationUI5 Controls simplified - UI5con2024 presentation
UI5 Controls simplified - UI5con2024 presentation
 
HCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAUHCL Notes and Domino License Cost Reduction in the World of DLAU
HCL Notes and Domino License Cost Reduction in the World of DLAU
 
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
Overcoming the PLG Trap: Lessons from Canva's Head of Sales & Head of EMEA Da...
 
Introduction of Cybersecurity with OSS at Code Europe 2024
Introduction of Cybersecurity with OSS  at Code Europe 2024Introduction of Cybersecurity with OSS  at Code Europe 2024
Introduction of Cybersecurity with OSS at Code Europe 2024
 
Artificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopmentArtificial Intelligence for XMLDevelopment
Artificial Intelligence for XMLDevelopment
 
System Design Case Study: Building a Scalable E-Commerce Platform - Hiike
System Design Case Study: Building a Scalable E-Commerce Platform - HiikeSystem Design Case Study: Building a Scalable E-Commerce Platform - Hiike
System Design Case Study: Building a Scalable E-Commerce Platform - Hiike
 
GraphRAG for Life Science to increase LLM accuracy
GraphRAG for Life Science to increase LLM accuracyGraphRAG for Life Science to increase LLM accuracy
GraphRAG for Life Science to increase LLM accuracy
 
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdfUnlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
Unlock the Future of Search with MongoDB Atlas_ Vector Search Unleashed.pdf
 

Kirkpatric

  • 3. All about Kirkpatrick In 1959, Kirkpatrick wrote four articles describing the four levels for evaluating training programs. He was working on his dissertation for a Ph.D. when he came up with the idea of defining evaluation. Evaluation, as according to Kirkpatrick, seems to have multiple meanings to training and developmental professionals. Some think evaluation is a change in behavior, or the determination of the final results.
  • 4. All about Kirkpatrick (continued) Kirkpatrick says they are all right, and yet all wrong. All four levels are important in understanding the basic concepts in training. There are exceptions, however.
  • 5. Kirkpatrick: Evaluating Training Programs “What is quality training?” “How do you measure it?” “How do you improve it?”
  • 6. Evaluating “The reason for evaluating is to determine the effectiveness of a training program.” (Kirkpatrick, 1994, pg. 3)
  • 7. The Ten Factors of Developing a Training Program 1. Determine needs 2. Set objectives 3. Determine subject content 4. Select qualified applicants 5. Determine the best schedule
  • 8. The Ten Factors of Developing a Training Program 6. Select appropriate facilities 7. Select qualified instructors 8. Select and prepare audiovisual aids 9. Co-ordinate the program 10. Evaluate the program
  • 9. Reasons for Evaluating Kirkpatrick gives three reasons ‘why’ there is a need to evaluate training: 1.“To justify the existence of the training department by showing how it contributes to the organizations’ objectives and goals.”
  • 10. Reasons for Evaluating 2. “To decide whether to continue or discontinue training programs.” 3. “To gain information on how to improve future training programs.” (Kirkpatrick, 1994, pg. 18)
  • 12. “The Four Levels represent a sequence of ways to evaluate (training) programs….As you move from one level to the next, the process becomes more difficult and time-consuming, but it also provides more valuable information.” (Kirkpatrick, 1994, pg. 21)
  • 13. Reaction: is the measuring of the reaction of the participants in the training program. is “a measure of customer satisfaction.” (Kirkpatrick, 1994, pg. 21)
  • 14. Learning: is the change in the participants’ attitudes, or an increase in knowledge, or greater skills received, as a result of the participation of the program.
  • 15. Learning The measuring of learning in any training program is the determination of at least one of these measuring parameters: Did the attitudes change positively? Is the knowledge acquired related and helpful to the task? Is the skill acquired related and helpful to the task?
  • 16. Behavior Level 3 attempts to evaluate how much transfer of knowledge, skills, and attitude occurs after the training.
  • 17. The four conditions Kirkpatrick identifies for changes to occur: Desire to change Knowledge of what to do and how to do it Work in the right climate Reward for (positive) change
  • 18. When all conditions are met, the employee must: Realize an opportunity to use the behavioral changes. Make the decision to use the behavioral changes. Decide whether or not to continue using the behavioral changes.
  • 19. When evaluating change in behavior, decide: When to evaluate How often to evaluate How to evaluate
  • 20. Guidelines for evaluating behavior: Use a control group Allow time for change to occur Evaluate before and after Survey/interview observers Get 100% response or sampling Repeat evaluation, as appropriate Consider cost versus benefits
  • 21. Results Level 4 is the most important and difficult of all - determining final results after training.
  • 22. Evaluation Questions: Increased production? Improved quality? Decreased costs? Improved safety numbers? Increased sales? Reduced turnover? Higher profits?
  • 23. Guidelines for evaluating results: Use a control group. Allow time for results to be achieved. Measure before and after the program. Repeat the measurements, as needed. Consider cost versus benefits. Be satisfied with evidence if proof is not possible.
  • 24. Case Study #1 INTEL CORPORATION
  • 25. Intel’s Compromise of the Kirkpatrick Model Intel uses the four-level model as an analysis instrument to determine the initial training needs and design of its training program; as well as using the model for evaluations.
  • 26. Intel’s Compromise of the Kirkpatrick Model Their uniqueness of using the model is in the fact that the designers of the training program worked backwards in the analysis of the training, starting with Level Four.
  • 27. The Model This implementation of the Kirkpatrick Model stands as vivid testimony to the versatility of the model as a training tool, and in developing fledgling training programs.
  • 28. The Model It also reflects the open-mindedness of the senior executives at Intel for their infinite use of the model and the use of the genius and visions of Kirkpatrick.
  • 29. How Intel applies the analysis to their training program Level Four …”Determine the organizations’ structure and future needs.” Level Three. Change the environmental conditions and employee conditions to improve business indicators.
  • 30. How Intel applies the analysis to their training program Level Two. “Design a training program that would ensure a transfer of deficient skills and knowledge.” Level One. Use a questionnaire, according to their skill level, that would instruct and inspire training participants.
  • 31. How Intel applies evaluation to their training program Level One - Questionnaire. Level Two - Demonstrate competency, create action plans through group simulations. Level Three - Follow-up to determine if action plans were met (specific steps to implement concepts of what was learned). Level Four - Ongoing process of tracking business indicators.
  • 32. Case Study #2 ST. LUKE’S HOSPITAL
  • 33. St. Luke’s is unique - Evaluation of outdoor-based training program, not classroom. Results analyzed statistically to determine the significance of any change. Evaluation led to recommendations for future programs.
  • 34. The New Questionnaire Used before attendance in the program. Used 3 months after completion of the program. Used again 6 months after completion of the program. (Communication showed statistically significant improvement, and Group Effectiveness showed statistically significant change.)
  • 35. Kirkpatrick’s 4 Levels of Evaluation are: Level 1 - Reaction: how participants reacted to the program. Level 2 - Learning: what participants learned from the program. Level 3 - Behavior: whether what was learned is being applied on the job. Level 4 - Results: whether that application is achieving results.
  • 36. Post-test Questions (1) Name three ways evaluation results can be measured. (2) Do all 4 Levels have to be used? (3) Do they have to be used in 1,2,3,4 order? (4) Is Kirkpatrick’s method of evaluation summative or formative? (5) Which developmental “view” does Kirkpatrick use? (discrepancy,
  • 37. “IF YOU THINK TRAINING IS EXPENSIVE, TRY IGNORANCE.” and, remember, the definition of ignorance is repeating the same behavior, over and over, and expecting different results!

Editor's Notes

  1. The reason why Kirkpatrick wanted to develop his Four-Level Model was to clarify the meaning and process for determining ‘evaluation’ in a training program. If there is no change in behavior, but there is a change in skills, knowledge, or attitudes, then using only part of the model (not all levels) is acceptable. If the purpose of the training program is to change behavior, then all four levels apply. Other authors on evaluation of training programs have proposed various strategies, but Kirkpatrick is given credit for developing and masterminding the Four-Level Model. Kirkpatrick focuses the Model for the executives and middle management. However, his model works well in most other training areas.
  2. These are questions asked by HRD coordinators on training performance and the beginning criteria and the expectations of the resulting training program. Business training operations need quantitative measures as well as qualitative measures. A happy medium between these two criteria is an ideal position to fully understand the training needs and to fulfill its development. Quantitative - the research methodology where the investigator's “values, interpretations, feelings, and musings have no place in the positivist’s view of the scientific inquiry.” (Borg and Gall, 1989) cont.
  3. The end results after an evaluation are hopefully positive results for both upper management and the program coordinators.
  4. 1. Ask participants, bosses, testing, or ask others who are familiar with the needs or objectives. Some examples are surveys or interviews. 2. a. What are the results that you are trying to do? b. What behaviors do you want the participants to have at the end of the training program. c. What knowledge, skills, and/or attitudes do you want your pupils to demonstrate at the end of the training program. 3. Determine subject content to meet needs and objectives. 4. Four decisions: a. Who is the best suited to receive the training. b. Are the training programs required by law (affirmative action). c. Voluntary or required d. Should hourly and salary be included in the same class or be segregated. 5. Solid week or intermittent days. How often should breaks be taken. Should lunch be brought in or allow participants to leave for a hour.
  5. 6. Should be comfortable and convenient and appropriate. 7. a. In-house or outside contractors b. Do instructors need to be ‘tailored’ to the special needs in the training program. 8. Two purposes: a. Maintain interest b. Help communicate ideas and skill transfer. Both of these purposes can be accomplished by using single, special interest video cassettes or some type of packaged program. 9. Two scenarios: a. Frustration, and b. Needs of the instructor. 10. The determining effectiveness of a training program are planning and implementation.
  6. 1. If and when downsizing occurs, this statement shall have more meaning than ever for some unlucky people. HRD departments are regarded by upper management as an overhead and not contributing directly to production.
  7. 2. Pilot courses may be implemented to see if the participants have the necessary knowledge, or skills, or behavioral changes to make the program work. 3. Kirkpatrick uses eight factors on how to improve the effectiveness of a training program. These eight factors closely follow the Ten Factors of Developing a Training Program. This is a feedback statement spinning off of the Ten Factors.
  8. All of these levels are important. However, in later examples of this model, you shall see where large corporations have taken the Kirkpatrick Model and used all of it, only part of it, and still some reversed the order of the levels.
  9. The reactions of the participants must be positive for the program to survive, grow, and improve. Reactions reach back to bosses and subordinates alike. This word-of-mouth gossip reaction can either make the program or break it. Here ‘customer’ refers to the participants in the training program.
  10. A training program must accomplish at least one of these three learning traits in order to be effective for a participant to learn. The best case scenario is to see an improvement in all three traits. However, as according to Kirkpatrick, only one learning trait is all it takes to have an effective training program.
  11. Guidelines for measuring Learning: 1. Use a control group along with an experimental group to provide a comparison analysis, 2. Have a pre-test and a post-test, then measure the difference, 3. Try to get an honest and true 100% response to any interviews, surveys, or tests. 4. The use of a test to measure participant learning is an effective evaluation for both participant and instructor alike. However, this is not a conclusive fact. There may be other factors involved. Results must be measured across the spectrum of the Ten Factors of Development.
  12. Level 3 asks the question “What changes in behavior occurred because people attended the training? This Level is a more difficult evaluation than Levels 1 and 2.
  13. The employee must want to make the change. The training must provide the what and the how . The employee must return to a work environment that allows and/or encourages the change. There should be rewards - Intrinsic - inner feelings of price and achievement. Extrinsic - such as pay increases or praise.
  14. The employee may - Like the new behavior and continue using it. Not like the new behavior and return to doing things the “old way”. Like the change, but be restrained by outside forces that prevent his continuing to use it.
  15. With Reaction and Learning, evaluation should be immediate. But evaluating change in Behavior involves some decision-making.
  16. Use a control group only if applicable. Be aware that this task can be very difficult and maybe even impossible. Allow time for behavioral changes. This could be immediate, as in the case of diversity training, or it can take longer, such as using training for administration of performance appraisals. For some programs 2-3 months is appropriate. For others, 6 months is more realistic. Evaluate before and after, if time and budgets allow. Conduct interviews and surveys. Decide who is qualified for questioning, and, of those qualified, whose answers would be most reliable, who is available, and, of the choices, should any not be used. Attempt to get 100% response. Repeat the evaluation. Not all employees will make the changes at the same time. Consider cost vs. benefit. This cost can be internal staff time or an outside expert hired to do the evaluation. The greater the possible benefits, the greater the number of dollars that can be justified. If the program will be repeated, the evaluation can be used for future program improvements.
  17. Many of these questions do not get answered. Why? Trainers don’t know how to measure results in comparison to the cost of the training. Secondly, the results may not be clear proof that the training caused the positive results. Unless there is a direct relationship between the training and the results. (i.e. sales training and resulting sales dollars)
  18. Use a control group, again, if applicable, to prove the training caused the change. Allow time for results, different for different programs, different for each individual. Measure before and after. This is easier than measuring behavior because figures are usually available - hard data, such as production numbers or absenteeism. Repeat the measurement. You must decide when and how often to evaluate. Consider cost vs. benefit. Here, the amount of money spent on evaluation should be determined by - cost of training, potential results to be achieved, and how often the training will be repeated. And last, be happy with evidence of training success, because you may not get proof!
  19. This case study used Level 1, Reaction, and Level 3, Behavior: St. Luke’s needed to improve efficiency and cost control and was looking for ways to improve management training. Outdoor-based programs have been effective in improving interdepartmental communications, increasing employee trust, and reducing boundaries between departments, thereby empowering employees. How many of you have taken part in such a program? There is an entire course of “rope and ladder” activities in the woods, some at ground level and some at higher elevations. The goal of these activities is to build trust and encourage openness and sharing.
  20. St. Luke’s program consisted of three 1-day sessions on such a course. Phase I was directed at getting acquainted: in the morning “low rope” activities and in the afternoon, “high rope” elements. Phase II was focused on building trust within the group with harder, more challenging activities. Phase III focused on individual development and increased group support. The group traveled together and had team slogans and T-shirts. Previous participants were given a questionnaire to describe what they had personally gotten from the program and how it had changed their behavior. The results were used to design a new questionnaire for future participants.
  21. Evaluation of this program showed that some of the goals were achieved and were long-lasting. Also, it showed that participants had a positive reaction to the program, which can be linked to results on the job.
  22. (1) For ways results can be measured, refer to Slide 21. (2) All four Levels do not have to be used. Case study on St. Luke’s Hospital only used Levels 1 and 3. (3) The Levels do not have to be used in 1,2,3,4 order. Intel started 4,3,2,1 in designing their program. (4) What is your opinion on Kirkpatrick’s method being summative or formative? Is it a combination? (5) Which developmental view do you think Kirkpatrick uses? Defend your opinion.
  23. FOOD FOR THOUGHT!