SlideShare a Scribd company logo
1 of 41
Download to read offline
EvaluatingTrainingPrograms
The FourLevels
Presented by :
M.Pavithra Sai
Mani
Prathiksha.L
1
AMITY GLOBAL BUSINESS SCHOOL, CHENNAI
Objectives
▪Upon completion of this
presentation you will be
able to:
▪State why evaluation of
programs is critical to
you and your
organization.
▪Apply Kirkpatrick’s four
levels of evaluation to
your programs.
▪Use guidelines for
developing evaluations.
▪Implement various
forms and approaches
to evaluation
2
Why Evaluate?
➢Determine the effectiveness
of the program design
How the program was
received by the participants
How learners fared on
assessment of their learning
Determine what instructional
strategies work
presentation mode
presentation methods.
learning activities
desired level of learning
➢Program improvement
3
Why Evaluate?
➢Should the program be
continued?
➢How do you justify your
existence?
➢How do you determine the
return on investment for the
program?
human capital
individual competence
social/economic benefit
4
FourLevels of
Evaluation
Kirkpatrick
▪ During program evaluation
▪ Level One Reaction
▪ Level Two Learning
▪ Post program evaluation
▪ Level Three Behavior
▪ Level Four Results
5
ReactionLevel
▪ A customer satisfaction
measure
▪ Were the participants
pleased with the program
▪ Perception if they learned
anything
▪ Likelihood of applying the
content
▪ Effectiveness of particular
strategies
▪ Effectiveness of the
packaging of the course
6
Examplesof Level
One
▪ Your Opinion, Please
In a word, how would you
describe this workshop?
▪ Intent
▪ Solicit feedback about the
course. Can also assess
whether respondents
transposed the numeric
scales.
7
Exampleof Level
One
▪Using a number, how would
you describe this program?
(circle a number)
Terrible Average
Outstanding
1 2 3
4 5
▪Intent: Provides quantitative
feedback to determine
average responses
(descriptive data).Watch
scale sets!
8
Exampleof Level
One
▪ How much did you know about
this subject before taking this
workshop?
Nothing Some
A lot
1 2 3
4 5
▪ How much do you know about
this subject after participating in
this workshop?
Nothing Some
A lot
1 2 3
4 5
▪ Intent - The question does not
assess actual learning, it assesses
perceived learning.
9
ExampleLevel
One
▪How likely are you to use
some or all of the skills
taught in this workshop in
your work/community/
family?
Not
Very
Likely Likely
Likely
1 2 3
4 5
▪Intent – determine learners
perceived relevance of the
material. May correlate with
the satisfaction learners feel.
10
Exampleof Level
One
▪ The best part of this program
was…
▪ The one thing that could be
improved most ..
▪ Intent
Qualitative feedback on the
course and help prioritize work
in a revision. Develop themes
on exercises, pace of course,
etc.
11
Guidelinesfor
Evaluating
Reaction
➢Decide what you want to find
out.
➢Design a form that will quantify
reactions.
➢Encourage written comments.
➢Get 100% immediate response.
➢Get honest responses.
➢If desirable, get delayed
reactions.
➢Determine acceptable
standards.
➢Measure future reactions
against the standard.
12
LearningLevel
▪ What did the participants learn
in the program?
▪ The extent to which
participants change
attitudes, increase
knowledge, and/or increase
skill.
▪ What exactly did the
participant learn and not
learn?
▪ Pretest Posttest
13
LearningLevel
▪ Requires developing specific
learning objectives to be evaluated.
▪ Learning measures should be
objective and quantifiable.
▪ Paper pencil tests, performance
on skills tests, simulations, role-
plays, case study, etc.
14
Level Two
Examples
▪Develop a written exam
based on the desired
learning objectives.
▪Use the exam as a pretest
▪Provide participants with a
worksheet/activity sheet that
will allow for “tracking”
during the session.
▪Emphasize and repeat key
learning points during the
session.
▪Use the pretest exam as a
posttest exam.
▪Compute the posttest-pretest
gain on the exam.
15
What makes a
good test?
▪ The only valid test questions
emerge from the objectives.
▪ Consider writing main
objectives and supporting
objectives.
▪ Test questions usually test
supporting objectives.
▪ Ask more than one question on
each objective.
16
Level Two
Strategies
▪ Consider using scenarios, case
studies, sample project
evaluations, etc, rather than test
questions. Develop a rubric of
desired responses.
▪ Develop between 3 and 10
questions or scenarios for
each main objective.
17
Level Two
Strategies
▪ Provide instructor feedback
during the learning activities.
▪ Requires the instructor to
actively monitor participants
discussion, practice
activities, and engagement.
Provide learners feedback.
▪ Ask participants open ended
questions (congruent with
the learning objectives)
during activities to test
participant understanding.
18
Example
Which of the following should
be considered when
evaluating at the Reaction
Level? (more than one
answer possible)
___Evaluate only the lesson
content
___Obtain both subjective and
objective
responses
___Get 100% response from
participants
___Honest responses are
important
___Only the course instructor
should review results.
19
Example
▪ Match the following to the
choices below
___ Reaction Level
___ Learning Level
A. Changes in performance at
work
B. Participant satisfaction
C. Organizational Improvement
D.What the participant learned
in class
20
ScenarioExample
▪ An instructor would like to
know the effectiveness of the
course design and how much a
participant has learned in a
seminar.The instructor would
like to achieve at least Level
Two evaluation.
▪ What techniques could the
instructor use to achieve
level two evaluation?
▪ Should the instructor also
consider doing a level one
evaluation? Why or why not?
21
Rubricfor
Scenario
Question
Directions to instructor: Use the
following topic checklist to
determine the completeness of the
participants response:
___ Learner demonstrated an accurate
understanding of what level two
is: learning level.
___ Learner provided at least two
specific examples: pretest -posttest,
performance rubrics, scenarios,
case studies, hands-on practice.
___ Learner demonstrated an accurate
understanding of what level one
evaluation is: reaction level.
___The learner provided at least three
specific examples of why level one
is valuable: assess satisfaction,
learning activities, course
packaging, learning strategies,
likelihood of applying learning.
22
BehaviorLevel
▪ How the training affects
performance.
▪ The extent to which change
in behavior occurred.
▪ Was the learning
transferred from the
classroom to the real world.
▪ Transfer – Transfer -
Transfer
23
Conditions
Necessaryto
Change
▪ The person must:
▪ have a desire to change.
▪ know what to do and how to
do it.
▪ work in the right climate.
▪ be rewarded for changing.
24
Typesof Climates
▪ Preventing – forbidden to use
the learning.
▪ Discouraging – changes in
current way of doing things is
not desired.
▪ Neutral – learning is ignored.
▪ Encouraging – receptive to
applying new learning.
▪ Requiring – change in behavior
is mandated.
25
Guidelinesfor
Evaluating
Behavior
▪Measure on a before/after
basis
▪Allow time for behavior
change (adaptation) to take
place
▪Survey or interview one or
more who are in the best
position to see change.
▪ The participant/learner
▪ The supervisor/mentor
▪ Subordinates or peers
▪ Others familiar with the
participants actions.
26
Guidelinesfor
Evaluating
Behavior
▪ Get 100% response or a
sample?
▪ Depends on size of group.
The more the better.
▪ Repeat at appropriate times
▪ Remember that other factors
can influence behavior over
time.
▪ Use a control group if practical
▪ Consider cost vs. benefits of the
evaluation
27
Level Three
Examples
▪ Observation
▪ Survey or Interview
▪ Participant and/or others
▪ Performance benchmarks
▪ Before and after
▪ Control group
▪ Evidence or Portfolio
28
Surveyor
Patterned
Interview
1. Explain purpose of the survey/interview.
2. Review program objectives and content.
3. Ask the program participant to what extent
performance was improved as a result of the
program. __ Large extent __ Some extent
__ Not at all
If “Large extent” or “Some extent”, ask to
please explain.
4. If “Not at all”, indicate why not:
___ Program content wasn’t practical
___ No opportunity to use what I learned
___ My supervisor prevented or discouraged
me to change
___ Other higher priorities
___ Other reason (please explain)
5. Ask, “In the future, to what extent do you plan
to change your behavior?”
___ Large extent ___ Some extent ___
Not at all
Ask to please explain:
29
Evidenceand
Portfolio
▪ Thank you for participating. I am very
interested in how the evaluation skills you
have learned are used in your work..
▪ Please send me a copy of at least one of
the following:
▪ a level three evaluation that you have
designed.
▪ a copy of level two evaluations that use
more than one method of evaluating
participant learning.
▪ a copy of a level one evaluation that
you have modified and tell me how it
influenced program improvement.
▪ (indicate if you would like my critique
on any of the evaluations)
▪ If I do not hear from you before January
30, I will give you a call – no pressure –
‘just love to learn what you are doing.
30
ResultsLevel
▪ Impact of education and
training on the organization or
community.
▪ The final results that occurred
as a result of training.
▪ The ROI for training.
31
Examplesof Level
Four
➢How did the training save costs
➢Did work output increase
➢Was there a change in the quality of
work
➢Did the social condition improve
➢Did the individual create an impact on
the community
➢Is there evidence that the organization
or community has changed.
32
How do you
knowif your
outcomesare
good ?
▪ Good training outcomes need
to be :
▪ Relevant
▪ Reliable
▪ Discriminate
▪ Practical
33
GoodOutcomes:
Relevance
▪ Criteria Relevance
▪ Criterion Contamination
▪ Criterion Deficiency
▪ Reliability
▪ Discrimination
▪ Practicality.
34
Criteriondeficiency,
relevanceand
contanmination
35
EvaluationDesigns:
ThreatstoValidity
▪ Ensuring internal validity
means you can be more certain
that your intervention or
program did cause the effect
observed and the effect is not
due to other causes. If you have
a threat to external validity, you
might be wrong in making a
generalization about your
findings.
36
ThreatstoValidity
1.Threats to Internal validity
▪ Company
▪ Persons
▪ Outcome Measures.
2.Threats to External Validity
▪ Reaction to pretest
▪ Reaction to evaluation
▪ Interaction of selection and
training
▪ Intertraction of methods.
37
MethodstoControl
forthreatsto
Validity
▪ Pre and post tests.
▪ Use of Comparison Groups
▪ Random Assigment.
38
TypesofEvaluation
Design
▪ Post-Test- only
▪ Pretest / Posttest
▪ Posttest – only with comparison
group.
▪ Pretest / Post test with
comparison group.
39
Factorsthat
Influencethetype
ofEvaluationDesign
40
Guidelinesfor
Evaluating
Results
▪Measure before and after
▪Allow time for change to take
place
▪Repeat at appropriate times
▪Use a control group if
practical
▪Consider cost vs. benefits of
doing Level Four
▪Remember, other factors can
affect results
▪Be satisfied with Evidence if
Proof is not possible.
41

More Related Content

What's hot

CHARACTERISTICS OF A GOOD INSTRUMENT
CHARACTERISTICS OF A GOOD INSTRUMENTCHARACTERISTICS OF A GOOD INSTRUMENT
CHARACTERISTICS OF A GOOD INSTRUMENTMusfera Nara Vadia
 
4. qualities of good measuring instrument
4. qualities of good measuring instrument4. qualities of good measuring instrument
4. qualities of good measuring instrumentJohn Paul Hablado
 
Assessment of Learning
Assessment of Learning Assessment of Learning
Assessment of Learning Zephie Andrada
 
Questionnaire design
Questionnaire designQuestionnaire design
Questionnaire designWai-Kwok Wong
 
Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...
Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...
Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...Sundar B N
 
Curriculum evaluation
Curriculum evaluationCurriculum evaluation
Curriculum evaluationHennaAnsari
 
Norm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced TestNorm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced TestDrSindhuAlmas
 
Criterion vs norm referenced testing
Criterion vs norm referenced testingCriterion vs norm referenced testing
Criterion vs norm referenced testingSaidBaalla
 
Waugh norman65
Waugh norman65Waugh norman65
Waugh norman65Idol Chan
 
Overview of Instructional Analysis (Conduct Instructional Analysis)
Overview of Instructional Analysis (Conduct Instructional Analysis)Overview of Instructional Analysis (Conduct Instructional Analysis)
Overview of Instructional Analysis (Conduct Instructional Analysis)Malyn Singson
 
BASIC OF MEASUREMENT & EVALUATION
BASIC OF MEASUREMENT & EVALUATION BASIC OF MEASUREMENT & EVALUATION
BASIC OF MEASUREMENT & EVALUATION suresh kumar
 
Non probability sampling
Non  probability samplingNon  probability sampling
Non probability samplingcorayu13
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment. Tarek Tawfik Amin
 

What's hot (20)

CHARACTERISTICS OF A GOOD INSTRUMENT
CHARACTERISTICS OF A GOOD INSTRUMENTCHARACTERISTICS OF A GOOD INSTRUMENT
CHARACTERISTICS OF A GOOD INSTRUMENT
 
4. qualities of good measuring instrument
4. qualities of good measuring instrument4. qualities of good measuring instrument
4. qualities of good measuring instrument
 
Conjoint Analysis
Conjoint AnalysisConjoint Analysis
Conjoint Analysis
 
Interpreting test results
Interpreting test resultsInterpreting test results
Interpreting test results
 
Assessment of Learning
Assessment of Learning Assessment of Learning
Assessment of Learning
 
Questionnaire design
Questionnaire designQuestionnaire design
Questionnaire design
 
Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...
Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...
Attitude Measurement Scales - Likert‘s Scale, Semantic Differential Scale, Th...
 
Constructing subjective test items
Constructing subjective test itemsConstructing subjective test items
Constructing subjective test items
 
Goal free model
Goal free modelGoal free model
Goal free model
 
Item analysis
Item analysisItem analysis
Item analysis
 
Reliability and validity
Reliability and validityReliability and validity
Reliability and validity
 
Curriculum evaluation
Curriculum evaluationCurriculum evaluation
Curriculum evaluation
 
Norm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced TestNorm referenced and Criterion Referenced Test
Norm referenced and Criterion Referenced Test
 
Criterion vs norm referenced testing
Criterion vs norm referenced testingCriterion vs norm referenced testing
Criterion vs norm referenced testing
 
Waugh norman65
Waugh norman65Waugh norman65
Waugh norman65
 
Overview of Instructional Analysis (Conduct Instructional Analysis)
Overview of Instructional Analysis (Conduct Instructional Analysis)Overview of Instructional Analysis (Conduct Instructional Analysis)
Overview of Instructional Analysis (Conduct Instructional Analysis)
 
BASIC OF MEASUREMENT & EVALUATION
BASIC OF MEASUREMENT & EVALUATION BASIC OF MEASUREMENT & EVALUATION
BASIC OF MEASUREMENT & EVALUATION
 
Item analysis ppt
Item analysis pptItem analysis ppt
Item analysis ppt
 
Non probability sampling
Non  probability samplingNon  probability sampling
Non probability sampling
 
Validity and reliability in assessment.
Validity and reliability in assessment. Validity and reliability in assessment.
Validity and reliability in assessment.
 

Similar to Evaluating the training programs PPT: The Four levels ( Kirkpatrick's Four-Level Training Evaluation Mode)

Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiFour Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiRavinder Tulsiani
 
Module 8: Assessment and Evaluation
Module 8: Assessment and EvaluationModule 8: Assessment and Evaluation
Module 8: Assessment and EvaluationCardet1
 
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxPRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxJOHNNYGALLA2
 
Assessment_ Know the Ropes, Learn the Ropes!
Assessment_ Know the Ropes, Learn the Ropes! Assessment_ Know the Ropes, Learn the Ropes!
Assessment_ Know the Ropes, Learn the Ropes! sathyakarunakaran5
 
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does itSilvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does iteaquals
 
evaluation of program.pptx
evaluation of program.pptxevaluation of program.pptx
evaluation of program.pptxMahwishBukhari3
 
Training &n Development Studies & Evaluation
Training &n Development Studies & EvaluationTraining &n Development Studies & Evaluation
Training &n Development Studies & EvaluationElsaCherian1
 
Assessment cpd
Assessment cpdAssessment cpd
Assessment cpdMrsMcGinty
 
Kirkpatrick 4 level evaluation model
Kirkpatrick 4 level evaluation modelKirkpatrick 4 level evaluation model
Kirkpatrick 4 level evaluation modelzhumin
 
Quick, Cheap and Dirty Training Evaluation
Quick, Cheap and Dirty Training EvaluationQuick, Cheap and Dirty Training Evaluation
Quick, Cheap and Dirty Training EvaluationITCILO
 
Evaluation of training based on Kirkpatrick
Evaluation of training based on KirkpatrickEvaluation of training based on Kirkpatrick
Evaluation of training based on KirkpatrickAna Zulianingrum
 
EFFECTIVE COACHING AND TRAINING ESP.pptx
EFFECTIVE COACHING AND TRAINING ESP.pptxEFFECTIVE COACHING AND TRAINING ESP.pptx
EFFECTIVE COACHING AND TRAINING ESP.pptxoforijulius77
 
Appraisal and Performance Management in Schools - A practical approach
Appraisal and Performance Management in Schools - A practical approachAppraisal and Performance Management in Schools - A practical approach
Appraisal and Performance Management in Schools - A practical approachMark S. Steed
 
Marking staff meeting 12.5.15
Marking staff meeting 12.5.15Marking staff meeting 12.5.15
Marking staff meeting 12.5.15MrsMcGinty
 
Kirk patrick's simplistic approach
Kirk patrick's simplistic approachKirk patrick's simplistic approach
Kirk patrick's simplistic approachrhimycrajan
 
Tqf day 2 - assessment and feedback
Tqf   day 2 - assessment and feedbackTqf   day 2 - assessment and feedback
Tqf day 2 - assessment and feedbackRMIT
 

Similar to Evaluating the training programs PPT: The Four levels ( Kirkpatrick's Four-Level Training Evaluation Mode) (20)

Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder TulsianiFour Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
Four Levels Of Evaluation (Kirkpatrick Model) By Ravinder Tulsiani
 
Module 8: Assessment and Evaluation
Module 8: Assessment and EvaluationModule 8: Assessment and Evaluation
Module 8: Assessment and Evaluation
 
Training climate
Training climateTraining climate
Training climate
 
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptxPRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
PRESCRIPTIVE-EVALUATION-GALLA-JOHNNY-G..pptx
 
Assessment_ Know the Ropes, Learn the Ropes!
Assessment_ Know the Ropes, Learn the Ropes! Assessment_ Know the Ropes, Learn the Ropes!
Assessment_ Know the Ropes, Learn the Ropes!
 
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does itSilvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
Silvana Richardson: Weighing the Pig Doesn't Make it Fatter or Does it
 
evaluation of program.pptx
evaluation of program.pptxevaluation of program.pptx
evaluation of program.pptx
 
Training &n Development Studies & Evaluation
Training &n Development Studies & EvaluationTraining &n Development Studies & Evaluation
Training &n Development Studies & Evaluation
 
Assessment cpd
Assessment cpdAssessment cpd
Assessment cpd
 
Kirkpatrick 4 level evaluation model
Kirkpatrick 4 level evaluation modelKirkpatrick 4 level evaluation model
Kirkpatrick 4 level evaluation model
 
Quick, Cheap and Dirty Training Evaluation
Quick, Cheap and Dirty Training EvaluationQuick, Cheap and Dirty Training Evaluation
Quick, Cheap and Dirty Training Evaluation
 
Evaluation of training based on Kirkpatrick
Evaluation of training based on KirkpatrickEvaluation of training based on Kirkpatrick
Evaluation of training based on Kirkpatrick
 
Evaluation
EvaluationEvaluation
Evaluation
 
Weac
WeacWeac
Weac
 
EFFECTIVE COACHING AND TRAINING ESP.pptx
EFFECTIVE COACHING AND TRAINING ESP.pptxEFFECTIVE COACHING AND TRAINING ESP.pptx
EFFECTIVE COACHING AND TRAINING ESP.pptx
 
Appraisal and Performance Management in Schools - A practical approach
Appraisal and Performance Management in Schools - A practical approachAppraisal and Performance Management in Schools - A practical approach
Appraisal and Performance Management in Schools - A practical approach
 
Marking staff meeting 12.5.15
Marking staff meeting 12.5.15Marking staff meeting 12.5.15
Marking staff meeting 12.5.15
 
Kirk patrick's simplistic approach
Kirk patrick's simplistic approachKirk patrick's simplistic approach
Kirk patrick's simplistic approach
 
Tqf day 2 - assessment and feedback
Tqf   day 2 - assessment and feedbackTqf   day 2 - assessment and feedback
Tqf day 2 - assessment and feedback
 
Af l (rp5)
Af l (rp5)Af l (rp5)
Af l (rp5)
 

Recently uploaded

Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfakmcokerachita
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docxPoojaSen20
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfUmakantAnnand
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxSayali Powar
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Educationpboyjonauth
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfSumit Tiwari
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting DataJhengPantaleon
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)eniolaolutunde
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 

Recently uploaded (20)

Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
Class 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdfClass 11 Legal Studies Ch-1 Concept of State .pdf
Class 11 Legal Studies Ch-1 Concept of State .pdf
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
MENTAL STATUS EXAMINATION format.docx
MENTAL     STATUS EXAMINATION format.docxMENTAL     STATUS EXAMINATION format.docx
MENTAL STATUS EXAMINATION format.docx
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Concept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.CompdfConcept of Vouching. B.Com(Hons) /B.Compdf
Concept of Vouching. B.Com(Hons) /B.Compdf
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptxPOINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
POINT- BIOCHEMISTRY SEM 2 ENZYMES UNIT 5.pptx
 
Introduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher EducationIntroduction to ArtificiaI Intelligence in Higher Education
Introduction to ArtificiaI Intelligence in Higher Education
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdfEnzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
Enzyme, Pharmaceutical Aids, Miscellaneous Last Part of Chapter no 5th.pdf
 
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data_Math 4-Q4 Week 5.pptx Steps in Collecting Data
_Math 4-Q4 Week 5.pptx Steps in Collecting Data
 
Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)Software Engineering Methodologies (overview)
Software Engineering Methodologies (overview)
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1Código Creativo y Arte de Software | Unidad 1
Código Creativo y Arte de Software | Unidad 1
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 

Evaluating the training programs PPT: The Four levels ( Kirkpatrick's Four-Level Training Evaluation Mode)

  • 1. EvaluatingTrainingPrograms The FourLevels Presented by : M.Pavithra Sai Mani Prathiksha.L 1 AMITY GLOBAL BUSINESS SCHOOL, CHENNAI
  • 2. Objectives ▪Upon completion of this presentation you will be able to: ▪State why evaluation of programs is critical to you and your organization. ▪Apply Kirkpatrick’s four levels of evaluation to your programs. ▪Use guidelines for developing evaluations. ▪Implement various forms and approaches to evaluation 2
  • 3. Why Evaluate? ➢Determine the effectiveness of the program design How the program was received by the participants How learners fared on assessment of their learning Determine what instructional strategies work presentation mode presentation methods. learning activities desired level of learning ➢Program improvement 3
  • 4. Why Evaluate? ➢Should the program be continued? ➢How do you justify your existence? ➢How do you determine the return on investment for the program? human capital individual competence social/economic benefit 4
  • 5. FourLevels of Evaluation Kirkpatrick ▪ During program evaluation ▪ Level One Reaction ▪ Level Two Learning ▪ Post program evaluation ▪ Level Three Behavior ▪ Level Four Results 5
  • 6. ReactionLevel ▪ A customer satisfaction measure ▪ Were the participants pleased with the program ▪ Perception if they learned anything ▪ Likelihood of applying the content ▪ Effectiveness of particular strategies ▪ Effectiveness of the packaging of the course 6
  • 7. Examplesof Level One ▪ Your Opinion, Please In a word, how would you describe this workshop? ▪ Intent ▪ Solicit feedback about the course. Can also assess whether respondents transposed the numeric scales. 7
  • 8. Exampleof Level One ▪Using a number, how would you describe this program? (circle a number) Terrible Average Outstanding 1 2 3 4 5 ▪Intent: Provides quantitative feedback to determine average responses (descriptive data).Watch scale sets! 8
  • 9. Exampleof Level One ▪ How much did you know about this subject before taking this workshop? Nothing Some A lot 1 2 3 4 5 ▪ How much do you know about this subject after participating in this workshop? Nothing Some A lot 1 2 3 4 5 ▪ Intent - The question does not assess actual learning, it assesses perceived learning. 9
  • 10. ExampleLevel One ▪How likely are you to use some or all of the skills taught in this workshop in your work/community/ family? Not Very Likely Likely Likely 1 2 3 4 5 ▪Intent – determine learners perceived relevance of the material. May correlate with the satisfaction learners feel. 10
  • 11. Exampleof Level One ▪ The best part of this program was… ▪ The one thing that could be improved most .. ▪ Intent Qualitative feedback on the course and help prioritize work in a revision. Develop themes on exercises, pace of course, etc. 11
  • 12. Guidelinesfor Evaluating Reaction ➢Decide what you want to find out. ➢Design a form that will quantify reactions. ➢Encourage written comments. ➢Get 100% immediate response. ➢Get honest responses. ➢If desirable, get delayed reactions. ➢Determine acceptable standards. ➢Measure future reactions against the standard. 12
  • 13. LearningLevel ▪ What did the participants learn in the program? ▪ The extent to which participants change attitudes, increase knowledge, and/or increase skill. ▪ What exactly did the participant learn and not learn? ▪ Pretest Posttest 13
  • 14. LearningLevel ▪ Requires developing specific learning objectives to be evaluated. ▪ Learning measures should be objective and quantifiable. ▪ Paper pencil tests, performance on skills tests, simulations, role- plays, case study, etc. 14
  • 15. Level Two Examples ▪Develop a written exam based on the desired learning objectives. ▪Use the exam as a pretest ▪Provide participants with a worksheet/activity sheet that will allow for “tracking” during the session. ▪Emphasize and repeat key learning points during the session. ▪Use the pretest exam as a posttest exam. ▪Compute the posttest-pretest gain on the exam. 15
  • 16. What makes a good test? ▪ The only valid test questions emerge from the objectives. ▪ Consider writing main objectives and supporting objectives. ▪ Test questions usually test supporting objectives. ▪ Ask more than one question on each objective. 16
  • 17. Level Two Strategies ▪ Consider using scenarios, case studies, sample project evaluations, etc, rather than test questions. Develop a rubric of desired responses. ▪ Develop between 3 and 10 questions or scenarios for each main objective. 17
  • 18. Level Two Strategies ▪ Provide instructor feedback during the learning activities. ▪ Requires the instructor to actively monitor participants discussion, practice activities, and engagement. Provide learners feedback. ▪ Ask participants open ended questions (congruent with the learning objectives) during activities to test participant understanding. 18
  • 19. Example Which of the following should be considered when evaluating at the Reaction Level? (more than one answer possible) ___Evaluate only the lesson content ___Obtain both subjective and objective responses ___Get 100% response from participants ___Honest responses are important ___Only the course instructor should review results. 19
  • 20. Example ▪ Match the following to the choices below ___ Reaction Level ___ Learning Level A. Changes in performance at work B. Participant satisfaction C. Organizational Improvement D.What the participant learned in class 20
  • 21. ScenarioExample ▪ An instructor would like to know the effectiveness of the course design and how much a participant has learned in a seminar.The instructor would like to achieve at least Level Two evaluation. ▪ What techniques could the instructor use to achieve level two evaluation? ▪ Should the instructor also consider doing a level one evaluation? Why or why not? 21
  • 22. Rubricfor Scenario Question Directions to instructor: Use the following topic checklist to determine the completeness of the participants response: ___ Learner demonstrated an accurate understanding of what level two is: learning level. ___ Learner provided at least two specific examples: pretest -posttest, performance rubrics, scenarios, case studies, hands-on practice. ___ Learner demonstrated an accurate understanding of what level one evaluation is: reaction level. ___The learner provided at least three specific examples of why level one is valuable: assess satisfaction, learning activities, course packaging, learning strategies, likelihood of applying learning. 22
  • 23. BehaviorLevel ▪ How the training affects performance. ▪ The extent to which change in behavior occurred. ▪ Was the learning transferred from the classroom to the real world. ▪ Transfer – Transfer - Transfer 23
  • 24. Conditions Necessaryto Change ▪ The person must: ▪ have a desire to change. ▪ know what to do and how to do it. ▪ work in the right climate. ▪ be rewarded for changing. 24
  • 25. Typesof Climates ▪ Preventing – forbidden to use the learning. ▪ Discouraging – changes in current way of doing things is not desired. ▪ Neutral – learning is ignored. ▪ Encouraging – receptive to applying new learning. ▪ Requiring – change in behavior is mandated. 25
  • 26. Guidelinesfor Evaluating Behavior ▪Measure on a before/after basis ▪Allow time for behavior change (adaptation) to take place ▪Survey or interview one or more who are in the best position to see change. ▪ The participant/learner ▪ The supervisor/mentor ▪ Subordinates or peers ▪ Others familiar with the participants actions. 26
  • 27. Guidelinesfor Evaluating Behavior ▪ Get 100% response or a sample? ▪ Depends on size of group. The more the better. ▪ Repeat at appropriate times ▪ Remember that other factors can influence behavior over time. ▪ Use a control group if practical ▪ Consider cost vs. benefits of the evaluation 27
  • 28. Level Three Examples ▪ Observation ▪ Survey or Interview ▪ Participant and/or others ▪ Performance benchmarks ▪ Before and after ▪ Control group ▪ Evidence or Portfolio 28
  • 29. Surveyor Patterned Interview 1. Explain purpose of the survey/interview. 2. Review program objectives and content. 3. Ask the program participant to what extent performance was improved as a result of the program. __ Large extent __ Some extent __ Not at all If “Large extent” or “Some extent”, ask to please explain. 4. If “Not at all”, indicate why not: ___ Program content wasn’t practical ___ No opportunity to use what I learned ___ My supervisor prevented or discouraged me to change ___ Other higher priorities ___ Other reason (please explain) 5. Ask, “In the future, to what extent do you plan to change your behavior?” ___ Large extent ___ Some extent ___ Not at all Ask to please explain: 29
  • 30. Evidenceand Portfolio ▪ Thank you for participating. I am very interested in how the evaluation skills you have learned are used in your work.. ▪ Please send me a copy of at least one of the following: ▪ a level three evaluation that you have designed. ▪ a copy of level two evaluations that use more than one method of evaluating participant learning. ▪ a copy of a level one evaluation that you have modified and tell me how it influenced program improvement. ▪ (indicate if you would like my critique on any of the evaluations) ▪ If I do not hear from you before January 30, I will give you a call – no pressure – ‘just love to learn what you are doing. 30
  • 31. ResultsLevel ▪ Impact of education and training on the organization or community. ▪ The final results that occurred as a result of training. ▪ The ROI for training. 31
  • 32. Examplesof Level Four ➢How did the training save costs ➢Did work output increase ➢Was there a change in the quality of work ➢Did the social condition improve ➢Did the individual create an impact on the community ➢Is there evidence that the organization or community has changed. 32
  • 33. How do you knowif your outcomesare good ? ▪ Good training outcomes need to be : ▪ Relevant ▪ Reliable ▪ Discriminate ▪ Practical 33
  • 34. GoodOutcomes: Relevance ▪ Criteria Relevance ▪ Criterion Contamination ▪ Criterion Deficiency ▪ Reliability ▪ Discrimination ▪ Practicality. 34
  • 36. EvaluationDesigns: ThreatstoValidity ▪ Ensuring internal validity means you can be more certain that your intervention or program did cause the effect observed and the effect is not due to other causes. If you have a threat to external validity, you might be wrong in making a generalization about your findings. 36
  • 37. ThreatstoValidity 1.Threats to Internal validity ▪ Company ▪ Persons ▪ Outcome Measures. 2.Threats to External Validity ▪ Reaction to pretest ▪ Reaction to evaluation ▪ Interaction of selection and training ▪ Intertraction of methods. 37
  • 38. MethodstoControl forthreatsto Validity ▪ Pre and post tests. ▪ Use of Comparison Groups ▪ Random Assigment. 38
  • 39. TypesofEvaluation Design ▪ Post-Test- only ▪ Pretest / Posttest ▪ Posttest – only with comparison group. ▪ Pretest / Post test with comparison group. 39
  • 41. Guidelinesfor Evaluating Results ▪Measure before and after ▪Allow time for change to take place ▪Repeat at appropriate times ▪Use a control group if practical ▪Consider cost vs. benefits of doing Level Four ▪Remember, other factors can affect results ▪Be satisfied with Evidence if Proof is not possible. 41