Orlando V. Griego, PhD




Assessments and the Kirkpatrick
           Model
                                  1
Then he looked at wisdom and
appraised it; he confirmed it and
            tested it.
            Job 28:27               2
Class Objective
   • Write questions by
     the end of class
     the represent
     Kirkpatrick’s four
     levels of training.




                           3
Activity Objective
    • Using an activity, be
      able to measure my
      effectiveness from
      individual to team by
      increasing my score




                              4
Behind the 8 ball

                    5
Donald Kirkpatrick

• Kirkpatrick developed a model of training
  evaluation in 1959
• Arguably the most widely used approach
• Simple, Flexible and Complete
• 4-level model




                                              6
The Four Levels
• Level I: Evaluate Reaction
• Level II: Evaluate Learning
• Level III: Evaluate Behavior
• Level IV: Evaluate Results
• Fifth level was recently “added” for
  return on investment (“ROI”) but this
  was not in Kirkpatrick’s original model




                                            7
Relationship Between Levels
         Level 4 - Results
             Was it worth it?
                                             • Each subsequent level
                                               is predicated upon
        Level 3 - Behavior
                                               doing evaluation at
       KSA being used on the job?              lower level
                                             • A Level 3 will be of
      Level 2 - Knowledge                      marginal use, if a Level
         Did they learn anything
                                               2 evaluation is not
        Level 1 - Reaction
                                               conducted
Was the environment suitable for learning?


                                                                      8
Only by assessing each level can
     we yield actionable results
         Level 4 - Results                    Check Requirements,
             Was it worth it?                Systems and Processes



        Level 3 - Behavior                            Check
       KSA being used on the job?            Performance Environment



      Level 2 - Knowledge                           Improve
         Did they learn anything             Knowledge/Skill transfer



        Level 1 - Reaction
                                                     Improve
                                              Learning Environment
Was the environment suitable for learning?


                                                                        9
Types of Assessments Used at Each Level
                                               Type               Form
         Level 4 - Results                                Correlation of business
             Was it worth it?
                                             Summative    results with other
                                                          assessment results

        Level 3 - Behavior                                Observation of
       KSA being used on the job?
                                             Summative
                                                          Performance
                                                          360° Survey

      Level 2 - Knowledge                                 Self-assessment
                                             Diagnostic
         Did they learn anything                          Test
                                             Summative

        Level 1 - Reaction                                Survey
                                             Reaction
                                                          Real-time Polling
Was the environment suitable for learning?   Formative
                                                          Quizzing

                                                                                10
Reaction - What Is It?
• How favorably
  participants react to the
  training (“Customer
  satisfaction”)
   – Collects reactions to
     instructor, course, and
     learning environment
   – Communicates to
     trainees that their
     feedback is valued
   – Can provide quantitative
     information
                                  11
Reaction - What It Looks Like
• Questionnaire - Most common
  collection tool
  – Content: I enjoyed the content.          SD: 1 2 3 4 5 6 :SA
  – Methods: The seminar approach
    helped me learn.
  – Media: The AVs were helpful to me.
  – Trainer style: I liked the instructor.
  – Facilities: The room was useful for
    my learning.
  – Course materials: The materials
    provided make my learning better.




                                                              12
Reaction - How to Perform
          • Determine what you want
            to find out
          • Design a form to
            collect/quantify reactions
          • Do Immediately
          • Develop acceptable
            scoring standards
          • Follow-up as appropriate

                                     13
Learning - What Is It?


        • Knowledge
        • Skills
        • Attitudes




                         14
Learning - What It
 Looks Like
• Media used to measure
  learning:
  – Text: I felt the textbook help me
    learn.
  – Voice: The audio materials
    increased my knowledge.
  – Demonstration: I learned well from
    the demonstration.
• Methods used to measure
  learning:
  –   Interviews
  –   Surveys
  –   Tests (pre-/post-)
  –   Observations
  –   Combinations
                                         15
Learning - How to
    Perform
• Use a control group, if
  feasible
• Evaluate knowledge,
  skills, and/or attitudes
  before and after
• Get 100% participation or
  use statistical sample
• Follow-up as appropriate


                              16
Behavior - What Is It?
• Transfer of knowledge, skills,
  and/or attitude to the real
  world
  – Measure achievement of
    performance objectives




                                   17
Behavior - What It Looks Like
• Observe performer, first-
  hand
• Survey key people who
  observe performer
• Use checklists,
  questionnaires, interviews,
  or combinations
  – I believe coming in early is
    helpful.
  – I work better on the new
    production system.
                                   18
Behavior - How
  to Perform
• Evaluate before and
  after training
• Allow ample time
  before observing
• Survey key people
• Consider cost vs.
  benefits
  – 100% participation or a
    sampling
  – Repeated evaluations
    at appropriate intervals
  – Use of a control group
                           19
Results - What Is It?
         • Assesses “bottom
           line,” final results
         • Definition of “results”
           dependent upon the
           goal of the training
           program



                                     20
Results - What
It Looks Like
• Depends upon objectives of
  training program
   – Quantify: Bottom line,
     productivity, improvement
• Proof vs. Evidence
   – Proof is concrete
   – Evidence is soft
• Sample Questions:
   – I believe the new system has
     improved productivity.
   – My training has allowed me to
     be more productive.
   – My boss is a better leader
     after my training.

                                 21
Results - How
  to Perform
• Use a control group
• Allow time for results
  to be realized
• Measure before and
  after the program
• Consider cost versus
  benefits
• Be satisfied with
  evidence when proof is
  not possible             22
Summary & Assignment
• Level I:     Evaluate Reaction
• Level II:    Evaluate Learning
• Level III:   Evaluate Behavior
• Level IV:     Evaluate Results


• Write five questions for each
  level using 4 different training
  scenarios.
   – Use “I” questions.
   – Insure they apply to the content
   – Tricks: Thesaursus, book TOC,
                                        23
     ask each other
Readings to Consider
•   Kirkpatrick, Donald L. (1998). Evaluating Training
    Programs: The Four Levels. Berrett-Koehler
    Publishers.
•   Worthen, Blaine R, James R. Sanders, Jody L.
    Fitzpatrick (1997). Program Evaluation: Alternative
    Approaches and Practical Guidelines (Second
    Edition).Addison, Wesley, Longman, Inc.
•   Kirkpatrick, Donald L. (1998). Another Look at
    Evaluating Training Programs. American Society for
    Training & Development.
•   Sieloff, Debra A. (1999). The Bridge Evaluation
    Model. International Society for Performance
    Improvement.

                                                          24

Kirkpatrick's model

  • 1.
    Orlando V. Griego,PhD Assessments and the Kirkpatrick Model 1
  • 2.
    Then he lookedat wisdom and appraised it; he confirmed it and tested it. Job 28:27 2
  • 3.
    Class Objective • Write questions by the end of class the represent Kirkpatrick’s four levels of training. 3
  • 4.
    Activity Objective • Using an activity, be able to measure my effectiveness from individual to team by increasing my score 4
  • 5.
  • 6.
    Donald Kirkpatrick • Kirkpatrickdeveloped a model of training evaluation in 1959 • Arguably the most widely used approach • Simple, Flexible and Complete • 4-level model 6
  • 7.
    The Four Levels •Level I: Evaluate Reaction • Level II: Evaluate Learning • Level III: Evaluate Behavior • Level IV: Evaluate Results • Fifth level was recently “added” for return on investment (“ROI”) but this was not in Kirkpatrick’s original model 7
  • 8.
    Relationship Between Levels Level 4 - Results Was it worth it? • Each subsequent level is predicated upon Level 3 - Behavior doing evaluation at KSA being used on the job? lower level • A Level 3 will be of Level 2 - Knowledge marginal use, if a Level Did they learn anything 2 evaluation is not Level 1 - Reaction conducted Was the environment suitable for learning? 8
  • 9.
    Only by assessingeach level can we yield actionable results Level 4 - Results Check Requirements, Was it worth it? Systems and Processes Level 3 - Behavior Check KSA being used on the job? Performance Environment Level 2 - Knowledge Improve Did they learn anything Knowledge/Skill transfer Level 1 - Reaction Improve Learning Environment Was the environment suitable for learning? 9
  • 10.
    Types of AssessmentsUsed at Each Level Type Form Level 4 - Results Correlation of business Was it worth it? Summative results with other assessment results Level 3 - Behavior Observation of KSA being used on the job? Summative Performance 360° Survey Level 2 - Knowledge Self-assessment Diagnostic Did they learn anything Test Summative Level 1 - Reaction Survey Reaction Real-time Polling Was the environment suitable for learning? Formative Quizzing 10
  • 11.
    Reaction - WhatIs It? • How favorably participants react to the training (“Customer satisfaction”) – Collects reactions to instructor, course, and learning environment – Communicates to trainees that their feedback is valued – Can provide quantitative information 11
  • 12.
    Reaction - WhatIt Looks Like • Questionnaire - Most common collection tool – Content: I enjoyed the content. SD: 1 2 3 4 5 6 :SA – Methods: The seminar approach helped me learn. – Media: The AVs were helpful to me. – Trainer style: I liked the instructor. – Facilities: The room was useful for my learning. – Course materials: The materials provided make my learning better. 12
  • 13.
    Reaction - Howto Perform • Determine what you want to find out • Design a form to collect/quantify reactions • Do Immediately • Develop acceptable scoring standards • Follow-up as appropriate 13
  • 14.
    Learning - WhatIs It? • Knowledge • Skills • Attitudes 14
  • 15.
    Learning - WhatIt Looks Like • Media used to measure learning: – Text: I felt the textbook help me learn. – Voice: The audio materials increased my knowledge. – Demonstration: I learned well from the demonstration. • Methods used to measure learning: – Interviews – Surveys – Tests (pre-/post-) – Observations – Combinations 15
  • 16.
    Learning - Howto Perform • Use a control group, if feasible • Evaluate knowledge, skills, and/or attitudes before and after • Get 100% participation or use statistical sample • Follow-up as appropriate 16
  • 17.
    Behavior - WhatIs It? • Transfer of knowledge, skills, and/or attitude to the real world – Measure achievement of performance objectives 17
  • 18.
    Behavior - WhatIt Looks Like • Observe performer, first- hand • Survey key people who observe performer • Use checklists, questionnaires, interviews, or combinations – I believe coming in early is helpful. – I work better on the new production system. 18
  • 19.
    Behavior - How to Perform • Evaluate before and after training • Allow ample time before observing • Survey key people • Consider cost vs. benefits – 100% participation or a sampling – Repeated evaluations at appropriate intervals – Use of a control group 19
  • 20.
    Results - WhatIs It? • Assesses “bottom line,” final results • Definition of “results” dependent upon the goal of the training program 20
  • 21.
    Results - What ItLooks Like • Depends upon objectives of training program – Quantify: Bottom line, productivity, improvement • Proof vs. Evidence – Proof is concrete – Evidence is soft • Sample Questions: – I believe the new system has improved productivity. – My training has allowed me to be more productive. – My boss is a better leader after my training. 21
  • 22.
    Results - How to Perform • Use a control group • Allow time for results to be realized • Measure before and after the program • Consider cost versus benefits • Be satisfied with evidence when proof is not possible 22
  • 23.
    Summary & Assignment •Level I: Evaluate Reaction • Level II: Evaluate Learning • Level III: Evaluate Behavior • Level IV: Evaluate Results • Write five questions for each level using 4 different training scenarios. – Use “I” questions. – Insure they apply to the content – Tricks: Thesaursus, book TOC, 23 ask each other
  • 24.
    Readings to Consider • Kirkpatrick, Donald L. (1998). Evaluating Training Programs: The Four Levels. Berrett-Koehler Publishers. • Worthen, Blaine R, James R. Sanders, Jody L. Fitzpatrick (1997). Program Evaluation: Alternative Approaches and Practical Guidelines (Second Edition).Addison, Wesley, Longman, Inc. • Kirkpatrick, Donald L. (1998). Another Look at Evaluating Training Programs. American Society for Training & Development. • Sieloff, Debra A. (1999). The Bridge Evaluation Model. International Society for Performance Improvement. 24

Editor's Notes

  • #2 Welcome to our tutorial on Kirkpatrick's Four Levels of Evaluation.
  • #7 Donald Kirkpatrick developed a model of training evaluation in 1959 that has served the training community like no other. This 4-level model is arguably the most widely used approach in the world-even today. It's simple. Flexible. Complete. It presents four types of evaluations.those for reaction, learning, behavior and results.
  • #8 It is important to remember that each level is different. Level 1 evaluates Reaction ; Level 2 measures Learning; Level 3 refers to Behavior. Level 4 looks at Results. Let’s take a look at each of Kirkpatrick’s level in detail.
  • #10 Go from top down What if we did nothing? Applications of percentage – level 1, level 2, level 3 etc.
  • #12 A Level One evaluation measures audience reaction . Kirkpatrick says that evaluating reaction is the same thing as measuring customer satisfaction in that: It gives you feedback so you know how customers are reacting right now ; and It shows you care how they react. Level One also provides quantitative data on customer satisfaction.
  • #13 Most evaluators use the questionnaire for Level Two evaluations. A questionnaire can identify what to keep, delete, or improve . As a result, changes may be made in the areas of content, methods, media, trainer style, facilities, and/or course materials.
  • #14 In order to perform a Level One evaluation, first determine what you want to find out. Then, design a form to collect or quantify learner reactions. You want to do this immediately, so that you capture everyone’s responses. Develop acceptable scoring standards. Know what you will accept, and what you won’t. Follow-up as appropriate, communicating the results of your evaluation with those who need to know.
  • #15 Let’s review Level Two --”Learning”. Instructors can affect three cognitive areas during training--they can teach new knowledge, skills, and attitudes. Therefore, when we measuring learning, we determine if one or more of the following has occurred: Was knowledge learned or gained? Were new skills or ways to improve existing skills gained? Were attitudes changed?
  • #16 Media used to help measure learning include the written word, voice, and demonstration. Specific method s that may be employed to check for learning include interviews, surveys, pre- and post-tests, observations, or any combinations of these. To view actual examples of learning evaluation tools, be sure to view the QuestionMark web site examples.
  • #17 If practical, use a control group to compare knowledge changes. Design the tests around your objectives. You may want to evaluate learning both before and after the training, too. Try to survey 100% of the participants to get useful data. If that’s not feasible, use a statistical sampling. Finally, follow through and communicate with those who need to know.
  • #18 Let’s move to Level Three, Behavior. Kirkpatrick says this level identifies behavior change. Level Three focuses on the learner’s ability to transfer the learning to where it is actually needed--the real world. Therefore, evaluators measure achievement of performance objectives in real-world settings.
  • #19 To measure behavior changes, Level Three evaluators collect data from the setting where the learner must exhibit the new behavior. Evaluators observe the performer and might gather data from other key individuals, such as subordinates, supervisors, and any others who are credible witnesses of the trainee’s behavior. Besides observation checklists, evaluators might use questionnaires or interviews to gather data.
  • #20 Observation is the primary method used in a Level Three evaluation, whether you choose to: Evaluate before and after training, or Survey key people who observe the trainee in action Allow ample time before observing, and consider the cost-to-benefits of your tactics, such as: 100 percent participation or a sampling Repeated evaluations Use of a control group
  • #21 Let’s turn our attention to Level Four--”Evaluate Results.” Here we are looking at the final results that occur because the participants attended and participated in the training. Of course, when we speak of the term “results” we must often times consider the goal of the training program. Therefore we must be careful to identify what results we’re seeking as we evaluate different programs. Results can be financial. Results should be quantifiable.
  • #22 Level Four evaluations capture quantifiable results. There are two ways to get the numbers: Prove it, or find strong evidence. Proof is concrete . The results relate to the training in direct, indisputable ways. In some situations, the cost of getting proof is monumental. Therefore, evidence-- a softer form of correlating results to training --will suffice.
  • #23 To perform this measurement, use a control group. The control group can provide important data, such as the effect of that advertising on increased sales levels mentioned in the last slide. Allow ample time for results to be achieved. Collect data both before and after the training program to get additional perspective on the trainee’s performance. Consider the cost versus the benefits of performing this evaluation. Finally, be prepared to accept evidence when proof is not possible.
  • #24 Congratulations. You’ve made it through Kirkpatrick’s Four Levels of evaluation. Now you know that: Level One measures participant reaction Level Two measures new learning Level Three measures behaviors in the real world; and Level Four measures the results. But we’re not going to measure your learning today--We’ll leave that up to you!