Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Levels 1-4 Evaluation


Published on

Slide presentation used to explain the Kirkpatrick levels of evaluation to instructors

Published in: Business, Technology

Levels 1-4 Evaluation

  1. 1. Did the Training Work? The Four Levels of Evaluation Created by Dawn Drake, 2008
  2. 2. Dawn DrakePage 2 Birth of The Model • Developed by Donald L. Kirkpatrick • 1959: Four article series: “Techniques for Evaluating Training Programs” (Training and Development, ASTD) • The Kirkpatrick Model • 1998: Evaluating Training Programs: The Four Levels • Widely recognized by training and HR professionals
  3. 3. Dawn DrakePage 3 The Four Levels Immediate 1. Learner Reactions 2. Learning Delayed 3. Job Behavior 4. Observable Results Instruction
  4. 4. Dawn DrakePage 4 Level 1: Learner Reaction Features • Asks for learners’ opinions: Were they “smiling when they left”? Challenges • Provides no proof of learning • Responses may not be honest (“If you can’t say something nice…”) • Determining when negative results indicate a valid need to modify training Benefits • Can be used to identify support vs. resistance • Can identify some trouble spots that may impact learning • Indicates what they will tell others (managers, peers) Smile Sheets
  5. 5. Dawn DrakePage 5 Level 2: Learning Features • Test knowledge and skills • Tie directly to course objectives • Simulate performances as much as possible Challenges • Defining performance objectives that can be measured in the testing environment • Obtaining client agreement that achieving the objectives will provide skills needed to meet performance standards • Ensure evaluations measure the learning objectives! Benefits • Provides evidence of whether learners gained knowledge/skills targeted by training • Identifies areas for improving training • Identifies areas where learners’ knowledge/skills still fall short of requirements Note: A single evaluation shows the current state. To show a change, you must establish a baseline (pre-evaluation) for comparison.
  6. 6. Dawn DrakePage 6 Features • Evaluates whether skills taught are being used on the job (transferred) • Delay between training and evaluation (usually several weeks) Challenges • May need to survey others, including managers and “customers” • Other factors may affect transfer Benefits • Can provide evidence of training effectiveness • When Level 2 results are good but Level 3 results are poor, can lead to consideration of other blocks to performance • Too much time between training and implementation? • Lack of support on the job? • Problems with the tool? • Lack of motivation? Level 3: Job Behavior (Transfer)
  7. 7. Dawn DrakePage 7 Level 4: Observable Results (Impact) Features • Intended to evaluate the impact of training on business results (longer range) • Usually done only for selected programs due to cost and difficulty Challenges • Due to time elapsed, usually impossible to screen out other variables that impact results • Establish criteria for “success” in advance, with client’s agreement Benefits • Shows value of training for achieving strategic business goals (ROI) Reduce costs by increasing efficiency Costs