Training effectiveness- clause 6.2


Published on

I Can’t Measure the Result of My Training Efforts.
I Don’t Know What Information to Collect.
If I Cannot Calculate the ROI, the Evaluation Is Useless.
My HR Head Does not Require Evaluation, So, Why Should I Do It ?
There Are Too Many Variables Affecting the Behavior Change for Me to Evaluate the Impact of Training.
Evaluation Will Lead to Criticism.
I Don’t Need to Justify My Existence; I Have a Proven Track Record.

Published in: Business, Technology
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Additional Speaker Notes: Bring the discussion close to home by asking the audience of trainers to think about the questions they have as a training course concludes. Share these important questions and ask if / how they find out the answers. While we may walk away with a “feeling” about how things went, it is important (for the reasons on the prior slides) to be able to validate these feelings. However, how do we go about validated these feelings and finding concrete answers? What answers can we draw from in the training community?
  • Additional Speaker Notes: Kirkpatrick’s 4 Levels of Evaluation provides a possible framework to answer the key questions we face as a training session concludes. Developed in 1952 by Donald Kirkpatrick, it is widely cited and viewed by most as the first evaluation model for corporate training (Pershing & Gilmore, 2004)
  • Additional Speaker Notes: Given that Kirkpatrick’s framework touches on important areas to assess, as well as its wide citation within the profession, it is easy to understand the broad appeal within the industry. However, as we will see in the next slide, this wide appeal has not translated to wide use . . .
  • Additional Speaker Notes: While Kirkpatrick is widely known, all levels are not widely used in practice. As cited in Pershing and Gilmore (2004), an ASTD study found only Level 1 was regularly used. This reflects important potential problems in adopting Kirkpatrick’s Model, including: the noted perception problems (time consuming, difficulty in measurement, beyond the realm of most trainers), as well as the pitfall of reliance on Level 1 results that may or may not translate to similar Learning / Transfer and ROI results that should be evaluated in Levels 2, 3 and 4.
  • Additional Speaker Notes: Brinkerhoff & Dressler also note 3 important problems with reliance on Kirkpatrick. Brinkerhoff & Dressler stress that training must be considered as part of the greater Performance Environment that includes the “owners” (such as senior and line managers), as well as other non-training factors (such as management support, incentives or rewards) which may impede or enable training. Given these limitations, it is recommended an alternative to Kirkpatrick be considered at BIG.
  • Additional Speaker Notes: As noted at the beginning of the presentation, it is necessary to incorporate an evaluation plan into the training programs at BIG. While Kirkpatrick’s Four Levels of Evaluation is a well known framework, it has important limitations. It is unlikely, the training staff at BIG will complete all of the necessary 4 levels to effectively evaluation the training program. Further, it lacks performance system focus. Brinkerhoff & Dressler propose a streamlined Success Case Evaluation Model that is recommended for use at BIG. As the training staff has not routinely performed evaluation as part of the underwriting training programs, it provides a relatively rapid evaluation and feedback process. It will also address the key business impact issues while contemplating the entire performance environment.
  • Training effectiveness- clause 6.2

    1. 1. Training Effectiveness, design and delivery- an ISO-9000 perspective. By Saroj Ku. Behera
    2. 2. Training Evaluation
    3. 3. Objective <ul><li>How the training is evaluated? </li></ul><ul><li>Factors affecting evaluation? </li></ul>
    4. 4. Evaluation - Myths <ul><li>I Can’t Measure the Result of My Training Efforts. </li></ul><ul><li>I Don’t Know What Information to Collect. </li></ul><ul><li>If I Cannot Calculate the ROI, the Evaluation Is Useless. </li></ul><ul><li>My HR Head Does not Require Evaluation, So, Why Should I Do It ? </li></ul><ul><li>There Are Too Many Variables Affecting the Behavior Change for Me to Evaluate the Impact of Training. </li></ul><ul><li>Evaluation Will Lead to Criticism. </li></ul><ul><li>I Don’t Need to Justify My Existence; I Have a Proven Track Record. </li></ul>
    5. 5. Factor How Factor Influences Type of Evaluation Design Change potential Can program be modified? Importance Does ineffective training affect customer service, product delivery, or relationships between employees? Scale How many trainees are involved? Purpose of training Is training conducted for learning, results, or both? Organization culture Is demonstrating results part of company norms and expectations? Time frame When do we need the information?
    6. 6. Questions we must answer … <ul><li>How do participants feel about our training program? </li></ul><ul><li>Are participants learning? </li></ul><ul><li>Is their learning transferring to the job? </li></ul><ul><li>Does the organization benefit from our training efforts? </li></ul>
    7. 7. Donald Kirpatrick’s Model for Training Evaluation
    8. 8. Finding answers … <ul><li>Kirkpatrick’s 4 Levels of Evaluation </li></ul><ul><li>Level 1: Reaction </li></ul><ul><ul><li>How do participants feel about our training program? </li></ul></ul><ul><li>Level 2: Learning </li></ul><ul><ul><li>Are participants learning? </li></ul></ul><ul><li>Level 3: Behavior </li></ul><ul><ul><li>Is their learning transferring to the job? </li></ul></ul><ul><li>Level 4: Results </li></ul><ul><ul><li>Does the organization benefit? </li></ul></ul>
    9. 9.   DONALD KIRKPATRICK'S 4 LEVELS OF EVALUATING TRAINING Levels Description Comments Level 1 Reaction Trainee reaction to the course. Does the trainee like the course? Usually in the form of evaluation forms, sometimes called &quot;smile sheets&quot;. Most primitive and widely-used method of evaluation. It is easy, quick, and inexpensive to administer. Negative indicators could mean difficultly learning in the course. Level 2 Learning Did trainees learn what was based on the course objectives? Learning can be measured by pre- and post tests, either through written test or through performance tests. Level 3 Behavior Trainee behavior changes on the job - are the learners applying what they learned? Difficult to do. Follow-up questionnaire or observations after training class has occurred. Telephone interviews can also be conducted. Level 4 Results Ties training to the company's bottom line. Generally applies to training that seeks to overcome a business problem caused by lack of knowledge or skill. Examples include reductions in costs, turnover, absenteeism and grievances. May be difficult to tie directly to training.
    10. 10. Appeal of Kirkpatrick’s Model: <ul><li>Assesses important areas </li></ul><ul><li>Widely known </li></ul><ul><li>Simple framework </li></ul><ul><li>Easy to explain and understand </li></ul>
    11. 11. However . . . <ul><li>Widely Know ≠ Widely Used </li></ul><ul><ul><li>Level 1: Often (over 90%) </li></ul></ul><ul><ul><li>Level 2: Sometimes (less than 35%) </li></ul></ul><ul><ul><li>Level 3 & 4: Rarely (less than 15%) </li></ul></ul><ul><li>Why is this a problem? </li></ul><ul><ul><li>Level 3 and 4 often perceived as: </li></ul></ul><ul><ul><ul><li>Difficult to measure </li></ul></ul></ul><ul><ul><ul><li>Time consuming </li></ul></ul></ul><ul><ul><ul><li>Beyond the realm of most trainers </li></ul></ul></ul><ul><ul><li>Level 1 result does not always mean similar Learning / Transfer / ROI results </li></ul></ul>
    12. 12. Other problems … <ul><li>Undermines Management Partnership </li></ul><ul><ul><li>Training ≠ “Silver Bullet” </li></ul></ul><ul><ul><li>Training is only one strategy within entire Performance System </li></ul></ul><ul><ul><li>Level 3 & 4 should include evaluations of entire Performance System - not just training </li></ul></ul><ul><li>Lacks Performance System Focus </li></ul><ul><ul><li>What about rest of Performance Environment ? </li></ul></ul><ul><ul><li>What factors impede / enable usage of training? </li></ul></ul><ul><li>Feedback Goes to Wrong People </li></ul><ul><ul><li>Feedback to training function only is incomplete </li></ul></ul><ul><ul><li>Must include Performance Environment owners </li></ul></ul>
    13. 13. In Summary: <ul><li>An evaluation tool must be integrated into the training programs. </li></ul><ul><li>Kirkpatrick’s Four Levels of Evaluation is well known, but has limitations: </li></ul><ul><ul><li>Unlikely completion of all 4 Levels </li></ul></ul><ul><ul><li>Lacks performance system focus </li></ul></ul>
    14. 14. Training: for organizational needs
    15. 15. Training is 4-step process <ul><li>For selecting and implementing training to close the gaps between required and existing competence, management should monitor the following stages: </li></ul><ul><li>a) defining training needs; </li></ul><ul><li>b) designing and planning training; </li></ul><ul><li>c) providing for the training; </li></ul><ul><li>d) evaluating the outcome of training. </li></ul>
    16. 16. The Training Process
    17. 17. Defining Training Needs <ul><li>4.2.1 General </li></ul><ul><li>4.2.2 Defining the needs of the organisation </li></ul><ul><li>4.2.3 Defining the competence needs </li></ul><ul><li>4.2.4 Reviewing Competence </li></ul><ul><li>4.2.5 Defining Competence gaps </li></ul><ul><li>4.2.6 Identifying solutions close to competence gaps </li></ul><ul><li>4.2.7 Defining the specifications for training needs </li></ul>
    18. 18. Training Processes-1
    19. 19. Training Processes-2
    20. 20. Training Processes-3
    21. 21. Training Processes-4
    22. 22. Designing and planning training <ul><li>4.3.1 General </li></ul><ul><li>4.3.2 Defining the constraints </li></ul><ul><li>4.3.3 Training methods and criteria for their selection </li></ul><ul><li>4.3.4 Training Plan specifications </li></ul><ul><li>4.3.5 Selecting a training provider </li></ul>
    23. 24. Providing the training <ul><li>General </li></ul><ul><li>Providing Support </li></ul><ul><li>1. pre-training support </li></ul><ul><li>2. training support </li></ul><ul><li>3. end-of-training support </li></ul>
    24. 26. Evaluating training outcomes <ul><li>General </li></ul><ul><li>Preparing evaluation report </li></ul><ul><li>Contents of evaluation report: </li></ul><ul><ul><li>specification of training needs </li></ul></ul><ul><ul><li>Evaluation criteria and description of sources, methods and schedule for evaluation. </li></ul></ul><ul><ul><li>Analysis of data and interpretation </li></ul></ul><ul><ul><li>Review of training costs </li></ul></ul><ul><ul><li>Conclusions and recommendations for improvement. </li></ul></ul>
    25. 28. Monitoring and improving the training process <ul><li>General </li></ul><ul><li>validation of training process </li></ul><ul><li>Monitoring is the key tool: </li></ul><ul><ul><li>Monitoring is to be done by competent personnel only. </li></ul></ul><ul><ul><li>Monitor records at all stages. Detect nonconformity at all stages. Take preventive and corrective actions. </li></ul></ul><ul><ul><li>if procedures followed and requirements met, then update the personal competence records. </li></ul></ul><ul><ul><li>if procedures not followed yet requirements are met, revise procedures and update training records. </li></ul></ul><ul><ul><li>If procedures followed and requirement not met, either correct the training process or seek a non-training solution. </li></ul></ul><ul><li>Identify further opportunities for improvement. </li></ul>
    26. 30. <ul><li>Thank You </li></ul><ul><li>For L&M </li></ul><ul><li>Saroj Ku. Behera </li></ul>