Training effectiveness-ISO Prospective


Published on

Published in: Business, Technology
1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Additional Speaker Notes: Bring the discussion close to home by asking the audience of trainers to think about the questions they have as a training course concludes. Share these important questions and ask if / how they find out the answers. While we may walk away with a “feeling” about how things went, it is important (for the reasons on the prior slides) to be able to validate these feelings. However, how do we go about validated these feelings and finding concrete answers? What answers can we draw from in the training community?
  • Additional Speaker Notes: Kirkpatrick’s 4 Levels of Evaluation provides a possible framework to answer the key questions we face as a training session concludes. Developed in 1952 by Donald Kirkpatrick, it is widely cited and viewed by most as the first evaluation model for corporate training (Pershing & Gilmore, 2004)
  • Additional Speaker Notes: Given that Kirkpatrick’s framework touches on important areas to assess, as well as its wide citation within the profession, it is easy to understand the broad appeal within the industry. However, as we will see in the next slide, this wide appeal has not translated to wide use . . .
  • Additional Speaker Notes: While Kirkpatrick is widely known, all levels are not widely used in practice. As cited in Pershing and Gilmore (2004), an ASTD study found only Level 1 was regularly used. This reflects important potential problems in adopting Kirkpatrick’s Model, including: the noted perception problems (time consuming, difficulty in measurement, beyond the realm of most trainers), as well as the pitfall of reliance on Level 1 results that may or may not translate to similar Learning / Transfer and ROI results that should be evaluated in Levels 2, 3 and 4.
  • Additional Speaker Notes: Brinkerhoff & Dressler also note 3 important problems with reliance on Kirkpatrick. Brinkerhoff & Dressler stress that training must be considered as part of the greater Performance Environment that includes the “owners” (such as senior and line managers), as well as other non-training factors (such as management support, incentives or rewards) which may impede or enable training. Given these limitations, it is recommended an alternative to Kirkpatrick be considered at BIG.
  • Additional Speaker Notes: As noted at the beginning of the presentation, it is necessary to incorporate an evaluation plan into the training programs at BIG. While Kirkpatrick’s Four Levels of Evaluation is a well known framework, it has important limitations. It is unlikely, the training staff at BIG will complete all of the necessary 4 levels to effectively evaluation the training program. Further, it lacks performance system focus. Brinkerhoff & Dressler propose a streamlined Success Case Evaluation Model that is recommended for use at BIG. As the training staff has not routinely performed evaluation as part of the underwriting training programs, it provides a relatively rapid evaluation and feedback process. It will also address the key business impact issues while contemplating the entire performance environment.
  • Training effectiveness-ISO Prospective

    1. 1. Training Effectiveness, design and delivery- an ISO-9000 perspective. By Saroj Ku. Behera
    2. 2. Agenda <ul><li>Overview: </li></ul><ul><ul><li>ISO Model </li></ul></ul><ul><ul><li>Requirements (9K, 14K, 18K). </li></ul></ul><ul><li>Kirpatrick Model for Training Evaluation. </li></ul><ul><li>ISO-10015: Guidelines for Training. </li></ul>
    3. 3. Background
    4. 4. C u s t o m e r R e q u i r e m e n t s C u s t o m e r S a t i s f a c t i o n CONTINUAL IMPROVEMENT OF THE UNIFIED MANAGEMENT SYSTEM Product realization Product Output Input Quality Management System Model Management responsibility Resource management Measurement, analysis, improvement
    5. 5. 6. Resource Management <ul><li>6.2 Human Resources </li></ul><ul><li>6.2.1 General </li></ul><ul><li>Personnel whose work affect product quality to be COMPETENT on the basis of appropriate education, training, skills and experience </li></ul>
    6. 6. 6. Resource Management (Contd.) <ul><li>6.2 Human Resources </li></ul><ul><li>6.2.2 Competence, Awareness and Training </li></ul><ul><li>The organization to </li></ul><ul><ul><li>Determine necessary competence for personnel performing work affecting product quality, </li></ul></ul><ul><ul><li>Provide training or take any other actions to satisfy these needs***, </li></ul></ul><ul><ul><li>Evaluate effectiveness of the actions taken.*** </li></ul></ul><ul><ul><li>Ensure that its personnel are aware of </li></ul></ul><ul><ul><ul><li>relevance and importance of their activities </li></ul></ul></ul><ul><ul><ul><li>how they contribute to the achievement of the Quality Objectives </li></ul></ul></ul><ul><ul><li>Maintain records of education, experience, training and skills </li></ul></ul><ul><ul><li>*** Changed in 2008 version </li></ul></ul>
    7. 7. 6. Resource Management (Contd.) <ul><li>6.2 Human Resources (Contd) </li></ul><ul><li>6.2.2 Competence, Awareness and Training ( Guidelines) </li></ul><ul><li>Make employees at each relevant function level aware of </li></ul><ul><ul><li>importance of conformance with the Quality Policy & QMS requirements </li></ul></ul><ul><ul><li>significant impacts of their work activities on quality, actual or potential </li></ul></ul><ul><ul><li>benefits of improved personnel performance </li></ul></ul><ul><ul><li>roles and responsibilities in achieving conformance to quality policy </li></ul></ul><ul><ul><li>the potential consequences of departure from specified procedures </li></ul></ul>
    8. 8. Paragraph 6.2 Words, &quot;affecting product quality&quot; changed to &quot;affecting conformity to product requirements” Paragraph 6.2.2 Sub-clause (b) -&quot;provide training or take other actions to satisfy these needs&quot; changed to &quot;where applicable training needs to be provided to achieve the necessary competence&quot;. Sub-clause (c) – obligation to ensure that the staff attained the expected competence. Changes in ISO-9001-2008
    9. 9. ISO-14000 & OHSAS-18001 4.4.2 Training Awareness & competence <ul><li>Identify training needs </li></ul><ul><li>Establish training procedures </li></ul><ul><ul><li>Importance of conformance </li></ul></ul><ul><ul><li>Consequences of departure from procedures </li></ul></ul><ul><li>Personnel competent by education, training, and/or experience </li></ul>
    10. 10. 4.4.2 Training Awareness & competence <ul><li>Training department completes Training Needs Analysis (TNA) and training schedule, reviews at least annually and revises as necessary </li></ul><ul><li>Permanent on-site contractors included in training program </li></ul><ul><li>Training records maintained </li></ul>
    11. 11. Effectiveness Extent to which planned activities are realized and planned results achieved. ISO 9000:2000 3.2.14 ISO 9001 specifies requirements for a quality management system that can be used for internal application by organizations, or for certification, or for contractual purposes. It focuses on the effectiveness of the quality management system in meeting customer requirements. ISO 9001: 2000 0.3
    12. 12. Training Design: ISO-9001 <ul><li>Part of Product Realization Process, Clause 7.3 </li></ul><ul><li>Training Design Planning </li></ul><ul><li>Design Inputs </li></ul><ul><li>Design Output </li></ul><ul><li>Design Review </li></ul><ul><li>Design Verification </li></ul><ul><li>Design Validation </li></ul><ul><li>Control of changes </li></ul>
    13. 13. <ul><li>7.3 Design & Development (D/D) </li></ul><ul><li>7.3.1 Design and Development Planning </li></ul><ul><li>Plan and control Design and Development of the product </li></ul><ul><li>During planning, determine </li></ul><ul><ul><li>Stages of design and development </li></ul></ul><ul><ul><li>Review, verification and validation appropriate to each Design and Development stage </li></ul></ul><ul><ul><li>Responsibilities and authorities for Design and Development </li></ul></ul><ul><li>Manage interface between different involved groups to ensure effective communication and clear responsibilities </li></ul><ul><li>Update planning output, as appropriate, as Design and Development progresses </li></ul>7. Product Realization (Contd.)
    14. 14. <ul><li>7.3 Design & Development (D/D) (Contd) </li></ul><ul><li>7.3.2 Design & Development Inputs </li></ul><ul><li>Determine the inputs related to product requirements and maintain records </li></ul><ul><li>Design & Development inputs to include </li></ul><ul><ul><li>Functional and performance requirements </li></ul></ul><ul><ul><li>Applicable statutory and regulatory requirements </li></ul></ul><ul><ul><li>Where applicable, Information derived from previous similar designs </li></ul></ul><ul><ul><li>Other requirements essential for Design and Development </li></ul></ul><ul><li>Review inputs for adequacy. Requirements to be complete, unambiguous and non-conflicting </li></ul>7. Product Realization (Contd.)
    15. 15. <ul><li>7.3 Design & Development (D/D) (Contd) </li></ul><ul><li>7.3.3 Design & Development Outputs </li></ul><ul><li>Outputs of D/D provided in a form that enables verification against D/D inputs. This shall be approved prior to release </li></ul><ul><li>D/D Outputs shall : </li></ul><ul><ul><li>meet D/D input requirements </li></ul></ul><ul><ul><li>provide appropriate information for purchasing, production and for service provisions </li></ul></ul><ul><ul><li>contain or reference product acceptance criteria </li></ul></ul><ul><ul><li>specify product characteristics essential for safe and proper use </li></ul></ul>7. Product Realization (Contd.)
    16. 16. <ul><li>7.3 Design & Development (D/D) (Contd) </li></ul><ul><li>7.3.4 Design & Development Review </li></ul><ul><li>At suitable stages, conduct systematic D/D reviews </li></ul><ul><ul><li>to evaluate the ability of results of D/D to fulfill requirements </li></ul></ul><ul><ul><li>to identify any problems and proposed necessary actions </li></ul></ul><ul><li>Involve and include representatives of functions concerned with design stage(s) being reviewed </li></ul><ul><li>Records of results of reviews and necessary actions to be maintained </li></ul><ul><li>7.3.5 Design & Development Verification </li></ul><ul><li>Perform verification to ensure that D/D outputs have satisfied D/D input requirements </li></ul><ul><li>Records of results of verification & necessary actions to be maintained </li></ul>7. Product Realization (Contd.)
    17. 17. <ul><li>7.3 Design & Development (D/D) (Contd) </li></ul><ul><li>7.3.6 Design & Development Validation </li></ul><ul><li>Perform D/D validation in accordance with planned arrangements ( see 7.3.1 ) to ensure </li></ul><ul><ul><li>Resulting product is capable of fulfilling the requirements for the specified or known intended use or application </li></ul></ul><ul><li>Wherever practicable, validation to be completed prior to delivery or implementation of the product </li></ul><ul><li>Records of results of validation and necessary actions to be maintained </li></ul>7. Product Realization (Contd.)
    18. 18. <ul><li>7.3 Design & Development (D/D) (Contd) </li></ul><ul><li>7.3.7 Control of Changes </li></ul><ul><li>Changes in D/D to be identified and records maintained </li></ul><ul><li>The changes to </li></ul><ul><ul><li>be reviewed, verified and validated, as appropriate </li></ul></ul><ul><ul><li>be approved before implementation </li></ul></ul><ul><ul><li>include evaluation of the effect of changes on constituent parts and delivered products </li></ul></ul><ul><li>Records of the results of validation and necessary actions to be maintained </li></ul>7. Product Realization (Contd.)
    19. 19. Training Evaluation
    20. 20. <ul><li>How the training is evaluated? </li></ul><ul><li>Factors affecting evaluation? </li></ul>
    21. 21. Evaluation - Myths <ul><li>I Can’t Measure the Result of My Training Efforts. </li></ul><ul><li>I Don’t Know What Information to Collect. </li></ul><ul><li>If I Cannot Calculate the ROI, the Evaluation Is Useless. </li></ul><ul><li>My CEO Does not Require Evaluation, So, Why Should I Do It ? </li></ul><ul><li>There Are Too Many Variables Affecting the Behavior Change for Me to Evaluate the Impact of Training. </li></ul><ul><li>Evaluation Will Lead to Criticism. </li></ul><ul><li>I Don’t Need to Justify My Existence; I Have a Proven Track Record. </li></ul>
    22. 22. Factor How Factor Influences Type of Evaluation Design Change potential Can program be modified? Importance Does ineffective training affect customer service, product development, or relationships between employees? Scale How many trainees are involved? Purpose of training Is training conducted for learning, results, or both? Organization culture Is demonstrating results part of company norms and expectations? Time frame When do we need the information?
    23. 23. Questions we must answer … <ul><li>How do participants feel about our training program? </li></ul><ul><li>Are participants learning? </li></ul><ul><li>Is their learning transferring to the job? </li></ul><ul><li>Does the organization benefit from our training efforts? </li></ul>
    24. 24. Donald Kirpatrick’s Model for Training Evaluation
    25. 25. Finding answers … <ul><li>Kirkpatrick’s 4 Levels of Evaluation </li></ul><ul><li>Level 1: Reaction </li></ul><ul><ul><li>How do participants feel about our training program? </li></ul></ul><ul><li>Level 2: Learning </li></ul><ul><ul><li>Are participants learning? </li></ul></ul><ul><li>Level 3: Behavior </li></ul><ul><ul><li>Is their learning transferring to the job? </li></ul></ul><ul><li>Level 4: Results </li></ul><ul><ul><li>Does the organization benefit? </li></ul></ul>Source: Kirkpatrick,1998
    26. 26.   DONALD KIRKPATRICK'S 4 LEVELS OF EVALUATING TRAINING Levels Description Comments Level 1 Reaction Trainee reaction to the course. Does the trainee like the course? Usually in the form of evaluation forms, sometimes called &quot;smile sheets&quot;. Most primitive and widely-used method of evaluation. It is easy, quick, and inexpensive to administer. Negative indicators could mean difficultly learning in the course. Level 2 Learning Did trainees learn what was based on the course objectives? Learning can be measured by pre- and post tests, either through written test or through performance tests. Level 3 Behavior Trainee behavior changes on the job - are the learners applying what they learned? Difficult to do. Follow-up questionnaire or observations after training class has occurred. Telephone interviews can also be conducted. Level 4 Results Ties training to the company's bottom line. Generally applies to training that seeks to overcome a business problem caused by lack of knowledge or skill. Examples include reductions in costs, turnover, absenteeism and grievances. May be difficult to tie directly to training.
    27. 27. Appeal of Kirkpatrick’s Model: <ul><li>Assesses important areas </li></ul><ul><li>Widely known </li></ul><ul><li>Simple framework </li></ul><ul><li>Easy to explain and understand </li></ul>
    28. 28. However . . . <ul><li>Widely Know ≠ Widely Used </li></ul><ul><ul><li>Level 1: Often (over 90%) </li></ul></ul><ul><ul><li>Level 2: Sometimes (less than 35%) </li></ul></ul><ul><ul><li>Level 3 & 4: Rarely (less than 15%) </li></ul></ul><ul><li>Why is this a problem? </li></ul><ul><ul><li>Level 3 and 4 often perceived as: </li></ul></ul><ul><ul><ul><li>Difficult to measure </li></ul></ul></ul><ul><ul><ul><li>Time consuming </li></ul></ul></ul><ul><ul><ul><li>Beyond the realm of most trainers </li></ul></ul></ul><ul><ul><li>Level 1 result does not always mean similar Learning / Transfer / ROI results </li></ul></ul>Source: Pershing & Gilmore, 2004
    29. 29. Other problems … <ul><li>Undermines Management Partnership </li></ul><ul><ul><li>Training ≠ “Silver Bullet” </li></ul></ul><ul><ul><li>Training is only one strategy within entire Performance System </li></ul></ul><ul><ul><li>Level 3 & 4 should include evaluations of entire Performance System - not just training </li></ul></ul><ul><li>Lacks Performance System Focus </li></ul><ul><ul><li>What about rest of Performance Environment ? </li></ul></ul><ul><ul><li>What factors impede / enable usage of training? </li></ul></ul><ul><li>Feedback Goes to Wrong People </li></ul><ul><ul><li>Feedback to training function only is incomplete </li></ul></ul><ul><ul><li>Must include Performance Environment owners </li></ul></ul>Source: Brinkerhoff & Dressler, 2002
    30. 30. In Summary: <ul><li>An evaluation tool must be integrated into the training programs. </li></ul><ul><li>Kirkpatrick’s Four Levels of Evaluation is well known, but has limitations: </li></ul><ul><ul><li>Unlikely completion of all 4 Levels </li></ul></ul><ul><ul><li>Lacks performance system focus </li></ul></ul>
    31. 31. ISO-10015 Guidelines for Training 9
    32. 32. Background of ISO-10015 <ul><li>22 country representatives developed the draft text over several years, culminating in the publication of ISO 10015 issued December 1999. </li></ul><ul><li>The advantages: </li></ul><ul><li>based on the process oriented concepts of the new 9000:2000 ISO family of standards. </li></ul><ul><li>Offers specific guidance for training design, review and implementation. </li></ul><ul><li>Provides context to organization's needs . </li></ul>
    33. 33. <ul><li>1. Linking training investment with company performance   </li></ul><ul><li>Key client is the organization, not only the persons being trained. (ROI Approach) </li></ul><ul><li>A company has to recognize first what is the performance challenge it faces and the causes for it. </li></ul><ul><li>ISO 10015 offers a clear road map to connect training to performance goals and use it for individual and group performance improvement. </li></ul><ul><li>2. Organizing training on the basis of pedagogical principles and processes </li></ul><ul><li>Training as a tool to cover an identified performance gap. </li></ul><ul><li>Establishing an appropriate training design and effective learning processes. </li></ul><ul><li>Ensures that training uses resources (finances, time and energy) optimally. </li></ul>Key features of ISO 10015
    34. 34. Training: for organizational needs
    35. 35. 4.1 Training is 4-step process <ul><li>For selecting and implementing training to close the gaps between required and existing competence, management should monitor the following stages: </li></ul><ul><li>a) defining training needs; </li></ul><ul><li>b) designing and planning training; </li></ul><ul><li>c) providing for the training; </li></ul><ul><li>d) evaluating the outcome of training. </li></ul>
    36. 36. The Training Process
    37. 37. 4.2 Defining Training Needs <ul><li>4.2.1 General </li></ul><ul><li>4.2.2 Defining the needs of the organisation </li></ul><ul><li>4.2.3 Defining the competence needs </li></ul><ul><li>4.2.4 Reviewing Competence </li></ul><ul><li>4.2.5 Defining Competence gaps </li></ul><ul><li>4.2.6 Identifying solutions close to competence gaps </li></ul><ul><li>4.2.7 Defining the specifications for training needs </li></ul>
    38. 38. Training Processes-1
    39. 39. Training Processes-2
    40. 40. Training Processes-3
    41. 41. Training Processes-4
    42. 42. 4.3 Designing and planning training <ul><li>4.3.1 General </li></ul><ul><li>4.3.2 Defining the constraints </li></ul><ul><li>4.3.3 Training methods and criteria for their selection </li></ul><ul><li>4.3.4 Training Plan specifications </li></ul><ul><li>4.3.5 Selecting a training provider </li></ul>
    43. 43.
    44. 44. 4.4 Providing the training <ul><li>4.4.1 General </li></ul><ul><li>4.4.2 Providing Support </li></ul><ul><li>1. pre-training support </li></ul><ul><li>2. training support </li></ul><ul><li>3. end-of-training support </li></ul>
    45. 45.
    46. 46. 4.5 Evaluating training outcomes <ul><li>4.5.1 General </li></ul><ul><li>4.5.2 Preparing evaluation report </li></ul><ul><li>Contents of evaluation report: </li></ul><ul><ul><li>specification of training needs </li></ul></ul><ul><ul><li>Evaluation criteria and description of sources, methods and schedule for evaluation. </li></ul></ul><ul><ul><li>Analysis of data and interpretation </li></ul></ul><ul><ul><li>Review of training costs </li></ul></ul><ul><ul><li>Conclusions and recommendations for improvement. </li></ul></ul>
    47. 47.
    48. 48. 5 Monitoring and improving the training process <ul><li>5.1 General </li></ul><ul><li>5.2 validation of training process </li></ul><ul><li>Monitoring is the key tool: </li></ul><ul><ul><li>Monitoring is to be done by competent personnel only. </li></ul></ul><ul><ul><li>Monitor records at all stages. Detect nonconformity at all stages. Take preventive and corrective actions. </li></ul></ul><ul><ul><li>if procedures followed and requirements met, then update the personal competence records. </li></ul></ul><ul><ul><li>if procedures not followed yet requirements are met, revise procedures and update training records. </li></ul></ul><ul><ul><li>If procedures followed and requirement not met, either correct the training process or seek a non-training solution. </li></ul></ul><ul><li>Identify further opportunities for improvement. </li></ul>
    49. 49.
    50. 50. Thank You <ul><li>Lawrence & Mayo Pvt Ltd </li></ul>
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.