Evaluating hrd-programs


Published on

Evaluate HRD programs

Published in: Business

Evaluating hrd-programs

  1. 1. Evaluating HRD Programs
  2. 2. Effectiveness <ul><li>The degree to which a training (or other HRD program) achieves its intended purpose </li></ul><ul><li>Measures are relative to some starting point </li></ul><ul><li>Measures how well the desired goal is achieved </li></ul>
  3. 3. Evaluation
  4. 4. HRD Evaluation <ul><li>Textbook definition: </li></ul><ul><li>“ The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.” </li></ul>
  5. 5. In Other Words… <ul><li>Are we training: </li></ul><ul><li>the right people </li></ul><ul><li>the right “stuff” </li></ul><ul><li>the right way </li></ul><ul><li>with the right materials </li></ul><ul><li>at the right time? </li></ul>
  6. 6. Evaluation Needs <ul><li>Descriptive and judgmental information needed </li></ul><ul><ul><li>Objective and subjective data </li></ul></ul><ul><li>Information gathered according to a plan and in a desired format </li></ul><ul><li>Gathered to provide decision making information </li></ul>
  7. 7. Purposes of Evaluation <ul><li>Determine whether the program is meeting the intended objectives </li></ul><ul><li>Identify strengths and weaknesses </li></ul><ul><li>Determine cost-benefit ratio </li></ul><ul><li>Identify who benefited most or least </li></ul><ul><li>Determine future participants </li></ul><ul><li>Provide information for improving HRD programs </li></ul>
  8. 8. Purposes of Evaluation – 2 <ul><li>Reinforce major points to be made </li></ul><ul><li>Gather marketing information </li></ul><ul><li>Determine if training program is appropriate </li></ul><ul><li>Establish management database </li></ul>
  9. 9. Evaluation Bottom Line <ul><li>Is HRD a revenue contributor or a revenue user? </li></ul><ul><li>Is HRD credible to line and upper-level managers? </li></ul><ul><li>Are benefits of HRD readily evident to all? </li></ul>
  10. 10. How Often are HRD Evaluations Conducted? <ul><li>Not often enough!!! </li></ul><ul><li>Frequently, only end-of-course participant reactions are collected </li></ul><ul><li>Transfer to the workplace is evaluated less frequently </li></ul>
  11. 11. Why HRD Evaluations are Rare <ul><li>Reluctance to having HRD programs evaluated </li></ul><ul><li>Evaluation needs expertise and resources </li></ul><ul><li>Factors other than HRD cause performance improvements – e.g., </li></ul><ul><ul><li>Economy </li></ul></ul><ul><ul><li>Equipment </li></ul></ul><ul><ul><li>Policies, etc. </li></ul></ul>
  12. 12. Need for HRD Evaluation <ul><li>Shows the value of HRD </li></ul><ul><li>Provides metrics for HRD efficiency </li></ul><ul><li>Demonstrates value-added approach for HRD </li></ul><ul><li>Demonstrates accountability for HRD activities </li></ul><ul><li>Everyone else has it… why not HRD? </li></ul>
  13. 13. Make or Buy Evaluation <ul><li>“ I bought it, therefore it is good.” </li></ul><ul><li>“ Since it’s good, I don’t need to post-test.” </li></ul><ul><li>Who says it’s: </li></ul><ul><ul><li>Appropriate? </li></ul></ul><ul><ul><li>Effective? </li></ul></ul><ul><ul><li>Timely? </li></ul></ul><ul><ul><li>Transferable to the workplace? </li></ul></ul>
  14. 14. Evolution of Evaluation Efforts <ul><li>Anecdotal approach – talk to other users </li></ul><ul><li>Try before buy – borrow and use samples </li></ul><ul><li>Analytical approach – match research data to training needs </li></ul><ul><li>Holistic approach – look at overall HRD process, as well as individual training </li></ul>
  15. 15. Models and Frameworks of Evaluation <ul><li>Table 7-1 lists six frameworks for evaluation </li></ul><ul><li>The most popular is that of D. Kirkpatrick: </li></ul><ul><ul><li>Reaction </li></ul></ul><ul><ul><li>Learning </li></ul></ul><ul><ul><li>Job Behavior </li></ul></ul><ul><ul><li>Results </li></ul></ul>
  16. 16. Kirkpatrick’s Four Levels <ul><li>Reaction </li></ul><ul><ul><li>Focus on trainee’s reactions </li></ul></ul><ul><li>Learning </li></ul><ul><ul><li>Did they learn what they were supposed to? </li></ul></ul><ul><li>Job Behavior </li></ul><ul><ul><li>Was it used on job? </li></ul></ul><ul><li>Results </li></ul><ul><ul><li>Did it improve the organization’s effectiveness? </li></ul></ul>
  17. 17. Issues Concerning Kirkpatrick’s Framework <ul><li>Most organizations don’t evaluate at all four levels </li></ul><ul><li>Focuses only on post-training </li></ul><ul><li>Doesn’t treat inter-stage improvements </li></ul><ul><li>WHAT ARE YOUR THOUGHTS? </li></ul>
  18. 18. A Suggested Framework – 1 <ul><li>Reaction </li></ul><ul><ul><li>Did trainees like the training? </li></ul></ul><ul><ul><li>Did the training seem useful? </li></ul></ul><ul><li>Learning </li></ul><ul><ul><li>How much did they learn? </li></ul></ul><ul><li>Behavior </li></ul><ul><ul><li>What behavior change occurred? </li></ul></ul>
  19. 19. Suggested Framework – 2 <ul><li>Results </li></ul><ul><ul><li>What were the tangible outcomes? </li></ul></ul><ul><ul><li>What was the return on investment (ROI)? </li></ul></ul><ul><ul><li>What was the contribution to the organization? </li></ul></ul>
  20. 20. Data Collection for HRD Evaluation <ul><li>Possible methods: </li></ul><ul><li>Interviews </li></ul><ul><li>Questionnaires </li></ul><ul><li>Direct observation </li></ul><ul><li>Written tests </li></ul><ul><li>Simulation/Performance tests </li></ul><ul><li>Archival performance information </li></ul>
  21. 21. Interviews <ul><li>Advantages : </li></ul><ul><li>Flexible </li></ul><ul><li>Opportunity for clarification </li></ul><ul><li>Depth possible </li></ul><ul><li>Personal contact </li></ul><ul><li>Limitations: </li></ul><ul><li>High reactive effects </li></ul><ul><li>High cost </li></ul><ul><li>Face-to-face threat potential </li></ul><ul><li>Labor intensive </li></ul><ul><li>Trained observers needed </li></ul>
  22. 22. Questionnaires <ul><li>Advantages : </li></ul><ul><li>Low cost to administer </li></ul><ul><li>Honesty increased </li></ul><ul><li>Anonymity possible </li></ul><ul><li>Respondent sets the pace </li></ul><ul><li>Variety of options </li></ul><ul><li>Limitations : </li></ul><ul><li>Possible inaccurate data </li></ul><ul><li>Response conditions not controlled </li></ul><ul><li>Respondents set varying paces </li></ul><ul><li>Uncontrolled return rate </li></ul>
  23. 23. Direct Observation <ul><li>Advantages : </li></ul><ul><li>Nonthreatening </li></ul><ul><li>Excellent way to measure behavior change </li></ul><ul><li>Limitations : </li></ul><ul><li>Possibly disruptive </li></ul><ul><li>Reactive effects are possible </li></ul><ul><li>May be unreliable </li></ul><ul><li>Need trained observers </li></ul>
  24. 24. Written Tests <ul><li>Advantages : </li></ul><ul><li>Low purchase cost </li></ul><ul><li>Readily scored </li></ul><ul><li>Quickly processed </li></ul><ul><li>Easily administered </li></ul><ul><li>Wide sampling possible </li></ul><ul><li>Limitations : </li></ul><ul><li>May be threatening </li></ul><ul><li>Possibly no relation to job performance </li></ul><ul><li>Measures only cognitive learning </li></ul><ul><li>Relies on norms </li></ul><ul><li>Concern for racial/ ethnic bias </li></ul>
  25. 25. Simulation/Performance Tests <ul><li>Advantages : </li></ul><ul><li>Reliable </li></ul><ul><li>Objective </li></ul><ul><li>Close relation to job performance </li></ul><ul><li>Includes cognitive, psychomotor and affective domains </li></ul><ul><li>Limitations : </li></ul><ul><li>Time consuming </li></ul><ul><li>Simulations often difficult to create </li></ul><ul><li>High costs to development and use </li></ul>
  26. 26. Archival Performance Data <ul><li>Advantages : </li></ul><ul><li>Reliable </li></ul><ul><li>Objective </li></ul><ul><li>Job-based </li></ul><ul><li>Easy to review </li></ul><ul><li>Minimal reactive effects </li></ul><ul><li>Limitations : </li></ul><ul><li>Criteria for keeping/ discarding records </li></ul><ul><li>Information system discrepancies </li></ul><ul><li>Indirect </li></ul><ul><li>Not always usable </li></ul><ul><li>Records prepared for other purposes </li></ul>
  27. 27. Choosing Data Collection Methods <ul><li>Reliability </li></ul><ul><ul><li>Consistency of results, and freedom from collection method bias and error </li></ul></ul><ul><li>Validity </li></ul><ul><ul><li>Does the device measure what we want to measure? </li></ul></ul><ul><li>Practicality </li></ul><ul><ul><li>Does it make sense in terms of the resources used to get the data? </li></ul></ul>
  28. 28. Type of Data Used/Needed <ul><li>Individual performance </li></ul><ul><li>Systemwide performance </li></ul><ul><li>Economic </li></ul>
  29. 29. Individual Performance Data <ul><li>Individual knowledge </li></ul><ul><li>Individual behaviors </li></ul><ul><li>Examples: </li></ul><ul><ul><li>Test scores </li></ul></ul><ul><ul><li>Performance quantity, quality, and timeliness </li></ul></ul><ul><ul><li>Attendance records </li></ul></ul><ul><ul><li>Attitudes </li></ul></ul>
  30. 30. Systemwide Performance Data <ul><li>Productivity </li></ul><ul><li>Scrap/rework rates </li></ul><ul><li>Customer satisfaction levels </li></ul><ul><li>On-time performance levels </li></ul><ul><li>Quality rates and improvement rates </li></ul>
  31. 31. Economic Data <ul><li>Profits </li></ul><ul><li>Product liability claims </li></ul><ul><li>Avoidance of penalties </li></ul><ul><li>Market share </li></ul><ul><li>Competitive position </li></ul><ul><li>Return on investment (ROI) </li></ul><ul><li>Financial utility calculations </li></ul>
  32. 32. Use of Self-Report Data <ul><li>Most common method </li></ul><ul><li>Pre-training and post-training data </li></ul><ul><li>Problems: </li></ul><ul><ul><li>Mono-method bias </li></ul></ul><ul><ul><ul><li>Desire to be consistent between tests </li></ul></ul></ul><ul><ul><li>Socially desirable responses </li></ul></ul><ul><ul><li>Response Shift Bias: </li></ul></ul><ul><ul><ul><li>Trainees adjust expectations to training </li></ul></ul></ul>
  33. 33. Research Design <ul><li>Specifies in advance: </li></ul><ul><li>the expected results of the study </li></ul><ul><li>the methods of data collection to be used </li></ul><ul><li>how the data will be analyzed </li></ul>
  34. 34. Research Design Issues <ul><li>Pretest and Posttest </li></ul><ul><ul><li>Shows trainee what training has accomplished </li></ul></ul><ul><ul><li>Helps eliminate pretest knowledge bias </li></ul></ul><ul><li>Control Group </li></ul><ul><ul><li>Compares performance of group with training against the performance of a similar group without training </li></ul></ul>
  35. 35. Recommended Research Design <ul><li>Pretest and posttest with control group </li></ul><ul><li>Whenever possible: </li></ul><ul><ul><li>Randomly assign individuals to the test group and the control group to minimize bias </li></ul></ul><ul><ul><li>Use “time-series” approach to data collection to verify performance improvement is due to training </li></ul></ul>
  36. 36. Ethical Issues Concerning Evaluation Research <ul><li>Confidentiality </li></ul><ul><li>Informed consent </li></ul><ul><li>Withholding training from control groups </li></ul><ul><li>Use of deception </li></ul><ul><li>Pressure to produce positive results </li></ul>
  37. 37. Assessing the Impact of HRD <ul><li>Money is the language of business. </li></ul><ul><li>You MUST talk dollars, not HRD jargon. </li></ul><ul><li>No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.” </li></ul>
  38. 38. HRD Program Assessment <ul><li>HRD programs and training are investments </li></ul><ul><li>Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producers </li></ul><ul><li>You must prove your worth to the organization </li></ul><ul><ul><li>Or you’ll have to find another organization… </li></ul></ul>
  39. 39. Evaluation of Training Costs <ul><li>Cost-benefit analysis </li></ul><ul><ul><li>Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc. </li></ul></ul><ul><li>Cost-effectiveness analysis </li></ul><ul><ul><li>Focuses on increases in quality, reduction in scrap/rework, productivity, etc. </li></ul></ul>
  40. 40. Return on Investment <ul><li>Return on investment = Results/Costs </li></ul>
  41. 41. Calculating Training Return On Investment     Results Results     Operational How Before After Differences Expressed Results Area Measured Training Training (+ or –) in $ Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day     1,440 panels 1,080 panels 360 panels $172,800        per day    per day      per year Housekeeping Visual 10 defects 2 defects 8 defects Not measur-      inspection    (average)    (average)      able in $      using              20-item              checklist         Preventable Number of 24 per year 16 per year 8 per year      accidents    accidents           Direct cost $144,000 $96,000 per $48,000 $48,000 per      of each    per year    year      year      accident             Return Investment     Total savings: $220,800.00 ROI = =                 SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43 (8), 41. Printed by permission. Operational Results Training Costs = $220,800 $32,564 = 6.8
  42. 42. Types of Training Costs <ul><li>Direct costs </li></ul><ul><li>Indirect costs </li></ul><ul><li>Development costs </li></ul><ul><li>Overhead costs </li></ul><ul><li>Compensation for participants </li></ul>
  43. 43. Direct Costs <ul><li>Instructor </li></ul><ul><ul><li>Base pay </li></ul></ul><ul><ul><li>Fringe benefits </li></ul></ul><ul><ul><li>Travel and per diem </li></ul></ul><ul><li>Materials </li></ul><ul><li>Classroom and audiovisual equipment </li></ul><ul><li>Travel </li></ul><ul><li>Food and refreshments </li></ul>
  44. 44. Indirect Costs <ul><li>Training management </li></ul><ul><li>Clerical/Administrative </li></ul><ul><li>Postal/shipping, telephone, computers, etc. </li></ul><ul><li>Pre- and post-learning materials </li></ul><ul><li>Other overhead costs </li></ul>
  45. 45. Development Costs <ul><li>Fee to purchase program </li></ul><ul><li>Costs to tailor program to organization </li></ul><ul><li>Instructor training costs </li></ul>
  46. 46. Overhead Costs <ul><li>General organization support </li></ul><ul><li>Top management participation </li></ul><ul><li>Utilities, facilities </li></ul><ul><li>General and administrative costs, such as HRM </li></ul>
  47. 47. Compensation for Participants <ul><li>Participants’ salary and benefits for time away from job </li></ul><ul><li>Travel, lodging, and per-diem costs </li></ul>
  48. 48. Measuring Benefits <ul><ul><li>Change in quality per unit measured in dollars </li></ul></ul><ul><ul><li>Reduction in scrap/rework measured in dollar cost of labor and materials </li></ul></ul><ul><ul><li>Reduction in preventable accidents measured in dollars </li></ul></ul><ul><ul><li>ROI = Benefits/Training costs </li></ul></ul>
  49. 49. Utility Analysis <ul><li>Uses a statistical approach to support claims of training effectiveness: </li></ul><ul><ul><li>N = Number of trainees </li></ul></ul><ul><ul><li>T = Length of time benefits are expected to last </li></ul></ul><ul><ul><li>d t = True performance difference resulting from training </li></ul></ul><ul><ul><li>SD y = Dollar value of untrained job performance (in standard deviation units) </li></ul></ul><ul><ul><li>C = Cost of training </li></ul></ul><ul><li> U = (N)(T)(d t )(Sd y ) – C </li></ul>
  50. 50. Critical Information for Utility Analysis <ul><li>d t = difference in units between trained/untrained, divided by standard deviation in units produced by trained </li></ul><ul><li>SD y = standard deviation in dollars, or overall productivity of organization </li></ul>
  51. 51. Ways to Improve HRD Assessment <ul><li>Walk the walk, talk the talk: MONEY </li></ul><ul><li>Involve HRD in strategic planning </li></ul><ul><li>Involve management in HRD planning and estimation efforts </li></ul><ul><ul><li>Gain mutual ownership </li></ul></ul><ul><li>Use credible and conservative estimates </li></ul><ul><li>Share credit for successes and blame for failures </li></ul>
  52. 52. HRD Evaluation Steps <ul><li>Analyze needs. </li></ul><ul><li>Determine explicit evaluation strategy. </li></ul><ul><li>Insist on specific and measurable training objectives. </li></ul><ul><li>Obtain participant reactions. </li></ul><ul><li>Develop criterion measures/instruments to measure results. </li></ul><ul><li>Plan and execute evaluation strategy. </li></ul>
  53. 53. Summary <ul><li>Training results must be measured against costs </li></ul><ul><li>Training must contribute to the “bottom line” </li></ul><ul><li>HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster </li></ul>