Your SlideShare is downloading. ×
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Evaluating hrd-programs
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Evaluating hrd-programs

1,964

Published on

Evaluate HRD programs

Evaluate HRD programs

Published in: Business
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,964
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
101
Comments
0
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Most
  • Transcript

    • 1. Evaluating HRD Programs
    • 2. Effectiveness
      • The degree to which a training (or other HRD program) achieves its intended purpose
      • Measures are relative to some starting point
      • Measures how well the desired goal is achieved
    • 3. Evaluation
    • 4. HRD Evaluation
      • Textbook definition:
      • “ The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.”
    • 5. In Other Words…
      • Are we training:
      • the right people
      • the right “stuff”
      • the right way
      • with the right materials
      • at the right time?
    • 6. Evaluation Needs
      • Descriptive and judgmental information needed
        • Objective and subjective data
      • Information gathered according to a plan and in a desired format
      • Gathered to provide decision making information
    • 7. Purposes of Evaluation
      • Determine whether the program is meeting the intended objectives
      • Identify strengths and weaknesses
      • Determine cost-benefit ratio
      • Identify who benefited most or least
      • Determine future participants
      • Provide information for improving HRD programs
    • 8. Purposes of Evaluation – 2
      • Reinforce major points to be made
      • Gather marketing information
      • Determine if training program is appropriate
      • Establish management database
    • 9. Evaluation Bottom Line
      • Is HRD a revenue contributor or a revenue user?
      • Is HRD credible to line and upper-level managers?
      • Are benefits of HRD readily evident to all?
    • 10. How Often are HRD Evaluations Conducted?
      • Not often enough!!!
      • Frequently, only end-of-course participant reactions are collected
      • Transfer to the workplace is evaluated less frequently
    • 11. Why HRD Evaluations are Rare
      • Reluctance to having HRD programs evaluated
      • Evaluation needs expertise and resources
      • Factors other than HRD cause performance improvements – e.g.,
        • Economy
        • Equipment
        • Policies, etc.
    • 12. Need for HRD Evaluation
      • Shows the value of HRD
      • Provides metrics for HRD efficiency
      • Demonstrates value-added approach for HRD
      • Demonstrates accountability for HRD activities
      • Everyone else has it… why not HRD?
    • 13. Make or Buy Evaluation
      • “ I bought it, therefore it is good.”
      • “ Since it’s good, I don’t need to post-test.”
      • Who says it’s:
        • Appropriate?
        • Effective?
        • Timely?
        • Transferable to the workplace?
    • 14. Evolution of Evaluation Efforts
      • Anecdotal approach – talk to other users
      • Try before buy – borrow and use samples
      • Analytical approach – match research data to training needs
      • Holistic approach – look at overall HRD process, as well as individual training
    • 15. Models and Frameworks of Evaluation
      • Table 7-1 lists six frameworks for evaluation
      • The most popular is that of D. Kirkpatrick:
        • Reaction
        • Learning
        • Job Behavior
        • Results
    • 16. Kirkpatrick’s Four Levels
      • Reaction
        • Focus on trainee’s reactions
      • Learning
        • Did they learn what they were supposed to?
      • Job Behavior
        • Was it used on job?
      • Results
        • Did it improve the organization’s effectiveness?
    • 17. Issues Concerning Kirkpatrick’s Framework
      • Most organizations don’t evaluate at all four levels
      • Focuses only on post-training
      • Doesn’t treat inter-stage improvements
      • WHAT ARE YOUR THOUGHTS?
    • 18. A Suggested Framework – 1
      • Reaction
        • Did trainees like the training?
        • Did the training seem useful?
      • Learning
        • How much did they learn?
      • Behavior
        • What behavior change occurred?
    • 19. Suggested Framework – 2
      • Results
        • What were the tangible outcomes?
        • What was the return on investment (ROI)?
        • What was the contribution to the organization?
    • 20. Data Collection for HRD Evaluation
      • Possible methods:
      • Interviews
      • Questionnaires
      • Direct observation
      • Written tests
      • Simulation/Performance tests
      • Archival performance information
    • 21. Interviews
      • Advantages :
      • Flexible
      • Opportunity for clarification
      • Depth possible
      • Personal contact
      • Limitations:
      • High reactive effects
      • High cost
      • Face-to-face threat potential
      • Labor intensive
      • Trained observers needed
    • 22. Questionnaires
      • Advantages :
      • Low cost to administer
      • Honesty increased
      • Anonymity possible
      • Respondent sets the pace
      • Variety of options
      • Limitations :
      • Possible inaccurate data
      • Response conditions not controlled
      • Respondents set varying paces
      • Uncontrolled return rate
    • 23. Direct Observation
      • Advantages :
      • Nonthreatening
      • Excellent way to measure behavior change
      • Limitations :
      • Possibly disruptive
      • Reactive effects are possible
      • May be unreliable
      • Need trained observers
    • 24. Written Tests
      • Advantages :
      • Low purchase cost
      • Readily scored
      • Quickly processed
      • Easily administered
      • Wide sampling possible
      • Limitations :
      • May be threatening
      • Possibly no relation to job performance
      • Measures only cognitive learning
      • Relies on norms
      • Concern for racial/ ethnic bias
    • 25. Simulation/Performance Tests
      • Advantages :
      • Reliable
      • Objective
      • Close relation to job performance
      • Includes cognitive, psychomotor and affective domains
      • Limitations :
      • Time consuming
      • Simulations often difficult to create
      • High costs to development and use
    • 26. Archival Performance Data
      • Advantages :
      • Reliable
      • Objective
      • Job-based
      • Easy to review
      • Minimal reactive effects
      • Limitations :
      • Criteria for keeping/ discarding records
      • Information system discrepancies
      • Indirect
      • Not always usable
      • Records prepared for other purposes
    • 27. Choosing Data Collection Methods
      • Reliability
        • Consistency of results, and freedom from collection method bias and error
      • Validity
        • Does the device measure what we want to measure?
      • Practicality
        • Does it make sense in terms of the resources used to get the data?
    • 28. Type of Data Used/Needed
      • Individual performance
      • Systemwide performance
      • Economic
    • 29. Individual Performance Data
      • Individual knowledge
      • Individual behaviors
      • Examples:
        • Test scores
        • Performance quantity, quality, and timeliness
        • Attendance records
        • Attitudes
    • 30. Systemwide Performance Data
      • Productivity
      • Scrap/rework rates
      • Customer satisfaction levels
      • On-time performance levels
      • Quality rates and improvement rates
    • 31. Economic Data
      • Profits
      • Product liability claims
      • Avoidance of penalties
      • Market share
      • Competitive position
      • Return on investment (ROI)
      • Financial utility calculations
    • 32. Use of Self-Report Data
      • Most common method
      • Pre-training and post-training data
      • Problems:
        • Mono-method bias
          • Desire to be consistent between tests
        • Socially desirable responses
        • Response Shift Bias:
          • Trainees adjust expectations to training
    • 33. Research Design
      • Specifies in advance:
      • the expected results of the study
      • the methods of data collection to be used
      • how the data will be analyzed
    • 34. Research Design Issues
      • Pretest and Posttest
        • Shows trainee what training has accomplished
        • Helps eliminate pretest knowledge bias
      • Control Group
        • Compares performance of group with training against the performance of a similar group without training
    • 35. Recommended Research Design
      • Pretest and posttest with control group
      • Whenever possible:
        • Randomly assign individuals to the test group and the control group to minimize bias
        • Use “time-series” approach to data collection to verify performance improvement is due to training
    • 36. Ethical Issues Concerning Evaluation Research
      • Confidentiality
      • Informed consent
      • Withholding training from control groups
      • Use of deception
      • Pressure to produce positive results
    • 37. Assessing the Impact of HRD
      • Money is the language of business.
      • You MUST talk dollars, not HRD jargon.
      • No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”
    • 38. HRD Program Assessment
      • HRD programs and training are investments
      • Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producers
      • You must prove your worth to the organization
        • Or you’ll have to find another organization…
    • 39. Evaluation of Training Costs
      • Cost-benefit analysis
        • Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc.
      • Cost-effectiveness analysis
        • Focuses on increases in quality, reduction in scrap/rework, productivity, etc.
    • 40. Return on Investment
      • Return on investment = Results/Costs
    • 41. Calculating Training Return On Investment     Results Results     Operational How Before After Differences Expressed Results Area Measured Training Training (+ or –) in $ Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day     1,440 panels 1,080 panels 360 panels $172,800        per day    per day      per year Housekeeping Visual 10 defects 2 defects 8 defects Not measur-      inspection    (average)    (average)      able in $      using              20-item              checklist         Preventable Number of 24 per year 16 per year 8 per year      accidents    accidents           Direct cost $144,000 $96,000 per $48,000 $48,000 per      of each    per year    year      year      accident             Return Investment     Total savings: $220,800.00 ROI = =                 SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43 (8), 41. Printed by permission. Operational Results Training Costs = $220,800 $32,564 = 6.8
    • 42. Types of Training Costs
      • Direct costs
      • Indirect costs
      • Development costs
      • Overhead costs
      • Compensation for participants
    • 43. Direct Costs
      • Instructor
        • Base pay
        • Fringe benefits
        • Travel and per diem
      • Materials
      • Classroom and audiovisual equipment
      • Travel
      • Food and refreshments
    • 44. Indirect Costs
      • Training management
      • Clerical/Administrative
      • Postal/shipping, telephone, computers, etc.
      • Pre- and post-learning materials
      • Other overhead costs
    • 45. Development Costs
      • Fee to purchase program
      • Costs to tailor program to organization
      • Instructor training costs
    • 46. Overhead Costs
      • General organization support
      • Top management participation
      • Utilities, facilities
      • General and administrative costs, such as HRM
    • 47. Compensation for Participants
      • Participants’ salary and benefits for time away from job
      • Travel, lodging, and per-diem costs
    • 48. Measuring Benefits
        • Change in quality per unit measured in dollars
        • Reduction in scrap/rework measured in dollar cost of labor and materials
        • Reduction in preventable accidents measured in dollars
        • ROI = Benefits/Training costs
    • 49. Utility Analysis
      • Uses a statistical approach to support claims of training effectiveness:
        • N = Number of trainees
        • T = Length of time benefits are expected to last
        • d t = True performance difference resulting from training
        • SD y = Dollar value of untrained job performance (in standard deviation units)
        • C = Cost of training
      •  U = (N)(T)(d t )(Sd y ) – C
    • 50. Critical Information for Utility Analysis
      • d t = difference in units between trained/untrained, divided by standard deviation in units produced by trained
      • SD y = standard deviation in dollars, or overall productivity of organization
    • 51. Ways to Improve HRD Assessment
      • Walk the walk, talk the talk: MONEY
      • Involve HRD in strategic planning
      • Involve management in HRD planning and estimation efforts
        • Gain mutual ownership
      • Use credible and conservative estimates
      • Share credit for successes and blame for failures
    • 52. HRD Evaluation Steps
      • Analyze needs.
      • Determine explicit evaluation strategy.
      • Insist on specific and measurable training objectives.
      • Obtain participant reactions.
      • Develop criterion measures/instruments to measure results.
      • Plan and execute evaluation strategy.
    • 53. Summary
      • Training results must be measured against costs
      • Training must contribute to the “bottom line”
      • HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster

    ×