Your SlideShare is downloading. ×
0
Evaluating
HRD
Programs

By Si-Hosseini
Effectiveness ‫الفعالية‬
The degree to which a training (or
other HRD program) achieves its
intended purpose
 Measures ar...
Evaluation

3
HRD Evaluation
Textbook definition:
“The systematic collection of descriptive
and judgmental information necessary to
make...
In Other Words…
Are we training:
 the right people
 the right “stuff”
 the right way
 with the right materials
 at th...
Evaluation Needs


Descriptive and judgmental information
needed ‫المعلومات الوصفية والحكمية المطلوبة‬
 Objective

and s...
Purposes of Evaluation
Determine whether the program is meeting
the intended objectives
 Identify strengths and weaknesse...
Purposes of Evaluation – 2
Reinforce major points to be made
 Gather marketing information
 Determine if training progra...
Evaluation Bottom Line
Is HRD a revenue contributor or a revenue
user?
 Is HRD credible to line and upper-level
managers?...
How Often are HRD Evaluations
Conducted?

Not often enough!!!
 Frequently, only end-of-course participant
reactions are c...
Why HRD Evaluations are Rare


Reluctance to having HRD programs evaluated



Evaluation needs expertise and resources

...
Need for HRD Evaluation
Shows the value of HRD
 Provides metrics for HRD efficiency
 Demonstrates value-added approach f...
Make or Buy Evaluation
“I bought it, therefore it is good.”
 “Since it’s good, I don’t need to post-test.”
 Who says it’...
Evolution of Evaluation Efforts
1.
2.
3.
4.

Anecdotal (story) approach – talk to other
users
Try before buy – borrow and ...
Models and Frameworks of
Evaluation

Table 7-1 lists six frameworks for evaluation
 The most popular is that of D. Kirkpa...
Kirkpatrick’s Four Levels (1994)


Reaction




Learning




Did they learn what they were supposed to?

Job Behavior...
Issues Concerning Kirkpatrick’s
Framework
Most organizations don’t evaluate at all
four levels
 Focuses only on post-trai...
Other Frameworks/Models




CIPP: Context, Input, Process, Product (Galvin,
1983)
Brinkerhoff (1987):
Goal setting
 Pro...
Other Frameworks/Models – 2


Kraiger, Ford, & Salas (1993):
Cognitive outcomes
 Skill-based outcomes
 Affective outcom...
Other Frameworks/Models – 3


Phillips (1996):
 Reaction

and Planned Action

 Learning
 Applied

Learning on the Job
...
A Suggested Framework – 1


Reaction
 Did

trainees like the training?
 Did the training seem useful?


Learning
 How...
Suggested Framework – 2


Results
 What

were the tangible outcomes?
 What was the return on investment
(ROI)?
 What w...
Data Collection for HRD
Evaluation

Possible methods:
 Interviews
 Questionnaires
 Direct observation
 Written tests
...
Interviews
Advantages:
 Flexible
 Opportunity for
clarification
 Depth possible
 Personal contact

Limitations:
 High...
Questionnaires
Advantages:
 Low cost to administer
 Honesty increased
 Anonymity possible
 Respondent sets the
pace
 ...
Direct Observation
Advantages:
 Nonthreatening
 Excellent way to
measure behavior
change

Limitations:
 Possibly disrup...
Written Tests
Advantages:
 Low purchase cost
 Readily scored
 Quickly processed
 Easily administered
 Wide sampling
p...
Simulation/Performance Tests
Advantages:
 Reliable
 Objective
 Close relation to job
performance
 Includes cognitive,
...
Archival Performance Data
Advantages:
 Reliable
 Objective
 Job-based
 Easy to review
 Minimal reactive
effects

Limi...
Choosing Data Collection
Methods


Reliability




Validity




Consistency of results, and freedom from
collection m...
Type of Data Used/Needed
Individual performance
 Systemwide performance
 Economic


31
Individual Performance Data


Individual knowledge

Individual behaviors
 Examples:


 Test

scores
 Performance quan...
Systemwide Performance Data
Productivity
 Scrap/rework rates (waste)
 Customer satisfaction levels
 On-time performance...
Economic Data
Profits
 Product liability claims
 Avoidance of penalties
 Market share
 Competitive position
 Return o...
Use of Self-Report Data
Most common method
 Pre-training and post-training data
 Problems:
 Mono-method bias
Desire to...
Research Design
Specifies in advance:


the expected results of the study



the methods of data collection to be used

...
Research Design Issues


Pretest and Posttest
 Shows

trainee what training has
accomplished
 Helps eliminate pretest k...
Recommended Research Design
Pretest and posttest with control group
 Whenever possible:


 Randomly

assign individuals...
Ethical Issues Concerning
Evaluation Research

Confidentiality
 Informed consent
 Withholding training from control grou...
Assessing the Impact of HRD
Money is the language of business.
 You MUST talk dollars, not HRD jargon.
 No one (except m...
HRD Program Assessment
HRD programs and training are
investments
 Line managers often see HR and HRD as
costs – i.e., rev...
Two Basic Methods for
Assessing Financial Impact
Evaluation of training costs
 Utility analysis


42
Evaluation of Training Costs


Cost-benefit analysis
 Compares

cost of training to benefits
gained such as attitudes, r...
Return on Investment


Return on investment = Results/Costs

44
Calculating Training Return On
Investment
 

Results

 

Results

Operational

How

Before

After

Results Area

Measured
...
Types of Training Costs
Direct costs
 Indirect costs
 Development costs
 Overhead costs
 Compensation for participants...
Direct Costs


Instructor
 Base

pay
 Fringe benefits
 Travel and per diem

Materials
 Classroom and audiovisual equi...
Indirect Costs
Training management
 Clerical/Administrative
 Postal/shipping, telephone, computers,
etc.
 Pre- and post...
Development Costs
Fee to purchase program
 Costs to tailor program to organization
 Instructor training costs


49
Overhead Costs
General organization support
 Top management participation
 Utilities, facilities
 General and administr...
Compensation for Participants
Participants’ salary and benefits for time
away from job
 Travel, lodging, and per-diem cos...
Measuring Benefits
 Change

in quality per unit measured in

dollars
 Reduction in scrap/rework measured in
dollar cost ...
Utility Analysis (Brogden-CronbachGleser model: Personnel
Psychology)

Uses a statistical approach to support
claims of t...
Critical Information for Utility
Analysis

dt = difference in units between
trained/untrained, divided by standard
deviati...
Ways to Improve HRD
Assessment


Walk the walk, talk the talk: MONEY



Involve HRD in strategic planning



Involve ma...
HRD Evaluation Steps
1.
2.
3.
4.
5.
6.

Analyze needs.
Determine explicit evaluation strategy.
Insist on specific and meas...
Summary
Training results must be measured
against costs
 Training must contribute to the “bottom
line”
 HRD must justify...
Upcoming SlideShare
Loading in...5
×

evaluating hrd programs

848

Published on

Evaluating HRD Programs

Published in: Career, Business, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
848
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
73
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Most
  • Transcript of " evaluating hrd programs"

    1. 1. Evaluating HRD Programs By Si-Hosseini
    2. 2. Effectiveness ‫الفعالية‬ The degree to which a training (or other HRD program) achieves its intended purpose  Measures are relative to some starting point  Measures how well the desired goal is achieved  2
    3. 3. Evaluation 3
    4. 4. HRD Evaluation Textbook definition: “The systematic collection of descriptive and judgmental information necessary to make effective training decisions related to the selection, adoption, value, and modification of various instructional activities.” 4
    5. 5. In Other Words… Are we training:  the right people  the right “stuff”  the right way  with the right materials  at the right time? 5
    6. 6. Evaluation Needs  Descriptive and judgmental information needed ‫المعلومات الوصفية والحكمية المطلوبة‬  Objective and subjective data  Information gathered according to a plan and in a desired format  Gathered to provide decision making information 6
    7. 7. Purposes of Evaluation Determine whether the program is meeting the intended objectives  Identify strengths and weaknesses  Determine cost-benefit ratio  Identify who benefited most or least  Determine future participants  Provide information for improving HRD programs  7
    8. 8. Purposes of Evaluation – 2 Reinforce major points to be made  Gather marketing information  Determine if training program is appropriate  Establish management database  8
    9. 9. Evaluation Bottom Line Is HRD a revenue contributor or a revenue user?  Is HRD credible to line and upper-level managers?  Are benefits of HRD readily evident to all?  9
    10. 10. How Often are HRD Evaluations Conducted? Not often enough!!!  Frequently, only end-of-course participant reactions are collected  Transfer to the workplace is evaluated less frequently  10
    11. 11. Why HRD Evaluations are Rare  Reluctance to having HRD programs evaluated  Evaluation needs expertise and resources  Factors other than HRD cause performance improvements – e.g.,  Economy (it costs time & money)  Equipment (HR staff are not expert doing it alone so have to outsource tools)  Policies, etc. (criteria) 11
    12. 12. Need for HRD Evaluation Shows the value of HRD  Provides metrics for HRD efficiency  Demonstrates value-added approach for HRD  Demonstrates accountability for HRD activities  Everyone else has it… why not HRD?  12
    13. 13. Make or Buy Evaluation “I bought it, therefore it is good.”  “Since it’s good, I don’t need to post-test.”  Who says it’s:   Appropriate?  Effective?  Timely?  Transferable to the workplace? 13
    14. 14. Evolution of Evaluation Efforts 1. 2. 3. 4. Anecdotal (story) approach – talk to other users Try before buy – borrow and use samples Analytical approach – match research data to training needs Holistic approach – look at overall HRD process, as well as individual training 14
    15. 15. Models and Frameworks of Evaluation Table 7-1 lists six frameworks for evaluation  The most popular is that of D. Kirkpatrick:   Reaction  Learning  Job Behavior  Results 15
    16. 16. Kirkpatrick’s Four Levels (1994)  Reaction   Learning   Did they learn what they were supposed to? Job Behavior   Focus on trainee’s reactions (immediately after the program e.g., Did you like the program?) Was it used on job? Results  Did it improve the organization’s effectiveness? 16
    17. 17. Issues Concerning Kirkpatrick’s Framework Most organizations don’t evaluate at all four levels  Focuses only on post-training  Doesn’t treat inter-stage improvements  WHAT ARE YOUR THOUGHTS?  17
    18. 18. Other Frameworks/Models   CIPP: Context, Input, Process, Product (Galvin, 1983) Brinkerhoff (1987): Goal setting  Program design  Program implementation  Immediate outcomes  Usage outcomes  Impacts and worth  18
    19. 19. Other Frameworks/Models – 2  Kraiger, Ford, & Salas (1993): Cognitive outcomes  Skill-based outcomes  Affective outcomes   Holton (1996): Five Categories: Secondary Influences  Motivation Elements  Environmental Elements  Outcomes  Ability/Enabling Elements  19
    20. 20. Other Frameworks/Models – 3  Phillips (1996):  Reaction and Planned Action  Learning  Applied Learning on the Job  Business Results  ROI 20
    21. 21. A Suggested Framework – 1  Reaction  Did trainees like the training?  Did the training seem useful?  Learning  How  much did they learn? Behavior  What behavior change occurred? 21
    22. 22. Suggested Framework – 2  Results  What were the tangible outcomes?  What was the return on investment (ROI)?  What was the contribution to the organization? 22
    23. 23. Data Collection for HRD Evaluation Possible methods:  Interviews  Questionnaires  Direct observation  Written tests  Simulation/Performance tests  Archival performance information 23
    24. 24. Interviews Advantages:  Flexible  Opportunity for clarification  Depth possible  Personal contact Limitations:  High reactive effects  High cost  Face-to-face threat potential  Labor intensive  Trained observers needed 24
    25. 25. Questionnaires Advantages:  Low cost to administer  Honesty increased  Anonymity possible  Respondent sets the pace  Variety of options Limitations:  Possible inaccurate data  Response conditions not controlled  Respondents set varying paces  Uncontrolled return rate 25
    26. 26. Direct Observation Advantages:  Nonthreatening  Excellent way to measure behavior change Limitations:  Possibly disruptive  Reactive effects are possible  May be unreliable  Need trained observers 26
    27. 27. Written Tests Advantages:  Low purchase cost  Readily scored  Quickly processed  Easily administered  Wide sampling possible Limitations:  May be threatening  Possibly no relation to job performance  Measures only cognitive learning  Relies on norms  Concern for racial/ ethnic bias 27
    28. 28. Simulation/Performance Tests Advantages:  Reliable  Objective  Close relation to job performance  Includes cognitive, psychomotor and affective domains Limitations:  Time consuming  Simulations often difficult to create  High costs to development and use 28
    29. 29. Archival Performance Data Advantages:  Reliable  Objective  Job-based  Easy to review  Minimal reactive effects Limitations:  Criteria for keeping/ discarding records  Information system discrepancies  Indirect  Not always usable  Records prepared for other purposes 29
    30. 30. Choosing Data Collection Methods  Reliability   Validity   Consistency of results, and freedom from collection method bias and error Does the device measure what we want to measure? Practicality  Does it make sense in terms of the resources used to get the data? 30
    31. 31. Type of Data Used/Needed Individual performance  Systemwide performance  Economic  31
    32. 32. Individual Performance Data  Individual knowledge Individual behaviors  Examples:   Test scores  Performance quantity, quality, and timeliness  Attendance records  Attitudes 32
    33. 33. Systemwide Performance Data Productivity  Scrap/rework rates (waste)  Customer satisfaction levels  On-time performance levels  Quality rates and improvement rates  33
    34. 34. Economic Data Profits  Product liability claims  Avoidance of penalties  Market share  Competitive position  Return on investment (ROI)  Financial utility calculations  34
    35. 35. Use of Self-Report Data Most common method  Pre-training and post-training data  Problems:  Mono-method bias Desire to be consistent between tests   Socially desirable responses  Response Shift Bias: Trainees adjust expectations to training 35
    36. 36. Research Design Specifies in advance:  the expected results of the study  the methods of data collection to be used  how the data will be analyzed 36
    37. 37. Research Design Issues  Pretest and Posttest  Shows trainee what training has accomplished  Helps eliminate pretest knowledge bias  Control Group  Compares performance of group with training against the performance of a similar group without training 37
    38. 38. Recommended Research Design Pretest and posttest with control group  Whenever possible:   Randomly assign individuals to the test group and the control group to minimize bias  Use “time-series” approach to data collection to verify performance improvement is due to training 38
    39. 39. Ethical Issues Concerning Evaluation Research Confidentiality  Informed consent  Withholding training from control groups  Use of deception  Pressure to produce positive results  39
    40. 40. Assessing the Impact of HRD Money is the language of business.  You MUST talk dollars, not HRD jargon.  No one (except maybe you) cares about “the effectiveness of training interventions as measured by and analysis of formal pretest, posttest control group data.”  40
    41. 41. HRD Program Assessment HRD programs and training are investments  Line managers often see HR and HRD as costs – i.e., revenue users, not revenue producers  You must prove your worth to the organization –  Or you’ll have to find another organization…  41
    42. 42. Two Basic Methods for Assessing Financial Impact Evaluation of training costs  Utility analysis  42
    43. 43. Evaluation of Training Costs  Cost-benefit analysis  Compares cost of training to benefits gained such as attitudes, reduction in accidents, reduction in employee sick-days, etc.  Cost-effectiveness analysis  Focuses on increases in quality, reduction in scrap/rework, productivity, etc. 43
    44. 44. Return on Investment  Return on investment = Results/Costs 44
    45. 45. Calculating Training Return On Investment   Results   Results Operational How Before After Results Area Measured Training Quality of panels % rejected         Housekeeping Visual    inspection   using   20-item   checklist           Differences Expressed 2% rejected Training 1.5% rejected (+ or –) .5% in $ $720 per day 1,440 panels   per day 1,080 panels   per day 360 panels $172,800   per year 2 defects   (average) 8 defects 10 defects   (average)       Not measur  able in $                         Preventable   accidents Number of   accidents 24 per year 16 per year 8 per year             Direct cost   of each   accident $144,000   per year $96,000 per   year $48,000 $48,000 per   year           Total savings: $220,800.00           Return ROI = Investment =       = Operational Results Training Costs $220,800 $32,564   = 6.8             45 SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission.
    46. 46. Types of Training Costs Direct costs  Indirect costs  Development costs  Overhead costs  Compensation for participants  46
    47. 47. Direct Costs  Instructor  Base pay  Fringe benefits  Travel and per diem Materials  Classroom and audiovisual equipment  Travel  Food and refreshments  47
    48. 48. Indirect Costs Training management  Clerical/Administrative  Postal/shipping, telephone, computers, etc.  Pre- and post-learning materials  Other overhead costs  48
    49. 49. Development Costs Fee to purchase program  Costs to tailor program to organization  Instructor training costs  49
    50. 50. Overhead Costs General organization support  Top management participation  Utilities, facilities  General and administrative costs, such as HRM  50
    51. 51. Compensation for Participants Participants’ salary and benefits for time away from job  Travel, lodging, and per-diem costs  51
    52. 52. Measuring Benefits  Change in quality per unit measured in dollars  Reduction in scrap/rework measured in dollar cost of labor and materials  Reduction in preventable accidents measured in dollars  ROI = Benefits/Training costs 52
    53. 53. Utility Analysis (Brogden-CronbachGleser model: Personnel Psychology)  Uses a statistical approach to support claims of training effectiveness:       N = Number of trainees T = Length of time benefits are expected to last dt = True performance difference resulting from training SDy = Dollar value of untrained job performance (in standard deviation units) C = Cost of training ∆U = (N)(T)(dt)(Sdy) – C 53
    54. 54. Critical Information for Utility Analysis dt = difference in units between trained/untrained, divided by standard deviation in units produced by trained  SDy = standard deviation in dollars, or overall productivity of organization  54
    55. 55. Ways to Improve HRD Assessment  Walk the walk, talk the talk: MONEY  Involve HRD in strategic planning  Involve management in HRD planning and estimation efforts  Gain mutual ownership  Use credible and conservative estimates  Share credit for successes and blame for failures 55
    56. 56. HRD Evaluation Steps 1. 2. 3. 4. 5. 6. Analyze needs. Determine explicit evaluation strategy. Insist on specific and measurable training objectives. Obtain participant reactions. Develop criterion measures/instruments to measure results. Plan and execute evaluation strategy. 56
    57. 57. Summary Training results must be measured against costs  Training must contribute to the “bottom line”  HRD must justify itself repeatedly as a revenue enhancer, not a revenue waster  57
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×