11Evaluating HRDEvaluating HRDProgramsProgramsProfessor Jayashree SadriProfessor Jayashree Sadriand Dr. Sorab Sadriand Dr....
22EffectivenessEffectiveness The degree to which a training (orThe degree to which a training (orother HRD program) achie...
33EvaluationEvaluation
44HRD EvaluationHRD EvaluationTextbook definition:Textbook definition:““The systematic collection ofThe systematic collect...
55In Other Words…In Other Words…Are we training:Are we training: the right peoplethe right people the right “stuff”the r...
66Evaluation NeedsEvaluation Needs Descriptive and judgmentalDescriptive and judgmentalinformation neededinformation need...
77Purposes of EvaluationPurposes of Evaluation Determine whether the program isDetermine whether the program ismeeting th...
88Purposes of EvaluationPurposes of Evaluation –– 22 Reinforce major points to be madeReinforce major points to be made ...
99Evaluation Bottom LineEvaluation Bottom Line Is HRD a revenue contributor or aIs HRD a revenue contributor or arevenue ...
1010How Often are HRD EvaluationsHow Often are HRD EvaluationsConducted?Conducted? Not often enough!!!Not often enough!!!...
1111Why HRD Evaluations are RareWhy HRD Evaluations are Rare Reluctance to having HRD programsReluctance to having HRD pr...
1212Need for HRD EvaluationNeed for HRD Evaluation Shows the value of HRDShows the value of HRD Provides metrics for HRD...
1313Make or Buy EvaluationMake or Buy Evaluation ““I bought it, therefore it is good.”I bought it, therefore it is good.”...
1414Evolution of Evaluation EffortsEvolution of Evaluation Efforts1.1. AnecdotalAnecdotal approachapproach –– talk to othe...
1515Models and Frameworks ofModels and Frameworks ofEvaluationEvaluation Table 7-1 lists six frameworks forTable 7-1 list...
1616Kirkpatrick’s Four LevelsKirkpatrick’s Four Levels ReactionReaction– Focus on trainee’s reactionsFocus on trainee’s r...
1717Issues Concerning Kirkpatrick’sIssues Concerning Kirkpatrick’sFrameworkFramework Most organizations don’t evaluateMos...
1818Other Frameworks/ModelsOther Frameworks/Models CIPP: Context, Input, Process, ProductCIPP: Context, Input, Process, P...
1919Other Frameworks/Models – 2Other Frameworks/Models – 2 Kraiger, Ford, & Salas (1993):Kraiger, Ford, & Salas (1993):– ...
2020Other Frameworks/Models – 3Other Frameworks/Models – 3 Phillips (1996):Phillips (1996):– Reaction and Planned ActionR...
2121A Suggested Framework – 1A Suggested Framework – 1 ReactionReaction– Did trainees like the training?Did trainees like...
2222Suggested Framework – 2Suggested Framework – 2 ResultsResults– What were the tangible outcomes?What were the tangible...
2323Data Collection for HRDData Collection for HRDEvaluationEvaluationPossible methods:Possible methods: InterviewsInterv...
2424InterviewsInterviewsAdvantagesAdvantages:: FlexibleFlexible Opportunity forOpportunity forclarificationclarification...
2525QuestionnairesQuestionnairesAdvantagesAdvantages:: Low cost toLow cost toadministeradminister Honesty increasedHones...
2626Direct ObservationDirect ObservationAdvantagesAdvantages:: NonthreateningNonthreatening Excellent way toExcellent wa...
2727Written TestsWritten TestsAdvantagesAdvantages:: Low purchase costLow purchase cost Readily scoredReadily scored Qu...
2828Simulation/Performance TestsSimulation/Performance TestsAdvantagesAdvantages:: ReliableReliable ObjectiveObjective ...
2929Archival Performance DataArchival Performance DataAdvantagesAdvantages:: ReliableReliable ObjectiveObjective Job-ba...
3030Choosing Data CollectionChoosing Data CollectionMethodsMethods ReliabilityReliability– Consistency of results, and fr...
3131Type of Data Used/NeededType of Data Used/Needed Individual performanceIndividual performance Systemwide performance...
3232Individual Performance DataIndividual Performance Data Individual knowledgeIndividual knowledge Individual behaviors...
3333Systemwide Performance DataSystemwide Performance Data ProductivityProductivity Scrap/rework ratesScrap/rework rates...
3434Economic DataEconomic Data ProfitsProfits Product liability claimsProduct liability claims Avoidance of penaltiesAv...
3535Use of Self-Report DataUse of Self-Report Data Most common methodMost common method Pre-training and post-training d...
3636Research DesignResearch DesignSpecifies in advance:Specifies in advance: the expected results of the studythe expecte...
3737Research Design IssuesResearch Design Issues Pretest and PosttestPretest and Posttest– Shows trainee what training ha...
3838Recommended ResearchRecommended ResearchDesignDesign Pretest and posttest with controlPretest and posttest with contr...
3939Ethical Issues ConcerningEthical Issues ConcerningEvaluation ResearchEvaluation Research ConfidentialityConfidentiali...
4040Assessing the Impact of HRDAssessing the Impact of HRD Money is the language of business.Money is the language of bus...
4141HRD Program AssessmentHRD Program Assessment HRD programs and training areHRD programs and training areinvestmentsinv...
4242Two Basic Methods forTwo Basic Methods forAssessing Financial ImpactAssessing Financial Impact Evaluation of training...
4343Evaluation of Training CostsEvaluation of Training Costs Cost-benefit analysisCost-benefit analysis– Compares cost of...
4444Return on InvestmentReturn on Investment Return on investment =Return on investment =Results/CostsResults/Costs
4545Calculating Training Return OnCalculating Training Return OnInvestmentInvestment    Results Results    Operational How...
4646Types of Training CostsTypes of Training Costs Direct costsDirect costs Indirect costsIndirect costs Development co...
4747Direct CostsDirect Costs InstructorInstructor– Base payBase pay– Fringe benefitsFringe benefits– Travel and per diemT...
4848Indirect CostsIndirect Costs Training managementTraining management Clerical/AdministrativeClerical/Administrative ...
4949Development CostsDevelopment Costs Fee to purchase programFee to purchase program Costs to tailor program toCosts to...
5050Overhead CostsOverhead Costs General organization supportGeneral organization support Top management participationTo...
5151Compensation for ParticipantsCompensation for Participants Participants’ salary and benefits forParticipants’ salary ...
5252Measuring BenefitsMeasuring Benefits– Change in quality per unit measured inChange in quality per unit measured indoll...
5353Utility AnalysisUtility Analysis Uses a statistical approach toUses a statistical approach tosupport claims of traini...
5454Critical Information for UtilityCritical Information for UtilityAnalysisAnalysis ddtt = difference in units between= ...
5555Ways to Improve HRDWays to Improve HRDAssessmentAssessment Walk the walk, talk the talk: MONEYWalk the walk, talk the...
5656HRD Evaluation StepsHRD Evaluation Steps1.1. Analyze needs.Analyze needs.2.2. Determine explicit evaluation strategy.D...
5757SummarySummary Training results must be measuredTraining results must be measuredagainst costsagainst costs Training...
Upcoming SlideShare
Loading in …5
×

Evaluating hrd interventions

419 views

Published on

Published in: Business, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
419
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
38
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Most
  • Evaluating hrd interventions

    1. 1. 11Evaluating HRDEvaluating HRDProgramsProgramsProfessor Jayashree SadriProfessor Jayashree Sadriand Dr. Sorab Sadriand Dr. Sorab Sadri
    2. 2. 22EffectivenessEffectiveness The degree to which a training (orThe degree to which a training (orother HRD program) achieves itsother HRD program) achieves itsintended purposeintended purpose Measures are relative to someMeasures are relative to somestarting pointstarting point Measures how well the desiredMeasures how well the desiredgoal is achievedgoal is achieved
    3. 3. 33EvaluationEvaluation
    4. 4. 44HRD EvaluationHRD EvaluationTextbook definition:Textbook definition:““The systematic collection ofThe systematic collection ofdescriptive and judgmentaldescriptive and judgmentalinformation necessary to makeinformation necessary to makeeffective training decisions relatedeffective training decisions relatedto the selection, adoption, value,to the selection, adoption, value,and modification of variousand modification of variousinstructional activities.”instructional activities.”
    5. 5. 55In Other Words…In Other Words…Are we training:Are we training: the right peoplethe right people the right “stuff”the right “stuff” the right waythe right way with the right materialswith the right materials at the right time?at the right time?
    6. 6. 66Evaluation NeedsEvaluation Needs Descriptive and judgmentalDescriptive and judgmentalinformation neededinformation needed– Objective and subjective dataObjective and subjective data Information gathered according toInformation gathered according toa plan and in a desired formata plan and in a desired format Gathered to provide decisionGathered to provide decisionmaking informationmaking information
    7. 7. 77Purposes of EvaluationPurposes of Evaluation Determine whether the program isDetermine whether the program ismeeting the intended objectivesmeeting the intended objectives Identify strengths and weaknessesIdentify strengths and weaknesses Determine cost-benefit ratioDetermine cost-benefit ratio Identify who benefited most or leastIdentify who benefited most or least Determine future participantsDetermine future participants Provide information for improvingProvide information for improvingHRD programsHRD programs
    8. 8. 88Purposes of EvaluationPurposes of Evaluation –– 22 Reinforce major points to be madeReinforce major points to be made Gather marketing informationGather marketing information Determine if training program isDetermine if training program isappropriateappropriate Establish management databaseEstablish management database
    9. 9. 99Evaluation Bottom LineEvaluation Bottom Line Is HRD a revenue contributor or aIs HRD a revenue contributor or arevenue user?revenue user? Is HRD credible to line and upper-Is HRD credible to line and upper-level managers?level managers? Are benefits of HRD readily evidentAre benefits of HRD readily evidentto all?to all?
    10. 10. 1010How Often are HRD EvaluationsHow Often are HRD EvaluationsConducted?Conducted? Not often enough!!!Not often enough!!! Frequently, only end-of-courseFrequently, only end-of-courseparticipant reactions are collectedparticipant reactions are collected Transfer to the workplace isTransfer to the workplace isevaluated less frequentlyevaluated less frequently
    11. 11. 1111Why HRD Evaluations are RareWhy HRD Evaluations are Rare Reluctance to having HRD programsReluctance to having HRD programsevaluatedevaluated Evaluation needs expertise andEvaluation needs expertise andresourcesresources Factors other than HRD causeFactors other than HRD causeperformance improvementsperformance improvements –– e.g.,e.g.,– EconomyEconomy– EquipmentEquipment– Policies, etc.Policies, etc.
    12. 12. 1212Need for HRD EvaluationNeed for HRD Evaluation Shows the value of HRDShows the value of HRD Provides metrics for HRD efficiencyProvides metrics for HRD efficiency Demonstrates value-addedDemonstrates value-addedapproach for HRDapproach for HRD Demonstrates accountability forDemonstrates accountability forHRD activitiesHRD activities Everyone else has it… why notEveryone else has it… why notHRD?HRD?
    13. 13. 1313Make or Buy EvaluationMake or Buy Evaluation ““I bought it, therefore it is good.”I bought it, therefore it is good.” ““Since it’s good, I don’t need to post-Since it’s good, I don’t need to post-test.”test.” Who says it’s:Who says it’s:– Appropriate?Appropriate?– Effective?Effective?– Timely?Timely?– Transferable to the workplace?Transferable to the workplace?
    14. 14. 1414Evolution of Evaluation EffortsEvolution of Evaluation Efforts1.1. AnecdotalAnecdotal approachapproach –– talk to othertalk to otherusersusers2.2. Try before buyTry before buy –– borrow and useborrow and usesamplessamples3.3. AnalyticalAnalytical approachapproach –– matchmatchresearch data to training needsresearch data to training needs4.4. HolisticHolistic approachapproach –– look at overalllook at overallHRD process, as well as individualHRD process, as well as individualtrainingtraining
    15. 15. 1515Models and Frameworks ofModels and Frameworks ofEvaluationEvaluation Table 7-1 lists six frameworks forTable 7-1 lists six frameworks forevaluationevaluation The most popular is that of D.The most popular is that of D.Kirkpatrick:Kirkpatrick:– ReactionReaction– LearningLearning– Job BehaviorJob Behavior– ResultsResults
    16. 16. 1616Kirkpatrick’s Four LevelsKirkpatrick’s Four Levels ReactionReaction– Focus on trainee’s reactionsFocus on trainee’s reactions LearningLearning– Did they learn what they were supposed to?Did they learn what they were supposed to? Job BehaviorJob Behavior– Was it used on job?Was it used on job? ResultsResults– Did it improve the organization’s effectiveness?Did it improve the organization’s effectiveness?
    17. 17. 1717Issues Concerning Kirkpatrick’sIssues Concerning Kirkpatrick’sFrameworkFramework Most organizations don’t evaluateMost organizations don’t evaluateat all four levelsat all four levels Focuses only on post-trainingFocuses only on post-training Doesn’t treat inter-stageDoesn’t treat inter-stageimprovementsimprovements WHAT ARE YOUR THOUGHTS?WHAT ARE YOUR THOUGHTS?
    18. 18. 1818Other Frameworks/ModelsOther Frameworks/Models CIPP: Context, Input, Process, ProductCIPP: Context, Input, Process, Product(Galvin, 1983)(Galvin, 1983) Brinkerhoff (1987):Brinkerhoff (1987):– Goal settingGoal setting– Program designProgram design– Program implementationProgram implementation– Immediate outcomesImmediate outcomes– Usage outcomesUsage outcomes– Impacts and worthImpacts and worth
    19. 19. 1919Other Frameworks/Models – 2Other Frameworks/Models – 2 Kraiger, Ford, & Salas (1993):Kraiger, Ford, & Salas (1993):– Cognitive outcomesCognitive outcomes– Skill-based outcomesSkill-based outcomes– Affective outcomesAffective outcomes Holton (1996): Five Categories:Holton (1996): Five Categories:– Secondary InfluencesSecondary Influences– Motivation ElementsMotivation Elements– Environmental ElementsEnvironmental Elements– OutcomesOutcomes– Ability/Enabling ElementsAbility/Enabling Elements
    20. 20. 2020Other Frameworks/Models – 3Other Frameworks/Models – 3 Phillips (1996):Phillips (1996):– Reaction and Planned ActionReaction and Planned Action– LearningLearning– Applied Learning on the JobApplied Learning on the Job– Business ResultsBusiness Results– ROIROI
    21. 21. 2121A Suggested Framework – 1A Suggested Framework – 1 ReactionReaction– Did trainees like the training?Did trainees like the training?– Did the training seem useful?Did the training seem useful? LearningLearning– How much did they learn?How much did they learn? BehaviorBehavior– What behavior change occurred?What behavior change occurred?
    22. 22. 2222Suggested Framework – 2Suggested Framework – 2 ResultsResults– What were the tangible outcomes?What were the tangible outcomes?– What was the return on investmentWhat was the return on investment(ROI)?(ROI)?– What was the contribution to theWhat was the contribution to theorganization?organization?
    23. 23. 2323Data Collection for HRDData Collection for HRDEvaluationEvaluationPossible methods:Possible methods: InterviewsInterviews QuestionnairesQuestionnaires Direct observationDirect observation Written testsWritten tests Simulation/Performance testsSimulation/Performance tests Archival performance informationArchival performance information
    24. 24. 2424InterviewsInterviewsAdvantagesAdvantages:: FlexibleFlexible Opportunity forOpportunity forclarificationclarification Depth possibleDepth possible Personal contactPersonal contactLimitations:Limitations: High reactiveHigh reactiveeffectseffects High costHigh cost Face-to-face threatFace-to-face threatpotentialpotential Labor intensiveLabor intensive Trained observersTrained observersneededneeded
    25. 25. 2525QuestionnairesQuestionnairesAdvantagesAdvantages:: Low cost toLow cost toadministeradminister Honesty increasedHonesty increased Anonymity possibleAnonymity possible Respondent setsRespondent setsthe pacethe pace Variety of optionsVariety of optionsLimitationsLimitations:: Possible inaccuratePossible inaccuratedatadata ResponseResponseconditions notconditions notcontrolledcontrolled Respondents setRespondents setvarying pacesvarying paces Uncontrolled returnUncontrolled returnraterate
    26. 26. 2626Direct ObservationDirect ObservationAdvantagesAdvantages:: NonthreateningNonthreatening Excellent way toExcellent way tomeasure behaviormeasure behaviorchangechangeLimitationsLimitations:: Possibly disruptivePossibly disruptive Reactive effectsReactive effectsare possibleare possible May be unreliableMay be unreliable Need trainedNeed trainedobserversobservers
    27. 27. 2727Written TestsWritten TestsAdvantagesAdvantages:: Low purchase costLow purchase cost Readily scoredReadily scored Quickly processedQuickly processed Easily administeredEasily administered Wide samplingWide samplingpossiblepossibleLimitationsLimitations:: May be threateningMay be threatening Possibly no relationPossibly no relationto job performanceto job performance Measures onlyMeasures onlycognitive learningcognitive learning Relies on normsRelies on norms Concern for racial/Concern for racial/ethnic biasethnic bias
    28. 28. 2828Simulation/Performance TestsSimulation/Performance TestsAdvantagesAdvantages:: ReliableReliable ObjectiveObjective Close relation toClose relation tojob performancejob performance Includes cognitive,Includes cognitive,psychomotor andpsychomotor andaffective domainsaffective domainsLimitationsLimitations:: Time consumingTime consuming Simulations oftenSimulations oftendifficult to createdifficult to create High costs toHigh costs todevelopment anddevelopment anduseuse
    29. 29. 2929Archival Performance DataArchival Performance DataAdvantagesAdvantages:: ReliableReliable ObjectiveObjective Job-basedJob-based Easy to reviewEasy to review Minimal reactiveMinimal reactiveeffectseffectsLimitationsLimitations:: Criteria forCriteria forkeeping/ discardingkeeping/ discardingrecordsrecords Information systemInformation systemdiscrepanciesdiscrepancies IndirectIndirect Not always usableNot always usable Records preparedRecords preparedfor other purposesfor other purposes
    30. 30. 3030Choosing Data CollectionChoosing Data CollectionMethodsMethods ReliabilityReliability– Consistency of results, and freedom fromConsistency of results, and freedom fromcollection method bias and errorcollection method bias and error ValidityValidity– Does the device measure what we want toDoes the device measure what we want tomeasure?measure? PracticalityPracticality– Does it make sense in terms of theDoes it make sense in terms of theresources used to get the data?resources used to get the data?
    31. 31. 3131Type of Data Used/NeededType of Data Used/Needed Individual performanceIndividual performance Systemwide performanceSystemwide performance EconomicEconomic
    32. 32. 3232Individual Performance DataIndividual Performance Data Individual knowledgeIndividual knowledge Individual behaviorsIndividual behaviors Examples:Examples:– Test scoresTest scores– Performance quantity, quality, andPerformance quantity, quality, andtimelinesstimeliness– Attendance recordsAttendance records– AttitudesAttitudes
    33. 33. 3333Systemwide Performance DataSystemwide Performance Data ProductivityProductivity Scrap/rework ratesScrap/rework rates Customer satisfaction levelsCustomer satisfaction levels On-time performance levelsOn-time performance levels Quality rates and improvement ratesQuality rates and improvement rates
    34. 34. 3434Economic DataEconomic Data ProfitsProfits Product liability claimsProduct liability claims Avoidance of penaltiesAvoidance of penalties Market shareMarket share Competitive positionCompetitive position Return on investment (ROI)Return on investment (ROI) Financial utility calculationsFinancial utility calculations
    35. 35. 3535Use of Self-Report DataUse of Self-Report Data Most common methodMost common method Pre-training and post-training dataPre-training and post-training data Problems:Problems:– Mono-method biasMono-method bias Desire to be consistent between testsDesire to be consistent between tests– Socially desirable responsesSocially desirable responses– Response Shift Bias:Response Shift Bias: Trainees adjust expectations to trainingTrainees adjust expectations to training
    36. 36. 3636Research DesignResearch DesignSpecifies in advance:Specifies in advance: the expected results of the studythe expected results of the study the methods of data collection to bethe methods of data collection to beusedused how the data will be analyzedhow the data will be analyzed
    37. 37. 3737Research Design IssuesResearch Design Issues Pretest and PosttestPretest and Posttest– Shows trainee what training hasShows trainee what training hasaccomplishedaccomplished– Helps eliminate pretest knowledge biasHelps eliminate pretest knowledge bias Control GroupControl Group– Compares performance of group withCompares performance of group withtraining against the performance of atraining against the performance of asimilar group without trainingsimilar group without training
    38. 38. 3838Recommended ResearchRecommended ResearchDesignDesign Pretest and posttest with controlPretest and posttest with controlgroupgroup Whenever possible:Whenever possible:– Randomly assign individuals to the testRandomly assign individuals to the testgroup and the control group to minimizegroup and the control group to minimizebiasbias– Use “time-series” approach to dataUse “time-series” approach to datacollection to verify performancecollection to verify performanceimprovement is due to trainingimprovement is due to training
    39. 39. 3939Ethical Issues ConcerningEthical Issues ConcerningEvaluation ResearchEvaluation Research ConfidentialityConfidentiality Informed consentInformed consent Withholding training from controlWithholding training from controlgroupsgroups Use of deceptionUse of deception Pressure to produce positive resultsPressure to produce positive results
    40. 40. 4040Assessing the Impact of HRDAssessing the Impact of HRD Money is the language of business.Money is the language of business. You MUST talk dollars, not HRDYou MUST talk dollars, not HRDjargon.jargon. No one (except maybe you) caresNo one (except maybe you) caresabout “the effectiveness of trainingabout “the effectiveness of traininginterventions as measured by andinterventions as measured by andanalysis of formal pretest, posttestanalysis of formal pretest, posttestcontrol group data.”control group data.”
    41. 41. 4141HRD Program AssessmentHRD Program Assessment HRD programs and training areHRD programs and training areinvestmentsinvestments Line managers often see HR and HRD asLine managers often see HR and HRD ascostscosts –– i.e.,i.e., revenue users, not revenuerevenue users, not revenueproducersproducers You must prove your worth to theYou must prove your worth to theorganization –organization –– Or you’ll have to find anotherOr you’ll have to find anotherorganization…organization…
    42. 42. 4242Two Basic Methods forTwo Basic Methods forAssessing Financial ImpactAssessing Financial Impact Evaluation of training costsEvaluation of training costs Utility analysisUtility analysis
    43. 43. 4343Evaluation of Training CostsEvaluation of Training Costs Cost-benefit analysisCost-benefit analysis– Compares cost of training to benefitsCompares cost of training to benefitsgained such as attitudes, reduction ingained such as attitudes, reduction inaccidents, reduction in employee sick-accidents, reduction in employee sick-days, etc.days, etc. Cost-effectiveness analysisCost-effectiveness analysis– Focuses on increases in quality,Focuses on increases in quality,reduction in scrap/rework, productivity,reduction in scrap/rework, productivity,etc.etc.
    44. 44. 4444Return on InvestmentReturn on Investment Return on investment =Return on investment =Results/CostsResults/Costs
    45. 45. 4545Calculating Training Return OnCalculating Training Return OnInvestmentInvestment    Results Results    Operational How Before After Differences ExpressedResults Area Measured Training Training (+ or –) in $Quality of panels % rejected 2% rejected 1.5% rejected .5% $720 per day    1,440 panels 1,080 panels 360 panels $172,800      per day   per day     per yearHousekeeping Visual  10 defects 2 defects 8 defects Not measur-   inspection   (average)   (average)   able in $   using        20-item          checklist       Preventable Number of 24 per year 16 per year 8 per year    accidents   accidents         Direct cost $144,000 $96,000 per $48,000 $48,000 per   of each   per year   year   year   accident           ReturnInvestment    Total savings: $220,800.00ROI = =             SOURCE: From D. G. Robinson & J. Robinson (1989). Training for impact. Training and Development Journal, 43(8), 41. Printed by permission.Operational ResultsTraining Costs= $220,800$32,564= 6.8
    46. 46. 4646Types of Training CostsTypes of Training Costs Direct costsDirect costs Indirect costsIndirect costs Development costsDevelopment costs Overhead costsOverhead costs Compensation for participantsCompensation for participants
    47. 47. 4747Direct CostsDirect Costs InstructorInstructor– Base payBase pay– Fringe benefitsFringe benefits– Travel and per diemTravel and per diem MaterialsMaterials Classroom and audiovisualClassroom and audiovisualequipmentequipment TravelTravel Food and refreshmentsFood and refreshments
    48. 48. 4848Indirect CostsIndirect Costs Training managementTraining management Clerical/AdministrativeClerical/Administrative Postal/shipping, telephone,Postal/shipping, telephone,computers, etc.computers, etc. Pre- and post-learning materialsPre- and post-learning materials Other overhead costsOther overhead costs
    49. 49. 4949Development CostsDevelopment Costs Fee to purchase programFee to purchase program Costs to tailor program toCosts to tailor program toorganizationorganization Instructor training costsInstructor training costs
    50. 50. 5050Overhead CostsOverhead Costs General organization supportGeneral organization support Top management participationTop management participation Utilities, facilitiesUtilities, facilities General and administrative costs,General and administrative costs,such as HRMsuch as HRM
    51. 51. 5151Compensation for ParticipantsCompensation for Participants Participants’ salary and benefits forParticipants’ salary and benefits fortime away from jobtime away from job Travel, lodging, and per-diem costsTravel, lodging, and per-diem costs
    52. 52. 5252Measuring BenefitsMeasuring Benefits– Change in quality per unit measured inChange in quality per unit measured indollarsdollars– Reduction in scrap/rework measured inReduction in scrap/rework measured indollar cost of labor and materialsdollar cost of labor and materials– Reduction in preventable accidentsReduction in preventable accidentsmeasured in dollarsmeasured in dollars– ROI = Benefits/Training costsROI = Benefits/Training costs
    53. 53. 5353Utility AnalysisUtility Analysis Uses a statistical approach toUses a statistical approach tosupport claims of trainingsupport claims of trainingeffectiveness:effectiveness:– N = Number of traineesN = Number of trainees– T = Length of time benefits are expected to lastT = Length of time benefits are expected to last– ddtt = True performance difference resulting from= True performance difference resulting fromtrainingtraining– SDSDyy = Dollar value of untrained job performance (in= Dollar value of untrained job performance (instandard deviation units)standard deviation units)– C = Cost of trainingC = Cost of training ∆∆U = (N)(T)(dU = (N)(T)(dtt)(Sd)(Sdyy) – C) – C
    54. 54. 5454Critical Information for UtilityCritical Information for UtilityAnalysisAnalysis ddtt = difference in units between= difference in units betweentrained/untrained, divided bytrained/untrained, divided bystandard deviation in unitsstandard deviation in unitsproduced by trainedproduced by trained SDSDyy = standard deviation in= standard deviation indollars, or overall productivity ofdollars, or overall productivity oforganizationorganization
    55. 55. 5555Ways to Improve HRDWays to Improve HRDAssessmentAssessment Walk the walk, talk the talk: MONEYWalk the walk, talk the talk: MONEY Involve HRD in strategic planningInvolve HRD in strategic planning Involve management in HRD planning andInvolve management in HRD planning andestimation effortsestimation efforts– Gain mutual ownershipGain mutual ownership Use credible and conservative estimatesUse credible and conservative estimates Share credit for successes and blame forShare credit for successes and blame forfailuresfailures
    56. 56. 5656HRD Evaluation StepsHRD Evaluation Steps1.1. Analyze needs.Analyze needs.2.2. Determine explicit evaluation strategy.Determine explicit evaluation strategy.3.3. Insist on specific and measurableInsist on specific and measurabletraining objectives.training objectives.4.4. Obtain participant reactions.Obtain participant reactions.5.5. Develop criterion measures/instrumentsDevelop criterion measures/instrumentsto measure results.to measure results.6.6. Plan and execute evaluation strategy.Plan and execute evaluation strategy.
    57. 57. 5757SummarySummary Training results must be measuredTraining results must be measuredagainst costsagainst costs Training must contribute to theTraining must contribute to the“bottom line”“bottom line” HRD must justify itself repeatedlyHRD must justify itself repeatedlyas a revenue enhanceras a revenue enhancer

    ×