Module 5
Training Evaluation
Topic
• Meaning and significance
• Donald Kirkpatrick’s Model
• Return on investment in training
• Data collection for training evaluation
• Design of training evaluation
Training Process
Step 1
Identify the
needs
Step 2
Designing
the training
Step 3
Implement
the training
Step 4
Evaluate
the training
Meaning of Training Evaluation
• Refers to activities aimed at finding out the
effectiveness of training programmes after they are
conducted against the objectives
• It answer questions like:
– Where was the capability level of the learners before
training, and where is it now
– What was intended to be achieved and what is achieved
– What is the monetary value of training outcome
– It brings rationality, objectivity, accountability and
credibility
Significance of Training Evaluation
1. Identifies strength and weaknesses in
training programme
2. Compare cost to the benefits of the training
programme
3. Test the clarity and validity of tests, cases and
exercises
4. Gathers data to assist in marketing future
programmes
5. Establishes a database that can assist
management in making decision
Principles of Training Evaluation
1. Must be planned and consciously designed.
2. Evaluation must be relevant to the purposes
of both training and development
3. It must be verifiable
4. It must be continuous and ongoing activity
5. It must be specific, explicit and exact.
6. Must be quantitative
7. Must be cost effective.
Types of Training Evaluation
• Formative evaluation
– Provide information to staff for the purpose of
improving the course programme
– It is to measure progress
– It provides positive learning environment
– It is train the trainer
• Summative evaluation
– Provide information to show the merit of a
training
– This evaluation is after the training programme
– It like providing summary report of the training
result
Donald Kirkpatrick’s Model
• Four levels of Kirkpatrick’s Model are:
– Reaction
– Learning
– Behaviour
– Results
Reaction
• Evaluation is focused on how trainee felt and their
personal reactions to the learning.
• E.g. Did they enjoy training? Did they consider
training relevant? Did they like style, venue, timing
etc?
• Positive reaction may make it easier to encourage
employees to attend future programme
• If trainee did not like the programme, they may
discourage others and may be reluctant to use the
learnt skills
• This only convey satisfaction level of the learners
and not what they learnt
Learning
• Evaluation is aimed at measurement of
increase in knowledge or intellectual
capability after training.
• Whether trainees learn what is expected of a
particular programme?
• Although time consuming assessment or test
before and after training, interview and
observation can be used.
• Hard copy, electronic, online tests or interview
style assessment may be used
Behaviour
• It is the extent to which the trainees applied the
learning and changed their work place behaviour.
• Can be seen immediately or several months after the
training
• Did the trainees put their learning into effect? Were the
relevant skills and knowledge used? Was the new
knowledge sustained
• Can be measured by feedback from customers,
suppliers, bosses, peers etc.
• Assessment may be subtle and ongoing
• 360 degree feedback is useful
• Easy for quantifiable and technical skills but difficult for
complex learning like attitudinal development
Results
• Evaluation focuses on business or resulting
from improved performance of trainee
• Check if the organisation is more profitable,
better able to serve its clients?
• Business data and financial data are analysed
• Key performance indicators are: volume,
values, percentages, non compliance, ROI etc.
ROI
• Jack Phillips added ROI as fifth level of
evaluation
• Organisation uses ROI model because
– Demonstrate the effect of the training on
profitability
– Used to justify cost of training programmes
– Useful for selecting future training programmes
• Linkage between training needs to business
results
Criteria for Effective ROI Process
1. It must be simple, without complex formulas,
length equations and complicated
methodologies
2. Must be economical with capacity to be
implemented easily
3. Assumptions, methodologies and outcomes
must be credible
4. Must account for other factors that influence
output
5. Must include cost of programme
Data collection for Training Evaluation
• Good evaluation depends upon good data
• Collecting appropriate and valid data using scientific
methods will help in doing acceptable evaluation.
• Different types of data available are:
– Individual performance details
– Performance detail of entire department
– Increase in economic value of the organisation
• Depending upon convenience and purpose
evaluation at one or two levels is sufficient
Methods of data collection for
Training Evaluations
Method Advantages Limitations
Interview Flexible, opportunity for clarification,
depth possible, personal interaction
High cost ,face to face threat
potential, labour intensive and
time consuming
Questionnaire Low cost, honesty if anonymous, variety
of options
Possible inaccurate data,
respondents set varying pace
Direct Observation Non- threatening, excellent way of
measuring behaviour change
Possibly disruptive, reactive
effect possible, may be
unreliable
Written test Low purchase cost, readily scored,
quickly processed
Possible cultural bias, may be
threatening
Performance Test Reliable, objective, close relation to job
performance
Time consuming, simulation
often difficult, high
development cost
Performance Data Reliable, objective, job based, easy to
review, minimum reactive effects
Information system
discrepancies
Designs of Training Evaluation
1. One shot case study
2. One group pre test/ post test design
3. One group time series
4. Randomized non- equivalent control design
5. Randomized equivalent control design
6. Post test only control group design
Suggestions for Better Evaluation
1. Plan for merits before writing survey
questions
2. Ensure the measurement process is
replicable and scalable
3. Ensure measurement are internally and
externally comparable
4. Use industry accepted measurable
approaches
5. Define value in the eyes of stakeholders
Suggestions for Better Evaluation
6. Manage the change associated with
measurement
7. Ensure metrics are well balanced
8. Leverage automation and technology
9. Crawl, walk, run
10.Ensure metrics have flexibility
Questions
1. Write a brief note on training evaluation design.
2. “A trainer must be subject matter expertise”.
Comment.
3. Discuss the Kirkpatrick model for measuring the
training effectiveness.
4. What is training evaluation?
5. What are the reasons for evaluating training?
6. Discuss various types of evaluation design.
7. What are ROI of training?
Thank you

Module 5 Training Evaluation.pptx

  • 1.
  • 2.
    Topic • Meaning andsignificance • Donald Kirkpatrick’s Model • Return on investment in training • Data collection for training evaluation • Design of training evaluation
  • 3.
    Training Process Step 1 Identifythe needs Step 2 Designing the training Step 3 Implement the training Step 4 Evaluate the training
  • 4.
    Meaning of TrainingEvaluation • Refers to activities aimed at finding out the effectiveness of training programmes after they are conducted against the objectives • It answer questions like: – Where was the capability level of the learners before training, and where is it now – What was intended to be achieved and what is achieved – What is the monetary value of training outcome – It brings rationality, objectivity, accountability and credibility
  • 5.
    Significance of TrainingEvaluation 1. Identifies strength and weaknesses in training programme 2. Compare cost to the benefits of the training programme 3. Test the clarity and validity of tests, cases and exercises 4. Gathers data to assist in marketing future programmes 5. Establishes a database that can assist management in making decision
  • 6.
    Principles of TrainingEvaluation 1. Must be planned and consciously designed. 2. Evaluation must be relevant to the purposes of both training and development 3. It must be verifiable 4. It must be continuous and ongoing activity 5. It must be specific, explicit and exact. 6. Must be quantitative 7. Must be cost effective.
  • 7.
    Types of TrainingEvaluation • Formative evaluation – Provide information to staff for the purpose of improving the course programme – It is to measure progress – It provides positive learning environment – It is train the trainer • Summative evaluation – Provide information to show the merit of a training – This evaluation is after the training programme – It like providing summary report of the training result
  • 8.
    Donald Kirkpatrick’s Model •Four levels of Kirkpatrick’s Model are: – Reaction – Learning – Behaviour – Results
  • 9.
    Reaction • Evaluation isfocused on how trainee felt and their personal reactions to the learning. • E.g. Did they enjoy training? Did they consider training relevant? Did they like style, venue, timing etc? • Positive reaction may make it easier to encourage employees to attend future programme • If trainee did not like the programme, they may discourage others and may be reluctant to use the learnt skills • This only convey satisfaction level of the learners and not what they learnt
  • 10.
    Learning • Evaluation isaimed at measurement of increase in knowledge or intellectual capability after training. • Whether trainees learn what is expected of a particular programme? • Although time consuming assessment or test before and after training, interview and observation can be used. • Hard copy, electronic, online tests or interview style assessment may be used
  • 11.
    Behaviour • It isthe extent to which the trainees applied the learning and changed their work place behaviour. • Can be seen immediately or several months after the training • Did the trainees put their learning into effect? Were the relevant skills and knowledge used? Was the new knowledge sustained • Can be measured by feedback from customers, suppliers, bosses, peers etc. • Assessment may be subtle and ongoing • 360 degree feedback is useful • Easy for quantifiable and technical skills but difficult for complex learning like attitudinal development
  • 12.
    Results • Evaluation focuseson business or resulting from improved performance of trainee • Check if the organisation is more profitable, better able to serve its clients? • Business data and financial data are analysed • Key performance indicators are: volume, values, percentages, non compliance, ROI etc.
  • 13.
    ROI • Jack Phillipsadded ROI as fifth level of evaluation • Organisation uses ROI model because – Demonstrate the effect of the training on profitability – Used to justify cost of training programmes – Useful for selecting future training programmes • Linkage between training needs to business results
  • 14.
    Criteria for EffectiveROI Process 1. It must be simple, without complex formulas, length equations and complicated methodologies 2. Must be economical with capacity to be implemented easily 3. Assumptions, methodologies and outcomes must be credible 4. Must account for other factors that influence output 5. Must include cost of programme
  • 15.
    Data collection forTraining Evaluation • Good evaluation depends upon good data • Collecting appropriate and valid data using scientific methods will help in doing acceptable evaluation. • Different types of data available are: – Individual performance details – Performance detail of entire department – Increase in economic value of the organisation • Depending upon convenience and purpose evaluation at one or two levels is sufficient
  • 16.
    Methods of datacollection for Training Evaluations Method Advantages Limitations Interview Flexible, opportunity for clarification, depth possible, personal interaction High cost ,face to face threat potential, labour intensive and time consuming Questionnaire Low cost, honesty if anonymous, variety of options Possible inaccurate data, respondents set varying pace Direct Observation Non- threatening, excellent way of measuring behaviour change Possibly disruptive, reactive effect possible, may be unreliable Written test Low purchase cost, readily scored, quickly processed Possible cultural bias, may be threatening Performance Test Reliable, objective, close relation to job performance Time consuming, simulation often difficult, high development cost Performance Data Reliable, objective, job based, easy to review, minimum reactive effects Information system discrepancies
  • 17.
    Designs of TrainingEvaluation 1. One shot case study 2. One group pre test/ post test design 3. One group time series 4. Randomized non- equivalent control design 5. Randomized equivalent control design 6. Post test only control group design
  • 18.
    Suggestions for BetterEvaluation 1. Plan for merits before writing survey questions 2. Ensure the measurement process is replicable and scalable 3. Ensure measurement are internally and externally comparable 4. Use industry accepted measurable approaches 5. Define value in the eyes of stakeholders
  • 19.
    Suggestions for BetterEvaluation 6. Manage the change associated with measurement 7. Ensure metrics are well balanced 8. Leverage automation and technology 9. Crawl, walk, run 10.Ensure metrics have flexibility
  • 20.
    Questions 1. Write abrief note on training evaluation design. 2. “A trainer must be subject matter expertise”. Comment. 3. Discuss the Kirkpatrick model for measuring the training effectiveness. 4. What is training evaluation? 5. What are the reasons for evaluating training? 6. Discuss various types of evaluation design. 7. What are ROI of training?
  • 21.

Editor's Notes