2. PRESENTATION OVERVIEW
The presentation will discuss:
The importance of assessment
Assessment planning
Avoiding confounds
What to measure
Assessment designs
Training R.O.I.
3. GROUND RULES FOR LEARNING
Participate fully
Share ideas and experience.
Respect others’ opinions.
Knowledge isn’t power it’s ------- the power
is ------.
Have fun.
4. THINK . . .
What is training
assessment?
An effort to determine
the value,
significance, or
extent of a learning
event.
5. WHY ASSESS?
Simply stated, you assess training
programs to determine whether or
not you are having a positive impact
on your learners and, in turn, to
determine whether your training
initiative is having a positive impact
on your organization.
7. ASSESSMENT IS IMPORTANT
For continuous improvement of your
training curricula
To determine training needs
To determine the impact of training
initiatives
To determine training R.O.I.
To justify what you do every day
8. ASSESSMENT LEVELS
In general, assessment happens at two
levels:
At the course or curricular level
Did the learner learn what you wanted them to
learn?
At the organizational performance level
Did the organization’s performance improve the
way you anticipated?
11. THINK . . .
When should you begin
your assessment?
Before you
do anything else.
12. WHEN TO ASSESS
If you are trying to measure how much
you have changed things, it’s much
better (and easier) to begin
measuring before you have changed
anything, so . . .
13. WHEN TO ASSESS
You must plan and
implement your
assessment before you
begin your training
initiative.
Really.
14. A PRACTICAL APPROACH . . .
Figure out how to measure what you
are going to train
If possible, use existing measures
If not, use easily managed measures
Measure before you train (baseline)
Measure after you train
Compare your measures
15. WHAT TO MEASURE
Figure out how to measure what you
are trying to change:
Measure your training objectives
Objectives should be clear
Objectives should be performance based
“Learning is something you can see.”
16. PERFORMANCE BASED
Focus on measurable and observable
skills.
Instead of saying that a learner:
Should know . . .
Will understand . . .
Must believe . . .
Operationaliz:
Define a concept or variable so that it can
be measured or expressed quantitatively.
17. OPERATIONALIZE
Instead of saying that a learner:
Should know . . .
Will understand . . .
Must believe . . .
Use statements like this:
Will list the 7 steps . . .
Will accurately describe . . .
Will be able to perform . . .
19. EXISTING MEASURES
If possible, use existing measures:
Use metrics that are already
supported within your organization.
Much easier
Instant “face validity”
Already integrated with management
systems
20. EXISTING MEASURES
Some examples of existing
measures might be:
Services Utilization data
Quality data
Outcomes data
Customer evaluations
Competency Tool
21. NO EXISTING MEASURES?
If your existing measures and your
training are absolutely unrelated,
either you’re training something
completely new, or you’re training
the wrong things, or your
organization is measuring the wrong
things, or both.
22. EASY MEASURES
If you have to use new measures, use
easily managed measures:
Complex measurement systems are
typically a mistake, because they . . .
Take too long
Consume too many resources
Cost too much money
Are a general pain in the “caboose”
23. SIMPLE TOOLS
Some examples of easy measurement
tools include:
Pre-tests
Test the same concepts you will test with
your post-test
Checklists
These are easy to create and easy to use,
but they are powerful (really!)
24. CHECKLIST
One of the best ways to
measure performance is
a simple checklist:
• Just list the things that
your learners should
be able to do if your
training is successful,
then . . .
25. CHECKLIST
• Observe the
learner(s) and check
the behaviors off the
list (Wow!!)
• Or let the learner(s)
check them off (self
observation actually
works)
26. BEFORE & AFTER
Use your checklist before you train
(to create a baseline).
Then use your checklist after the
training is complete and see if there
has been improvement.
27. BASELINE
A measurement, calculation, or
location used as a basis for
comparison, or . . .
How your learners perform the skills
(or don’t perform the skills) you are
going to train, before you train them.
28. THINK . . .
If you train something
to a group in your
organization, and
related
organizational
performance
improves, did your
training cause the
improvement?
29. CONFOUNDS
Maybe . . .
Maybe not.
It could have been your training, but it
also could have been a related
change in management strategies.
That’s a confound.
30. CONFOUNDS
You can only avoid confounds by
making sure that your assessment
system uses an appropriate
“empirically sound” design.
For example . . .
31. DESIGNS
Group
Two distinct groups, one is trained, one
isn’t, compare to find differences
One group isn’t trained (not reasonable)
Multiple-baseline
Two groups, collect baseline for both,
implement training at different times, see
if changes coincide with training.
33. MULTIPLE BASELINE
Collect baseline at more than one
department
Train one department
Assess performance at both departments
Wait
Train next department
Assess performance at both departments
Did performance improve? When?
35. MULTIPLE BASELINE
Inter-ocular trauma test . . .
Can you see a difference?
Did the difference occur
immediately after the training
at each Department?
36. DESIGNS
Avoiding confounds may
seem like a hassle, but it
will allow you to answer
potentially embarrassing
questions like . . .
“How do you know for
sure?”
37. TRAINING R.O.I.
Return On Investment
Answer the following . . .
Are your training outcomes creating
value that exceeds the cost and
resources required to implement the
training program itself?
38. TRAINING R.O.I.
Yes.
Establishing learning ROI is very difficult
because there are so many other
factors that can influence the
performance of your organization.
There are always lots of confounds!
39. TRAINING R.O.I.
In order to measure R.O.I. you must use
the same empirical strategies that you
use to eliminate confounds when
assessing any training.
You have to prove that your training is
causing the improvement.
40. TRAINING R.O.I.
The hardest part is often assigning a
value to the performance change that
you have created.
Get help.
At the very least this gives the rest of the
organization ownership of the
valuation.
41. JUST DO IT.
The things that can happen if you
don’t focus on assessment . . .
Are lots scarier than
The things that can happen if you do.
42.
43.
44.
45. REFERENCES
1. Kirkpatrick, Donald L. (1975). "Techniques for Evaluating
Programs." Parts 1, 2, 3 and 4. Evaluating Training Programs.
ASTD.
1. Nickols, Frederick W. (1983, October). "Half A Needs Assessment:
What is in the world of work and working." NSPI Journal.
2. Reith, Jack (1975). 1969 "Miami Conference Evaluation Methods
and Results." Evaluating Training Programs. ASTD.
3. Website: http://home.att.net