If you are responsible for managing your nonprofit's training, then you know that providing courses and classes is only part of the challenge. You also need to investigate, plan, coordinate, communicate, budget, and persuade. All of these management functions become easier when you have a solid set of training metrics to work from. A "training scorecard" gives you a tool to track how things are going, and gives you the data to stand on equal footing with other leaders in your organization.
Curated by the Cornerstone OnDemand Foundation and Steve Semler, Senior Training Manager at MoneyGram International, this special presentation for nonprofits focuses on Learning Metrics: Building Your Training Scorecard. You will learn:
• The four ascending categories of learning metrics
• How to capture and present qualitative and quantitative training evaluation data
• Which metrics to include on a training scorecard
• How to establish a rhythm of evaluation and reporting that supports your organization's training and learning needs
2. CSOD Foundation: Our Mission
2
The Cornerstone OnDemand Foundation transforms the way
people help people. Through the contribution of our technology and
talent management expertise, we strengthen nonprofit organizations
around the world by helping them develop, engage, and empower
their employees and the people they serve.
3. HR Pro Bono Corps
The HR Pro Bono Corps brings much-needed
human capital management expertise to the
nonprofit sector at no cost.
The HR Pro Bono Corps focuses its support in three areas:
• Performance Management
• Learning Management
• Succession Management
4. This Presentation
Our Goal Today:
• Learn how to create a training scorecard that will help you
better understand and manage learning
How We’ll Get There:
• Set the foundation about what to measure and why
• Explore methods for how and when to measure and report
on your metrics
5. About the Presenter
Steve Semler
Senior Manager
Risk Operations Training
MoneyGram International
ssemler@moneygram.com
www.linkedin.com/in/ssemler
6. Poll 1
Which of these topics is of MOST interest to you?
• Measurement background (0/28)
• The 4 categories of learning metrics (3/28)
• How to capture and present evaluation data (5/28)
• What to put on a scorecard (12/28)
• How to get into an evaluation and reporting rhythm (3/28)
7. Poll 2
What kind of metrics, measurement, or evaluation do you do now?
• None (1/28)
• Just class reaction evals (0/28)
• Reaction evals for all courses (4/28)
• Pre-/Post-Course learning (level 2) (2/28)
• Application (level 3) (2/28)
• Impact (level 4) (1/28)
• Other/A mix of some of these (9/28)
9. Why do Metrics?
• Make informed decisions
• What gets measured gets done
• Drive the car, steer the ship, fly the plane
• Know: “Are we really helping?”
10. Open Question 1
• What decisions do you need to help leaders in your
organizations make with regard to learning?
• How best to present material - from what format to how often.
• Are employees using what they learned in training?
• What courses/workshops are most needed.
• How to retain top talent? What gaps currently exist?
• Does e-learning replace live events
• Impact of elearning vs live events
• Is the training worth what we spend on it?
• We want to see that our training is changing the behavior of the person taking
the course.
• Are people learning and acquiring skills
• What do our people need and want?
Please type your answers into the Chat panel, sending to Everyone in
the webinar.
12. Are we Adding Value?
Risk: Learning reduces risk or increases predictability.
Reduce compliance incident frequency and cost
Reduce turnover costs
Reduce variation and error
Return: Learning adds capabilities, improves or prevents degradation of
performance, or decreases the time to achieve a capability.
Increase productivity or quality, decrease costs
Prevent loss of productivity, realize potential of the improvement during
change
Get to productivity faster, spend less on training to achieve the same result
Liquidity: Learning increases or broadens capabilities, providing
increased business flexibility.
Realize more opportunities
Reduce time and cost of implementing improvements
Reduce staff required to cover a business need
Adapted from Boudreau and Ramstad, Beyond HR, 2007
13. Poll 3
What is the most common type of added value your organization
expects from training?
o Reduced risk (7/28)
o Increased return (10/28)
o Increased liquidity (3/28)
14. What can we actually measure?
• Quantity
• Quality
• Cost
• Time
• Satisfaction
If it’s observable,
we can measure it.
“Not everything that can be counted counts, and not
everything that counts can be counted.”
– William Bruce Cameron (often attributed to Albert Einstein)
15. What learning can we measure?
Types of Learning
Intentional,
Formal
Intentional,
Informal
Unplanned
We can measure
this
We can measure
this, if we try very
hard
We basically don’t
even know that this is
happening
17. Four Categories Explained
• Activity – What are we doing? (Basic QQCTS)
• Efficiency – How well are we using our resources?
• Effectiveness – Is it doing what we intended it to do; what are
the results?
• Impact – What benefit are we getting from those results?
Efficiency, Effectiveness, and Impact form the basis for a
“decision science” of talent management…
Activity is really just bookkeeping.
18. Poll 4
What is your familiarity with Donald Kirkpatrick’s four level
evaluation model?
o Not familiar with it (9/28)
o Heard about or studied it (5/28)
o Have used it (2/28)
o Using it now (4/28)
Reference: http://www.mindtools.com/pages/article/kirkpatrick.htm
19. You may be familiar with Kirkpatrick…
19
Level 1 – Reaction
Level 2 – Learning
Level 3 – Behavior
Level 4 – Results
These are not the same things as
Activity, Efficiency, Effectiveness, and Impact
20. Training Results (Scorecard)
• Activity
o Requests Received: 6, Requests Answered: 6 (100%), Solutions Offered: 6 (100%)
o Courses: 208, Learners: 88, Hours: 122
o Other Activities: 240, Learners: 200, Hours: 40
o Cost: $16,550
• Efficiency
o Cost/Learning Hour: $102.16, Hours/Activity: 0.36, Activities/L&D Resource: 448,
Cost/Activity: $34.94
• Effectiveness
o Objectives Met: 624, Impact Estimate: $52,584 (est.)
o Usability: 89%, Net Promoter Score: 54%, Manager Rating: 72%
o Engagement Score: 3.3 (% change: -0.3 month, -0.5 year)
• Impact*
o # of Successes: 6, Success %: 100%
o Value of Successes: $372,000
* Impact metrics do not capture the value of all activities; only the ones specifically evaluated.
21. Poll 5
• Which category of metrics will be most helpful to your
organization?
o Activity (3/28)
o Efficiency (1/28)
o Effectiveness (10/28)
o Impact (8/28)
23. Training Results (Scorecard)
• Activity
o Requests Received: 6, Requests Answered: 6 (100%), Solutions Offered: 6 (100%)
o Courses: 208, Learners: 88, Hours: 122
o Other Activities: 240, Learners: 200, Hours: 40
o Cost: $16,550
• Efficiency
o Cost/Learning Hour: $102.16, Hours/Activity: 0.36, Activities/L&D Resource: 448,
Cost/Activity: $34.94
• Effectiveness
o Objectives Met: 624, Impact Estimate: $52,584 (est.)
o Usability: 89%, Net Promoter Score: 54%, Manager Rating: 72%
o Engagement Score: 3.3 (% change: -0.3 month, -0.5 year)
• Impact*
o # of Successes: 6, Success %: 100%
o Value of Successes: $372,000
* Impact metrics do not capture the value of all activities; only the ones specifically evaluated.
24. Capturing Activity Data
Outlook Mailbox
• Requests Received, Requests Answered, Solutions Offered
LMS/Training Reports
• Courses Completed, Learners Completing Courses, Learning Hours
(courses)
Internal Activity Reports
• Other Activities, Learners, Learning Hours
ERP Cost Reports
• Training Costs, Learner Costs (may have to estimate or average)
25. Total Cost Calculations
Learner Labor = Avg hourly cost x learning hours
L&D Labor = Monthly labor cost
L&D Program Expenses = From budget report
L&D Amortized Expenses = From budget report
= Total Monthly L&D Cost
Option: You could also calculate the opportunity cost of having
employees off the job, if that was relevant to your business.
26. Capturing Efficiency Data
ERP + LMS + Internal Reporting
• Cost/Learning Hour (total costs ÷ all learning hours from courses and
activities)
• Hours/Activity (total hours ÷ combined count of activities)
• Activities/L&D Resource (combined count of activities ÷ count of L&D
FTEs)
• Cost/Activity (total cost ÷ total activities)
27. Capturing Effectiveness Data
Course Reports + Business Data
• Objectives Met, Impact Estimate*
Learner Evaluations
• Usability, Net Promoter Score
Manager Evaluations
• Manager Rating (also Objectives Met, Impact Estimate)
Engagement Surveys
• Engagement Score, % change
28. Impact Estimate Logic
28
22
# Objectives
X
Average Value/Objective
= Estimated Impact
Example: A learner who completes an objective eventually saves 1
hour of labor/month over 3 months, for a value of 3 hours of labor
cost as the benefit for one objective.
Key: The estimate
must be reasonable
for the organization!
Note: This only works
if you are counting
performance
objectives.
29. Poll 6
What is your familiarity with Robert Brinkerhoff’s Success Case
Evaluation model?
• Not familiar with it (17/28)
• Heard about or studied it (1/28)
• Have used it (0/42)
• Certified in it/Using it now (0/28)
30. Capturing Impact Data
Success Case Evaluations
• # of Successes, Success %, Value of Successes
Manager Evaluations
• Value of Successes
31. Success Case Evaluation Logic
31
# Successes
X
Value/Success
= Impact Value
Example: 6 people proved successes from the New Hire Training
program. They got up to full productivity at least 2 weeks faster than
they did under the old program.
The value of this was 40 extra activities performed/week x 2 weeks x
$775/activity x 6 people = $372,000.
33. For your own scorecard…
33
What will help you drive
learning in your organization?
34. Open Question 2
• What are some things you want on your scorecard?
• Budget, appropriate curriculum
• Matching metrics to performance
• Is it building morale
• Are employees using what they learned in training?
• What's the "net promoter score" and how do you collect / compute it?
• Sucesses
Please type your answers into the Chat panel, sending to Everyone in
the webinar.
35. What goes on your scorecard?
• What do you need?
• What are they asking for?
• What is interesting?
• What will help you steer learning?
36. Recommended – Work Up the 4 Categories
• Activity
• Efficiency
• Effectiveness
• Impact
You can benefit by showing
your willingness to gather and
report on business data.
Having a strategy can be the
key to greater influence and
more informed decision
making.
Hint: Be willing to test and refine your scorecard metrics.
Some metrics won’t be helpful, and some might be too
difficult to capture.
37. How to get into an
Evaluation and Reporting
Rhythm
38. Build an Evaluation and Reporting Rhythm
• Start with a strategy, then build and revise
• Regular, short training meetings
• Monthly metrics reporting
39. Regular, Short Training Meetings
• Monthly
• With internal customers (leaders) and L&D staff
• Same agenda (see the reference presentation for an example)
• Focus on being helpful
• Make it part of the routine!
40. Training Meeting Agenda (30 min)
Training Activity
• Training Report: Quick update on training during the past month; check for
errors or corrections needed on report
• Training Evaluation Results: Updated results from evaluations, as applicable
• Other Training Completed: Anything done that wasn’t reflected on the current
report
• Upcoming Training: Planned training and upcoming opportunities
Training Priorities
• Next Month: What is happening that will affect training priorities? Impact on
training—move up, move back, put on hold, accelerate, etc.?
• Next Quarter: Same, but farther out
Training Plans:
• Existing Requests: Status update on current requests and projects
• New Requests: Any new requests that we did not already capture earlier in
the meeting; verify what we heard
• Coordination: Assign work, tasks, responsibilities as appropriate
42. In Summary…
42
Metrics can help you
manage learning, if you
use them well.
Build the strategy, work the strategy, and
make it work for you and your organization.
44. Open Question 3
• Questions?
• Q: We would like to know the real impact of our training program. So,
if an adult’s behavior changes towards their thoughts and actions on
child sexual abuse protection behaviors. It is hard to measure this in
one evaluation. So, perhaps we need to look at sending evaluations 3
months out, and 6 months. We are really curious on how we can get
people to respond to those evaluations when they are sent out after
the training.
• A: Set the expectation that you will be following up. Emphasize how
valuable their own individual perspectives are. Include a token gift of
appreciation for responding. Thank the person in advance for
completing the survey. State that successful people are more likely to
respond to the evaluation.
Please type your answers into the Chat panel, sending to Everyone in
the webinar.
45. Open Question 3
• Questions?
• Q: How do you get time on the leadership meeting agenda to talk
about training?
• A: Get 5 minutes, and rehearse your message so that you can
present your “elevator speech” drawn from the metrics in 30 seconds
or so. This should be the most important “what’s happening” point or
two. Be prepared to be dropped from the agenda when the meeting
runs long. Don’t worry; stay persistent. You’re in this for the long haul.
Please type your answers into the Chat panel, sending to Everyone in
the webinar.