In this interactive discussion, CEO of DevelopIntelligence, Kelby Zorgdrager, will share strategies for measuring the impact of training from a variety of large organizations across the country: Salesforce, Eventbrite, VMware and Autodesk. In this session, you will discover and engage with current training program methodologies in order to determine your company’s return on investment on learning.
2. About Me
2
Kelby Zorgdrager
CEO & Founder
DevelopIntelligence
• “Recovering” software developer
• Formal education in Computer Science and Mechanical Engineering
• Held every role from Jr. Developer to CTO
• 20+ Years in Learning and Development Industry
• Focus on bridging the gap between R&D and L&D
• Have supported over 30 learning organizations in past 15 years
3. About DevelopIntelligence
3
…Impacts you daily.
When you talk on the phone, watch a movie, connect with
friends on social media, drive a car, fly on a plane, pay with a
credit card, shop online, and order a latte with your mobile app,
you are interacting with technology developed by one of our
customers.
Our purpose…
We help organizations learn and adopt new
technologies by delivering highly customized learning
solutions.
In 2018 alone...
6. Questions we are commonly asked…
How much does the training cost?
Do we really have to train them or
can they learn on their own?
Will the training help?
6
Why do we always need to prove our worth?
7. Questions we are commonly asked…
How much does the training cost?
Do we really have to train them or
can they learn on their own?
Will the training help?
Questions we should be asking…
How will we measure impact?
Is our learning strategy aligned with
the learner preferences?
Is our learning organization
designed to make an impact?
Is our learning investment aligned
with our business strategy?
Is the impact worth the investment?
7
We need to answer a different set of questions
8. Can we measure impact?
BASE ZERO No tech training in place, tech talent is leŌ on their own to learnBase Level
REACTIONARY Ad-hoc, typically as a result of major issueLevel 1
ORGANIZED Some reacƟonary some proacƟve, sƟll ad-hocLevel 2
CENTRALIZED Strategic and managed, no or very liƩle measurementLevel 3
MEASURED Metrics and management in place to determine ROI and business impactLevel 4
OPTIMIZED ConƟnuous improvement focused on increased ROI and efficienciesLevel 5
= ROI Measurement Increase through Metrics and ReporƟng
Missed Opportunites
for Metrics and ReporƟng
LD Maturity Model
8
9. How will we measure impact?
UƟlizaƟon of Instructor Led programs
UƟlizaƟon of on-line library
Cost per employee
Cost per experience
Time to Market
AƩracƟon
RetenƟon
Increased Velocity of Release
Time to producƟvity
SoŌware Quality
DIFFICULTY TO MEASURE
BUSINESS
IMPACT
COST TO
M
EASURE
LD Efficiency
Impact on
Productivity
Impact on
Culture
9
10. Learning Strategies that Minimize the Impact
Provide a one size fits all learning approach
Context insensitive learning programs
Delivering training too early development lifecycle
Delivering training too late in development lifecycle
Delivering training in a modality rejected by learner
10
11. Misalignment: Early Intervention Training
Roles Project Inception Project Delivery
Business Analysts
Software Architects
Developers
Quality Assurance Team
Source Control Managers
Project Managers
Business Stakeholders
11
12. Alignment: Early Intervention
Roles
Business Analysts
Software Architects
Developers
Quality Assurance Team
Source Control Managers
Project Managers
Business Stakeholders
Project Inception Project Delivery
12
13. Misalignment: Save the Day Training
Roles Project Inception Project Delivery
Business Analysts
Software Architects
Developers
Quality Assurance Team
Source Control Managers
Project Managers
Business Stakeholders
13
15. Misalignment: Maintenance Training
Roles Project Inception Project Delivery
Business Analysts
Software Architects
Developers
Quality Assurance Team
Source Control Managers
Project Managers
Business Stakeholders
15
16. Alignment: Maintenance Training
Roles Project Inception Project Delivery
Business Analysts
Software Architects
Developers
Quality Assurance Team
Source Control Managers
Project Managers
Business Stakeholders
16
18. Is Learning Strategy aligned to learner preference?
0 1 2 3 4 5 6 7
Mobile Learning
VILT (online instructor led)
Meet-ups / Brown Bags
Conferences
ILT (in-person instructor led)
Videos or eLearning
Reading a book
Learning from Peers (one on one)
Entire R&D Organization Preferences
Preference by Role
18
19. Common LD Organization Structures
19
Decentralized Hybrid Centralized
• ”Autonomy”
• Self-service
• Nearly impossible to
measure consistently
• Aligned to “funders”
• Proactive and strategic
• Designed for impact
• Multiple funding sources
• Autonomy & alignment
• Supports some level of
measurement
20. 20
Is the Organization designed to make an impact?
People
60%
Infastructure
20%
Experiences
20%
BUDGET
Pros:
• Ability to perform learning strategy and be
trusted learning partner to organization
• In-house expertise and complete control
over learning experience supply chain
Cons:
• Impact of learning is hindered by limited
budget allocated for actual experiences
• Time to market of internal LD experiences
Centralized LD :: Traditional Model
21. 21
Is the Organization designed to make an impact?
People
25%
Infastructure
20%
Experiences
55%
BUDGET
Pros:
• Majority of budget invested in experiences
• Ability to pivot couple with time to market
of LD experiences
Cons:
• Under-staffed, hard to perform learning
strategy and be trusted learning partner to
organization
• Impact of learning may be less than
expected
Centralized LD :: Agile Model
22. Common Investment Strategies
22
No Strategy Reactive Strategic
• No predefined budget
• Pay/justify as needed
• Nearly impossible to
measure consistently
• Earmarked budget
• Money allocated by
business units and needs
• Designed for impact
• Predefined budget
• Money not aligned to
business units or needs
• Supports some level of
measurement
23. Professional Development Investment Model
Self-paced library
100% of population
Experiential learning
40% of population
Transformational programs
5% of population
Conferences
1% of population
Self-paced
Library
21%
Experiential
Learning
47%
Conferences
3%
Workforce
Transformation
29%
• Percent of budget investment
• Based on $1,000 investment per person
71% of budget
23
24. Transformational Investment Model
Self-paced library
100% of population
Experiential learning
15% of population
Transformational programs
35% of population
Conferences
No budget
Self-paced
Library
21%
Experiential
Learning
9%
Conferences
0%
Workforce
Transformation
70%
• Percent of budget investment
• Based on $1,000 investment per person
70% of budget
24
25. Is the impact worth the investment?
UƟlizaƟon of Instructor Led programs
UƟlizaƟon of on-line library
Cost per employee
Cost per experience
Time to Market
AƩracƟon
RetenƟon
Increased Velocity of Release
Time to producƟvity
SoŌware Quality
DIFFICULTY TO MEASURE
BUSINESS
IMPACT
COST TO
M
EASURE
25
Engagement
Culture
Productivity
Efficiency
26. Efficiency of Learning Organization
One of the most common ways to measure “impact”
Typical measurements:
Utilization of learning library
Cost per learner
Cost per learning experience
Cost of learning hour
# of learners per experience
Can be used as a benchmark while maturing your LD organization
Impact: An efficient organization should be able to offer more learning experience without
increasing the budget
Caution: The “cost of efficiency” may reduce the quality of the impact
26
27. Impact on Culture
Emerging trend in industry
Typical measurements:
Employee engagement
Ability to attract (how easy / hard is it to hire new candidates)
Tenure of employee (how long do they stay)
Promotion and career advancement
Innovation and collaboration
Requires LD and TA to work together
Impact: When learning becomes part of culture, you will see reduction in attrition and an
increase in career growth (promotions)
Caution: Taken too far, training becomes a benefit, and could be viewed like free food
27
28. Impact on Productivity
The “panacea” in our industry
Typical measurements include:
Increased utilization and output per worker
Decreased initial time to productivity
Consistency of work
Faster time to market
Better quality of work output
Requires upfront investment in strategy, program design and measurement systems
Impact: A high-performing workforce delivers better results more efficiently
Caution: Most organizations are not “set up” for this type of measurement
Caution: This type of measurement could be more costly than the learning experience
28
Kirkpatrick
29. A moment on NPS
Why are we using NPS to measure training?
Are we trying to:
Prove the learning experience is ‘good enough’ to justify the spend?
Determine the quality of the vendor we are using?
Drive enrollments in other programs?
Caution: Make sure you the use of NPS helps not hurts your organization..
29