TRUTHS ABOUT
TRAINING
EVALUATION



          David Kelly, CPLP, CRP
A LITTLE ABOUT ME



   Training Director for over 10 Years
   Director, Center for Learning at ACLD
   Certified Professional in Learning and Performance
    (CPLP)
   Certified ROI Professional (CRP)
   Board Member of ASTD Long Island
   Member of ASTD’s National Advisors for Chapters
   Member of the eLearn Magazine Editorial Board
TODAY’S DISCUSSION

 A Brief History of Measurement in Learning
  and Development Programs
 What do Trainers Measure? Does What We
  Measure Even Matter?
 The Question of Credibility

 A Better Definition for Training Success
WHAT CAME BEFORE…


               Why are we
               talking about
               ‘evaluation’ of
               training in the
               first place?
A BRIEF HISTORY OF TRAINING EVALUATION

Don Kirkpatrick – The Four Levels of
 Evaluation
  1.   Reaction
  2.   Learning
  3.   Behavior
  4.   Results


Jack Phillips – ROI Methodology
  5.   ROI
       (Benefits / Costs)
A BRIEF HISTORY OF TRAINING EVALUATION
CAPITAL               NON-CAPITAL
INVESTMENTS           INVESTMENTS
   Buildings            Marketing
   Equipment            Human Resources
   Tools                Quality
   Vehicles             Some Technology
   Companies            Staff Support
   Some Technology      Processes

     ROI is usually   ROI is not typically
      measured            measured

WHY MEASURE TRAINING?
CAPITAL               NON-CAPITAL
INVESTMENTS           INVESTMENTS
   Buildings            Marketing
   Equipment            Human Resources
   Tools                Quality
   Vehicles             Some Technology
   Companies            Staff Support
   Some Technology      Processes

     ~15% of Total         ~85% of Total
     Expenditures          Expenditures

WHY MEASURE TRAINING?
SATISFACTION WITH MEASURES OF SUCCESS
OF LEARNING AND DEVELOPMENT
To what extent are you satisfied with the
measures (value) of learning and
development?

1.   Very Dissatisfied – 8%
2.   Dissatisfied – 45%
3.   Satisfied – 41%
4.   Very Satisfied – 6%
*CEO Survey—Fortune 500 and Large
Private Companies, ROI Institute
WHAT METRICS ARE
  YOU USING?
COMMON LEARNING METRICS

 Cost per Employee
  or Hour
                       Do these metrics
 Course Completions
                       REALLY matter?
 Learner
  Satisfaction
 Attendance

 Test Scores
MEANINGFUL PERFORMANCE METRICS

 Decreased Time to
  Performance
 Reduced
  Performance Cycle
  Times
 Quality
  Improvements
 Increased Customer
  Satisfaction
 Increased Sales
HERE’S MY TWO CENTS…

              Measurement     for
               measurement’s sake
               is a waste of time.
               Ask yourself… “Will I
               be doing anything
               with the data we’re
               collecting?”
THE CREDIBILITY ISSUE




The truth is most training evaluation data is either
    irrelevant or does not hold up to scrutiny

                *Bozarth-Ferguson Magic Formula used with permission
THE CREDIBILITY ISSUE

Training: “Sales increased 40% after Training.”
Marketing: “That wasn’t training, we ran
advertisements during that period.”

Training: “Turnover has decreased 25% as a
result of our new orientation programs.”
Human Resources: “That wasn’t training, we
updated hiring criteria.”
THE CREDIBILITY ISSUE




   Source: http://elearningindustry.com/subjects/free-elearning-resources/item/256-free-elearning-roi-
THE BOTTOM LINE

ROI and the metrics used for Training
Evaluation are driven by companies and
consultants with a vested interest in
people believing these metrics matter.

In most cases, the data collected isn’t
really relevant to business outcomes.
HERE’S MY TWO CENTS…

              It’s not about
               Learning; It’s about
               Performance
              It’s not about what
               people KNOW; it’s
               about what people
               DO.
HERE’S MY TWO CENTS…

              Most  executives that
               ask for an ROI, DO
               NOT WANT AN
               ACTUAL ROI. They
               want to know that
               the efforts had
               value.
MAYBE IT’S NOT ABOUT EVALUATION…


                    What other ways
                    can we
                    determine
                    success besides
                    traditional
                    training
                    evaluation?
WHO DECIDES WHAT SUCCESS MEANS?

The Program Stakeholders
 Executives

 Management Approving Financing of
  Training
 Learning and Development Professionals

 Program Participants

 Managers of Program Participants
HERE’S MY TWO CENTS…

              The  time to think
               about how you will
               define success is
               during the DESIGN
               of a program, not
               after it’s completion.
WHAT’S YOUR STORY?

            A strong testimonial
            from a participant or
            executive on how
            training helped workers
            perform better is often
            more powerful than any
            number on a
            spreadsheet.
A STRUCTURED APPROACH
THE SUCCESS CASE METHOD

           Step 1.
           Identify targeted business
           goals and impact
           expectations.
THE SUCCESS CASE METHOD

           Step 2.
           Survey a large
           representative sample of all
           participants in a program to
           identify high impact and low
           impact cases.
THE SUCCESS CASE METHOD

           Step 3.
           Analyze the survey data to
           identify:
              a small group of successful
               participants
              a small group unsuccessful
               participants
THE SUCCESS CASE METHOD

           Step 4.
           Conduct interviews with the
           two selected groups to:
              Document the nature and
               business value of their
               application of learning
              Identify the performance factors
               that supported learning
               application and obstacles that
               prevented it.
THE SUCCESS CASE METHOD

           Step 5.
           Document and disseminate
           the story:
              Report impact
              Applaud successes
              Use data to educate managers
               and organization
IT’S NOT ABOUT HOW MANY SEATS ARE FILLED

               It’s not about the Training, it’s
               about the Performance
               Improvement. It’s about an
               ongoing process, not an
               event.

               “Evaluating training is like
               evaluating the wedding
               instead of the marriage.”
               - Robert Brinkhoff
IT’S NOT ABOUT HOW MANY SEATS ARE FILLED

                Most learning and
                 performance improvement
                 takes place informally, and
                 on the job.
                The success of a training
                 program is not determined
                 via a test; it is determined
                 by how effectively trainees
                 use the training when they
                 return to their work.
SUMMARY

 Data is critical to show the
  value and effectiveness of
  Learning and Performance
  Programs
 Performance is what
  happens after a Learning
  Program, so the most
  important data pertains to
  how work has changed
SUMMARY

 Success starts with
  identifying desired impacts
  during the needs
  assessment
 Managers of program
  participants are the key
  stakeholders for driving
  successful learning and
  performance programs
QUESTIONS?


David Kelly
 E-Mail:
  LnDDave@gmail.com
 Phone: (516) 474-1852

 Twitter: @LnDDave
 Blog: http://davidkelly.me

 Also connect via LinkedIn or Facebook

Truths About Training Evaluation

  • 1.
    TRUTHS ABOUT TRAINING EVALUATION David Kelly, CPLP, CRP
  • 2.
    A LITTLE ABOUTME  Training Director for over 10 Years  Director, Center for Learning at ACLD  Certified Professional in Learning and Performance (CPLP)  Certified ROI Professional (CRP)  Board Member of ASTD Long Island  Member of ASTD’s National Advisors for Chapters  Member of the eLearn Magazine Editorial Board
  • 3.
    TODAY’S DISCUSSION  ABrief History of Measurement in Learning and Development Programs  What do Trainers Measure? Does What We Measure Even Matter?  The Question of Credibility  A Better Definition for Training Success
  • 4.
    WHAT CAME BEFORE… Why are we talking about ‘evaluation’ of training in the first place?
  • 5.
    A BRIEF HISTORYOF TRAINING EVALUATION Don Kirkpatrick – The Four Levels of Evaluation 1. Reaction 2. Learning 3. Behavior 4. Results Jack Phillips – ROI Methodology 5. ROI (Benefits / Costs)
  • 6.
    A BRIEF HISTORYOF TRAINING EVALUATION
  • 7.
    CAPITAL NON-CAPITAL INVESTMENTS INVESTMENTS  Buildings  Marketing  Equipment  Human Resources  Tools  Quality  Vehicles  Some Technology  Companies  Staff Support  Some Technology  Processes ROI is usually ROI is not typically measured measured WHY MEASURE TRAINING?
  • 8.
    CAPITAL NON-CAPITAL INVESTMENTS INVESTMENTS  Buildings  Marketing  Equipment  Human Resources  Tools  Quality  Vehicles  Some Technology  Companies  Staff Support  Some Technology  Processes ~15% of Total ~85% of Total Expenditures Expenditures WHY MEASURE TRAINING?
  • 9.
    SATISFACTION WITH MEASURESOF SUCCESS OF LEARNING AND DEVELOPMENT To what extent are you satisfied with the measures (value) of learning and development? 1. Very Dissatisfied – 8% 2. Dissatisfied – 45% 3. Satisfied – 41% 4. Very Satisfied – 6% *CEO Survey—Fortune 500 and Large Private Companies, ROI Institute
  • 10.
  • 11.
    COMMON LEARNING METRICS Cost per Employee or Hour Do these metrics  Course Completions REALLY matter?  Learner Satisfaction  Attendance  Test Scores
  • 12.
    MEANINGFUL PERFORMANCE METRICS Decreased Time to Performance  Reduced Performance Cycle Times  Quality Improvements  Increased Customer Satisfaction  Increased Sales
  • 13.
    HERE’S MY TWOCENTS…  Measurement for measurement’s sake is a waste of time. Ask yourself… “Will I be doing anything with the data we’re collecting?”
  • 14.
    THE CREDIBILITY ISSUE Thetruth is most training evaluation data is either irrelevant or does not hold up to scrutiny *Bozarth-Ferguson Magic Formula used with permission
  • 15.
    THE CREDIBILITY ISSUE Training:“Sales increased 40% after Training.” Marketing: “That wasn’t training, we ran advertisements during that period.” Training: “Turnover has decreased 25% as a result of our new orientation programs.” Human Resources: “That wasn’t training, we updated hiring criteria.”
  • 16.
    THE CREDIBILITY ISSUE Source: http://elearningindustry.com/subjects/free-elearning-resources/item/256-free-elearning-roi-
  • 17.
    THE BOTTOM LINE ROIand the metrics used for Training Evaluation are driven by companies and consultants with a vested interest in people believing these metrics matter. In most cases, the data collected isn’t really relevant to business outcomes.
  • 18.
    HERE’S MY TWOCENTS…  It’s not about Learning; It’s about Performance  It’s not about what people KNOW; it’s about what people DO.
  • 19.
    HERE’S MY TWOCENTS…  Most executives that ask for an ROI, DO NOT WANT AN ACTUAL ROI. They want to know that the efforts had value.
  • 20.
    MAYBE IT’S NOTABOUT EVALUATION… What other ways can we determine success besides traditional training evaluation?
  • 21.
    WHO DECIDES WHATSUCCESS MEANS? The Program Stakeholders  Executives  Management Approving Financing of Training  Learning and Development Professionals  Program Participants  Managers of Program Participants
  • 22.
    HERE’S MY TWOCENTS…  The time to think about how you will define success is during the DESIGN of a program, not after it’s completion.
  • 23.
    WHAT’S YOUR STORY? A strong testimonial from a participant or executive on how training helped workers perform better is often more powerful than any number on a spreadsheet.
  • 24.
  • 25.
    THE SUCCESS CASEMETHOD Step 1. Identify targeted business goals and impact expectations.
  • 26.
    THE SUCCESS CASEMETHOD Step 2. Survey a large representative sample of all participants in a program to identify high impact and low impact cases.
  • 27.
    THE SUCCESS CASEMETHOD Step 3. Analyze the survey data to identify:  a small group of successful participants  a small group unsuccessful participants
  • 28.
    THE SUCCESS CASEMETHOD Step 4. Conduct interviews with the two selected groups to:  Document the nature and business value of their application of learning  Identify the performance factors that supported learning application and obstacles that prevented it.
  • 29.
    THE SUCCESS CASEMETHOD Step 5. Document and disseminate the story:  Report impact  Applaud successes  Use data to educate managers and organization
  • 30.
    IT’S NOT ABOUTHOW MANY SEATS ARE FILLED It’s not about the Training, it’s about the Performance Improvement. It’s about an ongoing process, not an event. “Evaluating training is like evaluating the wedding instead of the marriage.” - Robert Brinkhoff
  • 31.
    IT’S NOT ABOUTHOW MANY SEATS ARE FILLED  Most learning and performance improvement takes place informally, and on the job.  The success of a training program is not determined via a test; it is determined by how effectively trainees use the training when they return to their work.
  • 32.
    SUMMARY  Data iscritical to show the value and effectiveness of Learning and Performance Programs  Performance is what happens after a Learning Program, so the most important data pertains to how work has changed
  • 33.
    SUMMARY  Success startswith identifying desired impacts during the needs assessment  Managers of program participants are the key stakeholders for driving successful learning and performance programs
  • 34.
    QUESTIONS? David Kelly  E-Mail: LnDDave@gmail.com  Phone: (516) 474-1852  Twitter: @LnDDave  Blog: http://davidkelly.me  Also connect via LinkedIn or Facebook

Editor's Notes

  • #2 Start with the story of me finding the binders of program evaluations