Scorecards, Learning Metrics and Measurement Strategies

5,460 views

Published on

Published in: Technology, Business

Scorecards, Learning Metrics and Measurement Strategies

  1. 1. Scorecards, Learning Metrics and Measurement StrategiesYou can listen to today’s webinar using your computer’s speakers or you may dial into the teleconference. If you would like to join the teleconference, please dial 1.866.469.3239 and enter access code xxx xx xxx. You will be on music hold until the seminar begins. #CLOwebinar
  2. 2. Scorecards, Learning Metrics and Measurement Strategies Speaker: Dennis Bonilla Senior Vice President GP Learning Solutions Group Andrew Snoey Senior Program Manager, Operations Microsoft Corporation Moderator: Kellye Whitney Managing Editor Chief Learning Officer magazine #CLOwebinar
  3. 3. Tools You Can Use•  Q&A –  Click on the Q&A panel (?) in the bottom right corner –  Type in your question in the space provided –  Click on “Send.” #CLOwebinar
  4. 4. Tools You Can Use•  Polling –  The poll will appear on the right side of your screen –  Select the best option for each question –  Click on “Submit” #CLOwebinar
  5. 5. Frequently Asked Questions1. Will I receive a copy of the slides after the webinar? YES 2. Will I receive a copy of the webinar recording? YES Please allow up to 2 business days to receive these materials #CLOwebinar
  6. 6. Scorecards, Learning Metrics and Measurement Strategies Kellye Whitney Managing Editor Chief Learning Officer magazine #CLOwebinar
  7. 7. Scorecards, Learning Metrics and Measurement Strategies Dennis Bonilla Senior Vice President GP Learning Solutions Group Andrew Snoey Senior Program Manager, Operations Microsoft Corporation #CLOwebinar
  8. 8. LEARNING METRICS, SCORECARDS MEASUREMENT STRATEGIES
  9. 9. POLL QUESTION Where does your role fit in your Learning and Development Organization? •  Not at all •  Instructional Designer •  Instructor •  Training Coordinator •  Training Manager •  In the Line of Business 9
  10. 10. POLL QUESTION Does your organization formally track and report learning metrics? •  Not at all •  Sometimes •  I Don’t Know •  Yes 10
  11. 11. POLL QUESTION Does your organization formally track and report learning metrics? •  Not at all •  Sometimes •  I Don’t Know •  Yes If you answered yes to the previous poll, are the metrics: •  Just Learning Process and Event Statistics (butts in seat, courses completed, etc….) •  Tied to Organizational Business Outcomes •  Both 11
  12. 12. SECTION 1 – CRITERIA FOR DEVELOPINGEFFECTIVE SCORECARDS 12
  13. 13. SCORECARD OPERATING PRINCIPLES •  Drive Business Leadership •  Aim for Simplicity •  Set Realistic, But Stretched, Targets •  Ensure Consistency in Taxonomy and Calculations •  Ensure Similar Metric Types have Similar Thresholds •  Align Related Scorecards’ Metrics and Targets 13
  14. 14. IS LEARNING MAKING A DIFFERENCE? •  Are business objectives being met? •  Business leaders are demanding more linkages to business outcomes and measurements of results, yet it’s the outcome least often measured. •  Huge confusion between internal process metrics and actual results. •  Sponsors want to see evidence of IMPROVED BUSINESS PERFORMANCE, not internal training operations metrics. 14
  15. 15. SECTION 2 – DISCUSS THE EVOLUTION OFLEARNING MEASUREMENT FRAMEWORKS 15
  16. 16. POLL QUESTION How does your organization / company measure the quality of the training programs for your key strategic initiatives? •  Not at all •  Data collection measuring attendee satisfaction •  Data collection measuring attendee competency •  Data collection measuring manager satisfaction in attendee performance •  Data collection measuring business impact after course completion 16
  17. 17. WHAT METHODS ARE USED IN EVALUATION? •  Evaluation is traditionally represented as the final stage in a systematic approach to training •  More recent Instructional Systems Design (ISD) models incorporate evaluation throughout the process, not as end state •  There are six general approaches to educational evaluation with Goal-based and Systems-based approaches as the predominant modes used in the evaluation of training programs •  Various frameworks have been proposed under the influence of these two approaches •  The most influential framework has come from Kirkpatrick which follows the Goal- based approach and the most commonly used today 17
  18. 18. KIRKPATRICK LEVELS But Level IV is the only place RESULTS BUSINESS IMPACT AS A where we can easily perform LEVEL IV CONSEQUENCE OF TRAINING an ROI calculation TRANSFER STUDENTS’ BEHAVIORAL CHANGE RESULTING Assessing Level II and III takes LEVEL III IN PERFORMANCE AT A HIGHER LEVEL more time and effort –this is where our certification program is located LEARNING STUDENT UNDERSTANDS AND CAN RECALL LEVEL II PRIMARY CONCEPTS FROM THE TRAINING REACTIONS STUDENT PERCEPTIONS OF THE COURSE Most organizations have LEVEL I AND ITS VALUE Level 0 to I ATTENDANCE BASELINE INFO SUCH AS LEVEL 0 STUDENT PRESENCE AND PARTICIPATION 18
  19. 19. SECTION 3 –ALIGNING TRAINING MEASURESWITH BUSINESS MEASURES 19
  20. 20. TALK ABOUT A DISCONNECT WHAT EXECUTIVES WANT43210 BUSINESS LINKAGE REGISTRATION & PROGRAM DELIVERY MEASUREMENT ALIGNMENT Data from the Forum Corporation, Customer Driven Training Organization 20
  21. 21. TALK ABOUT A DISCONNECT WHAT EXECUTIVES WANT CURRENT EFFORT ALLOCATION43210 BUSINESS LINKAGE REGISTRATION & PROGRAM DELIVERY MEASUREMENT ALIGNMENT Data from the Forum Corporation, Customer Driven Training Organization 21
  22. 22. MEASUREMENT STRATEGIES •  1st Question that should always be asked: “Does this metric actually measure an outcome for which the training program was created?” •  For some, it could be a behavior change; for others, improved perception by stakeholders, and others a quantitative measure of business impact (revenue, market share, market position, etc.). •  Measure what matters to the customer in a way that leads to informed action. •  Don’t pick a process before defining the purpose. Let form follow function! 22
  23. 23. MEASUREMENT STRATEGY CORE PRINCIPLES •  Relevant •  Credible and Compelling •  Simple and Consistent •  Efficient“For every complex problem there is an answer that is clear, simple and wrong!” – H.L. Mencken 23
  24. 24. POLL (TRIVIA) QUESTION When was the first, documented, formal use of an educational program evaluation conducted in the USA? •  1792 •  1815 •  1897 •  1916 •  1958 24
  25. 25. “NOT EVERYTHING THAT CAN BE COUNTED, COUNTS.”So What Counts? •  Outcomes that are of interest to the business to which training contributed in a demonstrable way. •  Sponsors may not be as wedded to proof of financial ROI as many HR professionals assume. •  An evaluation that provides sound evidence to support informed decisions that are in the best interest of the organization. •  They prove to an appropriate level of certainty that the program did or did not achieve the desired outcomes and they provide insights to improve subsequent courses or iterations; actionable Information (Bersin) 25
  26. 26. PRESENTING TRAINING RESULTSPower of Narrative •  “Stories are easier to remember – because in many ways, stories are how we remember.” Pink (2005): A Whole New Mind •  Stories bring statistics to life •  Stories will be repeated and retoldSell the Sizzle: It doesn’t matter how good your results are if no one knows aboutthem.Results x Marketing = Perception of Value 26
  27. 27. SECTION 4 – SHARING EXAMPLES OFMICROSOFT’S EFFORTS 27
  28. 28. PROJECT PICASSOIn the Fall of 2009, the Microsoft training organization to applied our measurementprinciples to our course assessment survey.Remember •  Relevant: questions refer to key course success metrics à eliminated non-action- oriented questions •  Simple: shortened the question quantity to improve response rate •  Consistent: aligned format, wording, and calculations à minor wording revisions OLD SURVEY NEW SURVEY OLT 12 + 4 6+4 cILT 11+7 12 + 4Compelling: added business impact questions vILT 11+7 12 + 4 •  Manager engagement on the material •  Different actions will the employee take due to the course 28
  29. 29. PROFESSIONAL CERTIFICATION TECHNICAL EMPLOYEES SALES & MARKETING EMPLOYEES•  Microsoft has an extensive •  Sales staff was taught the certification program for internal and “soft” (professional) skills, but how external candidates using third-party do you measure the knowledge? testing organizations •  In January 2010, Microsoft started•  Our technical staff can be certified in an effort to train Sales & 18 tracks; when looking across the Margeting staff through an technologies and levels, resulting in academy program that would 64 discrete certifications have certifications associated with certain skills http://www.microsoft.com/learning/en/us/certification/cert-default.aspx 29
  30. 30. SALES AND MARKETING ACADEMYOur Sales & Marketing Academy is part of our People focusPurpose •  Accelerate sales and marketing staff readiness •  Promote management coaching and re-enforcement Basic Competent Master Business Management A101 A201 A301Options considered Customer Relationship •  Certification of the growth of a skill Management B101 B201 B301 •  Certification of the growth of a skill set C101 Products & Services -- C301 •  Separate certifications for each skill C102 Organizational A101 B201 •  Certify high-priority skills Awareness A102 D201 -- … E101 B201 --Built F301 … F101 F201 F302 •  Curriculum roadmaps •  Course knowledge assessments – demand pass rate of 80% •  Single certification activity across all courses in a level involving case study/simulation, live work product review, role play, oral review panel 30
  31. 31. MEASURING TRAINING IMPACTAlmost two years ago, Microsoft started an effort to measure the business impact ofour core sales training curriculum (CST)Key questions asked •  Does CST increase the Account Manager’s annual quota attainment? •  Does CST increase Customer Satisfaction? •  Does CST increase the Account Manager’s pipeline velocity?Tried to control for •  Region / country •  Tenure / experience level •  Account Manager’s satisfaction level with CST courses •  Account Managers self-assessed impact of CST courses 31
  32. 32. ALIGNING TRAINING TO A BUSINESS SCORECARDDEVELOP SALES & QUARTERLYCORPORATE MARKETING SCORECARDSCORECARD Metric G EXECUTES TO RESULTS Sell $ STRATEGIC Color? Share % INITIATIVES Cust Sat # Y/R Feedback Loops TRAINING ASSESS OTHER OTHER Root TRAINING PLANS Cause? ADOPTION Course Metric A Metric B Metric C Metric D Metric E Sell… X X X FORMULATE RESPONSE PLAN Market… X X 1.  Drive existing course(s) Present… X 2.  Bolster manager reinforcement 3.  Re-vamp existing course(s) Create… X 4.  Develop new course(s) Close… X X 32
  33. 33. SUMMARY •  Align Your Scorecard with the Desired Business Results •  Do Not Abandon Measurement at the Smiley Face •  Count Training Results, Not Effort 33
  34. 34. LET’S CONTINUE TALKING Dennis Bonilla Senior Vice President, Learning Solutions General Physics Corporation v-deboni@microsoft.com 425-636-3410 office 702-885-5814 mobile Andrew Snoey Senior Program Manager, Operations Microsoft Corporation asnoey@microsoft.com 425-722-8510 office
  35. 35. Join Our Next CLO WebinarTo the Agile Go the Spoils: Why Learning Must Evolve Tuesday, June 21, 2011CLO Webinars start at 2 p.m. Eastern / 11 a.m. Pacific Register at www.clomedia.com/eventsJoin the CLO Network: http://network.clomedia.com/ #CLOwebinar

×