Rethinking evaluation:  its value and limits in learning David Wilson [email_address]   Europe’s leading Corporate Learning Analysts
ROI Evaluation Value Measurement
ROI Evaluation Value Measurement Learning
ROI Evaluation Value Measurement Learning (ALL)
Research – L&D Evaluation of learning impact is systemically weak It is dominated by Kirkpatrick model with increasing pressure to talk about ROI As a process, it typically focuses on capturing feedback from the immediate learning event  Reactionnaires / happy sheets Use of technology to automate process is variable but generally very low Aggregation and reporting of happy sheet info Some use of e-survey and e-assessment tools
Research – Business At Exec level  There is a disconnect between organisational performance and investment in L&D Largely an act of faith Some tangential evidence from benchmarking L&D provides little or no direct evidence to correct this Negatively impacts budgets and commitment of management time and resources At a line manager level Ensuring staff learning is low on their priority list and personal time Disinterested in aggregated happy sheet information
The Cynics View  The “business” invests in L&D because it: (A)  knows  it adds value to the business  i.e. has evidence (B)  thinks  it to be a good idea,  i.e. has belief (C) is  forced to  …  i.e. non-discretionary But, there is: Little (A), Lots of (B), and (C) is overtaking both And unfortunately, also a (D) wants to shift the focus of blame for non-performance
So What Should We Do About it? ROI Evaluation Value Measurement
ROI Evaluation Value Measurement Outcomes Processes
Learning Impact Make the concept of learning impact central to  all  your processes Analysis, Design, Delivery, Review A good example ….
Orientation B&Q University  “The way we work” To gain an understanding of the business context To explain the role of the university and how we can support  To explain the University process and agree expectations To agree the budget and resources available  To agree next steps within process Value of Learning To complete the agreed activity to measure the value of the learning solution Orientation Design the solution  To design the solution using university design methodology To agree the sustainable training solution Learning  Context Why Learn? To blend or not  to blend? Define the desired business outcomes To define the desired business outcomes  To gain a view of the current performance To analyse the learning context To agree the way we will measure the value of learning  To establish the wider business / change context within which the learning will fit To agree next steps with process Planning & Set up To agree the resources required  To plan the development of the team to deliver To define and agree the administration process To plan the logistics of delivery To establish reporting requirements To plan the trial / pilot and how feedback will be captured To test out the delivery of the solution To capture the feedback and learning from the trial / pilot  To agree the improvements required and plan for this to take place Trial / Pilot Review To complete the PIR activity (for projects) To ensure the sustainable training is set up To establish how the learning is being put into action within the business  Delivery and Action  To deliver the solution as agreed Identify the learning needs To discover the learning needs of all job roles To agree the next steps in the process
Learning Impact Impact focuses on Outcomes Capability, Behaviour, Performance What other factors influence these Outcomes? Is this a learning problem at all? How does the solution design alleviate or leverage these other influences? What can we measure?
Rethinking the Models Kirkpatrick is a bad reference model Implied causality Legitimises focus on reaction (Level 1) Ignores input measures Did they attend?  Time / resource / money invested Ignores Influencing Factors Refocus on: Input measures – time/attendance/cost Output measures
Impact of Learning – Our Basic Model Knowledge Application Performance Activity Effectiveness Value How many KNOW it? How many DO it? How it changes  PERFORMANCE How  effectively  they KNOW it? How  effectively  they  DO it? How  effectively  does it change  PERFORMANCE The  value  of KNOWING The  value  of DOING The  value  of the  PERFORMANCE change
Impact Framework (Research Version) Learning Process Learning Process Motives/ Expectations Pre- Capability Post- Capability Commitment Post Knowledge/ Skills Pre Knowledge/ Skills Effectiveness Internal Factors Reward Environment Time Support Motivation Goals Application Application External Factors Competition Market Customer Time Budget Goals Performance Performance Learning Process Post Knowledge/ Skills Pre Knowledge/ Skills Value Application Performance Learning Process Post Knowledge/ Skills Pre Knowledge/ Skills Activity Application Performance Analysis Design
IOL Process Outline Analysis Measurement Design Influencing  Factors Delivery Review
ROI? ROI is a legitimate discussion for: Aggregated investment and activity Specific types of learning programmes It is not legitimate if: No meaningful way of  measuring the benefits It is not possible to calculate the causality relationship between the learning and the benefits Estimation may be possible, but is not the same as measurement Sceptical the value of Phillips ROI model although not having a systematic process What’s the ROI of measuring and calculating the ROI?
Measurement and Reporting L&D needs a clear strategy and processes for measurement and reporting What is already measured? Input measures from LMS Output measures from business What can be meaningfully measured in addition Surveying behaviour and performance change Knowledge and competency assessments Reporting Value Added / Dashboards Key programmes and outcomes
Measuring and Reporting Impact Defined by project Often Impact (L4) Learning (L2) Behaviour (L3) & Learning (L2) Level of Assessment Aggregated (Benchmarked) Addressed specific learning needs Individual On Demand Directly  (Outcomes) Organisationally (Human Capital) Valuation Meet specific project learning goals Performing role with requisite knowledge/skills Success The Business Project Organisation Driver Project-Driven Core Competency Valuing the Learning Portfolio
Questions? [email_address] Research Knowledge Base:  http:// research.elearnity.com

Rethinking Evaluation

  • 1.
    Rethinking evaluation: its value and limits in learning David Wilson [email_address] Europe’s leading Corporate Learning Analysts
  • 2.
  • 3.
    ROI Evaluation ValueMeasurement Learning
  • 4.
    ROI Evaluation ValueMeasurement Learning (ALL)
  • 5.
    Research – L&DEvaluation of learning impact is systemically weak It is dominated by Kirkpatrick model with increasing pressure to talk about ROI As a process, it typically focuses on capturing feedback from the immediate learning event Reactionnaires / happy sheets Use of technology to automate process is variable but generally very low Aggregation and reporting of happy sheet info Some use of e-survey and e-assessment tools
  • 6.
    Research – BusinessAt Exec level There is a disconnect between organisational performance and investment in L&D Largely an act of faith Some tangential evidence from benchmarking L&D provides little or no direct evidence to correct this Negatively impacts budgets and commitment of management time and resources At a line manager level Ensuring staff learning is low on their priority list and personal time Disinterested in aggregated happy sheet information
  • 7.
    The Cynics View The “business” invests in L&D because it: (A) knows it adds value to the business i.e. has evidence (B) thinks it to be a good idea, i.e. has belief (C) is forced to … i.e. non-discretionary But, there is: Little (A), Lots of (B), and (C) is overtaking both And unfortunately, also a (D) wants to shift the focus of blame for non-performance
  • 8.
    So What ShouldWe Do About it? ROI Evaluation Value Measurement
  • 9.
    ROI Evaluation ValueMeasurement Outcomes Processes
  • 10.
    Learning Impact Makethe concept of learning impact central to all your processes Analysis, Design, Delivery, Review A good example ….
  • 11.
    Orientation B&Q University “The way we work” To gain an understanding of the business context To explain the role of the university and how we can support To explain the University process and agree expectations To agree the budget and resources available To agree next steps within process Value of Learning To complete the agreed activity to measure the value of the learning solution Orientation Design the solution To design the solution using university design methodology To agree the sustainable training solution Learning Context Why Learn? To blend or not to blend? Define the desired business outcomes To define the desired business outcomes To gain a view of the current performance To analyse the learning context To agree the way we will measure the value of learning To establish the wider business / change context within which the learning will fit To agree next steps with process Planning & Set up To agree the resources required To plan the development of the team to deliver To define and agree the administration process To plan the logistics of delivery To establish reporting requirements To plan the trial / pilot and how feedback will be captured To test out the delivery of the solution To capture the feedback and learning from the trial / pilot To agree the improvements required and plan for this to take place Trial / Pilot Review To complete the PIR activity (for projects) To ensure the sustainable training is set up To establish how the learning is being put into action within the business Delivery and Action To deliver the solution as agreed Identify the learning needs To discover the learning needs of all job roles To agree the next steps in the process
  • 12.
    Learning Impact Impactfocuses on Outcomes Capability, Behaviour, Performance What other factors influence these Outcomes? Is this a learning problem at all? How does the solution design alleviate or leverage these other influences? What can we measure?
  • 13.
    Rethinking the ModelsKirkpatrick is a bad reference model Implied causality Legitimises focus on reaction (Level 1) Ignores input measures Did they attend? Time / resource / money invested Ignores Influencing Factors Refocus on: Input measures – time/attendance/cost Output measures
  • 14.
    Impact of Learning– Our Basic Model Knowledge Application Performance Activity Effectiveness Value How many KNOW it? How many DO it? How it changes PERFORMANCE How effectively they KNOW it? How effectively they DO it? How effectively does it change PERFORMANCE The value of KNOWING The value of DOING The value of the PERFORMANCE change
  • 15.
    Impact Framework (ResearchVersion) Learning Process Learning Process Motives/ Expectations Pre- Capability Post- Capability Commitment Post Knowledge/ Skills Pre Knowledge/ Skills Effectiveness Internal Factors Reward Environment Time Support Motivation Goals Application Application External Factors Competition Market Customer Time Budget Goals Performance Performance Learning Process Post Knowledge/ Skills Pre Knowledge/ Skills Value Application Performance Learning Process Post Knowledge/ Skills Pre Knowledge/ Skills Activity Application Performance Analysis Design
  • 16.
    IOL Process OutlineAnalysis Measurement Design Influencing Factors Delivery Review
  • 17.
    ROI? ROI isa legitimate discussion for: Aggregated investment and activity Specific types of learning programmes It is not legitimate if: No meaningful way of measuring the benefits It is not possible to calculate the causality relationship between the learning and the benefits Estimation may be possible, but is not the same as measurement Sceptical the value of Phillips ROI model although not having a systematic process What’s the ROI of measuring and calculating the ROI?
  • 18.
    Measurement and ReportingL&D needs a clear strategy and processes for measurement and reporting What is already measured? Input measures from LMS Output measures from business What can be meaningfully measured in addition Surveying behaviour and performance change Knowledge and competency assessments Reporting Value Added / Dashboards Key programmes and outcomes
  • 19.
    Measuring and ReportingImpact Defined by project Often Impact (L4) Learning (L2) Behaviour (L3) & Learning (L2) Level of Assessment Aggregated (Benchmarked) Addressed specific learning needs Individual On Demand Directly (Outcomes) Organisationally (Human Capital) Valuation Meet specific project learning goals Performing role with requisite knowledge/skills Success The Business Project Organisation Driver Project-Driven Core Competency Valuing the Learning Portfolio
  • 20.
    Questions? [email_address] ResearchKnowledge Base: http:// research.elearnity.com