Rethinking Evaluation

  • 334 views
Uploaded on

Rethinking Evaluation: its value and limits in corporate learning. From Learning Technologies 2008 conference

Rethinking Evaluation: its value and limits in corporate learning. From Learning Technologies 2008 conference

More in: Business , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
    Be the first to like this
No Downloads

Views

Total Views
334
On Slideshare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
0

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Rethinking evaluation: its value and limits in learning David Wilson [email_address] Europe’s leading Corporate Learning Analysts
  • 2. ROI Evaluation Value Measurement
  • 3. ROI Evaluation Value Measurement Learning
  • 4. ROI Evaluation Value Measurement Learning (ALL)
  • 5. Research – L&D
    • Evaluation of learning impact is systemically weak
    • It is dominated by Kirkpatrick model with increasing pressure to talk about ROI
    • As a process, it typically focuses on capturing feedback from the immediate learning event
      • Reactionnaires / happy sheets
    • Use of technology to automate process is variable but generally very low
      • Aggregation and reporting of happy sheet info
      • Some use of e-survey and e-assessment tools
  • 6. Research – Business
    • At Exec level
      • There is a disconnect between organisational performance and investment in L&D
        • Largely an act of faith
        • Some tangential evidence from benchmarking
        • L&D provides little or no direct evidence to correct this
      • Negatively impacts budgets and commitment of management time and resources
    • At a line manager level
      • Ensuring staff learning is low on their priority list and personal time
      • Disinterested in aggregated happy sheet information
  • 7. The Cynics View
    • The “business” invests in L&D because it:
      • (A) knows it adds value to the business i.e. has evidence
      • (B) thinks it to be a good idea, i.e. has belief
      • (C) is forced to … i.e. non-discretionary
    • But, there is:
      • Little (A), Lots of (B), and (C) is overtaking both
    • And unfortunately, also a
      • (D) wants to shift the focus of blame for non-performance
  • 8. So What Should We Do About it? ROI Evaluation Value Measurement
  • 9. ROI Evaluation Value Measurement Outcomes Processes
  • 10. Learning Impact
    • Make the concept of learning impact central to all your processes
      • Analysis, Design, Delivery, Review
    • A good example ….
  • 11. Orientation B&Q University “The way we work” To gain an understanding of the business context To explain the role of the university and how we can support To explain the University process and agree expectations To agree the budget and resources available To agree next steps within process Value of Learning To complete the agreed activity to measure the value of the learning solution Orientation Design the solution To design the solution using university design methodology To agree the sustainable training solution Learning Context Why Learn? To blend or not to blend? Define the desired business outcomes To define the desired business outcomes To gain a view of the current performance To analyse the learning context To agree the way we will measure the value of learning To establish the wider business / change context within which the learning will fit To agree next steps with process Planning & Set up To agree the resources required To plan the development of the team to deliver To define and agree the administration process To plan the logistics of delivery To establish reporting requirements To plan the trial / pilot and how feedback will be captured To test out the delivery of the solution To capture the feedback and learning from the trial / pilot To agree the improvements required and plan for this to take place Trial / Pilot Review To complete the PIR activity (for projects) To ensure the sustainable training is set up To establish how the learning is being put into action within the business Delivery and Action To deliver the solution as agreed Identify the learning needs To discover the learning needs of all job roles To agree the next steps in the process
  • 12. Learning Impact
    • Impact focuses on Outcomes
      • Capability, Behaviour, Performance
    • What other factors influence these Outcomes?
      • Is this a learning problem at all?
    • How does the solution design alleviate or leverage these other influences?
    • What can we measure?
  • 13. Rethinking the Models
    • Kirkpatrick is a bad reference model
      • Implied causality
      • Legitimises focus on reaction (Level 1)
      • Ignores input measures
        • Did they attend?
        • Time / resource / money invested
      • Ignores Influencing Factors
    • Refocus on:
      • Input measures – time/attendance/cost
      • Output measures
  • 14. Impact of Learning – Our Basic Model Knowledge Application Performance Activity Effectiveness Value How many KNOW it? How many DO it? How it changes PERFORMANCE How effectively they KNOW it? How effectively they DO it? How effectively does it change PERFORMANCE The value of KNOWING The value of DOING The value of the PERFORMANCE change
  • 15. Impact Framework (Research Version) Learning Process Learning Process Motives/ Expectations Pre- Capability Post- Capability Commitment Post Knowledge/ Skills Pre Knowledge/ Skills Effectiveness Internal Factors Reward Environment Time Support Motivation Goals Application Application External Factors Competition Market Customer Time Budget Goals Performance Performance Learning Process Post Knowledge/ Skills Pre Knowledge/ Skills Value Application Performance Learning Process Post Knowledge/ Skills Pre Knowledge/ Skills Activity Application Performance Analysis Design
  • 16. IOL Process Outline Analysis Measurement Design Influencing Factors Delivery Review
  • 17. ROI?
    • ROI is a legitimate discussion for:
      • Aggregated investment and activity
      • Specific types of learning programmes
    • It is not legitimate if:
      • No meaningful way of measuring the benefits
      • It is not possible to calculate the causality relationship between the learning and the benefits
    • Estimation may be possible, but is not the same as measurement
      • Sceptical the value of Phillips ROI model although not having a systematic process
    • What’s the ROI of measuring and calculating the ROI?
  • 18. Measurement and Reporting
    • L&D needs a clear strategy and processes for measurement and reporting
    • What is already measured?
      • Input measures from LMS
      • Output measures from business
    • What can be meaningfully measured in addition
      • Surveying behaviour and performance change
      • Knowledge and competency assessments
    • Reporting
      • Value Added / Dashboards
      • Key programmes and outcomes
  • 19. Measuring and Reporting Impact Defined by project Often Impact (L4) Learning (L2) Behaviour (L3) & Learning (L2) Level of Assessment Aggregated (Benchmarked) Addressed specific learning needs Individual On Demand Directly (Outcomes) Organisationally (Human Capital) Valuation Meet specific project learning goals Performing role with requisite knowledge/skills Success The Business Project Organisation Driver Project-Driven Core Competency Valuing the Learning Portfolio
  • 20. Questions? [email_address] Research Knowledge Base: http:// research.elearnity.com