• Like
  • Save
Quick, Cheap and Dirty Training Evaluation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Quick, Cheap and Dirty Training Evaluation

  • 1,068 views
Published

The Turin 2009 Learning Link …

The Turin 2009 Learning Link

Session 4.4.c) Enhancing institutional capacity development approaches

Day 4 - December 10

Published in Business , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
1,068
On SlideShare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
0
Likes
1

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Free good Soft skills Attribution Tradeoff between cost and accuracy
  • Proving vs improving - very expensive to gather accurate information - Even if gathered, doesn’t tell us much
  • Participatory methods: Online communities which provide services. Before/during/after. Action plans Inhibiting factors: Support of managers/colleagues Availability of resources Need for follow-up support (eg what did you have trouble implementing)

Transcript

  • 1. Quick, Cheap and Dirty Training Evaluation (Okay, maybe not so dirty….)
  • 2. What do We Want Evaluation to Do?
    • Accountability
    • Give quantitative measures of training results
      • Level 1: Participant Satisfaction (during/end of course)
      • Level 2: Learning (during/end of course)
      • Level 3: Participant Behaviour Outcomes (3 months-2 years after training)
      • Level 4: Impact on participants’ organizations
  • 3. What do We Want Evaluation to Do?
    • Learning
    • Give information that can be used to improve training results
      • Is course content targeted to participant needs? If not, what can we do to better focus training?
      • Are we choosing the right participants?
      • Lecturers? Training Material? Training methods?
      • Are participants succeeding in applying what they’ve learned? If not, why not?
  • 4. What do We Want Evaluation to Do?
    • Planning (developmental evaluation)
    • Pre-course needs assessment
  • 5. Why Doesn’t Evaluation Do all These Things?
    • No money
    • No time
    • Not sure how to measure outcomes and impact
    • Low response rates from participants
  • 6. Evaluation of Training for CD
    • Good Training for CD is Hard
    • Good Evaluation of Training for CD Is Harder
    • Good Evaluation of Short-Term One-Off Multi-Organization Training for CD is Hardest
  • 7. A Few Propositions….
    • “ 98% of Participants Thought Training Was Useful”
    • IS NOT A GOOD TRAINING EVALUATION RESULT
  • 8. A Few More Propositions…
    • MEASURING RESULTS IS NOT THE BEST WAY TO USE EVALUATION TO IMPROVE THEM
  • 9. A Few Propositions Con’t….
    • THE EASIEST PART OF TRAINING FOR CD IS THE TRAINING
  • 10. And One More Proposition
    • Lord Give me the Strength to Measure What I Can Change
    • The Courage to Acknowledge that Which I Can’t Change
    • And the Wisdom to Know the Difference
  • 11. The Way Forward To improve overall institute performance Evaluate training processes, not training participants To improve training content Measure obstacles implementation To enhance survey response rate, accuracy Use participatory methods To improve training outcomes/impact Get managers involved in the process, get participants thinking about why they are there. To evaluate results Focus on level 3: outcomes/behaviour change
  • 12. Measuring Training Processes
    • Planning:
      • Quality of Needs Assessment
      • Meaningful Client Involvement in Needs Assessment
      • Clarity of Training Objectives, Logic Model for Change
      • Strategic Participant Selection
    • Teaching:
      • Quality of Lecturers/Course Materials/Learning Methods
    • Linking Learning to Implementation:
      • Need for Follow-up Support/Quality of Support
  • 13. Measuring Obstacles to Implementation
    • Support of Managers
    • Support of Colleagues
    • Available Resources/Equipment
    • Appropriate Organizational Structure
    • Appropriate Policy Structure
    • Need for More Follow-Up Support
  • 14. Participatory Training Assessment and Evaluation in Three Stages
    • Stage 1: Pre-course needs assessment with application form
    • Stage 2: End-of-course assessment
    • Stage 3: post-course assessment (3-6 months)
  • 15. Stage 1: Course Application What knowledge, skills, or other benefits would you most like to gain from the course? Please be as specific as possible. You may list up to three goals. For each of the goals listed above, how do you expect to use what you have gained here once you have returned to your workplace? Supervisor Comments/Approval: Goal 1: Goal 2: Goal 3:
  • 16. . Stage 2: End-of-Course What knowledge, skills, or other benefits would you most like to gain from the course? Please be as specific as possible. You may list up to three goals. For each of the goals listed above, how do you expect to use what you have gained here once you have returned to your workplace? Now that you have completed the course, please review the goals you set for yourself. For each goal, please rate on a scale of 1-5 the extent to which the course met your needs, with 1 = not at all and 5 = entirely. Where course content did not cover the topic, please choose not applicable - N/A Comments Goal 1: 1 2 3 4 5 N/A
  • 17. Stage 2 con’t: Second Goal Table Of the knowledge/skills which you acquired in the course, which do you believe will be most useful to you back on the job? You can chose up to three knowledge/skill areas. How do you believe you will use the knowledge/skills in your work?   Knowledge/skill area #1 Knowledge/skill area #2 Knowledge/skill area #2
  • 18. Stage 3: Post-Course Follow-Up Of the knowledge/skills which you acquired in the course, which do you believe will be most useful to you back on the job? You can chose up to three knowledge/skill areas. How do you believe you will use the knowledge/skills in your work?   Please rate the extent to which you have been able to use this knowledge/skill in your work. Please rate on a scale of 1-5 where 1=not at all and 5=I have fully used what I have learned at work. Have you faced obstacles in using in your work the knowledge/skills that you have acquired in this course? If you have, please circle any statements that apply. Please answer separately with regard to each knowledge/skill area that you listed here.   Knowledge/skill area #1: 1 2 3 4 5   1. I do not have adequate funds. 2. I do not have adequate facilities and/or equipment. 3. My managers do not support me. 4. My colleagues do not support me. 5. I need more expert advice on how to apply what I’ve learned. 6. What I learned was not relevant for my present workplace.
  • 19. Why The Tables?
    • They can be used to:
    • Evaluate participant selection
    • Adjust training content to needs of persons in course
    • Focus participants to enhance their learning experience
    • Evaluate impact of training modules on participants with very different needs/expectations
    • Get a more nuanced picture of course results, including unintended ones
    • Get a more nuanced picture of obstacles to training implementation
  • 20. All this and more……
    • Quantifiable across courses while giving richer, open/ended information
    • Inexpensive to administer
    • Simple - standardized questionnaire means that you don’t need to prepare separate ones for every course.
  • 21. Preliminary Field Test Findings
    • With detailed directions, participants did, for the most part, state their goals in clear, specific, measurable terms.
    • Participants took the time to provide detailed responses to the table questions.
    • Participant ratings of achievement of learning goals reflected training content
    • Participants mostly gave different ratings for different learning goals
    • Trainers reported that information received gave them a much more nuanced picture of participant expectations/needs and course results