• Share
  • Email
  • Embed
  • Like
  • Save
  • Private Content
Training Evaluation workshop slides.may2011
 

Training Evaluation workshop slides.may2011

on

  • 6,921 views

PD518, Training Evaluation Workshop course slides. U.S. Department of State. Foreign Service Institute.

PD518, Training Evaluation Workshop course slides. U.S. Department of State. Foreign Service Institute.

Statistics

Views

Total Views
6,921
Views on SlideShare
6,876
Embed Views
45

Actions

Likes
6
Downloads
0
Comments
2

1 Embed 45

http://fsi.m.state.sbu 45

Accessibility

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel

12 of 2 previous next

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
  • Hi, I'm a facilitator that find your slides very interesting. Would be highly appreciative if you could email me these slides as the download is disabled. My email add is wayfarer68@gmail.com

    Thanks a million
    Are you sure you want to
    Your message goes here
    Processing…
  • I am a training director and was looking for some resources on training evaluation and saw your presentation. It is very good and to the point. However, I could download it. Is it possible to send me the PowerPoint presentation? I would appreciate it. Thanks, Sam
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Power point, video, flipcharts
  • To improve current and future training courses/programsTo decide whether to create, , continue or discontinue training courses/programsTo determine training’s contribution to desired organizational resultsTo maintain an organizationaldatabase for institutional knowledge and future decision-makingWhat do you think of these reasons? How do they relate to your organization?Can you think of additional reasons?
  • SystematicPlannedDocumentedTrainedImplementedResults communicatedActions taken“How we do business”
  • Question: Does this look familiar?Each level helps us answer those 4 key questions about how we know that our training works.
  • Staff Resources and TimeEvaluation planningData collection and administrationData analysis ReportingToolsIn-house and commercial Tool expertise/training
  • Tips to increase response rate:Communicate Before – let people know in class that you will be following up 3-6 months out Alert – survey email – invite to participate. Follow up – subsequent emails
  • Tips to increase response rate:Communicate Before – let people know in class that you will be following up 3-6 months out Alert – survey email – invite to participate. Follow up – subsequent emails
  • Quantitative and Qualitative Data Some methods provide data which are quantitative and some methods data which are qualitative. Quantitative methods are those which focus on numbers and frequencies rather than on meaning and experience. Quantitative methods (e.g. experiments, questionnaires and psychometric tests) provide information which is easy to analyse statistically and fairly reliable. Quantitative methods are associated with the scientific and experimental approach and are criticised for not providing an in depth description. Qualitative methods are ways of collecting data which are concerned with describing meaning, rather than with drawing statistical inferences. What qualitative methods (e.g. case studies and interviews) lose on reliability they gain in terms of validity. They provide a more in depth and rich description. Quantitative methods have come under considerable criticism. In modern research, most psychologists tend to adopt a combination of qualitative and quantitative approaches, which allow statistically reliable information obtained from numerical measurement to be backed up by and enriched by information about the research participants' explanations. You will find that many of the core studies do collect both types of data.  http://www.holah.karoo.net/quantitativequalitative.htm
  • Question: Does this look familiar?Each level helps us answer those 4 key questions about how we know that our training works.
  • Strategic Goal 1 – “Workforce Meets Priority Diplomatic & Operational Requirements as a Result of FSI Training”Strategic Goal 4 – “Core Training Continues To Fulfill Baseline Requirements and Meet New Challenges/New Skills”
  • Strategic Goal 1 – “Workforce Meets Priority Diplomatic & Operational Requirements as a Result of FSI Training”Strategic Goal 4 – “Core Training Continues To Fulfill Baseline Requirements and Meet New Challenges/New Skills”
  • Strategic Goal 1 – “Workforce Meets Priority Diplomatic & Operational Requirements as a Result of FSI Training”Strategic Goal 4 – “Core Training Continues To Fulfill Baseline Requirements and Meet New Challenges/New Skills”
  • Standard language for training:As a result of training, students will be able to achieve each learning objective
  • Standard language for training:As a result of training, students will be able to achieve each learning objective
  • Participant self-assessment Skills in presenting Progress Feedback from others Coaching using a checklist PD505 examples
  • PD 505, Training Tradecraft and PD513, Training and Presentation Skills10 minute recorded training presentationsPublic Diplomacy“Elevator speech” to a host national walking down the hallPD611, Leading a Small PostSimulated press interview with “reporter” in FSI TV studio
  • Reported in Kirkpatrick, Jim, PhD and Kirkpatrick, Wendy Kayser. “The Kirkpatrick Four Levels: A Fresh Look after 50 Years.” p. 5.

Training Evaluation workshop slides.may2011 Training Evaluation workshop slides.may2011 Presentation Transcript

  • Training Evaluation WorkshopPD518May 12-13, 2011
    Facilitator:
    Kathy Beckman FSI/SPAS/CSD
  • Training Evaluation is a
    “Hot Topic”
    because . . .
  • Setting the Learning Space
    Expectations, Learning Objectives, Group Norms
  • As a result of participating in this workshop, you should be able to:
    Define training evaluation
    Differentiate between Kirkpatrick’s four levels of training evaluation
    Write learning objectives that are specific and measurable
    Learning Objectives
  • Identify strategies and methods for assessing student reactions to training (Level 1)
    Specify steps for creating effective, valid tests to measure learning gains (Level 2).
    Identify strategies and methods for determining changes in behavior after training (Level 3).
    Locate resources for assessing organizational results (Level 4)
    Develop an Evaluation Plan for evaluating your training course/program.
    Learning Objectives, cont.
  • Your Training Evaluation Plan
    Throughout the course, you will be using templates to develop a Training Evaluation Plan for your own course(s)
  • Training Evaluation Overview
    How do we know that our training works?
  • How did participants react?
    Were they satisfied?
    Did it meet their needs?
    Would they recommend it to others?
  • What did they learn?
    Knowledge?
    Skills?
    Attitudes?
    “KSAs”
  • Did behavior (performance) change as a result?
    How have
    they applied knowledge, skills, and attitudes
    back on the job?
  • What is the organizational impact?
    Did the organization achieve its desired results from the training?
  • Training Evaluation is…
    “An integrated, four-level approach to determine the effectiveness of training programs.”
    Source: Kirkpatrick, Donald L. and Kirkpatrick, James D.
    Evaluating Training Programs. The Four Levels. 3rd Edition.
    San Francisco: Berrett-Koehler. 2006.
    7
  • Training Evaluation is…
    “A systematic process to determine the worth, value, or meaning of a training activity or process.”
    Source: Jack J. Phillips and Ron Drew Stone. How to Measure Training Results. A Practical Guide to Tracking the 6 Key Indicators. New York: McGraw-Hill. 2002.
    7
  • Why Evaluate?
    Sources:
    Kirkpatrick, Donald L. and Kirkpatrick, James D. Evaluating Training Programs. The Four Levels.
    Jack J. Phillips and Ron Drew Stone. How to Measure Training Results.
  • Successful Training Evaluation is…
    Aligned with organization’s mission and strategic goals
    A systematic process
    Data-driven
    Focused on continuing improvement
  • Benefits of Successful Training Evaluation
    Provides data over time (trends)
    Multiple stakeholders can use data as evidence when making decisions
    Helps improve quality of training activities
    Contributes to maximizing the talents of Department of State personnel
  • FSI’s Training Evaluation Model
    Kirkpatrick’s Four Levels
  • Level 4
    Organizational
    Results
     Plan
     Evaluate
    Level 3
    Student Behavior Change
    on the job
    Level 2
    Student Learning
    Level 1
    Student Reaction
     Train
    (KIRKPATRICK MODEL)
    9
  • Evaluation Timeframes
  • Key Points about the Kirkpatrick Model
    A sequence of ways to evaluate training
    Each level is important and yields valuable data
    Each level impacts the next level – don’t skip levels!
    Organizations should strategically select the scope of their evaluation activities
    “All levels for all training” is usually too costly
  • Evaluation Data Creates a “Chain of Evidence”
    Source: Jim Kirkpatrick, PhD and Wendy Kayser Kirkpatrick. “The Kirkpatrick Four Levels: A Fresh Look After 50 Years. 1959 – 2009.” Copyright 2009. Kirkpatrick Partners, LLC.
  • Sample Evaluation Strategy
    Source: Strategy of a large telecommunications company.
    Jack J. Phillips and Ron Drew Stone. How to Measure
    Training Results, p. 19.
  • Requirements – ALL Levels
    Tools
    Resources & Time
    Survey
    Data analysis
    • Evaluation planning
    • Data collection and administration
    • Data analysis
    • Reporting
  • Potential Selection Criteria for Levels 3 and 4
    Core programs to implement organization’s strategic goals
    New programs or major upgrades
    High visibility to management and key stakeholders
    Large target audience
    “Shelf life” of at least 1 year
    Source: Jack J. Phillips and Ron Drew Stone. How to Measure
    Training Results. A Practical Guide to Tracking the 6 Key Indicators.
    New York: McGraw-Hill. 2002.
  • Target Audience and Sample Size
    • How many people in your target audience do you need to contact to get meaningful results?
    • All participants
    • Sample
    • Confidence level (95%)
    • Confidence interval (margin of error)
    • See Sample Size Calculator at: http://www.surveysystem.com/sscalc.htm
  • Response Rate- What’s High Enough?
    • What percent of your target audience do you want to get responses from?
    • What response rate will give you a “statistically valid” sample?
    • How can you increase your response rate?
    See FSI 2010 Annual Training Survey, p. 48
    Industry research paper – on back table
  • Data Collection Methods
  • Data Collection Types
    Qualitative Data
    1. Describes context, meaning
    2. Examples: personal experiences, case studies
    Quantitative Data
    1. Can be expressed and analyzedas a number
    2. Examples: demographics, rankings (1-5)
    Both types of data are important!
  • FSI Survey Tools
  • Applying the Kirkpatrick Model
    at the Department of State
  • Level 4
    Organizational
    Results
     Plan
     Evaluate
    Level 3
    Student Behavior Change
    on the job
    Level 2
    Student Learning
    Level 1
    Student Reaction
     Train
    (KIRKPATRICK MODEL)
  • FSI Mission
    “The mission of the Foreign Service Institute (FSI) is to develop the men and women our nation requires to fulfill our leadership role in world affairs and to defend U.S. interests.”
    From FY 2011 Bureau Strategic Plan
    13
  • FSI’s 4 Strategic Goals
    Workforce Meets Priority Diplomatic & Operational Requirements as a Result of FSI Training
    Global Workforce Can More Widely Access Training Through Distance Learning Technologies
    From FY 2011 Bureau Strategic Plan
  • FSI’s 4 Strategic Goals, cont.
    Management Practices Promote Efficiency and Effectiveness
    Core Training Continues To Fulfill Baseline Requirements and Meet New Challenges/New Skills
    From FY 2011 Bureau Strategic Plan
  • Excerpt from FSI 2010 Annual Training Survey (Level 4)
    IMPACT OF FSI TRAINING ON IMPROVED JOB PERFORMANCE - % Agree or Strongly Agree
    2010 (Question Added in 2010)
    94 % Employee Self-Assessment
    96% Supervisor Assessment of Employee(s)
    48-51
  • Who can identify critical on-the-job behaviors?
    Leadership teams
    Supervisors
    Subject matter experts
    Within the Department
    Outside the Department
  • Knowledge
    Skills
    Attitudes
    “KSAs”
  • Examples of Learning Conditions
    Location
    Date/time
    Format (classroom, DVC, Webinar, distance learning)
    Facilitators
    Materials & methods
  • Writing Learning Objectives
    Make them specific and measureable!
  • What is a Learning Objective?
    A specific statement of measureable results a student can expect to achieve as a result of training
    Introduced by: “As a result of participating in this training, you will be able to…”
  • Which one of these is a Learning Objective?
    A – Identify (underline or circle) all misspelled proper nouns in a 500-word Spanish language news article.
    B – Play soccer.
    C - Demonstrate to students the correct way to complete a Leave Request form
  • Characteristics of an Effective Learning Objective
    Performance
    Conditions
    Criteria
    Source: Robert F. Mager. Preparing Instructional Objectives.
    Revised 2nd Edition. 1984. Lake Publishing Company. Belmont, CA.
  • Reaction
    Level 1
  • Level 1 Questions
    What did participants like about the program?
    What did they not like?
    How do they plan to use their training on the job?
  • Evaluation Samples– Level 1
    Kirkpatrick
    Chapter 4
    pp. 29-34
  • PD512, Training Design Workshop
    Workbook
    Generic evaluation for FSI Distance Learning class
    Workbook
    SAIT Instructor Led Training, Post Event Survey
    Handout
    FSI Evaluation Samples– Level 1
    38-41
  • Eight Tips on Developing Valid Level 1 Evaluation Forms
    Ken Phillips. Training Today, Fall 2007 (A quarterly magazine published by the Chicagoland Chapter of ASTD)
    Resources for Level 1
  • Are we on target?
    I want to learn more about…
  • Level 2
    Learning: knowledge, skills, attitudes
  • What did participants learn?
    What knowledge, skills, or attitudes did they develop or enhance during the course?
    Level 2 Questions
  • Some Level 2 Evaluation Methods
  • PD 505, Training Tradecraft
    10 minute training sessions (recorded)
    Public Diplomacy
    “Elevator speech”
    In-country TV interview (recorded)
    FSI Examples of Performance Tests
    Do you use performance tests?
    If so, please describe.
  • PD505, Training Tradecraft
    Interactive Presentation Checklist for training and presentation skills
    FSI Level 2 Evaluation Sample
    42
  • Research on Kirkpatrick Levels
    There is statistical correlation between Levels 1 and 2
    A positive learner reaction to training resulted in more learning
    There is also a statistical correlation between Levels 3 and 4
    “When employees consistently perform critical on-the-job behaviors, individual and overall productivity increased”1
    1 Research for Kirkpatrick Partners conducted by Sandy Almeida,MD,MPH.
  • Research on Kirkpatrick Levels, cont.
    However, there is not a statistical correlation between Levels 2 and 3
    “Even providing excellent training does not
    lead to significant transfer of learning to behavior and subsequent results without a good deal of deliberate and consistent reinforcement.”2
    2 Ibid., p. 5.
  • Failure to Transfer Learning to the Workplace
    More than 70% of learning failures occur after the training event is over.
  • Behavior
    Level 3
  • Level 3 Questions
    “Rubber meets the road”
    What specific behavior (performance) changes have resulted from the training?
    How well are participants applying the KSAs from training?
  • PD505, Training Tradecraft evaluation of 3 regional training classes
    School of Language Studies, FSI Classroom Training Impact Survey
    Handout
    Evaluation Samples– Level 3
    43-47
  • Resources for Level 3
    Quick Tips – a weekly online newsletter with practical tips on improving learning transfer to behavior on the job
    Register at www.kirkpatrickpartners.com
    Kirkpatrick Evaluation group
    www.linkedin.com
  • Organizational Results
    Overview of Level 4
  • Level 4 Questions
    What is the organizational impact?
    How did individual behavior change result in organizational success?
    The global view
  • Excerpt from FSI 2010 Annual Training Survey (Level 4)
    IMPACT OF FSI TRAINING ON IMPROVED JOB PERFORMANCE - % Agree or Strongly Agree
    2010 (Question Added in 2010)
    94% Employee Self-Assessment
    96% Supervisor Assessment of Employee(s)
    48-51
  • Resources for Level 4
    Books, articles, white papers
    Donald Kirkpatrick
    Jim Kirkpatrick
    Jack J. Phillips
    Websites
    American Society for Training and Development
    Kirkpatrick Partners
    ROI Institute (Phillips)
  • Complete your Training Evaluation Planning
    Templates, pp. 14, 33
  • Complete the templates for your training course/program
    Workbook, p. 14 – Starting your Training Evaluation Plan
    Workbook, p. 33 Data Collection Plan Worksheet
    Get together with partner(s) and share your evaluation plan
    Give and receive feedback to perfect your plans!
    Learning Activity
  • Presentations of Training Evaluation Plans