This document discusses evaluating training programs. It identifies four levels of evaluation: reaction, learning, behavior, and results. Reaction measures how learners felt about the training. Learning assesses what new skills and knowledge were gained. Behavior looks at lasting changes on the job. Results determines the training's impact on organizational goals. Methods are provided for each level, including surveys, tests, observation, and statistical analysis. The importance of developing an evaluation strategy is emphasized. The strategy should include planning the evaluation, collecting and analyzing data, and presenting results with recommendations for improvement. Evaluating training ensures objectives are met and identifies ways to enhance future programs.
This document discusses team development for an assignment that involves an expedition. It outlines the team's objectives of building friendship and leadership through completing the expedition. It also discusses methods for reviewing team performance such as peer reviews and expert reviews. Key strategies for evaluating and achieving team performance are identified, such as developing evaluation forms and setting guidelines for feedback. Finally, task management techniques like flip charts and computer charts are mentioned.
This document discusses training evaluation. It begins by defining training evaluation and identifying its objectives: to define training evaluation, discuss its purpose, and identify different evaluation types. It then discusses that evaluation is a state of mind, not just techniques. The purposes of evaluation are feedback, control, and intervention. Evaluation can assess the training plan, process, and product. Different evaluation methods and tools are reviewed. The document advocates for evaluating all stages of training and discusses frameworks for doing so, including Kirkpatrick's model and evaluating achievement of targets, attracting resources, and satisfying interested parties. It argues that the important question is not why evaluate, but whether an organization can afford not to evaluate training activities.
Mark Bailye, Client Success Specialist, A/NZ | Bb Education on Tour 2015 | Ed...Blackboard APAC
The document discusses assessment and feedback principles for encouraging positive learning habits. It describes Mark Bailye's role as an Adoption Specialist with Blackboard, providing expertise on effective adoption and implementation of Blackboard solutions. The document outlines areas of Mark's expertise including academic change management, curriculum design, and strategic learning management system planning and implementation. It focuses on encouraging positive learning habits, assessment and feedback, outlining key principles and how tools in a learning management system can support these areas. Examples are provided around using tools like assignments, discussions, badges and adaptive release to trigger actions, provide rewards and encourage investment in learning.
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Lambda Solutions
Access to webinar recording here: http://go.lambdasolutions.net/webinar-growing-trend-of-open-source-learning
Whether it’s to inform, to improve, to change—or a combination of these factors, training must have measurable outcomes that contribute to larger organizational goals. Good training evaluation techniques identify and measure the impact of learning on job performance and ultimately, organization-wide business results. When it comes to measuring eLearning, Donald Kirkpatrick’s Four Level of Evaluation model is one of the most widely used and respected worldwide.
Co-hosted by Paula Yunker, with 30+ years of instructional design experience and certification in Kirkpatricks Four Levels Evaluation—this webinar will explore why learning evaluation is an important component of any training program and how you can measure the application of learning beyond the learning event itself. We’ll discuss how to implement learning evaluation that’s practical and provides value but isn’t complicated, time consuming or expensive. Paula will also share her favorite learning evaluation resources after the webinar!
Check out the slides to learn more about:
- Why learning evaluation is critical for business results
- Kirkpatrick’s four levels of evaluation explained
- Aligning learning to organizational goals
- Typical challenges implementing evaluation in an organization
- Practical strategies for implementing learning evaluation
- Our favorite learning evaluation resources
The document discusses several instructional design models including ADDIE, Dick and Carey, Kemp, and Smith and Ragen. ADDIE is a five-phase model for developing instruction that includes analysis, design, development, implementation, and evaluation. The Dick and Carey model focuses on identifying instructional goals and ensuring objectives are met. The Kemp model determines learner characteristics and needed resources. Smith and Ragen emphasize analysis of learners, strategies for organization and management, and evaluation.
Mark Bailye, Blackboard - Moodlemoot 2015 Presentation - Enhancing Your Asses...Blackboard APAC
This document discusses assessment practices and using the Moodle learning management system to enhance assessment. It describes the purposes of assessment as being for learning, as learning, and of learning. Principles of effective assessment and feedback are outlined, including clarifying expectations, encouraging effort, delivering high-quality feedback, and developing self-assessment. The document then examines how different tools in Moodle, such as assignments, quizzes, and forums, can be used to implement these principles of assessment and feedback. Several examples are also provided of how assessment and feedback tools in Moodle have been incorporated into actual courses.
The Use of Rubrics to Support Assessment
What, Why, How?
This presentation examines the assessment rubric as a powerful tool to support student engagement, consistent academic practice and high quality feedback.
This document discusses team development for an assignment that involves an expedition. It outlines the team's objectives of building friendship and leadership through completing the expedition. It also discusses methods for reviewing team performance such as peer reviews and expert reviews. Key strategies for evaluating and achieving team performance are identified, such as developing evaluation forms and setting guidelines for feedback. Finally, task management techniques like flip charts and computer charts are mentioned.
This document discusses training evaluation. It begins by defining training evaluation and identifying its objectives: to define training evaluation, discuss its purpose, and identify different evaluation types. It then discusses that evaluation is a state of mind, not just techniques. The purposes of evaluation are feedback, control, and intervention. Evaluation can assess the training plan, process, and product. Different evaluation methods and tools are reviewed. The document advocates for evaluating all stages of training and discusses frameworks for doing so, including Kirkpatrick's model and evaluating achievement of targets, attracting resources, and satisfying interested parties. It argues that the important question is not why evaluate, but whether an organization can afford not to evaluate training activities.
Mark Bailye, Client Success Specialist, A/NZ | Bb Education on Tour 2015 | Ed...Blackboard APAC
The document discusses assessment and feedback principles for encouraging positive learning habits. It describes Mark Bailye's role as an Adoption Specialist with Blackboard, providing expertise on effective adoption and implementation of Blackboard solutions. The document outlines areas of Mark's expertise including academic change management, curriculum design, and strategic learning management system planning and implementation. It focuses on encouraging positive learning habits, assessment and feedback, outlining key principles and how tools in a learning management system can support these areas. Examples are provided around using tools like assignments, discussions, badges and adaptive release to trigger actions, provide rewards and encourage investment in learning.
Measuring the Impact of eLearning: Turning Kirkpatrick’s Four Levels of Evalu...Lambda Solutions
Access to webinar recording here: http://go.lambdasolutions.net/webinar-growing-trend-of-open-source-learning
Whether it’s to inform, to improve, to change—or a combination of these factors, training must have measurable outcomes that contribute to larger organizational goals. Good training evaluation techniques identify and measure the impact of learning on job performance and ultimately, organization-wide business results. When it comes to measuring eLearning, Donald Kirkpatrick’s Four Level of Evaluation model is one of the most widely used and respected worldwide.
Co-hosted by Paula Yunker, with 30+ years of instructional design experience and certification in Kirkpatricks Four Levels Evaluation—this webinar will explore why learning evaluation is an important component of any training program and how you can measure the application of learning beyond the learning event itself. We’ll discuss how to implement learning evaluation that’s practical and provides value but isn’t complicated, time consuming or expensive. Paula will also share her favorite learning evaluation resources after the webinar!
Check out the slides to learn more about:
- Why learning evaluation is critical for business results
- Kirkpatrick’s four levels of evaluation explained
- Aligning learning to organizational goals
- Typical challenges implementing evaluation in an organization
- Practical strategies for implementing learning evaluation
- Our favorite learning evaluation resources
The document discusses several instructional design models including ADDIE, Dick and Carey, Kemp, and Smith and Ragen. ADDIE is a five-phase model for developing instruction that includes analysis, design, development, implementation, and evaluation. The Dick and Carey model focuses on identifying instructional goals and ensuring objectives are met. The Kemp model determines learner characteristics and needed resources. Smith and Ragen emphasize analysis of learners, strategies for organization and management, and evaluation.
Mark Bailye, Blackboard - Moodlemoot 2015 Presentation - Enhancing Your Asses...Blackboard APAC
This document discusses assessment practices and using the Moodle learning management system to enhance assessment. It describes the purposes of assessment as being for learning, as learning, and of learning. Principles of effective assessment and feedback are outlined, including clarifying expectations, encouraging effort, delivering high-quality feedback, and developing self-assessment. The document then examines how different tools in Moodle, such as assignments, quizzes, and forums, can be used to implement these principles of assessment and feedback. Several examples are also provided of how assessment and feedback tools in Moodle have been incorporated into actual courses.
The Use of Rubrics to Support Assessment
What, Why, How?
This presentation examines the assessment rubric as a powerful tool to support student engagement, consistent academic practice and high quality feedback.
Enhancing the value of training and professional developmentKathleen Wolfhope
The document discusses relevant training and professional development. It outlines adult learning principles according to Malcolm Knowles, including moving from dependency to self-directedness. It also discusses Kirkpatrick's four levels of evaluation plus a fifth level added by Jack Phillips on return on investment. The document advocates for creating value through real-world application and accessibility of credible trainers to foster continuous learning for both short and long-term impact. It emphasizes the importance of collaboration and learning communities.
[DevDay2018] Build an effective learning program - By Vinh Hoang, Training & ...DevDay Da Nang
This document outlines steps for building an effective learning program. It discusses assessing needs, developing learning objectives, designing the program, implementing training, and evaluating performance. Key points include spending 80% of effort on needs assessment and objectives to focus training on addressing organizational pain points. Adult learning theory and techniques like serious eLearning are important to consider in design. Coaching after training is also highlighted as critical for providing real-time feedback to help trainees apply their new knowledge on the job. The overall goal is to increase training effectiveness beyond its typical 36% rate.
This document discusses rubrics and their use in project-based learning. It defines rubrics as tools that provide a common vocabulary for assessment and facilitate peer review. Good project-based learning problems relate to the real world, require decision-making, are multi-stage, and incorporate higher-order thinking skills. Rubrics can be used to evaluate project-based learning problems according to these criteria. The document also explores different types of rubrics and considerations for developing effective rubrics.
4. how will you know when you are achieving your aims co dev toolkit - activi...juko101
To effectively evaluate progress towards goals, one must first clearly define what those goals are. The document provides guidance on setting measurable goals, including thinking about how learners will demonstrate achievement through their actions and words. It also recommends trialling different methods of collecting goal-related data in classes, getting feedback, and deciding on the most appropriate metrics to track achievements over time. Being deliberate about setting goals and implementing measures to assess impact is important for curriculum development work to demonstrate real progress and inform future efforts.
This document outlines six steps for ensuring success in mathematics and science education: 1) having a well-developed curriculum, 2) using instructional materials aligned with the curriculum, 3) employing consistent instructional strategies, 4) providing adequate professional development, 5) implementing an assessment system aligned with standards and curriculum, and 6) establishing policies and procedures that support teaching and learning. It instructs teachers to rate their school on each step, discuss as a team, develop a consensus, and summarize their ratings to identify needs and guide next steps in planning.
This document discusses using rubrics to assess enhanced finished products. It defines rubrics and their purpose, and provides an example rubric with criteria for materials used, procedure followed, and workmanship. The rubric scores products as excellent, good, or fair in each category and can be used to assess the quality of enhanced products. The teacher will teach students about rubrics and their use in assessment, and have students use internet resources to learn more about rubrics.
This document provides an agenda and overview for an instructional strategies course for teaching in medicine. It includes:
- Introductions and an overview of course expectations and assignments
- A guided inquiry exercise where participants are organized into groups based on teaching experience to discuss models of teaching
- An introduction to the course blog where reading assignments will be posted
- An assignment to write a short paper reflecting on why they chose to teach due next week
4. how will you know when you are achieving your aims co dev toolkit - activi...juko101
To effectively evaluate your progress towards achieving your goals, you must first clearly define what you intend to accomplish. One helpful way to do this is to describe the specific behaviors you expect to see from learners and hear them say once a particular outcome is reached. The document then provides some examples of ways to measure success, such as evaluating learners' work, observing classroom behaviors, and getting feedback from learners themselves.
This document outlines the assessment components for a course on curriculum and instructional design. It includes:
1) A 30% individual portfolio requiring weekly reflective essays and notes on course readings and discussions. Students are expected to critically analyze what they are learning.
2) A 40% group curriculum and instructional design project. Groups will analyze and compare curricula from different countries and contexts. Each student will then design lesson plans based on one curriculum.
3) A 30% final exam assessing students' competency in curriculum and instructional design through a 2-hour written examination.
The Morrison, Ross & Kemp (MRK) model is a holistic instructional design approach that considers all factors in the learning environment. It uses a learner-focused approach where the instructional designer asks initial questions about learner skills and knowledge and then determines which of nine elements to use to develop a flexible instructional plan. The MRK model emphasizes continuous refinement through formative evaluation and allows flexibility in its application.
Evaluate the Effectiveness of Your Online Learning & Training ProgramsNimritta Parmar
The ability to measure the effectiveness of your organization’s training and development programs is critical to ensure your learning strategy is aligned to your desired business outcomes. However, having a system in place to accurately measure the impact of your learning programs can often be a complex challenge — One that organizations struggle with the most.
Join this webinar to discover how to enhance employee performance and prove the value of your learning investments, by implementing simple and effective strategies for measuring the impact of your online training programs.
Watch the presentation to learn:
- Common issues and challenges with measuring learning outcomes
- How to align learning outcomes to business objectives
- Best practices for evaluating the effectiveness of your training programs
- How to analyze and interpret learning data to understand the impact of individual learning, and performance
- How to leverage these insights to improve training programs, and enable the organization to make better informed decisions about learning
PDF✔Download❤ Problem-Based Learning An Inquiry Approach hugenerawewehewr
A stepbystep guide for teaching your students to think critically and solve complex problems! Problembased learning expert John Barell troubleshoots the PBL process for teachers drawing from practical classroom experience. Stepbystep procedures make this remarkably effective teaching model accessible and highly doable for all teachers from beginners to veterans. This standardsbased teacherfriendly second edition of the author's popular PBL guide includesExamples showing problembased learning in action Answers to frequently asked questions on standardsbased implementation Thorough guidelines for developing problems for students to solve Rubrics and assessment tips to ensure that standards are met
Learning Evaluation - How to get most out of your training programs simply an...Lambda Solutions
Whether it’s to inform, teach, improve, change or a combination of these factors, training should have a purpose. But how can you be sure you’ve done what you set out to do? Good training evaluation techniques identify and measure what learning has occurred during and after learning, whether job performance improvements have been realized, and most importantly, how organizations can get the best training value for their money and efforts.
In this webinar, we explore why learning evaluation is an important component of any training program and how you can measure the application of learning beyond the learning event itself. We discuss how to implement learning evaluation that’s practical and provides value but isn’t have to be complicated, time-consuming or expensive.
Tune in!
T Level Assessment & Curriculum Planning - Simple ExampleCapellaSystems
This document provides an overview of the T Level assessment and curriculum planning process, including the following key steps:
1. Create an assessment plan by setting course aims and outcomes, which is then used to plan curriculum intent and review impact.
2. Develop curriculum implementation by creating activities aligned to the assessment outcomes and mapping the curriculum sequence.
3. Analyze curriculum impact once delivery begins by adding findings from student assessments and identifying improvement actions.
Measures That Matter: How to Capture and Communicate the Value of Your Learni...Casey Cramer
By Noelle K. Akins, L.P.C., Navigator Management Partners
Learning Objectives
• Identify five levels of “matterful” learning evaluation
• Recognize how, when and where to gather data for each level
• Apply evaluation data to improve future programming and support organizational outcomes
We recently updated the Developmental Continua at GWA in Math and Literacy. Here is a parent presentation I did to help parents understand the new continua.
April 2015 Creating Learning Outcomes DowneyChrista Downey
This document outlines an institution's process for creating learning outcomes and a rubric to guide career services programming, convey expectations to students, and assess effectiveness. It describes adopting broad learning outcomes in 2010, then setting program-specific outcomes from 2010-2014. In 2014, outcomes were expanded and a rubric with developmental levels (emerging, developing, accomplished) was created. The rubric will be used to inform programming, communicate expectations, measure student growth, and assess programs and services through methods like student and alumni surveys. The overall goal is to demonstrate the value added by career services.
This document outlines the agenda and goals of a meeting between the Riverside School Board and McGill University to discuss enhancing mathematics teaching and learning through technology. The day will include reviewing project goals, formative assessment, focus groups, and a wrap-up. Key themes of the CCCM project are student success, digital literacy, transition from elementary to secondary school, professional learning, and using data. Objectives for the first year are to foster communities of practice around mathematics and digital tools and develop shared understandings.
This document outlines a whole school literacy and numeracy plan for Coffs Harbour High School for 2009-2010. It establishes that improving literacy and numeracy is a core responsibility of all teachers. The plan details the school improvement planning process, which involves collecting and analyzing student achievement data, setting targets, researching effective practices, implementing action plans, and evaluating the impact on student learning. Key dates and responsibilities are provided for establishing literacy and numeracy goals, developing faculty strategies and action plans, and monitoring implementation progress.
The ADDIE model is an instructional design model consisting of 5 phases: Analysis, Design, Development, Implementation, and Evaluation. The Analysis phase involves gathering data to determine training needs and goals. In the Design phase, objectives and lessons are created. During Development, materials and activities are produced. In Implementation, the training is delivered. Finally, Evaluation assesses the effectiveness of the training through various methods. The ADDIE model provides a systematic approach to creating effective instructional materials and programs.
The aim of this presentation was to provide college staff and faculty with a framework for developing a a competency-based curriculum. The workshop was presented during the national conference of the Vietnam Association of Community Colleges on September 19, 2013.
Enhancing the value of training and professional developmentKathleen Wolfhope
The document discusses relevant training and professional development. It outlines adult learning principles according to Malcolm Knowles, including moving from dependency to self-directedness. It also discusses Kirkpatrick's four levels of evaluation plus a fifth level added by Jack Phillips on return on investment. The document advocates for creating value through real-world application and accessibility of credible trainers to foster continuous learning for both short and long-term impact. It emphasizes the importance of collaboration and learning communities.
[DevDay2018] Build an effective learning program - By Vinh Hoang, Training & ...DevDay Da Nang
This document outlines steps for building an effective learning program. It discusses assessing needs, developing learning objectives, designing the program, implementing training, and evaluating performance. Key points include spending 80% of effort on needs assessment and objectives to focus training on addressing organizational pain points. Adult learning theory and techniques like serious eLearning are important to consider in design. Coaching after training is also highlighted as critical for providing real-time feedback to help trainees apply their new knowledge on the job. The overall goal is to increase training effectiveness beyond its typical 36% rate.
This document discusses rubrics and their use in project-based learning. It defines rubrics as tools that provide a common vocabulary for assessment and facilitate peer review. Good project-based learning problems relate to the real world, require decision-making, are multi-stage, and incorporate higher-order thinking skills. Rubrics can be used to evaluate project-based learning problems according to these criteria. The document also explores different types of rubrics and considerations for developing effective rubrics.
4. how will you know when you are achieving your aims co dev toolkit - activi...juko101
To effectively evaluate progress towards goals, one must first clearly define what those goals are. The document provides guidance on setting measurable goals, including thinking about how learners will demonstrate achievement through their actions and words. It also recommends trialling different methods of collecting goal-related data in classes, getting feedback, and deciding on the most appropriate metrics to track achievements over time. Being deliberate about setting goals and implementing measures to assess impact is important for curriculum development work to demonstrate real progress and inform future efforts.
This document outlines six steps for ensuring success in mathematics and science education: 1) having a well-developed curriculum, 2) using instructional materials aligned with the curriculum, 3) employing consistent instructional strategies, 4) providing adequate professional development, 5) implementing an assessment system aligned with standards and curriculum, and 6) establishing policies and procedures that support teaching and learning. It instructs teachers to rate their school on each step, discuss as a team, develop a consensus, and summarize their ratings to identify needs and guide next steps in planning.
This document discusses using rubrics to assess enhanced finished products. It defines rubrics and their purpose, and provides an example rubric with criteria for materials used, procedure followed, and workmanship. The rubric scores products as excellent, good, or fair in each category and can be used to assess the quality of enhanced products. The teacher will teach students about rubrics and their use in assessment, and have students use internet resources to learn more about rubrics.
This document provides an agenda and overview for an instructional strategies course for teaching in medicine. It includes:
- Introductions and an overview of course expectations and assignments
- A guided inquiry exercise where participants are organized into groups based on teaching experience to discuss models of teaching
- An introduction to the course blog where reading assignments will be posted
- An assignment to write a short paper reflecting on why they chose to teach due next week
4. how will you know when you are achieving your aims co dev toolkit - activi...juko101
To effectively evaluate your progress towards achieving your goals, you must first clearly define what you intend to accomplish. One helpful way to do this is to describe the specific behaviors you expect to see from learners and hear them say once a particular outcome is reached. The document then provides some examples of ways to measure success, such as evaluating learners' work, observing classroom behaviors, and getting feedback from learners themselves.
This document outlines the assessment components for a course on curriculum and instructional design. It includes:
1) A 30% individual portfolio requiring weekly reflective essays and notes on course readings and discussions. Students are expected to critically analyze what they are learning.
2) A 40% group curriculum and instructional design project. Groups will analyze and compare curricula from different countries and contexts. Each student will then design lesson plans based on one curriculum.
3) A 30% final exam assessing students' competency in curriculum and instructional design through a 2-hour written examination.
The Morrison, Ross & Kemp (MRK) model is a holistic instructional design approach that considers all factors in the learning environment. It uses a learner-focused approach where the instructional designer asks initial questions about learner skills and knowledge and then determines which of nine elements to use to develop a flexible instructional plan. The MRK model emphasizes continuous refinement through formative evaluation and allows flexibility in its application.
Evaluate the Effectiveness of Your Online Learning & Training ProgramsNimritta Parmar
The ability to measure the effectiveness of your organization’s training and development programs is critical to ensure your learning strategy is aligned to your desired business outcomes. However, having a system in place to accurately measure the impact of your learning programs can often be a complex challenge — One that organizations struggle with the most.
Join this webinar to discover how to enhance employee performance and prove the value of your learning investments, by implementing simple and effective strategies for measuring the impact of your online training programs.
Watch the presentation to learn:
- Common issues and challenges with measuring learning outcomes
- How to align learning outcomes to business objectives
- Best practices for evaluating the effectiveness of your training programs
- How to analyze and interpret learning data to understand the impact of individual learning, and performance
- How to leverage these insights to improve training programs, and enable the organization to make better informed decisions about learning
PDF✔Download❤ Problem-Based Learning An Inquiry Approach hugenerawewehewr
A stepbystep guide for teaching your students to think critically and solve complex problems! Problembased learning expert John Barell troubleshoots the PBL process for teachers drawing from practical classroom experience. Stepbystep procedures make this remarkably effective teaching model accessible and highly doable for all teachers from beginners to veterans. This standardsbased teacherfriendly second edition of the author's popular PBL guide includesExamples showing problembased learning in action Answers to frequently asked questions on standardsbased implementation Thorough guidelines for developing problems for students to solve Rubrics and assessment tips to ensure that standards are met
Learning Evaluation - How to get most out of your training programs simply an...Lambda Solutions
Whether it’s to inform, teach, improve, change or a combination of these factors, training should have a purpose. But how can you be sure you’ve done what you set out to do? Good training evaluation techniques identify and measure what learning has occurred during and after learning, whether job performance improvements have been realized, and most importantly, how organizations can get the best training value for their money and efforts.
In this webinar, we explore why learning evaluation is an important component of any training program and how you can measure the application of learning beyond the learning event itself. We discuss how to implement learning evaluation that’s practical and provides value but isn’t have to be complicated, time-consuming or expensive.
Tune in!
T Level Assessment & Curriculum Planning - Simple ExampleCapellaSystems
This document provides an overview of the T Level assessment and curriculum planning process, including the following key steps:
1. Create an assessment plan by setting course aims and outcomes, which is then used to plan curriculum intent and review impact.
2. Develop curriculum implementation by creating activities aligned to the assessment outcomes and mapping the curriculum sequence.
3. Analyze curriculum impact once delivery begins by adding findings from student assessments and identifying improvement actions.
Measures That Matter: How to Capture and Communicate the Value of Your Learni...Casey Cramer
By Noelle K. Akins, L.P.C., Navigator Management Partners
Learning Objectives
• Identify five levels of “matterful” learning evaluation
• Recognize how, when and where to gather data for each level
• Apply evaluation data to improve future programming and support organizational outcomes
We recently updated the Developmental Continua at GWA in Math and Literacy. Here is a parent presentation I did to help parents understand the new continua.
April 2015 Creating Learning Outcomes DowneyChrista Downey
This document outlines an institution's process for creating learning outcomes and a rubric to guide career services programming, convey expectations to students, and assess effectiveness. It describes adopting broad learning outcomes in 2010, then setting program-specific outcomes from 2010-2014. In 2014, outcomes were expanded and a rubric with developmental levels (emerging, developing, accomplished) was created. The rubric will be used to inform programming, communicate expectations, measure student growth, and assess programs and services through methods like student and alumni surveys. The overall goal is to demonstrate the value added by career services.
This document outlines the agenda and goals of a meeting between the Riverside School Board and McGill University to discuss enhancing mathematics teaching and learning through technology. The day will include reviewing project goals, formative assessment, focus groups, and a wrap-up. Key themes of the CCCM project are student success, digital literacy, transition from elementary to secondary school, professional learning, and using data. Objectives for the first year are to foster communities of practice around mathematics and digital tools and develop shared understandings.
This document outlines a whole school literacy and numeracy plan for Coffs Harbour High School for 2009-2010. It establishes that improving literacy and numeracy is a core responsibility of all teachers. The plan details the school improvement planning process, which involves collecting and analyzing student achievement data, setting targets, researching effective practices, implementing action plans, and evaluating the impact on student learning. Key dates and responsibilities are provided for establishing literacy and numeracy goals, developing faculty strategies and action plans, and monitoring implementation progress.
The ADDIE model is an instructional design model consisting of 5 phases: Analysis, Design, Development, Implementation, and Evaluation. The Analysis phase involves gathering data to determine training needs and goals. In the Design phase, objectives and lessons are created. During Development, materials and activities are produced. In Implementation, the training is delivered. Finally, Evaluation assesses the effectiveness of the training through various methods. The ADDIE model provides a systematic approach to creating effective instructional materials and programs.
The aim of this presentation was to provide college staff and faculty with a framework for developing a a competency-based curriculum. The workshop was presented during the national conference of the Vietnam Association of Community Colleges on September 19, 2013.
The systematic design of instruction dick and careyCathy Cousear
This document outlines the key principles and steps of instructional design according to Dick and Carey's systematic design model. It discusses establishing instructional goals, analyzing the learning context, writing objectives and criterion-referenced tests, developing instructional strategies and materials, and conducting formative evaluations to revise instruction. The goal is to design effective instruction by thoroughly analyzing learning needs and contexts, developing appropriate objectives and assessments, selecting optimal instructional approaches, and refining the design through evaluation and revision.
This document provides information and guidance for developing effective assessment tasks. It discusses linking assessment to learning outcomes, setting the appropriate level according to the NQF framework, and different types and purposes of assessment. Guidelines are provided for writing good learning outcomes and developing rubrics and criteria for assessment tasks. Different taxonomies for generating outcomes and assessments are explained, including Bloom's and Biggs' SOLO taxonomy. The document also covers reliability and validity in assessment, and provides tips for writing exam papers and checklists for moderation. Participants will work on tasks to develop assessment activities and criteria for outcomes, and compare sample exam papers.
Gary howard developing assessment instrumentsgary howard
This document discusses developing criterion-referenced assessment instruments. It describes four types of criterion-referenced tests: entry skills tests, pretests, practice tests, and posttests. These tests are used to assess prerequisite skills, previous mastery, provide feedback during instruction, and determine overall effectiveness. The document also covers developing test items aligned to instructional objectives, using various item formats, and creating alternative assessments like portfolios. It emphasizes the importance of congruence between instructional goals, learner characteristics, performance contexts, and assessments throughout the instructional design process.
Rubrics: Transparent Assessment in Support of LearningKenneth Ronkowitz
Rubrics provide concise descriptions of criteria for evaluating student work or performance. They define multiple levels of quality for each criterion from excellent to poor. Rubrics benefit both students and teachers by making clear expectations, providing transparency and consistency in grading, and giving effective feedback to improve learning. Teachers can create rubrics for assignments, assessments, or course materials. Rubrics can be holistic, evaluating work as a whole, or analytic, separately rating each criterion. Moodle has a rubric tool to create and apply rubrics for grading assignments.
Instructional design is the process of improving instruction through analyzing learning needs and systematically developing learning materials. It helps learners make sense of information through thoughtful, engaging content design rather than just presenting information. The ADDIE model is a five-step process used by instructional designers: analysis to identify learners and needs, design of strategy and prototype, development of refined materials, implementation of training dissemination, and evaluation through formative and summative feedback.
The Dick and Carey Systems Approach Model is a behaviorist approach to instructional design consisting of 10 components across 6 phases: design, analysis, development, formative assessment, revision, and summative evaluation. The first two phases involve assessing learner needs to identify goals, then conducting an instructional analysis to determine entry skills and conditions for learning. Next, performance objectives are written describing what learners will do. Assessments are developed, an instructional strategy is planned, and materials are prepared. During formative assessment, instruction is evaluated for effectiveness and revised as needed before summative evaluation of outcomes.
This document outlines a presentation on assessing and reporting training quality. It discusses:
1. Identifying the benefits of comprehensive training assessment, including learner accountability, organizational development, and demonstrating value.
2. Kirkpatrick's four levels of training evaluation - reaction, learning, behavior, and results - and how to assess each using both quantitative and qualitative metrics.
3. Holton's model of external factors that can influence training assessment.
4. Methods for assessment, including surveys, performance-based assessments, client contact, and focus groups to evaluate different levels.
5. Elements to include in a comprehensive training evaluation report, such as needs, objectives, recommendations, and references.
This document provides an overview of assessment for teachers. It defines assessment and differentiates it from evaluation. It discusses the importance of assessment in the teaching and learning cycle and its role in planning instruction. Formative and summative assessments are defined and examples are provided. The acronym TIPS for providing effective formative feedback is introduced. Teachers will learn to incorporate assessment into their unit planning using the Understanding by Design framework. The objectives are to help teachers understand assessment and use it to improve student learning.
CAIeRO: Practical Tools for Course DesignJulie Usher
The document provides an agenda and information about a CAIeRO course design retreat. The retreat will cover topics like setting learning outcomes, storyboarding, and action planning. CAIeRO stands for "Creating Aligned Interactive Educational Resource Opportunities" and is a course design toolkit. The toolkit includes tools like a module blueprint to define the mission and approach, storyboarding to plan learning activities, prototyping activities, and reviewing and reflecting on the design. The retreat aims to help participants design learner-focused, collaborative, and flexible courses using the CAIeRO toolkit and principles.
This document provides guidance for teachers delivering the GCE A2 Photography course, including examining the structure and assessment objectives, considering delivery models, and sharing good practices. It outlines the agenda for a training course, discusses assessment and moderation procedures, and emphasizes selecting and presenting candidates' best work for evaluation.
This document provides an overview and training on the Home Base IIS system. It covers how to log in, navigate student data and reports, create and schedule lessons and assessments, and administer online tests. The training discusses tracking student progress, analyzing assessment data, locating instructional materials, and using reports to identify struggling students and standards. It demonstrates how to create items, tests, and rubrics as well as print, schedule and assign assessments.
Discusses the facets of Performance Assessment: Definition, advantages and disadvantages, types, process, guidelines and procedures and the types of rubrics
The A.D.D.I.E. of Developing a Strategic Training RoadmapHenry John Nueva
Whatever size business you run, it is important to remember that learning is an ongoing experience. This applies as much to the upper management of the business as the employees.
It follows that training should also be a part of the company’s day to day business activities. Of course, employees who are motivated and keen to see the business succeed will often take new ideas that they come across during the course of their work, and will sometimes be in a position to make suggestions for improvement that can benefit the company’s bottom line. Check this out !
The presentation based on the tuning process in education. The module presented in training of university faculty. It explain how to apply tuning at course, degree and at programme level.
2016-12-07 Development of a Project/Problem Based Learning Body of Knowledge ...Yoshiki Sato
Our main goal in this study was to resolve the difficulty of facilitating problem-solving-learning in schools, where all facilitators carry out the effective facilitation of Problem/Project Based Learning (PBL) that satisfies a certain quality.
This paper discusses a `Project/Problem Based Learning Body of Knowledge (PBLBOK)' that was developed to enable facilitation suitable for learning scenarios.
We refer to the project management method, classify causes from learners having fallen into difficult situations, and define the viewpoints, processes, and intermediate artifacts (deliverables) of PBL in the development of PBLBOK.
We then describe how we organized the knowledge to facilitate PBL.
PBLBOK provides viewpoints, processes, and deliverables of facilitation, and also provides viewpoints for the evaluation of PBL by referring to the project management framework.
We found that teachers could efficiently and effectively facilitate and evaluate PBL by using PBLBOK.
Yoshiki Sato, Atsuo Hazeyama, Youzou Miyadera:
"Development of a {Project/Problem} Based Learning Body of Knowledge (PBLBOK)", Proc. 2016 IEEE 8th International Conference on Engineering Education (ICEED 2016), pp.189--194, 2016.(Kuala Lumpur, Malaysia)
Similar to Train the Trainer: Evaluating Training (20)
This document provides guidance on sending thank you letters after a job interview. It discusses who should receive a thank you, when to follow up, and tips for thank you emails and letters. The key points are:
1) Send a thank you note within 24 hours of the interview to each interviewer to stand out from other candidates and reaffirm your interest.
2) Thank you emails are usually best as they allow for a fast response, easy customization, and proofreading. Letters are better if building rapport, for small employers, or when a decision will take over 2 weeks.
3) Thank you communications should thank the interviewer, provide examples from the discussion, and confirm continued interest in the
Barbara Hauck-Mah has over 20 years of experience in library management, programming, and community partnership. She has led the demolition, relocation, and construction of a new $3.2 million public library building in Pennsylvania. As the founder and CEO of Smart Art, she created a $1 million photo-insert national greeting card company. Currently she teaches job search and resume writing skills to adults in Philadelphia.
Celebration of Successful Library ProgramsBarb Hauck-Mah
The document lists various fall events happening at the Loodi Memorial Library in 2012, including visits from the Columbus School, weekly Book Buddies afterschool programs, Community Partnership events in fall 2012, Star Wars Reads Day on October 6th with storytime, Fright Night Partnerships on October 20th and Fright Night at the library on October 20th. It also lists a Post Superstorm Sandy ALA Games Day, afterschool craft programs, toddler and preschool storytimes, a Friends BCCLS Breakfast in fall 2012, a Diwali celebration on November 2nd, an Ultimate Videogaming event for middle and high school ages on November 2nd, and a learn to knit and weave workshop
This document discusses green library partnerships and programs that libraries can implement through community collaborations. It provides tips for developing partnerships and outlines examples of easy, green programs libraries have implemented, such as organic food events, composting demonstrations, community paper shredding events, book recycling programs, and sneaker recycling drives. These partnership ideas are meant to enhance environmental education, have a positive environmental impact, and promote sustainable practices over time. Contact information and online resources are also included to help libraries get started with their own green partnership programs.
The Public Library as Community Disaster Recovery CenterBarb Hauck-Mah
This document outlines how libraries can prepare for emergencies by developing a Service Continuity Plan. It discusses Project OPAL, a pilot program that provides a template for a one-page disaster plan. The West Deptford Public Library shared their experience customizing the OPAL template into a Service Continuity Plan. Key aspects of an effective plan include establishing service priorities, roles and responsibilities, communication procedures, and collection rescue plans. The presentation emphasizes the importance of training staff, conducting drills, and regularly reviewing and updating the plan.
This document discusses how LinkedIn can be used beyond just job hunting. It notes that LinkedIn has over 100 million US members and 2 new members join every second. It recommends joining groups on LinkedIn as a way to connect with others, find professional development opportunities, and stay informed on topics of interest. The document provides examples of groups for librarians and other professionals as well as tips on using LinkedIn for networking, getting recommendations, finding mentors, and following thought leaders in your industry.
This document summarizes a partnership between the Rockaway Township Library and Dover Public Library to provide English language learning programs for adult learners. The programs included an ESL tutoring program in partnership with Literacy Volunteers, computer skills classes taught at the Rockaway Library, and English conversation meetings at the Dover Library. Over 300 adult English learners participated in the programs. Survey results found that participants' English and computer skills improved significantly. One student obtained a job due to the new technical skills learned in the classes.
This document provides an overview of a job interviewing workshop that teaches attendees the 4 P's of successful interviewing: preparation, presentation, persuasion, and practice. It covers topics like identifying strengths, researching the company, dressing appropriately, answering common interview questions, asking questions of the interviewer, discussing weaknesses, handling illegal questions, and following up after the interview. The workshop aims to help job seekers learn how to present themselves positively and persuade interviewers that they are the best candidate.
This document discusses a job readiness skills training program in New Jersey public libraries. It received a $5.1 million federal grant to provide resources like 938 new computers, faster internet access, online job resources, and job training classes for 12,800 residents across 365 libraries. The program goal is to help 445,000 job seekers by offering online and in-person training on skills like computer use, resume writing, and internet job searching to address high unemployment rates in New Jersey during the economic downturn of the late 2000s. Over 2700 students have participated in 800 library-hosted classes through partnerships with community colleges so far.
Barb Hauck-Mah, Director of Lodi Memorial Library, took a delegation trip to China from September 23 to October 3, 2011 to tour public and university libraries. She met with librarians from the Beijing Normal University Library and National Library of China in Beijing, and the Guizhou Provincial Library and Guiyang Medical University Library in Guiyang. Key learnings included the growth of ebooks in China, the lack of school libraries and programming in public libraries, limited technology updates due to funding, and mandatory retirement ages of 55 for women and 60 for men. Highlights of site visits included the National Library of China's new building and collections, as well as tours of the Guizhou Prov
This document summarizes a computer literacy program for English language learners run by the Rockaway Township Library. The program requires partnerships with literacy organizations, weekly computer access, instructors and volunteers. Lessons and materials are provided in English. The 10-week program teaches basic computer, email and internet skills, as well as Microsoft Office. Students saw improvements in English fluency and computer skills, with many gaining jobs or enrolling in further education. The librarian provides contact information and lessons learned to help others start similar programs.
This document discusses a literacy program for English language learners at libraries. It notes that 14% of adults in the US lack basic English skills, including many recent immigrants. The literacy program provides $5,000 grants to 75 public libraries in 24 states to develop literacy programs and resources for English language learners. The programs focus on developing collections, outreach, teaching literacy and citizenship skills, and sharing resources. Examples of programs mentioned include one-on-one English tutoring, a bookmobile, and citizenship classes.
The Caldwell Public Library director's report summarizes the library's 2013 highlights and upcoming MakerSpace initiative. In 2013, the library saw 65,876 visitors and circulated over 48,000 items. It will launch a mobile MakerSpace in 2014 to preserve local history through oral interviews and a student project, made possible by a $3,750 grant. The MakerSpace aims to share collected information through a library website and serve as a model for other libraries.
This document discusses how libraries can form green partnerships with other organizations in their community to promote environmental awareness and sustainability. It provides tips for developing successful partnerships, examples of "easy to be green" programs libraries have implemented through partnerships, such as organic food events, composting demonstrations, community paper shredding events, materials swaps, and recycling programs. The document also shares resources for libraries to find more information on starting their own green partnership programs.
Carrer goals.pptx and their importance in real lifeartemacademy2
Career goals serve as a roadmap for individuals, guiding them toward achieving long-term professional aspirations and personal fulfillment. Establishing clear career goals enables professionals to focus their efforts on developing specific skills, gaining relevant experience, and making strategic decisions that align with their desired career trajectory. By setting both short-term and long-term objectives, individuals can systematically track their progress, make necessary adjustments, and stay motivated. Short-term goals often include acquiring new qualifications, mastering particular competencies, or securing a specific role, while long-term goals might encompass reaching executive positions, becoming industry experts, or launching entrepreneurial ventures.
Moreover, having well-defined career goals fosters a sense of purpose and direction, enhancing job satisfaction and overall productivity. It encourages continuous learning and adaptation, as professionals remain attuned to industry trends and evolving job market demands. Career goals also facilitate better time management and resource allocation, as individuals prioritize tasks and opportunities that advance their professional growth. In addition, articulating career goals can aid in networking and mentorship, as it allows individuals to communicate their aspirations clearly to potential mentors, colleagues, and employers, thereby opening doors to valuable guidance and support. Ultimately, career goals are integral to personal and professional development, driving individuals toward sustained success and fulfillment in their chosen fields.
This presentation by Tim Capel, Director of the UK Information Commissioner’s Office Legal Service, was made during the discussion “The Intersection between Competition and Data Privacy” held at the 143rd meeting of the OECD Competition Committee on 13 June 2024. More papers and presentations on the topic can be found at oe.cd/ibcdp.
This presentation was uploaded with the author’s consent.
Why Psychological Safety Matters for Software Teams - ACE 2024 - Ben Linders.pdfBen Linders
Psychological safety in teams is important; team members must feel safe and able to communicate and collaborate effectively to deliver value. It’s also necessary to build long-lasting teams since things will happen and relationships will be strained.
But, how safe is a team? How can we determine if there are any factors that make the team unsafe or have an impact on the team’s culture?
In this mini-workshop, we’ll play games for psychological safety and team culture utilizing a deck of coaching cards, The Psychological Safety Cards. We will learn how to use gamification to gain a better understanding of what’s going on in teams. Individuals share what they have learned from working in teams, what has impacted the team’s safety and culture, and what has led to positive change.
Different game formats will be played in groups in parallel. Examples are an ice-breaker to get people talking about psychological safety, a constellation where people take positions about aspects of psychological safety in their team or organization, and collaborative card games where people work together to create an environment that fosters psychological safety.
This presentation by OECD, OECD Secretariat, was made during the discussion “Artificial Intelligence, Data and Competition” held at the 143rd meeting of the OECD Competition Committee on 12 June 2024. More papers and presentations on the topic can be found at oe.cd/aicomp.
This presentation was uploaded with the author’s consent.
This presentation by OECD, OECD Secretariat, was made during the discussion “Pro-competitive Industrial Policy” held at the 143rd meeting of the OECD Competition Committee on 12 June 2024. More papers and presentations on the topic can be found at oe.cd/pcip.
This presentation was uploaded with the author’s consent.
This presentation by Nathaniel Lane, Associate Professor in Economics at Oxford University, was made during the discussion “Pro-competitive Industrial Policy” held at the 143rd meeting of the OECD Competition Committee on 12 June 2024. More papers and presentations on the topic can be found at oe.cd/pcip.
This presentation was uploaded with the author’s consent.
XP 2024 presentation: A New Look to Leadershipsamililja
Presentation slides from XP2024 conference, Bolzano IT. The slides describe a new view to leadership and combines it with anthro-complexity (aka cynefin).
This presentation by Katharine Kemp, Associate Professor at the Faculty of Law & Justice at UNSW Sydney, was made during the discussion “The Intersection between Competition and Data Privacy” held at the 143rd meeting of the OECD Competition Committee on 13 June 2024. More papers and presentations on the topic can be found at oe.cd/ibcdp.
This presentation was uploaded with the author’s consent.
The importance of sustainable and efficient computational practices in artificial intelligence (AI) and deep learning has become increasingly critical. This webinar focuses on the intersection of sustainability and AI, highlighting the significance of energy-efficient deep learning, innovative randomization techniques in neural networks, the potential of reservoir computing, and the cutting-edge realm of neuromorphic computing. This webinar aims to connect theoretical knowledge with practical applications and provide insights into how these innovative approaches can lead to more robust, efficient, and environmentally conscious AI systems.
Webinar Speaker: Prof. Claudio Gallicchio, Assistant Professor, University of Pisa
Claudio Gallicchio is an Assistant Professor at the Department of Computer Science of the University of Pisa, Italy. His research involves merging concepts from Deep Learning, Dynamical Systems, and Randomized Neural Systems, and he has co-authored over 100 scientific publications on the subject. He is the founder of the IEEE CIS Task Force on Reservoir Computing, and the co-founder and chair of the IEEE Task Force on Randomization-based Neural Networks and Learning Systems. He is an associate editor of IEEE Transactions on Neural Networks and Learning Systems (TNNLS).
This presentation by OECD, OECD Secretariat, was made during the discussion “The Intersection between Competition and Data Privacy” held at the 143rd meeting of the OECD Competition Committee on 13 June 2024. More papers and presentations on the topic can be found at oe.cd/ibcdp.
This presentation was uploaded with the author’s consent.
Suzanne Lagerweij - Influence Without Power - Why Empathy is Your Best Friend...Suzanne Lagerweij
This is a workshop about communication and collaboration. We will experience how we can analyze the reasons for resistance to change (exercise 1) and practice how to improve our conversation style and be more in control and effective in the way we communicate (exercise 2).
This session will use Dave Gray’s Empathy Mapping, Argyris’ Ladder of Inference and The Four Rs from Agile Conversations (Squirrel and Fredrick).
Abstract:
Let’s talk about powerful conversations! We all know how to lead a constructive conversation, right? Then why is it so difficult to have those conversations with people at work, especially those in powerful positions that show resistance to change?
Learning to control and direct conversations takes understanding and practice.
We can combine our innate empathy with our analytical skills to gain a deeper understanding of complex situations at work. Join this session to learn how to prepare for difficult conversations and how to improve our agile conversations in order to be more influential without power. We will use Dave Gray’s Empathy Mapping, Argyris’ Ladder of Inference and The Four Rs from Agile Conversations (Squirrel and Fredrick).
In the session you will experience how preparing and reflecting on your conversation can help you be more influential at work. You will learn how to communicate more effectively with the people needed to achieve positive change. You will leave with a self-revised version of a difficult conversation and a practical model to use when you get back to work.
Come learn more on how to become a real influencer!
This presentation by Juraj Čorba, Chair of OECD Working Party on Artificial Intelligence Governance (AIGO), was made during the discussion “Artificial Intelligence, Data and Competition” held at the 143rd meeting of the OECD Competition Committee on 12 June 2024. More papers and presentations on the topic can be found at oe.cd/aicomp.
This presentation was uploaded with the author’s consent.
This presentation by Professor Alex Robson, Deputy Chair of Australia’s Productivity Commission, was made during the discussion “Competition and Regulation in Professions and Occupations” held at the 77th meeting of the OECD Working Party No. 2 on Competition and Regulation on 10 June 2024. More papers and presentations on the topic can be found at oe.cd/crps.
This presentation was uploaded with the author’s consent.
This presentation by Yong Lim, Professor of Economic Law at Seoul National University School of Law, was made during the discussion “Artificial Intelligence, Data and Competition” held at the 143rd meeting of the OECD Competition Committee on 12 June 2024. More papers and presentations on the topic can be found at oe.cd/aicomp.
This presentation was uploaded with the author’s consent.
3. Lesson ObjectivesLesson Objectives
You will be able to
• Identify 4 levels of evaluation
• Apply methods for each level
• Develop an evaluation strategy
3
5. Why evaluate?Why evaluate?
• Ensure objectives are achieved
• Reinforce key program points
• Assess value of training
• Identify areas needing improvement
• Sell program
• Identify appropriate audience for future
5
6. When to evaluate?When to evaluate?
During
Training
End of
Training
Back at
Job/Home
6
7. Who is involved?Who is involved?
• Trainer
• Learners
• Coworkers
• Manager
7
22. Apply to your lessonApply to your lesson
Take a few minutes
• Find “Levels of Evaluation” Worksheet (p 7-10)
• Choose one evaluation level
• Jot down some methods for your plan
Table 7.1 (p 7-3) and Table 7.2 (p 7-7)
22
23. Group ExerciseGroup Exercise
• Find a partner
• Share your evaluation plan
• Pick one to illustrate
• Present to group
Objective, level, method
23
24. Develop an Evaluation StrategyDevelop an Evaluation Strategy
1) Create a Plan
• Pick an evaluation level
• Identify information to be collected
• Select method
24
25. Develop an Evaluation StrategyDevelop an Evaluation Strategy
2) Collect and Analyze Information
• Design an instrument to collect data
Search online for “sample training
evaluation forms”
• Decide how to analyze data
25
26. Some Tips on Form DesignSome Tips on Form Design
• Keep it short
• Tie questions to objectives
• Focus on actionable items
• Mainly closed-end questions
• Use open-ends to find gaps
• Make it anonymous
26
27. Develop an Evaluation StrategyDevelop an Evaluation Strategy
3) Present Evaluation Results
• Highlight lessons learned
• Present results
with recommendations
to management
27
31. Best of Luck with yourBest of Luck with your
Presentations!Presentations!
Contact Information:
Joan Serpico
Manager Special Projects, Mount Laurel Library
jserpico@mtlaurel.lib.nj.us 856-234-7319 x314
Barb Hauck-Mah
Director, Lodi Memorial Library
hauckmah@bccls.org 973-365-4044 x7
31
Editor's Notes
HI. My name is Joan Serpico and I am the Manager of Special Projects at Mount Laurel Library. I graduated from TTT in 2004 and I am happy to continue to be involved in it today. My experience with TTT has been very positive – not only did it improve my training, but I got to know dozens or smart helpful people in my field who I have come to lean on for ideas and enjoy working with on projects. It has been a great experience for me.
Hi. I’m Barb Hauck-Mah, Director of Lodi Memorial Library, a public library in Bergen County. I graduated from TTT in 2009 and share Joan’s enthusiasm for the value of this special program
At this point you are probably all getting a little more nervous as your presentations get closer. (That is how we all felt.) But I guarantee you that you will be so happy that you took this training. Do you sense that already?
Well, like all of the trainers, we are here to help you make your future training better. This section is Evaluating Training.
Carol talked about evaluating your training in her section on lesson plans and course objectives. It’s good to think about perfecting your technique throughout your planning (timing, clarity, etc.)
This section focuses on evaluating your training based on the impact that the training has on your learners. What are they taking away from your training? We’ll talk about how you can determine this.
Though Barb and I will be sharing some thoughts and techniques, I know you have some experiences with these and I hope you will share them as we go along. (By now you realize that some of the most valuable things you will learn at TTT come from your fellow attendees.)
By the end of this lesson, you will be able to evaluate your training.
Specifically, you will be able to …
Identify four levels of evaluation
Identify evaluation methods that can be used for each level
Develop an evaluation strategy
I’ll be keeping the floor open to hear what methods you have already used and what has worked and what hasn’t – especially what hasn’t. (These make for valuable learning opportunities and the best stories.)
The most important question is why bother? It is extra work. Why not present some flawless training, take a deep breath and celebrate?
Barb/Joan is going to help record why we would decide to bother with this.
(Flipchart floosies records answers from audience.)
Let’s see how many we identified and if there are any extras here. (Instructor to underline answers on flipchart that correspond to items in list.)
To ensure training objectives are achieved
To review & reinforce key program points for learners
To assess value of training
To identify areas needing improvement
To sell program to management & learners
To identify appropriate audience for future
OK. Those seem to indicate it would be worth some time to do this step. Let’s look at when we would evaluate…
When you evaluate depends on what level and method you select. You may evaluate during training and at the end of the
session and later on.
Next, let’s talk about who would be involved in the evaluating...
Who are some of the people who can be helpful in the evaluation?
Remember, rather than judging you the evaluation process is designed to
give you information to improve future training.
So we know why, when and who. What kind of evaluating are we going to do?
Donald Kirkpatrick introduced a model for evaluating training in 1959.
It has become the standard for trainers.
In Donald Kirkpatrick’s model, level 1-the Reaction level -tells how learners felt about the training. Level 2 – the Learning level – evaluates the new skills the learners have acquired. Level 3 – the Behavior level – takes a look at how the learners behavior has changed either on the job or at home. Level 4 – the Results level – takes a look at the organizational impact of a training.
Let’s take a closer look at each level and some methods we can use for each. We’ll apply a training example together for each level and then later, you will have some time to apply a level and a method for your practice training.
The training example we will use throughout our discussion of evaluation levels will be an eReaders class for Library patrons – how to download our ever popular ebooks onto various devices. How many of you are offering some kind of training for your patrons on downloading ebooks? What kind of training is it? How is it going? Throughout the explanation of evaluation, we will imagine that we are planning an evaluation of a class on downloading ebooks. Let’s get started examining the different levels.
The reaction evaluation measures how the learner liked the training. How did they feel about it?
It is subjective and does not evaluate what they have learned but how they felt about it.
This is done right after the training sessions. (Make sure to leave time for learners to complete the evaluation.)
It can be done via an evaluation form also known as a smile sheet or an interview.
There are two evaluation samples in your course book that are examples of Level 1 evaluation. They are located on pages 7- 5 and 7-6.
How many of you have filled out evaluations like this before? How many of you have used these before in your own training? What do you think?
Let’s imagine that we were giving a training session on downloading ebooks for patrons. Take a look at the sample level 1 evaluation forms in your coursebooks. Which of these could be adapted to be used for this ereader training? (Instructor waits for responses.)
Right. Either of them could be used. Feel free to adopt any of these samples for your own training.
An alternative to paper evaluation forms is a free online evaluation survey from websites such as Survey Monkey or Zoomerang.
The survey can be emailed as a link, embedded on a webpage or sent as an APP as soon as training is completed.
How many of you have filled out an online evaluation before? How many have used online evaluations in your own training? What do you think?
Most survey websites have evaluation templates you can customize. Both multiple choice and open end questions can be included.
We’ve provided a handout on popular online survey sites to get you started.
The learning evaluation level measures what the learners are now able to do or what they know as a result of the training. In short, what have they learned. It can be both subjective or objective and can take place during training or before/after training as in a pre-test/post-test.
Evaluating participants’ learning can take place through observation of how they are applying their skills – role plays, case studies, exercises. Interviews can also gather this kind of information.
Who has used this level of evaluation before? How have they worked for you?
What would be a way that we could apply level 2 learning to the Ereaders training example? Feel free to look at Table 7-2 (Level 2 Evaluation Methods) on page 7-7 for some ideas. (Instructor to wait for responses.) Possible answers – exercises, observation
The Behavior evaluation level measures how the learners actually apply
what they have learned as a result of the training. In short, how they are
performing differently. How they have transferred what they have learned
to the work or home situation?
It is usually done a few months after a training.
Evaluating behavior can be done through observation, interviews or via surveys.
When observing, it makes sense to see how the learners behaved during training vs after. A checklist is a good idea to maintain consistency in observations.
You could also survey or interview the learners or the people that they work with to see how behavior has changed.
Has anyone ever done this kind of evaluating?
How could we apply this level of evaluation to our ereaders example to determine if the training has altered behavior? (Sample answers – interview or survey learners about numbers of ebooks downloaded.)
Level 4 evaluation (Results) determines the impact of the training on the organization.
It is performed by the organization.
It may not be appropriate for all training (like the ereader training). I mention it mostly so you are aware of it.
If your organization wanted to evaluate a Customer Service training, for
example, they might compare the customer complaints gathered before vs
after the training.
The organization could do an analysis of pre and post training statistics or a cost-benefit analysis to see the impact of training on goals.
(Examples of items to be measured include absenteeism, sales, or turnover rate, etc)
Going back to Donald Kirkpatrick’s model as a review, level 1-the Reaction level -tells how learners felt about the training. Level 2 – the Behavior level – evaluates the new skills the learners have acquired. Level 3 – the Behavior level – takes a look at how the learners behavior has changed either on the job or at home. Level 4 – the Results level – takes a look at the organizational impact of a training.
Does that make sense to everyone?
Now let’s look at some evaluation methods and make sure we understand what level of evaluation they measure. Please turn to the worksheet on page 7-9. It’s titled “Determining Levels of Evaluation.” Take a few minutes to complete it and then compare your answers with a partner.
Take about 10 minutes to do this and then we’ll go over these briefly as a group.
After 10 minutes when all are finished with the worksheet and comparing with a partner.
Did anyone have any disagreements about the levels? (If so, go over those first. If not, review answers.)
Great! Do you feel like you have an understanding of the levels, what they determine and how the methods can be used to measure this?
Now let’s apply these methods to your training. Go to page 7-10 “Levels of Evaluation” worksheet. List your lesson objective and choose which level of evaluation you wish to use. Then jot down a few possible methods of evaluation that would work for you at that level. Use Tables 7.1 and 7.2 as “cheat sheets.” Take about 5-10 minutes for this.
Great. Let’s take a few minutes to share our ideas with each other. Find your partner and discuss your choices. Feel free to share your experiences with these levels and different methods and how they have worked for you already.
Choose one method between the two of you and illustrate it on a piece of flipchart paper. When we come together, one of you will describe your partner’s objective, chosen level of evaluation and method and why it was chosen.
Here is an example of what I’m talking about. We talked about how survey would be one level 3 evaluation method for our ereaders training evaluation. Here is an artistic rendering of that. (Instructor shows flipchart paper example.) The trainer is sending an email to the learners about 3 months after an ereaders class to ask about their current ebook downloading habits.
Let’s take about 10-15 minutes for this. We’ll be around to answer questions, if needed.
The next steps are outlined in How to Create an Evaluation Strategy found on page 7-11. Let’s take a look at that.
First: Develop a Plan
Pick an evaluation level
Decide what information needs to be collected
Decide on a method
(We’ve done all that.)
Once you have an evaluation plan, you will need to design an instrument.
You do not need to do this from scratch. Feel free to borrow one and
personalize it so it works for your training. Sample evaluation forms can
be found in your coursebook.
When designing an evaluation form, here are a few tips:
Keep the form simple and short. A page or two that could be completed in a few minutes is recommended.
Focus questions on aspects of the training that could be modified.
Offer a mix of rating, multiple choice and open ended questions. This approach may reveal issues or strengths you hadn’t identified.
If possible, make the evaluation anonymous to ensure honesty. Or, offer identifcation as an option.
Finally, you will collect the data, analyze it and you may decide to share
your results with colleagues or your supervisor. All of this is designed to
help improve your training.
Presenting your evaluation findings and describing how you plan to
adjust your training as a result can go a long way toward convincing your managers
administration of the value of the training and the relevance to your learners.
As we have discussed, evaluation is a crucial part of improving your
training. Your Evaluation Strategy involves deciding what you want to
evaluate, picking an appropriate level and method, designing an
instrument to collect the information, and analyzing and presenting your
findings.
Evaluation is simply part of a continuing cycle of the training process that
is ongoing to allow you to be the best trainer you can be.
To that end, Joanne is up next to talk about practice training.
Please feel free to contact us if we can ever be of help now or in the future.