• Save
Evaluation Guide
Upcoming SlideShare
Loading in...5
×
 

Evaluation Guide

on

  • 628 views

When you know what you’re doing will achieve your goals, it’s time to evaluate and continuously improve!...

When you know what you’re doing will achieve your goals, it’s time to evaluate and continuously improve!
I wrote this evaluation guide to simplify and standardise the evaluation processes for team evaluation, strategic project evaluation, and learning program evaluations.

Statistics

Views

Total Views
628
Views on SlideShare
626
Embed Views
2

Actions

Likes
0
Downloads
0
Comments
0

1 Embed 2

http://www.linkedin.com 2

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Evaluation Guide Evaluation Guide Document Transcript

  • Workforce Development Evaluation GuideVersion: 2012.03.19Last released on: 17/04/2012Last released by: Dain Sanyë Senior Consultant Evaluation and Continuous Improvement
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Document titleEvaluation GuideDocument informationThis document was produced for the Government of South Australia (SA),Department for Education and Child Development (DECD), Human Resources andWorkforce Development.This document was created by Dain Sanyë, Senior Consultant, Evaluation andContinuous Improvement for Workforce Development, using Microsoft Word 2010.PurposeThis guide describes the evaluation framework created for the Department’sWorkforce Development directorate.AudienceThis document is designed as a guide for Workforce Development officers andmanagers to effectively evaluate and report on projects and programs.Evaluators should be familiar with:  the layout of a personal computer and desktop  the purpose, creation and analysis of data and survey tools  the Department’s Improvement and Accountability framework (DIAf)This guide is written in international English.Contact detailsDain Sanyë, Senior Consultant, Evaluation and Continuous ImprovementPhone: (08) 820 41402Fax: (08) 8206 4200Email: Dain.Sanye3@sa.gov.auCopyright noticeCopyright ©2012 Government of South Australia, Department for Education andChild Development. This publication is copyright and contains information which isthe property of the Department for Education and Child Development. No part of thisdocument may be copied or stored in a retrieval system without the writtenpermission of the author or the Chief Executive of the Department for Education andChild Development. Page EV–2 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Related files  Checkbox survey tool < http://www.decssurveys.sa.edu.au/online/ >  DIAf < http://www.decd.sa.gov.au/quality/pages/quality/26420/ >Update planUpdates are the responsibility of the current Workforce Development SeniorConsultant, Evaluation and Continuous Improvement.Updates must occur when scheduled as a part of ongoing role responsibility or asrequested by a Workforce Development line manager.Updates must include an incrementally increased version number in the formatyyyy.mm.dd.A (year.month.date.draft version of update).Old versions must be moved to an archive location.The current version must maintain the same file name and be stored and accessiblefrom a single fixed network location.Shortcuts should be created at additional locations linking to the single fixed networkstorage location.Revision historyPlease destroy any printed copies of this document earlier than version 2012.03.19.Version Reviser Details2012.02.22.A Dain Sanyë Initial draft2012.03.19 Dain Sanyë Initial document released Page EV–3 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Contents Document title ..................................................................................................... 2 Document information ......................................................................................... 2 Purpose ............................................................................................................... 2 Audience ............................................................................................................. 2 Contact details ..................................................................................................... 2 Copyright notice................................................................................................... 2 Related files......................................................................................................... 3 Update plan ......................................................................................................... 3 Revision history ................................................................................................... 3Contents .................................................................................................................. 41.0 Introduction.................................................................................................. 72.0 Background.................................................................................................. 83.0 Evaluative review of a workgroup or project team .................................... 9 DIAf corporate self-review snapshot survey ......................................................... 9 Improvement principles...................................................................................... 10 Improvement principle criteria ............................................................................ 10 Focus on core business ............................................................................... 10 Think systemically........................................................................................ 11 Share leadership.......................................................................................... 11 Attend to culture .......................................................................................... 11 Listen and respond ...................................................................................... 11 Make data count .......................................................................................... 12 Set directions ............................................................................................... 12 Target resources.......................................................................................... 12 Continuously improve .................................................................................. 12 Principle analysis ............................................................................................... 13 Aggregate score .......................................................................................... 13 Graphing foci scores .................................................................................... 14 Collaborative discussion .................................................................................... 153.A Appendix: Example—DIAf corporate self review survey ........................ 164.0 Strategic evaluation of a project ............................................................... 32 Objectives.......................................................................................................... 33 Deliverables....................................................................................................... 34 Strategic plans and alignment............................................................................ 35 Evaluation measures ......................................................................................... 36 Focus areas....................................................................................................... 37 Focusing on development or improvement .................................................. 37 Designed and implemented with systemic thinking ...................................... 37 Enables individual leadership....................................................................... 37 Attentive to organisational culture while transforming capacity..................... 37 Managed stakeholders through effective listening and responsiveness ....... 37 Effectively collated, used and reported data................................................. 37 Page EV–4 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Maintained direction and scope throughout the project ................................ 37 Innovatively and effectively aligned resources ............................................. 38 Building a sustainable future including continuous improvement.................. 38 Scoring .............................................................................................................. 38 Undeveloped ............................................................................................... 38 Developing................................................................................................... 38 Functioning .................................................................................................. 38 Strategic ...................................................................................................... 38 Embedded ................................................................................................... 384.A Appendix: Example—strategic project evaluation .................................. 395.0 Evaluation of an ongoing program, with a focus on knowledge and skills transfer .................................................................................................... 48 Level 1—evaluation of motivation and reaction .................................................. 49 Level 2—evaluation of learning content ............................................................. 50 Level 3—evaluation of performance and behaviour ........................................... 51 Level 4—evaluation of impact and results ......................................................... 52A.0 Appendix: Level 1 sample survey questions ........................................... 54 A.1 Level 1—About the training needs analysis............................................ 54 A.2 Level 1—About the training .................................................................... 55 A.3 Level 1—About the participant ............................................................... 56 A.4 Level 1—Individual learning needs......................................................... 59 A.5 Level 1—Business needs ....................................................................... 62 A.6 Level 1—Core skill needs ...................................................................... 64B.0 Appendix: Level 2 sample survey questions ........................................... 68 B.1 Level 2—Participant reaction and learning outcomes introduction.......... 68 B.2 Level 2—General feedback .................................................................... 69 B.3 Level 2—Training methods and materials .............................................. 70 B.4 Level 2—Trainer(s) ................................................................................ 72 B.5 Level 2—Tests and qualifications ........................................................... 73 B.6 Level 2—Progress to other learning ....................................................... 73 B.7 Level 2—Pre-training activities and instructions ..................................... 74 B.8 Level 2—Facilities, courses and resources ............................................ 75 B.9 Level 2—Facilities and administration .................................................... 76C.0 Appendix: Level 3 sample survey questions ........................................... 77 C.1 Level 3—Job performance impact .......................................................... 77 C.2 Level 3—Relevance of the training......................................................... 78 C.3 Level 3—Application of learning ............................................................. 80 C.4 Level 3—Post-training skills observation ................................................ 84 C.5 Level 3—Core skills improvement .......................................................... 86 C.6 Level 3—Job-specific skills evaluation ................................................... 90 C.7 Level 3—Training objectives .................................................................. 91D.0 Appendix: Level 4 sample survey questions ........................................... 92 D.1 Level 4—Business impact ...................................................................... 92 Page EV–5 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 D.2 Level 4—Test results ............................................................................. 93 D.3 Level 4—Learning gain .......................................................................... 94 D.4 Level 4—Skills gain ................................................................................ 95 D.5 Level 4—Changes to business performance .......................................... 96 D.6 Level 4—Business performance/impact measures (fixed and open) ...... 97 D.7 Level 4—Financial impact (including ROTI) ........................................... 99E.0 Appendix: Additional survey questions ................................................. 100 E.1 Open question modifiers and extensions.............................................. 100 E.2 Knowledge and skills tests (question examples) .................................. 101F.0 Frequently asked questions (FAQs) ....................................................... 103 F.1 How do I get a user account to create surveys in Checkbox ................ 103G.0 Glossary ................................................................................................... 104 G.1 Definitions ............................................................................................ 104 G.2 Formatting convention.......................................................................... 104 G.3 Colour coding ....................................................................................... 104 G.4 Hyperlinks ............................................................................................ 104 G.5 Acronyms ............................................................................................. 105 Page EV–6 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.191.0 IntroductionEvaluation is defined as the act of appraising to assess value.1Evaluation identifies and promotes best practice in planning and implementationmethodology, and ensures the delivery and creation of Workforce Developmentproducts and services is performed at the most efficient and productive level.Effective evaluation of projects and programs ensures consistent, high qualityservices and products with strong alignment to strategic goals and objectives for allteams’ work across the Workforce Development Directorate.Evaluation also assists in the expansion and roll-over of projects to createsustainable programs by providing a clear and focussed relationship betweendeliverables, objectives and strategic goals as targeted outcomes and responsibilitiesin the management process.Evaluation utilises descriptive foci to assist the evaluator to get into a specific frame-of-mind and effectively evaluate the project or program from a variety of strategicviewpoints. Project and program managers are most likely to evaluate their ownprojects and programs.Evaluative foci have been described from the Department’s Improvement andAccountability framework (DIAf)2.Some of the foci this guide will assist you to evaluate against include your project orprogram’s:  attention to organisational culture  effective collation, use and reporting of data  effective use of human, financial and physical resources  embedding of leadership and responsibility  focus on development and continuous improvement  maintenance of scope and direction  stakeholder management  sustainability over the medium to long term  systemic thinking and integrationThe three evaluative processes described in this guide are designed to assist officersand managers strategically prepare and evaluate projects and programs, including:  Evaluative review of a workgroup or project team  Strategic evaluation of a project  Evaluation of an ongoing program, with a focus on knowledge and skills transfer1 http://dictionary.reference.com/browse/evaluation?s=t2 www.decd.sa.gov.au/quality/pages/quality/26420/ Page EV–7 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.192.0 BackgroundPrior to 2012 the Workforce Development directorate did not have a consistentframework to evaluate the Directorate’s projects and programs.Individual teams within the Directorate use a number of different and differentlyapplied evaluation tools, historical or mandatory evaluative processes withoutscheduled or regular continuous improvement analysis of their evaluative processes,effectiveness and outcomes.The Department’s enterprise Registered Training Organisation (RTO), Organisationand Professional Development Services (OPDS), use:  national VET surveys for participants and line managers of participants  AQTF compliance requirementsThe Directorate’s Quality Leadership programs use: The Directorate’s Teacher Quality projects and programs use: The Directorate’s Performance and Development projects and programs use: The Directorate’s Projects and Innovations projects and programs use:  Page EV–8 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.193.0 Evaluative review of a workgroup or project teamAn evaluative review of a workgroup or project team is recommended once per year;however, an evaluation may be prudent following any significant change ordevelopment within the team; for example, new leadership, new location, change inorganisational strategic plan, or prior to the planning phase of a significant project.The evaluation is designed as a self-review to identify and promote collaborativediscussion on opportunities for continuous improvement and increased effectivenessthat aim to improve the productivity, efficiency and capacity across the team.Analysis of the survey responses provides specific direction to improve teamengagement, satisfaction, focus, responsiveness and resource use.The evaluation encompasses the Department’s nine DIAf improvement principles.DIAf corporate self-review snapshot surveyThis survey is stored within the Department’s Checkbox survey tool under thefollowing URL < http://www.decssurveys.sa.edu.au/online/selfreveiw.aspx > and isaccessible through the internet to all Directorate officers.The survey is password protected with the current password “opds”.The survey details key criteria to consider as a lead into self-review of any teamsperformance and operations.Team members should complete the survey individually and encouraged to behonest, noting all responses are anonymous3.After all team members have been given sufficient time to comfortably respond4, theresponses should be collated to provide data for collaborative discussion.All the surveys focus questions are answered on a ratings scale from StronglyDisagree to Strongly Agree. This provides an opportunity within the principle analysisto index the team’s aggregate score and accurately report on the team’simprovement trend and recommended foci for collaborative discussion5.There are no text or long answer questions in the survey; these types of commentsshould be raised in the collaborative discussion following the surveys completion andprinciple analysis6.After the survey is closed it is important to allow and allocate adequate time for acollaborative discussion with all team members present at a team meeting or otherdedicated meeting time.At the collaborative discussion all members of the team are encouraged to discussthe Focus on core business and two other focus principles identified through theprinciple response analysis as the greatest opportunities to improve effectivenesswithin the team.3 Preview and analysis of the survey responses should not begin until after a sufficientnumber of responses have been received to ensure anonymity for the respondents.4 As the survey is accessible over the internet, some team members may feel morecomfortable completing the survey out-of-office, after hours or at home.5 Indices should not be compared between teams; however, all staff within the Directorate canbe anonymously surveyed at the same time to achieve a current Directorate-level index.6 Analysis of survey responses should be carried out by an independent and skilled analyticalor evaluative Directorate officer; for example, the Directorate’s Data Management andAnalysis Officer or Senior Consultant, Evaluation and Continuous Improvement. Page EV–9 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19The collaborative discussion should encompass: a) evidence used by team members to score the focus questions b) what the next steps would be for improvement and who will be responsible for managing implementation7 c) ongoing self-review processes including evaluation of the implementation of the continuous improvement project d) the data and other evidence that would be gathered to monitor and evaluate improvement over time8Improvement principlesThe nine DIAf improvement principles are: 1. Focus on core business 2. Think systemically 3. Share leadership 4. Attend to culture 5. Listen and respond 6. Make data count 7. Set directions 8. Target resources 9. Continuously improveImprovement principle criteriaFor each improvement principle there are four criteria9 that can be scored on aratings scale from Strongly Agree to Strongly Disagree by each individual teammember.Focus on core business 1.1 Team goals are clear, known and used to drive decisions 1.2 The team has high service and delivery standards that result in high quality outcomes 1.3 Team members are committed to the team’s goals 1.4 The team’s plans, processes and practices work effectively to support team members to achieve goals7 Implementation of continuous improvement should be managed using the projectmanagement framework laid out in the Directorate’s Information Management System (IMS)and using the PDSA (plan, do, study, act) continuous improvement framework cycle.8 See the chapter in this Evaluation Guide on strategically evaluating a project.9 Note that principle criteria contain more than one question that may cause and result inskewing of responses where a respondent may agree with part of the criteria and disagreewith another part. If this issue is raised, respondents should score the criteria with their lowestdesired response rating to ensure the improvement principle has the best opportunity to bediscussed for improvement by the team. Page EV–10 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Think systemically 2.1 Political, system and contextual issues are identified and strategically addressed in plans and practices 2.2 Effective research and development processes enable team members to improve operations, outcomes and service delivery 2.3 Internal management processes are routinely reviewed to continuously improve operations 2.4 Effective partnerships exist with key stakeholders, other business units and professional groups to support the achievement of unit goalsShare leadership 3.1 Leaders provide clear direction and supportive leadership, and take an effective stance appropriate to the individual/situation to achieve agreed outcomes 3.2 Leadership is shared with strategies and processes to build the leadership capacity of individuals and leadership density of the business unit 3.3 Leaders support effective business unit management through a focus on professional learning for team members and themselves 3.4 Leaders ensure change is managed positively and successfully, with workload balance and direction sustainedAttend to culture 4.1 A positive workplace culture supports team members to work with enthusiasm, commitment, energy and the business unit to achieve success 4.2 Team members’ roles and responsibilities are clearly known and professional team interactions optimise success 4.3 Professional development and performance management processes provide team members with recognition, support and feedback to develop expertise 4.4 Culture and morale building processes effectively support positive team member interactions and address issues and concernsListen and respond 5.1 Quality partnerships are deliberately developed with key clients and stakeholders to achieve outcomes 5.2 Communication processes provide information to and from clients and stakeholders to improve service delivery and outcomes 5.3 Decision making structures are effective with high levels of team member and stakeholder input, support for, and engagement in decisions 5.4 Commitment to quality service delivery and responsiveness by all team members provides high levels of satisfaction and positive client relationships Page EV–11 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Make data count 6.1 Effective data management processes are in place to collect, store and access reliable data 6.2 Multiple measures of data are analysed and used to inform improvement directions, evaluate programs and report on outcomes 6.3 Data is used to identify root causes and variation to targeted improvement efforts while monitoring the effectiveness of implementation strategies 6.4 Data creates knowledge and learning for the team, organisation and system to inform decisions on development and innovationSet directions 7.1 An explicitly stated vision, values and purpose developed in collaboration with team members drives business unit decisions, plans and directions 7.2 Planning processes build team members’ capacity and expertise to achieve the vision and continuously improve outcomes 7.3 Communication, monitoring and evaluation of planning processes occurs with high levels of team member involvement and ownership 7.4 Strategic plans are integrated and enacted in daily operations to ensure strategic directions are achievedTarget resources 8.1 Effective resource management systems identify, support and develop the team’s human, financial and physical resources 8.2 Resources are targeted to achieve successful outcomes with processes in place to review resource needs and effectiveness 8.3 Assets and resources are acquired, organised and maintained to support performance 8.4 Risk management processes ensure prudent financial management, regulatory compliance and safe workplace practicesContinuously improve 9.1 Effective, known improvement processes support team members to ensure that goals are achieved and outcomes are continuously improved 9.2 Rigorous, regular self-review processes occur, with team members involvement and engagement, to monitor outcomes, evaluate progress and inform future directions 9.3 A commitment to continuous improvement is evidenced by routine policy review and development cycles, and effective document and records management 9.4 The team’s members develop integrated, sustainable and systemic programs, projects and products to achieve business unit goals and respond to the emerging needs of stakeholders Page EV–12 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Principle analysisThe principle analysis of the survey’s responses should be calculated perimprovement principle focus and include:  an aggregate score of the responses to the criteria for each improvement principle (focus)  the total number of responses and total number of possible responses (team members)  a graph displaying the distribution of relative scores across the focus and its four criteria independentlyAggregate scoreThe aggregate score may:  remove outliers—where the aggregate of one rating score is significantly inconsistent with the median or mode rating score (eg out of 25 responses: one response Strongly Disagrees while 24 responses Agree or Strongly Agree; the Strongly Disagree response may be removed from the analysis)10  be reported as a definitive modal score11—if the modal score aggregate exceeds the sum aggregate of all other rating scores (eg out of 25 responses: 16 responses are Agree while 9 responses are other ratings; the aggregate score for this focus can be reported as Agree)  be reported as a definitive median score12—if the aggregate scores per rating appear as a balanced bell curve (eg out of 25 responses: 1 Strongly Disagree, 3 Disagree, 6 Neither, 9 Agree, 6 Strongly Agree; the median score for this focus can be reported as Agree)  be reported as a mean or average index—with each rating given an linear aggregate multiplier (eg out of the above bell curve distribution with an aggregate multiplier of x1)Rating Strongly Disagree Disagree Neither Agree Strongly Agree(Multiplier) (x1) (x2) (x3) (x4) (x5) TOTAL Score 1 3 6 9 6Aggregate 1 6 18 36 30 91 Possible maximum 125 INDEX = 3.64 Agree (72.8%)The aggregate score for each improvement principle focus should always becalculated and reported consistently.10 If there is a significant cause to the response removed from the analysis this is expected tobe raised during the collaborative discussion.11 The modal score is the distinct rating with the highest aggregate.12 The median score is the exact middle rating when all scores are accurately ordered fromlowest to highest. Page EV–13 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Graphing foci scoresThe distributive graph should be chosen and produced to provide a highly visualdiscussion prompt for the collaborative discussion; for example: Page EV–14 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Collaborative discussionThe collaborative discussion is the most important part of the evaluation.The discussion enables team members to focus on specific areas for improvement,identify barriers and work as a team to trouble-shoot solutions to issues that impactthe self-identified effectiveness of the team.Only three foci should be discussed in this meeting including the Focus on corebusiness plus two other foci recommended by the independent principle analyst asareas which have a very low aggregate score or where the aggregate score does notappear representative of a clear majority; (eg out of 25 responses: 12 StronglyDisagree, 12 Agree, 1 Strongly Agree).The collaborative discussion should occur as a dedicated meeting with no otheragenda items, in a closed meeting room set up for brainstorming (whiteboards,butcher’s paper, post-it notes), and all team members present. If possible thediscussion should be led by an independent facilitator.The meeting must allow for anonymous reporting of causative factors leading to theopportunity for improvement; (eg all team members may be asked to writeanonymously a possible cause for the low criteria or focus score on a post-it note andpass those up to the independent facilitator who will ensure no individuals areidentified while the causes are being discussed).The facilitator should use techniques like the 5-Why’s13 to determine root causes ofthe score and to provoke collaborative discussion on potential resolutions.After resolutions have been brainstormed, the facilitator should lead the team inidentifying the resolution project to be recommendedThe project proposal should be development through project managementframework, with the team brainstorming components including:  potential risks  implementation schedule  stakeholders to be engaged  project team and responsibilities  objectives and deliverables to be achieved  evaluation process to ensure completion  and measurable, reportable achievement14The improvement project manager or team should report the improvement project’songoing status back to the whole team as an agenda item at every future normalteam meeting until the project is closed.13 To identify the root cause of an issue, ask Why the issue occurred, then ask Why the causeof the issue occurred, then ask Why the cause of the cause of the issue occurred, to a total of5-Why’s.14 See the chapter in this Evaluation Guide on strategically evaluating a project. Page EV–15 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.193.A Appendix: Example—DIAf corporate self review survey Page EV–16 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–17 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–18 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–19 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–20 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–21 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–22 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–23 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–24 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–25 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–26 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–27 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–28 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–29 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–30 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–31 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.194.0 Strategic evaluation of a projectA project is a temporary measure with distinct start and end dates comprising a timeconstraint to meet specific goals and objectives, and produce deliverables within adefined budget and limited resources, in order to achieve beneficial, value-addingchange.15A program does not have a defined end date and usually contains multiple repetitiousproject instances defined for example, per annum.16Projects and programs should be evaluated by the project or program manager, or aspecified third party, during the:  project concept or proposal phase  throughout the project implementation as required  post-project review or closure phaseStrategic evaluation of a project or program enables evaluation against its:  Objectives  Deliverables  Strategic alignment  Evaluation measuresProject objectives align to strategic plans and goals, and are achieved and evidencedthrough the production of project deliverables throughout the implementation of theproject.The project’s deliverables are evaluated to ensure they have effectively achieved theproject’s objectives and strategic alignment through evidence of achievement.This evaluation framework assist the project or program manager evaluate theirproject’s deliverables, objectives, strategic alignment and evaluative processesagainst foci developed from the Department’s Improvement and Accountabilityframework (DIAf).All Workforce Development projects and programs including courses and eventsshould be recorded in the Directorate’s information management system (IMS).< decsgla01user3GroupsHRWorkforce DevelopmentIMSinterfacesProjectManagement.mdb >This evaluation framework is planned to be integrated into the IMS Work Programand Project Management interface.15 For example, the 2012 South Australian Public Teaching Awards is a project.16 For example, the South Australian Public Teaching Awards is a program. Page EV–32 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19ObjectivesObjectives to be achieved through the project should be recorded into the IMS.An objective is a key benefit achievable by the project that aims to meet a strategicdirection, plan or vision for the Department.Objectives are usually described in the format: To..., in a way that..., so that...Objectives have no physical substance; they are not created in a visible way.17Score Description ExampleUndeveloped objectives resemble a brain-storming To improve image list, are vague and have no indication of change or outcomesDeveloping objectives are stated broadly and infer To improve the image of the or imply non-specific outcomes or Department changes in practiceFunctioning objectives are clear about the To improve the image teachers have outcomes to be targeted or improved of the Department recognising them but the change is stated in broad non- as part of the workforce specific termsStrategic objectives are linked to a strategic To improve the image teachers have objective in a plan and expressed in of the Department recognising them terms of the outcomes to be targeted as an important part of the education or improved and care workforceEmbedded objectives are STRATEGIC, based on To improve the image teachers have best practice, aligned to increase of the Department recognising them effectiveness and consistency, and as the part of the education embedded with a specific objective in workforce that provides excellence a strategic plan in education and carePlease update the objectives of your project in the IMS before scoring.17 Compare with deliverables. Page EV–33 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19DeliverablesDeliverables to be created during the project’s implementation should be recorded inthe IMS.A deliverable is a key product that will be created by the project providing physicalevidence of the projects activity.18Deliverables include key performance indicators (KPIs) as statistics and are used asproof the project is meeting its Objectives.Score Description ExampleUndeveloped are when no deliverables are None stated and implementation does not produce any physical evidenceDeveloping deliverables are common or Hold an awards ceremony for 2012 historically repeated with no significant improvement evidentFunctioning deliverables include data Hold a ceremony for 100 award winners in related to the project but 2012 lacking specificity against changes or improvements to be achievedStrategic deliverables are assigned to The Workforce Recognition Officer will event specific team members, are manage the 2012 SA Public Teaching demonstrable including and Awards ceremony for up to 100 teachers supported by data, and recognised and nominated in the last 12 expressed as SMART months for outstanding teaching practice (specific, measurable, achievable, relevant, timely)Embedded deliverables are STRATEGIC, The Workforce Recognition Officer will event and include agreements for manage the 2012 SA Public Teaching continuous support and Awards ceremony promoting excellence in improvement with deliverables education and care for up to 100 teachers for post-project analysis and recognised and nominated in the last 12 review months for outstanding teaching practice using a documented up-to-date event management process including a survey of participants on the events deliveryPlease update the planned and achieved deliverables of your project in the IMS beforescoring.18 Compare with objectives. Page EV–34 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Strategic plans and alignmentA link from each project objective to a specific strategic plan’s objective or goalshould be recorded against each project objective in the IMS.Linking objectives to a strategic direction, plan or vision is an important analysis ofthe value of your project.Score Description ExampleUndeveloped There are a high number of competing strategies which are very broad, are management or compliance requirements, or are determined by the DepartmentDeveloping There are a range of broad strategies areas, based on complex targets that are a mix of management and improvement issues, already described by the DepartmentFunctioning There are a number of strategies that are mainly improvements or developments, identified from strategic planning, Departmental or national objectivesStrategic There are a number of strategies that clearly define continuous improvement integrated with Departmental or national objectives, determined through data analysis, agreed to by staff and described against a particular field of expertiseEmbedded There are a number of strategies that all staff agree from data analysis and stakeholder consultation, that are key objectives to focus for continuous improvement, that expressed against particular fields of expertise will concurrently and strategically achieve Departmental or national objectivesPlease update your projects objective links to strategic plans in the IMS beforescoring. Page EV–35 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Evaluation measuresProjects managed through the IMS have measures enabling evaluation to bedescribed and resulted through your projects objectives, deliverables and strategiclinks.Evaluation includes your projects ongoing status updates as well as the final projectreview closure report.Score Description ExampleUndeveloped Data may be collated but is not used to monitor # participants progress, support status reporting or evaluate the effectiveness of the projectDeveloping Measure of progress are merely numbers or gender distribution of # deliverables ticked as completed or achieved with participants minimal or no evaluation of the relative change or survey of participants improvement achieved interestFunctioning Data is used throughout the project to monitor rate of registration of and report on the progress and achievement of interest targets, deliverables and objectives against strategic plansStrategic Specific data is regularly collated and used to regional and index of monitor the progress of the continuous disadvantage improvement changes throughout the projects distributions of rate of lifecycle, with regular review of the completion of registration and deliverables to support the achievement of achievement vs student objectives against strategic plans by all project outcomes (NAPLAN) team membersEmbedded Evaluation is STRATEGIC with the analysis of post event survey and multiple measures of improvement and the data retrieval regular collaboration and input from all project quantitatively analysing team members to evaluate the projects progress changes and and refine and redefine the objectives and next improvement in steps towards the most effective deliverables to education and care achieve the strategic outcomes of the project activityPlease start your post project review in the IMS before scoring, including: • what worked • what didnt work • lessons learned • your recommendations for future project managers Page EV–36 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Focus areasThe following focus areas may be used to categorise the project’s objectives,deliverables, strategic alignment and evaluation measures for projects where theimplementation of the evidentiary deliverables to achieve the project objectivesaligned with strategic plans is significantly large or complex.Focusing on development or improvementThe project produced evidence of continuous improvement in achievement/outcomesfor stakeholders, with challenging targets for ongoing improvement and qualitypractice, while creating an ethos with high expectations and consistent understandingto drive, develop and improve policy, practice and performance.Designed and implemented with systemic thinkingThe project is integrated with wider systems and is committed to improve thesesystems to create an aligned and effective integrated system that supports theDepartments continual improvement with targets for achievement appropriate to thecontext of community needs and aspirations.Enables individual leadershipThe project develops leadership within the projects team members and stakeholdersenabling them to exhibit principled and visible responsibility that is shared andfostered throughout the project through effective project management expertise andcapacity at all levels to achieve the projects objectives, deliverables and strategicgoals.Attentive to organisational culture while transforming capacityThe project intentionally creates a culture that involves individuals and groups intransforming the capacity of the system through a positive culture with high levels ofteam and stakeholder satisfaction, morale and support for individuals to grow andimprove their performance.Managed stakeholders through effective listening and responsivenessThe project enables team members and stakeholders to have an active voice and beresponsive to current and future needs of the Department using a client-focussedapproach to communication, risk management, prioritisation, and organisation of theprojects components and responsibilities.Effectively collated, used and reported dataThe project creates or gathers the necessary information and knowledge requiredfrom data sources including stakeholder needs to strategically evaluate and improveoutcomes through an informed, structured and organised approach with a clearevidence base for decision-making and recommendations as well as identifying cleardrivers for continuous improvement opportunities.Maintained direction and scope throughout the projectThe project planning process, documentation and stakeholder communicationexplicitly states values, vision and purpose within its objectives, deliverables andstrategic links, and all documentation is consistently up-to-date and available to allproject team members. Page EV–37 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Innovatively and effectively aligned resourcesThe project innovatively targets resources to most effectively align all physical,human and financial resources with project needs through sustainable resourcemanagement to achieve successful and efficient outcomes.Building a sustainable future including continuous improvementThe project achieves improvement for the Department through a structuredcontinuous improvement cycle with regular evaluation and review of existing relatedprocesses and activity.ScoringScoring of your project’s objectives, deliverables, alignment to strategic plans, andevaluation measures is recorded as a single rating across each component either asa whole, or categorised into focus areas. 1. Undeveloped 2. Developing 3. Functioning 4. Strategic 5. EmbeddedUndevelopedAn undeveloped rating refers to objectives, deliverables, strategic alignment andevaluation measures that are undocumented, unrecorded and unable to be reviewedby an appropriate third party.DevelopingA developing rating refers to objectives, deliverables, strategic alignment andevaluation measures that are drafted, or only documented on the initial project plan,concept or proposal.FunctioningA functioning rating refers to objectives, deliverables, strategic alignment andevaluation measures that have been or are intended to be ticked off as achieved ornot achieved; however, the project’s objectives or deliverables are not linked, relatedor aligned to strategic plans, goals or objectives.StrategicA strategic rating refers to objectives, deliverables, strategic alignment andevaluation measures where the project’s objectives and deliverables are linked,related, aligned and described against strategic plans, goals or objectives.EmbeddedAn embedded rating refers to objectives, deliverables, strategic alignment andevaluation measures where the project only includes objectives that are linked tostrategic objectives; with all project objectives evidenced by deliverables (with nonon-strategic, orphan objectives or deliverables); and with the project plan,implementation, delivery and closure evaluation focused on strategic achievement. Page EV–38 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.194.A Appendix: Example—strategic project evaluation Page EV–39 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–40 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–41 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–42 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–43 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–44 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–45 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–46 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–47 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.195.0 Evaluation of an ongoing program, with a focus on knowledge and skills transferThis evaluation is designed to encompass both effective training (improvement ofprograms) and training effectiveness (implementation of skills to improveorganisational effectiveness).The Kirkpatrick19 model will be used as the structural framework in the evaluation ofWorkforce Development training programs.This model has the following four levels. Level 1—evaluation of feedback from participants and stakeholders Level 2—evaluation of physical content, resources and deliverables Level 3—evaluation of application and change to organisational processes Level 4—evaluation of strategic outcomes and measurable benefitsLevels 1–2 provide instructive feedback to program deliverers to update and improvetheir relationship with participants, and to project managers in the roll-over of projectinitiatives to ongoing programs for the future program manager.Levels 3–4 produce summation of strategic value and effectiveness in theimplementation of a project and ongoing program delivery reportable to seniormanagement.Some methods of evaluation include:  asking for self-evaluation from participants and facilitators  testing participants’ knowledge, understanding and decision-making skill  observing participants’ performance  examining organisational business results  comparing social media learning with traditional learning intervention  seeing how much productivity is lost or gained from time requiredResults from evaluations can be analysed in global categories to calculate theeffectiveness of a type of training for the organisation; for example, all leadershipprogram evaluations.Sample evaluation survey questions are listed in Appendices A–E.19 www.kirkpatrickpartners.com Page EV–48 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 1—evaluation of motivation and reactionThe first level of evaluation is used to determine how well your participants engagewith the registration and learning process.It is used to identify why participants have enrolled in the program, what theyperceive are needs in their workplace that the program will have a positive impact onto improve their working environment, and what their learning expectations are fromparticipation in the program.The motivation and reaction evaluation informs on how appealing, relevant andeffective the learning content and delivery is to the individual participants, measuringhow well the learning engagement processes and program reputation work.Even if all the other levels are effectively covered by the program, missing level 1 can causeparticipants to fail to see a purpose in the learning and they may disengage from the program.Effective evaluation of level 1 motivation and reaction to the program to measure theengagement of participants to its structure and marketing can be collated andanalysed through:  reaction sheets  surveys  focus groups  interviewsParticipants may self-assess their impression of the program pre-, during and post-participation against any improvement rating scale with the following foci:  relevance  specific  practical  accessible  social Page EV–49 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 2—evaluation of learning contentThe second level of evaluation is used to improve the quality and effectiveness of thelearning content, structure and delivery.It is used to update, refine and ensure best practice in the knowledge, skills, andresources provided to and developed by participants who participate in thefacilitator’s program.The learning content evaluation informs on what participants learned and the extentto which knowledge and skills were gained by the participants, including the degreeof effectiveness that the learning delivery achieved to transfer the content, knowledgeand skills to the participants.Even if all the other levels are effectively covered by the program, missing level 2 can causethe participants to devalue the learning and consider the program to be out-of-date andirrelevant.Effective evaluation of level 2 learning content in the program to measure successfulcontent design and development, delivery methods and achievement can be collatedand analysed through:  pre and post training results  written knowledge tests  role-play and simulation  activities and games Page EV–50 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 3—evaluation of performance and behaviourThe third level of evaluation is used to report on changes and increases in capacity ofjob performance resulting from participation in the program, driving accountability,measuring effectiveness and value to the organisation, and enabling appropriateresource and support allocation.It is used to prove value of participation in the program through identifying changes inthe way employees behave and perform in their working environment to increase theefficiency, productivity and quality of their work using newly learned skills andknowledge.The performance and behaviour evaluation informs of the capability of participants toeffectively uptake and perform newly developed skills, and the degree that learnedskills and knowledge actually transfer to and are used in the working environment,post-participation.Even if all the other levels are effectively covered by the program, missing level 3 can causethe program to be devalued by staff and management as they consider the skills andknowledge learned through participation to be academic and ineffective in the real workplace.Effective evaluation of level 3 performance and behaviour following participation inthe program to measure successful transfer and implementation of learning from thetraining to working environments, can be collated and analysed through:  surveys  interviews  focus groups  observations  work reviewsParticipants may self-assess changes in their working environment followingparticipation in the program against any improvement rating scale with the followingfoci:  performance  confidence  contribution  engagement  retention  customer satisfaction  business success  systems use  drivers and strategic alignment Page EV–51 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 4—evaluation of impact and resultsThe fourth level of evaluation is used to report on the tangible organisationaloutcomes achieved through the implementation of skills developed by participation inthe program.It is used to graphically demonstrate the productive impact, outcomes and results ofparticipation showing organisational improvement.The impact and results evaluation provides quantitative results and scoring of addedorganisational value including reduced costs, improved quality, increased productionand efficiency indices, and enables calculation of return on investment (ROI) from theprogram.Even if all the other levels are effectively covered by the program, missing level 4 can causethe program to have reduced priority and acknowledgement by C-level and Executivemanagement as they may not easily identify a positive impact on the organisation’s bottomline by supporting the program.Effective evaluation of level 4 impact and results to measure the outcomes ofimplementation of learning to achieve described targets can be collated and analysedthrough:  borrowed metrics from other data systems including human resources data  surveys  focus groupsDecision-makers prefer the results of level 4 evaluations, although not necessarily indollars and cents. For example; a study of financial and information technologyexecutives found that they consider both hard and soft returns when it comes tocustomer-centric technologies, but give more weight to non-financial metrics (soft),such as customer satisfaction and loyalty.20These evaluation results enable real business results to be connected to trainingprograms, including:  operational efficiency  compliance  retention of top talent  customer satisfaction  sales volumeThese results also identify through analysis, the gaps and needs of the organisationthat can be filled through training and skills development.20 Hayes 2003 Page EV–52 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19This level of program evaluation may be met with a more balanced approach orscorecard21 from four perspectives:  Financial—a measurement that shows a monetary return or impact such as how the output from a process is improved (financial can be either soft or hard results)  Customer—improving an area in which the organisation differentiates itself from competitors to attract, retain, and deepen relationships with its targeted customers  Internal—achieving excellence by improving processes as supply-chain management, production or support process  Innovation and learning—ensuring learning packages support a climate for organisational change, innovation, and the growth of individualsEvaluative results from the above scorecard are preferred by management but maybe supplemented with the more commonly provided levels 1–2 quantitative resultslisted below.  How many people (participants) will receive the training?  How often the training both can and is planned to be repeated; noting that online training courses can be developed once and re-used with relatively low ongoing costs, whereas the volume of financial, human and physical resources required for classroom delivery continues to increase  How many total hours of learning have been successfully achieved?  A comparison of the cost of initial development (initial or single class), with the reducing cost to maintain and deliver multiple instances of the same content  Overall resource costs involved in preparing, running and evaluating programs, in both dollars ($) and hours.  Do participants and/or facilitators need to travel, be accommodated, and supplemented at the workplace while they are training?  What proportion of programs were evaluated, and the distribution of stringency of those evaluations?  What proportion of programs were evaluated by the program manager, versus those evaluated by an independent third party evaluator?  The effective distribution of evaluation results.  How many programs or instances were planned using or referencing evaluation results from previous programs or relevant projects, before implementation?  What evidence is there of the implementation of evaluation recommendations in the delivery of programs?21 Kaplan, Norton 2001 Page EV–53 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.0 Appendix: Level 1 sample survey questionsThe first level of evaluation is used to determine how well your participants engagewith the registration and learning process.22A.1 Level 1—About the training needs analysis1. This training needs analysis asks about [DELETE AS APPROPRIATE] the business need for training / your individual learning needs and preferences.2. The assessment is made up of [ADD NUMBER] questions and should take about [ADD NUMBER] minutes to complete. Please answer the questions as fully as possible.3. Please complete the assessment by [ADD DETAILS].4. What will happen with your responses?  [DELETE AS APPROPRIATE]  Your responses will be used to help decide on the most appropriate training needed.  Your responses will be anonymous and feedback will be reported for all respondents in generalised form only.  All responses will be analysed and reported to [ADD DETAILS]. Findings are due to be reported in [ADD DETAILS].  A copy of the assessment report will be available from [ADD LOCATION/PARTICIPANT] on [ADD DATE].5. Any questions?6. If you have any questions about this assessment, please contact [ADD DETAILS].22 The following questions have been adapted from < www.trainingcheck.com >. Page EV–54 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.2 Level 1—About the training1. What was the name of the training/activity you attended?2. Which training event did you attend?3. Which sessions did you attend? Choose as many as apply.  All sessions  Session 1  Session 2  Session 34. Which modules did you attend?  All  Module 1 only  Module 2 only  Other, please specify5. How many sessions did you attend in total?6. Where was the training/activity held?7. Which location did you attend?8. What was the date of the training?  dd/mm/yyyy9. Which date did you attend?10. Which times did you attend?11. Which level did you attend?12. What type of training/activity was it?  Classroom-based  Web-based/e-learning  On-the-job learning  Coaching/mentoring  Project work  Job shadowing  Other, please specify13. What was/were the name(s) of the trainer(s)?14. Who was the trainer? Page EV–55 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.3 Level 1—About the participant1. Your name:2. Your organisation:3. Your job title/grade:4. What is your job title/grade?  [FIRST JOB TITLE / GRADE / CLASSIFICATION]  [SECOND JOB TITLE / GRADE / CLASSIFICATION]  [THIRD JOB TITLE / GRADE / CLASSIFICATION]  Other, please specify5. Which of these best describes your job role?  Senior manager  Manager  Supervisor  Employee  Other, please specify6. On what basis are you employed?  Full-time  Part-time  Contract/agency  Other, please specify7. Which department/section do you work in?8. What is your managers name?9. Which shift pattern do you work in:10. How long (in months) have you been in your current role?11. How long have you worked for the organisation?  Less than one year  1-2 Years  3-4 Years  5-10 Years  11-15 Years  16-20 Years  21-25 Years  More than 25 years Page EV–56 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1912. Your age:  Under 18  18-25  26-35  36-45  46-55  56+13. What is your date of birth?  dd/mm/yyyy14. Your gender:  Male  Female15. Is English your first language?  Yes  No16. What is your primary language?  English  [LANGUAGE 2]  [LANGUAGE 3]  [LANGUAGE 4]  Other, please specify17. What is the highest level of education that you have completed?  Primary/elementary school  Secondary/high/vocational school  Further tertiary education/vocational college  University  Other, please specify18. What is the highest grade or year of school you completed?  Never attended school or only attended kindergarten  Grades 1 through 7 (Primary)  Grades 8 through 11 (High school)  Grade 12 (High school graduate)  Tertiary 1–3 years (TAFE or University)  Tertiary 4+ years (Post-graduate)  Post-tertiary (Post-graduate Honours, Masters, Doctorate)  Other, please specify Page EV–57 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1919. What is your highest qualification?  Doctorate  Master’s  Honours  Bachelors  Diploma  Certificate  None  Other, please specify Page EV–58 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.4 Level 1—Individual learning needs1. Overall, how would you describe your level of proficiency in your current role?2. Please give brief details of the tasks that you perform which are critical to your job role.3. What additional knowledge or skills might you need in order to perform your job role more effectively?4. In your view, what are the most important training needs of your unit/section/department? Please say why.5. Please rate your current level of skill in doing the following:  Rating (1 = No skills at all, 5 = Very good skills)  [SKILL 1]  [SKILL 2]  [SKILL 3]6. How often do you do the following as part of your job?  Rating (1 = Not at all, 5 = Very often)  [TASK/ACTIVITY 1]  [TASK/ACTIVITY 2]  [TASK/ACTIVITY 3]7. Which of the topics listed below would you like to receive training on? Please rate them from 1 (least important) to 5 (most important):  Communication skills  Computer skills  Customer service skills  Equalities and diversity  Health and safety  Leadership  Managing change  Performance appraisals  Presentation skills  Strategic business planning  Stress management  Time management  Supervisory skills  Writing for business Page EV–59 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.198. Please rate your current level of knowledge in the following areas:  Rating (1 = No knowledge at all, 5 = Very good knowledge)  [KNOWLEDGE AREA 1]  [KNOWLEDGE AREA 2]  [KNOWLEDGE AREA 3]9. How confident do you feel in the following areas?  Rating (1 = No confidence at all, 5 = Very confident)  [AREA 1]  [AREA 2]  [AREA 3]10. In order of importance (1 = most important), please identify up to three topics that you feel you would most like to receive training on.  1.  2.  3.11. How would you expect to receive training in these areas to improve your job performance?12. Please give brief details of any relevant training that have you already done in these areas.13. Which of the following training programs have you already participated in?  [PROGRAM 1]  [PROGRAM 2]  [PROGRAM 3]14. How do you prefer to learn? Choose as many as apply.  Through pictures and spatial understanding  Through audio and/or music input  Through words, both in speech and writing  Through using the body, hands and sense of touch  Through logic, reasoning and systems  In groups or with other people  Working alone / self-study  Other, please specify Page EV–60 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1915. What type of training/learning would you find most helpful to support your development? Choose as many as apply.  Bite-size learning  Blended learning  Classroom-based  Coaching/mentoring  Conference/workshop  e-Learning  Informal learning  Job shadowing  On-the-job training  Remote training/webinar  Task/project work  Dont know  Other, please specify16. How would you rate your skills in using a computer?  Rating (1 = No skills, 5 = Very good skills)17. What areas of your work do you hope to develop in the future?18. What new knowledge and skills do you think you will you need to achieve your goals?19. What other training might you want in the future? Please say why.20. Which of the following locations would you prefer to attend training?21. Which of the following times would you prefer to attend training?22. How would you feel and what challenges do you think there might be for you in attending training (excited, nervous, distance etc)?23. Please add any other comments about your training needs here. Page EV–61 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.5 Level 1—Business needs1. Please identify up to 3 key business goals/objectives which you believe could be supported through training.  Goal/objective 1:  Goal/objective 2:  Goal/objective 3:2. What additional knowledge and skills are needed within the organisation to achieve these goals/objectives?  For goal/objective 1:  For goal/objective 2:  For goal/objective 3:3. What other significant changes are taking place or issues are being faced within the organisation, and what new knowledge and skills are needed to address these?4. Other than training, how else could the necessary knowledge and skills be developed within the organisation?5. How can the expertise and effective practice that already exists with the organisation be captured and used to develop the necessary knowledge and skills?6. Who (which people/departments/sections) should be trained to develop the necessary knowledge and skills? Include contact details where possible.7. Which specific job skills should be addressed by training?8. Please give details of any existing training that you are aware of which you feel could address the identified needs.9. Which existing organisational strategies/plans should the training be aligned to?10. How will developing knowledge and skills in these areas help to achieve the business goals/objectives?11. How would you expect the training to change job and/or business performance? Please be as specific as possible.12. Which business performance/impact measures should be used to measure the success of the training, and what changes/improvement would you expect to see in these? Please give figures where possible.13. What other changes might occur, and how might these be measured?14. What additional systems or processes need to be set up to measure performance both before and after the training?15. When, or over what time period, should changes in performance be measured? (eg 3 months and monthly for 12 months after the training?)16. What potential barriers might there be to the training being effective, and how might these be overcome?17. What can be done to ensure that the impact of the training is maximised (eg supporting participants to apply their learning)? Page EV–62 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1918. What support could be given to participants to ensure that their learning will be transferred to their jobs?19. What budget should be allocated to this training?20. Should the evaluation of the training set out to measure value for money/return on investment, and, if so, what measures should be used?21. What should be the target return on investment %, cost-benefit ratio and/or payback period (length of time it takes to repay the investment) for this training?22. Please add any other comments about the business need for training here. Page EV–63 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.6 Level 1—Core skill needs1. Please choose a participant from the list to provide feedback on. You can provide feedback on others later by retaking the evaluation and choosing a new participant from the list.2. Please enter the name of the participant you are providing feedback on.3. What best describes your work relationship to this participant?  Manager/Supervisor  Colleague/Peer  Subordinate  Other, please specify4. How often do you interact with this participant at work?  Very often  Quite often  Sometimes  Rarely  Never5. In your assessment, how effective is this participant at the following skills in situations which require their use? Please use the following scale.  1 = Not at all - Effective behaviour is not demonstrated  2 = Poorly - Effective behaviour is demonstrated little  3 = Marginally - Effective behaviour is demonstrated on some occasions  4 = Moderately - Effective behaviour is demonstrated most of the time  5 = Very effective - Effective behaviour is demonstrated at all times  Please leave blank if not applicable/not able to comment.6. In your own assessment, how effective are you at the following skills in situations which require their use? Please use the following scale.  1 = Not at all - Effective behaviour is not demonstrated  2 = Poorly - Effective behaviour is demonstrated little  3 = Marginally - Effective behaviour is demonstrated on some occasions  4 = Moderately - Effective behaviour is demonstrated most of the time  5 = Very effective - Effective behaviour is demonstrated at all times  Please leave blank if not applicable/not able to comment. Page EV–64 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.197. Problem Solving (1 = Not at all effective, 5 = Very effective)  Steps back from problems in order to properly define them.  Thinks laterally in order to identify solutions to problems.  Continuously identifies opportunities for process improvement.  Assesses possible solutions against the requirements of the business and what is realistic.  Identifies risks of recommended solutions and proposes ways to manage those risks.  Encourages others to propose solutions to line management.8. Teamwork (1 = Not at all effective, 5 = Very effective)  Treats people with respect and integrity.  Encourages and supports the contributions of others in achieving team goals.  Makes a full contribution to successful team performance.  Puts personal preferences aside to achieve team goals.  Demonstrates personal commitment to the decisions of the team.  Makes good use of the talents of colleagues.  Helps colleagues when they are under pressure.9. Communication (1 = Not at all effective, 5 = Very effective)  Communicates clearly and effectively with others using spoken language.  Communicates clearly and effectively with others in writing.  Listens to others, correctly interprets and responds appropriately.  Asks questions to clarify, and ensures communication is two-way.  Tailors language, tone, style and format to suit the audience.  Demonstrates openness in sharing information and keeping others informed.10. Customer Service (1 = Not at all effective, 5 = Very effective)  Presents a good image of the company to customers.  Delivers courteous and prompt service.  Always delivers what has been promised.  Builds customers trust by understanding and meeting their individual needs.  Takes personal responsibility for resolving customer concerns.  Checks customers levels of satisfaction.  Seeks customers improvement ideas.  Strives to exceed customer expectations.  Makes the organisation easy to do business with. Page EV–65 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1911. Decision Making (1 = Not at all effective, 5 = Very effective)  Makes responsible decisions, taking into account the facts and the feelings of others.  Analyses available information in detail.  Refers decisions beyond personal authority levels, seeking out second opinions where necessary.  Explains reasons for decisions to those affected.  Ensures that decisions are implemented.  Records the reasons for making a decision when this may be useful to others.  Reviews the quality of personal decisions, and modifies decision making process.12. Influencing (1 = Not at all effective, 5 = Very effective)  Focuses proposals for action on meeting the requirements of the customer.  Plans proposals in advance, ensuring they are timed to create greatest interest.  Makes clear recommendations for action rather than presenting options.  Reinforces the benefits of proposals and recommendations by using relevant facts, figures and opinions.  Offers support and challenge to the proposals of others.  Modifies position, where appropriate, to achieve a win-win.  Accepts questions and challenges without acting defensively.13. Planning and Control (1 = Not at all effective, 5 = Very effective)  Establishes priorities, tasks and work schedules in advance.  Maximises the use of available resources.  Clarifies the responsibilities of self and others, avoiding duplication of activity and wasted effort.  Describes milestones in terms of what is achieved and delivered.  Monitors the progress of plans and ensures that action is taken to resolve delays.  Anticipates and promptly raises operational or resource issues.14. Attention to Detail (1 = Not at all effective, 5 = Very effective)  Works within limits of authority, seeking guidance when unsure.  Keeps an eye on the detail, checking own work for mistakes.  Completes all aspects of a task.  Establishes realistic deadlines and then sticks to them.  Keeps up to date on current internal and external procedures and regulations. Page EV–66 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1915. Leadership (1 = Not at all effective, 5 = Very effective)  Acts as a role model to others.  Adapts personal style to suit the situation and needs of others.  Treats all staff as individuals, recognising and valuing diversity.  Gives others direct, constructive, and actionable feedback.  Praises and rewards achievement.  Communicates business goals and objectives effectively.  Motivates and empowers others to reach and exceed business goals and objectives.16. Delivering Business Results (1 = Not at all effective, 5 = Very effective)  Applies skill, effort and judgement to get the job done.  Ensures own role and objectives are clear.  Identifies opportunities to develop business and meet customer needs.  Redirects own time and resources to ensure objectives are met.  Seeks out products or services that match customer needs.  Prioritises time and attention on high value activities.  Ensures that own objectives are aligned with business plans.17. Please describe how performance has been assessed or measured.18. In your view, how might this participant be able to improve their effectiveness in these skills?19. In your view, how might you be able to improve your effectiveness in these skills?20. Please provide any additional comments on this participant’s overall performance as it relates to their job responsibilities. Please give examples of specific behaviours and situations if possible.21. Please provide any additional comments on your overall performance as it relates to your job responsibilities. Please give examples if possible.22. Please add any further comments relating to the above skills here. Page EV–67 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.0 Appendix: Level 2 sample survey questionsThe second level of evaluation is used to improve the quality and effectiveness of thelearning content, structure and delivery.23B.1 Level 2—Participant reaction and learning outcomes introduction1. The following questions are about the [ADD DETAILS] training, which took place at [ADD LOCATION] between [ADD DATE] and [ADD DATE].2. Your responses will help to evaluate the effectiveness of the training.3. The evaluation asks about the learning that has taken place, including the outcomes of assessments, tests or qualifications. It would be useful to have relevant documents and figures at hand when completing the evaluation.4. The evaluation asks about the impact of the training on the way you do your job since completion of the training.5. This evaluation is made up of [ADD NUMBER] questions and should take about [ADD NUMBER] minutes to complete. Please think about each question carefully and answer as fully as possible.6. Please complete the evaluation by [ADD DATE].7. What will happen with your responses?  [DELETE AS APPROPRIATE]  To protect confidentiality the names of training participants will not be used in any reports produced.  Your responses are anonymous and data will be reported for the whole group only.  All responses will be analysed and reported to [ADD DETAILS]. Findings are due to be reported in [ADD DETAILS].  A copy of the evaluation report will be available from [ADD LOCATION/PARTICIPANT] on [ADD DATE].8. Any questions?9. If you have any questions about this evaluation, please contact [ADD DETAILS].23 The following questions have been adapted from < www.trainingcheck.com >. Page EV–68 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.2 Level 2—General feedback1. How would you rate the training overall? (1 = Very poor, 5 = Excellent)  Rating2. What do you feel you have learned or gained overall from the training?3. What did you like most about the training?4. What did you like least about the training?5. Would you recommend this training to your work colleagues?  Yes  No  Not sure6. What further help or support do you need to enable you to improve in this area?7. Other than what you have already told us, how could the training be improved, eg to meet your needs, make the training more relevant to your job role or provide a better learning experience? Page EV–69 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.3 Level 2—Training methods and materials1. Overall, how effective were the learning materials in helping you to learn?  Rating (1 = Not at all effective, 5 = Very effective)2. How useful did you find the following in helping you to learn?  Rating (1 = Not at all useful, 5 = Very useful)  Teaching  Group discussions  Case studies  Role plays  Practical exercises  Videos/DVDs  Handouts3. Which of the exercises/case studies/role plays etc, did you find most useful in helping you learn?4. How effective was the technology used in helping you to learn?  Rating (1 = Not at all effective, 5 = Very effective)5. How easy did you find the e-learning system to use?  Rating (1 = Not at all easy, 5 = Very easy)6. How did you find the pace (speed) of the training?  Rating (1 = Very poor, 5 = Very good)7. How did you find the content of the training (eg amount and difficulty)?  Rating (1 = Very poor, 5 = Very good)8. How well was the training structured (eg, manageable chunks, logical order, linked to objectives)?  Rating (1 = Not structured, 5 = Very structured)9. How did you find the length of the training?  Rating (1 = Too long, 5 = Too short)10. How much support from the following did you receive with your learning?  Rating (1 = Not at all, 5 = Very much)  Help with study skills  Help with English  Help with maths  Help with using computers11. How far did you feel supported to complete the training?  Rating (1 = Not at all, 5 = Very much)12. How far did the training methods suit the way you prefer to learn?  Rating (1 = Not at all, 5 = Very much) Page EV–70 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1913. How far did you have a chance to share your ideas and opinions?  Rating (1 = Not at all, 5 = Very much)14. How far were equal opportunities promoted?  Rating (1 = Not at all, 5 = Very much)15. How do you think the training methods could be improved?16. Did anything prevent you from learning effectively? If so, please give details.17. If you would like to make any other comments about the training methods, please add them here. Page EV–71 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.4 Level 2—Trainer(s)1. What was/were the name of your trainer(s)?2. Please choose your trainer from the following list.3. Overall, how skilled was your trainer in helping you to learn?  Rating (1 = Not skilled, 5 = Very skilled)4. Please rate your trainer in the following areas.  Rating (1 = Very poor, 5 = Very good)  Knowledge of the subject/activity  Creating interest in the subject/activity  Relating the training to your job role  Understanding your needs  Supporting you to set targets  Responding to questions5. How well did the trainer establish and maintain rapport with the participants?  Rating (1 = Not at all, 5 = Very well)6. How well did the trainer relate the training to your job role?  Rating (1 = Not at all, 5 = Very well)7. How far were your specific requirements and feedback taken into account during the training delivery?  Rating (1 = Not at all, 5 = Very much)8. How far did the trainer encourage the transfer of learning to the workplace?  Rating (1 = Not at all, 5 = Very much)9. How well did the trainer summarise and review the training at the end of each session?  Rating (1 = Not at all, 5 = Very well)10. Would you attend another course facilitated by this trainer?  Yes  No  Not sure11. If you would like to make any other comments about the trainer(s), please add them here. Page EV–72 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.5 Level 2—Tests and qualifications1. How did you feel about your results from the [ADD ASSESSMENT/TEST/QUALIFICATION], and what do you think were the reasons for your results?2. How well did you think the [ADD ASSESSMENT/TEST/QUALIFICATION] assessed the knowledge and skills you learned on the training?  Rating (1 = Not at all, 5 = Very well)3. If you are aware of other assessments/tests or qualifications which you feel may be more suitable, please give details here.4. If you have any further comments about assessments/tests or qualifications you took, please add them here.B.6 Level 2—Progress to other learning1. How far has this training encouraged you to continue to learn?  Rating (1 = Not at all, 5 = Very much)2. Which of the following training programs would you like to participate in the future?  [PROGRAM 1]  [PROGRAM 2]  [PROGRAM 3]3. What, if any, other training would you like to receive? Please only include your name if you are happy for this information to be forwarded to your manager.  Your name:  Type of training:4. What other training do you plan to take part in the future? Please give dates if possible. Page EV–73 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.7 Level 2—Pre-training activities and instructions1. How did you learn about the training?  From training manager  Department’s website  Email list  Recommendation from a colleague/co-worker  Recommendation from supervisor/manager  Other, please specify2. How clear were the joining instructions?  Rating (1 = Not at all clear, 5 = Very clear)3. How far did you feel informed and prepared to start the training?  Rating (1 = Not at all, 5 = Very well)4. How far did you discuss the reasons for attending the training with your manager before the start?  Rating (1 = Not at all, 5 = Extensively)5. How far did you feel able to discuss any problems or concerns that you might have had that might have affected your ability to make the best use of the training?  Rating (1 = Not at all able, 5 = Very able)6. How useful was the pre-training meeting/briefing in helping you to find out the following?  Rating (1 = Not useful, 5 = Very useful)  The objectives of the training  Reasons for your participation in the training  How the training relates to your job role  The preparation you needed to do for the training7. How useful did you find the pre-reading materials?  Rating (1 = Not useful, 5 = Very useful)8. How did you feel about going on the training (eg excited, worried)? What challenges/problems did you think there might be?9. Before you attended the training, how did you think it might help you perform your job?10. If you have any other comments about the pre-training materials/activities please add them here. Page EV–74 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.8 Level 2—Facilities, courses and resources1. What previous training have the relevant employees had in this area?2. Has training already been carried out in this area within the organisation, and, if so, how effective was it?3. Should the training be delivered using in-house trainers or using an external training provider?  In-house  External provider  Don’t know4. What internal resources (people, materials, facilities, budget etc) are already available for the training?5. What additional resources (people, materials, facilities, budget etc) are needed for the training to take place?6. What type of training would be most appropriate to develop the necessary knowledge and skills? (Please select as many as apply).  Bite-size learning  Blended learning  Classroom-based  Coaching/mentoring  Conference/workshop  e-Learning  Informal learning  Job shadowing  On-the-job training  Remote training/webinar  Task/project work  Dont know  Other, please specify7. What facilities are available for hosting the training?8. What accreditation should be attached to the training?9. How might people be encouraged to participate in the training? Page EV–75 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19B.9 Level 2—Facilities and administration1. Please rate the following aspects of the event facilities and administration:  Rating (1 = Very poor, 5 = Very good)  Administration and enrolment  Room and venue  Convenience of location  Technical support  Catering  Overnight accommodation2. Please rate the administration and enrolment for the event.  Rating (1 = Very poor, 5 = Very good)3. Please rate the room/venue.  Rating (1 = Very poor, 5 = Very good)4. How accessible was the venue?  Rating (1 = Not accessible, 5 = Very accessible)5. How easy was it to find the room once you were at the venue?  Rating (1 = Not at all easy, 5 = Very easy)6. Please rate the convenience of the location.  Rating (1 = Very inconvenient, 5 = Very convenient)7. Please rate the technical support.  Rating (1 = Very poor, 5 = Very good)8. Please rate the catering.  Rating (1 = Very poor, 5 = Very good)9. Please rate the overnight accommodation.  Rating (1 = Very poor, 5 = Very good)10. If you have any other comments about the event facilities or administration, please add them here. Page EV–76 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19C.0 Appendix: Level 3 sample survey questionsThe third level of evaluation is used to report on changes and increases in capacity ofjob performance resulting from participation in the program, driving accountability,measuring effectiveness and value to the organisation, and enabling appropriateresource and support allocation.24C.1 Level 3—Job performance impact1. The following questions are about the [ADD DETAILS] training, which took place at [ADD LOCATION] between [ADD DATE] and [ADD DATE].2. Your responses will help to evaluate the effectiveness of the training.3. The evaluation asks about the impact of the training on the way you (the participants) have performed in your (their) jobs since completion of the training.4. It would be useful to have details of the training participants and any relevant performance documents and figures on-hand (such as work records and [ADD DETAILS]).5. Please complete the evaluation by [ADD DATE].6. What will happen with your responses?  To protect confidentiality the names of training participants will not be used in any reports produced.  The data you supply will not be used to discipline, select out, or otherwise disadvantage participants.  All responses will be analysed and reported to [ADD DETAILS]. Findings are due to be reported in [ADD DETAILS].  A copy of the evaluation report will be available from [ADD LOCATION/PARTICIPANT] on [ADD DATE].7. Any questions?8. If you have any questions about this evaluation, please contact [ADD DETAILS].24 The following questions have been adapted from < www.trainingcheck.com >. Page EV–77 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19C.2 Level 3—Relevance of the training1. What were your main reasons for taking part in the training (please choose as many as apply)?  Your job or responsibilities have changed  It is part of your personal development plan  To improve how you work with colleagues  To improve your skills or knowledge  New technology or work processes have been introduced  It may be of some use in the future  You were asked to take part  It is a legal requirement  Other, please specify2. How much do you feel you learned from the training?  Rating (1 = Nothing, 5 = Very much)3. Which parts of the training did you find most useful and why?4. Which parts of the training did you find least useful and why?5. What were the three main learning points you took away from the training?  Learning Point 1:  Learning Point 2:  Learning Point 3:6. How useful was the following information for you? (1 = Not at all useful, 5 = Very useful)  [LEARNING AREA 1]  [LEARNING AREA 2]  [LEARNING AREA 3]7. Was there anything that you would have liked to or expected to learn but didn’t?8. How far did the training repeat what you had learned before?  Rating (1 = Not at all, 5 = Very much)9. How useful do you think your learning from the training will be for your job?  Rating (1 = Not useful, 5 = Very useful)10. How much do you think the learning from the training will help you improve your work performance?  Rating (1 = Not at all, 5 = Very much) Page EV–78 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1911. How far do you feel the training provided the knowledge and skills required for the workplace?  Rating (1 = Not at all, 5 = Very much)12. How far have the following improved as a result of the training?  Rating (1 = Not at all, 5 = Very much)  Your confidence  Your knowledge and skills  Your awareness of options available to you  Your motivation to take further steps13. Please rate your ability to [ADD SKILL/SKILL].  Rating (1 = No skills, 5 = Very good skills)  Before the training:  After the training:14. Please identify 3 new key actions you will be able to put into practice over the next 3 months.  Key action 1:  Key action 2:  Key action 3:15. How and in what context do you expect to put what you have learned into practice?16. What will you do differently at work as a result of the training?17. Is there anything that you are aware of that might stop you using your learning in your job?18. What things (eg people, equipment, skills) might you need, to help you use your learning in your job?19. If you have any other comments about the relevance of the training please add them here. Page EV–79 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19C.3 Level 3—Application of learning1. Overall, how effective do you believe the training has been in improving your (the participant’s) job performance?  Rating (1 = Not effective, 5 = Very effective)2. How has your perception of your (the participants’) job role and responsibilities changed since the training?3. From your observation of the participants since the completion of the training, how effective has the training been in improving their job performance?  Rating (1 = Not effective. 5 = Very effective)4. In your view, how closely was the training linked to job performance needs?  Rating (1 = Not at all, 5 = Very closely)5. How closely is the training integrated with other job performance systems and factors?  Rating (1 = Not at all, 5 = Very closely)6. How far have the following aspects of your (the participants/participants) work improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)  Job satisfaction  Overall quality of work  Attitude towards work  Ability to work together with colleagues  Confidence to tackle new things  Desire to learn new knowledge or skills7. How far have you (the participants) been able to use the new knowledge or skills learned in your (their) job/s?  Rating (1 = Not at all, 5 = Very much)8. How far do you feel your (the participants/participants) ability to contribute ideas at work has improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)9. Please rate your (the participant’s/participants’) ability to [ADD SKILL/SKILL].  Rating (1 = No skills, 5 = Very good skills)  Before the training:  After the training: Page EV–80 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1910. Are you (the participants) currently using [ADD SKILL] in your (their) job/work?  Yes, I am (they are) using it  I (they) havent had the opportunity to use it, but plan to/will do  I (they) dont need to use this skill in my (their) job/work  No, Im (they’re) not using it  Other, please specify11. What have been the biggest changes in the way you do your job (the participants carry out their jobs) as a result of attending the training?12. How did the training program influence/contribute to these changes?13. How soon after completing the training were you (the participants) able to use the knowledge or skills learned?  Immediately  Within 1 month  Within 2 to 6 months  Within 1 year  Have not used yet14. Please give brief examples of how the knowledge or skills learned in the training have been implemented.15. What was the impact of this training on your/your teams/your departments (the participants/teams/departments) performance?16. What has helped you (the participants) use the newly learned knowledge or skills in the workplace? (Choose as many answers as apply.)  Nothing  Plenty of opportunities/times to apply/use them  Knowledge/skills relevant to my (their) role/s  Received support from manager/managers/management  Encouraged by my (the) workgroup  Did a similar course previously  Effective tools available on the job  Work processes support use of skills  Had the time  Encouraged by early success  Other, please specify Page EV–81 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1917. What has stopped you (the participants) from using any of the newly learned knowledge or skills in the workplace? (Choose as many answers as apply.)  Nothing  No opportunity to apply  My (their) job has changed  Little or no support from manager/managers/management  Discouraged by my (the) workgroup  Was applying knowledge or skills already  Tools not available on the job  Work processes do not support use of skills  Have not had the time  Tried without success  Do not remember course content  Other, please specify18. If you (the participants) are not currently using [ADD SKILL] in your (their) job/work, what would help you (them) to start using it?  More practice  Job aids to remind me (them) of the steps  Support from my (their) manager  Other, please specify19. How far have managers supported you (the participants) to apply the learning to your (their) work?  Rating (1 = Not at all, 5 = Very much)20. Would you (as manager) support this (or another) participants/groups participation in this/further training?  Yes  No  Not sure21. How far do you feel your (the participants/participants) ability to solve problems and make decisions has improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)22. How far do you feel your (the participants/participants) overall confidence or morale has improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)23. What other factors have helped you (the participants) develop knowledge or skills in this area?24. In what other ways are you (the participants) planning to use the knowledge or skills learned on the training in the future? Page EV–82 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1925. How far do you feel your (the participants/participants) career development opportunities have improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)26. How far do you feel your (the participants/participants) confidence to join a further training program has improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)27. How far do you feel your (the participants/participants) work-life balance has improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)28. What might hold you (the participants) back from applying the learning to your (their) work in the future?29. What other things (eg people, equipment, knowledge, skills) and/or further action might you (the participants) need, to help you (them) use the learning in your (their) job/s?30. Overall, how effective do you believe the training has been in improving the performance of the workgroup/team/department?  Rating (1 = Not effective, 5 = Very effective)31. What changes have you seen in the way your workgroup/team/department works since the training program?32. Please describe any changes.33. What other factors may have influenced these changes?34. What else could be done to improve performance in this area?35. Looking back on the training, what suggestions do you have for improving its effectiveness in meeting (your) job performance needs?36. Please add any other comments you may have about the impact of the training on (your) job performance here.37. If you have any further comments about the impact the training, please add them here: Page EV–83 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19C.4 Level 3—Post-training skills observation1. Please choose a participant from the list to provide feedback on. You can provide feedback on other participants later by retaking the evaluation and choosing a new participant from the list.2. What is the name of the participant you are providing feedback on? You can provide feedback on other participants later by retaking the evaluation and choosing a new participant from the list.3. What best describes your work relationship to this participant?  Manager  Supervisor  Trainer  Other, please specify4. Please ask the participant to proceed to [ADD SKILL] and assess their actions below. (1 = No skills/could not do, 5 = Very good skills)  [ADD ACTION]  [ADD ACTION]  [ADD ACTION]5. Please ask the participant to proceed to [ADD SKILL] and assess the improvement in their actions since the completion of the training. (1 = No improvement, 5 = Very much improvement)  [ADD ACTION]  [ADD ACTION]  [ADD ACTION]6. From your observations, please provide your assessment of the participants ability to [ADD SKILL]. (1 = No skills/could not do, 5 = Very good skills)  [ADD ACTION]  [ADD ACTION]  [ADD ACTION]7. From your observations, please provide your assessment of the improvement in the participants ability to [ADD SKILL] since the completion of the training. (1 = No improvement, 5 = Very much improvement)  [ADD ACTION]  [ADD ACTION]  [ADD ACTION]8. Please add any other comments you may have about the skill observation or other learning outcomes here. Page EV–84 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.199. From your observations, please provide your assessment of the ability of the following participants to [ADD SKILL]. (1 = No skills/could not do, 5 = Very good skills)  [PARTICIPANT 1]  [PARTICIPANT 2]  [PARTICIPANT 3]10. From your observations, please provide your assessment of the improvement in the ability of the following participants to [ADD SKILL] since the completion of the training. (1 = No improvement, 5 = Very much improvement)  [PARTICIPANT 1]  [PARTICIPANT 2]  [PARTICIPANT 3]11. Was the participant able to [ADD SKILL]?  Yes  No  Comments:12. Please briefly describe the skill(s) being assessed.13. Was the participant able to perform the skill to the required standard?  Yes  No14. What, if any, further action is needed at this stage? eg support/coaching, further training etc.15. Please add any further comments relating to the skill(s) observed here. Page EV–85 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19C.5 Level 3—Core skills improvement1. Please choose a participant from the list to provide feedback on. You can provide feedback on other participants later by retaking the evaluation and choosing a new participant from the list.2. Please enter the name(s) of the participant/participants you are providing feedback on.3. In your assessment, how far do you feel your (this participant’s) ability to do the following has improved as a result of the training? Please use the following scale.  1 = No change, 2 = Minimal improvement, 3 = Moderate improvement, 4 = Much improvement, 5 = Very much improvement4. Taken as a group, how far have the training participants abilities to do the following improved as a result of the training? Please use the following scale.  1 = No change, 2 = Minimal improvement, 3 = Moderate improvement, 4 = Much improvement, 5 = Very much improvement5. Problem Solving (1 = No change, 5 = Very much improvement)  Steps back from problems in order to properly define them.  Thinks laterally in order to identify solutions to problems.  Continuously identifies opportunities for process improvement.  Assesses possible solutions against the requirements of the business and what is realistic.  Identifies risks of recommended solutions and proposes ways to manage those risks.  Encourages others to propose solutions to line management.6. Teamwork (1 = No change, 5 = Very much improvement)  Treats people with respect and integrity.  Encourages and supports the contributions of others in achieving team goals.  Makes a full contribution to successful team performance.  Puts personal preferences aside to achieve team goals.  Demonstrates personal commitment to the decisions of the team.  Makes good use of the talents of colleagues.  Helps colleagues when they are under pressure. Page EV–86 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.197. Communication (1 = No change, 5 = Very much improvement)  Communicates clearly and effectively with others using spoken language.  Communicates clearly and effectively with others in writing.  Listens to others, correctly interprets and responds appropriately.  Asks questions to clarify, and ensures communication is two-way.  Tailors language, tone, style and format to suit the audience.  Demonstrates openness in sharing information and keeping others informed.8. Decision Making (1 = No change, 5 = Very much improvement)  Makes responsible decisions, taking into account the facts and the feelings of others.  Analyses available information in detail.  Refers decisions beyond personal authority levels, seeking out second opinions where necessary.  Explains reasons for decisions to those affected.  Ensures that decisions are implemented.  Records the reasons for making a decision when this may be useful to others.  Reviews the quality of personal decisions, and modifies decision making process.9. Influencing (1 = No change, 5 = Very much improvement)  Focuses proposals for action on meeting the requirements of the customer.  Plans proposals in advance, ensuring they are timed to create greatest interest.  Makes clear recommendations for action rather than presenting options.  Reinforces the benefits of proposals and recommendations by using relevant facts, figures and opinions.  Offers support and challenge to the proposals of others.  Modifies position, where appropriate, to achieve a win-win.  Accepts questions and challenges without acting defensively. Page EV–87 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1910. Customer Service (1 = No change, 5 = Very much improvement)  Presents a good image of the company to customers.  Delivers courteous and prompt service.  Always delivers what has been promised.  Builds customers trust by understanding and meeting their individual needs.  Takes personal responsibility for resolving customer concerns.  Checks customers levels of satisfaction.  Seeks customers improvement ideas.  Strives to exceed customer expectations.  Makes the organisation easy to do business with.11. Planning and Control (1 = No change, 5 = Very much improvement)  Establishes priorities, tasks and work schedules in advance.  Maximises the use of available resources.  Clarifies the responsibilities of self and others, avoiding duplication of activity and wasted effort.  Describes milestones in terms of what is achieved and delivered.  Monitors the progress of plans and ensures that action is taken to resolve delays.  Anticipates and promptly raises operational or resource issues.12. Attention to Detail (1 = No change, 5 = Very much improvement)  Works within limits of authority, seeking guidance when unsure.  Keeps an eye on the detail, checking own work for mistakes.  Completes all aspects of a task.  Establishes realistic deadlines and then sticks to them.  Keeps up to date on current internal and external procedures and regulations.13. Leadership (1 = No change, 5 = Very much improvement)  Acts as a role model to others.  Adapts personal style to suit the situation and needs of others.  Treats all staff as individuals, recognising and valuing diversity.  Gives others direct, constructive, and actionable feedback.  Praises and rewards achievement.  Communicates business goals and objectives effectively.  Motivates and empowers others to reach and exceed business goals and objectives. Page EV–88 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1914. Delivering Business Results (1 = No change, 5 = Very much improvement)  Applies skill, effort and judgement to get the job done.  Ensures own role and objectives are clear.  Identifies opportunities to develop business and meet customer needs.  Redirects own time and resources to ensure objectives are met.  Seeks out products or services that match customer needs.  Prioritises time and attention on high value activities.  Ensures that own objectives are aligned with business plans.15. Please describe any changes.16. Please describe how performance has been assessed or measured.17. What in your view are the main reasons for any changes to these skills?18. What other factors may have influenced any changes to these skills?19. What, if any, were the main barriers to changes in these skills?20. How might you (the participant) be able to further improve your (their) effectiveness in these areas/skills?21. Please add any further comments relating to any changes to the above skills here.22. Please provide any additional comments on changes to this participant’s overall performance as it relates to their job responsibilities.23. Please give examples of specific behaviours and situations if possible. Page EV–89 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19C.6 Level 3—Job-specific skills evaluation1. Please choose a participant from the list to provide feedback on. You can provide feedback on others later by retaking the evaluation and choosing a new participant from the list.2. Please enter the name(s) of the participant/participants you are providing feedback on.3. Please rate your [this participant’s/the participant’s] ability to [ADD SKILL].  Rating (1 = No skills, 5 = Very good skills)  Before the training:  After the training:4. In your own assessment, how far do you feel your (this participant’s) ability to do the following has improved as a result of the training? Please use the following scale.  1 = No change, 2 = Minimal improvement, 3 = Moderate improvement, 4 = Much improvement, 5 = Very much improvement5. Taken as a group, how far have the training participants abilities to do the following improved as a result of the training? Please use the following scale.  1 = No change, 2 = Minimal improvement, 3 = Moderate improvement, 4 = Much improvement, 5 = Very much improvement6. [ADD SKILL AREA]  Rating (1 = No change, 5 = Very much improvement)  [ADD SKILL 1]  [ADD SKILL 2]  [ADD SKILL 3]7. How far has the training helped you (this participant) to do the following more effectively?  Rating (1 = No change, 5 = Very much improvement)  [ADD SKILL 1]  [ADD SKILL 2]  [ADD SKILL 3]8. Please describe any changes.9. How far do you feel that your (this participant’s) ability to [ADD SKILL] has improved as a result of attending the training?  Rating (1 = No change, 5 = Very much improvement)10. Please describe any changes.11. Please describe how performance has been assessed or measured.12. What in your view are the main reasons for any changes to this skill?13. What other factors may have influenced any changes to this skill?14. What, if any, were the main barriers to changes to this skill? Page EV–90 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1915. How might you (the participant) be able to further improve your (their) effectiveness in this area/skill?16. Please add any further comments relating to any changes to the above skills here.C.7 Level 3—Training objectives1. How far do you feel the following training objectives were met?  Rating (1 = Not met, 5 = Fully met)  By the end of the training participants should be able to:  [ADD OBJECTIVE 1]  [ADD OBJECTIVE 2]  [ADD OBJECTIVE 3]2. How far do you feel able to [ADD OBJECTIVE] as a result of the training?  Rating (1 = Not at all able to, 5 = Fully able to)3. How effective was the training in providing the following?  Rating (1 = Not at all effective, 5 = Very effective)  [ADD OUTCOME 1]  [ADD OUTCOME 2]  [ADD OUTCOME 3]4. What were your personal learning goals?5. How far do you feel your personal learning goals were met?  Rating (1 = Not met, 5 = Fully met)6. Which of your personal learning goals were not met by the training? Please say why: Page EV–91 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19D.0 Appendix: Level 4 sample survey questionsThe fourth level of evaluation is used to report on the tangible outcomes ofparticipation in the program.25D.1 Level 4—Business impact1. The following questions are about the [ADD DETAILS] training, which took place at [ADD LOCATION] between [ADD DATE] and [ADD DATE].2. This evaluation is to find out about the impact of the training on business performance over the period from [ADD TIMEFRAME] to [ADD TIMEFRAME].3. To complete the evaluation you will need to have access to information about changes in business performance that have resulted (at least in part) from the training program.4. Please complete the evaluation by [ADD DATE].5. What will happen with your responses?  All responses will be analysed and reported to [ADD DETAILS]. Findings are due to be reported in [ADD DETAILS].  Please note that any financial data that you provide may be used as part of a return on investment (ROI) analysis.  A copy of the evaluation report will be available from [ADD LOCATION/PARTICIPANT] on [ADD DATE].6. Any questions?7. If you have any questions about this evaluation, please contact [ADD DETAILS].25 The following questions have been adapted from < www.trainingcheck.com >. Page EV–92 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19D.2 Level 4—Test results1. Please enter the name of the assessment/test used to measure participants learning.2. Please briefly describe what knowledge or skills were assessed.3. Please briefly describe the grade scale/scoring system used.4. What is the highest grade/maximum score for this assessment/test?5. What grade/score is required to pass this assessment/test?6. Please enter the results of the assessment/test for each of the following learners. Target result (optional) Result (actual)[LEARNER 1][LEARNER 2][LEARNER 3]7. Please enter the pre and post-learning results of the assessment/test for each of the following learners. PRE-learning result Target result POST-learning (if available) (optional) result (actual)[LEARNER 1][LEARNER 2][LEARNER 3]8. Please enter details of assessment/test results for the following learners. [ASSESSMENT [ASSESSMENT/ [ASSESSMENT/ /TEST 1] TEST 2] TEST 3][LEARNER 1][LEARNER 2][LEARNER 3]9. How many learners achieved a pass or better in this assessment/test?10. How many learners did not achieve a pass in this assessment/test?11. What was the average PRE-learning result (score/grade)?12. What was the average POST-learning result (score/grade)?13. Please give reasons for the overall results; including what factors other than the training may have influenced them.14. Please add any further information about assessment or test results that you feel may be relevant to the evaluation.15. Please add any other comments you may have about the assessment/test results here. Page EV–93 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19D.3 Level 4—Learning gain1. The learning gain shows the improvement between the pre- and post-learning assessment scores. It is calculated using the following formula:  (Post-learning Score minus Pre-learning Score) / (Maximum Score minus Pre-learning Score) * 100  For example, if the maximum score is 100, and the learner achieved a pre-learning score of 50 and a post-learning score of 80, the learning gain would be calculated as follows: {(80 – 50) = 30} / {(100 – 50) = 50} * 100 = 60  This shows that there was a 60% learning gain.2. Individual Learning Gains—what was the learning gain for each of the following learners? Learning gain (%)[LEARNER 1][LEARNER 2][LEARNER 3]3. Overall Learning Gain—what % of learners achieved learning gains in each of the following bands?  The sum of the numbers entered must equal 100.  0–20% Gain  21–40% Gain  41–60% Gain  61–80% Gain  81–100% Gain  N/A4. What was the average % learning gain for this group of learners? (Enter numbers only)5. Please give reasons for the learning gain results including what factors other than the training may have influenced them.6. Please add any further information about the learning gains that you feel may be relevant to the evaluation. Page EV–94 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19D.4 Level 4—Skills gain1. The skills gain shows how effective the training has been in improving a particular skill. It is calculated using numerical ratings on changes to skills using the following formula:  (Most Recent Skill Rating minus Pre-learning Skill Rating) / (Maximum Skill Rating minus Pre-learning Skill Rating) * 100  For example: {(4 – 2) = 2} / {(5 – 2) = 3} * 100 = 66.6%  This shows that there was a 66.6% skill gain.2. Please enter the pre- and post-learning ratings for this learner for the following skill: PRE-learning POST-learning Max/Target rating rating rating[ADD SKILL]3. Individual Skill Gains—What was the skill gain for each of the following participants? Skill gain (%)[PARTICIPANT 1][PARTICIPANT 2][PARTICIPANT 3]4. Overall Skill Gain—what % of learners achieved learning gains in each of the following bands?  The sum of the numbers entered must equal 100.  0–20% Gain  21–40% Gain  41–60% Gain  61–80% Gain  81–100% Gain  N/A5. Average Skill Gain—what was the average % skill gain for all participants? (Enter numbers only)6. Approximately how much of this change is attributable to the training?  0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 100% Page EV–95 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19D.5 Level 4—Changes to business performance1. Overall, in your assessment, how effective has the training been in meeting business performance needs?  Rating (1 = Not effective, 5 = Very effective)2. If business performance needs have not been met to the required standard, please identify reasons for this and indicate what things (eg people, equipment, knowledge, skills) and/or further action might be needed.3. In your assessment, how far have the following business performance areas improved as a result of the training?  Rating (1 = No change, 5 = Very much improvement)  Productivity and output  Efficiency (eg errors, wastage)  Customer service and satisfaction  Employee satisfaction  Sickness absenteeism  Grievances and disciplinary action  Recruitment and retention  Teamwork and communication  Supervisory and management skills  Health and safety4. Please describe the main changes and include relevant performance data where possible.5. Briefly summarise if and how business performance needs have been met by the training.6. What other benefits have there been from the training (eg other changes in performance, cost savings, improvements in motivation, morale, relationships etc)?7. What, if any, future outcomes do you expect from the training?8. How could any future outcomes from the training be measured (eg using quality assessments, customer feedback, performance data etc)?9. Do you feel that the training was a worthwhile investment of the participants time? Please comment.10. What do you think would have happened if this training had not taken place?11. What are the key lessons learned from delivering this training, eg what could be done differently if similar training was delivered in the future?12. Please add any other comments you would like to make about the impact of the training on business performance. Page EV–96 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19D.6 Level 4—Business performance/impact measures (fixed and open)1. Please identify up to three (3) business performance/impact measures which have changed as a result of the training.  Examples of business performance/impact measures include:  Performance:  Volume of sales  Production rates  Wastage and error rates  Human capital:  Absenteeism rates  Job retention rates  Overtime costs  Customers:  Number of customers  Number of complaints  Amount of repeat business  Health and safety:  Number and seriousness of accidents  Lost time due to injuries  Insurance costs2. You will also be asked to estimate the % influence of the training on any changes.  Business performance/impact measure 1:  Business performance/impact measure 2:  Business performance/impact measure 3:3. Please briefly describe any changes to [ADD BUSINESS PERFORMANCE/ IMPACT MEASURE] as a result of the training; including timescales where possible.4. Please give details of data relating to this measure before and after the training. Please also estimate the % influence of the training on any change (eg if you think the training is responsible for half of the change, enter 50%). Measure (eg Pre-training Post-training % training %, rate, ratio) data (amount) data (amount) influenceBusinessperformance/ impactmeasure5. How was this data collected?6. What other factors may have influenced this change (eg organisational change, new recruits etc)? Page EV–97 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.197. Please add any other comments about changes to this measure.8. Please give details of any other changes to business performance/impact measures that have resulted from the training.9. What other benefits have there been from the training (eg cost savings, improvements in motivation, morale, relationships etc)? Page EV–98 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19D.7 Level 4—Financial impact (including ROTI)1. Please identify up to three (3) financial benefits which have resulted from the training.  Examples of financial benefits include:  Gains from changes in performance (eg increased sales or production)  Cost savings (eg reduced errors, lower overtime costs)2. Please enter benefits based on actual figures or using reasonable estimates.3. Please estimate the % influence of the training on any benefits (eg if you think the training is responsible for half of the benefit, enter 50%).  Financial Benefit 1:  Financial Benefit 2:  Financial Benefit 3:4. Please describe how this benefit has been calculated.5. Approximately what % of this benefit is attributable to the training? (Enter numbers only)6. What other factors may have influenced this change (eg organisational change, new recruits etc)?7. Please calculate/estimate the total financial benefits of the change described, eg If the number of absence days has gone down by 20, and the average cost per day is $100, then the total financial benefit of change = 20 x $100 = $2000. (Enter numbers only.)8. Please give details of data relating to changes to the financial benefit described before and after the training. Measure (eg Pre-training Post-training Approx total Approx % %, rate, ratio) data data financial of benefit (amount) (amount) benefit ($) attributable to trainingFinancialbenefit9. What quantitative financial benefits have resulted from changes in job performance due to the training? Please describe how any figures have been arrived at.10. Please give details of any other financial benefits that have resulted from the training (eg from other performance changes, cost savings etc).11. Please add any other comments you would like to make about the financial benefits from the training. Page EV–99 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19E.0 Appendix: Additional survey questionsThe following questions may be used as templates or to supplement previousquestions.E.1 Open question modifiers and extensionsThese open questions may be used multiple times to follow any of the previousappendices questions to gather more qualitative information on the previousresponse.1. Please say why.2. Please say why you chose this rating.3. If you have any further comments about [ADD TOPIC], please add them here.4. If your score is 3 or less, please say why.5. If your score for any of the above questions is 3 or less, please say why.6. If your score is 4 or more, please say why.7. If your score for any of the above questions is 4 or more, please say why.8. If your score is 2 or less or 4 or more, please say why.9. If your score for any of the above questions is 2 or less or 4 or more, please say why. Page EV–100 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19E.2 Knowledge and skills tests (question examples)1. Is [ADD TEST ITEM] true or false?2. Is it true that [ADD TEST ITEM]?3. What is the answer to [ADD TEST ITEM]?4. Define / Describe / Identify / List / Name / Outline / State why [ADD TEST ITEM].5. Give 3 reasons why [ADD TEST ITEM].  Reason 1  Reason 2  Reason 36. Which of the following is the correct answer?  [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]7. Which of the following is the correct answer? You can choose more than one answer.  [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]8. How important are the following in [ADD TEST ITEM]?  Your rating: (1 = Not at all important, 5 = Very important)9. Please rank the following in order of importance in relation to [ADD TEST ITEM]  Rank the items below, using numeric values starting with (1 = Most important, 5 = Least important).  [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]10. Choose one answer for each of the following:  [ADD TEST ITEM] [ADD TEST ITEM] [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM] Page EV–101 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1911. Choose as many answers as apply to each of the following:  [ADD TEST ITEM] [ADD TEST ITEM] [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]12. Write your answers into the relevant box for each of the following. You can add as many answers as you like.  [ADD TEST ITEM] [ADD TEST ITEM] [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]13. What is the total of [ADD TEST ITEM]? Enter numbers only.14. On what date was [ADD TEST ITEM]?  dd/mm/yyyy15. Out of every 100 [ADD TEST ITEM] how many will be the following? The sum of the numbers entered must equal 100.  [ADD TEST ITEM]  [ADD TEST ITEM]  [ADD TEST ITEM]16. Based on the text below, what is [ADD TEST ITEM]?  [ADD DESCRIPTION / TEXT HERE]17. Based on the following picture/diagram/image, what is [ADD TEST ITEM]?  Custom image Page EV–102 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19F.0 Frequently asked questions (FAQs)F.1 How do I get a user account to create surveys in CheckboxSend an email request to Adrian Barnett (DECD) ext.61080 for access and training. Page EV–103 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19G.0 GlossaryThe following terms are used interchangeably:  ‘staff member’ and ‘team member’  ‘team’, ‘project team’, ‘stream’ and ‘business unit’  ‘manager’ and ‘program manager’  ‘officer’ and ‘project officer’  ‘participant’ and ‘learner’  ‘facilitator’ and ‘trainer’  ‘mean’ and ‘average’G.1 DefinitionsProject—a temporary measure with distinct start and end dates comprising a timeconstraint to meet specific goals and objectives, and produce deliverables within adefined budget and limited resources, in order to achieve beneficial, value-addingchangeProgram—does not have a defined end date—it may contain multiple repetitiousproject instances defined for example, per annumG.2 Formatting convention<…> a network address or file nameitalic a titleG.3 Colour codingRed—high; critical; off track; urgentYellow or orange—medium; caution; at riskGreen—low; controlled; on trackG.4 HyperlinksCheckbox< http://www.decssurveys.sa.edu.au/online/ >Kirkpatrick< http://www.kirkpatrickpartners.com/ >DIAf< http://www.decd.sa.gov.au/quality/pages/quality/26420/ > Page EV–104 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  • Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19G.5 AcronymsMore acronyms relevant to Workforce Development can be found in the IMS GeneralAdministration Registers interface > Information, questions and answers (menu) > Acronyms(form)AQTF—Australian Quality Training FrameworkDECD—Department for Education and Child DevelopmentDIAf—DECD Improvement and Accountability frameworkFAQs—frequently asked questionsHR—Human ResourcesHRWD—Human Resources and Workforce DevelopmentICT—Information and Communications TechnologyIT—Information TechnologyIMS—Information Management SystemKPI—Key Performance IndicatorMS—MicrosoftOPDS—Organisation and Professional Development ServicesPDF—Portable Document FormatPDS—Professional Development SystemPDSA—plan, do, study, act (continuous improvement framework)PM—Performance Management (1)PM—Project Management (2)QL—Quality LeadershipR:—decsgla01user3groupsROI—return on investmentROTI—return on training investmentRTO—Registered Training OrganisationSA—South AustraliaURL—uniform resource locationWD—Workforce Development Page EV–105 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc