Evaluation Guide

642 views

Published on

When you know what you’re doing will achieve your goals, it’s time to evaluate and continuously improve!
I wrote this evaluation guide to simplify and standardise the evaluation processes for team evaluation, strategic project evaluation, and learning program evaluations.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
642
On SlideShare
0
From Embeds
0
Number of Embeds
4
Actions
Shares
0
Downloads
0
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Evaluation Guide

  1. 1. Workforce Development Evaluation GuideVersion: 2012.03.19Last released on: 17/04/2012Last released by: Dain Sanyë Senior Consultant Evaluation and Continuous Improvement
  2. 2. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Document titleEvaluation GuideDocument informationThis document was produced for the Government of South Australia (SA),Department for Education and Child Development (DECD), Human Resources andWorkforce Development.This document was created by Dain Sanyë, Senior Consultant, Evaluation andContinuous Improvement for Workforce Development, using Microsoft Word 2010.PurposeThis guide describes the evaluation framework created for the Department’sWorkforce Development directorate.AudienceThis document is designed as a guide for Workforce Development officers andmanagers to effectively evaluate and report on projects and programs.Evaluators should be familiar with:  the layout of a personal computer and desktop  the purpose, creation and analysis of data and survey tools  the Department’s Improvement and Accountability framework (DIAf)This guide is written in international English.Contact detailsDain Sanyë, Senior Consultant, Evaluation and Continuous ImprovementPhone: (08) 820 41402Fax: (08) 8206 4200Email: Dain.Sanye3@sa.gov.auCopyright noticeCopyright ©2012 Government of South Australia, Department for Education andChild Development. This publication is copyright and contains information which isthe property of the Department for Education and Child Development. No part of thisdocument may be copied or stored in a retrieval system without the writtenpermission of the author or the Chief Executive of the Department for Education andChild Development. Page EV–2 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  3. 3. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Related files  Checkbox survey tool < http://www.decssurveys.sa.edu.au/online/ >  DIAf < http://www.decd.sa.gov.au/quality/pages/quality/26420/ >Update planUpdates are the responsibility of the current Workforce Development SeniorConsultant, Evaluation and Continuous Improvement.Updates must occur when scheduled as a part of ongoing role responsibility or asrequested by a Workforce Development line manager.Updates must include an incrementally increased version number in the formatyyyy.mm.dd.A (year.month.date.draft version of update).Old versions must be moved to an archive location.The current version must maintain the same file name and be stored and accessiblefrom a single fixed network location.Shortcuts should be created at additional locations linking to the single fixed networkstorage location.Revision historyPlease destroy any printed copies of this document earlier than version 2012.03.19.Version Reviser Details2012.02.22.A Dain Sanyë Initial draft2012.03.19 Dain Sanyë Initial document released Page EV–3 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  4. 4. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Contents Document title ..................................................................................................... 2 Document information ......................................................................................... 2 Purpose ............................................................................................................... 2 Audience ............................................................................................................. 2 Contact details ..................................................................................................... 2 Copyright notice................................................................................................... 2 Related files......................................................................................................... 3 Update plan ......................................................................................................... 3 Revision history ................................................................................................... 3Contents .................................................................................................................. 41.0 Introduction.................................................................................................. 72.0 Background.................................................................................................. 83.0 Evaluative review of a workgroup or project team .................................... 9 DIAf corporate self-review snapshot survey ......................................................... 9 Improvement principles...................................................................................... 10 Improvement principle criteria ............................................................................ 10 Focus on core business ............................................................................... 10 Think systemically........................................................................................ 11 Share leadership.......................................................................................... 11 Attend to culture .......................................................................................... 11 Listen and respond ...................................................................................... 11 Make data count .......................................................................................... 12 Set directions ............................................................................................... 12 Target resources.......................................................................................... 12 Continuously improve .................................................................................. 12 Principle analysis ............................................................................................... 13 Aggregate score .......................................................................................... 13 Graphing foci scores .................................................................................... 14 Collaborative discussion .................................................................................... 153.A Appendix: Example—DIAf corporate self review survey ........................ 164.0 Strategic evaluation of a project ............................................................... 32 Objectives.......................................................................................................... 33 Deliverables....................................................................................................... 34 Strategic plans and alignment............................................................................ 35 Evaluation measures ......................................................................................... 36 Focus areas....................................................................................................... 37 Focusing on development or improvement .................................................. 37 Designed and implemented with systemic thinking ...................................... 37 Enables individual leadership....................................................................... 37 Attentive to organisational culture while transforming capacity..................... 37 Managed stakeholders through effective listening and responsiveness ....... 37 Effectively collated, used and reported data................................................. 37 Page EV–4 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  5. 5. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Maintained direction and scope throughout the project ................................ 37 Innovatively and effectively aligned resources ............................................. 38 Building a sustainable future including continuous improvement.................. 38 Scoring .............................................................................................................. 38 Undeveloped ............................................................................................... 38 Developing................................................................................................... 38 Functioning .................................................................................................. 38 Strategic ...................................................................................................... 38 Embedded ................................................................................................... 384.A Appendix: Example—strategic project evaluation .................................. 395.0 Evaluation of an ongoing program, with a focus on knowledge and skills transfer .................................................................................................... 48 Level 1—evaluation of motivation and reaction .................................................. 49 Level 2—evaluation of learning content ............................................................. 50 Level 3—evaluation of performance and behaviour ........................................... 51 Level 4—evaluation of impact and results ......................................................... 52A.0 Appendix: Level 1 sample survey questions ........................................... 54 A.1 Level 1—About the training needs analysis............................................ 54 A.2 Level 1—About the training .................................................................... 55 A.3 Level 1—About the participant ............................................................... 56 A.4 Level 1—Individual learning needs......................................................... 59 A.5 Level 1—Business needs ....................................................................... 62 A.6 Level 1—Core skill needs ...................................................................... 64B.0 Appendix: Level 2 sample survey questions ........................................... 68 B.1 Level 2—Participant reaction and learning outcomes introduction.......... 68 B.2 Level 2—General feedback .................................................................... 69 B.3 Level 2—Training methods and materials .............................................. 70 B.4 Level 2—Trainer(s) ................................................................................ 72 B.5 Level 2—Tests and qualifications ........................................................... 73 B.6 Level 2—Progress to other learning ....................................................... 73 B.7 Level 2—Pre-training activities and instructions ..................................... 74 B.8 Level 2—Facilities, courses and resources ............................................ 75 B.9 Level 2—Facilities and administration .................................................... 76C.0 Appendix: Level 3 sample survey questions ........................................... 77 C.1 Level 3—Job performance impact .......................................................... 77 C.2 Level 3—Relevance of the training......................................................... 78 C.3 Level 3—Application of learning ............................................................. 80 C.4 Level 3—Post-training skills observation ................................................ 84 C.5 Level 3—Core skills improvement .......................................................... 86 C.6 Level 3—Job-specific skills evaluation ................................................... 90 C.7 Level 3—Training objectives .................................................................. 91D.0 Appendix: Level 4 sample survey questions ........................................... 92 D.1 Level 4—Business impact ...................................................................... 92 Page EV–5 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  6. 6. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 D.2 Level 4—Test results ............................................................................. 93 D.3 Level 4—Learning gain .......................................................................... 94 D.4 Level 4—Skills gain ................................................................................ 95 D.5 Level 4—Changes to business performance .......................................... 96 D.6 Level 4—Business performance/impact measures (fixed and open) ...... 97 D.7 Level 4—Financial impact (including ROTI) ........................................... 99E.0 Appendix: Additional survey questions ................................................. 100 E.1 Open question modifiers and extensions.............................................. 100 E.2 Knowledge and skills tests (question examples) .................................. 101F.0 Frequently asked questions (FAQs) ....................................................... 103 F.1 How do I get a user account to create surveys in Checkbox ................ 103G.0 Glossary ................................................................................................... 104 G.1 Definitions ............................................................................................ 104 G.2 Formatting convention.......................................................................... 104 G.3 Colour coding ....................................................................................... 104 G.4 Hyperlinks ............................................................................................ 104 G.5 Acronyms ............................................................................................. 105 Page EV–6 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  7. 7. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.191.0 IntroductionEvaluation is defined as the act of appraising to assess value.1Evaluation identifies and promotes best practice in planning and implementationmethodology, and ensures the delivery and creation of Workforce Developmentproducts and services is performed at the most efficient and productive level.Effective evaluation of projects and programs ensures consistent, high qualityservices and products with strong alignment to strategic goals and objectives for allteams’ work across the Workforce Development Directorate.Evaluation also assists in the expansion and roll-over of projects to createsustainable programs by providing a clear and focussed relationship betweendeliverables, objectives and strategic goals as targeted outcomes and responsibilitiesin the management process.Evaluation utilises descriptive foci to assist the evaluator to get into a specific frame-of-mind and effectively evaluate the project or program from a variety of strategicviewpoints. Project and program managers are most likely to evaluate their ownprojects and programs.Evaluative foci have been described from the Department’s Improvement andAccountability framework (DIAf)2.Some of the foci this guide will assist you to evaluate against include your project orprogram’s:  attention to organisational culture  effective collation, use and reporting of data  effective use of human, financial and physical resources  embedding of leadership and responsibility  focus on development and continuous improvement  maintenance of scope and direction  stakeholder management  sustainability over the medium to long term  systemic thinking and integrationThe three evaluative processes described in this guide are designed to assist officersand managers strategically prepare and evaluate projects and programs, including:  Evaluative review of a workgroup or project team  Strategic evaluation of a project  Evaluation of an ongoing program, with a focus on knowledge and skills transfer1 http://dictionary.reference.com/browse/evaluation?s=t2 www.decd.sa.gov.au/quality/pages/quality/26420/ Page EV–7 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  8. 8. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.192.0 BackgroundPrior to 2012 the Workforce Development directorate did not have a consistentframework to evaluate the Directorate’s projects and programs.Individual teams within the Directorate use a number of different and differentlyapplied evaluation tools, historical or mandatory evaluative processes withoutscheduled or regular continuous improvement analysis of their evaluative processes,effectiveness and outcomes.The Department’s enterprise Registered Training Organisation (RTO), Organisationand Professional Development Services (OPDS), use:  national VET surveys for participants and line managers of participants  AQTF compliance requirementsThe Directorate’s Quality Leadership programs use: The Directorate’s Teacher Quality projects and programs use: The Directorate’s Performance and Development projects and programs use: The Directorate’s Projects and Innovations projects and programs use:  Page EV–8 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  9. 9. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.193.0 Evaluative review of a workgroup or project teamAn evaluative review of a workgroup or project team is recommended once per year;however, an evaluation may be prudent following any significant change ordevelopment within the team; for example, new leadership, new location, change inorganisational strategic plan, or prior to the planning phase of a significant project.The evaluation is designed as a self-review to identify and promote collaborativediscussion on opportunities for continuous improvement and increased effectivenessthat aim to improve the productivity, efficiency and capacity across the team.Analysis of the survey responses provides specific direction to improve teamengagement, satisfaction, focus, responsiveness and resource use.The evaluation encompasses the Department’s nine DIAf improvement principles.DIAf corporate self-review snapshot surveyThis survey is stored within the Department’s Checkbox survey tool under thefollowing URL < http://www.decssurveys.sa.edu.au/online/selfreveiw.aspx > and isaccessible through the internet to all Directorate officers.The survey is password protected with the current password “opds”.The survey details key criteria to consider as a lead into self-review of any teamsperformance and operations.Team members should complete the survey individually and encouraged to behonest, noting all responses are anonymous3.After all team members have been given sufficient time to comfortably respond4, theresponses should be collated to provide data for collaborative discussion.All the surveys focus questions are answered on a ratings scale from StronglyDisagree to Strongly Agree. This provides an opportunity within the principle analysisto index the team’s aggregate score and accurately report on the team’simprovement trend and recommended foci for collaborative discussion5.There are no text or long answer questions in the survey; these types of commentsshould be raised in the collaborative discussion following the surveys completion andprinciple analysis6.After the survey is closed it is important to allow and allocate adequate time for acollaborative discussion with all team members present at a team meeting or otherdedicated meeting time.At the collaborative discussion all members of the team are encouraged to discussthe Focus on core business and two other focus principles identified through theprinciple response analysis as the greatest opportunities to improve effectivenesswithin the team.3 Preview and analysis of the survey responses should not begin until after a sufficientnumber of responses have been received to ensure anonymity for the respondents.4 As the survey is accessible over the internet, some team members may feel morecomfortable completing the survey out-of-office, after hours or at home.5 Indices should not be compared between teams; however, all staff within the Directorate canbe anonymously surveyed at the same time to achieve a current Directorate-level index.6 Analysis of survey responses should be carried out by an independent and skilled analyticalor evaluative Directorate officer; for example, the Directorate’s Data Management andAnalysis Officer or Senior Consultant, Evaluation and Continuous Improvement. Page EV–9 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  10. 10. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19The collaborative discussion should encompass: a) evidence used by team members to score the focus questions b) what the next steps would be for improvement and who will be responsible for managing implementation7 c) ongoing self-review processes including evaluation of the implementation of the continuous improvement project d) the data and other evidence that would be gathered to monitor and evaluate improvement over time8Improvement principlesThe nine DIAf improvement principles are: 1. Focus on core business 2. Think systemically 3. Share leadership 4. Attend to culture 5. Listen and respond 6. Make data count 7. Set directions 8. Target resources 9. Continuously improveImprovement principle criteriaFor each improvement principle there are four criteria9 that can be scored on aratings scale from Strongly Agree to Strongly Disagree by each individual teammember.Focus on core business 1.1 Team goals are clear, known and used to drive decisions 1.2 The team has high service and delivery standards that result in high quality outcomes 1.3 Team members are committed to the team’s goals 1.4 The team’s plans, processes and practices work effectively to support team members to achieve goals7 Implementation of continuous improvement should be managed using the projectmanagement framework laid out in the Directorate’s Information Management System (IMS)and using the PDSA (plan, do, study, act) continuous improvement framework cycle.8 See the chapter in this Evaluation Guide on strategically evaluating a project.9 Note that principle criteria contain more than one question that may cause and result inskewing of responses where a respondent may agree with part of the criteria and disagreewith another part. If this issue is raised, respondents should score the criteria with their lowestdesired response rating to ensure the improvement principle has the best opportunity to bediscussed for improvement by the team. Page EV–10 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  11. 11. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Think systemically 2.1 Political, system and contextual issues are identified and strategically addressed in plans and practices 2.2 Effective research and development processes enable team members to improve operations, outcomes and service delivery 2.3 Internal management processes are routinely reviewed to continuously improve operations 2.4 Effective partnerships exist with key stakeholders, other business units and professional groups to support the achievement of unit goalsShare leadership 3.1 Leaders provide clear direction and supportive leadership, and take an effective stance appropriate to the individual/situation to achieve agreed outcomes 3.2 Leadership is shared with strategies and processes to build the leadership capacity of individuals and leadership density of the business unit 3.3 Leaders support effective business unit management through a focus on professional learning for team members and themselves 3.4 Leaders ensure change is managed positively and successfully, with workload balance and direction sustainedAttend to culture 4.1 A positive workplace culture supports team members to work with enthusiasm, commitment, energy and the business unit to achieve success 4.2 Team members’ roles and responsibilities are clearly known and professional team interactions optimise success 4.3 Professional development and performance management processes provide team members with recognition, support and feedback to develop expertise 4.4 Culture and morale building processes effectively support positive team member interactions and address issues and concernsListen and respond 5.1 Quality partnerships are deliberately developed with key clients and stakeholders to achieve outcomes 5.2 Communication processes provide information to and from clients and stakeholders to improve service delivery and outcomes 5.3 Decision making structures are effective with high levels of team member and stakeholder input, support for, and engagement in decisions 5.4 Commitment to quality service delivery and responsiveness by all team members provides high levels of satisfaction and positive client relationships Page EV–11 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  12. 12. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Make data count 6.1 Effective data management processes are in place to collect, store and access reliable data 6.2 Multiple measures of data are analysed and used to inform improvement directions, evaluate programs and report on outcomes 6.3 Data is used to identify root causes and variation to targeted improvement efforts while monitoring the effectiveness of implementation strategies 6.4 Data creates knowledge and learning for the team, organisation and system to inform decisions on development and innovationSet directions 7.1 An explicitly stated vision, values and purpose developed in collaboration with team members drives business unit decisions, plans and directions 7.2 Planning processes build team members’ capacity and expertise to achieve the vision and continuously improve outcomes 7.3 Communication, monitoring and evaluation of planning processes occurs with high levels of team member involvement and ownership 7.4 Strategic plans are integrated and enacted in daily operations to ensure strategic directions are achievedTarget resources 8.1 Effective resource management systems identify, support and develop the team’s human, financial and physical resources 8.2 Resources are targeted to achieve successful outcomes with processes in place to review resource needs and effectiveness 8.3 Assets and resources are acquired, organised and maintained to support performance 8.4 Risk management processes ensure prudent financial management, regulatory compliance and safe workplace practicesContinuously improve 9.1 Effective, known improvement processes support team members to ensure that goals are achieved and outcomes are continuously improved 9.2 Rigorous, regular self-review processes occur, with team members involvement and engagement, to monitor outcomes, evaluate progress and inform future directions 9.3 A commitment to continuous improvement is evidenced by routine policy review and development cycles, and effective document and records management 9.4 The team’s members develop integrated, sustainable and systemic programs, projects and products to achieve business unit goals and respond to the emerging needs of stakeholders Page EV–12 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  13. 13. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Principle analysisThe principle analysis of the survey’s responses should be calculated perimprovement principle focus and include:  an aggregate score of the responses to the criteria for each improvement principle (focus)  the total number of responses and total number of possible responses (team members)  a graph displaying the distribution of relative scores across the focus and its four criteria independentlyAggregate scoreThe aggregate score may:  remove outliers—where the aggregate of one rating score is significantly inconsistent with the median or mode rating score (eg out of 25 responses: one response Strongly Disagrees while 24 responses Agree or Strongly Agree; the Strongly Disagree response may be removed from the analysis)10  be reported as a definitive modal score11—if the modal score aggregate exceeds the sum aggregate of all other rating scores (eg out of 25 responses: 16 responses are Agree while 9 responses are other ratings; the aggregate score for this focus can be reported as Agree)  be reported as a definitive median score12—if the aggregate scores per rating appear as a balanced bell curve (eg out of 25 responses: 1 Strongly Disagree, 3 Disagree, 6 Neither, 9 Agree, 6 Strongly Agree; the median score for this focus can be reported as Agree)  be reported as a mean or average index—with each rating given an linear aggregate multiplier (eg out of the above bell curve distribution with an aggregate multiplier of x1)Rating Strongly Disagree Disagree Neither Agree Strongly Agree(Multiplier) (x1) (x2) (x3) (x4) (x5) TOTAL Score 1 3 6 9 6Aggregate 1 6 18 36 30 91 Possible maximum 125 INDEX = 3.64 Agree (72.8%)The aggregate score for each improvement principle focus should always becalculated and reported consistently.10 If there is a significant cause to the response removed from the analysis this is expected tobe raised during the collaborative discussion.11 The modal score is the distinct rating with the highest aggregate.12 The median score is the exact middle rating when all scores are accurately ordered fromlowest to highest. Page EV–13 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  14. 14. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Graphing foci scoresThe distributive graph should be chosen and produced to provide a highly visualdiscussion prompt for the collaborative discussion; for example: Page EV–14 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  15. 15. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Collaborative discussionThe collaborative discussion is the most important part of the evaluation.The discussion enables team members to focus on specific areas for improvement,identify barriers and work as a team to trouble-shoot solutions to issues that impactthe self-identified effectiveness of the team.Only three foci should be discussed in this meeting including the Focus on corebusiness plus two other foci recommended by the independent principle analyst asareas which have a very low aggregate score or where the aggregate score does notappear representative of a clear majority; (eg out of 25 responses: 12 StronglyDisagree, 12 Agree, 1 Strongly Agree).The collaborative discussion should occur as a dedicated meeting with no otheragenda items, in a closed meeting room set up for brainstorming (whiteboards,butcher’s paper, post-it notes), and all team members present. If possible thediscussion should be led by an independent facilitator.The meeting must allow for anonymous reporting of causative factors leading to theopportunity for improvement; (eg all team members may be asked to writeanonymously a possible cause for the low criteria or focus score on a post-it note andpass those up to the independent facilitator who will ensure no individuals areidentified while the causes are being discussed).The facilitator should use techniques like the 5-Why’s13 to determine root causes ofthe score and to provoke collaborative discussion on potential resolutions.After resolutions have been brainstormed, the facilitator should lead the team inidentifying the resolution project to be recommendedThe project proposal should be development through project managementframework, with the team brainstorming components including:  potential risks  implementation schedule  stakeholders to be engaged  project team and responsibilities  objectives and deliverables to be achieved  evaluation process to ensure completion  and measurable, reportable achievement14The improvement project manager or team should report the improvement project’songoing status back to the whole team as an agenda item at every future normalteam meeting until the project is closed.13 To identify the root cause of an issue, ask Why the issue occurred, then ask Why the causeof the issue occurred, then ask Why the cause of the cause of the issue occurred, to a total of5-Why’s.14 See the chapter in this Evaluation Guide on strategically evaluating a project. Page EV–15 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  16. 16. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.193.A Appendix: Example—DIAf corporate self review survey Page EV–16 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  17. 17. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–17 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  18. 18. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–18 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  19. 19. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–19 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  20. 20. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–20 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  21. 21. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–21 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  22. 22. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–22 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  23. 23. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–23 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  24. 24. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–24 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  25. 25. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–25 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  26. 26. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–26 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  27. 27. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–27 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  28. 28. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–28 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  29. 29. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–29 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  30. 30. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–30 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  31. 31. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–31 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  32. 32. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.194.0 Strategic evaluation of a projectA project is a temporary measure with distinct start and end dates comprising a timeconstraint to meet specific goals and objectives, and produce deliverables within adefined budget and limited resources, in order to achieve beneficial, value-addingchange.15A program does not have a defined end date and usually contains multiple repetitiousproject instances defined for example, per annum.16Projects and programs should be evaluated by the project or program manager, or aspecified third party, during the:  project concept or proposal phase  throughout the project implementation as required  post-project review or closure phaseStrategic evaluation of a project or program enables evaluation against its:  Objectives  Deliverables  Strategic alignment  Evaluation measuresProject objectives align to strategic plans and goals, and are achieved and evidencedthrough the production of project deliverables throughout the implementation of theproject.The project’s deliverables are evaluated to ensure they have effectively achieved theproject’s objectives and strategic alignment through evidence of achievement.This evaluation framework assist the project or program manager evaluate theirproject’s deliverables, objectives, strategic alignment and evaluative processesagainst foci developed from the Department’s Improvement and Accountabilityframework (DIAf).All Workforce Development projects and programs including courses and eventsshould be recorded in the Directorate’s information management system (IMS).< decsgla01user3GroupsHRWorkforce DevelopmentIMSinterfacesProjectManagement.mdb >This evaluation framework is planned to be integrated into the IMS Work Programand Project Management interface.15 For example, the 2012 South Australian Public Teaching Awards is a project.16 For example, the South Australian Public Teaching Awards is a program. Page EV–32 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  33. 33. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19ObjectivesObjectives to be achieved through the project should be recorded into the IMS.An objective is a key benefit achievable by the project that aims to meet a strategicdirection, plan or vision for the Department.Objectives are usually described in the format: To..., in a way that..., so that...Objectives have no physical substance; they are not created in a visible way.17Score Description ExampleUndeveloped objectives resemble a brain-storming To improve image list, are vague and have no indication of change or outcomesDeveloping objectives are stated broadly and infer To improve the image of the or imply non-specific outcomes or Department changes in practiceFunctioning objectives are clear about the To improve the image teachers have outcomes to be targeted or improved of the Department recognising them but the change is stated in broad non- as part of the workforce specific termsStrategic objectives are linked to a strategic To improve the image teachers have objective in a plan and expressed in of the Department recognising them terms of the outcomes to be targeted as an important part of the education or improved and care workforceEmbedded objectives are STRATEGIC, based on To improve the image teachers have best practice, aligned to increase of the Department recognising them effectiveness and consistency, and as the part of the education embedded with a specific objective in workforce that provides excellence a strategic plan in education and carePlease update the objectives of your project in the IMS before scoring.17 Compare with deliverables. Page EV–33 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  34. 34. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19DeliverablesDeliverables to be created during the project’s implementation should be recorded inthe IMS.A deliverable is a key product that will be created by the project providing physicalevidence of the projects activity.18Deliverables include key performance indicators (KPIs) as statistics and are used asproof the project is meeting its Objectives.Score Description ExampleUndeveloped are when no deliverables are None stated and implementation does not produce any physical evidenceDeveloping deliverables are common or Hold an awards ceremony for 2012 historically repeated with no significant improvement evidentFunctioning deliverables include data Hold a ceremony for 100 award winners in related to the project but 2012 lacking specificity against changes or improvements to be achievedStrategic deliverables are assigned to The Workforce Recognition Officer will event specific team members, are manage the 2012 SA Public Teaching demonstrable including and Awards ceremony for up to 100 teachers supported by data, and recognised and nominated in the last 12 expressed as SMART months for outstanding teaching practice (specific, measurable, achievable, relevant, timely)Embedded deliverables are STRATEGIC, The Workforce Recognition Officer will event and include agreements for manage the 2012 SA Public Teaching continuous support and Awards ceremony promoting excellence in improvement with deliverables education and care for up to 100 teachers for post-project analysis and recognised and nominated in the last 12 review months for outstanding teaching practice using a documented up-to-date event management process including a survey of participants on the events deliveryPlease update the planned and achieved deliverables of your project in the IMS beforescoring.18 Compare with objectives. Page EV–34 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  35. 35. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Strategic plans and alignmentA link from each project objective to a specific strategic plan’s objective or goalshould be recorded against each project objective in the IMS.Linking objectives to a strategic direction, plan or vision is an important analysis ofthe value of your project.Score Description ExampleUndeveloped There are a high number of competing strategies which are very broad, are management or compliance requirements, or are determined by the DepartmentDeveloping There are a range of broad strategies areas, based on complex targets that are a mix of management and improvement issues, already described by the DepartmentFunctioning There are a number of strategies that are mainly improvements or developments, identified from strategic planning, Departmental or national objectivesStrategic There are a number of strategies that clearly define continuous improvement integrated with Departmental or national objectives, determined through data analysis, agreed to by staff and described against a particular field of expertiseEmbedded There are a number of strategies that all staff agree from data analysis and stakeholder consultation, that are key objectives to focus for continuous improvement, that expressed against particular fields of expertise will concurrently and strategically achieve Departmental or national objectivesPlease update your projects objective links to strategic plans in the IMS beforescoring. Page EV–35 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  36. 36. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Evaluation measuresProjects managed through the IMS have measures enabling evaluation to bedescribed and resulted through your projects objectives, deliverables and strategiclinks.Evaluation includes your projects ongoing status updates as well as the final projectreview closure report.Score Description ExampleUndeveloped Data may be collated but is not used to monitor # participants progress, support status reporting or evaluate the effectiveness of the projectDeveloping Measure of progress are merely numbers or gender distribution of # deliverables ticked as completed or achieved with participants minimal or no evaluation of the relative change or survey of participants improvement achieved interestFunctioning Data is used throughout the project to monitor rate of registration of and report on the progress and achievement of interest targets, deliverables and objectives against strategic plansStrategic Specific data is regularly collated and used to regional and index of monitor the progress of the continuous disadvantage improvement changes throughout the projects distributions of rate of lifecycle, with regular review of the completion of registration and deliverables to support the achievement of achievement vs student objectives against strategic plans by all project outcomes (NAPLAN) team membersEmbedded Evaluation is STRATEGIC with the analysis of post event survey and multiple measures of improvement and the data retrieval regular collaboration and input from all project quantitatively analysing team members to evaluate the projects progress changes and and refine and redefine the objectives and next improvement in steps towards the most effective deliverables to education and care achieve the strategic outcomes of the project activityPlease start your post project review in the IMS before scoring, including: • what worked • what didnt work • lessons learned • your recommendations for future project managers Page EV–36 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  37. 37. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Focus areasThe following focus areas may be used to categorise the project’s objectives,deliverables, strategic alignment and evaluation measures for projects where theimplementation of the evidentiary deliverables to achieve the project objectivesaligned with strategic plans is significantly large or complex.Focusing on development or improvementThe project produced evidence of continuous improvement in achievement/outcomesfor stakeholders, with challenging targets for ongoing improvement and qualitypractice, while creating an ethos with high expectations and consistent understandingto drive, develop and improve policy, practice and performance.Designed and implemented with systemic thinkingThe project is integrated with wider systems and is committed to improve thesesystems to create an aligned and effective integrated system that supports theDepartments continual improvement with targets for achievement appropriate to thecontext of community needs and aspirations.Enables individual leadershipThe project develops leadership within the projects team members and stakeholdersenabling them to exhibit principled and visible responsibility that is shared andfostered throughout the project through effective project management expertise andcapacity at all levels to achieve the projects objectives, deliverables and strategicgoals.Attentive to organisational culture while transforming capacityThe project intentionally creates a culture that involves individuals and groups intransforming the capacity of the system through a positive culture with high levels ofteam and stakeholder satisfaction, morale and support for individuals to grow andimprove their performance.Managed stakeholders through effective listening and responsivenessThe project enables team members and stakeholders to have an active voice and beresponsive to current and future needs of the Department using a client-focussedapproach to communication, risk management, prioritisation, and organisation of theprojects components and responsibilities.Effectively collated, used and reported dataThe project creates or gathers the necessary information and knowledge requiredfrom data sources including stakeholder needs to strategically evaluate and improveoutcomes through an informed, structured and organised approach with a clearevidence base for decision-making and recommendations as well as identifying cleardrivers for continuous improvement opportunities.Maintained direction and scope throughout the projectThe project planning process, documentation and stakeholder communicationexplicitly states values, vision and purpose within its objectives, deliverables andstrategic links, and all documentation is consistently up-to-date and available to allproject team members. Page EV–37 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  38. 38. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Innovatively and effectively aligned resourcesThe project innovatively targets resources to most effectively align all physical,human and financial resources with project needs through sustainable resourcemanagement to achieve successful and efficient outcomes.Building a sustainable future including continuous improvementThe project achieves improvement for the Department through a structuredcontinuous improvement cycle with regular evaluation and review of existing relatedprocesses and activity.ScoringScoring of your project’s objectives, deliverables, alignment to strategic plans, andevaluation measures is recorded as a single rating across each component either asa whole, or categorised into focus areas. 1. Undeveloped 2. Developing 3. Functioning 4. Strategic 5. EmbeddedUndevelopedAn undeveloped rating refers to objectives, deliverables, strategic alignment andevaluation measures that are undocumented, unrecorded and unable to be reviewedby an appropriate third party.DevelopingA developing rating refers to objectives, deliverables, strategic alignment andevaluation measures that are drafted, or only documented on the initial project plan,concept or proposal.FunctioningA functioning rating refers to objectives, deliverables, strategic alignment andevaluation measures that have been or are intended to be ticked off as achieved ornot achieved; however, the project’s objectives or deliverables are not linked, relatedor aligned to strategic plans, goals or objectives.StrategicA strategic rating refers to objectives, deliverables, strategic alignment andevaluation measures where the project’s objectives and deliverables are linked,related, aligned and described against strategic plans, goals or objectives.EmbeddedAn embedded rating refers to objectives, deliverables, strategic alignment andevaluation measures where the project only includes objectives that are linked tostrategic objectives; with all project objectives evidenced by deliverables (with nonon-strategic, orphan objectives or deliverables); and with the project plan,implementation, delivery and closure evaluation focused on strategic achievement. Page EV–38 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  39. 39. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.194.A Appendix: Example—strategic project evaluation Page EV–39 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  40. 40. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–40 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  41. 41. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–41 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  42. 42. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–42 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  43. 43. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–43 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  44. 44. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–44 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  45. 45. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–45 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  46. 46. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–46 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  47. 47. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19 Page EV–47 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  48. 48. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.195.0 Evaluation of an ongoing program, with a focus on knowledge and skills transferThis evaluation is designed to encompass both effective training (improvement ofprograms) and training effectiveness (implementation of skills to improveorganisational effectiveness).The Kirkpatrick19 model will be used as the structural framework in the evaluation ofWorkforce Development training programs.This model has the following four levels. Level 1—evaluation of feedback from participants and stakeholders Level 2—evaluation of physical content, resources and deliverables Level 3—evaluation of application and change to organisational processes Level 4—evaluation of strategic outcomes and measurable benefitsLevels 1–2 provide instructive feedback to program deliverers to update and improvetheir relationship with participants, and to project managers in the roll-over of projectinitiatives to ongoing programs for the future program manager.Levels 3–4 produce summation of strategic value and effectiveness in theimplementation of a project and ongoing program delivery reportable to seniormanagement.Some methods of evaluation include:  asking for self-evaluation from participants and facilitators  testing participants’ knowledge, understanding and decision-making skill  observing participants’ performance  examining organisational business results  comparing social media learning with traditional learning intervention  seeing how much productivity is lost or gained from time requiredResults from evaluations can be analysed in global categories to calculate theeffectiveness of a type of training for the organisation; for example, all leadershipprogram evaluations.Sample evaluation survey questions are listed in Appendices A–E.19 www.kirkpatrickpartners.com Page EV–48 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  49. 49. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 1—evaluation of motivation and reactionThe first level of evaluation is used to determine how well your participants engagewith the registration and learning process.It is used to identify why participants have enrolled in the program, what theyperceive are needs in their workplace that the program will have a positive impact onto improve their working environment, and what their learning expectations are fromparticipation in the program.The motivation and reaction evaluation informs on how appealing, relevant andeffective the learning content and delivery is to the individual participants, measuringhow well the learning engagement processes and program reputation work.Even if all the other levels are effectively covered by the program, missing level 1 can causeparticipants to fail to see a purpose in the learning and they may disengage from the program.Effective evaluation of level 1 motivation and reaction to the program to measure theengagement of participants to its structure and marketing can be collated andanalysed through:  reaction sheets  surveys  focus groups  interviewsParticipants may self-assess their impression of the program pre-, during and post-participation against any improvement rating scale with the following foci:  relevance  specific  practical  accessible  social Page EV–49 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  50. 50. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 2—evaluation of learning contentThe second level of evaluation is used to improve the quality and effectiveness of thelearning content, structure and delivery.It is used to update, refine and ensure best practice in the knowledge, skills, andresources provided to and developed by participants who participate in thefacilitator’s program.The learning content evaluation informs on what participants learned and the extentto which knowledge and skills were gained by the participants, including the degreeof effectiveness that the learning delivery achieved to transfer the content, knowledgeand skills to the participants.Even if all the other levels are effectively covered by the program, missing level 2 can causethe participants to devalue the learning and consider the program to be out-of-date andirrelevant.Effective evaluation of level 2 learning content in the program to measure successfulcontent design and development, delivery methods and achievement can be collatedand analysed through:  pre and post training results  written knowledge tests  role-play and simulation  activities and games Page EV–50 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  51. 51. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 3—evaluation of performance and behaviourThe third level of evaluation is used to report on changes and increases in capacity ofjob performance resulting from participation in the program, driving accountability,measuring effectiveness and value to the organisation, and enabling appropriateresource and support allocation.It is used to prove value of participation in the program through identifying changes inthe way employees behave and perform in their working environment to increase theefficiency, productivity and quality of their work using newly learned skills andknowledge.The performance and behaviour evaluation informs of the capability of participants toeffectively uptake and perform newly developed skills, and the degree that learnedskills and knowledge actually transfer to and are used in the working environment,post-participation.Even if all the other levels are effectively covered by the program, missing level 3 can causethe program to be devalued by staff and management as they consider the skills andknowledge learned through participation to be academic and ineffective in the real workplace.Effective evaluation of level 3 performance and behaviour following participation inthe program to measure successful transfer and implementation of learning from thetraining to working environments, can be collated and analysed through:  surveys  interviews  focus groups  observations  work reviewsParticipants may self-assess changes in their working environment followingparticipation in the program against any improvement rating scale with the followingfoci:  performance  confidence  contribution  engagement  retention  customer satisfaction  business success  systems use  drivers and strategic alignment Page EV–51 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  52. 52. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19Level 4—evaluation of impact and resultsThe fourth level of evaluation is used to report on the tangible organisationaloutcomes achieved through the implementation of skills developed by participation inthe program.It is used to graphically demonstrate the productive impact, outcomes and results ofparticipation showing organisational improvement.The impact and results evaluation provides quantitative results and scoring of addedorganisational value including reduced costs, improved quality, increased productionand efficiency indices, and enables calculation of return on investment (ROI) from theprogram.Even if all the other levels are effectively covered by the program, missing level 4 can causethe program to have reduced priority and acknowledgement by C-level and Executivemanagement as they may not easily identify a positive impact on the organisation’s bottomline by supporting the program.Effective evaluation of level 4 impact and results to measure the outcomes ofimplementation of learning to achieve described targets can be collated and analysedthrough:  borrowed metrics from other data systems including human resources data  surveys  focus groupsDecision-makers prefer the results of level 4 evaluations, although not necessarily indollars and cents. For example; a study of financial and information technologyexecutives found that they consider both hard and soft returns when it comes tocustomer-centric technologies, but give more weight to non-financial metrics (soft),such as customer satisfaction and loyalty.20These evaluation results enable real business results to be connected to trainingprograms, including:  operational efficiency  compliance  retention of top talent  customer satisfaction  sales volumeThese results also identify through analysis, the gaps and needs of the organisationthat can be filled through training and skills development.20 Hayes 2003 Page EV–52 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  53. 53. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19This level of program evaluation may be met with a more balanced approach orscorecard21 from four perspectives:  Financial—a measurement that shows a monetary return or impact such as how the output from a process is improved (financial can be either soft or hard results)  Customer—improving an area in which the organisation differentiates itself from competitors to attract, retain, and deepen relationships with its targeted customers  Internal—achieving excellence by improving processes as supply-chain management, production or support process  Innovation and learning—ensuring learning packages support a climate for organisational change, innovation, and the growth of individualsEvaluative results from the above scorecard are preferred by management but maybe supplemented with the more commonly provided levels 1–2 quantitative resultslisted below.  How many people (participants) will receive the training?  How often the training both can and is planned to be repeated; noting that online training courses can be developed once and re-used with relatively low ongoing costs, whereas the volume of financial, human and physical resources required for classroom delivery continues to increase  How many total hours of learning have been successfully achieved?  A comparison of the cost of initial development (initial or single class), with the reducing cost to maintain and deliver multiple instances of the same content  Overall resource costs involved in preparing, running and evaluating programs, in both dollars ($) and hours.  Do participants and/or facilitators need to travel, be accommodated, and supplemented at the workplace while they are training?  What proportion of programs were evaluated, and the distribution of stringency of those evaluations?  What proportion of programs were evaluated by the program manager, versus those evaluated by an independent third party evaluator?  The effective distribution of evaluation results.  How many programs or instances were planned using or referencing evaluation results from previous programs or relevant projects, before implementation?  What evidence is there of the implementation of evaluation recommendations in the delivery of programs?21 Kaplan, Norton 2001 Page EV–53 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  54. 54. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.0 Appendix: Level 1 sample survey questionsThe first level of evaluation is used to determine how well your participants engagewith the registration and learning process.22A.1 Level 1—About the training needs analysis1. This training needs analysis asks about [DELETE AS APPROPRIATE] the business need for training / your individual learning needs and preferences.2. The assessment is made up of [ADD NUMBER] questions and should take about [ADD NUMBER] minutes to complete. Please answer the questions as fully as possible.3. Please complete the assessment by [ADD DETAILS].4. What will happen with your responses?  [DELETE AS APPROPRIATE]  Your responses will be used to help decide on the most appropriate training needed.  Your responses will be anonymous and feedback will be reported for all respondents in generalised form only.  All responses will be analysed and reported to [ADD DETAILS]. Findings are due to be reported in [ADD DETAILS].  A copy of the assessment report will be available from [ADD LOCATION/PARTICIPANT] on [ADD DATE].5. Any questions?6. If you have any questions about this assessment, please contact [ADD DETAILS].22 The following questions have been adapted from < www.trainingcheck.com >. Page EV–54 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  55. 55. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.2 Level 1—About the training1. What was the name of the training/activity you attended?2. Which training event did you attend?3. Which sessions did you attend? Choose as many as apply.  All sessions  Session 1  Session 2  Session 34. Which modules did you attend?  All  Module 1 only  Module 2 only  Other, please specify5. How many sessions did you attend in total?6. Where was the training/activity held?7. Which location did you attend?8. What was the date of the training?  dd/mm/yyyy9. Which date did you attend?10. Which times did you attend?11. Which level did you attend?12. What type of training/activity was it?  Classroom-based  Web-based/e-learning  On-the-job learning  Coaching/mentoring  Project work  Job shadowing  Other, please specify13. What was/were the name(s) of the trainer(s)?14. Who was the trainer? Page EV–55 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  56. 56. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.19A.3 Level 1—About the participant1. Your name:2. Your organisation:3. Your job title/grade:4. What is your job title/grade?  [FIRST JOB TITLE / GRADE / CLASSIFICATION]  [SECOND JOB TITLE / GRADE / CLASSIFICATION]  [THIRD JOB TITLE / GRADE / CLASSIFICATION]  Other, please specify5. Which of these best describes your job role?  Senior manager  Manager  Supervisor  Employee  Other, please specify6. On what basis are you employed?  Full-time  Part-time  Contract/agency  Other, please specify7. Which department/section do you work in?8. What is your managers name?9. Which shift pattern do you work in:10. How long (in months) have you been in your current role?11. How long have you worked for the organisation?  Less than one year  1-2 Years  3-4 Years  5-10 Years  11-15 Years  16-20 Years  21-25 Years  More than 25 years Page EV–56 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc
  57. 57. Created by Dain Sanyë, Senior Consultant, Evaluation and Continuous Improvement Version 2012.03.1912. Your age:  Under 18  18-25  26-35  36-45  46-55  56+13. What is your date of birth?  dd/mm/yyyy14. Your gender:  Male  Female15. Is English your first language?  Yes  No16. What is your primary language?  English  [LANGUAGE 2]  [LANGUAGE 3]  [LANGUAGE 4]  Other, please specify17. What is the highest level of education that you have completed?  Primary/elementary school  Secondary/high/vocational school  Further tertiary education/vocational college  University  Other, please specify18. What is the highest grade or year of school you completed?  Never attended school or only attended kindergarten  Grades 1 through 7 (Primary)  Grades 8 through 11 (High school)  Grade 12 (High school graduate)  Tertiary 1–3 years (TAFE or University)  Tertiary 4+ years (Post-graduate)  Post-tertiary (Post-graduate Honours, Masters, Doctorate)  Other, please specify Page EV–57 of EV–105 Evaluation—Guide R:HRWorkforce DevelopmentProjects and InnovationsEvaluationContinuousImprovementEvaluationGuide.doc

×