0
www.imagine-america.org                                                                   ROI                             ...
ROI of FACULTY DEVELOPMENT:    OF F                                                                                       ...
ABOUT THE IMAGINE    AMERICA FOUNDATION    E              stablished in 1982, the Imagine              further their educa...
TABLE OF CONTENTSEXECUTIVE SUMMARY...........................................................................................
EXECUTIVE SUMMARY    T              his research report was sponsored by the      The study was initiated in early 2008 an...
INTRODUCTIONO             rganizations implement training             faculty performance and student outcomes, the       ...
PROGRAM BACKGROUND    Formed through a strategic partnership between the            America Foundation (IAF). The program ...
Baseline Assessment                                      the opportunity to further discuss course topics,                ...
This allows for an additional enhancement of             Development Program fulfills annual continuing    training by cus...
of the CEE Faculty Development Program. The                   It is based on the five-level framework described byCEE staf...
Reaction & Planned Action                                                             Learning                            ...
Table 2: Guiding Principles   1.	 When conducting a higher-level evaluation,             7.	 Adjust estimates of improveme...
contained 20+ questions. For the purposes of this                              data provided insight into the students’   ...
Data Sources                                            had occurred. In some cases, this may have been                   ...
Table 3: Data Collection Response Rates                                                    Data Collection                ...
monetary value will be discussed. These benefits  are identified as intangible benefits and are  considered a valuable ele...
EVALUATION RESULTS     The results of the study are categorized by the levels     of evaluation as follows:               ...
Table 5: Level 1: Results to Other Course Survey Questions                                                                ...
In summary, the Level 2 results indicate that                           The first section of results analyzed involved    ...
CEE Faculty Development Program Participation                                                                             ...
In addition to rating skill improvement, the     instructors were asked to identify techniques, skills,                   ...
CEE Faculty Development Program                                                                                      Skill...
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program
Upcoming SlideShare
Loading in...5
×

Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program

603

Published on

An independent case study by the ROI Institute showed a Return on Investment (ROI) of over 500% for the Center for Excellence in Education (CEE) Faculty Development Program. The Center for Excellence in Education is a strategic partnership between MaxKnowledge and the Imagine America Foundation. For more information, visit www.iaf-cee.org.

0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
603
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Transcript of "Center-for-Excellence-in-Education-Return-on-Investment-for-Faculty-Development-Program"

  1. 1. www.imagine-america.org ROI 2010 OF ofImagine America Foundation FACULTY DEVELOPMENT : A CASE STUDY1101 Connecticut Ave., N.W., Suite 901Washington, DC 20036www.imagine-america.org 1/7/2010 8:43:55 PM
  2. 2. ROI of FACULTY DEVELOPMENT: OF F IMAGINE AMERICA FOUNDATION BOARD OF DIRECTORS A CASE STUDYContact Information Chairman Christopher M. Cimino NeInet, Inc.Bob Martin Jenny Faubert Jim Gessner Portland, MEPresident Manager of Marketing and Project Development Duluth Business UniversityImagine America Foundation Imagine America Foundation Duluth, MN Danny FinufPhone: (202) 336-6758 Phone: (202) 336-6743 Brown Mackie Colleges (EDMC)Email: bobm@imagine-america.org Email: jennyf@imagine-america.org Pittsburgh, PA Vice-Chairman Dennis Spisak Piper Jameson McGraw-Hill Career College Division Lincoln Educational ServicesImagine America Foundation Prepared by: McGraw-Hill Higher Education Whitehouse Station, NJ1101 Connecticut Ave., N.W. ROI Institute St. Louis, MOSuite 901 PO Box 380637 Patricia KapperWashington, DC 20036 Birmingham, AL 35238-0637 Treasurer Forefront Education, Inc Russell “Wicker” Freeman Inverness, IL Coyne American Institue Chicago, IL Harris N. Miller Career College Association Former Chairman Washington, D.C. Michael Platt Keith Zakarin Ad Venture Interactive Duane Morris, LLP Lenexa, KS San Diego, CAAbout the Imagine America FoundationThe Imagine America Foundation (IAF), established in 1982, is a not-for-profit organization dedicated toproviding scholarship, research and training support for the career college sector. Since its inception, theFoundation has provided over $40 million in scholarship and award support for graduating high schoolseniors, adult learners and U.S. military veterans attending career colleges nationwide through its award-winning Imagine America® programs. The Foundation also publishes vital research publications for thehigher education sector, honors achievement in career education and offers faculty development training.For more information about the Imagine America Foundation, please visit www.imagine-america.org. 2010Copyright©2009 Imagine America FoundationPrinted in the United States of America. All rights reserved. No part of this book may be reproduced or transmitted in anyform by any means-electronic, mechanical photocopying, recording, or otherwise - without the prior written consent ofthe Imagine America Foundation. For more information on getting permission for reprints and excerpts, please contactthe Imagine America Foundation at (202) 336-6800. 1/7/2010 8:42:16 PM
  3. 3. ABOUT THE IMAGINE AMERICA FOUNDATION E stablished in 1982, the Imagine further their education by attending participating America Foundation is a not-for-profit career colleges. LDRSHIP Award honorees receive organization dedicated to serving the up to $5,000 toward their education. career college community by providing scholarships and awards, conducting sector Educational research has been an integral research, offering faculty training, honoring component of the Foundation’s activities since its achievement in career education, and supporting establishment in 1982. In 2007, the Foundation and promoting the benefits of career colleges to the created the 21st Century Workforce Fund. One of general public. the goals of the Fund is to conduct research that elevates the public understanding of the vital role of The Foundation currently sponsors three career colleges and their students nationwide. The scholarship and award programs, including Imagine Foundation, through financial support from the America for graduating high school seniors; the 21st Century Workforce Fund, has initiated research Military Award Program (MAP) for active duty, studies focusing on the economic impact of career reservist or honorably discharged U.S. military colleges, their role in meeting the nation’s current personnel; and the Adult Skills Education Program skilled-worker shortage and other broad public (ASEP) for adult learners. To date, through the policy issues facing the higher education sector. Imagine America® programs, the Foundation has awarded over $40 million in scholarships and Thousands of career college instructors have been awards to students enrolling at career colleges and and continue to be successfully trained through the universities all over the United States and Puerto Center for Excellence in Education (CEE), a unique Rico. lifecycle training process for faculty development. A case study conducted by the ROI Institute, found Through its supporters, the Foundation sponsors that the CEE Faculty Development Program was a additional programs such as the Imagine America positive investment with a return on investment of Promise scholarship program for adult students. 517%. Since its inception, the Promise scholarship program has secured over $550,000 in grants, For more information about the Imagine America which have supported over 650 continuing career Foundation, please visit www.imagine-america.org. college students. The LDRSHIP Award recognizes exceptional military personnel who have decided to1 ROI of Faculty Development: A Case Study
  4. 4. TABLE OF CONTENTSEXECUTIVE SUMMARY.......................................................................................................................... 3INTRODUCTION...................................................................................................................................... 4PROGRAM BACKGROUND.................................................................................................................... 5 Evaluation Design............................................................................................................................................. 8 . Evaluation Framework...................................................................................................................................... 8 Evaluation Planning......................................................................................................................................... 10 Data Collection................................................................................................................................................. 10 Data Analysis.................................................................................................................................................... 13EVALUATION RESULTS.......................................................................................................................... 15 . Level 1: Reaction & Planned Action................................................................................................................ 15 Level 2: Learning.............................................................................................................................................. 16 . Level 3: Application & Implementation.......................................................................................................... 17 Level 4: Impact................................................................................................................................................. 22 . Level 5: ROI and Intangible Benefits............................................................................................................... 23 Program Suggestions and Recommendations................................................................................................ 24 www.imagine-america.orgCONCLUSIONS........................................................................................................................................ 25REFERENCES............................................................................................................................................ 26APPENDIX A: PROGRAM DOCUMENTS............................................................................................ 27APPENDIX B: APPLICATION AND IMPLEMENTATION – SKILLS USED...................................... 28APPENDIX C: ADDITIONAL PROGRAM BENEFITS......................................................................... 29APPENDIX D: PROGRAM SUGGESTIONS AND RECOMMENDATION (FULL LIST). ................ 30 .ACKNOWLEDGEMENTS........................................................................................................................ 31 2
  5. 5. EXECUTIVE SUMMARY T his research report was sponsored by the The study was initiated in early 2008 and completed Imagine America Foundation to measure mid-2009. The instructors from the Mooresville the impact of faculty development on campus (program participants) and individuals student retention in addition to the identified in an overseeing/management role corresponding Return on Investment (ROI). The (leadership group) were the primary sources of research was conducted by the ROI Institute, Inc. data for the study. A detailed plan was structured with the Phillips ROI MethodologyTM (Phillips, to ensure applicable data from all sources was 2003) serving as the structure for designing, collected and analyzed. Additionally, specific planning, and implementing the evaluation study. tools – including comprehensive project and The study focused on measuring the impact of the communication plans – were utilized to ensure the comprehensive Faculty Development Program evaluation was successful. offered by the Center for Excellence in Education (CEE), a career college employee development and Overall, the conclusions from the evaluation study performance improvement initiative formed by the reflect that the CEE Faculty Development Program Imagine America Foundation and MaxKnowledge, was a positive investment for UTI’s Mooresville Inc. campus. The findings indicate the participants were satisfied with the program and acquired knowledge The evaluation study was conducted in collaboration and skills needed to enhance their job performance. with Universal Technical Institute, Inc. (UTI), a CEE client. To perform a careful analysis and isolate the Identifying the success of applying the knowledge effects of the CEE Faculty Development Program, and skills learned from the program was a key the UTI campus in Mooresville, NC, was selected for component of the evaluation. In order for the the evaluation study. program to impact the business, behaviors on the job needed to change and/or improve. 79% of the The goals of the evaluation study were to: participants reported their teaching performance had improved as a result of their CEE Faculty • Identify participant satisfaction, planned action, Development Program participation. The majority and knowledge increase of the leadership group also reported improvement in instructor teaching skills. These findings, along • Validate the program’s alignment to UTI’s with the positive instructor observation results, performance needs indicated that the participants are applying the skills and have improved their job performance. • Determine the success with the implementation of skills acquired from the program, including As a result of applying the skills on the job, there identifying any enablers and barriers to was an impact to the business. According to both application the participant and leadership groups, the program contributed to student retention and course retake • Understand the impact of the program on improvements. After isolating the effects of the student retention and course retakes program, converting the measures to monetary value, and identifying the fully loaded costs, the • Compare the benefits of the program to the costs result was a positive ROI of 517% for the CEE and determine the ROI Faculty Development Program. Additionally, there were notable intangible benefits of the • Set the stage for future program evaluation program, including job satisfaction, faculty career studies within UTI and the career college sector as development, and student satisfaction. a whole3 ROI of Faculty Development: A Case Study
  6. 6. INTRODUCTIONO rganizations implement training faculty performance and student outcomes, the programs for their employees to ensure Foundation engaged the ROI Institute, Inc. to that they have the necessary skills to conduct an impact study. perform their jobs in the most effectiveand efficient way possible. An organization must be An impact study is best suited for comprehensivewilling to invest resources in order to implement programs that meet specific criteria such as higheffective learning and career development visibility, links to business objectives, large audienceprograms. Organizations are typically willing to offerings, and being of interest to management.make such investments if they receive a return The Faculty Development Program offered by theon their investment. Simply put, the benefits of Center for Excellence in Education (CEE), theimplementing the training program should ideally Foundation’s employee development initiativeoutweigh the costs. with MaxKnowledge, was considered by the ROI Institute as an ideal candidate for a comprehensiveAdditionally, there is a great demand for evaluation study.accountability within the field of humanresource development. If an organization cannot This case study report presents the evaluationdemonstrate evidence that significant benefits are results of the CEE Faculty Development Programactually achieved as a result of the training program, at Universal Technical Institute’s Mooresvillethe program may be deemed a waste of resources campus. Key stakeholders had a strong interestand discontinued. Conducting program evaluations in understanding the impact the program had onis the solution to providing accountability for student retention and course retakes. Additionally,training programs – and, for that matter, any the results of the study set the stage for otherprogram – within organizations. program evaluations at UTI, as well as other studies across the career college sector.Most executives in the career college sector realizethe importance and value of employee developmentbeyond simply meeting compliance requirements.Intuitively, these executives know that effectiveemployee development leads to job satisfaction,student satisfaction and increased studentoutcomes. However, there is usually no system or www.imagine-america.orgprocess in place to link training and developmentprograms to organizational objectives and businessresults. And for the employees, there is no clearlink between learning and performance. Thus,most institutions do not really know if the benefitsof implementing a training program outweigh thecosts of the program. This leads to an organizationalmindset that employee development is a cost center,resulting in executives implementing the lowest-costtraining options without considering the return ontheir investment.The Imagine America Foundation’s commitmentto the enhancement of continuing education andtraining opportunities for career college employeesgoes back to its original charter established over 25years ago. Recognizing the relationship between 4
  7. 7. PROGRAM BACKGROUND Formed through a strategic partnership between the America Foundation (IAF). The program also Imagine America Foundation and MaxKnowledge, incorporates the standards developed by the Inc., the Center for Excellence in Education (CEE) National Center for Competency Testing (NCCT) provides turnkey employee development solutions and prepares the participants for NCCT’s Certified for career colleges and schools. One such solution is Postsecondary Instructor (CPI) examination. the CEE Faculty Development Program, which has The program combines online training with been developed in consultation with career college onsite transfer of training processes and activities executives with the ultimate goal of increasing to produce measurable results. All instructors employee and organizational performance and (including part-time faculty) at participating enhancing student retention. institutions receive up to 12 hours of training on an annual basis. The program is based on the instructional competency standards identified by the Career College Association (CCA) and the Imagine CEE FACULTY DEVELOPMENT PROGRAM Baseline Assessment Learning Success Webinars Tutorials Individual Development Plan Faculty Development Training Course Guide Feedback Competency Mapping Discussion Meeting Tools and Analysis Performance Forum Training Activity Reports Instructor Obervation Coaching Support Knowledge Assessments Continuing Education Certified Postsecondary Instructor Figure 1: CEE Faculty Development Program Flowchart5 ROI of Faculty Development: A Case Study
  8. 8. Baseline Assessment the opportunity to further discuss course topics, applications, questions, and examples with theirParticipants entering the CEE Faculty Development coaches and with each other in a post-course, onsiteProgram take an initial baseline assessment. This environment. The CEE provides faculty coachesassessment is comprised of a series of questions that with guidelines and sample questions for theare mapped to nationally established competency discussion meetings.standards for career education instructors. Basedon a participant’s responses, specific trainingcourses are recommended. The assessment is Performance Forumnot an appraisal of teaching performance or Working in concert with the onsite post-courseexperience, but rather a resource to help instructors discussion meetings, the online performanceand their faculty coaches collaboratively choose forums allow instructors to reconnect with thetraining options that best support their individual CEE facilitators and other program participantsdevelopment goals. to discuss current questions, comments, issues, or examples. Each core course in the program has itsIndividual Development Plan own post-course performance forum to enhance transfer of training to the workplace and provideThe Individual Development Plan (IDP) is a self- just-in-time opportunities for instructors todeveloped, professional development portfolio continuously improve their teaching performance.that captures the participant’s journey through theprogram. This online portfolio management systemcan also be used to record developmental activities Instructor Observationcompleted outside of the CEE Faculty Development Each core training cycle in the CEE FacultyProgram. The IDP serves as a platform for Development Program culminates with theinstructors and coaches to collaborate on identified instructor observation. This observation,performance-based outcomes and customizes supplemented by observer guidelines andinstructor training by linking course content and observation instruments from the CEE, providesapplications directly to each participant’s individual instructors the opportunity to demonstrate – anddevelopment goals. The IDP may be used to the faculty coaches to observe and assess – specificdocument an instructor’s development activities and agreed-upon training applications from eachfor accrediting and licensing agencies. It serves as a course. Instructor observations add a measurabledocumented summary of the participant’s learning feature to the program, as instructors are achievingexperience and results achieved. their own goals and improving performance.Training Course Feedback and AnalysisIn concert with individual development goals, At the heart and center of the CEE Facultybaseline knowledge assessment recommendations Development Program is the ongoing feedback and www.imagine-america.organd collaboration with the faculty coach, analysis among the instructors, faculty coaches, andparticipants take three online training courses program itself. From the initial baseline assessmenton an annual basis. Each course begins a training and individual development plan to the courses,cycle that includes online and onsite activities discussions, and instructor observations, eachand observations designed to transfer immediate program component provides the opportunity forconcepts and applications to the classroom. the instructor and faculty coach to concentrateFacilitated by an expert in the field, each online on training outcomes in relation to specificcourse in the program creates an interactive and instructional goals and teaching performance.asynchronous learning experience through content, Additionally, constant data is provided on eachassessments, and discussion forums. instructor’s progression through the program.Discussion Meeting Learning WebinarsTo enhance the application of training to the As faculty coaches provide feedback to instructorsclassroom, faculty coaches – after each core training and assess training outcomes, they may requestcourse – facilitate onsite discussion meetings with focused learning webinars to address specifictheir instructors. These meetings provide faculty training issues and performance objectives. 6
  9. 9. This allows for an additional enhancement of Development Program fulfills annual continuing training by customizing webinar outcomes to the education requirements for Certified Postsecondary needs of each institution. The webinars provide an Instructors, as well as the professional development opportunity for participants to have live interactions requirements for career college licensing and with CEE expert facilitators through discussions on accrediting agencies. While CEUs provide valuable the selected topics. Webinars are provided for both documentation of an instructor’s professional faculty coaches and instructional staff. development activities, CEE’s continuing education focus is on the ongoing accomplishment of each Success Tutorials instructor’s goals and improvement of teaching performance. Success Tutorials are condensed, self-paced, non-facilitated tutorials that address workplace success skills in areas such as career development, Faculty Development Guide communication, creativity, management, and The CEE online Faculty Development Guide leadership. Though Continuing Education Units provides comprehensive guidelines for every (CEU) credit is not offered through these tutorials, component of the program and is the central they provide valuable and informal learning resource for management and faculty coaches in opportunities that are available online anytime to the implementation of all program activities. The program participants. guide is designed to maximize transfer of training by providing techniques and strategies for effective Knowledge Assessments implementation of the program. Included in the guide are downloadable tools and instruments to As instructors progress through their CEE training use for faculty discussion meetings and instructor courses, they may take a series of interactive observations. knowledge assessments that include questions from the original baseline assessment. These assessments provide immediate feedback and help prepare Competency Mapping Tools participants for the CPI exam administered by the The core training courses in the CEE Faculty NCCT. For instructors who do not wish to proceed Development Program focus on the instructional with the CPI exam, the assessments still provide competencies identified by both the NCCT and an ongoing evaluation of a participant’s mastery of CCA. These core courses are mapped to the training content. established instructional competencies to provide a thorough and practical approach to faculty training. Certified Postsecondary Instructor Our interactive competency mapping tools easily identify what instructor competencies are covered in The NCCT administers the CPI exam. This exam, which courses. based upon instructor competencies identified by the NCCT, is recognized as a benchmark for successful teaching in career education. Training Activity Reports Completion of CEE core training courses helps The CEE Faculty Development Program provides prepare instructors for the CPI exam. Once earning online training activity reports that can be accessed the designation of CPI, 12 hours of continuing at any time. These reports allow participants education credits are required to maintain CPI to privately view their individual progress and status. Continued subscription to the CEE Faculty activities as they proceed through the program. Development Program ensures that instructors meet Additionally, reports on all participants may be this requirement. accessed by faculty coaches, administrators, or managers identified by the institution to keep a Continuing Education pulse on overall program outcomes and participant accountability. The CEE Faculty Development Program provides four hours of continuing education credit for successful completion of each online course, Coaching Support or 12 hours of continuing education credit for In addition to all of the online resources, coaching each instructor during the subscription year. support by email or phone is always available to Therefore, continued subscription to the Faculty assist subscribed institutions in the implementation7 ROI of Faculty Development: A Case Study
  10. 10. of the CEE Faculty Development Program. The It is based on the five-level framework described byCEE staff works closely, every step of the way, Phillips (1983; 2003) and serves as a categorizationwith faculty coaches as they utilize the Faculty of data representing measures that capture programDevelopment Guide to facilitate the transfer of success from the participant, system, and economictraining activities. perspectives. Table 1 presents the definition of each level of evaluation and conveys the complete storyEvaluation Design of the program’s success.The ROI MethodologyTM served as the structure Along with the categorization of data within thefor designing, planning, and implementing the five-level framework, a process model, presented inevaluation study. This approach reports a balanced Figure 2, is used to provide a consistent approachset of measures, follows a step-by-step process, to collecting and analyzing data. The processand adheres to a set of guiding principles. These begins with developing the program objectiveselements ensure a thorough and credible process and planning the evaluation. Following is datafor communicating the impact of the CEE Faculty collection and data analysis phases, including theDevelopment Program. key step of isolating the effects of the program from other influencing factors. Lastly, the results areEvaluation Framework communicated to stakeholders in various formats, including a full impact report.The evaluation approach begins with a fundamentalframework by which evaluation data are categorized. Table 1: Evaluation Framework Level Measurement Focus1. Reaction, Satisfaction, and Planned Action Measures participant satisfaction with program planned action2. Learning Measures changes in knowledge, skills, and attitudes3. Application and Implementation Measures changes in on-the-job behavior4. Impact Measures changes in business impact measures5. Return on Investment (ROI) Compares the monetary benefits to the costs www.imagine-america.org Tabulate Fully Loaded Costs Collect Data Collect Data Isolate Convert Data Develop Plan Calculate Report During After Effects of to Monetary Objectives Evaluation ROI Results Program Program the Program Value Identify Intangible BenefitsFigure 2: ROI MethodologyTM Process Model 8
  11. 11. Reaction & Planned Action Learning Application & Implementation Isolate the Effects of the Program Impact ROI Intangible Benefits Figure 3: ROI MethodologyTM Chain of Impact The results of the evaluation communicate the from the students who observe the faculty in real- complete story of a program’s success or failure. The life, on-the-job situations. The students’ reaction chain of impact shown in Figure 3 represents the to the faculty can help illustrate the extent to which sequence of events that occurs when the participants faculty are applying what they learn through the of a program react positively, acquire the needed faculty development program. While Figure 4 knowledge/skills, apply the skills back on the job, represents the ideal evaluation design, for purposes and, as a consequence, positively affect key business of this initial study, the focus is to evaluate the measures. student perspective from Level 1 – Reaction and Planned Action – only. However, because the CEE Faculty Development Program develops faculty members in order for To help ensure the evaluation process is consistent, them to better teach students, the evaluation design the study follows the ROI MethodologyTM Guiding can include a dual evaluation (see Figure 4). This Principles. These principles, as reflected in Table design incorporates the impact of the program from 2, keep the evaluation credible by incorporating a the faculty perspective as well as from the student conservative, standard approach. perspective. This involves taking measurements Chain of Impact from the Faculty Perspective Reaction & Planned Action Chain of Impact from the Student Perspective Learning Application & Implementation Reaction & Planned Action Isolate the Effects of the Program Learning Impact Application & Implementation ROI Intangible Benefits Isolate the Effects of the Program Impact ROI Intangible Benefits Figure 4: ROI MethodologyTM Dual Chain of Impact9 ROI of Faculty Development: A Case Study
  12. 12. Table 2: Guiding Principles 1. When conducting a higher-level evaluation, 7. Adjust estimates of improvement for potential collect data at lower levels. errors of estimation. 2. When planning a higher-level evaluation, the 8. Avoid using extreme data items and previous level of evaluation is not required to be unsupported claims when calculating ROI. comprehensive. 3. When collecting and analyzing data, use only the 9. Use only the first year of annual benefits in most credible sources. ROI analysis of short-term solutions. 4. When analyzing data, choose the most 10. Fully load all costs of a solution, project, or conservative alternative for calculations. program when analyzing ROI. 5. Use at least one method to isolate the effects of a 11. Intangible measures are defined as measures program. that are purposely not converted to monetary values. 6. If no improvement data are available for a 12. Communicate the results of the ROI population or from a specific source, assume that Methodology to all key stakeholders. little or no improvement has occurred.Evaluation Planning • Project PlanAs with any project, planning is essential. By having A detailed project plan was created prior tocomprehensive plans in place, there is greater the launch of the evaluation and maintainedopportunity for success. Additionally, the planning throughout the life cycle of the study. This plantools provide a means for validating the direction highlighted the overall timeline, key deliverablesof a study and ensuring key stakeholder needs are and milestones, and applicable resources needed.addressed. For this evaluation study, there werethree distinct planning deliverables developed: Data Collection A sensible, efficient approach to data collection• Data Collection Plan was selected for this evaluation study to further The data collection plan outlined the primary support the reasonably low cost of the program to objectives and measures for each evaluation level. UTI’s Mooresville campus. However, strategies were To ensure all data is collected, specific details were incorporated to ensure the results were credible. included, such as the timing of the collection, Following the review of the program objectives, data www.imagine-america.org responsibility, and sources of data. While this collection instruments, sources, and timing details plan was developed prior to the launch of the were identified. study, it was considered a “living” document and referenced throughout the study. Data Collection Instruments For each level of evaluation, specific data collection• ROI Analysis Plan instruments from the ROI Methodology™ Chain of Impact were implemented as outlined below: The ROI analysis plan detailed the elements needed to complete the ROI calculation. This tool identified the key Level 4 business measures and the process for converting the measures to • Level 1: Reaction & Planned Action To capture the participants’ reaction data monetary value. Additionally, it documented the regarding the CEE Faculty Development Program, method for isolating the impact of the program, the standard end-of-course survey data was used. program costs, potential intangible benefits, and Following the conclusion of each course within communication strategies. the program, participants completed a survey that 10
  13. 13. contained 20+ questions. For the purposes of this data provided insight into the students’ study, the results of six key questions from each perception and reaction to the instructors’ course were used. effectiveness. Third, a streamlined follow-up questionnaire was developed to capture data regarding the instructors’ application of the skills/ • Level 2: Learning knowledge on the job, as well as other needed As part of each course, the participants completed information. assessments, including a final quiz where they had to achieve a score of 70% or higher in one attempt. The final quiz results from the program’s • Level 4: Impact courses were analyzed for this study. In addition to using the follow-up questionnaire to identify estimated improvements in business measures and isolation information (estimated • Level 3: Application & contribution percentages), data from UTI-specific Implementation reports were used. These reports provided specific There was an opportunity to use three different information on student retention and course data collection instruments to capture specific retake measures. elements regarding application of skills and knowledge. First, on-the-job observations were To ensure data was collected for all applicable completed by the training manager at the UTI CEE Faculty Development Program courses, the Mooresville campus for a group of instructors, strategy illustrated in Figure 5 was followed. This linking their performance improvement to the strategy formatted the data collection process CEE training courses. Second, course survey for each course and ensured the needed data was results from the courses taught by the UTI incorporated into the final analysis. instructors were used. This student evaluation Data Collection Level 1 Level 2 Level 3 ED101 ED102 ED103 ED104 ED105 ED106 ED107 ED108 ED110 RT101 RT104 Effective Student Student Class Instructional Enhancing Creating an Learning Time Improving Best Teaching Retention Learning & Mgmt Planning for Student Accelerated Theory and Retention Practices to Strategies Methods Assessment Strategies Student Learning Learning and Stress Enhance Success Environment Practice Mgmt Retention Collect Level 4 Data Complete Data Analysis Figure 5: Data Collection Strategy11 ROI of Faculty Development: A Case Study
  14. 14. Data Sources had occurred. In some cases, this may have been longer than 90 days post-completion of ED101,Participants in the CEE Faculty Development but potentially not too long after completingProgram were the primary sources for the Level other courses. Level 4 data was collected after the1 and Level 2 evaluation data. For Level 3, the completion of physical year 2008 to ensure retentionparticipants and the leadership team served as and course retake results reflected both pre-programthe data sources. Specifically, both the participant and post-program information.and leadership groups completed the follow-upquestionnaire during the study. For level 4, businessimpact data was provided by UTI while the isolation Data Collection Successestimates were gathered by both the participants A data collection administration strategy wasand leadership groups. implemented for this study to help ensure the needed data was collected. This strategy includedFor the participant group, 64 Mooresville a comprehensive communication plan (e.g.,instructors were identified as active participants follow-up questionnaire announcement andin the program. During the evaluation period, a reminder emails) and the involvement of keytotal of 133 enrollments occurred across various sponsors, including the UTI Mooresville trainingfaculty development courses in the program, and manager. Additionally, the timing of collection wasthere were 98 course completions. On average, the monitored to avoid conflict with other participantparticipants completed 1.5 courses each. For the responsibilities. Lastly, the follow-up questionnaireleadership group, six individuals were identified as was formatted to support anonymous responses, asdata sources. The leadership group was identified it did not require the participants to provide theiras individuals in management or overseeing roles names or demographic information. All of theseassociated with the UTI Mooresville campus. factors fostered the successful data collection as reflected in Table 3.Data Collection Timing For both the end-of-course surveys and learningLevel 1 and Level 2 data was collected at the end assessments, data was collected from all participantsof each course within the program. Level 3 data who completed the CEE training courses. With thewas collected after the majority of participants follow-up questionnaire, the participant responsehad completed the ED101 training cycle and rate was almost 89% and the leadership responseappropriate time for the opportunity to apply the rate was 83%.skills/knowledge in their normal job environments www.imagine-america.org 12
  15. 15. Table 3: Data Collection Response Rates Data Collection Instructors Leadership Timing Instrument Invited Participated Invited Responded End-of-Course End of Course 108 108 (100%) Survey1 Learning 102 102 (100%) End of Course Assessment1 Instructor 90-120 Days After Course Observation2 N/A 35 Pre- and Post-Program End-of-Course Survey N/A 89 Participation (Student)3 No Earlier than 90 Days After ED101 Course Follow-up 53 47 (88.7%) 6 5 (83.3%) Questionnaire4 Completion 1 End-of-course survey or assessment potentially completed by individuals not in study group (e.g., campus leaders taking course to review) 2 A sampling of observations occurred; program participants were not invited to be a part of this data collection, but rather selected 3 End-of-course survey completed by instructors’ students to identify instructor effectiveness; # reflects the number of surveys analyzed 4 Population invited were identified as having participated in one or more program courses and active instructors at time of data collection Data Analysis initial project planning, UTI provided the specific monetary value of student retention and course The data analysis phase of the study involved the retakes. Since these values are standard for UTI, it completion of the following five key activities: was the most accurate, credible process to use. • Isolating Program Effects • Tabulating Fully Loaded Program This activity addresses the question “How do Costs we know it was the CEE Faculty Development Identifying the fully loaded cost of the program is Program that influenced the business measures?” another important step that must be completed Isolating the effects of the program takes all to ensure information is available for the ROI variables that could have influenced the measures calculation. For the CEE Faculty Development into consideration and identifies the specific Program, the following categories were included: amount that is attributable to the evaluated program. Due to the structure of the program, the location of participants, and the availability • Delivery costs including UTI Mooresville campus program costs and participants’ time for of data, it was determined that participant completing the courses and leadership estimations would be the most appropriate isolation technique. Specific details • Evaluation costs including consulting fees and on how the isolation technique was implemented time involved in collecting data and the results of the estimations are discussed Assumptions, source of information, and later in the report. calculations used as part of developing the fully loaded costs are detailed in a future section of the • Converting to Monetary Value report. Determining the monetary value of the business measures is a key step, as it identifies the figures for the ROI equation. While there are a variety • Identifying Intangible Benefits Within the results section of the study, any of techniques available, this study leveraged benefits of the program that are not converted to information provided by UTI. As part of the13 ROI of Faculty Development: A Case Study
  16. 16. monetary value will be discussed. These benefits are identified as intangible benefits and are considered a valuable element of the program’s success.• Comparing Benefits to Costs The ROI and Benefit-Cost Ratio (BCR) formulas are highlighted below, and the actual results are discussed later in the report: Net Program Benefits ROI = × 100 Program Costs Program Benefits BCR = Program CostsFor this study, it is important to note that samplesizes, particularly among the leadership group,were not large enough to conduct inferentialstatistical analyses. Thus, descriptive statistics (meanand frequency of responses) were obtained forapplicable items. Although no statistical inferencescan be made, the examination of descriptivestatistics has implications of the effectiveness ofthe CEE Faculty Development Program. However,based on the standards of the Phillips ROIMethodologyTM, results are inferred only to thoseproviding data, regardless of the analysis. Thisstandard ensures a more conservative accounting ofresults. www.imagine-america.org 14
  17. 17. EVALUATION RESULTS The results of the study are categorized by the levels of evaluation as follows: Level 1: Reaction & Planned Action Upon completion of a training course within the • Level 1: Reaction & Planned Action CEE Faculty Development Program, the participants completed an end-of-course survey. This survey • Level 2: Learning contained a variety of questions that were answered using a rating scale of strongly agree (5.00) to • Level 3: Application & Implementation strongly disagree (1.00). In general, the target was for the questions to receive a 4.00 or higher, as this • Level 4: Impact rating represents an overall agreement with the question’s statement. • Level 5: ROI, Including Intangible Benefits While all survey questions results were analyzed, For each section, details are outlined and there were six primary questions of interest, and findings are discussed as they relate to the chain the average results for these key questions across of impact discussed previously. Following the all courses are represented in Table 4. The results levels of evaluation results, suggestions and for other questions in the survey are represented recommendations for program improvements are in Table 5. When considering the overall average discussed. for 108 responses, each of the questions received a rating that exceeded the target. Table 4: Level 1: Results to Key Course Survey Questions Key Questions Overall Average (N=108) My learning focused on issues that interest me. 4.22 Overall, I would say that I am satisfied with this course. 4.26 I would recommend this online course to others. 4.27 What I learned connects well with my professional practice. 4.36 I learned how to improve my professional practice. 4.37 What I learned is important for my professional practice. 4.44 Rating Scale Point Value: Strongly Agree (5.00), Agree (4.00), Neutral (3.00), Disagree (2.00), and Strongly Disagree (1.00)15 ROI of Faculty Development: A Case Study
  18. 18. Table 5: Level 1: Results to Other Course Survey Questions Overall Average Other Questions (N=108) The discussion forum activities contributed to my learning. 4.01 I feel that I had sufficient interaction with the facilitator. 4.09 I feel that I had sufficient interaction with other participants. 4.10 The facilitator stimulated my thinking. 4.11 This course was as effective as the traditional classroom courses I have taken. 4.12 The facilitator seemed adequately knowledgeable. 4.14 The facilitator encouraged me to participate in the discussion activities. 4.14 I feel that I had reasonable access to the facilitator. 4.16 The course content was clear and adequate. 4.24 The feedbacks in the computer-scored quizzes were adequate. 4.16 The course topics were clearly organized and easy to understand. 4.28 The online learning tools were easy to understand and use. 4.30 I had sufficient time to complete the course requirements. 4.33 The learning objectives were clearly defined. 4.34 The course site was well-organized and easy to navigate. 4.36 This course was an appropriate course to take online. 4.38 The course requirements were clearly defined. 4.39 www.imagine-america.org I had no technical problems taking this course. 4.41Rating Scale Point Value: Strongly Agree (5.00), Agree (4.00), Neutral (3.00), Disagree (2.00), and Strongly Disagree (1.00) of-course survey to assess if learning occurred.Level 2: Learning Overall, when considering all responses, thisParticipant learning can be measured using various statement received a rating of 4.37, higher than thetechniques. For the CEE Faculty Development target of 4.00.Program, there were two primary techniques used:1) the end-of-course evaluation, which contained The learning assessments in the training coursesan item about learning, and 2) structured, objective were used as another indicator of learning.testing administered during and at the end of each For each training course in the CEE Facultytraining course. Development Program, participants completed four in-course quizzes and a final quiz where theyThe question “I learned how to improve my had to score 70% or higher on the first attempt forprofessional practice” was asked on the end- successful completion of the course. 16
  19. 19. In summary, the Level 2 results indicate that The first section of results analyzed involved learning occurred. The participants reported the respondents’ agreement with the statement, learning ways to improve in their profession and “As a result of participating in the CEE Faculty also successfully achieved final quiz scores of 70% or Development Program, I/they have had greater success accomplishing activities and goals identified higher. The overall average final quiz score across all in individual development/action plans.” Of the courses was 85.54%. 47 participants who responded to this question, 30 (63.83%) either strongly agreed or agreed with the statement (see Figure 6). In contrast, only 4.30% Level 3: Application & reported disagreement. For the leadership group, Implementation the findings were more evenly distributed, with 40% agreeing with the program’s contributions and 60% Three primary data collection instruments were reporting a neutral rating. Overall, the responses utilized to measure the extent to which the skills suggest that, on average, the program contributed and knowledge gained through the CEE Faculty to the successful completion of development Development Program were being applied on the activities. This is important because completing job. The follow-up questionnaire and observations the development plan activities links learning to were administered following the completion of one performance objectives and fosters an increase in or more training cycles, including ED101: “Effective instructor effectiveness. Teaching Strategies.” Additionally, results from UTI course surveys (student evaluations) of the The next question the respondents rated related instructors’ courses were analyzed. to the statement, “As a result of participating in the CEE Faculty Development Program, my/their Regarding the follow-up questionnaire, participants teaching performance has improved.” As shown and leadership provided insight into success with: in Figure 7, 78.72% of the participants agreed or strongly agreed that the program contributed to • Accomplishing development/action plan activities improving their teaching performance. Only one • Improving teaching performance individual reported the program did not contribute to the improvement of his/her performance. For the • Delivering class and lab instruction leadership group, 40.00% agreed that the program • Managing the classroom participation improved instructor performance and 60% neither agreed nor disagreed. The overall • Teaching to different learning styles findings indicate the program supports improving • Planning instruction teaching performance since the majority of the participants reported agreement with the program • Assessing student performance contribution. CEE Faculty Development Program Participation Greater Success Accomplishing Development/Action Plan Activities 70.00% 29 3 60.00% Average Rating % 50.00% 2 40.00% 15 30.00% 20.00% 10.00% 1 2 0.00% Neither Agree or Strongly Agree Agree Disagree Strongly Disagree Disagree Participants (N = 47) 2.13% 61.70% 31.92% 4.30% 0.00% Leadership (N = 5) 0.00% 40.00% 60.00% 0.00% 0.00% Figure 6: Accomplishing Development/Action Plan Ratings by Participants and Leadership Group17 ROI of Faculty Development: A Case Study
  20. 20. CEE Faculty Development Program Participation Teaching Peformance Improvement 33 70.00% 3 60.00% Average Rating % 50.00% 2 40.00% 30.00% 9 20.00% 4 10.00% 1 0.00% Neither Agree or Strongly Agree Agree Disagree Strongly Disagree Disagree Participants (N = 47) 8.51% 70.21% 19.15% 2.13% 0.00% Leadership (N = 5) 0.00% 40.00% 60.00% 0.00% 0.00%Figure 7: Teaching Performance Improvement Ratings by Participants and Leadership GroupTo further understand the program’s contribution improvement” ratings. All of the skills were notedto improving instructors’ performance, the as having at least a moderate improvement by 39%respondents were asked to rate the program’s or more of the participants. These findings indicatecontribution to the improvement of specific skills. that the program objectives and the skills taughtThe average participant ratings are reflected in in the courses are aligned to the instructors’ work.Figure 8. Overall, “Deliver class and lab instruction” Additionally, the program is having an impact onand “Assess student performance” received the the job, as a majority of skills were reported to havehighest “Very significant improvement”/“Strong improved as a result of program participation. CEE Faculty Development Program Skill Improvement Reported by Participants 60.00% www.imagine-america.org 23 22 50.00% 20 21 Average Rating % 40.00% 18 16 15 30.00% 13 13 13 13 9 8 8 20.00% 10.00% 3 4 2 2 2 4 1 0.00% Very Significant Strong Moderate No Improvement No Opportunity Improvement Improvement Improvement Teach to different learning styles 6.52% 34.78% 50.00% 8.70% 0.00% Deliver class and lab instruction 8.70% 28.26% 43.48% 19.57% 0.00% Assess student performance 4.35% 32.61% 45.65% 17.39% 0.00% Plan instruction 4.35% 28.26% 47.83% 17.39% 2.17% Manage classroom 4.35% 28.26% 39.13% 28.26% 0.00%Figure 8: Skill Improvement Ratings by Participants 18
  21. 21. In addition to rating skill improvement, the instructors were asked to identify techniques, skills, • Following through with projects to completion and/or knowledge gained from the CEE courses • Thinking outside the box that fostered success. There were approximately 90 All of the responses to this question further support responses to this question (see Appendix C) and, in the finding that the program offers training aligned general, fell into two categories. The first category, to the instructors’ work efforts and provides tools “Class management and facilitation techniques,” for increasing their effectiveness. received approximately 70% of the responses. Examples of items included in this group were: Lastly, with relation to applying what was learned in the CEE Faculty Development Program on the • Incorporating multiple teaching styles to better job, the participants were asked to identify what suit a diverse class supported (enablers) and deterred (barriers) them. • Developing more ways to make presentations The information from this question is important because it can assist in identifying and reinforcing • Getting students involved in classroom teaching actions that foster the use of the knowledge/skills. to help learners be more active Additionally, if barriers are identified, steps can be taken to mitigate them. As reflected in Figure • Increasing listening skills to hear students’ 9, all of the enablers received high “Strongly concerns Agree”/“Agree” ratings. With regards to barriers to application, the participants did not report any The second category revolved around time significant barriers. management and organization. This category received approximately 30% of the responses and Figure 10 illustrates the leadership group’s ratings included items such as: regarding skill improvement associated with • Learning how to plan out the day and class a little the instructors participating in the CEE Faculty better Development Program. For the most part, this group reported moderate improvement for the skill • Creating lists of personal and professional tasks areas assessed. that need to be completed CEE Faculty Development Program Enablers Identified by Participants 8 80.00% 70.00% 60.00% 23 25 18 17 % of Responses 50.00% 40.00% 13 13 30.00% 10 10 10 10 20.00% 7 7 1 2 10.00% 2 1 1 2 0.00% Neither Agree or Strongly Agree Agree Disagree Strongly Disagree Disagree Opportunity to Use Material 23.81% 54.76% 16.67% 4.76% 0.00% (N = 42) Confidence Appying Material 16.28% 58.14% 23.26% 2.33% 0.00% (N = 43) Management Support 23.81% 42.86% 30.95% 2.38% 0.00% (N = 42) Peer Support/Guidance 23.81% 40.48% 30.95% 4.76% 0.00% (N = 42) Other 0.00% 10.00% 80.00% 10.00% 0.00% (N = 10) Figure 9: Enablers Identified by Participants19 ROI of Faculty Development: A Case Study
  22. 22. CEE Faculty Development Program Skill Improvement Reported by Leadership 70.00% 3 3 3 3 3 Average Rating % 60.00% 50.00% 2 2 2 40.00% 30.00% 1 1 1 1 20.00% 10.00% 0.00% Very Significant Strong Moderate No Improvement No Opportunity Improvement Improvement Improvement Deliver class and lab instruction 0.00% 20.00% 60.00% 20.00% 0.00% Manage classroom 0.00% 20.00% 60.00% 20.00% 0.00% Teach to different learning styles 0.00% 0.00% 60.00% 40.00% 0.00% Plan instruction 0.00% 0.00% 60.00% 40.00% 0.00% Assess student performance 0.00% 0.00% 40.00% 60.00% 0.00%Figure 10: Skill Improvement Ratings by Leadership GroupIn order to provide a comparison of the were slightly different. “Manage the classroom”participants’ and leadership ratings, an analysis and “Deliver class and lab instruction,” which areof the results was completed by assigning number both the closest aligned to the instructor on-the-values to the rating choices. As Figure 11 illustrates, job observation activity completed by leadership,the participants reported an overall improvement received the highest rating of 3.00. When comparingrating of 3.23, while the leadership group was lower the two groups’ ratings, “Manage the classroom”at 2.72. The participants reported “Teaching to received the closest rating, with a difference ofdifferent learning styles” the highest improvement 0.09. “Assess student performance,” which had arating of 3.39 and “Manage the classroom” the 0.84 difference, was notably rated higher by thelowest at 3.09. With the leadership group, the results participants. CEE Faculty Development Program Skill Improvement Average Ratings 4.00 3.50 3.23 3.24 3.26 3.39 3.09 3.00 3.15 www.imagine-america.org 3.00 3.00 2.72 2.60 2.60 Average Rating 2.50 2.40 2.00 1.50 1.00 0.50 0.00 Manage the Assess Student Deliver Class/Lab Teach to Different All Tasks Plan Instruction Classroom Performance Instruction Learning Styles Participants 3.23 3.09 3.15 3.24 3.26 3.39 (N = 46) Leadership 2.72 3.00 2.60 2.40 3.00 2.60 (N = 5)Rating Scale: Very Significant Improvement (5.00), Strong Improvement (4.00), Moderate Improvement (3.00),Little Improvement (2.00), No Improvement (1.00)Figure 11: Skill Improvement Ratings Comparison by Participants and Leadership Group 20
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.

×