Your SlideShare is downloading. ×
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Doctoral Disseration: "Examining the Impact on Business Results Through Post-Training ROI"

1,157

Published on

Training expense represents a substantial investment in training resource. This dissertation details research on the business impact of a leadership-training program using the return on investment …

Training expense represents a substantial investment in training resource. This dissertation details research on the business impact of a leadership-training program using the return on investment (ROI) methodology. The primary objective of this study was the determination if there were positive financial impacts of a leadership-training program on a business and to present a verifiable and valid, substantial ROI with meaningfulness.

Published in: Education, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,157
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
80
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Examining the Impact of Training on Business Results Through Post-Training ROI Dissertation Submitted to Northcentral University Graduate Faculty of the Department of Business and Management in Partial Fulfillment of the Requirements for the Degree of DOCTOR OF PHILOSOPHY by JACK L. KULES Prescott Valley, Arizona May 2008
  • 2. APPROVAL Examining the Impact of Training on Business Results Through Post-Training ROI by Jack L. Kules Approved by: __________________________________________ ________________ Chair: Thomas Driver, Ph.D. __________________________________________ ________________ Member: David Moody, Ph.D. __________________________________________ ________________ Member: William Shriner, Ph.D. Certified by: __________________________________________ ________________ School Chair: Freda Turner, Ph.D.
  • 3. ABSTRACT Examining the Impact of Training on Business Results Through Post-Training ROI by Jack L. Kules Northcentral University, May 2008 Training expense represents a substantial investment in training resource. This dissertation details research on the business impact of a leadership-training program using the return on investment (ROI) methodology. The primary objective of this study was the determination if there were positive financial impacts of a leadership-training program on a business and to present a verifiable and valid, substantial ROI with meaningfulness. Training application and effectiveness were measured through four research tests, and ROI results and relevance were measured using two additional research tests. Questionnaire responses and action plan information was examined from 48 employees (from a target population of about 65) who went through a 15-hour strategic leadership training program. The training was found to be both effective and successful in teaching and developing strategic leadership concepts. Perhaps the most important and tangible indication of successes came directly from the company. Based on the study’s results, there was no meaningful difference of learning by the four organizational groups and that all participants have a net positive impact on business. iii
  • 4. ACKNOWLEDGEMENTS When you begin a learning journey such as the one represented by this dissertation, as the researcher, you think you know where you are headed but you cannot be sure of the final destination. The experience of completing this dissertation has been challenging and rewarding. Not only did it foster a sense of accomplishment and contribution to my field of study but it also allowed me to meet and become close to some brilliant people. I thank the participants in this research who gave their generous input and support to this project. Likewise, I thank ―SP‖ for allowing me to use many of their resources to make this dream possible. A special thank you is extended to my dissertation committee. Dr. Thomas Driver (Committee Chair), Dr. David Moody, and Dr. William Shriner supported me all the way through the dissertation process and their direction, suggestions, and concerns during this project made this journey an exceptional one. Finally, I want to thank my wife, Bridget, for all the love, patience, understanding, and support that she gave me over the past four years—without which I would never have realized this life-long dream becoming a reality. I am truly and deeply indebted to her. iv
  • 5. TABLE OF CONTENTS APPROVAL ........................................................................................................ ii ABSTRACT ....................................................................................................... iii ACKNOWLEDGEMENTS ................................................................................. iiv TABLE OF CONTENTS ..................................................................................... v LIST OF TABLES ............................................................................................. vii LIST OF FIGURES ............................................................................................ ix CHAPTER I: INTRODUCTION .......................................................................... 1 Statement of the Problem............................................................................. 2 Definition of Key Terms ................................................................................ 4 Brief Review of Related Literature ................................................................ 5 Highlights and Limitations of Methodology ................................................... 6 Limitations of the Study ................................................................................ 7 Research Expectations ................................................................................ 8 CHAPTER II: REVIEW OF RELATED LITERATURE ...................................... 10 How Much Is Performance Improvement Really Worth? ............................ 10 Using ROI Forecasting to Develop a High-Impact, High-Value Training Curriculum ............................................................................................ 12 Measuring Return on Investment for a Mandatory Training Program ......... 14 Resisting Measurement: Evaluating Soft Skills Training for Senior Police Officers ................................................................................................. 16 A Preprogram ROI for Machine Operator Training ..................................... 17 Getting Results With Interpersonal Skills Training ..................................... 19 Training’s Contribution to a Major Change Initiative ................................... 21 ROI Case Studies ...................................................................................... 23 CHAPTER III: METHODOLOGY ..................................................................... 26 Overview .................................................................................................... 26 Restatement of the Problem....................................................................... 27 Statement of Hypotheses ........................................................................... 27 Description of Research Design ................................................................. 29 Operational Definition of Constructs and Key Variables ............................. 34 v
  • 6. Description of Materials and Instruments ................................................... 35 Selection of Subjects .................................................................................. 37 Procedures ................................................................................................. 38 Discussion of Data Processing................................................................... 44 Methodological Assumptions and Limitations............................................. 46 Ethical Assurances ..................................................................................... 47 CHAPTER IV: FINDINGS ................................................................................ 50 Overview .................................................................................................... 50 Findings...................................................................................................... 51 Analysis and Evaluation of Findings ........................................................... 71 Summary .................................................................................................... 76 CHAPTER V: SUMMARY, CONCLUSIONS AND RECOMMENDATIONS ..... 77 Summary .................................................................................................... 77 Conclusions................................................................................................ 84 Recommendations ..................................................................................... 93 REFERENCES ................................................................................................ 96 APPENDICES ................................................................................................101 Appendix A ............................................................................................... 102 Appendix B ............................................................................................... 108 Appendix C............................................................................................... 110 Appendix D............................................................................................... 118 Appendix E ............................................................................................... 122 Appendix F ............................................................................................... 124 Appendix G .............................................................................................. 131 Appendix H............................................................................................... 136 Appendix I ................................................................................................. 140 vi
  • 7. LIST OF TABLES Table 1: ROI Case Studies .............................................................................. 23 Table 2: Demographics of the Population ........................................................ 50 Table 3: Mean and Standard Deviation for Questionnaire Responses Average Across Four Classes .................................................................... 51 Table 4: Mean and Standard Deviation for the 15 Objectives in Question 1 Across Four Classes .................................................................................. 52 Table 5: Mean and Standard Deviation for the Six Elements in Question 2 Across Four Classes .................................................................................. 55 Table 6: Mean and Standard Deviation for the Five Skill Areas in Question 3 Across Four Classes ............................................................................... 57 Table 7: Mean and Standard Deviation for the 14 Topics in Question 4 Across Four Classes .................................................................................. 58 Table 8: Metrics for Action Plans ..................................................................... 63 Table 9: Action Plan Topic Selection ............................................................... 63 Table 10: Action Plan Input from Selection ...................................................... 65 Table 11: Individual Costs for Strategic Leadership Program .......................... 68 Table 12: Total Program Costs, By Class ........................................................ 69 Table 13: One-Way ANOVA of Perceived Relevance of the 15 Objectives by Organizational Group ............................................................................ 71 Table 14: One-Way ANOVA of Perceived Relevance of the Six Elements by Organizational Group ............................................................................ 72 Table 15: One-Way ANOVA of Perceived Relevance of the Five Skill Areas by Organizational Group .................................................................. 73 Table 16: One-Way ANOVA of Perceived Relevance of the 13 Topics by Organizational Group ................................................................................. 74 Table 17: Mean Time to Action Plan Completion Based on Class................... 74 vii
  • 8. Table 18: One-Way ANOVA of Mean Time to Action Plan Completion Across the Four Classes ............................................................................ 75 Table F1: Question Data Table From Questionnaires, Question 1 ................ 125 Table F2: Question Data Table From Questionnaires, Question 2 ................ 127 Table F3: Question Data Table From Questionnaires, Question 3 ................ 128 Table F4: Question Data Table From Questionnaires, Question 4 ................ 129 Table G1: ROI Data Table from Action Plans ................................................ 132 Table H1: Values of One-Way ANOVA of Question 1 ................................... 137 Table H2: Values of One-Way ANOVA of Question 2 ................................... 138 Table H3: Values of One-Way ANOVA of Question 3 ................................... 138 Table H4: Values of One-Way ANOVA of Question 4 ................................... 139 Table I1: Mean ROI Accros Four Classes ..................................................... 141 Table I2: Values of One-Way ANOVA of ROI Results ................................... 142 viii
  • 9. LIST OF FIGURES Figure 1. Question 1 Breakdown of Questionnaire Responses for Class 1 ..... 54 Figure 2. Question 1 Breakdown of Questionnaire Responses for Class 2 ..... 54 Figure 3. Question 1 Breakdown of Questionnaire Responses for Class 3 ..... 55 Figure 4. Question 1 Breakdown of Questionnaire Responses for Class 4 ..... 55 Figure 5. Question 2 Breakdown of Questionnaire Responses for Classes 1 and 2 .......................................................................................................... 57 Figure 6. Question 2 Breakdown of Questionnaire Responses for Classes 3 and 4 .......................................................................................................... 57 Figure 7. Question 3 Breakdown of Questionnaire Responses for Classes 1 and 2 .......................................................................................................... 59 Figure 8. Question 3 Breakdown of Questionnaire Responses for Classes 3 and 4 .......................................................................................................... 59 Figure 9. Question 4 Breakdown of Questionnaire Responses for Class 1 ..... 61 Figure 10. Question 4 Breakdown of Questionnaire Responses for Class 2 ... 61 Figure 11. Question 4 Breakdown of Questionnaire Responses for Class 3 ... 62 Figure 12. Question 4 Breakdown of Questionnaire Responses for Class 4 ... 62 ix
  • 10. 1 Chapter I: Introduction Training expenses make up a substantial portion of the budget of an organization, and have come to be seen as an investment in training resources (Phillips, 2001). Large training expenditures and the need to show value are two of the primary drivers that have set in motion an increased emphasis on return on investment (ROI). Attention to ROI is rapidly becoming a central concern of organizations. Executives are showing an increased interest in ROI, and have become mindful of how training budgets have grown with limited or no accountability (Bartram, 1999; Rothwell, 2003). Executives are now demanding a return on investment for these programs. Further illustrating its prominence, numerous case studies (Phillips, 1994; Phillips, 1997; Phillips, 2001) have used return on investment to validate training’s contribution to business results. Training budgets can be very large and now have the full attention of executives. The costs can be immense. IBM has a training budget of about $1 billion, and Kinko’s training budget is over $30 million—or 6% of Kinko’s total payroll (Phillips, 2001). Regardless of the measurement methodology—total budget, expenditure per employee, percentage of payroll, percentage of revenue—a large training budget elicits additional evaluation and measurement. Executives are now demanding increased accountability for the increasing training expenditures. The use of the ROI methodology offers a view of training that reflects the bottom line (Phillips, 2002). The debate as to what should be measured and which results provide the best evidence of training success will continue, and no measurement has been
  • 11. 2 clearly proven the most reliable. A valid system would employ a balanced set of measures that take into consideration trainee preferences, learning retention, learning application, changes in business measures, and actual ROI (Phillips, 1997). This need for balanced measures is the major driver of ROI methodology, as it measures financial impact along with other important concerns. Statement of the Problem The problem addressed in this study was the determination if there were positive financial impacts of a leadership-training program on a business and to present a verifiable and valid ROI with meaningfulness. SP had a major need to determine the value and applicability of its leadership training to the company for current and future leaders. Therefore, a study of this nature was proposed and was based on the hypotheses and research questions addressing the differences in various learning relationships across four organizational groups in a targeted company. There were many studies (Phillips, 1994; Phillips, 1997; Phillips, 2001) that indicated a positive ROI, but the lack of a statistically sound approach in noted studies (see Review of Related Literature for examples) hinders verifying the reliability of the data in the studies. Using the ROI methodology to examine the value of training, when developed, analyzed and reported with meaningful data, will support an organization’s business success (the ROI methodology will be covered in Chapter 3). It can verify or validate that the training initiatives are meeting the needs of the business and having a positive impact on the bottom line of the business. This methodology has already been used in numerous case studies
  • 12. 3 (Phillips, 1994; Phillips, 1997; Phillips, 2001) to validate the contribution of training to business results. Specialty Pharmaceutical (SP)—the fictitious name providing anonymity to the organization where this study took place—was the organizational body used for this research. SP opened three new facilities with due to its substantial growth and trained a group of key current and potential supervisors and managers in leadership skills. SP needed to change its predominant method of autocratic, dictatorial style of management to a leadership style that lends itself to a higher performance work force. Most of the supervisory staff had modeled their leadership after the former style, as they had been exposed to it as a worker. The leadership-training program was designed to promote the essential skills of creativity development, motivation, delegation, communication, and decision-making. Emphasis was placed on balancing the human-relations side of management with the drive for results. The time-spaced format allowed for the real-world application of concepts between sessions as well as a forum to report on the results achieved. Each session contained practical application projects that corresponded to the participants’ responsibilities—so that the projects completed during the program would result in improved performance During the program, each participant completed a one-year management plan (action plan). This project required follow-up actions to initiate actual cost savings and improvements in the work setting. This positive form of accountability was to ensure long-lasting and measurable results.
  • 13. 4 Five research questions were formulated to address the problem which drove the hypotheses and resulting testing: 1. What differences, if any, existed in the perceived relevance of the 15 objectives across the four classes at SP? 2. What differences, if any, in the perceived relevance of the six elements of the job existed across the four classes at SP? 3. What differences, if any, in the perceived relevance of the five skill areas existed across the four classes at SP? 4. What differences, if any, in the perceived relevance of the 13 topics in one’s own work or that of the work unit existed across the four classes at SP? 5. What are the differences, if any, in the ROI across the organizational groups at SP? Definition of Key Terms Action Plan. A specific plan for the actions or steps that will be undertaken to implement the ROI methodology within the organization. Action items focus on specific spheres of influence. (Phillips, 2003) Evaluation Framework. Defines the levels at which programs are evaluated and how data were captured at different times from different sources. The framework involves a four-level evaluation process: reaction, learning, behavior, and results (Kirkpatrick, 1998). Isolating Program Effects. Used to ensure accuracy in calculating the ROI and to ensure an accurate picture of the program’s benefits. Excluding this step in the process will result in an incorrect, invalid, and inappropriate ROI
  • 14. 5 calculation. Return on Investment (ROI). This is the ratio of earnings (net benefits) to investments (costs); it is the most common measure for value-added benefits in operational functions. Brief Review of Related Literature A review of literature related to the topic of return on investment in the training industry revealed that there has been a body of knowledge generating the standard approach to determining ROI. Several popular texts (Phillips, 1983; Phillips, 1994; Phillips, 1997; Phillips, 2001) focus on the methodologies presented by the ROI Institute (Phillips, 2002). Phillips, along with his partners Patti Phillips and Ron Stone, created the ROI Institute as a means of communicating the philosophies and methodologies of ROI to training and human resource professionals around the world. Other authors have considered the main views of Phillips and have made some changes to the methodologies that were established. One major dissident is Dennis Kravetz, with his own approach to measuring human capital. Kravetz’ approach accommodates the financial aspects of the complete human capital concept, whereas Phillips’ approach focuses more on the business results produced by an intervention—such as training (Kravetz, 2004). Although Kravetz’ approach is different, the results are generally complimentary to those used by Phillips. The American Society for Training and Development (ASTD) and the International Society for Performance Improvement (ISPI) have published several
  • 15. 6 books on the topic of ROI by different authors as well as three volumes of case studies based on Jack Phillips’ ROI methodology (Phillips, 1994; Phillips, 1997; Phillips, 2001). These case studies are the foundation of the methodology that was used during the research. Because of the infancy of the methodology, there is more potential for uncovering new insights and approaches to solving the problem. Highlights and Limitations of the Methodology The research methodology used descriptive and inferential statistics to characterize the data and to predict similarities. The primary analytical test used was the one-way analysis of variance (ANOVA). The data collection process involved using questionnaires and action plans from the new or potential supervisors who participated in the training program. It further included objectives and methodologies for each level of evaluation targeted. These covered the following targets, by objective (Phillips, 1983; Kirkpatrick, 1998): Reaction. Data collection included a participant feedback form at the end of the training program to judge reactions to the training in regards to the relevance and effectiveness. Learning. Pre- and post-self-assessments, observed behaviors during skill practice, and review activities were used to evaluate how much they learned. Job Application. On-the-job behavior changes were monitored and measured during the action plan implementation. Business Impact. In the action plan, participants estimated the potential
  • 16. 7 cost benefits of their applied behaviors over a 90-day period after training. Questionnaires (Appendix A) were administered during a 90-minute follow-up session which was scheduled about three months after the initial training and co-facilitated by senior management. The business impact was evaluated by comparing the identified measures on action plans (Appendix B) at implementation of the high-performing leadership action plans (at the end of the training cycle) during those 90 days following implementation. Although several strategies were available to isolate the effects of training, most of the methods were thought to be ill-suited to this situation. Participants’ direct estimates were found to be the most appropriate technique. Participants’ estimates of the impact of training have shown to be a reliable indicator of results through practical application of this process by numerous ROI professionals (Phillips, 1994, 1997, 2001). It is a proven methodology for deriving reliable data and establishing reliable metrics and the best fit for use in this ROI study. Seven examples of its use are provided in the Review of Related Literature. Limitations of the Study In the data collection, the focus was on impact and not process. Consequently, very little effort was made to collect input on the actual training delivery processes and mechanisms, themselves. Most of the emphasis was on the effect of the program in relation to the investment required. To remain objective, data were collected only from people who experienced the training. A standard practice in ROI evaluation of short-term training programs is to capture the first-year benefits after the program has been conducted (Brinkerhoff,
  • 17. 8 1994; Graber, 1997). This practice, in essence, limits the analysis of benefits to one year of operation. Although this could slightly overstate the results in some cases, it represents a conservative. The benefits obtained in subsequent years are not necessarily useful to the analysis. In this study, data were collected and analyzed over a three-month period following the training. This data was then extrapolated over a 12-month period to simulate standard practice. It is recognized that not all data collected and analyzed is absolute and that there may be qualifiers that need to be researched at a later date. There may be variables that are qualitative in nature that are treated quantitatively for the purpose of measuring results. Research Expectations The business impact of the training program was examined in this quantitative study. Every attempt was made to uncover specific business results linked to the training program. The impact of the training program was measured by the extent of application of the skills and knowledge promoted in the program. The program’s impact was indicated by the extent to which participants saw a connection between the training program and the application of skills in the work setting. It was further shown through their reactions’ consistency with their responses on the action plans. Intangible results are those benefits that cannot be assigned a dollar value or those for which the assigned value is questionable. Even though these benefits were not used in the ROI calculation, they are important to the goals of SP. Comments from the learners’ action plan feedback indicated various
  • 18. 9 intangibles that would benefit SP. There will be a follow-up study conducted by SP to evaluate the intangible benefits.
  • 19. 10 Chapter II: Review of Related Literature After reviewing more than 150 journal articles and case studies, seven case studies were selected as having the most relevance to the research topic. Each of these articles deals with return on investment (ROI) in training programs and is often cited by experts in the field. The following are short summaries of each article and the different statistical approaches that each took. How Much Is Performance Improvement Really Worth? Berthiez (2001) conducted an ROI study on a sales training program for a major global automobile corporation in Europe. The primary project objective focused on the following questions: exactly what financial effect did this specific training have on the overall bottom-line in sales of new cars? What percentage of new sales, if any, could the training process claim to represent? The results were substantial and unquestionably beneficial to executives in determining how to allocate shrinking budgets to gain maximum return on human performance for dollars invested (Berthiez, 2001). The Phillip’s ROI methodology was used in this impact study. The steps used were data collection, training effects isolation, data conversion to monetary value, intangible benefits identification, program costs tabulation, and ROI calculation (ROI methodology will be covered in Chapter 3). In addition to the overall ROI model, it was found to be useful to add an additional component at the beginning of the process model—the training needs analysis (TNA). The focus on a TNA helped to identify clearly what needs to be accomplished with a given training initiative (Berthiez, 2001).
  • 20. 11 To isolate the relationship between training and performance improvement, the following three approaches were used: training impact—sales consultants’ perception of the influence of sales training on actual car sales; confidence factor—sales consultants’ certainty of their estimates about the influence of training and other factors; and customer validation—final sales data collected by the customer and used to substantiate sale consultant estimates. These approaches were selected for ease of use and the realistic credibility of sources. Control groups, monitoring on-the-job application of principles learned in training or trend-line analyses, could have been used to further isolate the data on training effects. Berthiez suggests it would be beneficial to compare and contrast other methods of isolating data in future ROI initiatives wherever practical and cost efficient to do so (Berthiez, 2001). The conversion of data was relatively easy, since units of cars sold can be multiplied by a given unit price and unit margin to clearly establish the monetary benefits. The data were calibrated and crosschecked against actual car sales results reported in Standard-Poor’s annual report, objective industry statistics, and internal company sales reports. The findings were discounted by a training impact of 9% from data taken from the questionnaire responses. Training impact represented sales consultants’ perceptions of the influence of the sales training on actual car sales. The findings were discounted further by an average confidence factor of 65%, representing sales consultants’ certainty of their estimates regarding the influence of training and other factors. To maintain the integrity of the statistical data, the study excluded any values that were outside of
  • 21. 12 realistic possibility (Berthiez, 2001). Data in this study was used to compare against known results of the business and the industry. The results for the retail distributor—with an ROI of 325%–was conclusive evidence that the investors, the manufacturer, and the retail distributor did realize a significant payback for the capital invested (Berthiez, 2001). Using ROI Forecasting to Develop a High-Impact, High-Value Training Curriculum With a variety of approaches to forecasting addressed, Graber (1997) described the process used by a Midwest electrical power provider to allocate funds for a variety of training initiatives and projects. The process built on the principles of forecasting financial benefits and provided an important tool for the training and human resource managers (Graber, 1997). The purpose of the ROI forecasting was to identify the training that would provide the highest possible payback and, more generally, to make wise training and development decisions. The training itself was seen to have no inherent value; the worth was dependent on the performance gains it catalyzed, the performance gaps it addressed, and the opportunities it created in a given environment. ROI forecasting did not affect the cost of training; however, it maximized the payback from limited training resources and helped to avoid training dollars going to waste (Graber, 1997). The ROI forecasting process began by selecting employees and supervisors who were most familiar with a job—the subject-matter experts (SME). The SMEs agreed on the key accountabilities of the job, which were
  • 22. 13 given a weight based on their importance and the typical time spent doing them during a year. An estimation procedure (Casio-Ramos estimation) was used to make the weights more accurate. Subject experts picked the highest weighted accountability and gave it 100 points; every other accountability was then compared to it and given a lesser number of points. Finally, the subject experts identified from seven to ten critical skills for each key accountability (Graber, 1997). Using a five-point rating scale (beginner, novice, skilled, advanced, and expert), skill assessment questionnaires were completed separately by employees and their managers, and both perspectives were weighed equally. Employee skill gaps were identified and the cost of the gaps in terms of lost performance was estimated. Rather than calculate the value of each employee, the process was simplified by using the median of the employees’ pay range to establish their value within each of three levels: professional, supervisory and middle management (Graber, 1997). To increase its value as a good measure of training need, the skill gap was calculated differently than is typically done. A percentage skill gap value was calculated using the traditional gap rating scale in conjunction with the importance of the skill to the job. The dollar value of the job was then used to calculate the dollar impact of the skill gap (Graber, 1997). For example, if salary and benefits equaled $100,000, formal presentations (the skill) make up a 4.8% weight, and the subject rated a 3 on formal presentations (which equates to 50%), the calculation would be: (a)
  • 23. 14 $100,000 x 4.8% = $4,800 (the skill value of a fully qualified employee), and (b) $4,800 x 50% rating = $2,400 (the size of the gap from the optimum) (Graber, 1997). Based on the skill gaps identified by this process for all applicable employees, 11 training programs were chosen and the expected ROI for each was calculated. These results show that only 6 of the 11 courses selected showed a positive ROI; therefore, only about $11,800 was spent on this program (Graber, 1997). Measuring Return on Investment for a Mandatory Training Program Marcial (2001) illustrated how a Florida-based government agency measured the ROI for a mandatory training program on self-mastery. The program evaluated the impact of using a specific training delivery methodology and its ability to channel employees to participate in and contribute to the organization (Marcial, 2001). It involved a learning map on a high-performance development model (HPDM), due to the perceived importance of self-mastery, and this particular learning map had undergone several beta tests and revisions before it was used by the agency. The learning map included workplace change sheets, how we learn sheets, teammate skills sheets, development approach sheets, information guides, and personal opportunity plans. Each person was asked to complete the questionnaire alone, return to debrief the questionnaire as a group, identify one thing he or she could contribute or do that could make a difference to the facility, and tell two co-workers what he or she has learned about HPDM (Marcial, 2001).
  • 24. 15 The data collection methodology was set up to take advantage of all the data generated in the sessions. A comparison arrangement was established to isolate the effects of the learning map. To isolate the effects of the learning map further, the participants were asked to estimate the impact of the program themselves (Marcial, 2001). The researchers used the participants’ application of what they learned in the learning map sessions to convert the data to a monetary value. Monetary values were assigned to the changes made by the participants using regulations and methods found within the agency. A database of employee time was readily available from human resources. An internal specialist provided the compensation data, including the cost of medical care for injury using billing codes and allowances paid to providers for treatment (Marcial, 2001). The benefits-to-cost ratio came out as 1.03, calculated by dividing the total benefits ($2,819.37) by the total costs ($2,737.10). An ROI of 3% was found by subtracting the total costs ($2,737.10) from the total benefits ($2,819.37) divided by the total costs ($2,737.10) (Marcial, 2001). At the project outset, an ROI of 25% was anticipated. Because this was a mandated program, the whole cost was an expected expense with no financial benefits; therefore the attempt to calculate the cost of mandatory training was valuable. The use of the learning maps for HPDM did not appear to be an economical delivery method at first, but it became evident that its use was worth the time invested and that it had the potential to bring about a significant ROI (Marcial, 2001).
  • 25. 16 Resisting Measurement: Evaluating Soft Skills Training for Senior Police Officers Police organizations are traditionally governed from the top down in a military-like hierarchical structure. However, police work often requires the exercise of independent judgment within limited contexts. McCarty’s (2001) research illustrated the problems of implementing and evaluating a program focused on interpersonal skills training in a highly structured, often resistant organization in New York. Two methods were used to collect data in this study, which was based on a Dale Carnegie training program: action plans and questionnaires. Participants used the action plans to track progress and to collect actual performance data over a three-month period following the final training session. Participants received the follow-up questionnaire three months after the final session so that they could return it with the action plan. The questionnaire provided data regarding the extent to which the participants had used the training on the job and the results that came from these applications (McCarty, 2001). Participants’ estimates of the impact of training were a reliable indicator when appropriate steps were taken to collect data. Though their judgment was subjective, the participants had direct experience to guide their estimates and had first-hand knowledge of other influences that could have had an impact on performance measures. Participants’ estimates had proven to be extremely reliable in other studies where they were compared to results from control groups (McCarty, 2001). In McCarty’s study, the primary strategy for converting data to monetary
  • 26. 17 value was to ask the participants to make estimates and calculations based on improvements in their work units. On the action plans submitted, participants used accepted standards and conversion factors to arrive at the monetary value. Some of the action plans were incomplete or otherwise flawed, invalidating the data for purposes of calculating the ROI; even then, there were indications of performance improvement (McCarty, 2001). To calculate the ROI, the benefits from the group were compared with the fully loaded cost of the program for the group as follows: The total benefits were $333,168 and the cost of the program was $136,530, therefore the ROI ($333,168 - $136,530 / $136,530) was 144%. The high yield for a small number of contributors was indicative of the type of results manifested when senior officials who have a large sphere of influence participate in action plan improvements (McCarty, 2001). A Preprogram ROI for Machine Operator Training This proposed program included significant capital expenditures and the creation of a Canadian training facility (Renaud, 1997). Prior to pursuing the project, an ROI was developed using a small-scale pilot effort. The ROI was developed using methods typically reserved for post-program evaluation. The results of the process can apply to almost any type of setting in which a major training expenditure is under consideration (Renaud, 1997). According to Renaud, one of the most difficult tasks in completing this ROI evaluation was estimating the expected benefits from the program. The pilot program presented some measurable improvements and this information was
  • 27. 18 used in five tangible benefit areas: training time, machining scrap, turnover, safety, and maintenance expense (Renaud, 1997). As a standard practice, supervisors recorded production shortfalls with new employees until they reached the standard rate for a machine. These losses were essentially production lost to trainees taking the time allowed to learn to operate a machine at a standard rate. The pilot program showed a 64% reduction in this production lost to training, and the supervisors estimated that trainee losses could be reduced by 50% with a structured training program (Renaud, 1997). Many factors contribute to machining scrap; one of the biggest factors is the lack of training of new and inexperienced operators. The supervisors estimated that there could be at least a 10% reduction in total scrap costs with the new training program. The turnover rate in the machining area was eight employees per month; because of the smaller numbers of employees involved in the pilot program, turnover reduction data were inconclusive. The supervisors felt that training could reduce turnover by at least 30%; a 30% savings was $115,200. This estimate was considered conservative (Renaud, 1997). Most of the accidents in the machining area were not lost-time injuries. The pilot program indicated a 25% reduction in accidents, but the supervisors estimated that accidents could be reduced by 30%. To remain conservative, the 25% value was used, resulting in an annual savings of $14,250 (Renaud, 1997). Effective training of new employees should result in less maintenance required on production machines. The pilot program showed a dramatic
  • 28. 19 reduction of 45%; however, the supervisors estimated that the unscheduled maintenance expenses could be reduced by 10% each year with the implementation of the training program (Renaud, 1997). The total projected annual savings were $304,950. The annualized costs were $131,500. The annual gross savings of $304,950 less the program costs of $131,950 result in a net savings of $173,450. The expected ROI ($173,450 / $131,500) for the first years was 132%. The investment in the equipment and the program development was to be spread over several years (Renaud, 1997). Getting Results with Interpersonal Skills Training Because of their soft nature, interpersonal skills training was a particular challenge when calculating ROI. Russ-Eft (1994) described a very successful, commercially available interpersonal skills training program being implemented in a large information service organization in New York. The organization was facing several challenges during the implementation of the training program. They had more than 100 locations spread across the United States. They wanted to bring about a cultural change to make the climate more supportive and cooperative and to foster improved performance throughout the organization. The ROI and evaluation study was designed to identify solutions to these cost-related issues (Russ-Eft, 1994). The organization decided to use a financial approach to justify the implementation of the skills program. The parameters were defined and the evaluation was conducted using surveys designed to evaluate the transfer of skills acquired. The evaluation instruments gathered ratings of subjects’ on-the-
  • 29. 20 job behavior as well as ratings of organizational climate and job satisfaction. The on-the-job behavior ratings included 44 items grouped into four categories: dealing with problems, communicating with co-workers, working with supervisors, and improving work. Members of the trained and control groups then indicated the percentage of time that they spent on the job dealing with the items grouped under the four behavioral categories. Ratings of organizational climate and job satisfaction were obtained from seven additional items (Russ-Eft, 1994). The Russ-Eft study used pre-training and post-training ratings of the behavior of training and control groups, to which people had been randomly assigned. Ratings were gathered from the members of both groups, their supervisors, and their colleagues. Pre-training ratings were gathered immediately before training; post-training ratings were gathered approximately three months following training (Russ-Eft, 1994). A series of analyses of variance with repeated measures were conducted. These analyses compared trained participants with control participants, training- group supervisors with control-group supervisors, and trained-group colleagues with control-group colleagues (Russ-Eft, 1994). The costs of training included trainees’ time away from work, trainers’ time away from work for preparation and training, the costs of materials used during the sessions, the time required for designing the sessions, and certification costs. Costs incurred for the entire population of 85 trainees were estimated at approximately $70,000 (Russ-Eft, 1994). The analysis of variance showed overall improvement, retrospectively
  • 30. 21 comparing skill ratings before and after training (i.e., post-training ratings of pre- training skill) after the training was completed. Significant differences appeared between trained and control groups due to these overall improvements (Russ-Eft, 1994). The results indicated that the total benefits were approximately $305,000 for a sample of 42 trainees out of the total population of 85 trainees; resulting in a net benefit of $235,000 when the costs are subtracted from the benefits. The traditional ROI formula yields an ROI of 336%. These calculations underestimate the net benefits somewhat, as none of the indirect benefits was included in the analysis (Russ-Eft, 1994). Training’s Contribution to a Major Change Initiative Stone (1997) illustrated how the ROI was calculated for an extensive training program for relationship bankers headquartered in North Carolina. The program evaluated the impact of training in the face of a variety of other change initiatives, including a process improvement effort implemented prior to the training. The study illustrated one approach to isolate the impact of the various factors contributing to improvement (Stone, 1997). Because the decision to determine the ROI for training was not made until after the program had been implemented, the process of calculating the ROI was much more difficult. Several factors can contribute to performance improvements. In the Stone study, the possible factors were the re-engineering effort that preceded the training, the support of the deal team, incentives, coaching by managers, capital market liaison assistance, and other training
  • 31. 22 initiatives. Strategies were thus required to isolate the effects of the training. Four strategies were considered: control group arrangement, trend-line analysis, estimates taken directly from participants, and estimates taken directly from the managers of the participants (Stone, 1997). The data collection plan included several approaches to converting data to a monetary value. The specific benefit from each of the six business performance measures had to be converted to dollar values so they could be compared to the training program costs. For the customer satisfaction measure, it was decided that no value would be placed on the actual training; instead, customers would be asked to indicate the specific benefit they received (Stone, 1997). The participant and manager questionnaires provided significant information on changes in behavior; the information obtained indicated the skills were being used on the job. Based on the input from the team questionnaire, the five business measures from the training program with the strongest influence were increased sales of capital market products, improved customer satisfaction, improved employee satisfaction, new business from existing clients, and new relationships established. Customers were asked to provide specific information regarding the impact of the training on the business; 67% of customers responded that the bankers added value to their business (Stone, 1997). The ROI for the training was developed only based on participants’ input. The total fully-loaded cost to train the participants was $698,725. When this amount is combined with the benefits in the standard ROI formula, the ROI comes out to 47.2%. Although this value may be lower than anticipated, the
  • 32. 23 return is much higher in reality; the 47.2% ROI value is an understatement of the actual return that does not consider several factors. When factored in, the actual ROI could easily approach a value in the 100% to 200% range (Stone, 1997). ROI Case Studies Table 1 has a list of additional case studies that are pertinent to using ROI in a training environment and that hold relevance to this study. Table 1 ROI Case Studies Organization Industry Program ROI (%) Office of Personnel U.S. Government Supervisory 150 Management Training Magnavox Electronic Electronics Literacy Training 741 Systems Company Litton Guidance and Avionics Self-Directed Work 650 Control Systems Team Training Coca-Cola Bottling Soft Drinks Supervisory 1,447 Company of San Training Antonio Texas Instruments Electronics Sales Negotiation 2,827 Training Apple Computer Computer Process 182 Manufacturing Improvement Training Hewlett-Packard Computer Sales Training 195 Company Support Services First National Bank Financial Sales Training 555 Services Causeway Corporation Financial Total Quality 154 Services Management Training
  • 33. 24 Table 1 (continued) ROI Case Studies Organization Industry Program ROI (%) Multi-Marques Inc. Bakery Supervisory Work 215 Process Analysis Training Midwest Banking Banking Loan Officer 1,988 Company Training Financial Services Inc. Financial Human Resource 2,140 Services Selection Training North County Electric Electric and Gas Applied Behavior 400 & Gas Utility Management Training Yellow Freight System Trucking Performance 1,115 Management Training Healthcare, Inc. Healthcare Sexual Harassment 1,052 Services Training Apex Corporation Manufacturing Advanced Sales 2,981 and Distribution Skills Training Eastman Chemical Chemical Empowerment 2,307 Company Training Nortel Learning Telecomm Finance Training 317 Institute NYNEX Corporation Communications Information 511 and Media Technology Training Texas Instruments Technology Negotiation Skills 2,827 Systems Group Training First Union National Banking Change Initiative 472 Bank Training Bell Atlantic Network Telecomm Computer-Based 319 Services Maintenance Training
  • 34. 25 Table 1 (continued) ROI Case Studies Organization Industry Program ROI (%) Speedy Telecomm Performance 1,600 Telecommunications Management Company System Training Cracker Box, Inc. Restaurant Chain Performance 298 Management Training Focus Corporation Computer Build-to-Customer- 570 Manufacturing Order Training Verizon Communication Telecomm Customer Service -54 Skills Training Slick Manufacturing Government Computer Training 125 Agency (Ireland) Healthcare, Inc. Healthcare Sexual Harassment 1,052 Services Training Apex Corporation Manufacturing Advanced Sales 2,981 and Distribution Skills Training Eastman Chemical Chemical Empowerment 2,307 Company Training Compiled from In Action: Measuring Return on Investment, Volume 1, by J. J. Phillips (1994), In Action: Measuring Return on Investment, Volume 2, by J. J. Phillips (1998), and In Action: Measuring Return on Investment, Volume 3, by J. J. Phillips (2001).
  • 35. 26 Chapter III: Methodology Overview The success of a training program at SP, a pharmaceutical company located in the Midwest, was examined for this quantitative study. Two specific objectives of this study were met through the implementation of a comprehensive data collection and analysis process: (a) to examine the specific impact of the training program in measurable business contributions to the extent possible, up to and including the calculation of the ROI for SP; and (b) to examine the extent to which participants applied on the job what they learned during the training. At SP, the management had become more interested in measuring the impact of training and development programs. Four major trends were driving these actions: 1. Training programs were rapidly getting more expensive to develop and deliver. 2. The importance of training in meeting strategic objectives within SP placed the training process at a level where accountability was necessary. 3. A trend toward measurement and metrics at SP was recognized because of regulatory compliance issues. 4. Executive management, in an attempt to manage resources efficiently at SP, had brought closer scrutiny to the training and development process and was requiring accountability for large training expenditures. Collectively, these trends were driving a need for more accountability and evaluation in training and employee development at SP.
  • 36. 27 Restatement of the Problem The problem addressed in this study was the determination if there were positive financial impacts of a leadership-training program on a business and to present a verifiable and valid ROI with meaningfulness. SP had a major need to determine the value and applicability of its leadership training to the company for current and future leaders. Therefore, a study of this nature was proposed and was based on the hypotheses and research questions addressing the differences in various learning relationships across four organizational groups in a targeted company. There were many studies (Phillips, 1994; Phillips, 1997; Phillips, 2001) that indicated a positive ROI, but the lack of a statistically sound approach in noted studies (see Review of Related Literature for examples) hinders verifying the reliability of the data in the studies. Statement of Hypotheses There are five hypotheses defined in this section. All five hypotheses each involve an analysis of variance (ANOVA) test. These research questions were developed to test the hypotheses to which they apply, not to prove them Research Question 1: What differences, if any, in the perceived relevance of the 15 objectives existed across the four classes at SP? H10: There will be no difference in relation to the perception of relevance of the 15 objectives across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H1a: There will be a difference in relation to the perception of relevance of the 15 objectives across the four organizational groups at SP (Operations, Sales
  • 37. 28 and Marketing, Scientific Research, and Support Groups). Research Question 2: What differences, if any, in the perceived relevance of the six elements of the job existed across the four classes at SP? H20: There will be no difference in relation to the perception of relevance of the six elements of the job across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H2a: There will be a difference in relation to the perception of relevance of the six elements of the job across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). Research Question 3: What differences, if any, in the perceived relevance of the five skill areas existed across the four classes at SP? H30: There will be no difference in relation to the perception of relevance of the five skill areas across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H3a: There will be a difference in relation to the perception of relevance of the five skill areas across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). Research Question 4: What differences, if any, in the perceived relevance of the 13 topics in one’s own work or that of the work unit existed across the four classes at SP? H40: There will be no difference in relation to the perception of relevance of the 13 topics in one’s own work or that of the work unit across the four organizational groups at SP (Operations, Sales and Marketing, Scientific
  • 38. 29 Research, and Support Groups). H4a: There will be a difference in relation to the perception of relevance of the 13 topics in one’s own work or that of the work unit across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). Research Question 5: What are the differences, if any, in the ROI across the organizational groups at SP? In this context, a significant difference simply means there is statistical evidence that there is a consistent difference; it does not mean the difference is necessarily large or important (Sleezer, 1994). H50: There will be no meaningful ROI across the organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H5a: There will be meaningful ROI across the organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). Description of Research Design Phillips’ ROI Methodology: This ten-step model provides a process for collecting data, summarizing and processing data, isolating the effects of programs, converting the data to monetary value, and capturing the actual ROI. In the first step, the planning is initiated and the specific business drivers of the solution are identified. Discussion and decisions revolve around how the solution will satisfy the business drivers. Business measures are clearly identified. The objectives are established and revised to ensure that stakeholders agree on the training to be applied, the behavior changes initiated and the business impact measures to be influenced.
  • 39. 30 Detailed planning takes place in the next step. The purpose of the evaluation is clearly defined and baseline data is developed and collected. If the purpose is to calculate the ROI, the entire ROI process (ten steps) will be followed. This step includes determining the data collection strategy and developing the necessary planning documents that specify in detail how steps three through ten will be carried out. The third step involves the client organization collecting the Level 1 and Level 2 evaluation data during the solution implementation (the evaluation levels are explained in Chapter 5). In step four, data on the application of training, behavior changes and business impact are collected. The business impact data is converted to monetary values to calculate the ROI. Throughout the process, data is collected at all levels to show a chain of impact up to the highest level that satisfies the purposes of the study. Step five begins the data analysis phase of the process as the effects of the solution are isolated to determine the extent that the business measures were influenced by the solution. The sixth step is only applied when the purpose of the evaluation includes calculating the ROI. If stakeholders have determined that there is no interest in the ROI calculation for a specific initiative, then the business impact and behavior change data is reported minus the calculation. Step seven reports data on intangible benefits along with business metric improvements. Barriers and enablers to implementation and behavior change are also reported. Any improvement in behavior and business metrics influenced by
  • 40. 31 the solution is reported in step ten. Before the ROI is calculated (step nine), the costs (step eight) are compared to the benefits that are converted to a monetary value from step six. Additionally, all of the data from steps three, four, five, and seven are reported, along with conclusions and recommendations. Conclusions address information such as what caused the results, what worked, and what did not work, while recommendations address where to go from there and how the findings can be used to implement improvement. In this study, two methods were used to collect data in keeping with the Phillips’ ROI methodology. A follow-up questionnaire (see Appendix A) was used to determine the extent to which participants utilized the training and achieved self-reported on-the-job success. The action plan (see Appendix B) was implemented during the training to identify areas for individual improvement as a result of the training program, to link achievements to department-level contributions, and to convert the contributions to monetary value. Participants were required to continue using the process to track progress and collect actual performance data for a three-month period after the last training session. Participant estimates of training impact were a reliable indicator when appropriate steps were taken to collect the data. The participants were the closest individuals to the performance improvement and often were aware of the other influences that affected the performance measures. For this study, participants were asked to indicate the degree to which a specific improvement resulted from the training program. The action plan was the tool used to capture this data. The action plan was a tested and validated document approach
  • 41. 32 devised by the ROI Institute (Phillips, 1997). While data can be converted to monetary values in many ways, the primary strategy that was used in this study was to ask participants to make estimates and calculations based on the improvement in their work units. The participants used accepted standards and conversion factors to arrive at monetary values. The leadership training program was believed to show positive business results using the return on investment methodology. A structured approach was employed, allowing a comparison of the data gathered from questionnaires and action plans. The structure used in this study, designated by Gay and Airasian (2000) as a quasi-experimental design, is well suited to situations where it is not feasible to randomly assign individual participants to groups. This situation arose from the purposive sampling methods that were applied in this study. The following criteria were used for this quasi-experimental study: (a) four training courses took place—a total of 48 employees were trained, (b) two experienced senior leaders of SP, who were certified through an extensive train- the-trainer program, conducted the class instruction, (c) the class was designed for a group of 10-16 students, and (d) all assignments among all classes had the same objectives. Sample Size Selection. A population of 200 was identified by SP management from which to select the sample size. It was determined that only about 65 current employees of the leadership population of approximately 200 would require the training—based on a screening of performance reviews and
  • 42. 33 succession planning data by SP, which were not a part of this study. Only 48 participants (24% of the total population, but 74% of the target population) would be available for the training and subsequent study, therefore sample size calculations were based on the following data. Using power analysis, the acceptable margin of error was determined to be 5% with a confidence level of 95%. The sample size of 48 would yield a response distribution of less than 4% based on an anticipated 100% response rate (which was achieved). With this in mind, it would be likely to achieve meaningful input with a sample size of 48. Therefore, with a sample size of 48 from the approximately 65 eligible and required employees, a confidence level of 95% and a confidence interval of 7.32 (based on a sample size of 48 from a population of 65 and a worst-case percentage of accuracy of 50%), about 73.8% of the target population made up the sample of 48. The size of the study sample was critical to producing meaningful results. A power analysis could be used to determine the size of the sample is large enough. However, with an unknown effect size or a useful standard deviation (based on past data), a power analysis was not performed. All research questions were tested to determine a meaningful impact in rejecting the null hypothesis. Operational Definition of Constructs and Key Variables The independent and dependent variables for each research question are presented in this chapter under the Statement of Hypotheses in the five research questions and the accompanying null and alternate hypotheses. As well, the
  • 43. 34 dependent variables for each research question can be found in the referenced Statement of Hypotheses section. Research Question 1. There were four organizational groups selected (Operations, Sales and Marketing, Scientific Research, and Support Groups) with each group being an independent variable. These four organizational groups were used consistently throughout the research as the independent variable for all six of the hypotheses and the associated research questions. Independent variable data was collected using both the questionnaires and the action plans. The averaged class score for the research question (based on the questionnaire responses) was the dependent variable for the ANOVA test in hypothesis one. For hypothesis one and research question one, the questionnaire was the source of all data. Research Question 2. There were four organizational groups selected (Operations, Sales and Marketing, Scientific Research, and Support Groups) with each group being an independent variable. The averaged class score for the research question (based on the questionnaire responses) was the dependent variable for the ANOVA tests in hypothesis two. For hypothesis two and research question two, the questionnaire was the source of all data. Research Question 3. There were four organizational groups selected (Operations, Sales and Marketing, Scientific Research, and Support Groups) with each group being an independent variable. The averaged class score for the research question (based on the
  • 44. 35 questionnaire responses) was the dependent variable for the ANOVA tests in hypothesis three. For hypothesis three and research question three, the questionnaire was the source of all data. Research Question 4. There were four organizational groups selected (Operations, Sales and Marketing, Scientific Research, and Support Groups) with each group being an independent variable. The averaged class score for the research question (based on the questionnaire responses) was the dependent variable for the ANOVA tests in hypothesis four. For hypothesis four and research question four, the questionnaire was the source of all data. Research Question 5. There were four organizational groups selected (Operations, Sales and Marketing, Scientific Research, and Support Groups) with each group being an independent variable. The averaged class score for the research question (based on the questionnaire responses) was the dependent variable for the ANOVA tests in hypothesis five. For hypothesis five and research question five, the action plans were used as the data source. Description of Materials and Instruments Questionnaire. The participants received a questionnaire (see Appendix A) that provided data regarding the extent to which participants used the training on the job while involved in the program and the results that came from these applications. The questionnaire had the participants: (a) rate the success of the course in meeting 15 objectives, (b) rate the relevance of the program elements
  • 45. 36 to the job, (c) indicate the degree to which the use of the 15 skills are enhanced, and (d) indicate the extent to which one thinks this course will influence the measures in their own work or that of the work unit. The questionnaire included a 13-item checklist, and it requested examples and details. The questionnaire approach and format has been validated through successful and effective use in multiple ROI projects and case studies completed by the ROI Institute, as well as ROI practitioners worldwide (Philips, 1994; Phillips, 1997; Phillips, 2001). Used in conjunction with the action plan, the questionnaire was an invaluable assessment tool. Action Plan: The requirement for the action plan (see Appendix B) was communicated prior to the program start date. On the first day of training, the program instructors described the action planning process in a 15-minute discussion. The participants received notepads on which to capture specific action items throughout the training program. They were instructed to make notes when they learned a technique or skill that would be useful in improving one of the three measures that they each identified as important. In essence, this notepad became a rough draft of the action plans. For this mixed method study, the reasons behind various aspects of the results were relied on. In this study the why and how of improved leadership were investigated. The need was for smaller and more focused samples rather than large random samples. The researcher relied on qualitative methods for gathering information on: (a) participation in the training, (b) direct experiential learning and (c) analysis of outcomes through documentation. The actions plans
  • 46. 37 provided qualitative data that was converted to quasi-quantitative data by the participants. The action planning process was discussed in detail in a one-hour session during the second day of training. This discussion included three parts: (a) the actual forms, (b) the guidelines for developing action plans and SMART (specific, measurable, achievable, realistic, and time-based) requirements, and (c) examples to illustrate what a complete action plan should look like. The instructors distributed five blank action plans (only three are required, one for each measure) and examples of completed action plans. During the second day of training, the participants completed the booklets. The participants worked in teams to complete all three action plans. Each plan took about 20-to- 30 minutes to complete. During the third day of training, the participants briefly reviewed the action planning process as a group, with each action plan taking about five minutes to review. The program instructors then explained the follow-up steps to the group. Selection of Subjects Four functionally-defined groups (Operations, Sales and Marketing, Scientific Research, and Support) were asked to submit the names of candidates they considered key and potential leaders that would be contributing significantly to the future of SP. They were considered the role models or leaders within their respective departments, as determined by the executive leadership at SP. The individuals that participated varied from new first-line supervisors to director-level management.
  • 47. 38 From a population of about 65 leaders companywide who were in need of the leadership training, 48 were selected by executive management to participate in the training. For the population of 48 participants, the breakdown by function was Operations, 15; Sales and Marketing, 12; Scientific Research, 10; and Support Groups, 11. The sample size of 48, with a confidence level of 95% and a confidence interval of 7.32, allowed for a meaningful number of participants, adding to the validity of the study. Procedures The training program was conducted in several sessions, with each session lasting two to four hours and delivered one or two days a week for three to four weeks. An orientation session also was conducted prior to the first session. The training program included 48 participants from various functional organizations of SP. Employees attended the training sessions on SP’s time. This sample represented about 24% (48 participants) of a total target population (about 65 targeted leaders and potential leaders) at SP. The research included four phases: process planning, data collection, data analysis, and the communication of results. This was summed up in an ROI impact study of the Strategic Leadership training program within SP. 1. Process planning was the most critical phase. Thorough planning ensured that the process addressed the appropriate objectives and used the proper data collection instruments. 2. Data collection occurred at two periods—during the selected training program(s) to measure participant reaction, satisfaction, and learning; and on a
  • 48. 39 post-program basis to gather information on the application of skills and knowledge as well as the impact the training program had on the organization. 3. The results of the program were shown by isolating the effects of the training program. The costs were tabulated and the ROI calculation was developed. 4. The communication of results included several issues that are often neglected, but are as important as the process itself: (a) the process and measurements were meaningless without communication; (b) communicating the results was necessary to make improvements and to show accountability in the training programs; (c) communication was a sensitive issue and could have been a source of great benefit or a cause of major problems; and (d) the varieties of target audience need different information. Calculating the ROI required a value to be placed on each data element connected with the training programs. The following are some strategies that were used to convert data to monetary values. 1. Some output data converted to cost savings or profit contributions and was able to be reported as a standard value, such as increased sales. 2. The cost of some quality measures were calculated and reported as a standard value, such as customer complaints. 3. The historical costs of preventing a measure were used when available, such as with time lost to accidents. 4. External databases contain an approximate value or cost of some data elements, such as employee turnover.
  • 49. 40 5. Internal and external estimates of the value of a measure, such as employee complaints. 6. Measures linked to other measures for which costs are developed, such as employee satisfaction linked to turnover. 7. Supervisors’ or managers’ estimates of costs or values, such as unscheduled absence. 8. Employee time saved converted to wages and benefits. 9. Participants’ estimates of the cost or value of the data element, such as work group conflict. 10. Training staff estimates of the value of a data element, such as harassment complaints. Converting data to monetary benefits was critical. The process was challenging, but was methodically accomplished using one or more of the above strategies. Data Collection Issues. A data collection strategy was designed to meet the objectives of this study. The questionnaires and the action plans were utilized to ensure that adequate, quality input was obtained for the evaluation. In both data collection methods, the focus was on impact and not process. Consequently, very little effort was made to collect input on the actual training delivery processes and mechanisms themselves, although some data were collected. Most of the emphasis was on the impact of the program, which was obtained with evaluation Levels 3 and 4 for data collection and analysis (Phillips, 1983; Kirkpatrick, 1998). To remain objective, data were collected only from people who took part in
  • 50. 41 the training program. Although data from the instruction team and others could have proven helpful, it was essential that input was free from any perceived bias. These steps helped to ensure that the process was unbiased, objective, and contained a minimum of errors. Data Collection Timing. Another important issue to address in this study was the timing of the data collection. Although the training program was designed to have a long-term impact, the specific improvements from the training program would be difficult to capture if assessed years after the training program was completed. For longer periods, additional variables could influence output measures, thus complicating the relationship between the training and the improvement. The training program was time-spaced (one or two sessions a week for four to six weeks), which provided opportunities for on the job application of the training on an ongoing basis. Because of the above factors, it was decided to measure the success of the training program during a three- month period after the last training session. The data were collected and analyzed over a three-month period following the training. This data was then extrapolated over a 12-month period to simulate standard practice. A standard practice in program evaluation is to capture the annual benefits after the program has been conducted and compare them to the cost of the program (Brinkerhoff, 1994; Graber, 1997). In essence, this limits the benefits in an ROI calculation to the impact of a training program for one year of improvements. While this could slightly overstate the results in some cases, it usually understates them in practice. The skill transfer techniques used by the
  • 51. 42 instructors were among the most effective in the training industry at building the confidence and skills that contribute to long-lasting results. Questionnaire and Action Plan. The most common follow-up method— questionnaires—provided a rich source of information on the extent to which participants applied what they had learned in the program and the success they achieved with the application. Because of the need for business impact data, the action plan process provided a capable means of gauging the actual impact of the training program as its information was applied. The action plan was also a useful tool to keep employees focused on changing their behavior in the work setting. Appendix A is an example of the questionnaire and Appendix B is an example of an action plan. The action plan’s necessity was communicated prior to the program start date. On the first day of training, the program instructors described the action planning process in a 15-minute discussion. The participants received the questionnaire 90 days after the completion of the leadership-training program. It provided data regarding the extent to which participants used the training on the job while the training program was ongoing and the results that came from these applications. Effects of Training Isolation. Participant estimates of training impact are a reliable indicator when appropriate steps are taken to collect the data (Phillips, 2002). The participants were the closest individuals to the performance improvement and were often aware of the other influences that affect the performance measures. For this study, participants were asked to indicate the
  • 52. 43 degree to which a specific improvement was caused by the training program. The action plan was the tool used to capture this data. Participants’ direct estimates were deemed the most appropriate technique of evaluation for this study. Their estimates of the impact of training are a reliable indicator of value (Phillips, 2002). The participants are the individuals closest to the training and are often aware of the other influences that have an impact on the leadership training measures. In studies where participants’ estimates have been compared to the differences obtained from control group experiments, their estimates were found to be very reliable (Bernthal, 1994; McCarty, 2001; Russ-Eft, 1997; Westcott, 1994; Zigarmi, 1997). Data to Monetary Value Conversion. While data could be converted to monetary values in many ways, the primary strategy that was used in this study was to ask participants to make estimates and calculations based on the improvement in their individual work units. Participants used accepted standards and conversion factors to arrive at monetary values. Tabulating the costs of the training effort involved monitoring or developing all of the costs related to the training program. A fully-loaded cost profile was recommended when tabulating all of the direct and indirect costs (Marelli, 1993). The return on investment was then calculated by comparing the monetary benefits and costs. The benefit-cost ratio was obtained by dividing the monetary benefits of the program by the costs. The return on investment used the net benefits (costs minus benefits) divided by costs. This formula is commonly used to evaluate other investments where the ROI is traditionally
  • 53. 44 reported as earnings divided by investment. Intangible Benefits. Intangible results are those benefits that cannot be assigned a dollar value or the assigned value is questionable. Even though these benefits were not used in the ROI calculation, they are important to the goals of SP. Comments from the participants and action plan reporting indicated various intangibles that will benefit SP. Discussion of Data Processing The data collection strategy was designed to meet the objectives of this study. The action plans and the questionnaire were utilized to ensure that adequate, quality input was obtained for the evaluation. To assure anonymity, the questionnaire was sent confidentially from Corporate Training and Development to the 48 participants and the participants were not required to write their names on it. Experience has shown that participants will provide more data that are valid if anonymous feedback is ensured. They were under no pressure to exaggerate the data to impress superiors. The questionnaires and action plans were returned to the Corporate Training and Development department for review and analysis (note: the researcher is also the Director of Corporate Training and Development). Then a report was developed (an internal privileged document) and presented to the Executive Leadership Council for presentation to executive management at SP (see Appendix C). Responses to the questionnaire provided a very good source of data because of the number of write-in comments and the quality of data supplied.
  • 54. 45 The return rate of completed questionnaires was 100%, and the return rate of competed action plans was 100%. Descriptive and inferential statistics were used to analyze the questionnaire and action plan data. The primary analytical test used was the one-way analysis of variance (ANOVA). Tables and figures were used to show the distribution of the participants’ selections in absolute numbers, means and standard deviations. Because the main thrust of this study was to determine the business impact of the training program, every attempt was made to uncover specific business results linked to the training program. The impact of the training program was presented to indicate the extent of application of the skills and knowledge obtained. Each participant was asked to select a number of skills they used the most on the job since taking the training program. The following tests were conducted in support of the five research questions and five hypotheses. Test 1. A one-way analysis of variance (ANOVA) was conducted to find if there existed a difference in the perceived relevance of instruction across the four organizational groups where the training took place. Each organizational group was the independent variable with the averaged class score for question 1 on the questionnaire as the dependent variable. Test 2. A one-way ANOVA was conducted to find if there existed a difference in the perceived relevance of instruction across the elements of the job for which the training took place. Each organizational group was the independent variable with the averaged class score for question 2 on the questionnaire as the
  • 55. 46 dependent variable. Test 3. A one-way ANOVA was conducted to find if there existed a difference in the perceived degree of enhancement from the instruction across the skills of the job for which the training took place. Each organizational group was the independent variable with the averaged class score for question 3 on the questionnaire as the dependent variable. Test 4. A one-way ANOVA was conducted to find if there existed a difference in the perceived influence that instruction had on the measure of performance in one’s own work or that of the work unit. Each organizational group was the independent variable with the averaged class score for question 4 on the questionnaire as the dependent variable. Test 5. A one-way ANOVA was conducted to examine if there existed a difference in the ROI across the organizational groups. Each organizational group was the independent variable and the ROI expressed as a percent was the dependent variable. Methodological Assumptions and Limitations Assumptions. The underlying assumptions for this study were: (a) two fulltime instructors were sufficient to deliver all the training throughout the entire initiative, and (b) standard SP leadership training curricula, used previously within SP, were suitable for use. Limitations. SP was in charge of assigning who participated in the leadership-training program; therefore, the researcher had no control over who took part in the training and only limited control over class size and scheduling.
  • 56. 47 Further, four organizational areas of one company were the focus of this study: Operations, Sales and Marketing, Scientific Research, and Support Groups. Therefore, the sample collected may not be representative of organizations in other companies, limiting the ability to generalize. Because the sample population obtained for this study was limited only to key and potential leaders within SP, it is insufficient to use for generalizations about the entire population of SP. The sample taken was representative only of the population of key and potential leaders requiring leadership training at SP in the four representative organizational groups, making it possible to generalize about this population and these organizational groups alone. Ethical Assurances Of the many definitions of the term ethics, no one definition has emerged as universally accepted. Any time ethics is the topic of discussion, terms such as conscience, morality, legality, trust, values, responsibility, and integrity will frequently be heard. Although these terms are closely associated with ethics, they do not—by themselves—define it. Assumptions about ethical underpinnings of human behavior are reflected in the complexities involved in relating one culture to another, the distribution of scarce resources, the allocation of power, the dynamics of groups, the codification of ethical constructs, and the rewarding of ethical behavior and discouraging of unethical behavior. As leadership becomes more complex and deals with more situations, the application of ethics can also become more complex (Bonhoeffer, 1995).
  • 57. 48 Two standards are applied in order to help protect the privacy of research participants. Almost all research guarantees the participants’ confidentiality; the stricter standard is anonymity, which essentially means that the participant will remain anonymous throughout the study. Clearly, the anonymity standard is a stronger guarantee of privacy, but it is often difficult to accomplish—especially when participants have to be measured at multiple time points (Blackburn, 1996). Increasingly researchers have had to deal with the ethical issue of a person’s right to service. Good research practice often requires the use of a non- treatment control group. When the program may have beneficial effects, however, persons assigned to the non-treatment control may feel their rights to equal access to services are being curtailed (Blackburn, 1996). Even when clear ethical standards and principles exist, there will be times when the need to do accurate research runs up against the rights of potential participants. No set of standards can possibly anticipate every ethical circumstance. Furthermore, there needs to be a procedure that assures that researchers will consider all relevant ethical issues in formulating research plans. The participants of this study were employees of SP and agreed to participate. They were broken into four groups, by organization, each of which received the learning materials and participated in the training. Each person completed the informed consent form (see Appendix D). The study was non- invasive, and the participants were asked only to complete the questionnaire and the action plan as appropriate for the group to which they were assigned. With no risks to any participant, no additional safeguards needed to be
  • 58. 49 established. All participants were identified by a randomly generated study participant number that only the researcher had access to. All information gathered was protected by confidentiality agreements and not shown to the organization and other members of SP. This researcher made every attempt to comply with the guidelines established in Appendix B of the Dissertation Handbook (Northcentral, 2005). Approval from the university’s Ethics Committee was granted via electronic notification (see Appendix E).
  • 59. 50 Chapter IV: Findings Overview The impact of a leadership-training program on the bottom-line of the organization using return on investment (ROI) methodology was examined in this study. From a population of about 200 leaders companywide, 48 were selected by executive management to participate in the training and, thus, the study. The training spanned 15 facilities in the St. Louis area and all divisions of SP. Four series of the leadership course were taught during the month of December 2006, and data were collected from mid-December 2006 to mid-March 2007. Data were analyzed in March and April 2007. Table 2 displays research population demographics. Each participant submitted a completed questionnaire and each participant submitted a completed action plan for review and analysis. The findings of this study are based on the data gathered during this process. A 100% questionnaire and action plan return rate (N = 48) for all four series of classes was achieved. None of the submitted questionnaires and action plans had to be rejected. Absence of errors or ambiguities on the survey responses is attributed to special care taken by all instructors (a) to begin each training session with a class review of the intent of questionnaire and action plan questions, as well as reviewing the scaling used to record question responses; and (b) to reiterate the course requirement that an anonymous questionnaire and an anonymous action plan be submitted by each participant upon completion of each training session.
  • 60. 51 Table 2 Demographics of the Population (N = 48) Demographic Number Percentage Participants by Class/Operating Area Class 1, Operations 15 31.25 Class 2, Sales and Marketing 12 25.00 Class 3, Scientific Research 10 20.83 Class 4, Support Groups 11 22.92 Participants Taught by Instructor Instructor 1, Classes 1 and 3 25 52.08 Instructor 2, Classes 2 and 4 23 47.92 Number of Participants by Gender Male 37 77.08 Female 11 22.92 For the population of 48 participants, the percentage breakdown of respondents by class/operating area was: Class 1, Operations, 31.35%; Class 2, Sales and Marketing, 25.00%; Class 3, Scientific Research, 20.83%; and Class 4, Support Groups, 22.92%. The breakdown of respondents by instructor was: Instructor 1, Classes 1 and 3, 52.08%; and Instructor 2, Classes 2 and 4, 47.92%. The majority of respondents were male (77.08%). Findings Questionnaire. The average scores and the standard deviations for Questions 1, 2, 3, and 4 in the questionnaires are summarized in Table 3. The lowest average point value among all questions was a 2.94 (Question 4, Class 1), with the second lowest score being a 3.00 (Question 4, Class 4). This means 93.8% (15 out of 16 table entries) of the averaged class responses to
  • 61. 52 questionnaire questions were at or above the 3.00 median response level. This indicated positive overall training effectiveness based upon Kirkpatrick’s (1998) and Phillips’ (2003) training criteria, which will be discussed in detail in Chapter 5. It is difficult to attach greater meaningfulness to the data of Table 3 because (a) classes were small, ranging from 10 to 15 participants; (b) there were only four classes conducted; and (c) because of the small groups, there was meaningful variability in the standard deviation among the classes, ranging from .34 to .74. Table 3 Means and Standard Deviations for Questionnaire Response Average Across Four Classes) Class 1 Class 2 Class 3 Class 4 Program (n = 15) (n = 12) (n = 10) (n = 11) Element M SD M SD M SD M SD Question 1 3.13 .71 3.04 .62 3.21 .68 3.33 .63 Question 2 3.98 .70 3.21 .71 3.30 .74 3.38 .65 Question 3 3.50 .41 3.39 .47 3.48 .48 3.57 .34 Question 4 2.95 .66 3.01 .66 3.05 .69 3.03 .70 The average score and the standard deviation for the 15 objectives of Question 1 of the questionnaire, which asked the participant to indicate his or her degree of success in meeting these objectives, are summarized in Table 4. There were six possible responses for each objective in Question 1: 0 = Not Applicable, 1 = No Success, 2 = Very Little Success, 3 = Limited Success, 4 = Generally Successful, and 5 = Completely Successful. To avoid distraction resulting from an excessive number of tables inserted in the text, interested readers will find basic data tables for the completed questionnaires in Appendix F.
  • 62. 53 Table 4 Means and Standard Deviations for the 15 Objectives in Question 1 Across Four Classes) Class 1 Class 2 Class 3 Class 4 Overall (n = 15) (n = 12) (n = 10) (n = 11) (N = 48) Obj M SD M SD M SD M SD M SD A 3.27 .80 3.17 .72 2.90 .57 3.45 .52 3.21 .68 B 3.33 .49 3.17 .39 2.90 .74 3.64 .50 3.27 .57 C 3.93 .26 3.77 .39 3.70 .48 3.55 .52 3.60 .49 D 2.73 .59 2.75 .45 2.90 .32 3.45 .52 2.94 .56 E 2.53 .64 3.50 .52 2.50 .53 2.82 .40 2.83 .66 F 3.33 .49 3.42 .51 3.80 .42 3.73 .47 3.54 .50 G 3.27 .70 2.83 .39 3.40 .52 3.27 .47 3.19 .57 H 3.60 .51 3.83 .39 3.90 .32 3.73 .47 3.75 .44 I 3.27 .46 3.25 .45 3.80 .42 3.82 .40 3.50 .51 J 2.47 .64 2.50 .52 2.90 .32 2.55 .52 2.58 .54 K 2.53 .52 2.50 .52 2.80 .42 2.73 .47 2.63 .49 L 3.47 .52 2.75 .45 3.50 .53 3.09 .54 3.21 .58 M 3.67 .49 3.50 .52 3.80 .42 3.64 .50 3.65 .48 N 3.20 .41 2.92 .29 3.10 .32 3.73 .47 3.23 .47 O 2.33 .49 2.33 .29 2.20 .42 3.09 .83 2.48 .65 There were no responses in the Not Applicable or in the No Success response range. This means 100% of the averaged class responses to the 15 objectives in Question 1 were at or above 2, the Very Little Success response level. To visualize the distribution of point responses given by participants for each of the 15 objectives across the four classes taught, a graphical analysis was conducted. The results of this analysis are provided in Figures 1 through 4, portraying class response data for objectives A through O, respectively.
  • 63. 54 Examination of these figures reveals that, for the 15 objectives, the majority of responses were of 3 points or higher. 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M N O Figure 1. Question 1 Breakdown of Questionnaire Responses for Class 1 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M N O Figure 2. Question 1 Breakdown of Questionnaire Responses for Class 2
  • 64. 55 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M N O Figure 3. Question 1 Breakdown of Questionnaire Responses for Class 3. 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M N O Figure 4. Question 1 Breakdown of Questionnaire Responses for Class 4 As will be discussed in Chapter 5, a predominance of responses above a 3-point neutral value is an indication that participants perceived both the teaching materials and the course instruction to be positively effective (Kirkpatrick, 1998; Phillips, 1983; Phillips, 1997). The average score and the standard deviation for the six program elements of Question 2 of the questionnaire, which asked the participants to rate,
  • 65. 56 on a scale of 1-to-5, the relevance of each of the program elements to his or her job, are summarized in Table 5. There were five possible responses for each objective in Question 2: 1 = No Relevance, 2 = Limited Relevance, 3 = Some Relevance, 4 = Relevant, and 5 = Very Relevant. Table 5 Means and Standard Deviations for the Six Elements in Question 2 Across Four Classes Class 1 Class 2 Class 3 Class 4 Overall Program (n = 15) (n = 12) (n = 10) (n = 11) (N = 48) Element M SD M SD M SD M SD M SD Group Discussions 3.20 .41 3.00 .43 2.90 .74 2.91 .70 3.02 .56 Small Team Discussion 3.47 .52 2.75 .45 2.60 .52 2.73 .47 2.94 .60 Case Study/Skill 3.00 .76 2.50 .52 2.90 .32 3.73 .47 3.02 .70 Exercises Program Content 3.87 .35 3.75 .45 3.90 .32 3.91 .30 3.85 .36 Team Building 4.47 .52 3.92 .51 4.20 .42 3.82 .40 4.13 .53 Strategies Special Projects 3.47 .52 3.33 .65 3.30 .48 3.18 .40 3.33 .52 There were no responses in the No Relevance response range. This means 100% of the averaged class responses to the six program elements in Question 2 were at or above 2, the Limited Relevance response level. To visualize the distribution of point responses given by participants for each of the six program elements, a graphical analysis was conducted. The results of this analysis are provided in Figures 5 and 6, portraying class response data for the six program elements, respectively. Examination of these figures
  • 66. 57 reveals that, for the six program elements across all four classes, the majority of the responses were of 3 points or higher. 15 1 2 3 10 4 Response Counts 5 5 0 1 2 3 4 5 6 1 2 3 4 5 6 Class 1 (n = 15) Class 2 (n = 12 ) Figure 5. Question 2 Breakdown of Questionnaire Responses for Classes 1 and 2 15 1 2 3 10 4 Response Counts 5 5 0 1 2 3 4 5 6 1 2 3 4 5 6 Class 3 (n = 10 ) Class 4 ( n = 11 ) Figure 6. Question 2 Breakdown of Questionnaire Responses for Classes 3 and 4 The average score and the standard deviation for the five skill areas of Question 3 of the questionnaire are summarized in Table 6. There were between 2 and 6 sub-parts of each skill area, and the sub-parts were averaged to reflect the response values for the five skill areas. Question 3 asked each participant to indicate the degree to which his or her application of the leadership skills or
  • 67. 58 behaviors were enhanced as a result of his or her participation in the Strategic Leadership training program. There were six possible responses for each objective in Question 3: 0 = No Opportunity to Use the Skill, 1 = No Change, 2 = Little Change, 3 = Some Change, 4 = Significant Change, and 5 = Very Significant Change. Table 6 Means and Standard Deviations for the Five Skill Areas in Question 3 Across Four Classes Class 1 Class 2 Class 3 Class 4 Overall (n = 15) (n = 12) (n = 10) (n = 11) (N = 48) Skill Area M SD M SD M SD M SD M SD Developing Creativity 3.57 .53 3.25 .50 3.40 .52 3.45 .42 3.43 .49 Motivating 3.20 .25 3.08 .19 3.20 .42 3.55 .27 3.25 .33 Delegating 3.23 .32 3.25 .26 3.55 .37 3.68 .25 3.41 .35 Communicating 3.76 .27 3.80 .47 3.63 .56 3.63 .36 3.71 .41 Decision Making 3.75 .19 3.58 .46 3.64 .45 3.84 .29 3.70 .36 There were no responses in the No Opportunity to Use the Skill or the No Change response range. This means 100% of the averaged class responses to the five skill areas in Question 3 were at or above 2, the Little Change response level. To visualize the distribution of point responses given by participants for each of the five skill areas, a graphical analysis was conducted. The results of this analysis are provided in Figures 7 and 8, portraying class response data for each of the five skill areas, respectively. Examination of these figures reveals that for the five skill areas, across all four classes, the majority of responses were of 3
  • 68. 59 points or higher. 15 2.5-2.9 3.0-3.4 3.5-3 .9 10 4.0-4 .4 Response Counts 4.5-4.9 5 ` 0 ` A B C D E A B C D E Class 1 (n = 15) Class 2 (n = 12 ) Figure 7. Question 3 Breakdown of Questionnaire Responses for Classes 1 and 2 15 2.5-2.9 3.0-3.4 3.5-3.9 10 4.0-4.4 Response Counts 4.5-4.9 5 0 A B C D E A B C D E Class 3 (n = 10) Class 4 (n = 11) Figure 8. Question 3 Breakdown of Questionnaire Responses for Classes 3 and 4 The average score and the standard deviation for the 13 topics of Question 4 of the questionnaire are summarized in Table 7. In Question 4, participants were asked to indicate the extent to which they thought their application of the knowledge and skills learned in the Strategic Leadership
  • 69. 60 training program had had a positive influence on leadership measures in their own work or that of their work unit. There were five possible responses for each objective in Question 4: 1 = No Influence, 2 = Some Influence, 3 = Moderate Influence, 4 = Significant Influence, and 5 = Very Much Influence. Table 7 Means and Standard Deviations for the 14 Topics in Question 4 Across Four Classes Class 1 Class 2 Class 3 Class 4 Overall (n = 15) (n = 12) (n = 10) (n = 11) (N = 48) Topics M SD M SD M SD M SD M SD A 3.20 .41 3.25 .45 3.50 .53 3.55 .52 3.35 .48 B 3.47 .52 3.33 .49 3.40 .52 3.64 .50 3.46 .50 C 2.73 .46 2.67 .49 2.70 .48 2.64 .50 2.69 .47 D 3.53 .63 3.50 .52 3.60 .52 3.45 .52 3.54 .54 E 3.13 .35 3.25 .45 3.20 .42 3.27 .47 3.21 .41 F 3.60 .51 3.67 .49 3.70 .48 3.73 .47 3.67 .48 G 2.07 .26 2.33 .49 2.20 .42 2.55 .52 2.27 .45 H 2.60 .51 2.50 .52 2.40 .52 2.45 .52 2.50 .51 I 2.53 .52 2.75 .45 2.60 .52 2.45 .52 2.58 .50 J 2.47 .52 2.67 .49 2.80 .42 2.55 .52 2.60 .49 K 3.40 .51 3.50 .52 3.60 .52 3.55 .52 3.50 .51 L 3.07 .26 3.33 .49 3.40 .52 3.18 .40 3.23 .42 M 2.47 .52 2.33 .49 2.50 .53 2.36 .50 2.42 .50 There were no responses in the No Influence response range. This means 100% of the averaged class responses to the five measures in Question 4 were at or above 2, the Some Influence response level. To visualize the distribution of point responses given by participants for each of the five measures across the four classes taught, a graphical analysis was conducted. The results of this analysis are provided in Figures 9 through 12,
  • 70. 61 portraying class response data for each of the five measures, respectively. Examination of these figures reveals that, for the five skill areas, the majority of responses were of 3 points or higher. 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M Figure 9. Question 4 Breakdown of Questionnaire Responses for Class 1 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M Figure 10. Question 4 Breakdown of Questionnaire Responses for Class 2
  • 71. 62 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M Figure 11. Question 4 Breakdown of Questionnaire Responses for Class 3 15 1 2 3 10 4 Response Counts 5 5 0 A B C D E F G H I J K L M Figure 12. Question 4 Breakdown of Questionnaire Responses for Class 4 Action Plans. The data collected in Part I of the action plans will be used in a future study and by the company for developing future performance objectives; therefore, the Specific Steps, End Results, and Expected Intangible Results section will not be considered in this study. The qualitative data collected in Part I of the actions plans were converted into quantitative data Part II of the action
  • 72. 63 plans. Part II of the action plans was the basis for determining whether the training provided a positive ROI to the business. All Study Identifications were assigned randomly to the participants and numbered 01 through 48. The initial follow-up date for all participants was March 21, 2007; subsequent follow-up dates have been assigned for further internal study. For data analysis, each participant identified on the action plan was a member of one of the following four groups: Operations (class 1), Sales and Marketing (class 2), Scientific Research (class 3), or Support Group (class 4). Participants in the four classes agreed on a list of possible metrics for analysis on their respective action plans. As the need was for smaller but focused samples, the researcher relied on qualitative methods for gathering information, including: (a) participation in the training, (b) direct experiential learning and (c) analysis of outcomes through documentation. Because of the wide range of job requirements among the participants, a number of metrics had to be selected and agreed upon in order to ensure that each participant was effectively focusing on an area for improvement in which they had some control. Each participant then selected three metrics from this group to use on their individual action plan. Only one metric from each action plan, selected by the participant, was used during this study. A list of the possible metrics, as agreed to by the participants in the four classes, is shown in Table 8. The analysis data collected in Part II of the action plans consisted of subjective evaluations by all participants in the Strategic Leadership training program. Because the main thrust of this study was to determine the impact of
  • 73. 64 Table 8 Metrics for Action Plans Absenteeism Accidents Budget Coaching Communication Conflict management Customer service Decision making Delegation Feedback Performance improvement Planning Planning Productivity Rewards and recognition Teamwork development Time management Turnover the Strategic Leadership training program, every attempt was made to uncover specific results linked to the program. Although the program was not designed to produce measurable, quantifiable results, it did produce significant changes that had an impact on the bottom line of the business. Participants rated improvements in skills and knowledge from the start of the strategic leadership training program. They considered 18 skill areas, each selecting one metric on which to focus for this study. From a total of 48 possible topics (one per participant), selections were distributed as shown in Table 9 using 10 of the possible 18 topics as the priority choice for emphasis. Table 9 Action Plan Topic Selections Number Number Metric Topic Selected Metric Topic Selected Efficiency 9 Time management 4 Customer service 8 Turnover 4 Delegation 6 Absenteeism 3 Productivity 5 Budget 3 Accidents 4 Communication 2
  • 74. 65 The response rate was 100% and the input from each of the participants for the metrics being measured and analyzed is shown in Table 10. Table 10 Action Plan Input from Participants Annualized Adjusted Participant ID Improvement Contribution Confidence Value (%) Value ($) Estimate (%) (%) 01 21,000 60 50 6,300 02 30,000 40 50 6,000 03 20,000 40 60 4,800 04 18,000 45 50 4,050 05 26,400 55 60 8,712 06 16,000 50 50 4,000 07 30,000 35 60 6,300 08 25,200 30 75 5,670 09 51,120 25 50 6,390 10 15,000 80 80 9,600 11 25,800 75 60 11,610 12 9,600 60 50 2,880 13 71,000 35 40 9,940 14 12,000 80 75 7,200 15 55,000 40 50 11,000 Class 1 Total 426,120 104,452 16 30,000 40 60 7,200 17 21,000 60 60 7,560 18 19,200 40 60 4,608 19 24,000 75 50 9,000 20 18,000 60 90 9,720 21 14,400 50 100 7,200 22 30,000 45 50 6,750 23 28,800 50 80 11,520 24 30,000 70 50 10,500 25 5,760 60 90 3,110 26 14,400 60 60 5,184 27 19,200 60 100 11,520 Class 2 Total 254,760 93,872
  • 75. 66 Table 10 (continued) Action Plan Input from Participants Annualized Adjusted Improvement Contribution Confidence Participant ID Value (%) Value ($) Estimate (%) (%) 28 48,000 60 50 14,400 29 18,000 60 80 8,640 30 12,000 50 80 4,800 31 21,500 45 65 6,289 32 19,200 50 60 5,760 33 2,916 85 90 2,231 34 12,000 50 50 3,000 35 28,800 40 75 8,640 36 2,500 75 80 1,500 37 60,000 80 75 36,000 Class 3 Total 224,916 76,260 38 4,800 80 90 3,456 39 24,000 45 50 5,400 40 18,750 90 80 13,500 41 12,600 50 75 4,725 42 32,000 40 60 7,680 43 31,800 50 60 9,540 44 85,200 40 40 13,632 45 16,000 60 60 5,760 46 85,200 30 50 12,780 47 45,000 45 60 10,800 48 12,000 60 80 5,760 Class 4 Total 363,350 88,533 Grand Total 1,269,146 363,117 Comprehensive data can be found in Table G1 in Appendix G. Because the main thrust of this study was to determine the impact of the strategic leadership training program, every attempt was made to uncover specific results
  • 76. 67 linked to the program. In an effort to use the metrics to calculate business impact, participants provided annualized dollar values representing specific improvements related to the strategic leadership training program. A number of metrics were selected by the participants to ensure that each participant was effectively focusing on a personalized area for improvement. All 48 participants provided usable data, expressed in dollar values. Participants indicated the percentage of the improvement that was directly related to the training program. In addition, participants provided their level of confidence in the estimate they provided. The value, ranging from 100% for certainty to 0% for no confidence, reflects the perceived potential for error in the estimate. Two adjustments were made to the data, using the action plan form. First, the percentage of the improvement related to training was multiplied by the dollar value. Second, the confidence level estimate, expressed as a percent, was multiplied by the adjusted dollar value to adjust for the uncertainty of the data as perceived by the participant. The data taken from the participants’ action plans is presented in Table 10. These are subjective values, which are inherently problematic due to the potential mismatch between qualitative reports and quantitative scales. Therefore, the above adjustments were made to understate the results. From Phillips’ research, it is better to understate than overstate the results (Phillips, 1997). This approach is evident in the studies presented in the Review of Related Literature section of this study.
  • 77. 68 An important task was to define which specific costs were to be included in a tabulation of the program costs. A fully loaded cost profile was developed in this study. This approach accounted for all the costs of training: instructors’ salaries and benefits, participants’ salaries and benefits, program development and material costs, and refreshments. The training was conducted in a company facility and there were no travel or lodging expenses. The cost of instructors’ salaries and benefits were estimated using their estimated average annual salary and benefits. Instructor salary and benefits were found by (a) taking Instructor 1’s base salary of $96,500.00 and multiplying it by 1.42 for benefits, equaling $137,030.00, dividing it by the 1,950 work hours per year, for $70.27 per hour, and multiplying it by 30 hours (for course preparation and delivery time), for a final total of $2,108.10 per class, and (b) taking Instructor 2’s base salary of $84,600.00, multiplying it by 1.42 for benefits, which equals $120,132.00, dividing it by the 1,950 work hours per year for $61.61 per hour, and multiplying it by 30 hours (course preparation and delivery time), for a total of $1,848.30 per class. The cost of participants’ salaries and benefits were estimated using their estimated average annual salary and benefits. Participant salary and benefit costs for the training are shown as: (a) Class 1 had an average base salary of $62,400.00 per participant times 1.42 for benefits, which equals $88,608.00, divided by 1,950 work hours per year, which equals $45.44 per hour, times 15 hours’ class time equals $681.60 per Class 1 participant, (b) Class 2 had an average base salary of $66,100.00 per participant times 1.42 for benefits, which
  • 78. 69 equals $93,862.00, divided by the 1,950 work hours per year, equaling $48.13 per hour, times 15 hours class time equals $721.95 per Class 2 participant, (c) Class 3 had an average base salary of $74,250.00 per participant times 1.42 for benefits, which equals $105,435, divided by the 1,950 work hours per year, which equals $54.07 per hour, times 15 hours’ class time equals $811.05 per Class 3 participant, and (d) Class 4 had an average base salary of $58,720.00 per participant times 1.42 for benefits, which equals $83,382.40, divided by the 1,950 work hours per year equals $42.76 per hour, times 15 hours class time equals $641.40 per Class 4 participant. Program materials cost $5,400.00 for development, copies, etc, divided by 48 participants and two instructors equals $108.00 per participant and instructor. Refreshments cost $1,125.00 divided by 48 participants and two instructors, equaling $22.50 per participant and instructor. Table 11 Individual Costs for Strategic Leadership Training Workshop Instructor’s Salary and Benefits Instructor 1 $2,108.10 per class Instructor 2 $1,848.30 per class Participant’s Salary and Benefits Class 1 $681.60 per participant (average) Class 2 $721.95 per participant (average) Class 3 $811.05 per participant (average) Class 4 $641.40 per participant (average) Program Materials $108 per participant Refreshments $22.50 per participant
  • 79. 70 The cost estimates based on the above are shown in Table 11. The total costs by class and the total program costs are important and are fully loaded for use in the ROI calculation. The total costs, by class, are listed in Table 12. Table 12 Total Program Cost, By Class CLASS 1 Instructor’s Salary and Benefits $2,108 Participant’s Salary and Benefits $681.60 x 15 $10,224 Program Materials $108.00 x 15.5 $1,774 Refreshments $22.50 x 16 $360 CLASS TOTAL $14,466 CLASS 2 Instructor’s Salary and Benefits $1,848 Participant’s Salary and Benefits $721.95 x 12 $8,663 Program Materials $108.00 x 12.5 $1,350 Refreshments x 13 $22.50 x 13 $293 CLASS TOTAL $12,154 CLASS 3 Instructor’s Salary and Benefits $2,108 Participant’s Salary and Benefits $811.05 x 10 $8,111 Program Materials $108.00 x 11.5 $1,134 Refreshments $22.50 x 11 $248 CLASS TOTAL $11,601 CLASS 4 Instructor’s Salary and Benefits $1,848 Participant’s Salary and Benefits $841.40 x 11 $6,707 Program Materials $108.00 x 11.5 $1,242 Refreshments $22.50 x 12 $270 CLASS TOTAL $10,067 PROGRAM TOTAL $48,288
  • 80. 71 The final step in the impact equation is to calculate the ROI, which was perceived to be important for this program. Although the strategic leadership training program was not specifically intended to show a bottom-line impact, or ultimately a measurable return, the ROI was calculated to find the benefits of the program. Using a benefit of $104,452 for Class 1 and considering Class 1 had an average of cost of $14,466, the estimated ROI for Class 1 was 622%. Using a benefit of $93,872 for Class 2 and considering Class 2 had an average of cost of $12,154, the estimated ROI for Class 2 was 672%. Using a benefit of $76,260 for Class 3 and considering Class 3 had an average of cost of $11,601, the estimated ROI for Class 3 was 557%. Using a benefit of $88,533 for Class 4 and considering Class 4 had an average of cost of $10,067, the estimated ROI for Class 4 was 779%. Finally, using a benefit of $363,117 for the program and considering the program had a cost of $48,288, the estimated ROI for the program was 652%. Based on these assumptions and calculations, the Strategic Leadership training program yielded a very high estimation of ROI. In addition to this ROI, additional value may be attached to the improvement in business metrics as well as the change in skills experienced by the participants. Analysis and Evaluation of Findings The problem addressed in this study was the determination if there were positive financial impacts of a leadership-training program on a business and to present a verifiable and valid ROI with meaningfulness. There were many studies (Phillips, 1994; Phillips, 1997; Phillips, 2001) that indicated a positive ROI, but
  • 81. 72 the lack of a statistically sound approach in noted studies (see Review of Related Literature for examples) hinders verifying the reliability of the data in the studies. This study was done to validate a 200% ROI with strong meaningfulness. Each of the following research questions correlates directly to the hypothesis for which it is intended, where applicable. Research Question 1. What differences, if any, in the perceived relevance of the 15 objectives existed across the four classes at SP? This query was investigated by applying a one-way analysis of variance (ANOVA) on questionnaire data segregated by class. Class-based data consisted of individual participant scores calculated by summing the responses to Question 1 of the questionnaire. Table 13 presents the ANOVA test results for this query, indicating that differences in the perceived relevance of training across the 15 objectives did exist by organizational group (F (3,56) = 1.210,  = 5%). Since Fcalc is less than Fcritical, the null hypothesis for hypothesis one—there will be no meaningful difference in relation to the perception of relevance of the 15 objectives across the four organizational groups at SP—cannot be rejected. To avoid distraction resulting from an excessive number of tables inserted in the text, the interested reader will find basic data tables in Appendix H. Table 13 One-Way ANOVA of Perceived Relevance of the 15 Objectives by Organizational Group Source Df SS MS Fcalc Fcritical Between Groups 3 0.794 0.265 1.210 2.78 Within Groups 56 12.226 0.219 Total 59 13.020
  • 82. 73 Research Question 2. What differences, if any, in the perceived relevance of the six elements of the job existed across the four classes at SP? This query was investigated by applying a one-way ANOVA on questionnaire data segregated by class. Class-based data consisted of individual participant scores calculated by summing the responses to Question 2 of the questionnaire. Table 14 presents the ANOVA test results for this query, indicating that differences in the perceived relevance of training across the six job elements did exist by organizational group (F(3,20) = 0.28,  = 5%). Since Fcalc is less than Fcritical, the null hypothesis for hypothesis two—that there will be no meaningful difference in relation to the perception of relevance of the six elements of the job across the four organizational groups at SP—cannot be rejected. Table 14 One-Way ANOVA of Perceived Relevance of the Six Job Elements by Organizational Group Source Df SS MS Fcalc Fcritical Between Groups 3 0.259 .086 0.28 3.10 Within Groups 20 6.128 .306 Total 23 6.387 Research Question 3. What differences, if any, in the perceived relevance of the five skill areas existed across the four classes at SP? This query was investigated by applying a one-way ANOVA on questionnaire data segregated by class. The class-based data consisted of individual participant scores calculated by summing the responses to Question 3 of the questionnaire. Table 15 presents
  • 83. 74 the ANOVA test results for this query, indicating that differences in the perceived relevance of training across the five skill areas did exist by organizational group (F(3,16) = 0.89,  = 5%). Since Fcalc is less than Fcritical, the null hypothesis for hypothesis three—that there will be no meaningful difference in relation to the perception of relevance of the five skill areas across the four organizational groups at SP—cannot be rejected. Table 15 One-Way ANOVA of Perceived Relevance of the Five Skill Areas by Organizational Group Source Df SS MS Fcalc Fcritical Between Groups 3 0.1440 0.0480 0.89 3.24 Within Groups 16 0.8605 0.0538 Total 19 1.0045 Research Question 4. What differences, if any, existed in the perceived relevance of the 13 topics in one’s work or those of the work unit across the four classes at SP? This query was investigated by applying a one-way ANOVA on questionnaire data segregated by class. Class-based data consisted of individual participant scores calculated by summing the responses to Question 4 of the questionnaire, which focused on assessing teaching material effectiveness. Table 16 presents the ANOVA test results for this query indicating that differences in the perceived relevance of training across the 13 topics did exist by organizational group (F(3,48) = 0.09,  = 5%). Since Fcalc is less than Fcritical, the null hypothesis for hypothesis four—that there will be no meaningful difference in relation to the perception of relevance of the 13 topics in one’s work or those of
  • 84. 75 the work unit across the four organizational groups at SP—cannot be rejected. Table 16 One-Way ANOVA of Perceived Relevance of the 13 Topics by Organizational Group Source df SS MS Fcalc Fcritical Between Groups 3 0.069 0.023 0.09 2.80 Within Groups 48 12.455 0.259 Total 51 12.524 Research Question 5. What are the differences, if any, in the ROI across the organizational group at SP? Table 17 summarizes the average ROI by class. Table 18 shows one-way ANOVA results (F(3,45) = 0.69,  = 5%) which indicates that differences in mean return on investment across classes did exist. Since F calc is less than Fcritical, the null hypothesis—that there will be no meaningful differences in the ROI across organizational groups at SP—cannot be rejected. In this context, a meaningful difference simply means there is meaningful evidence that there is a difference; it does not mean the difference is necessarily large or important. To avoid distraction resulting from an excessive number of tables inserted in the text, the interested reader will find basic data tables in Appendix I. Table 17 Mean ROI Across the Four Classes Class 1 Class 2 Class 3 Class 4 (n = 15) (n = 12) (n = 10) (n = 11) Variable M SD M SD M SD M SD ROI (%) 631 279 678 270 562 524 606 296
  • 85. 76 Table 18 One-Way ANOVA of Mean ROI Across the Four Classes Source df SS MS Fcalc Fcritical Between Groups 3 77,057 25,686 0.22 2.82 Within Groups 44 5,242,625 119,151 Total 47 5,319,682 Summary The research findings of this study were presented and analyzed in this chapter. A series of one-way ANOVA indicated that no meaningful differences in the perceived relevance of the learned knowledge and skills existed across the four classes. Analysis of action plan completions indicated that no meaningful differences existed across the four classes. Finally, a one-way ANOVA indicated that no meaningful differences in return on investment existed across the four classes. The significance of these findings and the conclusions to be drawn from them follow in Chapter 5.
  • 86. 77 Chapter V: Summary, Conclusions and Recommendations Summary The problem addressed in this study was the determination if there were positive financial impacts of a leadership-training program on a business and to present a verifiable and valid ROI with meaningfulness. SP had a major need to determine the value and applicability of its leadership training to the company for current and future leaders. Therefore, a study of this nature was proposed and was based on the hypotheses and research questions addressing the differences in various learning relationships across four organizational groups in a targeted company. There were many studies (Phillips, 1994; Phillips, 1997; Phillips, 2001) that indicated a positive ROI, but the lack of a statistically sound approach in noted studies (see Review of Related Literature for examples) hinders verifying the reliability of the data in the studies. The issue this study addressed was the determination of the financial impact of a leadership-training program on business. Using the ROI methodology to examine the value of training, when developed, analyzed and reported with meaningful data, will support an organization’s business success (Phillips, 2001). The study verified and validated that the training initiatives are meeting the needs of the business and having a positive impact on the bottom line of the business. The problem of this study was investigated through the development of five research questions and five test hypotheses. Research Question 1. What differences, if any, in the perceived relevance of the 15 objectives existed across the four classes at SP?
  • 87. 78 H10: There will be no difference in relation to the perception of relevance of the 15 objectives across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H1a: There will be a difference in relation to the perception of relevance of the 15 objectives across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). A one-way analysis of variance (ANOVA) was conducted to examine if there existed a difference in the perceived relevance of instruction across the four organizational groups where the training took place. Each organizational group was the independent variable with the averaged class score for question 1 on the questionnaire as the dependent variable. Research Question2. What differences, if any, in the perceived relevance of the six elements of the job existed across the four classes at SP? H20: There will be no difference in relation to the perception of relevance of the six elements of the job across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H2a: There will be a difference in relation to the perception of relevance of the six elements of the job across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). A one-way ANOVA was conducted to examine if there existed a difference in the perceived relevance of instruction across the elements of the job for which the training took place. Each organizational group was the independent variable with the averaged class score for question 2 on the questionnaire as the
  • 88. 79 dependent variable. Research Question 3. What differences, if any, in the perceived relevance of the five skill areas existed across the four classes at SP? H30: There will be no difference in relation to the perception of relevance of the five skill areas across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H3a: There will be a difference in relation to the perception of relevance of the five skill areas across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). A one-way ANOVA was conducted to examine if there existed a difference in the perceived degree of enhancement of instruction across the skills of the job for which the training took place. Each organizational group was the independent variable with the averaged class score for question 3 on the questionnaire as the dependent variable. Research Question 4. What differences, if any, in the perceived relevance of the 13 topics in one’s own work or that of the work unit existed across the four classes at SP? H40: There will be no difference in relation to the perception of relevance of the 13 topics in one’s own work or that of the work unit across the four organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H4a: There will be a difference in relation to the perception of relevance of the 13 topics in one’s own work or that of the work unit across the four
  • 89. 80 organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). A one-way ANOVA was conducted to examine if there existed a difference in the perceived influence that instruction had on the measure of performance in one’s own work or that of the work unit for which the training took place. Each organizational group was the independent variable with the averaged class score for question 4 on the questionnaire as the dependent variable. Research Question 5. What are the differences, if any, in the ROI across organizational groups at SP? H50: There will be no differences in the ROI across organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). H5a: There will be differences in the ROI across organizational groups at SP (Operations, Sales and Marketing, Scientific Research, and Support Groups). A one-way ANOVA was conducted to examine if there existed a difference in the ROI across organizational groups. Each organizational group was the independent variable and the ROI expressed in a percent was the dependent variable. In this study, two methods were used to collect data. A follow-up questionnaire (see Appendix A) was used to determine the extent to which participants utilized the training and achieved self-reported on-the-job success. The questionnaire included the following evaluation items: (a) rate the success of the course in meeting the 15 objectives, (b) rate the relevance of the program
  • 90. 81 elements to the job, (c) indicate the degree to which the use of the 15 skills are enhanced, and (d) indicate the extent to which one thinks this course will influence the measures in one’s own work or that of the work unit. The action plan (see Appendix B) was implemented during the training to identify areas of individual improvement as a result of the training program, to link achievements to department-level contributions, and to convert the contributions to monetary value. The requirement of filling out the action plan was communicated prior to the program start date. Furthermore, the program instructors described the action planning process in a 15-minute discussion on the first day of training. Four functional areas (Operations, Sales and Marketing, Scientific Research, and Support) were asked to submit the names of candidates who they considered key and potential leaders that would be contributing significantly to the future of SP. They were considered the role models or leaders within their respective departments, as determined by the executive leadership at SP. From a population of about 200 leaders companywide, 48 were selected by executive management to participate in the training (see Chapter 3). The training program was conducted in several sessions, with each session lasting two to four hours and delivered one or two days a week for three to four weeks. An orientation session was also conducted prior to the first session. The research included four phases: process planning, data collection, data analysis, and the communication of results. This was summed up in an ROI impact study of the Strategic Leadership training program within SP (see
  • 91. 82 Appendix C). A data collection strategy was designed to meet the objectives of this study. The questionnaires and the action plans were utilized to ensure that adequate, quality input was obtained for the evaluation. In both data collection methods, the focus was on the impact and not the process. Most of the emphasis was on the impact of the program, which was obtained with evaluation levels 3 and 4 for data collection and analysis (Phillips, 1983; Kirkpatrick, 1998). Another important issue to address in this study was the timing of the data collection. Although the training program was designed to have a long-term impact, the specific improvements from the training program would be difficult to capture if assessed years after the training program was completed. Because of the above factors, it was decided to measure the success of the training program during a three-month period after the last training session. Participant input was limited to three months and then extrapolated out to one year. Participant estimates of training impact are a reliable indicator when appropriate steps are taken to collect the data (Phillips, 2002). The participants were the closest individuals to the performance improvements and often were aware of the other influences that could have affected the performance measures. For this study, participants were asked to indicate the degree to which a specific improvement was caused by the training program. The action plan was the tool used to capture this data. While the data could be converted to monetary value in many ways, the primary strategy that was used in this study was to ask participants to make
  • 92. 83 estimates and calculations based on the improvement in their work units. Participants used accepted standards and conversion factors to arrive at monetary values. Tabulating the costs of the training effort involved monitoring or formulating all of the costs related to the training program. A fully-loaded cost profile was recommended when tabulating all direct and indirect costs (Marelli, 1993). The return on investment was then calculated by comparing the monetary benefits and costs. The benefit-cost ratio was the monetary benefits of the program divided by the costs. The return on investment used the net benefits (costs minus benefits) divided by costs. This is the same formula commonly used to evaluate other investments where the ROI is traditionally reported as earnings divided by investment. The data collection strategy was designed to meet the objectives of this study. To assure anonymity, the questionnaire was sent confidentially from Corporate Training and Development to the 48 participants and participants were not required to write their names on the questionnaire. Descriptive and inferential statistics were used to analyze the questionnaire and action plan data. Descriptive statistics were used to characterize the data by calculating what the data indicate. Inferential statistics were used to predict similarities between participants, classes and operating areas. The primary analytical test used was the one-way analysis of variance (ANOVA). Tables and figures were used to show the distribution of the participants’ selections in absolute numbers, means and standard deviations. In
  • 93. 84 addition to the research questions, the questionnaires and action plans collected demographic data for participants by class and operating area, instructor, and gender. Because the main thrust of this study was to determine the business impact of the training program, every attempt was made to uncover specific business results linked to the training program. The impact of the training program was presented indicating the application of the skills and knowledge gained in the program. Each participant was asked to select a number of skills they used the most on the job at the start of the training program. To manage the abundance and complexity of research findings during this study, material presented in this section has been broken into two parts: first, results for each research question are presented individually; second, an integrated summary of overall study research findings is presented. Conclusions Because much of the following discussion expresses test results of this study in terms of training relevance, the following review of Kirkpatrick (1998) and Phillips (1983 and 1997) training effectiveness criteria is offered before addressing individual research questions findings. Level 1–Reactions. This is a basic level of effectiveness measurement concerned with determining if (a) the training experience was enjoyable for participants, and (b) course content was considered by students to be valuable and relevant to their jobs (Kirkpatrick, 1998; Phillips, 1983). Reaction represents an important area of measurement, particularly for training and development
  • 94. 85 staff. At this level, participants’ reactions to and satisfaction with the training program are measured. Data captured on Level 1 instruments are the relevance of the training to the job, the recommendation of the training to others, the importance of the information received, and the intention to use the knowledge and skills acquired. These four items have predictive validity for projecting actual applications and are compared from one program to another. Level 1 evaluations were conducted as a part of the training program, but were not considered in this study. Level 2–Learning. The second training effectiveness tier focuses on assessing student mastery of tools and knowledge delivered during training (Kirkpatrick, 1998; Philips, 1983). Learning can be measured informally with self- assessments, team assessments, or facilitator assessments; or formally with subjective tests, performance testing, or simulations. Learning instruments ask the participants to rate their understanding of the knowledge and skills acquired, their ability to use the knowledge and skills acquired, and their confidence in the use of the knowledge and skills acquired. Level 2 evaluations were conducted on some learning activities as a part of the training program, but were not considered in this study. Level 3–Knowledge Transfer or Application. The third training tier defines the long-term effectiveness in terms of (a) the extent to which students assimilate the knowledge and skills delivered, and (b) assessing the degree to which the learning has become permanently integrated into normal work activities. Measurement of Level 3 training effectiveness requires the ability to assess the
  • 95. 86 presence of long-term behavioral change deemed to be a direct result of the training received (Kirkpatrick, 1998; Phillips, 1983). Application information is often collected through a follow-up survey or questionnaire. Key questions asked concern the importance of the knowledge and skills back on the job, the frequency of use of the new knowledge and skills, and the effectiveness of the knowledge and skills when applied on the job. The Level 3 transfer of knowledge was measured in this study by collecting information with the questionnaire about the barriers and enablers to the application of the new knowledge and skills. The barrier information provided insight into the reasons for unsuccessful application of the new knowledge and skills. The enabler information provided insight into the reasons behind the successful implementation of a training program. Level 4–Results. The bottom-line financial benefits are the focus of the fourth strata, typically expressed in terms of process or quality improvements through which quantifiable dollar savings result (Kirkpatrick, 1998; Phillips, 1983). Both the questionnaire and the action plan were used to gather this data. This study included the methodology used to isolate the effects of the training program. Level 5–ROI. The monetary value of the business impact is compared with the costs for the program by a ratio of the Level 4 bottom-line savings with the total training costs (Phillips, 1997). The costs of the training program are fully loaded and the methods used to convert the data to monetary value are reported. The ROI calculation for this training program is identical to the ROI ratio used for any other business investment.
  • 96. 87 Level 6–Intangible Benefits. Intangible benefits are measures that are intentionally not converted to monetary values because the conversion to monetary data would be too subjective (Phillips, 1997). It is important to capture and report the intangible benefits of the training program to get a full understanding of its value, finding benefits such as increased job satisfaction, reduced conflicts, reduced stress, or improved teamwork. There are varieties of other intangible measures that are often very important to an organization. Intangible benefits were collected as a part of the training program, but were not considered in this study. Discussion of Individual Test Results. The average class scores for all questionnaires were at or above a neutral measure of 3.0 for all classes, with the exception of the Class 1 mean score for question 4 of 2.94. This score is within the class standard deviation of 0.49 and will not be considered abnormal. Individual participant scores of less than 3.0 are not being considered in this study; however, SP will be evaluating these individual scores for future training program improvements. Whether or not differences existed in the perceived relevance of the 15 objectives across the four classes at SP was queried in research question 1. Analysis of the data for questionnaire Question 1 indicated differences existed in the perception of relevance of the 15 objectives across the four classes, failing to reject the null hypothesis. These above-neutral scores indicate agreement existed among all four classes that the 15 objectives were perceived to be relevant. The positive findings above represent a fulfillment of Kirkpatrick (1998)
  • 97. 88 Level 1 and Level 2 evaluations. Whether or not differences existed in the perceived relevance of the six elements across the four classes at SP was queried in research question 2. Analysis of the data gathered for Question 2 of the questionnaire indicated differences in the perceived relevance of the six elements of the job across the four classes, failing to reject the null hypothesis. These above-neutral scores indicated agreement among all classes that the six elements were perceived to be relevant. These findings represent another positive fulfillment of Kirkpatrick (1998) Level 1 and Level 2 evaluations. Whether or not differences existed in the perceived relevance of the five skill areas across the four classes at SP were queried in research question 3. The analysis of questionnaire Question 3 indicated differences in the perceived relevance of the five skill areas across the four classes, failing to reject the null hypothesis. These above-neutral scores indicate that agreement existed among all four classes that the five skill areas were perceived to be relevant. This finding fulfills the Kirkpatrick (1998) Level 1 and Level 2 evaluation goals. Whether or not differences existed in the perceived relevance of the 13 topics in one’s own work or that of the work unit across the four classes at SP were queried in research question 4. Analysis of questionnaire Question 4 indicated differences in the perceived relevance of the 13 topics across the four classes, affirming failing to reject the null hypothesis. These above-neutral scores indicate agreement existed among all four classes that the 13 topics were perceived to be relevant. This finding fulfills the Kirkpatrick (1998) Level 1 and
  • 98. 89 Level 2 evaluation goals. Whether or not differences existed in the ROI across the four classes at SP was asked in research question 5. The analysis of the ROI indicated that differences did exist across the four organizations, affirming failing to reject hypothesis. These above-neutral scores indicate agreement existed among all four classes and that the ROI was relevant. This finding fulfills the Phillips (1997) Level 5 evaluation goal. Integration of Study Research Findings. Conclusive results regarding both the purpose and the problem underlying the study inquiry were presented with the research. The problem for this study was the lack of means to verify the positive financial impact of a leadership-training program on the business. Evidence indicated that the training initiative was effective for the following reasons: (a) a successful completion rate of 100% was realized for all action plans (see individual results from research question 2), and (b) the average, understated return on investment of successfully completed action plans was 652% (see individual results for research question 5). There are many studies (Phillips, 1994; Phillips, 1997; Phillips, 2001) that indicate a positive ROI, but none that indicate a statistically sound approach to verifying the ROI. This study was to validate a program with a 200% ROI with good meaningfulness. Another problem this study addressed was whether a leadership-training program could be accomplished across four disparate organizations at SP and successfully yield an ROI that shows a positive effect on the business. Evidence
  • 99. 90 indicated that the training initiative was effective in accomplishing this goal for the following reasons: (a) as described in the individual results of research question 1, the average participant evaluations of perceived relevance of the 15 objectives across the four classes were fairly high (Class 1 = 3.13, Class 2 = 3.04, Class 3 = 3.21, and Class 4 = 3.33); (b) as described in the individual results of research question 2, the average participant evaluation of perceived relevance of the six elements across the four classes were high (Class 1 = 3.98, Class 2 = 3.21, Class 3 = 3.30, and Class 4 = 3.38); (c) per the individual results of research question 3, the average participant evaluation of relevance of the five skill areas, regardless of class, were moderately high (Class 1 = 3.50, Class 2 = 3.21, Class 3 = 3.30, and Class 4 = 3.38); and (d) per the individual results of research question 4, the average participant evaluation of perceived relevance of the 13 job-related topics across the four classes were above average (Class 1 = 2.94, Class 2 = 3.01, Class 3 = 3.05, and Class 4 = 3.03). The evidence described thus far indicated that the SP sponsored training initiative was both effective and successful in teaching and developing strategic leadership concepts, as new ideas, across the four organizational areas of the company. In total, research findings portrayed the SP training initiative as effective and successful. High participant evaluations for the perceived relevance of all areas of the training programs are considered to have resulted from substantial effort expended in the developing and delivering of the comprehensive training, which was deployed uniformly among all classes. Employing open and frequent instructor-participant communication provided a
  • 100. 91 common approach ensuring all participants had easy and immediate access to all of the information they required. Augmentation of the investigation protocol to include Kirkpatrick and Phillips criteria proved beneficial, because it provided a well-proven method for evaluation across all classes. Perhaps the most important and tangible indication of successes came directly from SP. The study outcome motivated SP upper management to develop a training and development plan for all current and prospective organizational leaders in strategic leadership philosophies. During the time this dissertation was written in 2007, follow-on leadership training at SP had already begun, with expected completion sometime in 2008. All research results must be viewed in context of the limitations of this study: (a) the study was carried out within one company, (b) participants represented only a small percentage (about 24%) of all organizational leaders at SP, (c) a select number of organizations were involved, and (d) typically the top performers within the organization were given the opportunity to participate in the training and in the study. Although a population of 65 leaders was identified and only 48 were trained, this represents a meaningful percentage of the target population (73.8%). The four organizations identified accounted for nearly 80% of the total population of SP. These limits raise questions as to whether or not study findings were dependent upon selecting the more highly ranked organizational leaders to participate in the training. An additional issue must be addressed before closing this chapter. Study findings reflected generally high participant evaluation scores for all
  • 101. 92 questionnaire questions. The tendency for high learner scores on classroom questionnaires is a well-known occurrence (Fink, 2003; Fowler, 1995). Hence, conclusions made about the success and effectiveness of this study would be suspect if predicated solely upon participant questionnaire scores and action plan data. Fortunately, such criticism is dissuaded by two considerations. First, achieving action plan completion was possible only after participants had demonstrated proficiency in applying the leadership tools and techniques taught in the training program, thus showing successful and effective knowledge transfer. Second, action plan completion was accomplished only after the management sponsor had signed-off on the attested ROI results, thus representing affirmation that training goals had been successfully met. Implications of the Study. This company-sponsored training initiative demonstrated a positive ROI. This validated for SP the relevance of training intended to help leaders better meet changing organizational needs and expectations. Cultivating leadership expertise through the application action plans, as demonstrated with this study, offers a pathway for continual process improvement by validating results to business goals. Possibly of greater importance to SP is the relevance of this study’s findings to other organizations within the company and their leadership training requirements. This study may hold similarly important implications for companies other than SP with varied organizations requiring leadership training. The findings of this study are evidence that effective leadership training has a significant business impact on the leaders of the future. Identifying,
  • 102. 93 measuring, analyzing, implementing, and controlling training outcomes at SP will satisfy SP’s need for leadership growth in a rapidly growing company. The market could actually diminish, as well. Should that happen, fewer leaders will exist, and those that remain will likely focus on resolving business impact training issues. Leadership training programs may go on to change the experience level and educational requirements for the leaders of the future. Recommendations Building on the work of this study, the leadership training program could be further explored in relation to dimensions such as nationality (or cultural background), education level, and age. Examination of these areas could offer insight into factors impacting the success and effectiveness for training conducted across the SP organizations. The impact of gender differences is an area this study failed to address, as well, due to the low percentage of female participants available for the program. Future study should include more equal gender representation among participants. Doing so could provide an opportunity to learn how gender is related to the relevance of a leadership training initiative in terms of positive ROI outcomes. The limits under which this study was conducted offer several research opportunities: 1. Given this study was carried out within only one company, the opportunity exists for similar research to be conducted across similar organizations within other companies.
  • 103. 94 2. Because the participants represented only a small percentage (24%) of organizational leaders within SP, the opportunity remains for similar research with participants from other organizations within SP. 3. Because only a select number of organizations were involved, an opportunity exists for similar research on the other SP organizations. 4. Since only high performers within the SP organizational leadership were selected to participate in the training and in the study, future research could include a more representative sample of the entire SP population to have a wider understanding of the effects of the program. As stated earlier in this chapter, the evaluation accomplished in this study covered three of the four evaluation levels in the Kirkpatrick (1998) model. The Kirkpatrick Level 3 evaluation, focusing on long-term knowledge transfer, was not addressed in this study. Future research using the Level 3 evaluation would allow SP to gauge the long-term benefits of the training initiative. This could be accomplished some time hence by revisiting and surveying leaders trained during this study to measure the extent to which the knowledge and tools delivered during the training program continue to be applied in the course of normal professional duties. Making this kind of assessment would enable SP to gauge lasting behavioral changes resulting from the initial training program. In this study, Phillips’ (1997) Level 5 evaluation was the focus of the development of the individual action plans used to capture data that could be measured. This measured data established the foundation for continued evaluation of this research, as well as the development of new research data. By
  • 104. 95 continuing the ROI process, SP could evaluate and validate the future business impact of training initiatives using an ROI model.
  • 105. 96 References Barksdale, S. & Lund T. (2002). Rapid strategic planning. Alexandria, VA: ASTD Press. Barraza, D. (2001). ROI from individualized training for computer manufacturing employees. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Bartram, S. & Gibson, B. (1999). Evaluating training. Amherst, MA: HRD Press. Bernthal, P. R. & Byham, W.C. (1994). Interactive skills training for supervisors. In Phillips, J. J. (Ed.), In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Bernthal, P. & Byham, B. (1997). Evaluation techniques for an empowered workforce. In Phillips, J. J. (Ed.), In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Berthiez, G. G. & Kluseman, D. (2001). How much is performance improvement really worth? In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Blackburn, S. (1996). Dictionary of philosophy. Oxford, UK: Oxford University Press. Bonhoeffer, D. (1995). Ethics. New York: Prentice Hall. Brinkerhoff, R. O., Formella, L. & Smalley, K. (1994). Total quality management training for white-collar worker. In Phillips, J. J. (Ed.), In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Broad, M. L., Szymanski, L. & Douds, A. (1994). Built-in evaluation. In Phillips, J. J. (Ed.), In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Burkett, H. (2001). Program process improvement teams. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Chernick, J. F. (1997). An information technology program evaluation. In Phillips, J.J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press.
  • 106. 97 Davidove, E. (1994). ROI opportunities in program revision. In Phillips, J. J. (Ed.), In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Fink, A. (2003). How to manage, analyze, and interpret survey data. Thousand Oaks, CA: Sage. Ford, D. J. (1994). Three Rs in the workplace. In Phillips, J. J. (Ed.), In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Fowler, F. (1995). Improving survey questions: Design and evaluation. Thousand Oaks, CA: Sage. Gay, L. & Airasian, P. (2000). Educational research: Competencies for analysis and application. Columbus, OH: Merrill. Gordon, E. E. & Owens, B. (1997). Measuring the impact of basic skills training. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Graber, J., Post, G. & Erwin, R. (1997). Using ROI forecasting to develop a high- impact, high-value training curriculum. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Graham, M., Bishop, K. & Birdsong, R. (1994). Self-directed work team. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Hite, J. A., Jr. & Task, L. (1997). Measuring the ROI of a finance course: An evolution of methods. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Hodges, T. M. (1997). Computer-based training for maintenance employees. In Phillips, J. J. In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Keuler, D. J. (2001). Measuring ROI for telephonic customer service skills. In Phillips, J. J. (ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Kirkpatrick, D. L. (1998). Evaluating training programs (2nd Ed.). San Francisco: Bennett-Koehler. Kravetz, D. J. (2004). Measuring human capital. Mesa, AZ: Kravetz Associates Publishing.
  • 107. 98 Marcial, E. & Bothell, T. W. (2001). Measuring return on investment for a mandatory training program. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Marelli, A. F. (1996). Cost analysis for training. Technical Skills Training, 4(8), 8- 14. McCarty, R. J. (2001). Resisting measurement: Evaluating soft skills training for senior police officers. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Parry, S. B. (1994). Using action plans to measure return on investment. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Payne, R. (1994). Improving customer service skills. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Phillips, J. J. (1983). Handbook of training evaluation and measurement methods. Houston, TX: Gulf Publishing Co. Phillips, J. J. (1994). In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Phillips, J. J. (1994). Measuring ROI in an established program. In Phillips, J. J. In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Phillips, J. J. (1997). Evaluating leadership training for newly appointed people managers. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Phillips, J. J. (1997). In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Phillips, J. J. (1997). Return on investment in training and performance improvement programs. Houston, TX: Gulf Publishing Co. Phillips, J. J. (2001). In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Phillips, J. J. & Phillips, P. P. (2001). Performance management training. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press.
  • 108. 99 Phillips, J.J. & Stone, R.D. (2002). How to measure training results. New York: McGraw-Hill. Phillips, J. J. (2003). Return on investment in training and performance improvement programs, (2nd Ed.). Houston, TX: Gulf Publishing Co. Phillips, P. P. & Phillips, J. J. (2001). Measuring return on investment on interactive selling skills. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 3. Alexandria, VA: ASTD Press. Rackham N. (1997). Measuring the impact of sales training. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Renaud, T. & Phillips, J. J. (1997). A preprogram ROI for machine operator training. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Rothwell, W. J., Lindholm, J. E. & Wallick, W. G. (2003). What CEOs expect from corporate training. New York: AMACOM. Russ-Eft, D., Krishnamurthi, S. & Ravishankar, L. (1994). Getting results with interpersonal skills training. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Russ-Eft. D. & Hurson, K. (1997). Transforming supervisors into innovative team leaders. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Sleezer, C. M. & Swanson, R. A. (1994). Measuring the effects of an organization development program. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Stone, R. D., Steele, D. R., Wallace, D. M. & Spurrier, W. B. (1997). Training’s contribution to a major change initiative. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Westcott, R. (1994). Applied behavior management training. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press. Wurtz, W. (1997). Replicating a performance management ROI evaluation. In Phillips, J. J. In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press.
  • 109. 100 Zigarmi, P. & Baynham, F. (1997). Measuring the impact of leadership training. In Phillips, J. J. (Ed.) In action: Measuring return on investment, Volume 2. Alexandria, VA: ASTD Press. Zigon, J. (1994). Performance management training. In Phillips, J. J. In action: Measuring return on investment, Volume 1. Alexandria, VA: ASTD Press.
  • 110. APPENDICES
  • 111. Appendix A Impact Questionnaire for the Strategic Leadership Training Program
  • 112. 103 Impact Questionnaire for the Strategic Leadership Training Program Instructions 1. Please complete this questionnaire as promptly as possible and return it to the address shown on the bottom of this page. To provide responses, you will need to reflect on the Strategic Leadership Training Program and think about specific ways in which you have applied what you learned from each session. It may be helpful to review the materials from each session. 2. Please take your time as you provide responses. Accurate and complete responses are very important. You should be able to provide thorough responses in about 30 minutes. 3. You will need your action plan as you respond to several items on the questionnaire. Please review the action plan and make sure that each page is accurate and complete. Attach a copy of the action plan to the questionnaire when it is returned. 4. Please be objective in providing responses. In no way will your name be linked to your input. Your questionnaire and action plan will be viewed only by the Director of Corporate Training and Development. Specific responses or comments related to any individual will not be communicated to anyone inside or outside the Company. 5. Your responses will help determine the impact of this program. In exchange for your participation in this evaluation, a copy of a report summarizing the success of the entire class will be distributed to you within three months. Please make sure that your input is included with all of your classmates. 6. Should you need clarification or more information, please contact the Director of Corporate Training and Development at extension 3100. Please return your completed questionnaire with action plan(s) to: Jack Kules, Director of Corporate Training and Development
  • 113. 104 STRATEGIC LEADERSHIP TRAINING PROGRAM IMPACT QUESTIONNAIRE Class: (Circle one) Operations Sales/Market SciResearch Support 1. Listed below are the objectives of the Strategic Leadership Training Program. After reflecting on this course, please indicate your degree of success in meeting these objectives. Success means you achieved the objective to the extent that learning occurred during the sessions. Please check the appropriate response beside each item. Not No Very Little Limited Generally Completely OBJECTIVE Applicable Success Success Success Successful Successful A. Use the creative process effectively. B. Use the planning process effectively. C. Complete the action planning process. D. Identify and write useful performance standards. E. Identify opportunities for profitable actions in your organization. F. Apply the time management tools. G. Use the delegation process effectively. H. Use the Company approach to handling mistakes. I. Use the problem analysis and decision making process effectively. J. Increase your sensitivity to peoples’ personal and business problems. K. Seek to understand, appreciate, and respond to the diversity of associates. L. Develop and apply programs and principles to build a more motivational atmosphere. M. Identify specific applications of leadership principles. N. Increase effectiveness in the performance management system. O. Assess the results of training programs.
  • 114. 105 2. Please rate with (1) indicating no relevance and (5) indicating very relevant. Please check the appropriate response besides each item. No Limited Some Very PROGRAM ELEMENT Relevance Relevance Relevance Relevant Relevant Group (Class Discussions Small Team Discussions Case Study/Skill Exercises Program Content Team Building Strategies Special Projects 3. Please indicate the degree to which your application of the following skills or behaviors was enhanced as a result of your participation Strategic Leadership. Please check the appropriate response besides each item. No Very SKILL AREA Opportunity to No Little Some Significant Significant Use the Skill Change Change Change Change Change A. Developing Creativity 1) I hold creative meetings to develop the creative potential of my people and I involve them in active participation in solving problems. 2) I use the organized approach to creative thinking such as using checklists and the Company brainstorming technique. B. Motivating 1) I help build purpose and meaning into the job of team members. 2) I provide realistic and fair rewards for meritorious job performance. 3) I build and maintain a climate of mutual trust and respect in my area of operations. 4) I make special effort to maintain communication effectiveness. 5) I set an example that influences improvement, progress, and achievement by people.
  • 115. 106 No Very SKILL AREA Opportunity to No Little Some Significant Significant Use the Skill Change Change Change Change Change C. Delegating 1) I broaden the responsibility of my people for more important work when they achieve desired results. 2) When delegating, I communicate the results to be achieved, the area of responsibility and the scope of authority. 3) When delegating, I provide for authority and responsibility to go hand in hand. 4) I provide necessary data needed by my employees to achieve desired results on a timely basis and provide assistance as required. D. Communicating 1) I maintain a permissive and creative climate within which associates are motivated to express their ideas. 2) I listen with empathy. 3) When communicating, I take into consideration the personal make up of each individual. 4) I keep faith with associates, report facts honestly and listen sincerely. 5) I speak pleasantly and courteously with due regard for feelings of others. 6) I see things from the other person’s point of view. E. Decision Making 1) I look behind the symptoms to find and define the real problem. 2) I evaluate my decisions by predetermining their possible impact on people and things. 3) I involve my employees in creative meetings in order to develop their decision making potential. 4) I use the ideas and creativity of my employees in developing alternative solutions.
  • 116. 107 No Very SKILL AREA Opportunity to No Little Some Significant Significant Use the Skill Change Change Change Change Change 5) As I communicate decisions to my employees, I clarify the total picture, the reasons for the decisions, the desired results, and their part in implementation. 4. Indicate the extent to which you think your application of knowledge and skills learned from Strategic Leadership had a positive influence on the following measures in your own work or your work unit. Please check the appropriate response beside each item. No Some Moderate Significant Very Much TOPIC Influence Influence Influence Influence Influence A. Productivity B. Quality C. Cost Control D. Efficiency E. Customer Response Time F. Cycle Time G. Sales H. Employee Turnover I. Employee Absenteeism J. Employee Satisfaction K. Employee Complaints L. Customer Satisfaction M. Customer Complaints Thank you for your time and candor!
  • 117. Appendix B Action Plan for the Strategic Leadership Training Program
  • 118. 109 Action Plan for the Strategic Leadership Training Program Action Plan for the Strategic Leadership Training Program -- Part I Name Instructor Signature Follow-Up Date Objective Class: (Circle One) Operations Sales/Market SciResearch Support SPECIFIC STEPS: I will do this... END RESULTS: So far... 1. 2. 3. EXPECTED TANGIBLE BENEFITS:
  • 119. 110 Action Plan for the Strategic Leadership Training Program -- Part II Name Analysis A. What is the unit of measure? Does this measure reflect your performance alone? Yes o No o If not, how many employees are represented in this measure? B. What is the cost of one unit? $ C. How did you arrive at this value? D. What percent of this change was actually caused by the application of Strategic Leadership? % E. What level of confidence do you place on the above information? % (100% = Certainty, 0% + No Confidence) F. If your measure is time savings, what percentage of the time saved was actually applied toward productive tasks? % ACTUAL INTANGIBLE BENEFITS:
  • 120. Appendix C Evaluating Return on Investment for Strategic Leadership Training (Redacted)
  • 121. 111 EVALUATING RETURN ON INVESTMENT FOR STRATEGIC LEADERSHIP TRAINING (REDACTED) GENERAL INFORMATION INDUSTRY AND COMPANY ... Pharmaceutical Company is a fully integrated specialty pharmaceutical company that develops, acquires, manufactures, and markets technology-distinguished branded and generic/non-branded prescription pharmaceutical products. The pharmaceutical industry continues to face transformation through tighter regulatory requirements, increased competitive pressures, and changing technologies. This situation has forced pharmaceutical companies to focus on technology, cost control and customer service. For years, .. . has enjoyed a tremendous edge in technology, having developed a research and development (R&D) capability with its Scientific Affairs organization. In recent years, .. . has undergone significant changes to focus on controlling cost while providing excellent products and services. As part of this transformation, .. . is making minor internal adjustments in some parts of . . while growing in others. One critical group for this transformational activity is the first-level manager (sometimes labeled supervisor, team leader or project leader). He or she must manage and lead a team or group through the constant change and challenges. This study involves a program known as Strategic Leadership, directed at this group, which is one of several training programs targeted at various audiences within .. .. BACKGROUND AND RATIONALE FOR REVIEW Although the reactions to Strategic Leadership and other Corporate Training & Development programs have been very positive, management must ensure that the investment in leadership development has maximum impact on .. ., which is essential in today’s competitive environment. Consequently, .. . Corporate Training & Development initiated this study as part of a major assessment into the business impact of leadership development for . .. The study had three specific objectives:  To assess the specific impact of Strategic Leadership in measurable contributions to the extent possible, up to and including the calculation of the return on investment (ROI).
  • 122. 112  To identify specific barriers to successful program implementation and utilization.  To recommend specific changes or adjustments in the program. STRATEGIC LEADERSHIP PROGRAM The Strategic Leadership program is designed for newly appointed supervisors or managers (or prospective supervisors and manager) in people management positions. The program teaches them the basic tools for success as leaders of people. The program provides an overview of .. .’s global vision and presents critical people management policies and skills necessary to achieve this vision. A previous program had been operational for nearly three years. Two or more sessions were presented each year with an average of 24 in each session. This study was conducted on four 15-hour sessions of Strategic Leadership that totaled 48 participants in December 2006. Several issues surrounding Strategic Leadership made the project more difficult. A comprehensive needs assessment was not conducted before Strategic Leadership was implemented. Without a defined need, an economic benefit may not be realized. Initially, the program was not designed to deliver a measurable ROI. Consequently, in the design stages, key performance measures were not identified as being linked to the program and specific objectives were not developed to measure performance improvement; data collection systems were not refined to link with Strategic Leadership; and performance data were scattered through .. .. Even with these difficulties, there was a need to measure ROI utilizing the most appropriate processes. Consequently, this project attempted to connect the programs with a specific measurable return. ROI PROCESS To understand the ROI process, it is helpful to examine the key steps involved in developing the ROI. The first step is the collection of data after a program has been conducted. A variety of post-program data collection methods is available. Perhaps the most important step focuses on the issue of isolating the effects of training. In every training program, a variety of factors influence the output measures of business impact; training is only one of many influences that will drive a particular measure.
  • 123. 113 The next step in the ROI model is converting data to monetary values. Output measures must be converted to dollar values so they can be compared to the cost of the program to develop the ROI. Another essential step is to tabulate the program costs to determine the specific investment. All fully loaded costs that are related directly or indirectly to the training program are included. Finally, the costs and benefits come together in an equation for the ROI that shows the net benefits—which are the program benefits minus costs, divided by the total investment in the training program times 100 to derive a percentage. This process provides an ROI formula comparable to ROI calculations for other financial investments that typically show the net earnings divided by the average investment. This methodology provides a framework to measure the ROI in any type of training program and is the model used in this study. The key decisions for the study involve setting specific methods for to collect data, isolate the effects of training, and convert data to monetary values. These three most difficult and critical steps in the process are described in more detail below. DATA COLLECTION METHODS In this impact study, two methods were used to collect data. A questionnaire was administered to the target audience to determine the extent to which participants have utilized the training and have achieved on-the-job success. The second method of data collection was the completion of an individualized action plan. ISOLATING THE EFFECTS OF TRAINING Although there are several strategies available to isolate the effects of training, most of the methods were not appropriate in this situation. Participants’ direct estimates were the most appropriate techniques. Although they were subjective, their estimates of the impact of training can be a reliable indicator. The participants are the individuals closest to the performance improvement and are often aware of the other influences that have an impact on the performance measures. In studies where participants’ estimates have been compared to the differences obtained from control group experiments, their estimates were found to be very reliable.
  • 124. 114 CONVERSION OF DATA There are many ways in which data can be converted to monetary values, but the primary strategy used in this study was to ask participants to make estimates of the value of improvements in their work groups. In some cases, participants used accepted standards and conversion factors provided to them to arrive at monetary values. DATA COLLECTION STRATEGY TIMING OF DATA COLLECTION Strategic Leadership is designed to have a long-term impact, but the specific improvements from training programs are difficult to capture years after the program is completed. Although the connection between a training program and specific improvement may exist, it is very difficult for the participants to make the connection. In addition, for longer periods, additional variables will influence output measures, thus complicating the relationship between training and performance improvement. Because of this, the evaluation was limited to the four programs conducted in December 2006. Participants’ input was limited to the three-month period of December 2006 through mid- March 2007. A standard practice in ROI evaluation for short-term training programs is to capture the first-year benefits after the program has been conducted. This, in essence, limits the benefits to one year of improvements. Although this could slightly overstate the results in some cases, it represents a conservative approach because the benefits obtained in subsequent years are not used in the analysis. This practice is used in this study, but for a three-month assessment prorated to an annual financial impact. PROGRAM COSTS COST COMPONENTS When the ROI is developed, a tabulation of the costs for the program is necessary. A fully loaded cost profile was used in this study. This approach accounts for all of the costs of training so that management will fully understand the total costs of Strategic Leadership. The four cost elements considered in this analysis included:  Participant salaries and benefits (average daily salary times benefits factor times number of program days)  Instructor salaries and benefits (for direct work with the program)
  • 125. 115  Ongoing cost of materials (handouts, purchased materials)  Refreshments The cost of participants’ salaries and benefits was estimated using their estimated average salary and benefits; benefits were determined at 42% of salary. These estimates were derived with input from Human Resources. The cost estimates based on these assumptions are: Salaries and benefits for participants = $33,705 Salaries and benefits for instructor = $7,912 Cost of materials = $5,500 Cost of refreshments = $1,171 TOTAL PROGRAM COSTS = $48,288 IMPACT OF PROGRAM Because the main thrust of this project is to determine the impact of Strategic Leadership, every attempt has been made to uncover specific results linked to the program. Although the program was not designed to produce measurable, quantifiable results, it did produce significant changes and did have a business impact, as outlined below. Overall, Strategic Leadership is a very meaningful skill-building program with reports of important and significant changes in skill levels after the program. Participants’ managers were more optimistic about skill changes, reporting slightly higher levels of change than the participants. BUSINESS IMPACT In an effort to calculate business impact, participants were asked to provide annualized dollar values representing specific improvements related to the training program. One hundred percent of the participants provided usable data expressed in dollar values. Participants were asked to indicate the percent of the improvement that is directly related to the training program. In addition, participants were asked to provide the level of confidence they placed in their estimate. Two adjustments were made to the data. First, the percent of the improvement related to training is multiplied by the dollar value. Second, the confidence level estimate, expressed as a percent, is multiplied by the dollar value to adjust for the
  • 126. 116 uncertainty of the data as perceived by the participant. These two adjustments yield an average per participant of $7,559. All 48 participants in the program furnished measurable data, and this yields for Strategic Leadership a value of $363,117—a significant impact for a 15-hour leadership program. A word of caution is needed here. These are subjective values, although adjustments have been made to make them closer to the real values and possibly understate the results. From the perspective of the target audience, it is better to understate than overstate the results. RETURN ON INVESTMENT The final step in the impact equation is to calculate the ROI, which is perceived to be significant for this program. Although Strategic Leadership was not specifically designed to show a bottom-line impact, or ultimately a measurable return, the ROI was calculated. An acceptable target ROI is 25%. Using a benefit of $363,117 for Strategic Leadership (based on input from all 48 participants) and considering the program had a cost of $48,288, the estimated ROI becomes: $363,117 - $48,288 ROI = x 100 = 652% $48,288 Based on these assumptions and calculations, Strategic Leadership yields a very high estimation of ROI. In addition to this ROI, additional value can be attached to the improvement in business metrics, outlined above, as well as the change in skills experienced by the participants and other intangible benefits. Although the ROI was positive and significant, and the program shows important connections with business results, there is still much room for improvement—as suggested in the recommendations. Several important ingredients must be in place for the program to enhance business measures:  There must be a comprehensive assessment conducted to determine needs at Level 3 (on-the-job behavior and environment) and at Level 4 (business impact). Without this, it becomes extremely difficult to evaluate the program at Levels 3 and 4.  Program needs to have specific objectives at Levels 3 and 4 to provide direction to program designers, facilitators and evaluators as well as a focus
  • 127. 117 for the participants as they attempt to use the skills and knowledge on the job.  A follow-up mechanism should be an integral part of the program instead of an add-on activity. This enhances the quality and quantity of the data obtained in the follow-up questionnaire.
  • 128. Appendix D Informed Consent Form
  • 129. 119 INFORMED CONSENT FORM IMPACT OF TRAINING ON BUSINESS RESULTS USING RETURN ON INVESTMENT METHODOLOGY Purpose. You are invited to participate in a research study. The purpose of this study is to determine the return on investment of a leadership-training program in meeting the business needs of the Company. It focuses on the learning provided and the data collected with the questionnaire and the action plan. There is no deception in this study. We are interested in your input on objectives and results of leadership skill development. Participation Requirements. You will be asked to attend a multi-session leadership-training program, Strategic Leadership. In addition you will be asked to complete a paper-and-pencil (or electronic) questionnaire about your leadership training experience and application of learned skills, and a paper-and- pencil (or electronic) action plan focusing on your leadership objectives and results after a 90-day period after the training has been completed. The training will take place in the Training Center in accordance with the schedule promulgated separately. Research Personnel. Jack Kules, Director of Corporate Training and Development, is the only researcher directly involved in this research, and he can be contacted at any time at (314) 645-6600 ext. 3100 (work) or (314) 239-3290 (cell). Potential Risk/Discomfort. Although there are no known risks in this study, some of the information may be personally sensitive and will include questions about your personal leadership goals and objectives—which may be personally confidential to some people. You may withdraw from the study at any time and you may choose not to answer any question that you feel uncomfortable in answering. Potential Benefit. There is the benefit of personal feedback on your leadership skills and objectives from the researcher and other facilitators of the training program. No incentives are offered. The results will have scientific interest that may eventually have benefits for people who must conduct return on investment studies in an area such as training and development. Anonymity/Confidentiality. The data collected in this study are confidential. All data are coded such that your name is not associated with them. In addition, the coded data are made available only to the researcher associated with this study.
  • 130. 120 Right to Withdraw. You may decline to participate in the study and you have the right to withdraw from the study at any time without penalty. You may omit answering questions on the questionnaire and the action plan if you do not want to respond to them. We would be happy to answer any question that may arise about the study. Please direct your comments or questions to Jack Kules. Signatures I have read the above description of the Impact of Training on Business Results Using Return on Investment Methodology study and understand the conditions of my participation. My signature indicates that I agree to participate in the study. Participant’s Name: ______________________________ Participant’s Signature: ___________________________ Researcher’s Name: Jack L. Kules Researcher’s Signature: __________________________ Date: ______________
  • 131. 121 VERBATIM INSTRUCTIONS TO THE PARTICIPANTS You have been selected and are required to participate in a leadership development training program at the Company. You are also being requested to voluntarily participate in a study on the impact of return on investment of a training program on the business results of the Company. In addition to attending the required training sessions, you will be requested to complete a questionnaire focused on outcomes of the training program on your leadership skill development. You will also be requested to complete an action plan that is based on your goals and objectives for leadership skill development and your results—after three months—against these objectives. In addition to the individualized feedback you will be given from the training program facilitators, as a result of your participation in this study you will receive relevant and useful feedback on their leadership goals and objectives through the questionnaire and the action plan, the Company will receive an analysis and feedback on the return on investment based on business results, and the researcher will receive sufficient and valuable data for consideration in the study. At all times the data collected in this study will remain confidential. All data are coded such that your name is not associated with them, and the coded data are made available only to the researcher associated with this study. Should you have any questions, please direct your comments or questions to Jack Kules, Director of Corporate Training and Development, and the researcher in this study, at (314) 645-6600 ext. 3100 (work) or (314) 239-3290 (cell). Your participation is greatly appreciated.
  • 132. Appendix E IRB Research Approval
  • 133. 123 IRB Research Approval From: Chris Cozby [ccozby@ncu.edu] Sent: Friday, December 08, 2006 3:42 PM To: Jack.Kules@sbcglobal.net Cc: Thomas Driver Subject: IRB Project 2006_12_7 - 1 To Jack Kules IRB Project 2006_12_7 - 1 The Northcentral University IRB has approved your research proposal titled Impact of Training on Business Results on Dec. 8, 2006. This approval extends for a period of one year. Please inform me when the project is completed; if not completed within one year, you will need to apply for an extension. In the interim, if there are any changes to the research protocol described in your proposal, a written change request describing the proposed changes must be submitted for approval. As applicable, you must also obtain approval from the site where you will be conducting your research. Please provide us with a copy of any such approval to for our files. Good luck with your project. Thank you. Chris Cozby Chair IRB --------------------- Chris Cozby Professor School of Psychology Northcentral University 505 W. Whipple Prescott, AZ 86301 928-541-7777, ext 8054 888-327-2877, ext 8054 ccozby@ncu.edu
  • 134. Appendix F Question Data Tables from Questionnaires
  • 135. 125 Table F1 Question 1, Class 1 (n = 15) Stud Question Identifier ID A B C D E F G H I J K L M N O Total 1 3 3 4 3 2 3 3 4 3 3 3 4 4 3 2 47 2 3 3 4 3 2 3 3 4 3 2 3 3 4 3 2 45 3 3 4 4 3 2 3 2 4 4 2 3 4 4 3 2 47 4 2 3 4 2 2 3 2 4 3 2 2 3 3 3 2 40 5 4 4 4 3 3 4 4 4 4 3 3 4 4 3 3 54 6 3 3 4 2 3 3 3 3 3 2 2 3 3 3 2 42 7 4 3 4 3 3 4 4 4 3 3 3 4 4 4 3 53 8 3 3 4 2 2 3 4 3 3 2 2 3 4 3 2 43 9 2 3 4 2 2 3 3 3 3 2 2 3 3 3 2 40 10 5 4 4 3 4 4 4 4 4 4 3 4 4 4 3 58 11 3 3 4 2 3 3 3 3 3 2 2 3 3 3 2 42 12 4 4 4 3 3 4 4 4 3 3 3 4 4 3 3 53 13 4 4 4 4 3 4 4 4 4 3 3 4 4 4 3 56 14 3 3 3 3 2 3 3 3 3 2 2 3 4 3 2 42 15 3 3 4 3 2 3 3 3 3 2 2 3 3 3 2 42 Total 49 50 59 41 38 50 49 54 49 37 38 52 55 48 35 704 M 3.27 3.33 3.93 2.73 2.53 3.33 3.27 3.60 3.27 2.47 2.53 3.47 3.67 3.20 2.33 3.13 SD 0.80 0.49 0.26 0.59 0.64 0.49 0.70 0.51 0.46 0.64 0.52 0.52 0.49 0.41 0.49 0.71 Question 1, Class 2 (n = 12) Stud Question Identifier ID A B C D E F G H I J K L M N O Total 16 4 3 3 3 4 4 3 4 3 2 2 3 4 3 3 48 17 4 3 3 3 4 4 3 4 3 3 2 3 4 3 3 49 18 3 3 3 3 4 3 3 4 3 2 3 2 4 3 2 45 19 3 3 3 3 3 3 3 4 3 3 3 3 3 3 2 45 20 2 3 4 3 3 3 2 3 3 2 2 2 3 3 2 40 21 4 4 4 3 4 4 3 4 4 3 2 3 4 3 3 52 22 3 3 3 2 4 4 3 4 3 2 3 3 3 3 2 45 23 3 3 3 2 3 3 3 4 3 3 3 3 3 3 2 44 24 3 3 3 3 3 3 3 4 3 2 2 3 3 3 2 43 25 2 3 3 2 3 3 2 3 3 2 2 2 3 2 2 37 26 3 3 3 3 3 3 3 4 4 3 3 3 4 3 2 47 27 4 4 3 3 4 4 3 4 4 3 3 3 4 3 3 52 Total 38 38 38 33 42 41 34 46 39 30 30 33 42 35 28 547 M 3.17 3.17 3.17 2.75 3.50 3.42 2.83 3.83 3.25 2.50 2.50 2.75 3.50 2.92 2.33 3.04 SD 0.72 0.39 0.39 0.45 0.52 0.51 0.39 0.39 0.45 0.52 0.52 0.45 0.52 0.29 0.49 0.62
  • 136. 126 Table F1 (continued) Question 1, Class 3 (n = 10) Stud Question Identifier ID A B C D E F G H I J K L M N O Total 28 2 2 3 3 2 4 3 4 4 3 3 4 4 3 2 46 29 3 2 3 3 2 4 3 4 4 3 3 3 4 3 2 46 30 3 3 4 3 3 4 4 4 4 3 3 4 4 3 2 51 31 2 2 3 2 3 4 4 4 3 2 3 3 4 3 2 44 32 3 3 4 3 2 4 3 4 4 3 3 4 4 3 3 50 33 3 3 4 3 2 4 3 4 4 3 3 3 3 3 2 47 34 3 4 4 3 3 4 4 4 4 3 3 4 4 3 2 52 35 4 3 4 3 3 3 4 4 4 3 3 3 4 4 3 52 36 3 4 4 3 3 4 3 4 4 3 2 4 4 3 2 50 37 3 3 4 3 2 3 3 3 3 3 2 3 3 3 2 43 Total 29 29 37 29 25 38 34 39 38 29 28 35 38 31 22 481 M 2.90 2.90 3.70 2.90 2.50 3.80 3.40 3.90 3.80 2.90 2.80 3.50 3.80 3.10 2.20 3.21 SD 0.57 0.74 0.48 0.32 0.53 0.42 0.52 0.32 0.42 0.32 0.42 0.53 0.42 0.32 0.42 0.68 Question 1, Class 4 (n = 11) Stud Question Identifier ID A B C D E F G H I J K L M N O Total 28 2 2 3 3 2 4 3 4 4 3 3 4 4 3 2 46 29 3 2 3 3 2 4 3 4 4 3 3 3 4 3 2 46 30 3 3 4 3 3 4 4 4 4 3 3 4 4 3 2 51 31 2 2 3 2 3 4 4 4 3 2 3 3 4 3 2 44 32 3 3 4 3 2 4 3 4 4 3 3 4 4 3 3 50 33 3 3 4 3 2 4 3 4 4 3 3 3 3 3 2 47 34 3 4 4 3 3 4 4 4 4 3 3 4 4 3 2 52 35 4 3 4 3 3 3 4 4 4 3 3 3 4 4 3 52 36 3 4 4 3 3 4 3 4 4 3 2 4 4 3 2 50 37 3 3 4 3 2 3 3 3 3 3 2 3 3 3 2 43 Total 29 29 37 29 25 38 34 39 38 29 28 35 38 31 22 481 M 2.90 2.90 3.70 2.90 2.50 3.80 3.40 3.90 3.80 2.90 2.80 3.50 3.80 3.10 2.20 3.21 SD 0.57 0.74 0.48 0.32 0.53 0.42 0.52 0.32 0.42 0.32 0.42 0.53 0.42 0.32 0.42 0.68
  • 137. 127 Table F2 Question 2, Class 1 (n = 15) Question 2, Class 2 (n = 12) Stud Question Identifier Stud Question Identifier ID 1 2 3 4 5 6 Total ID 1 2 3 4 5 6 Total 1 3 3 2 4 4 3 19 16 3 3 3 4 4 4 21 2 3 3 3 4 4 4 21 17 3 3 3 4 4 3 20 3 3 4 4 4 5 3 23 18 3 3 3 4 4 4 21 4 4 4 3 4 5 4 24 19 3 3 2 4 4 3 19 5 3 3 2 3 4 3 18 20 3 2 2 3 3 2 15 6 3 3 3 4 4 4 21 21 2 2 2 3 3 3 15 7 3 4 4 4 5 3 23 22 3 3 3 4 4 4 21 8 4 4 3 4 5 4 24 23 3 3 3 4 4 3 20 9 3 3 2 3 4 3 18 24 3 3 2 4 4 4 20 10 3 3 3 4 4 4 21 25 3 3 2 4 4 3 19 11 3 4 4 4 5 3 23 26 3 2 2 3 4 4 18 12 4 4 3 4 5 4 24 27 4 3 3 4 5 3 22 13 3 3 2 4 4 3 19 14 3 3 3 4 4 4 21 15 3 4 4 4 5 3 23 Total 48 52 45 58 67 52 322 Total 36 33 30 45 47 40 231 M 3.20 3.47 3.00 3.87 4.47 3.47 3.58 M 3.00 2.75 2.50 3.75 3.92 3.33 3.21 SD 0.41 0.52 0.76 0.35 0.52 0.52 0.70 SD 0.43 0.45 0.52 0.45 0.51 0.65 0.71 Question 2, Class 3 (n = 10) Question 2, Class 4 (n = 11) Stud Question Identifier Stud Question Identifier ID 1 2 3 4 5 6 Total ID 1 2 3 4 5 6 Total 28 2 3 3 4 4 3 19 38 4 3 4 4 4 4 23 29 3 3 3 4 4 3 20 39 3 3 4 4 4 3 21 30 4 3 3 4 5 4 23 40 3 3 4 4 4 3 21 31 3 2 3 4 4 3 19 41 2 2 3 4 3 3 17 32 2 2 2 4 4 3 17 42 2 2 3 4 3 3 17 33 3 2 3 4 4 4 20 43 2 2 3 3 4 3 17 34 4 3 3 4 5 3 22 44 3 3 4 4 4 3 21 35 3 3 3 3 4 3 19 45 3 3 4 4 4 3 21 36 2 3 3 4 4 4 20 46 4 3 4 4 4 4 23 37 3 2 3 4 4 3 19 47 3 3 4 4 4 3 21 48 3 3 4 4 4 3 21 Total 29 26 29 39 42 33 198 Total 32 30 41 43 42 35 223 M 2.90 2.60 2.90 3.90 4.20 3.30 3.30 M 2.91 2.73 3.73 3.91 3.82 3.18 3.38 SD 0.74 0.52 0.32 0.32 0.42 0.48 0.74 SD 0.70 0.47 0.47 0.30 0.40 0.40 0.65
  • 138. 128 Table F3 Question 3, Class 1 (n = 15) Question 3, Class 2 (n = 12) Stud Question Identifier Stud Question Identifier ID A B C D 5 Total ID 1 2 3 4 5 Total 1 3.5 3.0 3.0 3.3 3.6 16.4 16 4.0 3.5 3.5 4.7 4.2 19.9 2 4.0 3.5 3.5 4.0 3.8 18.8 17 3.5 3.0 3.5 4.3 4.0 18.3 3 3.0 3.0 3.0 3.7 3.8 16.5 18 3.0 3.0 3.0 3.7 3.2 15.9 4 4.0 3.5 3.5 4.0 3.8 18.8 19 2.5 3.0 3.0 3.3 3.0 14.8 5 3.5 3.0 3.0 3.3 3.4 16.2 20 3.0 3.0 3.0 3.3 3.4 15.7 6 3.5 3.0 3.0 3.7 3.6 16.8 21 3.5 3.0 3.5 4.0 3.6 17.6 7 3.0 3.0 3.0 3.3 3.4 15.7 22 4.0 3.5 3.5 4.3 4.4 19.7 8 4.0 3.5 3.5 4.0 3.8 18.8 23 3.5 3.0 3.5 4.0 3.8 17.8 9 4.5 3.5 4.0 4.0 4.0 20.0 24 3.0 3.0 3.0 3.7 3.4 16.1 10 4.0 3.5 3.5 4.0 4.0 19.0 25 2.5 3.0 3.0 3.3 3.0 14.8 11 4.0 3.5 3.5 4.0 4.0 19.0 26 3.0 3.0 3.0 3.3 3.2 15.5 12 3 3.0 3.0 4.0 3.8 16.8 27 3.5 3.0 3.5 3.7 3.8 17.5 13 2.5 3.0 3.0 3.7 3.6 15.8 14 3.5 3.0 3.0 3.7 3.8 17.0 15 3.5 3.0 3.0 3.7 3.6 16.8 Total 53.5 48.0 48.5 56.4 56.0 262.4 Total 39.0 37.0 39.0 45.6 43.0 203.6 M 3.57 3.20 3.23 3.76 3.73 3.50 M 3.25 3.08 3.25 3.80 3.58 3.39 SD 0.53 0.25 0.32 0.27 0.19 0.41 SD 0.50 0.19 0.26 0.47 0.46 0.47 Question 3, Class 3 (n = 10) Question 3, Class 4 (n = 11) Stud Question Identifier Stud Question Identifier ID 1 2 3 4 5 Total ID 1 2 3 4 5 Total 28 4.0 4.0 4.0 4.7 4.4 21.1 38 3.0 3.5 3.5 3.3 3.6 16.9 29 4.0 3.5 4.0 4.3 4.2 20.0 39 3.0 3.5 3.5 3.3 3.8 17.1 30 3.5 3.5 3.5 4.0 3.8 18.3 40 3.0 3.5 3.5 3.3 3.4 16.7 31 3.5 3.0 3.5 3.7 3.6 17.3 41 3.5 3.5 4.0 3.7 3.8 18.5 32 3.0 3.0 3.5 3.3 3.4 16.2 42 3.5 3.5 4.0 3.7 4.0 18.7 33 3.0 2.5 3.0 3.0 3.2 14.7 43 3.5 3.5 3.5 3.7 3.6 17.8 34 4.0 3.5 4.0 3.7 4.0 19.2 44 3.0 3.0 3.5 3.3 3.8 16.6 35 3.5 3.0 3.5 3.3 3.4 16.7 45 3.5 3.5 3.5 3.3 3.6 17.4 36 3.0 3.0 3.5 3.3 3.4 16.2 46 4.0 3.5 4.0 4.0 4.0 19.5 37 2.5 3.0 3.0 3.0 3.0 14.5 47 4.0 4.0 4.0 4.3 4.4 20.7 48 4.0 4.0 3.5 4.0 4.2 19.7 Total 34.0 32.0 35.5 36.3 36.4 174.2 Total 38.0 39.0 40.5 39.9 42.2 199.6 M 3.40 3.20 3.55 3.63 3.64 3.48 M 3.45 3.55 3.68 3.63 3.84 3.63 SD 0.52 0.42 0.37 0.56 0.45 0.48 SD 0.42 0.27 0.25 0.36 0.29 0.34
  • 139. 129 Table F4 Question 4, Class 1 (n – 15) Stud Question Identifier ID A B C D E F G H I J K L M Total 1 3 3 2 4 3 4 2 3 3 2 4 3 3 39 2 3 4 3 3 3 4 2 2 2 2 3 3 2 36 3 3 3 3 4 3 4 2 3 3 3 3 3 2 39 4 4 4 3 3 3 3 2 2 2 3 3 3 2 37 5 3 3 2 4 3 3 2 3 3 3 4 3 3 39 6 3 4 3 3 4 4 2 2 2 2 4 3 3 39 7 3 3 3 4 3 4 2 3 3 2 3 3 2 38 8 4 4 3 3 3 4 3 3 2 3 3 4 3 42 9 3 3 2 5 3 3 2 3 3 2 3 3 2 37 10 3 4 3 3 4 3 2 2 2 2 4 3 3 38 11 3 3 3 4 3 4 2 3 3 3 4 3 3 41 12 4 4 3 3 3 4 2 2 2 3 3 3 2 38 13 3 3 2 4 3 4 2 3 3 3 3 3 2 38 14 3 4 3 3 3 3 2 2 2 2 3 3 2 35 15 3 3 3 4 3 3 2 3 3 2 4 3 3 39 Total 48 52 41 54 47 54 31 39 38 37 51 46 37 575 M 3.20 3.47 2.73 3.60 3.13 3.60 2.07 2.60 2.53 2.47 3.40 3.07 2.47 2.95 SD 0.41 0.52 0.46 0.63 0.35 0.51 0.26 0.51 0.52 0.52 0.51 0.26 0.52 0.66 Question 4, Class 2 (n = 12) Stud Question Identifier ID A B C D E F G H I J K L M Total 16 3 3 2 3 3 4 3 3 2 3 4 4 3 40 17 4 3 3 4 3 4 2 3 3 3 3 3 2 40 18 3 4 3 3 3 3 2 2 3 2 4 3 2 37 19 3 3 2 4 4 4 3 2 3 3 3 4 3 41 20 4 3 3 3 3 4 2 3 2 3 4 3 2 39 21 3 4 3 4 3 3 2 3 3 2 3 3 2 38 22 3 3 2 3 3 4 3 2 3 3 4 4 3 40 23 3 3 3 4 4 4 2 2 3 3 3 3 2 39 24 4 4 3 3 3 3 2 3 2 2 4 3 2 38 25 3 3 2 4 3 4 3 3 3 3 3 4 3 41 26 3 3 3 3 3 4 2 2 3 3 4 3 2 38 27 3 4 3 4 4 3 2 2 3 2 3 3 2 38 Total 39 40 32 42 39 44 28 30 33 32 42 40 28 469 M 3.25 3.33 2.67 3.50 3.25 3.67 2.33 2.50 2.75 2.67 3.50 3.33 2.33 3.01 SD 0.45 0.49 0.49 0.52 0.45 0.49 0.49 0.52 0.45 0.49 0.52 0.49 0.49 0.66
  • 140. 130 Table F4 (continued) Question 4, Class 3 (n = 10) Stud Question Identifier ID A B C D E F G H I J K L M Total 28 4 4 3 4 3 3 2 2 2 2 3 4 3 39 29 3 3 3 4 3 4 2 2 2 3 4 3 2 38 30 4 3 2 3 3 4 2 2 3 3 4 3 3 39 31 3 4 3 3 4 4 2 3 3 3 3 4 2 41 32 4 3 3 4 3 3 3 3 3 3 4 3 3 42 33 3 3 2 4 3 4 2 2 2 2 4 3 2 36 34 4 4 3 3 3 4 2 2 2 3 3 4 3 40 35 3 3 3 3 4 4 2 2 3 3 4 3 2 39 36 4 3 2 4 3 3 2 3 3 3 4 3 3 40 37 3 4 3 4 3 4 3 3 3 3 3 4 2 42 Total 35 34 27 36 32 37 22 24 26 28 36 34 25 396 M 3.50 3.40 2.70 3.60 3.20 3.70 2.20 2.40 2.60 2.80 3.60 3.40 2.50 3.05 SD 0.53 0.52 0.48 0.52 0.42 0.48 0.42 0.52 0.52 0.42 0.52 0.52 0.53 0.69 Question 4, Class 4 (n = 11) Stud Question Identifier ID A B C D E F G H I J K L M Total 38 4 3 3 3 4 3 3 2 3 3 4 3 3 41 39 3 4 3 3 3 4 2 2 2 3 4 4 2 39 40 4 4 3 4 3 4 3 3 2 2 3 3 2 40 41 4 3 3 4 3 4 2 3 2 3 3 3 3 40 42 3 4 2 3 4 4 3 2 3 2 4 3 2 39 43 3 4 2 3 3 3 2 2 3 3 4 3 2 37 44 4 3 2 4 3 4 3 3 3 2 3 4 3 41 45 4 4 3 4 3 4 2 3 2 3 3 3 2 40 46 4 4 3 3 4 4 3 2 2 2 4 3 2 40 47 3 3 3 3 3 4 2 2 2 3 4 3 3 38 48 3 4 2 4 3 3 3 3 3 2 3 3 2 38 Total 39 40 29 38 36 41 28 27 27 28 39 35 26 433 M 3.55 3.64 2.64 3.45 3.27 3.73 2.55 2.45 2.45 2.55 3.55 3.18 2.36 3.03 SD 0.52 0.50 0.50 0.52 0.47 0.47 0.52 0.52 0.52 0.52 0.52 0.40 0.50 0.70
  • 141. Appendix G ROI Data Table from Action Plans
  • 142. 132 Table G1 Complete Action Plan Input from Participants Annualized Student Improvement Contribution Confidence Adjusted ID Value ($) Basis for Value Estimate (%) (%) Value ($) Improvement in efficiency of group at 1 21,000 $1,750 per month for 12 months 60 50 6,300 Improvement in efficiency of group at 2 30,000 $2,500 per month for 12 months 40 50 6,000 Under budget for the year by this amount 3 20,000 40 60 4,800 Improved effectiveness due to delegation of responsibilities ($1,500 per month for 12 months) 4 18,000 45 50 4,050 Improvement in efficiency of group at 5 26,400 $2,200 per month for 12 months 55 60 8,712 5% improvement in group productivity 6 16,000 ($320,000 x 5%) 50 50 4,000 20% improvement in group productivity 7 30,000 ($150,000 x 20%) 35 60 6,300 Improvement in efficiency of group at 8 25,200 $2,100 per month for 12 months 30 75 5,670 Turnover reduction, one per year at $51,120 each (base salary $36,000 times 9 51,120 1.42) 25 50 6,390 12.5% improvement in group productivity 10 15,000 ($120,000 x 12.5%) 80 80 9,600 Improvement in efficiency of group at 11 25,800 $2,150 per month for 12 months 75 60 11,610 Absenteeism reduction (80 absences per 12 9,600 year at $120 per absence) 60 50 2,880 Turnover reduction, one per year at $71,000 each (base salary $50,000 times 13 71,000 1.42) 35 40 9,940 Absenteeism reduction (60 absences per 14 12,000 year at $200 per absence) 80 75 7,200 One lost time accident per year at $27,500 each 15 55,000 40 50 11,000 Class 1 426,120 104,452 Totals
  • 143. 133 Table G1 (continued) Complete Action Plan Input from Participants Annualized Student Improvement Contribution Confidence Adjusted ID Value ($) Basis for Value Estimate (%) (%) Value ($) Improvement in customer service responsiveness from ten hours to six hours (estimated value is $2,500 per month) 16 30,000 40 60 7,200 Improvement in customer service responsiveness from ten hours to six hours (estimated value is $1,750 per month) 17 21,000 60 60 7,560 Improved effectiveness due to delegation of responsibilities ($1,600 per month for 12 18 19,200 months) 40 60 4,608 Improvement in customer service responsiveness from eight hours to six hours (estimated value is $2,000 per month) 19 24,000 75 50 9,000 Improvement in customer service responsiveness from 14 hours to 10 hours (estimated value is $1,500 per month) 20 18,000 60 90 9,720 Improvement in customer service responsiveness from ten hours to eight hours (estimated value is $1,200 per month) 21 14,400 50 100 7,200 Improvement in efficiency of group at 22 30,000 $2,500 per month for 12 months 45 50 6,750 Improvement in customer service responsiveness from eight hours to four hours (estimated value is $2,400 per month) 23 28,800 50 80 11,520 Improvement in customer service responsiveness from ten hours to four hours (estimated value is $2,500 per month) 24 30,000 70 50 10,500 Improvement due to time management of 60 hours per month (estimated value is $96 per hour) 25 5,760 60 90 3,110 Improved effectiveness due to effective communications ($1,200 per month for 12 months) 26 14,400 60 60 5,184 Improvement in customer service responsiveness from ten hours to six hours (estimated value is $1,600 per month) 27 19,200 60 100 11,520 Class 2 254,760 93,872 Totals
  • 144. 134 Table G1 (continued) Complete Action Plan Input from Participants Annualized Student Improvement Contribution Confidence Adjusted ID Value ($) Basis for Value Estimate (%) (%) Value ($) Two lost time accidents per year at $24,000 28 48,000 each 60 50 14,400 Improvement in efficiency of group at 29 18,000 $1,500 per month for 12 months 60 80 8,640 Improved effectiveness due to delegation of responsibilities ($1,000 per month for 12 30 12,000 months) 50 80 4,800 10% improvement in group productivity 31 21,500 ($215,000 x 10%) 45 65 6,289 Improvement in efficiency of group at 32 19,200 $1,600 per month for 12 months 50 60 5,760 Improvement due to time management of 36 hours per month (estimated value is $81 per hour) 33 2,916 85 90 2,231 Absenteeism reduction (50 absences per year at $240 per absence) 34 12,000 50 50 3,000 Improved effectiveness due to delegation of responsibilities ($2,400 per month for 12 months) 35 28,800 40 75 8,640 Improvement due to time management of 25 hours per month (estimated value is $100 per hour) 36 2,500 75 80 1,500 Under budget for the year by this amount 37 60,000 50 70 21,000 Class 3 224,916 76,260 Totals
  • 145. 135 Table G1 (continued) Complete Action Plan Input from Participants Annualized Student Improvement Contribution Confidence Adjusted ID Value ($) Basis for Value Estimate (%) (%) Value ($) Improvement due to time management of 40 hours per month (estimated value is $120 per hour) 80 90 3,456 38 4,800 Improvement in efficiency of group at 39 24,000 $2,000 per month for 12 months 45 50 5,400 12.5% improvement in group productivity 40 18,750 ($150,000 x 12.5%) 90 80 9,000 Improved effectiveness due to delegation of responsibilities ($1,050 per month for 12 months) 50 75 4,725 41 12,600 Two lost time accident per year at $16,000 42 28,000 each 40 60 7,680 Improved effectiveness due to delegation of responsibilities ($2,650 per month for 12 months) 50 60 9,540 43 31,800 Turnover reduction, one per year at $85,200 each (base salary $60,000 times 44 85,200 1.42) 40 40 13,632 One lost time accident per year at $16,000 45 16,000 each 60 60 5,760 Turnover reduction, one per year at $85,200 each (base salary $60,000 times 1.42) 30 50 12,780 46 85,200 Under budget for the year by this amount 47 45,000 45 60 10,800 Improved effectiveness due to effective communications ($1,000 per month for 12 months) 60 80 5,760 48 12,000 Class 4 363,350 88,533 Totals Grand 1,269,146 363,117 Total
  • 146. Appendix H Values of One-Way ANOVA Tables
  • 147. 137 Table H1 Values for One-Way ANOVA of Question 1 Class 1 Class 2 Class 3 Class 4 Ave. Ave. Ave. Ave. 2 2 2 2 Score X Score X Score X Score X A 3.27 10.693 3.17 10.049 2.90 8.410 3.45 11.903 B 3.33 11.089 3.17 10.049 2.90 8.410 3.64 13.250 C 3.93 15.445 3.17 10.049 2.70 13.690 3.55 12.603 D 2.73 7.453 2.75 7.563 2.90 8.410 3.45 11.903 E 2.53 6.401 3.50 12.250 2.50 6.250 2.82 7.952 F 3.33 11.089 3.42 11.696 3.80 14.440 3.73 13.913 G 3.27 10.693 2.83 8.009 3.40 11.560 3.27 10.693 H 3.60 12.960 3.83 14.669 3.90 15.210 3.73 13.913 I 3.27 10.693 3.25 10.563 3.80 14.440 3.82 14.592 J 2.47 6.101 2.50 6.250 2.90 8.410 2.55 6.502 K 2.53 6.401 2.50 6.250 2.80 7.840 2.73 7.453 L 3.47 12.041 2.75 7.563 3.50 12.250 3.09 9.548 M 3.67 13.469 3.50 12.250 3.80 14.440 3.64 13.250 N 3.20 10.240 2.92 8.526 3.10 9.610 3.73 13.913 O 2.33 5.429 2.33 5.429 2.20 4.840 3.09 9.548 n 15 15 15 15 N = 60 X 46.93 45.59 48.10 50.29 X = 190.910 2 (X) /N = 607.440 (x ) (X ) = 620.509 2 2 150.197 141.165 158.210 170.937 (X) /n = 608.238 2 2 (x) /n 146.83 138.563 154.241 168.606
  • 148. 138 Table H2 Values for One-Way ANOVA of Question 2 Class 1 Class 2 Class 3 Class 4 Ave. Ave. Ave. Ave. 2 2 2 2 Score X Score X Score X Score X 1 3.24 10.240 3.00 9.000 2.90 8.410 2.91 8.468 2 3.47 12.041 3.75 14.063 2.60 6.760 2.73 7.453 3 3.00 9.000 2.50 6.250 2.90 8.410 3.73 13.913 4 3.87 14.977 3.75 14.063 3.90 15.210 3.91 15.288 5 4.47 19.981 3.92 15.366 4.20 17.640 3.82 14.592 6 3.47 12.041 3.33 11.089 3.30 10.890 3.18 10.112 N 6 6 6 6 N = 24 X 21.48 20.25 19.80 20.28 X = 81.81 2 (X) /N = 278.870 (x ) (X ) = 285.237 2 2 78.280 69.811 67.320 69.826 (X) /n = 279.128 2 2 (x) /n 76.898 68.344 65.340 68.546 Table H3 Values for One-Way ANOVA of Question 3 Class 1 Class 2 Class 3 Class 4 Ave. Ave. Ave. Ave. 2 2 2 2 Score X Score X Score X Score X A 3.57 12.745 3.25 10.563 3.40 11.560 3.45 11.903 B 3.20 10.240 3.08 9.486 3.20 10.240 3.55 12.603 C 3.23 10.433 3.25 10.563 3.55 12.603 3.68 13.542 D 3.76 14.138 3.80 14.440 3.63 13.177 3.63 13.177 E 3.75 14.063 3.58 12.816 3.64 13.250 3.84 14.746 N 5 5 5 5 N = 20 X 17.51 16.96 17.42 18.15 X = 70.04 (X)2/N = 245.280 (x ) (X ) = 246.288 2 2 61.619 57.868 60.830 65.971 (X) /n = 245.424 2 2 (x) /n 61.132 57.528 60.691 65.885
  • 149. 139 Table H4 Values for One-Way ANOVA of Question 4 Class 1 Class 2 Class 3 Class 4 Ave. Ave. Ave. Ave. 2 2 2 2 Score X Score X Score X Score X A 3.20 10.240 3.25 10.563 3.50 12.250 3.55 12.603 B 3.47 12.041 3.33 11.089 3.40 11.560 3.64 13.250 C 2.73 7.453 2.67 7.129 2.70 7.290 2.64 6.970 D 3.60 12.960 3.50 12.250 3.60 12.960 3.45 11.903 E 3.13 9.797 3.25 10.563 3.20 10.240 3.27 10.693 F 3.60 12.960 3.67 13.469 3.70 13.690 3.73 13.913 G 2.07 4.285 2.33 5.429 2.20 4.840 2.55 6.503 H 2.60 6.760 2.50 6.250 2.40 5.760 2.45 6.003 I 2.53 6.401 2.75 7.563 2.60 6.760 2.45 6.003 J 2.47 6.101 2.67 7.129 2.80 7.840 2.55 6.503 K 3.40 11.560 3.50 12.250 3.60 12.960 3.55 12.603 L 3.07 9.425 3.33 11.089 3.40 11.560 3.18 10.112 M 2.47 6.101 2.33 5.429 2.50 6.250 2.36 5.570 N 13 13 13 13 N = 52 X 38.34 39.08 39.60 39/37 X = 156.39 (X)2/N = 470.343 (x ) (X ) = 482.875 2 2 116.084 120.202 123.960 122.629 (X) /n = 470.413 2 2 (x) /n 113.074 117.480 120.628 119.231
  • 150. Appendix I ROI Data Tables
  • 151. 141 Table I1 Mean ROI Across the Four Classes Class 1 Class 2 Class 3 Class 4 Student Student Student Student ID ROI (%) ID ROI (%) ID ROI (%) ID ROI (%) 01 561 16 616 28 1,150 38 203 02 530 17 651 29 650 39 374 03 404 18 358 30 317 40 689 04 325 19 795 31 446 41 314 05 814 20 866 32 400 42 574 06 320 21 616 33 94 43 737 07 561 22 571 34 160 44 1,096 08 495 23 1,045 35 650 45 405 09 571 24 944 36 30 46 1,021 10 907 25 209 37 1,723 47 847 11 1,118 26 415 48 405 12 202 27 1,045 13 943 14 656 15 1,054 Total 9,641 Total 8,137 Total 5,620 Total 6,665 M 631 M 678 M 562 M 606 SD 279 SD 270 SD 524 SD 296
  • 152. 142 Table I2 Values for One-Way ANOVA of ROI Results Class 1 Class 2 Class 3 Class 4 2 2 2 2 ROI % X ROI % X ROI % X ROI % X 561 314,721 616 379,456 1,150 1,322,500 203 41,209 530 280,900 651 423,801 650 422,500 374 139,876 404 163,216 358 128,164 317 100,489 689 474,721 325 105,625 795 632,025 446 198,916 314 98,596 814 662,596 866 749,956 400 160,000 574 329,476 320 102,400 616 379,456 94 8,836 737 543,169 561 314,721 571 326,041 160 25,600 1,096 1,201,216 495 245,025 1,045 1,092,025 650 422,500 405 164,025 571 326,041 944 891,136 30 900 1,021 1,042,441 907 822,649 209 43,681 1,723 2,968,729 847 717,409 1,118 1,249,924 415 172,225 405 162,025 202 408,094 1,045 1,092,025 943 889,249 656 430,336 1,054 1,110,916 n 15 12 10 11 N = 48 X 9,461 8,131 5,620 6,665 X = 29,877 (x ) (X ) = 20,352,931 2 2 4,587,822 5,217,976 5,630,970 4,916,163 2 (X) /N =18,596,565 (X) /n =18,673,622 2 2 (x) /n 5,967,368 5,509,430 3,158,440 4,038,384

×