THE ASSESSMENTCENTER METHODAND METHODOLOGYNEW APPLICATIONS AND TECHNOLOGIESA MONOGRAPH BYWILLIAM C. BYHAM, PH.D.
TABLE OF CONTENTSINTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . ...
INTRODUCTION                                      In recognition of these two thrusts, this                               ...
>   Thornton, G. C. III, & Byham,W. C. (1982).      These books and monographs are available    Assessment centers and man...
I. ASSESSMENT                                      each simulation and take notes on special                              ...
VALIDITY AND FAIRNESS                                  ensuing years, but a second assessment alsoIn 1970 the assessment c...
(Gaugler, Rosenthal,Thornton, & Bentson,           Adverse Impact    1985). Working independently of Thornton,          On...
One final bit of evidence suggesting the                are translated and adapted, the methodology inacceptance of the as...
the single-setting assessment centers that          difficult to find qualified high-level, in-house    have achieved popu...
Selection and Placement of Empowered                The adaptation of self-directed teams drasticallyPersonnel            ...
Selection and Placement of Candidates                 Diagnosis of Training and Development     for Other Nonmanagement Po...
Dimensional Profiles of Two Middle Managers                                   Much Less                                   ...
Peer assessment is particularly prevalent in               Diagnosing Management Skills and     Japan, where more than 50 ...
This kind of diagnostic information is extremely     then trained, then assessed again. Table 4useful in developing a cult...
Schools of Business (AACSB), which encouraged       Overall Ratings of Performance in Three                               ...
Skills/Personal Characteristics Scores 5 4                                 3.41 3                        2.83             ...
NEW SIMULATIONS, TESTS, AND METHODS                 led to the high intercorrelations of dimensions     Simulations such a...
assessee’s responses to a number of these          exercises and explaining how the assessmentsituations. The scoring syst...
intelligence data and behavioral observations      Self-Report, Boss, and Workplace Peer     provides a markedly better me...
make the center an event. This may build           meetings with these managers over a periodexpectations and call attenti...
Using Videotape to Record Behavior                   AcceleRATE     Another increasingly popular technology is the       T...
II. ASSESSMENT                                            2. Use behavior to predict behavior.                            ...
5. Organize a discussion so two or                 TARGETED SELECTION®: OBTAINING        more assessors systematically sha...
Table 7 compares key concepts of assessment         Combining Simulations withcenter methodology and Targeted Selection   ...
Two types of simulations commonly are                    The employee will better understand the     integrated into behav...
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Assessment centermethods mg_ddi
Upcoming SlideShare
Loading in...5
×

Assessment centermethods mg_ddi

1,721

Published on

Published in: Education
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,721
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
162
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Assessment centermethods mg_ddi

  1. 1. THE ASSESSMENTCENTER METHODAND METHODOLOGYNEW APPLICATIONS AND TECHNOLOGIESA MONOGRAPH BYWILLIAM C. BYHAM, PH.D.
  2. 2. TABLE OF CONTENTSINTRODUCTION . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 2I. ASSESSMENT CENTER METHOD . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .4 1. How an Assessment Center Works . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 4 2. Validity and Fairness . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 5 3. Adoption of the Assessment Center Method Outside the United States . . . . . . . . . . . . . . 7 4. New Applications . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 8 5. New Simulations,Tests, and Methods . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .16 6. New Technology . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .18II. ASSESSMENT CENTER METHODOLOGY . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21 7. What is Assessment Center Methodology? . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .21 8. Targeted Selection®: Obtaining Behavior in Interviews . . . . . . . . . . . . . . . . . . . . . . . . . . .22 9. Targeted Observation: Obtaining Behavioral Information from Direct Observation of Performance . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .24 10. Third-party Evaluations . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .25 11. Developing Integrated Systems . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .26 12. The Future of Assessment Centers . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .27 13. Conclusion . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .27REFERENCES . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . .29© Development Dimensions International, Inc., MCMXCIV-MMIV. All rights reserved under U.S., International, and Universal Copyright Conventions.Reproduction in whole or part without written permission from DDI is prohibited. 1
  3. 3. INTRODUCTION In recognition of these two thrusts, this monograph is divided into two sections: The “assessment center method” is the name I. Assessment Center Method,Applications, given to the formal assessment approach and Technologies pioneered by AT&T in the United States and II. Assessment Center Methodology now used by thousands of organizations worldwide. In the most common application Since 1970 I have written, co-written, edited, of this method, three or more line managers or co-edited more than 60 books and articles observe a group of six assessees participating about assessment centers. This monograph in a series of exercises that simulate tasks will make no attempt to repeat or even related to the job or job level for which they summarize the information in those published are being assessed. After participants have materials. Instead, it will overview new completed the exercises, assessors meet to developments and provide a glimpse of consider each participant against a predeter- the future. mined list of job-related dimensions to reach The reader is advised to follow up on specific an overall evaluation. (See How an Assessment interests through the supplemental reading Center Works for a complete description.) suggestions provided after many sections of Twenty-one years ago the Harvard Business the monograph. Additionally, the following Review published an article I authored books and monographs are suggested as entitled “Assessment Centers for Spotting background reading: Future Managers” (Byham, 1970). This was > Ashe, L.,Todd, K., & Byham,W. C. (1991). the first article that described the assessment Employee evaluation for the 1990s: center method for a general audience. Since Paper-and-pencil tests, assessment centers, that time it has been widely reprinted and performance appraisals, and interviews. quoted. This monograph is an update and A review of court cases and discussion expansion of that original article. of future prospects. Pittsburgh, PA: Development Dimensions International. Two thrusts have characterized assessment > Byham,W. C. (1987). Applying a systems centers in the last 21 years: approach to personnel activities 1. Applications of the classic assessment (Monograph IX). Pittsburgh, PA: center described in the Harvard Business Development Dimensions International. Review article have expanded widely and > Byham,W. C. (1989). Targeted selection: new technology has increased validity, A behavioral approach to improved reliability, and efficiency of application. hiring decisions (Basic concepts and 2. The key components that produced the methodology) (Monograph XIV, rev. ed.). validity of the assessment center method Pittsburgh, PA: Development Dimensions have been applied in many other personnel International. functions, including selection interviewing, > Hauenstein, P., & Byham,W. C. (1989). reference checking, resume screening, Understanding job analysis (Monograph and observing and evaluating on-the-job XI). Pittsburgh, PA: Development performance. Dimensions International.2
  4. 4. > Thornton, G. C. III, & Byham,W. C. (1982). These books and monographs are available Assessment centers and managerial from Development Dimensions International performance. New York: Academic Press. (DDI).> Wellins, R., Byham,W., and Wilson, J. (1991). Empowered Teams: Creating self-directed work groups that improve quality, productivity, and participation. San Francisco: Jossey-Bass. 3
  5. 5. I. ASSESSMENT each simulation and take notes on special observation forms. After participants have CENTER METHOD completed their simulations, assessors spend one or more days sharing their observations and agreeing on evaluations. If used, test HOW AN ASSESSMENT CENTER WORKS and interview data are integrated into the The assessment center method involves decision-making process. The assessors’ final multiple evaluation techniques, including assessment, contained in a written report, various types of job-related simulations, and details participants’ strengths and development sometimes interviews and psychological tests. needs, and may evaluate their overall potential Common job simulations used in assessment for success in the target position if that is the centers are: purpose of the center. > In-basket exercises Perhaps the most important feature of the > group discussions assessment center method is that it relates > simulations of interviews with not to current job performance but to future “subordinates” or “clients” performance. By observing how a participant > fact-finding exercises handles the problems and challenges of the target job or job level (as simulated in the exercises), > analysis/decision-making problems assessors get a valid picture of how that person > oral presentation exercises would perform in the target position. This is > written communication exercises especially useful when assessing individuals who hold jobs that don’t offer them an opportunity Simulations are designed to bring out behav- to exhibit behavior related to the target position iors relevant to the most important aspects of or level. This is often the case with individuals the position or level for which the assessees who aspire to management positions but are being considered. Known as “dimensions,” presently hold positions that don’t give them these aspects of the job are identified prior to an opportunity to exhibit management-related the assessment center by analyzing the target behavior on the job. position. A job analysis procedure identifies the behaviors, motivations, and types of In addition to improved accuracy in diagnosis knowledge that are critical for success in the and selection, the organization that operates target position. During assessment, the job an assessment center enjoys a number of simulations bring out assessees’ behavior or indirect benefits. Candidates accept the knowledge in the target dimensions. fairness and accuracy of promotion decisions more readily and have a better understanding A typical assessment center involves six of job requirements. Training managers to be participants and lasts from one to three days. assessors increases their skills in many other As participants work through the simulations, managerial tasks, such as handling performance they are observed by assessors (usually three appraisals and conducting coaching and line managers) who are trained to observe feedback discussions. and evaluate behavior and knowledge level. Assessors observe different participants in4
  6. 6. VALIDITY AND FAIRNESS ensuing years, but a second assessment alsoIn 1970 the assessment center method was conducted eight years after the firstwas unique in that extensive research had (Howard & Bray, 1988). Table 1 shows theestablished its validity before it came into validity of both assessment predictions. Thepopular use. The assessment center method, criterion used was advancement to the fourthin its modern form, came into existence as a level of management in a seven-level hierarchy.result of the AT&T Management Progress Study The eight-year prediction is more valid—an(Bray, Campbell, & Grant, 1974). In this study, expected finding since most individuals wouldwhich began in the late 1950s, individuals have begun to consolidate their managemententering management positions in Bell skills after eight years in management. Yet theTelephone operating companies were assessed original assessment ratings were still valid—and, from then on, their careers were followed. even after 20 years.The study was unusual in that it was pure Thornton and Byham (1982) reviewed 29research. Neither the individuals assessed studies of the validity of assessment centernor their bosses were given information about methodology. The authors found more supporttheir performance in the center. Nor was for the assessment center method than forthis information in any way allowed to affect other selection methodologies, while lamentingparticipants’ careers. Participants were the fact that most of the studies were done byassessed soon after they entered management a few large organizations (AT&T, GE, IBM,as new college recruits or after they were SOHIO, and Sears).promoted from the ranks. The 1970 HarvardBusiness Review article presented the results In 1985 Thornton and his associates atfrom the first eight years of the study. Colorado State University processed 220 validity coefficients from 50 studies using aAdditional data from this landmark study are statistical approach called meta-analysis.now available. Not only have researchers They estimated the method’s validity at .37followed participant advancement during the Ratings at Original Assessment and Eight Years Later and Management Level Attained at Year 20 Attained Fourth Level Orginal Assessment Rating of Potential N Predicted to Achieve Fourth Level or Higher 25 60% Predicted to Achieve Third Level 23 25% Predicted to Remain Below Third Level 89 21% 137 Attained Fourth Level Eighth Year Assessment Rating of Potential N Predicted to Achieve Fourth Level or Higher 30 73% Predicted to Achieve Third Level 29 38% Predicted to Remain Below Third Level 76 12% 135Table 1 5
  7. 7. (Gaugler, Rosenthal,Thornton, & Bentson, Adverse Impact 1985). Working independently of Thornton, One area the 1970 Harvard Business Review Wayne Cascio of the University of Colorado article did not address was the fairness of the arrived at the same figure (.37) in studying the assessment center method relative to women validity of first-level assessment centers in an and minorities. The method seemed uniquely operating company of the Bell System. Cascio’s fair because of its emphasis on actual behavior main interest, however, was in measuring the rather than psychological constructs, but no “bottom-line impact” of promotion decisions confirming data were available. That has based on assessment center information changed—considerable data exist today. versus decisions based on criteria extracted Compared to other selection methodologies, from other methods (Cascio & Ramos, 1984). the assessment center method generally is To determine the dollar impact of assessment seen as more fair and objective in terms of centers, Cascio needed more than validity gender, race, and age than other methodologies. information; he needed cost data (fully loaded Some differential performance has been found costs of the assessment process), plus job but this usually is the result of differential performance data expressed in dollars. applicant populations. Over a four-year period he developed a simple There is consistent research showing that methodology for expressing in dollar terms assessment centers are unbiased in their the job performance levels of managers. predictions of future performance. These Using information provided by more than studies considered a candidate’s age, race, and 700 line managers, Cascio combined data on gender and found that predictions by assessment the validity and cost of the assessment center center methodology are equally valid for all with the dollar-valued job performance of candidates. (See Thornton & Byham, 1982, first-level managers. With this data he produced for a complete discussion of these issues.) an estimate of the organization’s net gain in Federal courts have viewed assessment centers dollars resulting from the use of assessment as valid and fair. Indeed, they often have center information in the promotion process. mandated assessment centers to overcome Over a four-year period the gain to the company selection problems stemming from the use of in terms of the improved job performance of paper-and-pencil and other selection instruments new managers was estimated at $13.4 million, (e.g., James C. Edwards v. City of Evanston; or approximately $2,700 each year for each Frank J. Macchiavola v. New York City Board of the 1,100 people promoted in first-level of Examiners). A case in point involved a management jobs. valve company whose use of paper-and-pencil As stated 21 years ago, the assessment center tests to select supervisors was struck down method is not a perfect predictor of success, by a federal court of appeals. As part of the but it is the single best aid for making settlement, the judge allowed the company promotion decisions. Its validity is enhanced to substitute the assessment center method as when coupled with other methodologies, the principle means of selecting supervisors— such as behavioral interviews and appropriate even though a slightly higher number of whites paper-and-pencil tests. (See New Applications.) than blacks succeeded in the centers. The judge ruled that a sufficient number of black candidates was found to possess acceptable potential for supervision to meet the company’s affirmative action goals.6
  8. 8. One final bit of evidence suggesting the are translated and adapted, the methodology inacceptance of the assessment center method each of these countries is virtually identical tocomes from the use of the method by the Equal that used in the United States.Employment Opportunity Commission (EEOC) The chief reason the assessment centerin 1977 and again in 1978. They used the method is valid in so many different countriesmethod to evaluate executives from both inside is that it is an easily adaptable evaluation system,and outside the government to fill high-level not an evaluation instrument. Users need notpositions that resulted from a reorganization. adopt dimensions or standards of performanceIn most situations, an assessment center is that are important in the U.S. but perhapsthe best method available to organizations unimportant in their country; they merelywhose aim is to make accurate selection adopt a systematic procedure for evaluatingand promotion decisions while minimizing candidates against job-related dimensions thatadverse impact. are specific to their particular organization and environment. For example, the dimensionAshe, Todd, K., & Byham, W. C. (1991). Employee Interpersonal Sensitivity is shown in vastly dif- evaluation for the 1990s: Paper- and-pencil tests, ferent ways in Japan than in the United States, assessment centers, performance appraisals, and interviews. A review of court cases and discussion but the method by which the dimension is of future prospects. Pittsburgh, PA: Development assessed works just the same (and as well). Dimensions International.Howard, A., & Bray, D. W. (1988). Managerial lives in While the same types of assessment center transition: Advancing and changing times. New York: exercises seem to work in most countries Guilford Press. (with appropriate translation and cultural adaptation), this is not always so. A case inADOPTION OF THE ASSESSMENT CENTER point is the assessment center run in 1972 byMETHOD OUTSIDE THE UNITED STATES Edgars, a large department store chain in SouthIn 1970 U.S. assessment center methodology Africa. In this, the first center used to identifyhad barely spread to Canada, and only a few blacks for supervisory and managementapplications of the somewhat different British positions, pretesting revealed that blacks hadassessment center methodology could be difficulty with In-basket exercises becausefound (e.g., in South Africa and Australia). they couldn’t easily visualize the people andSince that time the U.S. version has been situations described.adopted more widely. In 1971 American To overcome this, innovative assessment centerassessment center methodology was introduced exercises were developed. Assessees did notin Japan; now at least 150 organization in receive an In-basket full of items to handle, inthat country are involved. U.S. assessment writing, that were “left over” from a previousmethodology has become predominant in incumbent. Instead, they sat at desks in theirAustralia and South Africa and accounts for “offices” and roleplayers brought problems andmore than 50 percent of the method’s concerns to them just as employees would inapplications in the United Kingdom. The real-life situations. They were faced with newmethodology can be found in nearly every items, problems, and demands on their timeindustrialized or industrializing country in the throughout the day, both on the phone and inworld, with other growth areas including West face-to-face interactions. This live, large-scale,Germany, Scandinavia, the Netherlands, the multifaceted simulation was a precursor ofPhilippines, and Singapore. Although exercises 7
  9. 9. the single-setting assessment centers that difficult to find qualified high-level, in-house have achieved popularity recently in the people who can take the time to assess and United States. evaluate candidates objectively. NEW APPLICATIONS Nebraska Public Power District (NPPD) is a Although the primary focus of assessment typical example of a governmental application. centers is to select candidates for first-line NPPD put mid-managers through a one-and-a- supervisory positions (e.g., factory supervi- half-day assessment center to evaluate candidates sors, clerical supervisors, sales managers, tech- for higher-level management positions. The nical managers), applications have expanded center was administrated by NPPD staff using far beyond the typical business domain. Cities consultants as assessors. Results from the like New York, Orlando, Los Angeles, and assessment center were also used to make Philadelphia have used assessment centers to career development recommendations. make promotion decisions in their police and In an assessment center developed for fire departments. School systems in 34 states Frisch’s, a team of outside assessors developed use the methodology to select high school an assessment center process for evaluating principals in a program run by the National candidates for vice president positions and for Association of Secondary School Principals. the CEO position. Applications by agencies of the federal At Burroughs Wellcome, outside assessors government are numerous and varied (e.g., FAA, conducted an assessment center to identify NSA,FBI,IRS). In addition,many new applications developmental needs of candidates for upper of the method—other than supervisory skills management positions throughout the evaluations—have developed. These applications organization. This center combined dimensional have taken the method far beyond what one questionnaires with the assessment process to would have predicted 21 years ago. generate specific career development plans for Selection and Placement of Candidates each person. for Higher Levels of Management The federal government’s interest in high-level In the early 1970s organizations began using assessment has been spurred by the creation the assessment center method to help select of the Federal Senior Executive Service. and place individuals in higher levels of Individuals who attain these executive management. Assessment centers have been positions are entitled to special performance used to help evaluate candidates for presidencies bonuses and other perquisites. Assessment of organizations (Readers’ Digest of Canada), centers have been used widely by government plant managers (Ross Labs), general managers agencies to qualify individuals for these (H. H. Robertson, Chessie), and many senior opportunities. government positions (Office of Management and Budget, Department of Agriculture, As applications have expanded, so have the National Transportation Safety Board, Federal variety of exercises: top-level executives are Trade Commission, Home Loan Bank, Federal evaluated in simulated press conferences; Reserve). As predicted in the 1970 Harvard long-range planning and organization design Business Review article, most of these exercises are used; and there is more emphasis assessments were made by a team of outside on larger social issues. “professional” assessors (consultants). It is8
  10. 10. Selection and Placement of Empowered The adaptation of self-directed teams drasticallyPersonnel changes the role of supervisors and managers.The greatest growth of assessment centers from Supervisors (often called group leaders) have a1985 to 1991 has taken place in connection very large span of control, with as many as 100with organizations moving to an empowered subordinates. Because teams and team leadersworkforce. These organizations are giving take on many of the normal supervisory functions,employees: the supervisors become more managerial in function, concentrating more on budgeting> Responsibility for their designated areas and planning. This, in turn, affects the role of or outputs. middle managers. The multiple-level changes in> Control over resources, systems, methods, job functions have forced organizations to use and equipment. new methods in connection with selection,> Control over working conditions and promotion, and placement decisions. Because schedules. assessment centers worked so well at supervi-> Authority (within defined limits) to commit sory and managerial levels, it was natural to the organization. turn to assessment centers as a methodology.> Evaluation by achievements. Hundreds of manufacturing plants have used assessment centers to select employees, teamMost also are organizing employees into self- leaders, and group leaders. To accomplish this,directed work teams. The teams are made up many new processes have been developed,of team members and a team leader (the team especially in connection with “greenfield” plantleader is a working, nonmanagement member start-ups where large numbers of applicantsof the team). Teams take responsibility for: must be processed. Toyota assessed 22,000> Improving quality and productivity; applicants to staff their 3,000-person plant job rotation. in Kentucky.> Planning and scheduling. At the employee level, exercises involve> Who works on what. applicants in problem-solving group exercises, simulations of the manufacturing process,> Quality audit. and one-to-one interactive exercises.> Equipment adjustment,maintenance,and repair. Supervisor exercises provide opportunities> Housekeeping, vacation planning, absenteeism, to demonstrate coaching, leadership, and tardiness, and performance issues. decision-making skills.> Choosing the team leader. Wellins, R., Byham, W., & Wilson, J. (1991). Empowered> Many other areas. Teams: Creating Self-directed work groups that improve quality, productivity, and participation. San Francisco: Jossey-Bass. 9
  11. 11. Selection and Placement of Candidates Diagnosis of Training and Development for Other Nonmanagement Positions Needs The use of assessment centers also has Quick, easy training methods don’t change expanded to other entry-level and individual people’s skill levels. Skill acquisition requires contributor positions. Applications include intensive, time-consuming classroom training the selection of U.S. foreign service officers, and must be coupled with opportunities for police officers, sales people, trainers, quality on-the-job practice and feedback so new circle facilitators, and customer service behaviors are set in the individual’s repertoire. representatives. Because skill development takes a lot of time and effort, everyone cannot be trained in every Candidates for vocational counseling positions skill. The assessment center method provides conduct simulated counseling sessions as part an effective means to determine training or of their selection procedure. Psychologists developmental needs. Individuals then can be trying to achieve board status from the American placed in the most appropriate program. Board of Professional Psychology are evaluated using an interview, a work sample review, an The assessment center method is an excellent oral fact-finding case, and an analysis of a video- diagnostic tool because it separates an taped, therapist-client interaction. Applicants individual’s abilities into specific areas for apprentice positions are taught basic skills (dimensions) and then seeks specific examples and observed by professionals who evaluate of good and poor behavior within each their ability to learn the technical job skills. As dimension. This helps the assessee and his/her far back as 1967, validity studies were conducted boss determine more precisely what training showing the effectiveness of assessment center and developmental activities are required. methodology for evaluating entry-level position Almost all organizations using assessment candidates (Bray & Campbell, 1968). Research centers for selection or promotion also use showed that using assessment centers for the information obtained to diagnose training selection could increase the performance of needs. However, a major shift in focus has sales representatives considerably. become more prevalent in the last 21 years: The use of assessment center methodology for A large number of firms now use assessment entry-level selection or placement has fostered centers solely to diagnose training needs. An the development of some unique exercises. article describing the use of assessment cen- For example, interesting new exercises have ters as a diagnostic tool was published first in been developed for Japanese companies seeking the Training & Development Journal in 1971 engineers who have more creative abilities. and later was updated (Byham, 1980). The exercises were developed to assess the Table 2 shows a profile of two individuals who brain dominance of young engineers to find were assessed in a training-needs diagnostic “right-brained” or “whole-brained” engineers program. One had extensive needs in inter- who could fit into assignments requiring personal skills, the other in decision making. innovativeness. Six Japanese companies are As a result of these profiles, very different using these specially designed exercises for training prescriptions emerged. Such placement purposes. information saves the individuals and their organizations a great deal of time and effort by getting them into the right training program at the most appropriate time.10
  12. 12. Dimensional Profiles of Two Middle Managers Much Less Much More than Acceptable Acceptable than Acceptable 1 2 3 4 5 Oral Communication Oral Presentation Written Communication Meeting Leadership Group Leadership Sensitivity Planning & Organizing Delegation Control Manager #1 Development of Manager #2 Subordinates Analysis Judgment Initiative Motivation to WorkTable 2Most diagnostic assessment is done within Another way of obtaining developmentalan organization using in-house assessors or insights is through peer and self-assessment.consultants. (See New Technology on the use A few organizations, such as Xerox and theof videotape.) The major exception, however, United States Army War College, train assesseesis the “Looking Glass” simulation used by the to evaluate themselves. First, the participantsCenter for Creative Leadership in Greensboro, are videotaped as they work through assess-North Carolina. The Center uses professional ment center exercises. Then they are trainedpsychologists to analyze information from to evaluate the exercises so they can evaluateassessment center exercises and extensive their own and their peers’ performance.paper-and-pencil tests. They produce profiles The participants, with input from their peersof participants’ strengths and weaknesses, and instructor, determine their strengths andwhich are used to work with the individuals developmental needs and then considerin planning developmental activities. specific developmental plans. 11
  13. 13. Peer assessment is particularly prevalent in Diagnosing Management Skills and Japan, where more than 50 companies use it. Assumptions as Part of a Corporate Because peer assessment relies heavily on the Culture Change Strategy use of videotape equipment (so individuals Individual assessments in a plant or department can view their own performance), it is not can be combined to form an integral part of surprising that a major user of this technology an organization’s culture change strategy. After has been Matsushita Electric Industrial an organization has decided on the desired Company, the world’s largest supplier of culture, the next logical step is to define the videotape equipment. behaviors necessary to implement that culture However, the accuracy of peer and self-assess- and evaluate incumbents’ skill levels in these ment has yet to be proven fully. While peer behavioral areas. For example, an essential assessments at Ezaki Gliko Company (Japan) ingredient of a participative culture is the were found to be related to concurrent ability to run a meeting so all participants supervisor evaluations (Thornton & Byham, can speak their minds and have a sense of 1982), and many organizations have reported a ownership in decision making. A leader’s strong relationship between peer and assessor skill in accomplishing this can be determined evaluations, the National Association of in an assessment center. Table 3 shows the Secondary School Principals and DDI have distribution of assessment center ratings of found little validity in self-assessments four management levels in the dimension obtained as part of their large-scale assessment Group Leadership. The percentage rated center applications. “less than acceptable” is shown on the left, “more than acceptable” on the right. Byham, W. C. (1980, June). The assessment center as an aid in management development (rev. ed.). Training & Development Journal, 34 (6), pp. 24-36. Byham, W. C. (1982, February). How assessment centers are used to evaluate trainings effectiveness. Training Magazine, 19 (2), pp. 32-38. XYZ Manufacturing Company Group Leadership Using a Participative Style—Frequency Distribution Less Than Acceptable - % More Than Acceptable - % 100 80 60 40 20 0 0 20 40 60 80 100 Top Management Middle Management Supervision Level #2 Supervision Level #1 Table 312
  14. 14. This kind of diagnostic information is extremely then trained, then assessed again. Table 4useful in developing a culture-change strategy. shows the results of an application of the firstIndividuals who lack the skills needed to method of evaluation. This is an evaluation ofmanage participatively cannot implement a the Interaction Management® supervisoryparticipative strategy even if they want to— training program at the Lukens Steel Company.they must increase their basic skill level first. The assessment center results show thatIn addition, research shows that the easiest there were marked changes in individuals’way to change a person’s attitudes or basic performance after training.assumptions about people is to change theperson’s behavior first. This represents a Research Designsmarked departure from the previous strategy for Training Evaluationin which organizations tried to change attitudes 1. Post test only with Control Groupand hoped that behavioral change would follow.With the new strategy, individuals are identified Treatment Training Assessment Group: Program Centerwhose attitudes or basic assumptions about Matched Assessmentpeople can be considered out of line with the Control Group: Centerdesired culture. Their behavior is changedthrough an effective training and developmental 2. Pre/Post Evaluationprogram. This addresses their attitudes and Assessment Training Assessment Center Program Centerassumptions through the position reinforce-ment they receive for improved behavior. Intime, management effects the desired culture 3. Pre/Post Evaluation with Control Groupchange throughout the organization. Treatment Assessment Training Assessment Group: Center Program CenterEvaluating the Effectiveness ofTraining Programs Matched Assessment AssessmentThe American Society for Training and Control Group: Center CenterDevelopment (ASTD) estimates that U.S.companies spend $212 billion each year on Figure 1training (Carnevale & Goldstein, 1990). Thefastest-growing portion of this amount is for In addition to Lukens Steel, organizations suchsales, supervisory, and management training, as SOHIO,AT&T, Central Telephone Utilitiesyet most companies have not evaluated the Corporation, and the New York Metropolitaneffectiveness of their training programs properly. Transit Authority have used assessment center technology to evaluate training programsAssessment center methodology is an excellent (Byham, 1982). The advent of video technology,method for establishing the validity and which allows the relatively inexpensiveeffectiveness of training programs. Three evaluation of individuals, has increased theresearch designs commonly are used (see application of assessment center methodologyFigure 1). In the first design, a group of dramatically in this area. (See Newindividuals is trained while a matched group Technology.)is not. Both groups then are put through anassessment center. The second and thirddesigns have a group of individuals assessed, 13
  15. 15. Schools of Business (AACSB), which encouraged Overall Ratings of Performance in Three six schools to use the technology as part of Assessment Center Simulations by Trained and Non-trained Supervisors an exploratory project attempting to evaluate the “value added” of business schools. Table 5 Ratings: shows the skill level in three dimensions of Non-trained 21% Excellent undergraduate business majors entering the or Above Acceptable six schools, as well as that of graduating Trained 58% undergraduates,students entering MBA programs, and graduating MBAs.* In rough terms, a “3” Non-trained 37% Acceptable rating equates with what most organizations would consider acceptable performance. Trained 21% Because of this research, interest has developed Below Non-trained 42% in the use of assessment center methodology Acceptable to evaluate and document a wide range of or Poor Trained 21% college and other educational outputs. For example, the University of California at Berkeley is using an In-basket exercise in a longitudinal Table 4 study of MBAs. Indiana University of Pennsylvania measures the competencies Evaluating the Output of Educational of graduates of their teacher training program. Institutions A summary of applications of assessment Just as the assessment center method can be centers in education can be found in “Using used to evaluate the effectiveness of training the assessment center method to measure life programs, it also can be used to evaluate how competencies” (Byham, 1988), a chapter in well undergraduate or graduate schools impart Performance and judgment: Essays on specific skills to their students. For example, a principles and practice in the assessment business school can determine the extent of of college student learning (pp. 255–278). skill transfer by putting incoming and outgoing (Washington, DC: U.S. Government Printing students through assessment centers designed Office, 1988). to measure the skills in question. The primary application of the assessment center method * DDI’s Skills Diagnostic Program (SDP), which collects in the educational sphere has been promulgated data via videotape and written outputs, was used to collect and evaluate data in this research. (See New by the American Assembly of Collegiate Technology.)14
  16. 16. Skills/Personal Characteristics Scores 5 4 3.41 3 2.83 2.73 2.58 2.55 2.59 2.63 2.42 2.25 2.14 2.0 2 1.75 1 0 Decision Making Infromation Gathering/ Leadership Problem Analysis Incoming Incoming MBAs Undergraduates Graduating Graduating MBAs UndergraduatesTable 5Evaluating the Effectiveness of College Evaluating Students as a Criterion forRecruiting and Other Selection Activities GraduationThe normative data obtained from more than If assessment center methodology is as accurate1,000 students in the AACSB and similar studies as it seems in evaluating life skills such asallow organizations to evaluate the effectiveness leadership, interpersonal relations, presentationof their recruiting and selection systems in a skills, and decision making, it would makeway never before possible. By administering sense to use assessment center evaluations asthe same assessment center exercises used in one criterion in determining readiness forthe AACSB study to a sample of recent college graduation. Few universities have adopted thishires, an organization can determine whether philosophy, although a number are lookingit is getting the best students available. Almost into it. One major example is Alverno Collegeevery organization says it hires only top-quality in Milwaukee,Wisconsin, which uses assessmentgraduates, but every organization can’t be getting centers throughout its entire curriculum asthe best. One company that took a look at its the primary method of evaluating a student’scollege hires relative to the AACSB norms progress. Faculty members and business peoplefound that they were, in fact, getting people from the local community serve as assessors.who were average or below average in themost important dimensions, such as Leadership Cromwell, L., Loacker, G., & OBrien, K. (1986).and Decision Making. The only dimensional Assessment in higher education: To serve the learner. In C. Bennett (Ed.), Assessment in higher education:areas where their new hires exceeded national Issues in contexts (pp. 47–62). Washington, DC:averages were Personal Impact and Oral U.S. Department of Education.Communications. Obviously, their interviewingprocedure needed to be improved. 15
  17. 17. NEW SIMULATIONS, TESTS, AND METHODS led to the high intercorrelations of dimensions Simulations such as In-basket exercises, group found in early research. The assessors, no discussions, management games, and analysis matter how well trained, could not distinguish exercises described in Byham’s 1970 Harvard behavior among that many dimensions. Business Review article are still the bedrock Exercises now are designed to elicit information of assessment center methodology. However, on only one or two dimensions, thus making they have been supplemented by new types assessor ratings more reliable and assessor of exercises, most importantly the interaction training easier and quicker. simulation. In this exercise the assessee is A growing number of organizations, such as given background information about the need the Civil Service Commission of Canada and to interact with an individual (subordinate, The Center for Creative Leadership, have peer, or customer) and personal information adopted a “total simulation” approach to about the individual. After the assessee has had assessment. Instead of having a number of an opportunity to prepare, he or she conducts distinct and independent exercises, these a simulated interaction with a person trained organizations have integrated their exercises as a role player. The “interviewee” follows a into a common scenario. Characters introduced well-defined role and makes standard responses in the In-basket exercise are seen in later to all issues that might come up. A trained simulations, and candidates play the same role assessor observes the assessee’s behavior. throughout the assessment process. Although leaderless group exercises still are The total simulation approach is not without used commonly to assess leadership, one-to-one problems. Some practitioners are concerned interaction simulations have become more that information—or psychological “sets”— popular. This change reflects a general feeling developed in one exercise will contaminate that individual leadership skills are not performance in another and thus negate necessarily correlated with group leadership the important advantage of independent skills. Another reason for the switch is that observations. Proponents feel that this people going through the same group exercise problem can be overcome through exercise may have quite different experiences. Group design and that the realism is a major positive interactions depend on the nature of the feature. A compromise employed by several people involved. Sometimes the group is highly organizations involves using a common setting competitive; other times it is quite cooperative. and roles, while using independent exercises Sometimes several people vie for leadership; (information in one exercise does not necessarily other times only one person takes charge. This depend on information in other exercises). lack of consistency has caused organizations especially concerned with EEO issues to opt Using Videotape to Stimulate Behavior for the more standardized interaction simulations A somewhat controversial new development or different forms of group exercises. involves the use of videotape to stimulate Simulations now are shorter and more often assessee behavior. An assessee watches a targeted to a few dimensions. Most 1970 video of a situation he or she will face on the exercises were omnibus exercises in terms of job (e.g., an interaction with a subordinate). the dimensions being evaluated. It was not Periodically the tape stops and the assessee unusual for an assessor to attempt to evaluate is presented with four choices of what to do eight dimensions from a single exercise. This or say. A score is calculated based on the16
  18. 18. assessee’s responses to a number of these exercises and explaining how the assessmentsituations. The scoring system is developed center operates. These videos are becomingbased on a validity study. increasingly popular, especially in large-scale assessment center applications. The introductionWhile such exercises may appear to be to the assessment center often is linked to aassessment center exercises, they are not. video job preview that shows what the targetThey are related more closely to paper-and- job will be like, and that shows employees thepencil tests and are thus best used as part of values that will guide the organization. If givenscreening prior to an assessment center or as a a realistic job and assessment center preview,supplement to an assessment center. applicants can make informed decisions aboutOrientation Simulations and Videos whether or not they want to participate.Many organizations are using “minisimulations” Outstanding examples of such videos have beenas part of an orientation program for employees produced by Toyota and Southwestern Bell.who are interested in becoming supervisors Psychological Inventories andand who may attend an assessment center. Projective TestsDuring the orientation participants experience The original AT&T research assessment centersscaled-down versions of the simulations they used psychological inventories (e.g., thewould face in the assessment center. For Edwards Personal Preference Schedule) andexample, they spend one-half hour doing a projective tests (e.g., the Thematicseven-item In-basket, participate in a 15-minute Apperception Test) to supplement observationsgroup discussion, and observe a videotaped of assessees’ behaviors as they progressedinteraction simulation. An orientation through assessment center simulations. AT&Tprogram will: dropped these instruments after they converted> Ensure that participants have complete their research assessment centers to operational information about the assessment center. assessment centers run by the Bell operating> Clear up any concerns people may have companies. AT&T dropped the tests for two about how it will operate and what the reasons: (1) the operational assessment centers simulations will be like. used managers (rather than psychologists) as> Dissuade employees from seeking “inside” assessors, and (2) paper-and-pencil instruments information from friends who already have were disputed during this period (1960s) been participants. because of possible adverse impact on protected groups. Most organizations that adopted the> Ensure fairness to minority groups who may assessment center methodology followed have less exposure to evaluation situations by providing them with a thorough AT&T’s lead, concentrating on behavioral explanation of the assessment process. exercises rather than paper-and-pencil tests. Most even dropped intelligence tests because> Provide participants with a realistic job of the common finding of adverse influence preview that helps them make informed on blacks. However, the Bell companies decisions about the job; often participants retained these tests. will decide not to pursue the opportunity once they obtain this information. Today some of these tests,particularly intelligenceAnother way of providing effective orientation tests (general ability tests), are being usedto an assessment center is to make a video again in conjunction with assessment centers.showing assessees participating in the various Research data show that the combination of 17
  19. 19. intelligence data and behavioral observations Self-Report, Boss, and Workplace Peer provides a markedly better means of evaluating Evaluation Instruments people than either used alone (Thornton & An assessment center provides insights into Byham, 1982). The problem with using many job dimensions, but usually not all paper-and-pencil intelligence tests and other dimensions. Dimensions such as Work Standards psychological instruments is that they require and Energy are not evaluated well in assessment very careful validation efforts, and assessors centers. To fill in these gaps and to get must be specially trained in both data interpre- additional insights to dimensions that are tation and how to integrate that data with assessed in assessment centers, many behavioral data. organizations supplement their assessment The dimension Need or Desire to Lead Others, centers with self-reports and with evaluations which is evaluated through psychological by the assessee’s boss and workplace peers instruments, has achieved considerable attention or subordinates. Usually, assessee is given six in the last few years as a result of work by questionnaires that list the target dimensions Howard and Bray. They found a precipitous with definitions. The assessee completes one drop in this dimension when comparing Bell and gives the other five to his/her boss, peers, company college recruits of the 1950s to or subordinates. All questions are sent to a recruits of the 1980s. Confirmation of this central location where a computer summarizes decline was also obtained from the DDI-AACSB the data and prepares a report. study. Table 6 shows the results of these The combination of assessment center, self-, three studies. Because of these findings, and boss/peer/subordinate evaluations of a many organizations are interested in using common set of dimensions makes a powerful psychological inventories to collect information impact on assessees. The feedback counselor from assessees regarding this dimension. and the assessee can compare and contrast A large number of organizations have adopted each dimension’s ratings from each source an inventory developed by DDI and called the (self, others, and the assessment center). Job Fit Inventory (JFI). The JFI measures desire Based on these insights, they can define to work in an empowered organization and developmental actions more accurately. desire to empower others. NEW TECHNOLOGY The biggest drawback in the ongoing use of Ratings on “Disposition to Lead” assessment centers is the amount of managerial from Three Different Studies time required. In a typical assessment center, 0% 50% 100% a manager leaves his/her job for two or three days to observe participants’ performance in AACSB (1985) 28% simulations, and then spends an additional day N=600 or two meeting with other observers to make AT&T (late 1970s) 32% final evaluations. Although managers recognize N=344 the importance of selection and promotion AT&T (late 1950s) 50% decisions, they are often reluctant to devote N=266 this much concentrated time to assessment. A related problem is the formality of the Table 6 traditional assessment center, which tends to18
  20. 20. make the center an event. This may build meetings with these managers over a periodexpectations and call attention to who is of several weeks, according to the schedules ofbeing assessed and who has not been asked all parties. The managers involved fit the timeto participate. The traditional assessment for the exercises into their usual activities.center also forces organizations to put people During these meetings the managers put thethrough the process in groups; the method is assessee through the same job simulationsuseless when there are only two candidates used in formal assessment centers. For example,for a position. one manager might interview the assesseeThese constraints have limited assessment center about why he or she took certain actions inmethod applications in some organizations to the In-basket exercise; another might have theonly a few selection or promotion decisions. assessee present findings from an analysis andAs a result, many important and effective planning exercise; and a third might observeapplications, such as defining training needs, the assessee in a one-to-one interaction withhave not been utilized widely. Although another manager who role-plays a subordinate.organizations recognize the increasing At an appointed time the managers (assessors)importance of accurately diagnosing training meet to hold an assessor discussion that worksneeds before sending people to training exactly like such discussions in a traditionalprograms, the problems associated with assessment center. The assessors give actualstaffing developmental assessment centers examples of the participant’s behavior to backoften make their use prohibitive, even though up their ratings on each of the dimensionsassessment center methodology is the best they evaluated. After sharing all theiravailable diagnostic instrument for many observations, the assessors reach consensus onpositions. Managers agree on the importance the individual’s strengths and weaknesses inof thorough and accurate diagnoses, but are each dimension. Then, if the purpose of thereluctant to spend the time needed to produce assessment center is to provide the basis forthe excellent diagnoses that the assessment selection or promotion decisions, the assessorscenter methodology yields. make an overall evaluation. If the objective ofDeformalizing the Method assessment is to diagnose training needs, the assessors’ final step is to develop a profile of theA number of organizations in the United States assessee’s strengths and developmental needs.and overseas have overcome the implementationproblems noted earlier by making their All key components of the assessment centerassessment centers less formal and rigid while method are present: multiple job simulations;keeping the basic components that provide use of behavior observed in simulations tovalidity. Organizations do this by incorporating predict future behavior in the target job;the assessment center method into an organization of observed behavior aroundorganization’s day-to-day activities, rather job-related dimensions; and a systematic datathan by having their managers go off to a integration session involving several assessorsdesignated place, or to a “center.” who have observed participants independently in the simulations. Only the rigidity is removed.The individual to be assessed is given a list of This allows even the smallest organization tomanagers responsible for filling the position. apply the assessment center method in makingThe assessee then schedules his/her own selection/promotion decisions. 19
  21. 21. Using Videotape to Record Behavior AcceleRATE Another increasingly popular technology is the The AcceleRATE software program expedites use of videotape equipment to capture assessee the assessment process, and is therefore behavior. Rather than having managers observe advocated by many assessors and administrators. individuals in simulations, participants’ behavior With AcceleRATE, assessors input their is recorded on videotape. The tape and the observations directly into computers. The assessees’ written output then can be sent computer organizes behavior by dimension virtually anywhere and assessors can view and and feeds it back to the assessor in a way evaluate the taped and written performance at that facilitates the rating of each dimension. their convenience. After each assessor has The computer, using an expert system, then observed and evaluated the assigned simulation, checks the rating and if it differs from that a standard data integration session can be held, of the assessor, a second assessor reviews the or the data can be integrated by a computer data and shares his or her insights with the using an expert system. assessor. Together, they make a decision on the dimension rating for the exercise. The most popular application of this technology is DDI’s Skills Diagnostic Program (SDP). At the integration meeting, a computer integrates More than 100 organizations use the SDP all the behavioral observations across exercises as a substitute for, or as a supplement to, and presents the data in a convenient way the traditional assessment center. These for assessor analysis and decision making. organizations send In-baskets or videos of In some organizations, an expert system interactive exercises to DDI to be scored. substitutes for the integration meeting. This mathematical data integration is possible The SDP uses a special version of the AcceleRATE because of the high reliability of the assessor program described below. The output consists exercise dimensional ratings, where reliabilities of behavioral descriptions of performance on of .90 and higher are common. each dimension and percentile rankings relative to as many as three normative groups. The computer prints out a detailed final report The organization chooses the normative groups. giving dimensional ratings with behavioral For example, the performance of a middle examples. AcceleRATE decreases assessor time manager might be compared with other middle by more than half and dramatically decreases managers in the company, in a nationwide assessor and administrator training time. sample of middle managers, or a nationwide sample of middle managers with M.B.A.s, etc. One of the largest applications of SDP technology was the evaluation of more than 600 business school graduates as part of the AACSB-sponsored research project. Students from six representative universities worked through four assessment center exercises administered by the staff of the institution. Participants’ written outputs, along with their videotaped performances, were then sent to DDI where they were evaluated and the data were integrated via computer to arrive at dimensional ratings.20
  22. 22. II. ASSESSMENT 2. Use behavior to predict behavior. Assessors in assessment centers makeCENTER decisions based on behavior; they don’tMETHODOLOGY try to psychoanalyze the individuals they observe. They connect behavior in the assessment center exercises and behaviorWHAT IS ASSESSMENT CENTER required on the job. If the assessee’sMETHODOLOGY? behavior is similar to that required in theThe validity and effectiveness of the target job, that assessee receives a highassessment center method can be credited to rating. If the candidate does not usesix basic underlying methodological concepts. behaviors required in the target job, heDuring the past 21 years, many organizations or she receives a low rating.have used these concepts to improve theeffectiveness of personnel procedures outside 3. Have two or more individualsthe traditional assessment center. Assessment independently observe and evaluate.center methodology has been applied in Observations made by two or more trainedinterviewing, job observation, and obtaining observers provide multiple perspectivesthird-party information (reference data.) on the meaning and importance of anThe six methodological concepts that give assessee’s behavior. This reduces thethe assessment center method its validity are: chance that an assessee’s performance in one exercise will influence assessor1. Organize the assessment process evaluation in others. around target dimensions. 4. Develop a system that ensures all tar- One of the two keys to the job relatedness get dimensions are covered and that of assessment center methodology is the uses inputs from multiple sources. focus of assessment center observations on dimensions that have been defined Assessment centers are organized to force as important to success (or failure) in the the evaluation of all target dimensions. target job. Dimensions are defined through Exercises are selected to provide the most an analysis of the target job. This job analysis complete coverage possible, with overlap procedure usually involves interviewing built in for the most important dimensions. incumbents and their supervisors to But simulations may not provide information identify common factors that have a direct on all dimensions. Very seldom is a job so bearing on success and failure. See: unidimensional that a single source of data can predict future behavior. In reality, mostByham, W., & Associates (1990). Dimensions of effective jobs are extremely complicated in terms performance for the 1990s: What they are, how they of the activities and dimensions necessary differ among levels, how they are changing (Monograph XV, rev. ed.). Pittsburgh, PA: Development for success. For this reason, a variety of Dimensions International. assessment sources, such as interview dataHauenstein, P, & Byham, W. C. (1989). Understanding job and reference checks, are needed. analysis (Monograph XI). Pittsburgh, PA: Development Dimensions International. 21
  23. 23. 5. Organize a discussion so two or TARGETED SELECTION®: OBTAINING more assessors systematically share BEHAVIOR IN INTERVIEWS and debate their behavioral insights About 1,000 organizations throughout the and relate these findings to each world are using assessment center methodology target dimension prior to reaching with no simulations at all. They include an overall decision. Citicorp, Alcoa, Caterpillar, Hoffmann-LaRoche Inc., Imperial Chemicals, McGraw-Hill, and Research evidence and practical experience PepsiCo. These companies are using a clearly indicate that, in most situations, a technology known as Targeted Selection®, group process where data are shared and wherein applicant behavior is not gathered the judgments of several knowledgeable through job simulations but through a series individuals are polled enhances decision of behaviorally based interviews. Just as making. The assessment center really is an assessors are trained to observe and evaluate organized group decision-making process behavior in simulations,Targeted Selection that allows assessors systematically to interviewers are trained to ask questions collect data, organize it, share observations, that elicit clear examples of past behavior and come to a consensus. in interviews, and to use these examples to The integration session in assessment predict future behavior. Groups of target centers forces individuals to substantiate dimensions are assigned to two or more their ratings with examples of actual interviewers who use special questioning assessee behavior, thus keeping subjective skills to obtain the required behaviors. elements out of the discussion. The After the interviews, interviewers meet and process also helps assessors focus on systematically share data on each dimension each key job dimension prior to reaching until they can reach a consensus rating for overall decisions. each dimension. The result is a profile of 6. Use simulations to stimulate behavior strengths and weaknesses just like that to be observed. obtained in an assessment center. Research at companies like Holiday Inn, J. C. Penney, and Simulations are an important method (but McDonald’s has shown that this methodology not the only method; see concept #4) of increases the quality of hires markedly obtaining behavioral examples that can (Development Dimensions International, 1983). be used to predict future behavior. Simulations give organizations a chance Although it is not an assessment center, the to see how a person would perform in a Targeted Selection interviewing process is particular job prior to giving him/her the a true application of assessment center position. methodology. It is very different from the kinds of interviews that were used in early assessment The next three sections deal with applications centers, which tended to be very psychological of assessment center concepts in other and called for extensive interpretation by the personnel procedures such as interviewing, person (usually a psychologist) who conducted on-the-job observation of performance, and the interview. The strength of the behavioral obtaining third-party information about an Targeted Selection interview is that it requires individual. The section titled Developing almost no psychological interpretation. It is Integrated Systems deals with assessment merely an extrapolation of past behavior to methodology as the basis for integrated future behavior. personnel systems.22
  24. 24. Table 7 compares key concepts of assessment Combining Simulations withcenter methodology and Targeted Selection Behavioral Interviewsmethodology. While the Targeted Selection behavioral interviewing procedure is an appropriate alternative to assessment center data when Assessment Centers Targeted Selection the interviewee has had experience in the 1 Organize assessment same dimensions being assessed, it breaks down process around target when he or she has had no experience in the dimensions. areas being evaluated. For example, it is difficult to interview a new college graduate for the 2 Use behavior to same dimension Control if he or she has never had predict behavior. a management job, or a sales applicant for 3 Have two or more same Selling Skills if the applicant has never had a individuals independently selling job. In these situations the assessee is observe and evaluate. being evaluated for a position that is markedly 4 Develop a system that same different from those he or she has held before. ensures all target Even the highly skilled interviewer might have dimensions are covered trouble getting sufficient behavioral data on and that uses inputs from some dimensions to project future behavior. multiple sources. This is an ideal situation for behavioral 5 Organize a discussion so same simulations such as those used in assessment two or more assessors centers. Simulations allow direct observation systematically share and of the desired behavior. Indeed, combining debate their behavioral behavioral interviewing with assessment insights and relate these center simulations was a natural wedding of findings to each target technologies. Interviews provide information dimension prior to reaching on a subset of the target dimensions, while an overall decision. behavioral simulations provide information on additional dimensions. Because only one or 6 Use simulations to Use interviews two simulations are used typically, these stimulate behavior to obtain applications of the methodology are more to be observed. past behavior. often thought of as an elaboration of the interviewing process than as an assessmentTable 7 center, even though technically they meet the requirements of an assessment center set up by the International Congress on the Assessment Center Method. The simulations can be interjected into an interview, or one interviewer may interview the assessee while another administers simulations. Organizations using this combination methodology include the Upjohn Company, A. E. Staley Manufacturing Company, and Florida Steel Corporation. 23
  25. 25. Two types of simulations commonly are The employee will better understand the integrated into behaviorally based interviewing nature of his or her performance and what systems: changes are necessary. In addition, managers’ reports (promotion recommendation and 1. Regular assessment simulations are used others) will mean more to higher management when there is sufficient time, usually because they contain behavioral examples. when a single assessor is administering or observing the simulation and has no Several companies have set up performance other interviewing responsibilities. appraisal systems that parallel assessment However, the individual may, at another methodology directly, including systematic point in time, conduct an interview as data integration. In these systems managers part of the overall system. at the same level meet to share performance data regarding their subordinates and agree 2. Targeted Simulations® are used when the on dimensional and overall ratings, as in an simulation must be put into the interview assessment center. process itself. In these situations an interviewer conducts part of the interview, Although complete adoption of assessment administers a very short simulation, and methodology is still rather rare, many companies then continues the interview. These with assessment centers are integrating parts Targeted Simulations are shorter and have a of the methodology into their performance simpler rating procedure for the assessors. appraisal systems. Most of them are using the same dimensions to evaluate incumbents that Byham, W. C. (1987). Applying a systems approach to they use in assessing applicants for the position. personnel activities. (Monograph IX). Pittsburgh, PA: Kodak,ARA,Tampa Electric, and Owens Illinois Development Dimensions International. are leading the way by integrating their selection, training, and appraisal programs around a TARGETED OBSERVATION: OBTAINING consistent assessment center methodology BEHAVIORAL INFORMATION FROM DIRECT OBSERVATION OF PERFORMANCE and job dimensions. Just as basic assessment center methodology Strangely enough, simulations also are playing can be applied when the behavioral data come a role in direct observation of performance. from candidate interviews, it also can be used There are many situations in which it is difficult when the behavioral data come from on-the- for a manager to observe a subordinate supervisor job observation of performance. Accuracy of or manager in action. A prime example is in on-the-job performance observations can be interpersonal situations, such as when improved markedly by training managers to subordinate supervisors conduct performance differentiate between true behavioral appraisal interviews with their employees. observations and nonbehavioral observations, Most managers feel it is inappropriate to sit in and to observe and record behavior. on their subordinate supervisors’ discussions Without training, many managers tend to with subordinates. This creates a dilemma: generalize their observations, making How can the manager observe his/her comments such as,“He’s not organized,”“She’s subordinates’ performance in this important not trying hard enough,” or “He’s not mature.” area, and how can the manager coach if he The manager will have greater impact if his or or she cannot observe behavior? An obvious her feedback to a subordinate is more behavioral. answer is for the manager and the subordinate24

×