Your SlideShare is downloading. ×
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Quarterly Business Communication
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Quarterly Business Communication

1,250

Published on

Published in: Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,250
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Business Communication Quarterly http://bcq.sagepub.com Literacy in Decline: Untangling the Evidence Daphne A. Jameson Business Communication Quarterly 2007; 70; 16 DOI: 10.1177/1080569906297923 The online version of this article can be found at: http://bcq.sagepub.com/cgi/content/abstract/70/1/16 Published by: http://www.sagepublications.com On behalf of: Association for Business Communication Additional services and information for Business Communication Quarterly can be found at: Email Alerts: http://bcq.sagepub.com/cgi/alerts Subscriptions: http://bcq.sagepub.com/subscriptions Reprints: http://www.sagepub.com/journalsReprints.nav Permissions: http://www.sagepub.com/journalsPermissions.nav Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 2. LITERACY IN DECLINE: UNTANGLING THE EVIDENCE Daphne A. Jameson Cornell University National assessments of U.S. high school students’ writing and other verbal abilities do not show that literacy has declined substantially in recent years. In fact, scores have been relatively stable since the 1980s. The proportion of students with solid writ- ing and reading abilities has held fairly steady but remained small during the past 25 years. During this period, however, the proportion of high school graduates who enter higher education has soared. Thus, more students with weak verbal abilities now enter college. Initiatives that encourage people to continue their education have succeeded, whereas initiatives to improve writing, reading, and reasoning abilities have not. The complex causes of entering college students’ weak verbal abilities include social and cultural forces as well as decisions by educational institutions. By understanding the complicated history of this issue and reframing it in positive terms, business and tech- nical communication faculty can help effect change. Keywords: literacy; writing assessment; reading assessment; verbal abilities; curriculum COLLEGE FACULTY OFTEN complain that today’s high school graduates in the United States are less literate than in the past. Writing and reading skills have declined rapidly, some say. But is this perception accurate? Nationwide assessments of students’ verbal abilities are com- plex and sometimes seem contradictory. Reports often highlight small year-to-year changes rather than longer-term trends. Modifications in test methodologies and scoring systems complicate comparisons, mak- ing it important to look closely at the details of each study. To untangle the evidence and discover how entering college students’ writing and other verbal abilities have changed over the years, I analyzed reports and statistics from governmental and private organizations that have measured students’ knowledge and skills, some since the early 20th century, in the United States. My goal was to develop coherent information that faculty and administrators need Business Communication Quarterly, Volume 70, Number 1, March 2007 16-33 DOI: 10.1177/1080569906297923 © 2007 by the Association for Business Communication 16 Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 3. Jameson / LITERACY IN DECLINE 17 to know as they plan curricula, respond to stakeholders, and strive to provide the best education possible for students. It is important to understand the nature of the problem before trying to devise solutions. What I found was a surprise: The trends in U.S. national assess- ments of secondary students’ writing and broader verbal abilities do not substantiate the perception that literacy has declined rapidly in recent years. In fact, scores have been relatively stable since the 1980s. Instead, the issue involves two trends. The proportion of students with solid writing and reading abilities has held fairly steady but remained small during the past 25 years. Meanwhile, the proportion of high school graduates who enroll in higher education has soared. Thus, more students with weak verbal abilities are enter- ing college. National initiatives to encourage more people to con- tinue their education have apparently succeeded, whereas efforts to improve students’ writing, reading, and reasoning abilities have not. Although the weak verbal abilities of entering college students have implications for all of higher education, the issue is especially relevant to business and technical communication faculty. They are on the front lines, helping students meet high standards of literacy. Especially in business schools, these faculty serve as resident experts on language issues. This article will contribute to that expertise by providing an historical perspective and analyzing the latest statistical data concern- ing the verbal skills students have when they enter college. This article first explains the trends in assessment of entering col- lege students’ writing, reading, and broader verbal abilities and then explores cultural and social factors that have affected these trends. The study has several limitations, one being that it uses data from standardized tests, which do not measure the full range of writing and reading tasks that are important in college or in the workplace. Such tests do, however, provide useful information about broad lit- eracy trends. In addition, the study concerns the situation in the United States; other countries use different assessment approaches. Another limitation is that it analyzes the issue of students’ inade- quate verbal abilities but does not go on to propose specific actions. That important but difficult next step must derive from a solid under- standing of the trends and their underlying causes, both of which are discussed in this article. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 4. 18 BUSINESS COMMUNICATION QUARTERLY / March 2007 TRENDS IN NATIONAL WRITING ASSESSMENTS The writing performance of students about to finish high school in the United States is low in absolute terms and has changed little in rela- tive terms during the past 25 years. Starting in 1969, the National Assessment of Educational Progress (NAEP) evaluated students’ abil- ities in writing and other subjects. Because the studies used well- designed matrix sampling and careful statistical methods, the results can be generalized to parallel groups in the whole U.S. population. In the NAEP assessments, students respond to a series of open-ended prompts that reflect the practical purposes for which people write: to inform, narrate, and persuade. The prompts specify a genre, audience, and purpose, for instance, a letter to persuade a school board to adopt a particular policy. Professional evaluators score each response. The NAEP has used two types of assessments. Long-term trend assessments of 11th graders, conducted from 1984 to 1999, kept the test design and methodology consistent so full comparisons could be made over many years. Main assessments, conducted from 1969 to the present, change the test design about once a decade to reflect cur- rent curriculum practices and to refine the methodology. Initially, the group tested was 17-year-olds but in the 1990s was changed to 12th graders. These studies excluded the approximately one fifth of students who drop out of school before 11th grade, according to a study by the Editorial Projects in Education Research Center (2006). Results of Long-Term Trend Assessments Every time the long-term writing assessment was held, the results were disappointing in both absolute and relative terms; ultimately, the program was discontinued. Overall performance was measured in five categories, from effective, coherent writing to disjointed, unclear writing. Students in the top category, who would be most able to perform well in college, increased from 1984 to 1994 but only from 2% to 3% (U.S. Department of Education [USDOE], 1996). Those in the second category, complete, sufficient writing, which also would indicate an ability to do college-level work, declined from 37% to 30%. The percentage in the third category also declined, whereas those in the two bottom categories grew. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 5. Jameson / LITERACY IN DECLINE 19 The long-term trend assessment also measured accuracy in gram- mar, spelling, and punctuation. From 1984 to 1996, there was no sig- nificant change in the rate of errors for 11th graders, and the rate was quite high: an average of over seven errors per 100 words (USDOE, 1999). To many readers, a piece of writing with that many errors sig- nals that the writer is not literate. When the results of the 1999 long-term trend assessment were due to be made public, the acting commissioner announced that the data would not be released (Phillips, 2000). He attributed this deci- sion to a lack of confidence in the data because of the limited number of items and questions about analytical techniques. His statement said that the data would be reanalyzed and then released, but that did not happen. Instead, the NAEP subsequently discontin- ued the long-term trend assessments of writing. Furthermore, the report of the 1996 assessment (USDOE, 1997), which showed con- tinuing decline in writing scores, was reissued with those data removed. Did the disappointing results lead to the demise of the program? Given the politically charged environment of the NAEP (Epstein, 2005), one wonders whether the decision was based on more than technical grounds. Results of Main Assessments Unlike long-term trend assessments, main assessments continue; the results, though, raise similar concerns. In the eight assessments since 1969, the writing achievement of students nearing high school grad- uation has remained low in absolute terms and has improved little in relative terms. Main assessments are modified about once a decade to reflect cur- riculum changes and refinements in methodology. Although these changes preclude long-term comparisons, shorter-term comparisons within 10-year periods provide valuable insights. After analyzing NAEP writing results from 1974 to 1984, the executive director concluded that students’ performance was “dis- tressingly poor” and added that “If one accepts the assumption that a piece of writing is a reflection on how the writer thinks, then the Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 6. 20 BUSINESS COMMUNICATION QUARTERLY / March 2007 problem seems even more serious” (Applebee, Langer, & Mullis, 1985, p. 3). On the writing tasks included in all three assessments of 17-year-olds during this period, only 2% scored in the top category of writing achievement (called elaborated). Those in the second cat- egory (acceptable) declined from 25% to 22%. The third category (minimal) rose, and the lowest category (unsatisfactory) held steady. The results in the late 1980s and early 1990s continued to reveal deficiencies in students’ writing. A report on the 1992 assessment noted that even the best students had difficulty constructing an argument and marshalling evidence to persuade others (USDOE, 1994). Overall per- formance of 17-year-olds dropped in four of five assessments held from 1988 to 1996 and was unchanged in the fifth (USDOE, 2003a). An NAEP report analyzing the last two national writing assess- ments, conducted in 1998 and 2002, shows little change (USDOE, 2003b). The proportion of 12th graders in the top category (renamed advanced) rose from 1% to 2% but remained extremely low in absolute terms. The second category (proficient) grew from 21% to 22%, again low in absolute terms. These top two categories imply an ability to do college-level work. Meanwhile, the third category (basic) declined, and the lowest category (below basic) increased. Thus, the most recent national assessments of writing performance suggest that only about one quarter of current students who are near- ing the end of high school demonstrate the ability to do the kind of writing needed in college and later in careers. TRENDS IN NATIONAL READING ASSESSMENTS Since 1971, the National Assessment of Educational Progress has also conducted long-term trend assessments of students’ reading ability. As with the long-term trend studies of writing, these can be compared across years and generalized to parallel groups in the whole U.S. population. The results showed little change. For students age 17, no statis- tically significant difference existed between average reading scores in 1971 and 2004 (USDOE, 2005b). However, the propor- tion of students in the top two levels showed a small but statisti- cally significant drop from 41% in 1990 to 38% in 2004. These top Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 7. Jameson / LITERACY IN DECLINE 21 two performance levels—understanding complicated information and learning from specialized reading materials—imply the abil- ity to do college-level work. The NAEP’s assessments of both reading and writing reveal a discrepancy between verbal abilities and college enrollment. The proportion of high school graduates who enter college within a year of graduation has soared, from one half to two thirds since 1980 (USDOE, 2006a) (Figure 1). In contrast, the proportion of students who have the verbal abilities needed to do college work has lagged. Less than 40% of students have the reading skills and less than 30% the writing skills needed to do college work, according to the most recent national assessments (Figure 2). This discrepancy explains why colleges and universities face a growing problem and why people perceive that literacy is declining. TRENDS IN BROADER MEASURES OF VERBAL ABILITIES Whereas NAEP assessments provide information about the writing and reading abilities of a nationally representative sample of students nearing high school graduation, college entrance examina- tions focus on the broader verbal abilities of a self-selected group, those who hope to attend a college that requires such tests. Verbal in this context means reading comprehension, inferential reasoning, and vocabulary breadth. The most widely used entrance tests in the United States are the ACT (American College Test) and SAT (Scholastic Assessment Test, previously called the Scholastic Aptitude Test). This discussion focuses on the SAT, which has a much longer history. A sharp decline in SAT verbal scores occurred in the 1970s, but since the 1980s, scores have held fairly steady (Figure 3). However, changes in the demographics of test takers, score reporting, and test design complicate the interpretation of the trends. The following dis- cussion explains these complications. What the SAT Measures The basic SAT assesses general ability to succeed in college courses as indicated by verbal and mathematical skills developed over a long Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 8. 22 BUSINESS COMMUNICATION QUARTERLY / March 2007 80 70 60 50 40 30 20 10 0 1960 1965 1970 1975 1980 1985 1990 2000 2005 Figure 1. Percentage of High School Graduates Who Enrolled in Higher Education Within 1 Year of Graduation SOURCE: U.S. Department of Education (2006a). 70 60 50 40 30 20 10 0 1984 1988 1990 1992 1994 1996 1999 2002 2004 High school graduates who enroll in college within a year Long-term reading study: students age 17 in top two categories Long-term writing study: students grade 11 in top two categories Most recent writing study: students grade 12 in top two categories Figure 2. Discrepancy Between College Enrollment and Achievement in Writing and Reading (in percentages) SOURCE: U.S. Department of Education (1996, 1997, 2003b, 2005b, 2006a) NOTE: The National Assessment of Educational Progress did not release 1999 data and subsequently discontinued its long-term trend writing study. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 9. Jameson / LITERACY IN DECLINE 23 time. Entrance tests predict freshman grades, not longer term achieve- ment. The basic SAT does not measure achievement in specific courses; that purpose is fulfilled by the optional subject tests. The SAT verbal score is based on assessment of logical reasoning, vocabulary, and critical reading: the ability to interpret, synthesize, analyze, and evaluate written texts. These verbal abilities are broader than, but related to, writing ability. Before 2005, the SAT did not test writing directly for all test takers but did test it for some. An optional subject test in writing was offered until 2004, and a subtest in standard written English was incorporated into most but not all basic SATs from 1973 to 1993 (Frisch-Kowalski, 2003). Trends in SAT Verbal Scores The College Entrance Examination Board (CEEB), founded in 1900, initially offered only essay examinations but in the 1920s developed the SAT (Frisch-Kowalski, 2003). From 1926 until 1940, the scale was recentered annually, and scores could not be compared across years. When the test began to be given more than once a year, it was necessary to establish a standardized group and define a scale to express future scores. This was because stronger students typically take the spring test in their junior years, skewing the results when compared to the fall test. The 1940 reference group’s scores formed the basis for the scale used until 1995. It had a mean of 500 and a standard deviation of 100. As shown in Figure 3, the average SAT verbal score has declined at uneven rates. The sharpest declines occurred in the 1940s and then between 1963 and 1980 (Dorans, 2002; USDOE, 2005a). In the 1940s, the average verbal score dropped from 500 to 476. Year-by- year scores are not available for this period. Part of the change in test scores was related to changing demographics. After World War II, the GI Bill expanded educational opportunities, and a strong postwar economy made it possible for more people to attend college. In the 1950s and through the mid-1960s, the average verbal score hovered in the 470 to 480 range. More women were enrolling in col- lege at this time. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 10. 24 BUSINESS COMMUNICATION QUARTERLY / March 2007 510 500 490 480 470 460 450 440 430 420 410 400 1941 1946 1951 1956 1961 1966 1971 1976 1981 1986 1991 1996 2001 2006 Figure 3. Average SAT Verbal Scores, 1941-2006 SOURCE: U.S. Department of Education (2006a) and College Entrance Examination Board (1992, 2002, 2005, 2006). NOTE: Data not available for 1942-1951. Pre-1966 data cover all candidates and all scores for those who took the test multiple times. Data from 1966 on cover seniors and the most recent score for those who took the test multiple times. Data from 1995 on are converted back to the original scale from the recentered one. The 2006 test was renamed Critical Reading. In the late 1960s and throughout the 1970s, average scores declined sharply. Then in the 1980s, they leveled off in the 420s. The lowest average verbal score, 422, occurred in 1991. During this period, a broader group of students was taking the test and going to college. From 1960 to 1990, the profile of SAT test takers came to include a greater proportion of minority students, students of lower socio- economic status, students whose first language was not English, and students not in the top fifth of their high school classes (Carson, Huelskamp, & Woodall, 1993). To determine how these demographic changes might have affected SAT scores, the CEEB conducted several in-depth statistical studies. The conclusion was that this broader base of test takers accounted for about half of the drop in SAT scores (Morgan, 1991). The other half, however, was due to student performance, not demographics. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 11. Jameson / LITERACY IN DECLINE 25 Recentering the SAT in 1995 In response to the decline in scores, the CEEB recentered the SAT scales to return the average score to 500 (Winerip, 1994). Used for the first time in 1995, the new scales were based on a 1990 reference group. Although the absolute performance of the whole group of test tak- ers had declined, the scores were adjusted upward to indicate each student’s relative performance. After the recentering, the typical student got 80 extra points for the same level of demonstrated abil- ity. For instance, a raw score of 35 out of a possible 78 translated to a recentered score of 510 instead of the previous 430 (Winerip, 1994). Thousands more students per year received the top score of 800, which no longer signified answering all questions correctly. The test content was also changed. The revised test gave students more time to answer fewer questions, provided contexts for vocabu- lary questions, and omitted the antonym questions and the subtest on standard written English. The recentering of scales and changing of test content were criti- cized as moves to make students, parents, and school administrators feel good, even though actual achievement had deteriorated. A Washington Post political columnist protested that “This is nationalized grade inflation: The solution to low test scores is to jack them up”; he said that “the ‘recentering’ hurts us all by sanctioning mediocrity” (Samuelson, 1994, p. A27). In The Wall Street Journal, a former assis- tant secretary of education complained that “Our tests are getting better at confusing us and clouding over the issue of how our young people are really performing” (Manno, 1995, p. A12). The CEEB defended the recentering vigorously. Officials stressed that the test’s purpose is to help college admissions officers under- stand the relative aptitude of current applicants, not to track historical trends in absolute achievement (Dorans, 2002). Another argument was that the discrepancy between the verbal and mathematics scales caused confusion (Bridgeman, McCamley-Jenkins, & Ervin, 2000). The CEEB president emphasized that percentile measures would not change (Stewart, 1995). Even the staunchest defenders admitted though that the recentering “did not occur solely on the basis of its technical merit” (Dorans, 2002, p. 1). Converting recentered scores Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 12. 26 BUSINESS COMMUNICATION QUARTERLY / March 2007 back to the original scale reveals the stable, though low, trend of recent scores (Figure 3). Adding a Writing Section to the SAT in 2005 In the decade after the SAT was recentered, the average verbal score varied by only a few points (USDOE, 2005a). Yet concerns about students’ weak writing skills remained so strong that in 2002, the CEEB announced that it would add a mandatory writing section to the basic SAT. The CEEB president said this would “serve as a call to educators to emphasize strong writing instruction. America’s students will need strong writing skills to succeed in college and in life. It is time for writing to become a priority in our schools” (p. 4). The new SAT, given for the first time in March 2005, has three sec- tions: mathematics, critical reading, and writing. The writing section includes both an essay and multiple choice questions. The critical reading section has almost all the elements that the verbal section previously included. Not surprisingly, controversy about the new SAT, especially its writ- ing section, arose almost immediately. Critics questioned whether length counted more than accuracy and pointed out that the CEEB’s own studies showed that assessments based on a single essay had low reliability (Winerip, 2005). The National Council of Teachers of English (2005) noted that a critical element of good writing—revi- sion—is excluded. Another concern that has arisen during the first year of the new SAT is the decline in scores in the critical reading test, which covers verbal abilities other than writing. For all test takers, the decline was 5 points, the largest drop in 30 years (CEEB, 2006). Males’ scores dropped 8 points, females’ 3. Many universities, including the University of California, reported that scores of their applicant pools or admitted students dropped much more (Arenson, 2006). The score drop has several possible causes. Because the test costs more, fewer students retook it to try to raise their scores; only the last score is counted in the averages. Some students may have suffered fatigue because the new test is longer, though the CEEB’s research did not substantiate this effect. The falling scores may also signal an actual decline in students’ performance. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 13. Jameson / LITERACY IN DECLINE 27 In summary, the evidence does not support people’s perceptions that literacy is rapidly declining. Test scores have been relatively consistent since the 1980s. What has happened during this time, however, is that the proportion of high school graduates who go on to higher education within a year has soared to more than two thirds. Yet a far smaller proportion of students demonstrates the verbal abil- ities needed to do college-level work. So on the one hand, higher education is becoming more accessible, but on the other, more students with weak skills are enrolling. POSSIBLE CAUSES OF WEAK WRITING AND OTHER VERBAL ABILITIES The underlying causes for the inadequate writing and other verbal abilities of entering college students are complex. Educational insti- tutions and individual teachers bear much responsibility, but broader social and cultural changes are also significant forces. Decline in Writing Instruction and Practice Schools are responsible for the lack of direct instruction in the princi- ples of good writing. In 2003, the National Commission on Writing reported “Both the teaching and practice of writing are increasingly shortchanged throughout the school and college years” (p. 14). The commission recommended that schools double the amount of time spent on writing. Despite concerns about students’ writing, instruction has decreased, not increased (Figure 4). The percentage of college-bound high school students who studied English composition declined from 81% in 1992 to 62% in 2005; the proportion who studied grammar during that period declined from 85% to 65% (CEEB, 1992, 2005). Research paper assignments are “rapidly being abandoned, a victim of time constraints, new state mandates, and a burgeoning emphasis on accountability and assessment” (National Commission on Writing, 2003, p. 40). This curriculum trend is troubling because it does not result from teacher shortages, which are primarily in math and science, or from budget limitations, as it costs no more to teach writing than other Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 14. 28 BUSINESS COMMUNICATION QUARTERLY / March 2007 100 90 80 70 60 50 40 30 20 10 0 1992 2005 Studied composition Studied grammar Figure 4. Percentage of College-Bound Students (SAT takers) Who Reported Studying English Composition and Grammar in High School SOURCE: College Entrance Examination Board (1992, 2005). elements of English. One possible explanation is the continuing influence of an educational philosophy rooted in the 1960s and 1970s, which advocated “students’ right to their own language” (Smitherman, 1999) and privileged free-spirited self-expression over logically struc- tured, carefully edited prose. Another possible factor is the decline in the quality of elementary and secondary teachers. The highest achieving women are far less likely to become teachers now than in the past (Corcoran, Evans, & Schwab, 2004; Postrel, 2004). According to a recent economic analysis, the reasons for this phenomenon are not only the broader opportunities open to women but more important, the higher pay in other fields (Hoxby & Leigh, 2004). Decline in Time Spent on Homework Despite falling test scores and rising concern about writing skills, students spend less time on homework. According to an annual sur- vey by the University of California (2003), the percentage of col- lege-bound high school seniors who spent 6 or more hours per week on homework reached an all-time low of 33% in 2002. More than a third of 17-year-olds reported in 2004 that they typically spend no Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 15. Jameson / LITERACY IN DECLINE 29 time on homework (USDOE, 2005a). This may or may not mean that less homework was assigned. Grade inflation may underlie this phenomenon. Rise in Grades Ironically, during the period when college entrance test scores were falling, high school grades were rising. The percentage of college freshmen who had A averages in high school increased from 18% in 1968 to 48% in 2004, an all-time high (Sax et al., 2004). Those with C+ or below averages fell from 23% in 1968 to 5% in 2004, an all- time low. The discrepancy between standardized tests and grades shows that the rise in grades does not reflect increased mastery in verbal skills that these tests cover. Distorted Self-Perceptions of Abilities The rise in grades combined with the decline in time spent on home- work may have contributed to students’ distorted self-perceptions of their academic qualities. In 1971, students showed realistic self assess- ment: 50% rated themselves above average in academic ability and drive to achieve (American Council on Education, 1971). By 2004, 70% did (Sax et al., 2004). Decrease in Reading The decrease in reading is another possible cause of the lack of writ- ing abilities but may be due more to cultural and social forces than to educational practice. Reading provides an important though indi- rect route to writing ability. Those who read frequently develop a command of language that forms a strong knowledge basis upon which to develop writing skills. Studies over many years show that proficient, frequent readers are likely to be good writers (Stotsky, 1983). Better readers write more mature, cohesive prose (Cox, Shanahan, & Sulzby, 1990). Because some syntactic patterns exist in written but not oral language, people must learn these from reading (Fitzgerald & Shanahan, 2000). By reading well-written prose, people expand their vocabularies and subconsciously assimilate principles of writing style and rhetorical structure. However, the ability to read is not the same as the practice of reading. Students whose reading ability is good, as measured in the NAEP long-term Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 16. 30 BUSINESS COMMUNICATION QUARTERLY / March 2007 trend studies, do not necessarily choose to read frequently or to read the kinds of materials that would expand their vocabulary, increase their knowledge of organization patterns, and make their writing style more sophisticated. Although the amount of school-related reading did not change sig- nificantly from 1984 to 2004, the amount of leisure reading decreased significantly. The proportion of 17-year-olds who say they never read for pleasure more than doubled in these 20 years. Studies by the National Assessment of Adult Literacy (USDOE, 2006b) and the National Endowment for the Arts (2004) confirm similar trends for adults. Shifts in Use of Leisure Time The rise of a culture immersed in electronic media has affected what and how much people read, which in turn affects how well they write. Children whose families restrict television viewing are more involved with reading (U.S. Census Bureau, 2003), and children who watch television the least score the highest on reading assessments (USDOE, 2001). The NAEP reported a significant increase in 17- year-olds’ time spent watching television in 2004 compared to 1978 (USDOE, 2005b). The UCLA study found a slight decrease in televi- sion watching from 1987 to 2004, but that was more than offset by time devoted to computer and video games (Sax et al., 2004). Neither survey asked about time devoted to other electronic activities such as computer surfing, chatting, emailing, blogging, instant messaging, or use of CDs, DVDs, or digital audio players such as iPods. Some electronic media, such as chat rooms, email, and blogs, do provide practice writing but nevertheless may be counterproductive. The texts are typically short and unstructured. The standards of expression in grammar, spelling, syntax, and vocabulary are quite different from those required in academic and professional contexts. The electronic writing process often involves little planning and no revision. These types of writing experiences may detract from rather than enhance writing skills. In summary, the causes of entering college students’ weak writing and reading abilities are interrelated. The decline in writing instruc- tion, combined with grade inflation and little homework, may lead students to overestimate their abilities and thus lack motivation to Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 17. Jameson / LITERACY IN DECLINE 31 improve. The time spent on electronic media activities leaves less time for reading, which decreases knowledge of vocabulary, style, and rhetorical structure. Without this knowledge, students do not mature as writers. CONCLUSION For a variety of reasons—some technical, some perhaps political— test- ing organizations have changed scoring systems, test designs, and program goals in ways that have obscured rather than clarified trends. By looking carefully at what is tested and who participates, however, we can unravel the evidence about the literacy of entering college students. Despite perceptions to the contrary, measures of U.S. high school students’ writing, reading, and related skills have held fairly steady in the past 25 years. However, the proportion of students whose achieve- ment indicates the ability to do college-level work remains small. Because the rate of college enrollment has grown dramatically while achievement has languished, more entering college students have weak verbal abilities. Understanding the issue from this perspective, we can frame it in a positive way that does not focus on blame: More students are get- ting the opportunity to benefit from a college education, but to do so, they need to make up lost ground in writing, reading, and other ver- bal abilities. The rapid expansion of higher education has outpaced student achievement. What will it take to synchronize these forces? For faculty in business and technical communication, these trends suggest that developing innovative approaches to help weaker students must be a priority for years to come. Because of financial and enrollment pressures, only a few institutions of higher education will be able to address this issue by setting higher admission stan- dards. Dramatic improvement at the secondary-school level seems unlikely because strong initiatives and much public pressure over the past 25 years have resulted in stability rather than improvement. Teachers and educational institutions have almost no power over the social and cultural forces that affect literacy. By taking into account the historical context and the statistical trends related to this issue, business and technical communication faculty will be better able to seek new ways to increase student achievement and at the same time to hold realistic expectations for progress. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 18. 32 BUSINESS COMMUNICATION QUARTERLY / March 2007 References American Council on Education. (1971). The American freshman: National norms for fall 1971. Washington, DC: Author. Applebee, A. N., Langer, J. A., & Mullis, I. V. S. (1985). Writing: Trends across the decade, 1974-84 (ETS report 15-W-01). Princeton, NJ: Educational Testing Service. Arenson, K. W. (2006, May 11). Colleges report mystery decline in SAT scores. The New York Times, p. 26. Bridgeman, B., McCamley-Jenkins, L., & Ervin, N. (2000). Predictions of freshman grade- point average from the revised and recentered SAT I: Reasoning Test (College Board Rep. No. 2000-1). New York: College Entrance Examination Board. Carson, C. C., Huelskamp, R. M., & Woodall, T. D. (1993). Perspectives on education in America: An annotated briefing, April 1992. Journal of Educational Research, 86, 259-310. College Entrance Examination Board. (1992). College-bound seniors: 1992 profile of SAT and achievement test takers. New York: Author. (ERIC Document Reproduction Service No. ED351352) College Entrance Examination Board. (2002). 10-year trend in SAT scores indicates increased emphasis on math is yielding results; reading and writing are causes for concern (News release N0179). New York: Author. College Entrance Examination Board. (2005). 2005 college-bound seniors: Total group pro- file report. New York: Author. College Entrance Examination Board. (2006). 2006 college-bound seniors: Total group pro- file report. New York: Author. Corcoran, S. P., Evans, W. N., & Schwab, R. M. (2004). Women, the labor market, and the declining relative quality of teachers. Journal of Policy Analysis and Management, 23, 449-471. Cox, B. E., Shanahan, T., & Sulzby, E. (1990). Good and poor elementary readers’ use of cohe- sion in writing. Reading Research Quarterly, 25, 47-65. Dorans, N. J. (2002). The recentering of SAT scales and its effects on score distributions and score interpretations (College Board Rep. No. 2002-11). New York: College Entrance Examination Board. Editorial Projects in Education Research Center. (2006). Diplomas count: An essential guide to graduation policy and rates. Bethesda, MD: Author. Epstein, J. (2005). The National Assessment of Educational Progress: The story of a national test in the making. American Educational History Journal, 32, 10-19. Fitzgerald, J., & Shanahan, T. (2000). Reading and writing relations and their development. Educational Psychologist, 35, 39-50. Frisch-Kowalski, S. (2003). The SAT: A timeline of changes. New York: College Entrance Examination Board. Hoxby, C. M., & Leigh, A. (2004). Pulled away or pushed out? Explaining the decline of teacher aptitude in the United States. American Economics Review, 94, 236-240. Manno, B. V. (1995, September 13). The real score on the SATs. The Wall Street Journal, p. A12. Morgan, R. (1991). Cohort differences associated with trends in SAT score averages (ETS Rep. No. 91-27). New York: College Entrance Examination Board. National Commission on Writing in America’s Schools and Colleges. (2003). The neglected ‘R’: The need for a writing revolution. New York: College Entrance Examination Board. National Council of Teachers of English. (2005). The impact of the SAT and ACT timed writ- ing tests. Urbana, IL: Author. National Endowment for the Arts. (2004). Reading at risk. Washington, DC: Author. Phillips, G. W. (2000). Statement on long-term trend writing NAEP. Retrieved March 16, 2006, from http://nces.ed.gov/whatsnew/commissioner/remarks2000/4_11_2000.asp Postrel, V. (2004, March 25). In their hiring of teachers, do the nation’s public schools get what they pay for? The New York Times, p. C2. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.
  • 19. Jameson / LITERACY IN DECLINE 33 Samuelson, R. J. (1994, July 27). Merchants of mediocrity: The College Board nationalizes grade inflation. The Washington Post, p. A27. Sax, L. J., Hurtado, S., Lindhold, J. A., Astin, A. W., Korn, W. S., & Mahoney, K. M. (2004). The American freshman: National norms for fall 2004. Los Angeles: UCLA, Higher Education Research Institute. Smitherman, G. (1999). CCCC’s role in the struggle for language rights. College Composition and Communication, 50, 349-376. Stewart, D. M. (1995, October 4). New SATs for a new kind of student [Letter to the editor]. The Wall Street Journal, p. A15. Stotsky, S. (1983). Research on reading/writing relationships: A synthesis and suggested direc- tions. Language Arts, 60, 627-642. University of California. Higher Education Research Institute. (2003). College freshmen spend less time studying and more time surfing the net, UCLA survey reveals [Press release]. Retrieved March 16, 2006, from http://www.gseis.ucla.edu/heri/02_press_release.pdf U.S. Census Bureau. (2003). Child’s day: 2000 (selected indicators of child well-being) (Rep. P70-89). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (1994). NAEP 1992 writing report card. Washington, DC: Author. (ERIC Document Reproduction Service No. ED370119) U.S. Department of Education. National Center for Education Statistics. (1996). NAEP 1994 trends in academic progress (NCES No. 97095). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (1997). NAEP 1996 trends in academic progress (NCES No. 97986). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (1999). NAEP 1996 trends in writing: Fluency and writing conventions (NCES No. 1999-456). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (2001). The nation’s report card: Fourth-grade reading 2000 (NCES No. 2001-499). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (2003a). Digest of education statistics, 2002 (NCES No. 2003-060). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (2003b). The nation’s report card: Writing 2002 (NCES No. 2003-529). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (2005a). Digest of education statistics, 2004 (NCES No. 2006-005). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (2005b). NAEP 2004 trends in academic progress: Three decades of student performance in reading and math- ematics (NCES No. 2005-464). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (2006a). Digest of education statistics, 2005 (NCES No. 2006-030). Washington, DC: Author. U.S. Department of Education. National Center for Education Statistics. (2006b). A first look at the literacy of America’s adults (NCES No. 2006-470). Washington, DC: Author. Winerip, M. (1994, June 11). S.A.T. increases the average score, by fiat. The New York Times, p. A1. Winerip, M. (2005, May 4). SAT essay test rewards length and ignores errors. The New York Times, p. B9. Daphne A. Jameson is on the faculty of the School of Hotel Administration, Cornell University. She has been active in the Modern Language Association and the Conference on College Composition and Communication. She has served as president of the Association for Business Communication. Address correspondence to Daphne A. Jameson, Cornell University, 350 Statler Hall, Ithaca, NY 14853; email: daj2@cornell.edu. Downloaded from http://bcq.sagepub.com at SAGE Publications on January 14, 2008 © 2007 Association for Business Communication. All rights reserved. Not for commercial use or unauthorized distribution.

×