SlideShare a Scribd company logo
1 of 14
Download to read offline
Assessing Writing 8 (2002) 17–30
An assessment of ESL writing placement
assessment
Deborah Crusan∗
Department of English Language and Literatures, Wright State University,
3640 Colonel Glenn Highway, Dayton, OH 45435, USA
Abstract
What are the best assessment practices for ESL and international students who must
be placed into first-year composition courses at both two- and four-year institutions in the
United States? In this article, I examine this issue in three ways. First, I enumerate the
stances in the literature; following that I submit results of preliminary research which ques-
tions modes of writing assessment and their relationship to final grades in composition
classes. Finally, I present results of an internet search of one set of prominent American
institutions’ placement practices. From this data, it might be inferred that we use multi-
ple instruments to place ESL students into composition classes rather than resorting to a
multiple-choice test (especially a standardized instrument) as the sole means of determining
placement.
My primary purposes in this paper are to argue that second language composition special-
ists need to examine our placement practices and aim for a reconciliation of these practices
with our classroom pedagogies. Further, if we are not involved in placement decisions at
our various institutions, we must strive to be included in important decisions concerning
our students and to be advocates for second language writers.
© 2002 Elsevier Science Inc. All rights reserved.
Keywords: Assessment; Reconciliation; Second language
Writing assessment has always been problematic. The importance of respon-
sible assessment practices cannot be overstated, particularly when the assessment
measures influence critical decisions such as placement into composition classes,
∗ Tel.: +1-937-775-2846.
E-mail address: deborah.crusan@wright.edu (D. Crusan).
1075-2935/02/$ – see front matter © 2002 Elsevier Science Inc. All rights reserved.
PII: S1075-2935(02)00028-4
18 D. Crusan / Assessing Writing 8 (2002) 17–30
an act laden with pedagogical, ethical, political, psychometric, and financial im-
plications. In fact, many researchers hold that “any project to assess individ-
ual students’ writing abilities, over time or in a single instance, is fraught with
philosophical, ethical, and methodological peril” (Bullock, 1991, p. 195). Further,
White (1985) asserts, “most testing of writing is poorly done, destructive to the
goals of teaching, and improperly used” (p. 2).
In the United States, most writing assessment 25 years ago used indirect mea-
sures of writing ability, specifically multiple-choice tests. This procedure was used
because direct writing measures in which students actually produce text were con-
sidered unreliable due to inconsistencies in scoring by independent raters (Gordon,
1987; Greenberg, 1992; Harrington, 1998). Potentially confounding factors for the
use of essays assessment included the lack of benchmark papers, lack of clear and
consistent outcomes criteria and lack of quantitative measurements for inter-rater
reliability.
However, in the past 25 years, the US practice of indirectly assessing writing
ability has been criticized from many academic quarters (Bullock, 1991; Conlan,
1986; Moss, 1994; Stiggins, 1982; White, 1986, 1990). Experts cite problems
with validity (Bachman & Palmer, 1996), ethics (CCCC Committee on Assess-
ment, 1995; Courts & McInerney, 1993; Spolsky, 1997; White, 1994a), efficiency
(Williamson, 1994), the inability to measure skills teachers consider important
(Hamp-Lyons, 2001), and myriad other concerns (Hamp-Lyons, 1996; Shor, 1997).
Some researchers (Hamp-Lyons, 1990; Huot, 1990a) have claimed that indirect
assessment of writing by means of a multiple-choice instrument is now a thing of
the past. Hamp-Lyons (1990) states: “20 years or so ago, many if not most people
in North America (to a lesser extent in Great Britain and Australia) believed that
writing could be validly tested by an indirect test of writing. As we enter the 1990s,
however, they have not only been defeated but also chased from the battlefield”
(p. 69). Unfortunately, even at the current time, the opposite may be closer to the
truth. White (1996) cautions: “Anyone with a role in writing assessment must keep
in mind the multiple-choice specter that hovers just offstage; no stake has ever been
driven through its heart to keep it in its coffin, however much it may be wounded”
(p. 109).
In this paper, I review recent literature defining placement testing. I also define
direct and indirect assessment of writing, and I cite inherent problems with both
direct and indirect methods. Further, I summarize published placement practices at
Big Ten universities, confirming the assertion that, despite the contentions of many
in the composition assessment community, indirect testing as the sole measure of
placement of ESL students into writing courses is alive and well at some of the
most prestigious universities in America. Finally, I summarize a study conducted
at The Pennsylvania State University which compares student performance on an
essay and a multiple-choice grammar test and correlates them with students’ final
grades. I argue that collectively, these studies offer evidence that, while we know
what we should do, we often do not do it. Our reasons are legion: cost, speed,
practicality and efficiency, validity, and reliability.
D. Crusan / Assessing Writing 8 (2002) 17–30 19
1. What is a placement test?
Leki (1991) defines a placement exam as a test that pairs a student with an
appropriate course. At two- and four-year institutions in the United States, place-
ment tests are used to measure readiness for college-level course work and help
faculty decide whether incoming students should enroll in remedial or introductory
courses (Patkowski, 1991). They are not proficiency tests, but rather tools which
allow academic advisers to place students in the level of coursework best suited
to their preparation and skills. Institutions regularly employ scores obtained from
a variety of tests (ACT, SAT, Test of English as a Foreign Language (TOEFL),
locally designed essay tests, locally designed measures of the sub-skills of writ-
ing) in order to make admission and placement decisions for students, including
students for whom English is a second language.
2. Direct and indirect testing
An essay test is a direct assessment of writing ability, an integrative test which
has an emphasis on communication, authenticity, and context. It attempts to test
knowledge of language as a whole, not the individual components of language.
Although the constraints of timed essays have been well-noted in the literature
(Crowley, 1995; Shor, 1997), the advantage of essays is that they are able to gauge
the ability of students to identify and analyze problems, to identify audience and
purpose, to argue, describe, and define, skills that are valued in composition classes
in the United States (Crusan & Cornett, 2002; Ferretti, 2001; White, 1994b). In-
direct assessments are also used to assess writing ability. “Discrete-point tests are
constructed on the assumption that language can be broken down into its com-
ponent parts and each of those parts adequately tested. Those components are
basically the four skills (listening, speaking, reading, and writing), the various hi-
erarchical units of language (phonology/graphology, morphology, lexicon, syntax)
within each skill, and the sub-categories within those units” (Brown, 1994, p. 262).
An example of an indirect test would be a multiple-choice proficiency test divided
into grammar, reading comprehension, vocabulary, and “writing” which attempts
to isolate and evaluate knowledge of specific components of language.
Discussion in the literature abounds regarding the many problems surrounding
writing assessment — reliability and validity issues, rater training, holistic scoring,
and whether indirect or direct methods should be used for placement, proficiency,
and achievement; however, little quantitative evidence exists to support the po-
sitions taken in the literature (Hamp-Lyons, 1997). Polio (1998) points out that
relatively few studies address actual empirical research into ethical assessment for
placement of ESL composition students. Matsuda (1998) identifies an even more
serious problem, that “despite the growth of the ESL population, there has not been
a corresponding increase in the amount of attention given to ESL students in many
writing programs” (p. 99). Huot (1994) found a similar situation in mainstream
20 D. Crusan / Assessing Writing 8 (2002) 17–30
composition, commenting, “It is alarming to note the dearth of research and the-
ory in writing assessment and the lack of qualified personnel who direct writing
placement programs” (p. 61).
While there may be university administrators who believe in the value (mainly
in terms of efficiency and cost effectiveness) of indirect measures, over the past few
decades few writing specialists have publicly argued this, in print or at conferences
(Crusan, 2002a). Some writing assessment professionals operate on the assump-
tion that using at least a one-shot writing sample coupled with a multiple-choice
instrument is the most prevalent placement procedure used in US institutions.
Haswell (1998) has been unusual in recognizing the possibility of university use
of a multiple-choice test as the sole instrument for placement into writing courses.
Other than Haswell, it seems that neither native English speaker (L1) or non-native
English speaker (L2) composition specialists are concerned about finding quantita-
tive evidence to settle the question of whether a direct or indirect measure is better
at assessing writing in the context of college placement. The tacit assumption is
that the debate over the merits of direct versus indirect testing has been resolved
and the issue buried. Unfortunately, this issue has many lives, as demonstrated by
the proportion of institutions still using an indirect test as their primary mode of
assessment for placement into writing courses.
In a study of 100 private and public colleges and universities in the United States
selected for their “potentially significant second language populations” (Williams,
1995, p. 158), the researcher found that 32% of institutions reported the use of an
institutionally administered indirect test; 19% used a standardized test combined
with an essay; 23% an essay alone; and 26% relied on TOEFL scores for placement
in ESL composition course. If we combine the percentages of those reporting use
of an indirect measure alone, we find that 58% of the institutions in her study used
an indirect measure as the sole means to assess writing for placement.
More recently, Lewiecki-Wilson, Sommers, and Tassoni (2000) stated, “On the
grounds of expediency, many two-year institutions have turned to computerized
editing tests such as COMPASS for placing entering students into writing courses,
even though such tests do not directly measure writing” (p. 165). COMPASS is
a computerized assessment in which students find and correct errors in essays
on a computer screen, according to ACT (2002), the Writing Skills Placement
Test helps to determine placement into either entry-level or developmental writing
classes and assess students’ knowledge and skills in eight writing skills areas: us-
age//mechanics, rhetorical skills, punctuation, basic grammar and usage, sentence
structure, strategy, organization, and style.
Extending Yancey’s (1999) metaphor likening the history of writing assessment
to three waves — from objective tests to holistically scored essays to portfolio
assessment — Lewiecki-Wilson et al. suggest that there is, especially at two-year
institutions, evidence of a new “undertow — a backward-moving current” (p. 166)
returning to indirect, objective testing for placing students.
Writing assessment often seems to be carried out with little thought for who is
beingtestedandhowthesepersonsmightbeaffectedbytheresultsofthetests.Ifeel
D. Crusan / Assessing Writing 8 (2002) 17–30 21
strongly that, as educators, we must be more knowledgeable about assessment and
aid administrators and colleagues to realize the pedagogical, social, and political
implications of the tests we administer to our students. Drawing on Haswell’s
(1998) contention that “most schools use indirect testing through a standardized
exam,” (p. 136), in the remainder of this paper, I explore the contradiction between
what composition specialists believe about assessment for placement and what is
actually done. As I will outline, indirect measures of writing ability continue to be
accepted as the sole means of writing placement at many academic institutions in
the United States.
3. Problems with the tests — or — who says which is best and why
Researchers (Bachman, 1990; Bailey, 1998; Conlan, 1986; White, 1985) have
documented problems inherent in using a multiple-choice test as an assessment
of writing ability. Some have found that students, when taking multiple-choice
tests, have a “penchant for guessing, especially those who are not well-prepared to
take the examination or those who are more impulsive than reflective in test-taking
behavior” (Henning, 1987, p. 32). The flaw, it seems, is using a grammar test as
the only means to measure writing ability. Grammar is one of the components of
language and is generally more appropriately tested through its actual use in one
or more of the four skills (reading, writing, speaking, and listening) rather than in
isolation (Bailey, 1998). A score on a single multiple-choice instrument may not
provide sufficient depth of information.
ESL writers might well be marginalized by indirect tests. Haswell (1998) as-
serts: “Indirect testing may be less valid for international students since they often
show a larger gap between the skill and habits needed to write extended essays
and the facility with the kind of surface features measured in test items” (p. 136).
Some ESL students, because they generally have memorized grammar rules very
well, tend to score very high on tests like the TOEFL. Consequently, it has been
claimed that multiple-choice tests overpredict minority students’ performance on
essay tests (Bailey, 1998; O’Malley & Valdez Pierce, 1996).
However, direct assessment of writing ability poses a different set of challenges.
While numerous writing experts value the perceived gain in authenticity that timed
writing tests represent in assessing writing competence, others find fault with its
reliability and validity as a measure of writing ability (McNenny, 2001). Many ESL
writers, unsure of the structures of the language, have great difficulty producing
fluentwrittendiscourse,especiallyintimedsituations.This,coupledwith“asimple
lack of motor fluency in writing English script” (Leki, 1991, p. 55), causes the
second language writer to write much more slowly than a native speaker, producing
fewer words over a longer period of time. Hence, timed writings might impose a
handicap on many ESL students.
For any student, another problem presented by timed writing tests is that they
are given under artificial conditions: students must compose on an assigned topic,
22 D. Crusan / Assessing Writing 8 (2002) 17–30
and they are most often not allowed to access any reference material such as
dictionaries. Many students find it difficult to write “cold” on a topic they might
never have seen before and perhaps care nothing about or, even worse, know
nothing about (Huot, 1990a; Leki, 1991). Further, in the timed essay test there
is no time for revision or any other process approaches (Wolcott & Legg,
1998).
The relative merits of both direct and indirect measures were carefully scruti-
nized by Breland (1977), whose study concluded that a multiple-choice test com-
bined with an essay can be the most educationally sound solution to administrative
problems in placement. In 1983, Breland (1983) presented evidence of the valid-
ity of direct assessment over and above other available measures; however, he
cites cost and reliability as two obstacles and recommends consideration of auto-
mated textual analysis as well as production and reporting of more than a single
score as ways to improve better direct assessment. Breland, Camp, Jones, Morris,
and Rock (1987) studied the differences between essay and multiple-choice instru-
ments for assessing writing and once more suggested the multiple-choice/essay
test combination in a effort to reduce the cost of assessment and increase its re-
liability and validity. Hudson (1982) drew similar conclusions and recommended
either a holistically- or analytically-scored essay combined with a carefully chosen
objective test.
In an effort to “discover how prevalent placement testing was and what forms
it took” (p. 49), Huot (1994) surveyed 1,080 institutions, both two- and four-year
US colleges and universities on the MLA list of English Chairpersons. He calls
this initial study “tentative and exploratory” (p. 59) and encourages more research
in this area. He found that only one-half of the institutions surveyed used any
kind of direct assessment for placement of students into composition classes.
In other words, almost 50% of institutions were still using some form of in-
direct assessment, a problem lamented by a number of composition theorists
(Huot, 1990b; Shor, 1997; Tedick, 1990; White, 1990). Further, 54% of the in-
stitutions that reported using some form of direct assessment “employed writing
criteria developed outside the university” (p. 60).
3.1. Placement in the ESL context
Because Huot’s data did not include ESL program data, I was led to gather
my own data as well as to conduct an internet search of US institutions of higher
education — both two- and four-year institutions. Preliminary findings are reported
in the following sections.
First, I compared a grammar test to an essay test to determine which was a
better predictor of final grade in an ESL composition course. Six composition
instructors and 124 ESL students took part in a study at The Pennsylvania State
University. The students were both basic and freshman level writers enrolled in
“sheltered” (ESL only) classes. They ranged in age from 18 to 40; 65% were male,
35% female, and they represented 26 different languages; however, Spanish (25%)
D. Crusan / Assessing Writing 8 (2002) 17–30 23
and Chinese (21%) were most prevalent followed by Korean (12%). The students
were of varying levels of writing ability and were initially placed in classes by
means of a score on a university administered indirect assessment taken just prior
to enrollment as freshmen.
There were two independent variables in the study. The first was a direct
measure written on the first day of class on one of three assigned topics. The
second independent variable, an indirect measure based on disclosed copies of the
TOEFL, included 40 questions and was divided into four sub-skills — word or-
der, error identification, vocabulary/expression, and reading comprehension. The
reliability of this measure was .95. This test provided objective measure scores
in lieu of the university-administered placement test scores which, despite re-
peated requests, were not made available for this study. The dependent vari-
able, final grade, was reported to me by composition teachers at the end of the
semester.
The six instructors were trained, and used the Test of Written English (TWE)
scoring guide to score the essays, because it was initially designed to measure
writing proficiency rather than growth in the skill (Educational Testing Services,
1996). In case of a discrepancy of more than two points, we used a third rater’s
score to adjudicate. The inter-rater reliability was .92. Two English as a Second
Language instructors hand-scored the grammar tests.
Themostinterestingfindinginthedataisthecorrelationsbetweenthedependent
and independent variables. Both the direct and indirect assessments positively
and significantly correlated with final grade, although the placement essay had a
stronger correlation with final grade. The correlation between the grammar test
and final grade was the lowest but still significant (.190, P < .01). The correlation
between the placement essay and final grade was higher and more significant (.260,
P < .001).
It is important to note here that scholars have issued caveats when employing
final grades as variables. Armstrong (1995) argues that any model that uses final
grades as the basis for validity of a placement would likely fail to account for a
significant source of variation, that of instructor grading variability. Further, studies
of the presence of grade inflation (see Compton & Metheny, 2000; Edwards, 2000;
Nagle, 1998; Zirkel, 1999 for several perspectives on this matter) present reasons
for care and circumspection when utilizing such data.
Possibly of more significance both statistically and pragmatically, is the corre-
lation between the two independent variables, the grammar test and the placement
essay (.327, P < .0001), supporting the use of a combination of direct and indirect
measures as a means to place our ESL writers into composition courses. Despite
the fact that this correlation is derived from a relatively small sample, it supports
the use of multiple assessment instruments, rather than one or the other, in the
context of ESL student placement.
The second data set represents findings from institutional websites at some of
the largest universities in the US on the instruments used for initial placement of
ESL students.
24 D. Crusan / Assessing Writing 8 (2002) 17–30
D. Crusan / Assessing Writing 8 (2002) 17–30 25
Table 1 clearly shows that at least three of the largest and most prestigious uni-
versities (Penn State, Purdue and Wisconsin) currently use only a multiple-choice
instrument to place ESL students into composition courses.
In separate internet searches, I found that a great many two-year colleges —
which are often the educational entry point for many immigrant ESL students, par-
ticularly Hispanics (Rendon & Nora, 1994) — continue to use indirect instruments
as the sole means of placement into composition courses.
3.2. Placement and its importance for ESL students
Assessment for placement cannot be trivialized and weighs heavily on writing
program administrators (Silva, 1994). So important is placement that the CCCC
Committee on Second Language Writing in its Statement on Second-Language
Writing and Writers (2001) states: “Decisions regarding the placement of sec-
ond language writers into writing courses should be based on students’ writing
proficiency” and not “based solely on the scores from standardized tests of general
language proficiency or of spoken language proficiency. Instead, scores from the
direct assessment of students’ writing proficiency should be used, and multiple
writing samples should be consulted whenever possible” (pp. 670–671).
Armstrong (2001) recommends that institutions’ placement tests align with
course content. Likewise, Bachman and Palmer (1996) and Moss (1994) advise
that writing assessments should be linked with curricula. Students should be tested
in a manner similar to the work of the course: In other words, if the work of the
course is writing, then students should write as a means of placement. I believe that
a test is inappropriate unless it leads to the best available treatment or placement
for students (Crusan, 2002b).
It has been suggested that “assessment defines goals and expresses values more
clearly than do any number of mission statements” (White, Lutz, & Kamusikiri,
1996, p. 9). If a university’s definition of writing ability differs from that of
the composition community, its goals will differ as well. When a university de-
fines writing ability as product rather than a non-linear, recursive process, it is
justified in assessing writing ability indirectly. However, whether such assess-
ment is meaningful is a separate question. Continually, experts such as Ferris and
Hedgcock, 1998 point out that “an indirect measure might attempt to evaluate writ-
ing ability by testing verbal reasoning, error recognition, or grammatical accuracy,
all of which are related to writing performance in some way but only indirectly
because they do not involve the act of composing” (p. 229). The view that student
mastery of individual, bite-size, discrete-point rules can lead to full mastery of
the language is dangerous if it informs curriculum and leads to decontextualized
“writing” pedagogy (Leki, 1991).
Institutions may cling to indirect assessment techniques because they have a
narrow definition of writing ability. Institutions using only indirect assessment
tools for ESL placement may consider writing to be simply a set of sub-skills such
as vocabulary knowledge, error recognition, and mechanics. If this is the case, it
26 D. Crusan / Assessing Writing 8 (2002) 17–30
is not surprising that an indirect assessment is used. However, the assessment of
writing is complex (White, 1994a) and that complexity should not be reduced to
one score on one measure (Bailey, 1998; Belanoff, 1991; Carlson & Bridgeman,
1986; Haswell & Wyche-Smith, 1994; Huot, 1994, 1996). Many institutions cite
costandeaseofadministrationasjustificationforcontinuingtoassessESLstudents
solely with a discrete, indirect measure. Such defense of indirect assessment even
in the face of information which indicates that what is being done may not be
for the greater good of the students involved is suspect. Is ease of administration
and cost effectiveness for the university enough to justify probable financial and
academic hardships to students? Rose (1989) considers testing to be the supreme
irony. He states that “the very means we use to determine students’ needs — and
the various remedial procedures that derive from them — can wreak profound
harm on our children, usually, but by no means only, those who are already behind
the economic and political eight ball” (p. 127).
A number of researchers have argued that teachers should be involved in testing
(Bachman & Palmer, 1996; Hamp-Lyons, 1996; White, 1996). Kroll and Reid
(1994) contend that test developers should be those who teach the students who
will be tested, since these teachers would be better equipped to know what their
students can do, how they can do it, and what they need, and can, therefore,
better assess their students’ writing. Moreover, White (1996) strongly advocates
that composition faculty become involved in assessment outside the classroom;
he asserts that the measurement community regards the writing community as
“largely irrelevant to serious measurement” (p. 102). However, he also argues that
informed teachers can convince those in power to use and develop better types of
writing assessment procedures once they have begun to discover that assessment
knowledge is power.
Discussing the necessity for all teachers to address the needs of ESL stu-
dents, Zamel (1995) calls upon faculty to be border-crossers, “blurring the bor-
ders within what was once a fairly well-defined and stable academic community,”
(p. 519). In much the same way, I call for more compositionists to become involved
with the writing placement processes in their respective institutions, to cross the
border into the world of assessment and struggle to achieve “a vision of active
learning, creative thinking, and a much needed blend of skills with imagination”
(White, 1996, p. 111) thereby supporting the educational needs of our students.
How many of us could carry on an informed dialogue if asked about the writing
placement test procedures at our colleges and universities? Again, knowledge is
power.
“The form of assessment we choose is always important because as a rhetorical
act,assessmentspeakstostudents,faculty,thewiderinstitution,andthecommunity
about the values and practices of college writing” (Lewiecki-Wilson et al., 2000,
p. 183). This article’s purpose was to encourage ESL and composition specialists
to explore how we might better serve those we teach and test. In an age where we
seem to be cycling back into criticism (White, 2001), we must be cautious that our
students do not get lost — or worse, disappear from the academic scene altogether.
D. Crusan / Assessing Writing 8 (2002) 17–30 27
As teachers, we should re-evaluate our commitment to ESL students (Williams,
1995) by becoming more involved in the assessments that affect their lives, helping
our ESL students to carve out a legitimate space for themselves at the university,
a promise made to them by universities when touting much publicized diversity
programs.
References
American College Test, Inc., COMPASS/ESL. (2002). Writing placement skills. Retrieved June 4,
2002, from the Website: http://www.act.org/compass/sample/writing.html
Armstrong, W. B. (1995, May). Validating placement tests in the community college: The role of test
scores, biographical data, and grading concerns. Paper presented at the 35th Annual Forum of the
Association for Institutional Research, Boston, MA.
Armstrong, W. B. (2001). Pre-enrollment placement testing and curricular content: Correspondence or
misalignment. Abstract retrieved May 26, 2002, from: ERIC database.
Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford, UK: Oxford Univer-
sity Press.
Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice. Oxford, UK: Oxford University
Press.
Bailey, K. M. (1998). Learning about language assessment: Dilemmas, decisions, and directions.
Boston: Heinle & Heinle.
Belanoff, P. (1991). The myths of assessment. Journal of Basic Writing, 1, 54–66.
Breland, H. M. (1977). Can multiple-choice tests measure writing skills? College Board Review, 103,
32–33.
Breland, H. M. (1983). The direct assessment of writing skill: A measurement review (College Board
Report No. 83-6, ETS RR No. 83-32). New York: College Entrance Examination Board.
Breland, H., Camp, R., Jones, R., Morris, M., & Rock, D. (1987). Assessing writing skill (Research
Monograph No. 11). New York: College Entrance Examination Board.
Brown, H. D. (1994). Principles of language learning and teaching (3rd ed.). Englewood Cliffs, NJ:
Prentice Hall.
Bullock, R. (1991). Autonomy and community in the evaluation of writing. In: R. Bullock & J. Trimbur
(Eds.), The politics of writing instruction: Postsecondary. Portsmouth, NH: Boynton/Cook.
Carlson, S., & Bridgeman, B. (1986). Testing ESL student writers. In: K. L. Greenberg, H. S. Wiener,
& R. A. Donovan (Eds.), Writing assessment: Issues and strategies. New York: Longman.
CCCC Committee on Assessment. (1995). Writing assessment: A position statement. College Compo-
sition and Communication 46, 430–437.
CCCC Committee on Second Language Writing. (2001). CCCC statement on second-language writing
and writers. College Composition and Communication 52 (4), 669–674.
Compton, D. M., & Metheny, B. (2000). An assessment of grade inflation in higher education. Percep-
tual and Motor Skills, 90, 527–536.
Conlan, G. (1986). “Objective” measures of writing ability. In: K. L. Greenberg, H. S. Wiener, & R.
A. Donovan (Eds.), Writing assessment: Issues and strategies. New York: Longman.
Courts, P. L., & McInerney, K. H. (1993). Assessment in higher education: Politics, pedagogy, and
portfolios. Westport, CT: Praeger.
Crowley, S. (1995). Composition’s ethic of service, the universal requirement, and the discourse of
student need. Journal of Composition Theory, 15 (2), 227–239.
Crusan, D. (2002a). The marginalization of ESL students through placement exams. Paper presented
at the 36th Annual TESOL Convention and Exhibition, Salt Lake City, UT, April 2002.
Crusan, D. (2002b). The quagmire of assessment for placement: Talking out of both sides of our mouths.
Manuscript submitted for publication.
28 D. Crusan / Assessing Writing 8 (2002) 17–30
Crusan, D., & Cornett, C. (2002). The cart before the horse: Teaching assessment criteria before writing.
The International Journal for Teachers of English Writing Skills, 9, 20–33.
Educational Testing Service. (1996). TOEFL: Test of written English guide (4th ed.). Princeton, NJ:
Author.
Edwards, C. H. (2000). Grade inflation: The effects on educational quality and personal well being.
Education, 120, 538–546.
Ferretti, E. (2001). Just a little higher education: Teaching working-class women on the vocational
track. In: B. Alford & K. Kroll (Eds.), The politics of writing in the two-year college (pp. 1–18).
Portsmouth, NH: Heinemann.
Ferris, D., & Hedgcock, J. (1998). Teaching ESL composition: Purpose, process, and practice. Mahwah,
NJ: Lawrence Erlbaum.
Gordon, B. L. (1987). Another look: Standardized tests for placement in college composition courses.
Writing Program Administration, 10 (3), 29–38.
Greenberg, K. (1992). Validity and reliability issues in direct assessment of writing. Writing Program
Administration, 16, 7–22.
Hamp-Lyons, L. (1990). Second language writing: Assessment issues. In: B. Kroll (Ed.), Second
language writing: Research insights for the classroom. Cambridge: Cambridge University Press.
Hamp-Lyons, L. (1996). The challenges of second-language writing assessment. In: E. M. White, W.
D. Lutz, & S. Kamusikiri (Eds.), Assessment of writing: Politics, policies, practices. New York:
The Modern Language Association of America.
Hamp-Lyons, L. (1997). Exploring bias in essay tests. In: C. Severino, J. C. Guerra, & J. E. Butler
(Eds.), Writing in multicultural settings. New York: The Modern Language Association of America.
Hamp-Lyons, L. (2001). Fourth generation writing assessment. In: T. Silva & P. K. Matsuda (Eds.),
On second language writing (pp. 117–127). Mahwah, NJ: Lawrence Erlbaum.
Harrington, S. (1998). New visions of authority in placement test rating. Writing Program Administra-
tion, 22 (1/2), 53–84.
Haswell, R. H. (1998). Searching for Kiyoko: Bettering mandatory ESL writing placement. Journal of
Second Language Writing, 7, 133–174.
Haswell, R. H., & Wyche-Smith, S. (1994). Adventuring into writing assessment. College Composition
and Communication, 45, 220–236.
Henning, G. H. (1987). A guide to language testing: Development, evaluation, research. Cambridge,
MA: Newbury House.
Hudson, S. A. (1982). An empirical investigation of direct and indirect measures of writing. Report of
the 1980–81 Georgia Competency Based Education Writing Assessment Project — 1981. ERIC:
ED #205993.
Huot, B. (1990a). The literature of direct writing assessment: Major concerns and prevailing trends.
Review of Educational Research, 60, 237–263.
Huot, B. (1990b). Reliability, validity, and holistic scoring: What we know and what we need to know.
College Composition and Communication, 41, 201–213.
Huot, B. (1994). A survey of college and university writing placement practices. Writing Program
Administration, 17, 49–65.
Huot, B. (1996). Toward a new theory of writing assessment. College Composition and Communication,
47, 549–566.
Indiana University, South Bend, IN, Office of International Affairs. (2002). The English as a sec-
ond language (ESL) placement test. Retrieved June 5, 2002, from the Website: http://www.
iusb.edu/∌abridger/test.htm
Kroll, B., & Reid, J. (1994). Guidelines for designing writing prompts: Clarifications, caveats, and
cautions. Journal of Second Language Writing, 3, 231–255.
Leki, I. (1991). A new approach to advanced ESL placement testing. Writing Program Administration,
14 (3), 53–68.
Lewiecki-Wilson, C., Sommers, J., & Tassoni, J. P. (2000). Rhetoric and the writer’s profile: Prob-
lematizing directed self-placement. Assessing Writing, 7, 165–183.
D. Crusan / Assessing Writing 8 (2002) 17–30 29
Matsuda, P. K. (1998). Situating ESL writing in a cross-disciplinary context. Written Communication,
15, 99–121.
McNenny, G. (2001). Writing instruction and the post-remedial university: Setting the scene for the
mainstreaming debate in basic writing. In: G. McNenny & S. H. Fitzgerald (Eds.), Mainstreaming
basic writers: Politics and pedagogies of access (pp. 1–15). Mahwah, NJ: Lawrence Erlbaum.
Michigan State University, Lansing, MI, MSU Testing Office. (2002). English language proficiency
exams. Retrieved June 5, 2002, from the Website: http://www.couns.msu.edu/testing/tests.htm
Moss, P. A. (1994). Validity in high stakes writing assessment: Problems and possibilities. Assessing
Writing, 1, 109–128.
Nagle, B. (1998). A proposal for dealing with grade inflation: The relative performance index. Journal
of Education for Business, 74, 40–43.
Northwestern University, Evanston, IL, Writing Program. (2002). English proficiency. Retrieved June
5, 2002, from the Website: http://www.cas.northwestern.edu/handbook/IV.html#IV.A.2.
O’Malley, J. M., & Valdez Pierce, L. (1996). Authentic assessment for English language learners:
Practical approaches for teachers. New York: Addison-Wesley.
Patkowski, M. S. (1991). Basic skills tests and academic success of ESL college students. TESOL
Quarterly, 25, 735–738.
Polio, C. (1998). Examining the written product in L2 writing research: A taxonomy of measures and
analyses. Paper presented at the Symposium on Second Language Writing, Purdue University, West
Lafayette, IN, September 1998.
Rendon, L. I., & Nora, A. (1994). A synthesis and application of research on Hispanic students in
community colleges. In: J. L. Ratcliff, S. Schwarz, & L. H. Ebbers (Eds.), Community colleges
(2nd ed.). Needham Heights, MA: Simon & Schuster.
Rose, M. (1989). Lives on the boundary: A moving account of the struggles and achievements of
America’s educationally underprepared. New York: Penguin Books.
Shor, I. (1997). Our apartheid: Writing instruction and inequality. Journal of Basic Writing, 16, 91–104.
Silva, T. (1994). An examination of writing program administrators’ options for the placement of ESL
students in first year writing classes. Writing Program Administration, 18 (1/2), 37–43.
Spolsky, B. (1997). The ethics of gatekeeping tests: What have we learned in a hundred years? Language
Testing, 14, 242–247.
Stiggins, R. J. (1982). A comparison of direct and indirect writing assessment methods. Research in
the Teaching of English, 16, 101–114.
Tedick, D. J. (1990). ESL writing assessment: Subject-matter knowledge and its impact on performance.
English for Specific Purposes, 9, 123–143.
The Ohio State University, Columbus, OH, English as a Second Language Composition Program.
(2002). Placement. Retrieved June 5, 2002, from the Website: http://www.esl.ohio-state.edu/
Comp/Placement Information.html
The Pennsylvania State University, University Park, PA, First-Year Testing, Counseling and Ad-
vising Program (FTCAP). (2002). English test. Retrieved June 5, 2002, from the Website:
http://www.psu.edu/dus/ftcap/ftcoverv.htm
University of Illinois, Urbana-Champaign, IL, Division of English as an International Language.
(2002). The ESL placement test (EPT). Retrieved June 5, 2002, from the Website: http://www.
deil.uiuc.edu/esl.service/EPT.html
University of Iowa, Iowa City, IA, The English as a Second Language Program. (2002). English
as a second language credit classes. Retrieved June 5, 2002, from the Website: http://www.
uiowa.edu/%7Eiiepesl/ESL/eslindex.html
University of Michigan, Ann Arbor, MI, English Language Institute. (2002). The Academic English
Evaluation (AEE) Schedule and Information Sheet for Winter, Spring, and Summer Terms, 2003.
Retrieved June 5, 2002, from the Website: http://websvcs.itd.umich.edu/eli-bin/main
University of Minnesota, Minneapolis, MN. (2002). Test scores. Retrieved June 5, 2002, from the
Website: http://www1.umn.edu/twincities/
30 D. Crusan / Assessing Writing 8 (2002) 17–30
University of Wisconsin, Madison, WI, Center for Placement Testing. (2002). English place-
ment test. Retrieved June 5, 2002, from the Website: http://wiscinfo.doit.wisc.edu/exams/
english placement test.htm
White, E. M. (1985). Teaching and assessing writing: Recent advances in understanding, evaluating,
and improving student performance. San Francisco: Josey-Bass.
White, E. M. (1986). Pitfalls in the testing of writing. In: K. L. Greenberg, H. S. Wiener, & R. A.
Donovan (Eds.), Writing assessment: Issues and strategies. New York: Longman.
White, E. M. (1990). Language and reality in writing assessment. College Composition and Commu-
nication, 41, 187–200.
White, E. M. (1994a). Issues and problems in writing assessment. Assessing Writing, 1, 11–27.
White, E. M. (1994b). Teaching and assessing writing (2nd ed.). San Francisco: Josey-Bass.
White, E. M. (1996). Writing assessment beyond the classroom. In: L. Z. Bloom, D. D. Daiker, &
E. M. White (Eds.), Composition in the twenty-first century: Crisis and change (pp. 101–111).
Carbondale, IL: Southern Illinois University Press.
White, E. M. (2001). Revisiting the importance of placement and basic studies: Evidence of success.
In: G. McNenny & S. H. Fitzgerald (Eds.), Mainstreaming basic writers: Politics and pedagogies
of access (pp. 19–28). Mahwah, NJ: Lawrence Erlbaum.
White, E. M., Lutz, W. D., & Kamusikiri, S. (1996). Assessment of writing: Politics, policies, practices.
New York: The Modern Language Association of America.
Williams, J. (1995). ESL composition program administration in the United States. Journal of Second
Language Writing, 4, 157–179.
Williamson, M. (1994). The worship of efficiency: Untangling theoretical and practical considerations
in writing assessment. Assessing Writing, 1, 147–193.
Wolcott, W., & Legg, S. M. (1998). An overview of writing assessment: Theory, research, and practice.
Urbana, IL: National Council of Teachers of English.
Yancey, K. B. (1999). Looking back as we look forward: Historicizing writing assessment. College
Composition and Communication, 50, 483–503.
Zamel, V. (1995). Strangers in academia: The experiences of faculty and ESL students across the
curriculum. College Composition and Communication, 46, 506–521.
Zirkel, P. A. (1999). Grade inflation: A leadership opportunity for schools of education? Teachers
College Record, 101, 247–260.

More Related Content

Similar to An Assessment Of ESL Writing Placement Assessment

Assessment and testing language
Assessment and testing languageAssessment and testing language
Assessment and testing languageMa Luz Kantu
 
What does language testing have to offier
What does language testing have to offierWhat does language testing have to offier
What does language testing have to offierAPSACS
 
A Review Of The Research On The Evaluation Of ESL Writing
A Review Of The Research On The Evaluation Of ESL WritingA Review Of The Research On The Evaluation Of ESL Writing
A Review Of The Research On The Evaluation Of ESL WritingLori Mitchell
 
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...Simar Neasy
 
El ls%20in%20math%20word%20problems
El ls%20in%20math%20word%20problemsEl ls%20in%20math%20word%20problems
El ls%20in%20math%20word%20problemsanup2011
 
A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12
A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12
A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12Katie Robinson
 
Assessing Children S Writing Products The Role Of Curriculum Based Measures
Assessing Children S Writing Products  The Role Of Curriculum Based MeasuresAssessing Children S Writing Products  The Role Of Curriculum Based Measures
Assessing Children S Writing Products The Role Of Curriculum Based MeasuresErin Taylor
 
An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...
An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...
An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...Sheila Sinclair
 
Academic Writing Task Surveys The Need For A Fresh Approach
Academic Writing Task Surveys  The Need For A Fresh ApproachAcademic Writing Task Surveys  The Need For A Fresh Approach
Academic Writing Task Surveys The Need For A Fresh ApproachMichele Thomas
 
Tsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodo
Tsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodoTsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodo
Tsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodoWilliam Kritsonis
 
English - Majored Juniors’ attitudes towards learning the academic writing co...
English - Majored Juniors’ attitudes towards learning the academic writing co...English - Majored Juniors’ attitudes towards learning the academic writing co...
English - Majored Juniors’ attitudes towards learning the academic writing co...AJHSSR Journal
 
Task-Based Writing.pdf
Task-Based Writing.pdfTask-Based Writing.pdf
Task-Based Writing.pdfAliciaGarzon4
 
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...Jessica Henderson
 
Use of the metacognitive english language learning strategies based on person...
Use of the metacognitive english language learning strategies based on person...Use of the metacognitive english language learning strategies based on person...
Use of the metacognitive english language learning strategies based on person...Dr. Seyed Hossein Fazeli
 
Assessment And Academic Writing A Look At The Use Of Rubrics In The Second ...
Assessment And Academic Writing   A Look At The Use Of Rubrics In The Second ...Assessment And Academic Writing   A Look At The Use Of Rubrics In The Second ...
Assessment And Academic Writing A Look At The Use Of Rubrics In The Second ...Addison Coleman
 
2 robert jackson final paper_12-18
2 robert jackson final paper_12-182 robert jackson final paper_12-18
2 robert jackson final paper_12-18Alexander Decker
 
An Emic View Of Student Writing And The Writing Process
An Emic View Of Student Writing And The Writing ProcessAn Emic View Of Student Writing And The Writing Process
An Emic View Of Student Writing And The Writing ProcessRachel Doty
 
Writing Listening Speaking in the California Framwork
Writing Listening Speaking in the California FramworkWriting Listening Speaking in the California Framwork
Writing Listening Speaking in the California Framworkteamteach
 
Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee
Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee
Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee MissJillSmith
 

Similar to An Assessment Of ESL Writing Placement Assessment (20)

Assessment and testing language
Assessment and testing languageAssessment and testing language
Assessment and testing language
 
Thesis malyn
Thesis malynThesis malyn
Thesis malyn
 
What does language testing have to offier
What does language testing have to offierWhat does language testing have to offier
What does language testing have to offier
 
A Review Of The Research On The Evaluation Of ESL Writing
A Review Of The Research On The Evaluation Of ESL WritingA Review Of The Research On The Evaluation Of ESL Writing
A Review Of The Research On The Evaluation Of ESL Writing
 
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
A Case Study Of The 4Cs Approach To Academic Writing Among Second Year Busine...
 
El ls%20in%20math%20word%20problems
El ls%20in%20math%20word%20problemsEl ls%20in%20math%20word%20problems
El ls%20in%20math%20word%20problems
 
A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12
A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12
A Quantitative Synthesis Of Research On Writing Approaches In Grades 2 To 12
 
Assessing Children S Writing Products The Role Of Curriculum Based Measures
Assessing Children S Writing Products  The Role Of Curriculum Based MeasuresAssessing Children S Writing Products  The Role Of Curriculum Based Measures
Assessing Children S Writing Products The Role Of Curriculum Based Measures
 
An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...
An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...
An Analysis Of First-Grade Writing Profiles And Their Relationship To Composi...
 
Academic Writing Task Surveys The Need For A Fresh Approach
Academic Writing Task Surveys  The Need For A Fresh ApproachAcademic Writing Task Surveys  The Need For A Fresh Approach
Academic Writing Task Surveys The Need For A Fresh Approach
 
Tsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodo
Tsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodoTsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodo
Tsai, min hsiu university students anziety focus n6 v1 2012 (1)-chiodo
 
English - Majored Juniors’ attitudes towards learning the academic writing co...
English - Majored Juniors’ attitudes towards learning the academic writing co...English - Majored Juniors’ attitudes towards learning the academic writing co...
English - Majored Juniors’ attitudes towards learning the academic writing co...
 
Task-Based Writing.pdf
Task-Based Writing.pdfTask-Based Writing.pdf
Task-Based Writing.pdf
 
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
Assessing The Writing Assessment The Perception Of Saudi Graduate EFL Learner...
 
Use of the metacognitive english language learning strategies based on person...
Use of the metacognitive english language learning strategies based on person...Use of the metacognitive english language learning strategies based on person...
Use of the metacognitive english language learning strategies based on person...
 
Assessment And Academic Writing A Look At The Use Of Rubrics In The Second ...
Assessment And Academic Writing   A Look At The Use Of Rubrics In The Second ...Assessment And Academic Writing   A Look At The Use Of Rubrics In The Second ...
Assessment And Academic Writing A Look At The Use Of Rubrics In The Second ...
 
2 robert jackson final paper_12-18
2 robert jackson final paper_12-182 robert jackson final paper_12-18
2 robert jackson final paper_12-18
 
An Emic View Of Student Writing And The Writing Process
An Emic View Of Student Writing And The Writing ProcessAn Emic View Of Student Writing And The Writing Process
An Emic View Of Student Writing And The Writing Process
 
Writing Listening Speaking in the California Framwork
Writing Listening Speaking in the California FramworkWriting Listening Speaking in the California Framwork
Writing Listening Speaking in the California Framwork
 
Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee
Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee
Summary on LANGUAGE TESTING & ASSESSMENT (Part I) Alderson & Banerjee
 

More from Sarah Adams

Instant Essay Writer - Renderoactueel. Online assignment writing service.
Instant Essay Writer - Renderoactueel. Online assignment writing service.Instant Essay Writer - Renderoactueel. Online assignment writing service.
Instant Essay Writer - Renderoactueel. Online assignment writing service.Sarah Adams
 
November 2023 TOK Essay Prompts Detailed Guide
November 2023 TOK Essay Prompts  Detailed GuideNovember 2023 TOK Essay Prompts  Detailed Guide
November 2023 TOK Essay Prompts Detailed GuideSarah Adams
 
How To Write A Character Analysis Essay High Schoo
How To Write A Character Analysis Essay High SchooHow To Write A Character Analysis Essay High Schoo
How To Write A Character Analysis Essay High SchooSarah Adams
 
Why Am I In College Essay By Professor Garrison - 59
Why Am I In College Essay By Professor Garrison - 59Why Am I In College Essay By Professor Garrison - 59
Why Am I In College Essay By Professor Garrison - 59Sarah Adams
 
Help - Writeonline.Web.Fc2.Com. Online assignment writing service.
Help - Writeonline.Web.Fc2.Com. Online assignment writing service.Help - Writeonline.Web.Fc2.Com. Online assignment writing service.
Help - Writeonline.Web.Fc2.Com. Online assignment writing service.Sarah Adams
 
022 Heading For College Essay Example Persuasiv
022 Heading For College Essay Example Persuasiv022 Heading For College Essay Example Persuasiv
022 Heading For College Essay Example PersuasivSarah Adams
 
Free Printable Lined Writing Paper With Drawing Box Pa
Free Printable Lined Writing Paper With Drawing Box PaFree Printable Lined Writing Paper With Drawing Box Pa
Free Printable Lined Writing Paper With Drawing Box PaSarah Adams
 
RESEARCH PAPER CONC. Online assignment writing service.
RESEARCH PAPER CONC. Online assignment writing service.RESEARCH PAPER CONC. Online assignment writing service.
RESEARCH PAPER CONC. Online assignment writing service.Sarah Adams
 
Song Analysis Example Free Essay Example
Song Analysis Example Free Essay ExampleSong Analysis Example Free Essay Example
Song Analysis Example Free Essay ExampleSarah Adams
 
The Story So Far... - Creative Writing Blog
The Story So Far... - Creative Writing BlogThe Story So Far... - Creative Writing Blog
The Story So Far... - Creative Writing BlogSarah Adams
 
Grade 8 Essay Examples Sitedo. Online assignment writing service.
Grade 8 Essay Examples  Sitedo. Online assignment writing service.Grade 8 Essay Examples  Sitedo. Online assignment writing service.
Grade 8 Essay Examples Sitedo. Online assignment writing service.Sarah Adams
 
001 Contractions In College Essa. Online assignment writing service.
001 Contractions In College Essa. Online assignment writing service.001 Contractions In College Essa. Online assignment writing service.
001 Contractions In College Essa. Online assignment writing service.Sarah Adams
 
How To Write The Conclusion Of An Essay HubPages
How To Write The Conclusion Of An Essay  HubPagesHow To Write The Conclusion Of An Essay  HubPages
How To Write The Conclusion Of An Essay HubPagesSarah Adams
 
Examples Of Science Paper Abstract How To Write Disc
Examples Of Science Paper Abstract  How To Write DiscExamples Of Science Paper Abstract  How To Write Disc
Examples Of Science Paper Abstract How To Write DiscSarah Adams
 
Political Parties Today Are The Anathema Of Democr
Political Parties Today Are The Anathema Of DemocrPolitical Parties Today Are The Anathema Of Democr
Political Parties Today Are The Anathema Of DemocrSarah Adams
 
Expository Story. Exposition In A Story Why Background Informatio
Expository Story. Exposition In A Story Why Background InformatioExpository Story. Exposition In A Story Why Background Informatio
Expository Story. Exposition In A Story Why Background InformatioSarah Adams
 
Recommendation Letter For Master Degree Example -
Recommendation Letter For Master Degree Example -Recommendation Letter For Master Degree Example -
Recommendation Letter For Master Degree Example -Sarah Adams
 
College Essay Diagnostic Essay Assignment
College Essay Diagnostic Essay AssignmentCollege Essay Diagnostic Essay Assignment
College Essay Diagnostic Essay AssignmentSarah Adams
 
Stationery Paper Free Stock Photo Free Printable Stati
Stationery Paper Free Stock Photo  Free Printable StatiStationery Paper Free Stock Photo  Free Printable Stati
Stationery Paper Free Stock Photo Free Printable StatiSarah Adams
 
Critical Analysis Sample. 7 Critical Analysis Examples
Critical Analysis Sample. 7 Critical Analysis ExamplesCritical Analysis Sample. 7 Critical Analysis Examples
Critical Analysis Sample. 7 Critical Analysis ExamplesSarah Adams
 

More from Sarah Adams (20)

Instant Essay Writer - Renderoactueel. Online assignment writing service.
Instant Essay Writer - Renderoactueel. Online assignment writing service.Instant Essay Writer - Renderoactueel. Online assignment writing service.
Instant Essay Writer - Renderoactueel. Online assignment writing service.
 
November 2023 TOK Essay Prompts Detailed Guide
November 2023 TOK Essay Prompts  Detailed GuideNovember 2023 TOK Essay Prompts  Detailed Guide
November 2023 TOK Essay Prompts Detailed Guide
 
How To Write A Character Analysis Essay High Schoo
How To Write A Character Analysis Essay High SchooHow To Write A Character Analysis Essay High Schoo
How To Write A Character Analysis Essay High Schoo
 
Why Am I In College Essay By Professor Garrison - 59
Why Am I In College Essay By Professor Garrison - 59Why Am I In College Essay By Professor Garrison - 59
Why Am I In College Essay By Professor Garrison - 59
 
Help - Writeonline.Web.Fc2.Com. Online assignment writing service.
Help - Writeonline.Web.Fc2.Com. Online assignment writing service.Help - Writeonline.Web.Fc2.Com. Online assignment writing service.
Help - Writeonline.Web.Fc2.Com. Online assignment writing service.
 
022 Heading For College Essay Example Persuasiv
022 Heading For College Essay Example Persuasiv022 Heading For College Essay Example Persuasiv
022 Heading For College Essay Example Persuasiv
 
Free Printable Lined Writing Paper With Drawing Box Pa
Free Printable Lined Writing Paper With Drawing Box PaFree Printable Lined Writing Paper With Drawing Box Pa
Free Printable Lined Writing Paper With Drawing Box Pa
 
RESEARCH PAPER CONC. Online assignment writing service.
RESEARCH PAPER CONC. Online assignment writing service.RESEARCH PAPER CONC. Online assignment writing service.
RESEARCH PAPER CONC. Online assignment writing service.
 
Song Analysis Example Free Essay Example
Song Analysis Example Free Essay ExampleSong Analysis Example Free Essay Example
Song Analysis Example Free Essay Example
 
The Story So Far... - Creative Writing Blog
The Story So Far... - Creative Writing BlogThe Story So Far... - Creative Writing Blog
The Story So Far... - Creative Writing Blog
 
Grade 8 Essay Examples Sitedo. Online assignment writing service.
Grade 8 Essay Examples  Sitedo. Online assignment writing service.Grade 8 Essay Examples  Sitedo. Online assignment writing service.
Grade 8 Essay Examples Sitedo. Online assignment writing service.
 
001 Contractions In College Essa. Online assignment writing service.
001 Contractions In College Essa. Online assignment writing service.001 Contractions In College Essa. Online assignment writing service.
001 Contractions In College Essa. Online assignment writing service.
 
How To Write The Conclusion Of An Essay HubPages
How To Write The Conclusion Of An Essay  HubPagesHow To Write The Conclusion Of An Essay  HubPages
How To Write The Conclusion Of An Essay HubPages
 
Examples Of Science Paper Abstract How To Write Disc
Examples Of Science Paper Abstract  How To Write DiscExamples Of Science Paper Abstract  How To Write Disc
Examples Of Science Paper Abstract How To Write Disc
 
Political Parties Today Are The Anathema Of Democr
Political Parties Today Are The Anathema Of DemocrPolitical Parties Today Are The Anathema Of Democr
Political Parties Today Are The Anathema Of Democr
 
Expository Story. Exposition In A Story Why Background Informatio
Expository Story. Exposition In A Story Why Background InformatioExpository Story. Exposition In A Story Why Background Informatio
Expository Story. Exposition In A Story Why Background Informatio
 
Recommendation Letter For Master Degree Example -
Recommendation Letter For Master Degree Example -Recommendation Letter For Master Degree Example -
Recommendation Letter For Master Degree Example -
 
College Essay Diagnostic Essay Assignment
College Essay Diagnostic Essay AssignmentCollege Essay Diagnostic Essay Assignment
College Essay Diagnostic Essay Assignment
 
Stationery Paper Free Stock Photo Free Printable Stati
Stationery Paper Free Stock Photo  Free Printable StatiStationery Paper Free Stock Photo  Free Printable Stati
Stationery Paper Free Stock Photo Free Printable Stati
 
Critical Analysis Sample. 7 Critical Analysis Examples
Critical Analysis Sample. 7 Critical Analysis ExamplesCritical Analysis Sample. 7 Critical Analysis Examples
Critical Analysis Sample. 7 Critical Analysis Examples
 

Recently uploaded

How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxCeline George
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and ModificationsMJDuyan
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17Celine George
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxRamakrishna Reddy Bijjam
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSAnaAcapella
 
Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111GangaMaiya1
 
Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17Celine George
 
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfUGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfNirmal Dwivedi
 
What is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxWhat is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxCeline George
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17Celine George
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use CasesTechSoup
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxheathfieldcps1
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jisc
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxmarlenawright1
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Jisc
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxPooja Bhuva
 
80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...Nguyen Thanh Tu Collection
 
OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsSandeep D Chaudhary
 

Recently uploaded (20)

How to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptxHow to setup Pycharm environment for Odoo 17.pptx
How to setup Pycharm environment for Odoo 17.pptx
 
Understanding Accommodations and Modifications
Understanding  Accommodations and ModificationsUnderstanding  Accommodations and Modifications
Understanding Accommodations and Modifications
 
How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17How to Create and Manage Wizard in Odoo 17
How to Create and Manage Wizard in Odoo 17
 
Python Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docxPython Notes for mca i year students osmania university.docx
Python Notes for mca i year students osmania university.docx
 
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPSSpellings Wk 4 and Wk 5 for Grade 4 at CAPS
Spellings Wk 4 and Wk 5 for Grade 4 at CAPS
 
Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111Details on CBSE Compartment Exam.pptx1111
Details on CBSE Compartment Exam.pptx1111
 
Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17Model Attribute _rec_name in the Odoo 17
Model Attribute _rec_name in the Odoo 17
 
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdfUGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
UGC NET Paper 1 Unit 7 DATA INTERPRETATION.pdf
 
What is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptxWhat is 3 Way Matching Process in Odoo 17.pptx
What is 3 Way Matching Process in Odoo 17.pptx
 
Our Environment Class 10 Science Notes pdf
Our Environment Class 10 Science Notes pdfOur Environment Class 10 Science Notes pdf
Our Environment Class 10 Science Notes pdf
 
VAMOS CUIDAR DO NOSSO PLANETA! .
VAMOS CUIDAR DO NOSSO PLANETA!                    .VAMOS CUIDAR DO NOSSO PLANETA!                    .
VAMOS CUIDAR DO NOSSO PLANETA! .
 
How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17How to Add a Tool Tip to a Field in Odoo 17
How to Add a Tool Tip to a Field in Odoo 17
 
Introduction to TechSoup’s Digital Marketing Services and Use Cases
Introduction to TechSoup’s Digital Marketing  Services and Use CasesIntroduction to TechSoup’s Digital Marketing  Services and Use Cases
Introduction to TechSoup’s Digital Marketing Services and Use Cases
 
The basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptxThe basics of sentences session 3pptx.pptx
The basics of sentences session 3pptx.pptx
 
Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)Jamworks pilot and AI at Jisc (20/03/2024)
Jamworks pilot and AI at Jisc (20/03/2024)
 
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptxHMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
HMCS Vancouver Pre-Deployment Brief - May 2024 (Web Version).pptx
 
Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)Accessible Digital Futures project (20/03/2024)
Accessible Digital Futures project (20/03/2024)
 
Interdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptxInterdisciplinary_Insights_Data_Collection_Methods.pptx
Interdisciplinary_Insights_Data_Collection_Methods.pptx
 
80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
80 ĐỀ THI THỏ TUYỂN SINH TIáșŸNG ANH VÀO 10 SỞ GD – ĐT THÀNH PHỐ HỒ CHÍ MINH NĂ...
 
OSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & SystemsOSCM Unit 2_Operations Processes & Systems
OSCM Unit 2_Operations Processes & Systems
 

An Assessment Of ESL Writing Placement Assessment

  • 1. Assessing Writing 8 (2002) 17–30 An assessment of ESL writing placement assessment Deborah Crusan∗ Department of English Language and Literatures, Wright State University, 3640 Colonel Glenn Highway, Dayton, OH 45435, USA Abstract What are the best assessment practices for ESL and international students who must be placed into first-year composition courses at both two- and four-year institutions in the United States? In this article, I examine this issue in three ways. First, I enumerate the stances in the literature; following that I submit results of preliminary research which ques- tions modes of writing assessment and their relationship to final grades in composition classes. Finally, I present results of an internet search of one set of prominent American institutions’ placement practices. From this data, it might be inferred that we use multi- ple instruments to place ESL students into composition classes rather than resorting to a multiple-choice test (especially a standardized instrument) as the sole means of determining placement. My primary purposes in this paper are to argue that second language composition special- ists need to examine our placement practices and aim for a reconciliation of these practices with our classroom pedagogies. Further, if we are not involved in placement decisions at our various institutions, we must strive to be included in important decisions concerning our students and to be advocates for second language writers. © 2002 Elsevier Science Inc. All rights reserved. Keywords: Assessment; Reconciliation; Second language Writing assessment has always been problematic. The importance of respon- sible assessment practices cannot be overstated, particularly when the assessment measures influence critical decisions such as placement into composition classes, ∗ Tel.: +1-937-775-2846. E-mail address: deborah.crusan@wright.edu (D. Crusan). 1075-2935/02/$ – see front matter © 2002 Elsevier Science Inc. All rights reserved. PII: S1075-2935(02)00028-4
  • 2. 18 D. Crusan / Assessing Writing 8 (2002) 17–30 an act laden with pedagogical, ethical, political, psychometric, and financial im- plications. In fact, many researchers hold that “any project to assess individ- ual students’ writing abilities, over time or in a single instance, is fraught with philosophical, ethical, and methodological peril” (Bullock, 1991, p. 195). Further, White (1985) asserts, “most testing of writing is poorly done, destructive to the goals of teaching, and improperly used” (p. 2). In the United States, most writing assessment 25 years ago used indirect mea- sures of writing ability, specifically multiple-choice tests. This procedure was used because direct writing measures in which students actually produce text were con- sidered unreliable due to inconsistencies in scoring by independent raters (Gordon, 1987; Greenberg, 1992; Harrington, 1998). Potentially confounding factors for the use of essays assessment included the lack of benchmark papers, lack of clear and consistent outcomes criteria and lack of quantitative measurements for inter-rater reliability. However, in the past 25 years, the US practice of indirectly assessing writing ability has been criticized from many academic quarters (Bullock, 1991; Conlan, 1986; Moss, 1994; Stiggins, 1982; White, 1986, 1990). Experts cite problems with validity (Bachman & Palmer, 1996), ethics (CCCC Committee on Assess- ment, 1995; Courts & McInerney, 1993; Spolsky, 1997; White, 1994a), efficiency (Williamson, 1994), the inability to measure skills teachers consider important (Hamp-Lyons, 2001), and myriad other concerns (Hamp-Lyons, 1996; Shor, 1997). Some researchers (Hamp-Lyons, 1990; Huot, 1990a) have claimed that indirect assessment of writing by means of a multiple-choice instrument is now a thing of the past. Hamp-Lyons (1990) states: “20 years or so ago, many if not most people in North America (to a lesser extent in Great Britain and Australia) believed that writing could be validly tested by an indirect test of writing. As we enter the 1990s, however, they have not only been defeated but also chased from the battlefield” (p. 69). Unfortunately, even at the current time, the opposite may be closer to the truth. White (1996) cautions: “Anyone with a role in writing assessment must keep in mind the multiple-choice specter that hovers just offstage; no stake has ever been driven through its heart to keep it in its coffin, however much it may be wounded” (p. 109). In this paper, I review recent literature defining placement testing. I also define direct and indirect assessment of writing, and I cite inherent problems with both direct and indirect methods. Further, I summarize published placement practices at Big Ten universities, confirming the assertion that, despite the contentions of many in the composition assessment community, indirect testing as the sole measure of placement of ESL students into writing courses is alive and well at some of the most prestigious universities in America. Finally, I summarize a study conducted at The Pennsylvania State University which compares student performance on an essay and a multiple-choice grammar test and correlates them with students’ final grades. I argue that collectively, these studies offer evidence that, while we know what we should do, we often do not do it. Our reasons are legion: cost, speed, practicality and efficiency, validity, and reliability.
  • 3. D. Crusan / Assessing Writing 8 (2002) 17–30 19 1. What is a placement test? Leki (1991) defines a placement exam as a test that pairs a student with an appropriate course. At two- and four-year institutions in the United States, place- ment tests are used to measure readiness for college-level course work and help faculty decide whether incoming students should enroll in remedial or introductory courses (Patkowski, 1991). They are not proficiency tests, but rather tools which allow academic advisers to place students in the level of coursework best suited to their preparation and skills. Institutions regularly employ scores obtained from a variety of tests (ACT, SAT, Test of English as a Foreign Language (TOEFL), locally designed essay tests, locally designed measures of the sub-skills of writ- ing) in order to make admission and placement decisions for students, including students for whom English is a second language. 2. Direct and indirect testing An essay test is a direct assessment of writing ability, an integrative test which has an emphasis on communication, authenticity, and context. It attempts to test knowledge of language as a whole, not the individual components of language. Although the constraints of timed essays have been well-noted in the literature (Crowley, 1995; Shor, 1997), the advantage of essays is that they are able to gauge the ability of students to identify and analyze problems, to identify audience and purpose, to argue, describe, and define, skills that are valued in composition classes in the United States (Crusan & Cornett, 2002; Ferretti, 2001; White, 1994b). In- direct assessments are also used to assess writing ability. “Discrete-point tests are constructed on the assumption that language can be broken down into its com- ponent parts and each of those parts adequately tested. Those components are basically the four skills (listening, speaking, reading, and writing), the various hi- erarchical units of language (phonology/graphology, morphology, lexicon, syntax) within each skill, and the sub-categories within those units” (Brown, 1994, p. 262). An example of an indirect test would be a multiple-choice proficiency test divided into grammar, reading comprehension, vocabulary, and “writing” which attempts to isolate and evaluate knowledge of specific components of language. Discussion in the literature abounds regarding the many problems surrounding writing assessment — reliability and validity issues, rater training, holistic scoring, and whether indirect or direct methods should be used for placement, proficiency, and achievement; however, little quantitative evidence exists to support the po- sitions taken in the literature (Hamp-Lyons, 1997). Polio (1998) points out that relatively few studies address actual empirical research into ethical assessment for placement of ESL composition students. Matsuda (1998) identifies an even more serious problem, that “despite the growth of the ESL population, there has not been a corresponding increase in the amount of attention given to ESL students in many writing programs” (p. 99). Huot (1994) found a similar situation in mainstream
  • 4. 20 D. Crusan / Assessing Writing 8 (2002) 17–30 composition, commenting, “It is alarming to note the dearth of research and the- ory in writing assessment and the lack of qualified personnel who direct writing placement programs” (p. 61). While there may be university administrators who believe in the value (mainly in terms of efficiency and cost effectiveness) of indirect measures, over the past few decades few writing specialists have publicly argued this, in print or at conferences (Crusan, 2002a). Some writing assessment professionals operate on the assump- tion that using at least a one-shot writing sample coupled with a multiple-choice instrument is the most prevalent placement procedure used in US institutions. Haswell (1998) has been unusual in recognizing the possibility of university use of a multiple-choice test as the sole instrument for placement into writing courses. Other than Haswell, it seems that neither native English speaker (L1) or non-native English speaker (L2) composition specialists are concerned about finding quantita- tive evidence to settle the question of whether a direct or indirect measure is better at assessing writing in the context of college placement. The tacit assumption is that the debate over the merits of direct versus indirect testing has been resolved and the issue buried. Unfortunately, this issue has many lives, as demonstrated by the proportion of institutions still using an indirect test as their primary mode of assessment for placement into writing courses. In a study of 100 private and public colleges and universities in the United States selected for their “potentially significant second language populations” (Williams, 1995, p. 158), the researcher found that 32% of institutions reported the use of an institutionally administered indirect test; 19% used a standardized test combined with an essay; 23% an essay alone; and 26% relied on TOEFL scores for placement in ESL composition course. If we combine the percentages of those reporting use of an indirect measure alone, we find that 58% of the institutions in her study used an indirect measure as the sole means to assess writing for placement. More recently, Lewiecki-Wilson, Sommers, and Tassoni (2000) stated, “On the grounds of expediency, many two-year institutions have turned to computerized editing tests such as COMPASS for placing entering students into writing courses, even though such tests do not directly measure writing” (p. 165). COMPASS is a computerized assessment in which students find and correct errors in essays on a computer screen, according to ACT (2002), the Writing Skills Placement Test helps to determine placement into either entry-level or developmental writing classes and assess students’ knowledge and skills in eight writing skills areas: us- age//mechanics, rhetorical skills, punctuation, basic grammar and usage, sentence structure, strategy, organization, and style. Extending Yancey’s (1999) metaphor likening the history of writing assessment to three waves — from objective tests to holistically scored essays to portfolio assessment — Lewiecki-Wilson et al. suggest that there is, especially at two-year institutions, evidence of a new “undertow — a backward-moving current” (p. 166) returning to indirect, objective testing for placing students. Writing assessment often seems to be carried out with little thought for who is beingtestedandhowthesepersonsmightbeaffectedbytheresultsofthetests.Ifeel
  • 5. D. Crusan / Assessing Writing 8 (2002) 17–30 21 strongly that, as educators, we must be more knowledgeable about assessment and aid administrators and colleagues to realize the pedagogical, social, and political implications of the tests we administer to our students. Drawing on Haswell’s (1998) contention that “most schools use indirect testing through a standardized exam,” (p. 136), in the remainder of this paper, I explore the contradiction between what composition specialists believe about assessment for placement and what is actually done. As I will outline, indirect measures of writing ability continue to be accepted as the sole means of writing placement at many academic institutions in the United States. 3. Problems with the tests — or — who says which is best and why Researchers (Bachman, 1990; Bailey, 1998; Conlan, 1986; White, 1985) have documented problems inherent in using a multiple-choice test as an assessment of writing ability. Some have found that students, when taking multiple-choice tests, have a “penchant for guessing, especially those who are not well-prepared to take the examination or those who are more impulsive than reflective in test-taking behavior” (Henning, 1987, p. 32). The flaw, it seems, is using a grammar test as the only means to measure writing ability. Grammar is one of the components of language and is generally more appropriately tested through its actual use in one or more of the four skills (reading, writing, speaking, and listening) rather than in isolation (Bailey, 1998). A score on a single multiple-choice instrument may not provide sufficient depth of information. ESL writers might well be marginalized by indirect tests. Haswell (1998) as- serts: “Indirect testing may be less valid for international students since they often show a larger gap between the skill and habits needed to write extended essays and the facility with the kind of surface features measured in test items” (p. 136). Some ESL students, because they generally have memorized grammar rules very well, tend to score very high on tests like the TOEFL. Consequently, it has been claimed that multiple-choice tests overpredict minority students’ performance on essay tests (Bailey, 1998; O’Malley & Valdez Pierce, 1996). However, direct assessment of writing ability poses a different set of challenges. While numerous writing experts value the perceived gain in authenticity that timed writing tests represent in assessing writing competence, others find fault with its reliability and validity as a measure of writing ability (McNenny, 2001). Many ESL writers, unsure of the structures of the language, have great difficulty producing fluentwrittendiscourse,especiallyintimedsituations.This,coupledwith“asimple lack of motor fluency in writing English script” (Leki, 1991, p. 55), causes the second language writer to write much more slowly than a native speaker, producing fewer words over a longer period of time. Hence, timed writings might impose a handicap on many ESL students. For any student, another problem presented by timed writing tests is that they are given under artificial conditions: students must compose on an assigned topic,
  • 6. 22 D. Crusan / Assessing Writing 8 (2002) 17–30 and they are most often not allowed to access any reference material such as dictionaries. Many students find it difficult to write “cold” on a topic they might never have seen before and perhaps care nothing about or, even worse, know nothing about (Huot, 1990a; Leki, 1991). Further, in the timed essay test there is no time for revision or any other process approaches (Wolcott & Legg, 1998). The relative merits of both direct and indirect measures were carefully scruti- nized by Breland (1977), whose study concluded that a multiple-choice test com- bined with an essay can be the most educationally sound solution to administrative problems in placement. In 1983, Breland (1983) presented evidence of the valid- ity of direct assessment over and above other available measures; however, he cites cost and reliability as two obstacles and recommends consideration of auto- mated textual analysis as well as production and reporting of more than a single score as ways to improve better direct assessment. Breland, Camp, Jones, Morris, and Rock (1987) studied the differences between essay and multiple-choice instru- ments for assessing writing and once more suggested the multiple-choice/essay test combination in a effort to reduce the cost of assessment and increase its re- liability and validity. Hudson (1982) drew similar conclusions and recommended either a holistically- or analytically-scored essay combined with a carefully chosen objective test. In an effort to “discover how prevalent placement testing was and what forms it took” (p. 49), Huot (1994) surveyed 1,080 institutions, both two- and four-year US colleges and universities on the MLA list of English Chairpersons. He calls this initial study “tentative and exploratory” (p. 59) and encourages more research in this area. He found that only one-half of the institutions surveyed used any kind of direct assessment for placement of students into composition classes. In other words, almost 50% of institutions were still using some form of in- direct assessment, a problem lamented by a number of composition theorists (Huot, 1990b; Shor, 1997; Tedick, 1990; White, 1990). Further, 54% of the in- stitutions that reported using some form of direct assessment “employed writing criteria developed outside the university” (p. 60). 3.1. Placement in the ESL context Because Huot’s data did not include ESL program data, I was led to gather my own data as well as to conduct an internet search of US institutions of higher education — both two- and four-year institutions. Preliminary findings are reported in the following sections. First, I compared a grammar test to an essay test to determine which was a better predictor of final grade in an ESL composition course. Six composition instructors and 124 ESL students took part in a study at The Pennsylvania State University. The students were both basic and freshman level writers enrolled in “sheltered” (ESL only) classes. They ranged in age from 18 to 40; 65% were male, 35% female, and they represented 26 different languages; however, Spanish (25%)
  • 7. D. Crusan / Assessing Writing 8 (2002) 17–30 23 and Chinese (21%) were most prevalent followed by Korean (12%). The students were of varying levels of writing ability and were initially placed in classes by means of a score on a university administered indirect assessment taken just prior to enrollment as freshmen. There were two independent variables in the study. The first was a direct measure written on the first day of class on one of three assigned topics. The second independent variable, an indirect measure based on disclosed copies of the TOEFL, included 40 questions and was divided into four sub-skills — word or- der, error identification, vocabulary/expression, and reading comprehension. The reliability of this measure was .95. This test provided objective measure scores in lieu of the university-administered placement test scores which, despite re- peated requests, were not made available for this study. The dependent vari- able, final grade, was reported to me by composition teachers at the end of the semester. The six instructors were trained, and used the Test of Written English (TWE) scoring guide to score the essays, because it was initially designed to measure writing proficiency rather than growth in the skill (Educational Testing Services, 1996). In case of a discrepancy of more than two points, we used a third rater’s score to adjudicate. The inter-rater reliability was .92. Two English as a Second Language instructors hand-scored the grammar tests. Themostinterestingfindinginthedataisthecorrelationsbetweenthedependent and independent variables. Both the direct and indirect assessments positively and significantly correlated with final grade, although the placement essay had a stronger correlation with final grade. The correlation between the grammar test and final grade was the lowest but still significant (.190, P < .01). The correlation between the placement essay and final grade was higher and more significant (.260, P < .001). It is important to note here that scholars have issued caveats when employing final grades as variables. Armstrong (1995) argues that any model that uses final grades as the basis for validity of a placement would likely fail to account for a significant source of variation, that of instructor grading variability. Further, studies of the presence of grade inflation (see Compton & Metheny, 2000; Edwards, 2000; Nagle, 1998; Zirkel, 1999 for several perspectives on this matter) present reasons for care and circumspection when utilizing such data. Possibly of more significance both statistically and pragmatically, is the corre- lation between the two independent variables, the grammar test and the placement essay (.327, P < .0001), supporting the use of a combination of direct and indirect measures as a means to place our ESL writers into composition courses. Despite the fact that this correlation is derived from a relatively small sample, it supports the use of multiple assessment instruments, rather than one or the other, in the context of ESL student placement. The second data set represents findings from institutional websites at some of the largest universities in the US on the instruments used for initial placement of ESL students.
  • 8. 24 D. Crusan / Assessing Writing 8 (2002) 17–30
  • 9. D. Crusan / Assessing Writing 8 (2002) 17–30 25 Table 1 clearly shows that at least three of the largest and most prestigious uni- versities (Penn State, Purdue and Wisconsin) currently use only a multiple-choice instrument to place ESL students into composition courses. In separate internet searches, I found that a great many two-year colleges — which are often the educational entry point for many immigrant ESL students, par- ticularly Hispanics (Rendon & Nora, 1994) — continue to use indirect instruments as the sole means of placement into composition courses. 3.2. Placement and its importance for ESL students Assessment for placement cannot be trivialized and weighs heavily on writing program administrators (Silva, 1994). So important is placement that the CCCC Committee on Second Language Writing in its Statement on Second-Language Writing and Writers (2001) states: “Decisions regarding the placement of sec- ond language writers into writing courses should be based on students’ writing proficiency” and not “based solely on the scores from standardized tests of general language proficiency or of spoken language proficiency. Instead, scores from the direct assessment of students’ writing proficiency should be used, and multiple writing samples should be consulted whenever possible” (pp. 670–671). Armstrong (2001) recommends that institutions’ placement tests align with course content. Likewise, Bachman and Palmer (1996) and Moss (1994) advise that writing assessments should be linked with curricula. Students should be tested in a manner similar to the work of the course: In other words, if the work of the course is writing, then students should write as a means of placement. I believe that a test is inappropriate unless it leads to the best available treatment or placement for students (Crusan, 2002b). It has been suggested that “assessment defines goals and expresses values more clearly than do any number of mission statements” (White, Lutz, & Kamusikiri, 1996, p. 9). If a university’s definition of writing ability differs from that of the composition community, its goals will differ as well. When a university de- fines writing ability as product rather than a non-linear, recursive process, it is justified in assessing writing ability indirectly. However, whether such assess- ment is meaningful is a separate question. Continually, experts such as Ferris and Hedgcock, 1998 point out that “an indirect measure might attempt to evaluate writ- ing ability by testing verbal reasoning, error recognition, or grammatical accuracy, all of which are related to writing performance in some way but only indirectly because they do not involve the act of composing” (p. 229). The view that student mastery of individual, bite-size, discrete-point rules can lead to full mastery of the language is dangerous if it informs curriculum and leads to decontextualized “writing” pedagogy (Leki, 1991). Institutions may cling to indirect assessment techniques because they have a narrow definition of writing ability. Institutions using only indirect assessment tools for ESL placement may consider writing to be simply a set of sub-skills such as vocabulary knowledge, error recognition, and mechanics. If this is the case, it
  • 10. 26 D. Crusan / Assessing Writing 8 (2002) 17–30 is not surprising that an indirect assessment is used. However, the assessment of writing is complex (White, 1994a) and that complexity should not be reduced to one score on one measure (Bailey, 1998; Belanoff, 1991; Carlson & Bridgeman, 1986; Haswell & Wyche-Smith, 1994; Huot, 1994, 1996). Many institutions cite costandeaseofadministrationasjustificationforcontinuingtoassessESLstudents solely with a discrete, indirect measure. Such defense of indirect assessment even in the face of information which indicates that what is being done may not be for the greater good of the students involved is suspect. Is ease of administration and cost effectiveness for the university enough to justify probable financial and academic hardships to students? Rose (1989) considers testing to be the supreme irony. He states that “the very means we use to determine students’ needs — and the various remedial procedures that derive from them — can wreak profound harm on our children, usually, but by no means only, those who are already behind the economic and political eight ball” (p. 127). A number of researchers have argued that teachers should be involved in testing (Bachman & Palmer, 1996; Hamp-Lyons, 1996; White, 1996). Kroll and Reid (1994) contend that test developers should be those who teach the students who will be tested, since these teachers would be better equipped to know what their students can do, how they can do it, and what they need, and can, therefore, better assess their students’ writing. Moreover, White (1996) strongly advocates that composition faculty become involved in assessment outside the classroom; he asserts that the measurement community regards the writing community as “largely irrelevant to serious measurement” (p. 102). However, he also argues that informed teachers can convince those in power to use and develop better types of writing assessment procedures once they have begun to discover that assessment knowledge is power. Discussing the necessity for all teachers to address the needs of ESL stu- dents, Zamel (1995) calls upon faculty to be border-crossers, “blurring the bor- ders within what was once a fairly well-defined and stable academic community,” (p. 519). In much the same way, I call for more compositionists to become involved with the writing placement processes in their respective institutions, to cross the border into the world of assessment and struggle to achieve “a vision of active learning, creative thinking, and a much needed blend of skills with imagination” (White, 1996, p. 111) thereby supporting the educational needs of our students. How many of us could carry on an informed dialogue if asked about the writing placement test procedures at our colleges and universities? Again, knowledge is power. “The form of assessment we choose is always important because as a rhetorical act,assessmentspeakstostudents,faculty,thewiderinstitution,andthecommunity about the values and practices of college writing” (Lewiecki-Wilson et al., 2000, p. 183). This article’s purpose was to encourage ESL and composition specialists to explore how we might better serve those we teach and test. In an age where we seem to be cycling back into criticism (White, 2001), we must be cautious that our students do not get lost — or worse, disappear from the academic scene altogether.
  • 11. D. Crusan / Assessing Writing 8 (2002) 17–30 27 As teachers, we should re-evaluate our commitment to ESL students (Williams, 1995) by becoming more involved in the assessments that affect their lives, helping our ESL students to carve out a legitimate space for themselves at the university, a promise made to them by universities when touting much publicized diversity programs. References American College Test, Inc., COMPASS/ESL. (2002). Writing placement skills. Retrieved June 4, 2002, from the Website: http://www.act.org/compass/sample/writing.html Armstrong, W. B. (1995, May). Validating placement tests in the community college: The role of test scores, biographical data, and grading concerns. Paper presented at the 35th Annual Forum of the Association for Institutional Research, Boston, MA. Armstrong, W. B. (2001). Pre-enrollment placement testing and curricular content: Correspondence or misalignment. Abstract retrieved May 26, 2002, from: ERIC database. Bachman, L. F. (1990). Fundamental considerations in language testing. Oxford, UK: Oxford Univer- sity Press. Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice. Oxford, UK: Oxford University Press. Bailey, K. M. (1998). Learning about language assessment: Dilemmas, decisions, and directions. Boston: Heinle & Heinle. Belanoff, P. (1991). The myths of assessment. Journal of Basic Writing, 1, 54–66. Breland, H. M. (1977). Can multiple-choice tests measure writing skills? College Board Review, 103, 32–33. Breland, H. M. (1983). The direct assessment of writing skill: A measurement review (College Board Report No. 83-6, ETS RR No. 83-32). New York: College Entrance Examination Board. Breland, H., Camp, R., Jones, R., Morris, M., & Rock, D. (1987). Assessing writing skill (Research Monograph No. 11). New York: College Entrance Examination Board. Brown, H. D. (1994). Principles of language learning and teaching (3rd ed.). Englewood Cliffs, NJ: Prentice Hall. Bullock, R. (1991). Autonomy and community in the evaluation of writing. In: R. Bullock & J. Trimbur (Eds.), The politics of writing instruction: Postsecondary. Portsmouth, NH: Boynton/Cook. Carlson, S., & Bridgeman, B. (1986). Testing ESL student writers. In: K. L. Greenberg, H. S. Wiener, & R. A. Donovan (Eds.), Writing assessment: Issues and strategies. New York: Longman. CCCC Committee on Assessment. (1995). Writing assessment: A position statement. College Compo- sition and Communication 46, 430–437. CCCC Committee on Second Language Writing. (2001). CCCC statement on second-language writing and writers. College Composition and Communication 52 (4), 669–674. Compton, D. M., & Metheny, B. (2000). An assessment of grade inflation in higher education. Percep- tual and Motor Skills, 90, 527–536. Conlan, G. (1986). “Objective” measures of writing ability. In: K. L. Greenberg, H. S. Wiener, & R. A. Donovan (Eds.), Writing assessment: Issues and strategies. New York: Longman. Courts, P. L., & McInerney, K. H. (1993). Assessment in higher education: Politics, pedagogy, and portfolios. Westport, CT: Praeger. Crowley, S. (1995). Composition’s ethic of service, the universal requirement, and the discourse of student need. Journal of Composition Theory, 15 (2), 227–239. Crusan, D. (2002a). The marginalization of ESL students through placement exams. Paper presented at the 36th Annual TESOL Convention and Exhibition, Salt Lake City, UT, April 2002. Crusan, D. (2002b). The quagmire of assessment for placement: Talking out of both sides of our mouths. Manuscript submitted for publication.
  • 12. 28 D. Crusan / Assessing Writing 8 (2002) 17–30 Crusan, D., & Cornett, C. (2002). The cart before the horse: Teaching assessment criteria before writing. The International Journal for Teachers of English Writing Skills, 9, 20–33. Educational Testing Service. (1996). TOEFL: Test of written English guide (4th ed.). Princeton, NJ: Author. Edwards, C. H. (2000). Grade inflation: The effects on educational quality and personal well being. Education, 120, 538–546. Ferretti, E. (2001). Just a little higher education: Teaching working-class women on the vocational track. In: B. Alford & K. Kroll (Eds.), The politics of writing in the two-year college (pp. 1–18). Portsmouth, NH: Heinemann. Ferris, D., & Hedgcock, J. (1998). Teaching ESL composition: Purpose, process, and practice. Mahwah, NJ: Lawrence Erlbaum. Gordon, B. L. (1987). Another look: Standardized tests for placement in college composition courses. Writing Program Administration, 10 (3), 29–38. Greenberg, K. (1992). Validity and reliability issues in direct assessment of writing. Writing Program Administration, 16, 7–22. Hamp-Lyons, L. (1990). Second language writing: Assessment issues. In: B. Kroll (Ed.), Second language writing: Research insights for the classroom. Cambridge: Cambridge University Press. Hamp-Lyons, L. (1996). The challenges of second-language writing assessment. In: E. M. White, W. D. Lutz, & S. Kamusikiri (Eds.), Assessment of writing: Politics, policies, practices. New York: The Modern Language Association of America. Hamp-Lyons, L. (1997). Exploring bias in essay tests. In: C. Severino, J. C. Guerra, & J. E. Butler (Eds.), Writing in multicultural settings. New York: The Modern Language Association of America. Hamp-Lyons, L. (2001). Fourth generation writing assessment. In: T. Silva & P. K. Matsuda (Eds.), On second language writing (pp. 117–127). Mahwah, NJ: Lawrence Erlbaum. Harrington, S. (1998). New visions of authority in placement test rating. Writing Program Administra- tion, 22 (1/2), 53–84. Haswell, R. H. (1998). Searching for Kiyoko: Bettering mandatory ESL writing placement. Journal of Second Language Writing, 7, 133–174. Haswell, R. H., & Wyche-Smith, S. (1994). Adventuring into writing assessment. College Composition and Communication, 45, 220–236. Henning, G. H. (1987). A guide to language testing: Development, evaluation, research. Cambridge, MA: Newbury House. Hudson, S. A. (1982). An empirical investigation of direct and indirect measures of writing. Report of the 1980–81 Georgia Competency Based Education Writing Assessment Project — 1981. ERIC: ED #205993. Huot, B. (1990a). The literature of direct writing assessment: Major concerns and prevailing trends. Review of Educational Research, 60, 237–263. Huot, B. (1990b). Reliability, validity, and holistic scoring: What we know and what we need to know. College Composition and Communication, 41, 201–213. Huot, B. (1994). A survey of college and university writing placement practices. Writing Program Administration, 17, 49–65. Huot, B. (1996). Toward a new theory of writing assessment. College Composition and Communication, 47, 549–566. Indiana University, South Bend, IN, Office of International Affairs. (2002). The English as a sec- ond language (ESL) placement test. Retrieved June 5, 2002, from the Website: http://www. iusb.edu/∌abridger/test.htm Kroll, B., & Reid, J. (1994). Guidelines for designing writing prompts: Clarifications, caveats, and cautions. Journal of Second Language Writing, 3, 231–255. Leki, I. (1991). A new approach to advanced ESL placement testing. Writing Program Administration, 14 (3), 53–68. Lewiecki-Wilson, C., Sommers, J., & Tassoni, J. P. (2000). Rhetoric and the writer’s profile: Prob- lematizing directed self-placement. Assessing Writing, 7, 165–183.
  • 13. D. Crusan / Assessing Writing 8 (2002) 17–30 29 Matsuda, P. K. (1998). Situating ESL writing in a cross-disciplinary context. Written Communication, 15, 99–121. McNenny, G. (2001). Writing instruction and the post-remedial university: Setting the scene for the mainstreaming debate in basic writing. In: G. McNenny & S. H. Fitzgerald (Eds.), Mainstreaming basic writers: Politics and pedagogies of access (pp. 1–15). Mahwah, NJ: Lawrence Erlbaum. Michigan State University, Lansing, MI, MSU Testing Office. (2002). English language proficiency exams. Retrieved June 5, 2002, from the Website: http://www.couns.msu.edu/testing/tests.htm Moss, P. A. (1994). Validity in high stakes writing assessment: Problems and possibilities. Assessing Writing, 1, 109–128. Nagle, B. (1998). A proposal for dealing with grade inflation: The relative performance index. Journal of Education for Business, 74, 40–43. Northwestern University, Evanston, IL, Writing Program. (2002). English proficiency. Retrieved June 5, 2002, from the Website: http://www.cas.northwestern.edu/handbook/IV.html#IV.A.2. O’Malley, J. M., & Valdez Pierce, L. (1996). Authentic assessment for English language learners: Practical approaches for teachers. New York: Addison-Wesley. Patkowski, M. S. (1991). Basic skills tests and academic success of ESL college students. TESOL Quarterly, 25, 735–738. Polio, C. (1998). Examining the written product in L2 writing research: A taxonomy of measures and analyses. Paper presented at the Symposium on Second Language Writing, Purdue University, West Lafayette, IN, September 1998. Rendon, L. I., & Nora, A. (1994). A synthesis and application of research on Hispanic students in community colleges. In: J. L. Ratcliff, S. Schwarz, & L. H. Ebbers (Eds.), Community colleges (2nd ed.). Needham Heights, MA: Simon & Schuster. Rose, M. (1989). Lives on the boundary: A moving account of the struggles and achievements of America’s educationally underprepared. New York: Penguin Books. Shor, I. (1997). Our apartheid: Writing instruction and inequality. Journal of Basic Writing, 16, 91–104. Silva, T. (1994). An examination of writing program administrators’ options for the placement of ESL students in first year writing classes. Writing Program Administration, 18 (1/2), 37–43. Spolsky, B. (1997). The ethics of gatekeeping tests: What have we learned in a hundred years? Language Testing, 14, 242–247. Stiggins, R. J. (1982). A comparison of direct and indirect writing assessment methods. Research in the Teaching of English, 16, 101–114. Tedick, D. J. (1990). ESL writing assessment: Subject-matter knowledge and its impact on performance. English for Specific Purposes, 9, 123–143. The Ohio State University, Columbus, OH, English as a Second Language Composition Program. (2002). Placement. Retrieved June 5, 2002, from the Website: http://www.esl.ohio-state.edu/ Comp/Placement Information.html The Pennsylvania State University, University Park, PA, First-Year Testing, Counseling and Ad- vising Program (FTCAP). (2002). English test. Retrieved June 5, 2002, from the Website: http://www.psu.edu/dus/ftcap/ftcoverv.htm University of Illinois, Urbana-Champaign, IL, Division of English as an International Language. (2002). The ESL placement test (EPT). Retrieved June 5, 2002, from the Website: http://www. deil.uiuc.edu/esl.service/EPT.html University of Iowa, Iowa City, IA, The English as a Second Language Program. (2002). English as a second language credit classes. Retrieved June 5, 2002, from the Website: http://www. uiowa.edu/%7Eiiepesl/ESL/eslindex.html University of Michigan, Ann Arbor, MI, English Language Institute. (2002). The Academic English Evaluation (AEE) Schedule and Information Sheet for Winter, Spring, and Summer Terms, 2003. Retrieved June 5, 2002, from the Website: http://websvcs.itd.umich.edu/eli-bin/main University of Minnesota, Minneapolis, MN. (2002). Test scores. Retrieved June 5, 2002, from the Website: http://www1.umn.edu/twincities/
  • 14. 30 D. Crusan / Assessing Writing 8 (2002) 17–30 University of Wisconsin, Madison, WI, Center for Placement Testing. (2002). English place- ment test. Retrieved June 5, 2002, from the Website: http://wiscinfo.doit.wisc.edu/exams/ english placement test.htm White, E. M. (1985). Teaching and assessing writing: Recent advances in understanding, evaluating, and improving student performance. San Francisco: Josey-Bass. White, E. M. (1986). Pitfalls in the testing of writing. In: K. L. Greenberg, H. S. Wiener, & R. A. Donovan (Eds.), Writing assessment: Issues and strategies. New York: Longman. White, E. M. (1990). Language and reality in writing assessment. College Composition and Commu- nication, 41, 187–200. White, E. M. (1994a). Issues and problems in writing assessment. Assessing Writing, 1, 11–27. White, E. M. (1994b). Teaching and assessing writing (2nd ed.). San Francisco: Josey-Bass. White, E. M. (1996). Writing assessment beyond the classroom. In: L. Z. Bloom, D. D. Daiker, & E. M. White (Eds.), Composition in the twenty-first century: Crisis and change (pp. 101–111). Carbondale, IL: Southern Illinois University Press. White, E. M. (2001). Revisiting the importance of placement and basic studies: Evidence of success. In: G. McNenny & S. H. Fitzgerald (Eds.), Mainstreaming basic writers: Politics and pedagogies of access (pp. 19–28). Mahwah, NJ: Lawrence Erlbaum. White, E. M., Lutz, W. D., & Kamusikiri, S. (1996). Assessment of writing: Politics, policies, practices. New York: The Modern Language Association of America. Williams, J. (1995). ESL composition program administration in the United States. Journal of Second Language Writing, 4, 157–179. Williamson, M. (1994). The worship of efficiency: Untangling theoretical and practical considerations in writing assessment. Assessing Writing, 1, 147–193. Wolcott, W., & Legg, S. M. (1998). An overview of writing assessment: Theory, research, and practice. Urbana, IL: National Council of Teachers of English. Yancey, K. B. (1999). Looking back as we look forward: Historicizing writing assessment. College Composition and Communication, 50, 483–503. Zamel, V. (1995). Strangers in academia: The experiences of faculty and ESL students across the curriculum. College Composition and Communication, 46, 506–521. Zirkel, P. A. (1999). Grade inflation: A leadership opportunity for schools of education? Teachers College Record, 101, 247–260.