One way to ensure writing instruction improvement is similar to the recent July 2009 MDE reading mandate (all teachers must take a reading diagnostic course to advance to the professional certificate). Although this mandate is definitely needed, I take issue with how long it takes MDE to respond to the crises in our school systems especially in urban areas. Clearly, students’ reading deficiency levels is not a new phenomenon.
In 2003, The National Commission on Writing, (Neglected “R” article) proposed that all teachers take a writing theory and practice course. Of all the MEAP testing areas, writing has consistently been amongst the lowest for decades. Let’s see how long it will take MDE to mandate a writing diagnostic course for teachers similar to the mandated reading course.
It began assessing academic writing via student portfolios with traditional grading rubric components such as prewriting strategies, thesis statement, topic sentence, supporting evidence, sentence structure, transitions, spelling, grammar, and mechanics. However, the twist of this article is how teachers will have to begin to address writing in an evolving technological world. Technological writing, such as acronym usage, creative spelling, and condensed texts directly contrast with how students are initially taught to write (complete and correct phrase spellings and elaborations). Also, drafting and editing are apparently absent from this latter approach as well. As a result, English teachers must educate students on becoming multidimensional writers not just for their classrooms, but for interdisciplinary purposes as well.
According to ARCS (attention, relevance, confidence, and satisfaction) motivation theory by Keller (1979), students need to feel a sense of confidence that they can complete a given task. I address this concern via rubrics and modeling; I show students what I expect via rubrics and demonstrating. For e.g., if I want my students to complete a speech, I show them a student videotaped speech, and I model a speech for them as well. I even give them a chance to grade me using the rubric! They absolutely love this activity! William Arthur Ward said, “The mediocre teacher tells. The good teacher explains. The superior teacher demonstrates. The great teacher inspires.” To this end, I am a superior teacher striving to be a great teacher.
“… leaving teaching of writing to inexperienced graduate students…” I was quite offended by this above generality. Although I am a graduate student, I do not consider myself to be an inexperienced instructor.
However, the article redeemed itself by suggesting how in-service workshops could help teachers understand good writing and develop as writers themselves. I share my writings with my students. Food for thought: Because many of us are English teachers or teach language arts, how many of us share our writings with our students? How can we expect them to perform, if we only tell them and rarely show them?
(Give students a simple paragraph. Using a rubric, have students evaluate it.) This activity should be considered a summative assessment not formative. Due to its brevity, it cannot be considered a formative assessment alone; there is not enough data to drive instruction (actually create a lesson from it).
Three Types of Assessment Formative Self—ungraded Peer—ungraded Teacher—ungraded --Participate in writing workshops with students --Offer feedback to students before papers are submitted for grading --Have one-on-one writing conferences with students (prewriting, drafting, and editing conferences)
Similar to the Andrade (2009), Assessment-Driven Improvement article, I grade in green—not psychologically damaging. Green represents growth. At all costs, I want my students to see themselves as ever-evolving writers.
Similar to Andrade (2009), National Commission on Writing suggests that assessment must be fair and authentic. Assessment drives instruction. Where are my students’ strengths/ weaknesses? What am I doing to address their weaknesses? How am I reinforcing their strengths?
As educators, we must reflect on our teaching practices. Ask ourselves, regularly, “What’s working” and “What’s not working?” Have the students mastered the material? If not, what alternative teaching methods must I employ for them to master it? We need to adjust to our students and not require them to adjust to us all the time. Remember, we have more tools in our teacher toolbox than they have in their limited student toolbox.
Also, we must check for reliability within our grading rubrics. Check a student’s essay, and then give that same essay to your colleagues to see if your scores are similar. If so, then reliability has been achieved. Similar to my work with Evaluation Testing Systems, we had a table leader that second read college bound students’ writing samples. If she noticed that her scores greatly differed from ours, then she would have us re-read the essay to ensure reliability.
Are we assessing what we taught? Food for Thought: How often have we assessed students on material that we have not even thoroughly taught? As teachers, our assessments should not be “gotcha” moments to our students’ detriment, but tools to determine if our students learned what we taught.
Just as the middle schools students in the Andrade (2009) article writing improved so did mine. In 1998, when I taught 8 th grade English, I connected with WSU freshman to partner with my students. This mentor relationship resulted in my students’ MEAP Writing scores surpassing the State’s average by 10%. This 2003 article suggests university-school partnerships. I guess I was ahead of the curve (pat myself on back)!
Similar to Andrade (2009), I address my students’ trends in writing. What are the common errors that they are making? For my college English classes, I typically begin the semester with a 1-2 page essay. As I am reviewing those essays, I note common errors that thread throughout my students’ essays. I then address them as a class. Also, I take an error-filled sentence from each of my students’ paper, and we review them as a class. Although these sentences are anonymous, collectively students learn how to improve their own and peers’ writing.
Writing = POWER as a powerful process (pun intended). Each step builds from the previous step. P lan—list all possible ideas without judgment O rganize—group common ideas into an outline or cluster W rite—create draft E valuate—receive two critiques (peer & scholarly) R evise—include critics’ suggestions for final copy
When students can transfer their knowledge to another context, e.g., transfer their classroom knowledge to a standardized test setting. (Andrade 2009). Create acronyms, if necessary. IOPVWSC —An acronym for Ideas, Organize, Paragraphs, Voice, Word Choice, Sentences, and Conventions. “I only play videogames while snacking chips.” This idea is similar to the POWER concept that I use in my classroom.
When teachers focus on mechanics too much, students do not get a chance to develop their ideas properly.
“ Today’s most pressing domestic challenge is that of improving public schools…one of the greatest potential rewards lies in better writing—and improved thinking.” —The Neglected “R”: The Need for a Writing Revolution (2003)
Poll 3 question survey via http://www.pollev.com/alongbenton
3 Questions 1. In a standard five paragraph essay, how many main points must be supported? A. 5 B. 3 C. 1 D. All of the above E. None of the above 2. True or False Transitions should begin each body paragraph? 3. A transition is… A. Furthermore B. Similarly C. Consequently D. All of the above E. None of the above
MDE Mandate and National Commission on Writing Similarities <ul><li>How many of you have a professional teaching certificate? </li></ul><ul><li>July 2009 MDE reading mandate for professional certificate—3 components </li></ul><ul><li>Mandate needed, but MDE slow response time </li></ul><ul><li>Reading deficiency is an old phenomenon. </li></ul>
Similarities cont. <ul><li>Neglected “R” article writing course proposal </li></ul><ul><li>MEAP writing scores consistently among the lowest for decades </li></ul><ul><li>MDE response time to this proposal? </li></ul>
My article, by A. Trupe, (1997), Academic literacy in a wired world: What should a literate student text look like? <ul><li>Traditional grading rubric </li></ul><ul><li>Twist article—writing & technology </li></ul><ul><li>Technological writing vs. English class </li></ul><ul><li>English teachers to create multidimensional writers—interdisciplinary approach </li></ul>
Writing Across the Curriculum MDE & Michigan Science Teacher Association example http://www.michigan.gov/documents/mde/Science_WAC_2_3_264454_7.pdf Writing to Learn --Critical thinking (higher order thinking) skills --Analysis --Application Writing to Demonstrate Knowledge --Synthesize information --Explain concepts/ ideas --E.g., essays, letters, projects, reports, article reviews, research paper, etc.
Rubrics Students must understand what they will be graded on before they can adequately perform. It gives them a basis for writing successfully.
<ul><li>ARCS Motivation Theory, Keller (1979) Rubrics and modeling Speech Example --Student videotaped speech --I model as well --Grade me using the rubric. They love it! William Arthur Ward said, “The mediocre teacher tells. The good teacher explains. The superior teacher demonstrates. The great teacher inspires.” To this end, I am a superior teacher striving to be a great teacher. </li></ul>
<ul><li>The Neglected “R”: The Need for a Writing Revolution article “…leaving teaching of writing to inexperienced graduate students…” Quite offended by generality Graduate student, but not inexperienced with 17 years of teaching experience </li></ul>
Article Redemption In-service workshops help teachers: --understand good writing --develop as writers themselves I share my writings with my students. Food for Thought: How many of us share our writings with our students? How can we expect them to perform, if we only tell them and rarely show them?
<ul><li>Complete writing rubric activity A Word of Caution Regarding Spell Check (Strickland & Strickland article) Paragraph Evaluate it via rubric. Summative assessment not formative Too short for formative assessment alone; not enough data to drive instruction </li></ul>
Corrected Paragraph I have a spell check , It came with my PC ; It plainly marks four my revue M istakes I cannot sea . I’ve run this paragraph threw it , I’m sure your please too no . Its letter perfect in it’s weigh, My checker tolled me sew .
Writing Rubric Category 25 pts 16.5 pts 8.25 pts Capitalization Correct Minor Errors Major Errors Grammar Correct Minor Major Punctuation Correct Minor Major Spelling Correct Minor Major Rubric Key Correct: No errors Minor Problems: 1-5 errors Major Problems: 6 or more errors Total Points Earned __________ Final Grade __________
Assessment-Driven Improvements in Middle School Students’ Writing article Three Types of Formative Assessment --Self—ungraded --Peer—ungraded --Teacher—ungraded Anderson article—short list of goals; “zone of proximal development” (Vygotsky 1962)
Teacher—summative—graded I grade in green—not psychologically damaging. Green = growth Goal = ever-evolving writers
<ul><li>Assessment Goals Assessment must be fair and authentic. Assessment drives instruction. Students’ strengths/ weaknesses? Address their weaknesses? Reinforcing their strengths? </li></ul>
<ul><li>Teacher Reflections Ask, “What’s working”/ “What’s not working?” Student mastery? Alternative teaching methods? Adjust to them not them adjusting always Anderson article—boxed paragraph example (visual learners) Our vast teacher toolbox vs. their limited student toolbox </li></ul>
Consider the following picture scenario. Tiger, a young boy, is talking to a friend about his dog, and he says, “I taught Stripe how to whistle.” His friend replies, “I don’t hear him whistling.” Tiger, with a disgusted look on his face, responds, “I said I taught him. I didn’t say he learned.”
<ul><li>If students are not learning from our teaching, then we are just talking. Learning and teaching are uniquely tied together. </li></ul>
<ul><li>Reliability Check & Grading Rubrics Same essay, different colleague Similar grade = reliability has been achieved Evaluation Testing Systems example Table leader—second read writing samples If scores greatly differed, re-read the essay to ensure reliability </li></ul>
Validity Check Are we assessing what we taught? Food for Thought: Assessed students on untaught material? As teachers, our assessments should not be “gotcha” moments to our students’ detriment, but tools to determine if our students learned what we taught.
My Classroom Best Practices Middle school students improved writing & mine 1998, my 8 th grade English class & WSU freshman partnership MEAP Writing scores surpassed the State’s average by 10%. This 2003 article suggested a university-school partnerships.
Address students’ trends in writing Common errors? My college English classes—initial 1-2 page essay Note common errors, then address them as a class “Teaching to the individual” not “teaching to the middle” (Strickland & Strickland 2000 article) Review as a class error-filled sentences from each student Anonymous Goal = improve their own and peers’ writing
<ul><li>Writing Process—POWER Writing process = POWER (pun intended) P lan—list all possible ideas without judgment O rganize—group common ideas into an outline or cluster W rite—create draft E valuate—receive two critiques (peer & scholarly) R evise—include critics’ suggestions for final copy Each step builds from the previous step. </li></ul>
<ul><li>Authentic Learning Transfer knowledge e.g., transfer their classroom knowledge to a standardized test setting Create acronyms IOPVWSC — Ideas, Organize, Paragraphs, Voice, Word Choice, Sentences, and Conventions. “I only play videogames while snacking chips.” My Writing POWER example </li></ul>
“ When children are required to learn to spell words correctly before they learn to compose, it stifles the writing process.” (Strickland & Strickland 2000)