Session descriptionOur curricula, ever evolving, are described as nimble. Our institution prides itself on our adaptability and resiliency. But are our assessment practices keeping pace with our methods and content? In the wake of the implementation of proficiency-based admissions thresholds and placement policies, the state initiated system-wide alignment in order to achieve common course numbering and course objectives. As a result, the English faculty has reconstructed its notions about writing tasks, processes, and assessment models within both developmental and transfer-level courses. This presentation will share some of our ways of thinking about student writing, including tasks and assessment and encourage you to share your experiences as well.
Assessment, like revision, is recursive: Re-designing and re-thinking metrics and methods for the assessment of student writingJana CarterMontana State University Great FallsEnglish DepartmentNISOD May 2010
Why this is important
Why this is importantToo often, especially within two-year programs, Freshman Composition is a student’s last opportunity for explicit instruction in writing skills.While we know there is value in the explicit instruction and opportunities for guided practice we offer in Freshman Composition, we also know transitioning into writing in the disciplines is often difficult for students.
Why this is importantEllen Lavelle, in "The Quality of University Writing: a preliminary analysis of undergraduate portfolios," writes Surprisingly, in examining the differences between early and late samples [of 325 student writing portfolios collected over eight years at Southern Illinois University Edwardsville], early writing was supported as significantly better than late writing, as measured by both the holistic scoring method and by the deep-and-surface rubric (91)
Enrollment
Developmental courses
Transfer courses
Enrollment challengesThe number of sections taught by part-time faculty make it necessary for us to clearly communicate our expectations to part-time faculty.Challenges for maintaining coherence:Finding qualified and experienced part-time faculty to teach face-to-faceNavigating the adjunct hiring processDepartment chair’s active participation and involvement in hiring, evaluation, and mentoring of part-time facultyTime
What to do as enrollment risesCheer! Exciting opportunities provided by additional sections and infusion of instructional staff new to the institution.Groan. Anecdotal perception of a range of student skills entering next course in sequence.Get to work. Develop curriculum and coherence, develop staff, achieve buy-in, identify pool of available contingent faculty, develop assessment practices, and retool metrics.
Getting to work: Achieving buy-in and building coherence across sectionsWe started with staff development and development of practices: Annual kick-off meetings to discuss objectives, departmental philosophies, practices, and share ideas and activitiesRegular departmental meetings, including time for mentoring, brainstorming, and problem solving, to which adjuncts are invitedCourse template, maintained by department, for each course in the composition sequenceCommon final exam for WRIT101Level-specific textbook and materials  recommendations, maintained by departmentCentralized book ordering process through the Division secretary to avert problems with editions, texts without endorsement, and communication
Focused discussion: Adjuncts and high-enrollment coursesHow does your department build collaboration between adjunct faculty and full-time faculty?How does your institution support efforts to achieve coherence across multiple sections of courses?
Statewide articulation project: The Montana University System Transfer Initiative
MUS Transfer Initiative"Faculty Learning Outcomes Councils [are] drawn from specific disciplines at institutions throughout the state [and] examine individual transferable courses and reach consensus about appropriate learning outcomes for each one." (http://mus.edu/transfer/index2.asp)   Representative attended relevant councilsWe worked to articulate developmental courses as well as transfer-level courses to increase portability of credits for highly mobile students
MUS Transfer InitiativeArticulation MatrixThe Faculty Learning Outcome Councils wrote objectives.We revised our curriculum so that we were able to articulate each course in our Composition sequence.
MUS Transfer Initiative Participating Institutions
outcomes
How our outcomes measure up: 2004WRIT 101 course outcomes in 2004Topics, activities, and philosophy, little of which spoke to observable student behaviors, provided  conditions under which performance should occur, or included definition of acceptable performance. In effect, what we had was a mess.Best practices from Instructional Design“A learning objective is a statement of what students will be able to do when they have completed instruction. A learning objective has three major components:A description of what the student will be able to doThe conditions under which the student will perform the task.The criteria for evaluating student performance “ (Arreola)
How our outcomes measure up: 2006WRIT 101 course outcomes in 2006A specific list of observable student skills, conditions under which performance should occur, and oblique references to performance thresholds.In effect, what we created was a list of enabling objectives. Best practices from Instructional Design“Well-written learning objectives are often written at two levels:• Terminal objectives: The primary skills (desired end results of instruction)• Enabling objectives: Incremental skills that allow the desired end results to occur” (Shank 5)
How our outcomes measure up: 2010WRIT 101 course outcomes todayA specific list of observable student skills, conditions under which performance should occur, and references to rubric levels.Montana University System Transfer Initiative provided terminal objectives, but we still have far too many.Best practices from Instructional Design“Each individual learning objective should support the overarching goal of the course, that is, the thread that unites all the topics that will be covered and all the skills students should have mastered by the end of the semester.  Bestpractice dictates that learning objectives be kept to no more than half a dozen” ("Learning Objectives ”)
Focused discussion: OutcomesWho writes your outcomes? Does that level of control/autonomy allow you to meet the needs of your campus or community? How do you participate in revision of those outcomes?
Performance levels and indicators
Where we got stuckPrimary challenges of assessment at the department level: "fear of compromised autonomy," "methodological skepticism," and "lack of time" (Beld 6-7)We didn’t get stuck when we wrote outcomes. We did get stuck figuring out how to talk about and negotiate performance levels and indicators!
Why we got stuckRecipe for disaster: equal parts preparation and professional experience – working without institutional history or common expectationsWe had to recognize that what we valued as good, academic writing was very differentWe had to realize that we weren’t going to develop a perfect solution or model on the first attemptNorthern Arizona University initiated a rubrics program in response to a very similar problem (Garcia 13); while we’re no NAU, our rubrics project was productive. It built community by providing a springboard for discussion about values, beliefs, training, preparation, and professional experience
How we got un-stuckRecommendations: "focusing on intended users and uses," by asserting that "methodological choices should be governed by users and uses—not simply to foster ownership of the results, but to enhance their quality" (Beld 8);  "limiting the agenda," by focusing on an annual departmental assessment agenda that is "intentionally modest" (Beld 8); and "treating findings as a means to an end, not as the ends in themselves" (Beld 8)How we’ve applied those principles:Collaborate on all assessment-related work, think small, and stay flexible
Evolution of assessment:Where we’ve been2004No official departmental standards of performance: assessment was primarily individualized and private2006 Beginnings of rubrics to align with Montana University System Writing Assessment (MUSWA), COMPASS e-Write, and other placement instruments
Evolution of assessment: Where we are todayWRIT095, WRIT101, and WRIT201 are paperless classesTools provided by our LMS improve quality of and time invested in course administration, which is consistent with the literature: “A practical benefit of using e-learning tools mentioned most often by the lecturers was reducing effort and time they spent on administration” (Heinrich, et al 180)We use rubrics to gauge performance on enabling objectives (defined locally) and terminal objectives (defined by FLOC) for WRIT101
Arriving at the composite score paradigmBenefitsContent and conventions rubrics provide a fair  and representative composite scoreGrammar is not an emphasis of WRIT101. The Conventions rubric places value on careful editingRubrics edited for "mean-spiritedness" (Sommers 149) for C, D, and FModels of student writingBuy-in from stakeholders: collaboration, use, and wide/early distributionDrawbacksLimitations of the current release of our LMS means our composite score is taken from a chart/explained by an external documentWe don’t yet have a large pool of essay modelsWe do a good job training students how to analyze writing using the language of the rubrics, but we don’t yet do a super job explaining holistic scoring to colleagues
Evolution of assessment: Looking aheadFormalize the final exam processesTraining for adjunct faculty to increase inter-rater reliability for final exam – it’s unlikely we’ll achieve California’s 90% (U.S. Congress, Office of Technology Assessment) because it’s not that formal, but we’ll be much closerRefining rubrics and assignments to meet enabling objectives better, e.g., information literacyCollecting more models of student essays, especially those requiring research and properly handling resourcesInvestigating rubrics that are bigger than we are, e.g., AACU’s VALUE project, which provides rubrics for skill areas ("VALUE: Valid Assessment of Learning in Undergraduate Education")
Works CitedBeld, Jo Michelle. "Engaging Departments in Assessing Student Learning Overcoming Common Obstacles." Peer Review 12.1 (2010): 6-9. Academic Search Premier. EBSCO. Web. 23 May 2010. Garcia, Paula. "Involving Graduate Teaching Assistants in Freshman Composition in Rubric Implementation. (Cover story)." Assessment Update 18.6 (2006): 1-13. Academic Search Premier. EBSCO. Web. 23 May 2010. Heinrich, Eva, John Milne, and Maurice Moore. "An Investigation into E-Tool Use for Formative Assignment Assessment - Status and Recommendations." Journal of Educational Technology & Society  12.4 (2009): 176-192. Academic Search Premier. EBSCO. Web. 23 May 2010. LAVELLE, ELLEN. "The Quality of University Writing: a preliminary analysis of undergraduate portfolios." Quality in Higher Education  9.1 (2003): 87. Academic Search Premier. EBSCO. Web. 23 May 2010. "Learning Objectives (Teaching and Learning Laboratory @ MIT)." MIT. Web. 28 May 2010. http://web.mit.edu/tll/teaching-materials/learning-objectives/index-learning-objectives.htmlShank, Patti. "Writing Learning Objectives That Help You Teach and Students Learn (Part 1)." Online Classroom (2005): 4-7. Academic Search Premier. EBSCO. Web. 23 May 2010. U.S. Congress, Office of Technology Assessment. American Schools: Asking the Right Questions. [Full Report.]. Washington DC: U.S. Government Printing Office, 1992. ERIC. Web. 23 May 2010. "Writing Assessment: A Position Statement." National Council of Teachers of English - Homepage. CCCC. Web. 23 May 2010. <http://www.ncte.org/cccc/resources/positions/writingassessment>. "VALUE: Valid Assessment of Learning in Undergraduate Education." Association of American Colleges and Universities. Web. 23 May 2010. <http://www.aacu.org/value/index.cfm>.
Contact  and presentation informationJana CarterMSU Great Fallsjcarter@msugf.edu406.771.4363 (office) 406.530.9748 (mobile)Documents available online:http://www.slideshare.net/JanaCarter/nisod-jcarterSession evaluation:http://www.surveymonkey.com/s/6MC2YBN

"Assessment, Like Revision, Is Recursive"

  • 1.
    Session descriptionOur curricula,ever evolving, are described as nimble. Our institution prides itself on our adaptability and resiliency. But are our assessment practices keeping pace with our methods and content? In the wake of the implementation of proficiency-based admissions thresholds and placement policies, the state initiated system-wide alignment in order to achieve common course numbering and course objectives. As a result, the English faculty has reconstructed its notions about writing tasks, processes, and assessment models within both developmental and transfer-level courses. This presentation will share some of our ways of thinking about student writing, including tasks and assessment and encourage you to share your experiences as well.
  • 2.
    Assessment, like revision,is recursive: Re-designing and re-thinking metrics and methods for the assessment of student writingJana CarterMontana State University Great FallsEnglish DepartmentNISOD May 2010
  • 3.
    Why this isimportant
  • 4.
    Why this isimportantToo often, especially within two-year programs, Freshman Composition is a student’s last opportunity for explicit instruction in writing skills.While we know there is value in the explicit instruction and opportunities for guided practice we offer in Freshman Composition, we also know transitioning into writing in the disciplines is often difficult for students.
  • 5.
    Why this isimportantEllen Lavelle, in "The Quality of University Writing: a preliminary analysis of undergraduate portfolios," writes Surprisingly, in examining the differences between early and late samples [of 325 student writing portfolios collected over eight years at Southern Illinois University Edwardsville], early writing was supported as significantly better than late writing, as measured by both the holistic scoring method and by the deep-and-surface rubric (91)
  • 6.
  • 7.
  • 8.
  • 9.
    Enrollment challengesThe numberof sections taught by part-time faculty make it necessary for us to clearly communicate our expectations to part-time faculty.Challenges for maintaining coherence:Finding qualified and experienced part-time faculty to teach face-to-faceNavigating the adjunct hiring processDepartment chair’s active participation and involvement in hiring, evaluation, and mentoring of part-time facultyTime
  • 10.
    What to doas enrollment risesCheer! Exciting opportunities provided by additional sections and infusion of instructional staff new to the institution.Groan. Anecdotal perception of a range of student skills entering next course in sequence.Get to work. Develop curriculum and coherence, develop staff, achieve buy-in, identify pool of available contingent faculty, develop assessment practices, and retool metrics.
  • 11.
    Getting to work:Achieving buy-in and building coherence across sectionsWe started with staff development and development of practices: Annual kick-off meetings to discuss objectives, departmental philosophies, practices, and share ideas and activitiesRegular departmental meetings, including time for mentoring, brainstorming, and problem solving, to which adjuncts are invitedCourse template, maintained by department, for each course in the composition sequenceCommon final exam for WRIT101Level-specific textbook and materials recommendations, maintained by departmentCentralized book ordering process through the Division secretary to avert problems with editions, texts without endorsement, and communication
  • 12.
    Focused discussion: Adjunctsand high-enrollment coursesHow does your department build collaboration between adjunct faculty and full-time faculty?How does your institution support efforts to achieve coherence across multiple sections of courses?
  • 13.
    Statewide articulation project:The Montana University System Transfer Initiative
  • 14.
    MUS Transfer Initiative"FacultyLearning Outcomes Councils [are] drawn from specific disciplines at institutions throughout the state [and] examine individual transferable courses and reach consensus about appropriate learning outcomes for each one." (http://mus.edu/transfer/index2.asp) Representative attended relevant councilsWe worked to articulate developmental courses as well as transfer-level courses to increase portability of credits for highly mobile students
  • 15.
    MUS Transfer InitiativeArticulationMatrixThe Faculty Learning Outcome Councils wrote objectives.We revised our curriculum so that we were able to articulate each course in our Composition sequence.
  • 16.
    MUS Transfer InitiativeParticipating Institutions
  • 17.
  • 18.
    How our outcomesmeasure up: 2004WRIT 101 course outcomes in 2004Topics, activities, and philosophy, little of which spoke to observable student behaviors, provided conditions under which performance should occur, or included definition of acceptable performance. In effect, what we had was a mess.Best practices from Instructional Design“A learning objective is a statement of what students will be able to do when they have completed instruction. A learning objective has three major components:A description of what the student will be able to doThe conditions under which the student will perform the task.The criteria for evaluating student performance “ (Arreola)
  • 19.
    How our outcomesmeasure up: 2006WRIT 101 course outcomes in 2006A specific list of observable student skills, conditions under which performance should occur, and oblique references to performance thresholds.In effect, what we created was a list of enabling objectives. Best practices from Instructional Design“Well-written learning objectives are often written at two levels:• Terminal objectives: The primary skills (desired end results of instruction)• Enabling objectives: Incremental skills that allow the desired end results to occur” (Shank 5)
  • 20.
    How our outcomesmeasure up: 2010WRIT 101 course outcomes todayA specific list of observable student skills, conditions under which performance should occur, and references to rubric levels.Montana University System Transfer Initiative provided terminal objectives, but we still have far too many.Best practices from Instructional Design“Each individual learning objective should support the overarching goal of the course, that is, the thread that unites all the topics that will be covered and all the skills students should have mastered by the end of the semester.  Bestpractice dictates that learning objectives be kept to no more than half a dozen” ("Learning Objectives ”)
  • 21.
    Focused discussion: OutcomesWhowrites your outcomes? Does that level of control/autonomy allow you to meet the needs of your campus or community? How do you participate in revision of those outcomes?
  • 22.
  • 23.
    Where we gotstuckPrimary challenges of assessment at the department level: "fear of compromised autonomy," "methodological skepticism," and "lack of time" (Beld 6-7)We didn’t get stuck when we wrote outcomes. We did get stuck figuring out how to talk about and negotiate performance levels and indicators!
  • 24.
    Why we gotstuckRecipe for disaster: equal parts preparation and professional experience – working without institutional history or common expectationsWe had to recognize that what we valued as good, academic writing was very differentWe had to realize that we weren’t going to develop a perfect solution or model on the first attemptNorthern Arizona University initiated a rubrics program in response to a very similar problem (Garcia 13); while we’re no NAU, our rubrics project was productive. It built community by providing a springboard for discussion about values, beliefs, training, preparation, and professional experience
  • 25.
    How we gotun-stuckRecommendations: "focusing on intended users and uses," by asserting that "methodological choices should be governed by users and uses—not simply to foster ownership of the results, but to enhance their quality" (Beld 8); "limiting the agenda," by focusing on an annual departmental assessment agenda that is "intentionally modest" (Beld 8); and "treating findings as a means to an end, not as the ends in themselves" (Beld 8)How we’ve applied those principles:Collaborate on all assessment-related work, think small, and stay flexible
  • 26.
    Evolution of assessment:Wherewe’ve been2004No official departmental standards of performance: assessment was primarily individualized and private2006 Beginnings of rubrics to align with Montana University System Writing Assessment (MUSWA), COMPASS e-Write, and other placement instruments
  • 27.
    Evolution of assessment:Where we are todayWRIT095, WRIT101, and WRIT201 are paperless classesTools provided by our LMS improve quality of and time invested in course administration, which is consistent with the literature: “A practical benefit of using e-learning tools mentioned most often by the lecturers was reducing effort and time they spent on administration” (Heinrich, et al 180)We use rubrics to gauge performance on enabling objectives (defined locally) and terminal objectives (defined by FLOC) for WRIT101
  • 28.
    Arriving at thecomposite score paradigmBenefitsContent and conventions rubrics provide a fair and representative composite scoreGrammar is not an emphasis of WRIT101. The Conventions rubric places value on careful editingRubrics edited for "mean-spiritedness" (Sommers 149) for C, D, and FModels of student writingBuy-in from stakeholders: collaboration, use, and wide/early distributionDrawbacksLimitations of the current release of our LMS means our composite score is taken from a chart/explained by an external documentWe don’t yet have a large pool of essay modelsWe do a good job training students how to analyze writing using the language of the rubrics, but we don’t yet do a super job explaining holistic scoring to colleagues
  • 29.
    Evolution of assessment:Looking aheadFormalize the final exam processesTraining for adjunct faculty to increase inter-rater reliability for final exam – it’s unlikely we’ll achieve California’s 90% (U.S. Congress, Office of Technology Assessment) because it’s not that formal, but we’ll be much closerRefining rubrics and assignments to meet enabling objectives better, e.g., information literacyCollecting more models of student essays, especially those requiring research and properly handling resourcesInvestigating rubrics that are bigger than we are, e.g., AACU’s VALUE project, which provides rubrics for skill areas ("VALUE: Valid Assessment of Learning in Undergraduate Education")
  • 30.
    Works CitedBeld, JoMichelle. "Engaging Departments in Assessing Student Learning Overcoming Common Obstacles." Peer Review 12.1 (2010): 6-9. Academic Search Premier. EBSCO. Web. 23 May 2010. Garcia, Paula. "Involving Graduate Teaching Assistants in Freshman Composition in Rubric Implementation. (Cover story)." Assessment Update 18.6 (2006): 1-13. Academic Search Premier. EBSCO. Web. 23 May 2010. Heinrich, Eva, John Milne, and Maurice Moore. "An Investigation into E-Tool Use for Formative Assignment Assessment - Status and Recommendations." Journal of Educational Technology & Society 12.4 (2009): 176-192. Academic Search Premier. EBSCO. Web. 23 May 2010. LAVELLE, ELLEN. "The Quality of University Writing: a preliminary analysis of undergraduate portfolios." Quality in Higher Education 9.1 (2003): 87. Academic Search Premier. EBSCO. Web. 23 May 2010. "Learning Objectives (Teaching and Learning Laboratory @ MIT)." MIT. Web. 28 May 2010. http://web.mit.edu/tll/teaching-materials/learning-objectives/index-learning-objectives.htmlShank, Patti. "Writing Learning Objectives That Help You Teach and Students Learn (Part 1)." Online Classroom (2005): 4-7. Academic Search Premier. EBSCO. Web. 23 May 2010. U.S. Congress, Office of Technology Assessment. American Schools: Asking the Right Questions. [Full Report.]. Washington DC: U.S. Government Printing Office, 1992. ERIC. Web. 23 May 2010. "Writing Assessment: A Position Statement." National Council of Teachers of English - Homepage. CCCC. Web. 23 May 2010. <http://www.ncte.org/cccc/resources/positions/writingassessment>. "VALUE: Valid Assessment of Learning in Undergraduate Education." Association of American Colleges and Universities. Web. 23 May 2010. <http://www.aacu.org/value/index.cfm>.
  • 31.
    Contact andpresentation informationJana CarterMSU Great Fallsjcarter@msugf.edu406.771.4363 (office) 406.530.9748 (mobile)Documents available online:http://www.slideshare.net/JanaCarter/nisod-jcarterSession evaluation:http://www.surveymonkey.com/s/6MC2YBN