"Assessment, Like Revision, Is Recursive"
Upcoming SlideShare
Loading in...5
×
 

"Assessment, Like Revision, Is Recursive"

on

  • 1,314 views

Presentation at NISOD 2010

Presentation at NISOD 2010

Statistics

Views

Total Views
1,314
Views on SlideShare
1,312
Embed Views
2

Actions

Likes
0
Downloads
7
Comments
0

1 Embed 2

http://www.slideshare.net 2

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

CC Attribution License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

"Assessment, Like Revision, Is Recursive" "Assessment, Like Revision, Is Recursive" Presentation Transcript

  • Session description
    Our curricula, ever evolving, are described as nimble. Our institution prides itself on our adaptability and resiliency. But are our assessment practices keeping pace with our methods and content? In the wake of the implementation of proficiency-based admissions thresholds and placement policies, the state initiated system-wide alignment in order to achieve common course numbering and course objectives. As a result, the English faculty has reconstructed its notions about writing tasks, processes, and assessment models within both developmental and transfer-level courses. This presentation will share some of our ways of thinking about student writing, including tasks and assessment and encourage you to share your experiences as well.
  • Assessment, like revision, is recursive: Re-designing and re-thinking metrics and methods for the assessment of student writing
    Jana Carter
    Montana State University Great Falls
    English Department
    NISOD May 2010
  • Why this is important
  • Why this is important
    Too often, especially within two-year programs, Freshman Composition is a student’s last opportunity for explicit instruction in writing skills.
    While we know there is value in the explicit instruction and opportunities for guided practice we offer in Freshman Composition, we also know transitioning into writing in the disciplines is often difficult for students.
  • Why this is important
    Ellen Lavelle, in "The Quality of University Writing: a preliminary analysis of undergraduate portfolios," writes 
    Surprisingly, in examining the differences between early and late samples [of 325 student writing portfolios collected over eight years at Southern Illinois University Edwardsville], early writing was supported as significantly better than late writing, as measured by both the holistic scoring method and by the deep-and-surface rubric (91)
  • Enrollment
  • Developmental courses
  • Transfer courses
  • Enrollment challenges
    The number of sections taught by part-time faculty make it necessary for us to clearly communicate our expectations to part-time faculty.
    Challenges for maintaining coherence:
    Finding qualified and experienced part-time faculty to teach face-to-face
    Navigating the adjunct hiring process
    Department chair’s active participation and involvement in hiring, evaluation, and mentoring of part-time faculty
    Time
  • What to do as enrollment rises
    Cheer! Exciting opportunities provided by additional sections and infusion of instructional staff new to the institution.
    Groan. Anecdotal perception of a range of student skills entering next course in sequence.
    Get to work. Develop curriculum and coherence, develop staff, achieve buy-in, identify pool of available contingent faculty, develop assessment practices, and retool metrics.
  • Getting to work: Achieving buy-in and building coherence across sections
    We started with staff development and development of practices:
    Annual kick-off meetings to discuss objectives, departmental philosophies, practices, and share ideas and activities
    Regular departmental meetings, including time for mentoring, brainstorming, and problem solving, to which adjuncts are invited
    Course template, maintained by department, for each course in the composition sequence
    Common final exam for WRIT101
    Level-specific textbook and materials recommendations, maintained by department
    Centralized book ordering process through the Division secretary to avert problems with editions, texts without endorsement, and communication
  • Focused discussion: Adjuncts and high-enrollment courses
    How does your department build collaboration between adjunct faculty and full-time faculty?
    How does your institution support efforts to achieve coherence across multiple sections of courses?
  • Statewide articulation project: The Montana University System Transfer Initiative
  • MUS Transfer Initiative
    "Faculty Learning Outcomes Councils [are] drawn from specific disciplines at institutions throughout the state [and] examine individual transferable courses and reach consensus about appropriate learning outcomes for each one." (http://mus.edu/transfer/index2.asp)
    Representative attended relevant councils
    We worked to articulate developmental courses as well as transfer-level courses to increase portability of credits for highly mobile students
  • MUS Transfer InitiativeArticulation Matrix
    The Faculty Learning Outcome Councils wrote objectives.
    We revised our curriculum so that we were able to articulate each course in our Composition sequence.
  • MUS Transfer Initiative Participating Institutions
  • outcomes
  • How our outcomes measure up: 2004
    WRIT 101 course outcomes in 2004
    Topics, activities, and philosophy, little of which spoke to observable student behaviors, provided conditions under which performance should occur, or included definition of acceptable performance.
    In effect, what we had was a mess.
    Best practices from Instructional Design
    “A learning objective is a statement of what students will be able to do when they have completed instruction. A learning objective has three major components:
    A description of what the student will be able to do
    The conditions under which the student will perform the task.
    The criteria for evaluating student performance “ (Arreola)
  • How our outcomes measure up: 2006
    WRIT 101 course outcomes in 2006
    A specific list of observable student skills, conditions under which performance should occur, and oblique references to performance thresholds.
    In effect, what we created was a list of enabling objectives.
    Best practices from Instructional Design
    “Well-written learning objectives are often written at two levels:
    • Terminal objectives: The primary skills (desired end results of instruction)
    • Enabling objectives: Incremental skills that allow the desired end results to occur” (Shank 5)
  • How our outcomes measure up: 2010
    WRIT 101 course outcomes today
    A specific list of observable student skills, conditions under which performance should occur, and references to rubric levels.
    Montana University System Transfer Initiative provided terminal objectives, but we still have far too many.
    Best practices from Instructional Design
    “Each individual learning objective should support the overarching goal of the course, that is, the thread that unites all the topics that will be covered and all the skills students should have mastered by the end of the semester.  Bestpractice dictates that learning objectives be kept to no more than half a dozen” ("Learning Objectives ”)
  • Focused discussion: Outcomes
    Who writes your outcomes?
    Does that level of control/autonomy allow you to meet the needs of your campus or community?
    How do you participate in revision of those outcomes?
  • Performance levels and indicators
  • Where we got stuck
    Primary challenges of assessment at the department level: "fear of compromised autonomy," "methodological skepticism," and "lack of time" (Beld 6-7)
    We didn’t get stuck when we wrote outcomes. We did get stuck figuring out how to talk about and negotiate performance levels and indicators!
  • Why we got stuck
    Recipe for disaster: equal parts preparation and professional experience – working without institutional history or common expectations
    We had to recognize that what we valued as good, academic writing was very different
    We had to realize that we weren’t going to develop a perfect solution or model on the first attempt
    Northern Arizona University initiated a rubrics program in response to a very similar problem (Garcia 13); while we’re no NAU, our rubrics project was productive. It built community by providing a springboard for discussion about values, beliefs, training, preparation, and professional experience
  • How we got un-stuck
    Recommendations:
    "focusing on intended users and uses," by asserting that "methodological choices should be governed by users and uses—not simply to foster ownership of the results, but to enhance their quality" (Beld 8);
    "limiting the agenda," by focusing on an annual departmental assessment agenda that is "intentionally modest" (Beld 8); and
    "treating findings as a means to an end, not as the ends in themselves" (Beld 8)
    How we’ve applied those principles:
    Collaborate on all assessment-related work, think small, and stay flexible
  • Evolution of assessment:Where we’ve been
    2004
    No official departmental standards of performance: assessment was primarily individualized and private
    2006
    Beginnings of rubrics to align with Montana University System Writing Assessment (MUSWA), COMPASS e-Write, and other placement instruments
  • Evolution of assessment: Where we are today
    WRIT095, WRIT101, and WRIT201 are paperless classes
    Tools provided by our LMS improve quality of and time invested in course administration, which is consistent with the literature: “A practical benefit of using e-learning tools mentioned most often by the lecturers was reducing effort and time they spent on administration” (Heinrich, et al 180)
    We use rubrics to gauge performance on enabling objectives (defined locally) and terminal objectives (defined by FLOC) for WRIT101
  • Arriving at the composite score paradigm
    Benefits
    Content and conventions rubrics provide a fair and representative composite score
    Grammar is not an emphasis of WRIT101. The Conventions rubric places value on careful editing
    Rubrics edited for "mean-spiritedness" (Sommers 149) for C, D, and F
    Models of student writing
    Buy-in from stakeholders: collaboration, use, and wide/early distribution
    Drawbacks
    Limitations of the current release of our LMS means our composite score is taken from a chart/explained by an external document
    We don’t yet have a large pool of essay models
    We do a good job training students how to analyze writing using the language of the rubrics, but we don’t yet do a super job explaining holistic scoring to colleagues
  • Evolution of assessment: Looking ahead
    Formalize the final exam processes
    Training for adjunct faculty to increase inter-rater reliability for final exam – it’s unlikely we’ll achieve California’s 90% (U.S. Congress, Office of Technology Assessment) because it’s not that formal, but we’ll be much closer
    Refining rubrics and assignments to meet enabling objectives better, e.g., information literacy
    Collecting more models of student essays, especially those requiring research and properly handling resources
    Investigating rubrics that are bigger than we are, e.g., AACU’s VALUE project, which provides rubrics for skill areas ("VALUE: Valid Assessment of Learning in Undergraduate Education")
  • Works Cited
    Beld, Jo Michelle. "Engaging Departments in Assessing Student Learning Overcoming Common Obstacles." Peer Review 12.1 (2010): 6-9. Academic Search Premier. EBSCO. Web. 23 May 2010.
    Garcia, Paula. "Involving Graduate Teaching Assistants in Freshman Composition in Rubric Implementation. (Cover story)." Assessment Update 18.6 (2006): 1-13. Academic Search Premier. EBSCO. Web. 23 May 2010.
    Heinrich, Eva, John Milne, and Maurice Moore. "An Investigation into E-Tool Use for Formative Assignment Assessment - Status and Recommendations." Journal of Educational Technology & Society 12.4 (2009): 176-192. Academic Search Premier. EBSCO. Web. 23 May 2010.
    LAVELLE, ELLEN. "The Quality of University Writing: a preliminary analysis of undergraduate portfolios." Quality in Higher Education 9.1 (2003): 87. Academic Search Premier. EBSCO. Web. 23 May 2010.
    "Learning Objectives (Teaching and Learning Laboratory @ MIT)." MIT. Web. 28 May 2010. http://web.mit.edu/tll/teaching-materials/learning-objectives/index-learning-objectives.html
    Shank, Patti. "Writing Learning Objectives That Help You Teach and Students Learn (Part 1)." Online Classroom (2005): 4-7. Academic Search Premier. EBSCO. Web. 23 May 2010.
    U.S. Congress, Office of Technology Assessment. American Schools: Asking the Right Questions. [Full Report.]. Washington DC: U.S. Government Printing Office, 1992. ERIC. Web. 23 May 2010. "Writing Assessment: A Position Statement." National Council of Teachers of English - Homepage. CCCC. Web. 23 May 2010. <http://www.ncte.org/cccc/resources/positions/writingassessment>.
    "VALUE: Valid Assessment of Learning in Undergraduate Education." Association of American Colleges and Universities. Web. 23 May 2010. <http://www.aacu.org/value/index.cfm>.
  • Contact and presentation information
    Jana Carter
    MSU Great Falls
    jcarter@msugf.edu
    406.771.4363 (office)
    406.530.9748 (mobile)
    Documents available online:
    http://www.slideshare.net/JanaCarter/nisod-jcarter
    Session evaluation:
    http://www.surveymonkey.com/s/6MC2YBN