Published on

Published in: Business, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide


  1. 1. 1Summary of Observations and Recommendations Based on Scoring Students’ Written CommunicationUsing the VALUE Written Communication Rubric and Institutionally Developed WrittenCommunication Rubrics on Feb. 29, 2012Overall Observations about the VALUE RubricDid not like language of the levels—too easy to attach to years of studyHad difficulty differentiating between content development and genre/disciplinary conventionsbecause of organizationCircumstances surround the writing task—how closely do we want to evaluate students’adherence to a task (assignment? Is that what “following directions” means?Do we need normative information—examples, by levels to use across institutions?Should we be tracking students’ performance chronologically?Three out of 4 preferred institutionally designed rubric if “sources” criterion could be added andan “NA” column could be added. VALUE rubric contains criterion for sources; institutionalexample did not as some assignments may or may not requires sourcesInstitutionally developed rubric has “development “of topic—good criterion lacking in VALUErubricVALUE rubric does not have an absolute “Fail”—zero (group 2)Too much packed into the VALUE rubric if given to students—would be more confusing forthem. Institutional would be easier for them to understandSources of evidence -- quality is a disciplinary-based discussion. Need to have the disciplinaryperspective represented in the exercise.We did note the need to have a "0" to spread out the scale (because some will not meetbenchmark) and to have a dont know/not applicable category -- depending upon the assignment,what we know about the references, disciplinary differences, etc.).We also discussed the issue of intra-rater reliability -- we all come at thisAssignment C with a different set point.What are the goals -- what is the purpose?? Going on the street to communicate, sure. Reallyrepresent the discipline, not so much.We have to be very careful about selecting evidence, examples. The assignmentaffects the scoring ...variability creates more variability.Need for extensive training/norming
  2. 2. 2Question about guidelines for assignmentsThought there was a lot of overlap between categoriesGood assignments are criticalAssessing outside your expertise is difficultCategory: Genre and Disciplinary Conventions and Category Content Development: these seemmost related to the assignment *should talk specifically; re: format of documentationCategory: Sources and Evidence: seems as though there are 2 vectors in the rubric: soundversus unsound sources and success in using them consistently; difficult to parse these twoWays in Which Criteria between VALUE rubric and Institutionally Developed Rubric AreSimilarMechanics and syntax are common criteriaSources are common in bothBoth address most of same concernsPurposeSources & evidenceMechanicsOrganizationSyntaxWays in Which Criteria between VALUE Rubric and Institutionally Developed RubricDifferLike 4 better in VALUE rubricVALUE has broader categories and institutional rubrics have more specifics (e.g. tone)Institutional rubrics—more opportunity for constructive feedback (due to specificity)Critical thinking is not broken out in VALUE rubric
  3. 3. 3NO critical thinking or synthesis in VALUE (sources and evidence not quite sufficient as acategory)VALUE rubric brings in idea of attending to disciplinary conventionsInstitutional rubric may be more limited in terms of applicability to various types of writing, butis also more detailed which is goodDifferent terminology for similar categories (both category title and within them)Experiences Using the VALUE Rubric to Score SamplesWe certainly need a zero because this isn’t as bad as it could get; we do need a don’t knowDifficult to judge sources of evidence therefore a range in our scoring of sample CMuch as to do with how “develop” and “explore” are definedWould like to know more about the disciplineDisagreement about “conforming to assignment”What does “some errors” mean?Sample ARegarding the VALUE rubric and Sample A. For our group this was somewhatstraightforward -- with more similarity than on C. We did note the need to have a "0" to spreadout the scale (because some will not meet benchmark) and to have a dont know/not applicablecategory -- depending upon the assignment, what we know about the references, disciplinarydifferences, etc.).Sample CRegarding VALUES rubric and Sample C: we were more mixed on this one -- aproblem of the rubric -- it is a much better paper, but the scores arent that different. Clearly anupper level English paper.We all agree it was a better paper but we didnt agree on ratings -- and in some cases, the secondpaper got lower ratings from the reviewer than was given on paper A. Much of our discussionabout why this was is tied to the assignment--its complexity/sophistication and the choice thestudent made in how to respond.
  4. 4. 4We also discussed the issue of intra-rater reliability -- we all come at thisassignment with a difference set point.We also got into standards for judgment -- one person said this shouldnt begraded as an English paper -- the standard is more -- do they communicate thecapacity to "communicate on the street" and with this standard, the studentwould meet the requirement.Ongoing discussion of whether or not there was a thesis -- some felt there was (fitting withassignment requirement) others saw none, or one poorly developed.sources of evidence -- quality is a disciplinary-based discussion. Need to have the disciplinaryperspective represented in the exercise.differences in goals for the exercise -- do you want them to be able to hit the streets with thislevel of skill...sure. But in four year, want to show embedded in the discipline.Speaks to the need to work on the rubrics -- they are problematic in that they do not discern thegreater difference in quality.What are the goals -- what is the purpose?? Going on the street to communicate, sure. Reallyrepresent the discipline, not so much.We are doing this in a comparative purpose (compared to A, compared toExpectations in the disciplines) -- but the rubric asks you to do this on theirown terms.We have to be very careful about selecting evidence, examples. The assignmentaffects the scoring ...variability creates more variability.Representative from one campus is looking at the possibility ofreviewing/assessing student work at three levels -- how is it assessed at thecourse-level, the writing program level, and the disciplinary/departmentallevel. This also made us wonder how this multi-level approach could work for the vision projectgoals of comparability, "best" standard, etc.Discussion of UMB Criteria/RubricWe were mixed on whether the UMB rubric could map against VALUE --one feltThere was minimal mapping across the two (and that the UMB rubric was very much tied totheir specific assignment, and couldnt work easily for the other assignments we were given).Others felt there was connection (particularly if we consider the Critical Thinking and CriticalReading rubrics as well).
  5. 5. 5...the UMB unpacks the value rubric, which makes it easier to use. Sure, ICould map the two against each other and identify the commonalities (forexample, in a grid) -- but the UMB is easier to use.The elements of writing are all common -- there will be lots of overlap.Although we didnt all see this commonality in this rubric -- some felt theconnections are more implied -- processes "behind" what the VALUE rubricidentifies (for example, you cant adequately consider audience context without critical readingand thinking skills, etc. Other relevant value rubrics (Critical Thinking and Critical Reading)overlap with this. If we are going to be evaluating communication at the state level we need totake these together.The advantage of the UMB is it really is a gen ed-focused assessment tool, asopposed to the value rubric. Question of relative first-time pass rates by major discipline --assumption being some majors will find the task easier than others.This is much more a reading rubric; focused very much on the readings assigned.The rubric can’t be used to assess an outside paper -- but, they are being asked to carefully readresources. In sample C, some of us could make connections --for using "source" material. Sosome could be used.Student C doesnt do the assignment correctly, so this is not correct. Business person says didntdo assignment, it fails. (Rubric doesnt ask, did you do the assignment well.)If assignment were targeted for the UMB rubric we could use it.We also wondered about this exercise within the context, purpose, and audience for the statewideVisions assessment--with goals of "best", comparability --focus on three outcomes, etc. Need tobe clear about these issues for the Vision/LEAP task as a whole.