SlideShare a Scribd company logo
1 of 33
Welcome!
MILEX: March 2014
TU’s Excellent Rubric Assessment Adventures:
On the RAILS
Shana Gass Claire Holmes Lisa Sweeney
sgass@towson.edu cholmes@towson.edu sweeney@towson.edu
Agenda for Today :
• Background on Assessment, RAILS & Rubrics
• Norming & Rating Sessions
• Working Lunch:
Create Draft Rubrics
• Reflections & Questions
Assessment…
• Knowing what you are doing
• Knowing why you are doing it
• Knowing what students are learning as a
result
• Changing because of the information
(~Debra Gilchrist, Dean of Libraries and Institutional
Effectiveness, Pierce College)
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
decisions
to increase
learning
Information
Literacy
Instruction
Assessment
Cycle
(ILIAC)
Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional
Skills." Journal of Documentation. 65.4. 2009.
The Institute of Museum and Library Services is the primary
source of federal support for the nation’s 123,000 libraries and
17,500 museums. The Institute's mission is to create strong
libraries and museums that connect people to information and
ideas.
Megan Oakleaf,
founder of all things
RAILS.
RAILS Project Purpose
• Investigated an analytic rubric
approach to IL assessment in
higher education
• Developed a suite of IL rubrics
• Investigated rubric reliability & validity
• Developed training materials for training/
norming/ scoring
• Explored indicators of rater expertise
Cook’s RAILS Purpose
• Gain rubric experience:
creating/norming/rating
• Identify assessment opportunities
within TU’s Core Curriculum
• Develop a rubric for use on campus
• Assess students’ information literacy skills
• Examine instructional practices
Cook
Library’s
Priorities:
• Begin cycle of tracking
student learning.
• Begin cycle of tracking
instruction practices.
• Begin cycle of collecting
aggregated & anonymous
data.
• Reinforce regular opportunities
for reflection & discussion
among library instruction
colleagues.
(facilitate development of a
Community of Reflective
Practice)
(Image: AP Images)
Our assessment
adventure…
Understanding by Design
1. What do you want students to learn?
(outcome)
2. How will you know that they have learned it?
(assessment)
3. What activities will help them learn, and at the same
time, provide assessment data?
(teaching method & assessment)
(Wiggins & McTighe, 2006)
Performance/Integrated Assessment
Students reveal their learning when they are provided with:
complex,
authentic
LEARNING
ACTIVITIES
to explain, interpret, apply,
shift perspective,
empathize
and self-assess.
What we assess.What they learn.
(Megan Oakleaf, Assessment: Demonstrating the Educational Value of the Academic Library, ACRL
Assessment Immersion, 2011.)
5 Questions for Assessment Design:
1. Outcome What do you want the student to be able to
do?
2. IL Curriculum What does the student need to know in
order to do this well?
3. Pedagogy What type of instruction will best enable
the learning?
4. Assessment How will the student demonstrate the
learning?
5. Criteria for
evaluation
How will you know the student has done
well?
(Lisa Hinchcliffe, Student Learning Assessment Cycle. ACRL Assessment Immersion, 2011)
Evidence of “authentic” student learning:
For instance, the research worksheet in your
packet that asks students to break down and
practice sequential steps in the search process.
Brainstorm…
What other possible examples of evidence of
student learning do we collect? What could we
collect?
Brainstorm ideas…
Evidence: Possible examples of
authentic student learning…
• Research journal
• Reflective writing
• “think aloud”
• Self or peer evaluation
• Works cited page
• Annotated bibliography
• Posters
• Multimedia presentations
• Speeches
• Open-ended question
responses
• Group projects
• Performances
• Portfolios
• Library assignments
• Worksheets
• Concept maps
• Citation maps
• Tutorial responses
• Blogs
• Wikis
• Lab reports
• 2 dimensions
1. criteria
2. levels of performance
• grid or table format
• judges quality
• translates unwieldy
data into accessible
information
(Image: thefirstgradediaries.blogspot.com)
Criteria
1.“the conditions a [student] must meet to be
successful” (Wiggins)
2.“the set of indicators, markers, guides, or a list of
measures or qualities that will help [a scorer]
know when a [student] has met an outcome”
(Bresciani, Zelna & Anderson)
3.what to look for in [student] performance “to
determine progress…or determine when mastery
has occurred” (Arter)
Performance Levels
mastery, progressing, emerging, satisfactory,
marginal, proficient, high, middle, beginning,
advanced, novice, intermediate, sophisticate
d, competent, professional, exemplary, need
s
work, adequate, developing, accomplished, d
istinguished
(or numerical…)
SAMPLE RAILS RUBRIC
(green handout in your packet)
Performance Level 3
Student:
Performance Level 2
Student:
Performance Level 1
Student:
Performance Level 0
Student:
1.
Determines
Key Concepts
Determines multiple key concepts that
reflect the research topic/thesis
statement accurately.
Determines some concepts that reflect the
research topic/thesis statement, but concept
breakdown is incomplete or repetitive.
Determines concepts that reflect the research
topic/thesis statement inaccurately.
Does not determine any concepts
that reflect the research
question/thesis statement.
2.
Identifies synonyms
and related terms Identifies relevant synonyms and/or
related terms that match key concepts.
Attempts synonym (or related term) use, but
synonym list is incomplete or not fully
relevant to key concepts.
Identifies synonyms that inaccurately reflect
the key concepts.
Does not identify synonyms.
3.
Constructs a search
strategy using relevant
operators
Constructs a search strategy using an
appropriate combination of relevant
operators (for example: and, or, not)
correctly.
Constructs a search strategy using
operator(s), but uses operators in an
incomplete or limited way.
Constructs a search strategy using operators
incorrectly. Does not use operators.
4.
Uses evaluative
criteria
to select source(s)
Uses evaluative criteria to provide in-
depth explanation of rationale for source
selected.
Uses evaluative criteria to provide a
limited/superficial explanation of rationale for
source selected.
Attempts to use evaluative criteria, but does
so inaccurately or incorrectly.
Does not use evaluative criteria.
5.
Uses Citations
Uses an appropriate standard citation
style consistently and correctly.
Uses an appropriate standard citation style
consistently (bibliographic elements intact),
but with minimal format and/or punctuation
errors.
Attempts an appropriate standard citation
style, but does not include all bibliographic
elements consistently or correctly.
Does not include common citation
elements or does not include
citations.
Workshop Norming Practice
Round 1
• For first student work sample, Claire will
“norm aloud.”
• Participants will rate 2 work samples
individually.
• Group discussion: Can we reach consensus
for what constitutes evidence for each
performance level?
Norming: Round 2
• Participants will rate 2 more work samples
individually.
• Group discussion: Are we closer to
consensus?
• Do we establish rating ground rules?
• Does the rubric need to be modified?
Keep in mind…
• An info lit skills rubric does not score
discipline content; it scores information
literacy skills.
• You can only score what you can see.
Norming/Rating Discussion
• How do we achieve
consensus?
• What was challenging?
Rubrics – Benefits
Learning
• Articulate and communicate agreed upon
learning goals
• Provide direct feedback to learners
• Facilitate self-evaluation
• Focus on learning standards
1. What are our expectations of students completing this
assignment?
2. What does a successful learning of this type look like?
3. What specific learning outcomes do we want to see in
the completed assignment?
4. What evidence can we find that will demonstrate
learning success?
Creating a rubric:
More benefits of a (normed) rubric…
Data
• Facilitate consistent, accurate, unbiased scoring
• Deliver data that is easy to understand, defend,
and convey
• Offer detailed descriptions necessary for informed
decision-making
• Can be used over time or across multiple programs
Other
• Are inexpensive ($) to design & implement
Rubrics – Limitations
• Possible design flaws that impact data quality
• Require significant time for development
• Sometimes fail to balance between holistic and
analytic focus
• May fail to balance between generalized
wording and detailed description
• Can lack differentiation between
performance levels
RAILS Lessons
• Explicit, detailed performance
descriptions are crucial to achieve
inter-rater reliability.
• Raters appear to be more confident
about their ratings when student
artifacts under analysis are concrete,
focused, and shorter in length.
• The best raters “believe in” outcomes,
value constructed consensus (or
“disagree and commit”),
negotiate meaning across
disciplines,
develop shared
vocabulary, etc.
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
decisions
to increase
learning
Information
Literacy
Instruction
Assessment
Cycle
(ILIAC)
Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional
Skills." Journal of Documentation. 65.4. 2009.
• Internally
– Instruction
improvement
– Assessment
improvement
• Professionally
– Conferences
– Publications
Using
Assessment
Results…
References
Arter, J. (2000). Rubrics, scoring guides, and performance criteria:
Classroom tools for assessing and improving student learning.
Retrieved from http://eric.ed.gov/?id=ED446100
Bresciani, M., Zelna, C. & Anderson, J. (2004). Assessing student learning
and development: A handbook for practitioners. Washington, DC:
NASPA-Student Affairs Administrators in Higher Education.
Wiggins, G. P., & McTighe, J. (2006). Understanding by design. Upper Saddle
River, NJ: Pearson Education, Inc., 2006.
Wiggins, G. P. (1998). Educative assessment: Designing assessments
to inform and improve student performance. San Francisco,
CA: Jossey-Bass.
Selected Readings:
Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic
assessment of an information literacy program. Portal: Libraries and the Academy, 8 (1),
75-89.
Fagerheim, B. A., & Shrode, F. G. (2009). Information literacy rubrics within the disciplines.
Communications in Information Literacy, 3(2), 158-170.
Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully.
Journal of Academic Librarianship, 39(6), 599-602.
Knight, L. A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1),
43-55.
Oakleaf, M. (2007). Using rubrics to collect evidence for decision-making: What do librarians need to
learn? Evidence Based Library and Information Practice, 2(3), 27-42.
Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing
student learning and improving librarian instructional skills. Journal of
Documentation, 65(4), 539-560.
Oakleaf, M., Millet, M., & Kraus, L. (2011). All together now: getting faculty, administrators, and staff
engaged in information literacy assessment. Portal: Libraries and the Academy, 11(3), 831-
852.
Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time,
convey effective feedback, and promote student learning. Sterling, VA: Stylus Publishing.
MILEX: March 2014
TU’s Excellent Rubric Assessment Adventures:
On the RAILS
Shana Gass Claire Holmes Lisa Sweeney
sgass@towson.edu cholmes@towson.edu sweeney@towson.edu
SlideShare URL:
http://www.slideshare.net/claireholmes/milex-assess-norm2014
Thank you!

More Related Content

What's hot

Developing assessment patterns that work through TESTA
Developing assessment patterns that work through TESTADeveloping assessment patterns that work through TESTA
Developing assessment patterns that work through TESTATansy Jessop
 
Learnfromstudents11s ross
Learnfromstudents11s rossLearnfromstudents11s ross
Learnfromstudents11s rossAdria Kempner
 
Good cop, bad cop? Cracking formative, using summative well
Good cop, bad cop? Cracking formative, using summative wellGood cop, bad cop? Cracking formative, using summative well
Good cop, bad cop? Cracking formative, using summative wellTansy Jessop
 
Why a programme view? Why TESTA?
Why a programme view? Why TESTA?Why a programme view? Why TESTA?
Why a programme view? Why TESTA?Tansy Jessop
 
Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessmentTansy Jessop
 
Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium Tansy Jessop
 
Push back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learningPush back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learningTansy Jessop
 
Pbl assessment
Pbl assessmentPbl assessment
Pbl assessmentehelfant
 
Learning outcomes for the sceptical
Learning outcomes for the scepticalLearning outcomes for the sceptical
Learning outcomes for the scepticalTansy Jessop
 
Completing the Assessment Cycle
Completing the Assessment CycleCompleting the Assessment Cycle
Completing the Assessment CycleMelissa Mallon
 
Online assessment
Online assessmentOnline assessment
Online assessmentNisha Singh
 
Moving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate TeachingMoving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate TeachingVicki L. Wise
 
Ctg powerpoint for jan 24 in blue mta
Ctg powerpoint for jan 24 in blue mtaCtg powerpoint for jan 24 in blue mta
Ctg powerpoint for jan 24 in blue mtaM Taylor
 
Why formative? What is it? Why doesn't it work? How can we do it better?
Why formative? What is it? Why doesn't it work? How can we do it better?Why formative? What is it? Why doesn't it work? How can we do it better?
Why formative? What is it? Why doesn't it work? How can we do it better?Tansy Jessop
 
Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1
 Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1 Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1
Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1ClassResearchEVO
 
Online Assessment, Data Collection, and You
Online Assessment, Data Collection, and YouOnline Assessment, Data Collection, and You
Online Assessment, Data Collection, and YouCat Flippen
 
Evo research topics to r qs (judith hanks), january 2016 (1)
Evo research topics to r qs (judith hanks), january 2016 (1)Evo research topics to r qs (judith hanks), january 2016 (1)
Evo research topics to r qs (judith hanks), january 2016 (1)ClassResearchEVO
 
Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0Richard Voltz
 

What's hot (20)

Developing assessment patterns that work through TESTA
Developing assessment patterns that work through TESTADeveloping assessment patterns that work through TESTA
Developing assessment patterns that work through TESTA
 
Learnfromstudents11s ross
Learnfromstudents11s rossLearnfromstudents11s ross
Learnfromstudents11s ross
 
Good cop, bad cop? Cracking formative, using summative well
Good cop, bad cop? Cracking formative, using summative wellGood cop, bad cop? Cracking formative, using summative well
Good cop, bad cop? Cracking formative, using summative well
 
Why a programme view? Why TESTA?
Why a programme view? Why TESTA?Why a programme view? Why TESTA?
Why a programme view? Why TESTA?
 
Assessment literacy
Assessment literacyAssessment literacy
Assessment literacy
 
Improving student learning through programme assessment
Improving student learning through programme assessmentImproving student learning through programme assessment
Improving student learning through programme assessment
 
Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium Birmingham Assessment and Feedback Symposium
Birmingham Assessment and Feedback Symposium
 
Push back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learningPush back Sisyphus! Connecting feedback to learning
Push back Sisyphus! Connecting feedback to learning
 
Pbl assessment
Pbl assessmentPbl assessment
Pbl assessment
 
Learning outcomes for the sceptical
Learning outcomes for the scepticalLearning outcomes for the sceptical
Learning outcomes for the sceptical
 
Completing the Assessment Cycle
Completing the Assessment CycleCompleting the Assessment Cycle
Completing the Assessment Cycle
 
Guest Researcher Dr. Mark Knight
Guest Researcher Dr. Mark KnightGuest Researcher Dr. Mark Knight
Guest Researcher Dr. Mark Knight
 
Online assessment
Online assessmentOnline assessment
Online assessment
 
Moving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate TeachingMoving Beyond Student Ratings to Evaluate Teaching
Moving Beyond Student Ratings to Evaluate Teaching
 
Ctg powerpoint for jan 24 in blue mta
Ctg powerpoint for jan 24 in blue mtaCtg powerpoint for jan 24 in blue mta
Ctg powerpoint for jan 24 in blue mta
 
Why formative? What is it? Why doesn't it work? How can we do it better?
Why formative? What is it? Why doesn't it work? How can we do it better?Why formative? What is it? Why doesn't it work? How can we do it better?
Why formative? What is it? Why doesn't it work? How can we do it better?
 
Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1
 Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1 Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1
Mark Wyatt on Class research & CPD at Classroom Based Research EVO 2016 Week 1
 
Online Assessment, Data Collection, and You
Online Assessment, Data Collection, and YouOnline Assessment, Data Collection, and You
Online Assessment, Data Collection, and You
 
Evo research topics to r qs (judith hanks), january 2016 (1)
Evo research topics to r qs (judith hanks), january 2016 (1)Evo research topics to r qs (judith hanks), january 2016 (1)
Evo research topics to r qs (judith hanks), january 2016 (1)
 
Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0Coaching as part of teacher evaluation process v2.0
Coaching as part of teacher evaluation process v2.0
 

Similar to MILEXAssessmentRubrics2014

Webinar 1 performance assessment grades 6.12
Webinar 1 performance assessment grades 6.12Webinar 1 performance assessment grades 6.12
Webinar 1 performance assessment grades 6.12msnkeb19
 
Jace Hargis Designing Online Teaching
Jace Hargis Designing Online TeachingJace Hargis Designing Online Teaching
Jace Hargis Designing Online TeachingJace Hargis
 
Assessment of Information Literacy Learning
Assessment of Information Literacy LearningAssessment of Information Literacy Learning
Assessment of Information Literacy LearningJohan Koren
 
Assessment of Information Literacy Learning
Assessment of Information Literacy LearningAssessment of Information Literacy Learning
Assessment of Information Literacy LearningJohan Koren
 
Understanding by design teaching ellen meier ctsc
Understanding by design teaching ellen meier ctscUnderstanding by design teaching ellen meier ctsc
Understanding by design teaching ellen meier ctscmyrnacontreras
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessmentJen_castle
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessmentJen_castle
 
UBD Presentation for schools and application.pptx
UBD Presentation for schools and application.pptxUBD Presentation for schools and application.pptx
UBD Presentation for schools and application.pptxmarianarocamoraalt
 
Unh il lesson plan design & assessment
Unh il lesson plan design & assessment Unh il lesson plan design & assessment
Unh il lesson plan design & assessment Elizabeth Dolinger
 
Doctoral Education Online: What Should We Strive For? How Could It Be Better?
Doctoral Education Online: What Should We Strive For? How Could It Be Better?Doctoral Education Online: What Should We Strive For? How Could It Be Better?
Doctoral Education Online: What Should We Strive For? How Could It Be Better?Cynthia Agyeman
 
Communicating value through student learning assessment - Andrea Falcone & Ly...
Communicating value through student learning assessment - Andrea Falcone & Ly...Communicating value through student learning assessment - Andrea Falcone & Ly...
Communicating value through student learning assessment - Andrea Falcone & Ly...IL Group (CILIP Information Literacy Group)
 
Assessment without levels - Feedback Group
Assessment without levels - Feedback GroupAssessment without levels - Feedback Group
Assessment without levels - Feedback GroupChris Hildrew
 
Pile 2013 final day
Pile 2013 final dayPile 2013 final day
Pile 2013 final daymtinoco1
 
Assessment principles
Assessment principlesAssessment principles
Assessment principlesCarlo Magno
 

Similar to MILEXAssessmentRubrics2014 (20)

Webinar 1 performance assessment grades 6.12
Webinar 1 performance assessment grades 6.12Webinar 1 performance assessment grades 6.12
Webinar 1 performance assessment grades 6.12
 
Jace Hargis Designing Online Teaching
Jace Hargis Designing Online TeachingJace Hargis Designing Online Teaching
Jace Hargis Designing Online Teaching
 
LOTW
LOTWLOTW
LOTW
 
Assessment of Information Literacy Learning
Assessment of Information Literacy LearningAssessment of Information Literacy Learning
Assessment of Information Literacy Learning
 
Assessment of Information Literacy Learning
Assessment of Information Literacy LearningAssessment of Information Literacy Learning
Assessment of Information Literacy Learning
 
How to do backward curriculum design - Inskip & Hicks
How to do backward curriculum design - Inskip & HicksHow to do backward curriculum design - Inskip & Hicks
How to do backward curriculum design - Inskip & Hicks
 
Understanding by design teaching ellen meier ctsc
Understanding by design teaching ellen meier ctscUnderstanding by design teaching ellen meier ctsc
Understanding by design teaching ellen meier ctsc
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessment
 
Performance based assessment
Performance based assessmentPerformance based assessment
Performance based assessment
 
UBD Presentation for schools and application.pptx
UBD Presentation for schools and application.pptxUBD Presentation for schools and application.pptx
UBD Presentation for schools and application.pptx
 
Unh il lesson plan design & assessment
Unh il lesson plan design & assessment Unh il lesson plan design & assessment
Unh il lesson plan design & assessment
 
Doctoral Education Online: What Should We Strive For? How Could It Be Better?
Doctoral Education Online: What Should We Strive For? How Could It Be Better?Doctoral Education Online: What Should We Strive For? How Could It Be Better?
Doctoral Education Online: What Should We Strive For? How Could It Be Better?
 
Authentic assessment
Authentic assessmentAuthentic assessment
Authentic assessment
 
Stage Iii Lp
Stage Iii LpStage Iii Lp
Stage Iii Lp
 
Communicating value through student learning assessment - Andrea Falcone & Ly...
Communicating value through student learning assessment - Andrea Falcone & Ly...Communicating value through student learning assessment - Andrea Falcone & Ly...
Communicating value through student learning assessment - Andrea Falcone & Ly...
 
Faculty Workshop 2014
Faculty Workshop 2014Faculty Workshop 2014
Faculty Workshop 2014
 
ASSESSMENT.pptx
ASSESSMENT.pptxASSESSMENT.pptx
ASSESSMENT.pptx
 
Assessment without levels - Feedback Group
Assessment without levels - Feedback GroupAssessment without levels - Feedback Group
Assessment without levels - Feedback Group
 
Pile 2013 final day
Pile 2013 final dayPile 2013 final day
Pile 2013 final day
 
Assessment principles
Assessment principlesAssessment principles
Assessment principles
 

More from Claire Holmes

ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...Claire Holmes
 
Helping students connect the dots: Tools to engage students in their own le...
Helping students connect the dots: Tools to engage students in their own le...Helping students connect the dots: Tools to engage students in their own le...
Helping students connect the dots: Tools to engage students in their own le...Claire Holmes
 
Udl info litinstructionacrl_milexfa14
Udl info litinstructionacrl_milexfa14Udl info litinstructionacrl_milexfa14
Udl info litinstructionacrl_milexfa14Claire Holmes
 
UDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
UDL@Cook Library: Implementing UDL Practices in Information Literacy InstructionUDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
UDL@Cook Library: Implementing UDL Practices in Information Literacy InstructionClaire Holmes
 
MILEXPollEverywherePresentation
MILEXPollEverywherePresentationMILEXPollEverywherePresentation
MILEXPollEverywherePresentationClaire Holmes
 

More from Claire Holmes (6)

ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
 
Helping students connect the dots: Tools to engage students in their own le...
Helping students connect the dots: Tools to engage students in their own le...Helping students connect the dots: Tools to engage students in their own le...
Helping students connect the dots: Tools to engage students in their own le...
 
Udl info litinstructionacrl_milexfa14
Udl info litinstructionacrl_milexfa14Udl info litinstructionacrl_milexfa14
Udl info litinstructionacrl_milexfa14
 
UDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
UDL@Cook Library: Implementing UDL Practices in Information Literacy InstructionUDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
UDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
 
Web search2011
Web search2011Web search2011
Web search2011
 
MILEXPollEverywherePresentation
MILEXPollEverywherePresentationMILEXPollEverywherePresentation
MILEXPollEverywherePresentation
 

Recently uploaded

Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...EduSkills OECD
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13Steve Thomason
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxRoyAbrique
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionSafetyChain Software
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfsanyamsingh5019
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxheathfieldcps1
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxiammrhaywood
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application ) Sakshi Ghasle
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsKarinaGenton
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon AUnboundStockton
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introductionMaksud Ahmed
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdfssuser54595a
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxNirmalaLoungPoorunde1
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...M56BOOKSTORE PRODUCT/SERVICE
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...Marc Dusseiller Dusjagr
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationnomboosow
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesFatimaKhan178732
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityGeoBlogs
 

Recently uploaded (20)

Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
Presentation by Andreas Schleicher Tackling the School Absenteeism Crisis 30 ...
 
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdfTataKelola dan KamSiber Kecerdasan Buatan v022.pdf
TataKelola dan KamSiber Kecerdasan Buatan v022.pdf
 
The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13The Most Excellent Way | 1 Corinthians 13
The Most Excellent Way | 1 Corinthians 13
 
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptxContemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
Contemporary philippine arts from the regions_PPT_Module_12 [Autosaved] (1).pptx
 
Mastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory InspectionMastering the Unannounced Regulatory Inspection
Mastering the Unannounced Regulatory Inspection
 
Sanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdfSanyam Choudhary Chemistry practical.pdf
Sanyam Choudhary Chemistry practical.pdf
 
The basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptxThe basics of sentences session 2pptx copy.pptx
The basics of sentences session 2pptx copy.pptx
 
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptxSOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
SOCIAL AND HISTORICAL CONTEXT - LFTVD.pptx
 
Hybridoma Technology ( Production , Purification , and Application )
Hybridoma Technology  ( Production , Purification , and Application  ) Hybridoma Technology  ( Production , Purification , and Application  )
Hybridoma Technology ( Production , Purification , and Application )
 
Science 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its CharacteristicsScience 7 - LAND and SEA BREEZE and its Characteristics
Science 7 - LAND and SEA BREEZE and its Characteristics
 
Crayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon ACrayon Activity Handout For the Crayon A
Crayon Activity Handout For the Crayon A
 
microwave assisted reaction. General introduction
microwave assisted reaction. General introductionmicrowave assisted reaction. General introduction
microwave assisted reaction. General introduction
 
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
18-04-UA_REPORT_MEDIALITERAСY_INDEX-DM_23-1-final-eng.pdf
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Employee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptxEmployee wellbeing at the workplace.pptx
Employee wellbeing at the workplace.pptx
 
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
KSHARA STURA .pptx---KSHARA KARMA THERAPY (CAUSTIC THERAPY)————IMP.OF KSHARA ...
 
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
“Oh GOSH! Reflecting on Hackteria's Collaborative Practices in a Global Do-It...
 
Interactive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communicationInteractive Powerpoint_How to Master effective communication
Interactive Powerpoint_How to Master effective communication
 
Separation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and ActinidesSeparation of Lanthanides/ Lanthanides and Actinides
Separation of Lanthanides/ Lanthanides and Actinides
 
Paris 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activityParis 2024 Olympic Geographies - an activity
Paris 2024 Olympic Geographies - an activity
 

MILEXAssessmentRubrics2014

  • 1. Welcome! MILEX: March 2014 TU’s Excellent Rubric Assessment Adventures: On the RAILS Shana Gass Claire Holmes Lisa Sweeney sgass@towson.edu cholmes@towson.edu sweeney@towson.edu
  • 2. Agenda for Today : • Background on Assessment, RAILS & Rubrics • Norming & Rating Sessions • Working Lunch: Create Draft Rubrics • Reflections & Questions
  • 3. Assessment… • Knowing what you are doing • Knowing why you are doing it • Knowing what students are learning as a result • Changing because of the information (~Debra Gilchrist, Dean of Libraries and Institutional Effectiveness, Pierce College)
  • 4. Identify learning outcomes Create and enact learning activities Gather data to check for learning Interpret data Enact decisions to increase learning Information Literacy Instruction Assessment Cycle (ILIAC) Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.
  • 5. The Institute of Museum and Library Services is the primary source of federal support for the nation’s 123,000 libraries and 17,500 museums. The Institute's mission is to create strong libraries and museums that connect people to information and ideas. Megan Oakleaf, founder of all things RAILS.
  • 6. RAILS Project Purpose • Investigated an analytic rubric approach to IL assessment in higher education • Developed a suite of IL rubrics • Investigated rubric reliability & validity • Developed training materials for training/ norming/ scoring • Explored indicators of rater expertise
  • 7. Cook’s RAILS Purpose • Gain rubric experience: creating/norming/rating • Identify assessment opportunities within TU’s Core Curriculum • Develop a rubric for use on campus • Assess students’ information literacy skills • Examine instructional practices
  • 9. • Begin cycle of tracking student learning. • Begin cycle of tracking instruction practices. • Begin cycle of collecting aggregated & anonymous data. • Reinforce regular opportunities for reflection & discussion among library instruction colleagues. (facilitate development of a Community of Reflective Practice) (Image: AP Images) Our assessment adventure…
  • 10. Understanding by Design 1. What do you want students to learn? (outcome) 2. How will you know that they have learned it? (assessment) 3. What activities will help them learn, and at the same time, provide assessment data? (teaching method & assessment) (Wiggins & McTighe, 2006)
  • 11. Performance/Integrated Assessment Students reveal their learning when they are provided with: complex, authentic LEARNING ACTIVITIES to explain, interpret, apply, shift perspective, empathize and self-assess. What we assess.What they learn. (Megan Oakleaf, Assessment: Demonstrating the Educational Value of the Academic Library, ACRL Assessment Immersion, 2011.)
  • 12. 5 Questions for Assessment Design: 1. Outcome What do you want the student to be able to do? 2. IL Curriculum What does the student need to know in order to do this well? 3. Pedagogy What type of instruction will best enable the learning? 4. Assessment How will the student demonstrate the learning? 5. Criteria for evaluation How will you know the student has done well? (Lisa Hinchcliffe, Student Learning Assessment Cycle. ACRL Assessment Immersion, 2011)
  • 13. Evidence of “authentic” student learning: For instance, the research worksheet in your packet that asks students to break down and practice sequential steps in the search process. Brainstorm… What other possible examples of evidence of student learning do we collect? What could we collect?
  • 15. Evidence: Possible examples of authentic student learning… • Research journal • Reflective writing • “think aloud” • Self or peer evaluation • Works cited page • Annotated bibliography • Posters • Multimedia presentations • Speeches • Open-ended question responses • Group projects • Performances • Portfolios • Library assignments • Worksheets • Concept maps • Citation maps • Tutorial responses • Blogs • Wikis • Lab reports
  • 16. • 2 dimensions 1. criteria 2. levels of performance • grid or table format • judges quality • translates unwieldy data into accessible information (Image: thefirstgradediaries.blogspot.com)
  • 17. Criteria 1.“the conditions a [student] must meet to be successful” (Wiggins) 2.“the set of indicators, markers, guides, or a list of measures or qualities that will help [a scorer] know when a [student] has met an outcome” (Bresciani, Zelna & Anderson) 3.what to look for in [student] performance “to determine progress…or determine when mastery has occurred” (Arter)
  • 18. Performance Levels mastery, progressing, emerging, satisfactory, marginal, proficient, high, middle, beginning, advanced, novice, intermediate, sophisticate d, competent, professional, exemplary, need s work, adequate, developing, accomplished, d istinguished (or numerical…)
  • 19. SAMPLE RAILS RUBRIC (green handout in your packet) Performance Level 3 Student: Performance Level 2 Student: Performance Level 1 Student: Performance Level 0 Student: 1. Determines Key Concepts Determines multiple key concepts that reflect the research topic/thesis statement accurately. Determines some concepts that reflect the research topic/thesis statement, but concept breakdown is incomplete or repetitive. Determines concepts that reflect the research topic/thesis statement inaccurately. Does not determine any concepts that reflect the research question/thesis statement. 2. Identifies synonyms and related terms Identifies relevant synonyms and/or related terms that match key concepts. Attempts synonym (or related term) use, but synonym list is incomplete or not fully relevant to key concepts. Identifies synonyms that inaccurately reflect the key concepts. Does not identify synonyms. 3. Constructs a search strategy using relevant operators Constructs a search strategy using an appropriate combination of relevant operators (for example: and, or, not) correctly. Constructs a search strategy using operator(s), but uses operators in an incomplete or limited way. Constructs a search strategy using operators incorrectly. Does not use operators. 4. Uses evaluative criteria to select source(s) Uses evaluative criteria to provide in- depth explanation of rationale for source selected. Uses evaluative criteria to provide a limited/superficial explanation of rationale for source selected. Attempts to use evaluative criteria, but does so inaccurately or incorrectly. Does not use evaluative criteria. 5. Uses Citations Uses an appropriate standard citation style consistently and correctly. Uses an appropriate standard citation style consistently (bibliographic elements intact), but with minimal format and/or punctuation errors. Attempts an appropriate standard citation style, but does not include all bibliographic elements consistently or correctly. Does not include common citation elements or does not include citations.
  • 20. Workshop Norming Practice Round 1 • For first student work sample, Claire will “norm aloud.” • Participants will rate 2 work samples individually. • Group discussion: Can we reach consensus for what constitutes evidence for each performance level?
  • 21. Norming: Round 2 • Participants will rate 2 more work samples individually. • Group discussion: Are we closer to consensus? • Do we establish rating ground rules? • Does the rubric need to be modified?
  • 22. Keep in mind… • An info lit skills rubric does not score discipline content; it scores information literacy skills. • You can only score what you can see.
  • 23. Norming/Rating Discussion • How do we achieve consensus? • What was challenging?
  • 24. Rubrics – Benefits Learning • Articulate and communicate agreed upon learning goals • Provide direct feedback to learners • Facilitate self-evaluation • Focus on learning standards
  • 25. 1. What are our expectations of students completing this assignment? 2. What does a successful learning of this type look like? 3. What specific learning outcomes do we want to see in the completed assignment? 4. What evidence can we find that will demonstrate learning success? Creating a rubric:
  • 26. More benefits of a (normed) rubric… Data • Facilitate consistent, accurate, unbiased scoring • Deliver data that is easy to understand, defend, and convey • Offer detailed descriptions necessary for informed decision-making • Can be used over time or across multiple programs Other • Are inexpensive ($) to design & implement
  • 27. Rubrics – Limitations • Possible design flaws that impact data quality • Require significant time for development • Sometimes fail to balance between holistic and analytic focus • May fail to balance between generalized wording and detailed description • Can lack differentiation between performance levels
  • 28. RAILS Lessons • Explicit, detailed performance descriptions are crucial to achieve inter-rater reliability. • Raters appear to be more confident about their ratings when student artifacts under analysis are concrete, focused, and shorter in length. • The best raters “believe in” outcomes, value constructed consensus (or “disagree and commit”), negotiate meaning across disciplines, develop shared vocabulary, etc.
  • 29. Identify learning outcomes Create and enact learning activities Gather data to check for learning Interpret data Enact decisions to increase learning Information Literacy Instruction Assessment Cycle (ILIAC) Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.
  • 30. • Internally – Instruction improvement – Assessment improvement • Professionally – Conferences – Publications Using Assessment Results…
  • 31. References Arter, J. (2000). Rubrics, scoring guides, and performance criteria: Classroom tools for assessing and improving student learning. Retrieved from http://eric.ed.gov/?id=ED446100 Bresciani, M., Zelna, C. & Anderson, J. (2004). Assessing student learning and development: A handbook for practitioners. Washington, DC: NASPA-Student Affairs Administrators in Higher Education. Wiggins, G. P., & McTighe, J. (2006). Understanding by design. Upper Saddle River, NJ: Pearson Education, Inc., 2006. Wiggins, G. P. (1998). Educative assessment: Designing assessments to inform and improve student performance. San Francisco, CA: Jossey-Bass.
  • 32. Selected Readings: Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic assessment of an information literacy program. Portal: Libraries and the Academy, 8 (1), 75-89. Fagerheim, B. A., & Shrode, F. G. (2009). Information literacy rubrics within the disciplines. Communications in Information Literacy, 3(2), 158-170. Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully. Journal of Academic Librarianship, 39(6), 599-602. Knight, L. A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1), 43-55. Oakleaf, M. (2007). Using rubrics to collect evidence for decision-making: What do librarians need to learn? Evidence Based Library and Information Practice, 2(3), 27-42. Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing student learning and improving librarian instructional skills. Journal of Documentation, 65(4), 539-560. Oakleaf, M., Millet, M., & Kraus, L. (2011). All together now: getting faculty, administrators, and staff engaged in information literacy assessment. Portal: Libraries and the Academy, 11(3), 831- 852. Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Sterling, VA: Stylus Publishing.
  • 33. MILEX: March 2014 TU’s Excellent Rubric Assessment Adventures: On the RAILS Shana Gass Claire Holmes Lisa Sweeney sgass@towson.edu cholmes@towson.edu sweeney@towson.edu SlideShare URL: http://www.slideshare.net/claireholmes/milex-assess-norm2014 Thank you!

Editor's Notes

  1. This is to cue raters to the process.
  2. Panel 2