MLA/DLAInfoLitAssessmentRubrics2014

Claire Holmes
Claire HolmesResearch & Instruction Librarian at Towson University
WELCOME TO MLA/DLA 2014! Turn it UP!
Expand your networking opportunities
with MLA’s LinkedIn page
Rev Up Information Literacy Assessment: Use Rubrics to Close That Loop!
Claire Holmes Carissa Tomlinson
cholmes@towson.edu ctomlinson@towson.edu
tweets @TUEdLibrarian
facebook.com/MDLib
@MDLibraryAssoc
#MLADLA14
Get the conference schedule on your device --
go to m.lanyrd.com and search MLADLA#MLADLA14
Agenda for Today :
• Background on Assessment, RAILS & Rubrics
• Norming & Rating
• Rubric Evaluation
• Reflections & Questions
Assessment…
• Knowing what you are doing
• Knowing why you are doing it
• Knowing what students are learning as a
result
• Changing because of the information
(~Debra Gilchrist, Dean of Libraries and Institutional Effectiveness,
Pierce College)
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
decisions
to increase
learning
Information
Literacy
Instruction
Assessment
Cycle
(ILIAC)
Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional
Skills." Journal of Documentation. 65.4. 2009.
The Institute of Museum and Library Services is the primary
source of federal support for the nation’s 123,000 libraries and
17,500 museums. The Institute's mission is to create strong
libraries and museums that connect people to information and
ideas.
Megan Oakleaf,
founder of all things
RAILS.
RAILS Project Purpose
• Investigated an analytic rubric
approach to IL assessment in
higher education
• Developed a suite of IL rubrics
• Investigated rubric reliability & validity
• Developed training materials for training/
norming/ scoring
• Explored indicators of rater expertise
Cook’s RAILS Purpose
• Gain rubric experience:
creating/norming/rating
• Identify assessment opportunities
within TU’s Core Curriculum
• Develop a rubric for use on campus
• Assess students’ information literacy skills
• Examine instructional practices
Cook
Library’s
Priorities:
• Begin cycle of tracking
student learning.
• Begin cycle of tracking
instruction practices.
• Begin cycle of collecting
aggregated & anonymous
data.
• Reinforce regular opportunities
for reflection & discussion
among library instruction
colleagues.
(facilitate development of a
Community of Reflective
Practice)
(Image: AP Images)
Our assessment
adventure…
Understanding by Design
1. What do you want students to learn?
(outcome)
2. How will you know that they have learned it?
(assessment)
3. What activities will help them learn, and at the same
time, provide assessment data?
(teaching method & assessment)
(Wiggins & McTighe, 2006)
5 Questions for Assessment Design:
1. Outcome What do you want the student to be able to
do?
2. IL Curriculum What does the student need to know in
order to do this well?
3. Pedagogy What type of instruction will best enable
the learning?
4. Assessment How will the student demonstrate the
learning?
5. Criteria for
evaluation
How will you know the student has done
well?
(Lisa Hinchcliffe, Student Learning Assessment Cycle. ACRL Assessment Immersion, 2011)
Evidence of “authentic” student learning:
For instance, the research worksheet in your
packet that asks students to break down and
practice sequential steps in the search process.
Brainstorm…
What other possible examples of evidence of
student learning do we collect? What could we
collect?
Brainstorm ideas…
Evidence: Possible examples of
authentic student learning…
• Research journal
• Reflective writing
• “think aloud”
• Self or peer evaluation
• Works cited page
• Annotated bibliography
• Posters
• Multimedia presentations
• Speeches
• Open-ended question
responses
• Group projects
• Performances
• Portfolios
• Library assignments
• Worksheets
• Concept maps
• Citation maps
• Tutorial responses
• Blogs
• Wikis
• Lab reports
• 2 dimensions
1. criteria
2. levels of performance
• grid or table format
• judges quality
• translates unwieldy
data into accessible
information
(Image: thefirstgradediaries.blogspot.com)
SAMPLE RAILS RUBRIC
(green handout in your packet)
Performance Level 3
Student:
Performance Level 2
Student:
Performance Level 1
Student:
Performance Level 0
Student:
1.
Determines
Key Concepts
Determines multiple key concepts that
reflect the research topic/thesis
statement accurately.
Determines some concepts that reflect the
research topic/thesis statement, but concept
breakdown is incomplete or repetitive.
Determines concepts that reflect the research
topic/thesis statement inaccurately.
Does not determine any concepts
that reflect the research
question/thesis statement.
2.
Identifies synonyms
and related terms Identifies relevant synonyms and/or
related terms that match key concepts.
Attempts synonym (or related term) use, but
synonym list is incomplete or not fully
relevant to key concepts.
Identifies synonyms that inaccurately reflect
the key concepts.
Does not identify synonyms.
3.
Constructs a search
strategy using relevant
operators
Constructs a search strategy using an
appropriate combination of relevant
operators (for example: and, or, not)
correctly.
Constructs a search strategy using
operator(s), but uses operators in an
incomplete or limited way.
Constructs a search strategy using operators
incorrectly. Does not use operators.
4.
Uses evaluative
criteria
to select source(s)
Uses evaluative criteria to provide in-
depth explanation of rationale for source
selected.
Uses evaluative criteria to provide a
limited/superficial explanation of rationale for
source selected.
Attempts to use evaluative criteria, but does
so inaccurately or incorrectly.
Does not use evaluative criteria.
5.
Uses Citations
Uses an appropriate standard citation
style consistently and correctly.
Uses an appropriate standard citation style
consistently (bibliographic elements intact),
but with minimal format and/or punctuation
errors.
Attempts an appropriate standard citation
style, but does not include all bibliographic
elements consistently or correctly.
Does not include common citation
elements or does not include
citations.
Criteria
1. “the conditions a [student] must meet to be
successful” (Wiggins)
2. “the set of indicators, markers, guides, or a
list of measures or qualities that will help [a
scorer] know when a [student] has met an
outcome” (Bresciani, Zelna & Anderson)
3. what to look for in [student] performance
“to determine progress…or determine
when mastery has occurred” (Arter)
4. Names vary: Objectives, Components,
Learning Outcomes, etc.
1.
Determines
Key Concepts
2.
Identifies synonyms and
related terms
3.
Constructs a search strategy
using relevant operators
4.
Uses evaluative criteria
to select source(s)
5.
Uses Citations
Performance Levels
mastery, progressing, emerging, satisfactory,
marginal, proficient, high, middle, beginning,
advanced, novice, intermediate,
sophisticated, competent, professional,
exemplary, needs work, adequate,
developing, accomplished, distinguished
(or numerical…)
Performance Level 3 Performance Level 2 Performance Level 1 Performance Level 0
SAMPLE RAILS RUBRIC
(green handout in your packet)
Performance Level 3
Student:
Performance Level 2
Student:
Performance Level 1
Student:
Performance Level 0
Student:
1.
Determines
Key Concepts
Determines multiple key concepts that
reflect the research topic/thesis
statement accurately.
Determines some concepts that reflect the
research topic/thesis statement, but concept
breakdown is incomplete or repetitive.
Determines concepts that reflect the research
topic/thesis statement inaccurately.
Does not determine any concepts
that reflect the research
question/thesis statement.
2.
Identifies synonyms
and related terms Identifies relevant synonyms and/or
related terms that match key concepts.
Attempts synonym (or related term) use, but
synonym list is incomplete or not fully
relevant to key concepts.
Identifies synonyms that inaccurately reflect
the key concepts.
Does not identify synonyms.
3.
Constructs a search
strategy using relevant
operators
Constructs a search strategy using an
appropriate combination of relevant
operators (for example: and, or, not)
correctly.
Constructs a search strategy using
operator(s), but uses operators in an
incomplete or limited way.
Constructs a search strategy using operators
incorrectly. Does not use operators.
4.
Uses evaluative
criteria
to select source(s)
Uses evaluative criteria to provide in-
depth explanation of rationale for source
selected.
Uses evaluative criteria to provide a
limited/superficial explanation of rationale for
source selected.
Attempts to use evaluative criteria, but does
so inaccurately or incorrectly.
Does not use evaluative criteria.
5.
Uses Citations
Uses an appropriate standard citation
style consistently and correctly.
Uses an appropriate standard citation style
consistently (bibliographic elements intact),
but with minimal format and/or punctuation
errors.
Attempts an appropriate standard citation
style, but does not include all bibliographic
elements consistently or correctly.
Does not include common citation
elements or does not include
citations.
Workshop Norming Practice
Round 1
• For first student work sample, Claire will
“norm aloud.”
• Participants will rate 2 work samples
individually.
• Group discussion: Can we reach consensus
for what constitutes evidence for each
performance level?
Norming: Round 2
• Participants will rate 2 more work samples
individually.
• Group discussion: Are we closer to
consensus?
• Do we establish rating ground rules?
• Does the rubric need to be modified?
Keep in mind…
• An info lit skills rubric does not score
discipline content; it scores information
literacy skills.
• You can only score what you can see.
Norming/Rating Discussion
• How do we achieve
consensus?
• What was challenging?
Rubrics – Benefits
Learning
• Articulate and communicate agreed upon
learning goals
• Provide direct feedback to learners
• Facilitate self-evaluation
• Focus on learning standards
More benefits of a (normed) rubric…
Data
• Facilitate consistent, accurate, unbiased scoring
• Deliver data that is easy to understand, defend,
and convey
• Offer detailed descriptions necessary for informed
decision-making
• Can be used over time or across multiple programs
Other
• Are inexpensive ($) to design & implement
Rubrics – Limitations
• Possible design flaws that impact data quality
• Require significant time for development
• Sometimes fail to balance between holistic and
analytic focus
• May fail to balance between generalized
wording and detailed description
• Can lack differentiation between
performance levels
Rubric Evaluation Activity
• In groups of 2-3, spend 20 minutes answering
the following questions about the rubrics in
your packet:
– What is your overall impression of the rubric?
– Are there benefits to any of the various ways to
describe the performance levels?
– Look at one criteria and that performance levels
for that criteria. Do you think it would be hard to
norm this criteria? Why or why not?
1. What are our expectations of students completing this
assignment?
2. What does a successful learning of this type look like?
3. What specific learning outcomes do we want to see in
the completed assignment?
4. What evidence can we find that will demonstrate
learning success?
Things to think about when creating a rubric
RAILS Lessons
• Explicit, detailed performance
descriptions are crucial to achieve
inter-rater reliability.
• Raters appear to be more confident
about their ratings when student
artifacts under analysis are concrete,
focused, and shorter in length.
• The best raters “believe in” outcomes,
value constructed consensus (or
“disagree and commit”),
negotiate meaning across
disciplines,
develop shared
vocabulary, etc.
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
decisions
to increase
learning
Information
Literacy
Instruction
Assessment
Cycle
(ILIAC)
Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional
Skills." Journal of Documentation. 65.4. 2009.
• Internally
– Instruction
improvement
– Assessment
improvement
• Professionally
– Conferences
– Publications
Using
Assessment
Results…
References
Arter, J. (2000). Rubrics, scoring guides, and performance criteria:
Classroom tools for assessing and improving student learning.
Retrieved from http://eric.ed.gov/?id=ED446100
Bresciani, M., Zelna, C. & Anderson, J. (2004). Assessing student learning
and development: A handbook for practitioners. Washington, DC:
NASPA-Student Affairs Administrators in Higher Education.
Wiggins, G. P., & McTighe, J. (2006). Understanding by design. Upper Saddle
River, NJ: Pearson Education, Inc., 2006.
Wiggins, G. P. (1998). Educative assessment: Designing assessments
to inform and improve student performance. San Francisco,
CA: Jossey-Bass.
Selected Readings:
Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic
assessment of an information literacy program. Portal: Libraries and the Academy, 8 (1),
75-89.
Fagerheim, B. A., & Shrode, F. G. (2009). Information literacy rubrics within the disciplines.
Communications in Information Literacy, 3(2), 158-170.
Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully.
Journal of Academic Librarianship, 39(6), 599-602.
Knight, L. A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1),
43-55.
Oakleaf, M. (2007). Using rubrics to collect evidence for decision-making: What do librarians need to
learn? Evidence Based Library and Information Practice, 2(3), 27-42.
Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing
student learning and improving librarian instructional skills. Journal of
Documentation, 65(4), 539-560.
Oakleaf, M., Millet, M., & Kraus, L. (2011). All together now: getting faculty, administrators, and staff
engaged in information literacy assessment. Portal: Libraries and the Academy, 11(3), 831-
852.
Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time,
convey effective feedback, and promote student learning. Sterling, VA: Stylus Publishing.
MLA/DLA: May 2014
Rev Up Information Literacy Assessment:
Use Rubrics to Close That Loop!
Claire Holmes Carissa Tomlinson
cholmes@towson.edu ctomlinson@towson.edu
SlideShare URL:
Thank you!
1 of 34

Recommended

WILU Assessment Rubrics Workshop by
WILU Assessment Rubrics WorkshopWILU Assessment Rubrics Workshop
WILU Assessment Rubrics WorkshopClaire Holmes
648 views43 slides
MILEXAssessmentRubrics2014 by
MILEXAssessmentRubrics2014MILEXAssessmentRubrics2014
MILEXAssessmentRubrics2014Claire Holmes
676 views33 slides
Creating Meaningful Rubrics by
Creating Meaningful Rubrics Creating Meaningful Rubrics
Creating Meaningful Rubrics OPEN NH / NH e-Learning for Educators
1.8K views21 slides
Kenan Dikilitas "Developing ways of analyzing data" by
Kenan Dikilitas "Developing  ways of analyzing data"Kenan Dikilitas "Developing  ways of analyzing data"
Kenan Dikilitas "Developing ways of analyzing data"ClassResearchEVO
1.2K views33 slides
TESTA Masterclass by
TESTA MasterclassTESTA Masterclass
TESTA MasterclassTansy Jessop
115 views42 slides
Assessing your one shot instruction session - Kathy Stroud by
Assessing your one shot instruction session - Kathy StroudAssessing your one shot instruction session - Kathy Stroud
Assessing your one shot instruction session - Kathy StroudUniversity of Oregon
157 views12 slides

More Related Content

What's hot

Laos Session 5: Analysing Test Items using Classical Test Theory (CTT) by
Laos Session 5: Analysing Test Items using Classical Test Theory (CTT)Laos Session 5: Analysing Test Items using Classical Test Theory (CTT)
Laos Session 5: Analysing Test Items using Classical Test Theory (CTT)NEQMAP
972 views33 slides
Evo research topics to r qs (judith hanks), january 2016 (1) by
Evo research topics to r qs (judith hanks), january 2016 (1)Evo research topics to r qs (judith hanks), january 2016 (1)
Evo research topics to r qs (judith hanks), january 2016 (1)ClassResearchEVO
712 views32 slides
Assessment = Improved Teaching and Learning: Using Rubrics to Measure Inform... by
Assessment = Improved Teaching and Learning:  Using Rubrics to Measure Inform...Assessment = Improved Teaching and Learning:  Using Rubrics to Measure Inform...
Assessment = Improved Teaching and Learning: Using Rubrics to Measure Inform...Kathryn Crowe
592 views29 slides
Classroom research presentation by
Classroom research presentationClassroom research presentation
Classroom research presentationMBSHOLEH
743 views23 slides
Laos Session 2: Introduction to Modern Assessment Theory (continued) (EN) by
Laos Session 2:  Introduction to Modern Assessment Theory (continued) (EN)Laos Session 2:  Introduction to Modern Assessment Theory (continued) (EN)
Laos Session 2: Introduction to Modern Assessment Theory (continued) (EN)NEQMAP
308 views20 slides
Information Literacy Assessment: From the Classroom to the Curriculum by
Information Literacy Assessment: From the Classroom to the CurriculumInformation Literacy Assessment: From the Classroom to the Curriculum
Information Literacy Assessment: From the Classroom to the CurriculumSara Miller
301 views24 slides

What's hot(18)

Laos Session 5: Analysing Test Items using Classical Test Theory (CTT) by NEQMAP
Laos Session 5: Analysing Test Items using Classical Test Theory (CTT)Laos Session 5: Analysing Test Items using Classical Test Theory (CTT)
Laos Session 5: Analysing Test Items using Classical Test Theory (CTT)
NEQMAP972 views
Evo research topics to r qs (judith hanks), january 2016 (1) by ClassResearchEVO
Evo research topics to r qs (judith hanks), january 2016 (1)Evo research topics to r qs (judith hanks), january 2016 (1)
Evo research topics to r qs (judith hanks), january 2016 (1)
ClassResearchEVO712 views
Assessment = Improved Teaching and Learning: Using Rubrics to Measure Inform... by Kathryn Crowe
Assessment = Improved Teaching and Learning:  Using Rubrics to Measure Inform...Assessment = Improved Teaching and Learning:  Using Rubrics to Measure Inform...
Assessment = Improved Teaching and Learning: Using Rubrics to Measure Inform...
Kathryn Crowe592 views
Classroom research presentation by MBSHOLEH
Classroom research presentationClassroom research presentation
Classroom research presentation
MBSHOLEH743 views
Laos Session 2: Introduction to Modern Assessment Theory (continued) (EN) by NEQMAP
Laos Session 2:  Introduction to Modern Assessment Theory (continued) (EN)Laos Session 2:  Introduction to Modern Assessment Theory (continued) (EN)
Laos Session 2: Introduction to Modern Assessment Theory (continued) (EN)
NEQMAP308 views
Information Literacy Assessment: From the Classroom to the Curriculum by Sara Miller
Information Literacy Assessment: From the Classroom to the CurriculumInformation Literacy Assessment: From the Classroom to the Curriculum
Information Literacy Assessment: From the Classroom to the Curriculum
Sara Miller301 views
Application of assessment and evaluation data to improve a dynamic graduate m... by Pat Barlow
Application of assessment and evaluation data to improve a dynamic graduate m...Application of assessment and evaluation data to improve a dynamic graduate m...
Application of assessment and evaluation data to improve a dynamic graduate m...
Pat Barlow725 views
Final 23.3.12 cs3 mod 3 review of analysis and learning 3760 by Paula Nottingham
  Final 23.3.12 cs3  mod 3 review of analysis and learning 3760  Final 23.3.12 cs3  mod 3 review of analysis and learning 3760
Final 23.3.12 cs3 mod 3 review of analysis and learning 3760
Paula Nottingham476 views
Planning for learning in he by NewportCELT
Planning for learning in hePlanning for learning in he
Planning for learning in he
NewportCELT395 views
Formative Assessment Strategies for Library Instruction by Melissa Mallon
Formative Assessment Strategies for Library InstructionFormative Assessment Strategies for Library Instruction
Formative Assessment Strategies for Library Instruction
Melissa Mallon4.2K views
Reading and writing at m level for scitt by Philwood
Reading and writing at m level for scittReading and writing at m level for scitt
Reading and writing at m level for scitt
Philwood554 views
Information Literacy by BSU
Information  LiteracyInformation  Literacy
Information Literacy
BSU828 views
Webinar 1 performance assessment grades 6.12 by msnkeb19
Webinar 1 performance assessment grades 6.12Webinar 1 performance assessment grades 6.12
Webinar 1 performance assessment grades 6.12
msnkeb19366 views
Laos Session 7: Developing Quality Assessment Items - Rubrics by NEQMAP
Laos Session 7: Developing Quality Assessment Items - RubricsLaos Session 7: Developing Quality Assessment Items - Rubrics
Laos Session 7: Developing Quality Assessment Items - Rubrics
NEQMAP1.1K views
Writing Learning Objectives by Carla Piper
Writing Learning ObjectivesWriting Learning Objectives
Writing Learning Objectives
Carla Piper601 views
Data and assessment powerpoint presentation 2015 by Erica Zigelman
Data and assessment powerpoint presentation 2015Data and assessment powerpoint presentation 2015
Data and assessment powerpoint presentation 2015
Erica Zigelman836 views

Similar to MLA/DLAInfoLitAssessmentRubrics2014

Communicating value through student learning assessment - Andrea Falcone & Ly... by
Communicating value through student learning assessment - Andrea Falcone & Ly...Communicating value through student learning assessment - Andrea Falcone & Ly...
Communicating value through student learning assessment - Andrea Falcone & Ly...IL Group (CILIP Information Literacy Group)
1.1K views27 slides
Backwards Design & Student Learning Outcomes in Library Instruction by
Backwards Design & Student Learning Outcomes in Library InstructionBackwards Design & Student Learning Outcomes in Library Instruction
Backwards Design & Student Learning Outcomes in Library InstructionKristy Padron
4K views23 slides
Assessing for Improvement: learning outcomes assessment for library instruction by
Assessing for Improvement: learning outcomes assessment for library instructionAssessing for Improvement: learning outcomes assessment for library instruction
Assessing for Improvement: learning outcomes assessment for library instructionDiane Harvey
2.3K views47 slides
AACUpresentationfeb24draft by
AACUpresentationfeb24draftAACUpresentationfeb24draft
AACUpresentationfeb24draftNing Zou
205 views38 slides
Creating Rubrics by
Creating RubricsCreating Rubrics
Creating Rubricschedisky
14.5K views31 slides
LOTW by
LOTWLOTW
LOTWbrandywhitlock
482 views56 slides

Similar to MLA/DLAInfoLitAssessmentRubrics2014(20)

Backwards Design & Student Learning Outcomes in Library Instruction by Kristy Padron
Backwards Design & Student Learning Outcomes in Library InstructionBackwards Design & Student Learning Outcomes in Library Instruction
Backwards Design & Student Learning Outcomes in Library Instruction
Kristy Padron4K views
Assessing for Improvement: learning outcomes assessment for library instruction by Diane Harvey
Assessing for Improvement: learning outcomes assessment for library instructionAssessing for Improvement: learning outcomes assessment for library instruction
Assessing for Improvement: learning outcomes assessment for library instruction
Diane Harvey2.3K views
AACUpresentationfeb24draft by Ning Zou
AACUpresentationfeb24draftAACUpresentationfeb24draft
AACUpresentationfeb24draft
Ning Zou205 views
Creating Rubrics by chedisky
Creating RubricsCreating Rubrics
Creating Rubrics
chedisky14.5K views
SLO Template steps 1, 2, 3 by emilycaryn
SLO Template steps 1, 2, 3SLO Template steps 1, 2, 3
SLO Template steps 1, 2, 3
emilycaryn3.3K views
Rubrics: Transparent Assessment in Support of Learning by Kenneth Ronkowitz
Rubrics: Transparent Assessment in Support of LearningRubrics: Transparent Assessment in Support of Learning
Rubrics: Transparent Assessment in Support of Learning
Kenneth Ronkowitz8.1K views
Logic Models for 21st Century Learning by EdTechTeacher.org
Logic Models for 21st Century LearningLogic Models for 21st Century Learning
Logic Models for 21st Century Learning
EdTechTeacher.org1.9K views
CHAPTER 6 curriculum Evaluation 1.2.pptx by AngelouRivera
CHAPTER 6 curriculum Evaluation 1.2.pptxCHAPTER 6 curriculum Evaluation 1.2.pptx
CHAPTER 6 curriculum Evaluation 1.2.pptx
AngelouRivera803 views
Performance based assessment by Jen_castle
Performance based assessmentPerformance based assessment
Performance based assessment
Jen_castle542 views
Performance based assessment by Jen_castle
Performance based assessmentPerformance based assessment
Performance based assessment
Jen_castle1.3K views
TESTA, Kingston University Keynote by TESTA winch
TESTA, Kingston University KeynoteTESTA, Kingston University Keynote
TESTA, Kingston University Keynote
TESTA winch378 views
Jace Hargis Designing Online Teaching by Jace Hargis
Jace Hargis Designing Online TeachingJace Hargis Designing Online Teaching
Jace Hargis Designing Online Teaching
Jace Hargis394 views

More from Claire Holmes

ACRL 2015 Panel Presentation. More than just recycling: transforming informat... by
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...Claire Holmes
1.1K views28 slides
Helping students connect the dots: Tools to engage students in their own le... by
Helping students connect the dots: Tools to engage students in their own le...Helping students connect the dots: Tools to engage students in their own le...
Helping students connect the dots: Tools to engage students in their own le...Claire Holmes
340 views10 slides
Udl info litinstructionacrl_milexfa14 by
Udl info litinstructionacrl_milexfa14Udl info litinstructionacrl_milexfa14
Udl info litinstructionacrl_milexfa14Claire Holmes
641 views11 slides
UDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction by
UDL@Cook Library: Implementing UDL Practices in Information Literacy InstructionUDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
UDL@Cook Library: Implementing UDL Practices in Information Literacy InstructionClaire Holmes
770 views17 slides
Web search2011 by
Web search2011Web search2011
Web search2011Claire Holmes
256 views17 slides
MILEXPollEverywherePresentation by
MILEXPollEverywherePresentationMILEXPollEverywherePresentation
MILEXPollEverywherePresentationClaire Holmes
478 views14 slides

More from Claire Holmes(6)

ACRL 2015 Panel Presentation. More than just recycling: transforming informat... by Claire Holmes
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
ACRL 2015 Panel Presentation. More than just recycling: transforming informat...
Claire Holmes1.1K views
Helping students connect the dots: Tools to engage students in their own le... by Claire Holmes
Helping students connect the dots: Tools to engage students in their own le...Helping students connect the dots: Tools to engage students in their own le...
Helping students connect the dots: Tools to engage students in their own le...
Claire Holmes340 views
Udl info litinstructionacrl_milexfa14 by Claire Holmes
Udl info litinstructionacrl_milexfa14Udl info litinstructionacrl_milexfa14
Udl info litinstructionacrl_milexfa14
Claire Holmes641 views
UDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction by Claire Holmes
UDL@Cook Library: Implementing UDL Practices in Information Literacy InstructionUDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
UDL@Cook Library: Implementing UDL Practices in Information Literacy Instruction
Claire Holmes770 views
MILEXPollEverywherePresentation by Claire Holmes
MILEXPollEverywherePresentationMILEXPollEverywherePresentation
MILEXPollEverywherePresentation
Claire Holmes478 views

Recently uploaded

Berry country.pdf by
Berry country.pdfBerry country.pdf
Berry country.pdfMariaKenney3
80 views12 slides
The Future of Micro-credentials: Is Small Really Beautiful? by
The Future of Micro-credentials:  Is Small Really Beautiful?The Future of Micro-credentials:  Is Small Really Beautiful?
The Future of Micro-credentials: Is Small Really Beautiful?Mark Brown
102 views35 slides
11.30.23A Poverty and Inequality in America.pptx by
11.30.23A Poverty and Inequality in America.pptx11.30.23A Poverty and Inequality in America.pptx
11.30.23A Poverty and Inequality in America.pptxmary850239
181 views18 slides
BUSINESS ETHICS MODULE 1 UNIT I_B.pdf by
BUSINESS ETHICS MODULE 1 UNIT I_B.pdfBUSINESS ETHICS MODULE 1 UNIT I_B.pdf
BUSINESS ETHICS MODULE 1 UNIT I_B.pdfDr Vijay Vishwakarma
52 views21 slides
PRELIMS ANSWER.pptx by
PRELIMS ANSWER.pptxPRELIMS ANSWER.pptx
PRELIMS ANSWER.pptxsouravkrpodder
50 views60 slides
Interaction of microorganisms with vascular plants.pptx by
Interaction of microorganisms with vascular plants.pptxInteraction of microorganisms with vascular plants.pptx
Interaction of microorganisms with vascular plants.pptxMicrobiologyMicro
58 views33 slides

Recently uploaded(20)

The Future of Micro-credentials: Is Small Really Beautiful? by Mark Brown
The Future of Micro-credentials:  Is Small Really Beautiful?The Future of Micro-credentials:  Is Small Really Beautiful?
The Future of Micro-credentials: Is Small Really Beautiful?
Mark Brown102 views
11.30.23A Poverty and Inequality in America.pptx by mary850239
11.30.23A Poverty and Inequality in America.pptx11.30.23A Poverty and Inequality in America.pptx
11.30.23A Poverty and Inequality in America.pptx
mary850239181 views
Interaction of microorganisms with vascular plants.pptx by MicrobiologyMicro
Interaction of microorganisms with vascular plants.pptxInteraction of microorganisms with vascular plants.pptx
Interaction of microorganisms with vascular plants.pptx
JRN 362 - Lecture Twenty-Two by Rich Hanley
JRN 362 - Lecture Twenty-TwoJRN 362 - Lecture Twenty-Two
JRN 362 - Lecture Twenty-Two
Rich Hanley39 views
Education of marginalized and socially disadvantages segments.pptx by GarimaBhati5
Education of marginalized and socially disadvantages segments.pptxEducation of marginalized and socially disadvantages segments.pptx
Education of marginalized and socially disadvantages segments.pptx
GarimaBhati547 views
Research Methodology (M. Pharm, IIIrd Sem.)_UNIT_IV_CPCSEA Guidelines for Lab... by RAHUL PAL
Research Methodology (M. Pharm, IIIrd Sem.)_UNIT_IV_CPCSEA Guidelines for Lab...Research Methodology (M. Pharm, IIIrd Sem.)_UNIT_IV_CPCSEA Guidelines for Lab...
Research Methodology (M. Pharm, IIIrd Sem.)_UNIT_IV_CPCSEA Guidelines for Lab...
RAHUL PAL43 views
Artificial Intelligence and The Sustainable Development Goals (SDGs) Adoption... by BC Chew
Artificial Intelligence and The Sustainable Development Goals (SDGs) Adoption...Artificial Intelligence and The Sustainable Development Goals (SDGs) Adoption...
Artificial Intelligence and The Sustainable Development Goals (SDGs) Adoption...
BC Chew38 views
Peripheral artery diseases by Dr. Garvit.pptx by garvitnanecha
Peripheral artery diseases by Dr. Garvit.pptxPeripheral artery diseases by Dr. Garvit.pptx
Peripheral artery diseases by Dr. Garvit.pptx
garvitnanecha124 views
JQUERY.pdf by ArthyR3
JQUERY.pdfJQUERY.pdf
JQUERY.pdf
ArthyR3107 views
11.21.23 Economic Precarity and Global Economic Forces.pptx by mary850239
11.21.23 Economic Precarity and Global Economic Forces.pptx11.21.23 Economic Precarity and Global Economic Forces.pptx
11.21.23 Economic Precarity and Global Economic Forces.pptx
mary85023952 views

MLA/DLAInfoLitAssessmentRubrics2014

  • 1. WELCOME TO MLA/DLA 2014! Turn it UP! Expand your networking opportunities with MLA’s LinkedIn page Rev Up Information Literacy Assessment: Use Rubrics to Close That Loop! Claire Holmes Carissa Tomlinson cholmes@towson.edu ctomlinson@towson.edu tweets @TUEdLibrarian facebook.com/MDLib @MDLibraryAssoc #MLADLA14 Get the conference schedule on your device -- go to m.lanyrd.com and search MLADLA#MLADLA14
  • 2. Agenda for Today : • Background on Assessment, RAILS & Rubrics • Norming & Rating • Rubric Evaluation • Reflections & Questions
  • 3. Assessment… • Knowing what you are doing • Knowing why you are doing it • Knowing what students are learning as a result • Changing because of the information (~Debra Gilchrist, Dean of Libraries and Institutional Effectiveness, Pierce College)
  • 4. Identify learning outcomes Create and enact learning activities Gather data to check for learning Interpret data Enact decisions to increase learning Information Literacy Instruction Assessment Cycle (ILIAC) Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.
  • 5. The Institute of Museum and Library Services is the primary source of federal support for the nation’s 123,000 libraries and 17,500 museums. The Institute's mission is to create strong libraries and museums that connect people to information and ideas. Megan Oakleaf, founder of all things RAILS.
  • 6. RAILS Project Purpose • Investigated an analytic rubric approach to IL assessment in higher education • Developed a suite of IL rubrics • Investigated rubric reliability & validity • Developed training materials for training/ norming/ scoring • Explored indicators of rater expertise
  • 7. Cook’s RAILS Purpose • Gain rubric experience: creating/norming/rating • Identify assessment opportunities within TU’s Core Curriculum • Develop a rubric for use on campus • Assess students’ information literacy skills • Examine instructional practices
  • 9. • Begin cycle of tracking student learning. • Begin cycle of tracking instruction practices. • Begin cycle of collecting aggregated & anonymous data. • Reinforce regular opportunities for reflection & discussion among library instruction colleagues. (facilitate development of a Community of Reflective Practice) (Image: AP Images) Our assessment adventure…
  • 10. Understanding by Design 1. What do you want students to learn? (outcome) 2. How will you know that they have learned it? (assessment) 3. What activities will help them learn, and at the same time, provide assessment data? (teaching method & assessment) (Wiggins & McTighe, 2006)
  • 11. 5 Questions for Assessment Design: 1. Outcome What do you want the student to be able to do? 2. IL Curriculum What does the student need to know in order to do this well? 3. Pedagogy What type of instruction will best enable the learning? 4. Assessment How will the student demonstrate the learning? 5. Criteria for evaluation How will you know the student has done well? (Lisa Hinchcliffe, Student Learning Assessment Cycle. ACRL Assessment Immersion, 2011)
  • 12. Evidence of “authentic” student learning: For instance, the research worksheet in your packet that asks students to break down and practice sequential steps in the search process. Brainstorm… What other possible examples of evidence of student learning do we collect? What could we collect?
  • 14. Evidence: Possible examples of authentic student learning… • Research journal • Reflective writing • “think aloud” • Self or peer evaluation • Works cited page • Annotated bibliography • Posters • Multimedia presentations • Speeches • Open-ended question responses • Group projects • Performances • Portfolios • Library assignments • Worksheets • Concept maps • Citation maps • Tutorial responses • Blogs • Wikis • Lab reports
  • 15. • 2 dimensions 1. criteria 2. levels of performance • grid or table format • judges quality • translates unwieldy data into accessible information (Image: thefirstgradediaries.blogspot.com)
  • 16. SAMPLE RAILS RUBRIC (green handout in your packet) Performance Level 3 Student: Performance Level 2 Student: Performance Level 1 Student: Performance Level 0 Student: 1. Determines Key Concepts Determines multiple key concepts that reflect the research topic/thesis statement accurately. Determines some concepts that reflect the research topic/thesis statement, but concept breakdown is incomplete or repetitive. Determines concepts that reflect the research topic/thesis statement inaccurately. Does not determine any concepts that reflect the research question/thesis statement. 2. Identifies synonyms and related terms Identifies relevant synonyms and/or related terms that match key concepts. Attempts synonym (or related term) use, but synonym list is incomplete or not fully relevant to key concepts. Identifies synonyms that inaccurately reflect the key concepts. Does not identify synonyms. 3. Constructs a search strategy using relevant operators Constructs a search strategy using an appropriate combination of relevant operators (for example: and, or, not) correctly. Constructs a search strategy using operator(s), but uses operators in an incomplete or limited way. Constructs a search strategy using operators incorrectly. Does not use operators. 4. Uses evaluative criteria to select source(s) Uses evaluative criteria to provide in- depth explanation of rationale for source selected. Uses evaluative criteria to provide a limited/superficial explanation of rationale for source selected. Attempts to use evaluative criteria, but does so inaccurately or incorrectly. Does not use evaluative criteria. 5. Uses Citations Uses an appropriate standard citation style consistently and correctly. Uses an appropriate standard citation style consistently (bibliographic elements intact), but with minimal format and/or punctuation errors. Attempts an appropriate standard citation style, but does not include all bibliographic elements consistently or correctly. Does not include common citation elements or does not include citations.
  • 17. Criteria 1. “the conditions a [student] must meet to be successful” (Wiggins) 2. “the set of indicators, markers, guides, or a list of measures or qualities that will help [a scorer] know when a [student] has met an outcome” (Bresciani, Zelna & Anderson) 3. what to look for in [student] performance “to determine progress…or determine when mastery has occurred” (Arter) 4. Names vary: Objectives, Components, Learning Outcomes, etc. 1. Determines Key Concepts 2. Identifies synonyms and related terms 3. Constructs a search strategy using relevant operators 4. Uses evaluative criteria to select source(s) 5. Uses Citations
  • 18. Performance Levels mastery, progressing, emerging, satisfactory, marginal, proficient, high, middle, beginning, advanced, novice, intermediate, sophisticated, competent, professional, exemplary, needs work, adequate, developing, accomplished, distinguished (or numerical…) Performance Level 3 Performance Level 2 Performance Level 1 Performance Level 0
  • 19. SAMPLE RAILS RUBRIC (green handout in your packet) Performance Level 3 Student: Performance Level 2 Student: Performance Level 1 Student: Performance Level 0 Student: 1. Determines Key Concepts Determines multiple key concepts that reflect the research topic/thesis statement accurately. Determines some concepts that reflect the research topic/thesis statement, but concept breakdown is incomplete or repetitive. Determines concepts that reflect the research topic/thesis statement inaccurately. Does not determine any concepts that reflect the research question/thesis statement. 2. Identifies synonyms and related terms Identifies relevant synonyms and/or related terms that match key concepts. Attempts synonym (or related term) use, but synonym list is incomplete or not fully relevant to key concepts. Identifies synonyms that inaccurately reflect the key concepts. Does not identify synonyms. 3. Constructs a search strategy using relevant operators Constructs a search strategy using an appropriate combination of relevant operators (for example: and, or, not) correctly. Constructs a search strategy using operator(s), but uses operators in an incomplete or limited way. Constructs a search strategy using operators incorrectly. Does not use operators. 4. Uses evaluative criteria to select source(s) Uses evaluative criteria to provide in- depth explanation of rationale for source selected. Uses evaluative criteria to provide a limited/superficial explanation of rationale for source selected. Attempts to use evaluative criteria, but does so inaccurately or incorrectly. Does not use evaluative criteria. 5. Uses Citations Uses an appropriate standard citation style consistently and correctly. Uses an appropriate standard citation style consistently (bibliographic elements intact), but with minimal format and/or punctuation errors. Attempts an appropriate standard citation style, but does not include all bibliographic elements consistently or correctly. Does not include common citation elements or does not include citations.
  • 20. Workshop Norming Practice Round 1 • For first student work sample, Claire will “norm aloud.” • Participants will rate 2 work samples individually. • Group discussion: Can we reach consensus for what constitutes evidence for each performance level?
  • 21. Norming: Round 2 • Participants will rate 2 more work samples individually. • Group discussion: Are we closer to consensus? • Do we establish rating ground rules? • Does the rubric need to be modified?
  • 22. Keep in mind… • An info lit skills rubric does not score discipline content; it scores information literacy skills. • You can only score what you can see.
  • 23. Norming/Rating Discussion • How do we achieve consensus? • What was challenging?
  • 24. Rubrics – Benefits Learning • Articulate and communicate agreed upon learning goals • Provide direct feedback to learners • Facilitate self-evaluation • Focus on learning standards
  • 25. More benefits of a (normed) rubric… Data • Facilitate consistent, accurate, unbiased scoring • Deliver data that is easy to understand, defend, and convey • Offer detailed descriptions necessary for informed decision-making • Can be used over time or across multiple programs Other • Are inexpensive ($) to design & implement
  • 26. Rubrics – Limitations • Possible design flaws that impact data quality • Require significant time for development • Sometimes fail to balance between holistic and analytic focus • May fail to balance between generalized wording and detailed description • Can lack differentiation between performance levels
  • 27. Rubric Evaluation Activity • In groups of 2-3, spend 20 minutes answering the following questions about the rubrics in your packet: – What is your overall impression of the rubric? – Are there benefits to any of the various ways to describe the performance levels? – Look at one criteria and that performance levels for that criteria. Do you think it would be hard to norm this criteria? Why or why not?
  • 28. 1. What are our expectations of students completing this assignment? 2. What does a successful learning of this type look like? 3. What specific learning outcomes do we want to see in the completed assignment? 4. What evidence can we find that will demonstrate learning success? Things to think about when creating a rubric
  • 29. RAILS Lessons • Explicit, detailed performance descriptions are crucial to achieve inter-rater reliability. • Raters appear to be more confident about their ratings when student artifacts under analysis are concrete, focused, and shorter in length. • The best raters “believe in” outcomes, value constructed consensus (or “disagree and commit”), negotiate meaning across disciplines, develop shared vocabulary, etc.
  • 30. Identify learning outcomes Create and enact learning activities Gather data to check for learning Interpret data Enact decisions to increase learning Information Literacy Instruction Assessment Cycle (ILIAC) Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.
  • 31. • Internally – Instruction improvement – Assessment improvement • Professionally – Conferences – Publications Using Assessment Results…
  • 32. References Arter, J. (2000). Rubrics, scoring guides, and performance criteria: Classroom tools for assessing and improving student learning. Retrieved from http://eric.ed.gov/?id=ED446100 Bresciani, M., Zelna, C. & Anderson, J. (2004). Assessing student learning and development: A handbook for practitioners. Washington, DC: NASPA-Student Affairs Administrators in Higher Education. Wiggins, G. P., & McTighe, J. (2006). Understanding by design. Upper Saddle River, NJ: Pearson Education, Inc., 2006. Wiggins, G. P. (1998). Educative assessment: Designing assessments to inform and improve student performance. San Francisco, CA: Jossey-Bass.
  • 33. Selected Readings: Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic assessment of an information literacy program. Portal: Libraries and the Academy, 8 (1), 75-89. Fagerheim, B. A., & Shrode, F. G. (2009). Information literacy rubrics within the disciplines. Communications in Information Literacy, 3(2), 158-170. Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully. Journal of Academic Librarianship, 39(6), 599-602. Knight, L. A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1), 43-55. Oakleaf, M. (2007). Using rubrics to collect evidence for decision-making: What do librarians need to learn? Evidence Based Library and Information Practice, 2(3), 27-42. Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing student learning and improving librarian instructional skills. Journal of Documentation, 65(4), 539-560. Oakleaf, M., Millet, M., & Kraus, L. (2011). All together now: getting faculty, administrators, and staff engaged in information literacy assessment. Portal: Libraries and the Academy, 11(3), 831- 852. Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Sterling, VA: Stylus Publishing.
  • 34. MLA/DLA: May 2014 Rev Up Information Literacy Assessment: Use Rubrics to Close That Loop! Claire Holmes Carissa Tomlinson cholmes@towson.edu ctomlinson@towson.edu SlideShare URL: Thank you!