WELCOME TO MLA/DLA 2014! Turn it UP!
Expand your networking opportunities
with MLA’s LinkedIn page
Rev Up Information Lite...
Agenda for Today :
• Background on Assessment, RAILS & Rubrics
• Norming & Rating
• Rubric Evaluation
• Reflections & Ques...
Assessment…
• Knowing what you are doing
• Knowing why you are doing it
• Knowing what students are learning as a
result
•...
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
dec...
The Institute of Museum and Library Services is the primary
source of federal support for the nation’s 123,000 libraries a...
RAILS Project Purpose
• Investigated an analytic rubric
approach to IL assessment in
higher education
• Developed a suite ...
Cook’s RAILS Purpose
• Gain rubric experience:
creating/norming/rating
• Identify assessment opportunities
within TU’s Cor...
Cook
Library’s
Priorities:
• Begin cycle of tracking
student learning.
• Begin cycle of tracking
instruction practices.
• Begin cycle of collecting
a...
Understanding by Design
1. What do you want students to learn?
(outcome)
2. How will you know that they have learned it?
(...
5 Questions for Assessment Design:
1. Outcome What do you want the student to be able to
do?
2. IL Curriculum What does th...
Evidence of “authentic” student learning:
For instance, the research worksheet in your
packet that asks students to break ...
Brainstorm ideas…
Evidence: Possible examples of
authentic student learning…
• Research journal
• Reflective writing
• “think aloud”
• Self ...
• 2 dimensions
1. criteria
2. levels of performance
• grid or table format
• judges quality
• translates unwieldy
data int...
SAMPLE RAILS RUBRIC
(green handout in your packet)
Performance Level 3
Student:
Performance Level 2
Student:
Performance L...
Criteria
1. “the conditions a [student] must meet to be
successful” (Wiggins)
2. “the set of indicators, markers, guides, ...
Performance Levels
mastery, progressing, emerging, satisfactory,
marginal, proficient, high, middle, beginning,
advanced, ...
SAMPLE RAILS RUBRIC
(green handout in your packet)
Performance Level 3
Student:
Performance Level 2
Student:
Performance L...
Workshop Norming Practice
Round 1
• For first student work sample, Claire will
“norm aloud.”
• Participants will rate 2 wo...
Norming: Round 2
• Participants will rate 2 more work samples
individually.
• Group discussion: Are we closer to
consensus...
Keep in mind…
• An info lit skills rubric does not score
discipline content; it scores information
literacy skills.
• You ...
Norming/Rating Discussion
• How do we achieve
consensus?
• What was challenging?
Rubrics – Benefits
Learning
• Articulate and communicate agreed upon
learning goals
• Provide direct feedback to learners
...
More benefits of a (normed) rubric…
Data
• Facilitate consistent, accurate, unbiased scoring
• Deliver data that is easy t...
Rubrics – Limitations
• Possible design flaws that impact data quality
• Require significant time for development
• Someti...
Rubric Evaluation Activity
• In groups of 2-3, spend 20 minutes answering
the following questions about the rubrics in
you...
1. What are our expectations of students completing this
assignment?
2. What does a successful learning of this type look ...
RAILS Lessons
• Explicit, detailed performance
descriptions are crucial to achieve
inter-rater reliability.
• Raters appea...
Identify
learning
outcomes
Create and
enact
learning
activities
Gather
data to
check for
learning
Interpret
data
Enact
dec...
• Internally
– Instruction
improvement
– Assessment
improvement
• Professionally
– Conferences
– Publications
Using
Assess...
References
Arter, J. (2000). Rubrics, scoring guides, and performance criteria:
Classroom tools for assessing and improvin...
Selected Readings:
Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic
ass...
MLA/DLA: May 2014
Rev Up Information Literacy Assessment:
Use Rubrics to Close That Loop!
Claire Holmes Carissa Tomlinson
...
Upcoming SlideShare
Loading in …5
×

MLA/DLAInfoLitAssessmentRubrics2014

685 views

Published on

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
685
On SlideShare
0
From Embeds
0
Number of Embeds
33
Actions
Shares
0
Downloads
9
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

MLA/DLAInfoLitAssessmentRubrics2014

  1. 1. WELCOME TO MLA/DLA 2014! Turn it UP! Expand your networking opportunities with MLA’s LinkedIn page Rev Up Information Literacy Assessment: Use Rubrics to Close That Loop! Claire Holmes Carissa Tomlinson cholmes@towson.edu ctomlinson@towson.edu tweets @TUEdLibrarian facebook.com/MDLib @MDLibraryAssoc #MLADLA14 Get the conference schedule on your device -- go to m.lanyrd.com and search MLADLA#MLADLA14
  2. 2. Agenda for Today : • Background on Assessment, RAILS & Rubrics • Norming & Rating • Rubric Evaluation • Reflections & Questions
  3. 3. Assessment… • Knowing what you are doing • Knowing why you are doing it • Knowing what students are learning as a result • Changing because of the information (~Debra Gilchrist, Dean of Libraries and Institutional Effectiveness, Pierce College)
  4. 4. Identify learning outcomes Create and enact learning activities Gather data to check for learning Interpret data Enact decisions to increase learning Information Literacy Instruction Assessment Cycle (ILIAC) Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.
  5. 5. The Institute of Museum and Library Services is the primary source of federal support for the nation’s 123,000 libraries and 17,500 museums. The Institute's mission is to create strong libraries and museums that connect people to information and ideas. Megan Oakleaf, founder of all things RAILS.
  6. 6. RAILS Project Purpose • Investigated an analytic rubric approach to IL assessment in higher education • Developed a suite of IL rubrics • Investigated rubric reliability & validity • Developed training materials for training/ norming/ scoring • Explored indicators of rater expertise
  7. 7. Cook’s RAILS Purpose • Gain rubric experience: creating/norming/rating • Identify assessment opportunities within TU’s Core Curriculum • Develop a rubric for use on campus • Assess students’ information literacy skills • Examine instructional practices
  8. 8. Cook Library’s Priorities:
  9. 9. • Begin cycle of tracking student learning. • Begin cycle of tracking instruction practices. • Begin cycle of collecting aggregated & anonymous data. • Reinforce regular opportunities for reflection & discussion among library instruction colleagues. (facilitate development of a Community of Reflective Practice) (Image: AP Images) Our assessment adventure…
  10. 10. Understanding by Design 1. What do you want students to learn? (outcome) 2. How will you know that they have learned it? (assessment) 3. What activities will help them learn, and at the same time, provide assessment data? (teaching method & assessment) (Wiggins & McTighe, 2006)
  11. 11. 5 Questions for Assessment Design: 1. Outcome What do you want the student to be able to do? 2. IL Curriculum What does the student need to know in order to do this well? 3. Pedagogy What type of instruction will best enable the learning? 4. Assessment How will the student demonstrate the learning? 5. Criteria for evaluation How will you know the student has done well? (Lisa Hinchcliffe, Student Learning Assessment Cycle. ACRL Assessment Immersion, 2011)
  12. 12. Evidence of “authentic” student learning: For instance, the research worksheet in your packet that asks students to break down and practice sequential steps in the search process. Brainstorm… What other possible examples of evidence of student learning do we collect? What could we collect?
  13. 13. Brainstorm ideas…
  14. 14. Evidence: Possible examples of authentic student learning… • Research journal • Reflective writing • “think aloud” • Self or peer evaluation • Works cited page • Annotated bibliography • Posters • Multimedia presentations • Speeches • Open-ended question responses • Group projects • Performances • Portfolios • Library assignments • Worksheets • Concept maps • Citation maps • Tutorial responses • Blogs • Wikis • Lab reports
  15. 15. • 2 dimensions 1. criteria 2. levels of performance • grid or table format • judges quality • translates unwieldy data into accessible information (Image: thefirstgradediaries.blogspot.com)
  16. 16. SAMPLE RAILS RUBRIC (green handout in your packet) Performance Level 3 Student: Performance Level 2 Student: Performance Level 1 Student: Performance Level 0 Student: 1. Determines Key Concepts Determines multiple key concepts that reflect the research topic/thesis statement accurately. Determines some concepts that reflect the research topic/thesis statement, but concept breakdown is incomplete or repetitive. Determines concepts that reflect the research topic/thesis statement inaccurately. Does not determine any concepts that reflect the research question/thesis statement. 2. Identifies synonyms and related terms Identifies relevant synonyms and/or related terms that match key concepts. Attempts synonym (or related term) use, but synonym list is incomplete or not fully relevant to key concepts. Identifies synonyms that inaccurately reflect the key concepts. Does not identify synonyms. 3. Constructs a search strategy using relevant operators Constructs a search strategy using an appropriate combination of relevant operators (for example: and, or, not) correctly. Constructs a search strategy using operator(s), but uses operators in an incomplete or limited way. Constructs a search strategy using operators incorrectly. Does not use operators. 4. Uses evaluative criteria to select source(s) Uses evaluative criteria to provide in- depth explanation of rationale for source selected. Uses evaluative criteria to provide a limited/superficial explanation of rationale for source selected. Attempts to use evaluative criteria, but does so inaccurately or incorrectly. Does not use evaluative criteria. 5. Uses Citations Uses an appropriate standard citation style consistently and correctly. Uses an appropriate standard citation style consistently (bibliographic elements intact), but with minimal format and/or punctuation errors. Attempts an appropriate standard citation style, but does not include all bibliographic elements consistently or correctly. Does not include common citation elements or does not include citations.
  17. 17. Criteria 1. “the conditions a [student] must meet to be successful” (Wiggins) 2. “the set of indicators, markers, guides, or a list of measures or qualities that will help [a scorer] know when a [student] has met an outcome” (Bresciani, Zelna & Anderson) 3. what to look for in [student] performance “to determine progress…or determine when mastery has occurred” (Arter) 4. Names vary: Objectives, Components, Learning Outcomes, etc. 1. Determines Key Concepts 2. Identifies synonyms and related terms 3. Constructs a search strategy using relevant operators 4. Uses evaluative criteria to select source(s) 5. Uses Citations
  18. 18. Performance Levels mastery, progressing, emerging, satisfactory, marginal, proficient, high, middle, beginning, advanced, novice, intermediate, sophisticated, competent, professional, exemplary, needs work, adequate, developing, accomplished, distinguished (or numerical…) Performance Level 3 Performance Level 2 Performance Level 1 Performance Level 0
  19. 19. SAMPLE RAILS RUBRIC (green handout in your packet) Performance Level 3 Student: Performance Level 2 Student: Performance Level 1 Student: Performance Level 0 Student: 1. Determines Key Concepts Determines multiple key concepts that reflect the research topic/thesis statement accurately. Determines some concepts that reflect the research topic/thesis statement, but concept breakdown is incomplete or repetitive. Determines concepts that reflect the research topic/thesis statement inaccurately. Does not determine any concepts that reflect the research question/thesis statement. 2. Identifies synonyms and related terms Identifies relevant synonyms and/or related terms that match key concepts. Attempts synonym (or related term) use, but synonym list is incomplete or not fully relevant to key concepts. Identifies synonyms that inaccurately reflect the key concepts. Does not identify synonyms. 3. Constructs a search strategy using relevant operators Constructs a search strategy using an appropriate combination of relevant operators (for example: and, or, not) correctly. Constructs a search strategy using operator(s), but uses operators in an incomplete or limited way. Constructs a search strategy using operators incorrectly. Does not use operators. 4. Uses evaluative criteria to select source(s) Uses evaluative criteria to provide in- depth explanation of rationale for source selected. Uses evaluative criteria to provide a limited/superficial explanation of rationale for source selected. Attempts to use evaluative criteria, but does so inaccurately or incorrectly. Does not use evaluative criteria. 5. Uses Citations Uses an appropriate standard citation style consistently and correctly. Uses an appropriate standard citation style consistently (bibliographic elements intact), but with minimal format and/or punctuation errors. Attempts an appropriate standard citation style, but does not include all bibliographic elements consistently or correctly. Does not include common citation elements or does not include citations.
  20. 20. Workshop Norming Practice Round 1 • For first student work sample, Claire will “norm aloud.” • Participants will rate 2 work samples individually. • Group discussion: Can we reach consensus for what constitutes evidence for each performance level?
  21. 21. Norming: Round 2 • Participants will rate 2 more work samples individually. • Group discussion: Are we closer to consensus? • Do we establish rating ground rules? • Does the rubric need to be modified?
  22. 22. Keep in mind… • An info lit skills rubric does not score discipline content; it scores information literacy skills. • You can only score what you can see.
  23. 23. Norming/Rating Discussion • How do we achieve consensus? • What was challenging?
  24. 24. Rubrics – Benefits Learning • Articulate and communicate agreed upon learning goals • Provide direct feedback to learners • Facilitate self-evaluation • Focus on learning standards
  25. 25. More benefits of a (normed) rubric… Data • Facilitate consistent, accurate, unbiased scoring • Deliver data that is easy to understand, defend, and convey • Offer detailed descriptions necessary for informed decision-making • Can be used over time or across multiple programs Other • Are inexpensive ($) to design & implement
  26. 26. Rubrics – Limitations • Possible design flaws that impact data quality • Require significant time for development • Sometimes fail to balance between holistic and analytic focus • May fail to balance between generalized wording and detailed description • Can lack differentiation between performance levels
  27. 27. Rubric Evaluation Activity • In groups of 2-3, spend 20 minutes answering the following questions about the rubrics in your packet: – What is your overall impression of the rubric? – Are there benefits to any of the various ways to describe the performance levels? – Look at one criteria and that performance levels for that criteria. Do you think it would be hard to norm this criteria? Why or why not?
  28. 28. 1. What are our expectations of students completing this assignment? 2. What does a successful learning of this type look like? 3. What specific learning outcomes do we want to see in the completed assignment? 4. What evidence can we find that will demonstrate learning success? Things to think about when creating a rubric
  29. 29. RAILS Lessons • Explicit, detailed performance descriptions are crucial to achieve inter-rater reliability. • Raters appear to be more confident about their ratings when student artifacts under analysis are concrete, focused, and shorter in length. • The best raters “believe in” outcomes, value constructed consensus (or “disagree and commit”), negotiate meaning across disciplines, develop shared vocabulary, etc.
  30. 30. Identify learning outcomes Create and enact learning activities Gather data to check for learning Interpret data Enact decisions to increase learning Information Literacy Instruction Assessment Cycle (ILIAC) Oakleaf, Megan. "The Information Literacy Instruction Assessment Cycle: A Guide for Increasing Student Learning and Improving Librarian Instructional Skills." Journal of Documentation. 65.4. 2009.
  31. 31. • Internally – Instruction improvement – Assessment improvement • Professionally – Conferences – Publications Using Assessment Results…
  32. 32. References Arter, J. (2000). Rubrics, scoring guides, and performance criteria: Classroom tools for assessing and improving student learning. Retrieved from http://eric.ed.gov/?id=ED446100 Bresciani, M., Zelna, C. & Anderson, J. (2004). Assessing student learning and development: A handbook for practitioners. Washington, DC: NASPA-Student Affairs Administrators in Higher Education. Wiggins, G. P., & McTighe, J. (2006). Understanding by design. Upper Saddle River, NJ: Pearson Education, Inc., 2006. Wiggins, G. P. (1998). Educative assessment: Designing assessments to inform and improve student performance. San Francisco, CA: Jossey-Bass.
  33. 33. Selected Readings: Diller, K. R., & Phelps, S. F. (2008). Learning outcomes, portfolios, and rubrics, oh my! Authentic assessment of an information literacy program. Portal: Libraries and the Academy, 8 (1), 75-89. Fagerheim, B. A., & Shrode, F. G. (2009). Information literacy rubrics within the disciplines. Communications in Information Literacy, 3(2), 158-170. Holmes, C. & Oakleaf, M. (2013). The Official (and Unofficial) Rules for Norming Rubrics Successfully. Journal of Academic Librarianship, 39(6), 599-602. Knight, L. A. (2006). Using rubrics to assess information literacy. Reference Services Review, 34(1), 43-55. Oakleaf, M. (2007). Using rubrics to collect evidence for decision-making: What do librarians need to learn? Evidence Based Library and Information Practice, 2(3), 27-42. Oakleaf, M. (2009). The information literacy instruction assessment cycle: A guide for increasing student learning and improving librarian instructional skills. Journal of Documentation, 65(4), 539-560. Oakleaf, M., Millet, M., & Kraus, L. (2011). All together now: getting faculty, administrators, and staff engaged in information literacy assessment. Portal: Libraries and the Academy, 11(3), 831- 852. Stevens, D. D., & Levi, A. (2005). Introduction to rubrics: An assessment tool to save grading time, convey effective feedback, and promote student learning. Sterling, VA: Stylus Publishing.
  34. 34. MLA/DLA: May 2014 Rev Up Information Literacy Assessment: Use Rubrics to Close That Loop! Claire Holmes Carissa Tomlinson cholmes@towson.edu ctomlinson@towson.edu SlideShare URL: Thank you!

×