Your SlideShare is downloading. ×
Assessing Information Literacy From the Ground Up
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Assessing Information Literacy From the Ground Up

1,120
views

Published on

How to start an information literacy program at your academic library

How to start an information literacy program at your academic library

Published in: Education

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,120
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
15
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Students enrolled in special education.
  • Number of Students at WartburgCollege Focus – Small, Private, Liberal Arts; Information Literacy Across the Curriculum ProgramImplementation Schedule; Testing Schedule; Buy-In by Faculty; Successes and Challenges
  • Transcript

    • 1. Assessing Information Literacy from the Ground Up – Creating a Culture of Assessment
      Kristy Motz, Randall Schroeder & Mari Kermit-Canfield
      Ferris State University
    • 2. FLITE Assessment Team
    • 3. What’s Out There?
    • 4. Starting Up – Graduate Level Assessments (CMS)
      • Developed tools two years ago in Blackboard Vista product
      • 5. Ran assessments through several semesters
      • 6. Summer 2009: Refined questions and mapped them to Information Literacy Competency Standards for Higher Education
    • MMBA Pre- and Post-Instruction Assessments
      • Mixed delivery or blended delivery class
      • 7. Two Saturdays on-site - remaining course online
      • 8. Class runs eight weeks = ½ semester
      • 9. Assessments run in class first and last day
      • 10. 8 questions pre-instruction (first day of class)
      • 11. 5 questions post-instruction (end of class)
    • Online Assessment in CMS
    • 12. Pre-Instruction Course Assessment
    • 13. Post-Instruction Assessment MMBA
    • 14. MMBA CMS Issues
      • Surveys or ungraded quiz tools from CMS don’t provide results.
      • 15. Instructor must allow point credit for results to remain in Gradebook.
      • 16. Students with first-day user ID/password problems can’t reach the assessment.
      • 17. Questions must be tweaked to fit CMS question software format.
    • Fall 2008 MMBA - Fagerman
    • 18. Summer 2009 MMBA - Hamel
    • 19. ESPN 505 Special Education
      • 19 students took Pre-Instruction Assessment
      • 20. 15 students took Post-Instruction Assessment
      • 21. Instructor added 2 point bonus for participation
      • 22. Instructor used CMS with on-campus course
    • ESPN 505 PRE AND POST INSTRUCTION ASSESSMENT
    • 23. NEXT Steps: Real-Time Undergraduate assessments
    • 24. Vogel Library  FLITE
      • FLITE’s new department head: Feb 2008
      • 25. Assessment Team’s new focus on Information Literacy Instrument
      • 26. Announcement of new James Madison Information Literacy Testing
    • How To Begin?Vogel Library – WarTburg College
    • 27. Vogel Library
      • Seven years of pre-instruction and post-instruction information literacy testing
      • 28. Based on Information Literacy Competency Standards for Higher Education
      • 29. Designed question by question to highlight targeted standards
    • Vogel Library
      • Developed by
      • 30. Librarians Randall Schroeder, Jill Gremmels, and Karen Lehman
      • 31. Dr. Judith Griffith, English Faculty Chair
      • 32. Led to Faculty Buy-In
    • Vogel Adaptations to FLITE
      • FSU Students
      • 33. FSU Focus and Mission
      • 34. Information Literacy a new Concept
      • 35. Implementation and Testing
      • 36. Faculty Buy-in: 3 volunteers worked with us
    • Challenges
      • Conflicts with Scantrons and Testing Center
      • 37. Need for definitive learning outcomes
      • 38. Testing on concepts not always covered in classes
    • What are we Testing?
      16. A student who is deciding what topic to research for an assigned paper approaches the reference librarian and says, “This summer was too hot. I wonder if it is because of global warming?”
      What might be the thesis statement of this student’s paper?
      1. It was too hot this summer.
      2. Global warming is causing the mean temperature of the earth to rise, which could have disastrous effects on the ecosystems of the planet.
      3. Did global warming cause abnormal heat around the planet?
      4. We should buy hybrid vehicles because of global warming.
      5. This paper will be about the effects of global warming.
    • 39. Successes - Survey Questions
      Rank the activities on which you spent the most (4) to the least (1) time when you are on the computer.
      • Searching for specific information related to a school or personal task.
      • 40. Communicating with individuals through Facebook/My Space, IM, or via e-mail.
       
      • Downloading movies, music or graphics.
       
      • Surfing the Internet for enjoyment or playing games.
       
    • 41. What do we know about our students?
    • 42. Where Do You Get Your Information?
    • 43. Which of these have you done within the last year?
    • 44. How many papers did you write?
    • 45. Schroeder Words of Wisdom: Hard Lessons Learned
      • Give yourself permission to fail
      • 46. Assess to learn things – Don’t just assess to assess
      • 47. Be sure to have Faculty Buy-In
    • Starting Again
      • Involve new staff
      • 48. Improve librarian buy-In
      • 49. Imagine: What will happen when developers leave?
    • What are the Information Literacy Competency Standards?
      http://ala.org/ala/mgrps/divs/acrl/standards/standards.pdf
    • 50. FLITE Information Literacy Assessment Team CREATES LEARNING OUTCOMES
      • In-depth study of the 5 competency standards
      • 51. Focus on performance indicators
      • 52. Curriculum mapping - Group discussion and consensus
      • 53. 7 performance indicators
      • 54. 1-2 indicators from each standard
    • FLITE Information Literacy Assessment Team CREATES LEARNING OUTCOMES
      • Group work: Individual questions written for each learning outcome (performance indicator) - 35 questions for assessments
      • 55. Discussion and consensus: Team selected 34 questions for Question Bank
      • 56. 12 questions are survey questions asked for every assessment
      • 57. 22 questions can be tailored for class
    • Information Literacy – Round Two Fall 2009
      • Question Bank built for use with different classes
      • 58. Individualized assessments built by individual library instructors from Question Bank
      • 59. Assessments delivered electronically if possible
    • Survey Monkey
      Started in 1999, SurveyMonkey is an online survey tool that enables people of all experience levels to create their own surveys quickly and easily.
    • 60. FLITE Library’s Assessment Team picked Survey Monkey because:
      • Seven years of pre-instruction and post-instruction information literacy testing
      • 61. Based on Information Literacy Competency Standards for Higher Education
      • 62. Designed question by question to highlight targeted standards
    • FLITE Library’s Assessment Team picked Survey Monkey because…
      History
      Team members have previous experience with the program
      Easy of use
      User needs no statistical background
      User needs no knowledge of statistical software
      Malleable
      Accessible from web; in class or at home
      Easily printable for a paper version
    • 63. Success – Yes, You Can Too
      • We went beyond the “What if’s” and created a project.
      • 64. We gave ourselves permission to fail.
      • 65. We didn’t have to be perfect – we just had to start.
      • 66. We created a Culture of Assessment: we refined what we did and moved on.
    • Next Steps
      Post-instruction assessment – Dec 2009
      Refine questions based on data input
      Expand to other faculty and departments
      Adapt procedures for Distance Learning
      Write new assessments for upper-level classes
    • 67. Accomplished Major Goal
      • Closed the loop
      • 68. Curriculum mapping: using assessment data to revise and improve instruction curriculum
      • 69. Keep assessing – TracDat
    • Culture of Assessment
      • Helps us inform faculty where we want them to go with us
      • 70. Allows us to create customized instructions across the curriculum
      • 71. Showcases FLITE – the Library is one of the early reporters of assessment data on campus
    • Just Start!
      • Get Faculty Buy-in
      • 72. Give Yourself Permission to Fail
      • 73. Take the Plunge and Begin
    • Questions?
      Kristen Motz motzk@ferris.edu
      Randall Schroeder randallschroeder@ferris.edu
      Mari Kermit-Canfield kermitt@ferris.edu

    ×