Your SlideShare is downloading. ×
Why can't students get the resources they need results from a real availability study
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Why can't students get the resources they need results from a real availability study

432
views

Published on

Availability studies estimate the proportion of items in a collection that library users can access. This traditional research method can help librarians find and fix the most significant access …

Availability studies estimate the proportion of items in a collection that library users can access. This traditional research method can help librarians find and fix the most significant access problems with electronic resources, and connect patrons with information through better collection development and acquisitions decisions.

To date, all electronic resource availability studies have been "simulated" studies, in which a librarian tests access to a sample of items. Simulated availability studies identify technical problems with electronic resources, but don't address how database interface design or insufficient library research skills could prevent a student from successfully obtaining a desired item.

This study represents the first known attempt at a "real" electronic resource availability study, in which recruited students generate and test the sample. It uses quantitative methods to estimate overall resource availability, and a cognitive walkthrough (a usability research method) to compare the way Redlands students actually retrieve full text against an ideal process articulated by Redlands librarians.

The study's conclusions can be used to benchmark studies of e-resource availability at other campuses, provide input into database interface design and improve library instruction concerning electronic resources.

Presenter:
Sanjeet Mann
Arts and Electronic Resources Librarian, University of Redlands
Redlands, CA

Published in: Education, Technology

0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
432
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
15
Comments
0
Likes
3
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Why can’t students get the sources they need? Results from a real availability study Sanjeet Mann Arts & Electronic Resources Librarian, University of Redlands 29th Annual NASIG Conference, Ft. Worth, Texas May 2, 2014
  • 2. How do you get full text? Librarian Student Paige Mann Carlos Puma
  • 3. Research methods Technical errors = availability study • Quantitative • Large sample for statistical validity • Researcher tests access (“simulated availability”) Human interaction = usability study • Qualitative • 5-7 users • Researcher observes library users
  • 4. My methodology • Cognitive walkthrough • 7 students x 2 searches x 10 results ≈ 142 interactions • Jing screen capture software • Demographic survey and results spreadsheet
  • 5. Results 25% 43% 3% 29% Did not obtain item Requested via ILL Located physically Downloaded online
  • 6. Errors System 31 User 35 Both 16
  • 7. Severe systems errors • A&I database has no OpenURL link • Target database refuses the OpenURL • A&I database has bad/missing metadata • Can info be found in Google? • Knowledge base doesn’t offer article-level linking • Is student willing to browse?
  • 8. Student encounters a system error
  • 9. Severe human errors • Didn’t test link • Used system incorrectly • Overlooked important information • Got frustrated and gave up
  • 10. Student experiences user error
  • 11. Conceptual model 11
  • 12. Questions availability studies can address • How often do errors occur? Should we be satisfied with our technical infrastructure? (systems) • How often do users need ILL? (interlibrary loan) • Do we have enough full text in the collection? (collection development) • Are we teaching users what they need to be successful at obtaining electronic resources? (instruction)
  • 13. For discussion at University of Redlands • 13% error rate in 2013 simulated study • Common problems = source metadata, KB support for OA titles • 41% local availability (research libraries average 60%) • 2 of 3 items not held in local collection in 2013 study • 43% interactions result in ILL in 2014 study • Threshold concepts vs. search/retrieval mechanics • How do you teach students to be thoughtful, resilient searchers?
  • 14. What would I do differently? • Larger, personalized incentives • Simplify research design • Cognitive walkthrough as group activity • Jing + Camtasia Studio worked OK
  • 15. Further Reading • Selective bibliography: http://goo.gl/4fu47 • Slides: http://www.slideshare.net/sanjeetmann • University of Redlands availability study datasets • 2012 (simulated) http://goo.gl/606us • 2013 (simulated) http://goo.gl/O5XK9A • 2014 (real) http://goo.gl/BaAm5T

×