Online Catalogs: What Users and Librarians Want

2,811 views

Published on

Published in: Technology, Education, Business
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,811
On SlideShare
0
From Embeds
0
Number of Embeds
29
Actions
Shares
0
Downloads
71
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide
  • Online Catalogs: What Users and Librarians Want

    1. 1. <ul><li>Prepared for the Charleston Conference </li></ul><ul><ul><li>Karen Calhoun </li></ul></ul><ul><ul><li>Janet Hawk </li></ul></ul>Online catalogs: What users and librarians want A review of market research data 7 November 2008
    2. 2. With thanks to Joanne Cantrell, OCLC Market Research Analyst Photo by allw3ndy http://flickr.com/photos/allw3ndy/2757149584/
    3. 3. What did catalog quality mean in 1989? Davis, Carol C. 1989. “Results of a survey on record quality in the OCLC database.” Technical Services Quarterly . 7 (2):43-53. Duplicate records Bad name headings Bad subject headings
    4. 4. The perception of “quality”: The eye of the beholder <ul><li>Specialist’s view: </li></ul><ul><ul><li>Conformance to specifications (rules) </li></ul></ul><ul><ul><li>Priorities: Fullness and detail </li></ul></ul><ul><li>Pragmatist’s view: </li></ul><ul><ul><li>Make as many materials as possible available as quickly as possible </li></ul></ul><ul><ul><li>Priorities: speed and efficiency </li></ul></ul><ul><li>End-user’s view: </li></ul><ul><ul><li>Easy and convenient </li></ul></ul>
    5. 5. 30-second summary of online catalog user studies <ul><li>Keyword searching reigns </li></ul><ul><li>The default search is chosen most often </li></ul><ul><li>Number of terms in a query: 1 to 3 </li></ul><ul><li>Search failure rate (zero hits) is very high: 20 to 40 percent </li></ul>The latest study: Moulaison, Heather L. 2008. “OPAC queries at a medium-sized academic library: a Transaction log analysis.” LRTS 52 (4): 230-237.
    6. 6. Will Google Books usurp the library catalog? Ludwig, Mark J. and Margaret R. Wells. “Google Books vs. BISON.” Library Journal , July 15, 2008. http://www.libraryjournal.com/article/CA6566451.html
    7. 7. Markey, Karen and Karen Calhoun.1987. “Unique words contributed by MARC records with summary and/or contents notes.” Proceedings of the 50 th ASIS Annual Meeting (Medford NJ: Learned Information), p. 153 – 162. LCSH: from 3 to 7 words per record Assumptions and mindsets: Where do subject-rich index terms come from?
    8. 8. Assumptions and mindsets: What is “full”? + 3 more screens Product description and purchase information; ‘ More like this’ Editorial reviews and author info ‘ Inside the book’ tags, ratings, customer reviews, lists and more With thanks to David Lankes: http://quartz.syr.edu/rdlankes/Presentations/ 2007/ALCTS.pdf Bibliographic information Library holdings Details Subjects Editions Reviews Bibliographic information Australian library holdings
    9. 9. What Is online catalog “quality”? “ A persistent shortcoming in the decision-making process [about library database quality] that needs to be addressed is the lack of serious research into user needs and benefits, and the actual impact on users of database quality decisions.” — Janet Swan Hill Hill, Janet Swan. 2008. “Is it worth it? Management decisions related to database quality.” CCQ 46 (1): 5-26.
    10. 10. “ You need more book descriptions. Telling me the author name and book title does not tell me what a book is about.” - High school student- “ I would like to preview actual pages from the books. This would greatly help me educate myself on the subject matter presented and get a sense of what the book actually offers.” - College student- “ Please link me to the item i'm searching for.” -Graduate student-
    11. 11. Objectives of our metadata quality research <ul><li>Start over with a blank page </li></ul><ul><li>Identify and compare metadata expectations </li></ul><ul><ul><ul><li>End users </li></ul></ul></ul><ul><ul><ul><li>Librarians </li></ul></ul></ul><ul><li>Compare expectations of types of librarians </li></ul><ul><li>Determine end-user satisfaction with WorldCat.org </li></ul><ul><li>Define a new WorldCat quality program </li></ul><ul><ul><li>Considering the perspectives of all constituencies of WorldCat </li></ul></ul><ul><ul><ul><li>End users (and subgroups of end users) </li></ul></ul></ul><ul><ul><ul><li>Librarians (and subgroups of librarians) </li></ul></ul></ul>
    12. 12. How did we conduct the research? Research methodologies <ul><li>Focus groups </li></ul><ul><ul><li>Conducted by Blue Bear, LLC </li></ul></ul><ul><li>Pop-up survey on WorldCat.org </li></ul><ul><ul><li>Conducted by ForeSee Results </li></ul></ul><ul><li>Librarian survey </li></ul><ul><ul><li>Conducted by Marketing Backup </li></ul></ul>
    13. 13. End-user focus groups <ul><li>Focus groups: </li></ul><ul><ul><li>College students, ages 18–24 </li></ul></ul><ul><ul><li>General public, ages 25–59 </li></ul></ul><ul><ul><li>Scholars, including academic faculty and graduate students </li></ul></ul><ul><li>Format: </li></ul><ul><ul><li>Individual usability tests: captured comments on-screen </li></ul></ul><ul><ul><li>Facilitator-led, group discussion </li></ul></ul>
    14. 15. What did we learn? End-user focus group results <ul><li>Key observations: </li></ul><ul><li>Delivery is as important, if not more important, than discovery. </li></ul><ul><ul><li>Seamless, easy flow from discovery through delivery is critical. </li></ul></ul><ul><li>Improved search relevance is necessary. </li></ul>
    15. 16. Pop-up survey <ul><li>Live on WorldCat.org: May 12 </li></ul><ul><li>11,000+ responses through July 10 </li></ul><ul><li>Evaluates the metadata most helpful in identifying a needed item </li></ul>
    16. 17. Who responded to the survey? <ul><li>Students: 19% </li></ul><ul><li>Teacher/professor: 15% </li></ul><ul><li>Other general searchers: 34% </li></ul><ul><li>Librarians/other library staff: 32% </li></ul><ul><li>End-user country: </li></ul><ul><li>USA: 56% </li></ul><ul><li>Canada: 4% </li></ul><ul><li>Mexico: 3% </li></ul><ul><li>United Kingdom: 3% </li></ul><ul><li>End-user language: </li></ul><ul><li>English: 84% </li></ul><ul><li>Spanish: 8% </li></ul><ul><li>Other: 3% </li></ul><ul><li>French: 2% </li></ul><ul><li>End-user age: </li></ul><ul><li>18 & younger: 5% </li></ul><ul><li>19 – 30: 24% </li></ul><ul><li>31-40: 17% </li></ul><ul><li>41-50: 20% </li></ul><ul><li>51-60: 20% </li></ul><ul><li>61+: 13% </li></ul>
    17. 18. What did we learn? Pop-up survey results <ul><li>Information most essential in identifying the item needed? </li></ul><ul><li>End users (n=7535) </li></ul>Discovery Discovery Discovery Discovery Delivery Delivery Delivery Delivery Delivery Delivery Discovery Discovery Discovery Discovery Discovery Discovery Discovery Discovery Discovery Discovery Discovery Discovery Discovery Discovery
    18. 19. What did we learn? Pop-up survey suggestions Changes to help identify an item? End users (n=7535)
    19. 20. ‘ Item details’ in WorldCat.org The World Is Flat Lots of detail Not Quite the Diplomat Not much detail
    20. 21. ‘ Subject information’ in WorldCat.org 6 subject-rich words: Barack Obama Travel Africa Presidential Candidates
    21. 22. End-user recommendations <ul><li>Improve search relevance </li></ul><ul><li>Add more links to online full text (and make linking easy) </li></ul><ul><li>Add more summaries/abstracts: Make summaries more prominent </li></ul><ul><li>Add more details in the search results (e.g., cover art and summaries) </li></ul>
    22. 23. Librarian survey <ul><li>Currently in the field beginning September 2008 (U.S. and non U.S.) </li></ul><ul><li>Preliminary data: 1,138 responses; North America (844) and 171 international as of 10/24/08 </li></ul><ul><li>Evaluates: </li></ul><ul><ul><li>The metadata most helpful in identifying a needed item </li></ul></ul><ul><ul><li>Attributes liked most about WorldCat </li></ul></ul><ul><ul><li>Recommended enhancements to WorldCat </li></ul></ul>
    23. 24. Librarian survey <ul><li>Acquisitions: 28% </li></ul><ul><li>Cataloging : 65% </li></ul><ul><li>Collection development or selection: 32% </li></ul><ul><li>Interlibrary loan: 25% </li></ul><ul><li>Reference/public service: 46% </li></ul><ul><li>Library director/administration: 20% </li></ul>Current areas of responsibility
    24. 25. What did we learn? Librarian survey results: Reactions to WorldCat.org —c ompared to end users DISCOVERY Most essential information
    25. 26. What did we learn ? Librarian survey results compared to end-user results DISCOVERY Recommended enhancements
    26. 27. What did we learn? End-user survey data compared to librarian survey data DISCOVERY Recommended enhancements to WorldCat
    27. 28. What did we learn? Librarian survey results TOP recommended enhancements to WorldCat Top 5 total librarian responses
    28. 29. What did we learn? Librarian survey results Recommended enhancements to WorldCat Top 5 acquisition librarian responses
    29. 30. What did we learn? Librarian survey results Recommended enhancements to WorldCat Top 5 cataloging librarian responses
    30. 31. What did we learn? Librarian survey results Recommended enhancements to WorldCat Top 5 library director responses
    31. 32. What did we learn? Librarian survey results <ul><li>TOP enhancements for WorldCat TOP 5 responses by academic librarians </li></ul>
    32. 33. What did we learn? Librarian survey results <ul><li>TOP enhancements for WorldCat TOP 5 responses by public librarians </li></ul>
    33. 34. What did we learn? Librarian survey results <ul><li>TOP enhancements for WorldCat TOP 5 responses among international librarians </li></ul>
    34. 35. What did we learn? Pop-up survey suggestions Changes to help identify an item? End users (n=7535) – Bottom 8 mentions
    35. 36. Recommendations from librarian survey (so far) <ul><li>Merge duplicates </li></ul><ul><li>Make it easier to make corrections to records (fix typos; do upgrades); “social cataloging” experiment — Wikipedia </li></ul><ul><li>More emphasis on accuracy/currency of library holdings </li></ul><ul><li>Enrichment — TOCs, summaries, cover art — work with content suppliers, use APIs, etc. </li></ul><ul><li>Education about what users say they want </li></ul>
    36. 37. A few ideas to discuss <ul><li>Catalogs have many audiences, inside and outside the library </li></ul><ul><li>With respect to metadata “quality,” librarians’ and end users’ definitions generally differ </li></ul><ul><li>Different groups of end users have different priorities, but there are some commonalities across groups: </li></ul><ul><ul><li>The end user’s delivery experience is as important, if not more important than the discovery experience </li></ul></ul><ul><ul><li>Most important for analog materials: summaries, TOCs, etc. </li></ul></ul><ul><ul><li>Most important for licensed e- and digital materials: the ability to link easily and conveniently to the online content itself </li></ul></ul><ul><li>Different groups of librarians have different priorities, but there are some commonalities across groups: </li></ul><ul><ul><li>Merge duplicate records </li></ul></ul><ul><ul><li>Add TOCs </li></ul></ul>

    ×