Results of Web-scale discovery: Data, discussions and decisions


Published on

By comparing year-over-year usage before and after implementation of discovery services, libraries are able to quantify the impact discovery is having on usage of their resources. Early results reported by Michigan’s Grand Valley State University (GVSU) in June 2010 followed by University of Houston (UH) in May 2011 show web-scale discovery having a transformational effect—astronomical growth in the usage of their electronic resources. GVSU continues to look at the numbers, but is also measuring the impact of discovery at their library by the discussions that the introduction of this new “digital front door” has prompted. Learning more about how students and faculty approach and use library resources and the importance (or non-importance depending on the audience) type of resource plays in the research process is serious food for thought. This session will focus on new analytics and the availability of additional metrics; determining how best to help researchers of all kinds; and the choices that libraries consider as they enter and navigate in this new world of web-scale discovery.
Presenter: Jeffrey Daniels, Grand Valley State University

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

Results of Web-scale discovery: Data, discussions and decisions

  1. 1. Results of Web-scale discovery:Data, discussions and decisions Jeffrey Daniels Head of Technical Services and Electronic Resource Management Grand Valley State University Libraries
  2. 2. • 24,000 plus students• $4.5 million annual library materials budget• 63 library staff members• Over 300 databases• 60,000 ejournal titles• 600,000 plus ebook titles• 2012 ACRL Excellence in Academic Libraries Award
  3. 3. Why should we listen to this guy? • Grand Valley State University was the first commercial implementer of Serial Solution’s Summon • Went live in August of 2009 • Nearly 3 years experience with a live web-scale discovery product
  4. 4. Key Questions and Decisions• Do we want web-scale discovery?• Where do we want it?• Who is our audience?• Should we teach it?• Is it working?
  5. 5. Do we want web-scale discovery?• Federated searching never panned out• Enter web-scale discovery• Summon by Serials Solutions• Product search• Implementation
  6. 6. Where do we want it?• Searching Before Summon ▫ Keyword - Encore ▫ Keyword – Classic ▫ Title ▫ Author ▫ Subject ▫ Journal Title• Could there be more choices? YES ▫ Tabs for Nautilus ▫ Document Delivery ▫ Courser reserve
  7. 7. Searching After Summon• Summon search only search on page ▫ Why? ▫ Wanted to emphasize search box ▫ Links to other searches
  8. 8. Did that help?
  9. 9. Who is our Audience?• Should determine this prior to implementation• 1st and 2nd year users• Anyone doing research outside their discipline• Summon driving users to subject specific resources
  10. 10. Should we teach it?• Discovery Delivery and Management task force• Not that “type” of librarian• What type of class is this?• Bookend the classes• The “ah-ha” factor
  11. 11. Is it working?• Catalog items in Summon• Usage statistics• Article by Head of Collections at GVSU: Doug Way Doug Way, The Impact of Web-scale Discovery on the Use of a Library Collection, Serials Review, Volume 36, Issue 4, December 2010, Pages 214-220, ISSN 0098-7913, 10.1016/j.serrev.2010.07.002. (• Collection Development
  12. 12. Everybody loves statistics!• August 2009 -May 2012 (34 months) ▫ Summon sessions: 535,247 ▫ Summon searches: 2,868,046• 2011 calendar year ▫ Database sessions: 116,308 ▫ Summon sessions: 210,181 ▫ Database searches: 192,357 ▫ Summon searches: 1,147,807
  13. 13. 80000 180000 100000 160000 60000 20000 0 40000 120000 140000Aug-09Sep-09Oct-09Nov-09Dec-09Jan-10Feb-10Mar-10Apr-10May-10Jun-10Jul-10Aug-10Sep-10Oct-10Nov-10Dec-10Jan-11Feb-11Mar-11Apr-11May-11Jun-11 Jul-11Aug-11Sep-11 Oct-11Nov-11Dec-11Jan-12Feb-12Mar-12Apr-12May-12 Visits Searches
  14. 14. Effect on collection development?• This is a Frequently asked question• Using counter usage statistics to measure effect: ▫ 2 years pre Summon (September 2007 – August 2009) ▫ 2 years post Summon (September 2009 – August 2011)
  15. 15. Abstract and index databases• Expectations: ▫ Searches would go down ▫ Possible able to cancel general A/I resources• Statistics (2 years pre compared to 2 years post): ▫ RILM searches: 37% decrease ▫ Sociological Abstracts searches: 38% decrease• Effect on collection development: ▫ None, no cancelations to date  Want our upper level students using discipline appropriate resources  Accreditation and purchase model
  16. 16. Full text journal packages• Expectations: ▫ Full text access would increase ▫ Allow us to cancel some aggregated databases• Statistics: ▫ Springer full text: 80% increase• Effect on collection development: ▫ We’re pleased ▫ Reinforced desire to get these packages
  17. 17. Full text aggregated databases• Expectations: ▫ We would be able to drop some aggregators• Statistics: ▫ ProQuest Medical Searches: 3% decrease ▫ ProQuest Medical Full Text: 3% increase ▫ Most aggregated resources trended similar• Effect on collection development: ▫ Use is as strong or stronger ▫ No longer looking to drop
  18. 18. Effect on collection development• Summon has clearly changed how our resources area being used, but we feel they are good changes• Hoped to be able to recoup Summon costs with cancelations but didn’t see usage change that way• *also haven’t had the need (knock on wood)
  19. 19. And in closing…• If you are looking at web-scale discovery, ask the questions• Great resource for undergraduates• Makes library resources less “scary”• Not to the ideal experience yet, keep asking the questions!• Best product we’ve ever worked with if you don’t know where to start.
  20. 20. Questions?• Jeffrey Daniels▫ ▫ 616-331-2702