Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Electronic Resources and Libraries Workshop at INFO 2012, Tel Aviv


Published on

Bonnie Tijerina (@bonlth) presented a workshop at the INFO 2012 Conference in Tel Aviv, Israel. The workshop entitled, "E-Resource Management, Workflow, and Discovery in the Digital Age" presented a summary of eresources management work drawing from work presented at the 2012 Electronic Resources and Libraries Conference (@ERandL). More information about the conference can be found at

Published in: Technology, Education
  • Login to see the comments

Electronic Resources and Libraries Workshop at INFO 2012, Tel Aviv

  1. 1. INFO 2012 Workshop | Tel Aviv, Israel
  2. 2. Current Work Assistant Director for Collections Services at Claremont Colleges Library Founder, ER&L/ Electronic Resources & Libraries  ALAs Digital Content & Libraries Working Group Past Work  Editor, JERL/ Journal of Electronic Resources Librarianship  Digital Collections, UCLA  Eresources Management, GeorgiaTech
  3. 3. Things I like when I’m not working Yoga Hiking Oatmeal  Travel Aunt
  4. 4. About you! Do you work in a library? Which library types? Which library roles? Do you work for a company supporting libraries? Which types of products or services?
  5. 5. Electronic Resources & Libraries  Founded in 2005 by eresources librarian   100 person gathering in 2006  600+ attendees/ online attendees in 2012  2012 , most heavily tweeted and blogged conference  Practical, tactical, honest and strategic work from all levels
  6. 6.  Great networking and camaraderie More sessions than you could possibly fit in 3 days  #erl12 Flickr page
  7. 7. Electronic Resources LibrarianElectronic Serials LibrarianCataloguer eresourcesHead of Collection Development director/ associate director serialsSerials Acquisitions Librarian acquisitions Scholarly Communications Librarian collections #erl12 Flickr page technical/tech servicesLicensing Specialist health/medical sales/ marketingDigital Resources LibrarianSystems LibrarianWebservices LibrarianMetadata LibrarianAssistant DirectorHealth Sciences Librarian
  8. 8.  Lightning talk opportunities for attendees 50+ Panels, workshops and keynote sessions  Program search tool
  9. 9.  Managing e-Resources in Libraries ER&L Track detail Collection Development & Assessment Workflow & Organizations External & User Relationships Emerging & Future Technologies Scholarly Communication & Licensing Library as Publisher ER&L’s tracks are annually reviewed and updated by volunteers on the Program Planning committee.
  10. 10.  09.00-10.30 | E-Resource Management Lifecycle, Part 1: Overview of the lifecycle and new models for Collection Development and Acquisitions 10.30-11.00 | Break 11.00-12.00 | E-Resource Management Lifecycle, Part 2: Workflow Analysis, E-Resource Maintenance, and Standards Updates
  11. 11.  12.00-13.00 | TDNet presentations 13.00-14.30 | Lunch break 14.30-15.30 | Use, Users and Assessment: An investigation of measuring impact and determining value and ROI (return on investment). 15.30-16.00 | Break 16.00-17.00 | E-Resource Discovery and Promotion: An evaluation and exploration of discovery services
  12. 12. I’m here today to learn _________.
  13. 13. My greatestchallengein managingeresourcesis ________.
  14. 14. E-Resource Management History
  15. 15. Future… Open access 2004 | Commercial management? 2001 | Advent of e- ERMs on market resource (ERMS, Verde) management tools 2011 | Multiple (DLF, Serials discovery tools on Solutions, TDNet) market (Summon, EDS, P rimo) 2003 | Metasearch tools (Metafind, Ex Libris) 2012 | Demand driven tools (Get- 2009 | Flip to e- It-Now, GIST) access over print purchase (ARL)1997 | Advent of e- resources (JSTOR, SIAM)
  16. 16. Does this Reworking workflows all sound Fully implementing ERMS familiar? Measuring Usage, Value and ROI Value and Use of Discovery Tools Licensing and negotiation skills Useful patron-driven acquisition Dismantling the Big Deal User Experience Leadership in libraries
  17. 17. E-Resource Management Lifecycle |Part 1: Overview of the lifecycle andnew models for CollectionDevelopment and Acquisitions
  18. 18. What is TERMS?TERMS is an attempt to create aninternationally crowdsourced best practices forelectronic resource management.  Based on the electronic resources lifecycle, each segment of the lifecycle has been developed to give the basic techniques used.  Workflows are shared via an open dropbox site
  19. 19. Where is TERMS available? TERMS is freely available from three social media sites: Facebook: TERMS group page TUMBLR: TWITTER: @6terms Documentation regarding the best practices are posted to Facebook & Tumblr sites
  20. 20. Future of TERMS Working with JISC Collections in the UK to find the best place to openly share within their web site environment TERMS will be presented as a poster session at LIBER this summer, and at various events in 2013 TERMS will be published as an ALA Technical Report in April 2013
  21. 21. eResource Management Lifecycle – Selection/ Evaluation Criteria • Relevance to research and/or curriculum needsContent • Depth and breadth of content • Simultaneous multiple user or single user • User interface, response time and reliability Access • Digital Rights Management (DRM) • Authentication • Support (local and vendor)Technical • Customization • Provision of usage statistics, cataloging records • 1-time or Subscription Cost • Platform/hosting fees
  22. 22. eResource Management Lifecycle –Purchasing/ Pricing Models Purchase/own or Lease e-content Pay-per-view (articles) o Library-sponsored or end-user service such as DeepDyve Single-user or Multiple-user Institutional or Consortial purchases o Shared licensing and content; discounted cost Aggregator, “Big Deal” or Title-by-title o Fulltext databases; Publisher eJournal collections; Individual eJournals User demand-driven acquisition (DDA) or Librarian-selected eContent
  23. 23. Models are publisher-driven, in mostcases, but when given achoice, libraries must assess whichmodel is most cost-effective for each e-resource.
  24. 24. eResource Management Lifecycle – Licensing considerations Authorized use and users o Limits on use or users o Downloading and printing o Fair Use, Inter-Library Loan Country rights (outside of N. America) Governing Law o Stipulate local laws govern Cancellation and Archival rights o What happens to content already purchased?Model license available at LIBLICENSE
  25. 25. eBook Demand-Driven Acquisitions -Key Aspects New titles identified by eBook supplier for DDA service o Based on library-selected subject/non-subject parameters Bibliographic records loaded to catalog for users to discover o Creates an expanding database of relevant titles Users can access eBook for 24-hour loan periods Short-term loans trigger a purchase after library-defined threshold (3 loans, 4 loans, etc.) Automated ordering & e-invoicing of purchased eBooks (Depends on vendor) Expenditure data for loans and purchased available (From vendor and eBook aggregator)
  26. 26. eBook Demand-Driven Acquisitions - A Case StudyColorado State University – July 2011Rationale o Declining budget for books/eBooks o Low use of books purchased via approval plan or librarian selectionsUsed existing subject/non-subject parameters for weeklyrecords load for new eBooks4 short-term-loans before eBook purchase triggeredAfter 8 months, total US dollars spent on short-term-loans andpurchases considerably less than what would have been spenton print approval plan shipmentsGiven the cost savings, library is considering broadening scopeof eBook titles available for users to discover
  27. 27. Demand-Driven Acquisitions Metrics
  28. 28. eBook Demand-Driven Acquisitions Do you provide access to eBooks? Do your users use eBooks?
  29. 29. Break
  30. 30. Workflow Mapping Many institutions both in the US & UK are mapping out their processes for various electronic resource management workflows Mapping workflows help to understand workflow process overlaps in different departments & duplications of efforts via various management tools
  31. 31. Duke University library - Case Study
  32. 32. staff responsibility matrix staff interviews workflow diagrams analysis assessment of best practicesrecommendations
  33. 33. Duke University Case Study
  34. 34. Duke University Case Study
  35. 35. Duke University Case Study
  36. 36.  Proactive troubleshooting strategies Working more with vendors Extensive cross-training Leverage tools and technology to maximize efficiencies Improve/expedite loading of MARC records Improve transparency of e-resource workflow
  37. 37. Understand | Mapping helps to outlineproblems in processes.Insight | Mapping depicts missing steps ofmanagement.Alignment | Mapping helps all staff in theorganization to understand what the currentworkflow is.
  38. 38. Resolution of access problems often requiresworking multiple angles at once:  Access: what device patrons are using, what browsers are being employed for access, is the patron an authorized user Service: What library services are being used to gain access: OpenURL, webpages, LibGuides, LMS
  39. 39. Software ticketing programs like JIRAHomegrown ticketing systemsSharePoint by MicroSoftGoogle Forms
  40. 40.  Knowing total number of problems with any given publisher Have percentages for when spikes of troubleshooting requests come in to better manage staffing for troubleshooting Find or distinguish trends with library management tools like OpenURL provider and where their targets can be improved
  41. 41. NISO: National Information Standards organizationCOUNTER: Usage data standard reports (Release 4)ESPRESSO: Establishing Suggested Practice Regarding Single Sign-On [in use]I2: Institutional IdentifiersIOTA: Improving OpenURLs Through Analytics [in development]KBART: Knowledgebases & related tools (KBART5)ONIX Suite: EDI for various processesSERU: Shared Electronic Resources Understanding [just updated]SUSHI: Standardized Usage Statistics Harvesting Initiative [in use] Electronic Resource Management (ERM) Data Standards and Best Practices Working Group Open Discovery Initiative
  42. 42.  Identify efficiencies between libraries, publishers and discovery service providers Identify needs and requirements of stakeholder groups Create recommendations and tools to streamline ways to communicate with each other Ways of assessing: 1. participation level of info providers in services 2. breadth and depth of indexed content 3. the degree that content is available and accessible to the end users
  43. 43.  Standard vocabulary NISO Recommended practice  Data and format transfer  Communicating context rights  Level of indexing, content availability  Linking to content  Usage Statistics  Evaluate Compliance Spread this Information
  44. 44.  In most of these areas, targeted standards and best practices have evolved to fulfill and/or exceed the scope of the ERMI DD  KBART COUNTER  SUSHI 12 for Instititional Identities  ONIX for Serials (SOH, SPS, SRN) NISO should continue to encourage well- focused ERM Standards Development
  45. 45.  Workflows still a big issue NISO should convene series of webinars in 2012 to identify common needs & best practices Discuss findings at future conferences to guide further work
  46. 46. Break
  47. 47. Users think, process, and manage informationdifferent Expect more personalization and instant gratification Are Collaborative and multitask Learn experimentally through trial and error rather than by formal learning and reading Prefer non-linear access to information Respond better to graphic than text Expect highly intutive interfaces and convenience
  48. 48. To assess how well the library’s resourcessupport the needs of its usersTo demonstrate value of the library tocurriculum and researchTo show Return on Investment an institution hasmade in the library and its electronic resources
  49. 49. Use • Journal Usage Statistics Portal (JUSP) – locally developed tool • Provides single point of access to COUNTER usage reports (UK academic libraries) • 21 publishers participating • Automated gathering of usage data through SUSHI • Enables report comparisons across publishers and yearsTransactions • Analysis of transaction logs measures system response times, hit rates, session lengths, whether user is inside the library or notQuality • SNIP - contextual citation impact • Impact Factor - perceived ‘prestige’ of a journal • Eigenfactor - measure of time researchers spends with a journal
  50. 50. Make data meaningful Gather & analyze usage over time o Multiple years vs. one point in time to identify trends Factor in cost o Cost/use ratio, Cost-benefit analysis Analyze by subject, publishers, or user type o Variations may be meaningful and aid decision-making Look beyond the numbers o Barriers to use (user interface, training) o System/network/technical issues
  51. 51. California Digital Library - Value-based strategy utilizingobjective metrics to calculate the value of scholarly journals Used to identify titles that make a greater or lesser contribution to the University of California’s mission of teaching, research, and public service Analysis for over 8,600 journals in 36 UC licensed e-journal packages Use of locally developed Weighted Value Algorithm by Subject 3 vectors of value encompassing 6 data metrics: o Utility (usage and citations) o Quality (Impact Factor;1 SNIP 2) o Cost Effectiveness (cost per use, cost per SNIP
  52. 52. Understanding the Future: Next Wave of User Data Analysis -ITHAKA  Analysis of JSTOR usage data led to product enhancements ▪ Turnaways resulted in providing content that is out of copyright freely available to users ▪ Proxy re-direct feature for users who started in Google but weren’t authenticated even when they did have institutional access  Analysis of usage and turnaway data by discipline o Patterns of use for current content and archival content o Impact have discovery services have on usage
  53. 53. Which tools are you using? What data do you have? How do you use it?
  54. 54. Break
  55. 55. An Evaluation and Exploration of Discovery Services
  56. 56.  Slow response  Databases are being searched, not indexes Ranking by relevance not possible or problematic Results not de-duplicated Not all of a library’s resources could be searched  Libraries selected which resources should be searched—too many and search might time-out
  57. 57.  Allow users to search internal and external library resources—print & electronic—simultaneously o Fulltext article databases o Library OPAC o Locally created digital collections o Open-access content Considerations o Simple, single search o Results presented quickly o Filtering & manipulation of search results o Customization of interface by library o Mobile interface
  58. 58. EBSCO Discovery Service (EBSCO) - 2010Primo Central Total Care (Ex Libris) - 2010Summon (SerialsSolutions/ Proquest) -2009OCLC Worldcat Local - 2009
  59. 59.  Ecole Poytechnique Federale Lausanne (EPFL)  Parallel comparison of Summon, EBSCO Discovery, Worldcat Local and Primo (2011) o Original methodology included focus group with users o Technical & set-up issues resulted in shortened evaluation by only librarians  Looked at: o Content & Relevance (Content gaps?) o Search functions (user interface, advanced search) o Results view and manipulation and subsequent result use o User account (integration with circulation to request/hold materials) o Administration (local expertise; vendor support) o Professional interface (permanent URLs to content records?)“Be realistic, demand the impossible: Comparison of 4 Discovery Tools using real data at the EPFL (Ecole PolytechniqueFederale Lausanne).” D. Aymonin, et al.
  60. 60. EPFL conclusion: No one ‘winner’, each service had strengths and weaknesses What’s important for your library and users? o Content focus (local collections, articles, books) o Commercial databases (content-neutral or are some databases excluded?) o User interface o Price
  61. 61. Grand Valley State University, Michigan, USA Implemented Summon in 2009 Used Google Analytics and vendor-provided usage data to study impact of Summon on use of eResources Results – o Use of abstracting & Indexing databases, already declining, continued to decrease o Use of fulltext resources increased ‘dramatically.’
  62. 62. “Web‐scale discovery services represent adramatic change in how libraries provideaccess to collections. Silos that existedbased on subject content, publisher orcontent provider in many ways no longerexist or are no longer important.”Way, Doug, "The Impact of Web-scale Discovery on the Use of a Library Collection" (2010).Scholarly Publications. Paper 9.
  63. 63. Which discoveryservice have youinvestigated ? Share your evaluation and/or implementation experience Has the discovery service impacted eResource usage at your library?
  64. 64. Questions /Comments