Building a foundation for collection management decisions: two approaches


Published on

Salisbury University and the University of Maryland both undertook projects to evaluate the effectiveness of EBSCO Information Service's Usage Consolidation product and the usefulness of the data extracted for collection development decisions. The goals of implementation were to centralize the collection and analysis of e-resource usage data and to allow collection management librarians easy access to usage and cost per use data to aid in their decision-making. The presenters will discuss how staff at each institution populated Usage Consolidation and presented usage reports to collection managers; how collection managers responded to the data; and how they used the data to inform collection management decisions.

Leigh Ann DePope
Serials/Electronic Services Librarian, Salisbury University
Leigh Ann DePope is the Serials/Electronic Services Librarian at Salisbury University. She is responsible for all aspects of serials and electronic resource management. She has serials experience in both public and academic libraries. Leigh Ann has earned her MLS from Clarion University of Pennsylvania and a BA from the Pennsylvania State University.

Mark Hemhauser
Systems Librarian, University of Maryland
Mark Hemhauser has 18 years of experience managing serials acquisitions and is currently the Systems Librarian for the Aleph Acquisitions and Serials module at the University System of Maryland and Affiliated Institutions. He also serves on the e-Acquisitions Team of the Kuali OLE (Open Library Environment) project--an open-source, library-driven project to build a truly integrated library system

Rebecca Kemp
University of Maryland
Rebecca Kemp is Continuing Resources Librarian at University of Maryland, College Park. She has served as a continuing resources librarian since 2004, has served on national library association committees, and has participated in a variety of state and national conferences.

Published in: Education, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Screen shot of UC report (Title Details)…preview results helpful for quick viewing
  • The report represents over 20000 titles with 458000 plus uses for SU. I used this report to look for trends in usage, titles with the most use, other trends for discussion with the subject librarians.
  • These are some other handy reports that can be generated…usage by database (usage for almost all databases in one place), title usage from one specific platform (can see what resources are being used), usage for an ejournal (this was handy is seeing that an expensive biology title was not being used). Statistics for ejournals are most helpful because we were not collecting those before and as such had no way to evaluate them.
  • As I said before we are new at targeted assessment. The subject liaison were excited about being able to get the information but became overwhelmed at the amount. It was hard for them to see trends and to shift through all the data for titles relevant to their departments. This is a drawback for subject liaisons as they only want information pertaining to departments and the data is not label like that in these reports. The smaller subject specific databases work better for this but do not represent the bulk of our usage. They felt the database reports would be most useful for selection and renewal considerations.Again we do not take a proactive approach to collection development which makes it harder for me to produce reports, analyze reports, and assist the subject liaisons with e-resource collection development decisions. Reports are usually produced on-demand with a short turn-around time needed. It is my intent to fully realize the power of this product to change this.
  • My administration was much more receptive to UC. They were very excited to see the different types of reports that could be generated and to review the information in them. They immediately spotted trends and odd results and were eager for those items to be researched. They also now had hard numbers to use in public relations and outreach literature. We are starting a building project so this is quite important to us right now.
  • Building a foundation for collection management decisions: two approaches

    1. 1. Building a Foundation for Collection Management Decisions: Two Approaches Leigh Ann DePope Salisbury University Mark Hemhauser University of Maryland, College Park Rebecca Kemp University of Maryland, College Park
    2. 2. Presentation Objectives  Understand why and how each institution populated EBSCONET Usage Consolidation  Understand how Acquisitions staff presented the tool to Collection Management Librarians  Understand how Collection Management Librarians responded to the tool and used the tool/plan to use the tool to inform collection management decisions
    3. 3. Why test/implement Usage Consolidation?  Perennial problem: matching usage with cost data; cost of potential solutions  Displays usage and cost-per-use (CPU) information for EBSCO-subscribed titles in EBSCONET  College Park  Leverage Public Service Librarian familiarity with EBSCONET Subscription Management interface  93% of individually subscribed e-journals are paid through EBSCO (a lot of cost data available)  Salisbury  See usage across different platforms, see which packages contain specific titles
    4. 4. What Usage Consolidation does  Matches e-journals profiled in A-to-Z with EBSCO e-journal orders and with titles in COUNTER usage statistics files  Can profile SUSHI-compliant platforms so that COUNTER usage stats will automatically be harvested by Usage Consolidation  Shows usage and CPU for EBSCO-subscribed titles only; does not take into account costs of journal aggregator packages  Handles the following COUNTER reports: JR1, DB1, BR1 and 2
    5. 5. What Usage Consolidation does  SUSHI details:  Desired data:  SUSHI version (Optional)  SUSHI Server URL (Required)  SUSHI Requestor ID (Required)  SUSHI Customer ID (Required)  SUSHI Authentication Method (Optional)  Use OASIS 1.0 Authentication? (Required for web-service level authentication))  SUSHI username (Required if OASIS is “Yes”)  SUSHI password (Required if OASIS is “Yes”)
    6. 6. This slide contained a video demonstration of Usage Consolidation. If you are interested is seeing how Usage Consolidation works, contact an Ebsco representative.
    7. 7. Acceptance Criteria for College Park  Loaded file of ten titles with intentional issn and title spelling errors to track how use was matched to titles on A-Z/order list  Matched on e- or p-issn, match failed if both were wrong  Title errors had no effect when ISSN was present  Loaded larger set of titles and examined EbscoNet subscription manager results for “comes with” titles  Usage data for child titles correctly totaled on parent record and cost per use calculated at parent level (originally this did not work correctly, EBSCO fixed it)  Membership packages don‟t always sum use to parent (bug, needs working out)  Major packages, ie. Freedom Collection, not summed  Challenging because these packages are different for each customer
    8. 8. Acceptance Criteria for College Park  Loaded data from multiple publisher provided sites, eg. Highwire and Ingenta Connect to test if cost per use calculations were based on total use at all publisher platforms. It is.  Loaded data for one title from EbscoHost platform to see how aggregator use was handled in EbscoNet Subscription Manager. Not included in publisher cost per use calculation.  We advise testing any new usage tool that matches use to orders/costs to confirm the system behaves as advertised/expected.
    9. 9. Acceptance Criteria for College Park  Data should be accessible to selectors within a tool they can learn easily.  Data should be extractable for further manipulation.  Tool should not require dependence on a local Microsoft Access guru. Should be sustainable without local support.  Tool should save staff time in matching journal usage with subscription costs.  Support should be available for bugs detected and any user problems.  SUSHI—nice, but not required.
    10. 10. Profiling SUSHI at College Park  Required a bit more back and forth with publishers than we were willing to do  Publishers‟ servers sometimes timed out before data was retrieved, had to re-schedule  Latest release allows manual retrieval (useful option, as needed)  Matching “exceptions” to link use to cost would need to be done monthly  Not appealing as we prefer one-time gathering of full calendar year statistics  Latest release allows auto-completion of SUSHI loads so all matched data goes into UC and Ebsconet immediately, can work unmatched titles later
    11. 11. Presenting UC to subject librarians @ CP  Loaded three years of data for:  Elsevier  Springer  Wiley  Taylor and Francis  Oxford (Highwire and Ingenta Connect)  Sage  Cambridge  Royal Society of Chemistry  Calculation of CPU is done against whichever order year you are viewing despite choosing a different use stats year.
    12. 12. Presenting UC to subject librarians @ CP  Ignore “All Platforms” cost per use calculation.  Subscription Usage Details report only gives last completed year‟s use and current year‟s cost.  CPU calculation for “child” / “comes with” package titles is problematic: sometimes child titles have been treated as parent records of a sub-package inappropriately, creating incorrect usage reporting. Overall cost per use is correct.
    13. 13. UMCP Reports for Selectors  Usage and CPU for EBSCO-subscribed titles within Subscription Management  Searched by fund code within current subscription year  „i‟ button opens mouse-over with latest year use and A-Z holdings list
    14. 14. UMCP Reports for Selectors  Multiple publisher platform use summed (yellow highlighting)  Aggregator use separate from publisher platform use, but cost per use calculation for all platforms is meaningless  Usage and CPU for EBSCO-subscribed titles within Subscription Management
    15. 15. UMCP Reports for Selectors  Usage and CPU within Subscription Management: Subscription Usage Details report  Acquisitions will likely export this report and add additional years of cost data, using the PO number to match our ILS data.
    16. 16. Subject Librarian Feedback @ CP  What do you like about EBSCONET Usage Consolidation? Seeing cost and usage data in one place for a journal title.  What do you dislike about EBSCONET Usage Consolidation? Interface issues, content issues. Complexity involved with selecting order year that corresponds to use year.  Will it be useful in serials review? Yes, if CPU correct; concern over bundled titles.  Do you think that we should continue to load publisher platform usage for all EBSCO-subscribed publishers, not just the pilot publishers? Yes.  Do you think that we should continue to load EBSCOhost or other aggregator database usage into Usage Consolidation? Yes.
    17. 17. Presenting UC at Salisbury
    18. 18. Presenting UC at Salisbury
    19. 19. Presenting UC at Salisbury
    20. 20. Subject liaison feedback at Salisbury  Useful for database renewals  Too cumbersome at the title level  Reactive versus proactive
    21. 21. Library administration feedback at Salisbury  Useful for generating hard data  More efficient for collection development
    22. 22. Conclusion  Selectors want CPU data that is easily understandable  Reasonably successful in reporting usage and cost per use  College Park found their acceptance criteria were largely met, confident that remaining issues will be resolved.  College Park will add all the data they can to it. Then Systems will load data into a locally tweaked version of North Carolina State‟s Collection Management Review tool  Implementation continues at Salisbury
    23. 23. Questions?  Contact  Leigh Ann DePope (  Mark Hemhauser (  Rebecca Kemp (