Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Effectively Applying Usage Statistics in E-Resource Collection Development

443 views

Published on

Effectively Applying Usage Statistics in E-Resource Collection Development:

Published in: Education
  • Be the first to comment

  • Be the first to like this

Effectively Applying Usage Statistics in E-Resource Collection Development

  1. 1. Effectively Applying Usage Statistics in E-Resource Collection Development Using Evidence and Outreach in Decision- Making ACRL-MD – New Identities: Adapting the Academic Library November 14, 2014 Randy Lowe – Collection Development, Acquisition & Serials Librarian, Frostburg State University
  2. 2. Overview  Why E-Resources Assessment?  Usage Statistics – Types, Reports, Collection  Assessment: Evidence & Outreach ◦ Applying usage statistics to collection management decision-making ◦ Engaging librarians, faculty and administrators in the process
  3. 3. Why E-Resource Assessment?  Libraries have historically measured use of services (circulation statistics, re-shelving counts, gate counts, etc.)  The technology upon which e-resources reside inherently allows for extensive collection of usage data – and assessment of that use  Assessment of use data supports evidence-based collection management  Libraries operate in a challenging fiscal environment – demonstrating e-resource value and fiscal responsibility is a must
  4. 4. Effective E-Resources Assessment  Two essential elements in conducting effective e-resource assessments: ◦ Efficient and Accurate Data Collection ◦ Clear and Succinct Analysis  E-Resource assessment is more than just collecting usage statistics – it is applying them in the making of sound management decisions regarding library resources  Usage statistics measure volume, not value of resources
  5. 5. What Can You Do with E-Resources Usage Statistics?  Track usage / Assess overall collection use  Track expenditures / Figure cost-per-use  Track turnaways  Assess title, subject, publisher and other usage elements  Identify user behavior trends  Assist in making collection development decisions, including acquisition model selection  Effectively advocate for resources – especially if assessment is tied to institutional goals/strategic plan, curricular initiatives, student learning goals
  6. 6. Types of Usage Statistics Reports and When to Use Them  Vendor-Defined ◦ Analyzing usage data from a single vendor ◦ Obtaining cost information ◦ Comprehensive data files make it easy to analyze combinations of various data elements [Example] ◦ When COUNTER reports do not provide adequate detail  COUNTER-Compliant ◦ Analyzing usage data across multiple vendors ◦ Ensuring data integrity though adherence to recognized standards
  7. 7. Collecting Usage Data  Define Objectives ◦ What you need to know or are trying to find out should drive your data collection decisions ◦ Collecting Usage Statistics can be a major time commitment  Use your assessment objectives to help you to not only determine what data to collect, but when you have collected enough data to analyze  Properly balancing time and resources dedicated to both data collection and analysis is vital
  8. 8. Collecting Usage Data  Various vendors present data differently – this can present a challenge not only across vendors, but even with combining data elements from a single vendor  Manipulation / Formatting of raw data will likely be necessary  Example – COUNTER BR1 Report + Acquisition Type Data + Cost Data Compiled Manually = Data for Assessment  Schedule time(s) to collect data  Vendors’ archival policies for maintaining usage statistics vary
  9. 9. Assessing Usage Data You have usage data – What do you do with it?  It is easy to get overwhelmed in usage data – analysis should be guided by your assessment objectives ◦ What do you want/need to assess? ◦ What questions are you trying to answer? ◦ Who is your audience?  Have a purpose for using your data
  10. 10. Assessing Usage Data  Assessment is most powerful when it is tied to an action or potential action (including requests)  There is no single method for assessing usage statistics in every case – the “right data” to analyze and include in your report is that which will support your assessment objectives
  11. 11. Usage Data Analysis  Data analysis should be thorough, but presented succinctly  Conclusions, trends, etc. should be clear and verifiable  Beware of pre-conceived notions, perceptions or opinions – hypotheses can be both proven and refuted  State known limitations of the data you have collected and how they may affect your analysis
  12. 12. Using/Applying Evidence: Writing Your Report  Know your audience  Include a brief purpose/introduction  Write clearly and succinctly  Reported usage data should support the purpose of the assessment ◦ Only include data that supports your stated objectives – don’t include all collected data; it won’t be read by administrators
  13. 13. Using/Applying Evidence: Writing Your Report  Reported usage data should support the purpose of the assessment (continued) ◦ Include data within the text of your report where it is necessary and provides clear evidence for the points you are making ◦ It is usually more effective to include visual representations of (charts, graphs) rather than just figures within the text of reports ◦ Larger tables and data sets, if necessary to include, are best placed in appendices  Conclusions and recommendations should be easily identified and based on the evidence presented  State action and/or desired response clearly
  14. 14. Using/Applying Evidence: The Frostburg Experience  Effectively applying e-resources data to collection management has been an evolution  The lay of the land – 2007 ◦ We had data (searches & link resolver) ◦ Study to compare journal costs by format ◦ Data sat in a vacuum outside of annual database budgeting  Needed to establish a frame of reference to begin applying usage statistics in engaging faculty and administrators
  15. 15. Evidence & Outreach Example 1: Faculty Survey – 2007-2008  Faculty had not been previously engaged systematically in collection development efforts  User behavior as demonstrated in link resolver statistics indicated that online full-text was preferred by users  Library determined periodicals and standing orders should be migrated to online format, but which ones?  Fall 2007: Faculty surveyed regarding value (content) and usefulness (format) of journals, standing orders, databases.  Spring 2008: Results of survey matched link resolver usage statistics  Subscription Cancellations, additions, format migrations made over next 5 years
  16. 16. Evidence & Outreach Example 2: Underutilized Journals  Library began collecting full text article retrievals in 2009-2010 (and re-shelving counts in 2011-2012)  All journal subscriptions are reviewed by librarians annually  Faculty are involved in second level of review for underutilized subscriptions  Objective is to use the process as a means for continued dialogue with faculty in collection development
  17. 17. Evidence & Outreach Example 3: Collaboration with Academic Depts  Academic departments becoming increasingly engaged in e-resource subscription discussions, including funding ◦ Chemistry – CAS SciFinder ◦ Visual Arts – Artstor  Current collaboration is with Biology ◦ Department not satisfied with current e-resources ◦ No funds available for additional resources ◦ Reviewed use of current journal subscriptions and content of requested databases ◦ Department suggested journal cancellations to fund databases ◦ New e-resource scenarios developed
  18. 18. Evidence & Outreach Example 4: E-Book Assessment  Frostburg State University: Report overall use and expenditures of e-books over time; implement the most cost effective DDA acquisition model(s) [Report]  USMAI Consortial E-Book Pilot: Assess the effectiveness of a specific DDA acquisition model for the consortium; use and expenditures by consortium members and user types; identification of possible future program funding models [Report]
  19. 19. Thank You  Questions?  Contact Information: Randy Lowe Frostburg State University rlowe@frostburg.edu

×