COST-PER-USE VS.
HOURS-PER-REPORT
USAGE STATISTICS COLLECTIONAND
THE VALUE OF STAFF TIME
Presentation for NASIG 2014 Annua...
Starting Question
Which process is more cost effective?
Gathering electronic resource usage data in-house or
Outsourcing d...
Other Questions
• Is vendor-collected data as reliable and consistent as
library-collected data?
• What about the cost of ...
Survey of libraries that collect usage data
• 96% of libraries have collected usage data at least once
• 59% of those libr...
Survey results continued
• What percentage of
data was collected by
vendor?
• Majority of libraries
that use a vendor also...
Survey results continued
• Who from the library
participated in
collecting usage
data?
• Bulk of the work is
done by profe...
Experience at Mississippi
• Carnegie High Research institution with continuing
resources budget of about $4 million
• Cont...
Experience at Mississippi
• Time spent on collecting data
• Librarian – 24 hours
• ER Specialist – 120 hours
• Subscriptio...
Experience at Mississippi
• What were we trying to collect?
• Total titles: 2,264
• Total publisher platforms: 114
• 44 pl...
Experience at Mississippi
• What went right?
• Single largest publisher = 18% of titles
• Top two publishers = 30% of titl...
Third-party vendor costs
• Contacted three vendors
• None provide collection only services
• One service separates collect...
Survey results continued
• How do vendor-collected data compare to in-house
collected data?
Quality Vendor In-house
Reliab...
Survey results continued
Categories of problems reported
Quotes from survey
“Time consuming and requires library staff to maintain and
manage usage statistics access information. ...
Quotes from survey
“Outdated username/passwords usually involve a lot of
manhours to resolve – transferring the admin righ...
Quotes from survey
“There are always discrepencies between COUNTER stats
and vendor-reported stats, even from the same ven...
Survey results continued
• What do libraries do with the data collected?
• 72% of respondents did cost-per-use analysis
• ...
Experience at Mississippi
• Two paraprofessionals produced a spreadsheet of cost
and issns
64 hours = $850.81
• Librarian ...
Experience at Mississippi
Collecting ejournal usage data $2,591.19
Collecting print usage data $ 577.77
Collecting cost da...
Vendor costs
• Vendor A: quote based on FTE, unlimited platforms
$10,750/yr + $700 set up fee
• Vendor B: consolidation to...
Cost comparison
Collection, data matching, and report for 60 platforms
Vendor A: $11,450
Vendor B: $ 8,750
Vendor C: $21,4...
Survey results continued
Who saw this data or report?
0 20 40 60 80 100 120 140 160
Other
Library committee
Staff
Students...
Survey results continued
Who made decisions based on this report?
0 20 40 60 80 100 120 140 160
Other
Library committee
St...
Survey results continued
How was time spent over the process?
Action Average Value (0-100) Standard Deviation
Collection o...
Survey results compared to
Mississippi’s process
Action Survey Average Result UM’s Time Analysis
Collection of usage data ...
Conclusions
• For UM, in-house collection and reporting is most cost
effective method
• Vendor-collected usage data is con...
THANK YOU
Questions?
Christina Torbert
ctorbert@olemiss.edu
Upcoming SlideShare
Loading in …5
×

Cost-per-use vs. hours-per-report: usage data collection and the value of staff time

2,499 views

Published on

Cost-per-use for electronic journals has become a common standard for judging the value of individual titles, but the reports needed to make such judgments can be complex to create. Different options exist for collecting, collating and reporting the necessary data. This session will look at the costs estimated for the in-house process followed at the University of Mississippi, and how those costs in personnel time compared to pricing from outside vendors. It will also report on a survey of other libraries that use outside vendors to judge the perceived value of those services.

The survey data reported in the presentation is available upon request from the presenter.

Presenter:
Christina Torbert
Head of Continuing Resources, University of Mississippi
ctorbert@olemiss.edu

Published in: Education, Technology
0 Comments
2 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
2,499
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
14
Comments
0
Likes
2
Embeds 0
No embeds

No notes for slide

Cost-per-use vs. hours-per-report: usage data collection and the value of staff time

  1. 1. COST-PER-USE VS. HOURS-PER-REPORT USAGE STATISTICS COLLECTIONAND THE VALUE OF STAFF TIME Presentation for NASIG 2014 Annual Conference Christina Torbert Head of Continuing Resources University of Mississippi
  2. 2. Starting Question Which process is more cost effective? Gathering electronic resource usage data in-house or Outsourcing data collection to a vendor
  3. 3. Other Questions • Is vendor-collected data as reliable and consistent as library-collected data? • What about the cost of the other work involved in producing reports from that data?
  4. 4. Survey of libraries that collect usage data • 96% of libraries have collected usage data at least once • 59% of those libraries collect data on more than 75% of their subscriptions • 82% collected data from more than half of their subscriptions • 47% of those libraries used a third-party service to help with data collection
  5. 5. Survey results continued • What percentage of data was collected by vendor? • Majority of libraries that use a vendor also do some in-house data collection 1-25% 26-50% 51-75% 76-100%
  6. 6. Survey results continued • Who from the library participated in collecting usage data? • Bulk of the work is done by professional librarians • Assume that many libraries use a combination of people Librarians Library staff Students
  7. 7. Experience at Mississippi • Carnegie High Research institution with continuing resources budget of about $4 million • Continuing Resources Department consists of two librarians and four paraprofessionals • Two paraprofessionals are Specialists and two are Senior Library Assistants • ER Librarian’s part of the project, focused entirely on databases, is not included in this report
  8. 8. Experience at Mississippi • Time spent on collecting data • Librarian – 24 hours • ER Specialist – 120 hours • Subscription Specialist – 3 hours • Senior Library Assistants – 9 hours • Total cost - $2,519.19
  9. 9. Experience at Mississippi • What were we trying to collect? • Total titles: 2,264 • Total publisher platforms: 114 • 44 platforms did not provide any statistics • 9 platforms provided non-COUNTER statistics • 61 platforms with COUNTER statistics
  10. 10. Experience at Mississippi • What went right? • Single largest publisher = 18% of titles • Top two publishers = 30% of titles • Top five publishers = 67% of titles almost 10% of COUNTER platforms • Top 20% of platforms = 85% of titles collected by non-ER staff in 12 hours for $160
  11. 11. Third-party vendor costs • Contacted three vendors • None provide collection only services • One service separates collection costs - $100/platform • All include some level of consolidation, smoothing of match points, reduction of duplication • Also include tools for loading cost and running analysis reports
  12. 12. Survey results continued • How do vendor-collected data compare to in-house collected data? Quality Vendor In-house Reliable 3.55 3.62 Accurate 3.64 3.62 Easy to obtain 3.25 2.79 Good value 3.41 3.48 *median values on a five-point Leichert scale
  13. 13. Survey results continued Categories of problems reported
  14. 14. Quotes from survey “Time consuming and requires library staff to maintain and manage usage statistics access information. In some cases what is not collected by our vendor can sometimes be difficult for library staff to collect. It is a small portion of our total stats needed so we are sometimes expending a lot of energy for a little return.” “Vendors are continually changing how to get to the data and logging in sometimes becomes a barrier.”
  15. 15. Quotes from survey “Outdated username/passwords usually involve a lot of manhours to resolve – transferring the admin rights from one individual (who usually no longer works at the institution) to another staff person.” “There are frequently problems with the data such as noticeable outliers which suggest problems with the data collection and additionally outages acknowledged by the publisher.”
  16. 16. Quotes from survey “There are always discrepencies between COUNTER stats and vendor-reported stats, even from the same vendor – sometimes seems one is counting apples, the other oranges.” “What barriers didn’t we experience, inconsistent numbers from month to month (random extreme fluctuations), difficulty finding access to vendor-provided stats, difficulty understanding how to request reports from third-party stats collector and poor communication with their trainer when finally did figure out how to request assistance, despite the use of protocols like COUNTER, there are still wide fluctuations in number from vendor to vendor.”
  17. 17. Survey results continued • What do libraries do with the data collected? • 72% of respondents did cost-per-use analysis • 92% of respondents did the matching of cost and use data in-house
  18. 18. Experience at Mississippi • Two paraprofessionals produced a spreadsheet of cost and issns 64 hours = $850.81 • Librarian collected print volume/issue level usage 20 hours = $577 • Librarian collated and corrected the report 72 hours = $2,077.20
  19. 19. Experience at Mississippi Collecting ejournal usage data $2,591.19 Collecting print usage data $ 577.77 Collecting cost data $ 850.81 Finalizing cost-per-use report $2,077.20 Total process $6,126.19
  20. 20. Vendor costs • Vendor A: quote based on FTE, unlimited platforms $10,750/yr + $700 set up fee • Vendor B: consolidation tool = $2,705 + collection service = $100/platform (min. five platforms) • Vendor C: priced by number of platforms Up to 5 = $3,494; up to 10 = $4,732; up to 15 = $7,036; up to 25 = $11,513; up to 40 = $16,906; etc.
  21. 21. Cost comparison Collection, data matching, and report for 60 platforms Vendor A: $11,450 Vendor B: $ 8,750 Vendor C: $21,420 In-house at UM: $ 6,126
  22. 22. Survey results continued Who saw this data or report? 0 20 40 60 80 100 120 140 160 Other Library committee Staff Students/Patrons Librarians Faculty Library administration Institutional administration
  23. 23. Survey results continued Who made decisions based on this report? 0 20 40 60 80 100 120 140 160 Other Library committee Staff Students/Patrons Librarians Faculty Library administration Institutional administration
  24. 24. Survey results continued How was time spent over the process? Action Average Value (0-100) Standard Deviation Collection of usage data 41.12 26.47 Data matching 9.49 11.92 Quality control 7.26 7.29 Analysis 16.02 12.74 Sharing/Publicity 4.84 5.09 Decision making 9.7 8.89
  25. 25. Survey results compared to Mississippi’s process Action Survey Average Result UM’s Time Analysis Collection of usage data 41.12 47 Data matching 9.49 17 Quality control 7.26 17 Analysis 16.02 0 Sharing/Publicity 4.84 2 Decision making 9.7 14
  26. 26. Conclusions • For UM, in-house collection and reporting is most cost effective method • Vendor-collected usage data is considered as accurate as in-house-collected data • Publishers/Platforms are not considered reliable reporters • Most librarians are still doing a great deal of data checking and data matching to produce reports
  27. 27. THANK YOU Questions? Christina Torbert ctorbert@olemiss.edu

×