Presentation delivered at the UKSG Usage Statistics for Decision Making workshop. Held at the Institute of Materials, Minerals and Mining, London. 2nd Febrary 2012.
Selena KillickSenior Library Manager (Quality & Insight) at The Open University
Evaluating the Big Deal: Usage Statistics for Decision Making
1. Evaluating the Big Deal:
What metrics matter?
Usage Statistics for Decision Making
London, 2 February 2012
Selena Killick
Library Quality Officer
2. Introduction
• Institutional, financial and strategic context
• Evaluating the Big Deals
• Quantitative Reporting
• Qualitative Measures
• Using the Results
• Conclusions
3. Cranfield University
• The UK's only wholly postgraduate university focused
on science, technology, engineering and
management
• One of the UK's top five research intensive
universities
• Annual turnover £150m
• 40% of our students study whilst in employment
• We deliver the UK Ministry of Defence's largest
educational contract
4. Key Drivers
• Financial realities
• Demonstrating value for money
• Strategic alignment
• Research Excellence Framework (REF)
• Income
Mission critical
• Reputation
9. Previous Techniques
Used:
Annual journals review using the follow data
• Circulation figures – issues and renewals
• “Sweep survey” to capture in-house use
• Journal contents page requests
• Download figures
• Journal prices v the cost of ILL requests
More recent focus on “cost per download”
10. New Approach
Quantitative: Qualitative:
• Size • Academic Liaison
• Usage • Reading Lists Review
• Coverage • REF Preferred
• Value for Money
11. Requirements
• Systematic
• Sustainable
• Internal benchmarking
• Elevator pitch
• So what?
• Enable informed decision making
• Demonstrate smart procurement
14. Our Approach
• What has everyone else done?
• Analysing Publisher Deals Project
• Storage centre
• Excel training
• Template design
15. Basic Metrics
• Number of titles within a package
• Total annual full-text downloads
• Cost:
• Core titles
• e-Access Fee
• Total costs
16. Value Metrics
• Average number of requests per title
• Average cost per title
• Total cost as % of information provision expenditure
• Cost per full-text download
• Average download per FTE student/staff/total
• Average cost per FTE student/staff/total
18. Subscribed Titles
• Reviewing performance of core collection
• REF Preferred?
• Popular?
• Three year trends in cost / downloads / CPD
• Cost / Downloads / CPD categorised:
• Zero
• Low
• Medium
• High
• Cancel?
19. Popular Titles
• Which titles are the most popular?
• Top 30 titles in the package
• Three year trends in downloads
• REF Preferred?
• Subscribed title?
20. Considerations
• When to measure from/to?
• calendar, financial/academic, or contract year?
• Which titles make up our core collection?
• Do we have access to all of the „zero use‟ titles?
• What constitutes Low/Medium/High?
• What about the aggregator usage statistics?
• Do we trust the usage statistics?
• What is the size of the target population?
22. Downloading Statistics
• Get organised
• Gather your usernames and passwords
• Create local files to save and store usage reports
• Software now on the market to manage this for you
• Joint Usage Statistics Portal
25. Reporting Period
• Calendar, financial/academic, or contract year?
• COUNTER Reports = Calendar year
• Converted using Vlookup on ISSNs
• Manual
• Problematic
• Automatically converted using the JUSP
26. Aggregator Usage
Statistics
• Combining usage from publishers and aggregators at a
title level
• Combined using Pivot Tables
• Manual
• Problematic
• Where possible combined using JUSP
28. Excel Template
• Two main data sources:
• COUNTER JR1
• Subscription agent financial report
• Automated as much as possible
• Match formulas working with ISSNs to link title price to
usage/holdings
• All calculations are completed automatically when the
data sources are added
31. Academic Liaison
• Who‟s using it?
• Why?
• How?
• What will be the impact if we cancel?
• Teaching?
• Research?
• How valuable is it?
32. Quantitative on the
Qualitative:
Analysis on the five REF Preferred Recommended
Journals Lists:
• Overlapping titles
• Unsubscribed titles
• Financial shortfall
• Current recommended subscribed titles
• Usage data
33. Reading List Review
Qualitative analysis on course reading lists:
• What are our academic recommending?
• Where is it published?
• How often is it recommended?
• Are there alternatives?
36. What they can do:
• Both qualitative and quantitative measures tell the
story of the resource
• Aid decision making
• Justify procurement
• Safeguard budgets…?