Your SlideShare is downloading. ×
0
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Evaluating the Big Deal: Usage Statistics for Decision Making
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Evaluating the Big Deal: Usage Statistics for Decision Making

1,525

Published on

Presentation delivered at the UKSG Usage Statistics for Decision Making workshop. Held at the Institute of Materials, Minerals and Mining, London. 2nd Febrary 2012.

Presentation delivered at the UKSG Usage Statistics for Decision Making workshop. Held at the Institute of Materials, Minerals and Mining, London. 2nd Febrary 2012.

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,525
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
12
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Evaluating the Big Deal:What metrics matter?Usage Statistics for Decision MakingLondon, 2 February 2012Selena KillickLibrary Quality Officer
  • 2. Introduction • Institutional, financial and strategic context • Evaluating the Big Deals • Quantitative Reporting • Qualitative Measures • Using the Results • Conclusions
  • 3. Cranfield University • The UKs only wholly postgraduate university focused on science, technology, engineering and management • One of the UKs top five research intensive universities • Annual turnover £150m • 40% of our students study whilst in employment • We deliver the UK Ministry of Defences largest educational contract
  • 4. Key Drivers • Financial realities • Demonstrating value for money • Strategic alignment • Research Excellence Framework (REF) • Income Mission critical • Reputation
  • 5. Expenditure on Journals Journal Spend 2006-07 2007-08 2008-09 2009-10
  • 6. Information Expenditureby Format 2010-11 Books eBooks 4% 4% Databases 24% Journals 68%
  • 7. Information Expenditureby Format 2010-11 Books eBooks 4% 4% Databases 24% Journals 31% Big Deals 37%
  • 8. Evaluating the‘Big Deals’
  • 9. Previous TechniquesUsed: Annual journals review using the follow data • Circulation figures – issues and renewals • “Sweep survey” to capture in-house use • Journal contents page requests • Download figures • Journal prices v the cost of ILL requests More recent focus on “cost per download”
  • 10. New ApproachQuantitative: Qualitative:• Size • Academic Liaison• Usage • Reading Lists Review• Coverage • REF Preferred• Value for Money
  • 11. Requirements • Systematic • Sustainable • Internal benchmarking • Elevator pitch • So what? • Enable informed decision making • Demonstrate smart procurement
  • 12. Quantitative Reporting
  • 13. Brought to you by theletters… &
  • 14. Our Approach • What has everyone else done? • Analysing Publisher Deals Project • Storage centre • Excel training • Template design
  • 15. Basic Metrics • Number of titles within a package • Total annual full-text downloads • Cost: • Core titles • e-Access Fee • Total costs
  • 16. Value Metrics • Average number of requests per title • Average cost per title • Total cost as % of information provision expenditure • Cost per full-text download • Average download per FTE student/staff/total • Average cost per FTE student/staff/total
  • 17. The Long TailDownloads Titles Titles Titles Long Tail Short Tail No Tail
  • 18. Subscribed Titles • Reviewing performance of core collection • REF Preferred? • Popular? • Three year trends in cost / downloads / CPD • Cost / Downloads / CPD categorised: • Zero • Low • Medium • High • Cancel?
  • 19. Popular Titles • Which titles are the most popular? • Top 30 titles in the package • Three year trends in downloads • REF Preferred? • Subscribed title?
  • 20. Considerations • When to measure from/to? • calendar, financial/academic, or contract year? • Which titles make up our core collection? • Do we have access to all of the „zero use‟ titles? • What constitutes Low/Medium/High? • What about the aggregator usage statistics? • Do we trust the usage statistics? • What is the size of the target population?
  • 21. Capturing the data
  • 22. Downloading Statistics • Get organised • Gather your usernames and passwords • Create local files to save and store usage reports • Software now on the market to manage this for you • Joint Usage Statistics Portal
  • 23. Introductory Workshop April 18Birmingham
  • 24. Reporting Period • Calendar, financial/academic, or contract year? • COUNTER Reports = Calendar year • Converted using Vlookup on ISSNs • Manual • Problematic • Automatically converted using the JUSP
  • 25. Aggregator UsageStatistics • Combining usage from publishers and aggregators at a title level • Combined using Pivot Tables • Manual • Problematic • Where possible combined using JUSP
  • 26. Analysing the data
  • 27. Excel Template • Two main data sources: • COUNTER JR1 • Subscription agent financial report • Automated as much as possible • Match formulas working with ISSNs to link title price to usage/holdings • All calculations are completed automatically when the data sources are added
  • 28. QuantitativeReporting • Systematic  • Sustainable  • Internal benchmarking  • Elevator pitch  • So what?  • Enable informed decision making  • Demonstrate smart procurement 
  • 29. Qualitative Measures
  • 30. Academic Liaison • Who‟s using it? • Why? • How? • What will be the impact if we cancel? • Teaching? • Research? • How valuable is it?
  • 31. Quantitative on theQualitative: Analysis on the five REF Preferred Recommended Journals Lists: • Overlapping titles • Unsubscribed titles • Financial shortfall • Current recommended subscribed titles • Usage data
  • 32. Reading List Review Qualitative analysis on course reading lists: • What are our academic recommending? • Where is it published? • How often is it recommended? • Are there alternatives?
  • 33. Quantitative &QualitativeReporting • Systematic  • Sustainable  • Internal benchmarking  • Elevator pitch  • So what?  • Enable informed decision making  • Demonstrate smart procurement 
  • 34. Using the results
  • 35. What they can do: • Both qualitative and quantitative measures tell the story of the resource • Aid decision making • Justify procurement • Safeguard budgets…?
  • 36. What they can’t do:
  • 37. Conclusions
  • 38. Closing thoughts • Is it worth investing in this? • Qualitative & Quantitative • Danger of relying on cost-per-download
  • 39. Looking Ahead • Review of all budgets • All Resources • Systems • Staff • Services • Demonstrating Value and Impact • Resources • Services
  • 40. Thank YouSelena KillickCranfield Universitys.a.killick@cranfield.ac.ukTel: 01793 785561

×