Successfully reported this slideshow.
Your SlideShare is downloading. ×

Evaluating the Big Deal: Usage Statistics for Decision Making

More Related Content

Similar to Evaluating the Big Deal: Usage Statistics for Decision Making

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all

Evaluating the Big Deal: Usage Statistics for Decision Making

  1. 1. Evaluating the Big Deal: What metrics matter? Usage Statistics for Decision Making London, 2 February 2012 Selena Killick Library Quality Officer
  2. 2. Introduction • Institutional, financial and strategic context • Evaluating the Big Deals • Quantitative Reporting • Qualitative Measures • Using the Results • Conclusions
  3. 3. Cranfield University • The UK's only wholly postgraduate university focused on science, technology, engineering and management • One of the UK's top five research intensive universities • Annual turnover £150m • 40% of our students study whilst in employment • We deliver the UK Ministry of Defence's largest educational contract
  4. 4. Key Drivers • Financial realities • Demonstrating value for money • Strategic alignment • Research Excellence Framework (REF) • Income Mission critical • Reputation
  5. 5. Expenditure on Journals Journal Spend 2006-07 2007-08 2008-09 2009-10
  6. 6. Information Expenditure by Format 2010-11 Books eBooks 4% 4% Databases 24% Journals 68%
  7. 7. Information Expenditure by Format 2010-11 Books eBooks 4% 4% Databases 24% Journals 31% Big Deals 37%
  8. 8. Evaluating the ‘Big Deals’
  9. 9. Previous Techniques Used: Annual journals review using the follow data • Circulation figures – issues and renewals • “Sweep survey” to capture in-house use • Journal contents page requests • Download figures • Journal prices v the cost of ILL requests More recent focus on “cost per download”
  10. 10. New Approach Quantitative: Qualitative: • Size • Academic Liaison • Usage • Reading Lists Review • Coverage • REF Preferred • Value for Money
  11. 11. Requirements • Systematic • Sustainable • Internal benchmarking • Elevator pitch • So what? • Enable informed decision making • Demonstrate smart procurement
  12. 12. Quantitative Reporting
  13. 13. Brought to you by the letters… &
  14. 14. Our Approach • What has everyone else done? • Analysing Publisher Deals Project • Storage centre • Excel training • Template design
  15. 15. Basic Metrics • Number of titles within a package • Total annual full-text downloads • Cost: • Core titles • e-Access Fee • Total costs
  16. 16. Value Metrics • Average number of requests per title • Average cost per title • Total cost as % of information provision expenditure • Cost per full-text download • Average download per FTE student/staff/total • Average cost per FTE student/staff/total
  17. 17. The Long Tail Downloads Titles Titles Titles Long Tail Short Tail No Tail
  18. 18. Subscribed Titles • Reviewing performance of core collection • REF Preferred? • Popular? • Three year trends in cost / downloads / CPD • Cost / Downloads / CPD categorised: • Zero • Low • Medium • High • Cancel?
  19. 19. Popular Titles • Which titles are the most popular? • Top 30 titles in the package • Three year trends in downloads • REF Preferred? • Subscribed title?
  20. 20. Considerations • When to measure from/to? • calendar, financial/academic, or contract year? • Which titles make up our core collection? • Do we have access to all of the „zero use‟ titles? • What constitutes Low/Medium/High? • What about the aggregator usage statistics? • Do we trust the usage statistics? • What is the size of the target population?
  21. 21. Capturing the data
  22. 22. Downloading Statistics • Get organised • Gather your usernames and passwords • Create local files to save and store usage reports • Software now on the market to manage this for you • Joint Usage Statistics Portal
  23. 23. Introductory Workshop April 18 Birmingham
  24. 24. Reporting Period • Calendar, financial/academic, or contract year? • COUNTER Reports = Calendar year • Converted using Vlookup on ISSNs • Manual • Problematic • Automatically converted using the JUSP
  25. 25. Aggregator Usage Statistics • Combining usage from publishers and aggregators at a title level • Combined using Pivot Tables • Manual • Problematic • Where possible combined using JUSP
  26. 26. Analysing the data
  27. 27. Excel Template • Two main data sources: • COUNTER JR1 • Subscription agent financial report • Automated as much as possible • Match formulas working with ISSNs to link title price to usage/holdings • All calculations are completed automatically when the data sources are added
  28. 28. Quantitative Reporting • Systematic  • Sustainable  • Internal benchmarking  • Elevator pitch  • So what?  • Enable informed decision making  • Demonstrate smart procurement 
  29. 29. Qualitative Measures
  30. 30. Academic Liaison • Who‟s using it? • Why? • How? • What will be the impact if we cancel? • Teaching? • Research? • How valuable is it?
  31. 31. Quantitative on the Qualitative: Analysis on the five REF Preferred Recommended Journals Lists: • Overlapping titles • Unsubscribed titles • Financial shortfall • Current recommended subscribed titles • Usage data
  32. 32. Reading List Review Qualitative analysis on course reading lists: • What are our academic recommending? • Where is it published? • How often is it recommended? • Are there alternatives?
  33. 33. Quantitative & Qualitative Reporting • Systematic  • Sustainable  • Internal benchmarking  • Elevator pitch  • So what?  • Enable informed decision making  • Demonstrate smart procurement 
  34. 34. Using the results
  35. 35. What they can do: • Both qualitative and quantitative measures tell the story of the resource • Aid decision making • Justify procurement • Safeguard budgets…?
  36. 36. What they can’t do:
  37. 37. Conclusions
  38. 38. Closing thoughts • Is it worth investing in this? • Qualitative & Quantitative • Danger of relying on cost-per-download
  39. 39. Looking Ahead • Review of all budgets • All Resources • Systems • Staff • Services • Demonstrating Value and Impact • Resources • Services
  40. 40. Thank You Selena Killick Cranfield University s.a.killick@cranfield.ac.uk Tel: 01793 785561

×