Successfully reported this slideshow.
Your SlideShare is downloading. ×

Evaluating the Big Deal: What metrics matter?

Evaluating the Big Deal: What metrics matter?

In April 2010 the Cranfield University Libraries embarked upon a review of the electronic journal packages. Following research into usage metrics employed at other institutions a number of key performance indicators were developed and assessed using a standardised Excel template. The resulting information helped to inform a cancellation decision.

In April 2010 the Cranfield University Libraries embarked upon a review of the electronic journal packages. Following research into usage metrics employed at other institutions a number of key performance indicators were developed and assessed using a standardised Excel template. The resulting information helped to inform a cancellation decision.

More Related Content

Slideshows for you

Similar to Evaluating the Big Deal: What metrics matter?

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all

Evaluating the Big Deal: What metrics matter?

  1. 1. Evaluating the Big Deal: What metrics matter? Effektivare hantering av statistik för e-resurser med Excel Stockholm, 9 May 2011 Selena Killick Library Quality Officer
  2. 2. Evaluating the Big Deal: What metrics matter? Disclaimer: This presentation will not provide one single answer to the above question!
  3. 3. Introduction • Evaluating the ‘Big Deal’ • Researching and choosing metrics • Capturing the data • Reviewing the results • Next steps
  4. 4. Cranfield University • The UK's only wholly postgraduate university focused on science, technology, engineering and management • One of the UK's top five research intensive universities • Annual turnover £150m • 40% of our students study whilst in employment • We deliver the UK Ministry of Defence's largest educational contract
  5. 5. My Role (amongst other things) • Provide analysis and advice on customer feedback and library performance data: • Key Performance Indicators • SCONUL Statistics • LibQUAL+ • Enquiry monitoring • Focus groups
  6. 6. My Usage Statistics Journey • 2005: UKSG Usage Statistics training course • Personal interest but limited application • 2007/08: Need to report usage statistics for SCONUL Statistical Questionnaire1 • 2008: First budget reduction • 2010: Review of the Big Deals
  7. 7. Usage Statistics Learning Curve Knowledge Now Big Deal Budget Cut SCONUL Training 2005 2006 2007 2008 2009 2010 2011
  8. 8. Evaluating the ‘Big Deals’
  9. 9. Evaluating the ‘Big Deals’ • Resources budget cut predicted • Small publishers and book budgets previously reduced • Review of ‘Big Deals’ required • Initial discussion focussed on cost-per- download • More analysis needed
  10. 10. Researching Metrics • Angela Conyers: • UKSG E-Resources Management Handbook2 • Analysing Publisher Deal project from Evidence Base3 • Cliff Spencer: • Lib-Stats discussion list and archive4 • A librarian’s view of usage metrics5 • Managing and Understanding Data in Libraries (MUDL)6
  11. 11. Metrics, Metrics, Metrics: Basic Metrics • Number of titles within a package • Total annual full-text downloads • Downloads per Full-Time-Equivalent (FTE) student/staff/total • Number and % of titles in usage group zero/low/medium/high • 20 most popular titles as % of total downloads
  12. 12. The Long Tail Downloads Titles Titles Titles Long Tail Short Tail No Tail
  13. 13. Metrics, Metrics, Metrics: Value Metrics • Average number of requests per title • Total cost as % of information provision expenditure • Average cost per title • Cost per full-text download • Average cost per FTE student/staff/total
  14. 14. Metrics, Metrics, Metrics: Print Metrics • Costs by format and as % of information provision expenditure • Number of print subscriptions • Full-text downloads of print titles • Number and % of print subs in usage group zero/low/medium/high • Average number of requests per title • Average cost per title • Cost-per-download by title • Number and % of print subs in Top 20
  15. 15. Capturing the data
  16. 16. Get Organised… More metrics than I knew what to do with! 1. Become an Excel whizz 2. Download relevant reports and store locally 3. Design Excel template 4. Run the analysis on each package 5. Benchmark between packages 6. Report back to Library Management
  17. 17. Downloading Statistics • Get organised • Create local files to save and store usage reports (publishers will not save them forever) • Gather your usernames and passwords • Software now on the market to manage this for you • In the UK: Joint Usage Statistics Portal7
  18. 18. Excel Template • Two main data sources: • COUNTER JR1 • Publisher/Subscription agents financial report • Automated as much as possible • Match formulas working with ISSN to link title price to usage/holdings • All metric calculations are worked out when the data sources are added
  19. 19. Handy Excel Formulas = Sum • Sums the entries in a range = Average • Calculates the average value of a range = Count • Counts non-blank numeric values in a range = CountA • Counts non-blank numeric or text values in a range = CountIF • Count the number of values with a criteria = SumIF • Sums the values in a range with a criteria = Min • Calculates the lowest value in a range = Max • Calculates the highest value in a range Key Formulas: LOOKUP VLOOKUP MATCH
  20. 20. Reviewing the results
  21. 21. A Margin of Error • When to measure from/to; calendar, financial/academic, or contract year? • How many titles do we have? • Do we have access to all of the ‘zero use’ titles? • Which are our subscribed titles? • What about the aggregator usage statistics? • Do we trust the usage statistics? • What is the size of the target population? Sometimes you’ve got to work with what you’ve got.
  22. 22. Key Performance Indicators • Overview of basic and value metrics • Basis for comparing metrics against each other • Quick-glance % change column
  23. 23. Top 20 • What are the highest used titles within the package? • How many do we subscribe to? • Does this list change annually? • Which titles are consistently being used heavily? • Which subject areas do they support? • How much do they cost?
  24. 24. Print Titles • Trends in • Usage • Title price • Cost per use at title level • High / Medium / Low groupings for all three • Problems with gaps in the data from Subscription Agent
  25. 25. Metrics, Metrics, Metrics… but what metrics matter? • Average cost per title? • Cost per download? • Number of titles with high use? • Percentage of zero use titles? (The long tail) • Cost to replace the highly used titles? • ILL equivalent costs if titles weren’t taken? • Downloads per FTE in target audience? • Cost per FTE in target audience?
  26. 26. For Cancellation Decisions one metric held more weight than any other: The Subscription End Date.
  27. 27. Metrics, Metrics, Metrics… but what metrics matter? Answer: All of them, and more besides…
  28. 28. Metrics Used in Decision Making • All of them tell a story about the resource • Danger of relying just on cost-per-download • Highest c-p-d had second highest usage BUT: • Usage statistics are only two dimensional, they should never be used in isolation • Other factors that must be used in decision making process includes qualitative local knowledge e.g.  Local academic’s ‘top journal’ for their discipline  Small (but income generating) research areas  Groundbreaking research information needs
  29. 29. So, what did we cancel? • Out of the four packages under review, one had a cancellation date in this financial year • Other resources were cut, but we successfully lobbied to renew the one ‘Big Deal’ for another year • All four packages now have the same renewal date for the next financial year…
  30. 30. Next Steps
  31. 31. Qualitative Metrics • Subject coverage • Overlap with other packages / Uniqueness • Unique selling points • Titles Cranfield publish in within package • Ease of use • Known users of the package • Authentication options • Content for which access is retained post cancellation
  32. 32. Looking Ahead • Systematic review of all resources to demonstrate smart procurement • Qualitative research into customer expectation • Review of resource ‘uniqueness’ • Benchmarking resources • Between each other • With peer institutions
  33. 33. Selena Killick Library Quality Officer Cranfield University s.a.killick@cranfield.ac.uk Tel: +44 (0)1793 785561

×