Successfully reported this slideshow.
Your SlideShare is downloading. ×

Electronic Collection Management: How statistics can, and can't, help.

Electronic Collection Management: How statistics can, and can't, help.

Download to read offline

Presentation delivered at the ASLIB Engineering & Technology group and the Aerospace & Defence Librarians Group event titled: Surviving the recession: maximising your value. Held at Imperial College on the 15th of November 2011.

Presentation delivered at the ASLIB Engineering & Technology group and the Aerospace & Defence Librarians Group event titled: Surviving the recession: maximising your value. Held at Imperial College on the 15th of November 2011.

More Related Content

Slideshows for you

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all

Electronic Collection Management: How statistics can, and can't, help.

  1. 1. Electronic Collection Management: How statistics can, and can’t, help. Surviving the recession: maximising your value ASLIB Engineering & Technology Group Aerospace & Defence Librarians Group John Harrington Selena Killick Head of Information Services Library Quality Officer
  2. 2. Introduction • Institutional, financial and strategic context • Previous methods used to review journals collections • Role of qualitative and quantitative measures • What these measures can and cannot tell us
  3. 3. Cranfield University • The UK's only wholly postgraduate university focused on science, technology, engineering and management • One of the UK's top five research intensive universities • Annual turnover £150m • 40% of our students study whilst in employment • We deliver the UK Ministry of Defence's largest educational contract
  4. 4. Key Drivers • Financial realities • Demonstrating value for money • Strategic alignment • Research Excellence Framework (REF) • Income Mission critical • Reputation
  5. 5. Expenditure on Journals Journal Spend 2006-07 2007-08 2008-09 2009-10
  6. 6. Expenditure on Resources Cranfield University Information Provision Expenditure by Format 2009-10 0% 8% 29% Books inc. special collections Total Journals e-Books 58% Other databases Other digital documents 4%
  7. 7. How do we demonstrate that the collection is meeting the needs of the University?
  8. 8. Previous Techniques Used: Annual journals review using the follow data • Circulation figures – issues and renewals • “Sweep survey” to capture in-house use • Journal contents page requests • Download figures • Journal prices v the cost of ILL requests More recent focus on “cost per download”
  9. 9. New Approach Quantitative: Qualitative: • Size • Academic Liaison • Usage • Reading Lists Review • Coverage • REF Preferred • Value for Money
  10. 10. Quantitative Reporting
  11. 11. Quantitative Reporting • Systematic • Sustainable • Internal benchmarking • Elevator pitch • So what? • Enable informed decision making • Demonstrate smart procurement
  12. 12. Brought to you by the letters… &
  13. 13. Our Approach • What has everyone else done? • Analysing Publisher Deals Project • Storage centre • Excel training • Template design
  14. 14. Basic Metrics • Number of titles within a package • Total annual full-text downloads • Cost: • Core titles • e-Access Fee • Total costs
  15. 15. Value Metrics • Average number of requests per title • Average cost per title • Total cost as % of information provision expenditure • Cost per full-text download • Average download per FTE student/staff/total • Average cost per FTE student/staff/total
  16. 16. The Long Tail Downloads Titles Titles Titles Long Tail Short Tail No Tail
  17. 17. Subscribed Titles • Reviewing performance of core collection • REF Preferred? • Popular? • Three year trends in cost / downloads / CPD • Cost / Downloads / CPD categorised: • Zero • Low • Medium • High • Cancel?
  18. 18. Popular Titles • Which titles are the most popular? • Top 30 titles in the package • Three year trends in downloads • REF Preferred? • Subscribed title?
  19. 19. Considerations • When to measure from/to? • calendar, financial/academic, or contract year? • Which titles make up our core collection? • Do we have access to all of the „zero use‟ titles? • What constitutes Low/Medium/High? • What about the aggregator usage statistics? • Do we trust the usage statistics? • What is the size of the target population?
  20. 20. Electronic Collection Management: How statistics can, and can’t, help.
  21. 21. Qualitative Measures
  22. 22. Academic Liaison • Who‟s using it? • Why? • How? • How valuable is it? • What will be the impact if we cancel? • Teaching? • Research?
  23. 23. Quantitative on the Qualitative: Analysis on the five REF Preferred Recommended Journals Lists: • Overlapping titles • Unsubscribed titles • Financial shortfall • Current recommended subscribed titles • Usage data
  24. 24. Reading List Review Qualitative analysis on course reading lists: • What are our academic recommending? • Where is it published? • How often is it recommended? • Are there alternatives?
  25. 25. Using the results
  26. 26. What they can do: • Both qualitative and quantitative measures tell the story of the resource • Aid decision making • Justify procurement • Safeguard budgets
  27. 27. What they can’t do:
  28. 28. Conclusions
  29. 29. Closing thoughts • Is it worth investing in this? • Qualitative & Quantitative • Danger of relying on cost-per-download
  30. 30. Looking Ahead • Review of all budgets • All Resources • Systems • Staff • Services • Demonstrating Value and Impact • Resources • Services
  31. 31. Thank You Selena Killick John Harrington Cranfield University Cranfield University s.a.killick@cranfield.ac.uk j.harrington@cranfield.ac.uk Tel: 01793 785561 Tel: 01234 754477

×