Service Assessment Programmes: The SCONUL Experience

242 views
156 views

Published on

Presentation delivered as Chair of the SCONUL Working Group on Performance Improvement.

Published in: Education, Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
242
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Service Assessment Programmes: The SCONUL Experience

  1. 1. Service Assessment Programmes: The SCONUL Experience Stephen Town Cranfield University Chair, SCONUL Working Group on Performance Improvement
  2. 2. Quality in Higher Education • Quality Assurance – eg Audit, TQA, ISOs, Bologna • Traditional patterns of Peer Review – eg RAE, Library Reviews • Batteries of Performance Indicators – CVCP, SCONUL, HESA • Quality Culture – eg IIP, TQM, Satisfaction Surveys
  3. 3. The University Context (from the Library Assessment Conference, Charlottesville, Va, September 2006) Universities have two “bottom lines” 1. Financial (as in business) 2. Academic, largely through reputation in • Research (the priority in “leading” Universities) • Teaching (& maybe Learning)
  4. 4. Library Pressures for Accountability The need is therefore to demonstrate the Library contribution in these two dimensions: 1. Financial, through “value for money” or related measures 2. Impact on research, teaching and learning This also implies that “competitive” data will be highly valued
  5. 5. The SCONUL Response The SCONUL Working Group on Performance Improvement • Ten years of “toolkit” development to assist in performance measurement and improvement (for both management and advocacy) • SCONUL ‘Top concern survey’ 2005, leading to VAMP
  6. 6. Frameworks: Issues • Balanced Scorecard • EFQM • Key Performance Indicators
  7. 7. Quality: Issues & Solutions • Benchmarking – Process Benchmarking – Statistical Benchmarking – Peer Group Benchmarking • Customer Satisfaction Surveys – LibQUAL+ – SCONUL Satisfaction Survey – Priority Research – Opinion Meters • Charter Mark • Customer Relationship Management • Investors in People • Quality Assurance – QAA Guidelines for Assessors – ISO 9001 • Quality Maturity Model
  8. 8. SCONUL Satisfaction Survey
  9. 9. SCONUL LibQUAL+ Results 2006
  10. 10. CMM ‘Capability Maturity Model’ 1 2 3 4 55 Optimising 4 Managed 3 Defined 2 Repeatable 1 Initial
  11. 11. Statistics: Issues & Solutions • SCONUL Statistical Questionnaire • SCONUL Statistics on the Web • SCONUL Annual Library Statistics • SCONUL Statistics: Trend Analysis • Higher education library management statistics (HELMS) • E-Measures Project
  12. 12. New SCONUL Statistics Measures • 2d: Breakdown of 'unique serial titles' into: – print only (2e) – electronic only (2f) – print and electronic (2g) • 2k 'number of electronic databases' • 2l 'number of electronic books' • 4r 'number of successful requests for full-text articles' • 4s 'number of successful accesses to electronic books' • 7g Breakdown of 'electronic resources' into: – 'subscriptions to electronic databases' (7h) – 'expenditure on e-books' (7j) – 'expenditure on other digital documents' (7k) SCONUL Statistics Web Site
  13. 13. Impact & Value: Issues & Solutions • Impact – Impact Initiative (LIRG / SCONUL) – Institutional Critical Success Factors for Information Literacy • Value – Contingent Valuation – Transparency Costing – Staff Value Added
  14. 14. Conclusions of Impact Measurement “Helps us to move library performance on from simply counting inputs and outputs to looking at what difference we really make.” Payne, et al, 2004
  15. 15. SCONUL VAMP Objectives • New missing measurement instruments & tools • A full coherent framework for performance, improvement and innovation • Persuasive data for University Senior Managers, to prove value, impact, comparability, and worth
  16. 16. VAMP Project Structure • Phase 1 (March-June 2006) – Critical review – SCONUL Member Survey – Gap analysis & synthesis – SCONUL Conference Workshops • Phases 2 & 3 (July 2006 - April 2007) – Development of new measures & techniques – Review and re-branding of existing tools – Web site development – Dissemination & maintenance strategy
  17. 17. VAMP Web site • Showcasing approaches to performance measurement in the following areas: – Frameworks – Impact – Quality – Statistics – Value • Providing detailed methods on how to apply the different approaches • Sharing experience from those who have already applied the approach • Discussion areas
  18. 18. Communities of Practice “groups of people who share a passion for something that they know how to do, and who interact regularly to learn how to do it better” “coherence through mutual engagement” Etienne Wenger, 1998 & 2002
  19. 19. Member’s Forum (Blog?) Techniques in Use (Wiki?) VAMP Home Page Simple Introductions Detailed Techniques Community of Practice Techniques
  20. 20. J. Stephen Town j.s.town@cranfield.ac.uk

×