Building a resource for practical
assessment: adding value to value
and impact
Stephen Town
University of York, UK
Library...
Summary
• The SCONUL Value & Impact Measurement
Program (“VAMP”) recap
• The Performance Portal
• ‘Value’ options
– UK dri...
Introduction & recap
The University Context
(from the 2006 Library Assessment Conference,
after Lombardi)
Universities have two “bottom lines”
...
Library Pressures for Accountability
The need is therefore to demonstrate the Library
contribution in these two dimensions...
The Aim & Role of Universities & their
Libraries: cautions for measurement
• Research, Teaching & Reductionism
– ‘Mode 1’ ...
SCONUL Member Survey Findings
• 70% undertaken value or impact measurement
• Main rationales are advocacy, service improve...
Member Survey Conclusions
• There is a need to demonstrate value and that
libraries make a difference
• Measurement needs ...
VAMP Objectives
• New missing measurement instruments &
frameworks
• A full coherent framework for performance,
improvemen...
Missing methods
• An impact tool or tools, for both teaching &
learning and research (from the
LIRG/SCONUL initiative?)
• ...
Program Benefits
1. Attainment & retention of Library institutional
income
2. Proof of value and impact on education and
r...
Communities of Practice
“groups of people who share a passion
for something that they know how to do,
and who interact reg...
VAMP Project Structure
• Analysis March-June 2006
• Tools I (Impact) - June 2007
• Site Development - June 2007
• Tools II...
The Performance Portal
Member’s Forum
(Blog?Chat?)
Techniques in Use
(Wiki?)
VAMP
Home Page
Simple
Introductions
Detailed
Techniques
Community of...
The ‘Performance Portal’
• A Wiki of library performance measurement
containing a number of ‘approaches’, each
(hopefully)...
Content submission
User guide
Discussion Tools
• An experiment in social networking & Web
2.0 technologies
The Ontology of Performance
• ‘Frameworks’
• ‘Impact’
• ‘Quality’
• ‘Statistics’
• ‘Value’
• A visual Mind map?
Frameworks
Mounted
• European Framework
for Quality Management
(EFQM)
Desired
• Key Performance
Indicators
• The Balanced ...
Impact
Mounted
• Impact tools
Desired
• Detailed UK experience
from LIRG/SCONUL
Initiatives
• Outcome based
evaluation
• I...
Quality
Mounted
• Charter Mark
• Customer Surveys
– LibQUAL+
– SCONUl Survey
– Priority Research
• Investors in People
Des...
Statistics
Mounted
• SCONUL Statistics &
interactive service
• HELMS statistics
Desired
• Institutional experience
of usin...
Value
Mounted Desired
• Contingent valuation
• ‘Transparency’ costing
• Staff & process costing,
value & contribution
• E-...
Value
What is value?
• Cost efficiency
• Cost effectiveness
• Cost comparison (Case 3)
• Financial management process standards ...
Case 1: TRAC
UK Higher Education Transparency
initiative 2000-09
Transparent approach to costing
• The standard method for costing in UK HEIs
• Government requirement
• Ending of cross-su...
Implications
• All activity to be identified as ‘research’,
‘teaching’ or ‘other’
• Library as other? or
• All library act...
Case 2: the UK Open University
Library’s Best Value Program
OU Best Value Program Objectives
• To increase the business skills of library
managers & staff
• To develop skills to supp...
Strands
• Business reporting
• Process costing and continuous improvement
• Service planning
• Benchmarking
‘to generate r...
Business reporting elements
• Library business areas
• Five PIs per area, including cost, quality &
customer impact
• Fore...
Process Costing
• Complete process and stage costing
• Average times and skill levels
• Included enquiries, cataloguing, e...
Service plans
• Costed service plans to achieve medium
term improvement and development through
a rolling program
Included...
Program benefits and outcomes
• Staff development
– cost-conscious decision-making
– business skills
• Management informat...
Case 3: Financial benchmarking
International Benchmarking initiatives
• OU able to engage and lead an exercise
against distance education Universities
wo...
Financial Statistical Convergence
• York Meeting, 2008
– OCLC/RLG
– ARL
– SCONUL
– CAUL
Conclusion & Questions
• What do mean by value?
• Why do we not yet have a collective view on
costing approaches?
– Skills...
Acknowledgments
• The VAMP Subgroup of SCONUL WGPI
Maxine Melling, Philip Payne, Rupert Wood
• The Cranfield VAMP Team, Da...
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Building a resource for practical assessment: adding value to value and impact
Upcoming SlideShare
Loading in …5
×

Building a resource for practical assessment: adding value to value and impact

284 views

Published on

Stephen Town
University of York, UK

Library Assessment Conference, Seattle
Wednesday 6th August, 2008

Published in: Education, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
284
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
4
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Building a resource for practical assessment: adding value to value and impact

  1. 1. Building a resource for practical assessment: adding value to value and impact Stephen Town University of York, UK Library Assessment Conference, Seattle Wednesday 6th August, 2008
  2. 2. Summary • The SCONUL Value & Impact Measurement Program (“VAMP”) recap • The Performance Portal • ‘Value’ options – UK drivers (TRAC) – Institutional Case: The Open University’s Best Value project – Benchmarking & national statistics
  3. 3. Introduction & recap
  4. 4. The University Context (from the 2006 Library Assessment Conference, after Lombardi) Universities have two “bottom lines” 1. Financial (as in business) 2. Academic, largely through reputation in • Research (the priority in “leading” Universities) • Teaching (& maybe Learning)
  5. 5. Library Pressures for Accountability The need is therefore to demonstrate the Library contribution in these two dimensions: 1. Financial, through “value for money” or related measures 2. Impact on research, teaching and learning This also implies that “competitive” data will be highly valued
  6. 6. The Aim & Role of Universities & their Libraries: cautions for measurement • Research, Teaching & Reductionism – ‘Mode 1’ Research & impact ‘transcendental’ – ‘Mode 2’ Research & impact ‘instrumental’ – Value, Price & ‘Mandarinisation’ of research and its support – Interdisciplinary research – Collaborative research across institutions – Learning as a set of discreet assessed modules • All of this may damage the idea of Libraries as ‘transcendent’, collective and connective services
  7. 7. SCONUL Member Survey Findings • 70% undertaken value or impact measurement • Main rationales are advocacy, service improvement, comparison • Half used in-house methodologies; half used standard techniques • Main barrier is lack of tools, – Creating issues of time and buy-in
  8. 8. Member Survey Conclusions • There is a need to demonstrate value and that libraries make a difference • Measurement needs to show ‘real’ value • Need to link to University mission • Libraries are, and intend to be, ahead of the game • Impact may be difficult or impossible to measure • All respondents welcomed the programme, and the prospect of an available toolkit with robust and simple tools
  9. 9. VAMP Objectives • New missing measurement instruments & frameworks • A full coherent framework for performance, improvement and innovation • Persuasive data for University Senior Managers, to prove value, impact, comparability, and worth
  10. 10. Missing methods • An impact tool or tools, for both teaching & learning and research (from the LIRG/SCONUL initiative?) • A robust Value for Money/Economic Impact tool • Staff measures • Process & operational costing tools
  11. 11. Program Benefits 1. Attainment & retention of Library institutional income 2. Proof of value and impact on education and research 3. Evidence of comparability with peer institutions 4. Justification of a continuing role for libraries and their staff 5. Meeting national costing requirements for separating spend on teaching and research
  12. 12. Communities of Practice “groups of people who share a passion for something that they know how to do, and who interact regularly to learn how to do it better” “coherence through mutual engagement” Etienne Wenger, 1998 & 2002
  13. 13. VAMP Project Structure • Analysis March-June 2006 • Tools I (Impact) - June 2007 • Site Development - June 2007 • Tools II (Value) - ? • CoP development • Maintenance
  14. 14. The Performance Portal
  15. 15. Member’s Forum (Blog?Chat?) Techniques in Use (Wiki?) VAMP Home Page Simple Introductions Detailed Techniques Community of Practice Techniques
  16. 16. The ‘Performance Portal’ • A Wiki of library performance measurement containing a number of ‘approaches’, each (hopefully) with: – A definition – A method or methods – Some experience of their use in libraries (or links to this) – The opportunity to discuss use
  17. 17. Content submission
  18. 18. User guide
  19. 19. Discussion Tools • An experiment in social networking & Web 2.0 technologies
  20. 20. The Ontology of Performance • ‘Frameworks’ • ‘Impact’ • ‘Quality’ • ‘Statistics’ • ‘Value’ • A visual Mind map?
  21. 21. Frameworks Mounted • European Framework for Quality Management (EFQM) Desired • Key Performance Indicators • The Balanced Scorecard • Critical Success Factors • The Effective Academic Library
  22. 22. Impact Mounted • Impact tools Desired • Detailed UK experience from LIRG/SCONUL Initiatives • Outcome based evaluation • Information Literacy measurement • More on research impact
  23. 23. Quality Mounted • Charter Mark • Customer Surveys – LibQUAL+ – SCONUl Survey – Priority Research • Investors in People Desired • Benchmarking • Quality Assurance • ISO 9000s • ‘Investors in People’ experience • Opinion meters • Quality Maturity Model
  24. 24. Statistics Mounted • SCONUL Statistics & interactive service • HELMS statistics Desired • Institutional experience of using SCONUL statistics for local advocacy • COUNTER • E-resource tools
  25. 25. Value Mounted Desired • Contingent valuation • ‘Transparency’ costing • Staff & process costing, value & contribution • E-resource value tools
  26. 26. Value
  27. 27. What is value? • Cost efficiency • Cost effectiveness • Cost comparison (Case 3) • Financial management process standards & audit • Financial allocation (Case 1) • Valuation • Value added • Return on investment • Best value (Case 2)
  28. 28. Case 1: TRAC UK Higher Education Transparency initiative 2000-09
  29. 29. Transparent approach to costing • The standard method for costing in UK HEIs • Government requirement • Ending of cross-subsidy (T vs R) • Research funding based on full economic costing (fEC) • Positive effects on funding • Positive effect on pricing
  30. 30. Implications • All activity to be identified as ‘research’, ‘teaching’ or ‘other’ • Library as other? or • All library activities either research or teaching, or a simplistic apportioning to each • Libraries omitted as a component of research costs, and therefore as a share recipient
  31. 31. Case 2: the UK Open University Library’s Best Value Program
  32. 32. OU Best Value Program Objectives • To increase the business skills of library managers & staff • To develop skills to support customer- focused, cost-efficient management decision making • To develop benchmarking evaluation skills that balance quality, value and cost efficiency
  33. 33. Strands • Business reporting • Process costing and continuous improvement • Service planning • Benchmarking ‘to generate real accountability’
  34. 34. Business reporting elements • Library business areas • Five PIs per area, including cost, quality & customer impact • Forecast, variance & remedial action Has improved use of management information, efficiency, prioritisation and expenditure control
  35. 35. Process Costing • Complete process and stage costing • Average times and skill levels • Included enquiries, cataloguing, e-resources, IT support, document delivery, counter services Has delivered justification for staffing levels against activity, staffing formulae, redeployment to priority areas, and process improvements
  36. 36. Service plans • Costed service plans to achieve medium term improvement and development through a rolling program Included document delivery, enquiries, information literacy, and e-resources
  37. 37. Program benefits and outcomes • Staff development – cost-conscious decision-making – business skills • Management information improvement • Clarity about customers and use • Improved quality • Ability to ‘sell benefits’
  38. 38. Case 3: Financial benchmarking
  39. 39. International Benchmarking initiatives • OU able to engage and lead an exercise against distance education Universities worldwide • In one 2008 international benchmarking study – Only one institution (out of eight) had a comprehensive costing model
  40. 40. Financial Statistical Convergence • York Meeting, 2008 – OCLC/RLG – ARL – SCONUL – CAUL
  41. 41. Conclusion & Questions • What do mean by value? • Why do we not yet have a collective view on costing approaches? – Skills deficiency? – Lack of real need or real financial performance accountability? – We would rather not know? – Are we more intent on increasing budgets than seeking efficiency improvement?
  42. 42. Acknowledgments • The VAMP Subgroup of SCONUL WGPI Maxine Melling, Philip Payne, Rupert Wood • The Cranfield VAMP Team, Darien Rossiter, Michael Davis, Selena Lock, Heather Regan • The Open University, Ann Davies • ‘Value’ Consultants, Sue Boorman, Larraine Cooper • Attendees at the York Statistics meeting

×