My presentation from the Health Libraries Group 2016 conference #hlg2016. Examines range of publications around impact measures and quality assurance mostly from within libraries and higher education. Uses these to establish a set of four over arching principles for considering metrics - Meaningful, Actionable, Reproducible and Comparable. Discusses these with a worked example using a metrics creation / recording template.
Call Girls Service Tilak Nagar @9999965857 Delhi 🫦 No Advance VVIP 🍎 SERVICE
Principles for good metrics: theory to practice
1. Principles for good metrics:
theory to practice
Alan Fricker - Head of NHS Partnership
& Liaison, King’s College London
Richard Parker – Knowledge Manager,
Heart of England NHSFT
2. Why Metrics?
• How are we doing?
• How do we compare?
• Have changes made a
difference?
• Something to talk about
@NHS_HealthEdEng #heelks
3. Defining terms
• "A metric is criteria against which something is
measured" (Ben Showers (2015) Library Analytics
and Metrics)
• "a criterion or set of criteria stated in quantifiable
terms" (OED)”
@NHS_HealthEdEng #heelks
4. Our task
• Take a look around
• Identify appropriate methodologies
and mechanisms
• Help people get better with metrics
• Support Knowledge for Healthcare
@NHS_HealthEdEng #heelks
5. Your favourite metric
• First one that comes to mind
• Pop it on a sticky for later
@NHS_HealthEdEng #heelks
10. @NHS_HealthEdEng #heelks
NHS explorations
Library Quality Assurance Framework (LQAF)
• Replaced HeLICon (2010 onwards)
• 48 criteria across 5 domains
– Strategic Management
– Finance and Service Level Agreements
– Human Resources and staff management
– Infrastructure and facilities
– Library/ Knowledge Services Delivery and Development
• Annual submission
11. @NHS_HealthEdEng #heelks
NHS explorations
LQAF
Pro • Rigorous
• Regular
• Linked to
stakeholders
• Growing pool of
data
Con
• Inconsistent
compliance
regimes
• Self assessment
subjective
• Burden of evidence
collection
12. @NHS_HealthEdEng #heelks
NHS explorations
SHALL National KPI
• 2011 consultation on 6 national KPI
• Revised to 4 (not all from original list)
– % of the organisation’s workforce (headcount) who are registered library
members.
– % of the organisation’s workforce (headcount) who have registered as a
library member in the last year.
– % of the organisation’s workforce (headcount) who have used ATHENS in
the last year.
– % increase in compliance with the Library Quality Assurance Framework
(LQAF) compared with the previous year.
• Not implemented
13. Current practice in the NHS
• Brief KfH survey on metrics in use
• 150 responses but only 47 offered a metric
• 117 metrics suggested
@NHS_HealthEdEng #heelks
14. Areas of focus and approaches
0 5 10 15 20 25 30
Access
Book/physical
Current awareness
Document Supply/ILLs
Enquiries
E-Resource Use
Literature Searches
Outreach
Quality assurance
Training
Unclear
User registration
Website
Impact LQAF Satisfaction Timely Response Usage statistics Value Not stated
15. Reasons for choosing metrics
Code Definition M etrics
Easy to understand Metric clear to them and/or stakeholders 4
Impact Used for impact work 15
Satisfaction Satisfaction or quality related 11
Simple to collect Metric felt easy to get data for 10
Stakeholder agreed Requested / required by stakeholder 18
Timely response Measuresof speed of response 20
Usage Measuresof usage 52
User insight Understand user behaviour (and segment users) 41
Value Value for money 26
@NHS_HealthEdEng #heelks
16. Serendipity
• Areas for focus (Van Loo in Haines-Taylor &
Wilson, 1990):
– time consuming
– space intensive
– high cost
– affect most users
– directly linked to library objectives
– well defined and easy to describe
– relatively easy to collect
– are in areas where library staff have some
control to make changes
@NHS_HealthEdEng #heelks
17. @NHS_HealthEdEng #heelks
Wider world - libraries
International standard (ISO 11620:2014)
• Generic approach to performance
indicators
• Well defined terms
– Resources
– Use (activity)
– Efficiency (cost)
– Potentials and Development (value added work)
• 52 indicators offered
18. @NHS_HealthEdEng #heelks
Wider world - libraries
International standard - criteria
Informative content (provides information for decision
making
Reliability (produces same result when repeated)
Validity (measures what it is intended to measure –
though indirect measures can be valid)
Appropriateness (units and methods of measurement
appropriate to purpose)
Practicality (does not require unreasonable staff or user
time)
Comparability (the extent to which a score will mean the
same for different services – standard is clear you should
only compare similar services)
19. @NHS_HealthEdEng #heelks
Wider world - libraries
RLUK – service standards
• Pilot of 8 initial standards
• “We will achieve X% in Y”
• Shift to benchmarking
approach
• Potential kite mark
20. @NHS_HealthEdEng #heelks
Wider world
The Metric Tide - dimensions
“Robustness: basing metrics on the best possible data
in terms of accuracy and scope
Humility: recognising that quantitative evaluation should
support – but not supplant – qualitative, expert
assessment
Transparency: keeping data collection and analytical
processes open and transparent, so that those being
evaluated can test and verify the results
Diversity: accounting for variation by field, and using a
range of indicators to reflect and support a plurality of
research and researcher career paths across the system
Reflexivity: recognising and anticipating the systemic
and potential effects of indicators, and updating them in
response.”
21. @NHS_HealthEdEng #heelks
Wider world
HSCIC – Quality Assurance Indicators Tool
Relevance (Does it meet user need? Is it actionable?)
Accurate and reliable (Quality of data? Is it a good estimate of reality?)
Timeliness and Punctuality (How long after the event is data available / collected?)
Accessibility and clarity (How easy is to access the data? How easy is it to
interpret?)
Coherence and comparability (Are data from different sources on the same topic
similar? Can it be compared over time?)
Trade-offs (Would improving this metric have a negative impact on another?)
Assessment of user needs and perceptions (What do stakeholders think?)
Performance, cost and respondent burden (How much work is involved in
collection?)
Confidentiality and transparency
23. @NHS_HealthEdEng #heelks
Principles for good metrics
Meaningful
• Relates to goals of organisation
• Relates to needs of stake holders
• Re-examined over time to ensure
still valid
24. @NHS_HealthEdEng #heelks
Principles for good metrics
Actionable
• Measures what matters
• Measures something you can
influence
• Drives changes to behaviour /
services
• Investigate not assume
25. @NHS_HealthEdEng #heelks
Principles for good metrics
Reproducible
• Clearly defined in advance
• Transparent
• Can be replicated
• Best available data
• Non burdensome (to allow repetition)
26. @NHS_HealthEdEng #heelks
Principles for good metrics
Comparable
• Valid over time for internal use
• Valid externally for benchmarking
• Respect diversity of services
31. Definition of the Indicator or Measure: GMC Survey scores against Access to Educational Resources and sub questions on Library Services,
Online Journals and Space for Private Study. Overall score, specialty outliers, positive versus negative satisfaction ratings.
Which LQAF Sections does this link to?
1.2e Service development informed by evidence
1.3c Positive impact
Data Source Reason Indicator is
tracked
How to Calculate How to interpret Targets Link to strategic
directions
GMC Survey
delivered annually
with results publicly
available
http://www.gmc-
uk.org/education/sur
veys.asp
High quality national
data with good
granularity from a core
user group (can look at
Trust, Site and
Specialty).
Very high participation
rate. Consistent year
on year application.
Not Library delivered
reducing bias.
Data from GMC Survey site.
- Overall score for Trust for
Access to Educational
Resources from Summary
page.
- Download scores for
individual sub questions
(click through the overall
Access to the Educational
Resources Score)
- site by site data available
but some question marks
over accuracy of coding to
sites.
- Specialty data for outliers
should be examined
- Sentiment analysis by
calculating (Very good +
Good) – (Very Poor + poor) =
sentiment score
Compare performance on
different measures year
on year
Compare shifts within
specialties that have been
targeted following red
flags in previous years
Compare sites for local
issues
Benchmark against
equivalent organisations
Be aware of wider issues
within Trust / Specialties
that may have negative
halo
Have useful
conversation
with Medical
Education
Zero red flags for
specialties
Improve
absolute
performance
Improve
performance
against
benchmark
Trusts
Support key
stakeholders
and funders
Increase
satisfaction with
online offer
Provide high
quality study
space
Plans for performance improvement: Subject to areas highlighted and research on benchmark services
Where are the results reported and how regularly: Results included in annual report. Annual GMS Survey Report prepared for each Trust and
discussed at Library User Boards. Annual benchmarking report prepared for Library Leadership Team / wider Library Services
32. @NHS_HealthEdEng #heelks
Have a go
Make a metric
• Paper templates to scribble on
• Ask for electronic
• Ask for help
• Share your metrics #heelks
33. Thanks
Alan Fricker - Head of NHS Partnership & Liaison,
King’s College London
Alan.Fricker@kcl.ac.uk
Richard Parker – Knowledge Manager, Heart of
England NHSFT
Richard.Parker2@heartofengland.nhs.uk
@NHS_HealthEdEng #heelks
Editor's Notes
You could say Something to argue with
This was my first thought in a hot summer office!
We started by considering where metrics (and quality assurance / KPI) had been discussed in the NHS previously – we stuck with major initiatives and did not seek out an exhaustive picture of local work.
Helicon roots back in original LINC health panel accreditation checklist and toolkit (1996-1998)
Example of use of these figures in the NLH finance report
How have we addressed the cons?
Previous attempt to address this issue. Feel very culpable here as one of the people who shot holes in things. Basically – I could game almost every single one – the question was – did they matter?
First six
KPI1. Percentage of the organisation’s workforce (headcount) which are “active* library users.(Indicates penetration of library service).
KPI2. Percentage of the organisation’s workforce (headcount) which are registered ATHENS users.(Indicates use of e-resources)(E.g., 1000 Athens users in an organisation of 10,000 staff = 10% )
KPI3. Re-current expenditure commitment on library services based on the organisation's workforce (WTE). (Indicates Trust commitment to Library Services).(E.g., £100,000 spent on Library services in a Trust of 10,000 staff = £10 is spent on library services per WTE)
KPI4. Number of information consultancy enquiries per member of staff based on the organisation's workforce (WTE).(Indicates penetration level of Library enquiries on the organisation).(E.g., 400 enquiries in an organisation with 1,000 staff = a penetration level of 0.4)
KPI5. Percentage of the organisation's workforce (headcount) that subscribe to current awareness services. (Indicates penetration level of current awareness services on the organisation).
KPI6. Percentage of organisation's workforce (headcount) which have received information skills training in one year.(Indicates penetration of information skills/information literacy training on organisation).
Why so few metrics? Issue with tool? Survey overload? Discomfort with metrics?
Lots of people offering pure usage without the context to make it a metric. Cost per download widely used for collection development and VFM evidence. Speed felt important for users but contested – we always negotiate deadlines.
Discovered on the discard pile – describe what we were seeing in the survey data perfectly
Bingo! Powerful way to think about
Research Libraries UK. Targets set across the piece do not make sense.
Debate in HE around use of Metrics – post REF 2014 and in an increasingly numbers driven approach to career futures.
Now Known as NHS Digital. National Library of quality assurance indicators – task under the 2012 Health and Social Care Act – aimed at healthcare delivery and performance but work for our quality purposes too
People care about this metric
This metric makes a difference
You could repeat my metric and the results would be consistent
Take care with comparisons!
Doing this is not easy! The template is there to help
Main template
Checklist – good enough for Gawande and the WHO – good enough for me