Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Upcoming SlideShare
What to Upload to SlideShare
Next
Download to read offline and view in fullscreen.

0

Share

Download to read offline

Metrics Workshop for YOHHLNet

Download to read offline

How can we use metrics to better monitor, develop and discuss our services? Building on past experience in NHS Libraries and the wider sector a set of principles for good metrics are described. These are illustrated using data generated by running. The quality metric template is introduced as a tool to help the definition and sharing of metrics.

  • Be the first to like this

Metrics Workshop for YOHHLNet

  1. 1. Metrics – KfH task and finish group to practical application Alan Fricker - Head of NHS Partnership & Liaison, King’s College London
  2. 2. What we are up to today? • Thinking about what we mean by metrics • Looking at how we devised some principles • Keeping things practical
  3. 3. Be not afraid!
  4. 4. Why Metrics? • How are we doing? • How do we compare? • Have changes made a difference? • Have better conversations @NHS_HealthEdEng #heelks
  5. 5. Defining terms • "A metric is criteria against which something is measured" (Ben Showers (2015) Library Analytics and Metrics) • "a criterion or set of criteria stated in quantifiable terms" (OED)” @NHS_HealthEdEng #heelks
  6. 6. What about KPIs? 1. Result Indicator (RIs) RIs tell you what you have done 2. Performance Indicators (PIs) PIs tell you what to do 3. Key Result Indicator (KRI) KRIs tell you how you have done in a perspective or critical success factor 4. Key performance Indicators (KPIs) KPIs tell you what to do to increase performance dramatically According to Parmenter in Appleton, L (2017)
  7. 7. What was the KfH plan? • Take a look around • Identify appropriate methodologies and mechanisms • Help people get better with metrics • Support Knowledge for Healthcare @NHS_HealthEdEng #heelks
  8. 8. @NHS_HealthEdEng #heelks NHS explorations HeLICon • Checklist approach • Accreditation level based on achieving core criteria and excellence in other criteria • Three year cycle
  9. 9. @NHS_HealthEdEng #heelks NHS explorations HeLICon Pro • Standard criteria • Rigorous • Demonstrated improvements & impacts Con • Laborious • Paper heavy • Infrequent
  10. 10. @NHS_HealthEdEng #heelks NHS explorations National statistics return • Covers finance, activity and staffing • Long history of collection • More on this later today!
  11. 11. @NHS_HealthEdEng #heelks NHS explorations National statistics return Pro • Consistent questions • Reasonable completion rate • Trends discernable Con • Some regions better than others • Missing data • Inconsistent interpretation • Do they matter?
  12. 12. @NHS_HealthEdEng #heelks NHS explorations Library Quality Assurance Framework (LQAF) • Replaced HeLICon (2010 onwards) • 48 criteria across 5 domains – Strategic Management – Finance and Service Level Agreements – Human Resources and staff management – Infrastructure and facilities – Library/ Knowledge Services Delivery and Development • Annual submission
  13. 13. @NHS_HealthEdEng #heelks NHS explorations LQAF Pro • Rigorous • Regular • Linked to stakeholders • Growing pool of data Con • Inconsistent compliance regimes • Self assessment subjective • Burden of evidence collection
  14. 14. @NHS_HealthEdEng #heelks NHS explorations SHALL National KPI • 2011 consultation on 6 national KPI • Revised to 4 (not all from original list) – % of the organisation’s workforce (headcount) who are registered library members. – % of the organisation’s workforce (headcount) who have registered as a library member in the last year. – % of the organisation’s workforce (headcount) who have used ATHENS in the last year. – % increase in compliance with the Library Quality Assurance Framework (LQAF) compared with the previous year. • Not implemented
  15. 15. Practice in the NHS (at the time) • Brief KfH survey on metrics in use • 150 responses but only 47 offered a metric • 117 metrics suggested @NHS_HealthEdEng #heelks
  16. 16. Areas of focus and approaches 0 5 10 15 20 25 30 Access Book/physical Current awareness Document Supply/ILLs Enquiries E-Resource Use Literature Searches Outreach Quality assurance Training Unclear User registration Website Impact LQAF Satisfaction Timely Response Usage statistics Value Not stated
  17. 17. Serendipity • Areas for focus (Van Loo in Haines-Taylor & Wilson, 1990): – time consuming – space intensive – high cost – affect most users – directly linked to library objectives – well defined and easy to describe – relatively easy to collect – are in areas where library staff have some control to make changes @NHS_HealthEdEng #heelks
  18. 18. @NHS_HealthEdEng #heelks Wider world - libraries International standard (ISO 11620:2014) • Generic approach to performance indicators • Well defined terms – Resources – Use (activity) – Efficiency (cost) – Potentials and Development (value added work) • 52 indicators offered
  19. 19. @NHS_HealthEdEng #heelks Wider world - libraries International standard - criteria Informative content (provides information for decision making Reliability (produces same result when repeated) Validity (measures what it is intended to measure – though indirect measures can be valid) Appropriateness (units and methods of measurement appropriate to purpose) Practicality (does not require unreasonable staff or user time) Comparability (the extent to which a score will mean the same for different services – standard is clear you should only compare similar services)
  20. 20. @NHS_HealthEdEng #heelks Wider world - libraries RLUK – service standards • Pilot of 8 initial standards • “We will achieve X% in Y” • Shift to benchmarking approach • Potential kite mark
  21. 21. @NHS_HealthEdEng #heelks Wider world The Metric Tide - dimensions “Robustness: basing metrics on the best possible data in terms of accuracy and scope Humility: recognising that quantitative evaluation should support – but not supplant – qualitative, expert assessment Transparency: keeping data collection and analytical processes open and transparent, so that those being evaluated can test and verify the results Diversity: accounting for variation by field, and using a range of indicators to reflect and support a plurality of research and researcher career paths across the system Reflexivity: recognising and anticipating the systemic and potential effects of indicators, and updating them in response.”
  22. 22. @NHS_HealthEdEng #heelks Wider world HSCIC – Quality Assurance Indicators Tool Relevance (Does it meet user need? Is it actionable?) Accurate and reliable (Quality of data? Is it a good estimate of reality?) Timeliness and Punctuality (How long after the event is data available / collected?) Accessibility and clarity (How easy is to access the data? How easy is it to interpret?) Coherence and comparability (Are data from different sources on the same topic similar? Can it be compared over time?) Trade-offs (Would improving this metric have a negative impact on another?) Assessment of user needs and perceptions (What do stakeholders think?) Performance, cost and respondent burden (How much work is involved in collection?) Confidentiality and transparency
  23. 23. Tea and biscuits @NHS_HealthEdEng #heelks
  24. 24. @NHS_HealthEdEng #heelks Principles for good metrics Meaningful • Relates to goals of organisation • Relates to needs of stake holders • Re-examined over time to ensure still valid
  25. 25. @NHS_HealthEdEng #heelks Principles for good metrics Actionable • Measures what matters • Measures something you can influence • Drives changes to behaviour / services • Investigate not assume
  26. 26. @NHS_HealthEdEng #heelks Principles for good metrics Reproducible • Clearly defined in advance • Transparent • Can be replicated • Best available data • Non burdensome (to allow repetition)
  27. 27. @NHS_HealthEdEng #heelks Principles for good metrics Comparable • Valid over time for internal use • Valid externally for benchmarking • Respect diversity of services
  28. 28. A quick run through • What are your running metrics? • Scribble a couple down and think how they work out as we go along
  29. 29. Meaningful running • Citius, Altius, Amplius • What are my goals? • Are they valid this year? • Not just about times
  30. 30. Actionable running • What matters? • Can I influence it? • Changes behaviour? • Danger of assumptions
  31. 31. Reproducible running • Defined in advance • Transparent • Replicable • Non burdensome • Best available data
  32. 32. Comparable running • Valid for my performance over time? • Valid externally for benchmarking?
  33. 33. Autopsy of a dashboard!
  34. 34. NHS Athens usage levels M – Very precise goal! A – Influencing how? R – What is usage in this context? Summer versus Autumn? C – This year versus last?
  35. 35. Lit search turn around time M – How set? Negotiated deadlines? Does it matter? A – Is it ever broken? R – Is this days or searches? Is there a combined measure that would be clearer? C –
  36. 36. ToC generated requests M – What do the colours mean? What do the requests mean? Importance? A – Ability to influence this? R – 5 or less what? C – Longer run of data?
  37. 37. ToC generated requests M – What do I get from this as a stakeholder? Is this what I want? A – How would we change? R – What is included here? Timed how? Is this accurate? C –
  38. 38. Thanks Dom!
  39. 39. Practical time!
  40. 40. Quality Metrics Template @NHS_HealthEdEng #heelks • Putting theory into practice • Supporting better metrics • Supporting sharing
  41. 41. Quality Metrics Template (2.0) @NHS_HealthEdEng #heelks http://kfh.libraryservices.nhs.uk/metrics/
  42. 42. Quality Metrics Template (2.0) @NHS_HealthEdEng #heelks
  43. 43. Metric Definition: Achieve Bronze for devotion (regularity of running) at 4 week, 12 week, 24 week and 1 year time periods using the SmashRun Rank system Why is it important? Demonstrates consistent activity over an extended period Regular running is the way to improve Having a target motivates Looking to all the time periods encourages regularity while smoothing patterns through the year Process for compiling the Metric: Open http://smashrun.com/alanfricker/ranks and select Devotion option Ensure Demographic is set correctly for comparison (M / 40-49) Check for medal status What does it mean? Records number of sessions versus days in each time period and ranks me against others in the same demographic Bronze equals top 50% Limitations Takes no account of distance so I could track very brief runs to up averages Seasonal variation in others running means bronze standard on shorter time periods varies through year (fewer sessions in the winter overall) Desired outcomes: Increase average number of runs over a year from 1.8 per week in 2017 to 2.1 Improvement plans: Find someone to run with / drag the family to Park Run Take running shoes on holiday Organise #libraruns meetups at races / conferences Reporting: Review in year but consider full impact at year end
  44. 44. @NHS_HealthEdEng #heelks In use - Quality Framework at Lancashire Teaching Hospitals
  45. 45. Great tools • Running has Strava, Endomondo, Smashrun, Parkrun data, RunBritainRankings etc etc • What do we have? – OpenAthens – COUNTER – LMS – Other systems (KnowledgeShare, MailChimp etc) – Tools for feedback? – Your favourites…
  46. 46. Get stuck in • You should have brought some ideas! • Your tables have – Copies of the template – Copies with space to write – Copies of LYP dashboard – Copies of LTH Quality – Copies of metrics from the bank
  47. 47. Have a go • Draft one yourself • Discuss with your table • Ask for help • Be ready to share!
  48. 48. @NHS_HealthEdEng #heelks Make a depost in the bank Promoting sharing and supporting use • Open submission and publication • Quality check – mostly around reproducibility • One from each of you please!
  49. 49. Thanks Alan Fricker - Head of NHS Partnership & Liaison, King’s College London Alan.Fricker@kcl.ac.uk Images authors own or CC0 from pixabay.com @NHS_HealthEdEng #heelks

How can we use metrics to better monitor, develop and discuss our services? Building on past experience in NHS Libraries and the wider sector a set of principles for good metrics are described. These are illustrated using data generated by running. The quality metric template is introduced as a tool to help the definition and sharing of metrics.

Views

Total views

265

On Slideshare

0

From embeds

0

Number of embeds

2

Actions

Downloads

2

Shares

0

Comments

0

Likes

0

×