Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
Considering Metrics for NHS Library
Services
Alan Fricker, Head of NHS Partnership & Liaison, King’s College
London
Tracey...
Upcoming SlideShare
Loading in …5
×

Considering metrics for NHS Library Services

2,250 views

Published on

Prepared for the NHS HE London Health Libraries Conference 2015. Based on survey data collected as part of the Knowledge for Healthcare - Metrics task and finish group. What metrics are being used by library and knowledge Services working with the NHS and why.

Published in: Data & Analytics
  • Be the first to comment

  • Be the first to like this

Considering metrics for NHS Library Services

  1. 1. Considering Metrics for NHS Library Services Alan Fricker, Head of NHS Partnership & Liaison, King’s College London Tracey Pratchett, KLS Manager, Royal Preston Hospital 1. Introduction and aims How do we know if our services are performing well? How do we convince our stakeholders? Good metrics can help us but are not straightforward to set. The MetricsTask and Finish Group (part of the Quality and Impact stream of Knowledge for Healthcare (KfH)) are preparing a report that will advance principles for good metrics in libraries serving the NHS. To support this we wanted to establish a picture of metrics felt to be effective that are already in use. 3. What are people considering and how are they measuring? People were asked to name their metrics and indicate how they collected them. A wide range of service areas are being examined. Most approaches include usage data either system generated or manually recorded. More complex measures included LQAF and survey tools that included satisfaction measures. Focus for metrics frequently linked to areas where large inputs of staff time are required or where data is readily available. 0 5 10 15 20 25 30 Access Book/physical Current awareness Document Supply/ILLs Enquiries E-Resource Use Literature Searches Outreach Quality assurance Training Unclear User registration Website Impact LQAF Satisfaction Timely Response Usage statistics Value Not stated 4. Why these particular metrics? Respondents were asked why they found their metric effective. Responses were coded to identify themes. A single metric can have multiple codes. 5. Discussion Survey responses showed a mixed level of understanding of metrics. Pure activity measures were offered unqualified. A number of suggestions did not meet the metric definition. Limitations around the quality of electronic resource data were flagged but cost per download was the single most widely used metric. Cost per download / usage was widely used to support decision making including consideration of activity by different use groups. Speed of response was useful to flag staffing issues and to reassure users. Service level agreements are driving the adoption of metrics that are readily comprehensible to stakeholders.The high cost of literature search services makes them a strong focus. The metrics offered will be used to test proposed principles. What makes a metric?The survey included a simple definition: "A metric is criteria against which something is measured" (Ben Showers (2015) Library Analytics and Metrics) and "a criterion or set of criteria stated in quantifiable terms" (OED)” Code Definition M etrics Easy to understand Metric clear to them and/or stakeholders 4 Impact Used for impact work 15 Satisfaction Satisfaction or quality related 11 Simple to collect Metric felt easy to get data for 10 Stakeholder agreed Requested / required by stakeholder 18 Timely response Measuresof speed of response 20 Usage Measuresof usage 52 User insight Understand user behaviour (and segment users) 41 Value Value for money 26 2. Data collection An online survey tool was created in Survey Monkey.This was distributed via the KfH blog and NHS network mailing lists and ran for slightly over a fortnight. 150 responses were received but only 47 of these included a metric. A total of 117 metrics were put forward in all. Responses were received from across England and from teams working with primary, acute and mental health staff. The majority of responses came from services based in NHS organisations but local government and higher education were represented. “A better count would be average usage of Athens accounts” “Differences across various sites helps Library ensure that library staffing levels are appropriate” “Very effective as it demonstrates the turnaround time” “Difficulty is relating stats to a specific resource” “Demonstrates the currency of our stock” “Demonstrates cost effectiveness (or not) to senior managers”

×