From Event to Action: Accelerate Your Decision Making with Real-Time Automation
JUSP Streamlines University of Portsmouth's Big Deal Benchmarking Process
1. JUSP: The University of
Portsmouth Experience
Sarah Weston
Data Manager
University Library
2. Background
At Portsmouth we do not currently have an ERM system or
usage statistics packages
Usage data is stored locally and retrieved from multiple
administration accounts. Currently collecting data from 60+
different sources in relationship to electronic journals alone
Primary objective of our internal benchmarking was to
evaluate our ‘Big Deals’, determine value for money and
provide a sound evidence base to assist decision making
3. Benchmarking activities
Initial venture into ‘Big Deal’ benchmarking was around 18-
24 months ago – adopted a teamwork approach
Keen to explore the extent to which we felt our deals were
providing us with value for money and to look at the
implications if we were to consider cancellations
At that time we did not have the benefit of JUSP and so our
early activity was a little ad hoc and not necessarily the
most time efficient in terms of process
4. Initial process
Decided to focus on three medium sized deals and adopted
a two strand approach
Activity A Activity B
Obtain full title lists across multiple Access usage from publisher platforms
years and track changes Amalgamate any usage from
Obtain lists of PRE X subs aggregator/host platforms
Obtain title counts for deals Remove any archive data
Obtain costs data Match usage with deal titles
Having determined the number of titles in the deal on a year by year basis, how
much they cost and how much they were used ((JR1-JR1a) + Aggregator +
Host) it was possible to do some cost per use calculations
5. Key issues
• For a three year period this was time consuming and
involved lots of steps
• Obtaining accurate title lists (current and old) was not
always easy
• Records of PRE X subs did not always match
• No one place to access information and data formats
often differed
• Needed to remove all of the ‘weird and wonderful’
6. Internal Coding
• Colour coding was
adopted to distinguish
PRE X subs from titles
within the deal
• Titles were also tracked
to show at what points
they entered the deal as
this was important in
terms of calculations
7. What could JUSP do for us?
We like JUSP and it is doing more for us on a month by
month basis!
Our needs:
• On-going time series of data
• Usage amalgamated from all sources
• Ability to easily identify PRE X subs and titles within the
deal over time
• Ability to extract title usage relating to open access, trials
etc.
• Need to include some elements of print
(on our own here!)
8. A few of our favourite things!
Having already started to add our subscribed titles the ‘titles versus deals’ report
enables us to identify titles within our deal which is our baseline for analysis and
separate the PRE X titles to accurately benchmark our costs
9. A few of our favourite things!
Downloading a copy of the CSV file for this report you can see that some
additional information has been added in terms of aggregator usage
10. Titles included in deals across multiple years
The titles within deals over time report gives at a glance information of
how deal content has changed to facilitate accurate reporting
11. Publisher usage by title and year
The most valuable report for our benchmarking, eliminating a significant number
of steps and providing an accurate time series upon which to import our own data
12. Titles and usage range
This report is likely to be important, one of our internal benchmarks has been three
figure usage. The ability to see at a glance the breakdown of package usage will be
helpful
13. Impact
The portal manipulates our usage data and significantly
reduces the number of steps prior to our own analysis
Our benchmarking had focused on smaller deals, however,
this will make our larger reports much easier to manage
and time efficient to produce
We have not always known exactly what we have wanted
and some of the more experimental reports have been
particularly welcomed
14. Where do we go from here?
The portal provides us with an accurate record of titles and
usage in a deal over time
Allows us to produce accurate reports into which we can
now import cost data and subsequently calculate costs per
download either within deal or at title level
From this we are able to apply some our own internal
criteria for benchmarking and look at titles within a certain
cost per download banding, three figure usage, Pre X subs
or status of the title as determined by faculty/departments
15. Summary
• The portal provides us with a valuable ‘one stop shop’
• It has assisted us with our internal processes
• Continues to evolve and responds to user needs
Editor's Notes
Detailed benchmarking was a new activity. Part of the process involved reporting to Library Committee. Looked at the value of deals in comparison to other options.
Adopted a teamwork approach divided into two activity strands, this was beneficial as a means of sharing ideas given the fact that there was no precedent in terms of this activity.
Identifying precise deal content and changes over time was time consuming. This involved removing titles within reports that did not form part of the deal, but were showing usage for a variety of reasons e.g. Individual subscriptions, freely accessible titles etc.
In our case calendar year data was required for the process, however, recognised that there might be a preference for other timescales.
Significant use is being made of the ‘experimental’ reports. This report provides at a glance information of titles included in the deal and those that can be removed. Addition of details relating to the deal is an important addition to the portal. This is a quick exercise in terms of updating each year.
If ‘core’ titles have been added to the portal these become flagged as ‘core’ within reports which allows filtering.
This simplifies the colour coding process which was done prior to JUSP. Such detail is important in terms of accurate calculations.
Favourite report and a base report for our benchmarking, providing a detailed time series of usage data. Pre existing subscription titles are easily identified for further analysis. We also intend to add individual subscription details to the portal so that these will be clearly identifiable when running other reports.
Useful for summary reporting of the deal in comparison with the whole usage report.
Use of the portal is helping to open up the debate/discussion about the value of resources and how we measure them.
The portal simplifies the process, reducing the steps that are required for detailed analysis and provides an easy to use interface which responds to the changing needs of the community.