Presentation for pre-conference workshop at the Charleston Conference, 2014. There will be a time when your library will need to evaluate all of your electronic resources. How would you do it? In response to a cut to our materials budget, we have developed a method that condenses a large amount of information into a few select criteria. In this day-long workshop, we will walk through the process using the Decision Grid process developed by at the University of Maryland at College Park (Foudy and McManus 533-538) as a starting point. The workshop leaders will first demonstrate each step of our process, and then the participants will work in small groups (5-7) using their own experiences and a sample data set of their own. The steps covered will include selecting and defining the criteria, gathering and analyzing the data, and determining how to make final decisions. We will cover some technical aspects of gathering and analyzing data, including using Excel functions.
Planning for the budget-ocalypse: The evolution of a serials/ER cancellation ...University of North Texas
The University of North Texas Libraries are funded almost entirely by undergraduate student use fees. As the undergraduate enrollment has plateaued in recent years, the libraries' have not been able to keep up with rising costs, resulting in a series of cuts to the materials budget totaling nearly $4 million. While some of these cuts took the form of reductions in firm orders and dissolution of approval plans, for the past three years the bulk have come from cancellations of serials and electronic resources. With each year's cuts, the UNT Collection Development department has been forced to modify and refine their deselection process. This presentation will show the development of UNT's strategy for determining cancellations using a variety of methods (overlap analysis, usage statistics, faculty input) and tools (EBSCO Usage Consolidation, Serials Solutions 360).
Planning for the Budget-ocalypse: The Evolution of a Serials/ER Cancellation ...NASIG
The University of North Texas Libraries are funded almost entirely by undergraduate student use fees. As the undergraduate enrollment has plateaued in recent years, the libraries' have not been able to keep up with rising costs, resulting in a series of cuts to the materials budget totaling nearly $4 million. While some of these cuts took the form of reductions in firm orders and dissolution of approval plans, for the past three years the bulk have come from cancellations of serials and electronic resources. With each year's cuts, the UNT Collection Development department has been forced to modify and refine their deselection process. This presentation will show the development of UNT's strategy for determining cancellations using a variety of methods (overlap analysis, usage statistics, faculty input) and tools (EBSCO Usage Consolidation, Serials Solutions 360).
Presenters:
Todd Enoch
Head of Serials and Electronic Resources, University of North Texas
Karen Harker
Collection Assessment Librarian, University of North Texas
Lecture presented by Vivian Praxedes D. Sy at PAARL's Summer Conference on the theme "Library Analytics: Data-driven Library Management", held at Pearl Hotel, Manila on 20-22 April 2016
Action-Oriented Research Agenda on Library Contributions to Student Learning ...Lynn Connaway
Connaway, Lynn Silipigni, William Harvey, Vanessa Kitzie, and Stephanie Mikitish. 2017. “Action-Oriented Research Agenda on Library Contributions to Student Learning and Success.” Presented at the ALA Midwinter Meeting, Atlanta, Georgia, January 22.
Lecture presented by Fernan R. Dizon at PAARL's Summer Conference on the theme "Library Analytics: Data-driven Library Management", held at Pearl Hotel, Manila on 20-22 April 2016
Planning for the budget-ocalypse: The evolution of a serials/ER cancellation ...University of North Texas
The University of North Texas Libraries are funded almost entirely by undergraduate student use fees. As the undergraduate enrollment has plateaued in recent years, the libraries' have not been able to keep up with rising costs, resulting in a series of cuts to the materials budget totaling nearly $4 million. While some of these cuts took the form of reductions in firm orders and dissolution of approval plans, for the past three years the bulk have come from cancellations of serials and electronic resources. With each year's cuts, the UNT Collection Development department has been forced to modify and refine their deselection process. This presentation will show the development of UNT's strategy for determining cancellations using a variety of methods (overlap analysis, usage statistics, faculty input) and tools (EBSCO Usage Consolidation, Serials Solutions 360).
Planning for the Budget-ocalypse: The Evolution of a Serials/ER Cancellation ...NASIG
The University of North Texas Libraries are funded almost entirely by undergraduate student use fees. As the undergraduate enrollment has plateaued in recent years, the libraries' have not been able to keep up with rising costs, resulting in a series of cuts to the materials budget totaling nearly $4 million. While some of these cuts took the form of reductions in firm orders and dissolution of approval plans, for the past three years the bulk have come from cancellations of serials and electronic resources. With each year's cuts, the UNT Collection Development department has been forced to modify and refine their deselection process. This presentation will show the development of UNT's strategy for determining cancellations using a variety of methods (overlap analysis, usage statistics, faculty input) and tools (EBSCO Usage Consolidation, Serials Solutions 360).
Presenters:
Todd Enoch
Head of Serials and Electronic Resources, University of North Texas
Karen Harker
Collection Assessment Librarian, University of North Texas
Lecture presented by Vivian Praxedes D. Sy at PAARL's Summer Conference on the theme "Library Analytics: Data-driven Library Management", held at Pearl Hotel, Manila on 20-22 April 2016
Action-Oriented Research Agenda on Library Contributions to Student Learning ...Lynn Connaway
Connaway, Lynn Silipigni, William Harvey, Vanessa Kitzie, and Stephanie Mikitish. 2017. “Action-Oriented Research Agenda on Library Contributions to Student Learning and Success.” Presented at the ALA Midwinter Meeting, Atlanta, Georgia, January 22.
Lecture presented by Fernan R. Dizon at PAARL's Summer Conference on the theme "Library Analytics: Data-driven Library Management", held at Pearl Hotel, Manila on 20-22 April 2016
Slides from a presentation given 9 March 2017 at the Digital Education Summit at Sam Houston State University in Huntsville, TX. Session description: "Open Educational Resources (OER) can be great tools to enhance online courses. But what exactly are they, and how do you find them and put them to use? This session will define and illustrate OER broadly (and open textbooks in particular), highlight key tools for discovering OER, and share examples of how the integration of OER can benefit you and your students."
Mary Moser, Learning Commons Librarian, and Satu Riutta, Institutional Research Associate, both of Oxford College of Emory University, presented their findings from the Research Practices Survey at the Association of General and Liberal Studies conference in October 2009.
Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Rese...Lynn Connaway
Connaway, Lynn Silipigni. 2017. “Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Research Agenda for Student Learning and Success.” Presented at the University of Macau, Macau, April 6.
NISO Virtual Conference: Expanding the Assessment Toolbox: Blending the Old and New Assessment Practices
Keynote Address: The Value of Library-Provided Content: Assessing Usage and Demonstrating Impact
Megan Oakleaf, Associate Professor of Library and Information Science, iSchool at Syracuse University
Where are We Going and What Do We Do Next? Demonstrating the Value of Academi...Lynn Connaway
Connaway, Lynn Silipigni. 2017. “Where are We Going and What Do We Do Next? Demonstrating the Value of Academic Libraries in Time of Uncertainty.” Presented at the RLUK Conference 2017, London, United Kingdom, March 9.
Lecture presented by Rhea Rowena U. Apolinario at PAARL's Summer Conference on the theme "Library Analytics: Data-driven Library Management, held at Pearl Hotel, Manila on 20-22 April 2016
Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Rese...Lynn Connaway
Connaway, Lynn Silipigni. 2017. “Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Research Agenda for Student Learning and Success.” Presented at the University of Hong Kong, Hong Kong, April 7.
Working together: the final report: ALA 2012 (long)SAGE Publishing
Slides from Elisabeth Leonard's presentation on the "working together: evolving value for academic libraries" research by LISU and commissioned by SAGE
Final session in a series of four seminars presented to University of North Texas librarians. This presentation brings together some best practices for gathering, organizing, analyzing, and presenting statistics and data.
Slides from a presentation given 9 March 2017 at the Digital Education Summit at Sam Houston State University in Huntsville, TX. Session description: "Open Educational Resources (OER) can be great tools to enhance online courses. But what exactly are they, and how do you find them and put them to use? This session will define and illustrate OER broadly (and open textbooks in particular), highlight key tools for discovering OER, and share examples of how the integration of OER can benefit you and your students."
Mary Moser, Learning Commons Librarian, and Satu Riutta, Institutional Research Associate, both of Oxford College of Emory University, presented their findings from the Research Practices Survey at the Association of General and Liberal Studies conference in October 2009.
Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Rese...Lynn Connaway
Connaway, Lynn Silipigni. 2017. “Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Research Agenda for Student Learning and Success.” Presented at the University of Macau, Macau, April 6.
NISO Virtual Conference: Expanding the Assessment Toolbox: Blending the Old and New Assessment Practices
Keynote Address: The Value of Library-Provided Content: Assessing Usage and Demonstrating Impact
Megan Oakleaf, Associate Professor of Library and Information Science, iSchool at Syracuse University
Where are We Going and What Do We Do Next? Demonstrating the Value of Academi...Lynn Connaway
Connaway, Lynn Silipigni. 2017. “Where are We Going and What Do We Do Next? Demonstrating the Value of Academic Libraries in Time of Uncertainty.” Presented at the RLUK Conference 2017, London, United Kingdom, March 9.
Lecture presented by Rhea Rowena U. Apolinario at PAARL's Summer Conference on the theme "Library Analytics: Data-driven Library Management, held at Pearl Hotel, Manila on 20-22 April 2016
Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Rese...Lynn Connaway
Connaway, Lynn Silipigni. 2017. “Demonstrating the Value of Academic Libraries in Times of Uncertainty: A Research Agenda for Student Learning and Success.” Presented at the University of Hong Kong, Hong Kong, April 7.
Working together: the final report: ALA 2012 (long)SAGE Publishing
Slides from Elisabeth Leonard's presentation on the "working together: evolving value for academic libraries" research by LISU and commissioned by SAGE
Final session in a series of four seminars presented to University of North Texas librarians. This presentation brings together some best practices for gathering, organizing, analyzing, and presenting statistics and data.
The World Statistics Pocketbook, 2013 edition is an annual compilation of key statistical indicators prepared by the United Nations Statistics Division of the Department of Economic and Social Affairs. Over 50 indicators have been collected from more than 20 international statistical sources and are presented in one-page profiles for 216 countries or areas of the world. This issue covers various years from 2005 to 2012. For the economic indicators, in general, three years - 2005, 2010 and 2011 - are shown; for the indicators in the social and environmental categories, data for one year are presented.
World of Watson Data Science and Machine Learning trackArmand Ruis
Explore the many Data Science related presentations and labs at World of Watson. Hear about the one-stop-shop of IBM Data Science Experience which allows teams to collaborate and learn in one place. In addition, learn more about Machine Learning, Apache® Spark™, and a host of other related technology
Analysis of user experience is typically done by taking a random sample of users, measuring their experiences and extracting a single number from that sample. In terms of web performance, the experience we need to measure is user perceived page load time, and the single number we need to extract depends on the distribution of measurements across the sample.
There are a few contenders for what the magic number should be. Do you use the mean, median, mode, or something else? How do you determine the correctness of this number or whether your sample size is large enough? Is one number sufficient?
This talk covers some of the statistics behind figuring out which numbers one should be looking at and how to go about extracting it from the sample.
SPSS Statistics 17 completes the core programmability building blocks begun in SPSS 14. This presentation reviews the benefits and technology of programmability and shows four examples.
This presentation provides an overview of where programmability started, what features were available in SPSS 14 and what new features were added to SPSS 15. It explains how programmability works, shows where developers can find a host of resources, and provides numerous practical examples.
Extending and customizing ibm spss statistics with python, r, and .net (2)Armand Ruis
This presentation provides an overview of the programmability features available with the SPSS Statistics product (as of release 19), and contains examples highlighting a number of these features.
presented at the PAARL Convention on the
theme "Collection Development in the Digital Age," held at Corporate Inn, Ma. Orosa St., Manila, Philippines, 2003 Jan. 30.
What ARE we thinking? Collections decisions in an Academic LibraryLinda Galloway
When faced with multiple competing priorities for investment in library resources, there are many important aspects to consider. From student enrollment to prominence of programs, there are both data-driven and intangible factors to weigh. In addition, most library collections now focus on the immediate needs of students and researchers instead of collecting for posterity. This just-in-time versus just-in-case collection development mindset prioritizes different resource attributes and requires an often unfamiliar level of acquisitions flexibility.
The Changing Nature of Collection Development in Academic LibrariesFe Angela Verzosa
Presented at the seminar-workshop sponsored by the Center for Human Research and Development Foundation Inc. at PBSP Bldg, Intramuros, Manila, Philippines on 24 August 2006
lecture presented by Janice Penaflor for PAARL's 1st Marina G. Dayrit Lecture Series 2016 held at Asian Institute of Maritime Studies, Roxas Boulevard, Pasay City on February 19, 2016
Presenter(s): Emily Thornton, Cristina Trotter, Michael Holt, Louise Lowe.
“What is being assessed in libraries today? What tools and methods are being used? What should be assessed but is not? Why?” A national survey in Spring 2016 explored these pressing questions while investigating the current practice of assessment in libraries today. In this presentation, the researchers discuss the survey results and implications of the data.
Similar to Keeping it real: A comprehensive and transparent evaluation of electronic resources (20)
The Building Blocks of QuestDB, a Time Series Databasejavier ramirez
Talk Delivered at Valencia Codes Meetup 2024-06.
Traditionally, databases have treated timestamps just as another data type. However, when performing real-time analytics, timestamps should be first class citizens and we need rich time semantics to get the most out of our data. We also need to deal with ever growing datasets while keeping performant, which is as fun as it sounds.
It is no wonder time-series databases are now more popular than ever before. Join me in this session to learn about the internal architecture and building blocks of QuestDB, an open source time-series database designed for speed. We will also review a history of some of the changes we have gone over the past two years to deal with late and unordered data, non-blocking writes, read-replicas, or faster batch ingestion.
Adjusting OpenMP PageRank : SHORT REPORT / NOTESSubhajit Sahu
For massive graphs that fit in RAM, but not in GPU memory, it is possible to take
advantage of a shared memory system with multiple CPUs, each with multiple cores, to
accelerate pagerank computation. If the NUMA architecture of the system is properly taken
into account with good vertex partitioning, the speedup can be significant. To take steps in
this direction, experiments are conducted to implement pagerank in OpenMP using two
different approaches, uniform and hybrid. The uniform approach runs all primitives required
for pagerank in OpenMP mode (with multiple threads). On the other hand, the hybrid
approach runs certain primitives in sequential mode (i.e., sumAt, multiply).
Levelwise PageRank with Loop-Based Dead End Handling Strategy : SHORT REPORT ...Subhajit Sahu
Abstract — Levelwise PageRank is an alternative method of PageRank computation which decomposes the input graph into a directed acyclic block-graph of strongly connected components, and processes them in topological order, one level at a time. This enables calculation for ranks in a distributed fashion without per-iteration communication, unlike the standard method where all vertices are processed in each iteration. It however comes with a precondition of the absence of dead ends in the input graph. Here, the native non-distributed performance of Levelwise PageRank was compared against Monolithic PageRank on a CPU as well as a GPU. To ensure a fair comparison, Monolithic PageRank was also performed on a graph where vertices were split by components. Results indicate that Levelwise PageRank is about as fast as Monolithic PageRank on the CPU, but quite a bit slower on the GPU. Slowdown on the GPU is likely caused by a large submission of small workloads, and expected to be non-issue when the computation is performed on massive graphs.
Chatty Kathy - UNC Bootcamp Final Project Presentation - Final Version - 5.23...John Andrews
SlideShare Description for "Chatty Kathy - UNC Bootcamp Final Project Presentation"
Title: Chatty Kathy: Enhancing Physical Activity Among Older Adults
Description:
Discover how Chatty Kathy, an innovative project developed at the UNC Bootcamp, aims to tackle the challenge of low physical activity among older adults. Our AI-driven solution uses peer interaction to boost and sustain exercise levels, significantly improving health outcomes. This presentation covers our problem statement, the rationale behind Chatty Kathy, synthetic data and persona creation, model performance metrics, a visual demonstration of the project, and potential future developments. Join us for an insightful Q&A session to explore the potential of this groundbreaking project.
Project Team: Jay Requarth, Jana Avery, John Andrews, Dr. Dick Davis II, Nee Buntoum, Nam Yeongjin & Mat Nicholas
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
4. WHO WE ARE
Karen R. Harker, MLS, MPH
Collection Assessment Librarian
Laurel Crawford, MLS
Manager, Collection Development
Todd Enoch, MLS
Manager, Serials & Electronic Resources
University of North Texas Libraries
5. WHO ARE YOU?
Get to know your table-mates
What is the most interesting thing?
Who came the farthest?
Name your table team
1980’s Band
6. OUR STORY
The Good Stuff
University of North Texas
Solid teaching university
Small but growing graduate programs
Largest public university in the North Texas area
Third largest university system in Texas
Strong, vibrant community support
Enrollment grew strongly
Low rate of tuition increase
“Best Deal” in education
The Not-So-Good Stuff
State-funding declined
Tuition increased
Enrollment plateaued
Funding of library based on student fees
Library appropriations have been stable
Continuation costs have grown 5-7% per
year
Effectively reducing collections
expenditures
8. OUR STORY
In October 2013
Prepared for at least another $1M in collections budget for FY2014
Decisions would be based on data, not feelings or hunches or the
squeakiest wheels
Inspired by the article by Gerri Foudy & Alesia McManus
“Using a decision grid process to build consensus in electronic resources
cancellation decisions,” The Journal of Academic Librarianship, volume 31,
#6, pages 533-538. November, 2005.
9. THEIR STORY – OUR STORY
University of Maryland, College Park
Public university
Enrollment: 35,000
Budget year: 2004
Expected cuts: 25%
University of North Texas, Denton
Public university
Enrollment: 33,000
Budget year: 2014
Expected cuts: 20%
10. THEIR GOALS – OUR GOALS
UMCP
To evaluate all serial subscriptions to identify lesser
priority serial titles in all subject areas.
To develop a process to be managed by the libraries’
subject teams and coordinated by the collection
management staff.
To ensure that the serial review process be fair and open
to the campus community, with all faculty having an
opportunity to respond to suggested cancellations.
To balance specific departmental needs and more
interdisciplinary needs.
To respect the libraries’ responsibilities to manage the
collections budget in a responsible manner.
To identify the lowest priority of serials expenditures in all
formats and sort these commitments into three levels.
UNT
To evaluate all of the most expensive serial subscriptions
to identify lesser priority titles in all subject areas.
To develop a process managed by the collection
development staff with input from the subject specialists.
To ensure that the serial review process be fair and open
to the campus community, with all faculty have an
opportunity to respond to suggested cancellations.
To support a more interdisciplinary collection.
To project the libraries’ responsibilities to manage the
collections budget in a responsible manner.
To identify the lowest priority of serials expenditures.
11. THEIR CRITERIA – OUR CRITERIA
UMCP
Cost-effectiveness
Access
Breadth/Audience
Uniqueness
UNT
Cost-effectiveness
Ease of Use
Breadth/Audience
Uniqueness to the curriculum
12. THEIR METHODS – OUR METHODS
UMCP
Subject teams
Subject librarians
Broad disciplines
Three levels:
3 = does not meet the criteria well
2 = somewhat meets the criteria
1 = good at meeting the criteria
UNT
Collection development team
Resources grouped by type
Ranked resources
Subject librarians rated resources
Ease of Use
Breadth or Audience
Uniqueness to the curriculum
13. THEIR ANALYSIS – OUR ANALYSIS
UMCP
Summed the scores for each resource across all
criteria
Grouped the resources by score
Priority 3 = could be canceled with the least
damage to library services.
Priority 2 = cancellation of these resources would
more severely damage library services.
Priority 1 = cancellation of these resources would
cause the greatest damage to library services.
UNT
Ranked each group of resources by the score for
each criteria
Cost-per-Use
Change in price
Subject librarians’ ratings
Weighted average of 3 scores ((ease of use +
breadth*2+uniqueness*3)/3)
Converted to percentile distributions
Lowest 10th was worst-performing percentile
Averaged the percentile distributions
14. THEIR RESULTS – OUR RESULTS
UMCP
Had 10% budget target
Cancelled all resources in Priorities 2 & 3
Campus community was kept informed and had
ample opportunity for comment…
Libraries received very little negative feedback
…”and thanked us for doing the review”
UNT
Had $1.25M target
Cancelled the lowest-performing resources
Selectively modified or cancelled mid-level
resources
Advised to refrain from properly communicating
with the faculty in a timely manner
Some negative comments, some grousing about
the library funding in general
Always able to back-up decisions with data
Provided a full inventory of electronic resources
with updated information
Thanked by the Provost and Vice-Provost of
Academic Affairs
16. WHAT’S NEXT
Selecting & Defining the Criteria – Karen Harker
Gathering the Data – Todd Enoch
LUNCH!
Analyzing the Data – Todd Enoch
Making Final Decisions – Laurel Crawford
Conclusions & Discussion – You!
Each section
Some storytelling
Some guided discussion
Some hands-on activities
18. THE STORY WE WANTED TO TELL
Our decisions will be
based on the data that most
closely matches our
values.
19. WHAT ARE OUR VALUES?
Education over research
First two of Four Bold Goals
Library funded by student fees
Research often not well-connected with
education
Cost-effectiveness
Third of Four Bold Goals
Doing more with less
University coming under scrutiny
Need to take a stand
Holistic, interdisciplinary collection
development
No more subject-specific funds or formulas
Resources are inherently interdisciplinary
Education is becoming more interdisciplinary
Transparency
Communicating with the faculty & subject
librarians
Provide data that supports all decisions made
20. OAKLEAF’S LIBRARY IMPACT MODEL
Are They Learning? Are We? Learning Outcomes and the Academic Library The Library Quarterly , Vol. 81, No. 1. (January 2011), pp. 61-82, doi:10.1086/657444 by Megan Oakleaf
21. WHAT ARE YOUR VALUES?
Value of Academic Libraries – Megan Oakleaf
Align library’s values with the values of stakeholders.
Activity 1: Institutional Focus Areas
Top 5 Institutional Focus Areas
Individually
For whole team
23. THEIR CRITERIA – OUR CRITERIA
UMCP
Cost-effectiveness
Breadth/Audience
Uniqueness
Access
UNT
Cost-effectiveness
Breadth/Audience
Uniqueness to the curriculum
Ease of Use
24. COST-EFFECTIVENESS
UMCP
Cost-per-search
Rapid inflator
UNT
Cost-per-use
Use depends on resource type
Renewal price divided by the three-year
average of annual uses
Inflationary trends
5 year change in expenditures
25. COST-EFFECTIVENESS
UMCP
Cost-per-search
Rapid inflator
UNT
80/20 Rule for Big Deals
Packages only
Distribution of titles across a package
by usage
Target set: 80% of uses served by 20%
of titles
Other ways of obtaining content?
PPV, ILL, Get it Now?
26. BREADTH/AUDIENCE
UMCP
Impact on research and/or
curriculum needs
Number of users affected
Primary user groups
Number of searches per year
UNT
Interdisciplinary:
The primary user group is very broad; it is used by
students across several disciplines. The number of
students who would be affected by cancellation would
be high. It has a high impact on curriculum and
teaching.
Disciplinary:
The primary user group is limited to the discipline. It
may impact curriculum and teaching.
Niche:
The number of students who would be affected by
cancellation would be limited. It has a minimal
impact on curriculum and teaching; it may be used
only for faculty research.
27. UNIQUENESS
UMCP
Material Covered
Overlap with other sources
Unique resource for curriculum
and/or teaching
UNT
Subjective rating of liaisons
Totally unique
Somewhat unique
Not unique
Overlap with other sources
Full-text: Ejournals management system
A&I databases: Search Ulrich’s
28. THEIR CRITERIA - OUR CRITERIA
Access
Technical reliability
Open URL or Z39.50 Compliance
Ease of Use
Accessibility remotely
CD-ROM Format (2004)
Ease of Use
Easy
Most students can use both simple and advanced
features without assistance; students routinely
successfully find and access information.
Moderate
Students can figure out simple tasks on their own,
but need coaching for advanced tasks; students
have some difficulty finding or accessing
information without help.
Difficult
Students require coaching for even the simplest
tasks; only expert, experienced users are able to
find and access information without assistance.
29. USAGE MEASURES
Highest & Best Measure of Usage
That which is closest to the end-user experience
Varies by what the resource provides
30. HIGHEST & BEST USES
Individual journals • Full-text downloads
• Full-text downloads
• Distribution of usage across titles
Ejournal packages
• Items streamed/full-text
downloads
Audiovisual
Literature databases • Abstracts/record views
• Abstract/record views
• Full-text downloads
Full-text databases
Online reference (miscellaneous) • Abstract/record views
31. QUALITY
Impact Factor
Journal Lists
• From accrediting agencies
• From P&T committees
Subjective Opinions
• Librarians
• Faculty
32. WHAT DATA MATCHES OUR VALUES?
Education
• Ease of Use
• Breadth or
Audience
• Uniqueness of
content
Cost-effectiveness
• Cost-per-Use
• Inflationary
Trends
• 80/20 distribution
(packages)
Interdisciplinarity
• Breadth or
Audience
33. WHAT DATA MATCHES YOUR VALUES?
SELECT 3 MEASURES THAT MOST CLOSELY MATCH YOUR VALUES.
35. ELEMENTS OF A CLEAR DEFINITION
Characteristic • What exactly is being measured?
• Vendor
• Local
Source
• # of years
• Fiscal or calendar
Time frame
• Sum
• Average
• Weighted
Summary
• What is considered “good”?
• What is “bad”?
Scale
Direction • Higher is better or worse?
36. EXAMPLE: USE OF FULL-TEXT JOURNALS
• Full-text views & downloads
• COUNTER JR1 (v4)
Characteristic
Source • Vendor
Time frame • 3 Calendar Years
Summary • Annual Average
Scale • Minimum of 24 uses per year
Direction • Higher is better
37. EXAMPLE: COST-PER-USE OF JOURNALS
• Full-text views or downloads
• Subscription price
Characteristic
• Vendor
• ILS/ERM
Source
Time frame • 3 Calendar Years
Summary • Average
Scale • Maximum of $25
Direction • Lower is better
38. PROBLEMS OF SCALE: DIRECTION
Set in the same direction
Scores
Higher is better (1, 2, 3)
Ranks
Lower is better (1st, 2nd, 3rd)
Scales of different
measures
All the same scale? Different
scales?
Set on same scale, if possible
Use percentiles
39. PROBLEMS OF SCALE: REFERENCE POINTS
What is considered “good”? “bad”? “middlin’”?
Absolute or pre-selected?
Relative to each other?
Distribution of the values
Minimum, maximum, median, average and percentiles
Is it a smooth, gradual trend?
Are there big jumps? steep climb?
40. VIEWING DISTRIBUTIONS IN EXCEL
Open the BudgetData.xlsx file
Select a tab of interest (Ejournals,
Packages, etc.)
Select a measure of interest
Copy the column
Open the ScalesExercises.xlsx file
Paste the copied data into the
first column as values &
number formatting
Distribution statistics will appear
41. DEFINE YOUR CRITERIA
Characteristic
Source
Time frame
Summary
Scale & Reference Points
Direction
45. ORGANIZING DATA
First question – Where to store?
ILS
ERMS
Usage Consolidation Service
Wiki/Sharepoint
Spreadsheets/Databases
Designate one tool/file as Master Repository of Data
Should contain final forms of data from other sources
46. GATHERING TITLES
Identify all titles to be evaluated
Determine a unique identifier for each item
OCLC #, Vendor title #, ISSN, ILS order #
Include identifying information to be used in later analysis
Subject area
Resource type (journal, package, database, reference work)
Subscription type (annual sub, standing order, maintenance fee)
47. GATHERING COST PER USE
Gathering Cost
ILS
Publisher
Subscription Agent
Ulrich’s
Gathering Usage
Manual download from
publisher
SUSHI
Usage consolidation
service
48. THINGS TO CONSIDER WHEN GATHERING CPU
Cost considerations
Pro/super-rated prices
One-time discounts/credits
Usage considerations
Matching usage to
appropriate title(s) in master
list
Frequency
Platform changes
Rolling access
Lack of usage
51. DETERMINING SCOPE
How much of target collection should be reviewed?
All?
Titles paid from particular fund?
Titles related to a specific subject area?
52. PREPARING THE SURVEY
Lessons Learned
If looking for specific values: Lock it down!
Be clear
53. OUR RUBRIC
Rating Ease of Use Breadth of Audience Uniqueness
1 Easy
Most students can use both simple and
advanced features without assistance;
students routinely successfully find and
access information.
Interdisciplinary audience
The primary user group is very broad; it is used by
students across several disciplines. The number of
students who would be affected by cancellation
would be high. It has a high impact on curriculum
and teaching.
Totally unique
This resource contains curriculum and
teaching information not available
anywhere else.
2 Moderate
Students can figure out simple tasks on
their own, but need coaching for
advanced tasks; students have some
difficulty finding or accessing information
without help.
Disciplinary audience
The primary user group is limited to the discipline.
It may impact curriculum and teaching.
Somewhat unique
This resource contains curriculum and
teaching information which can be
found elsewhere; or it contains the
same information but has a uniquely
useful search interface or metadata.
3 Difficult
Students require coaching for even the
simplest tasks; only expert, experienced
users are able to find and access
information without assistance.
Niche audience
The number of students who would be affected by
cancellation would be limited. It has a minimal
impact on curriculum and teaching; it may be used
only for faculty research.
Not unique
This resource contains curriculum and
teaching information widely or freely
available elsewhere.
62. SURVEY RESPONSES
Collating responses for each item
Median vs. Mode
Weighting responses
Are some factors more important than others?
Pay attention to scoring direction
63. PULLING IT ALL TOGETHER
Composite score
Convert to same scale: Percentile rankings
Pay attention to direction of comparison
Reverse selected rankings by subtracting percentile from 1
67. CONDITIONAL FORMATTING CPU
CHANGING RULE TO PERCENTILE
Change
“Lowest” to
“Percentile”
Change
“Highest” to
“Percentile”
68. OTHER TYPES OF ANALYSIS
Packages
80/20 rule
Help measure efficiency of package.
If 80% of use comes from more than 20% of titles: more efficient
If 80% of use comes from less than 20% of titles: less efficient
Databases
Overlap analysis
71. OBJECTIVES
Develop a plan for making final decisions
Use SUMIF function to set up a scenario-planning tool
Plan scenarios to aid in making final decisions
Use a worksheet to document final decision-making
Use Excel charts to communicate scenarios to stakeholders
73. ON SETTING GOALS
How have you built trust among
stakeholders as you made difficult, and
maybe even unpopular, decisions?
74. ON SETTING GOALS
How have you balanced the
needs of squeaky wheels with
the needs of the entire patron
group?
75. DIFFERENT WAYS TO LOOK AT THE DATA
Value of seeing real-time impact of decisions
Open Baked Cake BRIP simplified.xlsx
76. WARM-UP
Apply conditional formatting to Status
column such that:
Keep turns the cell turns green
Drop turns the cell red
Make all 300 items say Keep for now
Change one to Drop to test your formatting
77. TASK 1: SEE PROGRESS TOWARD OVERALL GOAL
FUNCTION: SUM
Beginnin
g Cell
Ending
Cell
78. TASK 1: SEE PROGRESS TOWARD OVERALL GOAL
Formula
Result
79. TASK 2: SEE PERCENT CUTS BY DEPARTMENT
FUNCTION:
SUMIF
Criteria
Location
Criteria
to match
What
to sum
80. TASK 2: SEE PERCENT CUTS BY DEPARTMENT
Formula
Result
82. TASK 3: SCENARIO PLANNING
Goal: to reach your budget cut while balancing* the cuts across
departments
*Aim for 5%-40%
List out actual steps you took. Examples:
Dropped all titles below 0.10 composite score
Kept all titles essential for accreditation
Use filters to sort and manipulate the rows
89. COMMUNICATION & NEGOTIATION
Discussion Questions
What is your reaction to the influence of subjectivity during your
scenario planning and communication? Do you feel bias would
play a role?
How did you balance opinion vs hard data? For example, what
would you tell people who wanted to keep items with poor cost-per-
use?
91. SPEAKER INFO
Karen Harker, Collection Assessment
Librarian
University of North Texas Libraries
Karen.Harker@unt.edu
Todd Enoch, Serials Librarian
University of North Texas Libraries
Todd.Enoch@unt.edu
Laurel Sammonds Crawford,
Coordinator of Collection
Development
University of North Texas Libraries
Laurel.Crawford@unt.edu
92. PICTURE CREDITS
The following photos were used under a Creative Commons Attribution License:
Texas A&M Commerce, Graduation Summer 2014-7889
Jody McIntyre, Everest Base Camp
Romanlily, spreadsheet season
Editor's Notes
5 minutes max
I’d like to ask each group to talk about their stories amongst each other – I’m hoping to give about 15 minutes for this. Give about 10 minutes for within-table talking, then ask 1-3 tables to report.
The intersection of institutional and library needs and goals is where the impact is.
Demo this and then have the groups run their own views
Question: How would you handle resources which don’t provide usage data? Our solution: treat it as if there were zero uses.
EXERCISE: 5-10 minutes Ask them to calculate CPU. Remind to format cells for currency/accounting.