Presenter(s): Emily Thornton, Cristina Trotter, Michael Holt, Louise Lowe.
“What is being assessed in libraries today? What tools and methods are being used? What should be assessed but is not? Why?” A national survey in Spring 2016 explored these pressing questions while investigating the current practice of assessment in libraries today. In this presentation, the researchers discuss the survey results and implications of the data.
Food Chain and Food Web (Ecosystem) EVS, B. Pharmacy 1st Year, Sem-II
Surveying the State of Library Assessment
1. Surveying the State
of Library Assessment
MICHAEL HOLT Valdosta State University
LOUISE LOWE Mercer University
EMILY THORNTON University of North Georgia
CRISTINA HERNÁNDEZ TROTTER Oconee Regional Library System
Presentation graphics by Abbie Gettys
2. SURVEY GOALS
What were our goals with the survey?
What is being assessed in libraries today?
What tools and methods are being used?
What should be assessed, but is not? Why?
3. SURVEY BACKGROUND
DATES:
Survey open May 3rd – June 3rd, 2016
INTENDED PARTICIPANTS:
Anyone who self-identifies as being responsible for any
type of assessment or evaluation in any kind of library.
DISTRIBUTION OF INVITATIONS:
Acrlassesdg - ACRL Assessment listserv
Arl-assess, GLA-L , MLA, PUBLIB, ALA Spectrum Scholars List
4. SURVEY CONTENTS
How they are
responsible for
library
assessment
Job Title
Years of
experience in
libraries overall
and in assessment
Perception of
effectiveness of
assessment
plan/practices
Examples of what is
being assessed at
their library
Examples of what
is not being
assessed at their
library
• What tools/methods?
• How are results used?
• What are the barriers?
10. HOW WOULD YOU RATE YOUR KNOWLEDGE
OF ASSESSMENT IN LIBRARIES?
3
30
129
95
23
0
20
40
60
80
100
120
140
Not knowledgeable at
all
Slightly
knowledgeable
Moderately
knowledgeable
Very knowledgeable Extremely
knowledgeable
20. DOES EXPERIENCE CHANGE PERCEPTIONS
OF EFFECTIVENESS?
• Most effectiveness ratings fell into very effective or
moderately effective no matter what the experience
level with assessment was.
• There is a slight upward shift in effectiveness ratings as
experience with assessment increases.
• Moderately effective was the most popular rating
across all experience levels.
22. Having a plan seems to make for a higher effectiveness rating of assessment practices.
DOES HAVING A FORMAL
ASSESSMENT PLAN AFFECT PERCEPTIONS
OF EFFECTIVENESS?
of those with a formal plan rated their practices as very
or extremely effective.
without a plan rated their practices that highly.
of those unsure whether or not they had a plan rated
their practices that highly.
41%
23.4%
25%
23. EFFECTIVENESS BY JOB TITLE
Those with assessment explicitly listed in their job description rated their
effectiveness slightly higher than their counterparts.
of those with assessment in their job title/description rated the
effectiveness of their practices as extremely or very effective.44%
of those without assessment as an explicit component of their
job rated their practices as extremely or very effective.20%
The largest percentage of both (46% assessment in title/description,
57% without) rated their practices as moderately effective.
24. DO THEY FEEL THEIR ASSESSMENT EFFORTS ARE EFFECTIVE ?
27. DATA AREAS BEING ASSESSED
460 unique responses were coded.
Two categories of codes:
• Data Point/Focus - What is the thing
being measured this is actually
increasing or decreasing?
• Area of Assessment - What aspect of
the library is being assessed?
34. HOW ARE THE RESULTS BEING USED?
To track and improve user services
and spaces
Most common
response:
• Reporting - Internal and External
• Collection Development
• Strategic Planning
• Staff Performance
Other popular
responses:
• Adherence to Policy
• Promotion and Marketing
Less popular
responses:
36. WHAT ARE WE UNABLE TO ASSESS?
In these three main
assessment areas:
• Instruction & Information
Literacy
• Collections
• General Services
MOST EXPRESSED A DESIRE,
BUT DIFFICULTY IN ASSESSING:
• Impact
• Correlation
• Adequacy
• Value-added services
In these three main
assessment areas:
• Instruction & Information
Literacy
• Collections
• General Services
IN THESE THREE MAIN
ASSESSMENT AREAS:
• Instruction & Information
Literacy
• Collections
• General Services
37. EXAMPLES OF COMMON CHALLENGES
How can we show
direct correlations
between library
instruction and student
retention/academic
performance?
How effective are ‘one
shot’ library instruction
sessions?
How can we build
adequate collections
through faculty support
and usage data?
How can we evaluate
electronic collections
using various vendor-
provided data?
38. EXAMPLES OF COMMON CHALLENGES
How can libraries
prove they are
meeting needs?
Are library services
and programs
reaching specific
patrons groups?
Are library services
and programs
relevant for specific
patron groups?
How can libraries
prove activities,
collections, services
are worth the
investment?
How can libraries
prove their overall
value to the
community/institution?
40. “OTHER” LIMITING FACTORS
FROM ACADEMIC LIBRARIANS:
• Difficulty defining measurable goals
• Lack of an organized mission statement to align
goals to
• No cooperation from teaching faculty
FROM PUBLIC LIBRARIANS:
• Trouble with vendor standards
• Trouble defining what is being assessed
42. DISCUSSION
How do you address the common
assessment problems in your library?
Do you have a problem not mentioned here?
Are there other obstacles to assessment?
43. QUESTIONS?
This presentation is available at:
slideshare.net/chtrotter
Presentation graphics by Abbie Gettys - agettys@ocrl.org
MICHAEL HOLT
moholt@valdosta.edu
EMILY THORNTON
Emily.Thornton@ung.edu
LOUISE LOWE
lowe_ll@mercer.edu
CRISTINA HERNÁNDEZ TROTTER
chtrotter@gmail.com
COMMENTS?