Statistics! What do They Mean? (iCon 2013)Presentation Transcript
Love Them, But What
Do They Mean?
Dana Belcher, Asst Library Director
East Central University
August 2, 2013
Isn't that why we became librarians?
• Measurement of services & resources
• Student Learning Outcomes!!!!
• Use, etc.
Began working with the University
Assessment Committee in 2006
• All about student learning outcomes
• Struggle for libraries
• We don’t see the end results
from our instructions
• We don’t grade papers
• ECU focused on reference &
instruction services, and high-
• Numbers made no sense
• No trend could be detected
• Focus was on big picture
• Entire collection numbers
• How does that fit into the
Moved to total Student Learning
• SAILS – university buy in
• UNIV 1001 Freshman Seminar
• UNIV 3001 General Ed Seminar
• In-house Assessment tools
• SAILS 1.2 Developing appropriate
• SAILS 3.2 Articulating evaluation
Freshman compared to Juniors:
• Instruction numbers
• Can now help in decision making
• What SAILS criteria were weak – beef
up instruction in those skills
• What SAILS criteria improved – did
instructions during those two years
• Provide results by discipline to academic
departments – lead-in to future
information literacy sessions
• More numbers!
• Still big picture with no correlation to
academic departments or the
• No structure among multiple library
• Refocus was needed, desperately
First focus - to create a template
• Provost’s requirements
• Work plan items
• Program data & accomplishments
• Data in context
• Departmental Projects
No more big picture
• Break down into academic
• Group departments into colleges &
• PCODE2 = Classification
• PCODE3 = Major (based on
• PTYPE = Type of Patron
Data in ContextLINSCHEIDLIBRARY
Master file in Excel of P codes & colors:
The big picture is now in smaller, more
No longer what the library has done, but
who and how it is being used.
• Provides the needed connection to
the university & individual academic
Now easier to compare these numbers to
external numbers, i.e., enrollment.
Web Access Management (WAM)
• Tracks connections to databases from
non-institutional networked computers
• Smart phones
• Home computers
Tracked since 2007-2008
• Never used data except to report
number of connects
• Does report by PCODE3
Vendor supplied COUNTER statistics
• Proves resources are being used
• Doesn’t tell you who uses them
• Takes a lot of time gathering them
Problem: How do connects (WAM)
intersect with COUNTER statistics?
Solution: Excel and percentages
Number of database connections by
major for AY12-13
• Majors listed with ‘total’ indicate more
than one degree available
• Connects not associated with a
college/school are segregated out
• Not included in totals/percentages
used in calculations
• Allows comparison of apples to
Number of connections compared to full-text
• Eliminated any databases not providing full-
• Inserted two rows between databases
• First: divided each major’s total connects
by the total connects of all majors to
come up with a percentage of connects
• Second: took the percentage of connects
and multiplied by the total full-text
(COUNTER) for that database
• The total for the percentage row for all
majors = 100%
At-a-glance, you can see what databases are being used by each academic
Reminder: these statistics only track connections made by non-institutional
For AY1213, total connections = 1,115,228 with 249,849, or 22.40% from
institution networked computers.
For AY1213, total connections by major = 762,921, or 68.41%. Remaining
connections were made by non-academic department entities.
Using the previously
mentioned master Excel
file of patron codes, I
can quickly insert
college/ school codes to
Connects all the pieces into one picture
• Enrollment numbers provided by
• All other numbers provided in annual
• AY1213 – not all library statistics
gathered based on PCODE3, or
• AY1314 – steps have been
implemented to gather as much as
At-a-glance, the correlation between size of college/school and parts of
• It’s no longer that the library had 9,000+ checkouts, but that CEP had
24% of the checkouts and they are 25% of the total enrollment.
• Easier to see where there are strengths and weaknesses.
Master Table Exploded:
• Each item is coded to a library department.
• All numbers entered come directly from departmental annual reports.
• Highlighted areas weren’t counted by major for AY1213.
• All areas are now being counted by major for AY1314 thru use of other
ILS codes or Excel functions.
Other Master Table – designed the same & includes all OTHER statistics
Libraries need to make internal statistics
correlate more to the university
• Refocused assessment to true
SLOs, providing individual results to
• Refocused annual reports to also
provide individual results to academic
Any files shown or spoken about are
available – just email me.