3. Agenda
• The assessment lifecycle
• When to survey
• Sampling populations and methods
• Survey planning
• Survey validation and piloting
4. Time to hear from you!
1. What type of library do you work in?
2. Which statement best describes assessment
in your library?
5. Time to hear from you!
Free response: What do you MOST hope to get
out this workshop? (type ONE thing you hope
to get out of this workshop in the chat box)
9. The assessment lifecycle:
Plan
• Determine your objectives
• Define the questions that need to be
answered
• Map questions to data
• Design a method to answer the questions
(set up a study, collect new data, extract
existing data from a system)
11. The assessment lifecycle:
Implement
• We frequently measure everything that’s
easy to measure, without a good reason
• For data collection to foster assessment,
we must first determine what it is we
really care about, then initiate data
collection that will inform meaningful
analysis and outcomes
13. The assessment lifecycle:
Analyze and report
• Analyze our data and report them to
stakeholders
• Unfortunately, it’s easy to lose
momentum at this phase…
15. The assessment lifecycle:
React and refine
• The most frequent piece of the
assessment cycle that is ignored is the
last: making change based on the
findings of data analysis
• It is often inaction that causes the
assessment loop to remain incomplete
17. Time to hear from you!
Free response: What is ONE actionable thing you’ve
been able to do based on data you’ve collected
through surveys or other means? OR What is ONE
actionable thing you HOPE to be able to do based
on data you collect? Type your example in the chat
box.
20. Pros and cons of a survey
Pros
• Inexpensive, quick, ability to reach large
numbers of people, can collect both
qualitative and quantitative data, no
observer subjectivity
Cons
• Subject to misinterpretation, inflexible
design, inability to follow up or probe deeper
21. Surveys are best when…
• You want data on attitudes, beliefs,
experiences, needs, demographics, perceived
behavior, etc.
• You can’t acquire from a machine source
• You want info from a large number of people
and do not need to follow up on questions or
probe deeply (at least not yet!)
22. …but you may need more
• It’s often necessary to follow up on what you
learn in a survey.
• We use a technique we call triangulation.
23. Triangulating data
• Methods to consider when triangulating data:
– thoughtfully planned focus groups
– semi-structured interviews
– observational studies
– targeted, more focused surveys
– usage statistics or other numerical data (e.g., gate
counts, circulation stats, web metrics)
24. Time to hear from you!
Which of the following have you conducted or
been involved in at your library?
28. Census survey
• While most surveys rely on sampling, you can
also gather information from every single
person in a target population.
• This is called a census.
• Example: your academic library provides
carrels to select graduate students and in
return, each student is required to complete a
survey. You have information from 100% of
the population.
29. Sampling
• Sampling uses a
representative group
of a given population
to determine
characteristics of the
entire population.
• If you can’t talk to
everyone, you get a
sample.
30. Random sampling
Ideally we use random sampling:
• We invite a smaller group of people (the sample)
from a larger group (the population) to answer
our survey.
• Each person is chosen randomly and each
member of the population has an equal chance
of being included in the sample.
• In this way, we hope various groups within the
population end up equally represented in the
results.
31. Convenience sampling
• A convenience sample is
composed of people
who are easy to reach
• Unlikely that a
convenience sample will
accurately represent
your target population
• Example: placing a
satisfaction survey at
your library’s reference
desk
32. Who are we missing?
• Sometimes you want
feedback from people
who are not easily
accessible
– Patrons who only use
online resources
– Community members
who do not use the
library
– People without email
• Plan accordingly!
37. Before drafting your survey
• What problems are you trying to solve?
• What questions are you trying to answer?
• What data will help you answer the questions,
and does it already exist?
38. Getting permission and buy-in
• Which staff, administrators, users, or other
stakeholders need to be involved or kept
informed?
• Do you need IRB approval (colleges and
universities)?
• Who will ultimately receive your survey
results? It’s wise to include them from the
start!
39. Academic libraries: Working with IR
• Do you have an Institutional Research office?
• Develop a good working relationship with IR
• Collaborate on surveys (when appropriate)
months ahead of time
– You may need their approval or assistance
– Often they can provide a random sample
– Can they provide demographic data with a
sample, or add demographics to returned data?
40. Academic libraries: IRB
• Institutional Review Board (IRB)
• Reviews and approves research involving
human subjects to ensure that it is conducted
in accordance with all federal, institutional,
and ethical guidelines
• IRB is concerned with protecting the welfare,
rights, and privacy of human subjects
• Established in 1974 after major human rights
abuses in research of the 20th century
41. IRB rules of thumb
1. Does it involve human subjects? –Yes
2. Is it “research”? – Quite possibly not!
– Systematic investigation designed to develop or
contribute to generalizable knowledge through
public dissemination such as published articles,
presentations, and poster displays.
• Anonymous surveys are often exempt (you
must still follow all ethical guidelines but may
not be subject to IRB oversight)
42. Do I need to go through IRB?
• Ask others at your library if there is a library-
wide policy – sometimes libraries have blanket
policies with their local IRBs for most surveys
• Contact the IRB – if you have medical facilities
there may be two, contact the non-medical
– Describe your project
– They will tell you whether you do not need to
submit, or if you can submit an exempt proposal
44. Timing is key!
• Consider your primary audience when
thinking about timing for survey release
• Consider the circumstances of different target
populations
45. Distribution details
• What type of distribution is required?
– Rolling: Is the survey ongoing, without a closing
date?
– One time/periodic: Is the survey distributed one
time, or once a year, etc.?
– Program-dependent: Is the survey distribution
linked to a particular program?
• Beware survey fatigue!
46. Incentives
• Will you provide
an incentive?
– Raffle? Each
participant?
– $$? Service?
Goods?
• Anonymity may
pose a problem
for incentives
47. Recruitment strategy
• Target audience: Program participants, users
of a particular service you want to know more
about
• Random sample if you are interested in a cross
section of users
• Recruitment methods: Direct email, links on
homepage or pertinent webpages, email
blasts/listservs, bathroom fliers
48. Web-based distribution
• For open web links
– Will you collect an identifier?
– Will you prevent “ballot box stuffing”?
• For direct invites
– Leave open 1-3 weeks
– Consider invitation email carefully
– Send at least one follow-up reminder (ideally only
to those who have not yet completed the survey)
51. What is survey validation?
• The process of
assessing the survey
questions for their
dependability
• Have two parties review
the survey if possible:
1. People familiar with
the topic
2. Expert in survey
question design
52. Validating the survey
• Start with a simple text document
• Have validators go through the survey and
make notes
– Do all questions and answer choices make sense,
are they unbiased, etc.?
– Will the resulting data help you answer your
questions? Is all topical content accurate?
• Make changes based on validation!
53. What is survey piloting?
• Select a small subset of
your target population
to take your survey
• Even on pilot tester is
better than none!
• Try to get a range of
different people who
represent your target
group
54. Pilot testing your survey
• Enter survey into online
tool or final paper doc,
and then test with
several respondents
• Revise your survey and
re-test
• Time respondents
56. Plan for next week’s workshop
• Questions or comments from last week?
• Structure of the survey
• Writing unbiased, actionable questions
• Survey tools
• Acting on survey data
• Tips and lessons learned
57. What else for next week?
• Enter in the chat box other topics you’d like
covered in next week’s session
58. Thank you!
ALA eLearning will send participants a link to the
recorded workshop and slides.
Questions about this presentation?
Contact Emily Daly, emily.daly@duke.edu
Questions for ALA eLearning?
Contact editionscoursehelp@ala.org