This document provides an overview of Emily Daly's ALA eLearning Workshop on using surveys to improve libraries. The workshop covered survey validation and piloting, basic survey structure, writing actionable questions, survey tools, and acting on survey data. Daly emphasized involving colleagues in survey design, sharing results, coding free text responses, crowdsourcing work, and following through on projects identified through survey findings. The key takeaways included involving colleagues and users, ensuring form follows function, planning data analysis, testing surveys early, triangulating methods, and acting on insights from surveys.
3. Agenda
• Questions from last week?
• Survey validation and piloting
• Basic survey structure
• Writing actionable survey questions
• Survey tools (briefly!)
• Acting on survey data
• Key takeaways
4. Time to hear from you!
What is one thing you hope to do or explore as a
result of what you learned in last week’s
workshop? Enter it in the chat box.
6. What is survey validation?
• The process of
assessing the survey
questions for their
dependability
• Have two parties review
the survey if possible:
1. People familiar with
the topic
2. “Expert” in survey
question design
7. Who are these “experts”?
Public libraries might get help from:
– A staff person at the State Library
– City/county government staff working in
assessment/planning
College/university library might get help from:
– Assessment/UX staff in the library
– Assessment office on campus
8. Validating the survey
• Start with a simple text document
• Have validators go through the survey and
make notes
– Do all questions and answer choices make sense,
are they unbiased, etc.?
– Will the resulting data help you answer your
questions? Is all topical content accurate?
• Make changes based on validation!
9. What is survey piloting?
• Select a small subset of
your target population
to take your survey
• Even on pilot tester is
better than none!
• Try to get a range of
different people who
represent your target
group
10. Pilot testing your survey
• Pilot one by one
• Pilot using your final
tool (digital or paper)
• Observe, ask them to
think out loud, or to
write notes
• Time respondents
• Revise your survey and
re-test if desired
13. Form follows function
• What are you most interested in learning?
• Who will analyze the data? How?
• Who will see the results?
• Who is your target response group?
• How much time do they have?
• How invested are your respondents?
15. Some examples
“The Duplin County Public Library is interested
in learning more about what community
members think of services provided by the
library.”
16. Some examples
“The Duplin County Public Library is interested
in learning more about what community
members think of services provided by the
library.”
“The purpose of this study is to identify areas
undergraduate students feel should be
addressed in order to maintain an effective
academic library.”
17. More examples
“The library is particularly interested in the
opinions of patrons who live in the western part
of the county. You have been selected at
random from community residents in western
Greene County.”
18. More examples
“The library is particularly interested in the
opinions of patrons who live in the western part
of the county. You have been selected at
random from community residents in western
Greene County.”
“Your opinions are very important to us. There
are no right or wrong answers. Your responses
will be treated confidentially. Survey results will
in no way be traceable to individual
respondents.”
20. Strategies for good sequence
• Start with well formatted, engaging questions
• Don’t lead with questions on sensitive topics
• Place questions you care about most in the
first half
• Avoid repetitive, consecutive questions that
lead to reflexive responses
• Consider ending with demographics questions
24. Closed responses
Advantages
• Uniform response set facilitates comparison,
choices clarify meaning of question, reminder
of alternatives, pre-establishment for sensitive
questions, increased response rate and speed
25. Closed responses
Advantages
• Uniform response set facilitates comparison,
choices clarify meaning of question, reminder
of alternatives, pre-establishment for sensitive
questions, increased response rate and speed
Disadvantages
• Random selection, “closest representation”
issue, loss of distinction
27. Open responses
Advantages
• Allow for deep explanations, only way to get
responses you wouldn’t otherwise know
Disadvantages
• Requires communication skills from
respondents, more time consuming for
respondent and analysis
31. Last call for feedback!
Any additional comments about the Natrona
County Public Library?
32. Last call for feedback!
Any additional comments about the Natrona
County Public Library?
What else would you like to tell us about your
experience using this page?
33. Last call for feedback!
Any additional comments about the Natrona
County Public Library?
What else would you like to tell us about your
experience using this page?
Is there anything else you’d like to share?
34. Closing your survey
• Demographics questions for data you can’t get
elsewhere
• Call for volunteers
• Clear SUBMIT button
• Thank respondents for participating!
• Web-based: Redirect to a related page or the
library homepage
35. Invitations to participate
• “Would you be willing to participate in future
discussions or focus groups about the library?
If so, please provide your contact information
below.”
36. Invitations to participate
• “Would you be willing to participate in future
discussions or focus groups about the library? If
so, please provide your contact information
below.”
• “Would you be interested in joining a community
of makers to present their work in the Maker
Space?”
– Yes, I'm interested in being contacted about this
opportunity
– No, I'm not interested in being contacted about this
opportunity
44. Avoid jargon and colloquialisms
Think of your audience.
Wording should be simple.
Avoid technical words.
X "How many times last week did you use the library's Internet-
enabled public access computers?"
45. Avoid jargon and colloquialisms
Think of your audience.
Wording should be simple.
Avoid technical words.
X "How many times last week did you use the library's Internet-
enabled public access computers?"
O "How many times last week did you use the library's
computers [to access the Internet]?"
46. Abbreviations and acronyms
Assume your audience does not know any
of these.
X “CCPL is interested in replacing its OPAC."
X “How frequently do you use our ILL services?”
47. Abbreviations and acronyms
Assume your audience does not know any
of these.
X “CCPL is interested in replacing its OPAC."
O “Cleveland County Public Library is interested in replacing its
online catalog.”
X “How frequently do you use our ILL services?”
O “How frequently do you use our Interlibrary Loan services?
[This service allows us to request material from another library
for you if we do not have it here].”
48. Avoid ambiguity
Look at the survey from every angle: are there
ways that someone could interpret a question to
have two meanings?
X "What is your income?"
X "How many people are there in your household?"
49. Avoid ambiguity
Look at the survey from every angle: are there
ways that someone could interpret a question to
have two meanings?
X "What is your income?"
O "What is your income before taxes? Include salary as well as
other sources of income."
X "How many people are there in your household?"
O "Including yourself, how many people are there in your
household?"
50. Confusing phrasing
The respondent should not have to spend time
re-reading/interpreting the question.
X "Does it seem likely or does it seem unlikely to you that you would
use a Maker Space if the library had one?"
It seems likely __ It seems unlikely __ I’m not sure __
51. Confusing phrasing
The respondent should not have to spend time
re-reading/interpreting the question.
X "Does it seem likely or does it seem unlikely to you that you would
use a Maker Space if the library had one?"
It seems likely __ It seems unlikely __ I’m not sure __
O "If the library had a Maker Space, would you use it?"
Yes__ No__ Unsure__
52. Avoid double barreled questions
A question that introduces two or more issues
with the expectation of a single response
X "Is our staff friendly and professional?"
53. Avoid double barreled questions
A question that introduces two or more issues
with the expectation of a single response
X "Is our staff friendly and professional?"
O Question1: "Is our staff friendly?"
O Question 2: "Is our staff professional?"
54. Avoid non-specific questions
Do not leave questions open to a wide range of
interpretations.
X “How do you feel about public transportation?”
55. Avoid non-specific questions
Do not leave questions open to a wide range of
interpretations.
X “How do you feel about public transportation?”
O “How do you feel about the DATA bus system in Durham County,
North Carolina?”
56. Manipulative information
Certain questions require some background. Be
careful that explanatory statements do not
unduly influence responses.
X The county government spends approximately $10 per resident on
landscaping public areas. Do you believe that the county government
is adequately allocating funds for our library by designating only $1.15
per resident?"
57. Manipulative information
Certain questions require some background. Be
careful that explanatory statements do not
unduly influence responses.
X The county government spends approximately $10 per resident on
landscaping public areas. Do you believe that the county government
is adequately allocating funds for our library by designating only $1.15
per resident?"
O "Do you believe that the county government is adequately allocating
funds for our library by designating $1.15 per resident?"
58. Manipulative information
We often are interested to know how
knowledge of the difference in spending might
affect responses, first ask straightforward, then
with additional info.
O "Do you believe that the county government is
adequately allocating funds for our library by designating
$1.15 per resident?"
O "If you were to learn that the county government
spends approximately $10 per resident on landscaping
public areas, would that change your opinion about the
adequacy of allocating $1.15 per resident to the library?"
59. Order of response options
Often there is a logical, inherent order. If order is
irrelevant, list choices alphabetically so respondents
don't assume answers at the top are more important to
the interviewer, or have software randomize them.
X Group study rooms
Digital media lab
Laptop lending
E-books
Printing/copying
60. Order of response options
Often there is a logical, inherent order. If order is
irrelevant, list choices alphabetically so respondents
don't assume answers at the top are more important to
the interviewer, or have software randomize them.
X Group study rooms O Digital media lab
Digital media lab E-books
Laptop lending Group study rooms
E-books Laptop lending
Printing/copying Printing/copying
61. Interval categories
Do not allow to overlap. Provide an unbounded
final category if appropriate.
X Age 0-10
Age 10-15
Age 15-20
Age 20-50
Age 50-75
62. Interval categories
Do not allow to overlap. Provide an unbounded
final category if appropriate.
X Age 0-10 O Age 0-9
Age 10-15 Age 10-19
Age 15-20 Age 20-29
Age 20-50 Age 30-39
Age 50-75 Age 40-49
Age 50+
63. Multiple response clarification
Sometimes we allow respondents to choose
only one option and sometimes we let them
choose multiple. Be very clear that you are
allowing multiple! Otherwise results are unclear.
X For which of the following reasons do you use the library?
X__ Y__ Z__
64. Multiple response clarification
Sometimes we allow respondents to choose
only one option and sometimes we let them
choose multiple. Be very clear that you are
allowing multiple! Otherwise results are unclear.
X For which of the following reasons do you use the library?
X__ Y__ Z__
O For which of the following reasons do you use the library? Choose
all that apply.
X__ Y__ Z__
65. Appropriate response choices
– Surveys can be frustrating when the questions are
fixed response without appropriate answer choices.
– Provide answer choices such as “Don’t know,” “N/A,”
“Unsure,” and “Other” where appropriate.
X Does the laptop lending program meet your needs?
Yes__ No__
66. Appropriate response choices
– Surveys can be frustrating when the questions are
fixed response without appropriate answer choices.
– Provide answer choices such as “Don’t know,” “N/A,”
“Unsure,” and “Other” where appropriate.
X Does the laptop lending program meet your needs?
Yes__ No__
O Does the laptop lending program meet your needs?
Yes__ No__ N/A__ [or “I’ve never used this program__”]
69. Time to hear from you!
Which survey tools have you used? (check all
that apply)
70. Survey tools
There are many tools! I’ll mention a few – feel
free to share your experiences in the chat
• Google Forms
• Qualtrics
• Survey Monkey
• PLA’s Project Outcome
• Paper!
85. Share your results
• Short email with key points about the results
• Presentation for key staff or departments or
even the entire library
• Brief report, sent only to the staff most
interested in the results
• Links to the data for staff to explore on their
own
88. Outreach opportunities
“Funding for students to purchase articles that
Duke Libraries doesn't have access to would be
nice. Often I find current articles that I can use
for research but they are over $60.”
“It would be nice to be able to return Lilly
Library DVDs to Perkins & Bostock.”
89. Triangulating data
• Methods to consider when triangulating data:
– thoughtfully planned focus groups
– semi-structured interviews
– observational studies
– targeted, more focused surveys
– usage statistics or other numerical data (e.g., gate
counts, circulation stats, web metrics)
101. Process from start to finish
1. Conduct survey
2. Analyze data
3. Share findings
4. Develop recommendations
5. Prioritize according to ease of
implementation and potential impact
6. Assign ownership
7. Follow through
8. Report out!
105. Time to hear from you!
When thinking about all that we’ve covered
during this workshop, what is one key takeaway
for you? Enter it in the chat box.
114. Thank you!
ALA eLearning will send participants a link to the
recorded workshop and slides.
Questions about this presentation?
Contact Emily Daly, emily.daly@duke.edu
Questions for ALA eLearning?
Contact elsmarketing@ala.org
Editor's Notes
Photo by Gary Simmons
Photo by Mark Seton
We were pleased to have 47 staff from across the Libraries and representing tech services, public services, IT, building services, and administration, register to attend a workshop to explore Tableau dashboards and talk about ways we might follow up on survey findings and then consider the potential impact and ease of actually implementing the recommendations they brainstormed using an impact matrix like the one here
Images from https://emba.mit.edu/images/uploads/Impact_Matrix_MIT_EMBA.jpg and https://www.flickr.com/photos/dukeunivlibraries/.
We were pleased to have 47 staff from across the Libraries and representing tech services, public services, IT, building services, and administration, register to attend a workshop to explore Tableau dashboards and talk about ways we might follow up on survey findings and then consider the potential impact and ease of actually implementing the recommendations they brainstormed using an impact matrix like the one here
Images from https://emba.mit.edu/images/uploads/Impact_Matrix_MIT_EMBA.jpg and https://www.flickr.com/photos/dukeunivlibraries/.
Images of Duke University available at https://www.flickr.com/photos/dukeunivlibraries/.
Image: https://commons.wikimedia.org/wiki/File:Japanese_Map_symbol_(Triangulation_point).svg
Surveys are an extremely useful starting point – again, they’re a great way to reach a lot of people quickly and fairly easily, and results are typically very easy to analyze. They are not, however, the best way to collect information about users’ actual behavior, since surveys rely on users to self report their habits and experiences. Additionally, it can be difficult to be sure that survey respondents understand exactly what you mean by a particular question or answer choice, and there is, of course, no opportunity in a survey to follow up on a particular question or response with individual respondents.
So, you might start by distributing a survey to get a sense of broader trends. If you decide after your survey that you need more information about a particular topic, or if you are results are inconclusive or incomplete (which is not uncommon for surveys!), it can be very useful to follow up with additional assessment and triangulate results from multiple studies.
For instance, you might lead a group discussion or individual semi-structured interview. You might observe users in your library to get a better sense of some of the behavior they may have hinted at in the survey. There may be existing data like circulation stats, usage stats, or website hits and clicks that will help further inform your survey results. You might even decide to conduct another survey so you can get more information on a few targeted aspects of your initial survey.
Joyce and I rarely, if ever, base decisions or track trends entirely on a survey, or any single assessment measure, for that matter – again, we consider surveys to be one tool in a larger assessment effort.