Presented by Annabelle Mortensen and Stephanie Anderson at the PLA 2016 Conference.
Content strategy and user testing are buzzwords from the online realm, but these principles can be just as useful for practitioners of old-school collection development. Tear out some pages from the digital librarian’s playbook and learn how user interviews, evaluative research, A/B testing, and other fast, inexpensive UX techniques can revolutionize your approach to collection management.
At the end of this session, participants will:
1: Understand the basic principles of content strategy and user research.
2: Be able to identify myriad ways to put these methods into practice at your library.
3: Learn how to apply specific, scalable UX techniques to collection management.
11. Creating alignment
means: interviewing stakeholders, identifying
goals, creating a common understanding
like: ensuring staff is on board with the weeding
plan, making displays based on holds lists
12.
13. Providing assessment
means: examining existing content through
audits
like: evaluating circulation stats over time,
comparing collection to similar libraries,
reviewing booklists
14. Developing strategy
means: turning ideas into direction, deciding
how success is measured, planning for
maintenance over time
like: collection development policy changes,
creating budgets in response to use patterns
21. Observation
means: observe people in context of usage
like: watching a heavily browsed display or shelf
to see what patrons pick up and what they take
with them
22. Interviews
means: actually talk to people!
like: asking someone in the stacks how the
library can make browsing easier, or developing
a survey to learn where patrons get book
recommendations
23. A/B testing
means: creating multiple versions of a feature
and showing each to different users to see which
performs better
like: creating two shelf labels for two sets of
DVDs and asking patrons at the circ desk for
their opinion on which is easier to read
24. Card sorting
means: to help structure information, ask users
to put words in logical groups
like: rather than renaming sections based on
what staff calls them, asking a group of patrons
to sort book titles into groups and seeing what
happens
25. Analytics
means: stats!
like: How many patrons recommended a book to
us this month? How many book groups did we
host? How many patrons asked where the new
DVDs are?
26. UX questions you can apply
to anything:
Who is this for?
What problem does it solve for them?
How do we know they need it?
How do they access it?
How can we test this?
27. Core UX ideas anyone
can use in their work:
Strive for alignment.
Question your assumptions.
Work small.
Only research/test things that support specific
decisions.
People want what they want, not what you
want.
36. Recap!
-Strive for alignment.
-Question your assumptions.
-Work small.
-Only research/test things that support
specific decisions.
-Remember people want what they
want, not what you want.
37. Just Enough Research by Erika Hall
The Elements of Content Strategy by Erin Kissane
Lean UX: Applying Lean Principles to Improve User
Experience by Jeff Gothelf with Josh Seiden
Rocket Surgery Made Easy: The Do-It-Yourself Guide to
Finding and Fixing Usability Problems by Steve Krug
Read more …
38. Useful, Usable, Desirable: Applying User Experience
Design to Your Library by Aaron Schmidt and
Amanda Etches
UK Government Digital Service Design Principles:
https://www.gov.uk/design-principles
And more …
I realize the title should have referred to user research, not testing. Tempted to change it for the program.
I changed it! We can always say it was a little A/B testing on our part ;)
I wonder if here, rather than the opening slide, would be a better place to talk about ourselves a bit—who we are and how we came to learn about this stuff.
Annabelle: RA/reference lib who joined library’s web redesign and social media team as content strategy coordinator b/c of journalism background. At first it was scary and foreign. Once I learned more about user experience tools and put them into practice, I realized how applicable it is to other areas of library, although sadly most staff members don’t know about it. Brought this UX mindset to collection development. One frigid Midwinter day began talking about this to Stephanie …
Great idea!
I’m the Assistant Director for Public Services at Darien Library, which is a library that has a UX department. One of my goals in my current position has been to try to make UX more accessible for everybody—as Annabelle and I discussed when we met last year, most people working on collection development (as well as public services) can learn a lot from UX, but the jargon and siloing in many libraries has made it hard for people to benefit from what is really, at its heart, just a set of tools.
Excellent question! Content strategy definitely sounds more like something you’d hear about in an article about Silicon Valley juicing start-ups and not at a public library conference.
Some libraries are beginning to rebrand collections as content. In the web world, this is the most common definition of content strategy, as defined by Halvorson, one of the pioneers in the discipline.
So really, when we talk about content strategy, what we’re talking about is knowledgable people picking stuff, organizing it, and making it easy to find.
You know, sort of like our profession has been doing for centuries.
https://www.flickr.com/photos/susannatron/4474968603
Rather than our jobs becoming obsolete because of content strategy, they are in fact more important than ever!
This is not a comprehensive list of all a content strategist does, but are some of the tools that are most useful
This helps your get your bearings and identify everyone’s role in the process. It also means that your users have as good of a sense of your goals as your staff.
You don’t want anybody to feel this way! That’s the role of content strategy.
Ugh, audits! Just pretend that says “circulation stats” and it will make more sense.
This includes figuring out what you’re going to measure, why that’s important (thanks to alignment and assessment phases), how you’re going to do it, and how you’ll evaluate it over time. By addressing these issues, you’ll stay focused on patrons and hopefully won’t waste your energy chasing useless objectives or data.
We’re using the terms user testing and research somewhat interchangeably here; testing is part of the umbrella of research.
Nick Disabato is a UX consultant (and library-school grad!) in Chicago
If you work for a library, I am guessing you spend some of your time being like WHAT IS WRONG WITH YOU PEOPLE and sometimes you might not even know whether you mean your co-workers or your patrons. User research combats that feeling and also means that you are working with facts, not emotions.
We all carry assumptions—why this genre circulates well but that one doesn’t, why this year’s winter reading club had more participants than last year, etc.,—but research helps you get more information to make better decisions. Then once you make a decision, keep getting information so you can continue refining and improving your work. The great thing about testing is that you usually don’t need anything fancy or expensive. You just need to know what your questions are and figure out how you’re going to test those questions to give you enough information to move forward. You also don’t need a huge pool of participants—usually a few people can give you enough information to begin moving forward. Small, simple tests are always best.
(does this work you think?) UX aligns to basic tenets for collections & RA--Ranganathan’s laws, Charlie Robinson’s “give ‘em what they want” etc.)
Again, not a comprehensive list—there are many other tools out there. But we’ve found this mix of both qualitative and quantitative tools most relevant to our work.
Yes, this can seem creepy. But it provides really helpful quantitative information.
It’s important to ask the right questions—open-ended, neutral, work best. There is a lot of good information on our reading list and elsewhere online about the right way to do interviews; for surveys Survey Monkey’s site has some great articles on crafting useful surveys. Interestingly, many UX practitioners frown upon using focus groups. One-on-one is best. If it’s in person, good to have to interviewers—one asking questions and the other observing and taking notes.
Card sorts can be open—where people sort and create their own categories, or closed—words are sorted into they categories they’ve been given. This can be really great for figuring out wording for signage, location codes, and catalog verbiage
We have a lot of these: circ, turnover, door count, reference questions, RA questions, etc. The key is to figure out how they’re useful—what information do they tell us? What questions to they trigger? What questions do they answer? Analytics also are a great tool for testing assumptions.
I keep a version of this taped above my desk.
Some core CS/UX ideas to adopt: (Not sure if this should be at the end, or mentioned now and then recapped at the end).)
Strive for alignment.
(Not necessarily consensus, but common understanding within your organization)
Question your assumptions.
(Don’t have evidence that validates what you just said/think? Then it’s an assumption. People LOVE having assumptions. )
Work small.
(Do just enough research. Keep tests small and feedback loops short. Tests need not be fancy; in fact, it’s best if they are fast and cheap. Just make sure you write a summary of what went down.)
Only research/test things that support specific decisions.
(Figure what it is you want to know and why, and test just that. Findings give you confidence in your decisions. And if findings are inconclusive, that’s okay too. Make a decision and keep testing.)
People want what they want, not what you want.
Popular Author Performance Summary (Fiction) – Analytics in action! Also a good example of creating alignment
First in Series – A/B testing example, observation/interviews as a secondary benefit
Hoopla circulations: Analytics + A/B testing (with patrons in mind!)
Collection Summit (creating organizational alignment in preparation of rehauling collection development plan)
Testing signage for new CDs (quick paper prototypes)
Testing new DVD labels (discussed options, made three prototypes, showed them to patrons and shelvers--have photo)
Figuring out if we were buying enough DVDs and DVDs (tracking holds through spreadsheet)
Figuring out if we were buying enough DVDs and DVDs (tracking holds through spreadsheet)
Also testing databases with users, although this has worked less well.
When you begin looking at collections through this lens, you’ll start seeing a number of possible applications with displays, booklists, advisory questionnaires, summer-reading planning, patron-driven acquisition, and more. It’s really about developing a user-centered mindset and learning about tools that can help you even further with that process.
A lot of people working with collections and RA really want to try new things, but find themselves in a position or a department where they get some resistance to those new things. We are hoping we’ve given you the tools, the language, and the confidence to implement a lot of the cool new ideas you’ve picked up at the conference.
Most of these resources focus on web UX, but they do a great job of discussing these tools in depth and will strengthen your understanding of user research. And even if you don’t work on websites you visit them, and it will provide a really fascinating view at what bad websites and apps do wrong. Plus they are all short.
I realize the title should have referred to user research, not testing. Tempted to change it for the program.
I changed it! We can always say it was a little A/B testing on our part ;)