Successfully reported this slideshow.
Your SlideShare is downloading. ×

CRIG 2017 Improving digital library services with user research

CRIG 2017 Improving digital library services with user research

Download to read offline

Modern libraries provide a burgeoning array of digital services, all experienced through a myriad of touch-points. To name a few: catalogue; discovery layers; website; LibGuides; Learning Management Systems; chat; Skype; social media; YouTube; blogs; portals; email...
It's a complex picture! A dichotomy of implementing innovative new services while maintaining legacy ones rarely results in seamless, unified library experiences. Using unconnected touch-points often leads to broken user experiences. A good user experience requires research.

To increase satisfaction and delight library users, adopt an approach that gathers evidence, generates insights, and informs decision-making for iterative, incremental changes. This presentation explores some tried and tested user research methods to gather both qualitative and quantitative data from students and staff throughout all stages of project life-cycles. It aims to inspire you with examples of user research initiatives undertaken at Deakin University Library, including co-design workshops for a better homepage, and preliminary results from a longitudinal happiness tracking survey for continuous improvement.

Attendees will take away a digital set of research method cards templates, and tips for conducting quality user research to improve project outcomes at their libraries.

Modern libraries provide a burgeoning array of digital services, all experienced through a myriad of touch-points. To name a few: catalogue; discovery layers; website; LibGuides; Learning Management Systems; chat; Skype; social media; YouTube; blogs; portals; email...
It's a complex picture! A dichotomy of implementing innovative new services while maintaining legacy ones rarely results in seamless, unified library experiences. Using unconnected touch-points often leads to broken user experiences. A good user experience requires research.

To increase satisfaction and delight library users, adopt an approach that gathers evidence, generates insights, and informs decision-making for iterative, incremental changes. This presentation explores some tried and tested user research methods to gather both qualitative and quantitative data from students and staff throughout all stages of project life-cycles. It aims to inspire you with examples of user research initiatives undertaken at Deakin University Library, including co-design workshops for a better homepage, and preliminary results from a longitudinal happiness tracking survey for continuous improvement.

Attendees will take away a digital set of research method cards templates, and tips for conducting quality user research to improve project outcomes at their libraries.

More Related Content

Related Books

Free with a 30 day trial from Scribd

See all

Related Audiobooks

Free with a 30 day trial from Scribd

See all

CRIG 2017 Improving digital library services with user research

  1. 1. WITH USER RESEARCH @vfowler Improving digital library services DEAKIN UNIVERSITY CRICOS PROVIDER CODE: 00113B #CRIG2017 - UX & You Vernon Fowler
  2. 2. SLIDES @vfowler j.mp/crig2017ux With notes?
  3. 3. Today 1.Warm up 2.Methods 3.Case studies 4.Take away
  4. 4. DIGITAL PRODUCTS @leahbuley UX → encounters HTTP://ROSENFELDMEDIA.COM/BOOKS/THE-USER-EXPERIENCE-TEAM-OF-ONE/
  5. 5. Digital library products / services: PHOTO BY JASON ROSEWELL ON UNSPLASH
  6. 6. WHAT USERS WANT AND NEED Researching HTTP://ROSENFELDMEDIA.COM/BOOKS/THE-USER-EXPERIENCE-TEAM-OF-ONE/
  7. 7. HTTP://WWW.INSYNCSURVEYS.COM.AU/NEWS/2007/04/CAUL-RESULTS/
  8. 8. Search box change https://youtu.be/59N0Ugqjiso
  9. 9. A HIERARCHY OF EFFORT @igniting Fixing a bro- -ken user experience Stefan Klocek HTTPS://WWW.SMASHINGMAGAZINE.COM/2012/09/FIXING-BROKEN-USER-EXPERIENCE/
  10. 10. Success BRINGING UX INTO LIBRARIES @CraigMMacDonald HTTP://DX.DOI.ORG/10.1002/PRA2.2015.145052010055
  11. 11. Maturity & methods HTTP://DX.DOI.ORG/10.1080/01930826.2016.1232942 BUILDING UX CAPACITY IN LIBRARIES Unrecognised ConsideredRecognised Implemented Integrated Institutionalised
  12. 12. Methods CARDS
  13. 13. When to explore BEFORE HTTPS://WWW.USERTESTING.COM/BLOG/2016/04/22/CHOOSE-RESEARCH-METHOD/
  14. 14. Evaluate & validate DURING HTTPS://WWW.USERTESTING.COM/BLOG/2016/04/22/CHOOSE-RESEARCH-METHOD/
  15. 15. Time to measure LIVE HTTPS://WWW.USERTESTING.COM/BLOG/2016/04/22/CHOOSE-RESEARCH-METHOD/
  16. 16. Business values
  17. 17. Case studies PHOTO BY CLAY BANKS ON UNSPLASH
  18. 18. Co-design workshops EXPLORE AND VALIDATE A BETTER LIBRARY HOMEPAGE
  19. 19. Co-design benefits IMMEDIATE Better ideas, with high originality & user value HTTP://WWW.DESIGNFOREUROPE.EU/WHAT-CO-DESIGN
  20. 20. Co-design benefits IMMEDIATE Better ideas, with high originality & user value LONGER-TERM Better relationships HTTP://WWW.DESIGNFOREUROPE.EU/WHAT-CO-DESIGN
  21. 21. Co-design is… FUN HTTPS://WWW.NNGROUP.COM/ARTICLES/WHICH-UX-RESEARCH-METHODS/
  22. 22. Workshops TRY THIS HTTP://UXMAG.COM/ARTICLES/CREATIVITY-BASED-RESEARCH-THE-PROCESS-OF-CO-DESIGNING-WITH-USERS
  23. 23. 1. Gather & create 2. Prioritise 3. Present
  24. 24. Voice of the user Empathy building PHOTO BY MATT BOTSFORD ON UNSPLASH
  25. 25. Collages
  26. 26. Analysis
  27. 27. Insights → wireframes
  28. 28. Beta test
  29. 29. Co-design activities Invisible benefit
  30. 30. Search Happiness CONTINUOUS IMPROVEMENT HTTPS://RESEARCH.GOOGLE.COM/PUBS/PUB43221.HTML
  31. 31. Surveys TRACK ISSUES & DISCOVER OPPORTUNITIES HTTPS://WWW.NNGROUP.COM/ARTICLES/WHICH-UX-RESEARCH-METHODS/
  32. 32. First 24 hours Frequency and sentiment
  33. 33. Frustrations Log in issues Session time out Interface woes Book reviews dominate
  34. 34. Best “That you can go straight to a document (as PDF) from the first search window. And that you can easily get the citation (harvard, for example)” “Easy to use, great filters for searching for specific resources. Lots of resources available.”
  35. 35. Improving library search ‘exclude reviews’ – designed to resolve book reviews cluttering up results ‘session keeper’ – aims to resolve annoying timeouts
  36. 36. Tools & templates PHOTO BY TODD QUACKENBUSH ON UNSPLASH
  37. 37. 1 tool, 1 template Airtable 1. http://j.mp/airtableme 2. http://j.mp/uxbase2017
  38. 38. References Alvarez, H. (2016, April 22). Consider these variables before you choose a UX research method. Retrieved 2 June 2017, from https://www.usertesting.com/blog/2016/04/22/choose-research- method/ Buley, L. (2013). The User Experience Team of One : A Research and Design Survival Guide. Rosenfeld Media. Chisholm, J. (n.d.). What is co-design? Retrieved from http://www.designforeurope.eu/what-co- design Hall, E. (2013, September 10). Interviewing Humans. Retrieved from http://alistapart.com/article/interviewing-humans Klocek, S. (2012, September 27). Fixing A Broken User Experience. Retrieved from https://www.smashingmagazine.com/2012/09/fixing-broken-user-experience/
  39. 39. References MacDonald, C. M. (2015). User experience librarians: user advocates, user researchers, usability evaluators, or all of the above? Proceedings of the Association for Information Science and Technology, 52(1), 1–10. https://doi.org/10.1002/pra2.2015.145052010055 MacDonald, C. M. (2017). “It Takes a Village”: On UX Librarianship and Building UX Capacity in Libraries. Journal of Library Administration, 57(2), 194–214. https://doi.org/10.1080/01930826.2016.1232942 Müller, H., & Sedley, A. (2014). HaTS: Large-scale In-product Measurement of User Attitudes & Experiences with Happiness Tracking Surveys. Retrieved from https://research.google.com/pubs/pub43221.html Naranjo-Bock, C. (2012, April 24). Creativity-based Research: The Process of Co-Designing with Users. Retrieved 18 August 2017, from http://uxmag.com/articles/creativity-based-research- the-process-of-co-designing-with-users
  40. 40. References NNgroup. (2017). Small vs. Big User Studies — What’s Best? (Jakob Nielsen). Retrieved from https://youtu.be/ZTsT8r4MWLk PRESS RELEASE: Insync Surveys launches 2006 library results: CAUL hits an all time high. (2007, April 5). Retrieved 18 August 2017, from http://www.insyncsurveys.com.au/news/2007/04/caul- results/ Rohrer, C. (2014, October 12). When to Use Which User-Experience Research Methods. Retrieved 2 June 2017, from https://www.nngroup.com/articles/which-ux-research-methods/ Sauro, J. (2013, March 12). What UX Methods to Use and When to Use Them. Retrieved from https://measuringu.com/method-when/
  41. 41. Thank you TRY UX AT YOUR LIBRARY @vfowler
  42. 42. LibUX community ON SLACK
  43. 43. http://libux.co/slack
  44. 44. https://twitter.com/UXLibs/status/872057752859947009
  45. 45. https://twitter.com/asharman/status/872042570880253952 https://twitter.com/asharman/status/872042570880253952
  46. 46. https://twitter.com/asharman/status/872042570880253952 https://twitter.com/UXLibs/status/872057752859947009
  47. 47. Small sample, iterate rapidly: more value https://youtu.be/ZTsT8r4MWLk
  48. 48. User research BENEFITS HTTPS://ALISTAPART.COM/ARTICLE/INTERVIEWING-HUMANS
  49. 49. Biggest challenge as a Lib UX Professional Knowing what the business goals are, and how we measure success.
  50. 50. https://boagworld.com/digital-strategy/10-mistakes-in-project-management/

Editor's Notes

  • Hi and welcome to Improving digital library services with user research.
  • Through this URL my slide-deck is available. If you’d prefer a copy with notes, message me.
  • I’ll start with a warm up on UX in digital libraries.
    I’ll give an overview of some user research methods.
    Then I’ll highlight 2 case studies of projects exemplifying impacts that UX is having on digital library services at Deakin.
    At the end, I’ll give you some templates so you can start including more user research in projects at your library.
    Let’s start!
  • I’m sure those doing UX in libraries can relate to The User Experience Team of One in which Leah Buley says “user experience refers to encounters that people have with digital products, like software or a web app.”
    Think about digital products and services in your library …
  • Now this is the YOU part of this session. Everyone: shout out all the digital library products / services you can think of! … - catalogue; discovery layers; website; LibGuides; Reading Lists; Learning Management Systems; chat; Skype; social media; YouTube; blogs; portals; email, self checkout kiosks, Wi-Fi, BrowZine …
    Thank you.
  • Leah Buley explains that to do UX means:

    “to practice … methods and techniques for researching what users want and need, [and to design products and services for them.]”
  • Our regular Library Client Survey is one such technique for researching users wants and needs. So we’ve all been doing UX for years! 

    Back in 2006 & 2005, the InSync survey says “the virtual library category is a sore point for some”. [A major aspect of this category is the library search box, which many libraries have iterated on several times in the last 10 years.
  • In 2016 the Deakin Library search box was identified as an improvement opportunity.

    How will we know whether search experiences are more satisfying after a change? And where experiences are broken, how might we better meet expectations?
  • Stefan Klocek wrote Fixing a broken user experience – much like Bloom’s Taxonomy, his hierarchy of effort requires work to begin at the bottom. With both history and many services you called out earlier, libraries must approach from principle based improvements, to deliver learnable and desirable user experiences.
  • Points worth noting from Professor Craig MacDonald’s 2015 paper: First, that “User Experience (UX) is gaining momentum as a critical success factor across all industries and sectors, including libraries”. Second, within libraries, there is “a general preference for qualitative research methods”. And 3rd, that “some translation is required to bring professional UX knowledge into a library setting”.
  • In his 2017 article, MacDonald categorised libraries in terms of perceived level of UX maturity. In these models, libraries journey towards an institutionalised user-centred culture from an unrecognised “zero” state. 1 factor that distinguishes between stages of maturity is:
    “The extent that UX methods and techniques are employed across the organisation.” (p207)
  • In a poster presentation with Deakin Library colleagues last December, I shared 6 tried and tested user research methods on cards. At the end of my presentation today, I’ll give you 12 method cards to kick-start user research at your library.

    Now let’s look at when each method fits into 3 broad stages of project life-cycles.
  • Before development, where the goal is exploration, co-design and interviewing users are methods in this early stage.
  • During development the goals are evaluation & validation. Here we use methods such as Guerrilla (spontaneous) interviewing, and copious amounts of Prototyping.
  • At the live design or live service stage, the goal is measurement. Methods such as usability testing and intercepts surveys fit here. Part of choosing research methods is determined by which stage of project life-cycle you’re working in.
  • Another part is the business values associated with each method. Eg. Co-design encourages experimentation and iteration, Intercept Surveys help optimise the user experience. Both these methods support discovering growth opportunities.
    Now I’d like to share with you 2 case studies that used these methods.
  • 1st is an exploration study in the form of co-design workshops, with insights guiding our library homepage re-design. After that, I’ll talk about our happiness tracking study – a continuous measurement of satisfaction with our library discovery layer.
  • Co-design workshops formed the initial research for a homepage redesign project (underway). I’ll describe some benefits of co-design, what the method entails, our workshop procedure where participants created collages for a better library homepage, the resulting artifacts for analysis, and evidence of user needs.
  • Designing digital library services needs an understanding of both: the demand from users needs, and the supply side’s technologies & processes. So by bringing students, academics and staff perspectives together in co-design workshops provides immediate benefits such as generating better ideas with a high degree of originality and user value.
  • Co-design also offers longer-term benefits like better relationships between the library and our users.
  • Co-design is fun! It’s where “participants are given design elements or creative materials … to construct their ideal experience in a concrete way, that expresses what matters to them most, and why.”

    It’s a hybrid method that “allows users to interact with and rearrange design elements that could be part of a product experience, in order to discuss how their proposed solutions would better meet their needs and why they made certain choices.”
  • Huge thanks to Rachael Wilson who was instrumental in the success of the workshops. Our procedure went like this: - 1st, our team conducted environmental scanning to create clippings of home page elements. Add blank paper, scissors, Blu tack, tape, and sticky notes. Recruit a diverse range of participants. Then in each workshop, participant groups create collages to explore ideas for an awesome homepage. They work through 3 phases:
  • They gather elements from the stock clippings & create any elements to be included.
    To understand their priorities, they order / number elements from first to last.
    Then they present their design, so we understand what they want & why it’s important for them.
  • Before participants present, ask for their consent, then record the audio. You won’t need to transcribe, but it’s crucial for empathy building to share the voice of the user with stakeholders who couldn’t attend the workshop.
  • A nice aspect of co-design workshops is that tangible artifacts are produced by participants. We ended up with 18 collages from workshops at all 4 Deakin campuses. Then our job is to analyse. You can do this solo, but where possible, build a shared understanding by analysing cooperatively / collaboratively.
  • The collages were affinity analysed, clustering elements of the same type and labelling them with the priority participants gave them.

    Summarising the data, there were popular elements such as chat and today’s hours; there were elements staff included and students didn’t, and vice versa.
  • Insights from our co-design workshops informed wireframe construction for the desktop and mobile experiences. The insights also validated adjustments to the priority of elements in subsequent prototypes.
  • We’ve built and pilot tested this prototype. When the beta homepage is ready, workshop participants will be first to test.

    [Currently scheduled for late September]
  • An invisible benefit of co-design activities, is improved relationships with library constituents. For us, an example of this being realised was in a related project of streamlining our homepage search box.

    Including diverse user groups from an early stage gives our team confidence that we’re building the best design to fit user needs.

    Now I’d like to turn to a longitudinal search happiness study.
  • From a collaborative effort from EBSCO engineer Alvet Miranda, Hero MacDonald, Kathy Martin, and myself, we implemented a Happiness Tracking Survey for Deakin Library Search.

    We’re measuring satisfaction with Library Search and search related tasks. Our approach is based on HaTS: Large-scale In-Product Measurement of User Attitudes & Experiences with Happiness Tracking Surveys, a method developed by Google and deployed in their products to measure progress towards goals and inform decisions.

  • “Surveys measure and categorize attitudes or collect self-reported data that can help track or discover important issues to address.” We use an intercept survey method that’s triggered during use of library search.
  • A survey invitation appears on the results display. If a user dismisses the invitation, or takes the survey, they won’t see another invitation for 12 weeks.
  • The first question is the only mandatory one. It’s a 7-point Likert scale question with labels at the extremes and mid-point. An aggregate of responses tells us how satisfied or dissatisfied users are with Library Search.

    So how are we tracking so far? Before we see results, would you care to speculate how your library might rate for search satisfaction?
  • In the first 3 weeks up to the 23rd, we’re hovering between 3 and 5, fairly middle of the road. The middle-ground requires more in-depth analysis before we can direct efforts to improving. [Skip to next slide.]

    If satisfaction ratings were consistently high, maintain status quo. If they stay low, take action and consider alternatives.
  • Following the overall rating, are open-ended questions about frustrating or unappealing aspects as well as new feature suggestions. Analysing responses to these questions can help direct improvement efforts.
    This question may appear double-barrelled, however, through several experiments Google found that asking about frustrations and new capabilities in the same question increased response quantity and quality.

    Let’s look at common frustrations.
  • Analysis of frustrations has already revealed a handful of themes:
    - Log in issues, session time out, interface woes, and book reviews dominating results.

    Efforts are well underway to resolve our #1 issue. Kathy Martin is spearheading a massive project to unite Single Sign On across our platforms.
  • Next, our survey asks what people like best. Responses to this serve 3 purposes:
    Prevent us creating an unbalanced view of Library Search.
    Learn what aspects people like about Library Search to discover missed marketing opportunities.
    Help us be mindful when planning changes that may have knock on effects.
  • A couple of quotes from what people like best:
  • Then there are 6 tasks we’re asking respondents to select if they’ve attempted them in the last month. Each selection then prompts to evaluate on a scale of satisfaction.
  • From this chart, you can see what tasks are being attempted more, and prioritise accordingly. No analysis has yet been performed on the task satisfaction data, but baselines will be available when we’re ready to tackle improving specific task flows.
  • Summing up, it’s early days for our Library Search Happiness Tracking.
    I expect that once the Single Sign On project work is complete, we’ll see a drop in frustrations associated with log in issues. Developments such as exclude reviews and a session keeper should address other major issues identified. We may even see a rise in overall satisfaction. What’s important is that we now have the means to quantify and qualify the improvements we implement on a major library service.

    Now for some UX tools and templates for you.
  • UX has many research methods and a myriad of tools. My gift to you is a template of all my research methods and sample projects, wrapped in a free database tool.
  • This is what you’ll see when you copy a redacted version of my UX methods & projects database. Looking at the methods and steps, you’ll find you can easily use and adapt these to work for you. If you’d like to conduct user research at your library, this might be the head start you need.

    [The template also contains some projects, results, and so on.]
  • Airtable is my go to tool for user research (Happiness Tracking, co-design workshops, + more). The first link gets you a free account, and the 2nd gives you a copy of my UX methods & projects database.
    Equip yourself today so you can start including more user research in projects at your library.
  • I hope I've inspired you to be catalysts for more user research, to generate insights and inform decision-making for iterative, incremental changes at your library. Try UX at home.

    Thank you.
    And we have a few minutes for questions, so over to you.
  • People doing UX in libraries are widely dispersed. I’ve found it highly beneficial to participate in a global community of practice for UX in libraries. The LibUX community uses Slack to ask questions, share knowledge, give tips, and exchanges ideas.
  • So join the virtual community of practice! via http://libux.co/slack Like today, Slack is free, and the LibUX community are welcoming and helpful.
  • My post from July was a welcome to Josh Boyer of NCSU Libraries UX Department, and a mention of their decades of user studies available for perusal.
  • These 180 people are library staff from 19 countries who attended the 3rd UXLibs conference in Glasgow this June. Like you, they are from university libraries. However, only one is from Australia. If you can’t spot the Aussie in the photo, then … just look at me! [fist pump]
  • The big message from this year’s conference is “the value of getting all staff involved in research” so that initiatives get ample “buy-in and staff become more customer focussed.”

    As a member of a digital library team, it’s easy to become caught up in technological advances and implications, and lose some of that customer focus. Getting buy-in is a constant challenge, particularly for an emerging field that is largely unknown to libraries.
  • June next year is set for the 4th UXLibs conference – it’ll be in the UK, some 17,000 km from Melbourne. That may be difficult to attend, so chat with CAVAL staff if you’re looking for another local event.
  • In qualitative studies, small samples with more frequent iterations are more valuable than large simples with less iterations.
  • The ethos of Erika Hall’s book is to do no more than just enough research to garner insights that can inform decision-making. I thorought recommend the whole book, but if short on time, try the excerpt of Chapter 5 which is dedicated to User Research.
  • Look into Google’s HEART framework.
  • Clustered by stage of project life-cycle, these are the research methods I’ve used in projects so far. The observation you can make here is that we’re not doing much research in the before development phase where the goal is exploration. In an episode of Digital Insights titled 10 mistakes that are ruining the success of your digital projects, Paul Boag said “many teams seem to skip over a discovery phase because of the pressure to deliver”.

×