0
Usability for Tough Times Suzanne Chapman User Experience Department Ken Varnum Web Systems Department Image by flickr use...
Who Are We?
<ul><ul><li>aka &quot;discount” or “informal” or &quot;do it yourself&quot;  </li></ul></ul><ul><ul><li>It’s not just abou...
<ul><ul><li>Quick answers to simple questions </li></ul></ul><ul><ul><li>Faster </li></ul></ul><ul><ul><li>Easier </li></u...
When to Use Budget Techniques? <ul><li>When you just want a quick reaction to something. </li></ul><ul><li>  </li></ul><ul...
How to Use Budget Techniques? <ul><ul><li>Early and often </li></ul></ul><ul><ul><li>Alongside usage statistics & user fee...
Participants <ul><ul><li>Anywhere from 6-100+ </li></ul></ul><ul><ul><li>Where & how to find participants: </li></ul></ul>...
Lessons Learned & Tips <ul><ul><li>Test the test. Time spent piloting the test is time well spent.  </li></ul></ul><ul><ul...
 
 
Participatory Design Actively involve users in the design process. (inspired by Nancy Foster) Description: X/O & Ideal Des...
Participatory Design <ul><li>X/O Instructions: </li></ul><ul><ul><li>Circle the things you find useful </li></ul></ul><ul>...
Participatory Design <ul><li>Ideal Design Instructions: </li></ul><ul><ul><li>Draw your ideal library website. </li></ul><...
Participatory Design Materials Cost:  $0 Incentives Cost:  $75+ Set up time:  ~1hr Test time:  ~4hrs Analysis:  >12hrs Act...
 
 
 
 
Participatory Design
 
Card Sorting Materials Cost:  $0 / $125 for online tool. Incentives Cost:  $90 Set up time:  ~3hrs Test time:  ~2hrs Analy...
Card Sorting Group paper card sort - Services/Departments/Libraries
Card Sorting - Services/Departments/Libraries OptimalSort online card sort
 
Guerrilla Testing Materials Cost:  $0 Incentives Cost:  $0 Set up time:  ~2hrs Test time:  ~2hrs Analysis:  ~4hrs <ul><li>...
Guerrilla Testing <ul><li>Contents: </li></ul><ul><ul><li>Removed/added links </li></ul></ul><ul><li>Labels: </li></ul><ul...
Online Guerrilla Testing Materials Cost:  $0* Incentives Cost:  $0 Set up time:  ~1hr Test time:  0 Analysis:  ~1hrs Autom...
Where would you click to go directly to an article?
Questions? All past reports: www.lib.umich.edu/usability   Suzanne Chapman (suzchap@umich.edu) Ken Varnum (varnum@umich.edu)
Upcoming SlideShare
Loading in...5
×

Usability for Tough Times

1,373

Published on

Like many libraries, the University of Michigan Library for a long time employed no one for the purpose of website usability. To address the gap, a Usability Group was formed. The structure and methodologies of the group have evolved over the last four years, producing an efficient organization with innovative and highly effective techniques. Twenty-eight staff members have contributed to this group. Six different systems have been evaluated, resulting in over 30 reports and hundreds of recommendations.

Although resources are evaluated using a wide range of traditional techniques (formal testing, focus groups, surveys, heuristic evaluations, prototype testing, etc.), the group strongly believes that usability doesn't have to be complicated and time-consuming, favoring more straightforward, "budget" techniques as means to the most interesting and useful results. The group also often employs an iterative approach to testing by repeating and refining tests to evaluate effectiveness of changes and to fine-tune techniques.

This presentation will describe the Usability Group's techniques and findings from our most recent projects to evaluate the library's recently launched website. Specifically, we will describe methodologies, present testing materials and results from "guerilla" testing, group card sorting, and participatory design sessions with undergraduates, graduates and faculty, and staff. Participants will be able to apply these methods in their own libraries.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,373
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide
  • Traditionally most usability work done through this committee but recently created UX department will now also be focusing on this type of research. Nice to mention that UTF members volunteer and rarely have prior experience so budget techniques are easy to learn and easy to take back to their departments to use for other purposes! I’m a member of the core team, and also one of the biggest ‘customers’
  • This is OUR definition and it’s pretty loose.
  • Ken – maybe you want the slide to just say “Faster” and fill in the bits in parens? Faster (less time investment for prep) Easier (less time designing evaluations, cut out time-consuming things like recruiting participants) Cheaper (don’t need fancy software or facilities) More targeted (have a question, answer it directly) More staff (with less expertise) can take it on. (and maybe just as reliable results) Doing a couple of budget tests is better than doing nothing.  The ramp up to doing formal testing can be prohibitive to actually getting it done at all.
  • Like voting in Chicago
  • [this seems better at end, but I like the idea of the last slide being the online guerrilla.] Good example would be to describe our first attempt at guerrilla: We were looking to relabel the link to our various delivery services… we asked 9 people what they’d call it and got 9 different answers. It was still interesting and useful but we had to redesign and redo the test to get a solid answer.
  • Turning to our site, it’s goals, and our evaluation of them We launched a new web site in August 2009. Based around two main features: Search and Browse . Through user studies (both guerilla and longer-term) we generated data that helped define/conform the scope of our problem: Catalog Federated Search Research Guides x 3 300+ database providers Thousands of online journals Dozens of libraries One-click survey Broad themes emerged: “ I can’t find anything” (our Dean) A vehicle for a single identity – the MLibrary brand Provide a unified user experience Platonic Ideal – Eliminate Silos: information not location Site we have now in response to these needs. How’d we do, we wondered? Project priorities Gain a better understanding of user&apos;s perception and use of the &amp;quot;new&amp;quot; library website (it&apos;s now 1 year old!) Pinpoint &amp; evaluate problem areas Completed 4 evaluations using 3 different methods in 8 months
  • Suz intro ask how many people have done “formal” vs. “informal”
  • Search language
  • talk about how we analyzed data (user groups, sep out areas) to identify trends
  • Like most larger libraries, we have a bewildering array of service points Hard to organize in a way meaningful to the public We needed to break out of our librarian-centric way of thinking about the org chart Card sorting  
  • Group Paper Card Sort w. Students 18 participants: undergrads, grad students (divided into 4 groups) Organized 84 cards representing half of this content Allowed us to see interaction among students, hear thought processes, and better understand confusing labels   Individual Online Card Sort w. Staff Purchased license to OptimalSort allowing us to place in front of many individuals 140 staff completed exercise Provided more data, but didn&apos;t expose the thought process   
  • Exploring the results can be tricky   Task Force also came up with &amp;quot;unified&amp;quot; categories, based on the categories the participants created, as well as the comments they made during the card sort.    Several similarities between categories surfaced across the various participant groups performing the card sort, whether performing a paper sort or using the online tool.    Both the similar groupings across participant groups and the &amp;quot;unified&amp;quot; categories the Task Force came up with were suggested as bases for further tests. Implementing changes will be a large-scale change that would add significant complexities for users and staff.  
  • Exploring the results can be tricky   Task Force also came up with &amp;quot;unified&amp;quot; categories, based on the categories the participants created, as well as the comments they made during the card sort.    Several similarities between categories surfaced across the various participant groups performing the card sort, whether performing a paper sort or using the online tool.    Both the similar groupings across participant groups and the &amp;quot;unified&amp;quot; categories the Task Force came up with were suggested as bases for further tests. Implementing changes will be a large-scale change that would add significant complexities for users and staff.  
  • Goal: Fine-tune the contents &amp; labels for Quick Links.   Give brief history:  used to have QL, killed in new site, heard from lots of faculty that it was a poor idea to eliminated, brought it back.     The Test: 20 participants: undergrads, grad students Participants were shown the current Quick Links section without its title-- asked to name the section and describe where each link went Then asked what links they would most like to see in a grouping of links like this one    Findings &amp;quot;Outages&amp;quot; not understood or considered to be useful.    More than half of users requested addition of Webmail link.    Quick Links label works well. Removed/added links Rearranged links Retitled &apos;Ejournals&apos; -&gt; &apos;Online Journals&apos; (throughout site)  
  • URL of survey is at http://umichlib.qualtrics.com/SE/?SID=SV_3rZvKvGPvIkS1ms Still need to set up a TinyURL. And add URL to this slide.
  • URL of survey is at http://umichlib.qualtrics.com/SE/?SID=SV_3rZvKvGPvIkS1ms Still need to set up a TinyURL. And add URL to this slide.
  • Transcript of "Usability for Tough Times"

    1. 1. Usability for Tough Times Suzanne Chapman User Experience Department Ken Varnum Web Systems Department Image by flickr user alancleaver_2000 Budget Usability
    2. 2. Who Are We?
    3. 3. <ul><ul><li>aka &quot;discount” or “informal” or &quot;do it yourself&quot; </li></ul></ul><ul><ul><li>It’s not just about the money… It's also about the time and effort. </li></ul></ul><ul><ul><li>It’s qualitative, informal, and unscientific. </li></ul></ul><ul><ul><li>Our definition: Anything that you can do with low overhead that involves users interacting with a site.  </li></ul></ul>What is Budget Usability? “ The purpose isn’t to prove anything; it’s to get insights that enable you to improve what you’re building” – Steve Krug Image by flickr user sarabc
    4. 4. <ul><ul><li>Quick answers to simple questions </li></ul></ul><ul><ul><li>Faster </li></ul></ul><ul><ul><li>Easier </li></ul></ul><ul><ul><li>Cheaper </li></ul></ul><ul><ul><li>Targeted </li></ul></ul><ul><ul><li>More staff participation </li></ul></ul>Why Use Budget Methods? Image by flickr user electrofantastic
    5. 5. When to Use Budget Techniques? <ul><li>When you just want a quick reaction to something. </li></ul><ul><li>  </li></ul><ul><ul><li>Link label </li></ul></ul><ul><ul><li>Placement of something </li></ul></ul><ul><ul><li>Findability of some piece of content </li></ul></ul><ul><ul><li>Attitude towards a design </li></ul></ul><ul><ul><li>Do users get it? </li></ul></ul>Image by flickr user s2photo
    6. 6. How to Use Budget Techniques? <ul><ul><li>Early and often </li></ul></ul><ul><ul><li>Alongside usage statistics & user feedback </li></ul></ul><ul><ul><li>In conjunction (or in preparation for) larger evaluations </li></ul></ul><ul><ul><li>With a grain of salt </li></ul></ul>Image by flickr user sergesegal
    7. 7. Participants <ul><ul><li>Anywhere from 6-100+ </li></ul></ul><ul><ul><li>Where & how to find participants: </li></ul></ul><ul><ul><ul><li>&quot;in the wild&quot; & on-the-fly! </li></ul></ul></ul><ul><ul><ul><li>links from website </li></ul></ul></ul><ul><ul><ul><li>emails sent to departments via Subject Specialist Librarians </li></ul></ul></ul><ul><ul><li>Incentives: candy, MLibrary gadgets, or a few &quot;blue bucks&quot; each </li></ul></ul>Image by flickr user faultypixel
    8. 8. Lessons Learned & Tips <ul><ul><li>Test the test. Time spent piloting the test is time well spent. </li></ul></ul><ul><ul><li>Articulate your expectations but be flexible. </li></ul></ul><ul><ul><ul><li>Just want general feedback? Ask an open question. </li></ul></ul></ul><ul><ul><ul><li>Want to solve a specific problem? Ask a direct question.  </li></ul></ul></ul><ul><ul><li>Iterate. Know when to admit that something didn't work well. Refine and repeat. </li></ul></ul>
    9. 11. Participatory Design Actively involve users in the design process. (inspired by Nancy Foster) Description: X/O & Ideal Design
    10. 12. Participatory Design <ul><li>X/O Instructions: </li></ul><ul><ul><li>Circle the things you find useful </li></ul></ul><ul><ul><li>Put an X through the things you don't find useful </li></ul></ul><ul><ul><li>Add a note for anything that's missing </li></ul></ul>
    11. 13. Participatory Design <ul><li>Ideal Design Instructions: </li></ul><ul><ul><li>Draw your ideal library website. </li></ul></ul>
    12. 14. Participatory Design Materials Cost: $0 Incentives Cost: $75+ Set up time: ~1hr Test time: ~4hrs Analysis: >12hrs Actively involve users in the design process. (inspired by Nancy Foster) <ul><li>36 Participants: </li></ul><ul><ul><li>15 Undergrads </li></ul></ul><ul><ul><li>5 Grad Students </li></ul></ul><ul><ul><li>2 Faculty </li></ul></ul><ul><ul><li>15 Library Staff </li></ul></ul>Description: X/O & Ideal Design
    13. 19. Participatory Design
    14. 21. Card Sorting Materials Cost: $0 / $125 for online tool. Incentives Cost: $90 Set up time: ~3hrs Test time: ~2hrs Analysis: >10hrs Ask users to sort a series of cards, each labeled with a piece of content, into groups that make sense to them. <ul><li>158 Participants: </li></ul><ul><ul><li>18 Undergrads & Grads </li></ul></ul><ul><ul><li>140 Library Staff </li></ul></ul>Description: Did a combination of sessions with individual participants and groups.
    15. 22. Card Sorting Group paper card sort - Services/Departments/Libraries
    16. 23. Card Sorting - Services/Departments/Libraries OptimalSort online card sort
    17. 25. Guerrilla Testing Materials Cost: $0 Incentives Cost: $0 Set up time: ~2hrs Test time: ~2hrs Analysis: ~4hrs <ul><li>Participants: </li></ul><ul><ul><li>20 undergrad/grad </li></ul></ul>Quick and short answers to quick and short questions. Five minutes is our goal! <ul><li>Description: </li></ul><ul><ul><li>Print out web page </li></ul></ul><ul><ul><li>Approach someone “in the wild” & ask if they can spare 5 min. </li></ul></ul><ul><ul><li>Ask 1-2 short questions </li></ul></ul>
    18. 26. Guerrilla Testing <ul><li>Contents: </li></ul><ul><ul><li>Removed/added links </li></ul></ul><ul><li>Labels: </li></ul><ul><ul><ul><li>“ Quick Links” is good </li></ul></ul></ul><ul><ul><ul><li>Some link labels revised </li></ul></ul></ul><ul><li>Location: </li></ul><ul><ul><ul><li>Not good! Needs to be more prominent </li></ul></ul></ul>
    19. 27. Online Guerrilla Testing Materials Cost: $0* Incentives Cost: $0 Set up time: ~1hr Test time: 0 Analysis: ~1hrs Automated version of paper guerrilla test to reach a larger audience. <ul><li>Participants: </li></ul><ul><ul><li>In progress </li></ul></ul>Description: “ Survey” distributed via Subject Specialist Librarians, news items, and directly from access system interface.
    20. 28. Where would you click to go directly to an article?
    21. 29. Questions? All past reports: www.lib.umich.edu/usability   Suzanne Chapman (suzchap@umich.edu) Ken Varnum (varnum@umich.edu)
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×