Successfully reported this slideshow.
Your SlideShare is downloading. ×

User Research on a Shoestring

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Upcoming SlideShare
Internet librarian v9
Internet librarian v9
Loading in …3
×

Check these out next

1 of 43 Ad

More Related Content

Slideshows for you (20)

Similar to User Research on a Shoestring (20)

Advertisement

More from teaguese (14)

User Research on a Shoestring

  1. 1. user research on a shoestring susan teague-rector and erin white vcu libraries, richmond, va
  2. 2. What we'll cover Improving user experience of your (web) product through research before and during your design and development process.     Not spending a lot of money.             Taking the scary out of usability.
  3. 3. What we'll cover <ul><li>Before you begin: </li></ul><ul><li>Evaluating use of what you have </li></ul><ul><li>  </li></ul><ul><li>  </li></ul><ul><ul><li>Heatmapping </li></ul></ul><ul><ul><li>Analytics </li></ul></ul><ul><li>As you work: </li></ul><ul><li>Assessment of user experience </li></ul><ul><li>  </li></ul><ul><li>  </li></ul><ul><ul><li>Virtual user testing </li></ul></ul><ul><ul><li>Guerrilla testing </li></ul></ul><ul><ul><li>Microfeedback </li></ul></ul>
  4. 4. Some context <ul><li>Sound familiar? Large university (32,000 students), small Libraries web team (3 folks), and many projects (~1,000,000). Your situation in a nutshell: </li></ul><ul><li>  </li></ul><ul><ul><li>There's never enough time; </li></ul></ul><ul><ul><li>spending the little money you have is a hassle, but </li></ul></ul><ul><ul><li>you still want to make data-driven design decisions. </li></ul></ul><ul><li>  </li></ul><ul><li>Optional addition: </li></ul><ul><ul><li>Your boss wants you to justify user research. </li></ul></ul><ul><li>  </li></ul>
  5. 5. Justifying a little research <ul><li>If your supervisors still need coaxing (or if you do), here's why any teeny little bit of user research is going to make your life easier: </li></ul><ul><li>  </li></ul><ul><ul><li>Research can significantly decrease development and maintenance time , which means cost savings. </li></ul></ul><ul><ul><li>Google is right. Focus on the user and all else will follow. </li></ul></ul><ul><ul><li>Happy users = brand buy-in . When users like the library more, everybody's happy. </li></ul></ul>
  6. 6. Let's get in it <ul><li>We're going to use a case study of the last VCU Libraries web redesign to demonstrate some of the tools we used to do user research. </li></ul>
  7. 7. The year: 2009. The goal: redesign.
  8. 8. The plan <ul><ul><li>Heatmapping </li></ul></ul><ul><ul><li>Analytics </li></ul></ul><ul><li>  </li></ul><ul><li>Combined with </li></ul><ul><ul><li>Existing feedback </li></ul></ul><ul><ul><li>Library literature </li></ul></ul><ul><ul><li>Virtual usability testing </li></ul></ul><ul><ul><li>Guerrilla tests </li></ul></ul><ul><ul><li>Heatmapping </li></ul></ul><ul><ul><li>Microfeedback </li></ul></ul>original homepage prototypes new homepage
  9. 9. Heatmapping <ul><ul><li>What are you measuring? </li></ul></ul><ul><ul><ul><li>Where are people clicking? What is getting attention? </li></ul></ul></ul><ul><ul><li>What tool to use?   </li></ul></ul><ul><ul><ul><li>We used CrazyEgg and tracked clicks on the homepage for a week in the middle of spring semester. </li></ul></ul></ul><ul><ul><li>What does it track?   </li></ul></ul><ul><ul><ul><li>Clicks based on network and browser properties and return visits, among others.  </li></ul></ul></ul><ul><ul><li>How does it work?  </li></ul></ul><ul><ul><ul><li>Insert the javascript into the page you want to measure, then let CrazyEgg do the work. </li></ul></ul></ul>
  10. 10. Heatmapping results
  11. 11. Heatmapping: pros and cons <ul><li>Pros </li></ul><ul><ul><li>Robust statistics with minimal effort, $$$ </li></ul></ul><ul><ul><li>Fast way to evaluate components of a page   </li></ul></ul><ul><ul><li>Reports are easily PDF-able and aren't overly technical, so you can easily have a document to send up to managers. </li></ul></ul><ul><ul><li>Invisible to users. </li></ul></ul><ul><li>  </li></ul><ul><li>Cons </li></ul><ul><ul><li>Clicks don't tell the whole story. We know where people click, but we don't know if folks are finding what they want, or if the page makes any sense in the first place. </li></ul></ul>
  12. 12. We've done post-launch heatmapping, too.
  13. 13. Analytics <ul><ul><li>What are you measuring? </li></ul></ul><ul><ul><ul><li>What pages get the most traffic? </li></ul></ul></ul><ul><ul><ul><li>What search terms are people using to get to your site? </li></ul></ul></ul><ul><ul><ul><li>What are people searching for on your site?  </li></ul></ul></ul><ul><ul><li>What tool to use?   </li></ul></ul><ul><ul><ul><li>We used Google Urchin and an in-house search query tracker. (Removing personally identifiable data like IP.)   </li></ul></ul></ul><ul><ul><ul><li>Google Analytics does this as well. </li></ul></ul></ul><ul><ul><li>How does it work?  </li></ul></ul><ul><ul><ul><li>Urchin analyzes server logs of page visits, user characteristics, and referring sites. </li></ul></ul></ul><ul><ul><ul><li>In-house tool records search terms into a database and displays a word cloud of frequent terms. </li></ul></ul></ul>
  14. 14. Analytics: Urchin
  15. 15. Analytics: Search terms
  16. 16. Analytics: pros 'n' cons <ul><li>Pros </li></ul><ul><ul><li>Search terms tell us what pages are hard to find , and ways users think about our site . </li></ul></ul><ul><ul><li>Page visits tell us what's popular , and where people are coming from to get there. </li></ul></ul><ul><ul><li>Analytics tracking is invisible to users. </li></ul></ul><ul><li>  </li></ul><ul><li>Cons </li></ul><ul><ul><li>These tools can't track user navigation paths through our site. </li></ul></ul><ul><ul><li>Again, clicks don't tell the whole story. Use data only tells us so much. </li></ul></ul>
  17. 17. And now...Usability research. <ul><li>(I found this on the internet years ago, and if you can tell me what this is from or who the artist is, please e-mail me . It's driving me nuts. - Ed.) </li></ul>
  18. 18. User experience: what we're here for or, what is UX?
  19. 19. What smart people say about UX Morville, P. The User Experience Honeycomb Peter Morville gets us thinking about UX as a honeycomb of  elements that together can form a positive user experience. We list several UX resources at the end of the presentation, too.
  20. 20. Usability and UX: What's the diff? <ul><ul><li>Usability is a component of User Experience, the part of this equation that you have control of. </li></ul></ul>
  21. 21. Demystifying usability <ul><li>Usability research just asks, &quot; Does this thing make sense to other humans?&quot;   </li></ul><ul><li>  </li></ul><ul><li>There is no perfect tool. </li></ul><ul><li>Usability engineering is a process , not an end result.  </li></ul><ul><li>  </li></ul><ul><li>Usability research can be easy , fun , fast , and inexpensive.   </li></ul><ul><li>And anyone can do it. </li></ul>
  22. 22. This book rules (and is a fast read) <ul><li>Krug on how we should think about usability studies: </li></ul><ul><li>  </li></ul><ul><ul><li>some > none </li></ul></ul><ul><ul><li>done > perfect </li></ul></ul><ul><ul><li>sooner > later </li></ul></ul>OCLC # 61895021
  23. 23. Goals of usability testing <ul><li>Jakon Nielsen: Usability 101 </li></ul><ul><ul><li>Learnability: How easy is it for users to do basic tasks the first time they see the site? </li></ul></ul><ul><ul><li>Efficiency: Once users have learned the design, how quickly can they perform tasks? [in that session] </li></ul></ul><ul><ul><li>Memorability: When users return to the design after a period of not using it, how easily can they reestablish proficiency? </li></ul></ul><ul><ul><li>Errors: How many errors do users make, how severe are these errors, and how easily can they recover from the errors?  </li></ul></ul><ul><ul><li>Satisfaction: How pleasant is it to use the design? </li></ul></ul>
  24. 24. So let's get crackin' <ul><li>We tried to evaluate user experience through virtual usability testing with Usabilla, in-person guerrilla usability testing, and microfeedback forms. </li></ul><ul><li>All these methods are components of a UX toolkit . Individually, they are helpful, but together, they give a more complete picture.  </li></ul><ul><li>  </li></ul><ul><li>There is no &quot;right&quot; answer, but with a combination of tools you can learn more than with one tool alone. </li></ul>
  25. 25. Virtual usability testing <ul><ul><li>What are you measuring? The intangibles of user experience. </li></ul></ul><ul><ul><ul><li>What doesn't make sense? </li></ul></ul></ul><ul><ul><ul><li>Where do you think you should click to perform X task? </li></ul></ul></ul><ul><ul><ul><li>What works? </li></ul></ul></ul><ul><ul><li>What tool to use?   </li></ul></ul><ul><ul><ul><li>We used Usabilla . 5secondtest and a host of others provide similar services. </li></ul></ul></ul><ul><ul><li>How does it work?  </li></ul></ul><ul><ul><ul><li>Upload a screenshot of a web interface and create questions to ask about the interface. Questions can be task-based or feelings-based. </li></ul></ul></ul><ul><ul><ul><li>Users visit the testing site on their own time and give feedback in the form of clicks and optional notes. </li></ul></ul></ul>
  26. 26. What do you like about this page?
  27. 27. What do you like about this page? (comments from users) <ul><li>  Notes </li></ul><ul><ul><li>nice white space around outside </li></ul></ul><ul><ul><li>nice centralized location for web 2.0 </li></ul></ul><ul><ul><li>boxes arranged in tight enough formation that they aren't just &quot;a page of text&quot; </li></ul></ul><ul><ul><li>dynamic (?) changing image centralized </li></ul></ul>
  28. 28. What do you dislike about this page?
  29. 29. What do you dislike about this page? (comments from users) <ul><li>  </li></ul><ul><li>  Notes </li></ul><ul><ul><li>hours at top, directions bottom left, events bottom right, services in the middle ... i'm lost </li></ul></ul><ul><ul><li>'contact us' twice within inches of each other? </li></ul></ul><ul><ul><li>'suggestions and feedback' (what's the difference?) down here, 'ask us' up top </li></ul></ul><ul><ul><li>what could possibly be in 'about us?' it all seems to be on the homepage ... </li></ul></ul><ul><ul><li>location of FOL thing near web 2.0 stuff is bad -- confuses def'n of &quot;friend&quot; </li></ul></ul><ul><ul><li>will this be consistent branding throughout? </li></ul></ul><ul><ul><li>not sure if people will understand library services a-z w/o prompting </li></ul></ul>
  30. 30. What do you dislike about this page? <ul><li>  </li></ul><ul><li>  Notes </li></ul><ul><ul><li>hours at top, directions bottom left, events bottom right, services in the middle ... i'm lost </li></ul></ul><ul><ul><li>'contact us' twice within inches of each other? </li></ul></ul><ul><ul><li>'suggestions and feedback' (what's the difference?) down here, 'ask us' up top </li></ul></ul><ul><ul><li>what could possibly be in 'about us?' it all seems to be on the homepage ... </li></ul></ul><ul><ul><li>location of FOL thing near web 2.0 stuff is bad -- confuses def'n of &quot;friend&quot; </li></ul></ul><ul><ul><li>will this be consistent branding throughout? </li></ul></ul><ul><ul><li>not sure if people will understand library services a-z w/o prompting </li></ul></ul>Changes we incorporated are highlighted.
  31. 31. Virtual usability testing pros and cons <ul><li>Pros </li></ul><ul><ul><li>Fast and easy for all involved. </li></ul></ul><ul><ul><li>Data on intangibles that we couldn't get through web stats. </li></ul></ul><ul><ul><li>Free-text suggestions as a catch-all for unforeseen issues. </li></ul></ul><ul><ul><li>Bonus: users could position free-text comments contextually on the page, so we could easily figure out what the hell they were talking about when they said, &quot;this thing here doesn't make any sense.&quot; </li></ul></ul><ul><li>  </li></ul><ul><li>Cons </li></ul><ul><ul><li>It can be hard to find users who are invested enough to (a) participate, and (b) give free-text responses. </li></ul></ul><ul><ul><li>Keep context in mind if you have a small, relatively homogeneous sample (i.e. Library faculty). </li></ul></ul>
  32. 32. Guerrilla testing <ul><ul><li>What are you measuring? The intangibles of user experience: feelings, task completion, other thoughts about features and functionality. </li></ul></ul><ul><ul><li>What tool do you use? Paper prototypes, basic web prototypes, or anything in between; and a little extraversion. </li></ul></ul><ul><ul><li>How does it work? </li></ul></ul><ul><ul><ul><li>Take advantage of the fact that your web users are the same people who use your physical spaces. </li></ul></ul></ul><ul><ul><ul><li>Develop a short script, walk out into public spaces with prototypes, seek out participants, and ask people questions.   </li></ul></ul></ul><ul><ul><ul><li>Shut up and listen. </li></ul></ul></ul>
  33. 33. Example method <ul><ul><li>Before you begin: develop a short script for use at the beginning and end of each interview, as well as a few questions for each user. </li></ul></ul><ul><ul><li>Go out into the library and approach people who look like  they could spare a few minutes to talk. Try to target folks across populations - undergrads, grads, faculty. </li></ul></ul><ul><ul><li>Shoot for a 5-minute interview. Keep it short, and try not to overwhelm. </li></ul></ul><ul><ul><li>Don't worry about getting it perfect every time. </li></ul></ul><ul><ul><li>Write down everything you can - verbal and nonverbal responses, gestures, facial expressions, etc. </li></ul></ul><ul><ul><li>Respect the user's time. </li></ul></ul><ul><ul><li>Listen. </li></ul></ul><ul><ul><li>Say thank you! </li></ul></ul>
  34. 34. Example intro script <ul><li>Hello, my name is _______ and I’m with VCU Libraries. We’ve been working on a redesign of the Libraries Web site and would love to get your feedback on the design. Would you be willing to participate in a short 5 minute survey?  </li></ul><ul><li>  </li></ul><ul><li>We will not be collecting any personal information except your affiliation with the university.    </li></ul><ul><li>  </li></ul><ul><li>This is also not a test of your abilities, but just a tool to help us understand how the site will work for our customers.   </li></ul><ul><li>  </li></ul><ul><li>Do you have any questions before we begin? </li></ul>
  35. 35. Example questions to ask: <ul><ul><li>What is your affiliation with the university? </li></ul></ul><ul><ul><li>What single feature do like most about the web page? </li></ul></ul><ul><ul><li>What single feature do you like least about the web page? </li></ul></ul><ul><ul><li>How many Top Resources does the Libraries recommend? </li></ul></ul><ul><ul><li>In the redesign, how would you </li></ul></ul><ul><ul><ul><li>Find a book? </li></ul></ul></ul><ul><ul><ul><li>Find articles for your specific topic? </li></ul></ul></ul><ul><ul><ul><li>Place a hold or renew a book? </li></ul></ul></ul><ul><ul><ul><li>Order/find items that VCU Libraries does not own? </li></ul></ul></ul><ul><ul><ul><li>Contact a librarian? </li></ul></ul></ul><ul><ul><li>What single feature would you like to have available that you do not see included? </li></ul></ul>
  36. 36. Guerrilla testing pros and cons <ul><li>Pros </li></ul><ul><ul><li>Most web developers have to seek out users; library users are right here in our building . </li></ul></ul><ul><ul><li>It's an invaluable opportunity to talk to real users and learn about problems you might not've even considered. </li></ul></ul><ul><ul><li>You will find usability issues FAST. Be prepared to be flexible about addressing them. </li></ul></ul><ul><ul><li>You can repeat guerrilla tests as you iterate on designs. </li></ul></ul><ul><li>  </li></ul><ul><li>Cons </li></ul><ul><ul><li>Users may not be invested enough to give in-depth answers. Give small compensation if you can. </li></ul></ul><ul><ul><li>Be prepared to return to the drawing board (a good problem to have!) </li></ul></ul><ul><ul><li>Be mindful of your biases when selecting participants. </li></ul></ul>
  37. 37. Microfeedback <ul><ul><li>What are you measuring? The intangibles of user experience during/after a new web product launch. </li></ul></ul><ul><ul><li>What tool do you use? Small, 3-question quick feedback forms built in PHP. Some folks also use GetSatisfaction, UserVoice or Google Docs </li></ul></ul><ul><ul><li>How does it work? </li></ul></ul><ul><ul><ul><li>Create a prominent &quot;Give Feedback&quot; link to the survey, or just embed the survey on your page.  </li></ul></ul></ul><ul><ul><ul><ul><li>Keep it small, fast and easy. Make targets big. </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Don't require fields. Let users decide what to tell you. </li></ul></ul></ul></ul><ul><ul><ul><li>Users submit quibbles/questions/comments. </li></ul></ul></ul><ul><ul><ul><li>Look at the comments, look for trends, and address issues. </li></ul></ul></ul>
  38. 38. The questions we asked: Any other comments?     Love it     Like it     Indifferent     Confused     Hate it Ease of use ° ° ° ° ° Content ° ° ° ° ° Layout ° ° ° ° ° Overall ° ° ° ° °
  39. 39. Three months, 268 responses ...and a host of free-text responses (n~=100).     Love it     Like it     Indifferent     Confused     Hate it Ease of use 93 65 20 21 38 Content 85 75 29 21 25 Layout 92 75 21 19 28 Overall 90 75 18 22 27
  40. 40. Microfeedback pros and cons <ul><li>Pros </li></ul><ul><ul><li>Quickly pick up on unforeseen problems, broken links, and usability issues. </li></ul></ul><ul><ul><li>Users can give as much or as little info as they want, fast. </li></ul></ul><ul><ul><li>Some users use microfeedback already and may be more willing to use it (think Facebook's &quot;like&quot; feature). </li></ul></ul><ul><li>  </li></ul><ul><li>Cons </li></ul><ul><ul><li>Some users thought this was a library help form. Be clear about the form's scope when you post it. </li></ul></ul><ul><ul><li>The sample mostly includes lovers and haters, and not many in between. Don't take mean responses (and there will be some) personally. </li></ul></ul>
  41. 41. Remember this slide? <ul><li>When doing user research, remember: </li></ul><ul><li>  </li></ul><ul><ul><li>some > none </li></ul></ul><ul><ul><li>done > perfect </li></ul></ul><ul><ul><li>sooner > later </li></ul></ul>OCLC # 61895021
  42. 42. Thanks! and resources <ul><ul><li>User Experience Network </li></ul></ul><ul><ul><li>UXMatters </li></ul></ul><ul><ul><li>Garrett, J. J. The Elements of User Experience    </li></ul></ul><ul><ul><li>Krug, S. Don't Make Me Think </li></ul></ul><ul><ul><li>Morville, P. Ambient Findability </li></ul></ul><ul><ul><li>Nielsen, J. Designing Web Usability and UseIt.com </li></ul></ul><ul><ul><li>Designing Better Libraries </li></ul></ul><ul><ul><li>Adaptive Path </li></ul></ul><ul><ul><li>Quick and Dirty Remote User Testing </li></ul></ul>
  43. 43. The authors Susan Teague-Rector Web Design Project Librarian NCSU Libraries Erin White Web Applications Developer VCU Libraries     

×