Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

UX camp ams-2015- User Checks


Published on

2 hour workshop for UX Camp Amsterdam 2015
Getting to a better design fast. User Checks is an agile way of usability testing with the focus on creating value. With User Checks the design accelerates to a higher level within a short period and relatively low cost and little resources. User Checks maximize the key element of usability testing: getting to empathy.

Published in: Design
  • Be the first to comment

UX camp ams-2015- User Checks

  1. 1. userneeds! @anous User Checks! Get to a better design fast Workshop by Userneeds - Anouschka Scholten September 12, 2015
  2. 2. userneeds! @anous Quantitative methods
  3. 3. userneeds! @anous Make people’s lives better.
  4. 4. userneeds! @anous
  5. 5. userneeds! @anous People do not always do what you think they do! People do not always do what you tell them to do! People do not always do what they think they do! People do not always do what they say they do! ! Observations and asking why makes you find out what people really do and need!
  6. 6. userneeds! @anous Usability testing is about observing behaviour 
 Watch the user do what comes naturally,
 while performing a task (thinking aloud). 
 Don't help.
 Not asking about opinion.
  7. 7. userneeds! @anous empat! a key function of usability testing usability testing enables you to really see, hear and find out what moves your users the ability to see an experience through another person’s eyes w Getting to hy!
  8. 8. userneeds! @anous Usability ROI •  Increased conversion rates 
 whatever the user goal is (purchase, download, troubleshooting etc), they can achieve it with ease •  Increased productivity •  Reduced need for help desk intervention •  Reduced fall off rate at checkout
  9. 9. userneeds! @anous No usability testing •  Customer dissatisfaction •  Lose of potential income •  Losing money on marketing •  Higher development costs •  Bad reputation •  Staying behind the competition
  10. 10. userneeds! @anous user checks! & traditional usability testing!
  11. 11. userneeds! @anous Usability Test traditional 1 round of testing" In a lab environment, eye tracking Moderator is a UX researcher Process based
  12. 12. userneeds! @anous Traditional Usablity test drawbacks •  1 time, at end design process •  An extensive report, little or no redesign •  No retest •  Not in user context •  High costs: lab, eye tracking, special UX researcher •  Product team indirectly involved •  Schedule well in advance (at least 1 month)
  13. 13. userneeds! @anous
  14. 14. userneeds! @anous Test and retest within 1-3 weeks In the users context, quick and easy set up Designer conducts the test, team directly involved Value based User Checks Agile Usability Test
  15. 15. userneeds! @anous User Checks Agile Usability Testing 6 Evaluate Priortize Improvements 2. ROUND 3. Evaluate Priortize Improvements 5. ROUND 1. Design / MVP 4. Improved Design / MVP 7. Improved Design / MVP Repeat or develop Based on the RITE-model:
  16. 16. userneeds! @anous 0. Prepare
 Focus test, informed by data and user feedback; what is the goal, what do you want to improve, where do you want to find out the WHY User Checks steps
  17. 17. userneeds! @anous 1. The MVP/Prototype
 The test subject, if in design phase with focus on problem area, prototyped. User Checks steps
  18. 18. userneeds! @anous 2. User Checks – Round 1
 Observe 2 or 3 real users, attempt real task(s); conduct simple usability test per user > user check
 (UX) designer as moderator, team member as observer. Preferably in context user User Checks steps
  19. 19. userneeds! @anous 3. From findings to improvements
 What have we learned, prioritize, what are improvements. Via KJ- method and Impact/effort matrix 
 Sources: and User Checks steps
  20. 20. userneeds! @anous 4. Improved design > 5. User Checks Round 2
 Improve the prototype, do a 2nd round of user checks with the next 2 or 3 real users (not the same as round 1!) User Checks steps
  21. 21. userneeds! @anous 6. Findings and improvements > 7. Improved design
 Collect and group findings, prioritize. Think of design improvements, prioritize and implement in better design
 Repeat the process or implement design User Checks steps
  22. 22. userneeds! @anous Outsourcing your user research work is like outsourcing your vacation.  It gets the job done, but probably won’t have the effects you were seeking. Jared Spool vacation/ “
  23. 23. userneeds! @anous examples!
  24. 24. userneeds! @anous Design new site User check with a doctor in his office GGD Haaglanden project Angi Studio:"
  25. 25. userneeds! @anous GGD Haaglanden project Angi Studio: User Checks during completely new design
  26. 26. userneeds! @anous “ Bij Meiden verwacht ik 14 tot 21...oh, ik denk dat ik toch wel in die doelgroep val als ik het zo lees ? “ “LINDA.Girls.. I expect this is for 14 till 21 year old… oh, reading this subscription I think it actually does fit me?” Pop-up information about LINDA.meiden Project Angi Studio: Redesign LINDA subscription
  27. 27. userneeds! @anous “Zo onder elkaar is duidelijk, je kunt er zo doorheen” “Very clear and easy to walk through this way, underneath each other” Sub home Subscription with incentive - mobile Lange lijst om te scrollen vormde geen enkel probleem Scrolling was no problem at all on mobile 2e round: de footer met menu, werd wel gebruikt Footer with next step was used Navigeren met hoofdmenu gebeurde eerst ronde niet. Sticky menu bood ook geen oplossing Navigating via main menu on top wasn’t used, it was ignored, even with a sticky menu… dead end alert! Project van Angi Studio:
  28. 28. userneeds! @anous User Checks during redesign, project Angi Studio:"
  29. 29. userneeds! @anous Subsidie aanvraagsysteem Gemeente Den Haag, project Angi Studio: User Checks during completely new design
  30. 30. userneeds! @anous User checks drawbacks •  Too unexperienced biased designer •  Heavy pressure on improving the design within agile process (keeping in pace with development) •  With ‘high fidelity’ testing it demands velocity for retest with high multidisciplinary efforts: product owners, IxD, visual design and content
  31. 31. userneeds! @anous Compared Cost From 10.000 EUR From 5.000 EUR Test participants 6-8 once 2 rounds with 3 participants Who performs UX researchers/specialists only Product team (designers, product owner, developers) Where Lab In context user or meeting room How often Once Twice or more With 1 MVP/design 2 or more variations of MVP/ design Equipment Laptop/mobile, Eytracking, video Laptop/mobile, audio- and screen capture Plan At least 1 month upfront Recruit (target) users and conduct You get Report, 1-2 weeks later A better design and retest in 1-3 weeks Engaged team Tradit. usability tests User checks
  32. 32. userneeds! @anous Get down to it!
  33. 33. userneeds! @anous What to test ?
  34. 34. userneeds! @anous When to test •  Early as possible •  Often as possible ROI •  Save on development time •  Less support calls/e-mails •  More conversion
  35. 35. userneeds! @anous Procedure usability test 1.  Let your user experience the prototype/MVP/site 
 Observe don’t tell. Put your prototype in the user’s hands (or your user in the prototype) and give just the minimum context so they understand what to do. Give them a task.. 2.  Have them talk through their experience (talk aloud) 
 For example, when appropriate, as the host, ask “Tell me what you are thinking as you are doing this.” 3.  Actively observe 
 Watch how they use (and misuse!) what you have given them. Don’t immediately “correct” what your user tester is doing. 4.  Follow up with questions 
 This is important; often this is the most valuable part of testing. “Show me why this would [not] work for you.” “Can you tell me more about how this made you feel?” “Why?”" Answer questions with questions (i.e “well, what do you think that button does”)
  36. 36. userneeds! @anous Key: moderating the test If they… Say… Are not talking “What are you thinking?” Ask you a question (e.g. “Is that what I should do here?”) Rephrase the question (e.g “ What do you think you should do?”) Get a task right or wrong “Thank you , that is very helpful”" “Thanks for the feedback” Mess up “Remember, you can’t make any mistakes. You’re doing a great job.” Are unsure if they have completed a task and ask you “Is this what you would do if you were doing X at home?” Criticize the design “Thanks for the feedback”
  37. 37. userneeds! @anous User Checks set up •  Test script with real tasks •  Real target users as participants •  Ideally conducted in context user on their own device •  Record audio and screen capture •  1 test moderator, 1 observer 
 (more observers are possible, but not too much and really on the background)
  38. 38. userneeds! @anous Test script-agenda 1.  Introduction and questions up front 2.  Situation sketch + think aloud instruction 3.  Think aloud session 4.  Evaluate session, open questions 5.  End session - thank participant
  39. 39. userneeds! @anous 1. Example Introduction Thank you for participating, we really appreciate it! I’m <name> and there is one person besides me <name> is also here. We’re helping a company to develop a new service / to optimize a website This company cares about their customers and thinks it’s very important to involve potential customers when developing new services. They want the new service to be valuable for you. That’s why we want to ask you some questions and along the way we’d like show you the service/site. You can walk through it.  The interview takes about a 20 minutes. If you do not understand something, or you want to add something, do not hesitate to interrupt me!  Do you mind if we record the session for future reference within our organisation? Everything we discuss and record is confidential; we will not share things without your permission! Any questions?
  40. 40. userneeds! @anous Task-think aloud instruction You are going to see: a draft proposal of the service, which has not yet been developed/ the site now. <if a proptotype> The proposal is only images. It looks finished and you can click through it, but most of it is not working yet. It’s also not as fast as a normal app and you can’t do anything wrong. <Situation sketch and task> Please open the link on your mobile phone and walk through the app.
 Can you please say things aloud? Not to me, but for yourself? Everything you do or say is fine, eg. Also "what the hell is this ?! ’. You can start now by clicking the link
  41. 41. userneeds! @anous Evaluation questions Evaluation questions after the think aloud task, zoom in to what you observed, like: •  You mentioned… what did you mean with that? •  I saw you doing… why? What did you expect? How would you call it? •  You didn’t go to … why not? What do you expect to find?
 Also ask service, product or content specific questions, e.g.: •  You read this advice/text thoroughly, why? •  You were going back and fourth between product X and Y, why? •  You chose X and said something about the participants, what did you mean with that ?
  42. 42. userneeds! @anous a case!
  43. 43. userneeds! @anous Tell us •  The main goal(s) of the site/application •  Target users •  Main tasks, main questions of the users •  Context of use Do not tell us how you think the site works now!!!
  44. 44. userneeds! @anous Volunteers •  2 moderators •  2 users
  45. 45. userneeds! @anous Observers •  Yellow sticky notes for findings •  Objective, record behaviour! •  ONLY write down what you observe, like: “ mentions X and doesn’t seem to not know what X is”; sighs heavily on returning to; is confused about; does not know where to go next, gets frustrated;…. •  Other coloured sticky notes for possible improvements that pop up in your mind, like: feedback is missing; label is not understood; no focus; X doesn’t meet expectation;…
  46. 46. userneeds! @anous Observe the user’s behaviour Note every occasion the user: •  hesitates, worries, or is forced to think •  misunderstands something •  gets frustrated or annoyed •  gives up •  is surprised
  47. 47. userneeds! @anous 1st user check session! 20 min ! 2nd user check session! 20 min ! Findings & improvements! 20 min !
  48. 48. userneeds! @anous The findings, issues Evaluate Findings •  Collect Findings •  Prioritize issues to be solved: prioritize from the business/user goals perspective, rate them Do this with thewhole team
  49. 49. userneeds! @anous Design improvements Evaluate Findings •  Improvements on every level, note down 
 (labeling, content, categorization, visual assets, interaction, concept, flow, persuasion) > but keep the business/user goals perspective in mind (focus)! •  Prioritize by Impact & Effort (matrix) or dot-vote
 Choose improvement(s) to be re-tested Do this with thewhole team
  50. 50. userneeds! @anous Impact & Effort matrix 
 Impact: creates high/low value for users 
 effort: high/low effort in terms of time, knowledge, tools, people… Source: Do this with the whole team
 incl. decision makers!
  51. 51. userneeds! @anous Prepare to retest " Hypothesis and improve Improve the design Set up the hypothesis (or more than one) We believe that [this statement is true (e.g adding social proof, better labeling, visual cues… will…)] We will know we are right/wrong when we have the following result [quantitative feedback; qualitative feedback; KPI;…}
  52. 52. userneeds! @anous Just do it!!
  53. 53. userneeds! @anous thank you!