Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Ux tuneup


Published on

Published in: Technology, Design

Ux tuneup

  1. 1. UX TuneupTri UPA WorkshopPresented by Carol BarnumUsability Center @ Southern Polytechnic
  2. 2. How’d you get here?Self taught?Read “the book“?Educated?Trained?Some other way?
  3. 3. Workshop agendaHeuristic evaluation&Usability testingWhat still worksWhat needs a tuneup
  4. 4. Heuristic evaluation Morning focustraditionto today
  5. 5. UPA survey says . . .Heuristic evaluation/Expert review % of respondents Survey year 77% 2007 74% 2009 75% 2011 Slide 5
  6. 6. Why so popular? Fast Cheap Easy Effective Convenient Slide 6
  7. 7. Slide 8
  8. 8. Tradition—Nielsen’s 10 heuristics1. Visibility of system status2. Match between system and real world3. User control and freedom4. Consistency and standards5. Error prevention6. Recognition rather than recall7. Flexibility and efficiency of use8. Aesthetic and minimalist design9. Help users recognize, diagnose, and recover from errors10. Help and documentationJ. Nielsen and R. Mack, eds. Usability Inspection Methods, 1994 Slide 9
  9. 9. The Nielsen Method• Small set of evaluators – 3 to 5 optimal cost-benefit – Single evaluator finds 35% of problems• Each evaluator inspects alone – 1 to 2 hours – Several passes through interface – Inspection based on heuristics – If evaluators are not SME’s, hints can be given – Evaluator writes notes or report
  10. 10. The Nielsen MethodAfter individual evaluations aredone, evaluators:– Talk to each other, often with a facilitator– Share reports/notes– Collate findings– Rank issues by severity– Write compiled report
  11. 11. Nielsen variations on method• Supply a typical usage scenario, listing the steps a user would take to perform tasks• Hold a design debrief with designers• Use brainstorming to focus on possible solutions• Include positive findings
  12. 12. And the method is called…“Discount UsabilityEngineering“
  13. 13. So, what do you get?• A list of potential problems• Also (sometimes) the positive findings• Tied to a heuristic or rule of practice• A ranking of findings by severity• (Sometimes) recommendations for fixing problems• A report of findings Slide 14
  14. 14. What do you do? Well, I always . . . .
  15. 15. What do I do? Well, I used to follow Nielsen
  16. 16. Phase 1:Nielsen is my bible 17
  17. 17. CUE 4 Hotel Pennsylvania 2003• Comparative evaluation of reservation process• 17 teams – 8 did expert review/heuristic evaluation – Only 1 team used Nielsen’s heuristics• Rolf’s conclusions – Findings “overly sensitive“—too many to manage – Need to improve classification schemes – Need more precise and usable recommendations CHI 2003 Results available at Rolf Molich’s DialogDesign Slide 18
  18. 18. UPA Slide 19 2011
  19. 19. After that, what did I do? I got a little older and wiser
  20. 20. Phase 2: Loose interpretation ofNielsen dropped his heuristics kept severity ratings added screen captures added callouts added recommendations
  21. 21. Hyperspace, Shock, and Cardiac Arrest all require more clearly defined goals and objectives. H = Hyperspace; C = Cardiac Arrest; S = Shock SeverityFinding Description of problem Recommendation H C S RatingObjectives/goals Unclear reason content Develop a consistent    3for the modules is being presented structure that defines what’snot clear Lack of conciseness of noted in the bulleted points. presentation Avoid generic statements that Definitions are required don’tdefines what’s noted in the   Objectives/goals for Reason content is being the modules presented Develop a consistent structure that focus users on what  3 to work with the of Conciseness they will be accomplishing. bulleted points, above. module/content required to Advisegeneric statements that presentation Definitions Avoid that there is an don’t focus users on what they Evaluation criteria and work with the assessment used for will be accomplishing. module/content Advise that there is an methods unclearcriteria and evaluation and indicate if it’s Evaluation assessment used for evaluation Direct tie between methods at and indicate ifor interspersed in the end it’s at the end or Direct tie between interspersed in the module content and assessment the module. the goals and content and assessment Connect ideas in measure unclear measure Connect ideas in the the objectives with outcomes in goals Sequence of presentation assessment Sequencefollows logically from and objectives presentation of Follow the order of with outcomes introduction presentation does not Quizzes challenge users in defined at the beginning the assessment. Develop interesting and follow logically from Follow the order of challenging questions introduction. presentation definedthe the Re-frame goals/objectives at end of the module at Quizzes do not beginning. challenge users. Develop interesting and challenging quiz questions. Re-frame goals/objectives at the end of the module. Slide 22
  22. 22. Slide 23
  23. 23. Then, what did I do? I broke free!
  24. 24. Phase 3: I did itmy wayfindings stated in our terminologyscreen captures
  25. 25. 27
  26. 26. A unique password between 6 and 16 characters wasrequired. “Unique” is not defined. This is a problemwith terminology. Usually, passwords must be a combination of letters and numbers for higher security. An all- letter password—Heuristics—was accepted. A dictionary term is not a secure password and contradicts accepted conventions. The ability to input a dictionary word may be a component of trust for users.The username and security question answer wererejected on submit. This result is confusing as the name was confirmed on the previous screen. This relates to establishing conventions for the form of names/passwords on the input screen. Input formats need to be defined on the relevant page. Differences in spelling “username” vs. “user name” are subtle but are consistency issues.The red banner is confusing as the user chose thegold (Free Edition). This is a consistency issue. 28
  27. 27. User experience emerges in reviewercomments. . .
  28. 28. Reviewer comment: I wanna click on the map, not the pulldown.WAH!Also, I’ve got no idea what the text on this page means.
  29. 29. Why not tell theuser’s story?!
  30. 30. Strategy—Persona-based scenario review• Ginny Redish and Dana Chisnell• AARP report—58 pages, 50 websites – Two personas—Edith and Matthew – Evaluators “channel“ the user via persona and tasks/goals – The users’ stories emerge Available from Redish &Associates Slide 32
  31. 31. While the clickable area is very large in the navigation blocks, Edith expected to click on the labels, so she was surprised when the menu appeared When trying to click an item in the menu above, Edith had trouble selecting because her mouse hovered close enough to the choices below to open that menu, obscuring the item she wanted to clickChisnell and Redish, Designing Web Sites for Older Adults: Expert Review of Usability for Older Adults at 50 Web Sites (for AARP)
  32. 32. Engage in conversation with your reader Ginny Redish“Every use of every Letting Go of the Words Morgan Kaufmann, 2007 (new edition coming) website is a conversation started by the site visitor.”
  33. 33. Tell the story of your user’s experience Whitney Quesenbery and Kevin Brooks“Stories organize Storytelling for User Experience Rosenfeld Media 2010 facts in memorable ways.”
  34. 34. Options for report deliverables No deliverable Quick findings Presentation Detailed report Slide 36
  35. 35. Steve Krug’s approach• All sites have usability problems• All organizations have limited resources• You’ll always find more problems than you have resources to fix• It’s easy to get distracted by less serious problems that are easier to solve . . .• Which means that the worst ones often persist• Therefore, you have to be intensely focused on fixing the most serious problems firstRocket Surgery Made Easy, New Riders, 2010 Slide 37
  36. 36. “Focus ruthlessly on asmall number of themost importantproblems.” Steve Krug Slide 38
  37. 37. “big honkin’ report“
  38. 38. Lighten the load Start with Quesenbery’s 5 E’s• Effective• Efficient• Engaging• Error-tolerant• Easy to learn Whitney Quesenbery,
  39. 39. Customize yourheuristics
  40. 40. Walk in your user’sshoes
  41. 41. Your turn. Expert review.• Scenario. You want to do user testing in Atlanta. – You heard there might be a lab at Southern Polytechnic State University – See if you can find whether they have a lab and can rent the lab to you• Your task for this review: – Work independently – Jot down findings – Then meet with a few others to organize findings – Discuss how you will report the top findings
  42. 42. lunch
  43. 43. Usability testing Afternoon focussmall studies with atwist
  44. 44. UPA survey says . . . 82% do it
  45. 45. How do you do it?• Lab• Informal• Contextual• Remote• Big• Little• In between
  46. 46. What’s a small study good for? Research Exploration Testing Understanding prototypes users Answering arguments Slide 49
  47. 47. Why don’t we always test? Time Cost Can’t get Ignorance users Agile!!!! Slide 50
  48. 48. Faster, cheaper ways• First Fridays• RITE method• 5-second tests• Man on the street (or coffee shop)
  49. 49. First Fridays• content/usability/first-fridays
  50. 50. RITE method• Rapid iterative testing and evaluation• Developed by Microsoft’s Game Studios• Requires full team commitment – Observe – Analyze findings immediately – Change immediately – Retest – Do it again
  51. 51. 5-second test• Do it yourself•
  52. 52. Remote testing• Do it yourself•
  53. 53. Basics of testingGoalsPersona (s)Tasks/scenariosProtocolAssessment
  54. 54. What’s in a day?
  55. 55. Your turn. Option 1• Goal—ease of use for finding an online graduate program that supports UX interests• Create post-task questions• Select one person in your group to be the user – User task: search for an online program in UX or related field at – What are the requirements for admission? – What are the fees? – What is the next application deadline?• Observers take notes• Discuss findings• Determine top findings
  56. 56. Your turn. Option 2• New device for mobile phone user• Create a few tasks• Write a few post-task questions• Select a “new“ user to be participant• Observers take notes• Discuss findings• Determine top findings
  57. 57. Post-test feedback mechanisms• Create your own• Use SUS• Use Product Reaction Cards• Other?
  58. 58. Create your own
  59. 59. Let’s write some questionsQ 1Q 2Q 3
  60. 60. System Usability Scale Strongly Strongly Disagree Agree 1. I think that I would like to use this website frequently. 2. I found this website unnecessarily complex. 3. I thought this website was easy to use. 4. I think that I would need assistance to be able to use this website. 5. I found the various functions in this website were well integrated. 6. I thought there was too much inconsistency in this website. 7. I would imagine that most people would learn to use this website very quickly. 8. I found this website very cumbersome/awkward to use. 9. I felt very confident using this website. 10. I needed to learn a lot of things before I could get going with this website.This questionnaire is based on the System Usability Scale (SUS), which was developed by John Brooke while working at DigitalEquipment Corporation. © Digital Equipment Corporation, 1986.
  61. 61. What’s in the cards?
  62. 62. Microsoft creates desirability toolkit 1. Faces Questionnaire 6 Faces 2. Product Reaction Cards 118 Cards Slide 66
  63. 63. How to deal the cards• Spread them out on table• Instruct user to – walk along the table and pick up cards that express the user’s experience – Share the meaning of the cards – User’s story emerges• In remote testing, provide a table or Excel spreadsheet – User highlights selections – Explains choices• Collate the results in clusters of similar/same cards
  64. 64. 3 TV weather websites40353025 20 15 10 5 0 Positive Negative Positive Negative Station A Positive Negative Station B Station C 26/13 39/5 24/17 70
  65. 65.  Easy-to-use Helpful Straightforward  Fast Relevant Reliable UsefulRepeated positive card selections focusedon ease of use, relevance, and speed
  66. 66. Let’s try this
  67. 67. “But the light bulb has to want to change” Why do the most serious usability problems we uncover often go unfixed? Steve Krug and Caroline Jarrett #upa2012 Las Vegas
  68. 68. Survey says… Conflicted with decision makers belief or opinion Not enough resources Deferred until next major update/redesign Not enough time Too much else to do No effective decision maker Team did not have enough power to make it happen Required too big a change to a business process Technical team said it couldnt be doneOther events intervened before change could happen Disagreements emerged later Legal department objected 0 10 20 30 40 50 60 70 Number of times this reason was chosen from 131 total usable responses
  69. 69. Steve’s view: You can’t fix everythingProblems you canfind with just a fewtest participants Problems you have the resources to fix © 2001 Steve Krug
  70. 70. Jarrett/Krug theme: Do basic UX better• Do testing earlier• Make stakeholders watch the sessions• Present results better – More explanations – Use video clips 76
  71. 71. The one-two punch
  72. 72. Expert review• What’s it good for?• When do you do it?• What do you do with the results?
  73. 73. User testing• What’s it good for?• When do you do it?• What do you do with the results?
  74. 74. Happiness is . . . using both methods
  75. 75. Read the book.Visit the website. me.
  76. 76. Image credits• Slide 1, 3 Jarrod Clark,• Slide 2,• Slide 4, 46,• Slide 7,• Slide 13,• Slide 15, 16• Slide 17,• Slide 21,• Slide 26,• Slide 29,• Slide 31,• Slide 39,• Slide 40,• Slide 42, the• Slide 43,• Slide 51,• Slide 57,• Slide 58,• Slide 62,• Slide 68,• Slide 77,• Slide 80,