UX Research: What They Don't Teach You in Grad School

525
-1

Published on

Three case studies on UX techniques and methodologies that will inspire, amaze, and possibly strike fear. But, through it all, lessons learned from the field and fundamentals of UX research will be presented. The goal is to depart with practical perspectives and sufficient rigor to guide a course towards a customer aware corporate strategy.

*Please note we had technical difficulties during the Q&A so we were unable to 'close out' properly but the presentation was recorded without issue.*

Published in: Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
525
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

UX Research: What They Don't Teach You in Grad School

  1. 1. UX Research: What I did not learn in grad school… Gavin Lew Executive Vice President, GfK User Centric @glewGfK User CentricChicago, October 2012
  2. 2. Introduction © October 30, 2012 – Proprietary and Confidential 2
  3. 3. A little about me… Gavin S. Lew 3
  4. 4. Adjunct Faculty © October 30, 2012 – Proprietary and Confidential 4
  5. 5. Adjunct Faculty © October 30, 2012 – Proprietary and Confidential 5
  6. 6. © October 30, 2012 – Proprietary and Confidential 6
  7. 7. © October 30, 2012 – Proprietary and Confidential 7
  8. 8. © October 30, 2012 – Proprietary and Confidential 8
  9. 9. I did not finish my PhD© October 30, 2012 – Proprietary and Confidential 9
  10. 10. What Happened? © October 30, 2012 – Proprietary and Confidential 10
  11. 11. OverviewWhat Did I Learn in Grad School? Picture of Grad School ? 11
  12. 12. What Did I Know? © October 30, 2012 – Proprietary and Confidential 12
  13. 13. Founded User Centric Started UC back in 1999…we are now: – GfK User Centric • 150+ global UX consultants with post- graduate degrees in behavioral sciences, human factors, or human-computer interaction 13
  14. 14. Philosophy 1.0 At UC, we don’t “sell services” like – Usability testing – Or other user experience forms of research What we strive to do is answer client questions – Methodologies and techniques are just tools 14
  15. 15. We have the Privilege of… Interacting with, Designing for, and Testing many User Experiences But, this also means… 15
  16. 16. Sometimes I Feel Like Clients Ask Us to… But, most of our work… 16
  17. 17. Section TitleThisTitleSlide SUCKS!!![1 of 2] Slide content October 30, 2012 17
  18. 18. Philosophy 2.0We believe that any business can be successful If they could just… Take a bite out of suck We take projects where we can have a positive impact with our clients One that transforms their user’s experience 18
  19. 19.  Techniques (User Experience) 19
  20. 20. OverviewBut, What Does It Really Mean? Picture of thinker 20
  21. 21. Session Topics © October 30, 2012 – Proprietary and Confidential 21
  22. 22. What They Do Not Teach You in Grad School1. “The experience you craft is more than just the product”2. “Yes, usability can be measured…”3. “Sometimes research is COMPLICATED”4. “Design is not always walk-up-and-use” 22
  23. 23. 1. “The experience you craft is more than just the product” © October 30, 2012 – Proprietary and Confidential 23
  24. 24. Product orService
  25. 25. IVR CallOut-of-the- Centers Box ProductExperience or Service Store Web Site Experience User e-com Guide
  26. 26. Paper Bills IVR CallOut-of-the- Centers Box ProductExperience or Service Store Web Site Experience User e-com Guide HR
  27. 27. © October 30, 2012 – Proprietary and Confidential 28
  28. 28. October 30, 2012 29
  29. 29. Philosophy 3.0We Believe Experiences Matter Must think beyond just the product itself… 30
  30. 30. 2. “Yes, Usability can be MEASURED” 31
  31. 31. Must Fight Naysayers As UX practitioners, we believe that concepts, such as easy-to-use, intuitive, and usable can be measured Unfortunately, naysayers believe that we cannot measure ”I know it when I see it” Ultimately, what we do is MEASURE and CHANGE 32
  32. 32. Does Anyone See More Than Snow? © October 30, 2012 – Proprietary and Confidential 33
  33. 33. Anyone See a Dog?
  34. 34. Once you see it, you cannot help but see it
  35. 35. Dalmatian “pops”
  36. 36. FedEx Spinner
  37. 37. Who sees it?
  38. 38. More help
  39. 39. FedEx Spinner is “now” locked
  40. 40. Spinner just “pops”Forgive me. I just ruined your commute!
  41. 41. Naysayers: You Cannot Measure Usability “You just know it when you see it” © October 30, 2012 42
  42. 42. Naysayers: Cannot Measure “Intuitiveness” Correct this belief Measurement must be: – Well defined – Observable – Quantifiable – Repeatable © October 30, 2012 43
  43. 43. Not All Measures Are Created Equal ` Bad Inappropriate Good © October 30, 2012 44
  44. 44. Consider a Horse Race: Which Measure is Good? Consider horse racing: What measure is used? © October 30, 2012 45
  45. 45. Different Conditions: Yes, That is SnowYes, this is snow…!!! © October 30, 2012 46
  46. 46. Some believe the horse wants to win… © October 30, 2012 – Proprietary and Confidential 47
  47. 47. Exercise Procedure – Will give you a task – I will count time – Make a Yes or No decision – Remember the time – Raise your hand • Right = Yes / Left = No Let’s practice first 48
  48. 48. Exercise Is the man wearing a red shirt? Decide Yes or No Remember the time Raise hand Ready? Decide Yes or No Note time Raise hand 49
  49. 49. 1 sec 50
  50. 50. 2 sec 51
  51. 51. 3 sec 52
  52. 52. 4 sec 53
  53. 53. 5 sec 54
  54. 54. 6 sec 55
  55. 55. 7 sec 56
  56. 56. 8 sec 57
  57. 57. 9 sec 58
  58. 58. Task 1 Hospital setting. Assessment of different prompts for an interaction with an interface. The patient has declined further treatment. The physician asked you to go into the system and cancel all of the orders. So, you select the orders and press cancel. Decide Yes or No Note time Raise hand Ready? 59
  59. 59. 1 sec 60
  60. 60. 2 sec 61
  61. 61. 3 sec 62
  62. 62. 4 sec 63
  63. 63. 5 sec 64
  64. 64. 6 sec 65
  65. 65. 7 sec 66
  66. 66. 8 sec 67
  67. 67. 9 sec 68
  68. 68. 10 sec 69
  69. 69. Task 2 Hospital setting. Assessment of different prompts for an interaction with an interface. The patient has declined further treatment. The physician asked you to go into the system and cancel all of the orders. So, you select the orders and press cancel. Decide Yes or No Note time Raise hand Ready? 70
  70. 70. 1 sec 71
  71. 71. 2 sec 72
  72. 72. 3 sec 73
  73. 73. 4 sec 74
  74. 74. 5 sec 75
  75. 75. 6 sec 76
  76. 76. 7 sec 77
  77. 77. 8 sec 78
  78. 78. 9 sec 79
  79. 79. 10 sec 80
  80. 80. Why Does This Matter? You can feel easy-to-use You need to design this way You can feel bad interfaces and also measure the effect 81
  81. 81. 3. “Sometimes research can beCOMPLICATED…” © October 30, 2012 – Proprietary and Confidential 82
  82. 82. One Can Always Get Data…1) Is this really “good” data?2) Will the results produce actionable change? Integrity is everything in research 83
  83. 83. Pick a Device, Any Device! 84
  84. 84. AlphaSanitization MP3 PlayersBravo Five selected target interfacesCharlieDeltaEcho 85
  85. 85. Client ObjectiveClient asked User Centric to: Understand the user experience related to these devices 86
  86. 86. Client ObjectiveClient asked User Centric to: Understand the user experience related to these devices Identify usability issues 87
  87. 87. Client ObjectiveClient asked User Centric to: Understand the user experience related to these devices Identify usability issues Recommend possible solutions to improve the UI 88
  88. 88. Challenge: Help create a best-in-class UI Sounded reasonable to us 89
  89. 89. Effort Seemed to be Largely Formative Discovery emphasizes the qualitative Testing is pragmatic with small samples to iterate the designBut, not this case example because… 90
  90. 90. Case Study #1 But, the client had different needs 91
  91. 91. Design Research Included: 21 Tasks of Interest were selected – High Frequency of Use (“Play Song”) – Priority (“Create Playlist”) 5 User Interfaces – Four Competitors – One Client Design (Echo) Alpha Bravo Charlie Delta Echo 92
  92. 92. Task x Device MatrixUser Centric, Inc. UPA: June 2006 93
  93. 93. Case Study #1 But, the client had Client had even more needs… EVEN MORE needs 94
  94. 94. Asked to Conduct Research to:breed relative tonew competitionbe best-of- Ensure that the design will the 95
  95. 95. Asked to Conduct Research to:breed relative tonew competitionbe best-of- Ensure that the design will the Oh, we failed to mention—  This is extremely high profile  Data will drive strategy  Report will go directly to C-level executives… And oh, did I mention that we’re gonna need 96
  96. 96. Initial Design 21 Tasks x 5 Designs – 100+ Combinations Picked: Competitive Usability Testing Within-subjects design -- NOT – Learning – Fatigue Between-subjects design…? How? 97
  97. 97. When We Looked Closer… It actually got worse! 98
  98. 98. Not All of the Tasks Worked for Each Device! 99
  99. 99. Caseeven worse…And Study #1But it gets even better!TheClient had even more needs… client had anotherUI variant to add… 100
  100. 100. And New Baby Makes Six! 101
  101. 101. What else can I say? 102
  102. 102. Case Study #1And even worse…Could we try talkingthe company out ofdoing these things? Client had even more needs… Nope!Results will drive strategy 103
  103. 103. We Did Manage to Convince Company that UC would design the study such that it would be sensitive enough to detect statistical significance, if it indeed existed Thus, there would be NO a priori assurances of finding significant differences—because it might not exist! 104
  104. 104. The Real Challenge…We were charged with: – Research activities – Methodology that would provide data to justify design directionA device was going to be built… 105
  105. 105. Core Elements Access points (navigating to a feature is easy) Feature task flows (completing task may be hard) Design look and feel Iconography Verbiage 106
  106. 106. Core Elements Access points (navigating to a feature is easy) Feature task flows (completing task may be hard) Design look and feel Iconography Verbiage 107
  107. 107. Experimental Approach Realistically, participants could – Only interact with 2-3 designs effectively (not 5-7) – Assumed each participant could complete ~ 6 tasks Create prototypes on a computer – Level the “playing field” to core task flow elements Usability testing / Quantitative Data Collection – Recruited target demographic (incl. high schools) – Needed simultaneous test teams – In the end, we really needed to know the “story” 108
  108. 108. Control for Bias: Create Blocks Sheer size of all possible combinationsWithin each block… Order of task presentation – Tasks were systematically counter-balanced to reduce learning and order effects Order of device presentation – For each participant, devices were randomized within each task to reduce learning and order effects 109
  109. 109. Participants were Assigned to Blocks Individual participants received a block – Assigned randomly to reduce learning effects Constraint: Familiarity biases were avoided – iPod owners will not interact with iPods Est. what could go into a 60-min session – Blocks of four to six tasks seemed to “fit” – Using total number of steps to complete… 110
  110. 110. So…How Many Participants? If each participant received five or six (of the 21) tasks by three (of the six) devices – There are 20 unique device combinations • P1 = Devices ABC; P2 = ABD; P3 = ABE… To complete a full block, 20 participants were needed Each device by task cell was represented with at least 10 participants – N=80 to have four blocks of 20 111
  111. 111. So, This Looked Like This… Julian Tracy O. Participant #1 Participant #2 16-24 years 25-44 years task="11" device=Bravo task="16" device=Bravo task="11" device=Foxtrot task="16" device=Delta task="16" device=Charlie task="6" device=Alpha task="13" device=Charlie task="6" device=Foxtrot task="13" device=Delta task="6" device=Bravo task="13" device=Bravo task="4" device=Bravo task="8" device=Delta task="4" device=Foxtrot task="8" device=Bravo task="4" device=Alpha task="8" device=Charlie task="15" device=Foxtrot task="18" device=Bravo task="15" device=Bravo task="18" device=Delta task="17" device=Foxtrot task="9" device=Charlie task="17" device=Bravo task="9" device=Bravo task="17" device=Alpha task="9" device=Delta 112
  112. 112. Measures Time-on-Task Efficiency (Deviation from Optimal Path) – Total screens viewed / Optimal path for the task • More incorrect “steps” increases this metric Success – % participants in each cell (device x task) who successfully completed the task Preference – Pair-wise device preferences for a particular task with a magnitude judgment 113
  113. 113. Task-by-Task Analysis 114
  114. 114. Sometimes numbers donot tell the whole story 115
  115. 115. Sometimes the runner up Ain’t that Bad 116
  116. 116. What was the REAL STORY…? N = 80?... I asked for 100+ © October 30, 2012 – Proprietary and Confidential 117
  117. 117. Usability Issues / Participant Verbalizations 118
  118. 118. Results Drove Iterative Design Process Start with the high frequency / high priority tasks… – Why did the Foxtrot design win? – Why did the Alpha design lose? – Compare quantitative with qualitative Complex tasks – The fastest time was not the best – More clicks, less error, high satisfaction Sometimes winners would emerge for different reasons… – How do you weigh different UI conventions? 119
  119. 119. Lessons Learned “Know the story” – Benefit of Qualitative Data – Absolute “must have” – Extra 20 participants used for qualitative data Learning – Counterbalancing was sufficient – But, these are not walk-up-and-use devices… Avoid “Frankenstein” Design – Do not simply pick the winning task flow and implement – Consistency matters! 120
  120. 120. How to Write Good RecommendationsGroup Exercise: 3 of 3 All too often, we over focus on making things SIMPLE
  121. 121. Walk Up And Use4. “Design is NOTalways aboutwalk-up-and-use” © October 30, 2012 122
  122. 122. Not Everything Should Be WALK UP AND USE Expert interfaces are around us everywhere All too often… we design for the first hour of use NOT the first year of use 123
  123. 123. Call Center Interface 124
  124. 124. Call Center Interface 125
  125. 125. While on a Call, Knowing “History” Usually HelpsClick Notes 126
  126. 126. Specific Notes Can Be Opened 127
  127. 127. Specific Notes Can Be Opened 128
  128. 128.  Imagine… – Approx. 100 calls per day – Typical environment involves cubicle farms – Stand and ask questions – Multi-tasking – Time pressure – Possible sales incentives in effect – Rapid consumption of screens Does this change anything?!?! 129
  129. 129. Expert Users Demonstrate over-learned behaviors – High number of transactions, huge volume of calls – Rote memorization of commands and actions Emphasis is all about their workflow – They make transactions so quickly, across multiple systems, and in most cases, they do not need to look at entire screen – IMPACT: Users will not look at individual notes and they will be less informed, thus driving calls back!!! Reality: – Traditional walk-up-and-use methods may be totally inappropriate and insufficient 130
  130. 130. Main Screen 131
  131. 131. Takeaway Lessons © October 30, 2012 – Proprietary and Confidential 132
  132. 132. Core IssuesWe Measure Thank youWe Change 133
  133. 133. s1. “The experience you craft ismore than just the product” 2. “Yes, usability can be MEASURED”” 3. “Sometimes research can be COMPLICATED…” 4. “Design is NOT always about walk-up-and- use”” © October 30, 2012 – Proprietary and Confidential 134

×