Man vs Machine: Qualitative vs Quantitative UX testing


Published on

Published in: Technology
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Man vs Machine: Qualitative vs Quantitative UX testing

  1. 1. Manvs MachineJohan Verhaegen
  2. 2. In my left corner: The Machines
  3. 3. „Show stakeholders how to improve designs and marketing‟ ?
  4. 4. “(…) With it, you get objective statistics to quantifyusability.” ?
  5. 5. “Participants complete significantly more tasks thanusing the standard think aloud methodology (…)”
  6. 6. “This shows that the standard think aloudmethodology changes user behavior and thus canchange the outcome of the task (…)”
  7. 7. “There is only 1 conclusion: eye-tracking lets youuncover more usability problems without disturbingnatural user behavior.”
  8. 8. In my right corner: The Men
  9. 9. The Boss
  10. 10. Scared…© Disney-Pixar
  11. 11. And doomed…
  12. 12. Because Machines deliver… fast• Quantitative usability research gives you(well, some kind of) strength• Reveals the invisible• Is visually compelling• Is exciting for management• Makes you look good• Bosses like that … a lot!• Especially in times like these
  13. 13. So, the Super Hero buys himself a Machinethat makes him look great!
  14. 14. …or stupid
  15. 15. Wise men say:“Eye-tracker: $15,000+.Ouiji Board: $22.99.Tarot cards: $19.99.Tea leaves: $0.49.Solid inferences from well-collected data: Priceless.”Jared Spool, 01-05-2009
  16. 16. So, what should we do?• The facts• quantitative usability research is popular and quitemainstream nowadays• customers expect „evidence‟ for usability problems• UX experts turn to Machines for help• Let‟s focus on eyetracking (ET) as an example• we could take any Machine• but ET offers such sensational eye candy
  17. 17. What does this Machine actually does for us?• Eyetracking shows you what people look at• ≠ what people think about something• ≠ what people like or dislike• …• Eyetracking shows what people see• yes, but people also use peripheral vision…• …• Heat maps and/or gaze plots don‟t reveal usabilityproblems, e.g. they don‟t tell you• why a problem pops up• why it persists• how can it be solved• …• But… it can deliver
  18. 18. Tame the Machine: learn how to drive• Get to know your technology• Observe users, not screens• Ask users questions about what happens• Interpret your statistics• And take them to a higher level
  19. 19. So, how do you make this thing work?• Recruitment• Test object• Test environment• Test session• Qualitative eyetracking research• Analysis• Conclusion
  20. 20. “Who you gonna call…?”• Inform participants correctly during screening• Ask additional questions about their eyes• Do-it-yourself• Invite an appropriate number of participants
  21. 21. Be prepared• Prepare a detailed test protocol• Carefully prepare your test object• In the eyetracking software• Before the first participant arrives• Run a test session(or you‟ll crash)• Run a pilot analysis on your test data
  22. 22. On the set• Avoid rooms with windows or strong lighting• Avoid movable chairs• Provide an additional screen for the observer• Avoid testing in the user‟s work environment
  23. 23. And you‟re off!• Introduction takes some time• Communication about goals• Communication about eyetracking technology• Communication about scenarios• Calibration• Give the user a practice task• Take notes• Timing is crucial!
  24. 24. Go for qualitative eyetracking research• Don‟t exclusively focus on heat maps and gazeplots• Include video recording and gaze replay analysis• Observe your participants during the test• Think-aloud or not?• Retrospective or not?
  25. 25. And then analyze• You‟ll need time. A lot.• Use heat maps, gaze plots and gaze replays toillustrate your own findings or as a start formore elaborate UX research• Listen to the wise men: dare to use your UXexpertise, it‟s priceless
  26. 26. In the end…• …when it comes to understanding and solvingusability problems:Man always beats Machine!