Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Unbiased Methods to Understand the User Experience

1,217 views

Published on

The way we ask questions and behave during UX sessions affects the data we collect and the interpretations of our findings. In order to collect good UX data, it is important for the moderator to be neutral, structured, and unbiased while setting a comfortable stage for participants to share their thoughts and reactions. In this interactive 45-minute session, you will learn about the importance of structured, unbiased methods to collecting user feedback. We will discuss different methods (e.g., in-lab testing, remote moderated/unmoderated testing, surveys, card sorting, focus groups) and pros/cons of each. You will learn about different data that can be collected from usability tests, including subjective (e.g., what participants verbalize about their experience), behavioral (e.g., what participants do) and implicit (e.g., what participants think but cannot explain) data. We will discuss how to ask participants questions in ways that do not introduce biases, and how additional methods, such as eye tracking, may be valuable in understanding the users’ experience. You will learn how to ensure the data we get from UX tests are reliable and valid.

Published in: Education

Unbiased Methods to Understand the User Experience

  1. 1. Click to edit Master title style January 23, 2015 | Convey UX Jennifer Romano Bergstrom, PhD UX Researcher @romanocog Unbiased Methods to Understand the User Experience
  2. 2. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants?The path that got me here 1998-2002 2003-2009 2008-2011 2014 2011 - 2015 present 2014 – present: uxpa2015.org
  3. 3. Click to edit Master title style January 23, 2015 | Convey UX prejudice in favor of or against one thing, person, or group compared with another, usually in a way considered to be unfair. BIAS
  4. 4. Click to edit Master title style January 23, 2015 | Convey UX https://www.youtube.com/watch?v=G0ZZJXw4MTA The way we ask questions impacts answers.
  5. 5. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures Difficulty Ratings A. How difficult was it for you to complete the task? B. Was the task difficult for you to complete? C. How easy or difficult was the task to complete? D. Please rate your difficulty in completing the task. 1. Extremely difficult 2. Very difficult 3. Moderately difficult 4. Slightly difficult 5. Not difficult at all SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing ✗ ✗ ✓ ✓ @romanocog #ConveyUX
  6. 6. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures Think-Aloud Protocol A. Concurrent think aloud - while completing tasks. B. Retrospective think aloud – while watching a video replay. C. Retrospective think aloud – without video replay. D. A mix of concurrent and retrospective think aloud. SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing Olmsted-Hawala, E. L. & Romano Bergstrom, J. C. (2012). Think-aloud protocols. Does age make a difference? Proceedings from the Society for Technical Communication Summit, May 2012, Chicago, IL. ✗ ✗ ✓ ✓ ☐ @romanocog #ConveyUX
  7. 7. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures Probing A. While participant is completing tasks. B. At the end of the task. C. At the end of the session. SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing ✗ ✗ ✓ @romanocog #ConveyUX
  8. 8. Click to edit Master title style January 23, 2015 | Convey UX https://www.youtube.com/watch?v=jyMLDN9UOrE Interrupting interferes with natural actions.
  9. 9. Click to edit Master title style January 23, 2015 | Convey UX Self-report data is great… but it is not enough.
  10. 10. Click to edit Master title style January 23, 2015 | Convey UX https://www.youtube.com/watch?v=Oons6amow3I People cannot complete dual tasks well.
  11. 11. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures Combining Measures A. Concurrent think aloud and accuracy. B. Concurrent think aloud and time to complete tasks. C. Retrospective think aloud and accuracy. D. Retrospective think aloud and time to complete task. SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate ✓ ✓ ✓ •  Olmsted-Hawala, E. L. & Romano Bergstrom, J. C. (2012). Think-aloud protocols. Does age make a difference? Proceedings from the Society for Technical Communication Summit, May 2012, Chicago, IL. •  Olmsted-Hawala, E. L., Murphy, E. D., Hawala, S., & Ashenfelter, K. T. (2010). Think-aloud protocols: A comparison of three think-aloud protocols for use in testing data-dissemination web sites for usability. Proceedings from CHI, April 2010, Atlanta, GA. ☐ ✓ @romanocog #ConveyUX
  12. 12. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures Combining Measures A. Concurrent think aloud and accuracy. B. Concurrent think aloud and time to complete tasks. C. Retrospective think aloud and accuracy. D. Retrospective think aloud and time to complete task. SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate ✓ ✓ ✓ •  Olmsted-Hawala, E. L. & Romano Bergstrom, J. C. (2012). Think-aloud protocols. Does age make a difference? Proceedings from the Society for Technical Communication Summit, May 2012, Chicago, IL. •  Olmsted-Hawala, E. L., Murphy, E. D., Hawala, S., & Ashenfelter, K. T. (2010). Think-aloud protocols: A comparison of three think-aloud protocols for use in testing data-dissemination web sites for usability. Proceedings from CHI, April 2010, Atlanta, GA. ☐ ✓ @romanocog #ConveyUX Higher accuracy and satisfaction when moderators “coach.”
  13. 13. Click to edit Master title style January 23, 2015 | Convey UX Self-report data combined with observational data is very useful… but there is a filter.
  14. 14. Click to edit Master title style January 23, 2015 | Convey UX https://www.youtube.com/watch?v=Oons6amow3I People think they make logical decisions.
  15. 15. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate IMPLICIT •  Eye tracking •  Electrodermal activity (EDA) •  Behavioral analysis •  Verbalization analysis •  Pupil dilation Combining Measures A. Eye tracking with concurrent think aloud. B. Eye tracking with retrospective think aloud C. Eye tracking and time to complete tasks. D. Eye tracking alone. •  Romano Bergstrom, J. C. & Strohl, J. (2014). Improving government websites and surveys with usability testing: A comparison of methodologies. Proceedings from the Federal Committee on Statistical Methodology (FCSM) Conference, Nov 2013, Washington, DC. •  Romano Bergstrom, J. C. & Olmsted-Hawala, E. L. (2012). Effects of Age and Think-Aloud Protocol on Eye-Tracking Data and Usability Measures. Poster presentation at Usability Professionals Association (UPA) Conference, Las Vegas, NV, June 2012. ✗ ✗ ✓ ☐ ✓ @romanocog #ConveyUX
  16. 16. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate IMPLICIT •  Eye tracking •  Electrodermal activity (EDA) •  Behavioral analysis •  Verbalization analysis •  Pupil dilation Combining Measures A. Eye tracking with concurrent think aloud. B. Eye tracking with retrospective think aloud C. Eye tracking and time to complete tasks. D. Eye tracking alone. •  Romano Bergstrom, J. C. & Strohl, J. (2014). Improving government websites and surveys with usability testing: A comparison of methodologies. Proceedings from the Federal Committee on Statistical Methodology (FCSM) Conference, Nov 2013, Washington, DC. •  Romano Bergstrom, J. C. & Olmsted-Hawala, E. L. (2012). Effects of Age and Think-Aloud Protocol on Eye-Tracking Data and Usability Measures. Poster presentation at Usability Professionals Association (UPA) Conference, Las Vegas, NV, June 2012. ✗ ✗ ✓ ☐ ✓ @romanocog #ConveyUX
  17. 17. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? User Experience (UX) Measures Triangulated Approach •  Self-report metrics tell us why participants think they focus on certain aspects. •  Observational metrics tell us how participants navigate and interact. •  Eye tracking tells us what, how long, and how often participants focus on design elements. SELF-REPORT •  Difficulty ratings •  Satisfaction ratings •  Think-aloud protocol •  Debriefing interview •  Probing OBSERVATIONAL •  First click accuracy •  Task accuracy •  Time to complete tasks •  Click patterns •  Conversion rate IMPLICIT •  Eye tracking •  Electrodermal activity (EDA) •  Behavioral analysis •  Verbalization analysis •  Pupil dilation @romanocog #ConveyUX
  18. 18. Click to edit Master title style January 23, 2015 | Convey UX What might this look like?
  19. 19. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? Methods 30 “expert” and 30 “novice” users recruited Tasks using web & app Satisfaction & Knowledge Questionnaire Debriefing interview •  Recruit a mix of iOS and Android users. •  In-person moderated sessions (or remote sessions) •  Half of the participant will think aloud; half will work in silence. @romanocog #ConveyUX
  20. 20. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? •  Example Tasks: •  Download app to your mobile device. •  Start a message with three friends. •  Check for new messages. •  Delete a message. •  Send a photo message. •  Send a text message to a contact. •  Change your notifications. Methods 30 “expert” and 30 “novice” users recruited Tasks using web & app Satisfaction & Knowledge Questionnaire Debriefing interview @romanocog #ConveyUX
  21. 21. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? •  Example Satisfaction items: •  How likely are you to use the app in the future? (1: not likely at all – 5: extremely likely) •  How valuable is the app to you? (1: not valuable at all – 5: extremely valuable) •  Example Knowledge items: •  Can others see when you are online? (yes, no, not sure) Methods 30 “expert” and 30 “novice” users recruited Tasks using web & app Satisfaction & Knowledge Questionnaire Debriefing interview @romanocog #ConveyUX
  22. 22. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? •  Example debriefing items: •  You seemed to hover your mouse over here often. Can you tell me about that? •  Tell me what you thought about downloading the app. •  What would make this experience better for you? •  What is the one thing holding you back from using this tool? •  Please rate your overall satisfaction in using this app. 1. Extremely satisfied 2. Very satisfied 3. Moderately satisfied 4. Slightly satisfied 5. Not satisfied at all Methods 30 “expert” and 30 “novice” users recruited Tasks using web & app Satisfaction & Knowledge Questionnaire Debriefing interview @romanocog #ConveyUX
  23. 23. Click to edit Master title style January 23, 2015 | Convey UX UX Data Qualitative Quantitative Self-Report Satisfaction and knowledge questionnaires YES YES Verbal think aloud (half of participants) YES NO Moderator follow up YES NO Observational Time on page/task NO YES Selection/click behavior YES NO Success/fail rate NO YES Conversion rate YES YES Implicit Verbalization analysis YES YES Eye tracking YES YES @romanocog #ConveyUX
  24. 24. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? Other Considerations •  Iterative Testing – Keep tasks and questions the same. •  We cannot objectively test our own designs. •  Social validation, pleasing the researcher, acing the test •  Age-related differences in performance •  Our job is not to explain the product. •  Coaching = leading = waste of time •  Romano Bergstrom, J. C., Olmsted-Hawala, E. L., Chen, J. M., & Murphy, E. D. (2011). Conducting iterative usability testing on a Web site: Challenges and benefits. Journal of Usability Studies, 7, 9-30. •  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Bergstrom, H. C. (2014). Older adults fail to see the periphery during website navigation. Universal Access in the Information Society, in press. •  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Jans, M. E. (2013). Age-related differences in eye tracking and usability performance: Web site usability for older adults. International Journal of Human-Computer Interaction, 29, 541-548. @romanocog #ConveyUX
  25. 25. Click to edit Master title style January 23, 2015 | Convey UX How Many Participants? Other Considerations •  Iterative Testing – Keep tasks and questions the same. •  We cannot objectively test our own designs. •  Social validation, pleasing the researcher, acing the test •  Age-related differences in performance •  Our job is not to explain the product. •  Coaching = leading = waste of time •  Romano Bergstrom, J. C., Olmsted-Hawala, E. L., Chen, J. M., & Murphy, E. D. (2011). Conducting iterative usability testing on a Web site: Challenges and benefits. Journal of Usability Studies, 7, 9-30. •  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Bergstrom, H. C. (2014). Older adults fail to see the periphery during website navigation. Universal Access in the Information Society, in press. •  Romano Bergstrom, J. C., Olmsted-Hawala, E. L. & Jans, M. E. (2013). Age-related differences in eye tracking and usability performance: Web site usability for older adults. International Journal of Human-Computer Interaction, 29, 541-548. @romanocog #ConveyUX Younger adults Middle-age adults Older adults
  26. 26. Click to edit Master title style January 23, 2015 | Convey UX Are you ready to be unbiased? Thank you! Jennifer Romano Bergstrom @romanocog jenrb@fb.com #ConveyUX

×