Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Benchmarking Usability Performance

1,531 views

Published on

Why to benchmark usability testing - benchmarking vs. typical usability tests

Published in: Technology

Benchmarking Usability Performance

  1. 1. BENCHMARKING USABILITY PERFORMANCE Jennifer Romano Bergstrom, Ph.D. UX Research Leader Fors Marsh Group George Mason University Dec 9 , 2014
  2. 2. WHAT IS USER EXPERIENCE? + emotions and perceptions = UX Usability = “the extent to which a product can be used by specified users to achieve specified goals with effectiveness, efficiency, and satisfaction in a specified context of use.” ISO 9241-11
  3. 3. USABILITY & USER EXPERIENCE useful valuable desirable accessible trustworthy engaging usable The 5 Es to Understanding Users (W. Quesenbery): http:// www.wqusability.com/articles/getting-started.html
  4. 4. WHEN TO TEST
  5. 5. WHEN TO TEST Benchmark
  6. 6. WHY TEST WHY BENCHMARK? ‣ Provide a framework of current website performance ‣ Compare metrics in future testing
  7. 7. WHY DO IT? ‣ Ensure you’re solving a problem that exists ‣ Ensure you’re building a product that is tailored to its audience ‣ Ensure that your product solution aligns to behaviors WHY TEST
  8. 8. WHERE TO TEST •  Controlled environment •  All participants have the same experience •  Record and communicate from control room •  Observers watch from control room and provide additional probes (via moderator) in real time •  Incorporate physiological measures (e.g., eye tracking, EDA) •  No travel costs LABORATORY REMOTE IN THE FIELD •  Participants tend to be more comfortable in their natural environments •  Recruit hard-to-reach populations (e.g., children, doctors) •  Moderator travels to various locations •  Bring equipment (e.g., eye tracker) •  Natural observations •  Participants in their natural environments (e.g., home, work) •  Use video chat (moderated sessions) or online programs (unmoderated) •  Conduct many sessions quickly •  Recruit participants in many locations (e.g., states, countries)
  9. 9. HOW TO TEST •  In-depth feedback from each participant •  No group think •  Can allow participants to take their own route and explore freely •  No interference •  Remote in participant’s environment •  Flexible scheduling •  Qualitative and Quantitative ONE-ON-ONE SESSIONS FOCUS GROUPS SURVEYS •  Representative •  Large sample sizes •  Collect a lot of data quickly •  No interviewer bias •  No scheduling sessions •  Quantitative analysis •  Participants may be more comfortable with others •  Interview many people quickly •  Opinions collide •  Peer review •  Qualitative
  10. 10. WHAT TO MEASURE
  11. 11. WHAT TO MEASURE Benchmark
  12. 12. EXAMPLE IN-LAB ONE-ON-ONE METHODS Copyright*©2013**The*Nielsen*Company.*Confiden;al*and*proprietary.* 34* Example Methodology Participants: •  N = 74 | Average Age = 37 •  Mix of gender, ethnicity, income •  Random assignment to diary condition •  New, Old, Prototype, Bilingual Usability Testing session: •  Participants read a description of the study. •  The moderator gave instructions and calibrated the eye tracker. •  Participants completed Steps 1-5 in the diary at their own pace. •  End-of-session satisfaction questionnaire •  Debriefing interview Eye Tracker Moderators worked from another room. Control Room Slide from: Walton, L., Romano Bergstrom, J., Hawkins, D. & Pierce, C. (2014). User Experience and Eye-Tracking Study: Paper Diary Design Decisions. Paper presentation at the American Association for Public Opinion Research (AAPOR) Conference, Anaheim, CA, May 2014.
  13. 13. EXAMPLE IN-LAB ONE-ON-ONE METHODS Copyright*©2013**The*Nielsen*Company.*Confiden;al*and*proprietary.* 34* Example Methodology Participants: •  N = 74 | Average Age = 37 •  Mix of gender, ethnicity, income •  Random assignment to diary condition •  New, Old, Prototype, Bilingual Usability Testing session: •  Participants read a description of the study. •  The moderator gave instructions and calibrated the eye tracker. •  Participants completed Steps 1-5 in the diary at their own pace. •  End-of-session satisfaction questionnaire •  Debriefing interview Eye Tracker Moderators worked from another room. Control Room Slide from: Walton, L., Romano Bergstrom, J., Hawkins, D. & Pierce, C. (2014). User Experience and Eye-Tracking Study: Paper Diary Design Decisions. Paper presentation at the American Association for Public Opinion Research (AAPOR) Conference, Anaheim, CA, May 2014. No Think Aloud in Benchmark studies: We want a pure measure of performance
  14. 14. PREPARATION ‣ What are the most important things users should be able to do on this site? ‣ Most frequent ‣ Most important (e.g., registration) ‣ Tasks should be clear and unambiguous and in the user’s language (no jargon). ‣ Don’t prompt the solution. CREATE TASKS
  15. 15. PREPARATION TASK SCENARIO EXAMPLE ‣ “You want to book a romantic holiday for you and your partner for Valentine’s day. How would you do that?” ! ‣ “Use this site to…” is even better. It is a task. You can measure behavior. ! ‣ NOT: Go to the home page of romanticholidays.com and click “sign up now” then click “Valentine’s day.”
  16. 16. PREPARATION THINGS TO AVOID ‣ Asking participants to predict the future ‣ Asking if a participant would use something like X or might enjoy X feature is not productive ‣ Instead, ask about current behavior (do you currently do X?) or show them something and observe how they interact with it
  17. 17. PREPARATION THINGS TO AVOID ‣ Leading people ‣ Let them make their own mistakes; that is valuable ‣ If you give the answers, you’ll never learn what you need to learn ‣ AVOID: ‣ Telling people what to do or explaining how it works ‣ “Is there anywhere else you would click?” ‣ “Go ahead and click on that…”
  18. 18. PREPARATION THINGS TO AVOID ‣ Bias ‣ Try to remain neutral, even if the person is really funny or mean ‣ Use open-ended questions to understand perceptions ‣ AVOID: ‣ Testing friends ‣ Acting differently with different participants ‣ “Did you like it?” ‣ “Interesting.” ‣ “Now we are going to work with this awesome page.”
  19. 19. PREPARATION THINGS TO AVOID ‣ Interrupting ‣ You don’t want to interfere with what participants would normally do on their own ‣ Wait until the end to ask follow-up questions ‣ AVOID: ‣ Probing mid-task ‣ “Why?”
  20. 20. PREPARATION THINGS TO AVOID ‣ Explaining the purpose ‣ Your job is to pull as much information as possible ‣ Your job is not to explain how it works ‣ “What do you think it is for?” ‣ “What would you do if I was not here?” ‣ AVOID: ‣ Explaining how to find information ‣ Explaining the purpose of the product
  21. 21. ANALYZING 
 RESULTS USABILITY & UX TESTING
  22. 22. COMPARE TO GOALS ‣ It is a good idea to set goals (e.g., 90% of participants should be able to register in less than one minute). ‣ Keep results simple so people will use them and appreciate them. ‣ Compare performance to goals ‣ In future iterations, compare performance to benchmark ANALYZING RESULTS
  23. 23. OUTPUTS ‣ Notes, data, video/audio recordings ‣ Usability labs will create full reports (doc or PPT) ‣ Unmoderated tests may provide data reports and recorded sessions. ‣ When writing research notes, remember to: ‣ Report good and bad findings ‣ Stick to what you observed in the test ‣ Use the data! ANALYZING RESULTS
  24. 24. BENCHMARKING USABILITY PERFORMANCE THANK YOU! Jennifer Romano Bergstrom, Ph.D. Fors Marsh Group jbergstrom@forsmarshgroup.com @romanocog Links to more info: EdUI slides (see other slides on Slideshare too) Eye Tracking in UX Design

×