Your SlideShare is downloading. ×
Usability testing for qualitative researchers
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.


Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Usability testing for qualitative researchers


Published on

Published in: Technology

  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. What qualitative researchers should know about usability testing QRCA - New England chapter, fall 2011 meeting Presenter: Kay Corr y Aubrey S e p t e m b e r 2 3 , 2 01 1
  • 2. Background
  • 3. Agenda – what you will learn1. This presentation will be an “appetizer” on usability testing (1 hr 15 minutes) • Overview of how to plan, run, analyze & report on a usability study* • Pointers to where you can learn more2. Hands-on demonstration • We’ll usability test the Android, iPhone, & iPad versions of the new QRCA VIEWS app (45 minutes) 3 3
  • 4. Where does usability testing fit in withother qualitative research methods? 4
  • 5. What is a usability test?• Qualitative study where typical users try to accomplish typical tasks on their own with the product• Point is to see how clearly the product “speaks” to them, meets their expectations, fits into their typical work and task flow• Moderator & team watch participant working and keep score of task success & failure, comments, body language 5
  • 6. What can be usability tested? • Web sites, software applications • Consumer products (e.g., vacuum cleaners, ovens, mobile phones – examples from projects I’ve done to illustrate the range of where you can apply this technique) • Packaging • Customer service or ordering procedures • Training & documentation • Basically you can usability test any product or service where there is user interaction 6
  • 7. Many Good reasons for running a usability study• Make interaction with a product as fluid & intuitive as possible• Avoid embarrassment – expose usability problems• Test design concepts• Compare design approaches• Challenge assumptions• To compare your product with a competitor’s• Improve ease of use and learning• To better understand users• To understand training and documentation needs• To increase sales, improve your product’s reputation, decrease need for technical support• To save money and time (less need to rework the design, fewer calls to customer support) 7
  • 8. The test should focus on a specificaspect of the product 8
  • 9. Sync testing approach to product’sdevelopment stage Test method Project stage Focus Paper prototype Early Overall product concept, terminology, navigation Electronic prototype Design Task flow, visual design, page layout, specific features, validate redesign Functioning product Development & QA Defaults, online help, feature integration, performance Comparison Post-release Product features, performance benchmarks, can be within your own product or against competitor’s 9
  • 10. Where can you run a usability test?Venue Pros ConsFacility Can invite lots of observers, Artificial environment, costs more fewer logistical headaches, can test wider range of productsConference room Saves money - no travel, can More logistics, artificial environment, test wider range of products observers want to sit in same room as testerLive online More natural (participant is Can only test Web-based products, hear in their environment, saves voice but don’t see body language, money, no travel, fewer need to recruit more tech savvy logistics, easy for team to participants, firewall issues observeNative habitat (mobile studies) Person is in context of use - Cannot easily record, only have 1-2 people use mobile phones observers when they are “on the go”, cheap & quick Handout (pages 16-19 shows ways to set up the room) 10
  • 11. Stages of a usability test
  • 12. Major phases of a usability study 1. Planning the study 2. Running the study 3. Analyzing results 4. Reporting results A typical soup-to-nuts usability study takes about 70 hours 12
  • 13. Planning stepsPlanning a usability study includes• Setting objectives• Recruiting• Creating the task list• Managing the logistics 13
  • 14. Planning a study is very involved Create project plans and check lists to keep your ducks in a row. 14
  • 15. Forget a step and you are dead Just kidding 15
  • 16. Running the studySessions follow a structure that is similar to any qualitativeresearch session:1. Establish rapport, help them understand what they’ll be doing2. Begin with background questions3. Do the usability study4. Debrief to gather more feedback5. Administer surveys & say goodbye!During a usability study you are watching people’s behavior whilelistening closely to what they say 16
  • 17. What type of data do you collect?• Collect objective results (can they do it? Define up front what “success” means for each task)• Emotional reactions• Practical information (how does this product design fit into their world? What kind of training would a person need to be productive with this technology?)• Typical measures (task success/failure, time on task)• When they struggle, note why• You see patterns after 3-4 sessions but new stuff always emerges (depending on diversity of participant backgrounds) 17
  • 18. Look for opportunities to collect quantitative data Product Reaction Cards System Usability Scale  Participants quickly select 5 attributed  Participants answer 10 questions on key from among 118 choices aspects of usability  The attributes are balanced between  Survey produces a score between 0 and 100; positive and negative a score <60 is considered poor Product reaction card attributes are in Appendix 18
  • 19. Product reaction card results are fun and illuminatingText size indicates number of times that attribute was chosen. Refer to study spreadsheet to see cardschosen by each participant 19
  • 20. System usability Scale (SUS) scores cut to the chaseProduct ABC Product XYZ 10 3 3 XYZ’s’ SUS scores from all 10 participants were between 75 and 100 1 1 1 1 0 0 0 0 0 0 0 0 00-10 11-20 21-30 31-40 41-50 51-60 61-70 > 71 0-10 11-20 21-30 31-40 41-50 51-60 61-70 > 71SUS scores < 60 indicate poor usability. See the study spreadsheet for details. 20
  • 21. Seek results that can be expressed in picturesSource: Moxie Software 21
  • 22. Analyzing results• Tests produce huge amount of data• Analyze as close to tests as possible – I try to transcribe to heat map from notes right after the session• If you are doing these on your own (no observers) need to anticipate pushback from some quarters• Tabulate task times, # errors, and other notable incidents• Report across test as well as by task• Use descriptive statistics (mean, medium, mode, ranges)• Include team & tester quotes in results• Need to involve client team – debrief after each session or reserve a conference room, gather team 22
  • 23. Reporting – each discipline expectsspecific insights from the study results Executives want a distilled version, an encapsulation of “the problems” and to understand what is perceived as the premium this product offers Product managers and marketing people seek insights on segmentation, product identity, competitive information, participant reaction to feature sets Product designers want detailed usability feedback to guide refinements to the product’s interface and behavior. Is the design intuitive? How does it fit into the user’s work and task flow? Engineering needs to understand the usability bugs so they can prioritize them and fix them (often their input is the bottom line) Training and documentation people want to know which content to include in their work 23
  • 24. Hands-on exercise – usability test qrca views mobile appSTEPS1. Translate the objectives into a task list2. Break into groups of 2-3 people, one person is the administrator another the participant, others are observers who will take notes3. Run the study4. One group will use Morae, another will use the “sled” to record participants, and others will use the more informal “chair side” approach where you just sit next to the participant and watch them work5. Observers will record impressions on colored sticky notes (each group will have its own color)6. We’ll reconvene and do a mini affinity diagramming session to tabulate and discuss the results and recommendations for improvement. We’ll hold a debrief.7. Kay present our results to Monica and Eddie8. Then we’ll eat lunch!! 24
  • 25. The EndThank You!!
  • 26. About usKay Corry Aubrey, Qualitative Researcher and author of this studyKay Corry Aubrey is the owner of Usability Resources, which specializes in user-centeredresearch and design. Kay has over 20 years of experience in applying qualitative researchmethods and usability testing to technology-oriented products and collaborative software. Shehas led user research and usability and design efforts for dozens clients including AT&T,Affinnova, Constant Contact, Monster Worldwide, the Massachusetts Medical Society, the MayoClinic, and iRobot.Kay has taught at Northeastern University and Bentley University’s Center for Human Factorsand Information Design. She is the managing editor of the QRCA VIEWS magazine, a marketresearch journal that is read by over 5,000 qualitative research consultants and buyers. Kayhas an MSW from Boston University’s School of Social Work, an MS in information systems fromNortheastern University’s Graduate School of Engineering, and a BA from McGill University. Sheis a RIVA-certified Master Moderator who enjoys doing research with both groups andindividuals.For further information on Kay’s background, please visit orcontact her at 26