• Save
Usability Testing for People w/ Disabilities

Like this? Share it with your network

Share

Usability Testing for People w/ Disabilities

  • 7,644 views
Uploaded on

Learn best practices, tips and tricks to usability testing with people with disabilities.

Learn best practices, tips and tricks to usability testing with people with disabilities.

More in: Technology , Design
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
7,644
On Slideshare
7,261
From Embeds
383
Number of Embeds
12

Actions

Shares
Downloads
0
Comments
2
Likes
16

Embeds 383

http://www.oliveira-online.net 127
http://news.xtminc.com 112
http://www.scoop.it 50
https://twimg0-a.akamaihd.net 36
http://www.linkedin.com 23
https://si0.twimg.com 17
http://us-w1.rockmelt.com 11
https://www.linkedin.com 2
https://twitter.com 2
http://pinterest.com 1
http://storify.com 1
http://a0.twimg.com 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Goal is to create a great, usable, accessible product experienceWeb for AllThe social value of the Web is that it enables human communication, commerce, and opportunities to share knowledge. One of W3C's primary goals is to make these benefits available to all people, whatever their hardware, software, network infrastructure, native language, culture, geographical location, or physical or mental ability. Learn more about:Web Accessibility InitiativeInternationalizationMobile Web for Social DevelopmentWeb on EverythingThe number of different kinds of devices that can access the Web has grown immensely. Mobile phones, smart phones, personal digital assistants, interactive television systems, voice response systems, kiosks and even certain domestic appliances can all access the Web. Learn more about:Web of DevicesMobile Web InitiativeBrowsers and Other Agents
  • An application might be coded for accessibility but that does not mean it provides the best user experience.Maybe use an example from our testing. One example might be the QT survey: use of non-standard controls (altho coded accessibly) surprised users. Also the use of 1 question per page: some suggested showing several questions per page – unexpected page design and controls, slowed them down.
  • Some definitions of usability…
  • Usability testing generally involves setting a series of real tasks in prototype or real products for representative users to complete, and observing their actions and success. It’s as simple as that! It doesn’t need to be expensive and it does not take as much time as people may think. Observers watch, listen, take notes, and measure effectiveness (able to complete task), efficiency (time / effort required), and satisfaction. Why is this important?Enhances the design process through feedbackIdentifies and corrects usability problems prior to product releaseForces perspective change – from how designers would use the product to how users actually do use the productIncludes people with disabilities in the design decisionsEliminate design problemsUncovers unknown or unexpected issuesTests assumptionsProvides objective, third-party inputCreates a benchmark for future releasesImproves profitabilityIncreases customer satisfaction and retentionReduces maintenance, training, and support costsProvides a competitive edgeUsability testing isn’t really that different for people with disabilities, but do keep a checklist for additional items to think about when planning and conducting the test.Observed in evals:KT4TT – where am I? confusing site structure. Two pages labeled “Home.”NARIC – also hard to figure out where am I. XXX??? – site with obscure tab text and categories of info…Such problems slow users down, and as the graph on slide 3 shows, slows users with disabilities down much more.
  • Ideally throughout the dev cycle: the earlier the better. Testing with paper prototypes can be a great start.Different techniques besides user testing: e.g., expert evaluations, evaluations against standard and guidelines. Can be used at different times in the cycle.
  • First, test with people without disabilitiesIdentify general usability problems – so PWDs aren’t tripping over the same problems and never get to drill down to accessibility issuesFix any usability issuesAfter accessibility guidelines have been verifiedMake sure during design and development that user interface techniques chosen for usability can be made accessibleCode for accessibility throughout development phaseTest accessibility of features before asking people with disabilities to participate in usability testingWhen it makes sense based on person’s abilities
  • You need to plan especially for people with disabilities. There are multiple audiences to reach based on their type of impairment, and different channels for recruiting test participants. Factor in the testing environment – the types of assistive technology required, the physical accessibility of the site or, for remote testing, the communication challenges. Plan on longer timelines for both recruiting and running the test.
  • LabConference roomCaféRemote testingInformal testTrends in User testingMovement away from the lab testingIncreased interest in testing in user’s environmentsAgile development methodsSave money doing remote testing – travel, equipment, recruiting costs, compensation
  • Don’t use remote testing whenSecurity is importantWhen reliability and high speed internet connection is unavailableWhen it is important to track the user’s physical movementsImportant to see magnified screen (ZoomText users)IssuesMissed appointment (as with any testing but more likely if remote)User cannot use software, or cannot login or access websiteUsers not familiar with technologyCan’t see what the user is doing
  • Participant must have speaker phone – must be able to hear the assistive technology and personBest to have the internet and phone line separateAlso check into these GoToMeeting, Lotus SametimeUnyte, YuuGuu, WebEx, Yugma
  • Focus on the core tasks – what most users want to do most of the time. So for an email application, navigating to the inbox is a primary task, and distribution lists probably less so. BUT sometimes you want to find out the usability of a new function even if it’s not a core task. Plan for 5-6 tasks. Start with simple tasks and keep the instructions simple. Aim for no more than 45 minutes for all of the tasks and 1:00 to 1:15 for the entire session.Remember that it is easier and more effective for people with cognitive disabilities when you limit the length of a task and do one step at a time. Be sure to get their feedback after each step, while it is fresh and they are focused on that task.Design the tasks with both visual access and screen reader access in mind. Consider the order of the tasks so that they make sense, and the starting point for each task. Can the tester start anywhere or should they always start from a home page? In some cases it can even make sense to randomize the tasks.
  • Disabilities include: vision, hearing, speech, voice-related, manual dexterity, mobility and cognitiveSome won’t be relevant to the product (ie site does not use sound, then ignore hearing)Each category that applies to your site is a user profile
  • Jacob Nielsen, pioneer usability expert: 3-5 users will discover most usability issueGood News: useful results without running many sessions6-8 users per test or 5 spread over multiple testsLittle ROI in testing more than 9 usersGroup by assistive technology – testing with PWD may require more test sessions – each category of disability is a different audience of 3-5 users
  • Recruiting can be the hardest part. Based on the user profiles, you will want to create a screener to ask questions that are important to see if the person qualifies for the study. For example can ask about the assistive technology use, how often they use the Internet, sites visit, familiarity with the site being used with the test, familiarity with the tools that will be used with remote testing. For example, screen reader and screen magnifier users will need external microphone or a phone so you can hear both the speech and screen reader. Only exception is if you are using JAWS tandem. Also check to see if they have high-speed internet connect. Ask about firewalls or any software restrictions.
  • Allow enough time for recruiting users. The more criteria you have the harder it will be to recruit. Make sure you get multiple contact methods and let participants know that you will be contacting them. In the session confirmation email, send any instructions that are needed for the session and remind participantsRecruiting can be hard and it is sometimes tough to get participants for the test. It may take longer find the right participants but do not rely on family/friends and avoid using the same participants.If the users do not fit the profile, do not be afraid to tell them that they are not right for this study. If you are planning other studies, ask them if they would like to be on the list for future studies.During the screening, avoid telling users too much about the test as this can influence the feedback that they give during the testing.
  • If you need to recuit users in a short period of time, the more money offered will get additional participants. Pay is usually dependent on the length of the study.
  • It’s ok to get consent by email.If you send documents, make sure that they are accessible!
  • Datalogger is a free, easy-to-use tracking tool created by User Focus. It is a good way to record and track sessions.
  • Schedule one participant for a dry run. Use this to make any changes to make the test, fine tune the tasks.Make sure you know that the set up work and have a backup plan. Something may go wrong.Not doing proper run through of test in advanceTest all platforms and technology combinations in advanceKnow the level of accessibility of the tools you are usingBad task design
  • Allow for 2x as much timeKeep sessions under 90 minutes including pre and post questionnaires and paperwork Keep in mind that people with disabilities might require breaks and be sure to ask participants about any possible schedule conflicts. Mobility users might have helpers accompanying them, who could have schedule constraints.Plan based on type of disabilityTime required changes based on disability and experience levelDon’t assume you know what they want or what’s best for themGather feedback after each task and at the endObserve user behavior and listen to user feedbackRemember, you are moderating, not trainingAsk open-ended questions – why, how, whatLook for hesitations, comments, questions, body language and behaviorsExplain the studyRemind them that this is a test of the product, not of themEncourage them to think aloudThank them for their timeTell them about the reward / gift and that it is not contingent on their performance
  • To stop users during a task, consider saying “Let’s stop there and I’ll give you something new to do”Useful resources for testing with people with disabilities include: conducting a test (http://uiaccess.com/accessucd/interact.html)and the use of appropriate terms (http://www.miusa.org/ncde/tools/respect)
  • It could also be the assistive technology that is silent.Be careful not to interrupt the screen reader.Stop at a logical place to ask questions.
  • If testing people with cognitive disability, better to collect information after each task.
  • WithDatalogger it is easy to track start and stop times, and it offers robust graphing that helps to identify patterns in the data.Keep it simpleTrack their effectiveness and efficiencyNumber of pages visitedProblems, difficulties and obstaclesSeverity of the issuesAny assistance requiredAmount of time on each taskEvaluate satisfaction level of each taskUseful resources for testing with people with disabilities include: conducting a test (http://uiaccess.com/accessucd/interact.html)and the use of appropriate terms (http://www.miusa.org/ncde/tools/respect)
  • Standard Usability Measurement Inventory (SUMI)Software purchase50 questionsWebsite Analysis and Measurement Inventory (WHAMMI)Purchase20 QuestionsSystem Usability Scale (SUS) (included in the Datalogger)Free10 questions
  • Were there any unexpected resultsInclude any personal observations
  • Write issues on sticky notesCategorize as accessibility, usability, technology or AT Prioritize the list of issuesDetermine if this is an accessibility issues, issue with AT or user experience issueRe-test if possible – verify that problems were solved
  • Look for patternsUse graphs to summarize the information (generated by the Datalogger)Find patterns in the data- Determine if each issue was due to accessibility, the system, or a usability issue- Evaluate issues based on AT- Compare results

Transcript

  • 1. Usability Testing for People with DisabilitiesKathleen Wahlbin Mary Hunter UttInteractive Accessibility, Inc. The Paciello Group, LLCEmail: KathyW@ia11y.com Email: mary.h.utt@gmail.comOffice: (978) 443-0798 Office: (978) 618-9772 © 2012 Interactive Accessibility, Inc. All rights reserved.
  • 2. “Web for All. Web on Everything” http://www.w3.org/Consortium/mission 2
  • 3. Higher confidence, more satisfied, less frustratedSuccess 3 Source: Beyond Alt Text Jacob Nielsen 2002
  • 4. Accessible ≠ Usable 4
  • 5. the extent to which a product can beused by specified users toachieve specified goals witheffectiveness, efficiency and satisfactionin a specified context of useISO 9241-11 5
  • 6. The Is an approach thatincorporates direct userfeedback throughout thedevelopment cycle in order to reducecosts and create products and toolsthat meet the users needsUsability Professionals Association 6
  • 7. The “… making sure thatsomething works well: that aperson of average (or even below average)ability and experience can use the thing – withoutfor its intended purposegetting hopelesslyfrustrated Steve Krug, author of Don’t Make Me Think 7
  • 8. Usability TestingQuantifying the user-friendliness of a product Satisfaction Error Learnability Prevention Efficiency Effectiveness 8
  • 9. Usability testing has biggest impact earlyImpactonDesign Early Development Prototype Production Time
  • 10. When to include people with disabilities (PWD)?• First, test with people without disabilities• After accessibility guidelines have been verified• When it makes sense to test 10
  • 11. Planning is key to the success Prepare Analyze •Choose location • Conduct test • Prepare •Determine tasks sessions findings •Recruit • Analyze •Define audiences • Record results participants results •Environment •Materials Plan Test Report 11
  • 12. Plan: Many location options• Lab• Conference room• Café• Remote testing• Informal test Picture Sources: http://www.flickr.com/photos/4483970 9@N07/5739824042 12 http://www.uqul.uq.edu.au/
  • 13. Lab vs. RemoteTraditional Lab Remote Testing• Pros • Pros – Easy to observe users and body language – Cost effective – Controlled environment – No travel – easier scheduling / coordination – Easy to record user and screen / audio – Large pool of testers - easier to recruit – No communication glitches – Realistic, representative environment• Cons ○ Their hardware / software ○ Their AT – devices and applications – Transportation and coordination ○ Their settings – AT and settings might not match ○ Often, a helper set up their AT for them • Cons ○ AT differences can distract from real test – Cannot see the participants ○ User might combine AT – hard to match – Does not show magnified screen – Expensive – Problems downloading and installing – Can only draw from local user base desktop sharing – Lab and building must be accessible – Participants can be interrupted or distracted 13
  • 14. Remote testing can be effective and economical• Moderator / Note-taker – Screen sharing: Acrobat Connect, Skype, Join.me – Recording: Morae, Camtasia, Acrobat Connect – Robust computer – Speakerphone• Participant – High-speed internet access – Speakerphone – Robust computer – Camera (optional) 14
  • 15. Plan: Design good tasks “Log in and locate your inbox.”“Set up a distribution listof 5 recipients and sendthem a memo that is onyour desktop.” 15
  • 16. Plan: Who should be included?• Determine the purpose of testing – What are the core tasks?• Define the user profiles – Who uses the product? – Determine which categories of disabilities apply to your site• Find suitable participants – Participant’s background and skills should represent the product’s intended user 16
  • 17. Plan: How many participants? 17 Source: Jacob Nielsen; http://www.useit.com/alertbox/20000319.html
  • 18. Prepare: Recruiting - the hard part• Organizations for specific disabilities or conditions• Local disability-related support groups• Rehabilitation or disability services (government, university, local programs)• AT providers• Client contacts: customer support or sales• Notice boards or mailing lists• Social media: Facebook, Twitter, LinkedIn• Classified ads: Craigslist• Recruitment agency or market research agency• Participants 18
  • 19. Recruiting: Lessons LearnedDo Don’t – Allow 1-2 weeks for recruiting – Use the same participants – Create a screener and ask – Use participants who are familiar participants how they access with the site or application websites – Rely on family and friends – Get multiple contact methods – Explain too much about the – Send meeting invitation and purpose of the test follow up with participants the – Be afraid to turn them down if day before they do not fit the profile – Get consent over email – Ask for referrals – Get backup participants 19
  • 20. Recruiting: What should we pay?• Usually $50-$100 depending on length of session• Gift cards are easy• Consider options for international participants 20
  • 21. Prepare: Don’t forget the paperwork Script Ensure consistent instructionsConsent Forms Official acknowledgement for taking part NDA If needed Waivers Permission for recordingsQuestionnaires Gauge satisfaction 21
  • 22. Prepare: Determine how to collect data 22 Source: UserFocus http://www.userfocus.co.uk/resources/datalogger.html
  • 23. Prepare: Do a dry run, test the environment• Is the environment accessible?• How much time is needed for the setup?• Is the paperwork and test script material ok? 23
  • 24. Test: Six steps for conducting a good test Greet the participant Explain the study, your role and their role Perform tasks Ask satisfaction questions Debrief participants Debrief note taker and observers 24
  • 25. Conducting the Test: Be a good moderatorDo Don’t• Be friendly • Start the test unless they have• Give clear instructions to the signed a consent/release form participant prior to tasks • Assume what the participants• Stay objective and detached need• Give encouraging but non- • Give hints or ask leading questions committal feedback • Use industry jargon• Ask if you don’t understand why a • Accept just yes/no for answers users did an action• Know when to stop a task, but don’t rescue a person too soon 25
  • 26. If the participant says “hmm”, “oops” or “I wonder” What made you say that? Do you have questions or comments? Describe the steps you’re going through here 26
  • 27. If the participant is silent for 10-20 seconds… What are you What are you thinking? doing now? 27
  • 28. If the participant decides to stop… What would you Is that what you do next? expected to happen? Walk me through what you did Do you have any comments? 28
  • 29. Test: Take good notes 29
  • 30. Conduct satisfaction questionnaire• Three standard sets of questions: – Standard usability measurement inventory (SUMI) – Website analysis and measurement inventory (WHAMMI) – System usability scale (SUS) 30
  • 31. Test: Wrap up the session Review the participant’s test session Understand the issues, difficulties and omissions Debrief all observers and note takers Prepare short summary of the session and the results 31
  • 32. Results: Analyze performance metrics Completion Are users able to complete the task?Time on Task How long does it take to complete the task? Page Views How many pages to complete the task? Errors How many per task and the severity? Satisfaction How does the user rate the product? 32
  • 33. Results: Find patterns in the data 33
  • 34. Report• Determine top items based on the data and observations, not opinion• Report results for product improvement• Provide recommendations by AT• Review recommendations with designers and developers 34
  • 35. Lessons learned• Be prepared for technology issues for remote testing• Incomplete or buggy prototypes can prevent users from completing tasks – Code freeze – Not sufficiently tested for accessibility• Not scheduling enough time in between sessions – In case user takes longer than planned – To collect notes and results while it’s fresh 35
  • 36. Key Points – Usable accessibility benefits all• You can test usability formally or informally• Doesn’t have to cost a lot of money• Provides a richer understanding of how people with disabilities use the product• Identifies issues that lead to more usable and accessible products for all users• Value comes from getting as close as possible to what real people are doing with your product 36
  • 37. Questions? http://www.slideshare.net/kwahlbin 37