Unmoderated, Online Usability Testing for Web (Brandon Kopp & BIll Mockovak)
Upcoming SlideShare
Loading in...5
×
 

Unmoderated, Online Usability Testing for Web (Brandon Kopp & BIll Mockovak)

on

  • 587 views

Given at UXPA-DC's User Focus Conference, Oct. 19, 2012

Given at UXPA-DC's User Focus Conference, Oct. 19, 2012

Statistics

Views

Total Views
587
Views on SlideShare
587
Embed Views
0

Actions

Likes
0
Downloads
3
Comments
0

0 Embeds 0

No embeds

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Feedback A, Task 8 (Students & Teachers): Reacts to Home as a dropdownFeedback D, Task 6 then jump to Task 7 (Injuries & Illnesses): Guy spends 5:20 looking for Workplace Injuries. Doesn’t find it until next task.

Unmoderated, Online Usability Testing for Web (Brandon Kopp & BIll Mockovak) Unmoderated, Online Usability Testing for Web (Brandon Kopp & BIll Mockovak) Presentation Transcript

  • The Use of an Unmoderated, Online Usability Testing Service to Test a WebsiteBrandon Kopp & Bill Mockovak Research Psychologist Office of Survey Methods Research UXPA-DC User Focus Conference 19 October 2012
  • TryMyUI• Provides panel of participants that are presented with user defined tasks• You receive a video screenshot of the participant’s computer with an audio voiceover of their comments 2
  • Many Alternatives Userzoom.comUserTesting.com UTest.com OptimalWorkshop.comOpenHallway.com Loop11.com
  • The BLS.gov Dropdown Usability Test • During earlier usability testing, several users had problems with immediately activating dropdown menus • Users would accidentally click menu links and get lost
  • Prototypes• Test design alternatives for mega-dropdowns Type of MenuPrototype Display Width List of Subject Areas Control Partial list of content, with A Click to show/close Fixed, 3 columns ‘view all’ option B Hover, no delay Fixed, 1 column Compact C Hover, with delay Fixed, 3 columns Current content D Hover, no delay Fixed, 3 columns Current content
  • Prototype A • Need to click on tab to open • Click on X or Arrow to close • Abbreviated list is initially presented • User clicks View All to see entire list
  • Prototype B • Dragging mouse over menu will open it • Menu closes when you move mouse off it (no clicking is necessary) • Higher-level categories are displayed
  • Prototype C • Current, ‘full’ list of topics • Menu drops after delay, with mouseover • Moving off menu makes it disappear quickly
  • Prototype D • Control Condition • Current menu on BLS.gov • Menu drops immediately (no delay) with mouseover • Moving off menu makes it disappear quickly
  • Method• Define Participant Criteria Prototype # of Participants Gender: Any Age: 18-55 A 11 Country: U.S. Income: Any B 14 Education: Any Employment: Any C 10 Computer Experience: Beginner-Intermediate or Expert D 10• Participants complete up to 10 tasks while providing verbal feedback
  • Tasks1. Find a publication called the Occupational Outlook Quarterly. This is an online magazine about jobs and careers.
  • Tasks1. Occupational Outlook Quarterly2. Mass Layoffs3. International Unemployment4. Green Jobs5. Survey of Occupational Illness and Injury6. Strikes and Lockouts7. Students and Teachers8. NY State Wages
  • What the Participant Sees TaskInstructions Browser / Webpage ViewRecording Time 14
  • What The Experimenter Sees Video of Task IndexParticipant’s Screen
  • Example Videos Feedback (A; Task 8) Feedback (D; Task 6)
  • Task Success and Time To Complete # of Tasks Successfully Completed Time to Complete Task 8 120 7.4 7 6.6 6.7 Time to Complete Task (in sec.) 96.9# of Tasks Completed (out of 8) 6.2 100 87.8 6 83.8 77.8 80 5 4 60 3 40 2 20 1 0 0 A B C D A B C D
  • First Click Subject Databases Economic ElsewhereTask Home Publications Areas & Tools Release on page 1 0 3 3 0 32 7 2 3 17 16 7 0 2 3 3 26 9 6 0 1 4 2 33 4 2 0 2 5 1 34 6 1 1 0 6 3 38 3 1 0 0 7 4 25 2 1 4 7 8 1 25 9 1 0 6
  • Method• Define Participant Criteria Prototype # of Participants Gender: Any Age: 18-55 A 11 Country: U.S. Income: Any B 14 Education: Any Employment: Any C 10 Computer Experience: Beginner-Intermediate or Expert D 10• Participants complete up to 10 tasks while providing verbal feedback• Following testing, participants write responses to 4 open-ended questions
  • Open-Ended Questions • How easy were these tasks to complete? Were they very easy, easy, neither easy nor difficult, difficult, or very difficult? How easy do you think these tasks would be for an average American citizen using this website? 80% of Participants Selecting Rating 70 67 69 61 60 50 47 39 40 33 30 23 20 20 17 17 10 8 0 0 Easy/Very Easy Neither Easy Nor Difficult Difficult/Very Difficult A B C D
  • Feedback/Recommendations• Final Recommendation: Prototype C• Several participants were surprised that the Home tab had a menu• No items under Workplace Injuries subcategory on Subject Areas menu
  • Feedback/Recommendations• Subject Areas menu extended off the screen on some participants’ browsers requiring scrolling
  • Advantages• Cost ($) Method CostTraditional, In-Lab $40 per participantTryMyUI.com $35 per participantUserTesting.com $39 per participantUTest.com --OpenHallway.com $49-$199 per monthLoop11.com $350 per project
  • Advantages• Cost ($)• Cost (Time) Task In-Lab Web 20 minutes total; 30 minutes total; Requesting participants explaining criteria to recruiter specifying test groups and criteria 0 minutes; Screening participants 10 minutes per participant done by TryMyUI 0 minutes; Scheduling participants 15 minutes per participant study done at participant convenience 60 minutes total; Preparing for interviews 10 minutes per participant setting up web survey and tasks Total (for 45 interviews) 26.6 hours 1.5 hours 3 weeks; 1 – 2 days; Data collection period based on interviewer schedule Depends on participant criteria
  • Advantages• Cost ($)• Cost (Time)• Short data collection period• Participants that can be selected based on criteria• Videos can be shared• Participants are skilled/trained – At thinking aloud – At dealing with problems – At evaluating websites• Can get replacements for unusable cases
  • Disadvantages• No follow-up• Cannot correct navigation errors• Limited to 20 minutes*• Task completion time, success, and other quantitative methods have to be captured manually*• No rating scales*• Panel participants may have selection biases that make them different from a ‘typical’ user * May be different on other testing services
  • Discussion • What testing service have you used? • What are the advantages/disadvantages of that service? Loop11.com UTest.com Userzoom.comUserTesting.com TryMyUI.com OpenHallway.com OptimalWorkshop.com
  • Contact Information Brandon Kopp Research PsychologistOffice of Survey Methods Research www.bls.gov/osmr 202-691-7514 kopp.brandon@bls.gov