Your SlideShare is downloading. ×
Podcamp11: DIY Usability Testing
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×

Introducing the official SlideShare app

Stunning, full-screen experience for iPhone and Android

Text the download link to your phone

Standard text messaging rates apply

Podcamp11: DIY Usability Testing

311
views

Published on

While experts lend specific expertise and additional level of credibility, usability t

While experts lend specific expertise and additional level of credibility, usability t

Published in: Technology

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
311
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
4
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. DIY USABILITY: TESTING 1, 2, 3 mandy butters, certified usability analyst
  • 2. #pcn11UT
  • 3. usability testing is:
    • observing the user’s experience to improve the design
    • performance based, not preference based
    • an ongoing process
  • 4. but you shouldn’t start there: Think “prevention” not “validation”
  • 5. why test?
    • performance problems
    • achievement of business objectives
    • critical or frequent tasks
    • new stuff
    • things you aren’t too sure about
    • user complaints
  • 6. benefits:
    • data not opinions
    • avoids rework
    • positive ROI
      • usage
      • conversion
      • drop-offs
      • errors
      • support
      • training
      • time
  • 7. the not-so-secret formulas for ROI: www.humanfactors.com/downloads/roi.asp
  • 8.
    • why don’t more people do it?
    if this is so fabulous…
  • 9.
    • 1. recruit
    • 2. test
    • 3. analyze
    3 easy steps
  • 10. but first, let’s talk about time:
    • user research
    • rework
    • user complaints
    • customer service
    • training / help desk
    • recovery
    • You could spend it here:
    • Or here:
  • 11. step 1: recruit
    • 5 to 10 participants per user group
      • ideal: match actual or potential user
      • in a pinch: use friends, family or similar users
      • do not: use coworkers, designers, developers
    • be serious about the schedule
      • limit sessions to 1 hour
      • leave time in between
      • schedule back-ups (end-of-day); have “on-calls”
      • confirm and communicate
  • 12. recruiting tips:
    • don’t be afraid to ask
    • recruit 24/7
  • 13. step 2: test
    • you need:
      • a room
      • a table
      • a computer
      • chairs (3)
      • (and sometimes) internet connectivity
      • a watch
    • optional:
      • data capture software, audio/video equipment
  • 14. who’s in the room:
    • participant
    • facilitator
    • observer/note-taker
  • 15.
    • paper sketches
    • wire frames
    • design concepts
    • functioning prototypes
    start early. test often.
  • 16. what do I test, and how?
  • 17.
    • advanced prototypes
    • live site
    but it’s never too late.
  • 18. what do I test, and how?
  • 19. you will need:
    • written scenarios and/or tasks – OR – a script
    • test protocol
    • a form or spreadsheet to track performance
    • optional
      • consent form if recording audio or video
      • follow-up questionnaires
  • 20. a word about scenarios:
    • create a realistic situation
    • leave it open-ended, so user is free to explore
    • -OR-
    • include tasks within the situation
    • give answers to all choices
    • don’t give away the answer
  • 21. let’s talk about performance tests:
    • observe the user:
      • how do they perform?
      • can they succeed?
      • is it efficient?
      • do they “get it”?
      • what problems did they have?
      • can they recover from problems?
  • 22. what to track on your form:
    • # tasks completed successfully
    • # successful/unsuccessful steps
    • # of wrong paths taken
    • # of retries/restarts
    • error rates
    • time to complete
    • time to achieve usability goal
    • steps required
    • tasks performed per time frame
  • 23. the rating sheet:
  • 24. the usability questionnaire:
  • 25. about the test itself:
    • introduce
    • set the stage / warm up
    • distribute scenarios
    • watch and rate performance
    • ask final questions /clear up any confusion
    • pay
    • set up for next user (clear cache, new user i.d., etc.)
  • 26. tips for facilitator:
    • you’re testing design not the person
    • there is no “wrong” answer
    • don’t point out mistakes, just move on
    • do not give any help during the test. redirect questions back to the participant
      • Exception: need to finish task to get to next scenario.
  • 27. tips for facilitator:
    • ask:
      • “ Why?”
      • “ How would you do that?”
      • “ What do you think?”
      • “ What are you feeling?”
      • “ Tell me more about that.”
      • “ I’m not sure. What would you do?”
      • “ What are you looking for?”
  • 28. other handy advice:
    • do a practice run
    • use disclaimers
    • reassure but be neutral
    • don’t interrupt
    • save explanation for the end
  • 29. step 3: analyze
    • finish notes
    • identify usability issues
    • compile data (whatever you tracked)
    • compare results to goals
    • prioritize
  • 30. prioritizing usability:
  • 31. report formats:
    • quick fixes
    • punch list
    • pow-wow
    • presentation
    • report
  • 32. NOW GET TO TESTIN’. [email_address]