DIY USABILITY:  TESTING 1,  2, 3 mandy butters, certified usability analyst
#pcn11UT
usability testing is: <ul><li>observing the user’s experience to improve the design </li></ul><ul><li>performance based, n...
but you shouldn’t start there: Think “prevention” not “validation”
why test? <ul><li>performance problems </li></ul><ul><li>achievement of business objectives </li></ul><ul><li>critical or ...
benefits: <ul><li>data not opinions </li></ul><ul><li>avoids rework </li></ul><ul><li>positive ROI </li></ul><ul><ul><li>u...
the not-so-secret formulas for ROI: www.humanfactors.com/downloads/roi.asp
<ul><li>why don’t more people do it? </li></ul>if this is so fabulous…
<ul><li>1. recruit </li></ul><ul><li>2. test </li></ul><ul><li>3. analyze </li></ul>3 easy steps
but first, let’s talk about time: <ul><li>user research </li></ul><ul><li>rework </li></ul><ul><li>user complaints </li></...
step 1: recruit <ul><li>5 to 10 participants per user group </li></ul><ul><ul><li>ideal: match actual or potential user </...
recruiting tips: <ul><li>don’t be afraid to ask </li></ul><ul><li>recruit 24/7 </li></ul>
step 2: test <ul><li>you need: </li></ul><ul><ul><li>a room </li></ul></ul><ul><ul><li>a table </li></ul></ul><ul><ul><li>...
who’s in the room: <ul><li>participant </li></ul><ul><li>facilitator </li></ul><ul><li>observer/note-taker </li></ul>
<ul><li>paper sketches  </li></ul><ul><li>wire frames </li></ul><ul><li>design concepts </li></ul><ul><li>functioning prot...
what do I test, and how?
<ul><li>advanced prototypes  </li></ul><ul><li>live site </li></ul>but it’s never too late.
what do I test, and how?
you will need: <ul><li>written scenarios and/or tasks – OR – a script </li></ul><ul><li>test protocol </li></ul><ul><li>a ...
a word about scenarios: <ul><li>create a realistic situation </li></ul><ul><li>leave it open-ended, so user is free to exp...
let’s talk about performance tests: <ul><li>observe the user: </li></ul><ul><ul><li>how do they perform? </li></ul></ul><u...
what to track on your form: <ul><li># tasks completed successfully </li></ul><ul><li># successful/unsuccessful steps </li>...
the rating sheet:
the usability questionnaire:
about the test itself: <ul><li>introduce </li></ul><ul><li>set the stage / warm up </li></ul><ul><li>distribute scenarios ...
tips for facilitator: <ul><li>you’re testing design not the person </li></ul><ul><li>there is no “wrong” answer </li></ul>...
tips for facilitator: <ul><li>ask: </li></ul><ul><ul><li>“ Why?” </li></ul></ul><ul><ul><li>“ How would you do that?” </li...
other handy advice: <ul><li>do a practice run </li></ul><ul><li>use disclaimers </li></ul><ul><li>reassure but be neutral ...
step 3: analyze <ul><li>finish notes </li></ul><ul><li>identify usability issues </li></ul><ul><li>compile data (whatever ...
prioritizing usability:
report formats: <ul><li>quick fixes </li></ul><ul><li>punch list </li></ul><ul><li>pow-wow </li></ul><ul><li>presentation ...
NOW GET TO TESTIN’. [email_address]
Upcoming SlideShare
Loading in …5
×

Podcamp11: DIY Usability Testing

450 views

Published on

While experts lend specific expertise and additional level of credibility, usability t

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
450
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
5
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Podcamp11: DIY Usability Testing

  1. 1. DIY USABILITY: TESTING 1, 2, 3 mandy butters, certified usability analyst
  2. 2. #pcn11UT
  3. 3. usability testing is: <ul><li>observing the user’s experience to improve the design </li></ul><ul><li>performance based, not preference based </li></ul><ul><li>an ongoing process </li></ul>
  4. 4. but you shouldn’t start there: Think “prevention” not “validation”
  5. 5. why test? <ul><li>performance problems </li></ul><ul><li>achievement of business objectives </li></ul><ul><li>critical or frequent tasks </li></ul><ul><li>new stuff </li></ul><ul><li>things you aren’t too sure about </li></ul><ul><li>user complaints </li></ul>
  6. 6. benefits: <ul><li>data not opinions </li></ul><ul><li>avoids rework </li></ul><ul><li>positive ROI </li></ul><ul><ul><li>usage </li></ul></ul><ul><ul><li>conversion </li></ul></ul><ul><ul><li>drop-offs </li></ul></ul><ul><ul><li>errors </li></ul></ul><ul><ul><li>support </li></ul></ul><ul><ul><li>training </li></ul></ul><ul><ul><li>time </li></ul></ul>
  7. 7. the not-so-secret formulas for ROI: www.humanfactors.com/downloads/roi.asp
  8. 8. <ul><li>why don’t more people do it? </li></ul>if this is so fabulous…
  9. 9. <ul><li>1. recruit </li></ul><ul><li>2. test </li></ul><ul><li>3. analyze </li></ul>3 easy steps
  10. 10. but first, let’s talk about time: <ul><li>user research </li></ul><ul><li>rework </li></ul><ul><li>user complaints </li></ul><ul><li>customer service </li></ul><ul><li>training / help desk </li></ul><ul><li>recovery </li></ul><ul><li>You could spend it here: </li></ul><ul><li>Or here: </li></ul>
  11. 11. step 1: recruit <ul><li>5 to 10 participants per user group </li></ul><ul><ul><li>ideal: match actual or potential user </li></ul></ul><ul><ul><li>in a pinch: use friends, family or similar users </li></ul></ul><ul><ul><li>do not: use coworkers, designers, developers </li></ul></ul><ul><li>be serious about the schedule </li></ul><ul><ul><li>limit sessions to 1 hour </li></ul></ul><ul><ul><li>leave time in between </li></ul></ul><ul><ul><li>schedule back-ups (end-of-day); have “on-calls” </li></ul></ul><ul><ul><li>confirm and communicate </li></ul></ul>
  12. 12. recruiting tips: <ul><li>don’t be afraid to ask </li></ul><ul><li>recruit 24/7 </li></ul>
  13. 13. step 2: test <ul><li>you need: </li></ul><ul><ul><li>a room </li></ul></ul><ul><ul><li>a table </li></ul></ul><ul><ul><li>a computer </li></ul></ul><ul><ul><li>chairs (3) </li></ul></ul><ul><ul><li>(and sometimes) internet connectivity </li></ul></ul><ul><ul><li>a watch </li></ul></ul><ul><li>optional: </li></ul><ul><ul><li>data capture software, audio/video equipment </li></ul></ul>
  14. 14. who’s in the room: <ul><li>participant </li></ul><ul><li>facilitator </li></ul><ul><li>observer/note-taker </li></ul>
  15. 15. <ul><li>paper sketches </li></ul><ul><li>wire frames </li></ul><ul><li>design concepts </li></ul><ul><li>functioning prototypes </li></ul>start early. test often.
  16. 16. what do I test, and how?
  17. 17. <ul><li>advanced prototypes </li></ul><ul><li>live site </li></ul>but it’s never too late.
  18. 18. what do I test, and how?
  19. 19. you will need: <ul><li>written scenarios and/or tasks – OR – a script </li></ul><ul><li>test protocol </li></ul><ul><li>a form or spreadsheet to track performance </li></ul><ul><li>optional </li></ul><ul><ul><li>consent form if recording audio or video </li></ul></ul><ul><ul><li>follow-up questionnaires </li></ul></ul>
  20. 20. a word about scenarios: <ul><li>create a realistic situation </li></ul><ul><li>leave it open-ended, so user is free to explore </li></ul><ul><li>-OR- </li></ul><ul><li>include tasks within the situation </li></ul><ul><li>give answers to all choices </li></ul><ul><li>don’t give away the answer </li></ul>
  21. 21. let’s talk about performance tests: <ul><li>observe the user: </li></ul><ul><ul><li>how do they perform? </li></ul></ul><ul><ul><li>can they succeed? </li></ul></ul><ul><ul><li>is it efficient? </li></ul></ul><ul><ul><li>do they “get it”? </li></ul></ul><ul><ul><li>what problems did they have? </li></ul></ul><ul><ul><li>can they recover from problems? </li></ul></ul>
  22. 22. what to track on your form: <ul><li># tasks completed successfully </li></ul><ul><li># successful/unsuccessful steps </li></ul><ul><li># of wrong paths taken </li></ul><ul><li># of retries/restarts </li></ul><ul><li>error rates </li></ul><ul><li>time to complete </li></ul><ul><li>time to achieve usability goal </li></ul><ul><li>steps required </li></ul><ul><li>tasks performed per time frame </li></ul>
  23. 23. the rating sheet:
  24. 24. the usability questionnaire:
  25. 25. about the test itself: <ul><li>introduce </li></ul><ul><li>set the stage / warm up </li></ul><ul><li>distribute scenarios </li></ul><ul><li>watch and rate performance </li></ul><ul><li>ask final questions /clear up any confusion </li></ul><ul><li>pay </li></ul><ul><li>set up for next user (clear cache, new user i.d., etc.) </li></ul>
  26. 26. tips for facilitator: <ul><li>you’re testing design not the person </li></ul><ul><li>there is no “wrong” answer </li></ul><ul><li>don’t point out mistakes, just move on </li></ul><ul><li>do not give any help during the test. redirect questions back to the participant </li></ul><ul><ul><li>Exception: need to finish task to get to next scenario. </li></ul></ul>
  27. 27. tips for facilitator: <ul><li>ask: </li></ul><ul><ul><li>“ Why?” </li></ul></ul><ul><ul><li>“ How would you do that?” </li></ul></ul><ul><ul><li>“ What do you think?” </li></ul></ul><ul><ul><li>“ What are you feeling?” </li></ul></ul><ul><ul><li>“ Tell me more about that.” </li></ul></ul><ul><ul><li>“ I’m not sure. What would you do?” </li></ul></ul><ul><ul><li>“ What are you looking for?” </li></ul></ul>
  28. 28. other handy advice: <ul><li>do a practice run </li></ul><ul><li>use disclaimers </li></ul><ul><li>reassure but be neutral </li></ul><ul><li>don’t interrupt </li></ul><ul><li>save explanation for the end </li></ul>
  29. 29. step 3: analyze <ul><li>finish notes </li></ul><ul><li>identify usability issues </li></ul><ul><li>compile data (whatever you tracked) </li></ul><ul><li>compare results to goals </li></ul><ul><li>prioritize </li></ul>
  30. 30. prioritizing usability:
  31. 31. report formats: <ul><li>quick fixes </li></ul><ul><li>punch list </li></ul><ul><li>pow-wow </li></ul><ul><li>presentation </li></ul><ul><li>report </li></ul>
  32. 32. NOW GET TO TESTIN’. [email_address]

×