Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.
Upcoming SlideShare
Loading in...5
×

Like this? Share it with your network

Share

Users are Losers! They’ll Like Whatever we Make! and Other Fallacies.

  • 3,237 views
Uploaded on

Presented at CodeMash 2013. ...

Presented at CodeMash 2013.
If this sounds familiar it is time to make big changes or look for a new job. Failing your users will only end badly. In this session we look at the assumptions that are all-too-often made about users, usability and the User Experience (UX). In response to each of these misguided statements Carol will provide a quick method you can conduct with little or no resources to debunk these myths.

More in: Self Improvement
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
3,237
On Slideshare
3,200
From Embeds
37
Number of Embeds
4

Actions

Shares
Downloads
13
Comments
0
Likes
2

Embeds 37

http://sosassociates.com 18
https://twitter.com 9
http://pcux.blogspot.com 9
http://www.sosassociates.com 1

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide
  • Krug, Steve. Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems.

Transcript

  • 1. CodeMash 2013Presented by Carol Smith @carologic
  • 2.  User Experience Ethnography Customer Insight Usability Interaction Design User Research
  • 3. Plan A:R-E-S-P-E-C-T
  • 4.  Let’s find out about those losers users!  Share what is known  Existing users = usability study  Observations and interviews  Web site – use analytics  Social listening
  • 5.  Learn about:  User’s environment  Real process  Interruptions  Attitudes and opinions  Problems  Goals
  • 6.  Plan with a goal/hypothesis Questions 1. Make a guide 2. Review 3. Test 4. Start study
  • 7.  Share little Related tasks Wait for patterns Save questions Stay out of their “space” Don’t interrupt
  • 8.  Clarify observations  Why doing?  Goal?  How typical was this? Use prepared questions Don’t lead the witness Do listen closely Use their language
  • 9. Artifacts!http://www.flickr.com/photos/heygabe/ via http://creativecommons.org/licenses/by-nc-sa/2.0/Actual Photo: http://www.flickr.com/photos/heygabe/47206241/
  • 10.  Explicit consent Record video, photo, audio Take notes Give incentives
  • 11.  When do they think about your product?  In what context?  Most important to them?  Most like to change? Web sites used most frequently? Phone? What kind? Etc. Etc.
  • 12.  Let’s find out!  Market research / segments are a start  Go where (they *think*) they are ▪ Starbucks ▪ Wal-Mart* ▪ Conferences/User Groups  Card sort to test organization of info
  • 13.  Use to determine:  Order of information  Relationships  Labels for navigation  Verify correct audience http://www.flickr.com/photos/rosenfeldmedia/ via http://creativecommons.org/licenses/by-nc-sa/2.0/
  • 14.  Maximize probability of users finding content Explore how people are likely to group items Identify content likely to be:  Difficult to categorize  Difficult to find  Misunderstood Gaffney, Gerry. (2000) What is Card Sorting? Usability Techniques Series, Information & Design. http://www.infodesign.com.au/usabilityresources/design/cardsorting.asp
  • 15. http://www.flickr.com/photos/richtpt via http://creativecommons.org/licenses/by-nc-sa/2.0/
  • 16. One title/subject Printed stickersConcise and clear 36 Preventive Care Guidelines Numbered for analysis Short description on back of card if needed
  • 17.  Practice session Allow 1 hr for 50 items - Total of 30 – 100 Name groups of cards Moderated (in-person or remote) Un-moderated (online)
  • 18.  Ask to  Describe overall rationale for grouping cards  Show best example  What was difficult? What was easy?  Happy with final outcome?
  • 19.  Code cards = faster data analysis Look for patterns Excel Spreadsheet (Donna Spencer) Online tools - limited analysis Screenshot of OptimalSort online tool’s analysis - http://www.optimalworkshop.com/optimalsort.htm
  • 20. “They’ll LikeWhatever weMake”
  • 21.  Let’s test that  Usability test prototypes  Rapid, iterative cycles of design and evaluation  Web - feedback from on-site tools  Customer feedback/Help desk
  • 22.  Real users doing real tasks Using prototypes or live products Doing assigned tasks without guidance Observed closely http://creativecommons.org/licenses/by-sa/2.0/ http://www.flickr.com/photos/raphaelquinet/513351385/sizes/l/in/photostream/ http://www.flickr.com/photos/raphaelquinet/
  • 23.  Qualitative – not quantitative  actions + comments Series of small usability tests  3 participants each day  At least 3 days of testing  Changes made between testing days
  • 24. Day 1 Day 2 Day 3 PriorityTest & Level of Effort Update Test 1 High 2 Medium 3 Low
  • 25.  End of each day - after the last session Room with a whiteboard About 30 minutes Discuss  trends seen  concerns  recommendations  prioritize changes for the next round  list lower priority changes for future iterations
  • 26.  Final prototype  Vetted with users  Base for recommendations Light Report: “Caterpillar to Butterfly”  Screenshots show progressions  What changes were made and why
  • 27.  Traditional Testing In-Person Remote  Moderated or Un-moderated
  • 28. (Yes, this is an old idea; a great one!)
  • 29.  Small focused tests Reduce waiting for recruitment Once per week/sprint Same day mid-week Less users, shorter sessions: analyze at lunch  3 or more participants recommended  Half hour to 1 hour each
  • 30.  Make team aware Invite everyone Recurring meeting invites for stakeholders
  • 31.  Work in Progress Multiple projects Prototypes Concepts, rough ideas, brainstorming Competing designs, (A/B testing) Comparative studies across market Conduct interviews to inform research More…
  • 32. - Jeff Gothelf - http://blog.usabilla.com/5-effective-ways-for-usability-testing-to-play-nice-with-agile/
  • 33.  Team becomes  accustomed to steady stream of qualitative insight  ensures quick decisions  lines up with business and user goals Adapted from Jeff Gothelf - http://blog.usabilla.com/5-effective-ways-for-usability- testing-to-play-nice-with-agile/
  • 34.  “We are all only temporarily able-bodied. Accessibility is good for us all.” Spirit of the law  WCAG 2.0  Country specific (Section 508) -@mollydotcom at #stirtrek 2011 via @carologic
  • 35.  Test & Observation Rooms Any location will do  Conference rooms  Offices  Quiet corner of cafeteria  Remote Purchase software - always ready
  • 36.  Screener  Technology use/experience  Knowledge of topic Scripts/Guides Consent Forms Data Collection
  • 37. “Why would we need anything more?”
  • 38.  Great way to get quantitative information Questions  Words can have multiple meanings  Un-intended meanings Less people participate now than in past People save face  “It’s not that bad”, “It’s my fault” Vendors requesting Perfect 10
  • 39.  Too close to the project Know things others wouldn’t about product Concerns about ego, job, co-workers, etc. Not the intended user!
  • 40.  Studies have shown that testing 5-6 representative users of each user type will reveal 80% of usability issues. http://www.useit.com/alertbox/20000319.html Jakob Nielsen’s Alertbox. Why You Only Need to Test with 5 Users. March 19, 2000.
  • 41.  Identify repetition After pattern is found, continuation of study:  Adds cost  Delays reporting  Low probability of many new findings
  • 42.  Testing five users is always enough Can test anyone and have the same results Smaller groups equate better findings
  • 43.  Visual appearance is important  Must also be usable  Designed for users  Tasks able to be completed  Organized wellhttp://www.brainjuicer.com
  • 44. http://www.flickr.com/photos/kaptainkobold/5181464194/sizes/o/in/photostream/http://www.flickr.com/photos/kaptainkobold/
  • 45.  Costs more time and money How long will product be used? Less costly to find and correct issues than provide training to work around the problem
  • 46.  Time Money Can’t talk to our Customers Liability Not needed Invisible ROI
  • 47.  Be armed with  Facts  Questions Don’t just pick a method  What do you need to know?  What will the stakeholders respond to?
  • 48. 54
  • 49. @carologic slideshare.net/carologicspeakerrate.com/speakers/15585-caroljsmith
  • 50.  Albert, Bill, Tom Tullis, and Donna Tedesco. Beyond the Usability Lab Albert, Bill, Tom Tullis. Measuring the User Experience Beyer, Hugh. User-Centered Agile Methods (Synthesis Lectures on Human- Centered Informatics) Gothelf , Jeff. http://blog.usabilla.com/5-effective-ways-for-usability-testing-to- play-nice-with-agile/ Bias, , Randolph G. and Deborah J. Mayhew Cost-Justifying Usability: An Update for the Internet Age. Henry, S.L. and Martinson, M. Evaluating for Accessibility, Usability Testing in Diverse Situations. Tutorial, 2003 UPA Conference. Krug, Steve. Rocket Surgery Made Easy: The Do-It-Yourself Guide to Finding and Fixing Usability Problems. Molich, Rolf. A Critique of “How to Specify the Participant Group Size for Usability Studies: A Practitioner’s Guide” by Macefield. Journal of Usability Studies. Vol. 5, Issue 3, May 2010. pg. 124-128. Nielsen, Jakob’s Alertbox. Why You Only Need to Test with 5 Users. March 19, 2000. and Usability Evangelism: Beneficial or Land Grab? by Jakob Nielsen, Ph.D Ratcliffe, Lindsay and Marc McNeill. Agile Experience Design: A Digital Designers Guide to Agile, Lean, and Continuous. Rubin, Jeffrey and Dana Chisnell. Handbook of Usability Testing: How to Plan, Design, and Conduct Effective Tests. John Wiley & Sons, Inc. The $300 Million Button by Jared Spool