Measuring UX
Michael Le
Why bother?
 You did rapid prototyping using Balsamiq
 You used Agile during development
 You are feature and code comp...
Consider the following…
 Are people using your application?
 Are people using your application the way you intended?
 A...
Goal
 Continuous improvement
How?
 Observing
 Listening
 Analyzing
Observing
The user is not like me
 The moment you accept that the easier you can observe
 Be humble
Task Observations
 Preset scenarios that involve the user going through one
story or key action
 Thinking out loud is en...
Example
 Task 1 – Marking points on the EUR price curve
 Task 2 – Marking points on the GBP price curve
 Based on ratin...
 Same task but different score
 Users mentioned that they found the second stage harder
because they couldn’t navigate b...
Eye heatmaps
 Using sophisticated tools such as eye tracking cameras you
can create heatmaps of what people are looking a...
 Facebook: Attention is on areas with pictures
 CNN: People avoid looking at ads generally
Listening
Surveys
 General questions about the user and how they interact with
the system

System usability scale
 Simple list of ...
1.
2.
3.
4.
5.
6.
7.
8.
9.
10.

I think that I would like to use this system frequently.
I found the system unnecessarily ...
Graphing the responses
Watch out for “bad” questions
 Leading questions
 Questions with yes/no responses
“Would you use this application?”
 For web applications to general public this is a good question
 For internal applicat...
“What would make you stop using
this application?”
 May find out information about what is important to users

 People w...
Example: Walmart
 Trying to compete with Target in the area of store layout
 Clean and tidy (Target) vs. Packed aisles (...
People said
 Yes that’s a great idea, less clutter in the aisles is good!
Walmart reacted
 15% of inventory was removed from the aisles
 Removed pallets of items like juice boxes in the centers ...
What happened
1.85 billion dollars loss in sales for Walmart
Cause of error
 Walmart came up with the answer first, then asked
customers to agree to it.
 You should react to what cu...
Analyzing
Logs
 What people are searching for?
 How often?
 Uncover relationships in the data searched for and your
application

...
Heuristic evaluation
 This can be done during the rapid prototyping stage as well

 Using a few evaluators who are famil...
Nielsen’s Heuristics
 Jakob Nielsen’s list of heuristics is one of the most used set
for evaluating user interfaces
 The...
Nielsen’s Heuristics (1/10)

Visibility of system status
 Essentially feedback
 Users should be informed of system state...
Nielsen’s Heuristics (2/10)

Match between system and real
world
 System should use users’s language with words, phrases
...
Nielsen’s Heuristics (3/10)

User control and freedom
 Making errors is a good way for users to discover features of
the ...
Nielsen’s Heuristics (4/10)

Consistency and standards
 Follow platform conventions
 Users should not have to worry if w...
Nielsen’s Heuristics (5/10)

Error prevention
 Best error handling is error prevention
Nielsen’s Heuristics (6/10)

Recognition rather than recall
 Minimize user’s memory load
 Make objects, actions and opti...
Nielsen’s Heuristics (7/10)

Flexibility and efficiency of use
 Allow users to tailor frequent actions
 “Accelerators”
Nielsen’s Heuristics (8/10)

Aesthetic and minimalist design
 Key information should be clearly presented
Nielsen’s Heuristics (9/10)

Help users recover
 Error messages that are user friendly
 Suggest how to solve the issue
Nielsen’s Heuristics (10/10)

Help and documentation
 Provide help and documentation to simplify tasks
 Video guides
 T...
 Graphed as a matrix
 Rows are the different evaluators
 Columns are the heuristic measures
 Data sorted to isolate ea...
Result of Measuring UX
 Knowing your users better
 Continuous feedback and improvement cycles
 Driving change that is m...
Next steps?
 If there is interest, gather as a group and go through design
exercises
 Similar set up as the Human Comput...
References
 http://whitneyhess.com/blog/2012/05/04/the-user-is-not-likeme/
 http://dailyartifacts.com/walmarts-185-billo...
Upcoming SlideShare
Loading in …5
×

Measuring UX

865 views

Published on

Published in: Design, Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
865
On SlideShare
0
From Embeds
0
Number of Embeds
94
Actions
Shares
0
Downloads
10
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Measuring UX

  1. 1. Measuring UX Michael Le
  2. 2. Why bother?  You did rapid prototyping using Balsamiq  You used Agile during development  You are feature and code complete  You delivered the application on time
  3. 3. Consider the following…  Are people using your application?  Are people using your application the way you intended?  Are people using your application differently?  Do they like it?
  4. 4. Goal  Continuous improvement
  5. 5. How?  Observing  Listening  Analyzing
  6. 6. Observing
  7. 7. The user is not like me  The moment you accept that the easier you can observe  Be humble
  8. 8. Task Observations  Preset scenarios that involve the user going through one story or key action  Thinking out loud is encouraged  Observer should refrain from helping the user in the actions  Video recording is recommended  On screen  User
  9. 9. Example  Task 1 – Marking points on the EUR price curve  Task 2 – Marking points on the GBP price curve  Based on rating scale of (1 – 5, hard to easy) 10 9 8 7 6 EUR Pricing 5 GBP Pricing 4 3 2 1 0 1 2 3 4 5
  10. 10.  Same task but different score  Users mentioned that they found the second stage harder because they couldn’t navigate back to the main selection area
  11. 11. Eye heatmaps  Using sophisticated tools such as eye tracking cameras you can create heatmaps of what people are looking at in an application
  12. 12.  Facebook: Attention is on areas with pictures
  13. 13.  CNN: People avoid looking at ads generally
  14. 14. Listening
  15. 15. Surveys  General questions about the user and how they interact with the system System usability scale  Simple list of likert scaled questions
  16. 16. 1. 2. 3. 4. 5. 6. 7. 8. 9. 10. I think that I would like to use this system frequently. I found the system unnecessarily complex. I thought the system was easy to use. I think that I would need the support of a technical person to be able to use this system. I found the various functions in this system were well integrated. I thought there was too much inconsistency in this system. I would imagine that most people would learn to use this system very quickly. I found the system very cumbersome to use. I felt very confident using the system. I needed to learn a lot of things before I could get going with this system.
  17. 17. Graphing the responses
  18. 18. Watch out for “bad” questions  Leading questions  Questions with yes/no responses
  19. 19. “Would you use this application?”  For web applications to general public this is a good question  For internal application for regulatory purposes, maybe not  Answer would be YES  What is a better question?
  20. 20. “What would make you stop using this application?”  May find out information about what is important to users  People were generally worried about the calculations accuracy  People were worried about who did what (auditing)
  21. 21. Example: Walmart  Trying to compete with Target in the area of store layout  Clean and tidy (Target) vs. Packed aisles (Walmart)  Walmart surveyed their customers  “Would you like Walmart aisles to be less cluttered?”
  22. 22. People said  Yes that’s a great idea, less clutter in the aisles is good!
  23. 23. Walmart reacted  15% of inventory was removed from the aisles  Removed pallets of items like juice boxes in the centers of aisles.  Reduced displays at the ends of aisles  Shortened shelves
  24. 24. What happened 1.85 billion dollars loss in sales for Walmart
  25. 25. Cause of error  Walmart came up with the answer first, then asked customers to agree to it.  You should react to what customers do rather than what they say  Ie. How often do you work out?
  26. 26. Analyzing
  27. 27. Logs  What people are searching for?  How often?  Uncover relationships in the data searched for and your application  Add hooks to your applications so that you can track uses when different views are invoked  Taking the ideas of tracking web clicks to the desktop application world
  28. 28. Heuristic evaluation  This can be done during the rapid prototyping stage as well  Using a few evaluators who are familiar with heuristic evaluation to go through an rate an application on certain criteria  Considered cheaper and as effective as user testing  This is not a graphical design evaluation
  29. 29. Nielsen’s Heuristics  Jakob Nielsen’s list of heuristics is one of the most used set for evaluating user interfaces  There are 10 heuristics  Each heuristic is measured on a numeric scale with  1 – Low  10 – High priority, must fix
  30. 30. Nielsen’s Heuristics (1/10) Visibility of system status  Essentially feedback  Users should be informed of system state within a reasonable time
  31. 31. Nielsen’s Heuristics (2/10) Match between system and real world  System should use users’s language with words, phrases and concepts  “Item” versus “songs” in an album
  32. 32. Nielsen’s Heuristics (3/10) User control and freedom  Making errors is a good way for users to discover features of the application  There should be affordances to let users fail without fear: undo, redo, exit without saving
  33. 33. Nielsen’s Heuristics (4/10) Consistency and standards  Follow platform conventions  Users should not have to worry if words mean different things in different situations (ie. “close” vs. “exit”)
  34. 34. Nielsen’s Heuristics (5/10) Error prevention  Best error handling is error prevention
  35. 35. Nielsen’s Heuristics (6/10) Recognition rather than recall  Minimize user’s memory load  Make objects, actions and options visible  “Preview”  “Intellisense”
  36. 36. Nielsen’s Heuristics (7/10) Flexibility and efficiency of use  Allow users to tailor frequent actions  “Accelerators”
  37. 37. Nielsen’s Heuristics (8/10) Aesthetic and minimalist design  Key information should be clearly presented
  38. 38. Nielsen’s Heuristics (9/10) Help users recover  Error messages that are user friendly  Suggest how to solve the issue
  39. 39. Nielsen’s Heuristics (10/10) Help and documentation  Provide help and documentation to simplify tasks  Video guides  Tool tips
  40. 40.  Graphed as a matrix  Rows are the different evaluators  Columns are the heuristic measures  Data sorted to isolate easy problems together to highlight hard problems
  41. 41. Result of Measuring UX  Knowing your users better  Continuous feedback and improvement cycles  Driving change that is measurable  Leading businesses to make decisions on key areas rather than just speculation
  42. 42. Next steps?  If there is interest, gather as a group and go through design exercises  Similar set up as the Human Computer Interaction course from Coursera Skills  Rapid prototyping (Balsamiq)  Heuristic evaluation  Presentation
  43. 43. References  http://whitneyhess.com/blog/2012/05/04/the-user-is-not-likeme/  http://dailyartifacts.com/walmarts-185-billon-dollar-mistake  http://www.businessinsider.com/eye-tracking-heatmaps2012-5?op=1  http://www.useit.com/papers/heuristic/heuristic_evaluation.ht ml

×