Your SlideShare is downloading. ×
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
A Practical Guide to Measuring User Experience
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

A Practical Guide to Measuring User Experience

21,688

Published on

Measuring the success (or failure) of an experience can be a daunting and confusing endeavor. In this presentation Richard shares 11 guidelines to help you quantitatively measure your user …

Measuring the success (or failure) of an experience can be a daunting and confusing endeavor. In this presentation Richard shares 11 guidelines to help you quantitatively measure your user experience.

Richard provides techniques and tips for each phase and illustrates their use with real examples from his team’s work at Vanguard. To conclude, he describes some of the cultural and change management challenges involved when an organization uses data to inform design decisions.

Published in: Design
2 Comments
48 Likes
Statistics
Notes
No Downloads
Views
Total Views
21,688
On Slideshare
0
From Embeds
0
Number of Embeds
8
Actions
Shares
0
Downloads
378
Comments
2
Likes
48
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. 1
    A Practical Guide to Measuring User Experience
    Richard Dalton, @mauvyrusset, #measuringux
    http://www.flickr.com/photos/kwl/5282931021
  • 2. #1 To learn from failure, you need to know if you’re failing and why – measurement can tell you that
    @mauvyrusset #measuringux
    2
  • 3. THERE IS NOTHING FUNNY ABOUT R.O.I.
    3
  • 4. 4
  • 5. #2 Evaluate your experience against something you care about – is it meeting it’s objectives?
    @mauvyrusset #measuringux
    5
  • 6. 6
  • 7. #3 The objectives of your experience will likely differ from those of the person sitting next to you
    @mauvyrusset #measuringux
    7
  • 8. Users have goals
    … and the business has goals
    Users do things to try to meet their goals
    … and the business wants users to do things so it can meet its goals
    Users do things to try to meet their goals
    … and the business wants users to do things so it can meet its goals
    Users also have feelings about their goals & tasks
    … and the business wants to encourage certain feelings
    Capabilities help users to do the tasks they want to do
    … and encourage users to do the tasks the business wants them to do
    Projects create new capabilities
    Projects change existing capabilities
    Goals
    Are realized through
    Tasks + Emotions
    Tasks
    + Emotions
    Are enabled & encouraged by
    Capabilities
    Are created & changed by
    Projects
    8
  • 9. #4 Measure how well tasks are satisfied by capabilities, not projects – otherwise you have no baseline
    @mauvyrusset #measuringux
    9
  • 10. User-driven tasks
    Business-driven tasks
    Capabilities
    Find an investment company
    Find an investment company
    Make good investment decisions
    Follow Vanguard’s investing principles
    Web
    Phone
    Paper
    E-mail
    Mobile devices
    Radio/TV
    Monitor my investments
    Learn why Vanguard is great
    Act on my investments
    Bring assets to Vanguard
    Stay current on news and commentary
    Use Vanguard's products & services
    Deal with taxes
    Self-provision on the web
    Help other people be successful investors
    Spread the word about Vanguard
    Help my heirs be successful at Vanguard
    Trust Vanguard
    90 tasks grouped
    into 8 categories
    45 tasks grouped
    into 7 categories
    635 capabilities and counting …
    10
  • 11. 11
    http://filmfanatic.org/reviews/wp-content/uploads/2008/01/anfscd-parrot.png
  • 12. User-driven tasks
    Business-driven tasks
    Capabilities
    Research an item
    Close the sale
    Item profile
    (web)
    12
    Get information on the item
    Find out how much the item costs
    Compare the item to others like it
    Buy the item
    See if other people like the item
    Cross sell
    Save the item to look at later
    See related items
    Print details about the item
    See related items
    Trust us
    Ready to buy
    Believe that the site is safe and secure
    Buy the item
    Spread the word
    Find out shipping costs and times
    Tell other people about the item
    Find out how to pay
    Tell others
    Tell a friend about the item
  • 13. 13
    Buy the item
    1
    Emotional considerations
    Buy the item
    Shoppers are commonly fearful or unsure that they have chosen the best product for their needs and want to “comparison shop” the price and/or features of several products.
    Get information on the item
    2
    Find out how much the item costs
    3
    See if other people like the item
    4
    High-level design & content approach
    Find out shipping costs and times
    5
    Show recently viewed items and provide access to a “compare to similar items” tool.
    Compare the item to others like it
    6
    Believe that the site is safe and secure
    7
    Find out how to pay
    8
    Success criteria – “what is good?”
    Save the item to look at later
    9
    The ratio of users looking at recently viewed items via the “compare to similar items” tool vs. pogo-sticking back to the gallery page should be 20:1 or better.
    See related items
    10
    12
    See related items
    11
    Print details about the item
    Tell a friend about the item
    Tell other people about the item
  • 14. 14
  • 15. Measures vs. Success Criteria
    Measure
    Success Criteria
    15
    http://3.bp.blogspot.com/_iUXX3rF1Axo/TC0fGhmlV0I/AAAAAAAAAMw/atKJuTja0AM/s1600/butterfly+growth+chart2.jpg
    http://www.clarklings.com/uploaded_images/IMG_1100-732092.JPG
  • 16. S1.01 Get information on the item
    S2.03 Find out how to pay
    S3.01 Tell a friend about the item
    S2.02 Find out shipping costs and times
    S1.03 Compare the item to others like it
    S1.02 Find out how much the item costs
    S1.04 See if other people like the item
    Research an item
    Tell others
    Ready to buy
    S2.01 Buy the item
    P1.01 Buy the item
    Spread the word
    Trust us
    Cross sell
    P2.01 See related items
    P3.01 Believe that the site is safe and secure
    P4.02 Tell other people about the item
    Outcome
    Drivers
    Ishikawa (fishbone) diagram
    http://en.wikipedia.org/wiki/Ishikawa_diagram
    16
  • 17. #5 Measuring outcomes can tell you if a capability is failing. Measuring drivers can tell you why
    @mauvyrusset #measuringux
    17
  • 18. What’s being evaluated?
    Measure
    Completion or conversion rates
    Did future user behavior change as a result of this interaction?
    General effectiveness in satisfying the task
    “I’ve done it” or “Now I understand”
    Findability of a known item
    “I know where I’m going, don’t get in my way”
    Speed to find the first task
    Were the user expectations met?
    “Oh, that wasn’t what I wanted, let me go back …”
    Link bounce rate
    Client satisfaction
    “Huh, that information was useless”
    “Was this useful/helpful” surveys
    “Loss of sale” surveys
    Was enough information provided?
    “Don’t leave me wanting more info”
    Usage of a second source of information
    Are users travelling the paths we expected them to?
    “I can’t find the link, I’ll go back to the homepage first”
    Relative usage of one path vs. another to identical destination
    18
  • 19. Success Criteria
    Enduring
    Temporary
    Use for on-going monitoring
    Use for point-in-time improvement
    Measure a single solution against a predefined criteria
    Compare two or more solutions against one another using A/B testing
    Criteria set by past user behavior or future expectations.
    Point-in-time “winners” identified by test results.
    Example
    Task: Get information on the item
    Loss of sale surveys should show that problems with item information account for less than 5% of lost sales
    Example
    Task: Print details about the item
    A/B test two versions of the printer friendly entry point (link vs. button)
    19
  • 20. #6 Ask; how would the user behave if we nailed the design? How would they behave if we screwed it up?
    @mauvyrusset #measuringux
    20
  • 21. 21
  • 22. Sometimes people aren’t ready to listen
    22
  • 23. #7 Be open about the uses & limitations of data & involve people early – it helps gain buy-in
    @mauvyrusset #measuringux
    23
  • 24. Beware good intentions
    24
  • 25. #8 Avoid misleading measures - temptation to use the data is too strong. Ask “what if the result is X?”
    @mauvyrusset #measuringux
    25
  • 26. The only thing we have to fear is fear itself
    Ostridge
    26
    http://fc05.deviantart.net/fs36/f/2008/285/4/f/You_make_kitty_scared_by_GreenLabRat.jpg
  • 27. #9 Be unbiased. Don’t be afraid to measure things that might contradict your own opinion
    @mauvyrusset #measuringux
    27
  • 28. I’m sorry Dave. I’m afraid I can’t do that
    28
    http://img395.imageshack.us/img395/3899/hal90001600en6.jpg
  • 29. #10 Don’t lose your perspective about how data fits into your decision-making process
    @mauvyrusset #measuringux
    29
  • 30. But we have over 635 capabilities!
    30
    http://www.flickr.com/photos/tammra/279392432/
  • 31. #11 Start small. Pick a capability, identify objectives, define measures and watch what happens
    @mauvyrusset #measuringux
    31
  • 32. Thank you
    Richard Dalton
    richard@mauvyrusset.com
    mauvyrusset.com
    @mauvyrusset
    32

×