How do you know if the user experience your team is slaving over is succeeding or failing? How can your team use data to make better decisions? Quantitative measures can help answer these questions and can complement more traditional qualitative research methods like usability testing.
VIP Kolkata Call Girl Gariahat đ 8250192130 Available With Room
Â
How to Quantitatively Measure Your User Experience
1. Measures of Success: How to Quantitatively
Measure Your User Experience
Richard Dalton, @mauvyrusset
http://www.flickr.com/photos/torontorob/4044565681/
1
5. #2 The objectives of your experience
will likely differ from those of the
person sitting next to you
@mauvyrusset #WVpdx
5
6. Users have goals
Goals
⊠and the business has goals
Are realized through
Users do things to try to meet their goals
Tasks + Emotions
⊠and the business wants users to do things so it can meet its goals
Are enabled & encouraged by
Capabilities help users to do the tasks they want to do
Capabilities
⊠and encourage users to do the tasks the business wants them to do
Are created & changed by
Projects create new capabilities
Projects
Projects change existing capabilities
6
7. #3 Measure how well tasks are
satisfied by capabilities, not
projects â otherwise you have
no baseline
@mauvyrusset #WVpdx
7
9. User-driven tasks
Find an investment company Capabilities Business-driven tasks
Make good investment decisions
Web Follow Vanguardâs investing principles
Monitor my investments
Phone Learn why Vanguard is great
Act on my investments
Paper
Bring assets to Vanguard
Stay current on news and commentary E-mail
Use Vanguard's products & services
Deal with taxes Mobile devices
Help other people be successful investors Radio/TV Self-provision on the web
Help my heirs be successful at Vanguard Spread the word about Vanguard
Trust Vanguard
90 tasks grouped 635 capabilities and 45 tasks grouped
into 8 categories counting ⊠into 7 categories
9
11. User-driven tasks
Research an item Capabilities Business-driven tasks
Get information on the item
Find out how much the item costs Close the sale
Item profile
Compare the item to others like it (web) Buy the item
See if other people like the item
Save the item to look at later Cross sell
Print details about the item See related items
See related items
Trust us
Ready to buy Believe that the site is safe and secure
Buy the item
Spread the word
Find out shipping costs and times
Tell other people about the item
Find out how to pay
Tell others
Tell a friend about the item
11
12. Buy the item
1
Buy the item Emotional considerations
Shoppers are commonly fearful or unsure that they have chosen the
2 Get information on the item
best product for their needs and want to âcomparison shopâ the price
3 Find out how much the item costs and/or features of several products.
4 See if other people like the item
5 Find out shipping costs and times High-level design & content approach
Show recently viewed items and provide access to a âcompare to
6 Compare the item to others like it similar itemsâ tool.
7 Believe that the site is safe and secure
8 Find out how to pay
Success criteria & measures â âwhat is good?â
9 Save the item to look at later
The ratio of users looking at recently viewed items via the âcompare
See related items to similar itemsâ tool vs. pogo-sticking back to the gallery page should
10 be 20:1 or better.
See related items
11 Print details about the item
Tell a friend about the item
12
Tell other people about the item
12
15. S3.01 Tell a friend about the item S2.03 Find out how to pay S1.01 Get information on the item
S2.02 Find out S1.02 Find out S1.03 Compare the
shipping costs how much the item to others like it
and times item costs
S1.04 See if
other people
like the item
S2.01 Buy the item
P1.01 Buy the item
P4.02 Tell other P3.01 Believe P2.01 See related items
people about that the site is
the item safe and secure
Drivers Outcome
Ishikawa (fishbone) diagram
http://en.wikipedia.org/wiki/Ishikawa_diagram
15
16. #4 Measuring outcomes can tell you
if a capability is failing. Measuring
drivers can tell you why
@mauvyrusset #WVpdx
16
17. Whatâs being evaluated? Measure
A. General effectiveness in satisfying the task 1. Completion or conversion rates
âIâve done itâ or âNow I understandâ 2. Did future user behavior change as a
result of this interaction?
B. Findability of a known item Speed to find the first task
âI know where Iâm going, donât get in my wayâ
C. Were the user expectations met? Link bounce rate
âOh, that wasnât what I wanted, let me go back âŠâ
D. Client satisfaction 1. âWas this useful/helpfulâ surveys
âHuh, that information was uselessâ 2. âLoss of saleâ surveys
E. Was enough information provided? Usage of a second source of information
âDonât leave me wanting more infoâ
F. Are users travelling the paths we expected them to? Relative usage of one path vs. another to
âI canât find the link, Iâll go back to the homepage firstâ identical destination
17
18. Success Criteria
Enduring Temporary
Use for on-going monitoring Use for point-in-time improvement
Measure a single solution Compare two or more solutions
against a predefined criteria against one another using A/B testing
Criteria set by past user Point-in-time âwinnersâ
behavior or future expectations. identified by test results.
Example Example
Task: Get information on the item Task: Print details about the item
Loss of sale surveys should show that A/B test two versions of the
problems with item information printer friendly entry point (link vs.
account for less than 5% of lost sales button)
18
19. #5 Ask; how would the user
behave if we nailed the design?
How would they behave if we
screwed it up?
@mauvyrusset #WVpdx
19