Successfully reported this slideshow.
Your SlideShare is downloading. ×

Using data to inform design decisions in a UX immature organization - Big Design Conference 2017

Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Ad
Loading in …3
×

Check these out next

1 of 35 Ad

Using data to inform design decisions in a UX immature organization - Big Design Conference 2017

Does your “customer-centric” organization all but ignore the feedbacks that matters most? Does the pressure to get to market faster and outmaneuver the competition crowd out your potential to deliver a great user experience?

Mike Parker shares a step-by-step approach for giving customers their rightful seat at the table. He’ll help you spot logical entry points for collecting and measuring compelling user data, then guide you on positioning your analysis to tell an influential story that ties design decisions to critical business improvement.

Does your “customer-centric” organization all but ignore the feedbacks that matters most? Does the pressure to get to market faster and outmaneuver the competition crowd out your potential to deliver a great user experience?

Mike Parker shares a step-by-step approach for giving customers their rightful seat at the table. He’ll help you spot logical entry points for collecting and measuring compelling user data, then guide you on positioning your analysis to tell an influential story that ties design decisions to critical business improvement.

Advertisement
Advertisement

More Related Content

Similar to Using data to inform design decisions in a UX immature organization - Big Design Conference 2017 (20)

Advertisement
Advertisement

Using data to inform design decisions in a UX immature organization - Big Design Conference 2017

  1. 1. Using data to make informed design decisions. Mike Parker User Experience Director CA Technologies ?
  2. 2. The ask. 2
  3. 3. The problem. 3 ?
  4. 4. Larger problem.
 4 + 20% … … …
  5. 5. Organization UX maturity.
  6. 6. Organization UX maturity.
  7. 7. oh dear. 7
  8. 8. Gain a baseline understanding of your product’s performance. 
 8
  9. 9. How?
 
 Usage analytics
 Call support logs
 Contextual interviews
 Sales representative interviews
 Usability testing
 Jira tickets
 Social media 
 3rd party measurement
 9
  10. 10. How?
 
 Usage analytics
 Call support logs
 Contextual interviews
 Sales representative interviews
 Usability testing 
 Jira tickets
 Social media
 3rd party measurement
 10
  11. 11. Where we started.
 
 11
  12. 12. Where we started.
 
 12
  13. 13. Where we started. 13
  14. 14. $5,535 support cost Look for KPIs that matter. 84 line setup18,297 customer calls 17 mins avg. length 1.1 avg. calls per customer 123 web portal support tickets $402,534 support cost $45 per ticket 28 AR setup 11 HG setup 52% web portal $209,308 support cost 9,514 web portal customer calls $22 per call
  15. 15. $5,535 support cost Focus on biggest impact. 84 line setup18,297 customer calls 17 mins avg. length 1.1 avg. calls per customer 123 web portal support tickets $402,534 support cost $45 per ticket 28 AR setup 11 HG setup 52% web portal $209,308 support cost 9,514 web portal customer calls $22 per call
  16. 16. 16 AR Issues Severity Effort Est. Mo. Cost Priority Locating the AR feature H L $3,586 H Special characters not allowed in Caller ID field M L $2,492 H Incorrect setup of advanced features (i.e., Simultaneous Ring, Bridging, etc.) H M $3,827 H Call Forwarding Busy feature set to on by default. M L $1,407 M Active status for AR when not setup M L $178 M Setting up AR call routing M H $1,157 M Recording AR custom greetings M H $1,068 M Setting up AR schedule M H $938 M Build a case.
  17. 17. $5,535 support cost Sell your case. 84 line setup18,297 customer calls 17 mins avg. length 1.1 avg. calls per customer 123 web portal support tickets $402,534 support cost $45 per ticket 28 AR setup 11 HG setup 52% web portal $209,308 support cost 9,514 web portal customer calls $22 per call
  18. 18. Sell your case. $5,535 support cost 84 line setup 18,297 customer calls 17 mins avg. length 1.1 avg. calls per customer 123 web portal support tickets $402,534 support cost $45 per ticket 28 AR setup 11 HG setup 52% web portal $209,308 support cost 9,514 web portal customer calls $22 per call ROI $59,846monthly savings EFFORT 4 weeks
  19. 19. So who is this guy anyway? 19
  20. 20. “Not only does this thing work great, it’s actually fun to use too.”
 - Jeffrey W. 20
  21. 21. Fix this. 21
  22. 22. 22 How will you know it’s fixed?
  23. 23. 23 Back to baseline. Test the before (Baseline) Test the after (Benchmark)
  24. 24. 24 Quantitative Success / failure Time on task Number of clicks Which KPIs to measure? Qualitative Ease of use Confidence Consistency
  25. 25. 25 1. I think that I would like to use this system frequently. 2. I found the system to be simple. 3. I thought the system was easy to use. 4. I can use the system without the support of a technical person. 5. The various functions in the system were well integrated. 6. I thought there was a lot of consistency in the system. 7. I would imagine that most people would learn to use the system very quickly. 8. I found the system very intuitive. 9. I felt confident using the system. 10. I could use the system without having to learn anything new. Don’t forget your SUS.
  26. 26. 26 Online, Unmoderated. UserZoom. Clickable prototype.
 8 target participants, non-customers (no biases).
 5-7 tasks.
 Qualitative questionnaire following each task.
 SUS questionnaire at test completion. Same design for the before and after test. Decide on methodology.
  27. 27. Put it all together. Task 1 Task 2 Test Screen 1 Test Screen 2 Task 1 Instruction Question- naire Test Screen 2 Test Screen 3 Task 2 Instruction Question- naire SUS Each task should be finite. Participant attempts task on screen 1. When testing a flow, break the larger task into bite size chunks that can be completed by the user 1 screen at a time. Screen 2 provides the participant with closure so they can provide meaningful answers to the questionnaire. Start the next task at the previous closure page for continuity. 27 Quantitative (Automatic): • Success / Fail • Time to complete • Number of clicks Qualitative: • Ease of Use • Confidence • Consistency
  28. 28. 28 Do it. 1. Usability test. Establish baseline. Understand problems. 3. Usability test. Benchmark progress. Validate design decisions. 2. Design solutions.
  29. 29. 29 improvement in user confidence (that they’ve completed the task correctly) The result. 32% 20% 37% improvement in flow efficiency (or time the user spent completing the AR setup flow) improvement in ease of use
  30. 30. 3030 Study comparison.
  31. 31. 3131 SUS comparison. BenchmarkBaseline 90.2562.25 0 10 20 30 40 50 60 70 80 90 100 F D C B A
  32. 32. 3231 UX ROI. 45%Easier to use $40,096 Monthly cost savings 28Issues fixed Total UX effort5 wks
  33. 33. 33 AR Redesign Benchmark usability study Call center visit w1 w2 w3 w4 w5 What it looked like. Improvements usability study Refine
  34. 34. 34 Rock on!
 
 There’s no right way to include your users.
 Always start with a baseline.
 Look for impact. Sell it. Track. Measure. Sell it.
 

  35. 35. Mike Parker User Experience Director, Agile Management, CA Technologies @parkahux 35 Thank you.

×