MeasureWorks - Why your customers don't like to wait!

618 views

Published on

My presentation at the Zycko breakfast session... About why your users don't like to wait and why you should care as a site owner. This presentation covers the importance of perception of speed, navigation and how to do proper performance monitoring...

Published in: Internet, Technology, Design

MeasureWorks - Why your customers don't like to wait!

  1. 1. w @jeroentjepkema Zycko Breakfast session June 18 2014 The “Waiting” Experience
  2. 2. Waiting = Engagement
  3. 3. Mobile?
  4. 4. “Both offline and online consumers associate long wait times with poor customer service”
  5. 5. Click away slide Performance tolerance
  6. 6. 0 10 20 30 40 50 60 70 80 90 100 0 1 2 3 4 5 6 7 8 9 10 Average Bouncerate per pagetype/sessionBouncerate(%) Page load time (sec.) Median Campaign Product search Aggregated Real User Monitoring data collected by MeasureWorks for Dutch Commerce websites
  7. 7. 0 10 20 30 40 50 60 70 80 90 100 0 1 2 3 4 5 6 7 8 9 10 Bouncerate(%) Page load time (sec.) Median Campaign Product search Average Bouncerate per pagetype/session Aggregated Real User Monitoring data collected by MeasureWorks for Dutch Commerce websites
  8. 8. Purpose vs. Context
  9. 9. Online users often lack context for delays...
  10. 10. ...and see no other option than to click away
  11. 11. Good Design + Fast Delivery = Great User Experience
  12. 12. NS Hispeed: Site optimization 0 12,5 25 37,5 50 62,5 75 87,5 100 0-0,5 0,5-1 1-1,5 1,5-2 2-2,5 2,5-3 3-3,5 3,5-4 4-5 5-6 6-7 7-8 8-9 9-10 >10 Non-Bouncerate(%) Categorienaam Pageviews(#) Bouncerate # Pageviews Bol.com: Speed vs. Engagement per month Data presented by MeasureWorks at Velocity Conference London 2013: http://bit.ly/MW-VEUrum
  13. 13. Load time Availability Homepage Transaction time Availability Ticket ordering funnel (7 steps) Before After 11,5s 95,61% 3,8s 99,52% Before After 43,2s 88,72% 35,7s 97,65% 30% higher site success rate NS Hispeed: Site optimization
  14. 14. Design vs. Speed?
  15. 15. 0 25 50 75 100 1 2 3 4 5 Slow Average Fast %ofrespondentsthatratedtheactualwebsitespeed Design score (1=bad - 5=beautiful) Perception of good design is influenced by fast page load times
  16. 16. Good Design + Fast Delivery = Great User Experience
  17. 17. Navigating the web is not easy...
  18. 18. https://www.youtube.com/watch?v=3Sk7cOqB9Dk&feature=youtu.be
  19. 19. Let’s design for fast experience!
  20. 20. 1. Recognizable navigation
  21. 21. Source: Jakob Nielsen 99% of time online is spent on other sites than yours...
  22. 22. Experience vs. surfing online....
  23. 23. Always compared to past experiences Experience vs. surfing online....
  24. 24. Task completion has positive impact Always compared to past experiences Experience vs. surfing online....
  25. 25. Task completion has positive impact Slow downs have more impact Always compared to past experiences Experience vs. surfing online....
  26. 26. Sensitive for specific tasks Task completion has positive impact Slow downs have more impact Always compared to past experiences Experience vs. surfing online....
  27. 27. Why is this important?
  28. 28. Consumers visit 18-23 travel websites before they finally convert Source: TNS/Nipo
  29. 29. Search/Orientation phase Perception of experience 1 Performance influence cycle http://benchmark.measureworks.nl
  30. 30. Delivered experience Your website 3 5 2 Stimulate content/ conversion 4 Performance influence cycle Search/Orientation phase Perception of experience 1 http://benchmark.measureworks.nl
  31. 31. The waiting experience
  32. 32. 2. Speed up your content!
  33. 33. Speed-up your perception
  34. 34. http://www.nytimes.com/2012/08/19/opinion/sunday/why-waiting-in-line-is-torture.html?pagewanted=all&_r=0
  35. 35. http://answers.yahoo.com/question/index?qid=1005081200005
  36. 36. nytimes.com/2004/02/27/nyregion/27BUTT.html
  37. 37. Perception = Fast Experience
  38. 38. http://www.fastcodesign.com/1669788/the-3-white-lies-behind-instagrams-lightning-speed
  39. 39. @jeroentjepkema, MeasureWorks How fast should I be?
  40. 40. “Speed is at it’s best when it creates the feeling that you don’t have to work to achieve your goal”
  41. 41. 0,3 0 Source: Jakob Nielsen Instantaneous: I like it!
  42. 42. 1 0,3 0 Source: Jakob Nielsen Instantaneous: I like it! Interaction: Let’s conversate...
  43. 43. 1 0,3 0 Source: Jakob Nielsen Instantaneous: I like it! Interaction: Let’s conversate...
  44. 44. 1 3 0,3 0 Source: Jakob Nielsen Instantaneous: I like it! Interaction: Let’s conversate... Mmm, shall I click away?
  45. 45. 10 1 3 0,3 Instantaneous: I like it! Interaction: Let’s conversate... Mmm, shall I click away? Only if the task/content is relevant 0 Source: Jakob Nielsen
  46. 46. It’s (also) about the customer journey...
  47. 47. Provide relevant content to support task completion...
  48. 48. ...deliver it fast, focus on perception!
  49. 49. “Flow state”
  50. 50. Speed vs. Content
  51. 51. What’s your time to first tweet? https://blog.twitter.com/2012/improving-performance-on-twittercom
  52. 52. 3 (performance) design questions to ask...
  53. 53. Above the fold?
  54. 54. Which priority?
  55. 55. Improving which metrics?
  56. 56. Trouble is stickyness? Gum/Shoe 3. Measure to make it stick...
  57. 57. www.donothing.com
  58. 58. Put performance in your SLA Service Levels = Your baseline
  59. 59. A practical (and easy DIY) example:
  60. 60. Purchasing a book, must be completed (speed), where every page loads under 3 sec., from any location in the Netherlands, for 90% of all users, every day between 6am and 12pm, using IE9 and higher, Customer journey Metric: Speed Target: Sec User scenario User locations Percentile Service Window measured with Real User Monitoring. Monitoring type Read more: Metrics 101, Velocityconf 2010
  61. 61. Repeatable, End-User focused & Quantifiable
  62. 62. Setting up performance monitoring
  63. 63. INTERNET CUSTOMERS Third-party/ Cloud Services Content Delivery Networks Local ISP Mobile Carriers Major ISP INTERNAL USERS(CLOUD) DATA CENTER Storage DB Servers Web Servers App Servers Middleware ServersMainframe Load Balancers Network Application Delivery Chain
  64. 64. INTERNET CUSTOMERS Third-party/ Cloud Services Content Delivery Networks Local ISP Mobile Carriers Major ISP INTERNAL USERS(CLOUD) DATA CENTER Storage DB Servers Web Servers App Servers Middleware ServersMainframe Load Balancers Network Application Delivery Chain This is what you control...
  65. 65. INTERNET CUSTOMERS Third-party/ Cloud Services Content Delivery Networks Local ISP Mobile Carriers Major ISP INTERNAL USERS(CLOUD) DATA CENTER Storage DB Servers Web Servers App Servers Middleware ServersMainframe Load Balancers Network Application Delivery Chain This is what you control... What you’re blamed for..
  66. 66. Measuring performance is not easy...
  67. 67. Measuring performance is not easy... You need diagnostic details for things you can change and/or control
  68. 68. Measuring performance is not easy... You need insights in the things you can’t control, but do impact your bottom line You need diagnostic details for things you can change and/or control
  69. 69. Measuring performance?
  70. 70. Measuring performance? Always outside- in, from the browser perspective...
  71. 71. Internet DB Load balancer Web server App server User Datacenter
  72. 72. Internet DB Load balancer Web server App server User Datacenter Synthetic monitoring
  73. 73. Internet DB Load balancer Web server App server User Datacenter RUM Synthetic monitoring RUM
  74. 74. Internet DB Load balancer Web server App server User Datacenter RUM APM APM APM Synthetic monitoring RUM
  75. 75. Internet DB Load balancer Web server App server User Datacenter RUM APM APM APM Synthetic monitoring DWH Dashboard Reports Alerts RUM
  76. 76. Internet DB Load balancer Web server App server User Datacenter RUM APM APM APM Synthetic monitoring DWH Dashboard Reports Alerts RUM From pageview to application code Deep dive diagnostics End to End User Experience
  77. 77. Synthetic monitoring = Your heartbeat
  78. 78. Synthetic monitoring = Your heartbeat Ultimately, synthetic monitoring shows you if your site’s working or not... (best used for operations and service reports)
  79. 79. Simulate business transactions
  80. 80. Error detection & Root Cause analysis
  81. 81. However, the problem with synthetic is ...
  82. 82. Synthetic heartbeat Real Users ..it’s ‘1’ user, not your end user
  83. 83. Real User Monitoring = Your Real Users
  84. 84. Ultimately, Real User Monitoring shows you how many users are affected by bad performance... (best used for capacity management and conversion optimization) Real User Monitoring = Your Real Users
  85. 85. 1 Insert tag (.js file) into (mobile) web pages Pages are requested from browser/device As pages execute, tag collects performance metrics After onload tag send detailed report for further analysis tag.js tag.js tag.js tag.js 2 3 4
  86. 86. 7 0% 0%6 5 4 3 2 1 0 LoadTime(Sec.)Pageviews(%) 100% 50% 75% 0% 25% TCP connection SSL Handshake DNS Resolution Front-End time DOM Ready Page Load Pageviews 2.5 0 1.5 2.0 1.0 LoadTime(Sec.) Page views vs. Load time breakdown Time (days)
  87. 87. Application Performance = Your mechanic
  88. 88. Application Performance = Your mechanic Ultimately, Application Performance Monitoring shows you how your application behaves under pressure... (best used for root cause analysis, profiling and development)
  89. 89. A performance baseline...
  90. 90. Data presented by MeasureWorks at Velocity Conference London 2013: http://bit.ly/MW-VEUrum 0 12,5 25 37,5 50 62,5 75 87,5 100 0-0,5 0,5-1 1-1,5 1,5-2 2-2,5 2,5-3 3-3,5 3,5-4 4-5 5-6 6-7 7-8 8-9 9-10 >10 Non-Bouncerate(%) Categorienaam Pageviews(#) Bouncerate # Pageviews Bol.com: Speed vs. Engagement per month
  91. 91. 0 12,5 25 37,5 50 62,5 75 87,5 100 0-0,5 0,5-1 1-1,5 1,5-2 2-2,5 2,5-3 3-3,5 3,5-4 4-5 5-6 6-7 7-8 8-9 9-10 >10 Non-Bouncerate(%) Categorienaam Pageviews(#) Bouncerate # Pageviews Bol.com: Speed vs. Engagement per month Non-Bounce Mobile LD50: 4.85 sec. Baseline: 2,7 sec. Poverty Line: 10,6 sec. Data presented by MeasureWorks at Velocity Conference London 2013: http://bit.ly/MW-VEUrum
  92. 92. Building reports...
  93. 93. Competition 0 1 2 3 4 5 33 34 35 36 37 38 39 40 Performance(avg) Week You Competitors Branche 40% of your compeitors are faster than you 40% is the average performance of your competitors 3,4 sec Is your ranking against competitors # 6 Availability Loadtime. (Document. complete) Loadtime (.fully.loaded.) 082 084 www.homepage.nl 97,0% 2,20 3,50 10% 85% ge.nl/autoverzekeren 99,0% 2,50 3,30 20% 90% overzekeren/afsluiten 95,0% 2,30 3,60 15% 70% homepage.nl/contact 95,0% 2,43 3,57 20% 67% homepage.nl/zoeken 94,0% 2,48 3,62 23% 59% e.nl/productoverzicht 93,0% 2,53 3,67 25% 52% ge.nl/ab8test/variant1 92,0% 2,58 3,72 28% 44% ge.nl/ab8test/variant2 91,0% 2,63 3,77 30% 37% ge.nl/facebookplugins 90,0% 2,68 3,82 33% 29% page.nl/muziekpagina 89,0% 2,73 3,87 35% 22% ww.homepage.nl/faq 88,0% 2,78 3,92 38% 14% KPI’s Service levels Service Level Report Page 1 Page 2 Page 3 Page 4 Page 5 Page 6 Page 7 Page 8 Page 9 Page 10 Page 11
  94. 94. 0 15 30 45 60 75 90 33 34 35 36 37 38 39 40 33 34 35 36 37 38 39 40 Speed (sec.) 2.9 2.4 2.2 2.8 3.9 2.9 3.1 2.9 URL’s (#) 67 68 65 69 80 72 69 73 3d Party (#) 7 8 6 9 8 9 7 7 Page Size (Kb) 2000 1800 1850 2100 2700 2200 2100 2150 6 Speed URL’s 3d Party Performance Trend report 5 4 3 2 1 0 Objects(#) Loadtime(sec.) Week (#)
  95. 95. Revenue risked (Performance/Time Period/Percentiles) Optimal Flow (P_25) Speed (sec.) Availability (%) Bouncerate (%) Potential Conversion Page 1 2.1 100 49% 51.000 Page 2 4.0 100 51% 24.990 Page 3 2.0 100 60% 9996 Page 4 3.5 85 25% 6372 Actual Flow (P_85) Speed (sec.) Availability (%) Bouncerate (%) Actual Conversion Page 1 4.1 100 59% 41.000 Page 2 6.3 100 54% 18.860 Page 3 2.0 100 60% 7544 Page 4 4.7 85 53% 3013 100.000 visits 0-2,7 sec. 2,7-3,6 sec. >3,6 sec.
  96. 96. Questions so far?
  97. 97. Building your (monitoring) strategy...
  98. 98. Control Quality vs. Service Levels User Impacted? Application Chain vs. Business impact Resolution Root cause vs. Uptime Task Data Need for details - - ++ Manage Deliverables?
  99. 99. Synthetic monitoring Real User Monitoring Application Performance Monitoring Control Quality vs. Service Levels User Impacted? Application Chain vs. Business impact Resolution Root cause vs. Uptime Task Data Manage Deliverables? Need for details - - ++ Tooling
  100. 100. One dataset to rule them all...
  101. 101. The “waiting” experience 1. Understand your customer 2. Prioritize content 3. Deliver fast 4. Measure to optimize 5. Repeat this for every iteration
  102. 102. Thanks! More questions? M: jtjepkema@measureworks.nl T: @jeroentjepkema W: www.measureworks.nl View slides: bit.ly/MW-breakFAST

×