MeasureWorks - StartupBootCamp Amsterdam - Outrun your competition

603 views

Published on

Nobody likes waiting… Especially (spoiled) online consumers, with all that content and competition just one click away. In this session Jeroen Tjepkema, founder of MeasureWorks, will explain the delicate balance between performance, usability and conversion. Based on real world examples and user panel research, this session will provide actionable insights in what's needed to create an user experiences that exceeds expectations. You'll learn how the mind of your user works, what keeps them clicking, what metrics to collect and how to measure performance and user experience.

Published in: Technology, Design
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
603
On SlideShare
0
From Embeds
0
Number of Embeds
3
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

MeasureWorks - StartupBootCamp Amsterdam - Outrun your competition

  1. 1. @jeroentjepkema Founder & CEO
  2. 2. Slides: bit.ly/MW-SBCmnfc13
  3. 3. Content vs. Experience
  4. 4. How we perceive experience....
  5. 5. How we perceive experience.... 15%-20% worse than in reality
  6. 6. How we perceive experience.... 15%-20% worse than in reality Task completion has positive impact
  7. 7. How we perceive experience.... 15%-20% worse than in reality Task completion has positive impact Slow downs have more impact
  8. 8. How we perceive experience.... 15%-20% worse than in reality Task completion has positive impact Slow downs have more impact Always compared to past experiences
  9. 9. A simple online business model: Marketing Conversion Optimization New visitors Search Tweets Mentions ADs seen Conversion rate Growth Number of visits Loss Bounce rate Revenue Pages per visit Time on site X Order value
  10. 10. Consumers visit 18-23 websites before they finally convert Source: TNS/Nipo
  11. 11. Performance influence cycle Search/Orientation phase 1 Perception of experience http://twinkle100.measureworks.nl
  12. 12. Performance influence cycle Stimulate content/ conversion 4 Search/Orientation phase 5 1 Perception of experience http://twinkle100.measureworks.nl Delivered experience 3 2 Your website
  13. 13. Fast = Engagement @jeroentjepkema, MeasureWorks @jeroentjepkema, MeasureWorks
  14. 14. 0 Source: Jakob Nielsen
  15. 15. 1 0,3 0 Interaction: Let’s conversate... Instantaneous: I like it! Source: Jakob Nielsen
  16. 16. 3 1 0,3 0 Mmm, shall I click away? Interaction: Let’s conversate... Instantaneous: I like it! Source: Jakob Nielsen
  17. 17. 10 Only if the task/content is relevant 3 1 0,3 0 Mmm, shall I click away? Interaction: Let’s conversate... Instantaneous: I like it! Source: Jakob Nielsen
  18. 18. 0:00 sec. Conversation starts
  19. 19. 0:00 sec. Conversation starts 2.5 sec. First reply
  20. 20. 4.5 sec. Finish conversation 0:00 sec. Conversation starts 2.5 sec. First reply
  21. 21. 4.5 sec. Finish conversation 0:00 sec. Conversation starts Comfort zone Tolerated 2.5 sec. First reply Frustrated
  22. 22. The holy grail in experience is “flow state”
  23. 23. “” Flow is an “optimal experience” that is “intrinsically enjoyable”
  24. 24. Provide relevant content to support task completion
  25. 25. Deliver it fast, focus on perception
  26. 26. http://www.nytimes.com/2012/08/19/opinion/sunday/why-waiting-in-line-is-torture.html?pagewanted=all&_r=0
  27. 27. http://answers.yahoo.com/question/index?qid=1005081200005
  28. 28. http://amzn.to/1biSi6C
  29. 29. That’s not my website?
  30. 30. Over 60% of people that experienced a slow website ranked design 2 points lower compared to a fast experience Slow Average Fast % of respondents that rated the actual website speed 100 75 50 25 0 1 2 3 Design score (1=bad - 5=beautiful) 4 5
  31. 31. Mobile context?
  32. 32. Under time Content pressure Mobile context?
  33. 33. Under time Content pressure Mobile context? (often) one Usability handed
  34. 34. Under time Content pressure (often) one Usability handed Mobile context? While Speed on the move
  35. 35. Under time Content pressure (often) one Usability handed Mobile context? While Performance on the move
  36. 36. M-commerce?
  37. 37. Experiment 1: Mobile Shopping
  38. 38. Mobile shopping experience setup.... 100 mobile users Perform 2 tasks: ‣Buy a book ‣Buy a T-shirt Mobile browsers only via 3G, no WiFi
  39. 39. Average experience rating 7 6 5 4 Fix it fast 3 2 1 1 2 3 4 5 Average expectation rating Find a website Search for a book Order a book 6 7
  40. 40. #1: Speed
  41. 41. Functional issues reported with Zalando 50 37,5 Round 1 25 45 42,1 12,5 15,4 0 Zalando HM 20 7,7 V&D 24 3,9 Tom Tailor 4 Design 1. Buy a T-shirt Speed Mobile Readiness Other 2. Feedback 40 30 Round 2 20 10 12 0 54 34 Zalando Mobile Only research: - Task completion: Only use smartphone to buy a book - N = 100, users range from 20-65 15,4 HM 8 V&D Tom Tailor 21 Design 18 Speed 6 Mobile Readiness Other
  42. 42. Nexus - Android - 3G Tested with webpagetest.org (3G traffic shaping, 2000Kbps Down, 1000 Kbps Up, 150ms latency)
  43. 43. Nexus - Android - 3G Tested with webpagetest.org (3G traffic shaping, 2000Kbps Down, 1000 Kbps Up, 150ms latency)
  44. 44. #2: Usability
  45. 45. Bol.com Selexyz De Boekerij ECI Bruna Cosmox Mobile Readiness
  46. 46. 100 75 50 46 44 26 24 25 16 0 Bol.com Selexyz Bruna Other 1. Buy a book Mobile Only research: - Task completion: Only use smartphone to buy a book - N = 100, users range from 20-65 Design Speed 16 Mobile Readiness 2. Feedback bol.com 16 Other Design 12 Speed Mobile Readiness Other 3. Feedback from different stores
  47. 47. Mobile Readiness
  48. 48. - One touch? +
  49. 49. Login or not?
  50. 50. Mobile Search
  51. 51. Experiment 3: Mobile Banking
  52. 52. The mobile user experience… 50 users of ING, ABN and Rabobank Rate the user experience Select home insurance
  53. 53. There’s an app for that?!
  54. 54. Mobile satisfaction compared to desktop 80% 74% 77% Usability Speed 23% Design Content Satisfaction Mobile Banking research: - Focus on task completion - N = 50, users range from 20-65
  55. 55. Mobile satisfaction compared to desktop 80% 74% 77% Preference per type of task 77% 72% 38% 34% 23% Design Content Usability Satisfaction Mobile Banking research: - Focus on task completion - N = 50, users range from 20-65 Speed Check Balance Transfer 3d Party payments View history Mobile task preference
  56. 56. Mobile satisfaction compared to desktop Mobile versus Desktop frequency Preference per type of task Mobile 80% 74% 77% 77% 72% 38% 34% 23% Design Content Usability Satisfaction Mobile Banking research: - Focus on task completion - N = 50, users range from 20-65 Speed Check Balance Transfer 3d Party payments View history Mobile task preference Daily Weekly 2-weekly Change in behavior Monthly
  57. 57. Mobile websites?
  58. 58. Desktop Mobile 81% 71% 56% 62% 34% 42% 37% 22% Design Content Usability Speed Design Task satisfaction Desktop vs. Mobile Mobile Banking research: - Focus on task completion - N = 50, users range from 20-65 Content Usability Speed
  59. 59. What impacts mobile (shopping) experience?
  60. 60. What impacts mobile (shopping) experience? Day to day usage Screen size vs. Content Performance Mobile browsers
  61. 61. Outrun your competitors
  62. 62. Optimize all your touch points
  63. 63. Regular website Mobile Web Deliver the same experience to all your (mobile) users Native Application
  64. 64. Mobile first www.lukew.com
  65. 65. Deliver content accordingly Prioritize your business goals Mobile first www.lukew.com
  66. 66. Design for (mobile) use cases
  67. 67. Focus on task completion
  68. 68. With mobile conditions in mind
  69. 69. Optimized for the right screen
  70. 70. Native app Mobile Web - Portrait Mobile Web - Landscape
  71. 71. (Touch) Screen
  72. 72. 49% One device, One hand http://www.uxmatters.com/mt/archives/2013/02/how-do-users-really-hold-mobiledevices.php?goback=.gmp_72842.gde_72842_member_215909354
  73. 73. Golden touch http://www.uxmatters.com/mt/archives/2013/02/how-do-users-really-hold-mobiledevices.php?goback=.gmp_72842.gde_72842_member_215909354
  74. 74. Make it (feel) fast!
  75. 75. Abandonment"Rate"Across"200+"Sites"/"177+"Million"Page"Views" Over"2"weeks"/"All#Browsers# 30" Abandonment"Rate"(%)" 25" 20" Abandonment*Rate*,* All*Browsers* 15" 10" 5" 0" 0" 1" 2" 3" Source: Gomez real user monitoring 4" 5" 6" 7" 8" 9" Page"Load"Time"Band"(sec.)" 10" 11" 12" 13" 14" 15" It only takes a second...
  76. 76. Abandonment"Rate"Across"200+"Sites"/"177+"Million"Page"Views" Abandonment"Rate"Across"200+"Sites"/"177+"Million"Page"Views" Over"2"weeks"/"All#Browsers#vs.#iPhone#Safari# Over"2"weeks"/"All#Browsers# 25" Abandonment"Rate"(%)" 30" 25" Abandonment"Rate"(%)" 30" 20" 20" 15" 15" 10" 10" 5" 5" 0" Abandonment*Rate*,* All*Browsers* Abandonment*Rate*,* All*Browsers* Abandonment*Rate*,* iPhone*Safari* 0" 0" 1" 1" 2" 2" 3" 4" 5" 6" 7" 8" 9" 10" 11" 12" 13" 14" 15" 0" 3" 4" 5" 6" 7" 8" 9" 10" 11" 12" 13" 14" 15" Page"Load"Time"Band"(sec.)" Source: Gomez real user monitoring Page"Load"Time"Band"(sec.)" Source: Gomez real user monitoring ..To click away
  77. 77. Performance Analytics
  78. 78. A simple online business model: Marketing Conversion Optimization New visitors Search Tweets Mentions ADs seen Conversion rate Growth Number of visits Loss Bounce rate Revenue Pages per visit Time on site X Order value
  79. 79. Why did customers drop off? ‣ Price ‣ Functional errors? ‣ Performance issues?
  80. 80. Why did customers drop off? ‣ Price ‣ Functional errors? ‣ Performance issues? What’s the business impact? ‣ Lost customers? ‣ Revenue risked? ‣ In Euros?
  81. 81. We need context!
  82. 82. Complete Web Monitoring
  83. 83. Web Analytics Usability Performance (what did they do on the site?) (how did they interact with it?) (could they do what they wanted to?) Complete Web Monitoring VoC Social Media Competition (what were their motivations?) (what were they saying?) (what are they up to?)
  84. 84. “Hard” data Web Analytics Usability Performance (what did they do on the site?) (how did they interact with it?) (could they do what they wanted to?) Complete Web Monitoring VoC Social Media Competition (what were their motivations?) (what were they saying?) (what are they up to?) “Soft” data
  85. 85. “Hard” data Web Analytics Usability Performance (what did they do on the site?) (how did they interact with it?) (could they do what they wanted to?) Complete Web Monitoring VoC Social Media Competition (what were their motivations?) (what were they saying?) (what are they up to?) “Soft” data
  86. 86. #1. Collecting Performance Data
  87. 87. (CLOUD) DATA CENTER INTERNAL USERS CUSTOMERS INTERNET Third-party/ Cloud Services Storage Major ISP DB Servers Web Servers Network Mainframe Middleware Servers App Servers Load Balancers This is what you control... Local ISP Content Mobile Delivery Networks Carriers What you’re blamed for..
  88. 88. Measuring performance?
  89. 89. Measuring performance? Outside-in, from the browser perspective...
  90. 90. Performance Measurement Toolkit
  91. 91. Benchmark & Optimization
  92. 92. Webpagetest.org
  93. 93. Object level ! Webpagetest.org ! Optimization tips !
  94. 94. Webpagetest.org
  95. 95. Ultimately, Real User benchmarking gives you periodic insight in real usage scenario’s...
  96. 96. Uptime Monitoring
  97. 97. Simulate business transactions
  98. 98. Via multiple devices & browsers
  99. 99. Used for error detection & Root Cause Analysis
  100. 100. Used for error detection & Root Cause Analysis
  101. 101. Ultimately, synthetic monitoring shows you if your site’s working or not...
  102. 102. Real User Monitoring
  103. 103. But, synthetic isn’t enough...
  104. 104. Synthetic heartbeat Real Users But, synthetic isn’t enough...
  105. 105. 1 2 Insert tag (.js file) into (mobile) web pages tag.js Pages are requested from browser/device 3 As pages execute, tag collects performance metrics tag.js tag.js Browser RUM tag.js 4 After onload tag send detailed report for further analysis
  106. 106. Relies on navigation timing API, custom variables can be added Google Analytics
  107. 107. Soasta
  108. 108. Ultimately, Real User Monitoring shows you how many users are affected by bad performance...
  109. 109. Performance surveys
  110. 110. Expected Experience vs. Delivered Experience
  111. 111. Average experience rating 7 6 5 4 Fix it fast 3 2 1 1 2 3 4 5 Average expectation rating Find a website Search for a book Order a book 6 7
  112. 112. Where UX meets Performance: Study on Car Configurators (Audi vs. BMW) http://bit.ly/MW-CC13 password: mob1l32013
  113. 113. Ultimately, performance surveys give you periodic insight in task completion issues... (20 people is enough to find 80% of common issues)
  114. 114. Summarized:
  115. 115. Uptime Monitoring Real User Monitoring Benchmark & Optimize Performance surveys Benefits Benefits Benefits Benefits Heartbeat, runs without traffic Test specific customer journeys Object level detail Collect detailed alerts, including root cause analysis Desktop/Mobile Site Pingdom Watchmouse Alertsite Mobile Apps Gomez Keynote Real usage information from all users!! Trending/Optimization Business impact Testing of user scenario’s with real devices/bandwith Optimization details Competitive scan Desktop/Mobile Site Soasta New Relic Google Analytics Desktop/Mobile Site Webpagetest Browserstack Mobile Apps Gomez New Relic Localytics Google Analytics Mobile Apps Perfecto Mobile Device Anywhere Soasta Performance Measurement toolkit Soft data feedback Abandonment optimization Test before go live Real User Desktop/Mobile Site Wufoo Usabilla Loop11 Crazyegg Mobile Apps Reflector UX recorder Magitest
  116. 116. #2. Performance Budget
  117. 117. What is your baseline:
  118. 118. What is your baseline: A pre-defined set of metrics
  119. 119. What is your baseline: A pre-defined set of metrics that describes normal behavior
  120. 120. What is your baseline: A pre-defined set of metrics that describes normal behavior in order to detect variancies
  121. 121. What is your baseline: A pre-defined set of metrics that describes normal behavior in order to detect variancies and to be comparable within historic context
  122. 122. Then define service levels & thresholds...
  123. 123. Purchasing a book, must be completed (speed), where every page loads under 4 sec., Customer journey Metric: Speed Target: Sec using IE8 and higher, User scenario from any location in the Netherlands, User locations for 95% of all users, every day between 6am and 12pm, measured with Real User Monitoring. Percentile Window Collection type Source: Metrics 101, Velocityconf 2010
  124. 124. #3. Performance Analysis
  125. 125. Service levels KPI’s 082 084 Performance (avg) Loadtime. Loadtime Availability (Document. (.fully.loaded.) complete) Competition You 5 ww.homepage.nl 97,0% 2,20 3,50 10% 85% /autoverzekeren 99,0% 2,50 3,30 20% 90% ekeren/afsluiten 95,0% 2,30 3,60 15% 70% 95,0% 2,43 3,57 20% 67% 1 epage.nl/zoeken 94,0% 2,48 3,62 23% 59% Branche 2 epage.nl/contact Competitors 0 productoverzicht 93,0% 2,53 3,67 25% 92,0% 2,58 3,72 28% 44% 3 33 34 35 /ab8test/variant2 91,0% 2,63 3,77 30% 37% /facebookplugins 90,0% 2,68 3,82 33% 29% nl/muziekpagina 89,0% 2,73 3,87 35% 22% homepage.nl/faq 88,0% 2,78 3,92 38% 14% 36 37 38 39 40 Week 52% /ab8test/variant1 4 # 6 3,4 sec Is your ranking against competitors is the average performance of your competitors 40% 40% of your compeitors are faster than you
  126. 126. Competitive analysis Your website 20 2 2000 1 1000 10 0 Pagesize (kb) 3 3000 Speed (sec) URL’s 30 URL's 3rd URL's Speed (sec) Page Size (KB) 0 33 34 35 36 37 38 39 40    Week # 33 34 35 36 37 38 39 40 Your site Performance (FV) 3,3 2,4 2,2 2,8 3,3 2,9 3,1 3,2   URL's 67 68 65 69 80 72 69 73   URL's 3rd party 2 2 2 2 5 3 3 3   Page size (KB) 2000 1800 1850 2100 2400 2500 2500 2450 Your branche Performance (FV) 2,3 2,0 2,5 2,8 2,3 2,4 2,2 2,5   URL's 55 59 56 62 59 60 59 61   URL's 3rd party 2 2 2 2 3 3 3 3   Page size (KB) 1900 1800 1850 1900 2050 1900 1890 1900 #30-35 #15-20 #10-15 #15-20 #50-55 #20-25 #30-35 #30-35 Competitive ranking  
  127. 127. Performance Variation 9,26 8,98 7,79 7,79 7,23 7,29 7,24 7,24 6,67 4,97 5,4 5,40 4,80 4,61 3,39 3,38 3,08 3,08 2,97 1,89 Sunweb D-reizen Arke Globe Neckermann
  128. 128. Performance vs. Sentiment
  129. 129. 0! dec;11! nov;11! okt;11! sep;11! aug;11! jul;11! jun;11! mei;11! apr;11! 200.000! mrt;11! feb;11! Jan;11! dec;10! nov;10! okt;10! sep;10! aug;10! jul;10! jun;10! mei10! apr110! mrt;10! feb;10! jan;10! dec;09! nov;09! okt;09! sep;09! aug;09! jul;09! jun;09! Mei;09! Web%traffic%%(Pageviews)! Capacity Management Max.%number%of%pageviews%per%hour%! 1.200.000! 1.000.000! 800.000! Christmas/EOY% 600.000! 400.000! Traffic!Realized! Traffic!Forecast! MAX!capacity! Safety!capacity!
  130. 130. Revenue risked Optimale flow 10.000 visitors Speed (s) Availability (%) Bouncerate (%) Conversie Page 1 5,5 100 70 3000 Page 2 13,0 100 75 750 Page 3 2,0 100 60 300 Page 4 3,5 88,5 5 252 10.000 visitors Actual flow Speed (s) Availability (%) Bouncerate (%) Conversie Page 1 8,0 100 75 2500 Page 2 15,5 100 80 500 Page 3 2,0 100 60 200 Page 4 8,0 88,5 5 168
  131. 131. If anything else...
  132. 132. Build with the end-user in mind
  133. 133. Focus on a fast experience …without it, NO time for content
  134. 134. Deliver before the lights turn green!
  135. 135. Reading material: www.stevesouders.com www.leananalytics.com www.lukew.com
  136. 136. Thanks! More questions? M: jtjepkema@measureworks.nl T: @jeroentjepkema W: www.measureworks.nl

×