Performance ImpactHow Web speed affects online business KPIs
Today’s HostsHooman Beheshti, VP Product, StrangeloopAlistair Croll,Analyst, BitcurrentAuthor of O’Reilly’s Complete Web Monitoring
10 s1 s100 ms10 msZzz!
http://www.flickr.com/photos/spunter/393793587http://www.flickr.com/photos/laurenclose/2217307446
Everything is interwoven.
We’re getting better
Impact of page load time on average daily searches per user
Impact of additional delay on business metrics
Shopzilla had another angleBig, high-traffic site
100M impressions a day
8,000 searches a second
20-29M unique visitors a month
100M products
16 month re-engineering
Page load from 6 seconds to 1.2
Uptime from 99.65% to 99.97%
10% of previous hardware needshttp://en.oreilly.com/velocity2009/public/schedule/detail/7709
5-12% increase in revenue
TransactionalSaaSBuy something(Amazon)Use an app(Salesforce)MediaCollaborativeClick an ad(Google News)Create content(Wikipedia)
Tying web latency to business outcomes
KPIshttp://www.flickr.com/photos/spunter/393793587http://www.flickr.com/photos/laurenclose/2217307446
http://www.flickr.com/photos/mrmoorey/160654236
ATTENTIONENGAGEMENTCONVERSIONNEWVISITORSSEARCHESTWEETSMENTIONSADS SEENCONVERSIONRATEGROWTHTIMEONSITEPAGESPERVISITNUMBEROF VISITSxORDERVALUELOSSBOUNCERATE
It’s time for an experiment
StrangeloopVisitorWebserverDecide whetherto optimizeNormalcontentAcceleratedReceivepageOptimize?InsertsegmentmarkerProcessscriptsSendanalyticsUnacceleratedGoogleanalytics
What we learned
Traffic levels
Bounce rate
% New visitors
Average time on site
Pages per visit
Conversion rate & order value
Justifying an investment in performance()Currentdaily ordersIncreasedconversionsIncreasedorder value*+ROI(days)=Cost of performanceenhancement

Impact of web latency on conversion rates

Editor's Notes

  • #4 Once upon a time, performance was a dark art. We struggled to deliver “good enough” without really knowing why.
  • #5 We managed by anecdote. We were sure faster was better, but we couldn’t tie it to specific business outcomes.
  • #6 The notion that speed is good for users isn’t new. The concept of “Flow” – a state of heightened engagement that we experience when we’re truly focused on something – was first proposed by mihalycsikszentmihalyi
  • #7 It turns out that attention and engagement drop off predictably. At ten milliseconds, we actually believe something is physically accessible – think clicking a button and seeing it change color. At 100 milliseconds, we can have a conversation with someone without noticing the delay (remember old transatlantic calls?) At a second, we’re still engaged, but aware of the delay. At ten seconds, we get bored and tune out, because other things come into our minds.
  • #8 How much was fast enough? It was anybody’s guess.
  • #9 And guess they did.This is Zona’s formula for patience, the basis for the “eight second rule.” Unfortunately, things like tenacity, importance, and natural patience aren’t concrete enough for the no-nonsense folks that run web applications.
  • #10 IT operators and marketers are completely different people. What convinces an IT person to fix performance doesn’t convince a marketer. They want to know how it will impact the business fundamentals.
  • #11 By now, we know that everything matters. Usability, page latency, visitor mindset, and even sentiment on social media platforms all contribute to the business results you get from a site.
  • #12 Fortunately, we’re getting better at linking performance to business outcomes.
  • #13 One example of this is performance experimentation that Google’s done. Google’s a perfect lab. Not only do they have a lot of traffic, they also have computing resources to do back-end analysis of large data sets. Plus, they’re not afraid of experimentation – in fact, they insist on it. So they tried different levels of performance and watched what happened to visitors.
  • #14 The results, which they presented at Velocity in May, were fascinating. There was a direct impact between delay and the number of searches a user did each day – and to make matters worse, the numbers often didn’t improve even when the delay was removed. You may think 0.7% drop isn’t significant, but for Google this represents a tremendous amount of revenue.
  • #15 Microsoft’s Bing site is a good lab, too. They looked at key metrics, or KPIs, of their search site.
  • #16 They showed that as performance got worse, all key metrics did, too. Not just the number of searches, but also the revenue (earned when someone clicks) and refinement of searches.
  • #17 Shopzilla overhauled their entire site, dramatically reducing page load time, hardware requirements, and downtime.
  • #18 They saw a significant increase in revenues
  • #19 The site improvement increased the number of Google clicks that turned into actual visits
  • #20 It also affected search engine scores. By improving load time, search engines (in this case Google UK) “learned” that this was a good destination. That’s right – Google actually penalizes sites that are slow by giving them a lower page ranking.
  • #21 While this shows us metrics for large sites focused on sales and ad clicks, it doesn’t tell us about fundamentals.There are four fundamental site models, each of which has different business goals. An e-commerce site focused on transactions wants to convert visitors to buyers. A SaaS site wants to make subscribers renew. A media site wants to serve relevant ads and maximize searches or views. And so on.
  • #22 If we want to convince marketing, we need to measure business metrics.
  • #23 By tying performance and availability to Key Performance Indicators – KPIs – business and operations can finally have a conversation.
  • #24 Whether those KPIs are shopping cart abandonment
  • #25 Or visitor “bounce rate” (the number of visitors that leave immediately)
  • #26 Or just traffic.
  • #27 So what KPIs would we like to learn about? This is what web analytics folks work by, whether they’re running a media site, a SaaS platform, a transactional application, or a collaborative social network. It’s what the business cares about.
  • #28 Strangeloop agreed to set up an experiment using their technology which would help measure this.
  • #31 First, traffic. Despite splitting visitors to be optimized and unoptimized evenly, we had many more optimized sessions captured by the analytics. This may be a result of slower-loading pages failing to execute the analytics script, or abandoning the visit before the page had time to load.
  • #32 Unoptimized visitors are roughly 1% more likely to leave the site immediately, without proceeding to other pages.
  • #33 Strangely, the unoptimized visitors consisted of more new visitors than the optimized ones did. This seems counter-intuitive and warrants further study.
  • #34 Optimized visitors spent more time on the site
  • #35 And looked at more pages during their visit – if you’re a media property, this means more impressions for your advertisers.
  • #36 On a second e-commerce site running roughly the same experiment, conversions were 16 percent higher and orders were 5.5% higher.