Web Performance - A Whistlestop Tour
Upcoming SlideShare
Loading in...5
×
 

Web Performance - A Whistlestop Tour

on

  • 7,194 views

Slides from web performance talks at #DigiTalksChelt, Jan 2012

Slides from web performance talks at #DigiTalksChelt, Jan 2012

Notes, references and further reading in speakers notes

Statistics

Views

Total Views
7,194
Views on SlideShare
4,597
Embed Views
2,597

Actions

Likes
5
Downloads
79
Comments
0

15 Embeds 2,597

http://andydavies.me 1250
http://www.andysnotebook.com 1040
http://feeds.feedburner.com 75
http://127.0.0.1 65
http://andysnotebook.typepad.com 61
http://andysnotebook.com 51
http://lanyrd.com 27
http://a0.twimg.com 11
https://twitter.com 7
http://www.onlydoo.com 2
http://us-w1.rockmelt.com 2
http://cloud.feedly.com 2
https://www.linkedin.com 2
http://reader.aol.com 1
http://feedproxy.google.com 1
More...

Accessibility

Categories

Upload Details

Uploaded via as Apple Keynote

Usage Rights

CC Attribution License

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • Andy Davies\n\nSpecialise in making web sites faster, more reliable and helping them scale as they grow\n\nGoing to be talking about the first of these; web performance:\n\nWhy speed matters\nHow page load times can be measured\nSome ways of improving performance\n\n\nSlides complete with presenter notes and further reading / references will go on Slideshare in the next day or so\n\nIf anyone has any questions or comments feel free to send me a tweet...\n\n\n
  • First view 7.066s, repeat view 3.9s, start rendering 2.96\nAptimize 2010 Web Performance Benchmark of Fortune 500 websites \nhttp://www.aptimize.com/Upload/docs/2010-Website-Performance-Benchmarks.pdf\n\nDo people think 7s is good, bad, or don’t they know?\n\nStrangeloop Networks research into load times of Alexia Top 2000 ecommerce sites \n - Average page load time 11.21s\n - Best 2.2x, worst 40.2s!\n - Only 13 loaded in less that 5s\n(http://www.strangeloopnetworks.com/resources/research/report-state-of-the-union-for-page-speed-and-website-performance/)\n\nRobert Miller’s research from 1968 (often attributed to Jakob Nielsen)\n0.1s - instant\n1s - keeps the flow\nover 10s - breaks to flow\n\nOur perception of load time is off:\n- Out expectation is faster\n- Perceive load time as 15% slower than reality\n- Recount experience as 35% slower\nStoyan Stefanov - Psychology of Performance\n(http://www.slideshare.net/stoyan/psychology-of-performance)\n\n
  • 57% of online consumers will abandon a site after waiting 3 seconds for a page to load\n80% will not return\nAbout half will tell others of their negative experience\n\n(http://www.strangeloopnetworks.com/resources/infographics/web-performance-and-user-expectations/website-abandonment-happens-after-3-seconds/)\n\n\n74% of mobile users expect a site to load in 5s\n\n(http://www.strangeloopnetworks.com/resources/infographics/mobile-infographics/mobile-load-time-vs-user-expectations/)\n\n
  • Google - 400 ms delay- searches down 0.6%\n\nBing - 1s delay - revenue / user down 2.8%\n\nAmazon - 100 ms delay - sales down 1%\n
  • Shopzillareduced load time by 5 secs = 12% increase in sales\n\nMozillareduced load time by 2.2 secs = 60 million extra downloads / year\n\nYahootraffic went up 9% for every 400ms improvement\n\nAOLusers in top 10% of load times viewed more pages that those in lowest 10% (7.5 vs 5)\n\nPage load time is used as a factor by Google\n\nCan look at how speed affects conversions by using browser version as a proxy for performance - newer browsers tend to be faster.\n
  • Shopzilla: -5s reduction in load time, cut hardware costs in half\n\nNetflix: Reduced outbound bandwidth by 43%\n\nKerboodle: Reduced hosting costs by over €30,000 per year\n
  • Show a timeline of what the browser loader, when it started and how long it took\n\nBuild into some browsers\n- Webkit Inspector (Safari / Chrome)\n- IE9 developer tools\n\nAdd-ons for others\n- Firebug\n\nThird party tools for older IE versions\n- Dynatrace AJAX Edition\n\nHosted services\n- webpagetest.org\n- pingdom.com\n\n\nIt’s a sample of one, and we are not out users...\n
  • Actual page load times from a real site, note huge peak of over 10 seconds\n\nMany external factors can affect page load time:\n\n- Browser\n- How they are connected ADSL / mobile / public WiFi\n- Bandwidth\n- Latency\n- Anti-virus\n- Network kit\n- etc\n\nThese factors are beyond our control but to get a true picture of page load times we must measure them in the visitors browser.\n
  • Page load time data collected by Google Analytics (by default since 16th Nov 2011)\n\nTo increase sample rate from 1% to 10% add following to GA snippet:\n\n_gaq.push(['_setSiteSpeedSampleRate', 10]);\n\n\nData comes from Google Toolbar users and modern browsers that support the W3C NavigationTiming API i.e. Chrome, Firefox 7+, IE9+\n\nGood description of how it works on StackOverflow: http://stackoverflow.com/questions/6166074/how-does-gaq-push-trackpageloadtime-work\n\n\nGoogle Webmaster tools also has some load time information...\n\nGoogle Toolbar can report page load times back to Google (user has to enable it)\n\nMay be useful for some historic data but be very careful about reading too much into it.\n\nSample rates can be low, and samples skewed e.g. there is no toolbar for Chrome, also can’t match up performance data against individual pages and browsers\n
  • Some other Real User Monitoring tools\n\nCommercial\n\nNew Relic (http://newrelic.com/)\nWebTuna (www.webtuna.com)\nPion (http://www.atomiclabs.com/)\nGomez (http://www.compuware.com/application-performance-management/why-gomez.html)\nTealeaf (http://www.tealeaf.com/)\nOracle RUEI (http://www.oracle.com/technetwork/oem/uxinsight/index.html)\n\n\nDIY\n\nBoomerang.js (https://github.com/yahoo/boomerang)\nEpisodes (http://stevesouders.com/episodes2/)\nJiffy (http://code.google.com/p/jiffy-web/wiki/Jiffy_js)\nPion (http://www.pion.org/) Open source version\n\n
  • Front-end - everything except the time the server takes to generate / serve HTML page\n\nIn 2007, Steve Souders came up with 80% figure while part of the Yahoo Exceptional Performance team\n(http://developer.yahoo.com/blogs/ydn/posts/2007/03/high_performanc/)\n\nJoshua Bixby re-examined the claim in 2011 and came to the conclusion that it was around 97% on mobile devices\n(http://www.webperformancetoday.com/2011/04/20/desktop-vs-mobile-web-page-load-speed/)\n\nPages are getting larger and more complex\n\n1995 - 14.1k / 2.3 components\n2010 - 498k / 75 objects\n2012 est - 684k / 83 objects\n\n(http://www.strangeloopnetworks.com/resources/infographics/web-performance-and-ecommerce/web-pages-keep-getting-bigger/)\n
  • Some server performance issues can be improved by throwing hardware at it e.g. faster processors, more RAM (Databases love RAM), faster disks.\n\nBut should still look at optimising the back-end as it’s an important part of the picture e.g. DB tuning, architectures, reverse proxies etc.\n\nEven without tuning the server can still improve page load times by “flushing early”\n\n
  • Mike Belshe - “More Bandwidth Doesn’t Matter (much)”\n(http://www.chromium.org/spdy/More_Bandwidth_Doesn_t_Matter_2_%282%29.pdf)\n\nBandwidth - Throughput, how much can be download in a given time\nLatency - Time between making the request and receiving the response\nTCP Slow-start - new connections take time to work up to ‘max’ throughput\n\nConnections aren’t synchronous e.g. ADSL - faster download than upload, so requests are much slower (but fortunately generally smaller)\n\n\n
  • Minimise redirects\n\nMerge css\n\nMerge js\n\nSprite images (http://css-tricks.com/css-sprites/)\n\nCache forever (http://calendar.perfplanet.com/2010/easy-cache-headers/)\n\nData URIs (http://www.phpied.com/data-urls-what-are-they-and-how-to-use/) Firefox, Chrome, IE8+, handy for mobile but user can't choose not to have images\n\nCSS3 / SVG / canvas instead of images\n\n
  • Compress (gzip / deflate) all text based content - html, js, css, xml, json (and other uncompressed components e.g. ico)\n\nMinify JS and CSS (not everyone can get gzipped content)\n\nOptimise images: correct format + tools \n- jpg: lossy\n- png: lossless (often more efficient than gif. Photoshop screws up transparency)\n- gif:animated \n\nGreat reference for image optimisation - http://www.bookofspeed.com/chapter5.html\nhttp://jpegmini.com is a new tool for crunching jpegs, uses perceptual encoding, good results\n\nUse CSS / SVG / Canvas instead of images e.g. rounded corners\n\nLean markup\n- html: avoid DIVitis, HTML5 can be more brief\n- css: @stubornella’s OOCSS (https://github.com/stubbornella/oocss/wiki/faq)\n\nStatic content from cookieless domains (but should CSS come from the same domain?)\n\nRemove superfluous headers\n
  • Flush the page from the server in blocks e.g. , header, main content, sidebar, footer\n\nBrowsers wait until CSS files are downloaded before rendering\n\nTry to use async Javascript - some browsers can download/ parse in parallel and just block on execution others block during download and parsing\n\nPerils of relying on 3rd party hosted javascript = source of failure\n(http://www.stevesouders.com/blog/2010/06/01/frontend-spof/)\n- common 3rd party scripts have been updated \n- Google Analytics, Facebook Like, Google +1, Twitter have been updated for this - - StumbleUpon is an issue\n\n- OpenDNS blocked Google’s jQuery CDN!\n\nBrowsers have a limited number of simultaneous connections to the same host\n- IE6 2, modern browsers 6, mobile typically 4\n- domain sharding e.g. seperate hostname of images can over come this\n- can even point them back to the same server but LoveFilm had a problem with this\n(http://statichtml.com/2010/use-unique-ips-for-sharded-asset-hosts.html)\n\niframes are expensive\n\nConditional comments blocking downloads until main CSS arrives in IE \n(http://www.phpied.com/conditional-comments-block-downloads/) \n\nTurn on HTTP/1.1 keep-alive so browser can re-use connections\n\nAJAX - POST vs GET\n\n
  • Some things can be automated e.g. compression, minification, merging, image compression plus more sophisticated optimisations.\n\nFrom plugins for Wordpress and web servers, appliances that sit in front of your server farm through to cloud-based optimisation services.\n\nOther are a little more difficult e.g. flush early, improving backed performance, asynchronous loading scripts etc.\n\nAutomation products / tools...\n\nCMS plugins\n - W3 Total Cache for Wordpress\n\nServer plugins\n - mod_pagespeed (Apache)\n - reduce requests (IIS)\n - Aptimize (IIS) (£)\n - Webo (£)\n\nAppliances\n - Strangeloop (£)\n - Traditional ADCs do some of this e.g. compression\n\nCloud services\n - Strangeloop (£)\n - Torbit (£)\n - Blaze.io (£)\n - Yotta (£)\n - Google’s PageSpeed service\n
  • Summary:\n\nPerformance matters - there is a business case for speed\n\nNeed to measure the page load times for real users\n\nCovered ways of improving performance\n\nNext steps:\n\nLots of free information, knowing what to implement / prioritise\n\n"Takes one person. Takes 15 mins" - Steve Krug\n\n\nQUESTIONS...\n\n\nSlides, along with speaker notes and references will go on Slideshare in the next few days\n\nhttp://www.slideshare.net/andydavies\n\n\nQuestions or comments? - Grab me after or tweet or email me.\n\nIf you’ve got a performance problem that needs looking at Hire Me!\n\n\n\n

Web Performance - A Whistlestop Tour Web Performance - A Whistlestop Tour Presentation Transcript

  • #DigiTalksChelt Jan 2012Web Performance - A Whistle Stop Tour...@andydavies http://www.flickr.com/photos/blaxjax/170373495
  • 7s
  • 7sAverage page load time of Fortune 500 websites
  • “...up to 57% of online consumers will abandon a site after waiting 3 seconds for a page to load” Strangeloop Networks http://www.flickr.com/photos/macieklew/351554256
  • Slower sites have lower conversion rates http://www.flickr.com/photos/nelc/1208393486
  • Improving performance raises conversions http://www.flickr.com/photos/glasgows/444674524
  • Improving performance can save money http://www.flickr.com/photos/mjohnso/5437331304
  • Measuring load time using page waterfalls Great diagnostic tool, but it’s a sample of one and we are not our users!
  • Real User Load Page Times 27% 24%Visitors (%) 13% 8% 8% 6% 6% 3% 3% 2% 1% 1 2 3 4 5 6 7 8 9 10 > 10 Load Time (s)
  • Google Analytics: Measuring Real Users Sample data generated by modern browsers and Google Toolbar - older browsers may not be represented accurately!
  • Other tools for measuring page load time http://www.flickr.com/photos/wwarby/3297205226
  • 80%of page load time is down to the front-end (May be as high as 97% for mobile users)
  • But, don’t forget about server performance... http://www.flickr.com/photos/br1dotcom/4297736794
  • “More Bandwidth Doesn’t Matter (much)” Mike Belshe 3.11sPage Load Time 1.95s 1.63s 1.50s 1.44s 1.41s 1.39s 1.38s 1.37s 1.36s 1 2 3 4 5 6 7 8 9 10 Bandwidth (Mbps)
  • Cut down the number of requests http://www.flickr.com/photos/mic_n_2_sugars/564570276
  • Put your pages on a diet http://www.flickr.com/photos/europedistrict/4537909259
  • Understand how pages load
  • Don’t have to do it all by hand http://www.flickr.com/photos/simeon_barkas/2557059247
  • http://www.slideshare.net/andydavies @andydavies http://www.flickr.com/photos/auntiep/5024494612