Client-side Web Performance Optimization [paper]

6,604
-1

Published on

An overview about how the speed of websites can be improved with often simple optimizations on the client-side.

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
6,604
On Slideshare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
234
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide

Client-side Web Performance Optimization [paper]

  1. 1. Client-side Performance Optimizations Lecture: Ultra-large-scale Sites, Prof. Walter Kriha Student: Jakob Schröter jakob.schroeter@hdm-stuttgart.de January 2011 | WS 2010/2011 Computer Science and Media Stuttgart Media University, GermanyThis paper aims to show the importance of web performance and to sensibilize web developersto care about the end-users experience. Further it provides an overview of basic optimizationtechniques on the client-side with real-world examples to show how powerful these oftensimple optimizations are.
  2. 2. Ultra-large-scale Sites Client-Side Performance OptimizationsContent1 Why performance matters ........................................................................................................ 32 The client-side ...........................................................................................................................53 Analyze and measure ................................................................................................................74 Basic optimization techniques ................................................................................................ 11 4.1 HTTP requests are expensive ........................................................................................... 11 4.2 Intelligent browser caching ............................................................................................. 11 4.3 Shrink request size .......................................................................................................... 13 4.4 Image optimizations........................................................................................................ 15 4.5 Loading resources & page rendering .............................................................................. 17 4.6 Domain sharding / CDN .................................................................................................20 4.7 JS & CSS performance......................................................................................................205 Automation ............................................................................................................................. 216 Conclusion and a look into the future ................................................................................... 22References .......................................................................................................................................24Jakob Schröter | Stuttgart Media University, Germany 2
  3. 3. Ultra-large-scale Sites Client-Side Performance Optimizations1 Why performance mattersWhat counts to the success of a website is that the website users are satisfied and that theyenjoy browsing through the site. Of course this has mostly to do with the quality of thecontent and the usefulness of the service. But also the user experience is an important factorfor successful web projects – which includes fast response times.During a study Amazon slowed down their website by 100 ms with a result of a 1% drop insales. Yahoo made similar experiments with 400 ms, resulting 5-9% drop in requests. AlsoGoogle increased the number of results per page from 10 to 30 which added 500 ms to theresponse time. After this change they measured a 20% drop in search requests. Further theyrealized that it takes a few days until they reach their old number of requests again. Shopzillasaved about 5 seconds of their total page loading time with performance optimizations,resulting in 25% increase in requests, 7-12% increase in revenue and 50% reduction inhardware.Amazon +100 ms 1% drop in salesYahoo +400 ms 5-9% drop in requestsGoogle +500 ms 20% drop in requestsBing +2000 ms 4.3% drop in revenue/user [1]Shopzilla -5000 ms 25 % increase in requests 7-12% increase in revenue 50% reduction in hardware [2]Mozilla -2200 ms 15.4% increase in downloads [3]These numbers show that even minimal differences in the response time can have significanteffects on the business. [4]When having a look at the 3 response-time limits by usability guru Jakob Nielson this can beconfirmed: The users sense an interruption in what there are doing by response times higherthan 0.1 seconds. The response time should be less than 1 second to allow a good navigationand let the users feel that they are in control. 10 seconds would be an unacceptable responsetime. The user experience would be interrupted at a disturbing high rate and the user wouldprobably leave the site. By the way, these response-time limits haven’t change since 40 years.[5]But it’s not all about time - the user-experience mattersTo provide good user-experience it’s worthwhile to manage that the user doesn’t notice thathe’s actually waiting for the loading of a page. Even if there is no possibility to avoid theloading time of e.g. a huge background image there are plenty of ways to enhance the user-experience since the human sense of time is relative. For example the background image canbe loaded as the last resource, so the website in general is already usable before the image isloaded. Same works for JavaScript-enhanced elements: Tests on Facebook showed that it’s bestto avoid white screens and immediately show content even if the functionality behind it isn’tloaded yet [6]. This doesn’t change anything on the total page loading time, but the perceivedloading time will decrease since the user can already read the content and maybe jump toJakob Schröter | Stuttgart Media University, Germany 3
  4. 4. Ultra-large-scale Sites Client-Side Performance Optimizationsanother page via the navigation even if the current page isn’t completely loaded yet. The userfeels a faster response time. It also helps to show a notification like a progress bar if the actiontakes a few seconds. So the user knows that the computer is still working and isn’t crashed. Orwhy not just give the user something to do while he’s waiting? For example, let him alreadyadd tags to the pictures he’s currently uploading.So keep in mind that in the end the perceived page loading time is important.By the way, Google announced in April 2010 that from now on the page speed will be a factorfor ranking websites in the search results [7].Jakob Schröter | Stuttgart Media University, Germany 4
  5. 5. Ultra-large-scale Sites Client-Side Performance Optimizations2 The client-sideA few years ago, when talking about website performance it was only about optimizing theserver-side and reducing the generation-time of the HTML-output. But nowadays the server-side seems not to be the main problem: some guys from Yahoo found out that on an averageonly 10-20% of the loading time is spent on the server-side; 80-90% is spent at the client-side,that means in the users browser [8] [9]. Average loading time of a website 10-20% Server-side Client-side 80-90% Server ClientWaterfall chart of web.de generated with webpagetest.orgThe example waterfall chart above shows that just the loading of resources on the client-sideneeds way more time than generating the main HTML page on the server. So performance isnot only the job of the guys working on the backend – it’s also an important topic for frontendengineers.Jakob Schröter | Stuttgart Media University, Germany 5
  6. 6. Ultra-large-scale Sites Client-Side Performance OptimizationsWhen working for the web one usually assumes that the client is thin. Nowadays this is onlypartially true for modern web applications. They use a lot of JavaScript and CSS to create a richuser interface. That means more and more logic lies on the client and the server is sometimesonly used for persistently saving the data. There doesn’t have to be an obligatory HTTP requestfor each user interaction anymore.The browserBrowsers developed to pretty complex applications. Requesting the first URL, followingredirections, receiving and parsing the main HTML site and after the loading of additionalresources like CSS, JavaScript and images the browser renders the content. This meansrendering text in the defined CSS styles, rendering images, calculating the flow and so on.With the introduction of CSS3, the Canvas element and SVG browsers are not only simplydrawing black text on a white screen anymore – they come with support for rich graphic effectslike drop shadows, transformations like rotations and animations. So there’s more and morecomputing power needed to display a website. Also browsers are now able to play video andaudio files by themselves. And the execution of JavaScript needs time too. With every releasethe browser manufacturers are working hard on their browsers performance, trying to beat thecompetitors with compiling JavaScript engines, Hardware-accelerated rendering, …Jakob Schröter | Stuttgart Media University, Germany 6
  7. 7. Ultra-large-scale Sites Client-Side Performance Optimizations3 Analyze and measureConsidering 80-90% of the loading time is spent on the client-side it makes sense to have alook on how the performance can get tuned here. But before implementing someoptimizations it’s important to analyze the bottlenecks and set up test cases to measure theperformance before and after the optimizations. Not long ago browsers behaved like a blackbox – a web developer couldn’t easily see what was going on after the users hits the enterbutton in the address bar. Today there are plenty of great tools available, just to name a few:Firebug1 runs as a browser plugin for Firefox and gives beside many handy debug tools forfrontend developers the ability to track for example all network requests or to profileJavaScript function calls.1 http://getfirebug.com/Jakob Schröter | Stuttgart Media University, Germany 7
  8. 8. Ultra-large-scale Sites Client-Side Performance OptimizationsYahoo YSlow2 is a plugin for Firebug which can test a website for many basic optimizations.It’s a great tool for the beginning. The tests are based on the best practices3 from the YahooExceptional Performance team.Google Page Speed4 is also a plugin for Firebug, also based on performance rules. It alsoincludes minifying of HTML, CSS and JavaScript files.2 http://developer.yahoo.com/yslow/3 http://developer.yahoo.com/performance/rules.html4 http://code.google.com/speed/page-speed/docs/extension.htmlJakob Schröter | Stuttgart Media University, Germany 8
  9. 9. Ultra-large-scale Sites Client-Side Performance OptimizationsGoogle Chrome Speed Tracer5 is an extension for Google Chrome and offers a deep insightwhat the browser is doing. Starting from the loading of resources, executing JavaScript, CSSselector matching, paint processes to garbage collection.Also commercial tools are available, e.g. HTTPWatch6 and dynaTrace7.As explained in chapter 1 it’s not only the loading time that matters. So only measuring theloading time or e.g. the time until the browsers onload-event is fired isn’t enough [10]. Moreimportant is the time it takes until the page is usable – that means when the important contentand the navigation is visible to the user.The speed limiter function in OWASP WebScarab8 can emulate a slow internet connection tosee e.g. how all the images get loaded one after each other. In some cases it really helps to get alive impression on how the site is being loaded and in which order the content will be shownto the user.5 https://chrome.google.com/extensions/detail/ognampngfcbddbfemdapefohjiobgbdl6 http://www.httpwatch.com/7 http://www.dynatrace.com/8 http://www.owasp.org/index.php/Category:OWASP_WebScarab_ProjectJakob Schröter | Stuttgart Media University, Germany 9
  10. 10. Ultra-large-scale Sites Client-Side Performance OptimizationsIt’s also recommended to do remote testing from clients around the globe. For exampleWebPagetest 9 and Zoompf 10 are offering great tools which combine tools like GooglePageSpeed. WebPagetest also allows the definition of a DOM element (which containsimportant content, e.g. the DIV of the main content) and track the time until the element isavailable in the DOM. Also “slow-motion” videos can be generated to get an insight look onthe rendering process of the site.Further it’s possible to track the page load times of real visitors. This can be done for examplewith the event tracking feature of Google Analytics11 or the Episodes12 framework.As often it’s hard to get real precise test results since many external factors interfere with themeasured numbers. This includes server load, network latency, the workload of the clientmachine, differences between browsers and so on. Therefore the tests should always be runmultiple times.9 http://www.webpagetest.org/10 http://zoompf.com/free11 http://blog.yottaa.com/2010/10/how-to-measure-page-load-time-with-google-analytics/12 http://stevesouders.com/episodes/Jakob Schröter | Stuttgart Media University, Germany 10
  11. 11. Ultra-large-scale Sites Client-Side Performance Optimizations4 Basic optimization techniquesMany of the now described techniques are actually not new; some people were alreadydiscussing them a few years ago. But unfortunately still many developers and top websitesdon’t profit from them.4.1 HTTP requests are expensiveEvery HTTP request requires time and uses a request slot, even if the transferred data has onlya few bytes. Nowadays the transfer speed is not the biggest problem, once a connection isestablished the transfer runs fast, even on mobile networks. Latency is the major challenge todeal with.Keep in mind that every redirect causes a new request and according to tests of Steve Soudersmost of the modern browsers don’t cache redirects. So these additional requests are alsoslowing down a website [11].Avoid HTTP requests when possibleBecause every request comes always with latency it’s important to reduce the amount ofrequests wherever it’s possible. Many requests can be saved by the smart combination ofresource files. For example all JavaScript files which are loaded on every page, e.g. base.js,dragndrop.js and animation.js can be combined in one single file (this can be automated, seechapter 5). If there is a huge JavaScript file for only a few specific pages (e.g. uploader.js) thenit should be kept separate, because the code is not needed on the other pages. The sameshould be done with CSS files.When some more requests are made during the user uses the website, e.g. an form field offersa nice autosuggest function which calls the backend for JSON responses it sometimes makesense to deliver more results as the user may need in first step, but which he may will needlater on.Further the number of requests can be dropped considerably by combining images using CSSslices as described further on in chapter 4.4.4.2 Intelligent browser cachingThe HTTP protocol specifies great possibilities for caching files on the client and many of themare widely supported by modern browsers. This really isn’t something new, but it often seemsthey got forgotten in developers’ minds. It’s not uncommon that developers or serveradministrators just set all HTTP headers to “no-cache” to disable the browser cache so theydon’t have any troubles with cached but outdated content. But if the right HTTP headers willget sent to the browser, client-side caching can boost the performance of a website, like on theserver-side.Jakob Schröter | Stuttgart Media University, Germany 11
  12. 12. Ultra-large-scale Sites Client-Side Performance OptimizationsNot modified header and the ETagWebservers can be configured to send an ETag13 (entity tag) header in file responses. ThisETag can e.g. be a Md5 hash of the file, so after modifying the file the ETag will be different. Ifthe browser saved a file with an ETag in its cache, it will add the ETag to the next request forthe same file. Now the server checks, if the ETag is still valid (the file hasn’t changed). If it’svalid the server only sends a HTTP/1.x 304 Not Modified response, without sending thewhole file content again. The browser will use the cached file instead. If the file has changed,the server sends the whole file as usual. With this technique a lot of traffic can be saved, butthere’s still a HTTP to be made.ExpiresThe Expires header14 tells the browser the exact date and time when a file will expire. Untilthis date the browser doesn’t make any new requests for this file. So this is perfect for staticfiles where it’s known that they won’t change during the time specified in the Expiresheader.If the project has e.g. a weekly release cycle every Wednesday it’s possible to set the static filesto expire on this date. But mostly it’s a safer way to use cache busters as described in thefollowing chapter.Cache bustersAfter the release of a new version of the website it’s very important that all users are gettingthe latest version of the resource files, like CSS and JS. Wrong configured caching can becomea huge problem, imagine what happens when the users are getting the latest HTML page butare still using an outdated CSS and JavaScript file…13 http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.1914 http://www.w3.org/Protocols/rfc2616/rfc2616-sec14.html#sec14.21Jakob Schröter | Stuttgart Media University, Germany 12
  13. 13. Ultra-large-scale Sites Client-Side Performance OptimizationsCache busters force the browser to reload the file. This is done by appending e.g. a version-number so the browser thinks it’s a complete different file. Mod_rewrite15 can be used for thatso on the server-side it can still be the same filename. This can look for example like this:/scripts/uploader-158846.js -> /scripts/uploader.jsor directory-based/scripts/158846/uploader.jsWhen possible, cache busters as a get parameter (uploader.js?158846) should be avoidedsince some proxies are configured to don’t cache files with get parameters [12].Using Expire headers together with cache busters should be preferred instead of usingETags.But unfortunately web developers still can’t rely on caching as much as it would be desirable.According to Yahoo!s Exceptional Performance Team, 40% to 60% of Yahoo!s users have anempty cache experience and about 20% of all page views are done with an empty cache. Thissurprising fact outlines the importance of keeping websites as lightweight as possible asdescribed in the following chapter. [13]4.3 Shrink request sizeTransferring bytes always consumes time; therefore the amount of data being transferred fromthe server to the client and vice versa should be as small as possible. Just one example: GoogleMaps increased their number of map requests by 30% after shrinking their total file size by30% [14]. To archive a minimal file size it helps to use light data formats like JSON instead ofXML. Further great savings can be gained by minifying and compressing files:MinifyingCSS and JavaScript files usually contain describing variable names, comments, whitespaces andline breaks to be easy readable by humans. But the browser doesn’t need all these information.By removing these characters the file size can be shrinked drastically.There are a handful tools available like YUI Compressor16 and Dojo ShrinkSafe17 for minifyingJS and CSS files. Even HTML files can be minified.15 http://httpd.apache.org/docs/2.0/mod/mod_rewrite.html16 http://developer.yahoo.com/yui/compressor/17 http://shrinksafe.dojotoolkit.org/Jakob Schröter | Stuttgart Media University, Germany 13
  14. 14. Ultra-large-scale Sites Client-Side Performance OptimizationsCompressingAll modern browsers support compressed content18. This means all plain text content likeHTML, CSS, JS, JSON, XML etc. can be compressed on the server-side and the browser willdecompress the content right away before using it. Of course binary files like images, PDF andSWF shouldn’t be compressed again since they are already compressed.The compression can easily be enabled in the configuration of most webservers (e.g. byenabling mod_deflate in Apache). There’s no need to change anything of the sites code,everything is done by the webserver. Dynamic content will be compressed on-the-fly whilestatic content will automatically be served as a cached compressed version.By the way, the additional computing power needed to compress and decompress the datagenerally doesn’t cause any problems.The following table demonstrates as an example what amount of data can be saved byminifying and compressing HTML, CSS and JS files on hdm-stuttgart.de [15]: Original Minified Compressed Minified + compressedHTML 101 KB 97 KB 17 KB 16 KBCSS 90 KB 68 KB 19 KB 14 KBJS 243 KB 195 KB 73 KB 63 KBSum 434 KB 360 KB 109 KB 93 KBSo by minifying and compressing HTML, CSS and JS files their transferred file size can bereduced by 341 KB – that means 79%!Unfortunately by October 2010, still 47% of the top 1000 websites weren’t using compression,although it’s very easy to enable and has amazing potential to speed up websites! [16] Forexample the #9 top site of Germany19 Spiegel.de, could save a lot of traffic if they would minifyand compress their content. Just to provide a roundabout number: on every visit with anempty cache 436 KB of traffic could be saved. Projected on their 380.000.000 page impressionsper month 20 and considering the fact from the previous chapter that 20% of all pageimpressions are done with an empty cache this would sum up in a saving of about 32 TB permonth.18 The community-driven website http://www.browserscope.org/ offers great compares about differentbrowsers and versions.19 according to http://www.alexa.com/topsites/countries/DE20 http://www.spiegel.de/extra/0,1518,249387,00.htmlJakob Schröter | Stuttgart Media University, Germany 14
  15. 15. Ultra-large-scale Sites Client-Side Performance Optimizations4.4 Image optimizationsUse the right image format & dimensionsFirst, by choosing the optimal image format the file size can be reduced drastically while evenproviding better image quality. In general the JPG format should be used for images with ahigh number of colors. PNG is best for rendered texts e.g. headlines (which better should be aplain text anyway) and of course for images with alpha transparency. Background imagesusually can be saved with a high compression rate.It’s not uncommon that images are delivered in a higher resolution as they are displayed in thebrowser. So they are downscaled via the browser, which consumes needless networkbandwidth and rendering time of the browser. Whenever possible, the width and heightattributes of <img> tags should be filled [17].CSS SpritesLike JS and CSS files also image files can be combined to save an enormous amount ofrequests. Therefore a huge image will be made up of the single images. This image will bedefined as the background image of HTML elements and with the CSS background-positionthe image will get positioned so that only one single image is visible. [18] When combiningimages with similar colors the best compression rate can be achieved.Google for example combined 53 images into one file. Take a look at the CSS listing to see howthe selection of a single image out of the huge sprite is done:Jakob Schröter | Stuttgart Media University, Germany 15
  16. 16. Ultra-large-scale Sites Client-Side Performance Optimizations a.button { width: 13px; height: 13px; background: url(sprite.png) no-repeat; background-position: -19px -193px; } a.button:hover { background-position: -35px -193px; }Remove meta dataFurther with the removal of useless meta information like EXIF data quite a sum of bytes canbe saved. Dependent on the use case some meta information like copyright hints may beimportant but in general these meta data are not needed for images on a webpage. [19]Yahoo provides with Smush.it21 a great service for automatically optimizing images withoutquality loss. Also Google PageSpeed includes image optimizations like removing meta data:For example 124 KB or 67% of the total image size could be saved by removing meta data onhdm-stuttgart.de, without any loss in image quality.21 http://www.smushit.com/Jakob Schröter | Stuttgart Media University, Germany 16
  17. 17. Ultra-large-scale Sites Client-Side Performance Optimizations4.5 Loading resources & page renderingThe general aim should be to show the most important content first and as fast as possible. Forexample this could be the main article and the page navigation. The order of loading the siteresources like JavaScript, CSS and images files has a wide influence on the rendering process inthe browser. The network waterfall charts of e.g. Firebug can give helpful insights on how thebrowser loads the resources on the site.Correct order of HTTP requestsA best practice is to put the CSS file (hopefully all files are combined into one) as the very firstresource in the <head> tag. This helps to avoid strange content jumping while loading yourpage. Otherwise it can happen that the browser displays your content for example with thedefault font and later on has to redraw it with the font defined – which is not only confusingfor the user (the so called “Flash of unstyled content”), it also consumes needless computingpower and slows down the website rendering.Non-blocking JavaScriptThe browser waits until all JavaScript files defined in the <head> section are loaded before itbegins to render the page. Also <script> elements in the <body> section are blocking thebrowser from rendering all following HTML elements until the JavaScript file is loaded andexecuted. Further all other resource downloads are paused while loading JavaScript files sincethe loaded script could modify the DOM again; luckily new browsers are trying to fix this. Butnot only the loading of JavaScript blocks the page rendering, also all rendering of the pagecontent is paused and new resource downloads are blocked while executing JavaScript [10].It’s well-known that this behavior isn’t optimal, so the HTML5 specification offer async anddefer attributes22 for <script> elements to define when the script should be loaded andexecuted. The async attribute tells the browser to load the script asynchronous and execute itas soon as it’s loaded. That means the scripts will be executed in the order they got loaded, notin the order of the <script> elements. Also the DOM may not be complete when executingthis script. With the defer attribute the script will be loaded after all other site resourceswere loaded and executed when the DOM is ready. The order of execution will be kept.Unfortunately these attributes are not well supported by the major browsers yet.Nevertheless common JavaScript libraries provide utilities for asynchronous JavaScript and CSSloading, e.g. YUI3 Get23. And with a few hacks the loading and also the parsing and executionof JavaScript can be controlled separately. It’s worth to take a look at the JavaScript moduleControlJS24 from Steve Souders which allows this, even though it’s not yet recommended to usethis tool in an production environment. [20]22 http://www.whatwg.org/specs/web-apps/current-work/multipage/scripting-1.html23 http://developer.yahoo.com/yui/3/get/24 http://stevesouders.com/controljs/Jakob Schröter | Stuttgart Media University, Germany 17
  18. 18. Ultra-large-scale Sites Client-Side Performance OptimizationsThe following figure shows how JavaScript files block the browser from fetching other resourcefiles like images and block the page rendering. The green line indicates when the renderingprocess starts.Waterfall chart without ControlJS (IE8) [20]The next figure shows the same page using ControlJS. The user will see the page right after theHTML has been loaded and doesn’t have to wait about 4 seconds until all JavaScript files whereloaded. Also the images will get loaded first and parallel to the JavaScript files.Waterfall chart with ControlJS (IE8) [20]In general an eye should be kept on which files really need to be loaded to display the firststate of the website and the principle of progressive enhancement should be followed. Thatmeans allowing the browser to render the plain HTML page as fast as possible and enhance itwith JavaScript after the page was rendered.Frontend single points of failureJust think about the following case: An external JavaScript file e.g. from an advertisementcompany is loaded in the <head> section of a site. What happens to the site when the externalserver is down? The whole site won’t show up until the browser decides to timeout the request!That’s a single point of failure. [21] So it’s not only from the performance point of view wise toload resources in a non-blocking way. This especially applies for external resources so 3rdparties don’t slow down the own site.Jakob Schröter | Stuttgart Media University, Germany 18
  19. 19. Ultra-large-scale Sites Client-Side Performance OptimizationsIntelligent pre/lazy-loadingIt’s possible to preload resources which will be used later on. For example it can help topreload huge JavaScript files while the user enters his login data on the login page. After he haslogged in all needed JavaScript files are already in the browsers cache and can be usedimmediately. But of course it’s important to start the preloading after the current page hasbeen rendered so e.g. the login page is already usable before the preloading starts.Also lazy-loading is possible for content which is first of all not visible for the user. A commonuse case is images which are currently not visible and only will get visible after the user scrollsdown. YouTube for example lazy loads the thumbnails of the suggested clips only when theuser scrolls down. Many JavaScript libraries like YUI provide easy functions25 to implement thisbehavior. Also lazy-loading of other resources can result in a huge performance boost.Progressive renderingAn interesting, but depending of the application architecture sometimes difficult to implementtechnique is to send the generated HTML code as early as possible to the client, even if it isn’tcompletely generated. For example, in PHP this can be done with the flush() method asshown in the example below. Browsers can already parse the first code lines and e.g. startloading CSS and JavaScript files during the rest of the document is being generated on theserver. [22] [19]<html><head> <title>the page</title> <link href="my.css" /> <script src="my.js"></script></head><?php flush(); ?><body><div>site navigation</div><div>main content</div><?php flush(); ?><div>some user comments</div><div>some ads</div>...25 e.g. http://developer.yahoo.com/yui/3/imageloader/Jakob Schröter | Stuttgart Media University, Germany 19
  20. 20. Ultra-large-scale Sites Client-Side Performance Optimizations4.6 Domain sharding / CDNBrowsers only allow a specific number (2-6) of parallel HTTP connections to the same hostname. If all resource files are hosted under the same host name it will take a while until thefiles are loaded since there are only e.g. 2 download slots available. To improve concurrency itmakes sense to split up the resources on e.g. 2 additional (sub-)domains.Also a lightweight webserver26 can be used for delivering static files like JavaScript, CSS andimages. They have faster response times and can unload the main application servers. Furtherthe domain should be cookie-free, so the browser doesn’t send a cookie with every request forstatic files. This saves traffic and computing power.In addition content delivery networks with servers around the globe provide better responsetimes since they will choose the nearest server based in the location of the user.4.7 JS & CSS performanceMore and more websites are extensively using JavaScript – especially web 2.0 sites. Some ofthem put the whole creation of the DOM in the hand of JavaScript. From this it follows thatthe performance of JavaScript is getting more importance. All browser manufacturers areworking hard to optimize their engines and speed up the JavaScript execution. But also webdevelopers can do a lot to optimize their code. If a website is heavily using JavaScript, it’sworth to follow JavaScript best practices. [23]The same applies for CSS – it might sound a bit beside the point, but for example some CSSselectors have considerable better performance than others. The most misunderstood fact isthat browsers are interpreting CSS selectors from right to left and not like many people wouldguess from left to right. One example:#myElement li a {color: red;}Actually this selector seems to be very efficient, getting the element with the id myElement,search for children of the type <li> and then applying the font color to all children from thetype <a>. Instead the browser iterates over all <a> tags on the entire page, checks if they mayare a child of a <li> element in multiple levels and then checks if they are also a child of theelement with the id myElement. [19] From performance point of view the rule above would befaster when e.g. using only one class selector and apply this class name to all <a> elements:.myElement-li-a {color: red;}Steve Souders created a test suit27 to compare the performance of (own) CSS selectors.Further the new CSS3 shadow and transform effects should be used with care; in somecircumstances they can extremely slow down a website.26 e.g. http://www.nginx.org/ or http://www.lighttpd.net/27 http://stevesouders.com/efws/css-selectors/tests.phpJakob Schröter | Stuttgart Media University, Germany 20
  21. 21. Ultra-large-scale Sites Client-Side Performance Optimizations5 AutomationIt’s important that performance optimizations don’t break the development process.Struggling with minified JavaScript and CSS files in the development environment is no fun atall. And manually minifying and combining them before every release is a time-taking and alsoerror-prone job. Therefore the aim should be to integrate as much as possible into thedeployment process. This has also advantages when working on a huge project with dozens ofpeople since it’s hard to convince every developer to follow the optimization rules.Optimizations like minifying and combining CSS and JavaScript files can be doneautomatically during the deployment process (e.g. via Ant) so there is still a nice modular filestructure in the development environment.In addition some companies such as Strangeloop28 or Blaze29 offer commercial out-of-the-boxoptimizations tools. The trend goes to transformation-based performance optimization. Thismeans these tools will automatically modify the HTML output and optimize resources withoutthe need of far-reaching changes on the application. It’s a challenge e.g. for sites relying heavilyon JavaScript, Ajax and third party content. But for more simple HTML sites and particularlysmaller (e.g. private) sites this approach can be useful. Also Google recently released theiropen-source Apache module mod_pagespeed30 which does performance optimizations likecompressing, minifying, image optimization, combining of JS and CSS files and so onautomatically on-the-fly. The idea sounds very promising; the module even chooses theoptimal optimizations depending on the users’ browser. It’s worth giving it a try. But sometests from Aaron Peters showed that the module can even slow down a website since itconsumes computing power on the server [24].Further automated performance tests could be set up to ensure e.g. a new feature doesn’t slowdown the website dramatically. For example the tool ShowSlow31 allows the tracking of YSlow,Page Speed and dynaTrace rankings over time.28 http://www.strangeloopnetworks.com/29 http://www.blaze.io/30 http://code.google.com/speed/page-speed/docs/module.html31 http://www.showslow.com/Jakob Schröter | Stuttgart Media University, Germany 21
  22. 22. Ultra-large-scale Sites Client-Side Performance Optimizations6 Conclusion and a look into the futureNot only the end-user profits from snappy websites – often the servers and the networks getunloaded and bandwidth is saved. So also from an economical and ecological point of viewperformance optimizations are worthwhile. Like mentioned in chapter 1, Shopzilla reducedtheir hardware by 50% after performance optimizations [2]. So with performanceoptimizations a lot of money can be saved and also earned when e.g. beating competitors’ sitespeed and getting more satisfied customers. Fred Wilson, a New York based tech investor saidin March 2010 that he sees speed as the most important feature of an web application [25].It’s just the beginningComparable to Search Engine Optimization (SEO) a new industry specialized on performanceoptimizations has grown: Web Performance Optimization (WPO) [26]. The establishment ofthe W3C Web Performance Working Group32 shows that there is effort to standardizingperformance metrics in browsers, e.g. with the Navigation Timing33 specification.Also just think about performance on mobile devices. Mobile client-side web performance isalready a big topic and will get as important as desktop web performance [27]. While there is abunch of well working tools like Firebug available for measuring desktop performance, there isstill a lack of good tools for mobile browsers. And since browsers supporting rich graphiceffects like drop-shadows the client-side performance will get more attention in the future. Forexample all major browser manufactures are already working on hardware-accelerated websiterendering.Further interesting research is done like the Diffable34 project by Google which aims to providean tool that only downloads the deltas between cached static files and the updated ones. Sowhen e.g. a new version of Google Maps is released, the browser only needs to download a diff-file with maybe 20 KB instead of the full JavaScript file with 300 KB. [28] To further reduce thenumber of HTTP connections the idea of Resource Packages came up. All resource files can bepacked into one single ZIP file which is referenced in the documents <head> section. So thetransfer can be done by one single data stream. Good news are, that single files can beprogressively accessed while the (huge) ZIP file is still loading and the loading order can bedefined. The idea sounds really promising and could replace CSS sprites which are oftendifficult to maintain. [29] Also in the future local storages could be used to cache applicationdata and have a better control over cached files on the client-side.But beside all benefits, website performance may not be the highest priority optimization forevery website. There is no one-click solution yet to perfectly boost the performance of anyrandom website. Also the performance best practices are always in a change, so the bestoptimization rule for browser X may hurt performance in browser Y, or even in a newerversion of browser X. Going into the details of website performance is a very complex, time-taking and somehow endless task when digging into micro-optimizations. On the other side32 http://www.w3.org/2010/webperf/33 http://www.w3.org/TR/2010/WD-navigation-timing-20101207/34 http://code.google.com/p/diffable/Jakob Schröter | Stuttgart Media University, Germany 22
  23. 23. Ultra-large-scale Sites Client-Side Performance Optimizationshardware is getting faster and browsers are continuously improving the performance, withoutthe need to change anything on the own website. So for smaller sites the ROI may not beworthwhile. Depending on e.g. the CMS or shop system which the website is based on it mightbe tricky to implement even simple optimizations.Nevertheless every web developer should have a basic knowledge about performanceoptimizations and also small sites should at least adopt the basic optimizations like enablingcompression. They are very easy to adopt and have a huge benefit. With basic knowledge andlittle attention to performance enormous bottlenecks can be avoided right from the launch of asite.For further reading and latest news it’s worth to take a look at the blog35 of client-sideperformance guru Steve Souders.35 http://www.stevesouders.com/blog/Jakob Schröter | Stuttgart Media University, Germany 23
  24. 24. Ultra-large-scale Sites Client-Side Performance OptimizationsReferences[1] Eric Schurman and Jake Brutlag. (2009, June) Performance Related Changes and their User Impact. [Online]. http://www.slideshare.net/dyninc/the-user-and-business-impact- of-server-delays-additional-bytes-and-http-chunking-in-web-search-presentation[2] Steve Souders. (2009, July) OReilly Radar: Velocity and the Bottom Line. [Online]. http://radar.oreilly.com/2009/07/velocity-making-your-site-fast.html[3] Blake Cutler. (2010, March) Blog of Metrix: Firefox & Page Load Speed. [Online]. http://blog.mozilla.com/metrics/category/website-optimization/[4] Website Optimization, LLC. (2008, May) The Psychology of Web Performance. [Online]. http://www.websiteoptimization.com/speed/tweak/psychology-web-performance/[5] Jakob Nielsen. (2010, June) Website Response Times. [Online]. http://www.useit.com/alertbox/response-times.html[6] Zizhuang Yang. (2009, August) Facebook: Every Millisecond Counts. [Online]. http://www.facebook.com/note.php?note_id=122869103919[7] Amit Singhal and Matt Cutts. (2010, April) Official Google Webmaster Central Blog: Using site speed in web search ranking. [Online]. http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search- ranking.html[8] Tenni Theurer. (2006, November) Yahoo! User Interface Blog: Performance Research, Part 1: What the 80/20 Rule Tells Us about Reducing HTTP Requests. [Online]. http://www.yuiblog.com/blog/2006/11/28/performance-research-part-1/[9] Steve Souders, High performance web sites: essential knowledge for frontend engineers, OReilly, Ed., 2007.[10] Steve Souders. (2010, September) High Performance Web Sites blog. [Online]. http://www.stevesouders.com/blog/2010/09/30/render-first-js-second/[11] Steve Souders. (2010, July) High Performance Web Sites blog: Redirect caching deep dive. [Online]. http://www.stevesouders.com/blog/2010/07/23/redirect-caching-deep-dive/[12] Steve Souders. (2008, August) High Performance Web Sites blog: Revving Filenames: don’t use querystring. [Online]. http://www.stevesouders.com/blog/2008/08/23/revving- filenames-dont-use-querystring/[13] Tenni Theurer. (2007, January) Yahoo! User Interface Blog: Performance Research, Part 2: Browser Cache Usage – Exposed! [Online].Jakob Schröter | Stuttgart Media University, Germany 24
  25. 25. Ultra-large-scale Sites Client-Side Performance Optimizations http://yuiblog.com/blog/2007/01/04/performance-research-part-2/[14] Stephen Shankland. (2008, May) CNET News: Were all guinea pigs in Googles search experiment. [Online]. http://news.cnet.com/8301-10784_3-9954972-7.html[15] Jakob Schröter. (2010, January) Client-side Performance Optimizations. [Online]. http://www.slideshare.net/jakob.schroeter/clientside-performance-optimizations[16] Joshua Bixby. (2010, October) Almost half of the top 1000 retail sites don’t follow two easy performance best practices. Does yours? [Online]. http://www.webperformancetoday.com/2010/10/22/alexa-1000-performance-best- practices/[17] Website Optimization, LLC. (2004, September) Size Images with Width and Height Attributes. [Online]. http://www.websiteoptimization.com/speed/tweak/size/[18] Sven Lennartz. (2009, April) Smashing Magazine: The Mystery Of CSS Sprites: Techniques, Tools And Tutorials. [Online]. http://www.smashingmagazine.com/2009/04/27/the-mystery-of-css-sprites-techniques- tools-and-tutorials/[19] Steve Souders, Even Faster Web Sites, OReilly, Ed., 2009.[20] Steve Souders. (2010, December) High Performance Web Sites blog: ControlJS part 1: async loading. [Online]. http://www.stevesouders.com/blog/2010/12/15/controljs-part-1/[21] Steve Souders. (2010, June) High Performance Web Sites blog: Frontend SPOF. [Online]. http://www.stevesouders.com/blog/2010/06/01/frontend-spof/[22] Stoyan Stefanov. (2009, December) Progressive rendering via multiple flushes. [Online]. http://www.phpied.com/progressive-rendering-via-multiple-flushes/[23] Nicholas C. Zakas, High Performance JavaScript.: OReilly, 2010.[24] Aaron Peters. (2010, December) Performance Calendar: Mod_Pagespeed Performance Review. [Online]. http://calendar.perfplanet.com/2010/mod_pagespeed-performance- review/[25] Keir Whitaker. (2010, March) Think Vitamin: Fred Wilson’s 10 Golden Principles of Successful Web Apps. [Online]. http://thinkvitamin.com/web-apps/fred-wilsons-10- golden-principles-of-successful-web-apps/[26] Steve Souders. (2010, December) Performance Calendar: 2010 State of Performance. [Online]. http://calendar.perfplanet.com/2010/state-of-performance/[27] Joshua Bixby. (2011, January) RCR Wireless News: Reader Forum: 2011 Web performance predictions for the mobile industry. [Online]. http://www.rcrwireless.com/article/20110103/READERFORUM/101229979/reader-forum-Jakob Schröter | Stuttgart Media University, Germany 25
  26. 26. Ultra-large-scale Sites Client-Side Performance Optimizations 2011-web-performance-predictions-for-the-mobile-industry[28] Steve Souders. (2010, July) Diffable: only download the deltas. [Online]. http://www.stevesouders.com/blog/2010/07/09/diffable-only-download-the-deltas/[29] Alexander Limi. (2009, November) Making browsers faster: Resource Packages. [Online]. http://limi.net/articles/resource-packages/[30] Stoyan Stefanov. (2010, November) Progressive Downloads and Rendering. [Online]. http://www.slideshare.net/stoyan/progressive-downloads-and-renderingJakob Schröter | Stuttgart Media University, Germany 26

×