Web Performance BootCamp 2013
Upcoming SlideShare
Loading in...5
×
 

Web Performance BootCamp 2013

on

  • 555 views

These are the slides I used for my 1-day Performance Workshop at HTML5 DevConf this year.

These are the slides I used for my 1-day Performance Workshop at HTML5 DevConf this year.

Statistics

Views

Total Views
555
Views on SlideShare
554
Embed Views
1

Actions

Likes
0
Downloads
12
Comments
0

1 Embed 1

https://twitter.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment
  • It’s impossible to teach you what you need to know in one day!
  • Not database or middleware or hardware!
  • Why Science?Experiment -> get numbers -> analyze -> see outcomesHypotheses, testing,…
  • There are few good books in this space.
  • I’ll stay around for questions afterwards.
  • Sources:Compuware, April 2011.“Google has started using web-site performance as a signal in its search engine rankings.” Source: Using site speed in web search ranking: http://googlewebmastercentral.blogspot.com/2010/04/using-site-speed-in-web-search-ranking.htmlSource: http://blog.yottaa.com/2010/11/secret-sauce-for-successful-web-site-web-performance-optimization-wpo
  • This diagram gives you an idea of the number of different use cases and actors across a large organization who are concerned with performance and the user experience. Every one of these people needs to know something about performance.It’s really complicated in more ways than one!
  • It does not matter how fast the system is in the data centerThere is no reason anything should be down. It could be slower, but not down.Globalization: product is ready to be used anywhere.
  • Not latency or bandwidth or anything else – response time.
  • Robust statistics, John Tukey
  • 0-2525-5050-7575-100
  • Number of Jupiter’s satellites: 63 or 67? Different sources show different number. Wikipedia: 63 or 67, depending on the article. Caltech web page was showing 63.
  • Pattern: Your project is part of the system – optimize for the overall system performance, not just what you can see, even if that means your part is less-than-perfectly optimized.
  • 10-15 minutes on this slide
  • A simple OSI mnemonic
  • Observe how Google has minimal page – optimizes T4Chinese search engines – very busy page – but: they like it so busy
  • Only the last formula matters – ignore the restT2 Should be constant right??Green: CPU load increasesRed: The response time grows, then becomes unstableAt the end, it stabilizes, but the server is not serving anybodyNon-linear response: trouble!We don’t know the point where the server response becomes non-linear
  • Layer 4 stuffNumber 3 in the time comes from differentiation
  • Response times fluctuate – they are not deterministic
  • What do you do when there’s an unmovable rock in your lawn?Mow around it!Compensate in some other part of the E2EThink outside the boxWork with ISPsProvide services where neededWork with local ISPs
  • You can do this for yourself on a smaller scale at WebPageTest.org
  • Yslow turns 5 today!
  • Competitive testing is an art in itselfTry to measure task sequences not pagesBe prepared for lots of breakageBe respectful This is difficult and time-consuming!
  • Nothing we do about performance is more important than testingEverything upfront in a disciplined way. We need to establish the positive feedback loopsThat give developers the incentive to make it fast. Testing and theService-level requirements->test-refine->Service level agreement process are the big wins here.Tools and education play a big part in this.
  • What’s missing?Where are Xpath and Xquery? RDF?This specification is a kitchen sink full of technologies for the Web. – what wg website
  • One big issue for HTML5 right now is the difference in performance between the different browsers’ implementations of the new specs.

Web Performance BootCamp 2013 Web Performance BootCamp 2013 Presentation Transcript

  • Web Performance Boot Camp Daniel Austin PayPal, Inc. HTML5 DevConf Oct. 24, 2013 V 2.2
  • Overture: Goals of the Class • Provide a basic understanding of Web performance for Architects, Developers, Designers, and Engineers • Empower YOU to identify and resolve performance problems and make your pages and applications faster!!! • Demonstrate and explain how to use common tools and techniques used in our industry to solve performance problems
  • Scope of Web Performance Anything that uses HTTP Always From the End User‟s Point of View Web Request/Response Only!
  • Current State of the Art • Web performance is both an Art and a Science (but it‟s not yet Engineering) • Multiple tools and methodologies, large ad hoc, contend in the marketplace (but little of it is well-thought out or based on scientific reasoning). • Things are getting better – W3C involvement and competitive pressures, as well as better infrastructure and the influx of new users in Asia is driving more attention to performance. There‟s hope.
  • Reading List • Performance by Design - Daniel A. Menasce (Safari) • The Practical Performance Analyst - Neil J. Gunther • Elements of Networking Style - M.A. Padlipsky • High Performance Web Sites – Steve Souders
  • Tools Used in This Class • Excel (or similar spreadsheet program) • Online Testing Tools – webspeedtest.org • Desktop Testing Tools – your browser, Firebug, Visual Round Trip Analyzer (VRTA), netmon, dig, ping others • Optional: R (and Rstudio), Mathematica, SPSS
  • Class Structure Schedule • • • • • Start: 9:00 AM Break: 10:30-1:45 Lunch: 12:30-1:15 Break: 2:30-2:45 End: 4:00 PM Agenda • • • • • • Section I – What Is Performance? Section I – Performance Basics Section III – The MPPC Model Section IV – Tools & Testing Section V – HTML5 & Peformance Section VI – Mobile Devices
  • Section I What is Performance?
  • What Problem Are We Trying To Solve? • World-class response times compared to our competitors • Reliable, predictable performance for users worldwide • Efficient use of resources: cost scales linearly with traffic • Delighted users!
  • Impact of Scalability on Business • Google – 500 ms reduces traffic to sites by 20% • Yahoo! – 400 ms reduces traffic y 5-9% • Amazon – 100 ms reduces revenue by 1% • Compuware – 1 sec delay reduces conversion by 7% 10
  • Who Needs Performance Anyway? Who needs ‘Performance’ measurements anyway? ?
  • A More Rational Approach?
  • Systemic Qualities In a Nutshell “Anything you can say about a black box – from the outside” • Systemic qualities are the “ilities” – physical features of the system such as capacity, performance, and scalability • The SQs correspond to different groups of stakeholders: users, developers, operators, organizations • SQs are the best measure of the quality of the user‟s experience of the system, regardless of the feature set
  • The Four Classes of Systemic Qualities Manifest Qualities - What the users see • Usability, Performance, Reliability, Availability, Accessibility Operational Qualities - What the system operators see • Throughput, Manageability, Security, Serviceability Developmental Qualities - What developers see • Buildability, Budgetability, Planability Evolutionary Qualities - How the system changes over time • Scalability, Maintainability, Extensibility, Reusability, Portability
  • The Manifest Systemic Qualities • Usability reflects the ease with which users can accomplish their goals • Performance reflects how much little time users must wait for actions to complete • Reliability measures how often the system fails • Availability measures uptime vs. downtime • Accessibility measures the systems ability to serve users regardless of location or physical condition (including I18N and L10N)
  • Performance is a Balancing Act Performance isn’t everything; sometimes we’re called on to make choices about which systemic qualities have priority over others. Security v. performance is a common tradeoff – what would you choose?
  • What Is Performance? PERFORMANCE IS RESPONSE TIME PERFORMANCE IS RESPONSE TIME PERFORMANCE IS RESPONSE TIME PERFORMANCE IS RESPONSE TIME PERFORMANCE IS RESPONSE TIME
  • Section II Performance Basics
  • Scales of Measure
  • Statistics 101
  • Comparison of Mean, Median, and Mode Comparison of mean, median and mode of two log-normal distributions with different skewness.
  • Outliers, or Why We Use the Median • A: skewed to the left • B: skewed to the right • C: symmetrical
  • Statistical Distributions • Discrete or continuous? • Mean, median, sigma, 95%? • Is it reasonable? |mean – median| <= sigma • Does it correlate?
  • Understanding the Margin of Error The margin of error is a measure of how close the results are likely to be. • Margin of error at 99% confidence = 1.29/sqrt(n) • Margin of error at 95% confidence = 0.98/sqrt(n) • Margin of error at 90% confidence = 0.82/sqrt(n) (where n is the number of sample data points)
  • 5 Number Reports A simple way of summarizing a sample • • • • Shape of the distribution Extreme values Variance Skewness Median 1st quartile 3rd quartile Minimum Maximum This is how you get a sense of the data…
  • 5 Number Reports in Excel Excel functions: Min = MIN(Data Range) Q1 = QUARTILE(Data Range, 1) Q2 = QUARTILE(Data Range, 2) Q3 = QUARTILE(Data Range, 3) Max = MAX(Data Range)
  • 5 Number Reports in R Let‟s make a 5 number report in R: http://www.r-project.org/
  • Operational Research • Developed during WWII for managing armies and supply chains • A set of rules or „laws‟ that describe the operational aspects of a system. • Useful for understanding the performance of any system • Utilization Law • Forced Flow law • Little‟s Law • Response Time Law
  • Resources and Queues Ri Queue Si Resource Si: Service time Ri: Queue Residence time i: Queue length In general, systems consist of many combined queues and resources
  • The Utilization Law • The utilization (Ui) of resource i is the fraction of time that the resource is busy. • Xi: average throughput of queue i, i.e. average number of requests that complete from queue i per unit of time • Si: average service time of a request at queue i per visit to the resource Ui = Xi * Si
  • Interactive Response time law IMAGE COURTESY PROF. RAJ JAIN
  • Hands-on: Using the Response Time Law in the Real World Let‟s say Facebook‟s Web servers can process 10K „like‟ requests/second, and the number of concurrent users is 600K. If each user waits 5s between requests, how long will each request take? R = (N/X) –Z
  • Hands-on: Using the Response Time Law in the Real World Let‟s say Facebook‟s Web servers can process 10K „like‟ requests/second, and the number of concurrent users is 600K. If each user waits 5s between requests, how long will each request take? R = (N/X) –Z = 6 * 10e5 / (10 requests/ms) -5 * 10e4ms = 6 * 10e4 – 5* 10e4 ms = 1000ms Facebook needs more servers!
  • Antipattern: Keyhole optimization Problem: Optimizing your project Antipattern: Optimizing *your* project, at the expense of everyone else! Pattern: Your project is part of the system – optimize for the overall system performance, not just what you can see, even if that means your part is lessthan-perfectly optimized.
  • Section III The MPPC Model
  • Dimensions of Performance • • • • • Geography Network location Bandwidth Transport Type Browser/Device Type • RT Varies by as much as 50% • Page Composition • Client-side rendering and execution effects (JS, CSS) • Network Transport Effects • # of Connections, CDN Use
  • Hardware and Routing
  • The OSI Stack Model
  • OSI Functionality Summary Application Presentation Access network resources Translate, compress, encrypt Session Establish, manage, terminate sessions Transport Reliable message delivery end to end Network Transmit packets from source to destination Data Link Organize bits into frames Physical Transmit bits
  • All People Seem To Need Data Processing Application All Presentation People Session Transport Seem To Network Need Data Link Data Physical Processing
  • HTTP Connection Flow Server Client Connection setup Client‟s perceived response time Request Response Handshake time Request transmission time Estimated server processing time Response transmission time The more HTTP requests & network roundtrips you require, the slower your performance will be: Images, CSS, JS, DNS lookups, Redirects, #of packets
  • The MPPC Model Of Web Performance “Multiple Parallel Persistent Connections” Request Initiation by User S HTTP Request HTTP Request t1 This entire cycle, steps 1-4, is repeated once for each external reference on the page, so for a given page the total time is: T = S Dt1 + Dt2 + Dt3 + Dt4 n+1 End User DNS/Network Resolution Page Composition n+1 Where n is the number of external page requisites. Browser Rendering Time HTTP Response t4 Payload Delivery Time HTTP Response t3 T1 T2 T3 T4 Connection Time Server Duration Transport Time Render Time t2
  • T1 – Making the Connection 1= DNS + TCP+ SSL • Typically a larger part of the E2E than expected • Highly variable • SSL is slow!
  • Why DNS Matters • Nothing happens before DNS! • User does not see anything on their page  waiting time • Homework Assignment: create a host file for yourself. Try your favorite sites without DNS! • DNS has a great impact on user‟s perceptions in HTTP applications
  • Interacting with DNS: dig time
  • T2 – The Server Duration • Let ( ) r • U = ( r)[ r W] • X =U* • Navg = ( • … so 2 r [W( r)W+1 -(W+1)( r)W+1] = Navg/X (The response time law)
  • T3 – TCP Transport Time • Single Object: 3 = Sz/R+2RTT+ idle For persistent parallel connections: 3 = (M+1)Si/Ri+[M/kNh]*3 RTTi+ idle … for 1 base HTML page with M objects, with Si bits, at bandwidth Ri, k connections per host, and Nh unique hostnames
  • T4 - What the Browser Does n i=1 4 = off(i) off = time offset to parse the HTML, JS, CSS, and establish the individual connections (to different hostnames) 4 is especially significant for mobile devices!
  • Where are the delays?
  • Bandwidth Efficiency Bmax = 1.22*(L)^1/2 * MSS/RTT
  • Bandwidth, Latency, and All That
  • More Bandwidth? Less Latency!
  • Hands-on: Testing DNS Response times We‟ll use nslookup for this exercise 1. Run 10 nslookup commands for a site (e.g. www.facebook.com) 2. Observe the response time for the DNS lookup 3. Calculate statistics for the results • 5 number report (summary) • Sketch the distribution • What can you say about the response times for DNS?
  • Antipattern: That’s Outside My Control It‟s never the case that there is „nothing you can do‟ about a performance problem. Antipattern: avoiding solving a performance issue because you think it‟s outside your control. This path leads to despair. Pattern: Compensate in some other part of the E2E. Think outside the box
  • Section IV Tools and Testing
  • Let’s Talk Tools Site Performance Services – Gomez – Keynote – AlertSite – ThousandEyes • „Wholesale‟ Testing – Statistical data for many page views under different conditions – Operational testing – Best for understanding global and network effects Page Analysis Tools – YSlow – MS Virtual RoundTrip Analyzer, HTTPWatch, Many Others – F12 in your browser • „Retail‟ Testing – One Page or App – Diagnostic – Best for functional testing
  • Commercial Testing Services • Gomez, AlertSite, and Keynote toolsets are similar in many ways • Synthetic Test Setup • Test nodes in large datacenters and/or end user‟s machines • Statistical data about response times
  • Performance Testing Locations Your Data Centers QA & Test
  • HTTP Object Model Web Page(s) Page Objects A Test is a sequence of one or more URLs for which HTTP requests will be made. (or Components) A Monitor is a set of predefined Tests to be run at specific times and places Each Page Object has 4 associated time segments, t1, t2, t3, t4
  • Desktop Tools • Methodology – DOM Crawler and Packet Sniffer – More accurate – Analyzes components – Stats view • Implements the 14 18 22 105 YSlow Rules – All browsers except IE – Mobile bookmarklet – Best tool for page analysis
  • Unix Performance Testing Tools • • • • • Ping nslookup dig traceroute netstat
  • Task-Based Performance Thinking Welcome Inbox Bulk … Inbox (B) 61.77% Read Message (D) P(A,B) = 0.5168 10.04% 66.25% Verify (F) 34.52% Welcome (A) 9.85% Bulk (C) 58.07% 64.53% 22.30% Exercise: What % of users follow the path: A->E->F->G? 34.38% 35.02% Compose 28.04% (E) Send Confirm (G) 1400ms 2200ms 3200ms
  • Testing Your Competitors for Fun & Profit
  • Stormcat: Global Performance Testing • • Cannot compare performance data out-of-region There are many global factors involved in performance: • Bandwidth • ISP • Infrastructure • Secular cycles (weeks, holidays, usage patterns) • The best approach: use the „StormCat‟ system! • Best case (Northern California high broadband @3 AM) • Worst Case (rural Indonesia on VSNL @ 2PM local) • Divide the range into 5 categories equally spaced between the best & worst: some locales will be in Cat I, some in Cat II, some in Cat III, etc.
  • Hands-on: Analyzing Waterfall Diagrams http://www.webpagetest.org • Choose a location and a browser and test: http://www.yahoo.com
  • Cached v. Uncached
  • Antipattern: Design-time Failure Performance is a design time activity! Anti-Pattern: Releasing a new or modified product without testing its performance “Bake it in up front!”
  • Section V HTML5 & Performance
  • A Federated Model for HTML Core HTML5 Canvas 2D This is XHTML 1.1 HTML Markup Web Workers HTML Media Web Sockets IndexDB Web Storage Source: Sergey Mavrody c. 2013
  • The Co-Evolution of HTML, JS, CSS, and XML XML Core XSLT XSD Xpath/XQuery Document Object Model JavaScript… JSON… Source: Sergey Mavrody c. 2013
  • Tower of Babel: A Problem We Have Yet to Solve
  • The Current Browser Landscape
  • HTML5 Performance
  • W3C Resource Timing
  • W3C Navigation Timing
  • Hands-on Exercise: Testing Performance the W3C Way • Use the Navlet: http://code.google.com/p/navlet/ Make a bookmark or favorite using the code
  • Antipattern: We’ll Be Done With This Soon Performance is an ongoing activity, not fire and forget! Antipattern: Not treating performance as a property of the system, or only testing at release time. Pattern: establishing a long-term performance management plan as part of your cycle.
  • Section VI Mobile Devices
  • The Big Picture – Mobile is Growing
  • Native Apps v. HTML5 v. Desktop • Native Apps will run ~ 5x faster than HTML5 • Roughly 10x slower than desktop • HTML5 on the mobile device can be 50x slower – 10x from the ARM chip – 5x from JavaScript
  • Mobile Apps Are Slow If you‟re designing for mobile, it‟s safe to assume you‟re going to incur 2000ms of 3G latency.
  • Slow Compared to What? Since 2009 mobile browsers went from 30x to 5x slower than desktops – Better than Moore‟s Law improvement (!) – JavaScript v. Native code ~ 5x – 4g/LTE ~ 27% faster than 3g
  • Mobile Speeds by Example
  • Users Expect More. Now.
  • Mobile JavaScript Performance • That 5x is in the code interpretation? • Typed arrays • JSON layout • DOM Manipulation • Garbage Collection
  • How 3G/4G Networks Work
  • 3G to 4G Migration
  • HTML5: New Features for Mobile http://mobilehtml5.org/
  • Delay-tolerant Application Design • Plan for offline/intermittent connectivity • Caching local content – Local storage – Don‟t be afraid to use sessions – Use HTTP Caching headers wisely • Always have failure modes built-in
  • The Right Tool For the Right Job Source: Nick Zakas
  • Best Practices for Mobile • • • • • • • Tread lightly on the JavaScript Don‟t touch the DOM! CDNs are less effective due to network challenges TTFB is not a good measure of server duration Use Web Workers for preloading Test performance on different transport types Test battery consumption!
  • 4 Takeaways on Mobile Performance Mobile HTML apps are slow compared to native apps …but it‟s not all about JavaScript Mobile networking is a big challenge …so design for delay-tolerance HTML5 is designed for Mobile …so use it (wisely)! Use the right tool for the right job …including the right design patterns for Mobile
  • Tools for Mobile Testing • Speedtest/Ookla – Variability – Characteristics of different kinds of networks • iCurl – Simple HTTP Operations on your device • HTTPWatch Basic – Look at the Waterfall – Gather detailed data along with iCurl
  • Hands-on Exercise: Testing Mobile Performance
  • Antipattern: It’s The Application Stupid! T2 (server duration ~ 35% of total E2E – More on mobile however! Antipattern: Failing to recognize that the distribution of the Mobile E2E is very different from a desktop performance profile Pattern: Carefully analyze the MPPC numbers for your site and identify the problems that need to be solved and in what order.
  • Finale – Summing Up
  • The 7 Habits of Exceptional Performance 1. 2. 3. 4. 5. 6. 7. Make Performance a Priority Test, Measure, Test Again Learn about the Tools Balance Performance with Features Track Results Over Time Set Targets Ask Questions; Check It for Yourself! Thanks to Tenni Theurer
  • Yslow Rules! • Rule 1 - Make Fewer HTTP Requests • Rule 2 - Use a Content Delivery Network • Rule 3 - Add an Expires Header • Rule 4 - Gzip Components • Rule 5 - Put Style sheets at the Top • Rule 6 - Put Scripts at the Bottom • Rule 7 - Avoid CSS Expressions • Rule 8 - Make JavaScript and CSS External • Rule 9 - Reduce DNS Lookups • Rule 10 - Minify JavaScript • Rule 11 - Avoid Redirects • Rule 12 - Remove Duplicate Scripts • Rule 13 - Configure ETags • Rule 14 - Make AJAX Cacheable Source: Stevesouders.com
  • Every Tool Has Its Place in the Universe
  • The One Number of Truth “42”
  • About:HTML “…a single user-interface to many large classes of stored information such as reports, notes, data-bases, computer documentation and on-line systems help” WorldWideWeb: Proposal for a HyperText Project Berners-Lee & Caillau, 1990
  • Theme of the Work Ultimately, Performance is about Respect.
  • Thank You! Daniel Austin PayPal, Inc. HTML5 DevConf Oct. 24, 2013 @daniel_b_austin da@x.com