Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Measuring web performance. Velocity EU 2011

4,761 views

Published on

How to measure web performance covering different techniques, the important of context, real user measurement and synthetic etc

Published in: Technology
  • Be the first to comment

Measuring web performance. Velocity EU 2011

  1. 1. MEASURING WEB PERFORMANCE Steve Thair Seriti Consulting @TheOpsMgr
  2. 2. Every measurement of web performance you will ever make will be wrong(C) SERITI CONSULTING, 2011 08/11/2011 2
  3. 3. (C) SERITI CONSULTING, 2011 08/11/2011 3
  4. 4. “The human perception of duration is both subjective and variable” http://en.wikipedia.org/wiki/Time_perception
  5. 5. “PERCEPTION IS VARIABLE…” Go read Stoyan’s talk! http://velocityconf.com/velocity2010/public/schedule/detail/13019(C) SERITI CONSULTING, 2011 08/11/2011 5
  6. 6. Web Performance Subjective Objective(C) SERITI CONSULTING, 2011 08/11/2011 6
  7. 7. Case Studies Subjective Focus Groups “Qualitative Interviews techniques” Video Analysis Surveys(C) SERITI CONSULTING, 2011 08/11/2011 7
  8. 8. Javascript Navigation timing Objective Browser Extensions “Quantitative Custom Browsers techniques” Proxy timings Web Server mods Network sniffing(C) SERITI CONSULTING, 2011 08/11/2011 8
  9. 9. “I keep six honest serving-men (They taught me all I knew); Their names are What and Why and When And How and Where and Who.” Rudyard Kipling, The Elephant’s Tale(C) SERITI CONSULTING, 2011 08/11/2011 9
  10. 10. WHAT LEVEL DO YOU MEASURE? Journey Page Object
  11. 11. CHOOSE YOUR METRIC!https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/NavigationTiming/Overview.html (C) SERITI CONSULTING, 2011 08/11/2011 11
  12. 12. 4 Key “Raw” Metrics • Time to First Byte (TTFB) • Render Start Time • DOMContentLoaded • Page (onLoad) Load Time (PLT)(C) SERITI CONSULTING, 2011 08/11/2011 12
  13. 13. What about “Above the Fold” time? • How long to “render of the static stuff in the viewable area of the page”? Limitations of AFT – Only applicable to lab setting – Does not reflect user perceived latency based on functionality http://assets.en.oreilly.com/1/event/62/Above%20the%20Fold%20Time_%20Measuring%20Web%20Page%20Performance %20Visually%20Presentation.pdf(C) SERITI CONSULTING, 2011 08/11/2011 13
  14. 14. (C) SERITI CONSULTING, 2011 08/11/2011 14
  15. 15. WHAT OTHER METRICS? Apdex Statistical Metrics Counts/Histograms Raw Metrics
  16. 16. Apdex (t) = (Satisfied Count + Tolerated Count / 2) / Total Samples • A number between 0 and 1 that represents “user satisfaction” • For technical reasons the “Tolerated” threshold is set to four times the “Satisfied” Threshold so if your “Satisfied” threshold (t) was 4 seconds then: • 0 to 4 seconds = Satisfied 4 to 16 seconds = Tolerated over 16 seconds = Frustrated. http://apdex.org/(C) SERITI CONSULTING, 2011 08/11/2011 16
  17. 17. PERFORMANCE IS MULTI-DIMENSIONALMultiple MetricsFor Multiple URLSFrom Different LocationsUsing Different ToolsAcross the LifecycleOver Time(C) SERITI CONSULTING, 2011 08/11/2011 17
  18. 18. The importance of CONTEXT(C) SERITI CONSULTING, 2011 08/11/2011 18
  19. 19. Location Bandwidth Wired, WiFi, 3G Latency Operating Cached objects System Addons & Antivirus Extensions Browser Device Time of Day Context Resolution(C) SERITI CONSULTING, 2011 08/11/2011 19
  20. 20. (C) SERITI CONSULTING, 2011 08/11/2011 20
  21. 21. Who? When? User Experience Design (UX) Developers Prod Develop Testers Ops SDLC WebOps “The Boss” Build QA (CI)(C) SERITI CONSULTING, 2011 08/11/2011 21
  22. 22. WHERE – DEPENDS ON THE HOW & WHY… Web Browser Proxy Server InternetSynthetic versus Real-User “Real User” Firewall / Synthetic Agent Load-Balancer Web Server (Reverse) Proxy Server SPAN port or Network tap WiFi or 3G Smartphone Signal/Noise Ratio increases…. Network “Sniffer” User/Browser metrics Server-based metrics (C) SERITI CONSULTING, 2011 08/11/2011 22
  23. 23. The Synthetic Versus Real-User Debate(C) SERITI CONSULTING, 2011 08/11/2011 23
  24. 24. “…its a question of when,“Because you’re skipping the “last mile” not if active monitoring of websites between the server and the user’s for availability and performance will browser, you’re not seeing how your be obsolete.” site actually performs in the real world” - Pat Meenan - Josh Bixby “You can have my active monitoring when you pry it from my cold, dead hands…” - Steve Thair http://blog.patrickmeenan.com/2011/05/demise-of-active-website-monitoring.html http://www.webperformancetoday.com/2011/07/05/web-performance-measurement-island-is-sinking/ http://www.seriticonsulting.com/blog/2011/5/21/you-can-have-my-active-monitoring-when-you-pry-it-from-my-co.html (C) SERITI CONSULTING, 2011 08/11/2011 24
  25. 25. Observational Study Versus Experiment(C) SERITI CONSULTING, 2011 08/11/2011 25
  26. 26. Experiment versus Observational Study • Both typically have the goal of detecting a relationship between the explanatory and response variables.Experiment • create differences in the explanatory variable and examine any resulting changes in the response variable (cause-and-effect conclusion)Observational Study • observe differences in the explanatory variable and notice any related differences in the response variable (association between variables) http://www.math.utah.edu/~joseph/Chapter_09.pdf(C) SERITI CONSULTING, 2011 08/11/2011 26
  27. 27. Observational Study = Real-User• “Watching” what happens in a given population sample• We can only observe… and try to infer what is actually happening• Many “confounding variables”• High signal to noise• Correlation(C) SERITI CONSULTING, 2011 08/11/2011 27
  28. 28. Location Bandwidth Wired, Latency WiFi, 3G Cached Operating objects System Addons & Antivirus Extensions Browser Device Time of Day Context Resolution(C) SERITI CONSULTING, 2011 08/11/2011 28
  29. 29. Observational Study = Real-User Experiment = Synthetic• “Watching” what happens in a • We “design” our experiment given population sample • We chose when, where, what,• We can only observe… and try to how etc infer what is actually happening • We control the variables (as• Many “confounding variables” much as possible)• High signal to noise • Lower signal to noise• Correlation • Causation* * OK, real “root cause” analysis will probably take a lot more investigation, I admit… but you get closer!(C) SERITI CONSULTING, 2011 08/11/2011 29
  30. 30. So which one is better? Neither. Complementary not Competing “…Ultimately Id love to see a hybrid model where synthetic tests are triggered based on somethingdetected in the data (slowdown, drop in volume, etc) to validate the issue or collect more data. - Pat Meenan(C) SERITI CONSULTING, 2011 08/11/2011 30
  31. 31. API Call to SyntheticReal-User Monitoring Controlled Test and Use RUM as “Reality Check”detect a change in a compare to baseline.page’s performanceFrom Observation… By controlling the variables To Experiment… (C) SERITI CONSULTING, 2011 08/11/2011 31
  32. 32. Javascript Back to the “How”… Navigation timing Objective Browser Extensions “Quantitative Custom Browsers techniques” Proxy timings Web Server mods Network sniffing(C) SERITI CONSULTING, 2011 08/11/2011 32
  33. 33. 7 WAYS OF MEASURING WEBPERF1. JavaScript timing e.g. Souder’s Episodes or Yahoo! Boomerang*2. Navigation-Timing e.g GA SiteSpeed3. Browser Extension e.g. HTTPwatch4. Custom browser e.g. 3pmobile.com or (headless) PhantomJS.org5. Proxy timing e.g. Charles proxy6. Web Server Mod e.g. APM solutions7. Network sniffing e.g. Atomic Labs Pion(C) SERITI CONSULTING, 2011 08/11/2011 33
  34. 34. COMPARING METHODS… Measurement Method Navigation- Browser Custom Proxy Web Server Network Metric JavaScript Timing API Extension Browser Debugger Mod sniffing Charles APM Example Product WebTuna SiteSpeed HTTPWatch 3PMobile Pion Proxy Modules "Blocked/Wait" No No Yes Yes Yes No No DNS No Yes Yes Yes Yes No No Connect No Yes Yes Yes Yes No Yes Time to First Byte Partially Yes Yes Yes Yes Yes Yes "Render Start" No No Yes Yes No No No DOMReady Partially Yes Yes Yes No No No "Page/HTTP Partially Yes Yes Yes Yes No Partially Complete" OnLoad Event Yes Yes Yes Yes No No No JS Execution Time Partially No Yes Yes No No No Page-Level Yes Yes Yes Yes Partially Partially Partially Object Level No No Yes Yes Yes Yes Yes Good for RUM? Yes Yes Partially No No Partially Yes Good for Mobile? Partially Partially Partially Partially Partially Partially Partially Affects Measurement Yes No Yes Yes Yes Yes No(C) SERITI CONSULTING, 2011 08/11/2011 34
  35. 35. JAVASCRIPT TIMING – HOW IT WORKS unLoad Event var start = new Stick it in a Cookie Load the next page Date().getTime() PLT = onLoad Event Send a beacon var end = new beacon.gif?time=plt end - start Date().getTime() https://dvcs.w3.org/hg/webperf/raw-file/tip/specs/NavigationTiming/Overview.html(C) SERITI CONSULTING, 2011 08/11/2011 35
  36. 36. PROS & CONS OF JAVASCRIPT TIMING Metric JavaScript • Pro’s Example Product WebTuna • Simple "Blocked/Wait" No • Episodes/Boomerang provide custom timing for DNS No developer instrumentation Connect No Time to First Byte Partially • Cons "Render Start" No DOMReady Partially • Relies on Javascript and Cookies "Page/HTTP Partially Complete" • Only accurate for 2 nd page in journey OnLoad Event Yes JS Execution Time Partially • Can only really get a “page load metric” and a Page-Level Object Level Yes No partial TTFB metric Good for RUM? Good for Mobile? Yes Partially • “Observer effect” (and Javascript can break!)Affects Measurement Yes (C) SERITI CONSULTING, 2011 08/11/2011 36
  37. 37. NAVIGATION-TIMING – HOW IT WORKS onLoad Event var plt = now - Send a beacon var end = new performance.timing. beacon.gif?time=pltDate().getTime() navigationStart;(C) SERITI CONSULTING, 2011 08/11/2011 37
  38. 38. NAVIGATION TIMING METRICShttps://dvcs.w3.org/hg/webperf/raw-file/tip/specs/NavigationTiming/Overview.html (C) SERITI CONSULTING, 2011 08/11/2011 38
  39. 39. PROS & CONS OF NAVIGATION-TIMING Metric Navigation- • Pro’s Timing API Example Product SiteSpeed • Even simpler! "Blocked/Wait" No • Lots more metrics DNS Yes Connect Yes • More accurate Time to First Byte Yes "Render Start" No • Cons DOMReady Yes "Page/HTTP • Need browser support for API Yes Complete" OnLoad Event Yes • IE9+ / Chrome 6+ / Firefox 7+ JS Execution Time No Page-Level Yes • Relies on Javascript (for querying API & beacon) Object Level No Good for RUM? Yes • “Observer effect” Good for Mobile? PartiallyAffects Measurement No • Page-level only (C) SERITI CONSULTING, 2011 08/11/2011 39
  40. 40. A BIT MORE ABOUT GA SITESPEED…• Just add one line for basic, free, real-user monitoring! _gaq.push([_setAccount, UA-12345-1]); _gaq.push([_trackPageview]); _gaq.push([_trackPageLoadTime]);• Sampling appears to vary (a lot!) • 10% of page visits by design but reported 2% to 100%• Falls back to Google Toolbar if available (but NOT javascript timing)• Will probably make you think perf is better than it really is…(C) SERITI CONSULTING, 2011 08/11/2011 40
  41. 41. (C) SERITI CONSULTING, 2011 08/11/2011 41
  42. 42. (C) SERITI CONSULTING, 2011 08/11/2011 42
  43. 43. (C) SERITI CONSULTING, 2011 08/11/2011 43
  44. 44. BROWSER EXTENSION – HOW IT WORKS That subscribes to Write a browser Get your users to a whole lot of API extension… install it… event listeners… Send the timing back to collector E.g. showslow.com https://developer.mozilla.org/en/XPCOM_Interface_Reference(C) SERITI CONSULTING, 2011 08/11/2011 44
  45. 45. PROS & CONS OF BROWSER EXTENSIONS Metric Browser • Pros Extension • Very complete metrics Example Product HTTPWatch "Blocked/Wait" Yes • Object and Page level DNS Connect Yes Yes • No javascript (in the page at least)!!! Time to First Byte Yes • Great for continuous integration perf testing "Render Start" Yes DOMReady Yes • Cons "Page/HTTP Complete" Yes • Getting users to install it… OnLoad Event Yes JS Execution Time Yes • Not natively cross-browser Page-Level Yes Object Level Yes • Some browsers don’t support extensions Good for RUM? Good for Mobile? Partially Partially • Especially mobile browsers!Affects Measurement Yes • “Observer effect” (C) SERITI CONSULTING, 2011 08/11/2011 45
  46. 46. CUSTOM BROWSER – HOW IT WORKS Add custom Take some open Like WebKit or the instrumentation for source browser code Android Browser performance measurement Send the timing back Get users to to collector install it… E.g. 3pmobile.com(C) SERITI CONSULTING, 2011 08/11/2011 46
  47. 47. PROS & CONS OF CUSTOM BROWSER Metric Custom • Pros Browser Example Product 3PMobile • Great when you can’t use extensions / javascript / cookies "Blocked/Wait" Yes ie. For mobile performance e.g. 3Pmobile.com DNS Yes Connect Yes • Great for automation e.g. http://www.PhantomJS.org/ Time to First Byte Yes "Render Start" Yes • Good metrics (depending on OS API availability) DOMReady Yes "Page/HTTP • Cons Yes Complete" OnLoad Event Yes • Requires installation JS Execution Time Yes Page-Level Yes • Maintaining fidelity to “real browser” measurements Object Level Yes Good for RUM? No • “Observer Effect” (due to instrumentation code) Good for Mobile? PartiallyAffects Measurement Yes (C) SERITI CONSULTING, 2011 08/11/2011 47
  48. 48. PROXY DEBUGGER – HOW IT WORKS Change browser touse debugging Proxy Debugging proxy Export data to log e.g. Charles or records each request Fiddler(C) SERITI CONSULTING, 2011 08/11/2011 48
  49. 49. PROS & CONS OF PROXY DEBUGGER Metric Proxy • Pros Debugger Example Product Fiddler • One simple change to browser config Proxy "Blocked/Wait" Yes • No Javascript / Cookies DNS Yes Connect Yes • Can offer bandwidth throttling Time to First Byte Yes "Render Start" No • Cons DOMReady "Page/HTTP No • Proxies significantly impact HTTP traffic Yes Complete" • http://insidehttp.blogspot.com/2005/06/using-fiddler-for- OnLoad Event No JS Execution Time No performance.html Page-Level Partially Object Level Yes • No access to browser events Good for RUM? No • Concept of a “page” be problematic… Good for Mobile? PartiallyAffects Measurement Yes (C) SERITI CONSULTING, 2011 08/11/2011 49
  50. 50. 6 Keep-Alive connections per SERVER Versus 8 Keep-Alive connections TOTAL per PROXY (Firefox 7.0.1)(C) SERITI CONSULTING, 2011 08/11/2011 50
  51. 51. WEB SERVER MOD – HOW IT WORKS Write a webserver Start a timer on Stop Timer on Mod or ISAPI filter Request Response Send the timing back to collector E.g. AppDynamics http://www.apachetutor.org/dev/request(C) SERITI CONSULTING, 2011 08/11/2011 51
  52. 52. PROS & CONS OF WEB SERVER MOD Metric Web Server • Pros Mod APM • Great for Application Performance Management (APM) Example Product Modules • Can be used in a “hybrid mode” with Javascript timing "Blocked/Wait" No DNS No • Measuring your “back-end” performance Connect No Time to First Byte Yes • Can be easy to deploy* "Render Start" No DOMReady No • Cons "Page/HTTP Complete" No • Limited metrics, ignores network RTT and only sees origin OnLoad Event No requests JS Execution Time No • “Observer Effect” (~5% server perf hit with APM?) Page-Level Partially Object Level Yes • Concept of a “page” be problematic… Good for RUM? Partially Good for Mobile? Partially • Can be a pain to deploy*Affects Measurement Yes (C) SERITI CONSULTING, 2011 08/11/2011 52
  53. 53. NETWORK SNIFFING – HOW IT WORKS Create a SPAN Promiscuous Assemble TCP/IP port or network mode packet packets into tap sniffing HTTP Requests Record the timing Assemble HTTP data in a Requests into database “pages”(C) SERITI CONSULTING, 2011 08/11/2011 53
  54. 54. PROS & CONS OF NETWORK SNIFFING Metric Network • Pros sniffing • No “observer effect” (totally “passive”) Example Product Pion "Blocked/Wait" No • Very common “appliance-based” RUM solution DNS No Connect Yes • Can be used in a “hybrid mode” with Javascript timing Time to First Byte "Render Start" Yes No • Can be easy to deploy* DOMReady No • Cons "Page/HTTP Partially Complete" • Limited metrics and only sees origin requests OnLoad Event No JS Execution Time No • Not “cloud friendly” at present Page-Level Partially Object Level Yes • Concept of a “page” be problematic… Good for RUM? Yes Good for Mobile? Partially • Can be a pain to deploy*Affects Measurement No (C) SERITI CONSULTING, 2011 08/11/2011 54
  55. 55. SUMMARY• Performance is subjective (but we try to make it objective)• Performance is Multi-dimensional• Context is critical• “Observational Studies AND Experiments”• Real User Monitoring AND Synthetic Monitoring• 7 different measurement techniques each with Pros & Cons(C) SERITI CONSULTING, 2011 08/11/2011 55
  56. 56. @LDNWEBPERF USER GROUP!• Join our London Web Performance Meetup • http://www.meetup.com/London-Web-Performance-Group/• Next Wednesday 16 th Nov - 7pm – London (Bank) • WPO case study from www.thetimes.co.uk!• Follow us on Twitter @LDNWebPerf• #LDNWebPerf & #WebPerf(C) SERITI CONSULTING, 2011 08/11/2011 56
  57. 57. QUESTIONS?http://mobro.co/TheOpsMgr(C) SERITI CONSULTING, 2011 08/11/2011 57

×