Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Bigger, faster, and more engaging while on a budget!

372 views

Published on

Keeping a pulse on performance is important and proves difficult in a rapidly changing environment. One release to the next can have a significant impact on performance. Introducing performance budgets can lend a relatively simple safety net to catch unintended swings in resources or changes in timing during page load, that might have otherwise go unnoticed.

I’ll discuss how to create good performance budgets, why it’s important to have them in place, and tell a true story of how Zillow was able to respond quickly in a case when budgets were exceeded.

In telling this story, I will provide supporting evidence to debunk why “onload” isn’t a good measure of perceived performance, show real business Intelligence data supporting how it’s possible to deliver content that users want without impacting perceived performance, and tie these events together with RUM data that enables the best decision for the engineering team to support the business and ultimately the customer.

Published in: Engineering
  • Be the first to comment

  • Be the first to like this

Bigger, faster, and more engaging while on a budget!

  1. 1. @nwbower Bigger, Faster, and more Engaging while on a Budget! Nathan Bower
  2. 2. 2 Evolution of Page Weight and Requests: 3X increase in last 5 Years Image credit to HTTPArchive.org (http://httparchive.org/trends.php )
  3. 3. 3 Reactive Measurements TOTO Images Credit: http://www.spc.noaa.gov/faq/tornado/toto.htm
  4. 4. 4 Frequent Measurements Image Credit: http://history.msfc.nasa.gov/saturn_apollo/photos/images/10075947_jpg.jpg
  5. 5. 5 Visible Measurements with Alarm
  6. 6. 6 Measure and Trend What’s Important! Bigger : Size, # of Requests • All Resource Types by Weight and # of Requests – Images, Fonts, JavaScript, CSS, HTML, total size, total requests, etc. Faster : Time • User Perceived Performance – Ex. “AFT Hero Image”, “AFT Search Box” • SpeedIndex • TTFB • Pagespeed
  7. 7. 7 Add W3C User Timing Marks • Measure Performance of User Experience – http://blog.patrickmeenan.com/2013/07/measuring-performance-of-user- experience.html • “AFT” Notation (Above-the-Fold Time) • Hero Image Custom Metrics – http://www.stevesouders.com/blog/2015/05/12/hero-image-custom-metrics/ <img src="hero.jpg" onload="performance.mark(’AFT Hero1')"> <script>performance.mark(’AFT Hero2')</script> http://www.w3.org/TR/user-timing/
  8. 8. 8 Tools for Measuring and Trending Synthetic Measurements • SpeedCurve.com • Independent WPT Instance  Graphite • PhantomJS w/HAR  HARStorage • WPT OR PhantomJS w/HAR  SPLUNK Real User Monitoring (RUM) • Boomerang  SPLUNK • Numerous companies offering RUM
  9. 9. 9 Synthetic vs. RUM Synthetic Measurements • Consistency in device/network • Detailed Resource Analysis Drawbacks • Limited Sample and Frequency Real User Monitoring (RUM) • Large Sample • Real Conditions • Not Geographically Bound Drawbacks • Limited Resource Detail – No Size from W3C Resource Timing
  10. 10. 10
  11. 11. 11
  12. 12. 12
  13. 13. 13
  14. 14. 14 Evaluate Performance Trade-offs More Engaging : Conversion Rates Jakob Nielsen “ Increased conversion is one of the strongest ROI arguments for better user experience and more user research. Track over time, because it's a relative metric.” • Trend and Correlate Traffic Patterns to Performance – PV/Session, Session Duration, Bounce Rate • Observe Variations by Build Release Heavier and Slower Modifications May Improve Engagement! • Prove changes through AB Testing
  15. 15. 15 What else should we budget? • 3rd Party Resources • Service Response Time • Database Query Time • DNS Resolution • SSL Negotiation • TCP Connect • HTTP Status Codes
  16. 16. 16 Responsive Design Mobile User Timing Mark Desktop User Timing Mark …wait, these are GOALS!
  17. 17. 17 Managing Performance Budgets • Prioritize User Experience •Engage and Revise Alarms • Owner Buy-in for Goals •Set Regression Alarms • Optimization Burnout
  18. 18. 18
  19. 19. 19
  20. 20. 20
  21. 21. 21
  22. 22. 22 New Feature – AB Testing Showed Improved Engagement
  23. 23. 23 Business Intelligence – User Engagement Exceeded Budgets w/ vs w/out (11days prior vs 11days post) Device Type Uus Sessions Duration/Session (ms) PVs/Session Bounce Rate PVs/Uus Phone 2.31% 2.42% 1.86% -2.09% -1.20% -1.99% Tablet -0.95% -0.96% 0.98% -3.64% 0.52% -3.65%
  24. 24. 24
  25. 25. 25 SpeedIndex Did Not Falter Despite 4X Weight Increase
  26. 26. 26 RUM confirmed 10sec loss at 75th Percentile for “Time to Onload”
  27. 27. 27 Business Metrics Remained the Same! No Impacts to User Perceived Performance! Performance Changed in a Big Way!
  28. 28. 28 Lessons Learned • Increase Frequency of Measurements, “Quick Response”! • Budgets enable a “Call to Action”! • Prioritize User Timing marks for user perceived performance. • Excess resources on page load may not affect user perception of performance.
  29. 29. 29
  30. 30. 3030 ZILLOW ENGINEERING BLOG ENGINEERING.ZILLOW.COM Come Join Our Team! Zillow.com/jobs
  31. 31. 3131

×