Velocity 2013 london developer-friendly web performance testing in continuous integration

2,009 views

Published on

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
2,009
On SlideShare
0
From Embeds
0
Number of Embeds
658
Actions
Shares
0
Downloads
21
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

Velocity 2013 london developer-friendly web performance testing in continuous integration

  1. 1. Developer-Friendly Web Performance Testing in Continuous Integration Michael Klepikov, Google Velocity Conference, London, 2013
  2. 2. Overview ● Web performance testing is hard ● Sidelines of web app lifecycle ● Move to mainstream ● Integrate with main UI continuous build ● Make fixes cheaper
  3. 3. Perf testing on sidelines ● Many awesome perf test tools ○ WebPageTest.org ● Domain of few experts ● High maintenance ● Hard to add to regular continuous build
  4. 4. Why hard to integrate? ● Focus on browser only ○ No control of server start/stop ○ CB must run at a specific changelist ● Own test scheduler (e.g. WebPageTest) ○ Impedance mismatch with main CB scheduler
  5. 5. Expensive to fix production ● Complex end to end system ○ Regression could be anywhere, not just browser ● Hard to find the culprit change ○ 100’s-1000’s of developer changes + cherrypicks ● Production release logistics
  6. 6. Dedicated Perf Tests Rot UI evolves Tests break Who fill fix?
  7. 7. Selenium 2.0 - WebDriver ● Pervasive adoption in web UI tests ● All major browsers, desktop and mobile ○ W3C standard ● No direct support for perf testing ○ Bad idea to measure around Selenium/WD API calls
  8. 8. Real User Monitoring ● Self-instrumentation + reporting in JS ● Arguably more important than perf testing ● Track metrics that matter ● Interactive pages (page-load meaningless) ● Easier to attribute than page-load time
  9. 9. Perf tests that ride on UI tests ● UI tests trigger RUM ● Intercept metrics ○ and DevTools traces ● UI tests stay green ○ keep perf tests green
  10. 10. Collect DevTools traces ● Selenium Logging API: LogType.PERFORMANCE ● Save in test results ● Compare before vs. after traces ○ Great aid for debugging regressions ● Test infrastructure does it transparently
  11. 11. Intercept RUM metrics ● Parse from the saved DevTools trace ○ …/report?a=10,b=220 ● Store to a database ○ Time Series of Time Series: changelists, iterations ● Graph, autodetect regressions ○ TSViewDB
  12. 12. Iterations ● Run many iterations of same UI test ○ At the same changelist ● Statistically viable ○ Keep in mind – the distribution is not normal! ● Test infrastructure does it transparently
  13. 13. RUM metrics from UI tests Graphs ! Detect regressions Drill down
  14. 14. WebPageTest: results UI only ● Don’t schedule tests on WPT ● Send perf test results to WPT server ○ Awesome UI, lots of useful analysis for free ● Link from main test results UI to WPT page ● TSViewDB integration, with drill-down
  15. 15. Performance Test Lifecycle Add Instrumentation Reporting to your (web) app Continuous Performance Tests Fix Push to Production Regression Alerts Real User Monitoring
  16. 16. Perf tests run longer: two CBs Main CB green builds bisect the culprit Perf CB
  17. 17. Caveats ● Do not predict absolute production latency! ○ Ideally server under test runs in isolation in CB ○ Limited variety of hardware, networks ● Only detect regressions ○ E.g. daily regression summary email
  18. 18. Conclusions ● Use UI functional tests as a base ● Intercept RUM, run many iterations ● Run continuously! ● Autodetect regressions, CL author to fix! ● Debug production less!
  19. 19. Q&A These slides: goo.gl/HdUCqL Intercept DevTools: youtu.be/0_kAPWSZNY4 Send results to WPT: gist.github.com/klepikov TSViewDB: github.com/google/tsviewdb

×