Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

SANER 2015 ERA track: Differential Flame Graphs

963 views

Published on

Flame graphs can be used to analyze software profiles. We introduce the differential flame graph, which can be used to detect and analyze regressions in those profiles.

Published in: Software
  • Be the first to comment

  • Be the first to like this

SANER 2015 ERA track: Differential Flame Graphs

  1. 1. 3/5/15 Challenge the future Delft University of Technology Understanding Software Performance Regressions Using Differential Flame Graphs Cor-Paul Bezemer, Johan Pouwelse, Brendan Gregg
  2. 2. 2Understanding Software Performance Regressions Using DFGs Regression Testing
  3. 3. 3Understanding Software Performance Regressions Using DFGs Regression Testing Cycle 1)Write code 2)Deploy to continuous integration system 3)Test 4)Deploy to production code or return to 1)
  4. 4. 4Understanding Software Performance Regressions Using DFGs Performance Regression Testing • Functional testing is difficult, but: • Functionality is either OK or broken • Performance testing is even more difficult: • Depends on requirements of application • A library may be fast enough for one application, while it's too slow for another • How much performance regression is acceptable?
  5. 5. 5Understanding Software Performance Regressions Using DFGs
  6. 6. 6Understanding Software Performance Regressions Using DFGs What to Measure • Built-in performance metrics (or counters) • Response time, throughput, queue lengths, utilization • Usually system-wide • Profilers • Per-function info on execution time, bytes written, etc. • Higher granularity == more expensive • After measuring: tons of data! :) / :(
  7. 7. 7Understanding Software Performance Regressions Using DFGs
  8. 8. 8Understanding Software Performance Regressions Using DFGs Detecting I/O Regression • Previous work: • “Detecting and analyzing I/O performance regressions”, C. Bezemer, E. Milon, A. Zaidman, J. Pouwelse (Journal of Software: Evolution and Process 26 (12), 1193-1212) • Approach for finding out which function exhibits I/O write regression over two software versions • Results promising, but sometimes difficult to interpret
  9. 9. 9Understanding Software Performance Regressions Using DFGs Work in Progress: Visualization
  10. 10. 10Understanding Software Performance Regressions Using DFGs Flame Graphs (1) def a ( ) : if some condition : b() … def b ( ) : if some condition : c () ... def c( ): ... def d ( ) : if some condition : c () ... Stack trace, CPU time a(), 100 a()→b(), 25 a()→b()→c(), 10 d(), 50 d()→c(), 10
  11. 11. 11Understanding Software Performance Regressions Using DFGs Flame Graphs (2)
  12. 12. 12Understanding Software Performance Regressions Using DFGs Flame Graphs (3)
  13. 13. 13Understanding Software Performance Regressions Using DFGs Differential Flame Graphs (1)
  14. 14. 14Understanding Software Performance Regressions Using DFGs Demo
  15. 15. 15Understanding Software Performance Regressions Using DFGs Differential Flame Graphs (2) • You can do this with performance profiles.. • But also with other data? • Ideas: • Website clickpaths • Parallel/distributed computing • ... • Let us know! http://corpaul.github.io/flamegraphdiff/ • (they use this at Netflix, so it must be pretty cool!)
  16. 16. 16Understanding Software Performance Regressions Using DFGs Discussion • Data collection is difficult • No out of the box solutions • Expensive • Data variation between executions can be considerable • Especially for 'non-stable' metrics (CPU, memory)
  17. 17. 17Understanding Software Performance Regressions Using DFGs c.bezemer@tudelft.nl http://corpaul.github.io/flamegraphdiff/

×