Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

An exploratory study of the state of practice of performance testing in Java-based open source projects

297 views

Published on

Presentation at the SPEC DevOps RG

Published in: Software
  • Be the first to comment

  • Be the first to like this

An exploratory study of the state of practice of performance testing in Java-based open source projects

  1. 1. An exploratory study of the state ofAn exploratory study of the state of practice of performance testing in Java-practice of performance testing in Java- based open source projectsbased open source projects Cor-Paul BezemerPhilipp Leitner
  2. 2. Industrial applications depend more andIndustrial applications depend more and more on open source softwaremore on open source software
  3. 3. Is the dependency on open source softwareIs the dependency on open source software justified (or wise)?justified (or wise)?
  4. 4. We know (more or less) how toWe know (more or less) how to assess quality of functionality...assess quality of functionality...
  5. 5. But there is no standardized way ofBut there is no standardized way of assessing performance!assessing performance!
  6. 6. What is the state of practice ofWhat is the state of practice of performance testing in Java-basedperformance testing in Java-based open source software?open source software?
  7. 7. We conducted an exploratoryWe conducted an exploratory study on 111 Java-based projectsstudy on 111 Java-based projects
  8. 8. We conducted an exploratoryWe conducted an exploratory study on 111 Java-based projectsstudy on 111 Java-based projects ‘bench’ or ‘perf’ in the src/test directory
  9. 9. 0 20 40 60 <= 100101 − 500501 − 2000 2001 − 5000 > 5000 # of Total Commits #ofProjects Commits 0 20 40 60 1 2 − 5 6 − 10 11 − 25 > 25 # of Individual Contributors #ofProjects Individual Contributors 0 20 40 60 <= 25 26 − 50 51 − 100101 − 500 > 500 # of Stars #ofProjects Stars 0 20 40 60 <= 25 26 − 50 51 − 100101 − 500 > 500 # of Watchers #ofProjects Watchers
  10. 10. We manually identifiedWe manually identified (performance) tests(performance) tests ● Following process: – Search the test files for performance-related terms ● ‘perf’,‘bench’,‘fast’,‘speed’, etc. – Manually identify performance tests
  11. 11. We studied performance testingWe studied performance testing from five perspectivesfrom five perspectives 1.The developers who are involved 2.The extent of performance testing 3.The organization of performance tests 4.The types of performance tests 5.The tools used
  12. 12. Perspective 1: DevelopersPerspective 1: Developers Performance tests are usually done by a small group of developers (median 2 vs. median 9 developers)
  13. 13. Perspective 1: DevelopersPerspective 1: Developers Performance tests are usually done by a small group of developers (median 2 vs. median 9 developers) Performance testers are usually core developers of the project
  14. 14. Perspective 2:The extent of testingPerspective 2:The extent of testing The performance test suite is usually small – A few hundred LOC compared to a few thousand LOC for the functional test suite – Exceptions are Hadoop and Deuce STM (Software Transactional Memory)
  15. 15. Perspective 2:The extent of testingPerspective 2:The extent of testing The performance test suite is usually small – A few hundred LOC compared to a few thousand LOC for the functional test suite – Exceptions are Hadoop and Deuce STM (Software Transactional Memory) Projects that claim to be the fastest/most efficient do not seem to take extra measures to support their claims
  16. 16. Perspective 3: OrganizationPerspective 3: Organization There is no standardized way of organizing and conducting performance tests – Performance tests are scattered throughout the (test) code – Previous results and instructions for executing a performance test are included in code comments
  17. 17. Perspective 4:Types of performance testsPerspective 4:Types of performance tests Type 1: Performance smoke test (50% of the projects) ― Test a particular use case Type 2: Microbenchmarks (32%) ― Test smaller units of code Type 3: One-shot performance tests (15%) ― Test a known buggy case Type 4: Performance assertions (5%) ― ‘4x as fast as …’ Type 5: Implicit performance tests (5%) ― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
  18. 18. Perspective 4:Types of performance testsPerspective 4:Types of performance tests Type 1: Performance smoke test (50% of the projects) ― Test a particular use case Type 2: Microbenchmarks (32%) ― Test smaller units of code Type 3: One-shot performance tests (15%) ― Test a known buggy case Type 4: Performance assertions (5%) ― ‘4x as fast as …’ Type 5: Implicit performance tests (5%) ― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
  19. 19. Perspective 4:Types of performance testsPerspective 4:Types of performance tests Type 1: Performance smoke test (50% of the projects) ― Test a particular use case Type 2: Microbenchmarks (32%) ― Test smaller units of code Type 3: One-shot performance tests (15%) ― Test a known buggy case Type 4: Performance assertions (5%) ― ‘4x as fast as …’ Type 5: Implicit performance tests (5%) ― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
  20. 20. Perspective 4:Types of performance testsPerspective 4:Types of performance tests Type 1: Performance smoke test (50% of the projects) ― Test a particular use case Type 2: Microbenchmarks (32%) ― Test smaller units of code Type 3: One-shot performance tests (15%) ― Test a known buggy case Type 4: Performance assertions (5%) ― ‘4x as fast as …’ Type 5: Implicit performance tests (5%) ― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
  21. 21. Perspective 4:Types of performance testsPerspective 4:Types of performance tests Type 1: Performance smoke test (50% of the projects) ― Test a particular use case Type 2: Microbenchmarks (32%) ― Test smaller units of code Type 3: One-shot performance tests (15%) ― Test a known buggy case Type 4: Performance assertions (5%) ― ‘4x as fast as …’ Type 5: Implicit performance tests (5%) ― ‘Test’ using data that is generated as a side-effect (e.g., execution time of a test)
  22. 22. Perspective 5:ToolsPerspective 5:Tools Approach 1: Unit testing (51% of the projects) – Usually using JUnit instead of a ‘dedicated’ performance unit test framework Approach 2: Stand-alone performance tests (50%) – Custom written for the project Approach 3: Dedicated performance testing framework (16%) – Usually Caliper or JMH
  23. 23. Perspective 5:ToolsPerspective 5:Tools Approach 1: Unit testing (51% of the projects) – Usually using JUnit instead of a ‘dedicated’ performance unit test framework Approach 2: Stand-alone performance tests (50%) – Custom written for the project Approach 3: Dedicated performance testing framework (16%) – Usually Caliper or JMH
  24. 24. Perspective 5:ToolsPerspective 5:Tools Approach 1: Unit testing (51% of the projects) – Usually using JUnit instead of a ‘dedicated’ performance unit test framework Approach 2: Stand-alone performance tests (50%) – Custom written for the project Approach 3: Dedicated performance testing framework (16%) – Usually Caliper or JMH
  25. 25. ImplicationsImplications There is a lack of a ‘killer app’ for performance testing Writing performance tests is not a popular task in open source projects Developers want support for quick-and-dirty performance testing Performance testing is multi-faceted Integration into standard CI-frameworks is key
  26. 26. ImplicationsImplications There is a lack of a ‘killer app’ for performance testing Writing performance tests is not a popular task in open source projects Developers want support for quick-and-dirty performance testing Performance testing is multi-faceted Integration into standard CI-frameworks is key
  27. 27. ImplicationsImplications There is a lack of a ‘killer app’ for performance testing Writing performance tests is not a popular task in open source projects Developers want support for quick-and-dirty performance testing Performance testing is multi-faceted Integration into standard CI-frameworks is key
  28. 28. ImplicationsImplications There is a lack of a ‘killer app’ for performance testing Writing performance tests is not a popular task in open source projects Developers want support for quick-and-dirty performance testing Performance testing is multi-faceted Integration into standard CI-frameworks is key
  29. 29. ImplicationsImplications There is a lack of a ‘killer app’ for performance testing Writing performance tests is not a popular task in open source projects Developers want support for quick-and-dirty performance testing Performance testing is multi-faceted Integration into standard CI-frameworks is key
  30. 30. ConclusionConclusion ● Open source developers seem to be not very enthusiastic about writing performance tests ● There is a lack of a ‘killer app’ for performance testing What we are doing as researchers, does not seem to reach practice! Cor-Paul Bezemer,bezemer@cs.queensu.ca http://sailhome.cs.queensu.ca/~corpaul/
  31. 31. ConclusionConclusion ● Open source developers seem to be not very enthusiastic about writing performance tests ● There is a lack of a ‘killer app’ for performance testing What we are doing as researchers, does not seem to reach practice! Cor-Paul Bezemer,bezemer@cs.queensu.ca http://sailhome.cs.queensu.ca/~corpaul/
  32. 32. ConclusionConclusion ● Open source developers seem to be not very enthusiastic about writing performance tests ● There is a lack of a ‘killer app’ for performance testing What we are doing as researchers, does not seem to reach practice! Cor-Paul Bezemer,bezemer@cs.queensu.ca http://sailhome.cs.queensu.ca/~corpaul/
  33. 33. ConclusionConclusion ● Open source developers seem to be not very enthusiastic about writing performance tests ● There is a lack of a ‘killer app’ for performance testing What we are doing as researchers, does not seem to reach practice! Cor-Paul Bezemer,bezemer@cs.queensu.ca http://sailhome.cs.queensu.ca/~corpaul/

×