Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Towards Holistic Continuous Software Performance Assessment

226 views

Published on

In agile, fast and continuous development lifecycles, software performance analysis is fundamental to confidently release continuously improved software versions. Researchers and industry practitioners have identified the importance of integrating performance testing in agile development processes in a timely and efficient way. However, existing techniques are fragmented and not integrated taking into account the heterogeneous skills of the users developing polyglot distributed software, and their need to automate performance practices as they are integrated in the whole lifecycle without breaking its intrinsic velocity. In this paper we present our vision for holistic continuous software performance assessment, which is being implemented in the BenchFlow tool. BenchFlow enables performance testing and analysis practices to be pervasively integrated in continuous development lifecycle activities. Users can specify performance activities (e.g., standard performance tests) by relying on an expressive Domain Specific Language for objective-driven performance analysis. Collected performance knowledge can be thus reused to speed up performance activities throughout the entire process.

My talk from The International Workshop on Quality-aware DevOps (QUDOS 2017). Cite us: http://dl.acm.org/citation.cfm?id=3053636

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Towards Holistic Continuous Software Performance Assessment

  1. 1. @QUDOS 2017 Towards Holistic Continuous Software Performance Assessment Vincenzo Ferme, Cesare Pautasso http://benchflow.inf.usi.ch
  2. 2. 2 Software Development Nowadays Continuous Feedback
  3. 3. 2 Software Development Nowadays Continuous Feedback DevOps
  4. 4. 3 Continuous Feedback Repo
  5. 5. 3 Continuous Feedback CI ServerRepo
  6. 6. CI ServerRepo 4 What about performance?
  7. 7. CI ServerRepo 4 What about performance? Continuous Deployment
  8. 8. 5 Existing Performance Engineering Tools particularly open-source ones
  9. 9. 5 Existing Performance Engineering Tools particularly open-source ones
  10. 10. 5 Existing Performance Engineering Tools particularly open-source ones
  11. 11. 6 Existing Performance Engineering Tools particularly open-source ones DataMill
  12. 12. 6 Existing Performance Engineering Tools particularly open-source ones DataMill Cloud WorkBench
  13. 13. 6 Existing Performance Engineering Tools particularly open-source ones DataMill Cloud WorkBench inspectIT
  14. 14. 7 Limitations of Current Solutions Not integrated in CSA 
 (E.g., Do not Leverage Continuous Feedback) CSA: Continuous Software Assessment
  15. 15. 7 Limitations of Current Solutions Rarely Automating the End-to-End Process Not integrated in CSA 
 (E.g., Do not Leverage Continuous Feedback) CSA: Continuous Software Assessment
  16. 16. 8 DevOps: Professional Profiles Perspective Different Professional Profiles
 (e.g., Developers, Q/A and Operations Engineers)
  17. 17. 8 DevOps: Professional Profiles Perspective Heterogeneous and Cross-Cutting Skills Different Professional Profiles
 (e.g., Developers, Q/A and Operations Engineers)
  18. 18. 9 Holistic Continuous Software Performance Assessment Vision
  19. 19. 9 Holistic Continuous Software Performance Assessment Vision
  20. 20. 9 Holistic Continuous Software Performance Assessment Vision
  21. 21. 9 Holistic Continuous Software Performance Assessment Vision
  22. 22. CI ServerRepo 10 Approach Overview
  23. 23. CI ServerRepo 10 Approach Overview 3 Main Features
  24. 24. CI ServerRepo 10 Approach Overview Performance Knowledge 3 Main Features
  25. 25. CI ServerRepo 10 Approach Overview Performance Knowledge CSA Integration (DSL) 3 Main Features
  26. 26. CI ServerRepo 10 Approach Overview Performance Knowledge Objective- driven Tests CSA Integration (DSL) 3 Main Features
  27. 27. CI ServerRepo 10 Approach Overview Performance Knowledge Objective- driven Tests Fast Perf. Feedback CSA Integration (DSL) 3 Main Features
  28. 28. CI ServerRepo 10 Approach Overview Performance Knowledge Objective- driven Tests Fast Perf. Feedback CSA Integration (DSL) 3 Main Features benchflow
  29. 29. 11 Integration in Development Lifecycles (DSL)
  30. 30. 12 DSL Overview (Literature) Load Functions Workloads Simulated Users Test Data TestBed Management Client-side Perf. Data Analysis Definition of Configuration Tests
  31. 31. 13 Main DSL Features Integration in CSA
  32. 32. 13 Main DSL Features Integration in CSA Know you I SUT-awareness
  33. 33. 13 Main DSL Features Collection and Analysis of Performance Data Integration in CSA Know you I SUT-awareness
  34. 34. 13 Main DSL Features Collection and Analysis of Performance Data Integration in CSA Objective-Driven Performance Testing Know you I SUT-awareness
  35. 35. 14 Objective-Driven Performance Testing
  36. 36. 15 Objectives Taxonomy Base Objectives (Test Types)
 standard performance tests, e.g., load test, stress test, spike test, and configuration test

  37. 37. 15 Objectives Taxonomy Base Objectives (Test Types)
 standard performance tests, e.g., load test, stress test, spike test, and configuration test
 Objectives
 specific types of performance engineering activities, e.g., capacity planning and performance optimisations

  38. 38. 15 Objectives Taxonomy Base Objectives (Test Types)
 standard performance tests, e.g., load test, stress test, spike test, and configuration test
 Objectives
 specific types of performance engineering activities, e.g., capacity planning and performance optimisations
 Meta-Objectives
 defined from already collected performance knowledge, e.g., comparing different systems using a benchmark
  39. 39. 16 Example: Configuration Test objective: type: configuration observation: - … exploration_space: - … termination_criteria: - …
  40. 40. 17 Example: Configuration Test observation: service_A: - ram_avg - cpu_avg - response_time_90th_p service_B: - ram_avg service_A service_B
  41. 41. 18 Example: Configuration Test exploration_space: service_A: resources: - memory: range: 1GB... 5GB step: +1GB - cpus: range: 1…4 environment: - SIZE_OF_THREADPOOL: range: 5...100 step: +5 - … service_A service_B
  42. 42. 19 Example: Configuration Test termination_criteria: - max_exec_time = 1h - ...
  43. 43. Example: Configuration Test SIZE_OF_THREADPOOL CPUs memory 20
  44. 44. Example: Configuration Test SIZE_OF_THREADPOOL CPUs memory 20
  45. 45. Example: Configuration Test SIZE_OF_THREADPOOL CPUs memory 20
  46. 46. Example: Configuration Test MARS Kriging … SIZE_OF_THREADPOOL CPUs memory 20 observation: service_A: - ram_avg - cpu_avg - response_time_90th_p service_B: - ram_avg
  47. 47. 21 Objectives - Capacity planning (also based on some constraints)
 e.g., CPU, RAM
 why? cost of resources -> important for the business

  48. 48. 21 Objectives - Capacity planning (also based on some constraints)
 e.g., CPU, RAM
 why? cost of resources -> important for the business
 - Performance optimisation based on some (resource) constraints 
 e.g., which configuration is optimal?
 why? responsiveness -> important for the user
  49. 49. 22 Example: Performance Optimisation optimisation_target: service_A: - min(response_time_90th_p) service_B: - min(memory) ... service_A service_B
  50. 50. 23 Example: Performance Optimisation CPUs memory SIZE_OF_THREADPOOL MARS Kriging … optimisation_target: service_A: - min(response_time_90th_p) service_B: - min(memory)
  51. 51. 23 Example: Performance Optimisation CPUs memory SIZE_OF_THREADPOOL MARS Kriging … optimisation_target: service_A: - min(response_time_90th_p) service_B: - min(memory)
  52. 52. 24 Meta-Objectives - Regression
 is the performance, capacity or scalability still the same as previous tests show?
  53. 53. 24 Meta-Objectives - Regression
 is the performance, capacity or scalability still the same as previous tests show? - What-If Analysis
 what do we expect to happen to the output/dependent variables if we change some of the input/independent variables?
  54. 54. 24 Meta-Objectives - Regression
 is the performance, capacity or scalability still the same as previous tests show? - What-If Analysis
 what do we expect to happen to the output/dependent variables if we change some of the input/independent variables? - Before-and-After
 how has the performance changed given some features have been added?
  55. 55. 24 Meta-Objectives - Regression
 is the performance, capacity or scalability still the same as previous tests show? - What-If Analysis
 what do we expect to happen to the output/dependent variables if we change some of the input/independent variables? - Before-and-After
 how has the performance changed given some features have been added? - Benchmarking
 how does the performance of different systems compare?
  56. 56. 25 Fast Performance Feedback
  57. 57. 26 Different Types of Fast Feedback Evaluating if the System is Ready for the Defined Performance Test Objectives and Reaches Expected State
  58. 58. 26 Different Types of Fast Feedback Evaluating if the System is Ready for the Defined Performance Test Objectives and Reaches Expected State Reusing Collected Performance Knowledge
  59. 59. 27 Reuse Performance Knowledge Along the Workflow/Pipeline CSA Step Stop PipelineCSA Step before the execution of a test
  60. 60. 27 Reuse Performance Knowledge Along the Workflow/Pipeline CSA Step Stop PipelineCSA Step predicts before the execution of a test
  61. 61. 28 Reuse Performance Knowledge Across Different Iterations of the Same Test CSA Step CSA Step CI Server during the execution of a test
  62. 62. 28 Reuse Performance Knowledge Across Different Iterations of the Same Test CSA Step reuse CSA Step CI Server during the execution of a test
  63. 63. 28 Reuse Performance Knowledge Across Different Iterations of the Same Test CSA Step reuse CSA Step CI Server during the execution of a test
  64. 64. 29 Reuse Performance Knowledge Cross Branches after the execution of a test feat_b feat_a CSA Step CSA Step
  65. 65. 29 Reuse Performance Knowledge Cross Branches after the execution of a test feat_b feat_a CSA Step pass CSA Step
  66. 66. Highlights 30 Current Solutions + Limitations 7 Limitations of Current Solutions Rarely Automating the End-to-End Process Not integrated in CSA 
 (E.g., Do not Leverage Continuous Feedback) CSA: Continuous Software Assessment
  67. 67. Highlights 30 Current Solutions + Limitations 7 Limitations of Current Solutions Rarely Automating the End-to-End Process Not integrated in CSA 
 (E.g., Do not Leverage Continuous Feedback) CSA: Continuous Software Assessment Approach Overview CI ServerRepo 10 Approach Overview Performance Knowledge Objective- driven Tests Fast Perf. Feedback CSA Integration (DSL) 3 Main Features
  68. 68. Highlights 30 Approach Details In Depth Details 34 Current Solutions + Limitations 7 Limitations of Current Solutions Rarely Automating the End-to-End Process Not integrated in CSA 
 (E.g., Do not Leverage Continuous Feedback) CSA: Continuous Software Assessment Approach Overview CI ServerRepo 10 Approach Overview Performance Knowledge Objective- driven Tests Fast Perf. Feedback CSA Integration (DSL) 3 Main Features
  69. 69. 32 _____ _ _ __ __ _ |_ _| |__ __ _ _ __ | | __ / /__ _ _| | | | | '_ / _` | '_ | |/ / V / _ | | | | | | | | | | | (_| | | | | < | | (_) | |_| |_| |_| |_| |_|__,_|_| |_|_|_ |_|___/ __,_(_) benchflow benchflow vincenzo.ferme@usi.ch http://benchflow.inf.usi.ch 32
  70. 70. 33 Backup Slides
  71. 71. 34 BenchFlow Tool Overview Instance Database Serv. BFaban Drivers ContainersServers DATA
 TRANSFORMERS ANALYSERS Performance Metrics Performance KPIs Serv. A TestExecutionAnalyses Faban + Serv. N Minio harness benchflow MONITOR Adapters COLLECTORS
  72. 72. 35 Docker Performance [IBM ’14] W. Felter, A. Ferreira, R. Rajamony, and J. Rubio. An updated performance comparison of virtual machines and Linux containers. IBM Research Report, 2014. Although containers themselves have almost no overhead, Docker is not without performance gotchas. Docker volumes have noticeably better performance than files stored in AUFS. Docker’s NAT also introduces overhead for workloads with high packet rates.These features represent a tradeoff between ease of management and performance and should be considered on a case-by-case basis. ” “Our results show that containers result in equal or better performance than VMs in almost all cases. “ ” BenchFlow Configures Docker for Performance by Default
  73. 73. 36 BenchFlow: System Under Test Docker Engine
  74. 74. 36 BenchFlow: System Under Test Docker Engine Containers
  75. 75. 36 BenchFlow: System Under Test Docker Engine Docker Machine provides Containers
  76. 76. 36 BenchFlow: System Under Test Docker Swarm Docker Engine Docker Machine provides Containers
  77. 77. 36 BenchFlow: System Under Test Docker Swarm Docker Engine Docker Machine provides manages Containers Servers
  78. 78. 36 BenchFlow: System Under Test Docker Swarm Docker Engine Docker Compose Docker Machine provides manages Containers Servers
  79. 79. 36 BenchFlow: System Under Test Docker Swarm Docker Engine Docker Compose Docker Machine provides manages Containers Servers SUT’s Deployment Conf. deploys
  80. 80. 37 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service
  81. 81. 37 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service MONITOR
  82. 82. 38 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service • CPU usage • Database state Examples of Monitors:Monitors’ Characteristics: • RESTful services • Lightweight (written in Go) • As less invasive on the SUT as possible
  83. 83. 38 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service CPU Monitor DB Monitor • CPU usage • Database state Examples of Monitors:Monitors’ Characteristics: • RESTful services • Lightweight (written in Go) • As less invasive on the SUT as possible
  84. 84. 39 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service
  85. 85. 39 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service Minio Instance Database Analyses COLLECTORS
  86. 86. 40 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service • Container’s Stats (e.g., CPU usage) • Database dump • Applications Logs Examples of Collectors:Collectors’ Characteristics: • RESTful services • Lightweight (written in Go) • Two types: online and offline • Buffer data locally
  87. 87. 40 Server-side Data and Metrics Collection DBMSFaban Drivers ContainersServers harness WfMS TestExecution Web Service Stats Collector DB Collector • Container’s Stats (e.g., CPU usage) • Database dump • Applications Logs Examples of Collectors:Collectors’ Characteristics: • RESTful services • Lightweight (written in Go) • Two types: online and offline • Buffer data locally
  88. 88. 41 Performance Metrics and KPIs
 coordinate data collection and data transformationTestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector
  89. 89. 41 Performance Metrics and KPIs
 coordinate data collection and data transformationTestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT
  90. 90. 41 Performance Metrics and KPIs
 coordinate data collection and data transformationTestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publis h
  91. 91. 41 Performance Metrics and KPIs
 coordinate data collection and data transformationTestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publis h Subscri
  92. 92. 41 Performance Metrics and KPIs
 coordinate data collection and data transformationTestExecution A high-throughput distributed messaging system Minio Instance Database Analyses DBMSFaban Drivers ContainersServers harness WfMS Web Service Stats Collector DB Collector COLLECT Publis h Subscri Read

×