Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

0

Share

Neotys PAC 2018 - Tingting Zong

The PAC aims to promote engagement between various experts from around the world, to create relevant, value-added content sharing between members. For Neotys, to strengthen our position as a thought leader in load & performance testing.

Since its beginning, the PAC is designed to connect performance experts during a single event. In June, during 24 hours, 20 participants convened exploring several topics on the minds of today’s performance tester such as DevOps, Shift Left/Right, Test Automation, Blockchain and Artificial Intelligence.

Related Books

Free with a 30 day trial from Scribd

See all
  • Be the first to like this

Neotys PAC 2018 - Tingting Zong

  1. 1. Integrate Performance Test with DevOps - Tingting Zong (Ruby) -
  2. 2. INTRODUCTION
  3. 3. Hengtian Overview>Global Services Hengtian is a technology services company in Hangzhou, China, created in 2004 as an alliance among State Street Corporation, Insigma Technology (a Global Outsourcing 100 Company) and Zhejiang University. We provide offshore and onshore technology development, research and consulting services. Hengtian has been certified for compliance with ISO270001 and CMMI level 5.
  4. 4. Self Introduction Master graduated in Computer Science degree from Zhejiang University Master graduated in Management of Innovation and Entrepreneurship from Ecole Polytechnique. More than 8 years of working experience in the field of software performance engineering area and is currently the Project Manager of Performance Engineering Group and Security Engineering Group of Hengtian software. In the past 5 years, my team completed 250 + software performance testing and performance tuning projects and 100 + software security testing and security tuning project, with rich project experience. Four years ago, I began to promote the integration of performance test and DevOps in the company and achieved gratifying results.http://peg.hengtiansoft.com/en/index.html
  5. 5. TIANJI 3.0
  6. 6. Basic Concept of Tianji Dev IT support QA DevOps Online Automation Integration • Promote the standardization of the entire process of software delivery • Docking project management and building an integrated software production platform • Workflow engine is used to integrate project production flow and flexible configuration • Continuous integration and continuous testing to support rapid submission and feedback to increase efficiency • Embedded automation test and submit node quality control to improve quality assurance • Automate deployment and environmental management to promote seamless technical operations
  7. 7. Tianji 3.0 – Hengtianunifiedsoftwareproductionmanagementplatform Software Production Process Project Management & IT Support Basis infrastructure (Scalable, easy to manage, auto-backup) Middleware+ Process Engine HT Project Management Platform Psca n
  8. 8. Depl oy Tianji CI/CD topology Coding Dev Ops QA Submit Code On demand trigger Submit object package Deploy object Deploy and control Database for control platform Code Library Reports Object Library Dev QA Production Working Environment Design and coding Control dashboard Get the code
  9. 9. Platform views based on different roles DEV: PMS IDE Sonar Jenkins QAMS Receive Tasks Coding Submit Tianji Portal Trigger Code Scan Unit Test Code Review QA: Start Test Pre Performance Test Pre Security Test System Classification Test projects management Pass AutomaticExecution Release Automatic Execution Pre Auto Test
  10. 10. √ √√ √ √ √ √ √ √ √ Coding Code Upload Code Scan (Sonar) √ √ Unit Test Code Analysis Ready for Test Function Test Performance Code Scan (Pscan) √ Online URL Based Performance Test Performance Test and Tuning Security Test System Classification √ Deploy Deploy QA Environment DevOps
  11. 11. Improvement of development process • Unified code library and code specification • Framework standardization and components standardization • Automatic build/deployment reduces the communication costs and process dependency • Quick feedback and quick fix The efficiency of coding Development process management • Standard version management • Task-based branch autonomous management • Implantable automatic development test • Reduce communication costs at all stages HT standardization framework Java/ .Net/ UI/ iOS/ Andiod/… HT Component Library Code and version specifications Various types of best practices
  12. 12. Improvement of quality control process • Automatic code scanning • Self-service environment preparation • Self-service deployment • Implantable and repeatable automated execution Test execution Process control • Embedded Access Condition Node Control • Published online controls and management • Real-time status monitoring at all stages • Reduce communication costs at all stages HT customized rule library (PMD/FindBugs/CheckSt yle/FxCop) (Java/.Net/ iOS/ Andriod/…) Automation framework access (HT Selenium framework) HT Production Management Platform Various types of best practices
  13. 13. PSCAN
  14. 14. Pscan Bytecode Files Code Grammar Abstract Tree Database Data Structure
  15. 15. Pscan Import object code Get code performance analysis result
  16. 16. Choose customized rules Adding new rules: Pscan
  17. 17. Pscan-Review bytecode files • Analysis Object: - Compiled bytecode class file • Analysis steps: - Clearly define the rules - Analyzing the bytecode content of the sample code - Write the detector - Add rules to rules file
  18. 18. Pscan-Review bytecode files • Example: Bytecode file Recognition rule
  19. 19. • Analysis object: - AST (Abstract Syntax Tree) • Analysis steps: - Clearly define the rules - List all the different ways of breaking this rule - Analyze the characteristics of abstract grammar trees for all writing - Write rule code capture features Pscan-Review code grammar abstract tree
  20. 20. • Example: while loop without curly braces while loop with curly braces AST1 AST2 Pscan-Review code grammar abstract tree Recognition rule
  21. 21. • Analysis object: - SQL statement template • Analysis step - Extracting SQL Execution Statements in Code - Construct a SQL statement template - Review SQL Statement Template - Automation + manual recognition Pscan-Review database
  22. 22. Related rules: Select - Use the distinct keyword carefully - Use the Union keyword with caution - Reduce the amount of data in the connection table Insert - Avoid loop insertion, use insert into select bulk insert Delete - Avoid modifying/deleting too many data at the same time, batch operation data Pscan-Review database
  23. 23. • Analysis object - Code keywords • Analysis step - Source code static scan - Rule identification - Suggest a fix Pscan-Review data structure
  24. 24. Related rules - Avoid using synchronization mechanisms - Hashtable, Vector, StringBuffer is not recommended - Correspondingly replaced by HashMap, ArrayList, StringBuilder - Avoid foreach loop traversal, use for loop - Avoid using reflections frequently - Avoid array declaration as public static final - Avoid using the toString() method on an array - Avoid using try..catch in loops... - Try to use lazy loading strategy Pscan-Review data structure
  25. 25. Online URL Based Performance Test
  26. 26. Online URL Based Performance Test
  27. 27. Online URL Based Performance Test Input Output URL Analysis Website crawling Select key Scenarios Organize test results Format resolution Filter out malicious requests Set level Dynamic web crawling Assessment model Data processing URL Report Deal with anti-climbing
  28. 28. Online URL Based Performance Test
  29. 29. URL Analysis • URL resolution • Exclude malicious requests • Define hierarchical crawling • Dynamic web crawling • Dealing with anti-climbing strategies
  30. 30. Online URL based load test strategy Get a list of test elements Generate test scenarios Generate test threads Call the performance test tool engine Perform periodic concurrent accesses in sequence, and record the relevant sampled data for each visit
  31. 31. Test data analysis
  32. 32. SYSTEM CLASSIFICATION
  33. 33. System Classification To be considered • 0~2.0 points Recommended • 2.0~4.0 points Highly recommended • 4.0~6.0 points Must • 6.0~10.0 points
  34. 34. System Classification> Influencing factors Object The volume of target daily active user The usage frequency of target daily active user User traffic distribution Target error tolerance Target database data volume Typical resource consumption / time-consuming operation
  35. 35. System Classification> Influencing factors weight 𝑮 = 𝜸 𝒐𝒃𝒋 𝒈 𝒐𝒃𝒋 + 𝜸 𝒅𝒊𝒔 𝒈 𝒅𝒊𝒔 + 𝜸 𝒖𝒔𝒓 𝒈 𝒖𝒔𝒓 + 𝜸 𝒇𝒓𝒆 𝒈 𝒇𝒓𝒆 + 𝜸 𝒄𝒐𝒏 𝒈 𝒄𝒐𝒏 + 𝜸 𝒆𝒓𝒓 𝒈 𝒆𝒓𝒓 + 𝜸 𝒂𝒎𝒕 𝒈 𝒂𝒎𝒕 Object The volume of target daily active user The usage frequency of target daily active user User traffic distribution Target error tolerance Target database data volume Typical resource consumption / time- consuming operation E.g. 𝜸 𝒐𝒃𝒋 𝒓𝒆𝒑𝒓𝒆𝒔𝒆𝒏𝒕𝒔 𝒕𝒉𝒆 𝒑𝒐𝒊𝒏𝒕𝒔 𝒕𝒉𝒂𝒕 𝒕𝒉𝒆 𝒆𝒍𝒆𝒎𝒆𝒏𝒕 𝒈𝒐𝒕,𝒈 𝒐𝒃𝒋 𝒓𝒆𝒑𝒓𝒆𝒔𝒆𝒏𝒕𝒔 𝒕𝒉𝒆 𝒘𝒆𝒊𝒈𝒉𝒕 𝒐𝒇 𝒕𝒉𝒆 𝒆𝒍𝒆𝒎𝒆𝒏𝒕.
  36. 36. Performance Engineering Complexity Modeling
  37. 37. Performance Engineering Complexity Modeling> Time Calculation Script preparation time Data preparation time Test execution time Tuning time report time Reserve time T 𝑇otal = tScripting + tData Pr𝑒𝑝𝑎𝑟𝑒 + tExecution + tTunning + tReport + tExtra
  38. 38. Methodology Test plan Preparation Design Runtime Analysis Tuning Validation  Testing environment installation  Test platform validation  Test plan validation  Test data generation  VU profile design  DB update procedure  Test data validation  Population & monitoring  Single-user baseline  Ramp-up load test  Combination test  Maximum load test  Response time analysis  Benchmark comparison  System metric analysis  Tuning is required  Report creation Project manager Functional specialists Performance Consultant Project manager Functional specialists Performance Consultant Functional specialists Performance Consultant Developers Performance Consultant Server specialists Administrators Performance Consultant Server specialists Administrators Stage 1 Stage 3Stage 2
  39. 39. APM
  40. 40. Different kinds of probes Mobile terminal Probe JavaScript Probe Web Probe JVM Probe Web Probe
  41. 41. DC-Server Data Display DA:data analyzer DC:data collector Data collection Agent Haproxy Host agent Browser agent App agent SLB server browser mobile Data storage DC-Browser DC-App Data Analyzer Data Receive Data Receiver Kafka Cluster Data Analyzer Data Analyzer Data Receive Elastic search hdfs Metric Data Page display Open API Various reports Cache Cluster
  42. 42. Values Using Scenarios Dev &QA Infra IT Manager Business Manager • Positioning problems, solving problems, and reducing costs in advance during development testing • Distributed tracking of micro services The necessary tool for Devops and Micro services • Automatically locate issues and root causes • Link analysis • Support business dimension holographic positioning • eg: input business number • Visible Business process • Visible business operations • Controllable code quality reduces outsourcing risk • Find problems in advance reduce costs Holographic investigation, efficient issue positioning Visualization of IT value Quality control
  43. 43. What is the delay distribution of the program across the link? How is the user experience on browser or mobile? Is the program perform abnormal on the entire link? ? How is the responsiveness of the database? Find the full link of the program response based on the user ids or related business ids --Finding the user program response is the key to solve the problem Associated integration can solve problems Why full-link monitoring?
  44. 44. Full-link monitoring-A user call process from an internet finance company
  45. 45. Data collection - Code Probe A.class 1 2 3 4 5 8 9 Request Response JVM 6 10 7 Class Loader engin e agent A’.class Java agent Monitor the temporary data storage area Running data area
  46. 46. Data collection 一 Network package analysis
  47. 47. Advantage Monitor the database connection pool and discover potential free databases that may not be released. Database connection pool monitoring After user visiting, the data of the program full link response can be aggregated to form a logical wide table, and the business monitoring report can be efficiently generated. Support business monitoring By analyzing the business data, the complete data at the time of the user request can be restored to find the performance problem. Problem scene recovery The probe can obtain data such as exceptions within the program and is aggregated over the full link. Without looking at the logs, you can quickly understand the possible failure of the program. Abnormal perception When a failure occurs, it can be clearly determined whether the problem is affected by the program or nginx. Abnormal network equipment perception Full-link tracking can support load balancers such as Nginx and Apache. If the load balancer has a problem, it can be discovered. Load balancing awareness When there is a situation such as code dead lock, it can be pre-alarmed according to the preset warning time. Business suspended animation warning
  48. 48. User granularity fault location Full data distributed tracking Prior warning reduces the failure rate Suitable for container environment Advantage
  49. 49. AVG :217ms AVG :328ms AVG:125ms AVG:23ms AVG:27ms AVG:287ms AVG :0.95ms AVG:0.40ms Slow Method Slow Method Slow Method Calling Slow SQL Average Error Rate:11.5% Core application
  50. 50. 1, As is shown above, the request response time is 2.8s; 2. The request invokes the method shown above, where the slowest method call performed 2.7s, and slow method calls are the main reason for the slow response. Slow method
  51. 51. 1、Under normal circumstance, the response time of this request is fast (82.76ms) 2、In the abnormal case, the response time is: 58.7s. Obviously, it can be seen in the topology. The reason for the slow response is that many other applications are invoked. The average interface response time is >5S. The call relationship is complicated and the interface call is slow. It is the main reason for slow response. Method and call exception
  52. 52. 1, The request response time is 16.7s; 2. Database connection: 12 times, response time: 15.0s, database connection consume too much time is the main reason for this slow response Database connection problem
  53. 53. 1, As is shown above, the request response time is 2.40s; 2. The request executes the above SQL. The SQL executes 2.12s, and the slow execution of the SQL is the main reason for this slow request Slow SQL
  54. 54. Thank You!

The PAC aims to promote engagement between various experts from around the world, to create relevant, value-added content sharing between members. For Neotys, to strengthen our position as a thought leader in load & performance testing. Since its beginning, the PAC is designed to connect performance experts during a single event. In June, during 24 hours, 20 participants convened exploring several topics on the minds of today’s performance tester such as DevOps, Shift Left/Right, Test Automation, Blockchain and Artificial Intelligence.

Views

Total views

154

On Slideshare

0

From embeds

0

Number of embeds

1

Actions

Downloads

0

Shares

0

Comments

0

Likes

0

×