Successfully reported this slideshow.

Introduction to Performance Testing Part 1

806 views

Published on

During the session I would like to bring basic concepts about the performance testing and highlight the activities,
that we are running in CTCo. I believe, that primary audience for this session

would be test engineers, that do not have experience in this activity and would like to gain some knowledge in this area.

  • Be the first to comment

  • Be the first to like this

Introduction to Performance Testing Part 1

  1. 1. Introduction toPerformanceTestingPart 1(V1.1)Author:Vjacheslav LukashevichEdited by:Eugene Muran
  2. 2. Part 1 •Test types •Measurements and scenarios •Testing team •Test tools •Planning and Implementing testsPart 2 •Tool Configuration •Test Run •Result analysis •Facts about development
  3. 3. Fact is that -…
  4. 4. Fact is that -…No one cares about software performance at beginning
  5. 5. And at the end we getsomething like this …
  6. 6. Or even worse…
  7. 7. SMALL REMARK:ALL OF ABOVE MENTIONEDIS ALREADY IN THE PAST
  8. 8. Our days
  9. 9. When performancetesting is a “must” We expect large data volumes We expect high load Inexperienced programmers Customer requests it
  10. 10. Question!What types ofperformance testingexist?
  11. 11. Test types Performance Testing Load Testing Stress Testing Endurance testing Scalability testing Recoverability testingHow we name it - depends on test target!
  12. 12. Question!What we will measureby the performancetesting?
  13. 13. Application “speed”performancecharacteristics Throughput – number of actions per time moment. Response Time – time taken by action to complete
  14. 14. Application hardwarecharacteristics Primary resources: CPU (%) Total CPU usage on host CPU usage by target process Memory Usage Used Memory (on host or by process) Available Memory Network Number of bytes sent/received per second through network interface
  15. 15. Bottleneck “Bottleneck” is the plac where performance degradation occurs
  16. 16. ScalabilityAbility to handle additional workload when: Adding hardware (memory, CPU, storage) Adding additional hosts (machines)
  17. 17. Performance testtargets Common targets: Get performance statistics (application speed) Find bugs (software stability issues) Find out bottleneck’s Find out capacity Establish baselines Other targets Scalability check Crash test Recovery test Each test is a scenario or a set of scenarios
  18. 18. Question!Who is participating inperformance testing?
  19. 19. Relationship Between Performance Testing and Tuning Performance Tests Tuning process is done when: performance testing reveals system or application characteristics unacceptable Performance to reduce the amount of Characteristics resources being usedPerformance Tuning
  20. 20. Performance tuning Cooperative effort Product vendors Architects Developers Testers Database administrators System administrators Network administrators
  21. 21. Performance testingactivities 1. Identify the test environment 2. Plan and Design tests 3. Configure the Test Environment 4. Implement the Test Design 5. Execute the Test 6. Analyze Results, Report, and Retest
  22. 22. Identifying testenvironment:Hardware In ideal case test environment must be as close to production environment as possible
  23. 23. Identifying testenvironment:Software –keycomponents Operation System (MS Windows, Linux,…) Application Server (IBM Websphere, BEA Weblogic, Tomcat, …) Database Server (IBM DB2, Oracle, MS SQL Server, …)
  24. 24. Identifying test environment: Software VERSION# For performance tests key software components in Configuration environment must be as close as possible to production->Patch VERSION# environment –can impact overall system performance(Build #) <Settings> <<Java version>>
  25. 25. Performance testing scenariosScenario: Sequence of steps in application Represents a use case
  26. 26. Question!What scenarios weneed to test?
  27. 27. Plan and Design tests: Determining Test Scenarios Key Usage scenarios to include: most common, frequent business-critical performance-intensive technical concern high-visibility, obvious contractually obligated stakeholder concern
  28. 28. Applying usage pattern for test design Daytime user activity Peak 4500 load Options: 4000 1) Emulate exact usage pattern 3500 2) Run under maximum load (peakUser Actions per hour 3000 load) 2500 3) Run under average load 2000 1500 1000 500 0 0 0 0 0 0 0 0 00 00 00 00 00 :0 :0 :0 :0 :0 :0 :0 0: 2: 4: 6: 8: 10 12 14 16 18 20 22
  29. 29. Plan and Design tests:Determining TestScenarios Most important - performance scenario must be Repeatable (!)
  30. 30. Repeatable testscenario Data prerequisites cleanup Tests sequence Environment state
  31. 31. Performance Test Plan Test Plan content Test environment Test data Test scenariosPERFORMANCE TEST PLAN Performance Measurements
  32. 32. Test Plan: Test Data andTest Scenarios Test prerequisites: Database Users Files (documents, images …) Test Parameters Scenarios list Think time Load Scheme Duration
  33. 33. TESTIMPLEMENTATION
  34. 34. Environment setup Application deployment Database setup Users/ Access Rights System clock synchronization Performance Monitoring services configuration
  35. 35. Stable version To avoid the surprises for theperformance testing it is preferable For performance testing it is to select stable, well tested by functional testing preferable to select stable, application version well tested using functional testing Plan it and target for it! application version
  36. 36. Perform the smoke testing! Before beginning the scripting, verify that the scenarios you are going to implement in your test script are working at all!
  37. 37. TESTING TOOLS
  38. 38. 2 Main Purposes oftest tools Load Generation Measurements/Statistics gathering Best thing, if the tool can do this both tasks in parallel
  39. 39. 2 ways to emulateweb application “Play” with browser Links/Button clicks Form filling Send POST/GET request to server
  40. 40. What tools we havefor such testing? LoadRunner (different protocols: POST/GET, web browser emulation/GWT/Mobile/SAP) Jmeter (POST/GET) Grinder (POST/GET) *this is only some of available tooling
  41. 41. Capture/Replay Pluses (+) Fast and easy test creation Minuses (-) Unstable scripts Application change sensitivity Unnecessary requests in scripts Overhead on server side for replayed scripts
  42. 42. Tests debug Like any computer program, performance tests also must be tested Possible checks View returned pages during replay in test tool (in LR - visual test results) Test tool logs Server logs
  43. 43. Content check Check content, that is only available, when system operates correctly
  44. 44. Books Web load for dummiesPerformance Testing Guidance for WebApplicationshttp://perftestingguide.codeplex.com/downloads/get/17955

×