Iasi code camp 12 october 2013 performance testing for web applications with j-meter- codrutasalomia


Published on

Published in: Technology
1 Comment
1 Like
  • Thanks for sharing your views dear.This is very usefully for me and to read about the Performance Testing Guidance for Web Applications from testingsorttricks.wordpress.com
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Iasi code camp 12 october 2013 performance testing for web applications with j-meter- codrutasalomia

  1. 1. Performance testing determines /validates Speed Stability Scalability
  2. 2. What we measure: Response times Throughput Resource utilization
  3. 3. Load testing Determine a system’s behavior under normal conditions anticipated peak load conditions Used to identify:  maximum operating capacity  bottlenecks  which element is causing degradation
  4. 4. Stress testing Load the system to its breakpoint Used to identify the system break point It’s important to monitor how the system breaks and recovers
  5. 5. Volume testing Testing a system with a certain amount of data. Usually includes high data and throughput volumes Endurance testing Testing a system under load for an extended period of time to establish stability and behavior under sustained use
  6. 6. Using JMeter to test performance of my web application while having 100 concurrent logins It’s not the first login for the user, so that some elements (like images, css, js) are already cached in browser 1. System Configuration
  7. 7. 2. Scenario to register with JMeter: the user logs into application  Add a Thread Group to the Test Plan      Test Plan - for containing the tests Thread Group - for representing users Select Test Plan Add-> Threads->Thread Group Add a Recording Controller to the Thread Group  Controllers – for grouping and applying logic to test items  Select Thread Group  Add -> Logic Controller-> Recording Controller
  8. 8.  Add a HTTP Proxy server to the WorkBench  WorkBench - used for temporary items Add-> Non-Test Elements -> HTTP Proxy server configure it to use the recording controller
  9. 9.  Configure your browser to use the proxy and start the proxy server
  10. 10.  Open the browser, enter address for login page, login with the user The following HTTP requests are registered into Recording Controller The login action for a permanent teacher is made by 3 HTTP requests:  accessing the login page  entering the credential and the effective login  redirect action to a default page The login credentials are automatically saved for the second request, Which is the effective login step
  11. 11. Obs 1: Make sure that you really do the action you think you are  Check that the login effectively succeeded For this, add a View Results Tree listener to the Recording controller  Listeners - for display of data  Add -> Listener -> View Results Tree  Re-run the test and look to see the page where we are redirected
  12. 12. We can observe that, in fact the login didn’t succeed and we are still in the login page
  13. 13.  Add a HTTP Cache manager to the recording controller and rerun the test Configuration element - can modify requests Add-> Config Element-> HTTP Cookie Manager
  14. 14.  Add Listeners to see statistics of the execution First install JMeter Plugins  From http://jmeter-plugins.org/downloads/all/ download the JMeterPlugins-1.1.0.zip  Copy JMeterPlugins.jar file from JMeterPlugins-1.1.0.zip inside JMETER_INSTALL_DIR/lib/ext, for example to c:jmeterlibext in case of windows installation  Restart JMeter and see jp@gc-prefixed plugins in menus
  15. 15. Then add the Listeners  Select Thread Group  Add -> Listener and add the following listeners:  Active Threads Over Time  Transactions per Second  Response Times vs Threads  Summary Report
  16. 16.  In Thread Group enter the following values and re-run the test Number of concurrent users = 100 Ramp-up period = 10 Loop Count = 1
  17. 17.  The ramp-up period tells JMeter how long to take to "ramp-up" to the full number of threads chosen  If 100 threads are used, and the ramp-up period is 10 seconds, then JMeter will take 10 seconds to get all 100 threads up and running  Theoretically, each thread will start 0.1(10/100) seconds after the previous thread was begun The ramp-up should be:  long enough to avoid too large a work-load at the start of a test  short enough that the last threads start running before the first ones finish
  18. 18. Obs 2: Make sure that you really get the desired load If we analyze the data, we see that the maximum number of concurrent threads is 57, not 100!
  19. 19. Obs 3: Make sure that the results refer to the entire transaction  When looking into the other graphs/reports we see that for each of the three requests is generated a line
  20. 20. In Summary Report, for each child request was generated a line But in this case we are interested by the total time, which refers to the entire login transaction
  21. 21.  We are interested in having 100 concurrent users and to evaluate the entire login transaction (which is composed by all the three requests) So we’ll do the following changes:  In Thread Group check Forever for execution
  22. 22.  Select Recording Controller and Change it to Transaction Controller
  23. 23.  Select the Transaction Controller, check Generate parent sample and uncheck the second option, then re-run the test The statistics in reports refer now to the entire transaction
  24. 24.  Wait to obtain 100 concurrent threads in Active Threads Over Time report, then wait for a while before stopping the execution
  25. 25. Observations (after stopping the execution):  The test has run for about 40 sec  100 concurrent users were obtained after 10 sec  The next 30 sec were used to obtain statistics, in different graphics. The number of users remained 100 until the test was manually stopped If you consider that the chosen period of time (40 sec) is good enough to get significant trends then use this value as execution time
  26. 26.  In Thread Group enter 40 in Duration(seconds) field and re-run the tests
  27. 27. 3. Analyze the results
  28. 28. Active Threads Over Time:  how many active threads are in each thread group during test run  Theoretically, each thread will start 0.1(10/100) seconds after the previous thread was begun  Practically, the start of each thread depends on factors like available resources (e.g. machine CPU), by the load created by the other threads and so on, so that differences may appear
  29. 29. Transactions per second The load generated by the100 concurent users during 40 seconds created a number of maximum 16 concurrent transactions/seconds Obs 4: 100 concurrent users/threads don’t mean 100 concurrent transactions
  30. 30.  The response times vary between 1.5 sec when having 15 active threads and almost 20 sec when reaching 100 active threads
  31. 31. Response Time - Time to last byte  Time from the moment the request was sent till the last resource is finished downloading Response Latency - Time to first byte  Time from the moment the request was sent till the first byte of the first resource is received back
  32. 32. Response Time Response Latency Time to last byte Time to first byte
  33. 33. Summary Report  Samples – the number of samples with same label A sample mean one sampler call, in our case – one login transaction  Average  The average time taken to receive the destination web page (where the user is redirected after login)  In our case there were 238 values of receiving time which were added and divided by 238
  34. 34.  Min and Max The minimum and maximum value of time required for receiving the destination web page  Std. Dev How many exceptional cases were found which were deviating from the average value of the receiving time. The lesser this value more consistent the time pattern is assumed Error % Indicates the percentage of error Avg. Bytes Average size of the sample response in bytes
  35. 35.  Throughput  Calculated as requests/unit of time  The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server
  36. 36. In our case: The Average Bytes = 5167.7 Bytes ~ 5.046 KB  KB/sec = 30.18 Throughput = 30.18 /5.046 ~ 6 requests/seconds One transaction request has an average size of 5.046 KB and there were registered 30.18 KB/sec => this means there were about 6 requests per second
  37. 37.  Monitor the server resources Add listener to Thread Group Add-> Listener-> Perf Monitor Collector On server configure a Server Agent to retrieve data for the report http://jmeter-plugins.org/wiki/PerfMonAgent/ CPU: between 77.37% and 100% Memory: between 27% and 80% Network I/O : between 0 and 19744 bytes/s
  38. 38. Record TestCase Check operation success Are you really doing the action you think you are? Add reports / graphics Check the number of concurrent users Are you really getting the desired load? Check transaction data in graphics Are the results referring to the entire transaction? Choose the execution time Re-run tests and analyze the results
  39. 39. Bibliography: 1. http://jmeter-expert.blogspot.ro/ 2. http://www.slideshare.net/vivizam/how-to-analyze-reports-in-jmeter 3. http://stackoverflow.com/questions/346788/difference-between-baseline-and-benchmark-inperformance-of-an-application 4. http://www.performancetesting.co.za/Baseline%20Testing.htm 5. https://en.wikipedia.org/wiki/Load_testing 6. http://stackoverflow.com/questions/8397700/jmeter-response-time-calculation 7. http://stackoverflow.com/questions/12303535/meaning-and-calculation-formulas-for-processingtime-latency-and-response-time 8. http://blazemeter.com/blog/load-testing-mobile-apps-made-easy (mobile) 9. https://a.blazemeter.com/node/add/jcs?status=trial&rnd=1380662288 (mobile) 10. http://jmeter-plugins.org/wiki/PerfMonAgent/
  40. 40. Please fill in your evaluation form Performance testing for Web applications with JMeter Codruta Salomia codrutasalomia@yahoo.com 12th of October 2013