Your SlideShare is downloading. ×
Iasi code camp 12 october 2013  performance testing for web applications with j-meter- codrutasalomia
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Iasi code camp 12 october 2013 performance testing for web applications with j-meter- codrutasalomia


Published on

Published in: Technology
1 Like
  • Be the first to comment

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. Performance testing determines /validates Speed Stability Scalability
  • 2. What we measure: Response times Throughput Resource utilization
  • 3. Load testing Determine a system’s behavior under normal conditions anticipated peak load conditions Used to identify:  maximum operating capacity  bottlenecks  which element is causing degradation
  • 4. Stress testing Load the system to its breakpoint Used to identify the system break point It’s important to monitor how the system breaks and recovers
  • 5. Volume testing Testing a system with a certain amount of data. Usually includes high data and throughput volumes Endurance testing Testing a system under load for an extended period of time to establish stability and behavior under sustained use
  • 6. Using JMeter to test performance of my web application while having 100 concurrent logins It’s not the first login for the user, so that some elements (like images, css, js) are already cached in browser 1. System Configuration
  • 7. 2. Scenario to register with JMeter: the user logs into application  Add a Thread Group to the Test Plan      Test Plan - for containing the tests Thread Group - for representing users Select Test Plan Add-> Threads->Thread Group Add a Recording Controller to the Thread Group  Controllers – for grouping and applying logic to test items  Select Thread Group  Add -> Logic Controller-> Recording Controller
  • 8.  Add a HTTP Proxy server to the WorkBench  WorkBench - used for temporary items Add-> Non-Test Elements -> HTTP Proxy server configure it to use the recording controller
  • 9.  Configure your browser to use the proxy and start the proxy server
  • 10.  Open the browser, enter address for login page, login with the user The following HTTP requests are registered into Recording Controller The login action for a permanent teacher is made by 3 HTTP requests:  accessing the login page  entering the credential and the effective login  redirect action to a default page The login credentials are automatically saved for the second request, Which is the effective login step
  • 11. Obs 1: Make sure that you really do the action you think you are  Check that the login effectively succeeded For this, add a View Results Tree listener to the Recording controller  Listeners - for display of data  Add -> Listener -> View Results Tree  Re-run the test and look to see the page where we are redirected
  • 12. We can observe that, in fact the login didn’t succeed and we are still in the login page
  • 13.  Add a HTTP Cache manager to the recording controller and rerun the test Configuration element - can modify requests Add-> Config Element-> HTTP Cookie Manager
  • 14.  Add Listeners to see statistics of the execution First install JMeter Plugins  From download the  Copy JMeterPlugins.jar file from inside JMETER_INSTALL_DIR/lib/ext, for example to c:jmeterlibext in case of windows installation  Restart JMeter and see jp@gc-prefixed plugins in menus
  • 15. Then add the Listeners  Select Thread Group  Add -> Listener and add the following listeners:  Active Threads Over Time  Transactions per Second  Response Times vs Threads  Summary Report
  • 16.  In Thread Group enter the following values and re-run the test Number of concurrent users = 100 Ramp-up period = 10 Loop Count = 1
  • 17.  The ramp-up period tells JMeter how long to take to "ramp-up" to the full number of threads chosen  If 100 threads are used, and the ramp-up period is 10 seconds, then JMeter will take 10 seconds to get all 100 threads up and running  Theoretically, each thread will start 0.1(10/100) seconds after the previous thread was begun The ramp-up should be:  long enough to avoid too large a work-load at the start of a test  short enough that the last threads start running before the first ones finish
  • 18. Obs 2: Make sure that you really get the desired load If we analyze the data, we see that the maximum number of concurrent threads is 57, not 100!
  • 19. Obs 3: Make sure that the results refer to the entire transaction  When looking into the other graphs/reports we see that for each of the three requests is generated a line
  • 20. In Summary Report, for each child request was generated a line But in this case we are interested by the total time, which refers to the entire login transaction
  • 21.  We are interested in having 100 concurrent users and to evaluate the entire login transaction (which is composed by all the three requests) So we’ll do the following changes:  In Thread Group check Forever for execution
  • 22.  Select Recording Controller and Change it to Transaction Controller
  • 23.  Select the Transaction Controller, check Generate parent sample and uncheck the second option, then re-run the test The statistics in reports refer now to the entire transaction
  • 24.  Wait to obtain 100 concurrent threads in Active Threads Over Time report, then wait for a while before stopping the execution
  • 25. Observations (after stopping the execution):  The test has run for about 40 sec  100 concurrent users were obtained after 10 sec  The next 30 sec were used to obtain statistics, in different graphics. The number of users remained 100 until the test was manually stopped If you consider that the chosen period of time (40 sec) is good enough to get significant trends then use this value as execution time
  • 26.  In Thread Group enter 40 in Duration(seconds) field and re-run the tests
  • 27. 3. Analyze the results
  • 28. Active Threads Over Time:  how many active threads are in each thread group during test run  Theoretically, each thread will start 0.1(10/100) seconds after the previous thread was begun  Practically, the start of each thread depends on factors like available resources (e.g. machine CPU), by the load created by the other threads and so on, so that differences may appear
  • 29. Transactions per second The load generated by the100 concurent users during 40 seconds created a number of maximum 16 concurrent transactions/seconds Obs 4: 100 concurrent users/threads don’t mean 100 concurrent transactions
  • 30.  The response times vary between 1.5 sec when having 15 active threads and almost 20 sec when reaching 100 active threads
  • 31. Response Time - Time to last byte  Time from the moment the request was sent till the last resource is finished downloading Response Latency - Time to first byte  Time from the moment the request was sent till the first byte of the first resource is received back
  • 32. Response Time Response Latency Time to last byte Time to first byte
  • 33. Summary Report  Samples – the number of samples with same label A sample mean one sampler call, in our case – one login transaction  Average  The average time taken to receive the destination web page (where the user is redirected after login)  In our case there were 238 values of receiving time which were added and divided by 238
  • 34.  Min and Max The minimum and maximum value of time required for receiving the destination web page  Std. Dev How many exceptional cases were found which were deviating from the average value of the receiving time. The lesser this value more consistent the time pattern is assumed Error % Indicates the percentage of error Avg. Bytes Average size of the sample response in bytes
  • 35.  Throughput  Calculated as requests/unit of time  The time is calculated from the start of the first sample to the end of the last sample. This includes any intervals between samples, as it is supposed to represent the load on the server
  • 36. In our case: The Average Bytes = 5167.7 Bytes ~ 5.046 KB  KB/sec = 30.18 Throughput = 30.18 /5.046 ~ 6 requests/seconds One transaction request has an average size of 5.046 KB and there were registered 30.18 KB/sec => this means there were about 6 requests per second
  • 37.  Monitor the server resources Add listener to Thread Group Add-> Listener-> Perf Monitor Collector On server configure a Server Agent to retrieve data for the report CPU: between 77.37% and 100% Memory: between 27% and 80% Network I/O : between 0 and 19744 bytes/s
  • 38. Record TestCase Check operation success Are you really doing the action you think you are? Add reports / graphics Check the number of concurrent users Are you really getting the desired load? Check transaction data in graphics Are the results referring to the entire transaction? Choose the execution time Re-run tests and analyze the results
  • 39. Bibliography: 1. 2. 3. 4. 5. 6. 7. 8. (mobile) 9. (mobile) 10.
  • 40. Please fill in your evaluation form Performance testing for Web applications with JMeter Codruta Salomia 12th of October 2013