Paper presented during International Conference on Computer and Information Science 2012 (ICCIS2012) as part of World Engineering, Science and Technology Congress 2012 (ESTCON2012)
Unleash Your Potential - Namagunga Girls Coding Club
Performance Testing: Analyzing Differences of Response Time between Performance Testing Tools
1. ICCIS2012
Performance Testing:
Analyzing Differences of
Response Time between
Performance Testing Tools
Muhammad Dhiauddin Mohamed Suffian
Faculty of IT & Multimedia Communication
Open University Malaysia
Fairul Rizal Fahrurazi
Product Quality & Reliability Engineering
MIMOS Berhad
2. Presentation Outline
•
•
•
•
•
ICCIS2012
Background
Related Works
Overview of Performance Testing Tools
Test Environment Setup
Findings and Discussion
– Result of Tool A
– Result of Tool B
– Result of Tool C
– Comparison of Performance Test Results between Tools
– Potential Reasons for Response Time Differences
• Conclusion
3. Background
•
ICCIS2012
Several issues have been observed related to tools when conducting performance
testing:
tools compatibility with the software under test
tools installation
tools setup
tools flexibility in doing test both for client and server side
response time generated by the tools
Research
Focus
4. Related Works
ICCIS2012
• Most previous work on performance testing tools comparison ignored on different
result reported by each tools
VCAA uses pricing and user friendliness as a criteria to decide which tool to use
JDS mentions about the ability to emulate a complex business process and
support of unlimited number of concurrent users.
Testingrefeclections.com - accuracy of load and response time is something
we need to evaluate against our particular application and not something
to compare when determining the tool to use or buy
• There is no work so far to understand why they are different against tools
• Shall we have a framework or memorandum of understanding (MOU) about the
uniformity of response time on all performance testing tools?
• Each tool claims they are better than the others but none able to justify the
performance testing results against the real world.
This effort is:
•to prove if there is a difference response time between different performance
testing tools and potential reasons contribute to it
•to enlighten the performance testers that no tool in the world able to fully
replacing human for performance testing
13. Findings and Discussion
ICCIS2012
Potential Reasons for Response Time Differences
•
Some fundamental reasons:
capturing and simulating the load used for the performance test
method of calculating metrics gathered by each tool
language to develop the tools
architecture of the respective tools
•
Tool A and Tool C:
Capturing and simulating the load plays the biggest role
Several extra items being recorded and simulated when generating user loads.
Tool C by default uses Internet Explorer browser when recording and it's observed ASHX files recorded in
the list as an additional items compared to Tool A (ASHX is a web handler file ); Tool A did not
JavaScript and CSS files seem have higher response time in Tool C compared to Tool A around 18
percent. It is observed the remaining file types consists of images (GIF, JPG) and HTML do have a small
variation between 1 to 4 percent.
Method for calculating metrics gathered by each tool contributes to the variation of the response time
Fundamental formula to calculate response time is identical which is based on last byte sent and
last byte received
However, Tool C introduces Inter- Request-Delay, where some requests may have delays associated to
them
Tool A does not automatically implement a delay and it is up to the user to manually configure it
•
Architecture differs greatly:
Tool A and C developed by using Java and they require JVM to run so the value setting for Java Heap Size plays a
role to generate the best user load without putting extra burden to the client
Tool B architecture relies on web relay daemon facility allowing CORBA-based communication to be transmitted
between machines during executing the performance test
14. Conclusion
•
•
ICCIS2012
Different performance testing tools do give a different response time.
The next critical research is on capturing and simulating the load by each
tools:
• Need to continue analyzing each of the HTTP request and response
through tool available in the market for an example Wireshark
• To fully understand at the packet level what are being transferred and
received at each tools and why they are being included or excluded
• Currently, there is no tool able to tell us if application is fast enough in
term of user experience in a reality
• It is crucial for performance testers to understand that there is no tool able
to automate and tell us the full picture of the application's performance
going to be in a real worlds
• It is back to human brain to analyze the information given and
performance testing tools are just one of the tool can be used to achieve
that