2. Response Time Accuracy:
open-source or Proprietary
Performance Test Tool
by Muhammad Dhiauddin Mohamed Suffian,
Vivek Kumar & Redzuan Abdullah
Performance testing is a critical test approach in achieving the it is important to demonstrate that there are differences in re-
full coverage of software product testing, apart from functiona- sponse time between various performance testing tools and to
lity and other non-functional aspects. Performance testing will understand the tools’ behaviors including their influence on the
determine that the software under test is capable of performing machine resource utilization. We started off by planning the cor-
within an acceptable time according to the end-users’ expecta- rect test setup for the study.
tions. Our experiences in performance testing, particularly in the
load testing of various types of software, ranging from mobile The test setup involved two identical machines that serve as cli-
applications, desktop, web-based to cloud-based applications ent with performance testing tools installed and a server which
and using different performance testing tools, have triggered us hosts the static web pages. These machines are connected via PC-
to ask which tool actually gives the most accurate results. The to-PC LAN cable (commonly referred as cross cable). The reason
answer is not simple, and it is not that the proprietary tool always for connecting it via cross cable is to minimize network factors
gives the right and accurate response times compared to open- from influencing the performance test results. The diagram for
source tools, or vice-versa. Thus, before answering the question, the setup is as follows:
Figure 1: Test setup for comparison of performance testing tools
As shown in the diagram, only three performance testing tools cookies. Running processes on this machine were kept to a mi-
were selected and installed on the client machine. For the scope nimum, i.e. unnecessary processes and server-side logging were
of the study, the tools are only referred as Tool A, Tool B and Tool disabled. Load testing was performed from 1 user access and in-
C. While the test was running for a particular tool, the services creased to 100, 200, 300, 400 and 500 concurrent users with zero
for the other remaining tools were stopped or disabled. As for ramped-up time. The most important thing for this experiment is
the server, it only hosted static HTML pages with no cache and to reboot both the client and server machines for each test.
58 The Magazine for Professional Testers www.testingexperience.com
3. Server machine
From the technical perspective, the hardware specification for
both machines was as follows: Operating system: Windows Server 2003 Enterprise Edition
SP1
CPU/processor: Intel Pentium D 3.4 GHz
Java JDK: JDK 1.6.0 update 21
RAM/memory: 2 GB
Web server: Internet Information Services 6
Hard disk storage: 80 GB
HTML page size: 65.8 KB (page: 7 KB; image 1: 25.2 KB; image
The configuration of the software specification was as follows: 2: 33.6 KB)
Client machine The test was started with Tool A and with one simulating user ac-
cessing the HTML page at the server machine using cross cable. At
Operating system: Windows XP SP2
the same time, we recorded CPU, memory and disk I/O utilization
Java JDK: JDK 1.6.0 update 21 for both client and server machines. The same approach was done
Tool: Tool A (open-source); Tool B (open-source); Tool C (pro- three times per virtual user set until the number reached 500 vir-
prietary) tual users. The same kinds of load test were performed against
Tool B and Tool C. The metrics collected from this exercise were
response time, average CPU utilization, memory utilization and
disk I/O utilization. The details of the results obtained from this
activity are listed below:
a) Tool A
No of Con- Response Time Resource Utilization (Client) Resource Utilization (Server)
current Users (seconds) CPU Memory Disk I/O CPU Memory Disk I/O
1 0.103 0% 3% 22% 21% 22% 0%
1 0.100 0% 3% 22% 23% 22% 0%
1 0.103 0% 3% 22% 20% 22% 0%
100 0.258 6% 3% 22% 65% 23% 0%
100 0.29 5% 3% 22% 65% 23% 0%
100 0.231 1% 3% 22% 67% 22% 0%
200 0.198 7% 3% 22% 71% 23% 1%
200 0.231 7% 3% 22% 83% 23% 2%
200 0.538 4% 3% 22% 93% 23% 0%
300 0.337 3% 3% 22% 94% 23% 0%
300 0.265 10% 3% 22% 95% 23% 0%
300 0.229 3% 4% 22% 97% 23% 3%
400 0.169 7% 4% 22% 93% 23% 0%
400 0.534 8% 3% 22% 92% 23% 3%
400 0.278 6% 3% 22% 95% 23% 2%
500 0.394 14% 4% 22% 96% 22% 8%
500 0.35 14% 4% 22% 97% 22% 9%
500 0.306 14% 4% 22% 95% 23% 10%
b) Tool B
No of Con- Response Time Resource Utilization (Client) Resource Utilization (Server)
current Users (seconds) CPU Memory Disk I/O CPU Memory Disk I/O
1 0.015 0% 4% 22% 17% 7% 2%
1 0.015 0% 4% 22% 17% 7% 2%
1 0.015 0% 4% 22% 17% 7% 2%
100 1.423 1% 4% 22% 99% 6% 12%
100 1.211 3% 4% 22% 99% 6% 7%
100 1.403 1% 4% 22% 99% 6% 48%
200 3.489 3% 4% 22% 99% 6% 55%
200 4.478 4% 4% 22% 99% 7% 53%
200 5.123 4% 4% 22% 98% 6% 31%
www.testingexperience.com The Magazine for Professional Testers 59
4. 300 6.158 7% 4% 22% 99% 6% 14%
300 7.068 5% 4% 22% 99% 7% 33%
300 5.394 2% 4% 22% 99% 6% 69%
400 8.597 3% 4% 22% 99% 6% 32%
400 8.164 10% 4% 22% 97% 6% 32%
400 8.757 3% 4% 22% 98% 6% 36%
500 11.316 5% 4% 22% 98% 6% 47%
500 11.17 5% 4% 22% 98% 7% 27%
500 8.901 8% 4% 22% 97% 6% 28%
c) Tool C
No of Con- Response Time Resource Utilization (Client) Resource Utilization (Server)
current Users (seconds) CPU Memory Disk I/O CPU Memory Disk I/O
1 0.047 0% 3% 22% 35% 19% 3%
1 0.078 0% 3% 22% 38% 19% 3%
1 0.047 0% 3% 22% 25% 20% 3%
100 1.487 3% 3% 22% 100% 19% 59%
100 2.174 3% 3% 22% 100% 19% 14%
100 1.52 1% 3% 22% 100% 19% 22%
200 3.007 3% 3% 22% 100% 19% 27%
200 3.614 2% 3% 22% 100% 18% 15%
200 4.021 5% 3% 22% 100% 19% 38%
300 5.997 2% 3% 22% 100% 18% 22%
300 5.947 3% 3% 22% 100% 17% 10%
300 4.979 3% 3% 22% 100% 17% 18%
400 6.272 11% 3% 22% 100% 22% 15%
400 7.042 2% 3% 22% 100% 17% 6%
400 7.332 3% 3% 22% 100% 17% 27%
500 7.771 4% 4% 22% 88% 18% 7%
500 7.106 7% 4% 22% 100% 18% 24%
500 7.604 7% 4% 22% 100% 16% 10%
Based on the metrics collected from the three tools, we plotted times under different user loads for three rounds of performance
the graphs to compare and understand the trends of the re- testing. The graphs representing the results detailed in the previ-
sponse times. The graphs focus only on comparing the response ous tables are shown below:
Figure 2: Performance (Load) Test – Round 1 Figure 3: Performance (Load) Test – Round 2
Figure 4: Performance (Load) Test – Round 3
60 The Magazine for Professional Testers www.testingexperience.com
5. From the graphs plotted, we could observe the differences in re- ed and proved that differences do exist. In our next research ef-
sponse time between the three different tools after running a per- fort we will try to answer why this is the case. Some fundamental
formance (load) test for the same HTML web page. This result has understandings that are already in place concern the language
achieved our research objective, which was to prove that, when with which the tools have been developed, the architecture of the
using different performance test tools, the response times we get respective tools, the recording mechanism for each tool, captur-
are not similar or even close to each other, especially as the num- ing and simulating the load used for the performance test, and
ber of concurrent users increases. As mentioned previously, Tool the method of calculating metrics gathered by each tool. Having
A and Tool B are open-source tools while Tool C is a proprietary this research in place will allow further understanding of the be-
tool, which leads us to another research question: “Why do differ- havior of various performance tools and could help in selecting
ent performance testing tools give different response times when the most suitable tool for different contexts of testing.
testing the same web page?”. In this article, we have demonstrat-
Biography
Dhiauddin is currently a Senior Engi- Vivek Kumar works as a Senior Test Redzuan is currently the Director of the
neer and Test Team Lead for the test Engineer at one of the most reputable test department in one of the leading
department in one of the leading R&D R&D agencies in Malaysia. He has more R&D agencies in Malaysia. He has more
agencies in Malaysia. He has almost than 7 years of experience in Software than 14 years of working experience
6 years of experience in the software/ QA both in manual testing and in test in software development, technical
system development and software automation. During his QA/testing support and software testing in both
testing / quality assurance fields. With career he has been working in various local and multi-national organizations..
a working experience in automotive, domains, such as mobile telecommuni- Redzuan has vast knowledge in various
banking and research & development cations, banking, agriculture, semantic programming languages, CMMI, Six
companies, he obtained his technical technology, and ERP systems. He holds Sigma, telecommunication and project
and management skills from various an Engineering degree in Electronics management with international certifi-
types of project. He holds an M.Sc. & Instrumentation from the Dr. B. R. cations in TOGAF, ASP, Red Hat Certified
in Real Time Software Engineering Ambedkar University, Agra (Formerly Engineer (RHCE), British Standard Infor-
from the Centre for Advanced Soft- Agra University, Agra). He is a Certified mation Security Auditor (BS7799) and
ware Engineering (CASE®), Universiti Tester- Foundation Level (CTFL) from Six Sigma Green Belt among others.
Teknologi Malaysia. He is a Certified the International Software Testing
Tester - Foundation Level (CTFL) and Qualification Board (ISTQB®). He has
Certified Tester Advanced Level – Test undertaken and managed various
Manager (CTAL-TM) of the International projects from different domains and
Software Testing Qualification Board been involved in the complete project
(ISTQB®). He also has vast knowledge in life cycle starting from project kick-off
CMMI®, Test Process and Methodolo- to release to the customer. Vivek has a
gies as well as Software Development sound knowledge in CMMI, STLC, SDLC,
Life Cycle (SDLC). He has just completed automation testing and automation
his Six Sigma Green Belt project on the framework (keyword driven and data
test defect prediction model. He has driven) with Selenium, HP Quick Test
applied and managed various testing Professional 10 and Rational Functional
strategies in his projects, including Tester. His tireless dedication to the
functional, performance, security, usa- advancement of software testing in
bility and compatibility testing both general and to performance testing
for system test and system integration in particular is often referred to as a
test. His interests are in software engi- hobby and not only a job due to the
neering and software testing, particu- enjoyment he gets from his efforts.
larly in performance testing and test
management.
www.testingexperience.com The Magazine for Professional Testers 61