SlideShare a Scribd company logo
1 of 5
ISSN 1866-5705   www.testingexperience.com   free digital version     print version 8,00 €   printed in Germany                                                           12




                                                                                                                                        The Magazine for Professional Testers




                                                                                          Open Source Tools
                                                                                                                                                                                December 2010




© diego cervo - Fotolia.com
Response Time Accuracy:
                                                                          open-source or Proprietary
                                                                          Performance Test Tool
                                                                                        by Muhammad Dhiauddin Mohamed Suffian,
                                                                                                 Vivek Kumar & Redzuan Abdullah




Performance testing is a critical test approach in achieving the          it is important to demonstrate that there are differences in re-
full coverage of software product testing, apart from functiona-          sponse time between various performance testing tools and to
lity and other non-functional aspects. Performance testing will           understand the tools’ behaviors including their influence on the
determine that the software under test is capable of performing           machine resource utilization. We started off by planning the cor-
within an acceptable time according to the end-users’ expecta-            rect test setup for the study.
tions. Our experiences in performance testing, particularly in the
load testing of various types of software, ranging from mobile            The test setup involved two identical machines that serve as cli-
applications, desktop, web-based to cloud-based applications              ent with performance testing tools installed and a server which
and using different performance testing tools, have triggered us          hosts the static web pages. These machines are connected via PC-
to ask which tool actually gives the most accurate results. The           to-PC LAN cable (commonly referred as cross cable). The reason
answer is not simple, and it is not that the proprietary tool always      for connecting it via cross cable is to minimize network factors
gives the right and accurate response times compared to open-             from influencing the performance test results. The diagram for
source tools, or vice-versa. Thus, before answering the question,         the setup is as follows:




                 Figure 1: Test setup for comparison of performance testing tools


As shown in the diagram, only three performance testing tools             cookies. Running processes on this machine were kept to a mi-
were selected and installed on the client machine. For the scope          nimum, i.e. unnecessary processes and server-side logging were
of the study, the tools are only referred as Tool A, Tool B and Tool      disabled. Load testing was performed from 1 user access and in-
C. While the test was running for a particular tool, the services         creased to 100, 200, 300, 400 and 500 concurrent users with zero
for the other remaining tools were stopped or disabled. As for            ramped-up time. The most important thing for this experiment is
the server, it only hosted static HTML pages with no cache and            to reboot both the client and server machines for each test.

58             The Magazine for Professional Testers                                                             www.testingexperience.com
Server machine
From the technical perspective, the hardware specification for
both machines was as follows:                                            Operating system: Windows Server 2003 Enterprise Edition
                                                                         SP1
    CPU/processor: Intel Pentium D 3.4 GHz
                                                                         Java JDK: JDK 1.6.0 update 21
    RAM/memory: 2 GB
                                                                         Web server: Internet Information Services 6
    Hard disk storage: 80 GB
                                                                         HTML page size: 65.8 KB (page: 7 KB; image 1: 25.2 KB; image
The configuration of the software specification was as follows:          2: 33.6 KB)

    Client machine                                                   The test was started with Tool A and with one simulating user ac-
                                                                     cessing the HTML page at the server machine using cross cable. At
    Operating system: Windows XP SP2
                                                                     the same time, we recorded CPU, memory and disk I/O utilization
    Java JDK: JDK 1.6.0 update 21                                    for both client and server machines. The same approach was done
    Tool: Tool A (open-source); Tool B (open-source); Tool C (pro-   three times per virtual user set until the number reached 500 vir-
    prietary)                                                        tual users. The same kinds of load test were performed against
                                                                     Tool B and Tool C. The metrics collected from this exercise were
                                                                     response time, average CPU utilization, memory utilization and
                                                                     disk I/O utilization. The details of the results obtained from this
                                                                     activity are listed below:

          a)       Tool A


           No of Con-    Response Time        Resource Utilization (Client)           Resource Utilization (Server)
           current Users (seconds)            CPU          Memory       Disk I/O      CPU           Memory        Disk I/O
           1                0.103             0%           3%           22%           21%           22%           0%
           1                0.100             0%           3%           22%           23%           22%           0%
           1                0.103             0%           3%           22%           20%           22%           0%
           100              0.258             6%           3%           22%           65%           23%           0%
           100              0.29              5%           3%           22%           65%           23%           0%
           100              0.231             1%           3%           22%           67%           22%           0%
           200              0.198             7%           3%           22%           71%           23%           1%
           200              0.231             7%           3%           22%           83%           23%           2%
           200              0.538             4%           3%           22%           93%           23%           0%
           300              0.337             3%           3%           22%           94%           23%           0%
           300              0.265             10%          3%           22%           95%           23%           0%
           300              0.229             3%           4%           22%           97%           23%           3%
           400              0.169             7%           4%           22%           93%           23%           0%
           400              0.534             8%           3%           22%           92%           23%           3%
           400              0.278             6%           3%           22%           95%           23%           2%
           500              0.394             14%          4%           22%           96%           22%           8%
           500              0.35              14%          4%           22%           97%           22%           9%
           500              0.306             14%          4%           22%           95%           23%           10%

          b)       Tool B

           No of Con-    Response Time        Resource Utilization (Client)           Resource Utilization (Server)
           current Users (seconds)            CPU          Memory       Disk I/O      CPU           Memory        Disk I/O
           1                0.015             0%           4%           22%           17%           7%            2%
           1                0.015             0%           4%           22%           17%           7%            2%
           1                0.015             0%           4%           22%           17%           7%            2%
           100              1.423             1%           4%           22%           99%           6%            12%
           100              1.211             3%           4%           22%           99%           6%            7%
           100              1.403             1%           4%           22%           99%           6%            48%
           200              3.489             3%           4%           22%           99%           6%            55%
           200              4.478             4%           4%           22%           99%           7%            53%
           200              5.123             4%           4%           22%           98%           6%            31%

www.testingexperience.com                                                           The Magazine for Professional Testers           59
300                6.158               7%           4%           22%          99%           6%             14%
            300                7.068               5%           4%           22%          99%           7%             33%
            300                5.394               2%           4%           22%          99%           6%             69%
            400                8.597               3%           4%           22%          99%           6%             32%
            400                8.164               10%          4%           22%          97%           6%             32%
            400                8.757               3%           4%           22%          98%           6%             36%
            500                11.316              5%           4%           22%          98%           6%             47%
            500                11.17               5%           4%           22%          98%           7%             27%
            500                8.901               8%           4%           22%          97%           6%             28%

           c)         Tool C


            No of Con-    Response Time            Resource Utilization (Client)          Resource Utilization (Server)
            current Users (seconds)                CPU          Memory       Disk I/O     CPU           Memory         Disk I/O
            1                  0.047               0%           3%           22%          35%           19%            3%
            1                  0.078               0%           3%           22%          38%           19%            3%
            1                  0.047               0%           3%           22%          25%           20%            3%
            100                1.487               3%           3%           22%          100%          19%            59%
            100                2.174               3%           3%           22%          100%          19%            14%
            100                1.52                1%           3%           22%          100%          19%            22%
            200                3.007               3%           3%           22%          100%          19%            27%
            200                3.614               2%           3%           22%          100%          18%            15%
            200                4.021               5%           3%           22%          100%          19%            38%
            300                5.997               2%           3%           22%          100%          18%            22%
            300                5.947               3%           3%           22%          100%          17%            10%
            300                4.979               3%           3%           22%          100%          17%            18%
            400                6.272               11%          3%           22%          100%          22%            15%
            400                7.042               2%           3%           22%          100%          17%            6%
            400                7.332               3%           3%           22%          100%          17%            27%
            500                7.771               4%           4%           22%          88%           18%            7%
            500                7.106               7%           4%           22%          100%          18%            24%
            500                7.604               7%           4%           22%          100%          16%            10%

Based on the metrics collected from the three tools, we plotted           times under different user loads for three rounds of performance
the graphs to compare and understand the trends of the re-                testing. The graphs representing the results detailed in the previ-
sponse times. The graphs focus only on comparing the response             ous tables are shown below:




Figure 2: Performance (Load) Test – Round 1                                                         Figure 3: Performance (Load) Test – Round 2




                                                                                                     Figure 4: Performance (Load) Test – Round 3


60                The Magazine for Professional Testers                                                             www.testingexperience.com
From the graphs plotted, we could observe the differences in re-      ed and proved that differences do exist. In our next research ef-
sponse time between the three different tools after running a per-    fort we will try to answer why this is the case. Some fundamental
formance (load) test for the same HTML web page. This result has      understandings that are already in place concern the language
achieved our research objective, which was to prove that, when        with which the tools have been developed, the architecture of the
using different performance test tools, the response times we get     respective tools, the recording mechanism for each tool, captur-
are not similar or even close to each other, especially as the num-   ing and simulating the load used for the performance test, and
ber of concurrent users increases. As mentioned previously, Tool      the method of calculating metrics gathered by each tool. Having
A and Tool B are open-source tools while Tool C is a proprietary      this research in place will allow further understanding of the be-
tool, which leads us to another research question: “Why do differ-    havior of various performance tools and could help in selecting
ent performance testing tools give different response times when      the most suitable tool for different contexts of testing.
testing the same web page?”. In this article, we have demonstrat-




Biography
Dhiauddin is currently a Senior Engi-           Vivek Kumar works as a Senior Test              Redzuan is currently the Director of the
neer and Test Team Lead for the test            Engineer at one of the most reputable           test department in one of the leading
department in one of the leading R&D            R&D agencies in Malaysia. He has more           R&D agencies in Malaysia. He has more
agencies in Malaysia. He has almost             than 7 years of experience in Software          than 14 years of working experience
6 years of experience in the software/          QA both in manual testing and in test           in software development, technical
system development and software                 automation. During his QA/testing               support and software testing in both
testing / quality assurance fields. With        career he has been working in various           local and multi-national organizations..
a working experience in automotive,             domains, such as mobile telecommuni-            Redzuan has vast knowledge in various
banking and research & development              cations, banking, agriculture, semantic         programming languages, CMMI, Six
companies, he obtained his technical            technology, and ERP systems. He holds           Sigma, telecommunication and project
and management skills from various              an Engineering degree in Electronics            management with international certifi-
types of project. He holds an M.Sc.             & Instrumentation from the Dr. B. R.            cations in TOGAF, ASP, Red Hat Certified
in Real Time Software Engineering               Ambedkar University, Agra (Formerly             Engineer (RHCE), British Standard Infor-
from the Centre for Advanced Soft-              Agra University, Agra). He is a Certified       mation Security Auditor (BS7799) and
ware Engineering (CASE®), Universiti            Tester- Foundation Level (CTFL) from            Six Sigma Green Belt among others.
Teknologi Malaysia. He is a Certified           the International Software Testing
Tester - Foundation Level (CTFL) and            Qualification Board (ISTQB®). He has
Certified Tester Advanced Level – Test          undertaken and managed various
Manager (CTAL-TM) of the International          projects from different domains and
Software Testing Qualification Board            been involved in the complete project
(ISTQB®). He also has vast knowledge in         life cycle starting from project kick-off
CMMI®, Test Process and Methodolo-              to release to the customer. Vivek has a
gies as well as Software Development            sound knowledge in CMMI, STLC, SDLC,
Life Cycle (SDLC). He has just completed        automation testing and automation
his Six Sigma Green Belt project on the         framework (keyword driven and data
test defect prediction model. He has            driven) with Selenium, HP Quick Test
applied and managed various testing             Professional 10 and Rational Functional
strategies in his projects, including           Tester. His tireless dedication to the
functional, performance, security, usa-         advancement of software testing in
bility and compatibility testing both           general and to performance testing
for system test and system integration          in particular is often referred to as a
test. His interests are in software engi-       hobby and not only a job due to the
neering and software testing, particu-          enjoyment he gets from his efforts.
larly in performance testing and test
management.




www.testingexperience.com                                                            The Magazine for Professional Testers           61

More Related Content

Viewers also liked

Administration (Eliot Horowitz)
Administration (Eliot Horowitz)Administration (Eliot Horowitz)
Administration (Eliot Horowitz)
MongoSF
 
Schema design with MongoDB (Dwight Merriman)
Schema design with MongoDB (Dwight Merriman)Schema design with MongoDB (Dwight Merriman)
Schema design with MongoDB (Dwight Merriman)
MongoSF
 
Implementing MongoDB at Shutterfly (Kenny Gorman)
Implementing MongoDB at Shutterfly (Kenny Gorman)Implementing MongoDB at Shutterfly (Kenny Gorman)
Implementing MongoDB at Shutterfly (Kenny Gorman)
MongoSF
 
MongoDB Replication (Dwight Merriman)
MongoDB Replication (Dwight Merriman)MongoDB Replication (Dwight Merriman)
MongoDB Replication (Dwight Merriman)
MongoSF
 
Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...
Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...
Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...
MongoSF
 

Viewers also liked (15)

Performance Testing: Analyzing Differences of Response Time between Performan...
Performance Testing: Analyzing Differences of Response Time between Performan...Performance Testing: Analyzing Differences of Response Time between Performan...
Performance Testing: Analyzing Differences of Response Time between Performan...
 
Establishing A Defect Prediction Model Using A Combination of Product Metrics...
Establishing A Defect Prediction Model Using A Combination of Product Metrics...Establishing A Defect Prediction Model Using A Combination of Product Metrics...
Establishing A Defect Prediction Model Using A Combination of Product Metrics...
 
A Method for Predicting Defects in System Testing for V-Model
A Method for Predicting Defects in System Testing for V-ModelA Method for Predicting Defects in System Testing for V-Model
A Method for Predicting Defects in System Testing for V-Model
 
Breaking the Software - A Topic on Software Engineering & Testing
Breaking the Software -  A Topic on Software Engineering & TestingBreaking the Software -  A Topic on Software Engineering & Testing
Breaking the Software - A Topic on Software Engineering & Testing
 
Administration (Eliot Horowitz)
Administration (Eliot Horowitz)Administration (Eliot Horowitz)
Administration (Eliot Horowitz)
 
Schema design with MongoDB (Dwight Merriman)
Schema design with MongoDB (Dwight Merriman)Schema design with MongoDB (Dwight Merriman)
Schema design with MongoDB (Dwight Merriman)
 
Implementing MongoDB at Shutterfly (Kenny Gorman)
Implementing MongoDB at Shutterfly (Kenny Gorman)Implementing MongoDB at Shutterfly (Kenny Gorman)
Implementing MongoDB at Shutterfly (Kenny Gorman)
 
An Alternative of Secured Online Shopping System via Point-Based Contactless ...
An Alternative of Secured Online Shopping System via Point-Based Contactless ...An Alternative of Secured Online Shopping System via Point-Based Contactless ...
An Alternative of Secured Online Shopping System via Point-Based Contactless ...
 
MongoDB Replication (Dwight Merriman)
MongoDB Replication (Dwight Merriman)MongoDB Replication (Dwight Merriman)
MongoDB Replication (Dwight Merriman)
 
Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...
Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...
Real time ecommerce analytics with MongoDB at Gilt Groupe (Michael Bryzek & M...
 
Performance Testing Strategy for Cloud-Based System using Open Source Testing...
Performance Testing Strategy for Cloud-Based System using Open Source Testing...Performance Testing Strategy for Cloud-Based System using Open Source Testing...
Performance Testing Strategy for Cloud-Based System using Open Source Testing...
 
VIEWLEX # 08
VIEWLEX # 08VIEWLEX # 08
VIEWLEX # 08
 
Le microlearning est-il l'avenir de la formation ?
Le microlearning est-il l'avenir de la formation ?Le microlearning est-il l'avenir de la formation ?
Le microlearning est-il l'avenir de la formation ?
 
VIEWLEX # 04
VIEWLEX # 04VIEWLEX # 04
VIEWLEX # 04
 
20080618 Suivi Lecteurs Epsa
20080618 Suivi Lecteurs Epsa20080618 Suivi Lecteurs Epsa
20080618 Suivi Lecteurs Epsa
 

Similar to Testing Experience Magazine Vol.12 Dec 2010

Automated Software Testing Framework Training by Quontra Solutions
Automated Software Testing Framework Training by Quontra SolutionsAutomated Software Testing Framework Training by Quontra Solutions
Automated Software Testing Framework Training by Quontra Solutions
Quontra Solutions
 
75.bug tracking for improving software quality & reliability
75.bug  tracking  for improving software quality & reliability75.bug  tracking  for improving software quality & reliability
75.bug tracking for improving software quality & reliability
happiness09
 

Similar to Testing Experience Magazine Vol.12 Dec 2010 (20)

Load Runner
Load RunnerLoad Runner
Load Runner
 
SE2018_Lec 19_ Software Testing
SE2018_Lec 19_ Software TestingSE2018_Lec 19_ Software Testing
SE2018_Lec 19_ Software Testing
 
Performance testing and j meter
Performance testing and j meterPerformance testing and j meter
Performance testing and j meter
 
Cross-project defect prediction
Cross-project defect predictionCross-project defect prediction
Cross-project defect prediction
 
Automated Regression Testing for Embedded Systems in Action
Automated Regression Testing for Embedded Systems in ActionAutomated Regression Testing for Embedded Systems in Action
Automated Regression Testing for Embedded Systems in Action
 
Automation test framework with cucumber – BDD
Automation test framework with cucumber – BDDAutomation test framework with cucumber – BDD
Automation test framework with cucumber – BDD
 
AVG PC TuneUp Whitepaper 2015
AVG PC TuneUp Whitepaper 2015AVG PC TuneUp Whitepaper 2015
AVG PC TuneUp Whitepaper 2015
 
Testing & Improving Performance in IBM Cognos BI, Plus Automated Cognos Testi...
Testing & Improving Performance in IBM Cognos BI, Plus Automated Cognos Testi...Testing & Improving Performance in IBM Cognos BI, Plus Automated Cognos Testi...
Testing & Improving Performance in IBM Cognos BI, Plus Automated Cognos Testi...
 
Automated Test Framework with Cucumber
Automated Test Framework with CucumberAutomated Test Framework with Cucumber
Automated Test Framework with Cucumber
 
Performancetestingjmeter 121109061704-phpapp02
Performancetestingjmeter 121109061704-phpapp02Performancetestingjmeter 121109061704-phpapp02
Performancetestingjmeter 121109061704-phpapp02
 
Open Source Software Testing Tools
Open Source Software Testing ToolsOpen Source Software Testing Tools
Open Source Software Testing Tools
 
IRJET - A Valuable and Speculative Approach to Manage the Item Testing by usi...
IRJET - A Valuable and Speculative Approach to Manage the Item Testing by usi...IRJET - A Valuable and Speculative Approach to Manage the Item Testing by usi...
IRJET - A Valuable and Speculative Approach to Manage the Item Testing by usi...
 
Dive into Angular, part 5: Experience
Dive into Angular, part 5: ExperienceDive into Angular, part 5: Experience
Dive into Angular, part 5: Experience
 
WebSphere Technical University: Introduction to the Java Diagnostic Tools
WebSphere Technical University: Introduction to the Java Diagnostic ToolsWebSphere Technical University: Introduction to the Java Diagnostic Tools
WebSphere Technical University: Introduction to the Java Diagnostic Tools
 
Unit Testing Essay
Unit Testing EssayUnit Testing Essay
Unit Testing Essay
 
Automated Software Testing Framework Training by Quontra Solutions
Automated Software Testing Framework Training by Quontra SolutionsAutomated Software Testing Framework Training by Quontra Solutions
Automated Software Testing Framework Training by Quontra Solutions
 
Bug Tracking System
Bug Tracking SystemBug Tracking System
Bug Tracking System
 
Gajanan Bhat
Gajanan BhatGajanan Bhat
Gajanan Bhat
 
75.bug tracking for improving software quality & reliability
75.bug  tracking  for improving software quality & reliability75.bug  tracking  for improving software quality & reliability
75.bug tracking for improving software quality & reliability
 
Bug Tracking System
Bug Tracking SystemBug Tracking System
Bug Tracking System
 

Recently uploaded

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
Joaquim Jorge
 

Recently uploaded (20)

2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...2024: Domino Containers - The Next Step. News from the Domino Container commu...
2024: Domino Containers - The Next Step. News from the Domino Container commu...
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 
How to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected WorkerHow to Troubleshoot Apps for the Modern Connected Worker
How to Troubleshoot Apps for the Modern Connected Worker
 
Boost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdfBoost Fertility New Invention Ups Success Rates.pdf
Boost Fertility New Invention Ups Success Rates.pdf
 
Advantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your BusinessAdvantages of Hiring UIUX Design Service Providers for Your Business
Advantages of Hiring UIUX Design Service Providers for Your Business
 
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot TakeoffStrategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
Strategize a Smooth Tenant-to-tenant Migration and Copilot Takeoff
 
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
Strategies for Unlocking Knowledge Management in Microsoft 365 in the Copilot...
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Strategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a FresherStrategies for Landing an Oracle DBA Job as a Fresher
Strategies for Landing an Oracle DBA Job as a Fresher
 
Developing An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of BrazilDeveloping An App To Navigate The Roads of Brazil
Developing An App To Navigate The Roads of Brazil
 
AWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of TerraformAWS Community Day CPH - Three problems of Terraform
AWS Community Day CPH - Three problems of Terraform
 
Automating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps ScriptAutomating Google Workspace (GWS) & more with Apps Script
Automating Google Workspace (GWS) & more with Apps Script
 
Boost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivityBoost PC performance: How more available memory can improve productivity
Boost PC performance: How more available memory can improve productivity
 
Exploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone ProcessorsExploring the Future Potential of AI-Enabled Smartphone Processors
Exploring the Future Potential of AI-Enabled Smartphone Processors
 
HTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation StrategiesHTML Injection Attacks: Impact and Mitigation Strategies
HTML Injection Attacks: Impact and Mitigation Strategies
 
Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024Tata AIG General Insurance Company - Insurer Innovation Award 2024
Tata AIG General Insurance Company - Insurer Innovation Award 2024
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf[2024]Digital Global Overview Report 2024 Meltwater.pdf
[2024]Digital Global Overview Report 2024 Meltwater.pdf
 
Artificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and MythsArtificial Intelligence: Facts and Myths
Artificial Intelligence: Facts and Myths
 
GenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day PresentationGenCyber Cyber Security Day Presentation
GenCyber Cyber Security Day Presentation
 

Testing Experience Magazine Vol.12 Dec 2010

  • 1. ISSN 1866-5705 www.testingexperience.com free digital version print version 8,00 € printed in Germany 12 The Magazine for Professional Testers Open Source Tools December 2010 © diego cervo - Fotolia.com
  • 2. Response Time Accuracy: open-source or Proprietary Performance Test Tool by Muhammad Dhiauddin Mohamed Suffian, Vivek Kumar & Redzuan Abdullah Performance testing is a critical test approach in achieving the it is important to demonstrate that there are differences in re- full coverage of software product testing, apart from functiona- sponse time between various performance testing tools and to lity and other non-functional aspects. Performance testing will understand the tools’ behaviors including their influence on the determine that the software under test is capable of performing machine resource utilization. We started off by planning the cor- within an acceptable time according to the end-users’ expecta- rect test setup for the study. tions. Our experiences in performance testing, particularly in the load testing of various types of software, ranging from mobile The test setup involved two identical machines that serve as cli- applications, desktop, web-based to cloud-based applications ent with performance testing tools installed and a server which and using different performance testing tools, have triggered us hosts the static web pages. These machines are connected via PC- to ask which tool actually gives the most accurate results. The to-PC LAN cable (commonly referred as cross cable). The reason answer is not simple, and it is not that the proprietary tool always for connecting it via cross cable is to minimize network factors gives the right and accurate response times compared to open- from influencing the performance test results. The diagram for source tools, or vice-versa. Thus, before answering the question, the setup is as follows: Figure 1: Test setup for comparison of performance testing tools As shown in the diagram, only three performance testing tools cookies. Running processes on this machine were kept to a mi- were selected and installed on the client machine. For the scope nimum, i.e. unnecessary processes and server-side logging were of the study, the tools are only referred as Tool A, Tool B and Tool disabled. Load testing was performed from 1 user access and in- C. While the test was running for a particular tool, the services creased to 100, 200, 300, 400 and 500 concurrent users with zero for the other remaining tools were stopped or disabled. As for ramped-up time. The most important thing for this experiment is the server, it only hosted static HTML pages with no cache and to reboot both the client and server machines for each test. 58 The Magazine for Professional Testers www.testingexperience.com
  • 3. Server machine From the technical perspective, the hardware specification for both machines was as follows: Operating system: Windows Server 2003 Enterprise Edition SP1 CPU/processor: Intel Pentium D 3.4 GHz Java JDK: JDK 1.6.0 update 21 RAM/memory: 2 GB Web server: Internet Information Services 6 Hard disk storage: 80 GB HTML page size: 65.8 KB (page: 7 KB; image 1: 25.2 KB; image The configuration of the software specification was as follows: 2: 33.6 KB) Client machine The test was started with Tool A and with one simulating user ac- cessing the HTML page at the server machine using cross cable. At Operating system: Windows XP SP2 the same time, we recorded CPU, memory and disk I/O utilization Java JDK: JDK 1.6.0 update 21 for both client and server machines. The same approach was done Tool: Tool A (open-source); Tool B (open-source); Tool C (pro- three times per virtual user set until the number reached 500 vir- prietary) tual users. The same kinds of load test were performed against Tool B and Tool C. The metrics collected from this exercise were response time, average CPU utilization, memory utilization and disk I/O utilization. The details of the results obtained from this activity are listed below: a) Tool A No of Con- Response Time Resource Utilization (Client) Resource Utilization (Server) current Users (seconds) CPU Memory Disk I/O CPU Memory Disk I/O 1 0.103 0% 3% 22% 21% 22% 0% 1 0.100 0% 3% 22% 23% 22% 0% 1 0.103 0% 3% 22% 20% 22% 0% 100 0.258 6% 3% 22% 65% 23% 0% 100 0.29 5% 3% 22% 65% 23% 0% 100 0.231 1% 3% 22% 67% 22% 0% 200 0.198 7% 3% 22% 71% 23% 1% 200 0.231 7% 3% 22% 83% 23% 2% 200 0.538 4% 3% 22% 93% 23% 0% 300 0.337 3% 3% 22% 94% 23% 0% 300 0.265 10% 3% 22% 95% 23% 0% 300 0.229 3% 4% 22% 97% 23% 3% 400 0.169 7% 4% 22% 93% 23% 0% 400 0.534 8% 3% 22% 92% 23% 3% 400 0.278 6% 3% 22% 95% 23% 2% 500 0.394 14% 4% 22% 96% 22% 8% 500 0.35 14% 4% 22% 97% 22% 9% 500 0.306 14% 4% 22% 95% 23% 10% b) Tool B No of Con- Response Time Resource Utilization (Client) Resource Utilization (Server) current Users (seconds) CPU Memory Disk I/O CPU Memory Disk I/O 1 0.015 0% 4% 22% 17% 7% 2% 1 0.015 0% 4% 22% 17% 7% 2% 1 0.015 0% 4% 22% 17% 7% 2% 100 1.423 1% 4% 22% 99% 6% 12% 100 1.211 3% 4% 22% 99% 6% 7% 100 1.403 1% 4% 22% 99% 6% 48% 200 3.489 3% 4% 22% 99% 6% 55% 200 4.478 4% 4% 22% 99% 7% 53% 200 5.123 4% 4% 22% 98% 6% 31% www.testingexperience.com The Magazine for Professional Testers 59
  • 4. 300 6.158 7% 4% 22% 99% 6% 14% 300 7.068 5% 4% 22% 99% 7% 33% 300 5.394 2% 4% 22% 99% 6% 69% 400 8.597 3% 4% 22% 99% 6% 32% 400 8.164 10% 4% 22% 97% 6% 32% 400 8.757 3% 4% 22% 98% 6% 36% 500 11.316 5% 4% 22% 98% 6% 47% 500 11.17 5% 4% 22% 98% 7% 27% 500 8.901 8% 4% 22% 97% 6% 28% c) Tool C No of Con- Response Time Resource Utilization (Client) Resource Utilization (Server) current Users (seconds) CPU Memory Disk I/O CPU Memory Disk I/O 1 0.047 0% 3% 22% 35% 19% 3% 1 0.078 0% 3% 22% 38% 19% 3% 1 0.047 0% 3% 22% 25% 20% 3% 100 1.487 3% 3% 22% 100% 19% 59% 100 2.174 3% 3% 22% 100% 19% 14% 100 1.52 1% 3% 22% 100% 19% 22% 200 3.007 3% 3% 22% 100% 19% 27% 200 3.614 2% 3% 22% 100% 18% 15% 200 4.021 5% 3% 22% 100% 19% 38% 300 5.997 2% 3% 22% 100% 18% 22% 300 5.947 3% 3% 22% 100% 17% 10% 300 4.979 3% 3% 22% 100% 17% 18% 400 6.272 11% 3% 22% 100% 22% 15% 400 7.042 2% 3% 22% 100% 17% 6% 400 7.332 3% 3% 22% 100% 17% 27% 500 7.771 4% 4% 22% 88% 18% 7% 500 7.106 7% 4% 22% 100% 18% 24% 500 7.604 7% 4% 22% 100% 16% 10% Based on the metrics collected from the three tools, we plotted times under different user loads for three rounds of performance the graphs to compare and understand the trends of the re- testing. The graphs representing the results detailed in the previ- sponse times. The graphs focus only on comparing the response ous tables are shown below: Figure 2: Performance (Load) Test – Round 1 Figure 3: Performance (Load) Test – Round 2 Figure 4: Performance (Load) Test – Round 3 60 The Magazine for Professional Testers www.testingexperience.com
  • 5. From the graphs plotted, we could observe the differences in re- ed and proved that differences do exist. In our next research ef- sponse time between the three different tools after running a per- fort we will try to answer why this is the case. Some fundamental formance (load) test for the same HTML web page. This result has understandings that are already in place concern the language achieved our research objective, which was to prove that, when with which the tools have been developed, the architecture of the using different performance test tools, the response times we get respective tools, the recording mechanism for each tool, captur- are not similar or even close to each other, especially as the num- ing and simulating the load used for the performance test, and ber of concurrent users increases. As mentioned previously, Tool the method of calculating metrics gathered by each tool. Having A and Tool B are open-source tools while Tool C is a proprietary this research in place will allow further understanding of the be- tool, which leads us to another research question: “Why do differ- havior of various performance tools and could help in selecting ent performance testing tools give different response times when the most suitable tool for different contexts of testing. testing the same web page?”. In this article, we have demonstrat- Biography Dhiauddin is currently a Senior Engi- Vivek Kumar works as a Senior Test Redzuan is currently the Director of the neer and Test Team Lead for the test Engineer at one of the most reputable test department in one of the leading department in one of the leading R&D R&D agencies in Malaysia. He has more R&D agencies in Malaysia. He has more agencies in Malaysia. He has almost than 7 years of experience in Software than 14 years of working experience 6 years of experience in the software/ QA both in manual testing and in test in software development, technical system development and software automation. During his QA/testing support and software testing in both testing / quality assurance fields. With career he has been working in various local and multi-national organizations.. a working experience in automotive, domains, such as mobile telecommuni- Redzuan has vast knowledge in various banking and research & development cations, banking, agriculture, semantic programming languages, CMMI, Six companies, he obtained his technical technology, and ERP systems. He holds Sigma, telecommunication and project and management skills from various an Engineering degree in Electronics management with international certifi- types of project. He holds an M.Sc. & Instrumentation from the Dr. B. R. cations in TOGAF, ASP, Red Hat Certified in Real Time Software Engineering Ambedkar University, Agra (Formerly Engineer (RHCE), British Standard Infor- from the Centre for Advanced Soft- Agra University, Agra). He is a Certified mation Security Auditor (BS7799) and ware Engineering (CASE®), Universiti Tester- Foundation Level (CTFL) from Six Sigma Green Belt among others. Teknologi Malaysia. He is a Certified the International Software Testing Tester - Foundation Level (CTFL) and Qualification Board (ISTQB®). He has Certified Tester Advanced Level – Test undertaken and managed various Manager (CTAL-TM) of the International projects from different domains and Software Testing Qualification Board been involved in the complete project (ISTQB®). He also has vast knowledge in life cycle starting from project kick-off CMMI®, Test Process and Methodolo- to release to the customer. Vivek has a gies as well as Software Development sound knowledge in CMMI, STLC, SDLC, Life Cycle (SDLC). He has just completed automation testing and automation his Six Sigma Green Belt project on the framework (keyword driven and data test defect prediction model. He has driven) with Selenium, HP Quick Test applied and managed various testing Professional 10 and Rational Functional strategies in his projects, including Tester. His tireless dedication to the functional, performance, security, usa- advancement of software testing in bility and compatibility testing both general and to performance testing for system test and system integration in particular is often referred to as a test. His interests are in software engi- hobby and not only a job due to the neering and software testing, particu- enjoyment he gets from his efforts. larly in performance testing and test management. www.testingexperience.com The Magazine for Professional Testers 61