SlideShare a Scribd company logo
1 of 14
Download to read offline
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 1
Journal of Web Engineering & Technology
ISSN: 2455-1880(online)
Volume 3, Issue 2
www.stmjournals.com
Performance Testing on Web Application through HP
Load Runner with Parameterization and Customization
Jatin Aggarwal*, Arun Solanki
Department of Computer Science and Engineering, Gautam Buddha University, Greater Noida, India
Abstract
This paper provides an extension or enhancement in HP loadrunner tool which is a software
automated testing tool used to carry out the intensive performance testing. The extension aims
in designing the functions which are used to reduce the data management time taken by the
client or the user in different scripting methods like-parameterization, correlation, and
validation checks. The proposed technique has been implemented in C. Proposed scheme
reduces the human efforts in parameterizing the values and saves user time. The paper shows
the brief comparison results between the proposed scheme and in the existing HP loadrunner
tool which is available in the market.
Keywords: Software testing, performance testing, HP loadrunner, automation testing, C
*Author for Correspondence E-mail: jatinatj12@gmail.com
INTRODUCTION
Software testing is an approach which used to
recognize the completeness, correctness and
the quality of developed software [1]. Testing
is completely based on the requirements i.e.
which part or functionalities need to be tested,
how much testing is required to test the
software [2]. Testing can be performed at
different levels which are: unit testing,
integration testing, system testing, user-
acceptance testing, regression testing,
performance testing, and beta testing [3].
Performance testing is performed to test the
application’s stability, scalability as well as its
response time [4]. The different techniques of
performance testing are: load testing, stress
testing, spike testing, endurance testing, and
volume testing [5]. HP loadrunner is an
automated load testing tool. It is used to
conduct load testing before, during, and after
application deployment. Most of the
companies or clients use loadrunner tool as a
benchmark for cross verifying their
performance of the developed software [6].
LITERATURE SURVEY
A search of the published literature has
following papers such as Eljona Proko and Ilia
Ninka, give a discourse that performance test
tools help developer to discover bottleneck in
performance of the system and also to choose
a suitable platform for outlining the web
application [7]. Sharmila and Ramadevi
proposed performance testing ideas,
objectives, techniques and accessible tools for
testing web applications performance. They
had analyzed that performance testing was
used to decide the responsiveness, throughput,
quality assurance, and versatility of a system
under a given workload [8].
Vokolos and Weyuker, gave a discourse on
ways to deal with software performance
testing. A contextual analysis depicts the
experience that methodologies utilized for
testing the performance of a framework
utilized as a passage as a part of a substantial
modern customer/server exchange handling
application [9]. Korel and Al-Yami discussed
automating the regression testing process
which included testing the changed program
with experiments keeping in mind the end goal
to build up the certainty that the system will
perform as indicated by the adjusted
determination [10].
Yang and Pollock presented testing tools for
structural load testing which took a program
code as input, and naturally figured out if that
program should be load tested, and assuming
this was the case, consequently produced test
information for basic load testing of the
system [11]. Krishnamurthy portrayed poor
execution could antagonistically affect the
HP LoadRunner Tool Aggarwal and Solanki
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 2
benefit of ventures that depended on the web
applications. Accordingly, intensive
performance testing strategies are vital for
comprehension whether an online framework
will meet its execution targets when conveyed
in this present reality [12].
Sharma and Angmo, have talked about
different web automation testing tools which
will push us to comprehend the automation
testing and additionally the apparatuses
accessible for automation testing [13].
Ahlawat and Tyagi, discuss three well-known
load testing tools i.e. WAPT, LOADUI and
LOADSTER and their analysis has been made
as far as normal response time and ideal
response rate. After effects of the analysis help
in adoption and utilization of these tools [14].
SYSTEM ARCHITECTURE
As can be seen in Figure 1, the architecture of
the proposed work is a three-tier architecture
with a graphical user interface (GUI), a central
processing unit and a database [15].
Graphical User Interface (GUI)
The user is provided with a GUI to provide the
necessary inputs to the system. The user is
required to enter the hard-coded values for the
different features of the application.
Database
The database used in this system is a dynamic
database. The database is designed at runtime
based on the values entered by the user. The
database is displayed on the GUI as soon as
the user enters the necessary values [16].
Requirement Module
In this module, user or client needs to initiate
with defining the requirements which include:
performance test objective, performance
requirements, work load profile and
performance goals. This module is also known
as SLA’s (Service Level Agreement).
Scenario Identification
In this module, user needs to identify the
scenarios on the basis of requirements defined
in the SLA’s i.e. for which requirement which
scenario would be the best to test the
functionality or features of the system [17].
Script Customization
This module is responsible for generating the
load test scripts first and then enhance it with
the performance scripting methods such as
parameterization, correlation, transactions etc.
Scenario Execution
In this module, load scenarios would be
executed which was identified on the basis of
requirements in the requirement module.
Result Analysis
This module is responsible for generating the
graph and reports from the dump that consists
of raw data generated while executing the
scenarios in the scenario execution module.
PROCESS FLOW
The process flow of the proposed system has
been depicted in Figure 2. The system consists
of various different modules that interact with
each other to complete the working of the
system [18].
Fig. 1: System Architecture of Proposed Work.
Journal of Web Engineering & Technology
Volume 3, Issue 2
ISSN: 2455-1880(online)
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 3
Graphical User Interface (GUI)
The user is provided with a GUI to provide the
necessary inputs to the system. The user is
required to enter the hard-coded values for the
different features of the application.
Database
The database used in this system is a dynamic
database. The database is designed at runtime
based on the values entered by the user. The
database is displayed on the GUI as soon as
the user enters the necessary values [19].
Requirement Module
In this module, user or client needs to initiate
with defining the requirements which include:
performance test objective, performance
requirements, work load profile and
performance goals. This module is also known
as SLA’s (Service Level Agreement).
Scenario Identification
In this module, user needs to identify the
scenarios on the basis of requirements defined
in the SLA’s i.e. for which requirement which
scenario would be the best to test the
functionality or features of the system.
Script Customization
This module is responsible for generating the
load test scripts first and then enhance it with
the performance scripting methods such as
parameterization, correlation, transactions etc.
Scenario Execution
In this module, load scenarios would be
executed which was identified on the basis of
requirements in the requirement module.
Result Analysis
This module is responsible for generating the
graph and reports from the dump that consists
of raw data generated while executing the
scenarios in the scenario execution module.
PROCESS FLOW
The process flow of the proposed system has
been depicted in Figure 2. The system consists
of various different modules that interact with
each other to complete the working of the
system.
Step-1
This is the first module which is initiated by
defining the requirements. Requirements can
be defined in two ways: either functional or
non-functional. Functional requirements are
those which indicate something the framework
ought to do while the non-functional
requirements depict how the framework
functions [20].
Step-2
The scenario identification module is
responsible for checking the functionality of
the application, which needs to be tested.
Fig. 2: Process Flow Diagram of Proposed Work.
HP LoadRunner Tool Aggarwal and Solanki
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 4
Step-3
The script customization module is a
sequential flow module. Firstly, it initiates
with the creation of new script in VuGen and
then record the web tours application
according to the requirements. After recording,
a coded script called Vugen script is generated
which needs to be enhanced with the
performance scripting methods like-
parameterization, correlation, transactions, and
rendezvous points. After all the enhancements,
the script is replayed and verified to check
whether it is working right or not.
Step-4
The database used in this system is a dynamic
database. Database driven testing requires the
data files from which test data and/or output
data could be read instead of inserting the
same hardcoded values.
Step-5
The execution module takes place in the
controller component of the loadrunner. In this
module, all the load scenarios are executed.
Step-6
The result analysis module takes the raw data
from the controller and generates the graphs
and reports.
Step-7
The final result of all the load scenarios
detected by the proposed algorithm is
displayed on the screen.
METHODOLOGY OF PROPOSED
WORK
The present parameterization scripting method
in loadrunner involves the creation of
parameter list. HP loadrunner has friendly user
interface, provides rich code editor and has
many other functionalities which help to
automate any web application. Loadrunner
provides the feature to create the parameter list
through data driven testing which requires any
type of data file such as excel file which could
insert the test data and/or output values instead
of keep using the similar hardcoded values
when the test runs each time. This excel file is
created by the user manually which requires
large amount of time. And it is very important
to perform the testing under the given duration
so as to deploy the product at the given time.
Therefore, there is a need for more efficient
algorithm to parameterize the script during
automation testing. The proposed algorithm
generates the arbitrary string random functions
which can be effectively called when required
in building up the load test script. The
significance of providing this extension is very
much needed and which will be cleared
through the following example: for checking
the payment functionality in the web tours
application: first name, last name and address
is required and assume if the load test is to
check the performance of Myntra.com for
5000 clients, then the tester needs to create the
data file for first name, last name and address
for 5000 clients which requires a lot of time.
This is only one such condition which is
discussed, there can be numbers of such
conditions. The proposed system has been
implemented in a step by step manner to
clearly distinguish between the different parts
of the system. The proposed technique has
been implemented in C. Proposed scheme
reduces the human efforts in parameterizing
the values and saves user time. Extension
provides the generation of random string
functions for: First name (as sFirst), last name
(as sLast), address (as sAddress). The results
indicate that the proposed system is able to
diminish the data management time taken by
the customer or the client while scripting.
PSEUDOCODE OF PROPOSED
WORK
The pseudocode of the proposed work is as
follows:
Algorithm1: /*Generating the Flight
Booking Load Test Script*/
Step1: Start VuGen and create a new script;
Step2: Start recording the application;
Step3: After recording the complete
application, develop the load test script;
Step4: Run the script to check whether it is
working right or not as intended;
Step5: Enhance the script through performance
scripting methods like, parameterization,
correlation, transactions and etc.;
Step6: Define the transactions by keep in
touch with one line of code before request is
made to the server and close the exchange
when the request ends;
Journal of Web Engineering & Technology
Volume 3, Issue 2
ISSN: 2455-1880(online)
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 5
Step7: Parameterize the hard-coded values
either through creating the parameter list or by
generating the string random function;
Step8: Correlate the dynamic values in the
script either by automatic correlation or by
manual correlation;
Step9: Insert comments in the script while
recording or after recording;
Step10: Replay and verifies the load test
script;
Step11: Save the script;
Step12: Use controller for executing the
various load scenarios;
Step13: Analyze the result and generate graphs
as well as reports.
Algorithm2: /*Generating Random String
Functions*/
Step1: Start with naming the function by
which it is called when needed;
Step2: Declare the variables and array which
are required with their data types, sizes, and
values;
Step3: Copy the null values if any to
Temp_string variable by using strcpy function;
Step4: Use for loop from i=0 to 10;
Step5: Generate random numbers through
rand() function and store their remainder
values in Ret_value;
Step6: Concatenate the strings;
Step7: Then save the concatenated string
through lr_save_string function;
Step8: Terminate the function.
IMPLEMENTATION AND
WORKING OF PROPOSED SYSTEM
The internal working of the system is
discussed in this section. First, the
implementation details of the system have
been discussed and then, the working of the
system, in detail.
Implementation
The system has been implemented with the
help of C language. Both algorithms i.e. the
already existing parameterization method
algorithm and the proposed algorithm of
generating the string random functions have
been implemented in C. The tool used for
performance testing is the HP loadrunner tool
which is the automated testing tool.
Working
This section explains the functioning the
designed system with the help of actual
snapshots of the running system. HP
loadrunner (LR) is used to conduct load testing
before, during, and after application
deployment. The different components of load
runner are as follows: LR virtual generator
(VuGen): records virtual user (Vuser) scripts.
LR controller: creates, maintains and executes
scenarios. LR analysis: provides graphs and
reports. In this dissertation, web tours
application will be utilized that comes
packaged with loadrunner. The HP web-tours
requires a needy web tours apache server to be
running before it can work.
Defining the Requirements
For the web tours application, Table 1
illustrates the performance requirements which
consists of four columns like; test case Id’s,
expected result, actual result, pass/fail
statistics (if the actual results matched with the
expected result then the load scenarios is
considered as PASS otherwise FAIL).
Creating a New Script in VuGen
A window is shown in Figure 3 to choose the
conventions for creating a new script. As soon
as create button clicked, VuGen will open the
IDE (Integrated Development Environment) or
code editorial manager in which script files
would be cleared, with the exception of
essential mark of function action. For HP web
tours, it requires Web-HTTP/HTML protocol.
Table 1: Requirement Analysis.
Unique Test Case
Id
Test Scenario Description Expected Result
Actual
Result
Pass/Fail
T1
Run a load test with 5 Vusers for
10 min
Response time must be
3 sec
T2
Run a load test with 25 Vusers for
10 min
Response time must be
3 sec
T3
Run a load test with 50 Vusers for
10 min
Response time must be
3 sec
HP LoadRunner Tool Aggarwal and Solanki
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 6
Fig. 3: Creating a New Script.
Fig. 4: Recording the Application.
Recording the Application
After getting the code editor by VuGen,
recording can be initiated by clicking on the
recording button from the tool bar. As soon as
user hits the recording button, a start recording
window would pop-up as shown in Figure 4.
Once the user clicks on the start recording
button, VuGen would open up the HP web
tours application and a recording floating bar
would appear which consists of numerous
controls over recording. User could record the
complete script by performing different
functionalities in the application like; login,
flight booking, itinerary, signoff etc.
Generating the Code Script
After simulating all the functionalities of the
application, recording must be stopped by
clicking on stop button from the floating bar
so as to generate the coded script as shown in
Figure 5. One can easily understand the
process flow of script by looking at the left
side of the script which is called as step
navigator of the script.
Enhance the Script
At the point when a script is recorded, it
covers single and straight flow of the subject
application. Henceforth, script can be
upgraded on the premise of characterized
necessities through various techniques which
are:
Using Transactions: The utilization of
transactions measures the time taken by the
framework for a specific request. Applying
exchanges is direct, as shown in Figure 6, for
start transaction and end transaction.
Journal of Web Engineering & Technology
Volume 3, Issue 2
ISSN: 2455-1880(online)
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 7
Fig. 5: Code Generation.
Fig. 6: Enhancing the Script by Using Transactions.
Fig. 7: Parameterized List through Excel File.
Parameterization
A parameter in VuGen is a compartment that
contains a recorded worth that is substituted
for different users. It helps to decreases the
script size. Parameterization could be done
through various techniques; this research lay
out two of them which are:
 By Creating Parameter List
Loadrunner provides the feature to create the
parameter list through data driven testing
which requires any type of data file such as
excel file which could insert the test data
and/or output values instead of keep using the
similar hardcoded values when the test runs
each time as shown in Figure 7.
 By Generating String Random Functions
This is the extension in loadrunner tool on
which this research paper is based upon.
Extension provides the generation of random
string functions for: First name (as sFirst), last
name (as sLast), address (as sAddress). The
functions are created under the file called
method.h which can be seen on the left side of
the window. This parameterization technique
follows certain algorithm in order to generate
the functions which are shown in module 6.
HP LoadRunner Tool Aggarwal and Solanki
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 8
Fig. 8: Parameterization through Generating String Random Functions.
Correlation
Fig. 9: Correlating the Dynamic Value.
Correlation is a system of characterizing
relationship between two variables or entities.
It is done to handle dynamic values (values
which transforms all the time while repeating
the same steps). For example; Session Id’s,
cookies, dates and etc. This research
concentrates on web_reg_save_param function
which is used to correlate the web tours
application session id because it is the value
whose output would be diverse every time a
script is recorded as shown in Figure 9.
Using Comments
Comments make the code justifiable for
anybody who is referring it later on. They give
data about particular operation and separate
two segments for distinction as shown in
Figure 10.
Journal of Web Engineering & Technology
Volume 3, Issue 2
ISSN: 2455-1880(online)
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 9
Fig. 10: Inserting Comments to the Script.
Fig. 11: Replay and Verifies the Script.
Replay and Verifies the Script
Once all the enhancements made into the
script, click on the replay button from the
toolbar as shown in Figure 11. The reason for
this execution is to guarantee that script is
working right.
Use Controller for Load Scenarios
Controller is a system to control general load
test. Controller provides two types of
scenarios: manual scenario and goal-oriented
scenario. This research focuses on manual
scenarios. Once the play button is clicked,
controller changed its tab from design tab to
run tab as shown in Figure 12. Figure 12
shows the flight reservation load test with
5 Vuser for 10 min. Similarly, the load test
with 25 Vuser for 10 min and 50 Vuser for
10 min are executed. The top most right side
of the window contains the scenario check
status block which tells about; passed
transactions, failed transactions, running
Vusers, elapsed time, errors and etc.
HP LoadRunner Tool Aggarwal and Solanki
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 10
Fig. 12: Flight Reservation Load Test with 5 Vusers for 10 min.
Fig. 13: Analysis Report of 5Vuser.
Analyze the Results
HP analysis is a component to perform definite
investigation on the execution test that has
completed. Analysis takes the raw information
made by controller, in amid execution of
performance test. The dump contains all the
data in crude arrangement which is parsed by
HP analysis to produce different charts and
reports. Once the operations are established,
the principle session window would appear
which looks something like, as shown in
Figure 13. Figure 13 shows the analysis report
with load test of 5 Vuser for 10 min, similarly
analysis report with load test of 25 Vusers and
50 Vusers are generated.
RESULTS
The performance result and graphs of the load
test scripts is present in this section. First,
tools are analysed on different parameters like;
Journal of Web Engineering & Technology
Volume 3, Issue 2
ISSN: 2455-1880(online)
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 11
cost, installation, stability and etc. Then, test
reports of the performed work load tests for
normal (5 Vusers), minimal (25 Vusers), and
peak loads (50 Vusers) are shown. Afterwards,
the proposed work is compared against the
already existing system in the business sector
in terms of parameterization and
customization.
Comparison of HP LoadRunner and
JMeter
After analyzing and working on both the tools,
the following comparison has been established
as shown in Table 2.
Table 2 shows the descriptive comparison
between HP Loadrunner and JMeter. On the
basis of Table 2, the value of parameters
assigned ranks from 0 to 7 such as loadrunner
has high execution speed than JMeter, so
loadrunner gets higher rank i.e. 5 and load
runner gets lower rank which is 2 and in a
similar way all the parameters have assigned
some values.
Graph 1 shows a graphical comparison
between Loadrunner and JMeter. The factors
are discussed and compared as well and
assigned a parameter values from 0 to 7 such
as procurement cost, 5 for load runner and 2
for JMeter, installation, 4 for load runner and 1
for JMeter and so on as shown in Graph 1.
Hence, the overall values of parameters for
both the tools can be:
For Load ruuner,
The value of parameters is
=5+4+3+5+5+6/6=4.6.
For JMeter.
The value of parameters is
=2+1+5+3+2+3/6=2.6.
Based on this research, HP Loadrunner is
preferred as it is exceptionally more stable
than JMeter.
Table 2: Comparison between LoadRunner and JMeter.
Parameters
HP Load
Runner
JMeter Comment
Procurement cost 5 2
Load runner is an authorised instrument with high support cost while
JMeter is an open source free tool.
Installation 4 1 Load runner has high disk utilisation as compared to JMeter.
Scripts 3 5
Load runner hides scripts in levels so as to improve script look while
JMeter shows script depicting the HTTP stream of the scenario.
Result 5 3
Excellent analysis, dynamic graph generation while JMeter has limited
graph generation.
Execution speed 5 2 Load runner has high execution speed than JMeter.
Analysis and
monitoring
6 3
Load runner has inbuilt feature of analysis component while JMeter is not
exceptionally so strong.
Graph 1: Graphical Comparison between Load Runner and JMeter.
HP LoadRunner Tool Aggarwal and Solanki
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 12
Table 3: Average Response Time vs. Vusers (HP LoadRunner).
No. of Vuser
Average Response Time (sec)
Login Flight Booking Select Flight Fill Details Signoff Expected Result Pass/Fail
5 0.382 0.528 0.341 0.317 0.331 0.309 3 Pass
25 0.48 0.648 0.398 0.423 0.466 0.362 3 Pass
50 0.652 0.852 0.565 0.523 0.6 0.541 3 Pass
Graph 2: Average Response Time vs. Vusers.
Test Reports of the Load Test Script
In this section, the summary report of the
performed work load tests i.e. normal
(5 Vusers), minimal (25 Vusers), and peak
loads (50 Vusers) is formulated as shown in
Table 3.
From the above summary report, it can be
easily observed that as the number of Vusers
increases the average response time also
increases. For example, the average response
time for login is 0.382 sec for 5 Vusers,
0.48 sec for 25 Vusers, and 0.652 sec for
50 Vusers. Response time must be 3 sec to
pass a transaction to match the actual result
with the expected result.
Graph 2 shows a graphical representation
about how the average response time of
transactions increases when the number of
Vusers is increased. The maximum average
response time for flight reservation load test
script is 0.852 sec for 50 Vusers and minimum
is 0.382 sec for 5 Vusers. Hence, it is cleared
from the graph as the load increases the
average response time also increases.
Comparison of Proposed System with the
Existing Systems
In this section, the proposed system is
compared with the accessible tools in the
market on the factors like; how much time
would took place in managing the data before
and after the enhancement, method of
parameterization and customization, and size
of file has been compared between the
proposed system and the existing systems
which are HP loadrunner and JMeter as shown
in Table 4.
Table 4 shows the descriptive comparison
between the proposed system with the
accessible tools in the business sector which
are: existing loadrunner and JMeter. Table 4
illustrates the comparison on following
parameters: time, parameterization and
customization and size of file.
Table 4: Comparison of Proposed System with the Existing Systems.
Parameters
Systems
Existing HP Load Runner Proposed System JMeter
Time (3–4) h for 50 Vusers 30 min (5–6) h
Parameterization and
customization
By creating the parameterized
list for 50 Vuser
Generating string
random functions
Little complex to parameterize
for each Vuser
Size of file Approx. 1 GB Reduce to10 kb High disk utilization
Journal of Web Engineering & Technology
Volume 3, Issue 2
ISSN: 2455-1880(online)
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 13
Graph 3: Comparison of Proposed System with the Existing Systems.
Graph 3 shows a graphical representation of
the parameters that has been compared
between the proposed system and the existing
systems. As can be seen from the graph, the
proposed algorithm for generating the string
random functions reduces the time taken by
the client in managing the data while scripting
the load test, the method of parameterization
and customizing the script becomes more
easier than earlier, as well as reduces the size
of file which benefits in executing the load
tests in controller.
In real life scenarios, the workload would be
not just of 50 Vusers. For example; Assume
the load test is allocated to check performance
of Myntra.com for 5000 users. In a genuine
circumstance, these all these 5000 clients
won't be at login page yet in various pages of
the sites. Hence in order to parameterize all the
hardcoded values for 5000 Vusers, the
proposed algorithm can be implemented rather
than creating any parameterized list.
CONCLUSION
Performance testing is a key to expand
organizations' benefit and lessen dangers in
mission basic software systems. Load testing
tools are used for monitoring the performance
of system. It is analyzed that HP loadrunner is
the best tool for load testing due to its
robustness and steadiness. According to the
HP organization, tool provides all the
performance scripting methods feature like;
parameterization, correlation, transactions, etc.
It also helps to reduce the human efforts and
saves user time. But, it lacks the availability of
time taken by the user in data management
while parameterizing the script. In response to
this problem, this research paper aimed at
generating the string random functions which
can be easily called when needed in
developing the load test script. Hence, it can
be said that the proposed parameterization
technique is better than the existing
techniques.
REFERENCES
1. Padhy N, Mishra P, Panigrahi R. Software
Performance Testing Scheme Using
Virtualization Technology. IJCSEIT.
2012; 2(3): 43–58p.
2. Smita S, Sharma P. Use of Automation
Tools in Various Field: A Survey Paper.
IOSRJCE. 2014; 16(3): 18–21p.
3. Jain N, Srivastava V. Performance Testing
Techniques: A Survey Paper.
International Journal of Research in
Engineering and Technology (IJRET).
2013; 02(11): 116–119p.
4. Rajagopal R. Evaluating Load Generation
in Virtualized Environments for Software
Performance Testing. Int J Database
Manag Syst. 2011; 3(4).
5. Kaur N, Amandeep Kaur Mann. Survey
Paper on HP Load Runner Techniques.
IJSETR. 2013; 2(4).
6. Mansur M, Noor M, Sap M. Perforamve
Testing and its Techniques: A Research
Perspective. Proceedings of the
Postgraduate Annual Research Seminar.
2011.
7. Singh K, Upadhyaya S. Designing
Approach Analysis on Small-Scale
Software Performance Testing Tools.
International Journal of Computer
Science. 2012; 9(3).
HP LoadRunner Tool Aggarwal and Solanki
JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 14
8. Vijayarani S, Nithya S. An Efficient
Scripting Algorithm for Performance
Testing. International Journal of
Computer Applications (IJCA). 2011;
32(7).
9. Chamatkar A, Butey PK. Implementation
of Different Performance Testing
Algorithms with Server Networks.
ICCUBEA. 2005; 374–378p.
10. Lata S, Ramaraj N. Algorithm for
Efficient Software Testing. Conference on
Computational Intelligence and
Multimedia Applications. 2007; 2: 66–70p.
11. Yu H, Huang X, Hu X, et al. An
Environment for Load Creation and
Supply for Testing Switching Software.
ICMeCG. 2010; 35–38p.
12. Yao J, Zhong R. Comparative Study of
Testing Tools: Apache Jmeter and Load
Runner. 2008; 1: 212–215p.
13. Florez G, Bridges S, Vaughn R. An
Improved Algorithm for Load Testing on
Various Performance Test Tools. NAFPIS.
2002; 457–462p.
14. Wang A, Yang Y, Yang Z. Research on
Testing Algorithms Based on
Parameterization Methods. WiCOM. 2008;
1–4p.
15. Yue J, Mao S, Li M, et al. A Comparative
Study of QTP and Load Runner
Automated Testing Tools and Their
Contributions to Software Project
Scenario. 22nd International Conference
on Geoinformatics. 2014.
16. Zeidat N, Zhao Z. A Comparative Study
on Load Testing Tools. ICTAI. 2011.
17. Ray S, Pakhira M. Software Testing
Framework Process. ICCT. 2011.
18. Dong X, Zhang Z. Research and
Implementation of Correlation Algorithm
with Time Constraints. ICCSS. 2014.
19. Thirumurgan S, Suresh L. Research of
Load Testing and Result Application
Based on Load Runner. Wireless, Mobile
and Multimedia Networks IET
International Conference. 2008.
20. Pakhira M. Analyzing and Testing Web
Application Performance. ICIT
International Conference. 2008.
Cite this Article
Jatin Aggarwal, Arun Solanki.
Performance Testing on Web
Application through HP LoadRunner
with Parameterization and
Customization. Journal of Web
Engineering & Technology. 2016; 3(2):

More Related Content

What's hot

Bindu Chintalapudi - Software Testing -latest (1)
Bindu Chintalapudi - Software Testing -latest (1)Bindu Chintalapudi - Software Testing -latest (1)
Bindu Chintalapudi - Software Testing -latest (1)bindu chintalapudi
 
SUMMIT 09 - Startegic Choice Of Test Automation Framework
SUMMIT 09 - Startegic Choice Of Test Automation FrameworkSUMMIT 09 - Startegic Choice Of Test Automation Framework
SUMMIT 09 - Startegic Choice Of Test Automation FrameworkLavanya Lakshman
 
Cost Based Performance Modelling
Cost Based Performance ModellingCost Based Performance Modelling
Cost Based Performance ModellingEugene Margulis
 
Software Engineering unit 5
Software Engineering unit 5Software Engineering unit 5
Software Engineering unit 5Abhimanyu Mishra
 
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...ijseajournal
 
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISON
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONSTATISTICAL ANALYSIS FOR PERFORMANCE COMPARISON
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONijseajournal
 
SE18_Lec 01_Introduction to Software Engineering
SE18_Lec 01_Introduction to Software EngineeringSE18_Lec 01_Introduction to Software Engineering
SE18_Lec 01_Introduction to Software EngineeringAmr E. Mohamed
 
Software Process Models
 Software Process Models  Software Process Models
Software Process Models MohsinAli773
 
CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT
CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT
CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT ijsptm
 
15 si(systems analysis and design )
15 si(systems analysis and design )15 si(systems analysis and design )
15 si(systems analysis and design )Nurdin Al-Azies
 
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)IRJET Journal
 
Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model
Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model
Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model ijasa
 

What's hot (19)

Performance Testing: Analyzing Differences of Response Time between Performan...
Performance Testing: Analyzing Differences of Response Time between Performan...Performance Testing: Analyzing Differences of Response Time between Performan...
Performance Testing: Analyzing Differences of Response Time between Performan...
 
Bindu Chintalapudi - Software Testing -latest (1)
Bindu Chintalapudi - Software Testing -latest (1)Bindu Chintalapudi - Software Testing -latest (1)
Bindu Chintalapudi - Software Testing -latest (1)
 
SUMMIT 09 - Startegic Choice Of Test Automation Framework
SUMMIT 09 - Startegic Choice Of Test Automation FrameworkSUMMIT 09 - Startegic Choice Of Test Automation Framework
SUMMIT 09 - Startegic Choice Of Test Automation Framework
 
Cost Based Performance Modelling
Cost Based Performance ModellingCost Based Performance Modelling
Cost Based Performance Modelling
 
Software Engineering unit 5
Software Engineering unit 5Software Engineering unit 5
Software Engineering unit 5
 
RAD10987USEN.PDF
RAD10987USEN.PDFRAD10987USEN.PDF
RAD10987USEN.PDF
 
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...
SOFTWARE REQUIREMENT CHANGE EFFORT ESTIMATION MODEL PROTOTYPE TOOL FOR SOFTWA...
 
Resume-Ramchandra Gupta
Resume-Ramchandra GuptaResume-Ramchandra Gupta
Resume-Ramchandra Gupta
 
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISON
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISONSTATISTICAL ANALYSIS FOR PERFORMANCE COMPARISON
STATISTICAL ANALYSIS FOR PERFORMANCE COMPARISON
 
Cots testing
Cots testingCots testing
Cots testing
 
Ia rm001 -en-p
Ia rm001 -en-pIa rm001 -en-p
Ia rm001 -en-p
 
Customizing iso 9126 quality model for evaluation of b2 b applications
Customizing iso 9126 quality model for evaluation of b2 b applicationsCustomizing iso 9126 quality model for evaluation of b2 b applications
Customizing iso 9126 quality model for evaluation of b2 b applications
 
Automation Testing Syllabus - Checklist
Automation Testing Syllabus - ChecklistAutomation Testing Syllabus - Checklist
Automation Testing Syllabus - Checklist
 
SE18_Lec 01_Introduction to Software Engineering
SE18_Lec 01_Introduction to Software EngineeringSE18_Lec 01_Introduction to Software Engineering
SE18_Lec 01_Introduction to Software Engineering
 
Software Process Models
 Software Process Models  Software Process Models
Software Process Models
 
CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT
CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT
CONTROLLING INFORMATION FLOWS DURING SOFTWARE DEVELOPMENT
 
15 si(systems analysis and design )
15 si(systems analysis and design )15 si(systems analysis and design )
15 si(systems analysis and design )
 
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)Automation Testing of Web based Application with Selenium and HP UFT (QTP)
Automation Testing of Web based Application with Selenium and HP UFT (QTP)
 
Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model
Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model
Evaluating the Quality of Software in ERP Systems Using the ISO 9126 Model
 

Viewers also liked

Extending VuGen 11.5 with custom add-ins
Extending VuGen 11.5 with custom add-insExtending VuGen 11.5 with custom add-ins
Extending VuGen 11.5 with custom add-insstuartmoncrieff
 
Performance Testing in a Mobile World
Performance Testing in a Mobile WorldPerformance Testing in a Mobile World
Performance Testing in a Mobile Worldstuartmoncrieff
 
Introduction to performance testing
Introduction to performance testingIntroduction to performance testing
Introduction to performance testingRichard Bishop
 
Performance Testing in the Cloud
Performance Testing in the CloudPerformance Testing in the Cloud
Performance Testing in the Cloudstuartmoncrieff
 
Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Atul Pant
 
An Introduction to Performance Testing
An Introduction to Performance TestingAn Introduction to Performance Testing
An Introduction to Performance TestingSWAAM Tech
 

Viewers also liked (7)

Extending VuGen 11.5 with custom add-ins
Extending VuGen 11.5 with custom add-insExtending VuGen 11.5 with custom add-ins
Extending VuGen 11.5 with custom add-ins
 
How to start performance testing project
How to start performance testing projectHow to start performance testing project
How to start performance testing project
 
Performance Testing in a Mobile World
Performance Testing in a Mobile WorldPerformance Testing in a Mobile World
Performance Testing in a Mobile World
 
Introduction to performance testing
Introduction to performance testingIntroduction to performance testing
Introduction to performance testing
 
Performance Testing in the Cloud
Performance Testing in the CloudPerformance Testing in the Cloud
Performance Testing in the Cloud
 
Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Performance Test Plan - Sample 2
Performance Test Plan - Sample 2
 
An Introduction to Performance Testing
An Introduction to Performance TestingAn Introduction to Performance Testing
An Introduction to Performance Testing
 

Similar to 1. PERFORMANCE TESTING ON WEB APPLICATION THROUGH HP LOAD1

Load Runner Methodology to Performance Testing
Load Runner Methodology to Performance TestingLoad Runner Methodology to Performance Testing
Load Runner Methodology to Performance Testingijtsrd
 
Performance testing interview questions and answers
Performance testing interview questions and answersPerformance testing interview questions and answers
Performance testing interview questions and answersGaruda Trainings
 
Unit Testing Essay
Unit Testing EssayUnit Testing Essay
Unit Testing EssayDani Cox
 
A STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONS
A STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONSA STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONS
A STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONSecij
 
Top 20 LoadRunner Interview Questions and Answers in 2023.pdf
Top 20 LoadRunner Interview Questions and Answers in 2023.pdfTop 20 LoadRunner Interview Questions and Answers in 2023.pdf
Top 20 LoadRunner Interview Questions and Answers in 2023.pdfAnanthReddy38
 
RESEARCH ON DISTRIBUTED SOFTWARE TESTING PLATFORM BASED ON CLOUD RESOURCE
RESEARCH ON DISTRIBUTED SOFTWARE TESTING  PLATFORM BASED ON CLOUD RESOURCERESEARCH ON DISTRIBUTED SOFTWARE TESTING  PLATFORM BASED ON CLOUD RESOURCE
RESEARCH ON DISTRIBUTED SOFTWARE TESTING PLATFORM BASED ON CLOUD RESOURCEijcses
 
Research Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and ScienceResearch Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and Scienceinventy
 
Silk Performer Presentation v1
Silk Performer Presentation v1Silk Performer Presentation v1
Silk Performer Presentation v1Sun Technlogies
 
A novel approach for evaluation of applying ajax in the web site
A novel approach for evaluation of applying ajax in the web siteA novel approach for evaluation of applying ajax in the web site
A novel approach for evaluation of applying ajax in the web siteeSAT Publishing House
 
Performance Testing
Performance TestingPerformance Testing
Performance TestingSelin Gungor
 
Test Case Optimization and Redundancy Reduction Using GA and Neural Networks
Test Case Optimization and Redundancy Reduction Using GA and Neural Networks Test Case Optimization and Redundancy Reduction Using GA and Neural Networks
Test Case Optimization and Redundancy Reduction Using GA and Neural Networks IJECEIAES
 
Determination of Software Release Instant of Three-Tier Client Server Softwar...
Determination of Software Release Instant of Three-Tier Client Server Softwar...Determination of Software Release Instant of Three-Tier Client Server Softwar...
Determination of Software Release Instant of Three-Tier Client Server Softwar...Waqas Tariq
 
What is Performance Testing?
What is Performance Testing?What is Performance Testing?
What is Performance Testing?QA InfoTech
 
Comparative Study on Different Mobile Application Frameworks
Comparative Study on Different Mobile Application FrameworksComparative Study on Different Mobile Application Frameworks
Comparative Study on Different Mobile Application FrameworksIRJET Journal
 

Similar to 1. PERFORMANCE TESTING ON WEB APPLICATION THROUGH HP LOAD1 (20)

Load Runner Methodology to Performance Testing
Load Runner Methodology to Performance TestingLoad Runner Methodology to Performance Testing
Load Runner Methodology to Performance Testing
 
Performance testing interview questions and answers
Performance testing interview questions and answersPerformance testing interview questions and answers
Performance testing interview questions and answers
 
Ka3517391743
Ka3517391743Ka3517391743
Ka3517391743
 
Unit Testing Essay
Unit Testing EssayUnit Testing Essay
Unit Testing Essay
 
A STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONS
A STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONSA STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONS
A STUDY OF FORMULATION OF SOFTWARE TEST METRICS FOR INTERNET BASED APPLICATIONS
 
Top 20 LoadRunner Interview Questions and Answers in 2023.pdf
Top 20 LoadRunner Interview Questions and Answers in 2023.pdfTop 20 LoadRunner Interview Questions and Answers in 2023.pdf
Top 20 LoadRunner Interview Questions and Answers in 2023.pdf
 
Chapter 5 - Tools
Chapter 5 - ToolsChapter 5 - Tools
Chapter 5 - Tools
 
RESEARCH ON DISTRIBUTED SOFTWARE TESTING PLATFORM BASED ON CLOUD RESOURCE
RESEARCH ON DISTRIBUTED SOFTWARE TESTING  PLATFORM BASED ON CLOUD RESOURCERESEARCH ON DISTRIBUTED SOFTWARE TESTING  PLATFORM BASED ON CLOUD RESOURCE
RESEARCH ON DISTRIBUTED SOFTWARE TESTING PLATFORM BASED ON CLOUD RESOURCE
 
Research Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and ScienceResearch Inventy : International Journal of Engineering and Science
Research Inventy : International Journal of Engineering and Science
 
Q44098893
Q44098893Q44098893
Q44098893
 
Silk Performer Presentation v1
Silk Performer Presentation v1Silk Performer Presentation v1
Silk Performer Presentation v1
 
A novel approach for evaluation of applying ajax in the web site
A novel approach for evaluation of applying ajax in the web siteA novel approach for evaluation of applying ajax in the web site
A novel approach for evaluation of applying ajax in the web site
 
Performance Testing
Performance TestingPerformance Testing
Performance Testing
 
Test Case Optimization and Redundancy Reduction Using GA and Neural Networks
Test Case Optimization and Redundancy Reduction Using GA and Neural Networks Test Case Optimization and Redundancy Reduction Using GA and Neural Networks
Test Case Optimization and Redundancy Reduction Using GA and Neural Networks
 
Performance testing wreaking balls
Performance testing wreaking ballsPerformance testing wreaking balls
Performance testing wreaking balls
 
Determination of Software Release Instant of Three-Tier Client Server Softwar...
Determination of Software Release Instant of Three-Tier Client Server Softwar...Determination of Software Release Instant of Three-Tier Client Server Softwar...
Determination of Software Release Instant of Three-Tier Client Server Softwar...
 
QSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load RunnerQSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load Runner
 
What is Performance Testing?
What is Performance Testing?What is Performance Testing?
What is Performance Testing?
 
Comparative Study on Different Mobile Application Frameworks
Comparative Study on Different Mobile Application FrameworksComparative Study on Different Mobile Application Frameworks
Comparative Study on Different Mobile Application Frameworks
 
load_testing
load_testingload_testing
load_testing
 

1. PERFORMANCE TESTING ON WEB APPLICATION THROUGH HP LOAD1

  • 1. JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 1 Journal of Web Engineering & Technology ISSN: 2455-1880(online) Volume 3, Issue 2 www.stmjournals.com Performance Testing on Web Application through HP Load Runner with Parameterization and Customization Jatin Aggarwal*, Arun Solanki Department of Computer Science and Engineering, Gautam Buddha University, Greater Noida, India Abstract This paper provides an extension or enhancement in HP loadrunner tool which is a software automated testing tool used to carry out the intensive performance testing. The extension aims in designing the functions which are used to reduce the data management time taken by the client or the user in different scripting methods like-parameterization, correlation, and validation checks. The proposed technique has been implemented in C. Proposed scheme reduces the human efforts in parameterizing the values and saves user time. The paper shows the brief comparison results between the proposed scheme and in the existing HP loadrunner tool which is available in the market. Keywords: Software testing, performance testing, HP loadrunner, automation testing, C *Author for Correspondence E-mail: jatinatj12@gmail.com INTRODUCTION Software testing is an approach which used to recognize the completeness, correctness and the quality of developed software [1]. Testing is completely based on the requirements i.e. which part or functionalities need to be tested, how much testing is required to test the software [2]. Testing can be performed at different levels which are: unit testing, integration testing, system testing, user- acceptance testing, regression testing, performance testing, and beta testing [3]. Performance testing is performed to test the application’s stability, scalability as well as its response time [4]. The different techniques of performance testing are: load testing, stress testing, spike testing, endurance testing, and volume testing [5]. HP loadrunner is an automated load testing tool. It is used to conduct load testing before, during, and after application deployment. Most of the companies or clients use loadrunner tool as a benchmark for cross verifying their performance of the developed software [6]. LITERATURE SURVEY A search of the published literature has following papers such as Eljona Proko and Ilia Ninka, give a discourse that performance test tools help developer to discover bottleneck in performance of the system and also to choose a suitable platform for outlining the web application [7]. Sharmila and Ramadevi proposed performance testing ideas, objectives, techniques and accessible tools for testing web applications performance. They had analyzed that performance testing was used to decide the responsiveness, throughput, quality assurance, and versatility of a system under a given workload [8]. Vokolos and Weyuker, gave a discourse on ways to deal with software performance testing. A contextual analysis depicts the experience that methodologies utilized for testing the performance of a framework utilized as a passage as a part of a substantial modern customer/server exchange handling application [9]. Korel and Al-Yami discussed automating the regression testing process which included testing the changed program with experiments keeping in mind the end goal to build up the certainty that the system will perform as indicated by the adjusted determination [10]. Yang and Pollock presented testing tools for structural load testing which took a program code as input, and naturally figured out if that program should be load tested, and assuming this was the case, consequently produced test information for basic load testing of the system [11]. Krishnamurthy portrayed poor execution could antagonistically affect the
  • 2. HP LoadRunner Tool Aggarwal and Solanki JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 2 benefit of ventures that depended on the web applications. Accordingly, intensive performance testing strategies are vital for comprehension whether an online framework will meet its execution targets when conveyed in this present reality [12]. Sharma and Angmo, have talked about different web automation testing tools which will push us to comprehend the automation testing and additionally the apparatuses accessible for automation testing [13]. Ahlawat and Tyagi, discuss three well-known load testing tools i.e. WAPT, LOADUI and LOADSTER and their analysis has been made as far as normal response time and ideal response rate. After effects of the analysis help in adoption and utilization of these tools [14]. SYSTEM ARCHITECTURE As can be seen in Figure 1, the architecture of the proposed work is a three-tier architecture with a graphical user interface (GUI), a central processing unit and a database [15]. Graphical User Interface (GUI) The user is provided with a GUI to provide the necessary inputs to the system. The user is required to enter the hard-coded values for the different features of the application. Database The database used in this system is a dynamic database. The database is designed at runtime based on the values entered by the user. The database is displayed on the GUI as soon as the user enters the necessary values [16]. Requirement Module In this module, user or client needs to initiate with defining the requirements which include: performance test objective, performance requirements, work load profile and performance goals. This module is also known as SLA’s (Service Level Agreement). Scenario Identification In this module, user needs to identify the scenarios on the basis of requirements defined in the SLA’s i.e. for which requirement which scenario would be the best to test the functionality or features of the system [17]. Script Customization This module is responsible for generating the load test scripts first and then enhance it with the performance scripting methods such as parameterization, correlation, transactions etc. Scenario Execution In this module, load scenarios would be executed which was identified on the basis of requirements in the requirement module. Result Analysis This module is responsible for generating the graph and reports from the dump that consists of raw data generated while executing the scenarios in the scenario execution module. PROCESS FLOW The process flow of the proposed system has been depicted in Figure 2. The system consists of various different modules that interact with each other to complete the working of the system [18]. Fig. 1: System Architecture of Proposed Work.
  • 3. Journal of Web Engineering & Technology Volume 3, Issue 2 ISSN: 2455-1880(online) JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 3 Graphical User Interface (GUI) The user is provided with a GUI to provide the necessary inputs to the system. The user is required to enter the hard-coded values for the different features of the application. Database The database used in this system is a dynamic database. The database is designed at runtime based on the values entered by the user. The database is displayed on the GUI as soon as the user enters the necessary values [19]. Requirement Module In this module, user or client needs to initiate with defining the requirements which include: performance test objective, performance requirements, work load profile and performance goals. This module is also known as SLA’s (Service Level Agreement). Scenario Identification In this module, user needs to identify the scenarios on the basis of requirements defined in the SLA’s i.e. for which requirement which scenario would be the best to test the functionality or features of the system. Script Customization This module is responsible for generating the load test scripts first and then enhance it with the performance scripting methods such as parameterization, correlation, transactions etc. Scenario Execution In this module, load scenarios would be executed which was identified on the basis of requirements in the requirement module. Result Analysis This module is responsible for generating the graph and reports from the dump that consists of raw data generated while executing the scenarios in the scenario execution module. PROCESS FLOW The process flow of the proposed system has been depicted in Figure 2. The system consists of various different modules that interact with each other to complete the working of the system. Step-1 This is the first module which is initiated by defining the requirements. Requirements can be defined in two ways: either functional or non-functional. Functional requirements are those which indicate something the framework ought to do while the non-functional requirements depict how the framework functions [20]. Step-2 The scenario identification module is responsible for checking the functionality of the application, which needs to be tested. Fig. 2: Process Flow Diagram of Proposed Work.
  • 4. HP LoadRunner Tool Aggarwal and Solanki JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 4 Step-3 The script customization module is a sequential flow module. Firstly, it initiates with the creation of new script in VuGen and then record the web tours application according to the requirements. After recording, a coded script called Vugen script is generated which needs to be enhanced with the performance scripting methods like- parameterization, correlation, transactions, and rendezvous points. After all the enhancements, the script is replayed and verified to check whether it is working right or not. Step-4 The database used in this system is a dynamic database. Database driven testing requires the data files from which test data and/or output data could be read instead of inserting the same hardcoded values. Step-5 The execution module takes place in the controller component of the loadrunner. In this module, all the load scenarios are executed. Step-6 The result analysis module takes the raw data from the controller and generates the graphs and reports. Step-7 The final result of all the load scenarios detected by the proposed algorithm is displayed on the screen. METHODOLOGY OF PROPOSED WORK The present parameterization scripting method in loadrunner involves the creation of parameter list. HP loadrunner has friendly user interface, provides rich code editor and has many other functionalities which help to automate any web application. Loadrunner provides the feature to create the parameter list through data driven testing which requires any type of data file such as excel file which could insert the test data and/or output values instead of keep using the similar hardcoded values when the test runs each time. This excel file is created by the user manually which requires large amount of time. And it is very important to perform the testing under the given duration so as to deploy the product at the given time. Therefore, there is a need for more efficient algorithm to parameterize the script during automation testing. The proposed algorithm generates the arbitrary string random functions which can be effectively called when required in building up the load test script. The significance of providing this extension is very much needed and which will be cleared through the following example: for checking the payment functionality in the web tours application: first name, last name and address is required and assume if the load test is to check the performance of Myntra.com for 5000 clients, then the tester needs to create the data file for first name, last name and address for 5000 clients which requires a lot of time. This is only one such condition which is discussed, there can be numbers of such conditions. The proposed system has been implemented in a step by step manner to clearly distinguish between the different parts of the system. The proposed technique has been implemented in C. Proposed scheme reduces the human efforts in parameterizing the values and saves user time. Extension provides the generation of random string functions for: First name (as sFirst), last name (as sLast), address (as sAddress). The results indicate that the proposed system is able to diminish the data management time taken by the customer or the client while scripting. PSEUDOCODE OF PROPOSED WORK The pseudocode of the proposed work is as follows: Algorithm1: /*Generating the Flight Booking Load Test Script*/ Step1: Start VuGen and create a new script; Step2: Start recording the application; Step3: After recording the complete application, develop the load test script; Step4: Run the script to check whether it is working right or not as intended; Step5: Enhance the script through performance scripting methods like, parameterization, correlation, transactions and etc.; Step6: Define the transactions by keep in touch with one line of code before request is made to the server and close the exchange when the request ends;
  • 5. Journal of Web Engineering & Technology Volume 3, Issue 2 ISSN: 2455-1880(online) JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 5 Step7: Parameterize the hard-coded values either through creating the parameter list or by generating the string random function; Step8: Correlate the dynamic values in the script either by automatic correlation or by manual correlation; Step9: Insert comments in the script while recording or after recording; Step10: Replay and verifies the load test script; Step11: Save the script; Step12: Use controller for executing the various load scenarios; Step13: Analyze the result and generate graphs as well as reports. Algorithm2: /*Generating Random String Functions*/ Step1: Start with naming the function by which it is called when needed; Step2: Declare the variables and array which are required with their data types, sizes, and values; Step3: Copy the null values if any to Temp_string variable by using strcpy function; Step4: Use for loop from i=0 to 10; Step5: Generate random numbers through rand() function and store their remainder values in Ret_value; Step6: Concatenate the strings; Step7: Then save the concatenated string through lr_save_string function; Step8: Terminate the function. IMPLEMENTATION AND WORKING OF PROPOSED SYSTEM The internal working of the system is discussed in this section. First, the implementation details of the system have been discussed and then, the working of the system, in detail. Implementation The system has been implemented with the help of C language. Both algorithms i.e. the already existing parameterization method algorithm and the proposed algorithm of generating the string random functions have been implemented in C. The tool used for performance testing is the HP loadrunner tool which is the automated testing tool. Working This section explains the functioning the designed system with the help of actual snapshots of the running system. HP loadrunner (LR) is used to conduct load testing before, during, and after application deployment. The different components of load runner are as follows: LR virtual generator (VuGen): records virtual user (Vuser) scripts. LR controller: creates, maintains and executes scenarios. LR analysis: provides graphs and reports. In this dissertation, web tours application will be utilized that comes packaged with loadrunner. The HP web-tours requires a needy web tours apache server to be running before it can work. Defining the Requirements For the web tours application, Table 1 illustrates the performance requirements which consists of four columns like; test case Id’s, expected result, actual result, pass/fail statistics (if the actual results matched with the expected result then the load scenarios is considered as PASS otherwise FAIL). Creating a New Script in VuGen A window is shown in Figure 3 to choose the conventions for creating a new script. As soon as create button clicked, VuGen will open the IDE (Integrated Development Environment) or code editorial manager in which script files would be cleared, with the exception of essential mark of function action. For HP web tours, it requires Web-HTTP/HTML protocol. Table 1: Requirement Analysis. Unique Test Case Id Test Scenario Description Expected Result Actual Result Pass/Fail T1 Run a load test with 5 Vusers for 10 min Response time must be 3 sec T2 Run a load test with 25 Vusers for 10 min Response time must be 3 sec T3 Run a load test with 50 Vusers for 10 min Response time must be 3 sec
  • 6. HP LoadRunner Tool Aggarwal and Solanki JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 6 Fig. 3: Creating a New Script. Fig. 4: Recording the Application. Recording the Application After getting the code editor by VuGen, recording can be initiated by clicking on the recording button from the tool bar. As soon as user hits the recording button, a start recording window would pop-up as shown in Figure 4. Once the user clicks on the start recording button, VuGen would open up the HP web tours application and a recording floating bar would appear which consists of numerous controls over recording. User could record the complete script by performing different functionalities in the application like; login, flight booking, itinerary, signoff etc. Generating the Code Script After simulating all the functionalities of the application, recording must be stopped by clicking on stop button from the floating bar so as to generate the coded script as shown in Figure 5. One can easily understand the process flow of script by looking at the left side of the script which is called as step navigator of the script. Enhance the Script At the point when a script is recorded, it covers single and straight flow of the subject application. Henceforth, script can be upgraded on the premise of characterized necessities through various techniques which are: Using Transactions: The utilization of transactions measures the time taken by the framework for a specific request. Applying exchanges is direct, as shown in Figure 6, for start transaction and end transaction.
  • 7. Journal of Web Engineering & Technology Volume 3, Issue 2 ISSN: 2455-1880(online) JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 7 Fig. 5: Code Generation. Fig. 6: Enhancing the Script by Using Transactions. Fig. 7: Parameterized List through Excel File. Parameterization A parameter in VuGen is a compartment that contains a recorded worth that is substituted for different users. It helps to decreases the script size. Parameterization could be done through various techniques; this research lay out two of them which are:  By Creating Parameter List Loadrunner provides the feature to create the parameter list through data driven testing which requires any type of data file such as excel file which could insert the test data and/or output values instead of keep using the similar hardcoded values when the test runs each time as shown in Figure 7.  By Generating String Random Functions This is the extension in loadrunner tool on which this research paper is based upon. Extension provides the generation of random string functions for: First name (as sFirst), last name (as sLast), address (as sAddress). The functions are created under the file called method.h which can be seen on the left side of the window. This parameterization technique follows certain algorithm in order to generate the functions which are shown in module 6.
  • 8. HP LoadRunner Tool Aggarwal and Solanki JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 8 Fig. 8: Parameterization through Generating String Random Functions. Correlation Fig. 9: Correlating the Dynamic Value. Correlation is a system of characterizing relationship between two variables or entities. It is done to handle dynamic values (values which transforms all the time while repeating the same steps). For example; Session Id’s, cookies, dates and etc. This research concentrates on web_reg_save_param function which is used to correlate the web tours application session id because it is the value whose output would be diverse every time a script is recorded as shown in Figure 9. Using Comments Comments make the code justifiable for anybody who is referring it later on. They give data about particular operation and separate two segments for distinction as shown in Figure 10.
  • 9. Journal of Web Engineering & Technology Volume 3, Issue 2 ISSN: 2455-1880(online) JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 9 Fig. 10: Inserting Comments to the Script. Fig. 11: Replay and Verifies the Script. Replay and Verifies the Script Once all the enhancements made into the script, click on the replay button from the toolbar as shown in Figure 11. The reason for this execution is to guarantee that script is working right. Use Controller for Load Scenarios Controller is a system to control general load test. Controller provides two types of scenarios: manual scenario and goal-oriented scenario. This research focuses on manual scenarios. Once the play button is clicked, controller changed its tab from design tab to run tab as shown in Figure 12. Figure 12 shows the flight reservation load test with 5 Vuser for 10 min. Similarly, the load test with 25 Vuser for 10 min and 50 Vuser for 10 min are executed. The top most right side of the window contains the scenario check status block which tells about; passed transactions, failed transactions, running Vusers, elapsed time, errors and etc.
  • 10. HP LoadRunner Tool Aggarwal and Solanki JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 10 Fig. 12: Flight Reservation Load Test with 5 Vusers for 10 min. Fig. 13: Analysis Report of 5Vuser. Analyze the Results HP analysis is a component to perform definite investigation on the execution test that has completed. Analysis takes the raw information made by controller, in amid execution of performance test. The dump contains all the data in crude arrangement which is parsed by HP analysis to produce different charts and reports. Once the operations are established, the principle session window would appear which looks something like, as shown in Figure 13. Figure 13 shows the analysis report with load test of 5 Vuser for 10 min, similarly analysis report with load test of 25 Vusers and 50 Vusers are generated. RESULTS The performance result and graphs of the load test scripts is present in this section. First, tools are analysed on different parameters like;
  • 11. Journal of Web Engineering & Technology Volume 3, Issue 2 ISSN: 2455-1880(online) JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 11 cost, installation, stability and etc. Then, test reports of the performed work load tests for normal (5 Vusers), minimal (25 Vusers), and peak loads (50 Vusers) are shown. Afterwards, the proposed work is compared against the already existing system in the business sector in terms of parameterization and customization. Comparison of HP LoadRunner and JMeter After analyzing and working on both the tools, the following comparison has been established as shown in Table 2. Table 2 shows the descriptive comparison between HP Loadrunner and JMeter. On the basis of Table 2, the value of parameters assigned ranks from 0 to 7 such as loadrunner has high execution speed than JMeter, so loadrunner gets higher rank i.e. 5 and load runner gets lower rank which is 2 and in a similar way all the parameters have assigned some values. Graph 1 shows a graphical comparison between Loadrunner and JMeter. The factors are discussed and compared as well and assigned a parameter values from 0 to 7 such as procurement cost, 5 for load runner and 2 for JMeter, installation, 4 for load runner and 1 for JMeter and so on as shown in Graph 1. Hence, the overall values of parameters for both the tools can be: For Load ruuner, The value of parameters is =5+4+3+5+5+6/6=4.6. For JMeter. The value of parameters is =2+1+5+3+2+3/6=2.6. Based on this research, HP Loadrunner is preferred as it is exceptionally more stable than JMeter. Table 2: Comparison between LoadRunner and JMeter. Parameters HP Load Runner JMeter Comment Procurement cost 5 2 Load runner is an authorised instrument with high support cost while JMeter is an open source free tool. Installation 4 1 Load runner has high disk utilisation as compared to JMeter. Scripts 3 5 Load runner hides scripts in levels so as to improve script look while JMeter shows script depicting the HTTP stream of the scenario. Result 5 3 Excellent analysis, dynamic graph generation while JMeter has limited graph generation. Execution speed 5 2 Load runner has high execution speed than JMeter. Analysis and monitoring 6 3 Load runner has inbuilt feature of analysis component while JMeter is not exceptionally so strong. Graph 1: Graphical Comparison between Load Runner and JMeter.
  • 12. HP LoadRunner Tool Aggarwal and Solanki JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 12 Table 3: Average Response Time vs. Vusers (HP LoadRunner). No. of Vuser Average Response Time (sec) Login Flight Booking Select Flight Fill Details Signoff Expected Result Pass/Fail 5 0.382 0.528 0.341 0.317 0.331 0.309 3 Pass 25 0.48 0.648 0.398 0.423 0.466 0.362 3 Pass 50 0.652 0.852 0.565 0.523 0.6 0.541 3 Pass Graph 2: Average Response Time vs. Vusers. Test Reports of the Load Test Script In this section, the summary report of the performed work load tests i.e. normal (5 Vusers), minimal (25 Vusers), and peak loads (50 Vusers) is formulated as shown in Table 3. From the above summary report, it can be easily observed that as the number of Vusers increases the average response time also increases. For example, the average response time for login is 0.382 sec for 5 Vusers, 0.48 sec for 25 Vusers, and 0.652 sec for 50 Vusers. Response time must be 3 sec to pass a transaction to match the actual result with the expected result. Graph 2 shows a graphical representation about how the average response time of transactions increases when the number of Vusers is increased. The maximum average response time for flight reservation load test script is 0.852 sec for 50 Vusers and minimum is 0.382 sec for 5 Vusers. Hence, it is cleared from the graph as the load increases the average response time also increases. Comparison of Proposed System with the Existing Systems In this section, the proposed system is compared with the accessible tools in the market on the factors like; how much time would took place in managing the data before and after the enhancement, method of parameterization and customization, and size of file has been compared between the proposed system and the existing systems which are HP loadrunner and JMeter as shown in Table 4. Table 4 shows the descriptive comparison between the proposed system with the accessible tools in the business sector which are: existing loadrunner and JMeter. Table 4 illustrates the comparison on following parameters: time, parameterization and customization and size of file. Table 4: Comparison of Proposed System with the Existing Systems. Parameters Systems Existing HP Load Runner Proposed System JMeter Time (3–4) h for 50 Vusers 30 min (5–6) h Parameterization and customization By creating the parameterized list for 50 Vuser Generating string random functions Little complex to parameterize for each Vuser Size of file Approx. 1 GB Reduce to10 kb High disk utilization
  • 13. Journal of Web Engineering & Technology Volume 3, Issue 2 ISSN: 2455-1880(online) JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 13 Graph 3: Comparison of Proposed System with the Existing Systems. Graph 3 shows a graphical representation of the parameters that has been compared between the proposed system and the existing systems. As can be seen from the graph, the proposed algorithm for generating the string random functions reduces the time taken by the client in managing the data while scripting the load test, the method of parameterization and customizing the script becomes more easier than earlier, as well as reduces the size of file which benefits in executing the load tests in controller. In real life scenarios, the workload would be not just of 50 Vusers. For example; Assume the load test is allocated to check performance of Myntra.com for 5000 users. In a genuine circumstance, these all these 5000 clients won't be at login page yet in various pages of the sites. Hence in order to parameterize all the hardcoded values for 5000 Vusers, the proposed algorithm can be implemented rather than creating any parameterized list. CONCLUSION Performance testing is a key to expand organizations' benefit and lessen dangers in mission basic software systems. Load testing tools are used for monitoring the performance of system. It is analyzed that HP loadrunner is the best tool for load testing due to its robustness and steadiness. According to the HP organization, tool provides all the performance scripting methods feature like; parameterization, correlation, transactions, etc. It also helps to reduce the human efforts and saves user time. But, it lacks the availability of time taken by the user in data management while parameterizing the script. In response to this problem, this research paper aimed at generating the string random functions which can be easily called when needed in developing the load test script. Hence, it can be said that the proposed parameterization technique is better than the existing techniques. REFERENCES 1. Padhy N, Mishra P, Panigrahi R. Software Performance Testing Scheme Using Virtualization Technology. IJCSEIT. 2012; 2(3): 43–58p. 2. Smita S, Sharma P. Use of Automation Tools in Various Field: A Survey Paper. IOSRJCE. 2014; 16(3): 18–21p. 3. Jain N, Srivastava V. Performance Testing Techniques: A Survey Paper. International Journal of Research in Engineering and Technology (IJRET). 2013; 02(11): 116–119p. 4. Rajagopal R. Evaluating Load Generation in Virtualized Environments for Software Performance Testing. Int J Database Manag Syst. 2011; 3(4). 5. Kaur N, Amandeep Kaur Mann. Survey Paper on HP Load Runner Techniques. IJSETR. 2013; 2(4). 6. Mansur M, Noor M, Sap M. Perforamve Testing and its Techniques: A Research Perspective. Proceedings of the Postgraduate Annual Research Seminar. 2011. 7. Singh K, Upadhyaya S. Designing Approach Analysis on Small-Scale Software Performance Testing Tools. International Journal of Computer Science. 2012; 9(3).
  • 14. HP LoadRunner Tool Aggarwal and Solanki JoWET (2016) 1-14 © STM Journals 2016. All Rights Reserved Page 14 8. Vijayarani S, Nithya S. An Efficient Scripting Algorithm for Performance Testing. International Journal of Computer Applications (IJCA). 2011; 32(7). 9. Chamatkar A, Butey PK. Implementation of Different Performance Testing Algorithms with Server Networks. ICCUBEA. 2005; 374–378p. 10. Lata S, Ramaraj N. Algorithm for Efficient Software Testing. Conference on Computational Intelligence and Multimedia Applications. 2007; 2: 66–70p. 11. Yu H, Huang X, Hu X, et al. An Environment for Load Creation and Supply for Testing Switching Software. ICMeCG. 2010; 35–38p. 12. Yao J, Zhong R. Comparative Study of Testing Tools: Apache Jmeter and Load Runner. 2008; 1: 212–215p. 13. Florez G, Bridges S, Vaughn R. An Improved Algorithm for Load Testing on Various Performance Test Tools. NAFPIS. 2002; 457–462p. 14. Wang A, Yang Y, Yang Z. Research on Testing Algorithms Based on Parameterization Methods. WiCOM. 2008; 1–4p. 15. Yue J, Mao S, Li M, et al. A Comparative Study of QTP and Load Runner Automated Testing Tools and Their Contributions to Software Project Scenario. 22nd International Conference on Geoinformatics. 2014. 16. Zeidat N, Zhao Z. A Comparative Study on Load Testing Tools. ICTAI. 2011. 17. Ray S, Pakhira M. Software Testing Framework Process. ICCT. 2011. 18. Dong X, Zhang Z. Research and Implementation of Correlation Algorithm with Time Constraints. ICCSS. 2014. 19. Thirumurgan S, Suresh L. Research of Load Testing and Result Application Based on Load Runner. Wireless, Mobile and Multimedia Networks IET International Conference. 2008. 20. Pakhira M. Analyzing and Testing Web Application Performance. ICIT International Conference. 2008. Cite this Article Jatin Aggarwal, Arun Solanki. Performance Testing on Web Application through HP LoadRunner with Parameterization and Customization. Journal of Web Engineering & Technology. 2016; 3(2):