SlideShare a Scribd company logo
Performance Test Using Visual Studio 2010 
Presented By: Mohamed Tarek
Agenda – Day 1 
 Why Functional Testing is not enough ? 
 Why Performance Test ? 
 Difference between Performance, Load and Stress Testing 
 Performance testing process 
 Different Methods for Applying Load 
 How to Measure the System Performance ? 
 Points to be considered before & during testing 
 Risks Addressed Through Performance Testing 
 Why VSTS (From Testing Perspective) ? 
 Types of Tests 
 Web test 
 Load test 
 Ordered test
Agenda – Day 1 
Web Testing 
 How it works ? 
 Creating a web test 
 Recording the web test 
 Editing the web test 
 Configuring the web test 
 Running the web test
Why Functional Testing is not enough
Why Performance Test (1)
Why Performance Test (2)
Difference between Performance, Load and 
Stress Testing 
 Performance - is about response, time lapses, duration ... etc. 
 Load testing - is about test behavior under normal/peak workload conditions. Load is more 
about characterizing / simulating your actual workload. 
 Stress testing - is about surfacing issues under extreme conditions and resource failures.
Testing objective (Perf , Load , Stress) 
Common Action: Apply Load and Measure Performance 
Performance Testing Load Testing Stress Testing 
Measuring the System 
Performance under 
constant load 
Performance metrics: 
 Speed 
Accuracy 
Stability 
But With Different Objectives… !! 
Measuring the System 
Performance under 
incremental load to study the 
relation between the applied 
load and the system 
performance 
Applying Incremental load on 
the system till it reach a 
break point at which the 
system crash in order to 
avoid reaching this point in 
the future or to Enhance the 
system performance)
Performance testing process 
 Identify Test Environment 
 Identify Performance Acceptance Criteria 
 Plan and Design Tests 
 Configure Test Environment 
 Implement Test Design (Configuration & Test Data) 
 Execute Tests 
 Analyze , Report and Retest
Different Methods for Applying Load 
 Constant number of users accessing the system concurrently 
 Variable number of users accessing the system concurrently 
 Constant size of data uploaded to the system by some users 
(Ex: Different types of files) 
 Variable size of data uploaded to the system by some users 
 Constant size of data downloaded from the system by some users 
 Variable size of data downloaded from the system by some users 
 Combination of all of the above
Required Settings for the applied load 
 If the load is Virtual users (VU): 
 Number of users 
 Start time 
 End time 
 Ramp-UP period (Time taken for generating all VUs) 
 If the load is data file: 
 File name 
 File path
How to Measure the System Performance 
Metrics used for measuring the system performance 
Speed Accuracy Stability 
Always measured by one of 
the following: 
 Avg. Page Load time 
 Avg. Transaction time 
 Avg. Response time 
Can be measured using the 
Standard Deviation and the 
coefficient of variation. 
CV=SD/Avg.Res.time*100% 
CV< 5% High Stability 
5%<CV<10% Acceptable 
CV>10% Poor 
Can be measured using the Avg% 
of Error for all requests 
%Error<2% High Accuracy 
2%<%Error<4% Acceptable 
%Error>4% Poor 
 System Speed 
 System Accuracy 
 System Stability
Example for Performance Testing (1) 
 Test objective: 
 We need to know the system performance during the rush hour 
 Test inputs and Prerequisites: 
1) Duration of the rush hour 
2) Number of users accessing the system concurrently at the rush hour 
3) Ramp-UP Period 
4) The commonly used scenarios 
5) The users distribution among the scenarios (i.e. % of users executing each scenario) 
6)Any input data like user credentials, Serials,..etc
Example for Performance Testing (2) 
Test Results and Reports: 
1) Indication about the System Speed (Avg. Page time, Avg. Response time or Avg. Transaction time) 
2) Indication about the System Accuracy(Avg. % of Error) 
3) The Errors that occurred and the URL of the failed requests 
4) Indication about the System Stability(Avg. Standard deviation and Coefficient of Variation) 
5) Tabular Report for all the results logged during the test run (Optional) 
6) Graphical report for System Speed, Accuracy and Stability Vs Time (Optional)
Example for Load Testing (1) 
Test objective: 
We need to study the effect of increasing load on the system performance 
Test inputs and Prerequisites: 
1) Initial load to be applied on the system 
2) maximum load to be applied on the system (Optional) 
3) Type of the load to be applied 
4) Step of the incremental load 
5) The commonly used scenarios 
6) The users distribution among the scenarios (i.e. % of users executing each scenario) 
7) Any input data like user credentials, Serials,..etc
Example for Load Testing (2) 
Test Results and Reports: 
1) Tabular report indicating the effect of increasing load on the system speed 
2) Graphical chart indicating the effect of increasing load on the system speed 
3) Tabular report indicating the effect of increasing load on the system Accuracy 
4) Graphical chart indicating the effect of increasing load on the system Accuracy 
5) Tabular report indicating the effect of increasing load on the system Stability 
6) Graphical chart indicating the effect of increasing load on the system Stability 
7) List of the switching points at which the system performance was greatly impacted
Example For Stress Testing 
Test objective: 
We need to Know the maximum load that we could apply to the system 
Test inputs and Prerequisites: 
1) Initial load to be applied on the system 
2) Defining the test exit criteria (System crash point) 
3) Type of the load to be applied 
4) Step of the incremental load 
5) The commonly used scenarios 
6) The users distribution among the scenarios (i.e. % of users executing each scenario) 
7)Any input data like user credentials, Serials,..etc 
Test Results: The Load at witch the system crash
Points to be considered before & during 
testing 
 Testing Environment should simulate the production Environment 
 Incase of inability to simulate the production Environment, a scaling for the test results should 
be done 
 The network factor should be isolated usually by preparing a closed test lab for the testing 
machine and the SUT 
 The testing tool should be compatible with the SUT 
 The firewall and Antivirus should be disabled on both testing machine and SUT during 
recording and execution 
 Pop-Ups and warnings should be disabled during the recording and execution 
 While recording, Suitable thinking time should be considered between different actions in the 
scenario
Risks Addressed Through Performance 
Testing - Speed 
 Speed-related risks are not confined to end-user satisfaction, although that is what most 
people think of first. Speed is also a factor in certain business- and data-related risks. Some of 
the most common speed-related risks that performance testing can address include: 
 Is the application fast enough to satisfy end users? 
 Is the business able to process and utilize data collected by the application before that data 
becomes outdated? (For example, end-of-month reports are due within 24 hours of the close 
of business on the last day of the month, but it takes the application 48 hours to process the 
data.) 
 Is the application capable of presenting the most current information (e.g., stock quotes) to 
its users? 
 Is a Web Service responding within the maximum expected response time before an error is 
thrown?
Risks Addressed Through Performance 
Testing - Scalability 
 Scalability risks concern not only the number of users an application can support, but also the volume 
of data the application can contain and process, as well as the ability to identify when an application is 
approaching capacity. Common scalability risks that can be addressed via performance testing include: 
 Can the application provide consistent and acceptable response times for the entire user base? 
 Can the application store all of the data that will be collected over the life of the application? 
 Are there warning signs to indicate that the application is approaching peak capacity? 
 Will the application still be secure under heavy usage? 
 Will functionality be compromised under heavy usage? 
 Can the application withstand unanticipated peak loads?
Risks Addressed Through Performance 
Testing – Stability (1) 
 Stability is a blanket term that encompasses such areas as reliability, uptime, and recoverability. 
Although stability risks are commonly addressed with high-load, endurance, and stress tests, stability 
issues are sometimes detected during the most basic performance tests. Some common stability risks 
addressed by means of performance testing include: 
 Can the application run for long periods of time without data corruption, slowdown, or servers needing 
to be rebooted? 
 If the application does go down unexpectedly, what happens to partially completed transactions? 
 When the application comes back online after scheduled or unscheduled downtime, will users still be 
able to see/do everything they expect?
Risks Addressed Through Performance 
Testing – Stability (2) 
 When the application comes back online after unscheduled downtime, does it resume at the correct 
point? In particular, does it not attempt to resume cancelled transactions? 
 Can combinations of errors or repeated functional errors cause a system crash? 
 Are there any transactions that cause system-wide side effects? 
 Can one leg of the load-balanced environment be taken down and still provide uninterrupted service to 
users? 
 Can the system be patched or updated without taking it down?
Why VSTS (From Testing Perspective) 
VSTS Provide Testers with many useful features such as the following: 
 Provide many types of tests to satisfy the Testers needs 
 Simple GUI for recording and configuring tests 
 Powerful in analyzing and Reporting test results 
 High flexibility in Test Management and control 
 Test scripting is allowed for Advanced Editing and control
Types of Tests in VSTS 
Web test Load test Ordered test 
Used for functional 
testing of web 
Applications 
Types of Tests in VSTS 
Used for Performance, load 
and stress testing of web 
applications 
Used for managing and 
controlling the order of 
tests execution) 
 VSTS Provide many types of tests such as: 
 Web test 
 Load test 
 Ordered test 
 Manual test 
 Unit test 
 Generic test 
 The scope of this training is focusing only on the most important types of them
Web Testing -How it Works ? 
 Web tests are used for testing the functionality of the web applications, 
 Web sites, web services or a combination of all of this. 
 Web tests can be created by recording the interactions that are performed in the 
browser which are normally a series of HTTP requests (GET/POST). 
 This requests can be played back to test the web application by setting a validation 
point at the response to validate it against the expected results 
Step 1 
Step 2 
Step 3 
Record the user interactions [Test Scenario] o0n0 the web browser 
Play back the recorded scenario after setting validation points on the response 
Get the test result [Either Success or failure] after validating against the expected results
Web Testing - Creating a Web test 
1) From FILE menu, Select New -> Project -> Test 
Project then determine the Project name and path 
2) In the Solution Explorer Panel, Select the test project, right–click, 
and choose Add New test from the context menu which opens 
the Add New Test window that contains the different test type 
templates. 
3) Select the Web Test template from the list of different test types 
and determine the test name 
4) Once you select the Web Test and click OK you can see the test 
getting added to the selected test project and a new instance of a 
web browser opens.
Web Testing - Recording the Web test 
 When the browser open to record the user interactions it will contain the Web Test Recorder in the left 
pane that is used for recording the test scenarios. 
 Recorder has five different options discussed as follows: 
 Record: To start recording the web page requests. 
 Pause: used to pause the recording as In some cases we may need to skip the recording of some actions 
that it is not included in the test scenario then continue the recording normally 
 Stop: This is to stop the recording session. Once we click on the Stop button, the browser will close and 
the session will stop. 
 Add a Comment: This option is used for adding any comments to the current request in the recording. 
 Clear all requests: Used in case we need to clear out all the recorded requests
Web Testing - Editing the Web test 
 After completing the recording of all requests, Now you can see 
the Web Test editor window is open and see the recording 
details in the Web Test editor. 
 The HTTP requests sent to the server usually contains 
parameters sent with it. 
 We have two types of these parameters 
Types of Parameters 
Form post parameters 
These parameters are sent along 
with the request if the method used 
for the request is POST. All field 
entries made by the user in the web 
page are sent to the server as Form 
POST parameters. 
Query string parameters 
These parameters are sent along with 
the request if the method used for the 
request is GET. Web page will be 
retrieved from server depending on 
this parameters. 
(Ex: User name and password)
Web Testing - Editing the Web test (Web test 
properties) 
 There is some Properties for the Web Test that we can set such as user 
credentials or giving a description to the test. 
 Also there is some properties for the requests that we can set for each 
request individually such as the timeout or think times. 
 Web test properties are as follows: 
 Description: To specify the description for the current test. 
 Name: Name of the current Web Test application. 
 User Name: To specify the username of the test user, 
 if we are using any user credential for this test. 
 Password: This is the password for the username specified in the 
Username field. 
 Proxy: we use this field to set the proxy server name to be used by the 
test. 
 Stop On Error: This is useful to inform the application whether to stop 
the test or continue in case of any errors.
Web Testing - Editing the Web test (Requests 
properties) 
 Requests properties are as follows: 
------------------------------------------------- 
 Cache Control: This property is to simulate the caching property of the 
web pages. The value can be true or false. 
 Expected Response URL: This is set to the response URL that we expect 
after we sent the current request. This is to be validated against the actual 
Response URL 
 Method: This property is used to set the request method used for the 
current request. It can be either GET or POST. 
 Think Time(Seconds): Think time is set for the think time taken by the user 
between actions (Requests). This is not the exact time that the user can 
spend in thinking but is just an estimation. This property is not very useful 
for the normal single web test but very useful in case of Load test as it 
affects the system performance.
Web Testing - Editing the Web test (Requests 
properties) 
 Timeout (Seconds): This is the expiry time for the request. This is the maximum time for the request to respond 
back. If it doesn't return within this time limit then the page gets timed out with error. Default is 300. 
 Response Time Goal: Used to determine the desired response time for this request so you can check whether the 
actual response time meet the desired one or not. The default value is 0 which means the property is not set. 
 URL: This is the URL address for the request 
 Parse Dependent Requests: This property can be set to True or False to parse the dependent requests within the 
requested page. For example, we may not be interested in collecting the details for the images loaded in the web 
page. So we can turn off the requests for loading the images by setting this to False. Only the main request details 
will be collected. 
 Record Results: Used if we need to collect the performance data for the HTTP request.
Web Testing - Editing the Web test 
(Extraction rules) 
 Extraction rules are useful for extracting the data or information from the 
HTTP response. 
 Normally in web applications many web forms depend on other web 
forms. It means that the request is based on the data collected from the 
previous request's response. 
 The data from the response has to be extracted and then passed to the 
next request in the form of values for query string parameters. 
 VSTS provides several built-in types of extraction rules. This helps us to 
extract the values based on the HTML tags or different type of fields 
available in the web form.
Web Testing - Editing the Web test 
(Extraction rules) 
 The different types of Extraction rules: 
------------------------------------------------------ 
 Extract Attribute Value: This is to extract the attribute value from the response 
page based on the tag and the attribute name. The extracted value will be 
stored in the context parameter. Attribute could be an image, link,..etc. 
 Extract Form Field: To extract the value from any of the Form fields in the 
response. The field name is identified here. 
 Extract Text: This is to extract the text from the response page. The text is 
identified based on its starting and ending value with text Casing as optional. 
 We can add as many rules as we want, but we should make sure that the 
Context Parameter Names are unique across the application.
Web Testing - Editing the Web test 
(Extraction rules) 
 How to add an extraction rule to a Web test: 
------------------------------------------------------ 
 Open a Web test. 
 2) Select the request to which you want to add the extraction rule. 
 3) Right-click the request and select Add Extraction Rule. The Add 
Extraction Rule dialog box is displayed. 
 4) In the Add Extraction Rule dialog box, select Extract Attribute Value. 
 5) In the Properties, set the Context Parameter Name property to a 
descriptive name such as First Link.
Web Testing - Editing the Web test 
(Extraction rules) 
Example: 
Consider the HTML format of what we are trying to extract is 
<a href=http://www.contoso.com> 
where a is referred to as the tag and href is the attribute of interest. 
6) Set the Attribute Name property to href and the Tag Name 
property to a. 
7) After running the test the extracted value 
http://www.contoso.comwill be stored in the context parameter 
First Link 
8) You can use this extracted value in further sections in the test
Web Testing - Editing the Web test 
(Validation rules) 
 Validation rules are mainly used to validate certain data or text 
in the response against our expectations. 
 How to add a validation rule to the web test: 
--------------------------------------------------------------- 
 1) Right click on the request 
 2) Select the Add Validation Rule option which opens the 
validation rule's dialog 
 3) Select the type of validation rule required and fill the 
parameters required for the rule. 
 VSTS provides a set of predefined validation rules.
Web Testing - Editing the Web test 
(Validation rules) 
The different types of validation rules: 
------------------------------------------------------ 
Form Field: Used to verify the existence of a form field with a certain value. 
The parameters are Form Field Name and Expected value. 
Find Text: This is to verify the existence of a specified text in the response. 
The parameters used for this are: 
Find Text: The text to search for 
Ignore Case: To determine whether the search will be case sensitive or not. 
[True: Ignore case / False: Don’t Ignore case] 
Pass If Text Found: To determine the acceptance criterion 
[True: The test will pass if the text is found / False: The test will pass if the text is 
not found] 
Use Regular Expression: Used if you need to search for a regular expression (i.e. 
Special sequence of characters)
Web Testing - Editing the Web test 
(Validation rules) 
 Maximum Request Time: This is to verify whether the request finishes 
within the specified Maximum request Time. 
 The parameter is Max Request Time (milliseconds) 
 Required Attribute Value: This is similar to the attribute used in the 
Extraction rules. The parameters used here are the same as those used in 
extraction rules with two additional fields which are: 
 Expected Value: The expected value of the attribute 
 Index: The index is used here to indicate which occurrence of the string to 
validate. If this index is set to -1, then it checks any occurrence in the 
response. 
 The option Level that appears above the parameters of all validation rules 
is used in load testing to determine which level of rules should be 
validated
Web Testing - Editing the Web test 
(Validation rules) 
 Required Tag: To verify if the specified tag exists in the response. The parameters are: 
 Required Tag Name: The name of the Tag to be validated 
 Minimum Occurrences: set the minimum number of occurrence if needed 
 Response URL: Verify that the response URL is the same as the recorded response URL
Web Testing - Configuring the Web test 
 How to set the properties of the web test 
----------------------------------------------------------- 
 1) Open the “Test” menu from the menu bar 
 2) Select “Edit Test Run Configurations” 
 3) Select “Local Test Run” from the sub menu 
 4) The test configuration window will open 
 5) Select the “Web Test” from the left panel 
 6) Set the settings of the web test 
 This section describes all of the settings required for web testing. 
 These settings are applied only for the web testing but Some of the 
properties will be overridden in case of load testing.
Web Testing - Configuring the Web test 
 The different properties of the web test: 
--------------------------------------------------------- 
 Number of run iterations: This is to set the number of times the test 
has to run. 
 This property does not apply to load test as the number of iterations 
is determined in the load test properties and override this property. 
 You can set the web test to run for each row in the available Data 
source 
 Browser type: This property is to set the type of browser to use for 
the web test. 
 Network type : This is to simulate the network type on which you 
want to run the web test 
 Think time: This is to simulate the think time between the requests 
that will be sent in the web test
Web Testing - Running the Web test 
 Once we've made all the required settings and finished recording all 
the required requests we can start running the test. 
 Use the Run Test option in the Web test editor toolbar to start 
running the test. 
 You can notice the test execution progress in the web test window. 
 After completing the execution, the result window displays success 
and failure by marking against each request. 
 If any one of the requests in the test fails, the test result window will 
show the final result of the test as “Failed”
Web Testing - Running the Web test 
 If there are multiple requests in the test, we can view the detailed 
results of each request 
 Web Browser: This tab displays the same web page used by the 
request 
 Request: This tab contains all the information about the request 
like Headers, Cookies, Query String Parameters and Form Post 
Parameters. 
 Response: This tab shows the response for the requested web 
page represented in HTML 
 Context: All the run-time details assigned to 
 the test can be viewed here. 
 Details: It shows the status of the validation rules that were 
executed during the test.
Contact Us 
Website: http://www.geekit.me 
E-Mail : info@geekit.me

More Related Content

What's hot

Performance testing presentation
Performance testing presentationPerformance testing presentation
Performance testing presentation
Belatrix Software
 
Getting start with Performance Testing
Getting start with Performance Testing Getting start with Performance Testing
Getting start with Performance Testing
Yogesh Deshmukh
 
Performance Testing Overview
Performance Testing OverviewPerformance Testing Overview
Performance Testing Overview
James Venetsanakos
 
Software testing performance testing
Software testing  performance testingSoftware testing  performance testing
Software testing performance testing
GaneshKumarKanthiah
 
Performance testing with Jmeter
Performance testing with JmeterPerformance testing with Jmeter
Performance testing with Jmeter
Prashanth Kumar
 
Introduction to performance testing
Introduction to performance testingIntroduction to performance testing
Introduction to performance testing
Tharinda Liyanage
 
An Introduction to Performance Testing
An Introduction to Performance TestingAn Introduction to Performance Testing
An Introduction to Performance Testing
SWAAM Tech
 
What is Performance Testing?
What is Performance Testing?What is Performance Testing?
What is Performance Testing?
QA InfoTech
 
Performance and load testing
Performance and load testingPerformance and load testing
Performance and load testingsonukalpana
 
Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Atul Pant
 
Performance Testing
Performance TestingPerformance Testing
Performance Testing
Selin Gungor
 
Infographic: Importance of Performance Testing
Infographic: Importance of Performance TestingInfographic: Importance of Performance Testing
Infographic: Importance of Performance Testing
KiwiQA
 
Performance Testing
Performance TestingPerformance Testing
Performance Testing
sharmaparish
 
Performance testing
Performance testingPerformance testing
Performance testing
Jyoti Babbar
 
Performance testing
Performance testingPerformance testing
Performance testing
Ranpreet kaur
 
Introduction to Performance Testing Part 1
Introduction to Performance Testing Part 1Introduction to Performance Testing Part 1
Introduction to Performance Testing Part 1
C.T.Co
 
How to start performance testing project
How to start performance testing projectHow to start performance testing project
How to start performance testing project
NaveenKumar Namachivayam
 
Continuous Performance Testing
Continuous Performance TestingContinuous Performance Testing
Continuous Performance Testing
Grid Dynamics
 
Performance testing
Performance testingPerformance testing
Performance testing
Chalana Kahandawala
 

What's hot (20)

Performance testing presentation
Performance testing presentationPerformance testing presentation
Performance testing presentation
 
Getting start with Performance Testing
Getting start with Performance Testing Getting start with Performance Testing
Getting start with Performance Testing
 
Performance Testing Overview
Performance Testing OverviewPerformance Testing Overview
Performance Testing Overview
 
Software testing performance testing
Software testing  performance testingSoftware testing  performance testing
Software testing performance testing
 
Performance testing
Performance testingPerformance testing
Performance testing
 
Performance testing with Jmeter
Performance testing with JmeterPerformance testing with Jmeter
Performance testing with Jmeter
 
Introduction to performance testing
Introduction to performance testingIntroduction to performance testing
Introduction to performance testing
 
An Introduction to Performance Testing
An Introduction to Performance TestingAn Introduction to Performance Testing
An Introduction to Performance Testing
 
What is Performance Testing?
What is Performance Testing?What is Performance Testing?
What is Performance Testing?
 
Performance and load testing
Performance and load testingPerformance and load testing
Performance and load testing
 
Performance Test Plan - Sample 2
Performance Test Plan - Sample 2Performance Test Plan - Sample 2
Performance Test Plan - Sample 2
 
Performance Testing
Performance TestingPerformance Testing
Performance Testing
 
Infographic: Importance of Performance Testing
Infographic: Importance of Performance TestingInfographic: Importance of Performance Testing
Infographic: Importance of Performance Testing
 
Performance Testing
Performance TestingPerformance Testing
Performance Testing
 
Performance testing
Performance testingPerformance testing
Performance testing
 
Performance testing
Performance testingPerformance testing
Performance testing
 
Introduction to Performance Testing Part 1
Introduction to Performance Testing Part 1Introduction to Performance Testing Part 1
Introduction to Performance Testing Part 1
 
How to start performance testing project
How to start performance testing projectHow to start performance testing project
How to start performance testing project
 
Continuous Performance Testing
Continuous Performance TestingContinuous Performance Testing
Continuous Performance Testing
 
Performance testing
Performance testingPerformance testing
Performance testing
 

Viewers also liked

Performance testing
Performance testing Performance testing
Performance testing
BugRaptors
 
Revised using rubrics to facilitate self-assessment and self-reflection
Revised  using rubrics to facilitate self-assessment and self-reflectionRevised  using rubrics to facilitate self-assessment and self-reflection
Revised using rubrics to facilitate self-assessment and self-reflectionJeremy
 
Ed8 Assessment of Learning 2
Ed8 Assessment of Learning 2 Ed8 Assessment of Learning 2
Ed8 Assessment of Learning 2
Eddie Abug
 
Assessment of learning2
Assessment of learning2Assessment of learning2
Assessment of learning2
niel lopez
 
Measurement and scaling techniques
Measurement  and  scaling  techniquesMeasurement  and  scaling  techniques
Measurement and scaling techniquesUjjwal 'Shanu'
 
Methods of data collection
Methods of data collection Methods of data collection
Methods of data collection PRIYAN SAKTHI
 

Viewers also liked (8)

Performance testing
Performance testing Performance testing
Performance testing
 
Revised using rubrics to facilitate self-assessment and self-reflection
Revised  using rubrics to facilitate self-assessment and self-reflectionRevised  using rubrics to facilitate self-assessment and self-reflection
Revised using rubrics to facilitate self-assessment and self-reflection
 
Ed8 Assessment of Learning 2
Ed8 Assessment of Learning 2 Ed8 Assessment of Learning 2
Ed8 Assessment of Learning 2
 
Assessment
AssessmentAssessment
Assessment
 
Assessment of learning2
Assessment of learning2Assessment of learning2
Assessment of learning2
 
Chapter 9-METHODS OF DATA COLLECTION
Chapter 9-METHODS OF DATA COLLECTIONChapter 9-METHODS OF DATA COLLECTION
Chapter 9-METHODS OF DATA COLLECTION
 
Measurement and scaling techniques
Measurement  and  scaling  techniquesMeasurement  and  scaling  techniques
Measurement and scaling techniques
 
Methods of data collection
Methods of data collection Methods of data collection
Methods of data collection
 

Similar to Performance Testing Using VS 2010 - Part 1

Performance testing interview questions and answers
Performance testing interview questions and answersPerformance testing interview questions and answers
Performance testing interview questions and answers
Garuda Trainings
 
Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1
Suresh Mishra
 
Performance Testing using LoadRunner
Performance Testing using LoadRunnerPerformance Testing using LoadRunner
Performance Testing using LoadRunner
Kumar Gupta
 
Best Practices for Applications Performance Testing
Best Practices for Applications Performance TestingBest Practices for Applications Performance Testing
Best Practices for Applications Performance Testing
Bhaskara Reddy Sannapureddy
 
Performance testing Web Application - A complete Guide
Performance testing Web Application - A complete GuidePerformance testing Web Application - A complete Guide
Performance testing Web Application - A complete Guide
TestingXperts
 
PerformanceTestingWithLoadrunner
PerformanceTestingWithLoadrunnerPerformanceTestingWithLoadrunner
PerformanceTestingWithLoadrunnertechgajanan
 
Performance Testing With Loadrunner
Performance Testing With LoadrunnerPerformance Testing With Loadrunner
Performance Testing With Loadrunnervladimir zaremba
 
Some Commonly Asked Question For Software Testing
Some Commonly Asked Question For Software TestingSome Commonly Asked Question For Software Testing
Some Commonly Asked Question For Software Testing
Kumari Warsha Goel
 
System testing
System testingSystem testing
System testing
Abdullah-Al- Mahmud
 
Testing Types And Models
Testing Types And ModelsTesting Types And Models
Testing Types And Modelsnazeer pasha
 
Software reliability & quality
Software reliability & qualitySoftware reliability & quality
Software reliability & qualityNur Islam
 
Performance testing basics
Performance testing basicsPerformance testing basics
Performance testing basics
Charu Anand
 
Performance testing and j meter
Performance testing and j meterPerformance testing and j meter
Performance testing and j meter
Purna Chandar
 
Performancetestingjmeter 121109061704-phpapp02
Performancetestingjmeter 121109061704-phpapp02Performancetestingjmeter 121109061704-phpapp02
Performancetestingjmeter 121109061704-phpapp02
Shivakumara .
 
QSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load RunnerQSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load Runner
Qspiders - Software Testing Training Institute
 
Chaos Testing of Microservices - Shalamov Maksym
 Chaos Testing of Microservices  - Shalamov Maksym Chaos Testing of Microservices  - Shalamov Maksym
Chaos Testing of Microservices - Shalamov Maksym
Kuberton
 
System testing
System testingSystem testing
System testing
Sifat Hossain
 
Getting Started with Apache Jmeter
Getting Started with Apache JmeterGetting Started with Apache Jmeter
Getting Started with Apache Jmeter
Mindfire Solutions
 
Performance Requirement Gathering
Performance Requirement GatheringPerformance Requirement Gathering
Performance Requirement Gathering
Atul Pant
 
Performance Test Slideshow Recent
Performance Test Slideshow RecentPerformance Test Slideshow Recent
Performance Test Slideshow RecentFuture Simmons
 

Similar to Performance Testing Using VS 2010 - Part 1 (20)

Performance testing interview questions and answers
Performance testing interview questions and answersPerformance testing interview questions and answers
Performance testing interview questions and answers
 
Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1
 
Performance Testing using LoadRunner
Performance Testing using LoadRunnerPerformance Testing using LoadRunner
Performance Testing using LoadRunner
 
Best Practices for Applications Performance Testing
Best Practices for Applications Performance TestingBest Practices for Applications Performance Testing
Best Practices for Applications Performance Testing
 
Performance testing Web Application - A complete Guide
Performance testing Web Application - A complete GuidePerformance testing Web Application - A complete Guide
Performance testing Web Application - A complete Guide
 
PerformanceTestingWithLoadrunner
PerformanceTestingWithLoadrunnerPerformanceTestingWithLoadrunner
PerformanceTestingWithLoadrunner
 
Performance Testing With Loadrunner
Performance Testing With LoadrunnerPerformance Testing With Loadrunner
Performance Testing With Loadrunner
 
Some Commonly Asked Question For Software Testing
Some Commonly Asked Question For Software TestingSome Commonly Asked Question For Software Testing
Some Commonly Asked Question For Software Testing
 
System testing
System testingSystem testing
System testing
 
Testing Types And Models
Testing Types And ModelsTesting Types And Models
Testing Types And Models
 
Software reliability & quality
Software reliability & qualitySoftware reliability & quality
Software reliability & quality
 
Performance testing basics
Performance testing basicsPerformance testing basics
Performance testing basics
 
Performance testing and j meter
Performance testing and j meterPerformance testing and j meter
Performance testing and j meter
 
Performancetestingjmeter 121109061704-phpapp02
Performancetestingjmeter 121109061704-phpapp02Performancetestingjmeter 121109061704-phpapp02
Performancetestingjmeter 121109061704-phpapp02
 
QSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load RunnerQSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load Runner
 
Chaos Testing of Microservices - Shalamov Maksym
 Chaos Testing of Microservices  - Shalamov Maksym Chaos Testing of Microservices  - Shalamov Maksym
Chaos Testing of Microservices - Shalamov Maksym
 
System testing
System testingSystem testing
System testing
 
Getting Started with Apache Jmeter
Getting Started with Apache JmeterGetting Started with Apache Jmeter
Getting Started with Apache Jmeter
 
Performance Requirement Gathering
Performance Requirement GatheringPerformance Requirement Gathering
Performance Requirement Gathering
 
Performance Test Slideshow Recent
Performance Test Slideshow RecentPerformance Test Slideshow Recent
Performance Test Slideshow Recent
 

Recently uploaded

De mooiste recreatieve routes ontdekken met RouteYou en FME
De mooiste recreatieve routes ontdekken met RouteYou en FMEDe mooiste recreatieve routes ontdekken met RouteYou en FME
De mooiste recreatieve routes ontdekken met RouteYou en FME
Jelle | Nordend
 
Into the Box 2024 - Keynote Day 2 Slides.pdf
Into the Box 2024 - Keynote Day 2 Slides.pdfInto the Box 2024 - Keynote Day 2 Slides.pdf
Into the Box 2024 - Keynote Day 2 Slides.pdf
Ortus Solutions, Corp
 
Cracking the code review at SpringIO 2024
Cracking the code review at SpringIO 2024Cracking the code review at SpringIO 2024
Cracking the code review at SpringIO 2024
Paco van Beckhoven
 
Globus Compute Introduction - GlobusWorld 2024
Globus Compute Introduction - GlobusWorld 2024Globus Compute Introduction - GlobusWorld 2024
Globus Compute Introduction - GlobusWorld 2024
Globus
 
Developing Distributed High-performance Computing Capabilities of an Open Sci...
Developing Distributed High-performance Computing Capabilities of an Open Sci...Developing Distributed High-performance Computing Capabilities of an Open Sci...
Developing Distributed High-performance Computing Capabilities of an Open Sci...
Globus
 
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERROR
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERROR
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERROR
Tier1 app
 
Lecture 1 Introduction to games development
Lecture 1 Introduction to games developmentLecture 1 Introduction to games development
Lecture 1 Introduction to games development
abdulrafaychaudhry
 
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...
informapgpstrackings
 
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...
Globus
 
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoam
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamOpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoam
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoam
takuyayamamoto1800
 
Visitor Management System in India- Vizman.app
Visitor Management System in India- Vizman.appVisitor Management System in India- Vizman.app
Visitor Management System in India- Vizman.app
NaapbooksPrivateLimi
 
Large Language Models and the End of Programming
Large Language Models and the End of ProgrammingLarge Language Models and the End of Programming
Large Language Models and the End of Programming
Matt Welsh
 
Vitthal Shirke Microservices Resume Montevideo
Vitthal Shirke Microservices Resume MontevideoVitthal Shirke Microservices Resume Montevideo
Vitthal Shirke Microservices Resume Montevideo
Vitthal Shirke
 
GlobusWorld 2024 Opening Keynote session
GlobusWorld 2024 Opening Keynote sessionGlobusWorld 2024 Opening Keynote session
GlobusWorld 2024 Opening Keynote session
Globus
 
Beyond Event Sourcing - Embracing CRUD for Wix Platform - Java.IL
Beyond Event Sourcing - Embracing CRUD for Wix Platform - Java.ILBeyond Event Sourcing - Embracing CRUD for Wix Platform - Java.IL
Beyond Event Sourcing - Embracing CRUD for Wix Platform - Java.IL
Natan Silnitsky
 
Using IESVE for Room Loads Analysis - Australia & New Zealand
Using IESVE for Room Loads Analysis - Australia & New ZealandUsing IESVE for Room Loads Analysis - Australia & New Zealand
Using IESVE for Room Loads Analysis - Australia & New Zealand
IES VE
 
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...
Anthony Dahanne
 
Globus Connect Server Deep Dive - GlobusWorld 2024
Globus Connect Server Deep Dive - GlobusWorld 2024Globus Connect Server Deep Dive - GlobusWorld 2024
Globus Connect Server Deep Dive - GlobusWorld 2024
Globus
 
Strategies for Successful Data Migration Tools.pptx
Strategies for Successful Data Migration Tools.pptxStrategies for Successful Data Migration Tools.pptx
Strategies for Successful Data Migration Tools.pptx
varshanayak241
 
2024 RoOUG Security model for the cloud.pptx
2024 RoOUG Security model for the cloud.pptx2024 RoOUG Security model for the cloud.pptx
2024 RoOUG Security model for the cloud.pptx
Georgi Kodinov
 

Recently uploaded (20)

De mooiste recreatieve routes ontdekken met RouteYou en FME
De mooiste recreatieve routes ontdekken met RouteYou en FMEDe mooiste recreatieve routes ontdekken met RouteYou en FME
De mooiste recreatieve routes ontdekken met RouteYou en FME
 
Into the Box 2024 - Keynote Day 2 Slides.pdf
Into the Box 2024 - Keynote Day 2 Slides.pdfInto the Box 2024 - Keynote Day 2 Slides.pdf
Into the Box 2024 - Keynote Day 2 Slides.pdf
 
Cracking the code review at SpringIO 2024
Cracking the code review at SpringIO 2024Cracking the code review at SpringIO 2024
Cracking the code review at SpringIO 2024
 
Globus Compute Introduction - GlobusWorld 2024
Globus Compute Introduction - GlobusWorld 2024Globus Compute Introduction - GlobusWorld 2024
Globus Compute Introduction - GlobusWorld 2024
 
Developing Distributed High-performance Computing Capabilities of an Open Sci...
Developing Distributed High-performance Computing Capabilities of an Open Sci...Developing Distributed High-performance Computing Capabilities of an Open Sci...
Developing Distributed High-performance Computing Capabilities of an Open Sci...
 
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERROR
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERROR
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERROR
 
Lecture 1 Introduction to games development
Lecture 1 Introduction to games developmentLecture 1 Introduction to games development
Lecture 1 Introduction to games development
 
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...
 
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...
 
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoam
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamOpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoam
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoam
 
Visitor Management System in India- Vizman.app
Visitor Management System in India- Vizman.appVisitor Management System in India- Vizman.app
Visitor Management System in India- Vizman.app
 
Large Language Models and the End of Programming
Large Language Models and the End of ProgrammingLarge Language Models and the End of Programming
Large Language Models and the End of Programming
 
Vitthal Shirke Microservices Resume Montevideo
Vitthal Shirke Microservices Resume MontevideoVitthal Shirke Microservices Resume Montevideo
Vitthal Shirke Microservices Resume Montevideo
 
GlobusWorld 2024 Opening Keynote session
GlobusWorld 2024 Opening Keynote sessionGlobusWorld 2024 Opening Keynote session
GlobusWorld 2024 Opening Keynote session
 
Beyond Event Sourcing - Embracing CRUD for Wix Platform - Java.IL
Beyond Event Sourcing - Embracing CRUD for Wix Platform - Java.ILBeyond Event Sourcing - Embracing CRUD for Wix Platform - Java.IL
Beyond Event Sourcing - Embracing CRUD for Wix Platform - Java.IL
 
Using IESVE for Room Loads Analysis - Australia & New Zealand
Using IESVE for Room Loads Analysis - Australia & New ZealandUsing IESVE for Room Loads Analysis - Australia & New Zealand
Using IESVE for Room Loads Analysis - Australia & New Zealand
 
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...
Paketo Buildpacks : la meilleure façon de construire des images OCI? DevopsDa...
 
Globus Connect Server Deep Dive - GlobusWorld 2024
Globus Connect Server Deep Dive - GlobusWorld 2024Globus Connect Server Deep Dive - GlobusWorld 2024
Globus Connect Server Deep Dive - GlobusWorld 2024
 
Strategies for Successful Data Migration Tools.pptx
Strategies for Successful Data Migration Tools.pptxStrategies for Successful Data Migration Tools.pptx
Strategies for Successful Data Migration Tools.pptx
 
2024 RoOUG Security model for the cloud.pptx
2024 RoOUG Security model for the cloud.pptx2024 RoOUG Security model for the cloud.pptx
2024 RoOUG Security model for the cloud.pptx
 

Performance Testing Using VS 2010 - Part 1

  • 1. Performance Test Using Visual Studio 2010 Presented By: Mohamed Tarek
  • 2. Agenda – Day 1  Why Functional Testing is not enough ?  Why Performance Test ?  Difference between Performance, Load and Stress Testing  Performance testing process  Different Methods for Applying Load  How to Measure the System Performance ?  Points to be considered before & during testing  Risks Addressed Through Performance Testing  Why VSTS (From Testing Perspective) ?  Types of Tests  Web test  Load test  Ordered test
  • 3. Agenda – Day 1 Web Testing  How it works ?  Creating a web test  Recording the web test  Editing the web test  Configuring the web test  Running the web test
  • 4. Why Functional Testing is not enough
  • 7. Difference between Performance, Load and Stress Testing  Performance - is about response, time lapses, duration ... etc.  Load testing - is about test behavior under normal/peak workload conditions. Load is more about characterizing / simulating your actual workload.  Stress testing - is about surfacing issues under extreme conditions and resource failures.
  • 8. Testing objective (Perf , Load , Stress) Common Action: Apply Load and Measure Performance Performance Testing Load Testing Stress Testing Measuring the System Performance under constant load Performance metrics:  Speed Accuracy Stability But With Different Objectives… !! Measuring the System Performance under incremental load to study the relation between the applied load and the system performance Applying Incremental load on the system till it reach a break point at which the system crash in order to avoid reaching this point in the future or to Enhance the system performance)
  • 9. Performance testing process  Identify Test Environment  Identify Performance Acceptance Criteria  Plan and Design Tests  Configure Test Environment  Implement Test Design (Configuration & Test Data)  Execute Tests  Analyze , Report and Retest
  • 10. Different Methods for Applying Load  Constant number of users accessing the system concurrently  Variable number of users accessing the system concurrently  Constant size of data uploaded to the system by some users (Ex: Different types of files)  Variable size of data uploaded to the system by some users  Constant size of data downloaded from the system by some users  Variable size of data downloaded from the system by some users  Combination of all of the above
  • 11. Required Settings for the applied load  If the load is Virtual users (VU):  Number of users  Start time  End time  Ramp-UP period (Time taken for generating all VUs)  If the load is data file:  File name  File path
  • 12. How to Measure the System Performance Metrics used for measuring the system performance Speed Accuracy Stability Always measured by one of the following:  Avg. Page Load time  Avg. Transaction time  Avg. Response time Can be measured using the Standard Deviation and the coefficient of variation. CV=SD/Avg.Res.time*100% CV< 5% High Stability 5%<CV<10% Acceptable CV>10% Poor Can be measured using the Avg% of Error for all requests %Error<2% High Accuracy 2%<%Error<4% Acceptable %Error>4% Poor  System Speed  System Accuracy  System Stability
  • 13. Example for Performance Testing (1)  Test objective:  We need to know the system performance during the rush hour  Test inputs and Prerequisites: 1) Duration of the rush hour 2) Number of users accessing the system concurrently at the rush hour 3) Ramp-UP Period 4) The commonly used scenarios 5) The users distribution among the scenarios (i.e. % of users executing each scenario) 6)Any input data like user credentials, Serials,..etc
  • 14. Example for Performance Testing (2) Test Results and Reports: 1) Indication about the System Speed (Avg. Page time, Avg. Response time or Avg. Transaction time) 2) Indication about the System Accuracy(Avg. % of Error) 3) The Errors that occurred and the URL of the failed requests 4) Indication about the System Stability(Avg. Standard deviation and Coefficient of Variation) 5) Tabular Report for all the results logged during the test run (Optional) 6) Graphical report for System Speed, Accuracy and Stability Vs Time (Optional)
  • 15. Example for Load Testing (1) Test objective: We need to study the effect of increasing load on the system performance Test inputs and Prerequisites: 1) Initial load to be applied on the system 2) maximum load to be applied on the system (Optional) 3) Type of the load to be applied 4) Step of the incremental load 5) The commonly used scenarios 6) The users distribution among the scenarios (i.e. % of users executing each scenario) 7) Any input data like user credentials, Serials,..etc
  • 16. Example for Load Testing (2) Test Results and Reports: 1) Tabular report indicating the effect of increasing load on the system speed 2) Graphical chart indicating the effect of increasing load on the system speed 3) Tabular report indicating the effect of increasing load on the system Accuracy 4) Graphical chart indicating the effect of increasing load on the system Accuracy 5) Tabular report indicating the effect of increasing load on the system Stability 6) Graphical chart indicating the effect of increasing load on the system Stability 7) List of the switching points at which the system performance was greatly impacted
  • 17. Example For Stress Testing Test objective: We need to Know the maximum load that we could apply to the system Test inputs and Prerequisites: 1) Initial load to be applied on the system 2) Defining the test exit criteria (System crash point) 3) Type of the load to be applied 4) Step of the incremental load 5) The commonly used scenarios 6) The users distribution among the scenarios (i.e. % of users executing each scenario) 7)Any input data like user credentials, Serials,..etc Test Results: The Load at witch the system crash
  • 18. Points to be considered before & during testing  Testing Environment should simulate the production Environment  Incase of inability to simulate the production Environment, a scaling for the test results should be done  The network factor should be isolated usually by preparing a closed test lab for the testing machine and the SUT  The testing tool should be compatible with the SUT  The firewall and Antivirus should be disabled on both testing machine and SUT during recording and execution  Pop-Ups and warnings should be disabled during the recording and execution  While recording, Suitable thinking time should be considered between different actions in the scenario
  • 19. Risks Addressed Through Performance Testing - Speed  Speed-related risks are not confined to end-user satisfaction, although that is what most people think of first. Speed is also a factor in certain business- and data-related risks. Some of the most common speed-related risks that performance testing can address include:  Is the application fast enough to satisfy end users?  Is the business able to process and utilize data collected by the application before that data becomes outdated? (For example, end-of-month reports are due within 24 hours of the close of business on the last day of the month, but it takes the application 48 hours to process the data.)  Is the application capable of presenting the most current information (e.g., stock quotes) to its users?  Is a Web Service responding within the maximum expected response time before an error is thrown?
  • 20. Risks Addressed Through Performance Testing - Scalability  Scalability risks concern not only the number of users an application can support, but also the volume of data the application can contain and process, as well as the ability to identify when an application is approaching capacity. Common scalability risks that can be addressed via performance testing include:  Can the application provide consistent and acceptable response times for the entire user base?  Can the application store all of the data that will be collected over the life of the application?  Are there warning signs to indicate that the application is approaching peak capacity?  Will the application still be secure under heavy usage?  Will functionality be compromised under heavy usage?  Can the application withstand unanticipated peak loads?
  • 21. Risks Addressed Through Performance Testing – Stability (1)  Stability is a blanket term that encompasses such areas as reliability, uptime, and recoverability. Although stability risks are commonly addressed with high-load, endurance, and stress tests, stability issues are sometimes detected during the most basic performance tests. Some common stability risks addressed by means of performance testing include:  Can the application run for long periods of time without data corruption, slowdown, or servers needing to be rebooted?  If the application does go down unexpectedly, what happens to partially completed transactions?  When the application comes back online after scheduled or unscheduled downtime, will users still be able to see/do everything they expect?
  • 22. Risks Addressed Through Performance Testing – Stability (2)  When the application comes back online after unscheduled downtime, does it resume at the correct point? In particular, does it not attempt to resume cancelled transactions?  Can combinations of errors or repeated functional errors cause a system crash?  Are there any transactions that cause system-wide side effects?  Can one leg of the load-balanced environment be taken down and still provide uninterrupted service to users?  Can the system be patched or updated without taking it down?
  • 23. Why VSTS (From Testing Perspective) VSTS Provide Testers with many useful features such as the following:  Provide many types of tests to satisfy the Testers needs  Simple GUI for recording and configuring tests  Powerful in analyzing and Reporting test results  High flexibility in Test Management and control  Test scripting is allowed for Advanced Editing and control
  • 24. Types of Tests in VSTS Web test Load test Ordered test Used for functional testing of web Applications Types of Tests in VSTS Used for Performance, load and stress testing of web applications Used for managing and controlling the order of tests execution)  VSTS Provide many types of tests such as:  Web test  Load test  Ordered test  Manual test  Unit test  Generic test  The scope of this training is focusing only on the most important types of them
  • 25. Web Testing -How it Works ?  Web tests are used for testing the functionality of the web applications,  Web sites, web services or a combination of all of this.  Web tests can be created by recording the interactions that are performed in the browser which are normally a series of HTTP requests (GET/POST).  This requests can be played back to test the web application by setting a validation point at the response to validate it against the expected results Step 1 Step 2 Step 3 Record the user interactions [Test Scenario] o0n0 the web browser Play back the recorded scenario after setting validation points on the response Get the test result [Either Success or failure] after validating against the expected results
  • 26. Web Testing - Creating a Web test 1) From FILE menu, Select New -> Project -> Test Project then determine the Project name and path 2) In the Solution Explorer Panel, Select the test project, right–click, and choose Add New test from the context menu which opens the Add New Test window that contains the different test type templates. 3) Select the Web Test template from the list of different test types and determine the test name 4) Once you select the Web Test and click OK you can see the test getting added to the selected test project and a new instance of a web browser opens.
  • 27. Web Testing - Recording the Web test  When the browser open to record the user interactions it will contain the Web Test Recorder in the left pane that is used for recording the test scenarios.  Recorder has five different options discussed as follows:  Record: To start recording the web page requests.  Pause: used to pause the recording as In some cases we may need to skip the recording of some actions that it is not included in the test scenario then continue the recording normally  Stop: This is to stop the recording session. Once we click on the Stop button, the browser will close and the session will stop.  Add a Comment: This option is used for adding any comments to the current request in the recording.  Clear all requests: Used in case we need to clear out all the recorded requests
  • 28. Web Testing - Editing the Web test  After completing the recording of all requests, Now you can see the Web Test editor window is open and see the recording details in the Web Test editor.  The HTTP requests sent to the server usually contains parameters sent with it.  We have two types of these parameters Types of Parameters Form post parameters These parameters are sent along with the request if the method used for the request is POST. All field entries made by the user in the web page are sent to the server as Form POST parameters. Query string parameters These parameters are sent along with the request if the method used for the request is GET. Web page will be retrieved from server depending on this parameters. (Ex: User name and password)
  • 29. Web Testing - Editing the Web test (Web test properties)  There is some Properties for the Web Test that we can set such as user credentials or giving a description to the test.  Also there is some properties for the requests that we can set for each request individually such as the timeout or think times.  Web test properties are as follows:  Description: To specify the description for the current test.  Name: Name of the current Web Test application.  User Name: To specify the username of the test user,  if we are using any user credential for this test.  Password: This is the password for the username specified in the Username field.  Proxy: we use this field to set the proxy server name to be used by the test.  Stop On Error: This is useful to inform the application whether to stop the test or continue in case of any errors.
  • 30. Web Testing - Editing the Web test (Requests properties)  Requests properties are as follows: -------------------------------------------------  Cache Control: This property is to simulate the caching property of the web pages. The value can be true or false.  Expected Response URL: This is set to the response URL that we expect after we sent the current request. This is to be validated against the actual Response URL  Method: This property is used to set the request method used for the current request. It can be either GET or POST.  Think Time(Seconds): Think time is set for the think time taken by the user between actions (Requests). This is not the exact time that the user can spend in thinking but is just an estimation. This property is not very useful for the normal single web test but very useful in case of Load test as it affects the system performance.
  • 31. Web Testing - Editing the Web test (Requests properties)  Timeout (Seconds): This is the expiry time for the request. This is the maximum time for the request to respond back. If it doesn't return within this time limit then the page gets timed out with error. Default is 300.  Response Time Goal: Used to determine the desired response time for this request so you can check whether the actual response time meet the desired one or not. The default value is 0 which means the property is not set.  URL: This is the URL address for the request  Parse Dependent Requests: This property can be set to True or False to parse the dependent requests within the requested page. For example, we may not be interested in collecting the details for the images loaded in the web page. So we can turn off the requests for loading the images by setting this to False. Only the main request details will be collected.  Record Results: Used if we need to collect the performance data for the HTTP request.
  • 32. Web Testing - Editing the Web test (Extraction rules)  Extraction rules are useful for extracting the data or information from the HTTP response.  Normally in web applications many web forms depend on other web forms. It means that the request is based on the data collected from the previous request's response.  The data from the response has to be extracted and then passed to the next request in the form of values for query string parameters.  VSTS provides several built-in types of extraction rules. This helps us to extract the values based on the HTML tags or different type of fields available in the web form.
  • 33. Web Testing - Editing the Web test (Extraction rules)  The different types of Extraction rules: ------------------------------------------------------  Extract Attribute Value: This is to extract the attribute value from the response page based on the tag and the attribute name. The extracted value will be stored in the context parameter. Attribute could be an image, link,..etc.  Extract Form Field: To extract the value from any of the Form fields in the response. The field name is identified here.  Extract Text: This is to extract the text from the response page. The text is identified based on its starting and ending value with text Casing as optional.  We can add as many rules as we want, but we should make sure that the Context Parameter Names are unique across the application.
  • 34. Web Testing - Editing the Web test (Extraction rules)  How to add an extraction rule to a Web test: ------------------------------------------------------  Open a Web test.  2) Select the request to which you want to add the extraction rule.  3) Right-click the request and select Add Extraction Rule. The Add Extraction Rule dialog box is displayed.  4) In the Add Extraction Rule dialog box, select Extract Attribute Value.  5) In the Properties, set the Context Parameter Name property to a descriptive name such as First Link.
  • 35. Web Testing - Editing the Web test (Extraction rules) Example: Consider the HTML format of what we are trying to extract is <a href=http://www.contoso.com> where a is referred to as the tag and href is the attribute of interest. 6) Set the Attribute Name property to href and the Tag Name property to a. 7) After running the test the extracted value http://www.contoso.comwill be stored in the context parameter First Link 8) You can use this extracted value in further sections in the test
  • 36. Web Testing - Editing the Web test (Validation rules)  Validation rules are mainly used to validate certain data or text in the response against our expectations.  How to add a validation rule to the web test: ---------------------------------------------------------------  1) Right click on the request  2) Select the Add Validation Rule option which opens the validation rule's dialog  3) Select the type of validation rule required and fill the parameters required for the rule.  VSTS provides a set of predefined validation rules.
  • 37. Web Testing - Editing the Web test (Validation rules) The different types of validation rules: ------------------------------------------------------ Form Field: Used to verify the existence of a form field with a certain value. The parameters are Form Field Name and Expected value. Find Text: This is to verify the existence of a specified text in the response. The parameters used for this are: Find Text: The text to search for Ignore Case: To determine whether the search will be case sensitive or not. [True: Ignore case / False: Don’t Ignore case] Pass If Text Found: To determine the acceptance criterion [True: The test will pass if the text is found / False: The test will pass if the text is not found] Use Regular Expression: Used if you need to search for a regular expression (i.e. Special sequence of characters)
  • 38. Web Testing - Editing the Web test (Validation rules)  Maximum Request Time: This is to verify whether the request finishes within the specified Maximum request Time.  The parameter is Max Request Time (milliseconds)  Required Attribute Value: This is similar to the attribute used in the Extraction rules. The parameters used here are the same as those used in extraction rules with two additional fields which are:  Expected Value: The expected value of the attribute  Index: The index is used here to indicate which occurrence of the string to validate. If this index is set to -1, then it checks any occurrence in the response.  The option Level that appears above the parameters of all validation rules is used in load testing to determine which level of rules should be validated
  • 39. Web Testing - Editing the Web test (Validation rules)  Required Tag: To verify if the specified tag exists in the response. The parameters are:  Required Tag Name: The name of the Tag to be validated  Minimum Occurrences: set the minimum number of occurrence if needed  Response URL: Verify that the response URL is the same as the recorded response URL
  • 40. Web Testing - Configuring the Web test  How to set the properties of the web test -----------------------------------------------------------  1) Open the “Test” menu from the menu bar  2) Select “Edit Test Run Configurations”  3) Select “Local Test Run” from the sub menu  4) The test configuration window will open  5) Select the “Web Test” from the left panel  6) Set the settings of the web test  This section describes all of the settings required for web testing.  These settings are applied only for the web testing but Some of the properties will be overridden in case of load testing.
  • 41. Web Testing - Configuring the Web test  The different properties of the web test: ---------------------------------------------------------  Number of run iterations: This is to set the number of times the test has to run.  This property does not apply to load test as the number of iterations is determined in the load test properties and override this property.  You can set the web test to run for each row in the available Data source  Browser type: This property is to set the type of browser to use for the web test.  Network type : This is to simulate the network type on which you want to run the web test  Think time: This is to simulate the think time between the requests that will be sent in the web test
  • 42. Web Testing - Running the Web test  Once we've made all the required settings and finished recording all the required requests we can start running the test.  Use the Run Test option in the Web test editor toolbar to start running the test.  You can notice the test execution progress in the web test window.  After completing the execution, the result window displays success and failure by marking against each request.  If any one of the requests in the test fails, the test result window will show the final result of the test as “Failed”
  • 43. Web Testing - Running the Web test  If there are multiple requests in the test, we can view the detailed results of each request  Web Browser: This tab displays the same web page used by the request  Request: This tab contains all the information about the request like Headers, Cookies, Query String Parameters and Form Post Parameters.  Response: This tab shows the response for the requested web page represented in HTML  Context: All the run-time details assigned to  the test can be viewed here.  Details: It shows the status of the validation rules that were executed during the test.
  • 44. Contact Us Website: http://www.geekit.me E-Mail : info@geekit.me