SlideShare a Scribd company logo
1 of 10
Download to read offline
Best Practices in Performance Testing
Table of Contents
I) Test Planning/Test Strategy ............................................................................. 2
II) Scripting (Jmeter Checklist)............................................................................ 4
III) Test Execution & Analysis .............................................................................. 9
pg. 2
BEST PRACTICES IN PERFORMANCE TESTING
A performance tester should have information of all the below checklist items & should seek to get inputs & follow these diligently for
successful performance testing
I) Test Planning/Test Strategy
Checkpoints
Entry Criteria
Have the Project release timelines, scope been communicated? If not pls seek and get the details of project scope, types of tests to be
covered, test case/user traversal flows to covered as part of scripting and execution timelines
Have the test cases/user traversal flows been identified from the Requirement Document and is the screen flow for critical transactions
available for the same?
Is the production mix data available?
Are the application architecture, AppServer, WebServer, database details available? Are architecture ( logical , physical for both prod and test
environments), deployment diagrams available?
Test Plan or Strategy Document Contents - Checkpoints
Have the release, project(s), project manager(s) details been updated?
Has the Revision History section been updated with the latest version#, Date, Originator and Modification details?
Are all the locations in the reference documentation section (Requirement Document) links to the latest version of the respective
documents?
Has the Project Roles & Responsibilities section been updated with all the project managers and lead engineer's names and phone numbers?
Does the Document Approvals / Signoff section include the QA/Client managers/relevant stakeholders involved in the project for the current
Project release?
Is the Introduction Section (Purpose, Project Summary) updated with details from the Requirement Documents as applicable to the
current Project release?
Is the correct System Architecture Diagram in place in the Architecture Overview Section?
Does your Test Strategy section cover all the different types of tests that are to be executed for the Project/Release?
Have all the test scenarios to be scripted been detailed out in the Test Cases section?
pg. 3
BEST PRACTICES IN PERFORMANCE TESTING
Is the Test Cases section in format discussed
Is the scope of testing in-sync with the project's scope for the project release (with respect to functionality, system upgrades, etc)?
Has the Assumptions/Constraints/Risks section been updated with proper data and ownership
Are the dates in the Deliverable section reflective of the current project release schedule and is the right link provided to the test reports?
Does the Project Requirements section have the right link to the baseline reports?
Are the TPS requirements given accurately (2x,3x,4x) for the text sections (Capacity, Longevity, Max TPS etc)?
Has the Test Dependencies section been update with the latest Production Mix data and TPS values?
Has the Overall Production Mix % split been specified with accurate weight age in a multiple scenario case?
Are the data requirements (User ID etc) given for all the applicable scenarios?
Check whether there is a need for consolidated(transaction) form of the response time of the modules, if not avoid using it in the script, else
Manual work will be required on the final report and the total row will need to calculate freshly/manually.
Is the scripting tool & monitoring tool specified with correct version in the Test Tools sections?
Has the appserver, webserver, DB Server etc with sizing details been provided in the Environment/Server List section as applicable?
Has the Logs and Configurations section been updated with application logs, webserver logs and config file locations?
Are all the tests mentioned in the test strategy section, explained in the Performance Tests section with details about purpose, Objectives etc
Are the estimated/actual milestone dates for each phase of the project with the performance test deliverables mentioned accurately in the
Performance Testing Schedule?
Have the dates for builds schedule been updated in the Environment Schedule, if applicable?
Formatting Checks
Is the TOC updated to reflect the correct page numbers/section names and are the links on the TOC working correctly?
Are the header/footer appropriately placed as specified?
Has a spell check been performed?
Is the indentation uniform throughout the document?
Have page breaks been appropriately given to the starting of new sections, if applicable?
Is the font uniform throughout the document?
Have the section headers been formatted appropriately so as to reflect properly in the TOC?
Is the Test Plan named appropriately according to conventions, if applicable?
Have the properties settings been modified?
pg. 4
BEST PRACTICES IN PERFORMANCE TESTING
Is the Alignment is proper throughout the document?
Has a Self Review been performed before sharing the Test Plan?
II) Scripting (Jmeter Checklist)
Some general standards to follow in JMeter
1. Keep scripts modular: Create separate scripts for different functionalities or test scenarios. This will make it easier to manage and
maintain the scripts.
2. Use variables: Use variables for any test data that will change, such as user IDs, passwords, and URLs. This will make it easier to
update the test data later on.
3. Use Assertions: Use assertions to verify that the response of the server is correct only for debugging purpose. This will help you
catch any errors that occur during testing.
4. Use correlation: Correlate any dynamic values that are returned by the server, such as session IDs or CSRF tokens. This will ensure
that subsequent requests are valid.
5. Use realistic load profiles: Use realistic load profiles to simulate the expected user behaviour. This will help you identify any
performance issues before they occur in production.
6. Use timers: Use timers to simulate user think time and pacing between requests. This will help you create more accurate load
profiles. Choose the right combination of timers (constant, Gaussian Random Timer depending on arrival rates to be formulated.
7. Use CSV data sets: Use CSV data sets to read test data from external files. This will make it easier to manage and maintain large
amounts of test data. Ensure all CSV data set profiles are kept in the bin folder of the JMeter instance and no full path’s should be
used. This helps in moving across Windows/Mac/Linux instances.
8. Use non-GUI mode: Run JMeter in non-GUI mode to reduce the overhead on the system and improve performance.
9. Use distributed testing: Use distributed testing to simulate larger loads and distribute the load across multiple machines.
pg. 5
BEST PRACTICES IN PERFORMANCE TESTING
10. Use version control: Use version control to manage and maintain scripts over time. This will help you track changes and revert to
previous versions if needed
11. Enable - “Delay Thread Creation until needed in Thread” in Thread Group
12. Use WorkBench for rough activity, debugging, recording so that it’s not executed
13. Use Patterns to Exclude . Avoid using “Use Patterns to Include”{Check this while recording in JMeter Patterns to Include gives a
better control in recording capture}. Always use the Recoding Template, capture the request response in XML files in View Results
Tree. Do this recording at least twice to ensure no request response captures are missed.
14. Avoid GUI mode for large load. Don’t use listeners that are heavy
15. Avoid view result tree and table while execution test. Use it only for debugging
16. Avoid the use of Assertions in JMeter, as it is likely to be a memory hogger.
17. Use loop controllers for same samplers
18. Use dynamic test data with CSV
19. For Saving Listener output use CSV instead of XML. Disable XML is considered heavy
20. Ensure all protocol, server, port number entries are cleared in the subrequests. Only the ‘HTTP Request Defaults’ should have this.
Note ‘Http Requests Defaults’ should have the variables in them and no hard coding should be permitted in there. This helps in when
changing the environments during the tests.
21. Ensure ‘Header Manager’ too have no hard coded values in them
Checkpoints
Recording Options
Decide on the appropriate sampler for scripting.
Uncheck the Auto Correlation option, if it’s checked. Sometimes this may be required and is left to the judgment of the scripter {Auto
Correlation is not ideally recommended}
Add Appropriate comments while recording or after recording which explains the function of every transaction Always add elaborate
description in the comments column, so that it is easy for anyone to follow or maintain in the future.
Keep all the Values to be inputted with field name ready in another document (notepad/WordPad/etc) which would prevent re-scripting the
flow due to wrong input or forgetting the values (in case of complex/very large workflows).
Decide the script name for the flow in advance.
pg. 6
BEST PRACTICES IN PERFORMANCE TESTING
Scripting Content
Check whether the script starts with an appropriate Header which contains the following info?
• Copyright notice
• Script Name
• Author
• Version information
• Script Version
• Modified Log
• Associated files
• Script Overview
• Function TOC
• [URL TOC]
User proper Hungarian notation for variable names.
All the transactions should have the check points to verify page/function accuracy
The check points must be unique between pages and must distinguish one page from the other.
All action and transaction names should follow a set of standard
format: ApplicationName_FunctionalityName_FunctionalityNumber_PageNumber_PageName/Transaction Name
For Eg: Application name = PI
5 scripts exist
Associate, Supervisor, Workflow, QC and Index
PI_Associate_01_01_Process
PI_01_PreLogin
PI_02_Login
PI_03_Logout
PI_QC_04_02_WorkTab
From the above you must follow those common transactions across the application like login, logout etc have a common transaction name
across all scripts.
Always parameterize URL, UID, and PWD.
pg. 7
BEST PRACTICES IN PERFORMANCE TESTING
PWD may need to be encrypted.
Parameterize the following values in the script depending upon the requirement.
• Input Data (Like User ID, Password, Date, Timings etc )
• Login server URL
• Application Server URL, Content
• Think Time
• Other Values applicable
Correlate the Dynamic values in the script using manually co-relation through Regular expression extractor
The best way to figure out co relatable values is by scripting 2 times with different inputs (including uid, pwd and any inputs that you feed into
system,) and the comparing what changes. Anything that changes in the request and is not input by you requires co-relation.
There should not be any orphan requests in the script
Ensure that if multiple Vusers cannot login with same id, then place “unique” as the uid/pwd data file setting.
Run the script for multi-iteration with multiple data values between each iteration
Maintain a tabular structure that will help you analyze what went wrong while trying to replay went wrong while trying to replay.
A script is COMPLETED only when the script passes below conditions with all test data, standardizations, response handling, error handling
etc are incorporated
pg. 8
BEST PRACTICES IN PERFORMANCE TESTING
Here multi user refers multiple Vusers.
Script
Single
User
Single
Iteration
Single
User
Multi
Iteration
Multiuser
Multi
Iteration
s1 passed failed failed
s2 passed failed failed
s3
Finally, all scripts should have passed status in all the columns of the previous table
 Understand the nature of users getting into the system and choose appropriate controllers for implementing login. For example,
internet facing end users are more likely to login, perform operations and logout. In this kind of scenario, use a simple controller for the
login. In the case of back office users, they are likely to login at their shift start time, perform operations and finally log out when shift
ends. In this kind of scenario, use a once only controller and then rest of it could be in loop controller.
 Always use conditional controller like ‘If’ controllers to move to the next step, only if the previous step variables are available and
populated properly. This will help minimize erroneous requests being sent when there is no proper data available in the previous
response.
 For if controllers, ensure “Interpret Condition as Variable Expression?” is deselected
 Note that whenever as eeries of “If Controllers” are used, see to that they are to nested as deemed necessary for the situation.
 Post the use of Normal Thread Group, once multiple users tests pass, make use of Ultimate Thread Group as needed.
 Trial the controls with dummy samplers, in terms of the flow of in each request in different flows in case it is involving throughput
controller etc.
 Avoid using any dummy sampler in the script in the test runs, dummy sampler response time will add up to the overall response time
and mess up the result. Try using a Beanshell processor always, in the worst-case scenario use a Beashell sampler.
 Before starting any scripting work, keep the JMeter heap size to 80% of your physical memory size to avoid any issues while
running higher users.
 Always take a copy of your recordings and then work on the copied set, keep all original sets in a secured place till the project ends.
pg. 9
BEST PRACTICES IN PERFORMANCE TESTING
III) Test Execution & Analysis
Entry Criteria
Test plan/Strategy should be completed and signed-off.
Test Data set up should be completed.
Make sure that the application is functionally stable. {ideally before the start of scripting}
Make sure that the scripts/scenarios have been "tested" to ensure no data / application / script issues exist.
The Scripts have been reviewed and signed off.
Environment is available independently for performance testing/ or as discussed in the Test Plan
Execution
Check the duration of the Test Run is setup as per requirement.
Check the runtime settings have been set up as per requirement.
Confirm the system utilization before commencement of the test; the utilization should be within the limits specified in the test plan.
Confirm all the log levels set at ERROR/INFO/etc as per scenario specifications.
If the test requires specific configuration changes, ensure that the changes are made and documented.
Make sure that the run time settings set appropriately and reviewed by another person
Check whether is there enough disk space in the server file systems and Database.
Confirm whether the load generator(s) and machine where the results folder is set have adequate disk space
Configure AppDynamics/NewRelic/WilyIntroscope/ AWS cloudwatch etc any of the monitoring tool to collect the data during the test run
For Java applications use JVisualVM to monitor heap, GC. You need to work with dev team to open the ports for remote connection as a pre-
requiste
Monitor the required process like prstat, vmstat, GC, netstat and collect the data from server when test is running (if required & if you have
access to get them)
Take the thread dump if it’s required. Also kill the process in case of any issue with the application
Monitor the test properly and check whether the requirements have been met or not
Before actual execution do a dummy run to see that all scripts are working and there are no issues with User Ramp up
Once the Scenario (for test run) is ready save it locally and also takes a backup of the same on another machine.
pg. 10
BEST PRACTICES IN PERFORMANCE TESTING
Check the Load generator connectivity to application URL (whether any ports need to be opened) and network utilization before
commencement of Test.
Remove the log dump on the server (if required)
Check if there are any unnecessary processes running on Controller and Load generator which are/may occupy memory during the test.
Ensure that there are no batch jobs or processes running on the server which may lead to incorrect results.
Do not open any application on the Controller or Load generator when the test is running
Do not try to add to many vusers when the test is running, this leads to LR crash.
Turn on the snapshot log level for time consuming scenarios(Ex: endurance)
Look for any Cold Starts.
Take one API and request the same on a 10, 15 & 20 min interval for 5 to 6 times and observe the latency. Every time we should obtain the
same response time.
Analysis
Analyze the application, web services logs for potential issues (connections, memory, cpu etc.)
Analyze the Garbage Collection logs for Memory leaks / excessive collections
Create a template to save time when opening the Analysis file in analyzer (Like the percentile value, Not including think time and graphs
required)
Set the Granularity of graph to improve readability
Change the X-axis and Y axis values for graphs generated by analyzer (if required)
Confirm that the applications processes have returned to normal after the test (hanging issues in case any)
Check for the transaction having response time(s) beyond acceptable limits.
Provide the DB bottleneck by providing the top queries which are most time/resource consuming.
Check the overall TPS achieved for constant duration.
Analyze the system and application-level process CPU utilization to be within permissible limits
If the error rate is greater than acceptable limits (5%?), then find out the problem into any one of these categories - environment, application
or script
Save the Analysis file in .JTL/ .CSV format to save time for future use.
Once the template is applied and required graphs generated save the report in HTML or Word format for reference

More Related Content

Similar to Performance testing checklist.pdf

Test Director Ppt Training
Test Director Ppt TrainingTest Director Ppt Training
Test Director Ppt Trainingshrikantg
 
Best practice adoption (and lack there of)
Best practice adoption (and lack there of)Best practice adoption (and lack there of)
Best practice adoption (and lack there of)John Pape
 
Implementing test scripting Ian McDonald updated (minor changes) 26-04-2013
Implementing test scripting   Ian McDonald updated (minor changes) 26-04-2013Implementing test scripting   Ian McDonald updated (minor changes) 26-04-2013
Implementing test scripting Ian McDonald updated (minor changes) 26-04-2013Ian McDonald
 
Qtp Presentation
Qtp PresentationQtp Presentation
Qtp Presentationtechgajanan
 
The Importance of Performance Testing Theory and Practice - QueBIT Consulting...
The Importance of Performance Testing Theory and Practice - QueBIT Consulting...The Importance of Performance Testing Theory and Practice - QueBIT Consulting...
The Importance of Performance Testing Theory and Practice - QueBIT Consulting...QueBIT Consulting
 
Best practices for test case creation & maintenance
Best practices for test case creation & maintenanceBest practices for test case creation & maintenance
Best practices for test case creation & maintenance99tests
 
Introduction to testing.
Introduction to testing.Introduction to testing.
Introduction to testing.Jithinctzz
 
QTP Tutorial
QTP TutorialQTP Tutorial
QTP Tutorialpingkapil
 
Performance eng prakash.sahu
Performance eng prakash.sahuPerformance eng prakash.sahu
Performance eng prakash.sahuDr. Prakash Sahu
 
Test Automation Framework Design | www.idexcel.com
Test Automation Framework Design | www.idexcel.comTest Automation Framework Design | www.idexcel.com
Test Automation Framework Design | www.idexcel.comIdexcel Technologies
 
Maximizing SAP ABAP Performance
Maximizing SAP ABAP PerformanceMaximizing SAP ABAP Performance
Maximizing SAP ABAP PerformancePeterHBrown
 
Creating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdf
Creating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdfCreating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdf
Creating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdfkalichargn70th171
 

Similar to Performance testing checklist.pdf (20)

About Qtp_1 92
About Qtp_1 92About Qtp_1 92
About Qtp_1 92
 
About Qtp 92
About Qtp 92About Qtp 92
About Qtp 92
 
Test Director Ppt Training
Test Director Ppt TrainingTest Director Ppt Training
Test Director Ppt Training
 
Best practice adoption (and lack there of)
Best practice adoption (and lack there of)Best practice adoption (and lack there of)
Best practice adoption (and lack there of)
 
Implementing test scripting Ian McDonald updated (minor changes) 26-04-2013
Implementing test scripting   Ian McDonald updated (minor changes) 26-04-2013Implementing test scripting   Ian McDonald updated (minor changes) 26-04-2013
Implementing test scripting Ian McDonald updated (minor changes) 26-04-2013
 
Qtp Presentation
Qtp PresentationQtp Presentation
Qtp Presentation
 
Salesforce testing best_practices
Salesforce testing best_practicesSalesforce testing best_practices
Salesforce testing best_practices
 
Txet Document
Txet DocumentTxet Document
Txet Document
 
The Importance of Performance Testing Theory and Practice - QueBIT Consulting...
The Importance of Performance Testing Theory and Practice - QueBIT Consulting...The Importance of Performance Testing Theory and Practice - QueBIT Consulting...
The Importance of Performance Testing Theory and Practice - QueBIT Consulting...
 
Best practices for test case creation & maintenance
Best practices for test case creation & maintenanceBest practices for test case creation & maintenance
Best practices for test case creation & maintenance
 
Test automation
Test automationTest automation
Test automation
 
Introduction to testing.
Introduction to testing.Introduction to testing.
Introduction to testing.
 
Qa documentation pp
Qa documentation ppQa documentation pp
Qa documentation pp
 
QTP Tutorial
QTP TutorialQTP Tutorial
QTP Tutorial
 
Performance eng prakash.sahu
Performance eng prakash.sahuPerformance eng prakash.sahu
Performance eng prakash.sahu
 
Test Automation Framework Design | www.idexcel.com
Test Automation Framework Design | www.idexcel.comTest Automation Framework Design | www.idexcel.com
Test Automation Framework Design | www.idexcel.com
 
Hybrid framework
Hybrid frameworkHybrid framework
Hybrid framework
 
Maximizing SAP ABAP Performance
Maximizing SAP ABAP PerformanceMaximizing SAP ABAP Performance
Maximizing SAP ABAP Performance
 
MSSQL Queries.pdf
MSSQL Queries.pdfMSSQL Queries.pdf
MSSQL Queries.pdf
 
Creating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdf
Creating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdfCreating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdf
Creating Test Scenarios Demystified_ Your Ultimate How-To Guide.pdf
 

Recently uploaded

Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )Tsuyoshi Horigome
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 
chaitra-1.pptx fake news detection using machine learning
chaitra-1.pptx  fake news detection using machine learningchaitra-1.pptx  fake news detection using machine learning
chaitra-1.pptx fake news detection using machine learningmisbanausheenparvam
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineeringmalavadedarshan25
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
Introduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptxIntroduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptxvipinkmenon1
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfAsst.prof M.Gokilavani
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝soniya singh
 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.eptoze12
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...srsj9000
 
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2RajaP95
 

Recently uploaded (20)

Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )SPICE PARK APR2024 ( 6,793 SPICE Models )
SPICE PARK APR2024 ( 6,793 SPICE Models )
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 
chaitra-1.pptx fake news detection using machine learning
chaitra-1.pptx  fake news detection using machine learningchaitra-1.pptx  fake news detection using machine learning
chaitra-1.pptx fake news detection using machine learning
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 
Internship report on mechanical engineering
Internship report on mechanical engineeringInternship report on mechanical engineering
Internship report on mechanical engineering
 
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Meera Call 7001035870 Meet With Nagpur Escorts
 
Introduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptxIntroduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptx
 
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
🔝9953056974🔝!!-YOUNG call girls in Rajendra Nagar Escort rvice Shot 2000 nigh...
 
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service NashikCollege Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdfCCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
CCS355 Neural Network & Deep Learning UNIT III notes and Question bank .pdf
 
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
Model Call Girl in Narela Delhi reach out to us at 🔝8264348440🔝
 
Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.Oxy acetylene welding presentation note.
Oxy acetylene welding presentation note.
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
Gfe Mayur Vihar Call Girls Service WhatsApp -> 9999965857 Available 24x7 ^ De...
 
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
 

Performance testing checklist.pdf

  • 1. Best Practices in Performance Testing Table of Contents I) Test Planning/Test Strategy ............................................................................. 2 II) Scripting (Jmeter Checklist)............................................................................ 4 III) Test Execution & Analysis .............................................................................. 9
  • 2. pg. 2 BEST PRACTICES IN PERFORMANCE TESTING A performance tester should have information of all the below checklist items & should seek to get inputs & follow these diligently for successful performance testing I) Test Planning/Test Strategy Checkpoints Entry Criteria Have the Project release timelines, scope been communicated? If not pls seek and get the details of project scope, types of tests to be covered, test case/user traversal flows to covered as part of scripting and execution timelines Have the test cases/user traversal flows been identified from the Requirement Document and is the screen flow for critical transactions available for the same? Is the production mix data available? Are the application architecture, AppServer, WebServer, database details available? Are architecture ( logical , physical for both prod and test environments), deployment diagrams available? Test Plan or Strategy Document Contents - Checkpoints Have the release, project(s), project manager(s) details been updated? Has the Revision History section been updated with the latest version#, Date, Originator and Modification details? Are all the locations in the reference documentation section (Requirement Document) links to the latest version of the respective documents? Has the Project Roles & Responsibilities section been updated with all the project managers and lead engineer's names and phone numbers? Does the Document Approvals / Signoff section include the QA/Client managers/relevant stakeholders involved in the project for the current Project release? Is the Introduction Section (Purpose, Project Summary) updated with details from the Requirement Documents as applicable to the current Project release? Is the correct System Architecture Diagram in place in the Architecture Overview Section? Does your Test Strategy section cover all the different types of tests that are to be executed for the Project/Release? Have all the test scenarios to be scripted been detailed out in the Test Cases section?
  • 3. pg. 3 BEST PRACTICES IN PERFORMANCE TESTING Is the Test Cases section in format discussed Is the scope of testing in-sync with the project's scope for the project release (with respect to functionality, system upgrades, etc)? Has the Assumptions/Constraints/Risks section been updated with proper data and ownership Are the dates in the Deliverable section reflective of the current project release schedule and is the right link provided to the test reports? Does the Project Requirements section have the right link to the baseline reports? Are the TPS requirements given accurately (2x,3x,4x) for the text sections (Capacity, Longevity, Max TPS etc)? Has the Test Dependencies section been update with the latest Production Mix data and TPS values? Has the Overall Production Mix % split been specified with accurate weight age in a multiple scenario case? Are the data requirements (User ID etc) given for all the applicable scenarios? Check whether there is a need for consolidated(transaction) form of the response time of the modules, if not avoid using it in the script, else Manual work will be required on the final report and the total row will need to calculate freshly/manually. Is the scripting tool & monitoring tool specified with correct version in the Test Tools sections? Has the appserver, webserver, DB Server etc with sizing details been provided in the Environment/Server List section as applicable? Has the Logs and Configurations section been updated with application logs, webserver logs and config file locations? Are all the tests mentioned in the test strategy section, explained in the Performance Tests section with details about purpose, Objectives etc Are the estimated/actual milestone dates for each phase of the project with the performance test deliverables mentioned accurately in the Performance Testing Schedule? Have the dates for builds schedule been updated in the Environment Schedule, if applicable? Formatting Checks Is the TOC updated to reflect the correct page numbers/section names and are the links on the TOC working correctly? Are the header/footer appropriately placed as specified? Has a spell check been performed? Is the indentation uniform throughout the document? Have page breaks been appropriately given to the starting of new sections, if applicable? Is the font uniform throughout the document? Have the section headers been formatted appropriately so as to reflect properly in the TOC? Is the Test Plan named appropriately according to conventions, if applicable? Have the properties settings been modified?
  • 4. pg. 4 BEST PRACTICES IN PERFORMANCE TESTING Is the Alignment is proper throughout the document? Has a Self Review been performed before sharing the Test Plan? II) Scripting (Jmeter Checklist) Some general standards to follow in JMeter 1. Keep scripts modular: Create separate scripts for different functionalities or test scenarios. This will make it easier to manage and maintain the scripts. 2. Use variables: Use variables for any test data that will change, such as user IDs, passwords, and URLs. This will make it easier to update the test data later on. 3. Use Assertions: Use assertions to verify that the response of the server is correct only for debugging purpose. This will help you catch any errors that occur during testing. 4. Use correlation: Correlate any dynamic values that are returned by the server, such as session IDs or CSRF tokens. This will ensure that subsequent requests are valid. 5. Use realistic load profiles: Use realistic load profiles to simulate the expected user behaviour. This will help you identify any performance issues before they occur in production. 6. Use timers: Use timers to simulate user think time and pacing between requests. This will help you create more accurate load profiles. Choose the right combination of timers (constant, Gaussian Random Timer depending on arrival rates to be formulated. 7. Use CSV data sets: Use CSV data sets to read test data from external files. This will make it easier to manage and maintain large amounts of test data. Ensure all CSV data set profiles are kept in the bin folder of the JMeter instance and no full path’s should be used. This helps in moving across Windows/Mac/Linux instances. 8. Use non-GUI mode: Run JMeter in non-GUI mode to reduce the overhead on the system and improve performance. 9. Use distributed testing: Use distributed testing to simulate larger loads and distribute the load across multiple machines.
  • 5. pg. 5 BEST PRACTICES IN PERFORMANCE TESTING 10. Use version control: Use version control to manage and maintain scripts over time. This will help you track changes and revert to previous versions if needed 11. Enable - “Delay Thread Creation until needed in Thread” in Thread Group 12. Use WorkBench for rough activity, debugging, recording so that it’s not executed 13. Use Patterns to Exclude . Avoid using “Use Patterns to Include”{Check this while recording in JMeter Patterns to Include gives a better control in recording capture}. Always use the Recoding Template, capture the request response in XML files in View Results Tree. Do this recording at least twice to ensure no request response captures are missed. 14. Avoid GUI mode for large load. Don’t use listeners that are heavy 15. Avoid view result tree and table while execution test. Use it only for debugging 16. Avoid the use of Assertions in JMeter, as it is likely to be a memory hogger. 17. Use loop controllers for same samplers 18. Use dynamic test data with CSV 19. For Saving Listener output use CSV instead of XML. Disable XML is considered heavy 20. Ensure all protocol, server, port number entries are cleared in the subrequests. Only the ‘HTTP Request Defaults’ should have this. Note ‘Http Requests Defaults’ should have the variables in them and no hard coding should be permitted in there. This helps in when changing the environments during the tests. 21. Ensure ‘Header Manager’ too have no hard coded values in them Checkpoints Recording Options Decide on the appropriate sampler for scripting. Uncheck the Auto Correlation option, if it’s checked. Sometimes this may be required and is left to the judgment of the scripter {Auto Correlation is not ideally recommended} Add Appropriate comments while recording or after recording which explains the function of every transaction Always add elaborate description in the comments column, so that it is easy for anyone to follow or maintain in the future. Keep all the Values to be inputted with field name ready in another document (notepad/WordPad/etc) which would prevent re-scripting the flow due to wrong input or forgetting the values (in case of complex/very large workflows). Decide the script name for the flow in advance.
  • 6. pg. 6 BEST PRACTICES IN PERFORMANCE TESTING Scripting Content Check whether the script starts with an appropriate Header which contains the following info? • Copyright notice • Script Name • Author • Version information • Script Version • Modified Log • Associated files • Script Overview • Function TOC • [URL TOC] User proper Hungarian notation for variable names. All the transactions should have the check points to verify page/function accuracy The check points must be unique between pages and must distinguish one page from the other. All action and transaction names should follow a set of standard format: ApplicationName_FunctionalityName_FunctionalityNumber_PageNumber_PageName/Transaction Name For Eg: Application name = PI 5 scripts exist Associate, Supervisor, Workflow, QC and Index PI_Associate_01_01_Process PI_01_PreLogin PI_02_Login PI_03_Logout PI_QC_04_02_WorkTab From the above you must follow those common transactions across the application like login, logout etc have a common transaction name across all scripts. Always parameterize URL, UID, and PWD.
  • 7. pg. 7 BEST PRACTICES IN PERFORMANCE TESTING PWD may need to be encrypted. Parameterize the following values in the script depending upon the requirement. • Input Data (Like User ID, Password, Date, Timings etc ) • Login server URL • Application Server URL, Content • Think Time • Other Values applicable Correlate the Dynamic values in the script using manually co-relation through Regular expression extractor The best way to figure out co relatable values is by scripting 2 times with different inputs (including uid, pwd and any inputs that you feed into system,) and the comparing what changes. Anything that changes in the request and is not input by you requires co-relation. There should not be any orphan requests in the script Ensure that if multiple Vusers cannot login with same id, then place “unique” as the uid/pwd data file setting. Run the script for multi-iteration with multiple data values between each iteration Maintain a tabular structure that will help you analyze what went wrong while trying to replay went wrong while trying to replay. A script is COMPLETED only when the script passes below conditions with all test data, standardizations, response handling, error handling etc are incorporated
  • 8. pg. 8 BEST PRACTICES IN PERFORMANCE TESTING Here multi user refers multiple Vusers. Script Single User Single Iteration Single User Multi Iteration Multiuser Multi Iteration s1 passed failed failed s2 passed failed failed s3 Finally, all scripts should have passed status in all the columns of the previous table  Understand the nature of users getting into the system and choose appropriate controllers for implementing login. For example, internet facing end users are more likely to login, perform operations and logout. In this kind of scenario, use a simple controller for the login. In the case of back office users, they are likely to login at their shift start time, perform operations and finally log out when shift ends. In this kind of scenario, use a once only controller and then rest of it could be in loop controller.  Always use conditional controller like ‘If’ controllers to move to the next step, only if the previous step variables are available and populated properly. This will help minimize erroneous requests being sent when there is no proper data available in the previous response.  For if controllers, ensure “Interpret Condition as Variable Expression?” is deselected  Note that whenever as eeries of “If Controllers” are used, see to that they are to nested as deemed necessary for the situation.  Post the use of Normal Thread Group, once multiple users tests pass, make use of Ultimate Thread Group as needed.  Trial the controls with dummy samplers, in terms of the flow of in each request in different flows in case it is involving throughput controller etc.  Avoid using any dummy sampler in the script in the test runs, dummy sampler response time will add up to the overall response time and mess up the result. Try using a Beanshell processor always, in the worst-case scenario use a Beashell sampler.  Before starting any scripting work, keep the JMeter heap size to 80% of your physical memory size to avoid any issues while running higher users.  Always take a copy of your recordings and then work on the copied set, keep all original sets in a secured place till the project ends.
  • 9. pg. 9 BEST PRACTICES IN PERFORMANCE TESTING III) Test Execution & Analysis Entry Criteria Test plan/Strategy should be completed and signed-off. Test Data set up should be completed. Make sure that the application is functionally stable. {ideally before the start of scripting} Make sure that the scripts/scenarios have been "tested" to ensure no data / application / script issues exist. The Scripts have been reviewed and signed off. Environment is available independently for performance testing/ or as discussed in the Test Plan Execution Check the duration of the Test Run is setup as per requirement. Check the runtime settings have been set up as per requirement. Confirm the system utilization before commencement of the test; the utilization should be within the limits specified in the test plan. Confirm all the log levels set at ERROR/INFO/etc as per scenario specifications. If the test requires specific configuration changes, ensure that the changes are made and documented. Make sure that the run time settings set appropriately and reviewed by another person Check whether is there enough disk space in the server file systems and Database. Confirm whether the load generator(s) and machine where the results folder is set have adequate disk space Configure AppDynamics/NewRelic/WilyIntroscope/ AWS cloudwatch etc any of the monitoring tool to collect the data during the test run For Java applications use JVisualVM to monitor heap, GC. You need to work with dev team to open the ports for remote connection as a pre- requiste Monitor the required process like prstat, vmstat, GC, netstat and collect the data from server when test is running (if required & if you have access to get them) Take the thread dump if it’s required. Also kill the process in case of any issue with the application Monitor the test properly and check whether the requirements have been met or not Before actual execution do a dummy run to see that all scripts are working and there are no issues with User Ramp up Once the Scenario (for test run) is ready save it locally and also takes a backup of the same on another machine.
  • 10. pg. 10 BEST PRACTICES IN PERFORMANCE TESTING Check the Load generator connectivity to application URL (whether any ports need to be opened) and network utilization before commencement of Test. Remove the log dump on the server (if required) Check if there are any unnecessary processes running on Controller and Load generator which are/may occupy memory during the test. Ensure that there are no batch jobs or processes running on the server which may lead to incorrect results. Do not open any application on the Controller or Load generator when the test is running Do not try to add to many vusers when the test is running, this leads to LR crash. Turn on the snapshot log level for time consuming scenarios(Ex: endurance) Look for any Cold Starts. Take one API and request the same on a 10, 15 & 20 min interval for 5 to 6 times and observe the latency. Every time we should obtain the same response time. Analysis Analyze the application, web services logs for potential issues (connections, memory, cpu etc.) Analyze the Garbage Collection logs for Memory leaks / excessive collections Create a template to save time when opening the Analysis file in analyzer (Like the percentile value, Not including think time and graphs required) Set the Granularity of graph to improve readability Change the X-axis and Y axis values for graphs generated by analyzer (if required) Confirm that the applications processes have returned to normal after the test (hanging issues in case any) Check for the transaction having response time(s) beyond acceptable limits. Provide the DB bottleneck by providing the top queries which are most time/resource consuming. Check the overall TPS achieved for constant duration. Analyze the system and application-level process CPU utilization to be within permissible limits If the error rate is greater than acceptable limits (5%?), then find out the problem into any one of these categories - environment, application or script Save the Analysis file in .JTL/ .CSV format to save time for future use. Once the template is applied and required graphs generated save the report in HTML or Word format for reference