SlideShare a Scribd company logo
1 of 24
Performance Measurement Fundamentals
1 Basic Concepts
2 Performance
Measurements
3 Performance
Testing in SDLC
Software Testing - ISTQB Specialist
Performance Tester Exam Preparation
Chapter 2
Neeraj Kumar Singh
4 Performance
Testing Tasks
5 Tools
Performance Measurement Fundamentals
Contents
2.1 Typical Metrics Collected in
Performance Testing
2.2 Aggregating Results from Performance Testing
2.3 Key Sources of Performance Metrics
2.4 Typical Results of a Performance Test
Neeraj Kumar Singh
Performance Measurement Fundamentals
Contents
2.1.1 Why Performance Metrics are Needed
2.1.2 Collecting Performance Measurements and
Metrics
2.1.3 Selecting Performance Metrics
Neeraj Kumar Singh
Typical Metrics Collected in Performance Testing
Why Performance Metrics are Needed
Accurate measurements and the metrics which are derived from those measurements are essential for defining the
goals of performance testing and for evaluating the results of performance testing. Performance testing should not
be undertaken without first understanding which measurements and metrics are needed. The following project
risks apply if this advice is ignored:
 It is unknown if the levels of performance are acceptable to meet operational objectives
 The performance requirements are not defined in measurable terms
 It may not be possible to identify trends that may predict lower levels of performance
 The actual results of a performance test cannot be evaluated by comparing them to a baseline set of
performance measures that define acceptable and/or unacceptable performance
 Performance test results are evaluated based on the subjective opinion of one or more people
 The results provided by a performance test tool are not understood
Neeraj Kumar Singh
Performance Measurement Fundamentals
Contents
2.1.1 Why Performance Metrics are Needed
2.1.2 Collecting Performance Measurements
and Metrics
2.1.3 Selecting Performance Metrics
Neeraj Kumar Singh
Typical Metrics Collected in Performance Testing
Collecting Performance Measurements and Metrics
As with any form of measurement, it is possible to obtain and express metrics in precise ways. Therefore, any of
the metrics and measurements described in this section can and should be defined to be meaningful in a particular
context. This is a matter of performing initial tests and learning which metrics need to be further refined and
which need to be added.
For example, the metric of response time likely will be in any set of performance metrics. However, to be
meaningful and actionable, the response time metric will need to be further defined in terms of time of day,
number of concurrent users, the amount of data being processed and so forth.
The metrics collected in a specific performance test will vary based on the
 business context (business processes, customer and user behavior, and stakeholder expectations),
 operational context (technology and how it is used)
 test objectives
For example, the metrics chosen for the performance testing of an international ecommerce website will differ
from those chosen for the performance testing of an embedded system used to control medical device
functionality.
A common way to categorize performance measurements and metrics is to consider the technical environment,
business environment, or operational environment in which the assessment of performance is needed.
Neeraj Kumar Singh
Typical Metrics Collected in Performance Testing
Collecting Performance Measurements and Metrics
The categories of measurements and metrics included below are the ones commonly obtained from performance
testing.
Technical Environment
Performance metrics will vary by the type of the technical environment, as shown in the following list:
 Web-based
 Mobile
 Internet-of-Things (IoT)
 Desktop client devices
 Server-side processing
 Mainframe
 Databases
 Networks
 The nature of software running in the environment (e.g., embedded)
Neeraj Kumar Singh
Typical Metrics Collected in Performance Testing
Collecting Performance Measurements and Metrics
The metrics include the following:
 Response time (e.g., per transaction, per concurrent user, page load times)
 Resource utilization (e.g., CPU, memory, network bandwidth, network latency, available disk space, I/O rate,
idle and busy threads)
 Throughput rate of key transaction (i.e., the number of transactions that can be processed in a given period of
time)
 Batch processing time (e.g., wait times, throughput times, data base response times, completion times)
 Numbers of errors impacting performance
 Completion time (e.g., for creating, reading, updating, and deleting data)
 Background load on shared resources (especially in virtualized environments)
 Software metrics (e.g., code complexity)
Neeraj Kumar Singh
Typical Metrics Collected in Performance Testing
Collecting Performance Measurements and Metrics
Business Environment
From the business or functional perspective, performance metrics may include the following:
 Business process efficiency (e.g., the speed of performing an overall business process including normal,
alternate and exceptional use case flows)
 Throughput of data, transactions, and other units of work performed (e.g., orders processed per hour, data
rows added per minute)
 Service Level Agreement (SLA) compliance or violation rates (e.g., SLA violations per unit of time)
 Scope of usage (e.g., percentage of global or national users conducting tasks at a given time)
 Concurrency of usage (e.g., the number of users concurrently performing a task)
 Timing of usage (e.g., the number of orders processed during peak load times)
Neeraj Kumar Singh
Typical Metrics Collected in Performance Testing
Collecting Performance Measurements and Metrics
Operational Environment
The operational aspect of performance testing focuses on tasks that are generally not considered to be user-facing
in nature. These include the following:
 Operational processes (e.g., the time required for environment start-up, backups, shutdown and resumption
times)
 System restoration (e.g., the time required to restore data from a backup)
 Alerts and warnings (e.g., the time needed for the system to issue an alert or warning)
Neeraj Kumar Singh
Performance Measurement Fundamentals
Contents
2.1.1 Why Performance Metrics are Needed
2.1.2 Collecting Performance Measurements
and Metrics
2.1.3 Selecting Performance Metrics
Neeraj Kumar Singh
Typical Metrics Collected in Performance Testing
Selecting Performance Metrics
It should be noted that collecting more metrics than required is not necessarily a good thing. Each metric chosen
requires a means for consistent collection and reporting. It is important to define an obtainable set of metrics that
support the performance test objectives.
For example, the Goal-Question-Metric (GQM) approach is a helpful way to align metrics with performance goals.
The idea is to first establish the goals, then ask questions to know when the goals have been achieved. Metrics are
associated with each question to ensure the answer to the question is measurable.
It should be noted that the GQM approach doesn’t always fit the performance testing process. For example, some
metrics represent a system’s health and are not directly linked to goals.
It is important to realize that after the definition and capture of initial measurements further measurements and
metrics may be needed to understand true performance levels and to determine where corrective actions may be
needed.
Neeraj Kumar Singh
Performance Measurement Fundamentals
Contents
2.1 Typical Metrics Collected in
Performance Testing
2.2 Aggregating Results from Performance Testing
2.3 Key Sources of Performance Metrics
2.4 Typical Results of a Performance Test
Neeraj Kumar Singh
Performance Measurement Fundamentals
Aggregating Results from Performance Testing
The purpose of aggregating performance metrics is to be able to understand and express them in a way that
accurately conveys the total picture of system performance. When performance metrics are viewed at only the
detailed level, drawing the right conclusion may be difficult—especially for business stakeholders.
For many stakeholders, the main concern is that the response time of a system, web site, or other test object is
within acceptable limits.
Once deeper understanding of the performance metrics has been achieved, the metrics can be aggregated so that:
 Business and project stakeholders can see the “big picture” status of system performance
 Performance trends can be identified
 Performance metrics can be reported in an understandable way
Neeraj Kumar Singh
Performance Measurement Fundamentals
Contents
2.1 Typical Metrics Collected in
Performance Testing
2.2 Aggregating Results from Performance Testing
2.3 Key Sources of Performance Metrics
2.4 Typical Results of a Performance Test
Neeraj Kumar Singh
Performance Measurement Fundamentals
Key Sources of Performance Metrics
System performance should be no more than minimally impacted by the metrics collection effort (known as the
“probe effect”). In addition, the volume, accuracy and speed with which performance metrics must be collected
makes tool usage a requirement. While the combined use of tools is not uncommon, it can introduce redundancy in
the usage of test tools and other problems.
There are three key sources of performance metrics:
Performance Test Tools
All performance test tools provide measurements and metrics as the result of a test. Tools may vary in the number
of metrics shown, the way in which the metrics are shown, and the ability for the user to customize the metrics to
a particular situation.
Some tools collect and display performance metrics in text format, while more robust tools collect and display
performance metrics graphically in a dashboard format. Many tools offer the ability to export metrics to facilitate
test evaluation and reporting.
Neeraj Kumar Singh
Performance Measurement Fundamentals
Key Sources of Performance Metrics
Performance Monitoring Tools
Performance monitoring tools are often employed to supplement the reporting capabilities of performance test
tools. In addition, monitoring tools may be used to monitor system performance on an ongoing basis and to alert
system administrators to lowered levels of performance and higher levels of system errors and alerts. These tools
may also be used to detect and notify in the event of suspicious behavior (such as denial of service attacks and
distributed denial of Service attacks).
Log Analysis Tools
There are tools that scan server logs and compile metrics from them. Some of these tools can create charts to
provide a graphical view of the data. Errors, alerts and warnings are normally recorded in server logs. These
include:
 High resource usage, such as high CPU utilization, high levels of disk storage consumed, and insufficient
bandwidth
 Memory errors and warnings, such as memory exhaustion
 Deadlocks and multi-threading problems, especially when performing database operations
 Database errors, such as SQL exceptions and SQL timeouts
Neeraj Kumar Singh
Performance Measurement Fundamentals
Contents
2.1 Typical Metrics Collected in
Performance Testing
2.2 Aggregating Results from Performance Testing
2.3 Key Sources of Performance Metrics
2.4 Typical Results of a Performance Test
Neeraj Kumar Singh
Performance Measurement Fundamentals
Typical Results of a Performance Test
In functional testing, particularly when verifying specified functional requirements or functional elements of user
stories, the expected results usually can be defined clearly and the test results interpreted to determine if the test
passed or failed. For example, a monthly sales report shows either a correct or an incorrect total.
Whereas tests that verify functional suitability often benefit from well-defined test oracles, performance testing
often lacks this source of information. Not only are the stakeholders notoriously bad at articulating performance
requirements, many business analysts and product owners are bad at eliciting such requirements. Testers often
receive limited guidance to define the expected test results.
When evaluating performance test results, it is important to look at the results closely. Initial raw results can be
misleading with performance failures being hidden beneath apparently good overall results. For example, resource
utilization may be well under 75% for all key potential bottleneck resources, but the throughput or response time
of key transactions or use cases are an order-of-magnitude too slow.
Neeraj Kumar Singh
Performance Measurement Fundamentals
1 Basic Concepts
2 Performance
Measurements
3 Performance
Testing in SDLC
Software Testing - ISTQB Specialist
Performance Tester Exam Preparation
Chapter 2
Neeraj Kumar Singh
4 Performance
Testing Tasks
5 Tools
Neeraj Kumar Singh
Performance Measurement Fundamentals
Sample Questions Pattern
Source : istqb.org
1. Which of the following is a true statement regarding tracking metrics for network latency during a performance
test?
Select ONE option.
Answer Set
a. High latency could indicate a network bandwidth problem that could negatively impact performance
b. Low latency could indicate a network bandwidth problem that could negatively impact performance
c. Network latency is difficult to track and should not be included in the performance metrics
d. Network latency is too variable to be useful during performance tuning
Performance Measurement Fundamentals
Sample Question
2. Should performance test results be aggregated?
Select ONE option.
Answer Set
a. Yes, this gives a better overall picture of the performance of the system and helps to identify trends
b. Yes, this is the best way to focus on the outliers in the performance metrics
c. No, the results should be analyzed individually so that all variations are understood
d. No, the results from each test should be reported and tracked separately
Performance Measurement Fundamentals
Sample Question
3. Which of the following is a failure that would typically be found by conducting a spike test?
Select ONE option.
Answer Set
a. The system performance gradually degrades
b. The system provides inconsistent responses to errors
c. The system handles a sudden burst of activity, but can’t resume a steady state
d. The system performs well for the expected load, but can’t scale to a larger load
Performance Measurement Fundamentals
Sample Question

More Related Content

What's hot

Chapter 5 - Improving the Testing Process
Chapter 5 -  Improving the Testing ProcessChapter 5 -  Improving the Testing Process
Chapter 5 - Improving the Testing ProcessNeeraj Kumar Singh
 
Chapter 5 - Automating the Test Execution
Chapter 5 - Automating the Test ExecutionChapter 5 - Automating the Test Execution
Chapter 5 - Automating the Test ExecutionNeeraj Kumar Singh
 
Chapter 4 - Test Design Techniques
Chapter 4 - Test Design TechniquesChapter 4 - Test Design Techniques
Chapter 4 - Test Design TechniquesNeeraj Kumar Singh
 
Chapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for TestingChapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for TestingNeeraj Kumar Singh
 
Chapter 7 - People Skills and Team Composition
Chapter 7 - People Skills and Team CompositionChapter 7 - People Skills and Team Composition
Chapter 7 - People Skills and Team CompositionNeeraj Kumar Singh
 
Chapter 3 - Agile Testing Methods, Techniques and Tools
Chapter 3 - Agile Testing Methods, Techniques and ToolsChapter 3 - Agile Testing Methods, Techniques and Tools
Chapter 3 - Agile Testing Methods, Techniques and ToolsNeeraj Kumar Singh
 
Chapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationChapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationNeeraj Kumar Singh
 
Chapter 3 - Common Test Types and Test Process for Mobile Applications
Chapter 3 - Common Test Types and Test Process for Mobile ApplicationsChapter 3 - Common Test Types and Test Process for Mobile Applications
Chapter 3 - Common Test Types and Test Process for Mobile ApplicationsNeeraj Kumar Singh
 
Chapter 2 - Testing Throughout the Development LifeCycle
Chapter 2 - Testing Throughout the Development LifeCycleChapter 2 - Testing Throughout the Development LifeCycle
Chapter 2 - Testing Throughout the Development LifeCycleNeeraj Kumar Singh
 
Chapter 4 - Mobile Application Platforms, Tools and Environment
Chapter 4 - Mobile Application Platforms, Tools and EnvironmentChapter 4 - Mobile Application Platforms, Tools and Environment
Chapter 4 - Mobile Application Platforms, Tools and EnvironmentNeeraj Kumar Singh
 
Chapter 8 - Continuous Improvement
Chapter 8 - Continuous ImprovementChapter 8 - Continuous Improvement
Chapter 8 - Continuous ImprovementNeeraj Kumar Singh
 
Chapter 3 - Analytical Techniques
Chapter 3 - Analytical TechniquesChapter 3 - Analytical Techniques
Chapter 3 - Analytical TechniquesNeeraj Kumar Singh
 
Chapter 2 - Mobile Application Test Types
Chapter 2 - Mobile Application Test TypesChapter 2 - Mobile Application Test Types
Chapter 2 - Mobile Application Test TypesNeeraj Kumar Singh
 
ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2Chandukar
 

What's hot (20)

Chapter 1 - Testing Process
Chapter 1 - Testing ProcessChapter 1 - Testing Process
Chapter 1 - Testing Process
 
Chapter 2 - Test Management
Chapter 2 - Test ManagementChapter 2 - Test Management
Chapter 2 - Test Management
 
Chapter 5 - Improving the Testing Process
Chapter 5 -  Improving the Testing ProcessChapter 5 -  Improving the Testing Process
Chapter 5 - Improving the Testing Process
 
Chapter 5 - Automating the Test Execution
Chapter 5 - Automating the Test ExecutionChapter 5 - Automating the Test Execution
Chapter 5 - Automating the Test Execution
 
Chapter 2 - Test Management
Chapter 2 - Test ManagementChapter 2 - Test Management
Chapter 2 - Test Management
 
Chapter 4 - Test Design Techniques
Chapter 4 - Test Design TechniquesChapter 4 - Test Design Techniques
Chapter 4 - Test Design Techniques
 
Chapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for TestingChapter 6 - Tool Support for Testing
Chapter 6 - Tool Support for Testing
 
Chapter 7 - People Skills and Team Composition
Chapter 7 - People Skills and Team CompositionChapter 7 - People Skills and Team Composition
Chapter 7 - People Skills and Team Composition
 
Chapter 3 - Agile Testing Methods, Techniques and Tools
Chapter 3 - Agile Testing Methods, Techniques and ToolsChapter 3 - Agile Testing Methods, Techniques and Tools
Chapter 3 - Agile Testing Methods, Techniques and Tools
 
Chapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationChapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and Automation
 
Chapter 3 - Common Test Types and Test Process for Mobile Applications
Chapter 3 - Common Test Types and Test Process for Mobile ApplicationsChapter 3 - Common Test Types and Test Process for Mobile Applications
Chapter 3 - Common Test Types and Test Process for Mobile Applications
 
Chapter 2 - Testing Throughout the Development LifeCycle
Chapter 2 - Testing Throughout the Development LifeCycleChapter 2 - Testing Throughout the Development LifeCycle
Chapter 2 - Testing Throughout the Development LifeCycle
 
Chapter 4 - Mobile Application Platforms, Tools and Environment
Chapter 4 - Mobile Application Platforms, Tools and EnvironmentChapter 4 - Mobile Application Platforms, Tools and Environment
Chapter 4 - Mobile Application Platforms, Tools and Environment
 
Chapter 1 - Testing Process
Chapter 1 - Testing ProcessChapter 1 - Testing Process
Chapter 1 - Testing Process
 
Chapter 8 - Continuous Improvement
Chapter 8 - Continuous ImprovementChapter 8 - Continuous Improvement
Chapter 8 - Continuous Improvement
 
Chapter 3 - Analytical Techniques
Chapter 3 - Analytical TechniquesChapter 3 - Analytical Techniques
Chapter 3 - Analytical Techniques
 
Chapter 2 - Mobile Application Test Types
Chapter 2 - Mobile Application Test TypesChapter 2 - Mobile Application Test Types
Chapter 2 - Mobile Application Test Types
 
Chapter 3 - Test Automation
Chapter 3 - Test AutomationChapter 3 - Test Automation
Chapter 3 - Test Automation
 
SOFTWARE TESTING
SOFTWARE TESTINGSOFTWARE TESTING
SOFTWARE TESTING
 
ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2ISTQB Foundation - Chapter 2
ISTQB Foundation - Chapter 2
 

Similar to Chapter 2 - Performance Measurement Fundamentals

Chapter 4 - Testing Quality Characteristics
Chapter 4 - Testing Quality CharacteristicsChapter 4 - Testing Quality Characteristics
Chapter 4 - Testing Quality CharacteristicsNeeraj Kumar Singh
 
Chapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationChapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationNeeraj Kumar Singh
 
Process_Modeling
Process_ModelingProcess_Modeling
Process_ModelingAshok Kumar
 
Chapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationChapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationNeeraj Kumar Singh
 
Jurnal an example of using key performance indicators for software development
Jurnal   an example of using key performance indicators for software developmentJurnal   an example of using key performance indicators for software development
Jurnal an example of using key performance indicators for software developmentRatzman III
 
SPM_presentation.pptx
SPM_presentation.pptxSPM_presentation.pptx
SPM_presentation.pptxAatifQuamre
 
Practical Software Measurement
Practical Software MeasurementPractical Software Measurement
Practical Software Measurementaliraza786
 
Lightweight Processes: A Definition
Lightweight Processes: A DefinitionLightweight Processes: A Definition
Lightweight Processes: A DefinitionGlen Alleman
 
Lecture3
Lecture3Lecture3
Lecture3soloeng
 
Primer on performance_requirements_gathering_v0.3
Primer on performance_requirements_gathering_v0.3Primer on performance_requirements_gathering_v0.3
Primer on performance_requirements_gathering_v0.3Trevor Warren
 
Maintenance Leading and Lagging Key Performance Indicators (KPIs)
Maintenance Leading and Lagging Key Performance Indicators (KPIs)Maintenance Leading and Lagging Key Performance Indicators (KPIs)
Maintenance Leading and Lagging Key Performance Indicators (KPIs)Ricky Smith CMRP
 
Requirement Engineering
Requirement EngineeringRequirement Engineering
Requirement EngineeringMubashir Yasin
 
Software Engineering (Metrics for Process and Projects)
Software Engineering (Metrics for Process and Projects)Software Engineering (Metrics for Process and Projects)
Software Engineering (Metrics for Process and Projects)ShudipPal
 
Using Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process ImprovementUsing Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process ImprovementQuantitative Software Management, Inc.
 
Importance of software quality metrics
Importance of software quality metricsImportance of software quality metrics
Importance of software quality metricsPiyush Sohaney
 
Software process and project metrics
Software process and project metricsSoftware process and project metrics
Software process and project metricsIndu Sharma Bhardwaj
 

Similar to Chapter 2 - Performance Measurement Fundamentals (20)

Chapter 4 - Testing Quality Characteristics
Chapter 4 - Testing Quality CharacteristicsChapter 4 - Testing Quality Characteristics
Chapter 4 - Testing Quality Characteristics
 
Chapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationChapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and Automation
 
Process_Modeling
Process_ModelingProcess_Modeling
Process_Modeling
 
Chapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and AutomationChapter 6 - Test Tools and Automation
Chapter 6 - Test Tools and Automation
 
Jurnal an example of using key performance indicators for software development
Jurnal   an example of using key performance indicators for software developmentJurnal   an example of using key performance indicators for software development
Jurnal an example of using key performance indicators for software development
 
Securitymetrics
SecuritymetricsSecuritymetrics
Securitymetrics
 
SPM_presentation.pptx
SPM_presentation.pptxSPM_presentation.pptx
SPM_presentation.pptx
 
Practical Software Measurement
Practical Software MeasurementPractical Software Measurement
Practical Software Measurement
 
Lightweight Processes: A Definition
Lightweight Processes: A DefinitionLightweight Processes: A Definition
Lightweight Processes: A Definition
 
Lecture3
Lecture3Lecture3
Lecture3
 
Primer on performance_requirements_gathering_v0.3
Primer on performance_requirements_gathering_v0.3Primer on performance_requirements_gathering_v0.3
Primer on performance_requirements_gathering_v0.3
 
Maintenance Leading and Lagging Key Performance Indicators (KPIs)
Maintenance Leading and Lagging Key Performance Indicators (KPIs)Maintenance Leading and Lagging Key Performance Indicators (KPIs)
Maintenance Leading and Lagging Key Performance Indicators (KPIs)
 
Sdlc1
Sdlc1Sdlc1
Sdlc1
 
Requirement Engineering
Requirement EngineeringRequirement Engineering
Requirement Engineering
 
csc 510 Project
csc 510 Projectcsc 510 Project
csc 510 Project
 
Software Engineering (Metrics for Process and Projects)
Software Engineering (Metrics for Process and Projects)Software Engineering (Metrics for Process and Projects)
Software Engineering (Metrics for Process and Projects)
 
Using Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process ImprovementUsing Benchmarking to Quantify the Benefits of Software Process Improvement
Using Benchmarking to Quantify the Benefits of Software Process Improvement
 
Importance of software quality metrics
Importance of software quality metricsImportance of software quality metrics
Importance of software quality metrics
 
Unit Iii
Unit IiiUnit Iii
Unit Iii
 
Software process and project metrics
Software process and project metricsSoftware process and project metrics
Software process and project metrics
 

More from Neeraj Kumar Singh

Chapter 1 - Mobile World - Business and Technology Drivers
Chapter 1 - Mobile World - Business and Technology DriversChapter 1 - Mobile World - Business and Technology Drivers
Chapter 1 - Mobile World - Business and Technology DriversNeeraj Kumar Singh
 
ISTQB Performance Tester Sample Questions
ISTQB Performance Tester Sample QuestionsISTQB Performance Tester Sample Questions
ISTQB Performance Tester Sample QuestionsNeeraj Kumar Singh
 
ISTQB Performance Tester Sample Questions' Answers
ISTQB Performance Tester Sample Questions' AnswersISTQB Performance Tester Sample Questions' Answers
ISTQB Performance Tester Sample Questions' AnswersNeeraj Kumar Singh
 
ISTQB Performance Tester Certification Syllabus and Study Material
ISTQB Performance Tester Certification Syllabus and Study MaterialISTQB Performance Tester Certification Syllabus and Study Material
ISTQB Performance Tester Certification Syllabus and Study MaterialNeeraj Kumar Singh
 
ISTQB Technical Test Analyst Answers to Sample Question Paper
ISTQB Technical Test Analyst Answers to Sample Question PaperISTQB Technical Test Analyst Answers to Sample Question Paper
ISTQB Technical Test Analyst Answers to Sample Question PaperNeeraj Kumar Singh
 
ISTQB Technical Test Analyst Sample Question Paper
ISTQB Technical Test Analyst Sample Question PaperISTQB Technical Test Analyst Sample Question Paper
ISTQB Technical Test Analyst Sample Question PaperNeeraj Kumar Singh
 
ISTQB Advance level syllabus 2019 Technical Test Analyst
ISTQB Advance level syllabus 2019 Technical Test AnalystISTQB Advance level syllabus 2019 Technical Test Analyst
ISTQB Advance level syllabus 2019 Technical Test AnalystNeeraj Kumar Singh
 
Chapter 4 - Quality Characteristics for Technical Testing
Chapter 4 - Quality Characteristics for Technical TestingChapter 4 - Quality Characteristics for Technical Testing
Chapter 4 - Quality Characteristics for Technical TestingNeeraj Kumar Singh
 
Chapter 2 - White Box Test Techniques
Chapter 2 - White Box Test TechniquesChapter 2 - White Box Test Techniques
Chapter 2 - White Box Test TechniquesNeeraj Kumar Singh
 
Chapter 1 - The Technical Test Analyst Tasks in Risk Based Testing
Chapter 1 - The Technical Test Analyst Tasks in Risk Based TestingChapter 1 - The Technical Test Analyst Tasks in Risk Based Testing
Chapter 1 - The Technical Test Analyst Tasks in Risk Based TestingNeeraj Kumar Singh
 
ISTQB Agile Technical Tester Answers for Sample Question Paper
ISTQB Agile Technical Tester Answers for Sample Question PaperISTQB Agile Technical Tester Answers for Sample Question Paper
ISTQB Agile Technical Tester Answers for Sample Question PaperNeeraj Kumar Singh
 
ISTQB Agile Technical Tester Sample Question Paper
ISTQB Agile Technical Tester Sample Question PaperISTQB Agile Technical Tester Sample Question Paper
ISTQB Agile Technical Tester Sample Question PaperNeeraj Kumar Singh
 

More from Neeraj Kumar Singh (14)

Chapter 1 - Mobile World - Business and Technology Drivers
Chapter 1 - Mobile World - Business and Technology DriversChapter 1 - Mobile World - Business and Technology Drivers
Chapter 1 - Mobile World - Business and Technology Drivers
 
ISTQB Performance Tester Sample Questions
ISTQB Performance Tester Sample QuestionsISTQB Performance Tester Sample Questions
ISTQB Performance Tester Sample Questions
 
ISTQB Performance Tester Sample Questions' Answers
ISTQB Performance Tester Sample Questions' AnswersISTQB Performance Tester Sample Questions' Answers
ISTQB Performance Tester Sample Questions' Answers
 
ISTQB Performance Tester Certification Syllabus and Study Material
ISTQB Performance Tester Certification Syllabus and Study MaterialISTQB Performance Tester Certification Syllabus and Study Material
ISTQB Performance Tester Certification Syllabus and Study Material
 
Chapter 4 - Defect Management
Chapter 4 - Defect ManagementChapter 4 - Defect Management
Chapter 4 - Defect Management
 
ISTQB Technical Test Analyst Answers to Sample Question Paper
ISTQB Technical Test Analyst Answers to Sample Question PaperISTQB Technical Test Analyst Answers to Sample Question Paper
ISTQB Technical Test Analyst Answers to Sample Question Paper
 
ISTQB Technical Test Analyst Sample Question Paper
ISTQB Technical Test Analyst Sample Question PaperISTQB Technical Test Analyst Sample Question Paper
ISTQB Technical Test Analyst Sample Question Paper
 
ISTQB Advance level syllabus 2019 Technical Test Analyst
ISTQB Advance level syllabus 2019 Technical Test AnalystISTQB Advance level syllabus 2019 Technical Test Analyst
ISTQB Advance level syllabus 2019 Technical Test Analyst
 
Chapter 5 - Reviews
Chapter 5 - ReviewsChapter 5 - Reviews
Chapter 5 - Reviews
 
Chapter 4 - Quality Characteristics for Technical Testing
Chapter 4 - Quality Characteristics for Technical TestingChapter 4 - Quality Characteristics for Technical Testing
Chapter 4 - Quality Characteristics for Technical Testing
 
Chapter 2 - White Box Test Techniques
Chapter 2 - White Box Test TechniquesChapter 2 - White Box Test Techniques
Chapter 2 - White Box Test Techniques
 
Chapter 1 - The Technical Test Analyst Tasks in Risk Based Testing
Chapter 1 - The Technical Test Analyst Tasks in Risk Based TestingChapter 1 - The Technical Test Analyst Tasks in Risk Based Testing
Chapter 1 - The Technical Test Analyst Tasks in Risk Based Testing
 
ISTQB Agile Technical Tester Answers for Sample Question Paper
ISTQB Agile Technical Tester Answers for Sample Question PaperISTQB Agile Technical Tester Answers for Sample Question Paper
ISTQB Agile Technical Tester Answers for Sample Question Paper
 
ISTQB Agile Technical Tester Sample Question Paper
ISTQB Agile Technical Tester Sample Question PaperISTQB Agile Technical Tester Sample Question Paper
ISTQB Agile Technical Tester Sample Question Paper
 

Recently uploaded

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDGMarianaLemus7
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsMiki Katsuragi
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Neo4j
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 3652toLead Limited
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfAddepto
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraDeakin University
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024Lorenzo Miniero
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfjimielynbastida
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 

Recently uploaded (20)

Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDG
 
Vertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering TipsVertex AI Gemini Prompt Engineering Tips
Vertex AI Gemini Prompt Engineering Tips
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
Tech-Forward - Achieving Business Readiness For Copilot in Microsoft 365
 
Gen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdfGen AI in Business - Global Trends Report 2024.pdf
Gen AI in Business - Global Trends Report 2024.pdf
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort ServiceHot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
Hot Sexy call girls in Panjabi Bagh 🔝 9953056974 🔝 Delhi escort Service
 
Artificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning eraArtificial intelligence in the post-deep learning era
Artificial intelligence in the post-deep learning era
 
SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024SIP trunking in Janus @ Kamailio World 2024
SIP trunking in Janus @ Kamailio World 2024
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdf
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 

Chapter 2 - Performance Measurement Fundamentals

  • 1. Performance Measurement Fundamentals 1 Basic Concepts 2 Performance Measurements 3 Performance Testing in SDLC Software Testing - ISTQB Specialist Performance Tester Exam Preparation Chapter 2 Neeraj Kumar Singh 4 Performance Testing Tasks 5 Tools
  • 2. Performance Measurement Fundamentals Contents 2.1 Typical Metrics Collected in Performance Testing 2.2 Aggregating Results from Performance Testing 2.3 Key Sources of Performance Metrics 2.4 Typical Results of a Performance Test Neeraj Kumar Singh
  • 3. Performance Measurement Fundamentals Contents 2.1.1 Why Performance Metrics are Needed 2.1.2 Collecting Performance Measurements and Metrics 2.1.3 Selecting Performance Metrics Neeraj Kumar Singh
  • 4. Typical Metrics Collected in Performance Testing Why Performance Metrics are Needed Accurate measurements and the metrics which are derived from those measurements are essential for defining the goals of performance testing and for evaluating the results of performance testing. Performance testing should not be undertaken without first understanding which measurements and metrics are needed. The following project risks apply if this advice is ignored:  It is unknown if the levels of performance are acceptable to meet operational objectives  The performance requirements are not defined in measurable terms  It may not be possible to identify trends that may predict lower levels of performance  The actual results of a performance test cannot be evaluated by comparing them to a baseline set of performance measures that define acceptable and/or unacceptable performance  Performance test results are evaluated based on the subjective opinion of one or more people  The results provided by a performance test tool are not understood Neeraj Kumar Singh
  • 5. Performance Measurement Fundamentals Contents 2.1.1 Why Performance Metrics are Needed 2.1.2 Collecting Performance Measurements and Metrics 2.1.3 Selecting Performance Metrics Neeraj Kumar Singh
  • 6. Typical Metrics Collected in Performance Testing Collecting Performance Measurements and Metrics As with any form of measurement, it is possible to obtain and express metrics in precise ways. Therefore, any of the metrics and measurements described in this section can and should be defined to be meaningful in a particular context. This is a matter of performing initial tests and learning which metrics need to be further refined and which need to be added. For example, the metric of response time likely will be in any set of performance metrics. However, to be meaningful and actionable, the response time metric will need to be further defined in terms of time of day, number of concurrent users, the amount of data being processed and so forth. The metrics collected in a specific performance test will vary based on the  business context (business processes, customer and user behavior, and stakeholder expectations),  operational context (technology and how it is used)  test objectives For example, the metrics chosen for the performance testing of an international ecommerce website will differ from those chosen for the performance testing of an embedded system used to control medical device functionality. A common way to categorize performance measurements and metrics is to consider the technical environment, business environment, or operational environment in which the assessment of performance is needed. Neeraj Kumar Singh
  • 7. Typical Metrics Collected in Performance Testing Collecting Performance Measurements and Metrics The categories of measurements and metrics included below are the ones commonly obtained from performance testing. Technical Environment Performance metrics will vary by the type of the technical environment, as shown in the following list:  Web-based  Mobile  Internet-of-Things (IoT)  Desktop client devices  Server-side processing  Mainframe  Databases  Networks  The nature of software running in the environment (e.g., embedded) Neeraj Kumar Singh
  • 8. Typical Metrics Collected in Performance Testing Collecting Performance Measurements and Metrics The metrics include the following:  Response time (e.g., per transaction, per concurrent user, page load times)  Resource utilization (e.g., CPU, memory, network bandwidth, network latency, available disk space, I/O rate, idle and busy threads)  Throughput rate of key transaction (i.e., the number of transactions that can be processed in a given period of time)  Batch processing time (e.g., wait times, throughput times, data base response times, completion times)  Numbers of errors impacting performance  Completion time (e.g., for creating, reading, updating, and deleting data)  Background load on shared resources (especially in virtualized environments)  Software metrics (e.g., code complexity) Neeraj Kumar Singh
  • 9. Typical Metrics Collected in Performance Testing Collecting Performance Measurements and Metrics Business Environment From the business or functional perspective, performance metrics may include the following:  Business process efficiency (e.g., the speed of performing an overall business process including normal, alternate and exceptional use case flows)  Throughput of data, transactions, and other units of work performed (e.g., orders processed per hour, data rows added per minute)  Service Level Agreement (SLA) compliance or violation rates (e.g., SLA violations per unit of time)  Scope of usage (e.g., percentage of global or national users conducting tasks at a given time)  Concurrency of usage (e.g., the number of users concurrently performing a task)  Timing of usage (e.g., the number of orders processed during peak load times) Neeraj Kumar Singh
  • 10. Typical Metrics Collected in Performance Testing Collecting Performance Measurements and Metrics Operational Environment The operational aspect of performance testing focuses on tasks that are generally not considered to be user-facing in nature. These include the following:  Operational processes (e.g., the time required for environment start-up, backups, shutdown and resumption times)  System restoration (e.g., the time required to restore data from a backup)  Alerts and warnings (e.g., the time needed for the system to issue an alert or warning) Neeraj Kumar Singh
  • 11. Performance Measurement Fundamentals Contents 2.1.1 Why Performance Metrics are Needed 2.1.2 Collecting Performance Measurements and Metrics 2.1.3 Selecting Performance Metrics Neeraj Kumar Singh
  • 12. Typical Metrics Collected in Performance Testing Selecting Performance Metrics It should be noted that collecting more metrics than required is not necessarily a good thing. Each metric chosen requires a means for consistent collection and reporting. It is important to define an obtainable set of metrics that support the performance test objectives. For example, the Goal-Question-Metric (GQM) approach is a helpful way to align metrics with performance goals. The idea is to first establish the goals, then ask questions to know when the goals have been achieved. Metrics are associated with each question to ensure the answer to the question is measurable. It should be noted that the GQM approach doesn’t always fit the performance testing process. For example, some metrics represent a system’s health and are not directly linked to goals. It is important to realize that after the definition and capture of initial measurements further measurements and metrics may be needed to understand true performance levels and to determine where corrective actions may be needed. Neeraj Kumar Singh
  • 13. Performance Measurement Fundamentals Contents 2.1 Typical Metrics Collected in Performance Testing 2.2 Aggregating Results from Performance Testing 2.3 Key Sources of Performance Metrics 2.4 Typical Results of a Performance Test Neeraj Kumar Singh
  • 14. Performance Measurement Fundamentals Aggregating Results from Performance Testing The purpose of aggregating performance metrics is to be able to understand and express them in a way that accurately conveys the total picture of system performance. When performance metrics are viewed at only the detailed level, drawing the right conclusion may be difficult—especially for business stakeholders. For many stakeholders, the main concern is that the response time of a system, web site, or other test object is within acceptable limits. Once deeper understanding of the performance metrics has been achieved, the metrics can be aggregated so that:  Business and project stakeholders can see the “big picture” status of system performance  Performance trends can be identified  Performance metrics can be reported in an understandable way Neeraj Kumar Singh
  • 15. Performance Measurement Fundamentals Contents 2.1 Typical Metrics Collected in Performance Testing 2.2 Aggregating Results from Performance Testing 2.3 Key Sources of Performance Metrics 2.4 Typical Results of a Performance Test Neeraj Kumar Singh
  • 16. Performance Measurement Fundamentals Key Sources of Performance Metrics System performance should be no more than minimally impacted by the metrics collection effort (known as the “probe effect”). In addition, the volume, accuracy and speed with which performance metrics must be collected makes tool usage a requirement. While the combined use of tools is not uncommon, it can introduce redundancy in the usage of test tools and other problems. There are three key sources of performance metrics: Performance Test Tools All performance test tools provide measurements and metrics as the result of a test. Tools may vary in the number of metrics shown, the way in which the metrics are shown, and the ability for the user to customize the metrics to a particular situation. Some tools collect and display performance metrics in text format, while more robust tools collect and display performance metrics graphically in a dashboard format. Many tools offer the ability to export metrics to facilitate test evaluation and reporting. Neeraj Kumar Singh
  • 17. Performance Measurement Fundamentals Key Sources of Performance Metrics Performance Monitoring Tools Performance monitoring tools are often employed to supplement the reporting capabilities of performance test tools. In addition, monitoring tools may be used to monitor system performance on an ongoing basis and to alert system administrators to lowered levels of performance and higher levels of system errors and alerts. These tools may also be used to detect and notify in the event of suspicious behavior (such as denial of service attacks and distributed denial of Service attacks). Log Analysis Tools There are tools that scan server logs and compile metrics from them. Some of these tools can create charts to provide a graphical view of the data. Errors, alerts and warnings are normally recorded in server logs. These include:  High resource usage, such as high CPU utilization, high levels of disk storage consumed, and insufficient bandwidth  Memory errors and warnings, such as memory exhaustion  Deadlocks and multi-threading problems, especially when performing database operations  Database errors, such as SQL exceptions and SQL timeouts Neeraj Kumar Singh
  • 18. Performance Measurement Fundamentals Contents 2.1 Typical Metrics Collected in Performance Testing 2.2 Aggregating Results from Performance Testing 2.3 Key Sources of Performance Metrics 2.4 Typical Results of a Performance Test Neeraj Kumar Singh
  • 19. Performance Measurement Fundamentals Typical Results of a Performance Test In functional testing, particularly when verifying specified functional requirements or functional elements of user stories, the expected results usually can be defined clearly and the test results interpreted to determine if the test passed or failed. For example, a monthly sales report shows either a correct or an incorrect total. Whereas tests that verify functional suitability often benefit from well-defined test oracles, performance testing often lacks this source of information. Not only are the stakeholders notoriously bad at articulating performance requirements, many business analysts and product owners are bad at eliciting such requirements. Testers often receive limited guidance to define the expected test results. When evaluating performance test results, it is important to look at the results closely. Initial raw results can be misleading with performance failures being hidden beneath apparently good overall results. For example, resource utilization may be well under 75% for all key potential bottleneck resources, but the throughput or response time of key transactions or use cases are an order-of-magnitude too slow. Neeraj Kumar Singh
  • 20. Performance Measurement Fundamentals 1 Basic Concepts 2 Performance Measurements 3 Performance Testing in SDLC Software Testing - ISTQB Specialist Performance Tester Exam Preparation Chapter 2 Neeraj Kumar Singh 4 Performance Testing Tasks 5 Tools
  • 21. Neeraj Kumar Singh Performance Measurement Fundamentals Sample Questions Pattern Source : istqb.org
  • 22. 1. Which of the following is a true statement regarding tracking metrics for network latency during a performance test? Select ONE option. Answer Set a. High latency could indicate a network bandwidth problem that could negatively impact performance b. Low latency could indicate a network bandwidth problem that could negatively impact performance c. Network latency is difficult to track and should not be included in the performance metrics d. Network latency is too variable to be useful during performance tuning Performance Measurement Fundamentals Sample Question
  • 23. 2. Should performance test results be aggregated? Select ONE option. Answer Set a. Yes, this gives a better overall picture of the performance of the system and helps to identify trends b. Yes, this is the best way to focus on the outliers in the performance metrics c. No, the results should be analyzed individually so that all variations are understood d. No, the results from each test should be reported and tracked separately Performance Measurement Fundamentals Sample Question
  • 24. 3. Which of the following is a failure that would typically be found by conducting a spike test? Select ONE option. Answer Set a. The system performance gradually degrades b. The system provides inconsistent responses to errors c. The system handles a sudden burst of activity, but can’t resume a steady state d. The system performs well for the expected load, but can’t scale to a larger load Performance Measurement Fundamentals Sample Question