SlideShare a Scribd company logo
1 of 43
MEETING 14
SOFTWARE PRODUCTIVITY
MEASUREMENT
SOFTWARE TESTING
By : Ajeng Savitri Puspaningrum, M.Kom
OBJECTIVE
• Learning How to Measure
Software Productivity
Intro
4
• How do we measure the progress of testing?
• When do we release the software?
• Why do we devote more time and resources to testing a
particular module?
• What is the reliability of software at the time of release?
• Who is responsible for the selection of a poor test suite?
• How many faults do we expect during testing?
• How much time and resources are required for software
testing?
• How do we know the effectiveness of a test suite?
5
• Sometimes testing techniques are confused with testing objectives.
• Testing techniques can be viewed as aids that help to ensure the
achievement of test objectives
• To avoid such misunderstandings, a clear distinction
should be made between test-related measures
that provides an evaluation of the program under test,
based on the observed test outputs and the measures
that evaluate the thoroughness of the test set.
6
SOFTWARE QUALITY
Factors affecting the quality :
• Correctness: the software is working properly & correctly
(no error)
• Maintainability: easy to maintain; mean time to change
(MTTC) is small
• Integrity: resist interference; good level of security
• Usability: easy to use
7
Software testing metrics may help us to measure and
quantify many things which may help us find some answers
to such important questions
Measure,
Measurement,
and Metrics
9
MEASURE, MEASUREMENT, AND METRICS
• A measure provides a quantitative indication of
the extent, amount, dimension, capacity, or size
of some attributes of a product or process.
• Measurement is the act of determining a
measure.
• The metric is a quantitative measure of the
degree to which a product or process possesses a
given attribute
10
ILLUSTRATION
• A measure is the number of failures experienced during testing.
• Measurement is the way of recording such failures.
• A software metric may be an average number of failures experienced
per hour during testing.
11
MEASURE
• A measure has been established when a single
data point has been collected (e.g., the number of
errors uncovered within a single software
component),.
• Measure quality attributes for software such as
testability, complexity, reliability, maintainability,
efficiency, portability, enhanced ability, usability,
etc.
• Similarly, we may also like to measure size, effort,
development time, and resources for software.
12
MEASUREMENT
It can be defined as :
• The process by which numbers or
symbols are assigned to attributes
of entities in the real world in such a
way as to describe them according
to clearly defined rules
13
METRICS
It can be defined as :
• The continuous application of measurement-based techniques to the
software development process and its products to supply meaningful
and timely management information, together with the use of those
techniques to improve that process and its products
• Ejiogu defines effective software metrics as a set of
attributes that should be encompassed.
• Software metrics are related to measures that, in turn,
involve numbers for quantification.
• These numbers are used to produce a better product
and improve its related process.
14
METRICS (cont.)
• The metrics should be relative:
• Easy to learn how to derive the metric,
• Its computation should not demand inordinate effort or time.
• Always yield results that are unambiguous.
• The mathematical computation of the metric should use measures
that do not lead to bizarre combinations of units.
• Should be based on the requirements model, the design model, or
the structure of the program itself
• Should not be dependent on the vagaries of programming language
syntax or semantics
• The metric should provide information that can lead to a higher-
quality end product
15
INDICATOR
• A software engineer collects measures and
develops metrics so that indicators will be
obtained.
• An indicator is a metric or combination of metrics
that provide insight into the software process, a
software project, or the product itself.
16
Key performance indicators (KPIs) are metrics that
are used to track performance and trigger
remedial actions when their values fall in a
predetermined range.
But how do you know that metrics
are meaningful in the first place?
17
SOFTWARE ANALYTICS
It can be defined as :
• The systematic computational analysis of software engineering data or
statistics to provide managers and software engineers with meaningful
insights and empower their teams to make better decisions
• Analytics can help developers predict the number of defects to
expect, where to test for them, and how much time it will take to fix
them.
• This allows managers and developers to create incremental schedules
that use these predictions to determine expected completion times.
18
SOFTWARE ANALYTICS (cont.)
Buse and Zimmermann suggest that analytics can help developers
make decisions regarding:
• Targeted testing: To help focus regression testing and integration
testing resources
• Targeted refactoring: To help make strategic decisions on how to
avoid large technical debt costs
• Release planning: To help ensure that market needs, as well as
technical features in software products, are taken into account
• Understanding customers: To help developers get actionable
information on product use by customers in the field during product
engineering
• Judging stability: To help managers and developers monitor the state
of the evolving prototype and anticipate future maintenance needs
• Targeting inspection: To help teams determine the value of individual
inspection activities, their frequency, and their scope
Product Metrics
20
PRODUCT METRICS
• These metrics provide information about the testing
status of a software product.
• The data for such metrics are also generated during
testing and may help us to know the quality of the
product.
• Some of the basic metrics are given as:
• Time interval between failures
• Number of failures experienced in a time interval
• Cumulative failures experienced up to a specified time
• Time of failure
• Estimated time for testing
• Actual testing time
21
PRODUCT METRICS (cont.)
• With the basic metrics before, we may find some additional metrics
as given below:
• % of time spent =
𝐴𝑐𝑡𝑢𝑎𝑙 𝑡𝑖𝑚𝑒 𝑠𝑝𝑒𝑛𝑡
𝐸𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 𝑡𝑒𝑠𝑡𝑖𝑛𝑔 𝑡𝑖𝑚𝑒
x 100
• Average time interval between failures
• Maximum and minimum failures experienced in any time interval
• Average number of failures experienced in time intervals
• Time remaining to complete the testing
22
METRICS FOR THE REQUIREMENTS MODEL
• These estimation metrics examine the requirements model with the
intent of predicting the “size” of the resultant system.
• Size is sometimes (but not always) an indicator of design complexity
and is almost always an indicator of increased coding, integration, and
testing effort.
• By measuring the characteristics of the requirements
model, it is possible to gain quantitative insight
into its specificity and completeness.
THE CHARACTERISTICS
Conventional Software.
• Specificity (lack of ambiguity),
completeness, correctness,
understandability, verifiability, internal and
external consistency, achievability,
concision, traceability, modifiability,
precision, and reusability.
• Electronically stored; executable or at least
interpretable; annotated by relative
importance, and stable, versioned,
organized, cross-referenced, and specified
at the right level of detail
Mobile Software.
• Number of static screen displays.
• Number of dynamic screen displays.
• Number of persistent data objects.
• Number of external systems interfaced.
• Number of static content objects.
• Number of dynamic content objects.
• Number of executable functions.
24
DESIGN METRICS FOR
CONVENTIONAL SOFTWARE
• Architectural design metrics focus on
characteristics of the program
architecture with an emphasis on the
architectural structure and the
effectiveness of modules or components
within the architecture.
• Software design complexity measures:
structural complexity, data complexity,
and system complexity.
25
DESIGN METRICS FOR
OBJECT-ORIENTED SOFTWARE
Nine distinct and measurable characteristics of an OO design:
• Size is defined by taking a static count of OO entities such as classes or operations, coupled with the depth
of an inheritance tree.
• Complexity is defined in terms of structural characteristics by examining how classes of an OO design are
interrelated to one another.
• Coupling is measured by counting physical connections between elements of the OO design.
• Sufficiency is “the degree to which an abstraction [class] possesses the features required of it . . .”.
• Completeness determines whether a class delivers the set of properties that fully reflect the needs of the
problem domain.
• Cohesion is determined by examining whether all operations work together to achieve a single, well-
defined purpose.
• Primitiveness is the degree to which an operation is atomic—that is, the operation cannot be constructed
out of a sequence of other operations contained within a class.
• Similarity determines the degree to which two or more classes are similar in terms of their structure,
function, behavior, or purpose.
• Volatility measures the likelihood that a change will occur.
26
DESIGN METRICS FOR
USER INTERFACE
Suggested Interface
Metric
Description
Layout appropriateness The relative position of entities within the interface
Layout complexity Number of distinct regions defined for an interface
Layout region complexity Average number of distinct links per region
Recognition complexity Average number of distinct items the user must look at before making a navigation or data
input decision
Recognition time Average time (in seconds) that it takes a user to select the appropriate action for a given task
Typing effort Average number of keystrokes required for a specific function
Mouse pick effort Average number of mouse picks per function
Selection complexity Average number of links that can be selected per page
Content acquisition time Average number of words of text per Web page
Memory load Average number of distinct data items that the user must remember to achieve a specific
objective
27
DESIGN METRICS FOR
USER INTERFACE (CONT.)
Suggested Graphic Design
Metrics
Description
Word count Total number of words that appear on a page
Body text percentage Percentage of words that are body versus display text (e.g., headers)
Emphasized body text percentage The portion of body text that is emphasized
Text positioning count Changes in text position from flush left
Text cluster count Text areas highlighted with color, bordered regions, rules, or lists
Link count Total links on a page
Page size Total bytes for the page as well as elements, graphics, and style sheets
Graphic percentage Percentage of page bytes that are for graphics
Graphics count Total graphics on a page (not including graphics specified in scripts, applets, and objects)
Color count Total colors employed
Font count Total fonts employed (i.e., face + size + bold + italic)
28
DESIGN METRICS FOR
USER INTERFACE (CONT.)
Suggested Content
Metrics
Description
Page wait Average time required for a page to download at different connection speeds
Page complexity Average number of different types of media used on page, not including text
Graphic complexity Average number of graphics media per page
Audio complexity Average number of audio media per page
Video complexity Average number of video media per page
Animation complexity Average number of animations per page
Scanned image
complexity
Average number of scanned images per page
29
DESIGN METRICS FOR
USER INTERFACE (CONT.)
Suggested Navigation
Metrics
Description
Page-linking complexity Number of links per page
Connectivity Total number of internal links, not including dynamically generated links
Connectivity density Connectivity divided by page count
30
DESIGN METRICS FOR SOURCE CODE
The measures are:
• n1 = number of distinct operators that appear in a program
• n2 = number of distinct operands that appear in a program
• N1 = total number of operator occurrences
• N2 = total number of operand occurrences
The length N can be estimated as
• N = n1 log2 n1 + n2 log2 n2
The program volume may be defined as
• V = N log2 (n1 + n2)
31
METRICS FOR TESTING
Testing metrics fall into two broad categories:
(1) metric that attempts to predict the likely number of tests required at various testing levels,
and
(2) metrics that focus on test coverage for a given component.
The metrics consider aspects of encapsulation and inheritance
(1) Lack of cohesion in methods (LCOM).
(2) Percent public and protected (PAP).
(3) Public access to data members (PAD).
(4) Number of root classes (NOR).
(5) Fan-in (FIN).
(6) Number of children (NOC)
(7) Depth of the inheritance tree (DIT).
32
METRICS FOR MAINTENANCE
• IEEE Std. 982.1-2005 suggests a software maturity index (SMI) that
provides an indication of the stability of a software product (based on
changes that occur for each release of the product).
• The following information is determined:
• MT = number of modules in the current release
• Fc = number of modules in the current release that have been changed
• Fa = number of modules in the current release that have been added
• Fd = number of modules from the preceding release that were deleted in the current release
• The software maturity index is computed in the following manner:
• SMI =
𝑀𝑇
− (𝐹𝑎 + 𝐹𝑐 + 𝐹𝑑)
𝑀𝑇
Process Metrics
34
PROCESS METRICS
• Process metrics are intended to improve the software process so
that errors are uncovered in the most effective manner
• Project metrics enable a software project manager to
(1) Assess the status of an ongoing project,
(2) Track potential risks,
(3) Uncover problem areas before they go “critical,”
(4) Adjust workflow or tasks, and
(5) Evaluate the project team’s ability to control the quality of software work products
35
DETERMINANTS FOR SOFTWARE QUALITY
AND ORGANIZATIONAL EFFECTIVENESS
• Referring to the Figure, the process sits at the
center of a triangle connecting three factors that
have a profound influence on software quality and
organizational performance.
• The skill and motivation of people have been
shown to be the most influential factors in quality
and performance.
• The complexity of the product can have a
substantial impact on quality and team
performance.
• The technology (i.e., the software engineering
methods and tools) that populate the process also
has an impact.
36
SOFTWARE MEASUREMENT
1. Size-oriented software metrics:
derived by normalizing quality and/or
productivity measures by considering
the size of the software that has been
produced.
2. Function-oriented software metrics:
use a measure of the functionality
delivered by the application as a
normalization value. The computation
of a function-oriented metric is based
on characteristics of the software’s
information domain and complexity.
37
SIZE-ORIENTED SOFTWARE
METRICS
• Normalize the quality and productivity by
measuring the size of the software (LOC/KLOC), so
that we get:
• Errors per KLOC
• Defects per KLOC
• Rupiah (Rp) per LOC
• Documentation pages per KLOC
• Other metrics can be derived:
• Errors per person-month
• LOC per person-month
• Rupiah (Rp) per documentation page
38
FUNCTION-ORIENTED
SOFTWARE METRICS
• Measure indirectly.
• Emphasis on program functionality & utility.
• It was first proposed by Albrecht, with a proposed
method of calculation called: FUNCTION POINT (FP)
• FP is derived by using empirical relationships based on
something measurable from the information domain
and related to the complexity of the software.
39
FUNCTION POINTS
Information Domain:
• Number of user inputs: the number of user inputs required by the
application
• Number of outputs for users: the sum of all outputs both reports and
errors (to the printer/screen)
• Inquiry input number: online input resulting in the online output
• Number of files
• Number of external interfaces: all interfaces that are machine-
readable to transfer information to other systems.
40
METRICS FOR SOFTWARE QUALITY
• A quality metric that provides benefit at both the project and process
level is defect removal efficiency (DRE).
• In essence, DRE is a measure of the filtering ability of quality
assurance and control actions as they are applied throughout all
process framework activities.
• When considered for a project as a whole, DRE is defined in the
following manner: 𝐷𝑅𝐸 =
𝐸
𝐸+𝐷
where E is the number of errors found before delivery of the software to the end user and D is
the number of defects found after delivery.
UNITING OF METRICS IN SOFTWARE
PROCESS
software
engineering
process
software
project
software
product
data
collection
data
collection
data
collection
measures
metrics
indicators
References
Chopra, R. (2018). Software Testing: Principles and Practices.
Mercury Learning & Information.
02
Majchrzak, T. A. (2012). Improving Software Testing: Technical And
Organizational Developments. Springer Science & Business Media.
03
Myers, G. J., Sandler, C., & Badgett, T. (2012). The Art Of Software
Testing. John Wiley & Sons.
04
Dooley, J. F., & Dooley. (2017). Software Development, Design and
Coding. Apress.
01
Singh, Y. (2011). Software Testing. Cambridge University Press
05
THANK YOU
Insert the Subtitle of Your Presentation

More Related Content

Similar to Measure Software Testing Progress

Questions for successful test automation projects
Questions for successful test automation projectsQuestions for successful test automation projects
Questions for successful test automation projectsDaniel Ionita
 
Testing throughout the software life cycle - Testing & Implementation
Testing throughout the software life cycle - Testing & ImplementationTesting throughout the software life cycle - Testing & Implementation
Testing throughout the software life cycle - Testing & Implementationyogi syafrialdi
 
Chapter 13 software testing strategies
Chapter 13 software testing strategiesChapter 13 software testing strategies
Chapter 13 software testing strategiesSHREEHARI WADAWADAGI
 
Software Testing - Software Quality
Software Testing - Software QualitySoftware Testing - Software Quality
Software Testing - Software QualityAjeng Savitri
 
Software Testing Strategy
Software Testing StrategySoftware Testing Strategy
Software Testing StrategyAjeng Savitri
 
boughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifij
boughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifijboughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifij
boughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifijakd3143
 
Manual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxManual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxssuser305f65
 
Recent and-future-trends spm
Recent and-future-trends spmRecent and-future-trends spm
Recent and-future-trends spmPrakash Poudel
 
Intoduction to software engineering part 2
Intoduction to software engineering part 2Intoduction to software engineering part 2
Intoduction to software engineering part 2Rupesh Vaishnav
 
productmetrics-111021073430-phpapp01.pdf
productmetrics-111021073430-phpapp01.pdfproductmetrics-111021073430-phpapp01.pdf
productmetrics-111021073430-phpapp01.pdfSrinivasRedyySarviga
 
Introduction To Software Concepts Unit 1 & 2
Introduction To Software Concepts Unit 1 & 2Introduction To Software Concepts Unit 1 & 2
Introduction To Software Concepts Unit 1 & 2Raj vardhan
 
Software Engineering Practices and Issues.pptx
Software Engineering Practices and Issues.pptxSoftware Engineering Practices and Issues.pptx
Software Engineering Practices and Issues.pptxNikilesh8
 
SQA_Lec#01-1.ppt
SQA_Lec#01-1.pptSQA_Lec#01-1.ppt
SQA_Lec#01-1.pptAhmad Abbas
 
Quality Assurance in Modern Software Development
Quality Assurance in Modern Software DevelopmentQuality Assurance in Modern Software Development
Quality Assurance in Modern Software DevelopmentZahra Sadeghi
 
Quality Concept
Quality ConceptQuality Concept
Quality ConceptAnand Jat
 

Similar to Measure Software Testing Progress (20)

Questions for successful test automation projects
Questions for successful test automation projectsQuestions for successful test automation projects
Questions for successful test automation projects
 
Testing throughout the software life cycle - Testing & Implementation
Testing throughout the software life cycle - Testing & ImplementationTesting throughout the software life cycle - Testing & Implementation
Testing throughout the software life cycle - Testing & Implementation
 
Software Development
Software DevelopmentSoftware Development
Software Development
 
Chapter 13 software testing strategies
Chapter 13 software testing strategiesChapter 13 software testing strategies
Chapter 13 software testing strategies
 
Software engineering
Software engineeringSoftware engineering
Software engineering
 
Software Testing - Software Quality
Software Testing - Software QualitySoftware Testing - Software Quality
Software Testing - Software Quality
 
Software Testing Strategy
Software Testing StrategySoftware Testing Strategy
Software Testing Strategy
 
boughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifij
boughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifijboughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifij
boughtonalexand jdjdjfjjfjfjfjnfjfjjjfkdifij
 
Software testing introduction
Software testing  introductionSoftware testing  introduction
Software testing introduction
 
Manual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docxManual Testing Interview Questions & Answers.docx
Manual Testing Interview Questions & Answers.docx
 
Recent and-future-trends spm
Recent and-future-trends spmRecent and-future-trends spm
Recent and-future-trends spm
 
SQA_Class
SQA_ClassSQA_Class
SQA_Class
 
Intoduction to software engineering part 2
Intoduction to software engineering part 2Intoduction to software engineering part 2
Intoduction to software engineering part 2
 
productmetrics-111021073430-phpapp01.pdf
productmetrics-111021073430-phpapp01.pdfproductmetrics-111021073430-phpapp01.pdf
productmetrics-111021073430-phpapp01.pdf
 
Introduction To Software Concepts Unit 1 & 2
Introduction To Software Concepts Unit 1 & 2Introduction To Software Concepts Unit 1 & 2
Introduction To Software Concepts Unit 1 & 2
 
Software Engineering Practices and Issues.pptx
Software Engineering Practices and Issues.pptxSoftware Engineering Practices and Issues.pptx
Software Engineering Practices and Issues.pptx
 
SQA_Lec#01-1.ppt
SQA_Lec#01-1.pptSQA_Lec#01-1.ppt
SQA_Lec#01-1.ppt
 
Hema se
Hema seHema se
Hema se
 
Quality Assurance in Modern Software Development
Quality Assurance in Modern Software DevelopmentQuality Assurance in Modern Software Development
Quality Assurance in Modern Software Development
 
Quality Concept
Quality ConceptQuality Concept
Quality Concept
 

More from Ajeng Savitri

Software Testing Documentation
Software Testing DocumentationSoftware Testing Documentation
Software Testing DocumentationAjeng Savitri
 
Software Testing Strategy (Part 2)
Software Testing Strategy (Part 2)Software Testing Strategy (Part 2)
Software Testing Strategy (Part 2)Ajeng Savitri
 
Object Oriented Testing
Object Oriented TestingObject Oriented Testing
Object Oriented TestingAjeng Savitri
 
Testing Technique (Part 2)
Testing Technique (Part 2)Testing Technique (Part 2)
Testing Technique (Part 2)Ajeng Savitri
 
Methodology Selection Strategy
Methodology Selection Strategy Methodology Selection Strategy
Methodology Selection Strategy Ajeng Savitri
 
Software Testing - Software Quality (Part 2)
Software Testing - Software Quality (Part 2)Software Testing - Software Quality (Part 2)
Software Testing - Software Quality (Part 2)Ajeng Savitri
 
Computer Evolution and Performance
Computer Evolution and PerformanceComputer Evolution and Performance
Computer Evolution and PerformanceAjeng Savitri
 
Software Testing - Introduction
Software Testing - IntroductionSoftware Testing - Introduction
Software Testing - IntroductionAjeng Savitri
 
Requirement Gathering
Requirement GatheringRequirement Gathering
Requirement GatheringAjeng Savitri
 
Introduction to SDLC
Introduction to SDLC Introduction to SDLC
Introduction to SDLC Ajeng Savitri
 

More from Ajeng Savitri (20)

Software Testing Documentation
Software Testing DocumentationSoftware Testing Documentation
Software Testing Documentation
 
Debugging (Part 2)
Debugging (Part 2)Debugging (Part 2)
Debugging (Part 2)
 
Debugging
DebuggingDebugging
Debugging
 
Software Testing Strategy (Part 2)
Software Testing Strategy (Part 2)Software Testing Strategy (Part 2)
Software Testing Strategy (Part 2)
 
Object Oriented Testing
Object Oriented TestingObject Oriented Testing
Object Oriented Testing
 
Testing Technique (Part 2)
Testing Technique (Part 2)Testing Technique (Part 2)
Testing Technique (Part 2)
 
Testing Technique
Testing TechniqueTesting Technique
Testing Technique
 
Testing Plan
Testing PlanTesting Plan
Testing Plan
 
Methodology Selection Strategy
Methodology Selection Strategy Methodology Selection Strategy
Methodology Selection Strategy
 
Software Testing - Software Quality (Part 2)
Software Testing - Software Quality (Part 2)Software Testing - Software Quality (Part 2)
Software Testing - Software Quality (Part 2)
 
Computer Evolution and Performance
Computer Evolution and PerformanceComputer Evolution and Performance
Computer Evolution and Performance
 
Software Testing - Introduction
Software Testing - IntroductionSoftware Testing - Introduction
Software Testing - Introduction
 
Sequence Diagram
Sequence DiagramSequence Diagram
Sequence Diagram
 
Activity Diagram
Activity DiagramActivity Diagram
Activity Diagram
 
Use Case Diagram
Use Case DiagramUse Case Diagram
Use Case Diagram
 
Requirement Gathering
Requirement GatheringRequirement Gathering
Requirement Gathering
 
Business Value
Business ValueBusiness Value
Business Value
 
Agile Development
Agile DevelopmentAgile Development
Agile Development
 
Structured Design
Structured DesignStructured Design
Structured Design
 
Introduction to SDLC
Introduction to SDLC Introduction to SDLC
Introduction to SDLC
 

Recently uploaded

Cloud Data Center Network Construction - IEEE
Cloud Data Center Network Construction - IEEECloud Data Center Network Construction - IEEE
Cloud Data Center Network Construction - IEEEVICTOR MAESTRE RAMIREZ
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Andreas Granig
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityNeo4j
 
Intelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalmIntelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalmSujith Sukumaran
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWave PLM
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEOrtus Solutions, Corp
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio, Inc.
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)OPEN KNOWLEDGE GmbH
 
Unveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML DiagramsUnveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML DiagramsAhmed Mohamed
 
Implementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with AzureImplementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with AzureDinusha Kumarasiri
 
software engineering Chapter 5 System modeling.pptx
software engineering Chapter 5 System modeling.pptxsoftware engineering Chapter 5 System modeling.pptx
software engineering Chapter 5 System modeling.pptxnada99848
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxTier1 app
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptkotipi9215
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfPower Karaoke
 

Recently uploaded (20)

Hot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort Service
Hot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort ServiceHot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort Service
Hot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort Service
 
Cloud Data Center Network Construction - IEEE
Cloud Data Center Network Construction - IEEECloud Data Center Network Construction - IEEE
Cloud Data Center Network Construction - IEEE
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered Sustainability
 
Intelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalmIntelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalm
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need It
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)
 
Unveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML DiagramsUnveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML Diagrams
 
Implementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with AzureImplementing Zero Trust strategy with Azure
Implementing Zero Trust strategy with Azure
 
software engineering Chapter 5 System modeling.pptx
software engineering Chapter 5 System modeling.pptxsoftware engineering Chapter 5 System modeling.pptx
software engineering Chapter 5 System modeling.pptx
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.ppt
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdf
 

Measure Software Testing Progress

  • 1. MEETING 14 SOFTWARE PRODUCTIVITY MEASUREMENT SOFTWARE TESTING By : Ajeng Savitri Puspaningrum, M.Kom
  • 2. OBJECTIVE • Learning How to Measure Software Productivity
  • 4. 4 • How do we measure the progress of testing? • When do we release the software? • Why do we devote more time and resources to testing a particular module? • What is the reliability of software at the time of release? • Who is responsible for the selection of a poor test suite? • How many faults do we expect during testing? • How much time and resources are required for software testing? • How do we know the effectiveness of a test suite?
  • 5. 5 • Sometimes testing techniques are confused with testing objectives. • Testing techniques can be viewed as aids that help to ensure the achievement of test objectives • To avoid such misunderstandings, a clear distinction should be made between test-related measures that provides an evaluation of the program under test, based on the observed test outputs and the measures that evaluate the thoroughness of the test set.
  • 6. 6 SOFTWARE QUALITY Factors affecting the quality : • Correctness: the software is working properly & correctly (no error) • Maintainability: easy to maintain; mean time to change (MTTC) is small • Integrity: resist interference; good level of security • Usability: easy to use
  • 7. 7 Software testing metrics may help us to measure and quantify many things which may help us find some answers to such important questions
  • 9. 9 MEASURE, MEASUREMENT, AND METRICS • A measure provides a quantitative indication of the extent, amount, dimension, capacity, or size of some attributes of a product or process. • Measurement is the act of determining a measure. • The metric is a quantitative measure of the degree to which a product or process possesses a given attribute
  • 10. 10 ILLUSTRATION • A measure is the number of failures experienced during testing. • Measurement is the way of recording such failures. • A software metric may be an average number of failures experienced per hour during testing.
  • 11. 11 MEASURE • A measure has been established when a single data point has been collected (e.g., the number of errors uncovered within a single software component),. • Measure quality attributes for software such as testability, complexity, reliability, maintainability, efficiency, portability, enhanced ability, usability, etc. • Similarly, we may also like to measure size, effort, development time, and resources for software.
  • 12. 12 MEASUREMENT It can be defined as : • The process by which numbers or symbols are assigned to attributes of entities in the real world in such a way as to describe them according to clearly defined rules
  • 13. 13 METRICS It can be defined as : • The continuous application of measurement-based techniques to the software development process and its products to supply meaningful and timely management information, together with the use of those techniques to improve that process and its products • Ejiogu defines effective software metrics as a set of attributes that should be encompassed. • Software metrics are related to measures that, in turn, involve numbers for quantification. • These numbers are used to produce a better product and improve its related process.
  • 14. 14 METRICS (cont.) • The metrics should be relative: • Easy to learn how to derive the metric, • Its computation should not demand inordinate effort or time. • Always yield results that are unambiguous. • The mathematical computation of the metric should use measures that do not lead to bizarre combinations of units. • Should be based on the requirements model, the design model, or the structure of the program itself • Should not be dependent on the vagaries of programming language syntax or semantics • The metric should provide information that can lead to a higher- quality end product
  • 15. 15 INDICATOR • A software engineer collects measures and develops metrics so that indicators will be obtained. • An indicator is a metric or combination of metrics that provide insight into the software process, a software project, or the product itself.
  • 16. 16 Key performance indicators (KPIs) are metrics that are used to track performance and trigger remedial actions when their values fall in a predetermined range. But how do you know that metrics are meaningful in the first place?
  • 17. 17 SOFTWARE ANALYTICS It can be defined as : • The systematic computational analysis of software engineering data or statistics to provide managers and software engineers with meaningful insights and empower their teams to make better decisions • Analytics can help developers predict the number of defects to expect, where to test for them, and how much time it will take to fix them. • This allows managers and developers to create incremental schedules that use these predictions to determine expected completion times.
  • 18. 18 SOFTWARE ANALYTICS (cont.) Buse and Zimmermann suggest that analytics can help developers make decisions regarding: • Targeted testing: To help focus regression testing and integration testing resources • Targeted refactoring: To help make strategic decisions on how to avoid large technical debt costs • Release planning: To help ensure that market needs, as well as technical features in software products, are taken into account • Understanding customers: To help developers get actionable information on product use by customers in the field during product engineering • Judging stability: To help managers and developers monitor the state of the evolving prototype and anticipate future maintenance needs • Targeting inspection: To help teams determine the value of individual inspection activities, their frequency, and their scope
  • 20. 20 PRODUCT METRICS • These metrics provide information about the testing status of a software product. • The data for such metrics are also generated during testing and may help us to know the quality of the product. • Some of the basic metrics are given as: • Time interval between failures • Number of failures experienced in a time interval • Cumulative failures experienced up to a specified time • Time of failure • Estimated time for testing • Actual testing time
  • 21. 21 PRODUCT METRICS (cont.) • With the basic metrics before, we may find some additional metrics as given below: • % of time spent = 𝐴𝑐𝑡𝑢𝑎𝑙 𝑡𝑖𝑚𝑒 𝑠𝑝𝑒𝑛𝑡 𝐸𝑠𝑡𝑖𝑚𝑎𝑡𝑒𝑑 𝑡𝑒𝑠𝑡𝑖𝑛𝑔 𝑡𝑖𝑚𝑒 x 100 • Average time interval between failures • Maximum and minimum failures experienced in any time interval • Average number of failures experienced in time intervals • Time remaining to complete the testing
  • 22. 22 METRICS FOR THE REQUIREMENTS MODEL • These estimation metrics examine the requirements model with the intent of predicting the “size” of the resultant system. • Size is sometimes (but not always) an indicator of design complexity and is almost always an indicator of increased coding, integration, and testing effort. • By measuring the characteristics of the requirements model, it is possible to gain quantitative insight into its specificity and completeness.
  • 23. THE CHARACTERISTICS Conventional Software. • Specificity (lack of ambiguity), completeness, correctness, understandability, verifiability, internal and external consistency, achievability, concision, traceability, modifiability, precision, and reusability. • Electronically stored; executable or at least interpretable; annotated by relative importance, and stable, versioned, organized, cross-referenced, and specified at the right level of detail Mobile Software. • Number of static screen displays. • Number of dynamic screen displays. • Number of persistent data objects. • Number of external systems interfaced. • Number of static content objects. • Number of dynamic content objects. • Number of executable functions.
  • 24. 24 DESIGN METRICS FOR CONVENTIONAL SOFTWARE • Architectural design metrics focus on characteristics of the program architecture with an emphasis on the architectural structure and the effectiveness of modules or components within the architecture. • Software design complexity measures: structural complexity, data complexity, and system complexity.
  • 25. 25 DESIGN METRICS FOR OBJECT-ORIENTED SOFTWARE Nine distinct and measurable characteristics of an OO design: • Size is defined by taking a static count of OO entities such as classes or operations, coupled with the depth of an inheritance tree. • Complexity is defined in terms of structural characteristics by examining how classes of an OO design are interrelated to one another. • Coupling is measured by counting physical connections between elements of the OO design. • Sufficiency is “the degree to which an abstraction [class] possesses the features required of it . . .”. • Completeness determines whether a class delivers the set of properties that fully reflect the needs of the problem domain. • Cohesion is determined by examining whether all operations work together to achieve a single, well- defined purpose. • Primitiveness is the degree to which an operation is atomic—that is, the operation cannot be constructed out of a sequence of other operations contained within a class. • Similarity determines the degree to which two or more classes are similar in terms of their structure, function, behavior, or purpose. • Volatility measures the likelihood that a change will occur.
  • 26. 26 DESIGN METRICS FOR USER INTERFACE Suggested Interface Metric Description Layout appropriateness The relative position of entities within the interface Layout complexity Number of distinct regions defined for an interface Layout region complexity Average number of distinct links per region Recognition complexity Average number of distinct items the user must look at before making a navigation or data input decision Recognition time Average time (in seconds) that it takes a user to select the appropriate action for a given task Typing effort Average number of keystrokes required for a specific function Mouse pick effort Average number of mouse picks per function Selection complexity Average number of links that can be selected per page Content acquisition time Average number of words of text per Web page Memory load Average number of distinct data items that the user must remember to achieve a specific objective
  • 27. 27 DESIGN METRICS FOR USER INTERFACE (CONT.) Suggested Graphic Design Metrics Description Word count Total number of words that appear on a page Body text percentage Percentage of words that are body versus display text (e.g., headers) Emphasized body text percentage The portion of body text that is emphasized Text positioning count Changes in text position from flush left Text cluster count Text areas highlighted with color, bordered regions, rules, or lists Link count Total links on a page Page size Total bytes for the page as well as elements, graphics, and style sheets Graphic percentage Percentage of page bytes that are for graphics Graphics count Total graphics on a page (not including graphics specified in scripts, applets, and objects) Color count Total colors employed Font count Total fonts employed (i.e., face + size + bold + italic)
  • 28. 28 DESIGN METRICS FOR USER INTERFACE (CONT.) Suggested Content Metrics Description Page wait Average time required for a page to download at different connection speeds Page complexity Average number of different types of media used on page, not including text Graphic complexity Average number of graphics media per page Audio complexity Average number of audio media per page Video complexity Average number of video media per page Animation complexity Average number of animations per page Scanned image complexity Average number of scanned images per page
  • 29. 29 DESIGN METRICS FOR USER INTERFACE (CONT.) Suggested Navigation Metrics Description Page-linking complexity Number of links per page Connectivity Total number of internal links, not including dynamically generated links Connectivity density Connectivity divided by page count
  • 30. 30 DESIGN METRICS FOR SOURCE CODE The measures are: • n1 = number of distinct operators that appear in a program • n2 = number of distinct operands that appear in a program • N1 = total number of operator occurrences • N2 = total number of operand occurrences The length N can be estimated as • N = n1 log2 n1 + n2 log2 n2 The program volume may be defined as • V = N log2 (n1 + n2)
  • 31. 31 METRICS FOR TESTING Testing metrics fall into two broad categories: (1) metric that attempts to predict the likely number of tests required at various testing levels, and (2) metrics that focus on test coverage for a given component. The metrics consider aspects of encapsulation and inheritance (1) Lack of cohesion in methods (LCOM). (2) Percent public and protected (PAP). (3) Public access to data members (PAD). (4) Number of root classes (NOR). (5) Fan-in (FIN). (6) Number of children (NOC) (7) Depth of the inheritance tree (DIT).
  • 32. 32 METRICS FOR MAINTENANCE • IEEE Std. 982.1-2005 suggests a software maturity index (SMI) that provides an indication of the stability of a software product (based on changes that occur for each release of the product). • The following information is determined: • MT = number of modules in the current release • Fc = number of modules in the current release that have been changed • Fa = number of modules in the current release that have been added • Fd = number of modules from the preceding release that were deleted in the current release • The software maturity index is computed in the following manner: • SMI = 𝑀𝑇 − (𝐹𝑎 + 𝐹𝑐 + 𝐹𝑑) 𝑀𝑇
  • 34. 34 PROCESS METRICS • Process metrics are intended to improve the software process so that errors are uncovered in the most effective manner • Project metrics enable a software project manager to (1) Assess the status of an ongoing project, (2) Track potential risks, (3) Uncover problem areas before they go “critical,” (4) Adjust workflow or tasks, and (5) Evaluate the project team’s ability to control the quality of software work products
  • 35. 35 DETERMINANTS FOR SOFTWARE QUALITY AND ORGANIZATIONAL EFFECTIVENESS • Referring to the Figure, the process sits at the center of a triangle connecting three factors that have a profound influence on software quality and organizational performance. • The skill and motivation of people have been shown to be the most influential factors in quality and performance. • The complexity of the product can have a substantial impact on quality and team performance. • The technology (i.e., the software engineering methods and tools) that populate the process also has an impact.
  • 36. 36 SOFTWARE MEASUREMENT 1. Size-oriented software metrics: derived by normalizing quality and/or productivity measures by considering the size of the software that has been produced. 2. Function-oriented software metrics: use a measure of the functionality delivered by the application as a normalization value. The computation of a function-oriented metric is based on characteristics of the software’s information domain and complexity.
  • 37. 37 SIZE-ORIENTED SOFTWARE METRICS • Normalize the quality and productivity by measuring the size of the software (LOC/KLOC), so that we get: • Errors per KLOC • Defects per KLOC • Rupiah (Rp) per LOC • Documentation pages per KLOC • Other metrics can be derived: • Errors per person-month • LOC per person-month • Rupiah (Rp) per documentation page
  • 38. 38 FUNCTION-ORIENTED SOFTWARE METRICS • Measure indirectly. • Emphasis on program functionality & utility. • It was first proposed by Albrecht, with a proposed method of calculation called: FUNCTION POINT (FP) • FP is derived by using empirical relationships based on something measurable from the information domain and related to the complexity of the software.
  • 39. 39 FUNCTION POINTS Information Domain: • Number of user inputs: the number of user inputs required by the application • Number of outputs for users: the sum of all outputs both reports and errors (to the printer/screen) • Inquiry input number: online input resulting in the online output • Number of files • Number of external interfaces: all interfaces that are machine- readable to transfer information to other systems.
  • 40. 40 METRICS FOR SOFTWARE QUALITY • A quality metric that provides benefit at both the project and process level is defect removal efficiency (DRE). • In essence, DRE is a measure of the filtering ability of quality assurance and control actions as they are applied throughout all process framework activities. • When considered for a project as a whole, DRE is defined in the following manner: 𝐷𝑅𝐸 = 𝐸 𝐸+𝐷 where E is the number of errors found before delivery of the software to the end user and D is the number of defects found after delivery.
  • 41. UNITING OF METRICS IN SOFTWARE PROCESS software engineering process software project software product data collection data collection data collection measures metrics indicators
  • 42. References Chopra, R. (2018). Software Testing: Principles and Practices. Mercury Learning & Information. 02 Majchrzak, T. A. (2012). Improving Software Testing: Technical And Organizational Developments. Springer Science & Business Media. 03 Myers, G. J., Sandler, C., & Badgett, T. (2012). The Art Of Software Testing. John Wiley & Sons. 04 Dooley, J. F., & Dooley. (2017). Software Development, Design and Coding. Apress. 01 Singh, Y. (2011). Software Testing. Cambridge University Press 05
  • 43. THANK YOU Insert the Subtitle of Your Presentation