SlideShare a Scribd company logo
1
1
1
1
Software Testing
Dr. RAJIB MALL
Professor
Department Of Computer Science &
Engineering
IIT Kharagpur.
1
Faults and Failures
• A program may
fail during testing:
▪A manifestation of a fault
(also called defect or bug).
▪Mere presence of a fault may not
lead to a failure. 2
Errors, Faults, Failures
• Programming is human effort-intensive:
▪ Therefore, inherently error prone.
• IEEE std 1044, 1993 defined errors and
faults as synonyms :
• IEEE Revision of std 1044 in 2010
introduced finer distinctions:
▪ For more expressive communications
distinguished between Errors and Faults 3
Design
4
Specification Code
Fault, defect, or bug
Failure
Error or mistake
Bug Source
Spec and
Design
Code
Error Tit-Bits
• Even experienced programmers
make many errors:
▪ Avg. 50 bugs per 1000 lines of source code
• Extensively tested software contains:
▪ About 1 bug per 1000 lines of
source code.
• Bug distribution:
▪ 60% spec/design, 40% implementation.
6
How are Bugs Reduced?
• Review
• Testing
• Formal specification and
verification
• Use of development process
7
Cost of Not Adequately Testing
• Can be enormous
• Example:
• Ariane 5 rocket self-destructed 37 seconds after launch
• Reason: A software exception bug went undetected…
– Conversion from 64-bit floating point to 16-bit signed integer
value had caused an exception
• The floating point number was larger than 32767
• Efficiency considerations had led to the disabling of the
exception handler.
• Total Cost: over $1 billion 8
How to Test?
• Input test data to the program.
• Observe the output:
▪Check if the program behaved as
expected.
9
Examine Test Result…
•If the program does not
behave as expected:
▪Note the conditions under
which it failed (Test report).
▪Later debug and correct.
10
Testing Facts
• Consumes the largest effort among
all development activities:
▪ Largest manpower among all roles
▪ Implies more job opportunities
• About 50% development effort
▪ But 10% of development time?
▪ How? 11
Testing Facts
• Testing is getting more complex and
sophisticated every year.
▪ Larger and more complex programs
▪ Newer programming paradigms
▪ Newer testing techniques
▪ Test automation 12
Testing Perception
• Testing is often viewed as not very
challenging --- less preferred by novices,
but:
▪ Over the years testing has taken a center
stage in all types of software development.
▪ “Monkey testing is passe” --- Large number of
innovations have taken place in testing area ---
requiring tester to have good knowledge of
test techniques.
▪ Challenges of test automation 13
Testing Activities Now Spread Over Entire Life Cycle
14
Test How Long?
• Another way:
▪Seed bugs… run test cases
▪See if all (or most) are getting
detected
# Bugs
Time
One way:
15
Verification versus Validation
• Verification is the process of
determining:
▪ Whether output of one phase of
development conforms to its previous phase.
• Validation is the process of
determining:
▪ Whether a fully developed system
conforms to its SRS document..
16
Verification versus Validation
• Verification is concerned with
phase containment of errors:
▪Whereas, the aim of validation is
that the final product is error
free.
17
Verification and Validation Techniques
• Review
• Simulation
• Unit testing
• Integration
testing
18
• System testing
Verification Validation
Are you building it
right?
Have you built the right
thing?
Checks whether an
artifact conforms to
its previous artifact.
Checks the final product
against the specification.
Done by developers. Done by Testers.
Static and dynamic
activities: reviews,
unit testing.
Dynamic activities:
Execute software and
check against
requirements. 19
1/29/2020
Testing Levels
20
4 Testing Levels
• Software tested at 4 levels:
▪Unit testing
▪Integration testing
▪System testing
▪Regression testing
21
22
Test Levels
• Unit testing
▪ Test each module (unit, or component)
independently
▪ Mostly done by developers of the modules
• Integration and system testing
▪ Test the system as a whole
▪ Often done by separate testing or QA team
• Acceptance testing
▪ Validation of system functions by the
customer
Levels of Testing
What users
really need
Requirements
Design
Code
Acceptance
testing
System testing
Integration testing
Unit testing
Maintenance Regression Testing
Overview of Activities During System and
Integration Testing
• Test Suite Design
• Run test cases
• Check results to detect failures.
• Prepare failure list
• Debug to locate errors
• Correct errors.
Tester
Developer
24
Quiz 1
• As testing proceeds more and more bugs
are discovered.
▪ How to know when to stop testing?
• Give examples of the types of bugs
detected during:
▪ Unit testing?
▪ Integration testing?
▪ System testing? 25
Unit testing
• During unit testing, functions (or
modules) are tested in isolation:
▪What if all modules were to be tested
together (i.e. system testing)?
•It would become difficult to
determine which module has the
error. 26
Integration Testing
• After modules of a system have
been coded and unit tested:
▪Modules are integrated in
steps according to an integration plan
▪The partially integrated system is
tested at each integration step. 27
Integration and System Testing
• Integration test evaluates a group of
functions or classes:
▪ Identifies interface compatibility, unexpected
parameter values or state interactions, and
run-time exceptions
▪ System test tests working of the entire
system
• Smoke test:
▪ System test performed daily or several times
a week after every build.
28
Types of System Testing
• Based on types test:
▪ Functionality test
▪ Performance test
• Based on who performs testing:
▪ Alpha
▪ Beta
▪ Acceptance test 29
Performance test
• Determines whether a system or
subsystem meets its non-functional
requirements:
• Response times
• Throughput
• Usability
• Stress
• Recovery
• Configuration, etc. 30
User Acceptance Testing
• User determines whether the
system fulfills his requirements
▪Accepts or rejects delivered
system based on the test
results.
31
Who Tests Software?
• Programmers:
▪ Unit testing
▪ Test their own or other’s programmer’s code
• Users:
▪ Usability and acceptance testing
▪ Volunteers are frequently used to test beta
versions
• Test team:
▪ All types of testing except unit and acceptance
▪ Develop test plans and strategy 32
33
Feasibility Study
Req. Analysis
Design
Coding
Testing
Maintenance
Testing by developers
Testing by
Tester
Review,
Simulation, etc.
V&V
Pesticide Effect
• Errors that escape a fault detection
technique:
▪ Can not be detected by further
applications of that technique.
F
I
L
T
E
R
F
I
L
T
E
R
F
I
L
T
E
R
34
F
I
L
T
E
R
Capers Jones Rule of Thumb
• Each of software review,
inspection, and test step will
find 30% of the bugs present.
In IEEE Computer, 1996
35
Capers Jones
Pesticide Effect
• Assume to start with 1000 bugs
• We use 4 fault detection
techniques :
▪ Each detects only 70% bugs existing
at that time
▪ How many bugs would remain at end?
▪ 1000*(0.3)4=81 bugs 36
Quiz
Feasibility Study
Req. Analysis
Design
Coding
Testing
Maintenance
37
1. When are verification undertaken in waterfall model?
2. When is testing undertaken in waterfall model?
3. When is validation undertaken in waterfall model?
38
Quiz: Solution
Feasibility Study
Req. Analysis
Design
Coding
Testing
Maintenance
1. When are verification undertaken in waterfall model?
2. When is testing undertaken in waterfall model?
Ans: Coding phase and Testing phase
3. When is validation undertaken in waterfall model?
Verification
Validation
1/29/2020
V Life Cycle
Model
39
V Model
• It is a variant of the Waterfall
▪ Emphasizes verification and
validation (V&V) activities.
▪ V&V activities are spread over the
entire life cycle.
• In every phase of development:
▪ Testing activities are planned in
parallel with development. 40
Project Planning
Production,
Operation &
Maintenance
Requirements
Analysis &
Specification
System Testing
High Level
Design
Integration&
Testing
Detailed Design Unit testing
Coding 41
V Model Steps
• Planning
• Requirements
Specification and
Analysis
• Design
• System and
acceptance testing
• Integration and
Testing 42
V Model: Strengths
• Starting from early stages of
software development:
▪ Emphasize planning for verification
and validation of the software
• Each deliverable is made testable
• Intuitive and easy to use 43
V Model Weaknesses
• Does not support overlapping of
phases
• Does not support iterations
• Not easy to handle later changes in
requirements
• Does not support any risk handling
method 44
When to Use V Model
• Natural choice for systems
requiring high reliability:
• Example: embedded control
applications:
▪ All requirements are known up-front
▪ Solution and technology are known45
1/29/2020
A Few More Basic
Concepts on
Testing…
46
How Many Latent Errors?
• Several independent studies
[Jones],[schroeder], etc. conclude:
▪ 85% errors get removed at the end
of a typical testing process.
▪ Why not more?
▪ All practical test techniques are
basically heuristics… they help to
reduce bugs… but do not guarantee
complete bug removal… 47
48
Evolution of Test Automation
Manual
Test design
and
Automated
Execution
Scripting
Manual
Test design
and
Execution
Capture and
Replay
Model-Based
Testing
Automated
Test design
and
Execution
1960-1990 1990-2000 2000-
48
Fault Model
• Types of faults possible in a
program.
• Some types can be ruled out:
▪For example, file related-
problems in a program not
using files. 49
Fault Model of an OO Program
Faults in OO Programs
Structural
Faults
Algorithmic
Faults
Procedural
Faults
Traceability
Faults
OO
Faults
Incorrect
Result
Inadequate
Performance
50
Hardware Fault-Model
• Essentially only four types:
▪ Stuck-at 0
▪ Stuck-at 1
▪ Open circuit
▪ Short circuit
• Testing is therefore simple:
▪ Devise ways to test the presence of each
• Hardware testing is usually fault-
based testing.
51
Test Cases
• Each test case typically tries to
establish correct working of some
functionality:
▪Executes (covers) some program
elements.
▪For certain restricted types of
faults, fault-based testing can be
used.
52
• Test data:
▪ Inputs used to test the system
• Test cases:
▪ Inputs to test the system,
▪ State of the software, and
▪ The predicted outputs from the inputs
Test data versus test cases
53
Test Cases and Test Suites
• A test case is a triplet [I,S,O]
▪I is the data to be input to the
system,
▪S is the state of the system at
which the data will be input,
▪O is the expected output of the
system. 54
What are Negative Test Cases?
• Purpose:
▪ Helps to ensure that the application
gracefully handles invalid and unexpected
user inputs and the application does not
crash.
• Example:
▪ If user types letter in a numeric field, it
should not crash and display the message:
“incorrect data type, please enter a
number…” 55
Test Cases and Test Suites
•Test a software using a set
of carefully designed test
cases:
▪The set of all test cases is
called the test suite.
56
Test Execution Example: Return Book
• Test case [I,S,O]
1.Set the program in the required
state: Book record created, member
record created, Book issued
2.Give the defined input: Select renew
book option and request renew for a
further 2 wk period.
3.Observe the output:
▪ Compare it to the expected output.
57
Sample: Recording of Test Case & Results
Test Case number
Test Case author
Test purpose
Pre-condition:
Test inputs:
Expected outputs (if any):
Post-condition:
Test Execution history:
Test execution date
Test execution person
Test execution result (s) : Pass/Fail
If failed : Failure information
: fix status
58
Test Team- Human Resources
• Test Planning: Experienced people
• Test scenario and test case design: Experienced
and test qualified people
• Test execution: semi-experienced to
inexperienced
• Test result analysis: experienced people
• Test tool support: experienced people
• May include external people:
▪ Users
▪ Industry experts 59
Why Design of Test Cases?
• Exhaustive testing of any non-trivial
system is impractical:
▪ Input data domain is extremely large.
• Design an optimal test suite, meaning:
▪ Of reasonable size, and
▪ Uncovers as many errors as possible. 60
Design of Test Cases
• If test cases are selected randomly:
▪ Many test cases would not contribute to
the significance of the test suite,
▪ Would only detect errors that are already
detected by other test cases in the suite.
• Therefore, the number of test cases in
a randomly selected test suite:
▪ Does not indicate the effectiveness of
testing.
61
Design of Test Cases
• Testing a system using a large
number of randomly selected test
cases:
▪Does not mean that most errors in
the system will be uncovered.
• Consider following example:
▪Find the maximum of two integers x
and y. 62
Design of Test Cases
• The code has a simple programming
error:
• If (x>y) max = x;
else max = x; // should be max=y;
• Test suite {(x=3,y=2);(x=2,y=3)} can
detect the bug,
• A larger test suite {(x=3,y=2);(x=4,y=3);
(x=5,y=1)} does not detect the bug. 63
Test Plan
• Before testing activities start, a test
plan is developed.
• The test plan documents the following:
▪ Features to be tested
▪ Features not to be tested
▪ Test strategy
▪ Test suspension criteriastopping criteria
▪ Test effort
▪ Test schedule 64
Design of Test Cases
• Systematic approaches are
required to design an effective
test suite:
▪Each test case in the suite
should target different faults.
65
Testing Strategy
• Test Strategy primarily addresses:
▪ Which types of tests to deploy?
▪ How much effort to devote to which
type of testing?
• Black-box: Usage–based testing (based
on customers’ actual usage pattern)
• White-box testing can be guided by
black box testing results 66
Consider Past Bug Detection Data…
10%
40%
25%
15%
10%
Reviews Unit
test
System
test
Integration
test
customer
reported
Quiz: How would you use
this for planning test
effort?
# of Bugs
Detected
67
Consider Past Bug Detection Data…
50%
30%
10% 10%
test
Technique 1
test
Technique 2
test
Technique 3
customer
reported
Quiz: How would you
use this for planning
test effort?
Problems
Detected
68
Distribution of Error Prone Modules
customer reported bugs for Release 1
Quiz 6: How would you use
this for planning Release 2 testing?
# bugs
Detected
M1 M3
M2 M6
M5
M4
69
Defect clustering: A few modules usually contain most defects…
70
70
03/08/10 70
70
Unit Testing
70
When and Why of Unit Testing?
• Unit testing carried out:
• After coding of a unit is complete
and it compiles successfully.
•Unit testing reduces
debugging effort substantially.
71
Why unit test?
• Without unit test:
▪ Errors become
difficult to track
down.
▪ Debugging cost
increases
substantially…
Failure
Unit Testing
• Testing of individual methods, modules,
classes, or components in isolation:
▪ Carried out before integrating with other
parts of the software being developed.
• Support required for for Unit testing:
▪ Driver
• Simulates the behavior of a function that calls
and possibly supplies some data to the function
being tested.
▪ Stub
• Simulates the behavior of a function that has
not yet been written.
Unit
Driver
Stub
73
Unit Testing
STUB
CALL
PROCEDURE
UNDER TEST
Access To Nonlocal Variables
DRIVER
74
Quiz
• Unit testing can be considered as
which one of the following types
of activities?
▪ Verification
▪ Validation
75
Design of Unit Test Cases
• There are essentially three main
approaches to design test cases:
▪Black-box approach
▪White-box (or glass-box)
approach
▪Grey-box approach 76
Black-Box Testing
• Test cases are designed using only
functional specification of the
software:
▪Without any knowledge of the
internal structure of the software.
• Black-box testing is also known as
functional testing.
Software
Input
Output
77
What is Hard about BB Testing
• Data domain is large
• A function may take multiple
parameters:
▪ We need to consider the
combinations of the values of the
different parameters.
78
What’s So Hard About Testing?
• Consider int check-equal(int x, int y)
• Assuming a 64 bit computer
▪ Input space = 2128
•Assuming it takes 10secs to key-in an integer
pair:
▪It would take about a billion years to enter all
possible values!
▪Automatic testing has its own problems! 79
Solution
• Construct model of the data
domain:
▪ Called Domain based testing
▪ Select data based on the domain
model
80
81
81
03/08/10 81
81
Black-box Testing
81
Black Box Testing
• Considers the software as a black box:
▪Test data derived from the specification
• No knowledge of code necessary
• Also known as:
▪ Data-driven or
▪ Input/output driven testing
• The goal is to achieve the thoroughness
of exhaustive input testing:
▪With much less effort!!!!
82
System
Input Output
Black-Box Testing
• Scenario coverage
•Equivalence class partitioning
•Special value (risk-based) testing
▪Boundary value testing
▪Cause-effect (Decision Table)
testing
▪Combinatorial testing
▪Orthogonal array testing
83
Black-Box Testing
• Scenario coverage
•Equivalence class partitioning
•Special value (risk-based) testing
▪Boundary value testing
▪Cause-effect (Decision Table) testing
▪Combinatorial testing
▪Orthogonal array testing 84
85
85
03/08/10 85
85
Equivalence Class
Testing
85
Equivalence Class Partitioning
• The input values to a program:
▪ Partitioned into equivalence classes.
• Partitioning is done such that:
▪Program behaves in similar ways to
every input value belonging to an
equivalence class.
▪At the least there should be as many
equivalence classes as scenarios.
86
Why Define Equivalence Classes?
• Premise:
▪ Testing code with any one
representative value from a
equivalence class:
▪As good as testing using any
other values from the
equivalence class. 87
E1 E2 E3
Equivalence Class Partitioning
• How do you identify equivalence
classes?
▪Identify scenarios
▪Examine the input data.
▪Examine output
•Few guidelines for determining the
equivalence classes can be given… 88
Jerry Gao Ph.D.7/20002 All Rights Reserved
•If an input condition specifies a range, one valid and two
invalid equivalence class are defined.
•If an input condition specifies a member of a set, one
valid and one invalid equivalence classes are defined.
•If an input condition is Boolean, one valid and one invalid
classes are defined.
•Example:
•Area code: range --- value defined between 10000 and
90000
•Password: value - six character string.
Guidelines to Identify Equivalence Classes
89
Equivalent class partition: Example
• Given three sides, determine the
type of the triangle:
▪ Isosceles
▪ Scalene
▪ Equilateral, etc.
• Hint: scenarios expressed in output
in this case.
90
Equivalence Partitioning
• First-level partitioning:
▪ Valid vs. Invalid test cases
Valid Invalid
91
Equivalence Partitioning
• Further partition valid and invalid
test cases into equivalence classes
92
Valid
Invalid
Equivalence Partitioning
• Create a test case for at least one
value from each equivalence class
93
Valid
Invalid

More Related Content

Similar to SOFTWARE TESTING W1_watermark.pdf

Chapter 13 software testing strategies
Chapter 13 software testing strategiesChapter 13 software testing strategies
Chapter 13 software testing strategies
SHREEHARI WADAWADAGI
 
Software testing
Software testingSoftware testing
Software testing
Preeti Mishra
 
unit-2_20-july-2018 (1).pptx
unit-2_20-july-2018 (1).pptxunit-2_20-july-2018 (1).pptx
unit-2_20-july-2018 (1).pptx
PriyaFulpagare1
 
SENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptxSENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptx
MinsasWorld
 
Testing fundamentals
Testing fundamentalsTesting fundamentals
Testing fundamentals
Raviteja Chowdary Adusumalli
 
ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0Samer Desouky
 
Introduction To Testing by enosislearning.com
Introduction To Testing by enosislearning.com Introduction To Testing by enosislearning.com
Introduction To Testing by enosislearning.com
enosislearningcom
 
Learn software testing
Learn software testingLearn software testing
Learn software testing
sajedah abukhdeir
 
testing strategies and tactics
 testing strategies and tactics testing strategies and tactics
testing strategies and tactics
Preeti Mishra
 
Software testing
Software testingSoftware testing
Software testing
Omar Al-Bokari
 
Software Testing - Test management - Mazenet Solution
Software Testing - Test management - Mazenet SolutionSoftware Testing - Test management - Mazenet Solution
Software Testing - Test management - Mazenet Solution
Mazenetsolution
 
White box testing
White box testingWhite box testing
White box testing
Neethu Tressa
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing FundamentalsChankey Pathak
 
Agile Acceptance testing with Fitnesse
Agile Acceptance testing with FitnesseAgile Acceptance testing with Fitnesse
Agile Acceptance testing with Fitnesse
ClareMcLennan
 
Software Testing Basics
Software Testing BasicsSoftware Testing Basics
Software Testing Basics
Belal Raslan
 
Week 14 Unit Testing.pptx
Week 14  Unit Testing.pptxWeek 14  Unit Testing.pptx
Week 14 Unit Testing.pptx
mianshafa
 
Software testing methods, levels and types
Software testing methods, levels and typesSoftware testing methods, levels and types
Software testing methods, levels and typesConfiz
 
Testing Plan
Testing PlanTesting Plan
Testing Plan
Ajeng Savitri
 
Different Types Of Testing
Different Types Of TestingDifferent Types Of Testing
Different Types Of Testing
Siddharth Belbase
 

Similar to SOFTWARE TESTING W1_watermark.pdf (20)

Chapter 13 software testing strategies
Chapter 13 software testing strategiesChapter 13 software testing strategies
Chapter 13 software testing strategies
 
Software testing
Software testingSoftware testing
Software testing
 
unit-2_20-july-2018 (1).pptx
unit-2_20-july-2018 (1).pptxunit-2_20-july-2018 (1).pptx
unit-2_20-july-2018 (1).pptx
 
SENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptxSENG202-v-and-v-modeling_121810.pptx
SENG202-v-and-v-modeling_121810.pptx
 
Testing fundamentals
Testing fundamentalsTesting fundamentals
Testing fundamentals
 
6. oose testing
6. oose testing6. oose testing
6. oose testing
 
ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0ISTQB - CTFL Summary v1.0
ISTQB - CTFL Summary v1.0
 
Introduction To Testing by enosislearning.com
Introduction To Testing by enosislearning.com Introduction To Testing by enosislearning.com
Introduction To Testing by enosislearning.com
 
Learn software testing
Learn software testingLearn software testing
Learn software testing
 
testing strategies and tactics
 testing strategies and tactics testing strategies and tactics
testing strategies and tactics
 
Software testing
Software testingSoftware testing
Software testing
 
Software Testing - Test management - Mazenet Solution
Software Testing - Test management - Mazenet SolutionSoftware Testing - Test management - Mazenet Solution
Software Testing - Test management - Mazenet Solution
 
White box testing
White box testingWhite box testing
White box testing
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing Fundamentals
 
Agile Acceptance testing with Fitnesse
Agile Acceptance testing with FitnesseAgile Acceptance testing with Fitnesse
Agile Acceptance testing with Fitnesse
 
Software Testing Basics
Software Testing BasicsSoftware Testing Basics
Software Testing Basics
 
Week 14 Unit Testing.pptx
Week 14  Unit Testing.pptxWeek 14  Unit Testing.pptx
Week 14 Unit Testing.pptx
 
Software testing methods, levels and types
Software testing methods, levels and typesSoftware testing methods, levels and types
Software testing methods, levels and types
 
Testing Plan
Testing PlanTesting Plan
Testing Plan
 
Different Types Of Testing
Different Types Of TestingDifferent Types Of Testing
Different Types Of Testing
 

Recently uploaded

CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
karthi keyan
 
DfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributionsDfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributions
gestioneergodomus
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
SamSarthak3
 
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
ssuser7dcef0
 
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERSCW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
veerababupersonal22
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation & Control
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
Neometrix_Engineering_Pvt_Ltd
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
Kamal Acharya
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
zwunae
 
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
Amil Baba Dawood bangali
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
TeeVichai
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
AmarGB2
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
VENKATESHvenky89705
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
AJAYKUMARPUND1
 
Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
Kamal Acharya
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
Osamah Alsalih
 
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdfGoverning Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
WENKENLI1
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
Aditya Rajan Patra
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
manasideore6
 
Building Electrical System Design & Installation
Building Electrical System Design & InstallationBuilding Electrical System Design & Installation
Building Electrical System Design & Installation
symbo111
 

Recently uploaded (20)

CME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional ElectiveCME397 Surface Engineering- Professional Elective
CME397 Surface Engineering- Professional Elective
 
DfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributionsDfMAy 2024 - key insights and contributions
DfMAy 2024 - key insights and contributions
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
 
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
 
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERSCW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
CW RADAR, FMCW RADAR, FMCW ALTIMETER, AND THEIR PARAMETERS
 
Water Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdfWater Industry Process Automation and Control Monthly - May 2024.pdf
Water Industry Process Automation and Control Monthly - May 2024.pdf
 
Standard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - NeometrixStandard Reomte Control Interface - Neometrix
Standard Reomte Control Interface - Neometrix
 
Cosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdfCosmetic shop management system project report.pdf
Cosmetic shop management system project report.pdf
 
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
一比一原版(IIT毕业证)伊利诺伊理工大学毕业证成绩单专业办理
 
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
NO1 Uk best vashikaran specialist in delhi vashikaran baba near me online vas...
 
Railway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdfRailway Signalling Principles Edition 3.pdf
Railway Signalling Principles Edition 3.pdf
 
Investor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptxInvestor-Presentation-Q1FY2024 investor presentation document.pptx
Investor-Presentation-Q1FY2024 investor presentation document.pptx
 
road safety engineering r s e unit 3.pdf
road safety engineering  r s e unit 3.pdfroad safety engineering  r s e unit 3.pdf
road safety engineering r s e unit 3.pdf
 
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
Pile Foundation by Venkatesh Taduvai (Sub Geotechnical Engineering II)-conver...
 
Student information management system project report ii.pdf
Student information management system project report ii.pdfStudent information management system project report ii.pdf
Student information management system project report ii.pdf
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
 
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdfGoverning Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
 
Fundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptxFundamentals of Electric Drives and its applications.pptx
Fundamentals of Electric Drives and its applications.pptx
 
Building Electrical System Design & Installation
Building Electrical System Design & InstallationBuilding Electrical System Design & Installation
Building Electrical System Design & Installation
 

SOFTWARE TESTING W1_watermark.pdf

  • 1. 1 1 1 1 Software Testing Dr. RAJIB MALL Professor Department Of Computer Science & Engineering IIT Kharagpur. 1
  • 2. Faults and Failures • A program may fail during testing: ▪A manifestation of a fault (also called defect or bug). ▪Mere presence of a fault may not lead to a failure. 2
  • 3. Errors, Faults, Failures • Programming is human effort-intensive: ▪ Therefore, inherently error prone. • IEEE std 1044, 1993 defined errors and faults as synonyms : • IEEE Revision of std 1044 in 2010 introduced finer distinctions: ▪ For more expressive communications distinguished between Errors and Faults 3
  • 4. Design 4 Specification Code Fault, defect, or bug Failure Error or mistake
  • 5. Bug Source Spec and Design Code Error Tit-Bits • Even experienced programmers make many errors: ▪ Avg. 50 bugs per 1000 lines of source code • Extensively tested software contains: ▪ About 1 bug per 1000 lines of source code. • Bug distribution: ▪ 60% spec/design, 40% implementation.
  • 6. 6
  • 7. How are Bugs Reduced? • Review • Testing • Formal specification and verification • Use of development process 7
  • 8. Cost of Not Adequately Testing • Can be enormous • Example: • Ariane 5 rocket self-destructed 37 seconds after launch • Reason: A software exception bug went undetected… – Conversion from 64-bit floating point to 16-bit signed integer value had caused an exception • The floating point number was larger than 32767 • Efficiency considerations had led to the disabling of the exception handler. • Total Cost: over $1 billion 8
  • 9. How to Test? • Input test data to the program. • Observe the output: ▪Check if the program behaved as expected. 9
  • 10. Examine Test Result… •If the program does not behave as expected: ▪Note the conditions under which it failed (Test report). ▪Later debug and correct. 10
  • 11. Testing Facts • Consumes the largest effort among all development activities: ▪ Largest manpower among all roles ▪ Implies more job opportunities • About 50% development effort ▪ But 10% of development time? ▪ How? 11
  • 12. Testing Facts • Testing is getting more complex and sophisticated every year. ▪ Larger and more complex programs ▪ Newer programming paradigms ▪ Newer testing techniques ▪ Test automation 12
  • 13. Testing Perception • Testing is often viewed as not very challenging --- less preferred by novices, but: ▪ Over the years testing has taken a center stage in all types of software development. ▪ “Monkey testing is passe” --- Large number of innovations have taken place in testing area --- requiring tester to have good knowledge of test techniques. ▪ Challenges of test automation 13
  • 14. Testing Activities Now Spread Over Entire Life Cycle 14
  • 15. Test How Long? • Another way: ▪Seed bugs… run test cases ▪See if all (or most) are getting detected # Bugs Time One way: 15
  • 16. Verification versus Validation • Verification is the process of determining: ▪ Whether output of one phase of development conforms to its previous phase. • Validation is the process of determining: ▪ Whether a fully developed system conforms to its SRS document.. 16
  • 17. Verification versus Validation • Verification is concerned with phase containment of errors: ▪Whereas, the aim of validation is that the final product is error free. 17
  • 18. Verification and Validation Techniques • Review • Simulation • Unit testing • Integration testing 18 • System testing
  • 19. Verification Validation Are you building it right? Have you built the right thing? Checks whether an artifact conforms to its previous artifact. Checks the final product against the specification. Done by developers. Done by Testers. Static and dynamic activities: reviews, unit testing. Dynamic activities: Execute software and check against requirements. 19
  • 21. 4 Testing Levels • Software tested at 4 levels: ▪Unit testing ▪Integration testing ▪System testing ▪Regression testing 21
  • 22. 22 Test Levels • Unit testing ▪ Test each module (unit, or component) independently ▪ Mostly done by developers of the modules • Integration and system testing ▪ Test the system as a whole ▪ Often done by separate testing or QA team • Acceptance testing ▪ Validation of system functions by the customer
  • 23. Levels of Testing What users really need Requirements Design Code Acceptance testing System testing Integration testing Unit testing Maintenance Regression Testing
  • 24. Overview of Activities During System and Integration Testing • Test Suite Design • Run test cases • Check results to detect failures. • Prepare failure list • Debug to locate errors • Correct errors. Tester Developer 24
  • 25. Quiz 1 • As testing proceeds more and more bugs are discovered. ▪ How to know when to stop testing? • Give examples of the types of bugs detected during: ▪ Unit testing? ▪ Integration testing? ▪ System testing? 25
  • 26. Unit testing • During unit testing, functions (or modules) are tested in isolation: ▪What if all modules were to be tested together (i.e. system testing)? •It would become difficult to determine which module has the error. 26
  • 27. Integration Testing • After modules of a system have been coded and unit tested: ▪Modules are integrated in steps according to an integration plan ▪The partially integrated system is tested at each integration step. 27
  • 28. Integration and System Testing • Integration test evaluates a group of functions or classes: ▪ Identifies interface compatibility, unexpected parameter values or state interactions, and run-time exceptions ▪ System test tests working of the entire system • Smoke test: ▪ System test performed daily or several times a week after every build. 28
  • 29. Types of System Testing • Based on types test: ▪ Functionality test ▪ Performance test • Based on who performs testing: ▪ Alpha ▪ Beta ▪ Acceptance test 29
  • 30. Performance test • Determines whether a system or subsystem meets its non-functional requirements: • Response times • Throughput • Usability • Stress • Recovery • Configuration, etc. 30
  • 31. User Acceptance Testing • User determines whether the system fulfills his requirements ▪Accepts or rejects delivered system based on the test results. 31
  • 32. Who Tests Software? • Programmers: ▪ Unit testing ▪ Test their own or other’s programmer’s code • Users: ▪ Usability and acceptance testing ▪ Volunteers are frequently used to test beta versions • Test team: ▪ All types of testing except unit and acceptance ▪ Develop test plans and strategy 32
  • 33. 33 Feasibility Study Req. Analysis Design Coding Testing Maintenance Testing by developers Testing by Tester Review, Simulation, etc. V&V
  • 34. Pesticide Effect • Errors that escape a fault detection technique: ▪ Can not be detected by further applications of that technique. F I L T E R F I L T E R F I L T E R 34 F I L T E R
  • 35. Capers Jones Rule of Thumb • Each of software review, inspection, and test step will find 30% of the bugs present. In IEEE Computer, 1996 35 Capers Jones
  • 36. Pesticide Effect • Assume to start with 1000 bugs • We use 4 fault detection techniques : ▪ Each detects only 70% bugs existing at that time ▪ How many bugs would remain at end? ▪ 1000*(0.3)4=81 bugs 36
  • 37. Quiz Feasibility Study Req. Analysis Design Coding Testing Maintenance 37 1. When are verification undertaken in waterfall model? 2. When is testing undertaken in waterfall model? 3. When is validation undertaken in waterfall model?
  • 38. 38 Quiz: Solution Feasibility Study Req. Analysis Design Coding Testing Maintenance 1. When are verification undertaken in waterfall model? 2. When is testing undertaken in waterfall model? Ans: Coding phase and Testing phase 3. When is validation undertaken in waterfall model? Verification Validation
  • 40. V Model • It is a variant of the Waterfall ▪ Emphasizes verification and validation (V&V) activities. ▪ V&V activities are spread over the entire life cycle. • In every phase of development: ▪ Testing activities are planned in parallel with development. 40
  • 41. Project Planning Production, Operation & Maintenance Requirements Analysis & Specification System Testing High Level Design Integration& Testing Detailed Design Unit testing Coding 41
  • 42. V Model Steps • Planning • Requirements Specification and Analysis • Design • System and acceptance testing • Integration and Testing 42
  • 43. V Model: Strengths • Starting from early stages of software development: ▪ Emphasize planning for verification and validation of the software • Each deliverable is made testable • Intuitive and easy to use 43
  • 44. V Model Weaknesses • Does not support overlapping of phases • Does not support iterations • Not easy to handle later changes in requirements • Does not support any risk handling method 44
  • 45. When to Use V Model • Natural choice for systems requiring high reliability: • Example: embedded control applications: ▪ All requirements are known up-front ▪ Solution and technology are known45
  • 46. 1/29/2020 A Few More Basic Concepts on Testing… 46
  • 47. How Many Latent Errors? • Several independent studies [Jones],[schroeder], etc. conclude: ▪ 85% errors get removed at the end of a typical testing process. ▪ Why not more? ▪ All practical test techniques are basically heuristics… they help to reduce bugs… but do not guarantee complete bug removal… 47
  • 48. 48 Evolution of Test Automation Manual Test design and Automated Execution Scripting Manual Test design and Execution Capture and Replay Model-Based Testing Automated Test design and Execution 1960-1990 1990-2000 2000- 48
  • 49. Fault Model • Types of faults possible in a program. • Some types can be ruled out: ▪For example, file related- problems in a program not using files. 49
  • 50. Fault Model of an OO Program Faults in OO Programs Structural Faults Algorithmic Faults Procedural Faults Traceability Faults OO Faults Incorrect Result Inadequate Performance 50
  • 51. Hardware Fault-Model • Essentially only four types: ▪ Stuck-at 0 ▪ Stuck-at 1 ▪ Open circuit ▪ Short circuit • Testing is therefore simple: ▪ Devise ways to test the presence of each • Hardware testing is usually fault- based testing. 51
  • 52. Test Cases • Each test case typically tries to establish correct working of some functionality: ▪Executes (covers) some program elements. ▪For certain restricted types of faults, fault-based testing can be used. 52
  • 53. • Test data: ▪ Inputs used to test the system • Test cases: ▪ Inputs to test the system, ▪ State of the software, and ▪ The predicted outputs from the inputs Test data versus test cases 53
  • 54. Test Cases and Test Suites • A test case is a triplet [I,S,O] ▪I is the data to be input to the system, ▪S is the state of the system at which the data will be input, ▪O is the expected output of the system. 54
  • 55. What are Negative Test Cases? • Purpose: ▪ Helps to ensure that the application gracefully handles invalid and unexpected user inputs and the application does not crash. • Example: ▪ If user types letter in a numeric field, it should not crash and display the message: “incorrect data type, please enter a number…” 55
  • 56. Test Cases and Test Suites •Test a software using a set of carefully designed test cases: ▪The set of all test cases is called the test suite. 56
  • 57. Test Execution Example: Return Book • Test case [I,S,O] 1.Set the program in the required state: Book record created, member record created, Book issued 2.Give the defined input: Select renew book option and request renew for a further 2 wk period. 3.Observe the output: ▪ Compare it to the expected output. 57
  • 58. Sample: Recording of Test Case & Results Test Case number Test Case author Test purpose Pre-condition: Test inputs: Expected outputs (if any): Post-condition: Test Execution history: Test execution date Test execution person Test execution result (s) : Pass/Fail If failed : Failure information : fix status 58
  • 59. Test Team- Human Resources • Test Planning: Experienced people • Test scenario and test case design: Experienced and test qualified people • Test execution: semi-experienced to inexperienced • Test result analysis: experienced people • Test tool support: experienced people • May include external people: ▪ Users ▪ Industry experts 59
  • 60. Why Design of Test Cases? • Exhaustive testing of any non-trivial system is impractical: ▪ Input data domain is extremely large. • Design an optimal test suite, meaning: ▪ Of reasonable size, and ▪ Uncovers as many errors as possible. 60
  • 61. Design of Test Cases • If test cases are selected randomly: ▪ Many test cases would not contribute to the significance of the test suite, ▪ Would only detect errors that are already detected by other test cases in the suite. • Therefore, the number of test cases in a randomly selected test suite: ▪ Does not indicate the effectiveness of testing. 61
  • 62. Design of Test Cases • Testing a system using a large number of randomly selected test cases: ▪Does not mean that most errors in the system will be uncovered. • Consider following example: ▪Find the maximum of two integers x and y. 62
  • 63. Design of Test Cases • The code has a simple programming error: • If (x>y) max = x; else max = x; // should be max=y; • Test suite {(x=3,y=2);(x=2,y=3)} can detect the bug, • A larger test suite {(x=3,y=2);(x=4,y=3); (x=5,y=1)} does not detect the bug. 63
  • 64. Test Plan • Before testing activities start, a test plan is developed. • The test plan documents the following: ▪ Features to be tested ▪ Features not to be tested ▪ Test strategy ▪ Test suspension criteriastopping criteria ▪ Test effort ▪ Test schedule 64
  • 65. Design of Test Cases • Systematic approaches are required to design an effective test suite: ▪Each test case in the suite should target different faults. 65
  • 66. Testing Strategy • Test Strategy primarily addresses: ▪ Which types of tests to deploy? ▪ How much effort to devote to which type of testing? • Black-box: Usage–based testing (based on customers’ actual usage pattern) • White-box testing can be guided by black box testing results 66
  • 67. Consider Past Bug Detection Data… 10% 40% 25% 15% 10% Reviews Unit test System test Integration test customer reported Quiz: How would you use this for planning test effort? # of Bugs Detected 67
  • 68. Consider Past Bug Detection Data… 50% 30% 10% 10% test Technique 1 test Technique 2 test Technique 3 customer reported Quiz: How would you use this for planning test effort? Problems Detected 68
  • 69. Distribution of Error Prone Modules customer reported bugs for Release 1 Quiz 6: How would you use this for planning Release 2 testing? # bugs Detected M1 M3 M2 M6 M5 M4 69 Defect clustering: A few modules usually contain most defects…
  • 71. When and Why of Unit Testing? • Unit testing carried out: • After coding of a unit is complete and it compiles successfully. •Unit testing reduces debugging effort substantially. 71
  • 72. Why unit test? • Without unit test: ▪ Errors become difficult to track down. ▪ Debugging cost increases substantially… Failure
  • 73. Unit Testing • Testing of individual methods, modules, classes, or components in isolation: ▪ Carried out before integrating with other parts of the software being developed. • Support required for for Unit testing: ▪ Driver • Simulates the behavior of a function that calls and possibly supplies some data to the function being tested. ▪ Stub • Simulates the behavior of a function that has not yet been written. Unit Driver Stub 73
  • 74. Unit Testing STUB CALL PROCEDURE UNDER TEST Access To Nonlocal Variables DRIVER 74
  • 75. Quiz • Unit testing can be considered as which one of the following types of activities? ▪ Verification ▪ Validation 75
  • 76. Design of Unit Test Cases • There are essentially three main approaches to design test cases: ▪Black-box approach ▪White-box (or glass-box) approach ▪Grey-box approach 76
  • 77. Black-Box Testing • Test cases are designed using only functional specification of the software: ▪Without any knowledge of the internal structure of the software. • Black-box testing is also known as functional testing. Software Input Output 77
  • 78. What is Hard about BB Testing • Data domain is large • A function may take multiple parameters: ▪ We need to consider the combinations of the values of the different parameters. 78
  • 79. What’s So Hard About Testing? • Consider int check-equal(int x, int y) • Assuming a 64 bit computer ▪ Input space = 2128 •Assuming it takes 10secs to key-in an integer pair: ▪It would take about a billion years to enter all possible values! ▪Automatic testing has its own problems! 79
  • 80. Solution • Construct model of the data domain: ▪ Called Domain based testing ▪ Select data based on the domain model 80
  • 82. Black Box Testing • Considers the software as a black box: ▪Test data derived from the specification • No knowledge of code necessary • Also known as: ▪ Data-driven or ▪ Input/output driven testing • The goal is to achieve the thoroughness of exhaustive input testing: ▪With much less effort!!!! 82 System Input Output
  • 83. Black-Box Testing • Scenario coverage •Equivalence class partitioning •Special value (risk-based) testing ▪Boundary value testing ▪Cause-effect (Decision Table) testing ▪Combinatorial testing ▪Orthogonal array testing 83
  • 84. Black-Box Testing • Scenario coverage •Equivalence class partitioning •Special value (risk-based) testing ▪Boundary value testing ▪Cause-effect (Decision Table) testing ▪Combinatorial testing ▪Orthogonal array testing 84
  • 86. Equivalence Class Partitioning • The input values to a program: ▪ Partitioned into equivalence classes. • Partitioning is done such that: ▪Program behaves in similar ways to every input value belonging to an equivalence class. ▪At the least there should be as many equivalence classes as scenarios. 86
  • 87. Why Define Equivalence Classes? • Premise: ▪ Testing code with any one representative value from a equivalence class: ▪As good as testing using any other values from the equivalence class. 87 E1 E2 E3
  • 88. Equivalence Class Partitioning • How do you identify equivalence classes? ▪Identify scenarios ▪Examine the input data. ▪Examine output •Few guidelines for determining the equivalence classes can be given… 88
  • 89. Jerry Gao Ph.D.7/20002 All Rights Reserved •If an input condition specifies a range, one valid and two invalid equivalence class are defined. •If an input condition specifies a member of a set, one valid and one invalid equivalence classes are defined. •If an input condition is Boolean, one valid and one invalid classes are defined. •Example: •Area code: range --- value defined between 10000 and 90000 •Password: value - six character string. Guidelines to Identify Equivalence Classes 89
  • 90. Equivalent class partition: Example • Given three sides, determine the type of the triangle: ▪ Isosceles ▪ Scalene ▪ Equilateral, etc. • Hint: scenarios expressed in output in this case. 90
  • 91. Equivalence Partitioning • First-level partitioning: ▪ Valid vs. Invalid test cases Valid Invalid 91
  • 92. Equivalence Partitioning • Further partition valid and invalid test cases into equivalence classes 92 Valid Invalid
  • 93. Equivalence Partitioning • Create a test case for at least one value from each equivalence class 93 Valid Invalid