2. Topics include
Validation Planning
Testing Fundamentals
Test plan creation
Test-case generation
Black-box Testing
White Box Testing
Unit Testing
Integration Testing
System testing
Object-oriented Testing
3. Verification Vs.
Validation Two questions
Are we building the right product ? => Validation
Are we building the product right ? = > Verification
People
Money
Machines
Materials
Building the right product
Building product right
Efficiency
making best use of
resources in achieving
goals
Effectiveness
choosing effective goals and
achieving them
4. Verification &
Validation
Software V & V is a disciplined approach to assessing software
products throughout the SDLC.
V & V strives to ensure that quality is built into the software
and that the software satisfies business functional
requirements.
V & V is to ensures that software conforms to its specification
and meets the needs of the customers.
V & V employs review, analysis, and testing techniques to
determine whether a software product and its intermediate
deliverables comply with requirements. These requirements
include both business functional capabilities and quality
attributes.
V & V provide management with insights into the state of the
project and the software products, allowing for timely change
in the products or in the SDLC approach.
V & V is typically applied in parallel with software development
and support activities.
5. Verification involves checking that
The software conforms to its specification.
System meets its specified functional and non-functional
requirements.
“Are we building the product right ?”
Validation, a more general process ensure that the
software meets the expectation of the customer.
“Are we building the right product ?”
You can't test in quality. If its not there before you begin
testing, it won’t be there when you’re finished testing.
Verification &
Validation
6. Techniques of system
checking & Analysis
Software inspections
Concerned with analysis of the static system
representation to discover problems (static verification)
such as
• Requirements document
• Design diagrams and
• Program source code
It do not require the system to be executed.
This techniques include program inspections, automated
source code analysis and formal verification.
It can’t check the non-functional characteristics of the
software such as its performance and reliability.
7. Software testing
It involves executing an implementation of the software
with test data and examining the outputs of the software
and its operational behavior to check that it is performing
as required.
It is a dynamic techniques of verification and validation.
The system is executed with test data and its operational
behaviour is observed.
Two distinct types of testing
Defect testing : to find inconsistencies between a program
and its specification.
Statistical testing : to test program’s performance and
reliability and to check how it works under operational
conditions
Techniques of system
checking & Analysis
8. Static and Dynamic V &
V
Formal
specification
High-level
design
Requirements
specification
Detailed
design
Program
Prototype
Dynamic
validation
Static
verification
9. Software Testing
fundamentals
Testing is a set of activities that can be planned in
advance and conducted systematically.
Testing is the process of executing a program with the
intent of finding errors.
A good test case is one with a high probability of finding
an as-yet undiscovered error.
A successful test is one that discovers an as-yet-
undiscovered error.
10. Software testing
priciples
All tests should be traceable to customer
requirements.
Tests should be planned long before testing
begins.
The Pareto principle (80% of all errors will likely
be found in 20% of the code) applies to software
testing.
Testing should begin in the small and progress to
the large.
Exhaustive testing is not possible.
To be most effective, testing should be conducted
by an independent third party.
11. Operability-the better it works the more efficiently it can
be tested
Observability-what you see is what you test
Controllability-the better software can be controlled the
more testing can be automated and optimized
Decomposability-by controlling the scope of testing, the
more quickly problems can be isolated and retested
intelligently
Simplicity-the less there is to test, the more quickly we
can test
Stability-the fewer the changes, the fewer the
disruptions to testing
Understandability-the more information known, the
smarter the testing
Software Testability
Checklist
12. V&V Vs. Debugging
Verification and validation
A process that establishes the existence of defects in a
software system.
The ultimate goal of the V&V process is to establish
confidence that the software system is “fit for purpose”.
Debugging
A process that locates and corrects these defects
Locate
error
Design
error repair
Repair
error
Re-test
program
Test
results Specification Test
cases
13. Design test
cases
Prepare test
data
Runprogram
withtest data
Compare results
totest cases
Test
cases
Test
data
Test
results
Test
reports
The defect testing
process
Test data
Inputs which have been devised to test the system
Test cases
Inputs to test the system and the predicted outputs from
these inputs if the system operates according to its
specification
14. Project Planning
Plan Description
Quality Plan Describes the quality procedure and
standards that will be used in a project.
Validation Plan Describes the approach, resources and
schedule used for system validation.
Configuration
Management Plan
Describes the configuration management
procedures and structures to be used.
Maintenance Plan Predicts the maintenance requirements of
the system, maintenance costs and effort
required.
Staff development Plan Describes how the skills and experience of
the project team members will be developed.
17. Testing Process
Unit testing - Individual components are tested
independently, without other system components
Module testing - Related collections of dependent
components( class, ADT, procedures & functions) are tested,
without other system module.
Sub-system testing-Modules are integrated into sub-systems
and tested. The focus here should be on interface testing to
detect module interface errors or mismatches.
System testing - Testing of the system as a whole. Validating
functional and non-functional requirements & Testing of
emergent system properties.
Acceptance testing-Testing with customer data to check that
it is acceptable. Also called Alpha Testing
18. Component testing
Testing of individual program components
Usually the responsibility of the component developer
(except sometimes for critical systems)
Tests are derived from the developer’s experience
Integration testing
Testing of groups of components integrated to create
a system or sub-system
The responsibility of an independent testing team
Tests are based on a system specification
The testing process
19. Acceptance Testing
Making sure the software works correctly for
intended user in his or her normal work
environment.
Alpha test-version of the complete software is
tested by customer under the supervision of the
developer at the developer’s site.
Beta test-version of the complete software is
tested by customer at his or her own site without
the developer being present
The testing process
20. Black-box testing
Also known as behavioral or functional testing.
The system is a “Blackbox” whose behavior can be
determined by studying its inputs and related outputs.
Knowing the specified function a product is to perform
and demonstrating correct operation based solely on its
specification without regard for its internal logic.
Focus on the functional requirements of the software i.e.,
information domain not the implementation part of the
software and disregards control structure.
The program test cases are based on the system
specification
It is performed during later stages of testing like in the
acceptance testing or beta testing.
21. I
e
Input test data
OeOutput test results
System
Inputs causing
anomalous
behaviour
Outputs which reveal
the presence of
defects
Black-box testing
22. Test are designed to answer the following questions:
How is functional validity tested?
How is system behavior and performance tested?
What classes of input behavior will make good test case?
Is the system particularly sensitive to certain input
values?
How are the boundaries of data class isolated?
What data rates and data volume can the system
tolerate?
What effect will specific combinations of data have on
system operations?
Black-box testing
23. Advantages of Black box
testing
Validates whether or not a given system conforms
to
its software specification
Introduce a series of inputs to a system and
compare
the outputs to a pre-defined test specification.
Test integration between individual system
components.
Tests are architecture independent — they do not
concern themselves with how a given output is
produced, only with whether that output is the
desired and expected output.
Require no knowledge of the underlying system,
one
need not be a software engineer to design black
box
tests.
24. Disadvantages of Black
box testing
Offer no guarantee that every line of code has been
tested.
Being architecture independent, it cannot determine
the efficiency of the code.
Will not find any errors, such as memory leaks, that
are not explicitly and instantly exposed by the
application.
26. Black-box technique that divides the input domain into
classes of data from which test cases can be derived
An ideal test case uncovers a class of errors( incorrect
processing of all incorrect data) that might require many
arbitrary test cases to be executed before a general error is
observed
Equivalence class guidelines:
If input condition specifies a range, one valid and two invalid
equivalence classes are defined
If an input condition requires a specific value, one valid and
two invalid equivalence classes are defined
If an input condition specifies a member of a set, one valid
and one invalid equivalence class is defined
If an input condition is Boolean, one valid and one invalid
equivalence class is defined
Equivalence
Partitioning
28. Between 10000 and 99999Less than 10000 More than 99999
9999
10000 50000
100000
99999
Input values
Between 4 and 10Less than 4 More than 10
3
4 7
11
10
Number of input values
Equivalence
Partitioning
29. Boundary Value Analysis (BVA)
Black-box technique that focuses on the boundaries of the
input domain rather than its center
BVA guidelines:
If input condition specifies a range bounded by values a and
b, test cases should include a and b, values just above and
just below a and b
If an input condition specifies and number of values, test
cases should be exercise the minimum and maximum
numbers, as well as values just above and just below the
minimum and maximum values
Apply guidelines 1 and 2 to output conditions, test cases
should be designed to produce the minimum and maxim
output reports
If internal program data structures have boundaries (e.g. size
limitations), be certain to test the boundaries
30. Comparison Testing
Also called back-to-back testing.
Black-box testing for safety critical systems ( such
as aircraft avionics, automobile braking system) in
which independently developed implementations of
redundant systems are tested for conformance to
specifications
Often equivalence class partitioning is used to
develop a common set of test cases for each
implementation.
31. Orthogonal Array
Testing
Black-box technique that enables the design of a
reasonably small set of test cases that provide
maximum test coverage
Focus is on categories of faulty logic likely to be
present in the software component (without
examining the code)
Priorities for assessing tests using an orthogonal
array
Detect and isolate all single mode faults
Detect all double mode faults
Mutimode faults
32. White-box or Glass Box
testing
Knowing the internal workings of a product,
tests are performed to check the workings of all
independent logic paths.
It derive test cases that:
Guarantee that all independent paths within a module
have been exercised at least once.
Exercise all logical decisions on their true and false
sides.
Execute all loops at their boundaries and within their
operational bounds, and
Exercise internal data structures to ensure their
validity.
Techniques being used: basic path and control
structure testing.
34. Tests complete systems or subsystems
composed of integrated components
Integration testing should be black-box testing
with tests derived from the specification
Main difficulty is localising errors
Incremental integration testing reduces this
problem.
Incremental integration strategies include
Top-down integration
Bottom-up integration
Regression testing
Smoke testing
Integration Testing
35. Top-down testing
Start with high-level system and integrate from the
top-down replacing individual components by stubs
where appropriate
Bottom-up testing
Integrate individual components in levels until the
complete system is created
In practice, most integration involves a combination
of these strategies
Approaches to
integration testing
37. Level NLevel NLevel NLevel NLevel N
Level N–1 Level N–1Level N–1
Testing
sequence
Test
drivers
Test
drivers
Bottom-up testing
38. System Testing
Recovery testing
Checks the system’s ability to recover from failures.
Security testing
Verifies that system protection mechanism prevent improper
penetration or data alteration
Stress testing
Program is checked to see how well it deals with abnormal
resource demands – quantity, frequency, or volume.
Performance testing
Designed to test the run-time performance of software,
especially real-time software.
39. Object-oriented Testing
The components to be tested are object classes
that are instantiated as objects
Larger gain than individual functions so
approaches to white-box testing have to be
extended
No obvious ‘top’ to the system for top-down
integration and testing
40. Acceptance Test
Format
Test Item List
Identification of Test-item
Testing Detail
Detailed testing procedure
Testing Result
Summary of testing-item
41. Test-item List
Item
No.
Test Item Sub –
item No.
Test-Sub Item Level
SR-02 Staff Review SR-02-01 Program Officer
Review
A
SR-02-02 Early Decline Report A
Test-Level
A- Basic Function, compulsory
B- Enhanced Function, compulsory
C- Enhanced Function, optional
42. Testing Details
Item No SR-02-01 Test Date
Item Staff Review Sub-item PO Review
Report: Early Decline
Precondition
Test Procedure
Test Standard
Test description
Test Result and
Conclusion
Passed
Failed
Sin of the Tester Sign of the
Manager
SR-02 Staff Review
43. References
From software engineering, A practitioner’s
approach by Roger S. Pressman
– Chapter 17: Software testing techniques
• Software Testing Fundamentals
• Test case design
• White-box testing- Basic path, Control Structure Testing
• Black-box testing
– Chapter 18: Software Testing Strategies
• A strategic approach to software testing
• Unit, Integration, Validation, System testing
From Software Engineering, Ian Sommerville
– Part5: Verification and Validation
• Chapter 19: Verification and validation
• Chapter 20: Software testing