SlideShare a Scribd company logo
1 of 168
Learn Software Testing
For Beginners
Introduction & Fundamentals
What is Quality?
What is Software Testing?
Why testing is necessary?
Who does the testing?
What has to be tested?
When is testing done?
How often to test?
What is cost of Quality?
What are Testing Standards?
What is Quality?
īƒ˜ Quality is “fitness for use” - (Joseph
Juran)
īƒ˜ Quality is “conformance to
requirements” - (Philip B. Crosby)
īƒ˜ Quality of a product or service is its
ability to satisfy the needs and
expectations of the customer
Deming’s Learning Cycle of Quality
Deming’s Learning Cycle of Quality
“Inspection with the aim of finding the bad
ones and throwing them out is too late,
ineffective and costly.
Quality comes not from inspection but
improvement of the process.”
Dr. W. Edwards Deming Founder of the
Quality Evolution
Juran’s Perception of Quality
Most Common Software problems
īƒ˜ Incorrect calculation
īƒ˜ Incorrect data edits & ineffective data
edits
īƒ˜ Incorrect matching and merging of data
īƒ˜ Data searches that yields incorrect
results
īƒ˜ Incorrect processing of data
relationship
īƒ˜ Incorrect coding / implementation of
business rules
īƒ˜ Inadequate software performance
īƒ˜ Confusing or misleading data
īƒ˜ Software usability by end users &
īƒ˜ Obsolete Software
īƒ˜ Inconsistent processing
īƒ˜ Unreliable results or performance
īƒ˜ Inadequate support of business needs
īƒ˜ Incorrect or inadequate interfaces
īƒ˜ with other systems
īƒ˜ Inadequate performance and security
controls
īƒ˜ Incorrect file handling
Objectives of testing
īƒ˜ Executing a program with the intent of
finding an error.
īƒ˜ To check if the system meets the
requirements and be executed
successfully in the Intended environment.
īƒ˜ To check if the system is “ Fit for purpose”.
īƒ˜ To check if the system does what it is
expected to do.
Objectives of testing
īƒ˜ A good test case is one that has a
probability of finding an as yet
undiscovered error.
īƒ˜ A successful test is one that uncovers a
yet undiscovered error.
īƒ˜ A good test is not redundant.
īƒ˜ A good test should be “best of breed”.
īƒ˜ A good test should neither be too simple
nor too complex.
Objective of a Software Tester
īƒ˜ Find bugs as early as possible and make sure
they get fixed.
īƒ˜ To understand the application well.
īƒ˜ Study the functionality in detail to find where the
bugs are likely to occur.
īƒ˜ Study the code to ensure that each and every
line of code is tested.
īƒ˜ Create test cases in such a way that testing is
done to uncover the hidden bugs and also
ensure that the software is usable and reliable
VERIFICATION & VALIDATION
Verification - typically involves reviews and meeting
to evaluate documents, plans, code, requirements,
and specifications. This can be done with checklists,
issues lists, walkthroughs, and inspection meeting.
Validation - typically involves actual testing and
takes place after verifications are completed.
Validation and Verification process continue in
a cycle till the software becomes defects free.
TESTABILITY
īƒ˜Operability
īƒ˜Observe-ability
īƒ˜Controllability
īƒ˜Decomposability
īƒ˜Stability
īƒ˜Understandability
Plan
Do
Check
Action
Software Development Process Cycle
īƒ˜ PLAN (P): Device a plan. Define your objective and
determine the strategy and supporting methods
required to achieve that objective.
īƒ˜ DO (D): Execute the plan. Create the conditions
and perform the necessary training to execute the
plan.
īƒ˜ CHECK (C): Check the results. Check to determine
whether work is progressing according to the plan
and whether the results are obtained.
īƒ˜ ACTION (A): Take the necessary and appropriate
action if checkup reveals that the work is not being
performed according to plan or not as anticipated.
QUALITY PRINCIPLES
Quality - the most important factor affecting an
organization’s long-term performance.
Quality - the way to achieve improved
productivity and competitiveness in any
organization.
Quality - saves. It does not cost.
Quality - is the solution to the problem, not a
problem.
Cost of Quality
Prevention Cost
Amount spent before the product is actually
built. Cost incurred on establishing methods
and procedures, training workers, acquiring
tools and planning for quality.
Appraisal cost
Amount spent after the product is built but
before it is shipped to the user. Cost of
inspection, testing, and reviews.
Failure Cost
Amount spent to repair failures.
Cost associated with defective products
that have been delivered to the user or
moved into production, costs involve
repairing products to make them fit as per
requirement.
Quality Assurance Quality Control
A planned and systematic
set of activities necessary to
provide adequate confidence
that requirements are
properly established and
products or services conform
to specified requirements.
The process by which
product quality is compared
with applicable standards;
and the action taken when
non-conformance is
detected.
An activity that establishes
and evaluates the processes
to produce the products.
An activity which verifies if
the product meets pre-
defined standards.
Quality Assurance Quality Control
Helps establish processes. Implements the process.
Sets up measurements
programs to evaluate
processes.
Verifies if specific
attributes are in a specific
product or Service
Identifies weaknesses in
processes and improves
them.
Identifies defects for the
primary purpose of
correcting defects.
QA is the responsibility of
the entire team.
QC is the responsibility of the
tester.
Prevents the introduction of
issues or defects
Detects, reports and corrects
defects
QA evaluates whether or not
quality control is working for
the primary purpose of
determining whether or not
there is a weakness in the
process.
QC evaluates if the application
is working for the primary
purpose of determining if there
is a flaw / defect in the
functionalities.
Responsibilities of QA and QC
QA improves the process
that is applied to multiple
products that will ever be
produced by a process.
QC improves the
development of a specific
product or service.
QA personnel should not
perform quality control
unless doing it to validate
quality control is working.
QC personnel may perform
quality assurance tasks if
and when required.
Responsibilities of QA and QC
SEI – CMM
Software Engineering Institute (SEI) developed Capability
Maturity Model (CMM)
CMM describes the prime elements - planning, engineering,
managing software development and maintenance
CMM can be used for
â€ĸ Software process improvement
â€ĸ Software process assessment
â€ĸ Software capability evaluations
The CMM is organized into five maturity level
Initial
Level 1
Repeatable
Level 2
Defined
Level 3
Managed
Level 4
Optimizing
Level 5
Disciplined Process
Standard Consistence
Process
Predictable Process
Continuous
Improvement Process
Phases of SDLC
â€ĸ Requirement Specification and
Analysis
â€ĸ Design
â€ĸ Coding
â€ĸ Testing
â€ĸ Implementation
â€ĸ Maintenance
SOFTWARE DEVELOPMENT LIFE
CYCLE (SDLC)
Requirement Specification
and Analysis
User Requirement
Specification (USR)
Software Requirement
Specification (SRS)
The output of SRS is the input of design phase.
Two types of design -
High Level Design (HLD)
Low Level Design (LLD)
Design
īƒ˜ List of modules and a brief description of each
module.
īƒ˜ Brief functionality of each module.
īƒ˜ Interface relationship among modules.
īƒ˜ Dependencies between modules (if A exists, B
exists etc).
īƒ˜ Database tables identified along with key
elements.
īƒ˜ Overall architecture diagrams along with
technology details.
High Level Design (HLD)
īƒ˜ Detailed functional logic of the module, in
pseudo code.
īƒ˜ Database tables, with all elements,
including their type and size.
īƒ˜ All interface details.
īƒ˜ All dependency issues
īƒ˜ Error message listings
īƒ˜ Complete input and outputs for a module.
Low Level Design (LLD)
Breaking down the product into independent
modules to arrive at micro levels.
2 different approaches followed in designing –
Top Down Approach
Bottom Up Approach
The Design process
Top-down approach
Bottom-Up Approach
Coding
Developers use the LLD document and
write the code in the programming language
specified.
Testing
The testing process involves development of
a test plan, executing the plan and
documenting the test results.
Implementation
Installation of the product in its operational
environment.
Maintenance
After the software is released and the client starts
using the software, maintenance phase is started.
3 things happen - Bug fixing, Upgrade, Enhancement
Bug fixing – bugs arrived due to some untested
scenarios.
Upgrade – Upgrading the application to the newer
versions of the software.
Enhancement - Adding some new features into the
existing software.
SOFTWARE LIFE CYCLE MODELS
WATERFALL MODEL
V-PROCESS MODEL
SPIRAL MODEL
PROTOTYPE MODEL
INCREMENTAL MODEL
EVOLUTIONARY DEVELOPMENT
MODEL
Project Management
īƒ˜ Project Staffing
īƒ˜ Project Planning
īƒ˜ Project Scheduling
Project Staffing
īƒ˜ Project budget may not allow to utilize
highly – paid staff.
īƒ˜ Staff with the appropriate experience may not
be available.
Project Planning
Plan Description
Quality plan Describes the quality procedures and
standards used in a project.
Validation plan Describes the approach, resources and
schedule used for system validation.
Configuration
management plan
Describes the configuration management
procedures and structures to be used.
Maintenance
plan
Predicts the maintenance requirements of the
system/ maintenance costs and efforts
required.
Staff
development plan
Describes how the skills and experience of
the project team members will be developed.
Project Scheduling
īƒ˜ Bar charts and Activity Networks
īƒ˜ Scheduling problems
RISK MANAGEMENT
īƒ˜ Risk identification
īƒ˜ Risk Analysis
īƒ˜ Risk Planning
īƒ˜ Risk Monitoring
Risk Risk
type
Description
Staff
turnover
Project Experienced staff will leave the
project before it is finished.
Management
change
Project There will be a change of
organizational management with
different priorities.
Hardware
unavailability
Project Hardware which is essential for the
project will not be delivered on
schedule.
Requirements
change
Project &
Product
There will be a larger number of
changes to the requirements than
anticipated.
Risk Risk
type
Description
Specification
delays
Project &
Product
Specifications of essential
interfaces are not available on
schedule.
Size under
estimate
Project &
Product
The size of the system has been
under estimated.
CASE tool under
performance
Product CASE tools which support the
project do not perform as
anticipated.
Technology
change
Business The underlying technology on
which the system is built is
superseded by new technology.
Product
competition
Business A competitive product is marketed
before the system is completed.
PC version
Initial system DEC
version
VMS
version
Unix
version
Mainframe
version
Workstation
version
Configuration Management
Sun
version
Configuration Management (CM)
Standards
īƒ˜ CM should be based on a set of standards,
which are applied within an organization.
CM Planning
īƒ˜Documents, required for future system
maintenance, should be identified and included
as managed documents.
īƒ˜It defines the types of documents to be
managed and a document naming scheme.
Change Management
īƒ˜Keeping and managing the changes and
ensuring that they are implemented in the most
cost-effective way.
Change Request form
A part of the CM planning process
īŦ Records change required
īŦ Change suggested by
īŦ Reason why change was suggested
īŦ Urgency of change
īŦ Records change evaluation
īŦ Impact analysis
īŦ Change cost
īŦ Recommendations(system maintenance staff)
VERSION AND RELEASE MANAGEMENT
īƒ˜ Invent identification scheme for system
versions and plan when new system version is
to be produced.
īƒ˜ Ensure that version management procedures
and tools are properly applied and to plan and
distribute new system releases.
Versions/Variants/Releases
īƒ˜Variant An instance of a system which is
functionally identical but non – functionally
distinct from other instances of a system.
īƒ˜Versions An instance of a system, which is
functionally distinct in some way from other
system instances.
īƒ˜Release An instance of a system, which is
distributed to users outside of the development
team.
SOFTWARE TESTING LIFECYCLE -
PHASES
īƒ˜ Requirements study
īƒ˜ Test Case Design and
Development
īƒ˜ Test Execution
īƒ˜ Test Closure
īƒ˜ Test Process Analysis
Requirements study
īƒ˜ Testing Cycle starts with the study of client’s
requirements.
īƒ˜ Understanding of the requirements is very
essential for testing the product.
Analysis & Planning
â€ĸ Test objective and coverage
â€ĸ Overall schedule
â€ĸ Standards and Methodologies
â€ĸ Resources required, including necessary
training
â€ĸ Roles and responsibilities of the team
members
â€ĸ Tools used
Test Case Design and Development
â€ĸ Component Identification
â€ĸ Test Specification Design
â€ĸ Test Specification Review
Test Execution
â€ĸ Code Review
â€ĸ Test execution and evaluation
â€ĸ Performance and simulation
Test Closure
â€ĸ Test summary report
â€ĸ Project De-brief
â€ĸ Project Documentation
Test Process Analysis
Analysis done on the reports and improving the
application’s performance by implementing new
technology and additional features.
DIFFERENT LEVELS OF
TESTING
Testing Levels
â€ĸ Unit testing
â€ĸ Integration testing
â€ĸ System testing
â€ĸ Acceptance testing
Unit testing
īƒ˜ The most ‘micro’ scale of testing.
īƒ˜ Tests done on particular functions or code
modules.
īƒ˜ Requires knowledge of the internal program
design and code.
īƒ˜ Done by Programmers (not by testers).
Unit testing
Objectives ī‚ˇ To test the function of a program or unit of
code such as a program or module
ī‚ˇ To test internal logic
ī‚ˇ To verify internal design
ī‚ˇ To test path & conditions coverage
ī‚ˇ To test exception conditions & error
handling
When ī‚ˇ After modules are coded
Input ī‚ˇ Internal Application Design
ī‚ˇ Master Test Plan
ī‚ˇ Unit Test Plan
Output ī‚ˇ Unit Test Report
Who ī‚ˇDeveloper
Methods ī‚ˇWhite Box testing techniques
ī‚ˇTest Coverage techniques
Tools ī‚ˇDebug
ī‚ˇRe-structure
ī‚ˇCode Analyzers
ī‚ˇPath/statement coverage tools
Education ī‚ˇTesting Methodology
ī‚ˇEffective use of tools
īƒ˜ Incremental integration testing
īƒ˜Continuous testing of an application as and
when a new functionality is added.
īƒ˜Application’s functionality aspects are required
to be independent enough to work separately
before completion of development.
īƒ˜Done by programmers or testers.
Integration Testing
īŦ Testing of combined parts of an application to
determine their functional correctness.
īŦ ‘Parts’ can be
â€ĸ code modules
â€ĸ individual applications
â€ĸ client/server applications on a network.
Types of Integration Testing
â€ĸ Big Bang testing
â€ĸ Top Down Integration testing
â€ĸ Bottom Up Integration testing
Integration testing
Objectives ī‚ˇ To technically verify proper
interfacing between modules, and
within sub-systems
When ī‚ˇ After modules are unit tested
Input ī‚ˇ Internal & External Application
Design
ī‚ˇ Master Test Plan
ī‚ˇ Integration Test Plan
Output ī‚ˇ Integration Test report
Who ī‚ˇDevelopers
Methods ī‚ˇWhite and Black Box
techniques
ī‚ˇProblem /
Configuration
Management
Tools ī‚ˇDebug
ī‚ˇRe-structure
ī‚ˇCode Analyzers
Education ī‚ˇTesting Methodology
ī‚ˇEffective use of tools
System Testing
Objectives ī‚ˇ To verify that the system components perform
control functions
ī‚ˇ To perform inter-system test
ī‚ˇ To demonstrate that the system performs both
functionally and operationally as specified
ī‚ˇ To perform appropriate types of tests relating
to Transaction Flow, Installation, Reliability,
Regression etc.
When ī‚ˇ After Integration Testing
Input ī‚ˇ Detailed Requirements & External Application
Design
ī‚ˇ Master Test Plan
ī‚ˇ System Test Plan
Output ī‚ˇ System Test Report
Who ī‚ˇDevelopment Team and Users
Methods ī‚ˇProblem / Configuration
Management
Tools ī‚ˇRecommended set of tools
Education ī‚ˇTesting Methodology
ī‚ˇEffective use of tools
Systems Integration Testing
Objectives ī‚ˇ To test the co-existence of products and
applications that are required to perform
together in the production-like operational
environment (hardware, software, network)
ī‚ˇ To ensure that the system functions together
with all the components of its environment as a
total system
ī‚ˇ To ensure that the system releases can be
deployed in the current environment
When ī‚ˇ After system testing
ī‚ˇ Often performed outside of project life-cycle
Input ī‚ˇ Test Strategy
ī‚ˇ Master Test Plan
ī‚ˇ Systems Integration Test Plan
Output ī‚ˇ Systems Integration Test report
Who ī‚ˇSystem Testers
Methods ī‚ˇWhite and Black Box techniques
ī‚ˇProblem / Configuration
Management
Tools ī‚ˇRecommended set of tools
Education ī‚ˇTesting Methodology
ī‚ˇEffective use of tools
Acceptance Testing
Objectives ī‚ˇ To verify that the system meets
the user requirements
When ī‚ˇ After System Testing
Input ī‚ˇ Business Needs & Detailed
Requirements
ī‚ˇ Master Test Plan
ī‚ˇ User Acceptance Test Plan
Output ī‚ˇ User Acceptance Test report
Who Users / End Users
Methods ī‚ˇBlack Box techniques
ī‚ˇProblem / Configuration
Management
Tools Compare, keystroke capture & playback,
regression testing
Education ī‚ˇTesting Methodology
ī‚ˇEffective use of tools
ī‚ˇProduct knowledge
ī‚ˇBusiness Release Strategy
TESTING METHODOLOGIES
AND TYPES
Testing methodologies
Black box testing
White box testing
Incremental testing
Thread testing
īƒ˜ Black box testing
â€ĸ No knowledge of internal design or code
required.
â€ĸ Tests are based on requirements and
functionality
īƒ˜ White box testing
â€ĸ Knowledge of the internal program design
and code required.
â€ĸ Tests are based on coverage of code
statements,branches,paths,conditions.
īƒ˜ Incorrect or missing functions
īƒ˜ Interface errors
īƒ˜ Errors in data structures or external database
access
īƒ˜ Performance errors
īƒ˜ Initialization and termination errors
BLACK BOX - TESTING TECHNIQUE
Black box / Functional testing
īƒ˜ Based on requirements and functionality
īƒ˜ Not based on any knowledge of internal
design or code
īƒ˜ Covers all combined parts of a system
īƒ˜ Tests are data driven
White box testing / Structural testing
īƒ˜ Based on knowledge of internal logic of an
application's code
īƒ˜ Based on coverage of code statements,
branches, paths, conditions
īƒ˜ Tests are logic driven
Functional testing
īŦ Black box type testing geared to functional
requirements of an application.
īŦ Done by testers.
System testing
īŦ Black box type testing that is based on overall
requirements specifications; covering all combined
parts of the system.
End-to-end testing
īŦ Similar to system testing; involves testing of a
complete application environment in a situation that
mimics real-world use.
Sanity testing
īŦ Initial effort to determine if a new software
version is performing well enough to accept
it for a major testing effort.
Regression testing
īŦ Re-testing after fixes or modifications of the
software or its environment.
Acceptance testing
īŦ Final testing based on specifications of the
end-user or customer
Load testing
īŦ Testing an application under heavy loads.
īŦ Eg. Testing of a web site under a range of
loads to determine, when the system
response time degraded or fails.
Stress Testing
īŦ Testing under unusually heavy loads, heavy
repetition of certain actions or inputs, input of
large numerical values, large complex queries
to a database etc.
īŦ Term often used interchangeably with ‘load’
and ‘performance’ testing.
Performance testing
īŦ Testing how well an application complies to
performance requirements.
Install/uninstall testing
īŦ Testing of full,partial or upgrade
install/uninstall process.
Recovery testing
īŦ Testing how well a system recovers from
crashes, HW failures or other problems.
Compatibility testing
īŦ Testing how well software performs in a
particular HW/SW/OS/NW environment.
Exploratory testing / ad-hoc testing
īŦ Informal SW test that is not based on formal test
plans or test cases; testers will be learning the
SW in totality as they test it.
Comparison testing
īŦ Comparing SW strengths and weakness to
competing products.
Alpha testing
â€ĸTesting done when development is nearing
completion; minor design changes may still
be made as a result of such testing.
Beta-testing
â€ĸTesting when development and testing are
essentially completed and final bugs and
problems need to be found before release.
Mutation testing
īŦ To determining if a set of test data or test cases is
useful, by deliberately introducing various bugs.
īŦ Re-testing with the original test data/cases to
determine if the bugs are detected.
White Box - Testing
White Box - testing technique
īƒ˜ All independent paths within a module have been
exercised at least once
īƒ˜ Exercise all logical decisions on their true and false
sides
īƒ˜ Execute all loops at their boundaries and within their
operational bounds
īƒ˜ Exercise internal data structures to ensure their
validity
This white box technique focuses on the validity
of loop constructs.
4 different classes of loops can be defined
â€ĸ simple loops
â€ĸ nested loops
â€ĸ concatenated loops
â€ĸ Unstructured loops
Loop Testing
Other White Box Techniques
Statement Coverage – execute all statements at least once
Decision Coverage – execute each decision direction at least
once
Condition Coverage – execute each decision with all possible
outcomes at least once
Decision / Condition coverage – execute all possible
combinations of condition outcomes in
each decision.
Multiple condition Coverage – Invokes each point of entry at
least once.
Examples â€Ļâ€Ļ
Statement Coverage – Examples
Eg. A + B
If (A = 3) Then
B = X + Y
End-If
While (A > 0) Do
Read (X)
A = A - 1
End-While-Do
Decision Coverage - Example
If A < 10 or A > 20 Then
B = X + Y
Condition Coverage – Example
A = X
If (A > 3) or (A < B) Then
B = X + Y
End-If-Then
While (A > 0) and (Not EOF) Do
Read (X)
A = A - 1
End-While-Do
Incremental Testing
īƒ˜ A disciplined method of testing the interfaces
between unit-tested programs as well as
between system components.
īƒ˜ Involves adding unit-testing program module
or component one by one, and testing each
result and combination.
Two types of Incremental Testing
īƒ˜ Top-down – testing form the top of the
module hierarchy and work down to the bottom.
Modules are added in descending hierarchical
order.
īƒ˜ Bottom-up – testing from the bottom of the
hierarchy and works up to the top. Modules are
added in ascending hierarchical order.
Testing Levels/
Techniques
White
Box
Black
Box
Incre-
mental
Thread
Unit Testing X
Integration
Testing
X X
X
System Testing X
Acceptance
Testing
X
Major Testing Types
īƒ˜ Stress / Load Testing
īƒ˜ Performance Testing
īƒ˜ Recovery Testing
īƒ˜ Conversion Testing
īƒ˜ Usability Testing
īƒ˜ Configuration Testing
Stress / Load Test
īƒ˜ Evaluates a system or component at or beyond
the limits of its specified requirements.
īƒ˜ Determines the load under which it fails and
how.
Performance Test
īŦ Evaluate the compliance of a system or
component with specified performance
requirements.
īŦ Often performed using an automated test tool
to simulate large number of users.
Recovery Test
Confirms that the system recovers from
expected or unexpected events without loss
of data or functionality.
Eg.
īƒ˜ Shortage of disk space
īƒ˜ Unexpected loss of communication
īƒ˜ Power out conditions
Conversion Test
īŦ Testing of code that is used to convert data
from existing systems for use in the newly
replaced systems
Usability Test
īŦ Testing the system for the users
to learn and use the product.
Configuration Test
īŦ Examines an application's requirements for pre-
existing software, initial states and
configuration in order to maintain proper
functionality.
SOFTWARE TESTING LIFECYCLE -
PHASES
â€ĸ Requirements study
â€ĸ Test Case Design and
Development
â€ĸ Test Execution
â€ĸ Test Closure
â€ĸ Test Process Analysis
Requirements study
īƒ˜ Testing Cycle starts with the study of client’s
requirements.
īƒ˜ Understanding of the requirements is very
essential for testing the product.
Analysis & Planning
â€ĸ Test objective and coverage
â€ĸ Overall schedule
â€ĸ Standards and Methodologies
â€ĸ Resources required, including necessary
training
â€ĸ Roles and responsibilities of the team
members
â€ĸ Tools used
Test Case Design and Development
â€ĸ Component Identification
â€ĸ Test Specification Design
â€ĸ Test Specification Review
Test Execution
â€ĸ Code Review
â€ĸ Test execution and evaluation
â€ĸ Performance and simulation
Test Closure
â€ĸ Test summary report
â€ĸ Project Documentation
Test Process Analysis
Analysis done on the reports and improving the
application’s performance by implementing new
technology and additional features.
TEST PLAN
Objectives
īƒ˜ To create a set of testing tasks.
īƒ˜ Assign resources to each testing task.
īƒ˜ Estimate completion time for each testing task.
īƒ˜ Document testing standards.
īƒ˜A document that describes the
īŦ scope
īŦ approach
īŦ resources
īŦ schedule
â€Ļof intended test activities.
īƒ˜Identifies the
īŦ test items
īŦ features to be tested
īŦ testing tasks
īŦ task allotment
īŦ risks requiring contingency planning.
Purpose of preparing a Test Plan
īƒ˜ Validate the acceptability of a software product.
īƒ˜ Help the people outside the test group to understand
‘why’ and ‘how’ of product validation.
īƒ˜ A Test Plan should be
īŦ thorough enough (Overall coverage of test to be
conducted)
īŦ useful and understandable by the people inside and
outside the test group.
Scope
īƒ˜The areas to be tested by the QA team.
īƒ˜Specify the areas which are out of scope (screens,
database, mainframe processes etc).
Test Approach
īƒ˜Details on how the testing is to be performed.
īƒ˜Any specific strategy is to be followed for
testing (including configuration management).
Entry Criteria
Various steps to be performed before the start of a
test i.e. Pre-requisites.
E.g.
īŦ Timely environment set up
īŦ Starting the web server/app server
īŦ Successful implementation of the latest build etc.
Resources
List of the people involved in the project and their
designation etc.
Tasks/Responsibilities
Tasks to be performed and responsibilities
assigned to the various team members.
Exit Criteria
Contains tasks like
â€ĸBringing down the system / server
â€ĸRestoring system to pre-test environment
â€ĸDatabase refresh etc.
Schedule / Milestones
Deals with the final delivery date and the
various milestones dates.
Hardware / Software Requirements
īƒ˜Details of PC’s / servers required to install the
application or perform the testing
īƒ˜Specific software to get the application
running or to connect to the database etc.
Risks & Mitigation Plans
īƒ˜List out the possible risks during testing
īƒ˜Mitigation plans to implement incase the risk
actually turns into a reality.
Tools to be used
īƒ˜List the testing tools or utilities
īƒ˜Eg.WinRunner, LoadRunner, Test Director,
Rational Robot, QTP.
Deliverables
īƒ˜Various deliverables due to the client at various
points of time i.e. Daily / weekly / start of the
project end of the project etc.
īƒ˜These include test plans, test procedures, test
metric, status reports, test scripts etc.
References
īŦ Procedures
īŦ Templates (Client specific or otherwise)
īŦ Standards / Guidelines e.g. Qview
īŦ Project related documents (RSD, ADD,
FSD etc).
Annexure
īƒ˜ Links to documents which have been / will be
used in the course of testing
Eg. Templates used for reports, test cases etc.
īƒ˜ Referenced documents can also be attached here.
Sign-off
īƒ˜ Mutual agreement between the client and the QA
Team.
īƒ˜ Both leads/managers signing their agreement on
the Test Plan.
Good Test Plans
īƒ˜ Developed and Reviewed early.
īƒ˜ Clear, Complete and Specific
īƒ˜ Specifies tangible deliverables that can be
inspected.
īƒ˜ Staff knows what to expect and when to expect it.
Good Test Plans
īƒ˜ Realistic quality levels for goals
īƒ˜ Includes time for planning
īƒ˜ Can be monitored and updated
īƒ˜ Includes user responsibilities
īƒ˜ Based on past experience
īƒ˜ Recognizes learning curves
TEST CASES
Test case is defined as
īƒ˜ A set of test inputs, execution conditions and
expected results, developed for a particular
objective.
īƒ˜ Documentation specifying inputs, predicted
results and a set of execution conditions for a test
item.
īƒ˜ Specific inputs that will be tried and the
procedures that will be followed when the
software tested.
īƒ˜ Sequence of one or more subtests executed as
a sequence as the outcome and/or final state of
one subtests is the input and/or initial state of
the next.
īƒ˜ Specifies the pretest state of the AUT and its
environment, the test inputs or conditions.
īƒ˜ The expected result specifies what the AUT
should produce from the test inputs.
Good Test Plans
īƒ˜ Developed and Reviewed early.
īƒ˜ Clear, Complete and Specific
īƒ˜ Specifies tangible deliverables that can be
inspected.
īƒ˜ Staff knows what to expect and when to expect it.
Good Test Plans
īƒ˜ Realistic quality levels for goals
īƒ˜ Includes time for planning
īƒ˜ Can be monitored and updated
īƒ˜ Includes user responsibilities
īƒ˜ Based on past experience
īƒ˜ Recognizes learning curves
Test Cases
Contents
īŦ Test plan reference id
īŦ Test case
īŦ Test condition
īŦ Expected behavior
Good Test Cases
Find Defects
īƒ˜ Have high probability of finding a new defect.
īƒ˜ Unambiguous tangible result that can be
inspected.
īƒ˜ Repeatable and predictable.
Good Test Cases
īƒ˜ Traceable to requirements or design documents
īƒ˜ Push systems to its limits
īƒ˜ Execution and tracking can be automated
īƒ˜ Do not mislead
īƒ˜ Feasible
Defect Life Cycle
What is Defect?
A defect is a variance from a desired
product attribute.
Two categories of defects are
â€ĸ Variance from product specifications
â€ĸ Variance from Customer/User
expectations
Variance from product specification
īƒ˜ Product built varies from the product specified.
Variance from Customer/User specification
īƒ˜ A specification by the user not in the built
product, but something not specified has been
included.
Defect categories
Wrong
The specifications have been implemented
incorrectly.
Missing
A specified requirement is not in the built
product.
Extra
A requirement incorporated into the product
that was not specified.
Defect Log
â€ĸ Defect ID number
â€ĸ Descriptive defect name and type
â€ĸ Source of defect – test case or other source
â€ĸ Defect severity
â€ĸ Defect Priority
â€ĸ Defect status (e.g. New, open, fixed, closed,
reopen, reject)
7. Date and time tracking for either the most
recent status change, or for each change in the
status.
8. Detailed description, including the steps
necessary to reproduce the defect.
9. Component or program where defect was found
10. Screen prints, logs, etc. that will aid the
developer in resolution process.
11. Stage of origination.
12. Person assigned to research and/or corrects the
defect.
Severity Vs Priority
Severity
Factor that shows how bad the defect is and
the impact it has on the product
Priority
Based upon input from users regarding
which defects are most important to them,
and be fixed first.
Severity Levels
īƒ˜ Critical
īƒ˜ Major / High
īƒ˜ Average / Medium
īƒ˜ Minor / low
īƒ˜ Cosmetic defects
Severity Level – Critical
īƒ˜ An installation process which does not load a
component.
īƒ˜ A missing menu option.
īƒ˜ Security permission required to access a function
under test.
īƒ˜ Functionality does not permit for further testing.
īƒ˜ Runtime Errors like JavaScript errors etc.
īƒ˜ Functionality Missed out / Incorrect
Implementation (Major Deviation from
Requirements).
īƒ˜ Performance Issues (If specified by Client).
īƒ˜ Browser incompatibility and Operating systems
incompatibility issues depending on the impact
of error.
īƒ˜ Dead Links.
Severity Level – Major / High
īƒ˜ Reboot the system.
īƒ˜ The wrong field being updated.
īƒ˜ An updated operation that fails to complete.
īƒ˜ Performance Issues (If not specified by Client).
īƒ˜ Mandatory Validations for Mandatory Fields.
īƒ˜ Functionality incorrectly implemented (Minor
Deviation from Requirements).
īƒ˜ Images, Graphics missing which hinders
functionality.
īƒ˜ Front End / Home Page Alignment issues.
īƒ˜ Severity Level – Average / Medium
Incorrect/missing hot key operation.
Severity Level – Minor / Low
īƒ˜ Misspelled or ungrammatical text
īƒ˜ Inappropriate or incorrect formatting (such as
text font, size, alignment, color, etc.)
īƒ˜ Screen Layout Issues
īƒ˜ Spelling Mistakes / Grammatical Mistakes
īƒ˜ Documentation Errors
īƒ˜ Page Titles Missing
īƒ˜ Alt Text for Images
īƒ˜ Background Color for the Pages other than
Home page
īƒ˜ Default Value missing for the fields required
īƒ˜ Cursor Set Focus and Tab Flow on the Page
īƒ˜ Images, Graphics missing, which does not,
hinders functionality
Test Reports
8 INTERIM REPORTS
īƒ˜ Functional Testing Status
īƒ˜ Functions Working Timeline
īƒ˜ Expected Vs Actual Defects Detected Timeline
īƒ˜ Defects Detected Vs Corrected Gap Timeline
īƒ˜ Average Age of Detected Defects by type
īƒ˜ Defect Distribution
īƒ˜ Relative Defect Distribution
īƒ˜ Testing Action
Functional Testing Status Report
Report shows percentage of the
functions that are
â€ĸFully Tested
â€ĸTested with Open defects
â€ĸNot Tested
Functions Working Timeline
īƒ˜Report shows the actual plan to have all
functions verses the current status of the
functions working.
īƒ˜Line graph is an ideal format.
Expected Vs. Actual Defects Detected
īƒ˜Analysis between the number of defects being
generated against the expected number of
defects expected from the planning stage.
Defects Detected Vs. Corrected Gap
A line graph format that shows the
īƒ˜Number of defects uncovered verses the
number of defects being corrected and
accepted by the testing group.
Average Age Detected Defects by Type
īƒ˜Average days of outstanding defects by its
severity type or level.
īƒ˜The planning stage provides the acceptable
open days by defect type.
Defect Distribution
Shows defect distribution by function or module
and the number of tests completed.
Relative Defect Distribution
īƒ˜Normalize the level of defects with the
previous reports generated.
īƒ˜Normalizing over the number of functions or
lines of code shows a more accurate level of
defects.
Testing Action
Report shows
īŦ Possible shortfalls in testing
īŦ Number of severity-1 defects
īŦ Priority of defects
īŦ Recurring defects
īŦ Tests behind schedule
â€Ļ.and other information that present an accurate
testing picture
METRICS
2 Types
īƒ˜Product metrics
īƒ˜Process metrics
Process Metrics
īƒ˜ Measures the characteristic of the
â€ĸ methods
â€ĸ techniques
â€ĸ tools
Product Metrics
īƒ˜ Measures the characteristic of the
documentation and code.
Test Metrics
User Participation = User Participation test time
Vs. Total test time.
Path Tested = Number of path tested Vs. Total
number of paths.
Acceptance criteria tested = Acceptance criteria
verified Vs. Total acceptance criteria.
Test cost = Test cost Vs. Total system cost.
Cost to locate defect = Test cost / No. of defects
located in the testing.
Detected production defect = No. of defects
detected in production / Application system size.
Test Automation = Cost of manual test effort /
Total test cost.
CMM – Level 1 – Initial Level
The organization
īƒ˜Does not have an environment for developing
and maintaining software.
īƒ˜At the time of crises, projects usually stop
using all planned procedures and revert to coding
and testing.
CMM – Level 2 – Repeatable level
Effective management process having
established which can be
īƒ˜ Practiced
īƒ˜ Documented
īƒ˜ Enforced
īƒ˜ Trained
īƒ˜ Measured
īƒ˜ Improvised
CMM – Level 3 – Defined level
īƒ˜Standard defined software engineering and
management process for developing and
maintaining software.
īƒ˜These processes are put together to make a
coherent whole.
CMM – Level 4 – Managed level
īƒ˜Quantitative goals set for both software products
and processes.
īƒ˜The organizational measurement plan involves
determining the productivity and quality for all
important software process activities across all
projects.
CMM – Level 5 – Optimizing level
Emphasis laid on
īƒ˜Process improvement
īƒ˜Tools to identify weaknesses existing in their
processes
īƒ˜Make timely corrections
Cost of Poor Quality
Total Quality Costs represent the difference
between the actual (current) cost of a product
or service and what the reduced cost would be
if there were no possibility of substandard
service, failure to meet specifications, failure
of products, or defects in their manufacture.
Campanella, Principles of Quality Costs
Prevention of Poor Quality
COQ Process
1. Commitment
2. COQ Team
3. Gather data (COQ assessment)
4. Pareto analysis
5. Determine cost drivers
6. Process Improvement Teams
7. Monitor and measure
8. Go back to step 3
Generally
Missing
“Wished I had understood that Cost of Quality stuff better”
TESTING STANDARDS
External Standards
Familiarity with and adoption of industry test
standards from organizations.
Internal Standards
Development and enforcement of the test
standards that testers must meet.
IEEE STANDARDS
Institute of Electrical and Electronics
Engineers designed an entire set of standards
for software and to be followed by the
testers.
IEEE – Standard Glossary of Software Engineering
Terminology
IEEE – Standard for Software Quality Assurance Plan
IEEE – Standard for Software Configuration
Management Plan
IEEE – Standard for Software for Software Test
Documentation
IEEE – Recommended Practice for Software
Requirement Specification
IEEE – Standard for Software Unit Testing
IEEE – Standard for Software Verification and
Validation
IEEE – Standard for Software Reviews
IEEE – Recommended practice for Software
Design descriptions
IEEE – Standard Classification for Software
Anomalies
IEEE – Standard for Software Productivity
metrics
IEEE – Standard for Software Project
Management plans
IEEE – Standard for Software Management
IEEE – Standard for Software Quality Metrics
Methodology
Other standardsâ€Ļ..
ISO – International Organization for Standards
Six Sigma – Zero Defect Orientation
SPICE – Software Process Improvement and
Capability Determination
NIST – National Institute of Standards and
Technology
www.softwaretestinggenius.com
A Storehouse of Vast
Knowledge on
Multiple Answer Interview Questions / Quiz as used by
Several MNC’s to Evaluate New Testers
and
Hundreds of Interview Preparation Questions on
QuickTest Professional (QTP) , LoadRunner , Software
Testing & Quality Assurance
>>>>>>>>>>>>>> www.softwaretestinggenius.com <<<<<<<<<<<<<<
Thank You
>>>>>>>>>>>>>> www.softwaretestinggenius.com <<<<<<<<<<<<<<

More Related Content

Similar to Software testing for beginners

16103271 software-testing-ppt
16103271 software-testing-ppt16103271 software-testing-ppt
16103271 software-testing-pptatish90
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing FundamentalsChankey Pathak
 
Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assurancelokareminakshi
 
Skil storm testing at the speed of business 2
Skil storm testing at the speed of business 2Skil storm testing at the speed of business 2
Skil storm testing at the speed of business 2Glen Noesen
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance Webtech Learning
 
QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)
QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)
QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)Atlogys Technical Consulting
 
An introduction to Software Testing and Test Management
An introduction to Software Testing and Test ManagementAn introduction to Software Testing and Test Management
An introduction to Software Testing and Test ManagementAnuraj S.L
 
Introduction to Software Testing
Introduction to Software TestingIntroduction to Software Testing
Introduction to Software TestingRajathi-QA
 
Software Quality and Testing_Se lect18 btech
Software Quality and Testing_Se lect18 btechSoftware Quality and Testing_Se lect18 btech
Software Quality and Testing_Se lect18 btechIIITA
 
Continuous Testing Landscape.pptx
Continuous Testing Landscape.pptxContinuous Testing Landscape.pptx
Continuous Testing Landscape.pptxMarc Hornbeek
 
SQA Lecture 01 (Introduction) - Testing and SQA
SQA Lecture 01 (Introduction) - Testing and SQASQA Lecture 01 (Introduction) - Testing and SQA
SQA Lecture 01 (Introduction) - Testing and SQAsunena224
 
Day 2 meet shilpa - measuring software quality-are you up-to-date on what an...
Day 2 meet shilpa  - measuring software quality-are you up-to-date on what an...Day 2 meet shilpa  - measuring software quality-are you up-to-date on what an...
Day 2 meet shilpa - measuring software quality-are you up-to-date on what an...XBOSoft
 
Quality Software
Quality SoftwareQuality Software
Quality SoftwareMarius Ghetie
 
Quality Management
Quality ManagementQuality Management
Quality ManagementBuchiri
 
Quality Mangt
Quality MangtQuality Mangt
Quality Mangtajithsrc
 
What is the Difference Between Software Testing and QA Testing.pptx
What is the Difference Between Software Testing and QA Testing.pptxWhat is the Difference Between Software Testing and QA Testing.pptx
What is the Difference Between Software Testing and QA Testing.pptxCalidad Infotech
 
Software Quality Assurance
Software Quality Assurance Software Quality Assurance
Software Quality Assurance ShashankBajpai24
 

Similar to Software testing for beginners (20)

16103271 software-testing-ppt
16103271 software-testing-ppt16103271 software-testing-ppt
16103271 software-testing-ppt
 
Software Testing Fundamentals
Software Testing FundamentalsSoftware Testing Fundamentals
Software Testing Fundamentals
 
Software quality assurance
Software quality assuranceSoftware quality assurance
Software quality assurance
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Skil storm testing at the speed of business 2
Skil storm testing at the speed of business 2Skil storm testing at the speed of business 2
Skil storm testing at the speed of business 2
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance
 
Quality Assurance and Testing services
Quality Assurance and Testing servicesQuality Assurance and Testing services
Quality Assurance and Testing services
 
QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)
QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)
QA Best Practices at Atlogys - Tech Talk (Atlogys Academy)
 
An introduction to Software Testing and Test Management
An introduction to Software Testing and Test ManagementAn introduction to Software Testing and Test Management
An introduction to Software Testing and Test Management
 
Introduction to Software Testing
Introduction to Software TestingIntroduction to Software Testing
Introduction to Software Testing
 
Software Quality and Testing_Se lect18 btech
Software Quality and Testing_Se lect18 btechSoftware Quality and Testing_Se lect18 btech
Software Quality and Testing_Se lect18 btech
 
Continuous Testing Landscape.pptx
Continuous Testing Landscape.pptxContinuous Testing Landscape.pptx
Continuous Testing Landscape.pptx
 
SQA Lecture 01 (Introduction) - Testing and SQA
SQA Lecture 01 (Introduction) - Testing and SQASQA Lecture 01 (Introduction) - Testing and SQA
SQA Lecture 01 (Introduction) - Testing and SQA
 
Day 2 meet shilpa - measuring software quality-are you up-to-date on what an...
Day 2 meet shilpa  - measuring software quality-are you up-to-date on what an...Day 2 meet shilpa  - measuring software quality-are you up-to-date on what an...
Day 2 meet shilpa - measuring software quality-are you up-to-date on what an...
 
Quality Software
Quality SoftwareQuality Software
Quality Software
 
Quality Management
Quality ManagementQuality Management
Quality Management
 
Ch27
Ch27Ch27
Ch27
 
Quality Mangt
Quality MangtQuality Mangt
Quality Mangt
 
What is the Difference Between Software Testing and QA Testing.pptx
What is the Difference Between Software Testing and QA Testing.pptxWhat is the Difference Between Software Testing and QA Testing.pptx
What is the Difference Between Software Testing and QA Testing.pptx
 
Software Quality Assurance
Software Quality Assurance Software Quality Assurance
Software Quality Assurance
 

Recently uploaded

Folding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a seriesFolding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a seriesPhilip Schwarz
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
Unveiling the Future: Sylius 2.0 New Features
Unveiling the Future: Sylius 2.0 New FeaturesUnveiling the Future: Sylius 2.0 New Features
Unveiling the Future: Sylius 2.0 New FeaturesŁukasz Chruściel
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Andreas Granig
 
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024StefanoLambiase
 
MYjobs Presentation Django-based project
MYjobs Presentation Django-based projectMYjobs Presentation Django-based project
MYjobs Presentation Django-based projectAnoyGreter
 
Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...Velvetech LLC
 
React Server Component in Next.js by Hanief Utama
React Server Component in Next.js by Hanief UtamaReact Server Component in Next.js by Hanief Utama
React Server Component in Next.js by Hanief UtamaHanief Utama
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEOrtus Solutions, Corp
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptkotipi9215
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
What are the key points to focus on before starting to learn ETL Development....
What are the key points to focus on before starting to learn ETL Development....What are the key points to focus on before starting to learn ETL Development....
What are the key points to focus on before starting to learn ETL Development....kzayra69
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackVICTOR MAESTRE RAMIREZ
 
Unveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML DiagramsUnveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML DiagramsAhmed Mohamed
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
办į†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇ
办į†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇåŠžį†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇ
办į†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇumasea
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...gurkirankumar98700
 
英å›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœ
英å›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœč‹ąå›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœ
英å›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœqr0udbr0
 

Recently uploaded (20)

Folding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a seriesFolding Cheat Sheet #4 - fourth in a series
Folding Cheat Sheet #4 - fourth in a series
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
Unveiling the Future: Sylius 2.0 New Features
Unveiling the Future: Sylius 2.0 New FeaturesUnveiling the Future: Sylius 2.0 New Features
Unveiling the Future: Sylius 2.0 New Features
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024
 
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
Dealing with Cultural Dispersion — Stefano Lambiase — ICSE-SEIS 2024
 
MYjobs Presentation Django-based project
MYjobs Presentation Django-based projectMYjobs Presentation Django-based project
MYjobs Presentation Django-based project
 
Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...Software Project Health Check: Best Practices and Techniques for Your Product...
Software Project Health Check: Best Practices and Techniques for Your Product...
 
React Server Component in Next.js by Hanief Utama
React Server Component in Next.js by Hanief UtamaReact Server Component in Next.js by Hanief Utama
React Server Component in Next.js by Hanief Utama
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.ppt
 
Hot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort Service
Hot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort ServiceHot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort Service
Hot Sexy call girls in Patel Nagar🔝 9953056974 🔝 escort Service
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
What are the key points to focus on before starting to learn ETL Development....
What are the key points to focus on before starting to learn ETL Development....What are the key points to focus on before starting to learn ETL Development....
What are the key points to focus on before starting to learn ETL Development....
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
Cloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStackCloud Management Software Platforms: OpenStack
Cloud Management Software Platforms: OpenStack
 
Unveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML DiagramsUnveiling Design Patterns: A Visual Guide with UML Diagrams
Unveiling Design Patterns: A Visual Guide with UML Diagrams
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
办į†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇ
办į†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇåŠžį†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇ
办į†å­ĻäŊč¯(UQæ–‡å‡­č¯äšĻ)昆åŖĢ兰大å­Ļæ¯•ä¸šč¯æˆįģŠå•åŽŸį‰ˆä¸€æ¨Ąä¸€æ ˇ
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑đŸŊ‍❤ī¸â€đŸ§‘đŸģ 89...
 
英å›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœ
英å›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœč‹ąå›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœ
英å›ŊUNå­ĻäŊč¯,北厉晎éĄŋ大å­Ļæ¯•ä¸šč¯äšĻ1:1åˆļäŊœ
 

Software testing for beginners

  • 2. Introduction & Fundamentals What is Quality? What is Software Testing? Why testing is necessary? Who does the testing? What has to be tested? When is testing done? How often to test? What is cost of Quality? What are Testing Standards?
  • 3. What is Quality? īƒ˜ Quality is “fitness for use” - (Joseph Juran) īƒ˜ Quality is “conformance to requirements” - (Philip B. Crosby) īƒ˜ Quality of a product or service is its ability to satisfy the needs and expectations of the customer
  • 5. Deming’s Learning Cycle of Quality “Inspection with the aim of finding the bad ones and throwing them out is too late, ineffective and costly. Quality comes not from inspection but improvement of the process.” Dr. W. Edwards Deming Founder of the Quality Evolution
  • 7. Most Common Software problems īƒ˜ Incorrect calculation īƒ˜ Incorrect data edits & ineffective data edits īƒ˜ Incorrect matching and merging of data īƒ˜ Data searches that yields incorrect results īƒ˜ Incorrect processing of data relationship īƒ˜ Incorrect coding / implementation of business rules īƒ˜ Inadequate software performance
  • 8. īƒ˜ Confusing or misleading data īƒ˜ Software usability by end users & īƒ˜ Obsolete Software īƒ˜ Inconsistent processing īƒ˜ Unreliable results or performance īƒ˜ Inadequate support of business needs īƒ˜ Incorrect or inadequate interfaces īƒ˜ with other systems īƒ˜ Inadequate performance and security controls īƒ˜ Incorrect file handling
  • 9. Objectives of testing īƒ˜ Executing a program with the intent of finding an error. īƒ˜ To check if the system meets the requirements and be executed successfully in the Intended environment. īƒ˜ To check if the system is “ Fit for purpose”. īƒ˜ To check if the system does what it is expected to do.
  • 10. Objectives of testing īƒ˜ A good test case is one that has a probability of finding an as yet undiscovered error. īƒ˜ A successful test is one that uncovers a yet undiscovered error. īƒ˜ A good test is not redundant. īƒ˜ A good test should be “best of breed”. īƒ˜ A good test should neither be too simple nor too complex.
  • 11. Objective of a Software Tester īƒ˜ Find bugs as early as possible and make sure they get fixed. īƒ˜ To understand the application well. īƒ˜ Study the functionality in detail to find where the bugs are likely to occur. īƒ˜ Study the code to ensure that each and every line of code is tested. īƒ˜ Create test cases in such a way that testing is done to uncover the hidden bugs and also ensure that the software is usable and reliable
  • 12. VERIFICATION & VALIDATION Verification - typically involves reviews and meeting to evaluate documents, plans, code, requirements, and specifications. This can be done with checklists, issues lists, walkthroughs, and inspection meeting. Validation - typically involves actual testing and takes place after verifications are completed. Validation and Verification process continue in a cycle till the software becomes defects free.
  • 15. īƒ˜ PLAN (P): Device a plan. Define your objective and determine the strategy and supporting methods required to achieve that objective. īƒ˜ DO (D): Execute the plan. Create the conditions and perform the necessary training to execute the plan. īƒ˜ CHECK (C): Check the results. Check to determine whether work is progressing according to the plan and whether the results are obtained. īƒ˜ ACTION (A): Take the necessary and appropriate action if checkup reveals that the work is not being performed according to plan or not as anticipated.
  • 16. QUALITY PRINCIPLES Quality - the most important factor affecting an organization’s long-term performance. Quality - the way to achieve improved productivity and competitiveness in any organization. Quality - saves. It does not cost. Quality - is the solution to the problem, not a problem.
  • 17. Cost of Quality Prevention Cost Amount spent before the product is actually built. Cost incurred on establishing methods and procedures, training workers, acquiring tools and planning for quality. Appraisal cost Amount spent after the product is built but before it is shipped to the user. Cost of inspection, testing, and reviews.
  • 18. Failure Cost Amount spent to repair failures. Cost associated with defective products that have been delivered to the user or moved into production, costs involve repairing products to make them fit as per requirement.
  • 19. Quality Assurance Quality Control A planned and systematic set of activities necessary to provide adequate confidence that requirements are properly established and products or services conform to specified requirements. The process by which product quality is compared with applicable standards; and the action taken when non-conformance is detected. An activity that establishes and evaluates the processes to produce the products. An activity which verifies if the product meets pre- defined standards.
  • 20. Quality Assurance Quality Control Helps establish processes. Implements the process. Sets up measurements programs to evaluate processes. Verifies if specific attributes are in a specific product or Service Identifies weaknesses in processes and improves them. Identifies defects for the primary purpose of correcting defects.
  • 21. QA is the responsibility of the entire team. QC is the responsibility of the tester. Prevents the introduction of issues or defects Detects, reports and corrects defects QA evaluates whether or not quality control is working for the primary purpose of determining whether or not there is a weakness in the process. QC evaluates if the application is working for the primary purpose of determining if there is a flaw / defect in the functionalities. Responsibilities of QA and QC
  • 22. QA improves the process that is applied to multiple products that will ever be produced by a process. QC improves the development of a specific product or service. QA personnel should not perform quality control unless doing it to validate quality control is working. QC personnel may perform quality assurance tasks if and when required. Responsibilities of QA and QC
  • 23. SEI – CMM Software Engineering Institute (SEI) developed Capability Maturity Model (CMM) CMM describes the prime elements - planning, engineering, managing software development and maintenance CMM can be used for â€ĸ Software process improvement â€ĸ Software process assessment â€ĸ Software capability evaluations
  • 24. The CMM is organized into five maturity level Initial Level 1 Repeatable Level 2 Defined Level 3 Managed Level 4 Optimizing Level 5 Disciplined Process Standard Consistence Process Predictable Process Continuous Improvement Process
  • 25. Phases of SDLC â€ĸ Requirement Specification and Analysis â€ĸ Design â€ĸ Coding â€ĸ Testing â€ĸ Implementation â€ĸ Maintenance SOFTWARE DEVELOPMENT LIFE CYCLE (SDLC)
  • 26. Requirement Specification and Analysis User Requirement Specification (USR) Software Requirement Specification (SRS)
  • 27. The output of SRS is the input of design phase. Two types of design - High Level Design (HLD) Low Level Design (LLD) Design
  • 28. īƒ˜ List of modules and a brief description of each module. īƒ˜ Brief functionality of each module. īƒ˜ Interface relationship among modules. īƒ˜ Dependencies between modules (if A exists, B exists etc). īƒ˜ Database tables identified along with key elements. īƒ˜ Overall architecture diagrams along with technology details. High Level Design (HLD)
  • 29. īƒ˜ Detailed functional logic of the module, in pseudo code. īƒ˜ Database tables, with all elements, including their type and size. īƒ˜ All interface details. īƒ˜ All dependency issues īƒ˜ Error message listings īƒ˜ Complete input and outputs for a module. Low Level Design (LLD)
  • 30. Breaking down the product into independent modules to arrive at micro levels. 2 different approaches followed in designing – Top Down Approach Bottom Up Approach The Design process
  • 33. Coding Developers use the LLD document and write the code in the programming language specified. Testing The testing process involves development of a test plan, executing the plan and documenting the test results. Implementation Installation of the product in its operational environment.
  • 34. Maintenance After the software is released and the client starts using the software, maintenance phase is started. 3 things happen - Bug fixing, Upgrade, Enhancement Bug fixing – bugs arrived due to some untested scenarios. Upgrade – Upgrading the application to the newer versions of the software. Enhancement - Adding some new features into the existing software.
  • 35. SOFTWARE LIFE CYCLE MODELS WATERFALL MODEL V-PROCESS MODEL SPIRAL MODEL PROTOTYPE MODEL INCREMENTAL MODEL EVOLUTIONARY DEVELOPMENT MODEL
  • 36. Project Management īƒ˜ Project Staffing īƒ˜ Project Planning īƒ˜ Project Scheduling
  • 37. Project Staffing īƒ˜ Project budget may not allow to utilize highly – paid staff. īƒ˜ Staff with the appropriate experience may not be available.
  • 38. Project Planning Plan Description Quality plan Describes the quality procedures and standards used in a project. Validation plan Describes the approach, resources and schedule used for system validation. Configuration management plan Describes the configuration management procedures and structures to be used. Maintenance plan Predicts the maintenance requirements of the system/ maintenance costs and efforts required. Staff development plan Describes how the skills and experience of the project team members will be developed.
  • 39. Project Scheduling īƒ˜ Bar charts and Activity Networks īƒ˜ Scheduling problems
  • 40. RISK MANAGEMENT īƒ˜ Risk identification īƒ˜ Risk Analysis īƒ˜ Risk Planning īƒ˜ Risk Monitoring
  • 41. Risk Risk type Description Staff turnover Project Experienced staff will leave the project before it is finished. Management change Project There will be a change of organizational management with different priorities. Hardware unavailability Project Hardware which is essential for the project will not be delivered on schedule. Requirements change Project & Product There will be a larger number of changes to the requirements than anticipated.
  • 42. Risk Risk type Description Specification delays Project & Product Specifications of essential interfaces are not available on schedule. Size under estimate Project & Product The size of the system has been under estimated. CASE tool under performance Product CASE tools which support the project do not perform as anticipated. Technology change Business The underlying technology on which the system is built is superseded by new technology. Product competition Business A competitive product is marketed before the system is completed.
  • 43. PC version Initial system DEC version VMS version Unix version Mainframe version Workstation version Configuration Management Sun version
  • 44. Configuration Management (CM) Standards īƒ˜ CM should be based on a set of standards, which are applied within an organization.
  • 45. CM Planning īƒ˜Documents, required for future system maintenance, should be identified and included as managed documents. īƒ˜It defines the types of documents to be managed and a document naming scheme.
  • 46. Change Management īƒ˜Keeping and managing the changes and ensuring that they are implemented in the most cost-effective way.
  • 47. Change Request form A part of the CM planning process īŦ Records change required īŦ Change suggested by īŦ Reason why change was suggested īŦ Urgency of change īŦ Records change evaluation īŦ Impact analysis īŦ Change cost īŦ Recommendations(system maintenance staff)
  • 48. VERSION AND RELEASE MANAGEMENT īƒ˜ Invent identification scheme for system versions and plan when new system version is to be produced. īƒ˜ Ensure that version management procedures and tools are properly applied and to plan and distribute new system releases.
  • 49. Versions/Variants/Releases īƒ˜Variant An instance of a system which is functionally identical but non – functionally distinct from other instances of a system. īƒ˜Versions An instance of a system, which is functionally distinct in some way from other system instances. īƒ˜Release An instance of a system, which is distributed to users outside of the development team.
  • 50. SOFTWARE TESTING LIFECYCLE - PHASES īƒ˜ Requirements study īƒ˜ Test Case Design and Development īƒ˜ Test Execution īƒ˜ Test Closure īƒ˜ Test Process Analysis
  • 51. Requirements study īƒ˜ Testing Cycle starts with the study of client’s requirements. īƒ˜ Understanding of the requirements is very essential for testing the product.
  • 52. Analysis & Planning â€ĸ Test objective and coverage â€ĸ Overall schedule â€ĸ Standards and Methodologies â€ĸ Resources required, including necessary training â€ĸ Roles and responsibilities of the team members â€ĸ Tools used
  • 53. Test Case Design and Development â€ĸ Component Identification â€ĸ Test Specification Design â€ĸ Test Specification Review Test Execution â€ĸ Code Review â€ĸ Test execution and evaluation â€ĸ Performance and simulation
  • 54. Test Closure â€ĸ Test summary report â€ĸ Project De-brief â€ĸ Project Documentation Test Process Analysis Analysis done on the reports and improving the application’s performance by implementing new technology and additional features.
  • 56. Testing Levels â€ĸ Unit testing â€ĸ Integration testing â€ĸ System testing â€ĸ Acceptance testing
  • 57. Unit testing īƒ˜ The most ‘micro’ scale of testing. īƒ˜ Tests done on particular functions or code modules. īƒ˜ Requires knowledge of the internal program design and code. īƒ˜ Done by Programmers (not by testers).
  • 58. Unit testing Objectives ī‚ˇ To test the function of a program or unit of code such as a program or module ī‚ˇ To test internal logic ī‚ˇ To verify internal design ī‚ˇ To test path & conditions coverage ī‚ˇ To test exception conditions & error handling When ī‚ˇ After modules are coded Input ī‚ˇ Internal Application Design ī‚ˇ Master Test Plan ī‚ˇ Unit Test Plan Output ī‚ˇ Unit Test Report
  • 59. Who ī‚ˇDeveloper Methods ī‚ˇWhite Box testing techniques ī‚ˇTest Coverage techniques Tools ī‚ˇDebug ī‚ˇRe-structure ī‚ˇCode Analyzers ī‚ˇPath/statement coverage tools Education ī‚ˇTesting Methodology ī‚ˇEffective use of tools
  • 60. īƒ˜ Incremental integration testing īƒ˜Continuous testing of an application as and when a new functionality is added. īƒ˜Application’s functionality aspects are required to be independent enough to work separately before completion of development. īƒ˜Done by programmers or testers.
  • 61. Integration Testing īŦ Testing of combined parts of an application to determine their functional correctness. īŦ ‘Parts’ can be â€ĸ code modules â€ĸ individual applications â€ĸ client/server applications on a network.
  • 62. Types of Integration Testing â€ĸ Big Bang testing â€ĸ Top Down Integration testing â€ĸ Bottom Up Integration testing
  • 63. Integration testing Objectives ī‚ˇ To technically verify proper interfacing between modules, and within sub-systems When ī‚ˇ After modules are unit tested Input ī‚ˇ Internal & External Application Design ī‚ˇ Master Test Plan ī‚ˇ Integration Test Plan Output ī‚ˇ Integration Test report
  • 64. Who ī‚ˇDevelopers Methods ī‚ˇWhite and Black Box techniques ī‚ˇProblem / Configuration Management Tools ī‚ˇDebug ī‚ˇRe-structure ī‚ˇCode Analyzers Education ī‚ˇTesting Methodology ī‚ˇEffective use of tools
  • 65. System Testing Objectives ī‚ˇ To verify that the system components perform control functions ī‚ˇ To perform inter-system test ī‚ˇ To demonstrate that the system performs both functionally and operationally as specified ī‚ˇ To perform appropriate types of tests relating to Transaction Flow, Installation, Reliability, Regression etc. When ī‚ˇ After Integration Testing Input ī‚ˇ Detailed Requirements & External Application Design ī‚ˇ Master Test Plan ī‚ˇ System Test Plan Output ī‚ˇ System Test Report
  • 66. Who ī‚ˇDevelopment Team and Users Methods ī‚ˇProblem / Configuration Management Tools ī‚ˇRecommended set of tools Education ī‚ˇTesting Methodology ī‚ˇEffective use of tools
  • 67. Systems Integration Testing Objectives ī‚ˇ To test the co-existence of products and applications that are required to perform together in the production-like operational environment (hardware, software, network) ī‚ˇ To ensure that the system functions together with all the components of its environment as a total system ī‚ˇ To ensure that the system releases can be deployed in the current environment When ī‚ˇ After system testing ī‚ˇ Often performed outside of project life-cycle Input ī‚ˇ Test Strategy ī‚ˇ Master Test Plan ī‚ˇ Systems Integration Test Plan Output ī‚ˇ Systems Integration Test report
  • 68. Who ī‚ˇSystem Testers Methods ī‚ˇWhite and Black Box techniques ī‚ˇProblem / Configuration Management Tools ī‚ˇRecommended set of tools Education ī‚ˇTesting Methodology ī‚ˇEffective use of tools
  • 69. Acceptance Testing Objectives ī‚ˇ To verify that the system meets the user requirements When ī‚ˇ After System Testing Input ī‚ˇ Business Needs & Detailed Requirements ī‚ˇ Master Test Plan ī‚ˇ User Acceptance Test Plan Output ī‚ˇ User Acceptance Test report
  • 70. Who Users / End Users Methods ī‚ˇBlack Box techniques ī‚ˇProblem / Configuration Management Tools Compare, keystroke capture & playback, regression testing Education ī‚ˇTesting Methodology ī‚ˇEffective use of tools ī‚ˇProduct knowledge ī‚ˇBusiness Release Strategy
  • 72. Testing methodologies Black box testing White box testing Incremental testing Thread testing
  • 73. īƒ˜ Black box testing â€ĸ No knowledge of internal design or code required. â€ĸ Tests are based on requirements and functionality īƒ˜ White box testing â€ĸ Knowledge of the internal program design and code required. â€ĸ Tests are based on coverage of code statements,branches,paths,conditions.
  • 74. īƒ˜ Incorrect or missing functions īƒ˜ Interface errors īƒ˜ Errors in data structures or external database access īƒ˜ Performance errors īƒ˜ Initialization and termination errors BLACK BOX - TESTING TECHNIQUE
  • 75. Black box / Functional testing īƒ˜ Based on requirements and functionality īƒ˜ Not based on any knowledge of internal design or code īƒ˜ Covers all combined parts of a system īƒ˜ Tests are data driven
  • 76. White box testing / Structural testing īƒ˜ Based on knowledge of internal logic of an application's code īƒ˜ Based on coverage of code statements, branches, paths, conditions īƒ˜ Tests are logic driven
  • 77. Functional testing īŦ Black box type testing geared to functional requirements of an application. īŦ Done by testers. System testing īŦ Black box type testing that is based on overall requirements specifications; covering all combined parts of the system. End-to-end testing īŦ Similar to system testing; involves testing of a complete application environment in a situation that mimics real-world use.
  • 78. Sanity testing īŦ Initial effort to determine if a new software version is performing well enough to accept it for a major testing effort. Regression testing īŦ Re-testing after fixes or modifications of the software or its environment.
  • 79. Acceptance testing īŦ Final testing based on specifications of the end-user or customer Load testing īŦ Testing an application under heavy loads. īŦ Eg. Testing of a web site under a range of loads to determine, when the system response time degraded or fails.
  • 80. Stress Testing īŦ Testing under unusually heavy loads, heavy repetition of certain actions or inputs, input of large numerical values, large complex queries to a database etc. īŦ Term often used interchangeably with ‘load’ and ‘performance’ testing. Performance testing īŦ Testing how well an application complies to performance requirements.
  • 81. Install/uninstall testing īŦ Testing of full,partial or upgrade install/uninstall process. Recovery testing īŦ Testing how well a system recovers from crashes, HW failures or other problems. Compatibility testing īŦ Testing how well software performs in a particular HW/SW/OS/NW environment.
  • 82. Exploratory testing / ad-hoc testing īŦ Informal SW test that is not based on formal test plans or test cases; testers will be learning the SW in totality as they test it. Comparison testing īŦ Comparing SW strengths and weakness to competing products.
  • 83. Alpha testing â€ĸTesting done when development is nearing completion; minor design changes may still be made as a result of such testing. Beta-testing â€ĸTesting when development and testing are essentially completed and final bugs and problems need to be found before release.
  • 84. Mutation testing īŦ To determining if a set of test data or test cases is useful, by deliberately introducing various bugs. īŦ Re-testing with the original test data/cases to determine if the bugs are detected.
  • 85. White Box - Testing
  • 86. White Box - testing technique īƒ˜ All independent paths within a module have been exercised at least once īƒ˜ Exercise all logical decisions on their true and false sides īƒ˜ Execute all loops at their boundaries and within their operational bounds īƒ˜ Exercise internal data structures to ensure their validity
  • 87. This white box technique focuses on the validity of loop constructs. 4 different classes of loops can be defined â€ĸ simple loops â€ĸ nested loops â€ĸ concatenated loops â€ĸ Unstructured loops Loop Testing
  • 88. Other White Box Techniques Statement Coverage – execute all statements at least once Decision Coverage – execute each decision direction at least once Condition Coverage – execute each decision with all possible outcomes at least once Decision / Condition coverage – execute all possible combinations of condition outcomes in each decision. Multiple condition Coverage – Invokes each point of entry at least once. Examples â€Ļâ€Ļ
  • 89. Statement Coverage – Examples Eg. A + B If (A = 3) Then B = X + Y End-If While (A > 0) Do Read (X) A = A - 1 End-While-Do
  • 90. Decision Coverage - Example If A < 10 or A > 20 Then B = X + Y Condition Coverage – Example A = X If (A > 3) or (A < B) Then B = X + Y End-If-Then While (A > 0) and (Not EOF) Do Read (X) A = A - 1 End-While-Do
  • 91. Incremental Testing īƒ˜ A disciplined method of testing the interfaces between unit-tested programs as well as between system components. īƒ˜ Involves adding unit-testing program module or component one by one, and testing each result and combination.
  • 92. Two types of Incremental Testing īƒ˜ Top-down – testing form the top of the module hierarchy and work down to the bottom. Modules are added in descending hierarchical order. īƒ˜ Bottom-up – testing from the bottom of the hierarchy and works up to the top. Modules are added in ascending hierarchical order.
  • 93. Testing Levels/ Techniques White Box Black Box Incre- mental Thread Unit Testing X Integration Testing X X X System Testing X Acceptance Testing X
  • 94. Major Testing Types īƒ˜ Stress / Load Testing īƒ˜ Performance Testing īƒ˜ Recovery Testing īƒ˜ Conversion Testing īƒ˜ Usability Testing īƒ˜ Configuration Testing
  • 95. Stress / Load Test īƒ˜ Evaluates a system or component at or beyond the limits of its specified requirements. īƒ˜ Determines the load under which it fails and how.
  • 96. Performance Test īŦ Evaluate the compliance of a system or component with specified performance requirements. īŦ Often performed using an automated test tool to simulate large number of users.
  • 97. Recovery Test Confirms that the system recovers from expected or unexpected events without loss of data or functionality. Eg. īƒ˜ Shortage of disk space īƒ˜ Unexpected loss of communication īƒ˜ Power out conditions
  • 98. Conversion Test īŦ Testing of code that is used to convert data from existing systems for use in the newly replaced systems
  • 99. Usability Test īŦ Testing the system for the users to learn and use the product.
  • 100. Configuration Test īŦ Examines an application's requirements for pre- existing software, initial states and configuration in order to maintain proper functionality.
  • 101. SOFTWARE TESTING LIFECYCLE - PHASES â€ĸ Requirements study â€ĸ Test Case Design and Development â€ĸ Test Execution â€ĸ Test Closure â€ĸ Test Process Analysis
  • 102. Requirements study īƒ˜ Testing Cycle starts with the study of client’s requirements. īƒ˜ Understanding of the requirements is very essential for testing the product.
  • 103. Analysis & Planning â€ĸ Test objective and coverage â€ĸ Overall schedule â€ĸ Standards and Methodologies â€ĸ Resources required, including necessary training â€ĸ Roles and responsibilities of the team members â€ĸ Tools used
  • 104. Test Case Design and Development â€ĸ Component Identification â€ĸ Test Specification Design â€ĸ Test Specification Review Test Execution â€ĸ Code Review â€ĸ Test execution and evaluation â€ĸ Performance and simulation
  • 105. Test Closure â€ĸ Test summary report â€ĸ Project Documentation Test Process Analysis Analysis done on the reports and improving the application’s performance by implementing new technology and additional features.
  • 106. TEST PLAN Objectives īƒ˜ To create a set of testing tasks. īƒ˜ Assign resources to each testing task. īƒ˜ Estimate completion time for each testing task. īƒ˜ Document testing standards.
  • 107. īƒ˜A document that describes the īŦ scope īŦ approach īŦ resources īŦ schedule â€Ļof intended test activities. īƒ˜Identifies the īŦ test items īŦ features to be tested īŦ testing tasks īŦ task allotment īŦ risks requiring contingency planning.
  • 108. Purpose of preparing a Test Plan īƒ˜ Validate the acceptability of a software product. īƒ˜ Help the people outside the test group to understand ‘why’ and ‘how’ of product validation. īƒ˜ A Test Plan should be īŦ thorough enough (Overall coverage of test to be conducted) īŦ useful and understandable by the people inside and outside the test group.
  • 109. Scope īƒ˜The areas to be tested by the QA team. īƒ˜Specify the areas which are out of scope (screens, database, mainframe processes etc). Test Approach īƒ˜Details on how the testing is to be performed. īƒ˜Any specific strategy is to be followed for testing (including configuration management).
  • 110. Entry Criteria Various steps to be performed before the start of a test i.e. Pre-requisites. E.g. īŦ Timely environment set up īŦ Starting the web server/app server īŦ Successful implementation of the latest build etc. Resources List of the people involved in the project and their designation etc.
  • 111. Tasks/Responsibilities Tasks to be performed and responsibilities assigned to the various team members. Exit Criteria Contains tasks like â€ĸBringing down the system / server â€ĸRestoring system to pre-test environment â€ĸDatabase refresh etc. Schedule / Milestones Deals with the final delivery date and the various milestones dates.
  • 112. Hardware / Software Requirements īƒ˜Details of PC’s / servers required to install the application or perform the testing īƒ˜Specific software to get the application running or to connect to the database etc. Risks & Mitigation Plans īƒ˜List out the possible risks during testing īƒ˜Mitigation plans to implement incase the risk actually turns into a reality.
  • 113. Tools to be used īƒ˜List the testing tools or utilities īƒ˜Eg.WinRunner, LoadRunner, Test Director, Rational Robot, QTP. Deliverables īƒ˜Various deliverables due to the client at various points of time i.e. Daily / weekly / start of the project end of the project etc. īƒ˜These include test plans, test procedures, test metric, status reports, test scripts etc.
  • 114. References īŦ Procedures īŦ Templates (Client specific or otherwise) īŦ Standards / Guidelines e.g. Qview īŦ Project related documents (RSD, ADD, FSD etc).
  • 115. Annexure īƒ˜ Links to documents which have been / will be used in the course of testing Eg. Templates used for reports, test cases etc. īƒ˜ Referenced documents can also be attached here. Sign-off īƒ˜ Mutual agreement between the client and the QA Team. īƒ˜ Both leads/managers signing their agreement on the Test Plan.
  • 116. Good Test Plans īƒ˜ Developed and Reviewed early. īƒ˜ Clear, Complete and Specific īƒ˜ Specifies tangible deliverables that can be inspected. īƒ˜ Staff knows what to expect and when to expect it.
  • 117. Good Test Plans īƒ˜ Realistic quality levels for goals īƒ˜ Includes time for planning īƒ˜ Can be monitored and updated īƒ˜ Includes user responsibilities īƒ˜ Based on past experience īƒ˜ Recognizes learning curves
  • 118. TEST CASES Test case is defined as īƒ˜ A set of test inputs, execution conditions and expected results, developed for a particular objective. īƒ˜ Documentation specifying inputs, predicted results and a set of execution conditions for a test item.
  • 119. īƒ˜ Specific inputs that will be tried and the procedures that will be followed when the software tested. īƒ˜ Sequence of one or more subtests executed as a sequence as the outcome and/or final state of one subtests is the input and/or initial state of the next. īƒ˜ Specifies the pretest state of the AUT and its environment, the test inputs or conditions. īƒ˜ The expected result specifies what the AUT should produce from the test inputs.
  • 120. Good Test Plans īƒ˜ Developed and Reviewed early. īƒ˜ Clear, Complete and Specific īƒ˜ Specifies tangible deliverables that can be inspected. īƒ˜ Staff knows what to expect and when to expect it.
  • 121. Good Test Plans īƒ˜ Realistic quality levels for goals īƒ˜ Includes time for planning īƒ˜ Can be monitored and updated īƒ˜ Includes user responsibilities īƒ˜ Based on past experience īƒ˜ Recognizes learning curves
  • 122. Test Cases Contents īŦ Test plan reference id īŦ Test case īŦ Test condition īŦ Expected behavior
  • 123. Good Test Cases Find Defects īƒ˜ Have high probability of finding a new defect. īƒ˜ Unambiguous tangible result that can be inspected. īƒ˜ Repeatable and predictable.
  • 124. Good Test Cases īƒ˜ Traceable to requirements or design documents īƒ˜ Push systems to its limits īƒ˜ Execution and tracking can be automated īƒ˜ Do not mislead īƒ˜ Feasible
  • 125. Defect Life Cycle What is Defect? A defect is a variance from a desired product attribute. Two categories of defects are â€ĸ Variance from product specifications â€ĸ Variance from Customer/User expectations
  • 126. Variance from product specification īƒ˜ Product built varies from the product specified. Variance from Customer/User specification īƒ˜ A specification by the user not in the built product, but something not specified has been included.
  • 127. Defect categories Wrong The specifications have been implemented incorrectly. Missing A specified requirement is not in the built product. Extra A requirement incorporated into the product that was not specified.
  • 128. Defect Log â€ĸ Defect ID number â€ĸ Descriptive defect name and type â€ĸ Source of defect – test case or other source â€ĸ Defect severity â€ĸ Defect Priority â€ĸ Defect status (e.g. New, open, fixed, closed, reopen, reject)
  • 129. 7. Date and time tracking for either the most recent status change, or for each change in the status. 8. Detailed description, including the steps necessary to reproduce the defect. 9. Component or program where defect was found 10. Screen prints, logs, etc. that will aid the developer in resolution process. 11. Stage of origination. 12. Person assigned to research and/or corrects the defect.
  • 130. Severity Vs Priority Severity Factor that shows how bad the defect is and the impact it has on the product Priority Based upon input from users regarding which defects are most important to them, and be fixed first.
  • 131. Severity Levels īƒ˜ Critical īƒ˜ Major / High īƒ˜ Average / Medium īƒ˜ Minor / low īƒ˜ Cosmetic defects
  • 132. Severity Level – Critical īƒ˜ An installation process which does not load a component. īƒ˜ A missing menu option. īƒ˜ Security permission required to access a function under test. īƒ˜ Functionality does not permit for further testing.
  • 133. īƒ˜ Runtime Errors like JavaScript errors etc. īƒ˜ Functionality Missed out / Incorrect Implementation (Major Deviation from Requirements). īƒ˜ Performance Issues (If specified by Client). īƒ˜ Browser incompatibility and Operating systems incompatibility issues depending on the impact of error. īƒ˜ Dead Links.
  • 134. Severity Level – Major / High īƒ˜ Reboot the system. īƒ˜ The wrong field being updated. īƒ˜ An updated operation that fails to complete. īƒ˜ Performance Issues (If not specified by Client). īƒ˜ Mandatory Validations for Mandatory Fields.
  • 135. īƒ˜ Functionality incorrectly implemented (Minor Deviation from Requirements). īƒ˜ Images, Graphics missing which hinders functionality. īƒ˜ Front End / Home Page Alignment issues. īƒ˜ Severity Level – Average / Medium Incorrect/missing hot key operation.
  • 136. Severity Level – Minor / Low īƒ˜ Misspelled or ungrammatical text īƒ˜ Inappropriate or incorrect formatting (such as text font, size, alignment, color, etc.) īƒ˜ Screen Layout Issues īƒ˜ Spelling Mistakes / Grammatical Mistakes īƒ˜ Documentation Errors
  • 137. īƒ˜ Page Titles Missing īƒ˜ Alt Text for Images īƒ˜ Background Color for the Pages other than Home page īƒ˜ Default Value missing for the fields required īƒ˜ Cursor Set Focus and Tab Flow on the Page īƒ˜ Images, Graphics missing, which does not, hinders functionality
  • 138. Test Reports 8 INTERIM REPORTS īƒ˜ Functional Testing Status īƒ˜ Functions Working Timeline īƒ˜ Expected Vs Actual Defects Detected Timeline īƒ˜ Defects Detected Vs Corrected Gap Timeline īƒ˜ Average Age of Detected Defects by type īƒ˜ Defect Distribution īƒ˜ Relative Defect Distribution īƒ˜ Testing Action
  • 139. Functional Testing Status Report Report shows percentage of the functions that are â€ĸFully Tested â€ĸTested with Open defects â€ĸNot Tested
  • 140. Functions Working Timeline īƒ˜Report shows the actual plan to have all functions verses the current status of the functions working. īƒ˜Line graph is an ideal format.
  • 141. Expected Vs. Actual Defects Detected īƒ˜Analysis between the number of defects being generated against the expected number of defects expected from the planning stage.
  • 142. Defects Detected Vs. Corrected Gap A line graph format that shows the īƒ˜Number of defects uncovered verses the number of defects being corrected and accepted by the testing group.
  • 143. Average Age Detected Defects by Type īƒ˜Average days of outstanding defects by its severity type or level. īƒ˜The planning stage provides the acceptable open days by defect type.
  • 144. Defect Distribution Shows defect distribution by function or module and the number of tests completed. Relative Defect Distribution īƒ˜Normalize the level of defects with the previous reports generated. īƒ˜Normalizing over the number of functions or lines of code shows a more accurate level of defects.
  • 145. Testing Action Report shows īŦ Possible shortfalls in testing īŦ Number of severity-1 defects īŦ Priority of defects īŦ Recurring defects īŦ Tests behind schedule â€Ļ.and other information that present an accurate testing picture
  • 147. Process Metrics īƒ˜ Measures the characteristic of the â€ĸ methods â€ĸ techniques â€ĸ tools
  • 148. Product Metrics īƒ˜ Measures the characteristic of the documentation and code.
  • 149. Test Metrics User Participation = User Participation test time Vs. Total test time. Path Tested = Number of path tested Vs. Total number of paths. Acceptance criteria tested = Acceptance criteria verified Vs. Total acceptance criteria.
  • 150. Test cost = Test cost Vs. Total system cost. Cost to locate defect = Test cost / No. of defects located in the testing. Detected production defect = No. of defects detected in production / Application system size. Test Automation = Cost of manual test effort / Total test cost.
  • 151. CMM – Level 1 – Initial Level The organization īƒ˜Does not have an environment for developing and maintaining software. īƒ˜At the time of crises, projects usually stop using all planned procedures and revert to coding and testing.
  • 152. CMM – Level 2 – Repeatable level Effective management process having established which can be īƒ˜ Practiced īƒ˜ Documented īƒ˜ Enforced īƒ˜ Trained īƒ˜ Measured īƒ˜ Improvised
  • 153. CMM – Level 3 – Defined level īƒ˜Standard defined software engineering and management process for developing and maintaining software. īƒ˜These processes are put together to make a coherent whole.
  • 154. CMM – Level 4 – Managed level īƒ˜Quantitative goals set for both software products and processes. īƒ˜The organizational measurement plan involves determining the productivity and quality for all important software process activities across all projects.
  • 155. CMM – Level 5 – Optimizing level Emphasis laid on īƒ˜Process improvement īƒ˜Tools to identify weaknesses existing in their processes īƒ˜Make timely corrections
  • 156. Cost of Poor Quality Total Quality Costs represent the difference between the actual (current) cost of a product or service and what the reduced cost would be if there were no possibility of substandard service, failure to meet specifications, failure of products, or defects in their manufacture. Campanella, Principles of Quality Costs
  • 157. Prevention of Poor Quality
  • 158.
  • 159. COQ Process 1. Commitment 2. COQ Team 3. Gather data (COQ assessment) 4. Pareto analysis 5. Determine cost drivers 6. Process Improvement Teams 7. Monitor and measure 8. Go back to step 3 Generally Missing
  • 160. “Wished I had understood that Cost of Quality stuff better”
  • 161. TESTING STANDARDS External Standards Familiarity with and adoption of industry test standards from organizations. Internal Standards Development and enforcement of the test standards that testers must meet.
  • 162. IEEE STANDARDS Institute of Electrical and Electronics Engineers designed an entire set of standards for software and to be followed by the testers.
  • 163. IEEE – Standard Glossary of Software Engineering Terminology IEEE – Standard for Software Quality Assurance Plan IEEE – Standard for Software Configuration Management Plan IEEE – Standard for Software for Software Test Documentation IEEE – Recommended Practice for Software Requirement Specification
  • 164. IEEE – Standard for Software Unit Testing IEEE – Standard for Software Verification and Validation IEEE – Standard for Software Reviews IEEE – Recommended practice for Software Design descriptions IEEE – Standard Classification for Software Anomalies
  • 165. IEEE – Standard for Software Productivity metrics IEEE – Standard for Software Project Management plans IEEE – Standard for Software Management IEEE – Standard for Software Quality Metrics Methodology
  • 166. Other standardsâ€Ļ.. ISO – International Organization for Standards Six Sigma – Zero Defect Orientation SPICE – Software Process Improvement and Capability Determination NIST – National Institute of Standards and Technology
  • 167. www.softwaretestinggenius.com A Storehouse of Vast Knowledge on Multiple Answer Interview Questions / Quiz as used by Several MNC’s to Evaluate New Testers and Hundreds of Interview Preparation Questions on QuickTest Professional (QTP) , LoadRunner , Software Testing & Quality Assurance >>>>>>>>>>>>>> www.softwaretestinggenius.com <<<<<<<<<<<<<<