SlideShare a Scribd company logo
1 of 136
Software Testing Fundamentals
V Model
Validation
Verification
LLD
HLD
System test
planning
Integration test
planning
Unit test
planning
Unit
testing
Integration
Testing
System
Testing
Coding
Delivery
production
deployment
Maintenance
and
enhancement
URS
UAT
planning
SRS
User Acceptance
Testing
Software Testing Definitions
The process of executing a program or part of a program
with the intent of finding errors (Myers)
Testing is the process of trying to discover every
conceivable fault of weakness in a work product (Myers)
The process of searching for errors (Kaner)
Testing is the process of evaluating or exercising a system
or system component by manual or automated means to
verify that the software meets specified requirements
(IEEE)
Role of a Tester
Assuring that the software meets user’s needs
Software can be used with negligible risks
This is achieved through
 Verification
 Validation
Verification
Verification
 It is the process of determining whether or not the
product of given phase fulfill the spec. from the
previous phase
 Uses reviews, inspections, and demonstrations
throughout development to ensure the quality of the
product of that phase, including that it meets the
requirements from the previous phase
 “Are we building the product right?”
Validation
 The process of evaluating the software at the end of
development to ensure compliance with the specified
requirements
 Includes what is commonly thought of as testing and
comparing test results to expected results. Validation
occurs at the end of the development process.
 “Are we building the right product?”
Static & Dynamic Testing
Most of the Verification and Validation activities can be
classified as Static or Dynamic
Static testing (without executing any program)
 Requirement reviews
 Design reviews
 Code reviews
Dynamic testing
 Testing the software by executing the program
Characteristics of Static Testing
Static
 Do not observe system behavior
 Not looking for system failures
 Faults are directly detected
 Focus is on evaluating adherence to
 Standards,
 Guidelines and
 Processes
Characteristics of Dynamic Testing
contd.
Dynamic Testing
 The program is executed
 System behavior is observed
 Determine the existence of failures
 Reveals the presence of faults
White Box Testing (Code based testing)
A software testing technique whereby explicit knowledge
of the internal workings of the item being tested
white box testing uses specific knowledge of programming
code to examine outputs
Also known as glass box, structural, clear box and open
box testing
Advantages of white box testing
Helps to identify the following:
 Adherence to coding standards
 Adherence to coding guidelines
 Indentation
 Memory Leaks
 Logical complexity of the program
 Limitations of the program
Black Box Testing (Requirement based
testing)
A Software testing technique where by the expected
outcome of the software is verified by providing inputs
without considering how the software program arrives at
those outputs.
The internal workings of the item being tested are not
known by the tester in black box testing.
The tester does not ever examine the programming code
and does not need any further knowledge of the program
other than its specifications.
Advantages of Black Box testing
The test is unbiased because the designer and the tester are
independent of each other.
The tester does not need knowledge of any specific
programming language(s).
The test is done from the point of view of the end user, not
the designer or programmer.
Test cases can be designed as soon as the specifications are
complete.
Conclusions
White box testing does not guarantee 100% conformance
to requirements
Black box testing does not concentrate on logic of the
program, but ensures conformance to requirements
Hence, both white box and black box testing is required to
ensure product quality”
All types of testing, whether static or dynamic, white box
or black box are part of verification and validation
activities.
Let us see verification and validation activities.
Verification & Validation activities
Verification
 Requirement reviews
 Design reviews
 Code reviews
Validation
 Unit testing
 Module testing
 Integration testing
 System testing
 Regression testing
 User acceptance testing
 Field testing
Software Testing Life Cycle
[STLC]
STLC Activities
Test Requirements document
Test Planning
Test Design
Test Execution
Defect Tracking
Test Requirements Document
From the software requirement specification
(SRS)document, list of testable requirements are extracted
and referred to as Test Requirements document.
All non technical and un-testable requirements are
extracted from this document.
Test requirements document is the base for further
activities of Testing
Test Planning
Mainly, Test Plan addresses
 Scope and objectives of testing
 Schedule, Resources and Reporting
 Types of testing and methodology
 Phases of testing applicable and scope of testing in each
phase
 Software and hardware requirements
 Identified risks and strategy for mitigating those risks
 Information regarding tools used through entire testing
life cycle
Test Design
Test Design is applicable to both white box and black box
testing
Test design activity involves designing test cases for a
given requirement (Black box testing) or for a given
program (white box testing).
Test case is defined as
 “a set of test inputs, execution conditions, and expected
results developed for a particular objective, such as to
exercise a particular program path or to verify
compliance with a specific requirement [IEEE]
Test Execution
Test execution involves
 Executing developed test cases on a piece of program
developed (Code based test cases) or on the entire
software application (Requirements based test cases)
 The status of test case is updated during execution
 Possible states include
 Pass, Fail, Unable to test, deferred
 Test execution statistics are collected and analyzed for
test progress monitoring
Defect Tracking
When actual result obtained from the software application
during testing, deviates from expected result written in the
test case, it is termed as a “defect”.
The test case is failed and a defect posted on the software.
The defect is fixed by the development team and the fix is
provided in subsequent releases.
The fix provided for the defect is validated and if found to
be working, the test case passes and the defect closed.
The defect posting, tracking, closing the defects are done
in a defect tracking tool.
SDLC Vs STLC
Requirements
Phase
Design Phase
Coding Phase
Deployment Phase
Test Requirements
document
Test Planning
Test Case Design
Unit Test Execution
Defect
Tracking
System Test Execution
Requirement Reviews
Requirement reviews
Requirement quality affects work performed in subsequent
phases of the system life cycle. Requirements of poor
quality
 Increase cost and schedule: effort is spent during design
and implementation trying to figure out what the
requirements are
 Decrease product quality: poor requirements cause the
wrong product to be delivered or de-scoping to meet
schedule or cost constraints
Requirement reviews contd.
Increase maintenance effort: lack of traceability increases
the effort to identify where changes are required, especially
as knowledgeable personnel leave
Create disputes with the customer/client: ambiguity causes
differences in expectations and contractual issues
Are a major cause of project failure: all of the above
Requirement Quality factors
Cohesive
Complete
Consistent
Feasible
Independent
Necessary
Unambiguous
Mandatory
Usable
Terse
Testable
Traceable
Non redundant
External observability
Metadata
Verifiable and validatable
Requirement quality factors
Requirements
Cohesive
Complete
Consistent
Feasible
Independent
Necessary
Unambiguous
Mandatory
Usable
Testable Traceable
Non redundant
External observability
Metadata
Terse
Requirement characteristic: Cohesive
Does each requirement specify only one thing?
Do all parts of the requirement belong together:
Do all parts of a data requirement involve the same data
abstraction?
Do all parts of a functional requirement involve the same
functional abstraction?
Do all parts of an interface requirement involve the same
interface?
Do all parts of a quality requirement involve the same
quality factor or sub-factor?
Requirement characteristic: Complete
Is each requirement self contained with no missing
information?
Does each requirement contain all relevant information?
For example, does the requirement include all relevant
preconditions such as the relevant state of the application
or component?
Does each requirement need no further amplification or
clarification?
Does each requirement provide sufficient information to
avoid ambiguity?
Requirement characteristic: Complete
If the requirement is not a part of the current release, then
is it specified as completely and as thoroughly as is
currently known?
Is each identified “requirement” actually a single
requirement and not actually multiple requirements?
Is the use of conjunctions (“and” and “or”) restricted to
preconditions and invariants?
Requirement characteristic: Consistent
Is each requirement externally consistent with its
documented sources such as higher-level goals and
requirements?
Is each requirement externally consistent with all other
related requirements of the same type or at the same
requirements specification? For example, two requirements
should neither be contradictory nor describe the same
concepts using different words.
Are the constituent parts of each requirement internally
consistent? For example, are all parts of a compound
precondition or post-condition consistent?
Requirement characteristic: Feasible
Can each requirement be implemented given the existing
hardware or software technology?
Can each requirement be implemented given the
endeavor’s budget?
Can each requirement be implemented given the
endeavor’s schedule?
Can each requirement be implemented given the
endeavor’s constraints on staffing (e.g., staff size,
expertise, and experience)?
Can each requirement be implemented given the
limitations of physics, chemistry, etc?
Requirement characteristic:Independent
The requirement does not rely on another requirement to
be fully understood.
Requirements that need proxies are not independent.
Parent requirements rely on their children to be fully
defined.
In testing, a parent is not satisfied until all its children are
met.
Why retain them? These may be source requirements that
must be retained.
Requirement characteristic:Independent
Also, using them to structure the proxies or children
improves understandability.
Example: "user friendly" can be used to assign, talk about,
or locate the group of proxies defining "user friendly" for
that particular project.
Requirement characteristic: Mandatory
Is each requirement essential to the success of the
application or component?
Is each requirement truly mandatory (i.e., a true
requirement that must be met and implemented)?
Is each requirement truly required by some stakeholder,
typically the customer or user organization?
Is each requirement free from unnecessary constraints
(e.g., architecture, design, implementation, testing, and
other technology decisions)?
Requirement characteristic: Mandatory
Does each requirement specify a “what” rather than a
“how”?
Is each requirement clearly differentiated from:
A “nice to have” item on someone’s wish list (i.e., gold-
plating)?
Constraints?
Requirement characteristic: Metadata
Individual requirements should have metadata (i.e.,
attributes or annotations) that characterizes them.
This metadata can include (but is not limited to)
 Acceptance criteria, Allocation, Assumptions,
Identification, Prioritization, Rationale, Schedule,
Status, and Tracing information
Requirement characteristic: Verifiability
Can each requirement be verified against its source?
Can each requirement be verified against its associated
standards (e.g., content and format), guidelines, and/or
templates?
Requirement characteristic:
Validatability
Is it possible to ensure that each requirement is actually
what the customer representatives really want and need?
Is it possible to ensure that each requirement is actually
user representatives really want and need?
Is it possible to ensure that each requirement is actually
what the marketing representatives really want and need?
Does each requirement only specify behavior and/or
characteristics that are externally observable when treating
the application or component as a black-box?
Does each requirement avoid specifying any internal
architecture, design, implementation, or testing decisions?
If a requirement does specify one or more internal
architecture, design, implementation, or testing decisions,
is the requirement clearly identified as a constraint rather
than as a pure requirement?
Requirement characteristic:
External Observability
Requirement characteristic: Testable
Able to prove the object of the requirement satisfies the
requirement
Un-testable requirements can lead to disputes with the
client.
Example of an un-testable requirement
 “The system shall produce the ABC report in a
timely manner”
 “The system shall be written in the approved
language”
Requirement characteristic: Traceable
Examine the statement “The system shall calculate
retirement annuities and survivor benefits”
Observations:
 2 different requirement clubbed together
 Cannot maintain distinctness while reporting
 Can be decomposed as under
 The system shall calculate
 A. Retirement annuities
 B. Survivor benefits
Requirement attributes
Unique identifier
Organizational information--for example, what are the
parents/children of the requirement, its category or type
Method of validation
Item(s) that satisfy the requirement
Source of requirement (legal citation, business policy, etc.)
Association with the test plan/tests(s)
Requirement owners (subject matter expert, analyst)
Requirement status
Requirement attributes contd.
Requirement change history
WBS code
Risk
Priority
Cost (estimate and actual)
Degree of difficulty
Metrics
Justification for the requirement
Cross references to other requirements or documents
Comments
Case Study I:
Requirements review
Review the software requirement specification (SRS)
document for marketing division of ABC pharmaceuticals
and provide review comments in the enclosed template.
Categorize each review comment by appropriate severity
and category.
At the end, provide statistics of review comments in terms
of severity and category.
Design Review
Design reviews
Reviews for software design focus on data design,
architectural design and procedural design.
In general, there are two types of design reviews
 Preliminary design review
 Design walkthrough
Preliminary design review and design
walkthrough…
Preliminary design review
 Assesses the translation of requirements to the design
of data and architecture
Design walkthrough
 Concentrates on the procedural correctness of
algorithms as they are implemented within program
modules
Design review verifications…
Do designs satisfy all specified requirements for the
product?
Have all relevant standards, guidelines applied or met?
Are product design and processing capabilities
compatible?
Are safety requirements met?
Design review verifications…
Do designs meet functional and operational requirements..
For example, performance and reliability requirements?
Is the design satisfactory for all the anticipated
environmental and load conditions?
Are components or service elements standardized and do
they provide reliability, availability and maintainability?
Design review verifications…
Are plans for implementing design technically feasible (in
terms of purchasing, production, installation, inspection
and testing)
Are the assumptions made during the design process valid?
Case Study II: Design review
Review the Design specification document requirements
provided in SRS for marketing division of ABC
pharmaceuticals and provide review comments in the
enclosed template.
Categorize each review comment by appropriate severity
and category.
At the end, provide statistics of review comments in terms
of severity and category
Code Reviews
Introduction :Code review
Code review is a phase in the computer program
development process.
It is an activity in which, authors of code, peer reviewers,
and perhaps quality assurance reviewers get together to
review code.
The code is read line by line for
 real or potential flaws,
 consistency with the overall program design,
 comment quality, and
 adherence to coding standards”
Advantages:Code review
Finding and correcting errors at this stage is relatively
inexpensive
Code reviews tend to reduce the more expensive process of
handling, locating, and fixing bugs during later stages of
development or after code delivery to users
Code review smoke test
The code review smoke test includes
 Does the code build correctly?
 Does the code execute as expected?
 Has the developer tested the code for positive
workflows?
 As a reviewer, do you understand the code?
Comments and coding conventions
Does the code respect project specific coding conventions?
Does the source file start with an appropriate header and copyright
information?
Are variable declarations properly commented?
Are units of numeric data properly commented?
Are units of numeric data clearly stated?
Are all functions, methods and classes documented?
Are complex algorithms, code optimizations adequately commented?
Does the code that have been commented out have an explanation?
Are comments used to identify missing functionality or unresolved
issue in the code?
Error handling
Are assertions used everywhere data is expected to have a
valid value or range?
Are errors properly handled each time a function returns?
Are resources and memory released in all error paths?
Are all thrown exceptions handled properly?
Is the function caller notified when an error is detected?
Has error handling code been tested?
Resource Leaks
Is allocated memory (non-garbage collected) freed?
Are all objects (Database connections, Sockets, Files, etc.)
freed even when an error occurs?
Is the same object released more than once?
Does the code accurately keep track of reference counting?
Thread safeness
Are all global variables thread-safe?
Are objects accessed by multiple threads thread-safe?
Are locks released in the same order they are obtained?
Is there any possible deadlock or lock contention?
Control Structures
Are loop ending conditions accurate?
Is the code free of unintended infinite loops?
Performance
Do recursive functions run within a reasonable amount of
stack space?
Are whole objects duplicated when only references are
needed?
Does the code have an impact on size, speed, or memory
use?
Are you using blocking system calls when performance is
involved?
Is the code doing busy waits instead of using
synchronization mechanisms or timer events?
Functions
Are function parameters explicitly verified in the code?
Are arrays explicitly checked for out-of-bound indexes?
Are functions returning references to objects declared on
the stack?
Are variables initialized before they are used?
Does the code re-write functionality that could be achieved
by using an existing API?
Bug fixes
Does a fix made to a function change the behavior of caller
functions?
Does the bug fix correct all the occurrences of the bug?
Case Study III
Review the code written in C++ for marketing division of
ABC pharmaceuticals and provide review comments in the
enclosed template.
Categorize each review comment by appropriate severity
and category.
At the end, provide statistics of review comments in terms
of severity and category. The categories can include
 Comments and coding conventions, Error handling,
Resource leaks, Control structures, Bug fixes,
Functions, Deviation from Req, Deviation from design.
White Box Testing
White Box Testing (Code based testing)
A software testing technique whereby explicit knowledge
of the internal workings of the item being tested
White box testing uses specific knowledge of
programming code to examine outputs
Examines the internal design of the program
Requires detailed knowledge about structure of the
program
Allows exhaustive testing of all the logical paths (i.e. each
line of code for each condition)
Also known as glass box, structural, clear box and open
box testing
Advantages of white box testing
Helps to identify the following:
 Adherence to coding standards
 Adherence to coding guidelines
 Indentation
 Memory Leaks
 Buffer overflows, stacks
 Logical complexity of the program
 Limitations of the program
Statement coverage
Statement Coverage
 Each statement in the program is executed at least once
 100% of the statements in the program should be
executed at-least once
Weakness: It is necessary but not sufficient. When there is
a decision, you have to ensure that it takes a correct path. It
is not done by statement coverage.
Branch/Decision Coverage
Statement coverage does not address all outcomes of
decisions.
Branches like If..Else, Do..While are to be evaluated for
both true and false
Test each condition for a true and a false value
That is, each branch direction must be traversed at-least
once
Ex: For the condition (A>=5) or (B<2) THEN X=1, the test cases are:
A=6 and B=4 …True (Here, A is true and B is false)
A=2 and B=3 … False (Here, A is false and B is false)
That is, check how many decisions are there. For each decision, write
one test case for true and one test case for false
Conditions Coverage
All the conditions should be executed at least once for both
false and true conditions.
True and false outcome of each condition in a decision
must be tested.
Do not look for combinations.
Example: For the condition (A>=5) or (B<2) THEN X=1, the test
cases are:
A=6 and B=3 …True (Here, A is true and B is False)
A=2 and B=1 … True (Here, A is false and B is true)
Condition/Decision coverage
Condition/Decision Coverage
 It may not always result in decision coverage. In such
cases, go in for decision +condition coverage.
Multiple Condition Coverage:
 Go for combinations. For Example: For the condition
(A>=5) or (B<2) THEN X=1, the test cases are:
 A=6, B=6
 A=6, B=3
 A=2, B=1
 A=2, B=3
Path Coverage
Errors are sometimes revealed in a path including
combination of branches.
More general coverage requires executing all possible
paths, known as path coverage criteria.
Number of paths may be infinite if there are loops.
100% path coverage is impossible
White box testing steps
Examine the program logic
Design test cases to satisfy logic coverage criteria
Run the test cases
Compare the actual results obtained with expected results
in the test case
Report errors in case of deviation from expected results
Compare actual coverage to expected coverage
Cyclomatic Complexity
Cyclomatic complexity provides quantitative measure of
logical complexity of the program
Cyclomatic complexity provides minimum number of
independent paths in the given program
Based on the Cyclomatic complexity value obtained, the
decision to accept the program for testing or not, can be
made
Black Box Testing
(Requirement Based Testing)
Software Testing Phases
Software Testing Phases
Unit Testing
Module Testing
Integration Testing
System Testing
User Acceptance Testing
Field Testing
Test Case Design Techniques
Client Server Application Testing
Web Based Application Testing
Introduction to web applications
Web Technology
Web Architecture
HTML/DHTML
Web servers
Cookies
Types of testing applicable to web applications
Applicable types of testing
Unit testing
Page flow testing
Usability testing
Functional testing
Load testing
Performance testing
Data volume testing
Security testing
Regression testing
External testing
Connectivity testing
Stress testing
Unit Testing
Unit testing involves testing of the individual modules and
pages that make up the application
In general, unit tests check the behavior of a given page i.e.
does the application behave correctly and consistently
given either good or bad input
Some of the types of checking would include:
 Invalid input (Missing output, out of bound input,
entering an integer when float expected, and vice versa,
control characters in strings etc.,)
 Alternate Input Format (e.g., 0 instead of 0.0,
0.00000001 instead of 0 etc.,)
Unit Testing
 Button click testing e.g., multiple clicking with and
without pauses between clicks.
 Immediate reload after button click prior to response
having been received.
 Multiple reloads in the same manner as above.
Random input and random click testing.
 This testing involves a user randomly pressing buttons
(including multiple clicks on "hrefs") and randomly
picking checkboxes and selecting them.
Unit Testing
There are two forms of output screen expected:
 An error page indicating the type of error encountered.
 A normal page showing either the results of the
operation or the normal next page where more options
may be selected.
“In no event should a catastrophic error occur”
Page Flow Testing
Page flow testing deals with ensuring that jumping to
random pages does not confuse the application.
Each page should typically check to ensure that it can only
be viewed via specific previous pages, and if the referring
page was not one of that set, then an error page should be
displayed.
A page flow diagram is a very useful aid for the tester to
use when checking for correct page flow within the
application.
Impact of page flow on security
Some aspects of page flow testing cross into security.
Some simple checks to consider are
 Forcing the application to move in an unnatural path.
 The application must resist, and display appropriate
error message
Page flow testing : Details
 Log into the system and then attempt to jump to any
page in any order once a session has been established.
 Use bookmarks and set up temporary web pages to
redirect into the middle of an application using faked
session information
Usability testing
Usability testing ensures that all pages present a cohesive
look to the user, including spelling, graphics, page size,
response time, etc
Examples of usability testing include:
 Spelling checks
 Graphical user interface checks (colors, dithering,
aliasing, size, etc.,)
 Adherence to web GUI Standards
 Meaningful error messages
 Accuracy of data displayed
Usability testing contd.
 Page Navigation
 Context sensitivity
 Editorial continuity
 Accessibility
 Accuracy of data in the database as a result of user
input
 Accuracy of data in the database as a result of external
factors (e.g. imported data)
 Meaningful help pages including context sensitive help
Functional Testing
Functional testing ensures
 Conformance to functional requirements of the
application
 Scenarios/Test cases are designed to find out
conformance to the requirements
 Whole business logic gets tested as part of the
functional testing
Load Testing
Load testing the application involves generation of varying
loads (in terms of concurrent users) against
 web server,
 the databases supporting the web server and
 the middle ware/application server logic connecting
those pages to the databases
Load testing includes verification of data integrity on the
web pages, within the back end database and also the load
ramping or surges in activity against the application
Load Testing
"Does the site scale", "Is the site's response time
deterministic, etc.
Examples of load testing would include:
 Sustained low load test (50 users for around 48 hours).
 Sustained high load test (300+ users for 12 hours).
 Surge test (e.g. run 50 users, then surge to 500 users
and then return to 50, no memory leaks, lost users,
orphaned processes, etc., should be seen).
 The system should continue running with multiple
surges at various times during the day.
 This test should run for 48 hours.
Load Testing contd.
Load testing is also to discover at what load the application
would fail and what are the saturation point.
Performance Testing
Performance Testing refers to the response time by the
software to process and present the requests made by the
end users
Performance depends on
 Speed of the network
 Hardware configuration of application server, web
server, database server and the client system (Processor,
RAM etc)
 Volume of data in the database
Data Volume Testing
Data volume testing involves testing the application under
data load, where large quantities of data are passed through
the system. (e.g. large number of items in
dropdown/combo boxes, large amount of data in text
boxes).
Performance of the application should be monitored during
this testing, since a slow database could significantly affect
response time and data must be collected over this.
Data Volume Testing
This data can be used as a control set for contrasting
monitoring data from a live system and providing
predictive information indicating when major application
stress points may be encountered.
No errors should be seen on application pages or in error
logs for pages that are data intensive.
Security Testing
Security testing involves verifying weather both the
servers and the application are managing security correctly
Security from server perspective
 Attempt to penetrate system security both internally and
externally to ensure the system that houses the
application is secure from bother internal and external
attacks.
 Attempt to cause things like buffer overflow to result in
root access being given accidentally, (such code does
exist, but explaining it is beyond the scope of this
document)
Security Testing contd.
 Attempt to cause the application to crash by giving it
false or random information
 Ensure that the server OS is up to correct patch levels
from security viewpoint
 Ensure that the server is physically secure
Security Testing contd.
Application level security testing involves testing some or
all the following
 Unauthenticated access to the application
 Unauthorized access to the application
 Unencrypted data passing
 Protection of the data
 Log files should be checked to ensure they do not
contain sensitive information
Security Testing contd.
 Faked sessions. Sessions information must be
valid and secure. (e.g. a URL containing a
session identifier cannot be copied from one
system to another and then the application be
continued from the different system without
being detected)
 Multiple login testing by a single user from
several clients
Security Testing contd.
 Attempt to break into the application by running
username/password checks using password-cracking
program
 Security audit, e.g. examine log files, etc., no
sensitive information should be left in raw
text/human readable form in any log file
 Automatic logout after N minutes of inactivity with
positive feedback to the user
Regression Testing
Regression testing ensures that during the lifetime of the
application, any fixes do not break other parts of the
application
This type of testing typically involves running all the tests,
or a relevant subset of those tests when defect fixes are
made or new functionalities added
The regression tests must also be kept up to date with
planned changes in the application. As the application
evolves, so must the tests
External Testing
External testing deals with checking the effect of external
factors on the application. Example of external factors would be
the web server, the database server, the browser, network
connectivity issues, etc.
Examples of external testing are:
 Database unavailability test (e.g., is login or further access to
the application permitted should the database go into a
scheduled maintenance window)
 Database error detection and recovery test (e.g., simulate
loss of database connectivity, the application should detect
this, and report an error accordingly). The application should
be able to recover without human intervention when the
database returns
External Testing
 Database authentication test (check access privileges to
the database).
 Connection pooling test (ensure that database
connections are used sparingly, and will not run out
under load).
 Web page authentication test.
 Browser compatibility tests – for example, does the
application behave the same way on multiple browsers,
does the JavaScript work the same way, etc.,
Connectivity Testing
Connectivity testing involves determining if the servers
and clients behave appropriately under varying
circumstances
This testing is difficult to accomplish from a server
perspective since it is expected that the servers will be
operating with standby power supplies as well as being in a
highly available configuration
Thus the server tests need not be run using a power–off
scenario; simply removing the network connection to the
PC may be sufficient
Connectivity Testing contd.
Two aspects of connectivity testing
 Voluntary, where a user actively interacts with the
system in an unexpected way
 Involuntary, where the system acts in an unpredictable
manner
Connectivity Testing: Involuntary
Test:
 Forcing the browser to prematurely terminate during a page
load using a task manager to kill the browser, or hitting the
ESC key and reloading or revisiting the same page via a
bookmark.
Expectation:
 The testing should cover both a small delay (< 10secs) in
reinstating the browser as well as a long delay (> 10mins). In
the latter case the user should not be able to connect back to
the application without being redirected to the login page.
Connectivity Testing: Involuntary
Test:
 Simulation of Hub Failure between PC and the Web Server.
 Removing the network cable from the PC, attempt to visit a
page; abort the visit, and then reconnect the cable can
simulate this.
 The test should use two time delays; the first should be
under 15 seconds, and the second delay around 15 minutes
before reconnecting.
 After reconnecting, attempt to reload the previous page
Expectation: The user should be able to continue with the
session unless a specified timeout has occurred in which case
the user should be redirected to a login page.
Connectivity Testing: Involuntary
Test: Web server on/off test.
 Shutdown the web server, then restart the server
Expectation:
 The user should be able to connect back to the
application without being redirected to the login page.
This will prove the statelessness of individual pages
Note:
 The shutdown is only for the web server. Do not
attempt this with an application server, as that is a
separate test
Connectivity Testing: Involuntary
Test: Database server on/off test.
 Shutdown the database server and restart it
Expectation: The user should be able to connect back to
the application without being redirected to the login page
It may be that a single transaction needs to be redone, and
the application should detect this and react accordingly
Connectivity Testing: Involuntary
Application server on/off test
 Shutdown the application server and restart it
 There are two possible outcomes for this depending on
how session management is implemented
 The first outcome is that the application redirects to
an error page indicating loss of connectivity, and the
user is requested to login and retry
 The second outcome is the application continues
normally since no session information was lost
because it was held in a persistent state that
transcends application server restarts
Connectivity Testing: Voluntary
Examples of voluntary connectivity testing include;
 Quit from session without the user saving state.
 Quit from session with the user saving state.
 Server – forced quit from session due to inactivity.
 Server – forced quit from session due to server
problem.
 Client forced quit from session due to visiting another
site in the middle of a session for a brief period of time.
 Client – forced quit from session due to visiting another
site/application for an extended period of time.
 Client – forced quit due to browser crashing
Extended Session Testing
Remaining in a session for an extended period of time and
click items to navigate the screen. The session must not be
terminated by the server except in the case of a deliberate
logout initiated by the user
Remaining on a single page for an extended length of time.
The session should be automatically terminated and the
next click by the user should take the user to a page
indicating why the session was terminated and the option
to log back into the system should be present.
The page may have timed redirect associated with it, and if
so, a page indicating a timed out session should be
displayed.
Extended Session Testing
The following must be tested
 The user's session should have been saved and may
optionally be restored on re login
 The user's state must reflect the last complete action the
user performed
 Leaving the application pages to visit another site or
application and then returning to original application
via a bookmark or the back button should result in a
restoration of state, and the application should continue
as if the person had not left
Power Hit/Reboot/Other Cycle Testing
Power Hit/Cycle testing involves determining if the servers
and clients act appropriately during the recovery process
 Client power off/on test
 Client hub power off/on test
 Client network connection removal/reinsertion test
 Server power off/on test
 Server Hub power off/on test
 Server network connection removal/reinsertion test
Standards Conformance Testing
Conformance to
 Web application standards
 Web user interface standards and guidelines
 Web Usability standards
 Web Security standards
 Domain specific standards (e.g. HL7, CCOW for
Healthcare, SOX for Banking softwares etc)
Bug Life Cycle
Bug Life Cycle
Submitted
In work
Solved
Validated
Terminated
Invalid
Deferred
Unable to fix in
If the bug
is not
solved
current release
The bug is tested by the
tester and closed here.
The bug is solved
only by the developer
Developer is
solving the bug
bug
Reviewed The bug is reviewed
and closed by mgmt
Re-work
Bug life cycle [Notes]
The status “Submitted” or “Posted” is assigned to the
defect when the tester raises the defect.
In case the submitted bug is found to be invalid, the bug is
moved to “Terminated” state or “Rejected” state by the
development team.
The status of the bug is moved to “In – work” by the
developer once the developer starts working on fixing the
defect.
Once the developer fixes the bug, the developer moves the
status of the defect to “Solved” state and the fix shall be
made available to the tester in the next release.
Bug life cycle [Notes] contd.
The tester tests the fix for the bug and if found to be
working fine, moves the status of the defect to “Validated”
state, otherwise puts it back to the developer and the status
of the bug is moved back to “In work”.
In case the development team is not in a position to fix the
defect in the current release, the development team moves
the status of the defect to “Deferred” state meaning it shall
be taken up for fixing in the next release.
Reporting Defects
Reporting defects: Attributes
Product name/Application
Name
Version
Module
Summary
Steps to reproduce
Impact
Database information
Severity
Priority
Browser (IE, NN,
Mozilla)
Screen shots (if required
and available)
Reproducible (Yes, No,
Sporadic)
Type of bug (Performance,
Functionality, User
interface etc)
Phase of testing (Unit,,
System testing)
Details of the attributes
Product name/Application:
 Provide the name of the application being tested or
select it from a list
Version
 Provide version of the application being tested or select
it from a list.. Ex: version 1.0, 1.2 etc
Module
 Provide module of the application in which the bug
occurred or select it from a list
Details of the attributes contd.
Summary
 Provide summary of the defect such that this summary,
when viewed, provides sufficient picture, to which team
and category this defect belongs to.
 Project Leads/Managers assign defect to different
individuals based on the details of the summary.
Steps to reproduce (Description)
 Provide step by step explanation of how you arrived at
the defect. The development team must be able to
reproduce the defect with these details.
Details of the attributes contd.
Impact
 Provide impact of the defect from the application and
end user’s perspective, being posted.
Database information
 Provide information on database as to whether
 it is a new database,
 or a ported database,
 If yes, ported from which previous release
Details of the attributes contd.
Severity
 Critical (The defect has severe impact on the end user’s
workflow)
 Serious (The defect has blocked workflow(s), but
alternatives are available)
 Minor (Does not block any user’s workflows. Trivial
error)
Priority
 High (Needs immediate fixing)
 Medium (Can be fixed with agreed time period)
 Low (can be fixed at convenience)
Details of the attributes contd.
Phase of testing
 Provide or select a phase of testing such as Unit testing,
Module testing, Integration testing, System testing

 This helps to analyze how many bugs were uncovered
during a particular phase of testing and facilitates
comparison of finding out defects across phases
Details of the attributes contd.
Reproducible
 This attribute generally has 3 options i.e. Yes, No, &
Sporadic
 Selecting Yes indicates that the defect is reproducible
by following the steps specified as part of the defect.
 Selecting No indicates that the defect is not
reproducible in a particular given sequence.
 Selecting “Sporadic” indicates the defect is
reproducible by following the steps specified but the
defect does not consistently appear
Details of the attributes contd.
Type of bug
 Provide or select the type of bug like whether defect
found falls into the category of Functionality,
Performance etc.
 General categories include Functionality, Performance,
Usability, Load, Volume, Stress, Security, User
interface
 This statistics helps to understand how many
functional, performance etc defects appeared in the
release and gives direction to identify the bottlenecks
Details of the attributes contd.
Browser
 Provide or select browser on which the software was
being used when the defect occurred. Ex: Internet
explorer, Netscape Navigator, Mozilla etc.
Screenshots
 Attach screenshots of error messages, system crashes
while posting the defect. That facilitates the
development team to understand the defect better
Case Study
Study the following defects observed while testing a
software product and re-write them in proper format and
assign appropriate severity and priority to the defects.
Thank You
SW Testing Fundamentals

More Related Content

What's hot

Basic software-testing-concepts
Basic software-testing-conceptsBasic software-testing-concepts
Basic software-testing-concepts
medsherb
 
Testing concepts ppt
Testing concepts pptTesting concepts ppt
Testing concepts ppt
Rathna Priya
 

What's hot (19)

Testing automation in agile environment
Testing automation in agile environmentTesting automation in agile environment
Testing automation in agile environment
 
Elements of a Test Framework
Elements of a Test FrameworkElements of a Test Framework
Elements of a Test Framework
 
Intro to Manual Testing
Intro to Manual TestingIntro to Manual Testing
Intro to Manual Testing
 
Software Testing or Quality Assurance
Software Testing or Quality AssuranceSoftware Testing or Quality Assurance
Software Testing or Quality Assurance
 
Manual Testing Interview Questions | Edureka
Manual Testing Interview Questions | EdurekaManual Testing Interview Questions | Edureka
Manual Testing Interview Questions | Edureka
 
Unit tests benefits
Unit tests benefitsUnit tests benefits
Unit tests benefits
 
Quality Assurance Guidelines
Quality Assurance GuidelinesQuality Assurance Guidelines
Quality Assurance Guidelines
 
Automated Testing vs Manual Testing
Automated Testing vs Manual TestingAutomated Testing vs Manual Testing
Automated Testing vs Manual Testing
 
Manual Testing Material by Durgasoft
Manual Testing Material by DurgasoftManual Testing Material by Durgasoft
Manual Testing Material by Durgasoft
 
Code Review Tool Evaluation
Code Review Tool EvaluationCode Review Tool Evaluation
Code Review Tool Evaluation
 
Manual testing
Manual testingManual testing
Manual testing
 
Career in Software Testing | Skills Required for Software Test Engineer | Edu...
Career in Software Testing | Skills Required for Software Test Engineer | Edu...Career in Software Testing | Skills Required for Software Test Engineer | Edu...
Career in Software Testing | Skills Required for Software Test Engineer | Edu...
 
Software Testing Basic Concepts
Software Testing Basic ConceptsSoftware Testing Basic Concepts
Software Testing Basic Concepts
 
Software testing Training Syllabus Course
Software testing Training Syllabus CourseSoftware testing Training Syllabus Course
Software testing Training Syllabus Course
 
Agile test practices
Agile test practicesAgile test practices
Agile test practices
 
Building an Automation Framework
Building an Automation FrameworkBuilding an Automation Framework
Building an Automation Framework
 
Basic software-testing-concepts
Basic software-testing-conceptsBasic software-testing-concepts
Basic software-testing-concepts
 
Functional & Performance Test Automation with CI
Functional & Performance Test Automation with CI Functional & Performance Test Automation with CI
Functional & Performance Test Automation with CI
 
Testing concepts ppt
Testing concepts pptTesting concepts ppt
Testing concepts ppt
 

Similar to SW Testing Fundamentals

softwaretestingppt-FINAL-PPT-1
softwaretestingppt-FINAL-PPT-1softwaretestingppt-FINAL-PPT-1
softwaretestingppt-FINAL-PPT-1
FAIZALSAIYED
 
softwaretestingppt-120810095500-phpapp02 (1).pdf
softwaretestingppt-120810095500-phpapp02 (1).pdfsoftwaretestingppt-120810095500-phpapp02 (1).pdf
softwaretestingppt-120810095500-phpapp02 (1).pdf
BabaShaikh3
 
Glossary of Testing Terms and Concepts
Glossary of Testing Terms and ConceptsGlossary of Testing Terms and Concepts
Glossary of Testing Terms and Concepts
mqamarhayat
 
16103271 software-testing-ppt
16103271 software-testing-ppt16103271 software-testing-ppt
16103271 software-testing-ppt
atish90
 
Software testing sengu
Software testing  senguSoftware testing  sengu
Software testing sengu
Sengu Msc
 

Similar to SW Testing Fundamentals (20)

Mca se chapter_07_software_validation
Mca se chapter_07_software_validationMca se chapter_07_software_validation
Mca se chapter_07_software_validation
 
softwaretestingppt-FINAL-PPT-1
softwaretestingppt-FINAL-PPT-1softwaretestingppt-FINAL-PPT-1
softwaretestingppt-FINAL-PPT-1
 
Software Quality
Software Quality Software Quality
Software Quality
 
softwaretestingppt-120810095500-phpapp02 (1).pdf
softwaretestingppt-120810095500-phpapp02 (1).pdfsoftwaretestingppt-120810095500-phpapp02 (1).pdf
softwaretestingppt-120810095500-phpapp02 (1).pdf
 
Best software testing course
Best software testing courseBest software testing course
Best software testing course
 
Software Testing - SDLC Model
Software Testing - SDLC ModelSoftware Testing - SDLC Model
Software Testing - SDLC Model
 
System testing
System testingSystem testing
System testing
 
Software Testing PPT | Software All Testing
Software Testing PPT | Software All TestingSoftware Testing PPT | Software All Testing
Software Testing PPT | Software All Testing
 
testing.pptx
testing.pptxtesting.pptx
testing.pptx
 
Glossary of Testing Terms and Concepts
Glossary of Testing Terms and ConceptsGlossary of Testing Terms and Concepts
Glossary of Testing Terms and Concepts
 
Software_Testing_ppt.pptx
Software_Testing_ppt.pptxSoftware_Testing_ppt.pptx
Software_Testing_ppt.pptx
 
16103271 software-testing-ppt
16103271 software-testing-ppt16103271 software-testing-ppt
16103271 software-testing-ppt
 
Software testing ppt
Software testing pptSoftware testing ppt
Software testing ppt
 
Software testing sengu
Software testing  senguSoftware testing  sengu
Software testing sengu
 
Software testing
Software testingSoftware testing
Software testing
 
software testing technique
software testing techniquesoftware testing technique
software testing technique
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing strategies
Software testing strategiesSoftware testing strategies
Software testing strategies
 
Software-Testing-ppt.pptx
Software-Testing-ppt.pptxSoftware-Testing-ppt.pptx
Software-Testing-ppt.pptx
 

Recently uploaded

%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
masabamasaba
 
Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...
Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...
Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...
Medical / Health Care (+971588192166) Mifepristone and Misoprostol tablets 200mg
 
Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...
Medical / Health Care (+971588192166) Mifepristone and Misoprostol tablets 200mg
 
Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...
Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...
Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...
chiefasafspells
 

Recently uploaded (20)

OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
OpenChain - The Ramifications of ISO/IEC 5230 and ISO/IEC 18974 for Legal Pro...
 
WSO2CON 2024 - How to Run a Security Program
WSO2CON 2024 - How to Run a Security ProgramWSO2CON 2024 - How to Run a Security Program
WSO2CON 2024 - How to Run a Security Program
 
VTU technical seminar 8Th Sem on Scikit-learn
VTU technical seminar 8Th Sem on Scikit-learnVTU technical seminar 8Th Sem on Scikit-learn
VTU technical seminar 8Th Sem on Scikit-learn
 
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
%in Stilfontein+277-882-255-28 abortion pills for sale in Stilfontein
 
WSO2CON 2024 - Does Open Source Still Matter?
WSO2CON 2024 - Does Open Source Still Matter?WSO2CON 2024 - Does Open Source Still Matter?
WSO2CON 2024 - Does Open Source Still Matter?
 
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
%+27788225528 love spells in Atlanta Psychic Readings, Attraction spells,Brin...
 
%in ivory park+277-882-255-28 abortion pills for sale in ivory park
%in ivory park+277-882-255-28 abortion pills for sale in ivory park %in ivory park+277-882-255-28 abortion pills for sale in ivory park
%in ivory park+277-882-255-28 abortion pills for sale in ivory park
 
tonesoftg
tonesoftgtonesoftg
tonesoftg
 
What Goes Wrong with Language Definitions and How to Improve the Situation
What Goes Wrong with Language Definitions and How to Improve the SituationWhat Goes Wrong with Language Definitions and How to Improve the Situation
What Goes Wrong with Language Definitions and How to Improve the Situation
 
WSO2CON 2024 - Building the API First Enterprise – Running an API Program, fr...
WSO2CON 2024 - Building the API First Enterprise – Running an API Program, fr...WSO2CON 2024 - Building the API First Enterprise – Running an API Program, fr...
WSO2CON 2024 - Building the API First Enterprise – Running an API Program, fr...
 
Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...
Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...
Abortion Pill Prices Boksburg [(+27832195400*)] 🏥 Women's Abortion Clinic in ...
 
WSO2Con204 - Hard Rock Presentation - Keynote
WSO2Con204 - Hard Rock Presentation - KeynoteWSO2Con204 - Hard Rock Presentation - Keynote
WSO2Con204 - Hard Rock Presentation - Keynote
 
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
MarTech Trend 2024 Book : Marketing Technology Trends (2024 Edition) How Data...
 
WSO2CON 2024 - Cloud Native Middleware: Domain-Driven Design, Cell-Based Arch...
WSO2CON 2024 - Cloud Native Middleware: Domain-Driven Design, Cell-Based Arch...WSO2CON 2024 - Cloud Native Middleware: Domain-Driven Design, Cell-Based Arch...
WSO2CON 2024 - Cloud Native Middleware: Domain-Driven Design, Cell-Based Arch...
 
Architecture decision records - How not to get lost in the past
Architecture decision records - How not to get lost in the pastArchitecture decision records - How not to get lost in the past
Architecture decision records - How not to get lost in the past
 
WSO2CON 2024 - WSO2's Digital Transformation Journey with Choreo: A Platforml...
WSO2CON 2024 - WSO2's Digital Transformation Journey with Choreo: A Platforml...WSO2CON 2024 - WSO2's Digital Transformation Journey with Choreo: A Platforml...
WSO2CON 2024 - WSO2's Digital Transformation Journey with Choreo: A Platforml...
 
Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...
Abortion Pills In Pretoria ](+27832195400*)[ 🏥 Women's Abortion Clinic In Pre...
 
BUS PASS MANGEMENT SYSTEM USING PHP.pptx
BUS PASS MANGEMENT SYSTEM USING PHP.pptxBUS PASS MANGEMENT SYSTEM USING PHP.pptx
BUS PASS MANGEMENT SYSTEM USING PHP.pptx
 
AI & Machine Learning Presentation Template
AI & Machine Learning Presentation TemplateAI & Machine Learning Presentation Template
AI & Machine Learning Presentation Template
 
Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...
Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...
Love witchcraft +27768521739 Binding love spell in Sandy Springs, GA |psychic...
 

SW Testing Fundamentals

  • 2. V Model Validation Verification LLD HLD System test planning Integration test planning Unit test planning Unit testing Integration Testing System Testing Coding Delivery production deployment Maintenance and enhancement URS UAT planning SRS User Acceptance Testing
  • 3. Software Testing Definitions The process of executing a program or part of a program with the intent of finding errors (Myers) Testing is the process of trying to discover every conceivable fault of weakness in a work product (Myers) The process of searching for errors (Kaner) Testing is the process of evaluating or exercising a system or system component by manual or automated means to verify that the software meets specified requirements (IEEE)
  • 4. Role of a Tester Assuring that the software meets user’s needs Software can be used with negligible risks This is achieved through  Verification  Validation
  • 5. Verification Verification  It is the process of determining whether or not the product of given phase fulfill the spec. from the previous phase  Uses reviews, inspections, and demonstrations throughout development to ensure the quality of the product of that phase, including that it meets the requirements from the previous phase  “Are we building the product right?”
  • 6. Validation  The process of evaluating the software at the end of development to ensure compliance with the specified requirements  Includes what is commonly thought of as testing and comparing test results to expected results. Validation occurs at the end of the development process.  “Are we building the right product?”
  • 7. Static & Dynamic Testing Most of the Verification and Validation activities can be classified as Static or Dynamic Static testing (without executing any program)  Requirement reviews  Design reviews  Code reviews Dynamic testing  Testing the software by executing the program
  • 8. Characteristics of Static Testing Static  Do not observe system behavior  Not looking for system failures  Faults are directly detected  Focus is on evaluating adherence to  Standards,  Guidelines and  Processes
  • 9. Characteristics of Dynamic Testing contd. Dynamic Testing  The program is executed  System behavior is observed  Determine the existence of failures  Reveals the presence of faults
  • 10. White Box Testing (Code based testing) A software testing technique whereby explicit knowledge of the internal workings of the item being tested white box testing uses specific knowledge of programming code to examine outputs Also known as glass box, structural, clear box and open box testing
  • 11. Advantages of white box testing Helps to identify the following:  Adherence to coding standards  Adherence to coding guidelines  Indentation  Memory Leaks  Logical complexity of the program  Limitations of the program
  • 12. Black Box Testing (Requirement based testing) A Software testing technique where by the expected outcome of the software is verified by providing inputs without considering how the software program arrives at those outputs. The internal workings of the item being tested are not known by the tester in black box testing. The tester does not ever examine the programming code and does not need any further knowledge of the program other than its specifications.
  • 13. Advantages of Black Box testing The test is unbiased because the designer and the tester are independent of each other. The tester does not need knowledge of any specific programming language(s). The test is done from the point of view of the end user, not the designer or programmer. Test cases can be designed as soon as the specifications are complete.
  • 14. Conclusions White box testing does not guarantee 100% conformance to requirements Black box testing does not concentrate on logic of the program, but ensures conformance to requirements Hence, both white box and black box testing is required to ensure product quality” All types of testing, whether static or dynamic, white box or black box are part of verification and validation activities. Let us see verification and validation activities.
  • 15. Verification & Validation activities Verification  Requirement reviews  Design reviews  Code reviews Validation  Unit testing  Module testing  Integration testing  System testing  Regression testing  User acceptance testing  Field testing
  • 16. Software Testing Life Cycle [STLC]
  • 17. STLC Activities Test Requirements document Test Planning Test Design Test Execution Defect Tracking
  • 18. Test Requirements Document From the software requirement specification (SRS)document, list of testable requirements are extracted and referred to as Test Requirements document. All non technical and un-testable requirements are extracted from this document. Test requirements document is the base for further activities of Testing
  • 19. Test Planning Mainly, Test Plan addresses  Scope and objectives of testing  Schedule, Resources and Reporting  Types of testing and methodology  Phases of testing applicable and scope of testing in each phase  Software and hardware requirements  Identified risks and strategy for mitigating those risks  Information regarding tools used through entire testing life cycle
  • 20. Test Design Test Design is applicable to both white box and black box testing Test design activity involves designing test cases for a given requirement (Black box testing) or for a given program (white box testing). Test case is defined as  “a set of test inputs, execution conditions, and expected results developed for a particular objective, such as to exercise a particular program path or to verify compliance with a specific requirement [IEEE]
  • 21. Test Execution Test execution involves  Executing developed test cases on a piece of program developed (Code based test cases) or on the entire software application (Requirements based test cases)  The status of test case is updated during execution  Possible states include  Pass, Fail, Unable to test, deferred  Test execution statistics are collected and analyzed for test progress monitoring
  • 22. Defect Tracking When actual result obtained from the software application during testing, deviates from expected result written in the test case, it is termed as a “defect”. The test case is failed and a defect posted on the software. The defect is fixed by the development team and the fix is provided in subsequent releases. The fix provided for the defect is validated and if found to be working, the test case passes and the defect closed. The defect posting, tracking, closing the defects are done in a defect tracking tool.
  • 23. SDLC Vs STLC Requirements Phase Design Phase Coding Phase Deployment Phase Test Requirements document Test Planning Test Case Design Unit Test Execution Defect Tracking System Test Execution
  • 25. Requirement reviews Requirement quality affects work performed in subsequent phases of the system life cycle. Requirements of poor quality  Increase cost and schedule: effort is spent during design and implementation trying to figure out what the requirements are  Decrease product quality: poor requirements cause the wrong product to be delivered or de-scoping to meet schedule or cost constraints
  • 26. Requirement reviews contd. Increase maintenance effort: lack of traceability increases the effort to identify where changes are required, especially as knowledgeable personnel leave Create disputes with the customer/client: ambiguity causes differences in expectations and contractual issues Are a major cause of project failure: all of the above
  • 29. Requirement characteristic: Cohesive Does each requirement specify only one thing? Do all parts of the requirement belong together: Do all parts of a data requirement involve the same data abstraction? Do all parts of a functional requirement involve the same functional abstraction? Do all parts of an interface requirement involve the same interface? Do all parts of a quality requirement involve the same quality factor or sub-factor?
  • 30. Requirement characteristic: Complete Is each requirement self contained with no missing information? Does each requirement contain all relevant information? For example, does the requirement include all relevant preconditions such as the relevant state of the application or component? Does each requirement need no further amplification or clarification? Does each requirement provide sufficient information to avoid ambiguity?
  • 31. Requirement characteristic: Complete If the requirement is not a part of the current release, then is it specified as completely and as thoroughly as is currently known? Is each identified “requirement” actually a single requirement and not actually multiple requirements? Is the use of conjunctions (“and” and “or”) restricted to preconditions and invariants?
  • 32. Requirement characteristic: Consistent Is each requirement externally consistent with its documented sources such as higher-level goals and requirements? Is each requirement externally consistent with all other related requirements of the same type or at the same requirements specification? For example, two requirements should neither be contradictory nor describe the same concepts using different words. Are the constituent parts of each requirement internally consistent? For example, are all parts of a compound precondition or post-condition consistent?
  • 33. Requirement characteristic: Feasible Can each requirement be implemented given the existing hardware or software technology? Can each requirement be implemented given the endeavor’s budget? Can each requirement be implemented given the endeavor’s schedule? Can each requirement be implemented given the endeavor’s constraints on staffing (e.g., staff size, expertise, and experience)? Can each requirement be implemented given the limitations of physics, chemistry, etc?
  • 34. Requirement characteristic:Independent The requirement does not rely on another requirement to be fully understood. Requirements that need proxies are not independent. Parent requirements rely on their children to be fully defined. In testing, a parent is not satisfied until all its children are met. Why retain them? These may be source requirements that must be retained.
  • 35. Requirement characteristic:Independent Also, using them to structure the proxies or children improves understandability. Example: "user friendly" can be used to assign, talk about, or locate the group of proxies defining "user friendly" for that particular project.
  • 36. Requirement characteristic: Mandatory Is each requirement essential to the success of the application or component? Is each requirement truly mandatory (i.e., a true requirement that must be met and implemented)? Is each requirement truly required by some stakeholder, typically the customer or user organization? Is each requirement free from unnecessary constraints (e.g., architecture, design, implementation, testing, and other technology decisions)?
  • 37. Requirement characteristic: Mandatory Does each requirement specify a “what” rather than a “how”? Is each requirement clearly differentiated from: A “nice to have” item on someone’s wish list (i.e., gold- plating)? Constraints?
  • 38. Requirement characteristic: Metadata Individual requirements should have metadata (i.e., attributes or annotations) that characterizes them. This metadata can include (but is not limited to)  Acceptance criteria, Allocation, Assumptions, Identification, Prioritization, Rationale, Schedule, Status, and Tracing information
  • 39. Requirement characteristic: Verifiability Can each requirement be verified against its source? Can each requirement be verified against its associated standards (e.g., content and format), guidelines, and/or templates?
  • 40. Requirement characteristic: Validatability Is it possible to ensure that each requirement is actually what the customer representatives really want and need? Is it possible to ensure that each requirement is actually user representatives really want and need? Is it possible to ensure that each requirement is actually what the marketing representatives really want and need?
  • 41. Does each requirement only specify behavior and/or characteristics that are externally observable when treating the application or component as a black-box? Does each requirement avoid specifying any internal architecture, design, implementation, or testing decisions? If a requirement does specify one or more internal architecture, design, implementation, or testing decisions, is the requirement clearly identified as a constraint rather than as a pure requirement? Requirement characteristic: External Observability
  • 42. Requirement characteristic: Testable Able to prove the object of the requirement satisfies the requirement Un-testable requirements can lead to disputes with the client. Example of an un-testable requirement  “The system shall produce the ABC report in a timely manner”  “The system shall be written in the approved language”
  • 43. Requirement characteristic: Traceable Examine the statement “The system shall calculate retirement annuities and survivor benefits” Observations:  2 different requirement clubbed together  Cannot maintain distinctness while reporting  Can be decomposed as under  The system shall calculate  A. Retirement annuities  B. Survivor benefits
  • 44. Requirement attributes Unique identifier Organizational information--for example, what are the parents/children of the requirement, its category or type Method of validation Item(s) that satisfy the requirement Source of requirement (legal citation, business policy, etc.) Association with the test plan/tests(s) Requirement owners (subject matter expert, analyst) Requirement status
  • 45. Requirement attributes contd. Requirement change history WBS code Risk Priority Cost (estimate and actual) Degree of difficulty Metrics Justification for the requirement Cross references to other requirements or documents Comments
  • 46. Case Study I: Requirements review Review the software requirement specification (SRS) document for marketing division of ABC pharmaceuticals and provide review comments in the enclosed template. Categorize each review comment by appropriate severity and category. At the end, provide statistics of review comments in terms of severity and category.
  • 48. Design reviews Reviews for software design focus on data design, architectural design and procedural design. In general, there are two types of design reviews  Preliminary design review  Design walkthrough
  • 49. Preliminary design review and design walkthrough… Preliminary design review  Assesses the translation of requirements to the design of data and architecture Design walkthrough  Concentrates on the procedural correctness of algorithms as they are implemented within program modules
  • 50. Design review verifications… Do designs satisfy all specified requirements for the product? Have all relevant standards, guidelines applied or met? Are product design and processing capabilities compatible? Are safety requirements met?
  • 51. Design review verifications… Do designs meet functional and operational requirements.. For example, performance and reliability requirements? Is the design satisfactory for all the anticipated environmental and load conditions? Are components or service elements standardized and do they provide reliability, availability and maintainability?
  • 52. Design review verifications… Are plans for implementing design technically feasible (in terms of purchasing, production, installation, inspection and testing) Are the assumptions made during the design process valid?
  • 53. Case Study II: Design review Review the Design specification document requirements provided in SRS for marketing division of ABC pharmaceuticals and provide review comments in the enclosed template. Categorize each review comment by appropriate severity and category. At the end, provide statistics of review comments in terms of severity and category
  • 55. Introduction :Code review Code review is a phase in the computer program development process. It is an activity in which, authors of code, peer reviewers, and perhaps quality assurance reviewers get together to review code. The code is read line by line for  real or potential flaws,  consistency with the overall program design,  comment quality, and  adherence to coding standards”
  • 56. Advantages:Code review Finding and correcting errors at this stage is relatively inexpensive Code reviews tend to reduce the more expensive process of handling, locating, and fixing bugs during later stages of development or after code delivery to users
  • 57. Code review smoke test The code review smoke test includes  Does the code build correctly?  Does the code execute as expected?  Has the developer tested the code for positive workflows?  As a reviewer, do you understand the code?
  • 58. Comments and coding conventions Does the code respect project specific coding conventions? Does the source file start with an appropriate header and copyright information? Are variable declarations properly commented? Are units of numeric data properly commented? Are units of numeric data clearly stated? Are all functions, methods and classes documented? Are complex algorithms, code optimizations adequately commented? Does the code that have been commented out have an explanation? Are comments used to identify missing functionality or unresolved issue in the code?
  • 59. Error handling Are assertions used everywhere data is expected to have a valid value or range? Are errors properly handled each time a function returns? Are resources and memory released in all error paths? Are all thrown exceptions handled properly? Is the function caller notified when an error is detected? Has error handling code been tested?
  • 60. Resource Leaks Is allocated memory (non-garbage collected) freed? Are all objects (Database connections, Sockets, Files, etc.) freed even when an error occurs? Is the same object released more than once? Does the code accurately keep track of reference counting?
  • 61. Thread safeness Are all global variables thread-safe? Are objects accessed by multiple threads thread-safe? Are locks released in the same order they are obtained? Is there any possible deadlock or lock contention?
  • 62. Control Structures Are loop ending conditions accurate? Is the code free of unintended infinite loops?
  • 63. Performance Do recursive functions run within a reasonable amount of stack space? Are whole objects duplicated when only references are needed? Does the code have an impact on size, speed, or memory use? Are you using blocking system calls when performance is involved? Is the code doing busy waits instead of using synchronization mechanisms or timer events?
  • 64. Functions Are function parameters explicitly verified in the code? Are arrays explicitly checked for out-of-bound indexes? Are functions returning references to objects declared on the stack? Are variables initialized before they are used? Does the code re-write functionality that could be achieved by using an existing API?
  • 65. Bug fixes Does a fix made to a function change the behavior of caller functions? Does the bug fix correct all the occurrences of the bug?
  • 66. Case Study III Review the code written in C++ for marketing division of ABC pharmaceuticals and provide review comments in the enclosed template. Categorize each review comment by appropriate severity and category. At the end, provide statistics of review comments in terms of severity and category. The categories can include  Comments and coding conventions, Error handling, Resource leaks, Control structures, Bug fixes, Functions, Deviation from Req, Deviation from design.
  • 68. White Box Testing (Code based testing) A software testing technique whereby explicit knowledge of the internal workings of the item being tested White box testing uses specific knowledge of programming code to examine outputs Examines the internal design of the program Requires detailed knowledge about structure of the program Allows exhaustive testing of all the logical paths (i.e. each line of code for each condition) Also known as glass box, structural, clear box and open box testing
  • 69. Advantages of white box testing Helps to identify the following:  Adherence to coding standards  Adherence to coding guidelines  Indentation  Memory Leaks  Buffer overflows, stacks  Logical complexity of the program  Limitations of the program
  • 70. Statement coverage Statement Coverage  Each statement in the program is executed at least once  100% of the statements in the program should be executed at-least once Weakness: It is necessary but not sufficient. When there is a decision, you have to ensure that it takes a correct path. It is not done by statement coverage.
  • 71. Branch/Decision Coverage Statement coverage does not address all outcomes of decisions. Branches like If..Else, Do..While are to be evaluated for both true and false Test each condition for a true and a false value That is, each branch direction must be traversed at-least once Ex: For the condition (A>=5) or (B<2) THEN X=1, the test cases are: A=6 and B=4 …True (Here, A is true and B is false) A=2 and B=3 … False (Here, A is false and B is false) That is, check how many decisions are there. For each decision, write one test case for true and one test case for false
  • 72. Conditions Coverage All the conditions should be executed at least once for both false and true conditions. True and false outcome of each condition in a decision must be tested. Do not look for combinations. Example: For the condition (A>=5) or (B<2) THEN X=1, the test cases are: A=6 and B=3 …True (Here, A is true and B is False) A=2 and B=1 … True (Here, A is false and B is true)
  • 73. Condition/Decision coverage Condition/Decision Coverage  It may not always result in decision coverage. In such cases, go in for decision +condition coverage. Multiple Condition Coverage:  Go for combinations. For Example: For the condition (A>=5) or (B<2) THEN X=1, the test cases are:  A=6, B=6  A=6, B=3  A=2, B=1  A=2, B=3
  • 74. Path Coverage Errors are sometimes revealed in a path including combination of branches. More general coverage requires executing all possible paths, known as path coverage criteria. Number of paths may be infinite if there are loops. 100% path coverage is impossible
  • 75. White box testing steps Examine the program logic Design test cases to satisfy logic coverage criteria Run the test cases Compare the actual results obtained with expected results in the test case Report errors in case of deviation from expected results Compare actual coverage to expected coverage
  • 76. Cyclomatic Complexity Cyclomatic complexity provides quantitative measure of logical complexity of the program Cyclomatic complexity provides minimum number of independent paths in the given program Based on the Cyclomatic complexity value obtained, the decision to accept the program for testing or not, can be made
  • 79. Software Testing Phases Unit Testing Module Testing Integration Testing System Testing User Acceptance Testing Field Testing
  • 80. Test Case Design Techniques
  • 83. Introduction to web applications Web Technology Web Architecture HTML/DHTML Web servers Cookies Types of testing applicable to web applications
  • 84. Applicable types of testing Unit testing Page flow testing Usability testing Functional testing Load testing Performance testing Data volume testing Security testing Regression testing External testing Connectivity testing Stress testing
  • 85. Unit Testing Unit testing involves testing of the individual modules and pages that make up the application In general, unit tests check the behavior of a given page i.e. does the application behave correctly and consistently given either good or bad input Some of the types of checking would include:  Invalid input (Missing output, out of bound input, entering an integer when float expected, and vice versa, control characters in strings etc.,)  Alternate Input Format (e.g., 0 instead of 0.0, 0.00000001 instead of 0 etc.,)
  • 86. Unit Testing  Button click testing e.g., multiple clicking with and without pauses between clicks.  Immediate reload after button click prior to response having been received.  Multiple reloads in the same manner as above. Random input and random click testing.  This testing involves a user randomly pressing buttons (including multiple clicks on "hrefs") and randomly picking checkboxes and selecting them.
  • 87. Unit Testing There are two forms of output screen expected:  An error page indicating the type of error encountered.  A normal page showing either the results of the operation or the normal next page where more options may be selected. “In no event should a catastrophic error occur”
  • 88. Page Flow Testing Page flow testing deals with ensuring that jumping to random pages does not confuse the application. Each page should typically check to ensure that it can only be viewed via specific previous pages, and if the referring page was not one of that set, then an error page should be displayed. A page flow diagram is a very useful aid for the tester to use when checking for correct page flow within the application.
  • 89. Impact of page flow on security Some aspects of page flow testing cross into security. Some simple checks to consider are  Forcing the application to move in an unnatural path.  The application must resist, and display appropriate error message
  • 90. Page flow testing : Details  Log into the system and then attempt to jump to any page in any order once a session has been established.  Use bookmarks and set up temporary web pages to redirect into the middle of an application using faked session information
  • 91. Usability testing Usability testing ensures that all pages present a cohesive look to the user, including spelling, graphics, page size, response time, etc Examples of usability testing include:  Spelling checks  Graphical user interface checks (colors, dithering, aliasing, size, etc.,)  Adherence to web GUI Standards  Meaningful error messages  Accuracy of data displayed
  • 92. Usability testing contd.  Page Navigation  Context sensitivity  Editorial continuity  Accessibility  Accuracy of data in the database as a result of user input  Accuracy of data in the database as a result of external factors (e.g. imported data)  Meaningful help pages including context sensitive help
  • 93. Functional Testing Functional testing ensures  Conformance to functional requirements of the application  Scenarios/Test cases are designed to find out conformance to the requirements  Whole business logic gets tested as part of the functional testing
  • 94. Load Testing Load testing the application involves generation of varying loads (in terms of concurrent users) against  web server,  the databases supporting the web server and  the middle ware/application server logic connecting those pages to the databases Load testing includes verification of data integrity on the web pages, within the back end database and also the load ramping or surges in activity against the application
  • 95. Load Testing "Does the site scale", "Is the site's response time deterministic, etc. Examples of load testing would include:  Sustained low load test (50 users for around 48 hours).  Sustained high load test (300+ users for 12 hours).  Surge test (e.g. run 50 users, then surge to 500 users and then return to 50, no memory leaks, lost users, orphaned processes, etc., should be seen).  The system should continue running with multiple surges at various times during the day.  This test should run for 48 hours.
  • 96. Load Testing contd. Load testing is also to discover at what load the application would fail and what are the saturation point.
  • 97. Performance Testing Performance Testing refers to the response time by the software to process and present the requests made by the end users Performance depends on  Speed of the network  Hardware configuration of application server, web server, database server and the client system (Processor, RAM etc)  Volume of data in the database
  • 98. Data Volume Testing Data volume testing involves testing the application under data load, where large quantities of data are passed through the system. (e.g. large number of items in dropdown/combo boxes, large amount of data in text boxes). Performance of the application should be monitored during this testing, since a slow database could significantly affect response time and data must be collected over this.
  • 99. Data Volume Testing This data can be used as a control set for contrasting monitoring data from a live system and providing predictive information indicating when major application stress points may be encountered. No errors should be seen on application pages or in error logs for pages that are data intensive.
  • 100. Security Testing Security testing involves verifying weather both the servers and the application are managing security correctly Security from server perspective  Attempt to penetrate system security both internally and externally to ensure the system that houses the application is secure from bother internal and external attacks.  Attempt to cause things like buffer overflow to result in root access being given accidentally, (such code does exist, but explaining it is beyond the scope of this document)
  • 101. Security Testing contd.  Attempt to cause the application to crash by giving it false or random information  Ensure that the server OS is up to correct patch levels from security viewpoint  Ensure that the server is physically secure
  • 102. Security Testing contd. Application level security testing involves testing some or all the following  Unauthenticated access to the application  Unauthorized access to the application  Unencrypted data passing  Protection of the data  Log files should be checked to ensure they do not contain sensitive information
  • 103. Security Testing contd.  Faked sessions. Sessions information must be valid and secure. (e.g. a URL containing a session identifier cannot be copied from one system to another and then the application be continued from the different system without being detected)  Multiple login testing by a single user from several clients
  • 104. Security Testing contd.  Attempt to break into the application by running username/password checks using password-cracking program  Security audit, e.g. examine log files, etc., no sensitive information should be left in raw text/human readable form in any log file  Automatic logout after N minutes of inactivity with positive feedback to the user
  • 105. Regression Testing Regression testing ensures that during the lifetime of the application, any fixes do not break other parts of the application This type of testing typically involves running all the tests, or a relevant subset of those tests when defect fixes are made or new functionalities added The regression tests must also be kept up to date with planned changes in the application. As the application evolves, so must the tests
  • 106. External Testing External testing deals with checking the effect of external factors on the application. Example of external factors would be the web server, the database server, the browser, network connectivity issues, etc. Examples of external testing are:  Database unavailability test (e.g., is login or further access to the application permitted should the database go into a scheduled maintenance window)  Database error detection and recovery test (e.g., simulate loss of database connectivity, the application should detect this, and report an error accordingly). The application should be able to recover without human intervention when the database returns
  • 107. External Testing  Database authentication test (check access privileges to the database).  Connection pooling test (ensure that database connections are used sparingly, and will not run out under load).  Web page authentication test.  Browser compatibility tests – for example, does the application behave the same way on multiple browsers, does the JavaScript work the same way, etc.,
  • 108. Connectivity Testing Connectivity testing involves determining if the servers and clients behave appropriately under varying circumstances This testing is difficult to accomplish from a server perspective since it is expected that the servers will be operating with standby power supplies as well as being in a highly available configuration Thus the server tests need not be run using a power–off scenario; simply removing the network connection to the PC may be sufficient
  • 109. Connectivity Testing contd. Two aspects of connectivity testing  Voluntary, where a user actively interacts with the system in an unexpected way  Involuntary, where the system acts in an unpredictable manner
  • 110. Connectivity Testing: Involuntary Test:  Forcing the browser to prematurely terminate during a page load using a task manager to kill the browser, or hitting the ESC key and reloading or revisiting the same page via a bookmark. Expectation:  The testing should cover both a small delay (< 10secs) in reinstating the browser as well as a long delay (> 10mins). In the latter case the user should not be able to connect back to the application without being redirected to the login page.
  • 111. Connectivity Testing: Involuntary Test:  Simulation of Hub Failure between PC and the Web Server.  Removing the network cable from the PC, attempt to visit a page; abort the visit, and then reconnect the cable can simulate this.  The test should use two time delays; the first should be under 15 seconds, and the second delay around 15 minutes before reconnecting.  After reconnecting, attempt to reload the previous page Expectation: The user should be able to continue with the session unless a specified timeout has occurred in which case the user should be redirected to a login page.
  • 112. Connectivity Testing: Involuntary Test: Web server on/off test.  Shutdown the web server, then restart the server Expectation:  The user should be able to connect back to the application without being redirected to the login page. This will prove the statelessness of individual pages Note:  The shutdown is only for the web server. Do not attempt this with an application server, as that is a separate test
  • 113. Connectivity Testing: Involuntary Test: Database server on/off test.  Shutdown the database server and restart it Expectation: The user should be able to connect back to the application without being redirected to the login page It may be that a single transaction needs to be redone, and the application should detect this and react accordingly
  • 114. Connectivity Testing: Involuntary Application server on/off test  Shutdown the application server and restart it  There are two possible outcomes for this depending on how session management is implemented  The first outcome is that the application redirects to an error page indicating loss of connectivity, and the user is requested to login and retry  The second outcome is the application continues normally since no session information was lost because it was held in a persistent state that transcends application server restarts
  • 115. Connectivity Testing: Voluntary Examples of voluntary connectivity testing include;  Quit from session without the user saving state.  Quit from session with the user saving state.  Server – forced quit from session due to inactivity.  Server – forced quit from session due to server problem.  Client forced quit from session due to visiting another site in the middle of a session for a brief period of time.  Client – forced quit from session due to visiting another site/application for an extended period of time.  Client – forced quit due to browser crashing
  • 116. Extended Session Testing Remaining in a session for an extended period of time and click items to navigate the screen. The session must not be terminated by the server except in the case of a deliberate logout initiated by the user Remaining on a single page for an extended length of time. The session should be automatically terminated and the next click by the user should take the user to a page indicating why the session was terminated and the option to log back into the system should be present. The page may have timed redirect associated with it, and if so, a page indicating a timed out session should be displayed.
  • 117. Extended Session Testing The following must be tested  The user's session should have been saved and may optionally be restored on re login  The user's state must reflect the last complete action the user performed  Leaving the application pages to visit another site or application and then returning to original application via a bookmark or the back button should result in a restoration of state, and the application should continue as if the person had not left
  • 118. Power Hit/Reboot/Other Cycle Testing Power Hit/Cycle testing involves determining if the servers and clients act appropriately during the recovery process  Client power off/on test  Client hub power off/on test  Client network connection removal/reinsertion test  Server power off/on test  Server Hub power off/on test  Server network connection removal/reinsertion test
  • 119. Standards Conformance Testing Conformance to  Web application standards  Web user interface standards and guidelines  Web Usability standards  Web Security standards  Domain specific standards (e.g. HL7, CCOW for Healthcare, SOX for Banking softwares etc)
  • 121. Bug Life Cycle Submitted In work Solved Validated Terminated Invalid Deferred Unable to fix in If the bug is not solved current release The bug is tested by the tester and closed here. The bug is solved only by the developer Developer is solving the bug bug Reviewed The bug is reviewed and closed by mgmt Re-work
  • 122. Bug life cycle [Notes] The status “Submitted” or “Posted” is assigned to the defect when the tester raises the defect. In case the submitted bug is found to be invalid, the bug is moved to “Terminated” state or “Rejected” state by the development team. The status of the bug is moved to “In – work” by the developer once the developer starts working on fixing the defect. Once the developer fixes the bug, the developer moves the status of the defect to “Solved” state and the fix shall be made available to the tester in the next release.
  • 123. Bug life cycle [Notes] contd. The tester tests the fix for the bug and if found to be working fine, moves the status of the defect to “Validated” state, otherwise puts it back to the developer and the status of the bug is moved back to “In work”. In case the development team is not in a position to fix the defect in the current release, the development team moves the status of the defect to “Deferred” state meaning it shall be taken up for fixing in the next release.
  • 125. Reporting defects: Attributes Product name/Application Name Version Module Summary Steps to reproduce Impact Database information Severity Priority Browser (IE, NN, Mozilla) Screen shots (if required and available) Reproducible (Yes, No, Sporadic) Type of bug (Performance, Functionality, User interface etc) Phase of testing (Unit,, System testing)
  • 126. Details of the attributes Product name/Application:  Provide the name of the application being tested or select it from a list Version  Provide version of the application being tested or select it from a list.. Ex: version 1.0, 1.2 etc Module  Provide module of the application in which the bug occurred or select it from a list
  • 127. Details of the attributes contd. Summary  Provide summary of the defect such that this summary, when viewed, provides sufficient picture, to which team and category this defect belongs to.  Project Leads/Managers assign defect to different individuals based on the details of the summary. Steps to reproduce (Description)  Provide step by step explanation of how you arrived at the defect. The development team must be able to reproduce the defect with these details.
  • 128. Details of the attributes contd. Impact  Provide impact of the defect from the application and end user’s perspective, being posted. Database information  Provide information on database as to whether  it is a new database,  or a ported database,  If yes, ported from which previous release
  • 129. Details of the attributes contd. Severity  Critical (The defect has severe impact on the end user’s workflow)  Serious (The defect has blocked workflow(s), but alternatives are available)  Minor (Does not block any user’s workflows. Trivial error) Priority  High (Needs immediate fixing)  Medium (Can be fixed with agreed time period)  Low (can be fixed at convenience)
  • 130. Details of the attributes contd. Phase of testing  Provide or select a phase of testing such as Unit testing, Module testing, Integration testing, System testing   This helps to analyze how many bugs were uncovered during a particular phase of testing and facilitates comparison of finding out defects across phases
  • 131. Details of the attributes contd. Reproducible  This attribute generally has 3 options i.e. Yes, No, & Sporadic  Selecting Yes indicates that the defect is reproducible by following the steps specified as part of the defect.  Selecting No indicates that the defect is not reproducible in a particular given sequence.  Selecting “Sporadic” indicates the defect is reproducible by following the steps specified but the defect does not consistently appear
  • 132. Details of the attributes contd. Type of bug  Provide or select the type of bug like whether defect found falls into the category of Functionality, Performance etc.  General categories include Functionality, Performance, Usability, Load, Volume, Stress, Security, User interface  This statistics helps to understand how many functional, performance etc defects appeared in the release and gives direction to identify the bottlenecks
  • 133. Details of the attributes contd. Browser  Provide or select browser on which the software was being used when the defect occurred. Ex: Internet explorer, Netscape Navigator, Mozilla etc. Screenshots  Attach screenshots of error messages, system crashes while posting the defect. That facilitates the development team to understand the defect better
  • 134. Case Study Study the following defects observed while testing a software product and re-write them in proper format and assign appropriate severity and priority to the defects.