INDIAN RAILWAYS INSTITUTE OF SIGNAL
ENGINEERING AND TELECOMMUNICATIONS
SECUNDERABAD - 500 017
S.No. Chapter Page
1. Basics of Software Engineering 1
2. Software Quality Assurance Plan 15
3. Software Testing 28
4. Software Verification and Validation 36
No. of Pages : 40
No. of Sheets : 21
April - 2002
Prepared & Checked By
CHAPTER 1 C7
BASICS OF SOFTWARE ENGINEERING
The influence of Software on the present day life is immense and is ever increasing.
Dramatic expansion in the availability of Computers at low cost has enabled the Society to
employ more and more Software based systems. In earlier days, it was thought that Software
could be developed only by a very limited group of highly Intelligent people. Mostly
Programmes were not properly documented and the effect of any wrong operation caused
havoc. Gradually, it was realised that if a Software is built as per some set Rules and
Standards and the various Stages are properly Documented, the product would have
better Reliability and Maintainability and less Probability of Failure. Thus, the idea of
treating Software Development as an Engineering Subject came. It has now further been
established that like any other Engineering Topic, Software Development also can have
Metrics to find the Quality of the Product.
According to their applications, Softwares can be classified in various groups:
System Software --- A collection of Programmes which services other
Softwares. It Interacts with the Computer Hardware
and performs the System Management.
Real Time Software --- It Monitors, Analyzes and Controls External Events
as they occur. The Response Time for the Software
may much less than mSecs.
Business Software --- This type may have Inter active facilities for the
Users in Business Environment.
Engineering and --- It must handle Complex Numerical Data.
Embedded Software --- This resides inside ROM in Control Equipment
having Embedded System and does very limited
PC Software --- From Word Processing to Multimedia based
Graphics, a vast area.
Web based --- Softwares in Web Pages received by the Internet
Artificial Intelligence --- It uses non numerical Algorithms to solve Complex
When a Software is developed, the Quality of the end Product is always important. The
Software Project Manager must continuously evaluate the Quality throughout the
Development Process. The Quality can be divided in two categories – Quality of Design,
which refers to the Characteristics that the Designers specify and Quality of
Conformance, which defines the degree to which the Design Specifications are
followed during the Development.
There are many Quality Criteria to be taken into account. They are:
• Usability ----- It is the Ease of use i.e. the Software must be User-friendly.
• Integrity ----- It is the capability of getting protected from any
Unauthorized use. In these days of Hackers, it has become
• Efficiency ----- The proper use of the resources.
• Correctness ----- The degree to which the Software is able in fulfilling of
• Reliability ----- The ability of not failing.
• Maintainability ----- The efforts needed to locate and rectify a Fault. It is often
measured by the Mean Time To Change a Failure.
• Flexibility ----- The ease of making desirable changes.
• Testability ----- The ease of Testing the Program.
• Portability ----- The effort needed to transfer a Program to a new
• Reusability ----- The ease of reusing Software in a new context.
• Interoperability ----- The effort needed to couple the System to another one.
Relationships between the criteria of a Software
= Inv R
EFFICIENCY -- -- E
E _ √ = DIRECT
= DirectINTEGRITY -- -- X I
USABILITY √ √ X √ U --
MAINTAINABILITY √ √ X -- √
TESTABILITY √ √ X -- √ √ T
FLEXIBILITY √ √ X -- √ √ √
PORTABILITY -- -- X -- -- √ √ P -- P
REUSABILITY -- X X X -- √ √ √ R√ R
INTEROPERABILITY -- -- X X -- -- -- -- √ √ I
Steps for Improving in the Quality of a Software
How much care might have been taken to Ensure a Good Quality Software
Development, it is always better to stick to the following Guidelines to Improve the Quality of
• Make it clear that the Management is committed to Quality.
• Form Quality Improvement Teams with each Department being represented.
• Find where lie the Current and Potential Problems.
• Evaluate the Cost of Quality.
• Raise the Quality Awareness among all Staff.
• Act to solve the Identified Problems.
• Establish a Committee for Zero-defects Program.
• Train the Supervisors to actively carry out their roles in Quality.
• Encourage individuals to establish Improvement Goals.
• Encourage Communication with the Management about obstacles.
• Recognize and appreciate Participants in Quality Improvement.
• Establish Quality Councils to aid communication.
• Keep score of achievements.
• Do it all over again to show that the process never ends.
Factors influencing the Development of a Software System
When a Software is Developed, Project Planners always try to produce a Reliable and good
Quality Software. But whether their efforts would succeed or not depends on many factors,
which influence the End- product. They are:
• Size of the Software. If the size is too big, the probability of Errors also increases.
• Percentage of new design and / or Code. If the new Design is proportionately more,
then much Effort would be needed to develop the Software.
• Complexity of the Software system. More Complexity would need more Effort, Time and
• Difficulty in Design and Coding.
• Quality of the Software. Better Quality needs more stringent Metrics.
• Programming Languages to be used. If some uncommon Language is used,
Programme development may not be easy.
• Security Classification Level of the Software. More Safety Critical Software needs
tougher Verification and Validation. This obviously needs more Time, Effort and Cost.
• The Target Machine for which the Software is being developed.
• Utilization of the target Hardware.
• Volatility of the Requirement. If the Requirement changes very often, the Development
Process gets affected.
• Personnel associated with the Software development. The Personnel must have
Knowledge, Enthusiasm and Involvement. They should also be properly Trained.
• Development Environment. The Developers should have a Transparent and
• Computing resources available.
Software Development Life-Cycle Process
Software Development Process can be described as a collection of many Stages. Some of
these may be required during an intermediate Process and others may be spread over the
whole Development Process. The Total Development, Maintenance and Retirement
Processes are called the Software Life-Cycle Process.
There are total 65 Processes in the whole Software Life-Cycle. The Processes are spread
over Pre-Development, Development and Post-Development phases. These phases
cover 33 Processes. There are, in addition, 19 Integral Processes, which are spread over
the whole Development Life-Cycle.
The following Chart can better describe the Life-Cycle.
SOFTWARE LIFECYCLE PROCESSES ( 65 )
PROJECT MANAGEMENT PROCESSES (13 )
MODEL PROJECT INITIATION ( 4 )
PROJECT MONITORING & CONTROL ( 5 )
SOFTWARE QUALITY MANAGEMENT ( 4 )
IDENTIFY PRE- POST-
LIFE CYCLE DEVELOPMENT DEVELOPMENT DEVELOPMENT
MODEL PROCESSES PROCESSES PROCESSES
(8) ( 14 ) ( 11 )
CONCEPT REQUIREMENTS INSTALLATION (
EXPLORATION (3) 4)
(5) OPERATION &
SYSTEM IMPLEMEN (1)
ALLOCATION TATION RETIREMENT
(3) (6) (3)
PROJECT INTEGRAL PROCESSES ( 17 )
VERIFICATION AND VALIDATION ( 6 )
CONFIGURATION MANAGEMENT ( 4 )
DOCUMENTATION DEVELPOMENT ( 3 )
PROJECT MANAGEMENT PROCESSES )
TRAINING ( 4
Effective Project Management needs that the Software Development be done in three
A) Project Initiation Processes
Some of the Processes are to be followed during the Initiation of the Project Development.
They are to be properly Planned and performed before the actual Development starts.
Map activities To Software Life-cycle Model.
Allocate Project Resources.
Establish Project Environment.
Plan Project Management.
B) Project Monitoring and Control Processes
Throughout the Development Life-Cycle, the Software is to be continuously Monitored and
Controlled. The involved Processes are :
Perform Contingency Planning.
Manage the Projects.
Implement Problem Reporting System.
C) Software Quality Management Processes
The Quality Management must be done throughout the Development Life-Cycle for
producing a good Software. The Processes are:
Plan Software Quality Management.
Define Software Metrics.
Manage Software Quality.
Identify Quality Improvement needs.
Before actual Development of a Software, some Pre-Development Processes are to be
performed. These are again divided in two broad groups. The first group deals with the
Concept of the Software to be developed while the second group defines the System
Functions and Architecture.
A) Concept Exploration Processes
Identify Ideas and Needs.
Formulate Potential Approaches.
Conduct Feasibility Studies.
Plan System Transition, if applicable.
Refine and finalize the Idea or Need.
B) System Allocation Processes
Develop System Architecture.
Decompose System Requirements.
Actual Development of the Software can be divided into several Phases --- Requirement,
Design and Implementation.
A) Requirement Processes
This Phase describes the Requirements to be fulfilled by the Software to be developed. It
also defines the Interfaces.
Define and Develop Software Requirements
Define Interface Requirements.
Prioritize and Integrate Software Requirements.
B) Design Processes
This Phase defines the Design Flow and Interfaces. It also describes the Algorithm in detail.
Perform Architectural Design.
Design the Database, if applicable.
Select or Develop Algorithms.
Perform detailed Design.
C) Implementation Processes
This Phase defines the Coding Processes and how the Integration between various Modules
as well as Hardware and Software will be done.
Create Test Data.
Create Source Code.
Create Object Code.
Create Operating Documents.
After the Development of the Software, come the Installation, Maintenance and Retirement
Processes. Software, though does not have any aging as such, needs Update from time to
time. In the case of any Alterations done after Installation, the Software may have to go
through the whole Development Processes again.
A) Installation Processes
Accept Software in Operational Environment.
B ) Operation and Support Processes
Operate the System.
Provide Technical Assistance and Consultancy.
Maintain Support Request Log.
C) Maintenance Processes
Reapply Software Life-cycle.
D) Retirement Processes
Notify the Users.
Conduct Parallel Operations, if applicable.
Retire the System.
There are some Processes that need to be pursued throughout the Software Development
Life-Cycle. The Integrity of the Software and the proper output depends on these Phases.
A) Verification and Validation Processes
This Phase is very important to be assured that the Software has been developed in a
Proper Way and will perform Proper Jobs. This also covers Testing of the Software. The
Plan Verification and Validation.
Execute V & V Tasks.
Collect and analyze Metric Data.
Plan Software Testing.
Develop Software Testing Requirements.
Execute the Software Testing.
B) Software Configuration Management Processes
Plan Configuration Management.
Perform Configuration Identification.
Perform Configuration Control.
Perform Status Accounting.
C) Documentation Development Processes
Produce and distribute Documentation.
D) Training Processes
Plan the Training Program.
Develop Training Materials.
Validate the Training Program.
Implement the Training Program.
SOFTWARE DESIGN PRINCIPLES.
• Design process should not suffer from ‘ Tunnel Vision’. A good designer should consider
• Design should be traceable to the Analysis Model.
• Design should not reinvent the wheel. Reusable design components should always be
chosen as an alternative to reinvention.
• Design should minimize the Intellectual Distance between the Software and the
problem, as it exists in the real world.
• Design should exhibit uniformity and integration.
• Design should be structured to accommodate change.
• Design should be structured to degrade gently, even when unusual Data-events or
operating conditions are encountered.
• Design should be Assessed for Quality during the formation and not after a fact.
• Design should be Reviewed to minimize Conceptual Errors.
• Design is not Coding and vice versa.
WATERFALL MODEL FOR SOFTWARE DEVELOPMENT
This Model suggests a Systematic and Sequential approach to Software Development. It
assumes that the Developer has the capability and good Knowledge of the System.
The Model has the following activities: ---
• System Engineering and Modelling
Establishment of Requirements for all system elements and allocation of some
subset of these requirements to the Software Module is the starting point.
• Requirement Analysis
Any Software is built to process Data and Events, which continue Information
Domain of the System. Software Engineer must understand the Information Domain and
the Required Function, Behaviour, Performance and Interfacing. Documentation and
Review of Requirements are to be done in this stage.
This stage focuses on Data Structure, Equipment Architecture, Interface
Representations and Algorithmic details. Design is also documented and is a part of
• Code Generation
This stage translates Design in to a Machine Readable Form. This can be
performed better, if the design is performed in details.
It focuses on the Logical Internals of the Software. All statements must be
tested. Tests must uncover Errors and ensure that actual Results agree with Required
It comprises Regular check for Errors and change in Software to
accommodate change in Environment.
The Spiral Model is an evolutionary Software Process Model that couples the
Iterative Nature of the Prototype Model with the Controlled and Systematic approach of
the Waterfall Model to provide the potential for Rapid Development. In this case, the
Software is developed in a Series of Incremental Releases.
This Model is divided into a number of specified Activities called Task Regions:
• Customer Communication --- Customer will have an effective communication with
the Developer at this stage to clarify regarding
System Requirements and Specifications.
• Planning --- In this stage the Resources, Time Schedule and
any other Project related information will be
• Risk Analysis --- Tasks related to Technical and Management Risks
are decided at this stage.
• Engineering --- At this stage, Tasks required to build one or more
Representations of the Application are defined.
• Construction and --- This stage covers the Tasks required to Construct,
Release Test, Install and Provide User Support including
Documentation and Training.
SPIRAL MODEL FOR SOFTWARE DEVELOPMENT
PLANNING RISK ANALYSIS
( Finding Objective ) (Analyzing Alternatives)
Gathering and Project
• Functionality should be separate from implementation.
• A model of the desired behavior of the system is to be developed.
• Context on which the system operates is to be defined. This covers Interaction with
other system components.
• A Cognitive Model, rather than a Design or Implementation Model is to be created.
• Specification must be tolerant to incompleteness and augmentation.
• Representation format and context should be relevant to the problem.
• Information contained in the Specification should be nested.
• Diagrams and other notational forms should be restricted in number and consistent in
• Representations should be revisable.
SOFTWARE SPECIFICATION REVIEW
The following questions are asked during Specification Review:
• Do the stated Goals and Objectives for the Software remain consistent with the
System Goals and Objectives?
• Is the Information flow and Structure adequately defined?
• Have the important Interfaces for all the System Elements been described?
• Are the Diagrams clear? Are they self-explanatory without text?
• Do the major Functions remain within the Scope of Specification?
• Is the behaviour of the Software consistent with the Inputs & Outputs?
• Are the Design Constraints realistic?
• Have the Technological Risks of Development been considered?
• Have Alternative Software Requirements been considered?
• Have the Validation Criteria been stated in details?
• Do the Inconsistencies, Omissions or Redundancy exist?
• Is the Customer Contact complete?
SOFTWARE REQUIREMENT ANALYSIS
It is a process of Discovery, Refinement, Modelling and Specification. Both the
Customer as well as the Designer takes an active role in Requirement Analysis. Software
Requirement Analysis can be divided in to five areas :--
Evaluation and Synthesis
A set of Operational Principles must be followed :--
Information Domain of the problem must be represented and understood.
• Functions to be performed must be clearly defined.
• Models for depicting Information, Function and Behaviour must be of a Layered
• Analysis process should move from Vital Information to the Implementation detail.
• Models can be Functional and Behavioural.
CHAPTER 2 C7
SOFTWARE QUALITY ASSURANCE PLAN (SQAP)
Any Software Development Process must be continuously Monitored and Reported
by a Quality Assurance Team. The Scope and Activity of this Team are described in detail
by a Software Quality Assurance Plan, henceforth called SQAP.
The following questions should be addressed in this Plan:
What is the intended use of the How is the Software to be used?
Software covered by this SQAP? Critical is this Software? Is it part of a
larger system? If so, how is it related to
What is the scope of this SQAP? Who does it apply to? Who is the
Why is this SQAP being written? Is this Plan being written in
response to an Internal or
External Requirement? How will
this Plan contribute to the
Success of the Project?
Which Software Items are covered Specific names and abbreviations
by this SQAP? should be supplied for these items.
Which portions of the Software Name the Life-cycle Model to be used
Life-cycle apply to each of the by this Project. For Enhancing,
Software items mentioned in the Modifying and Correcting the existing
SQAP? Products, define the Life-cycle Stages
or Phases they will pass through
before their Integration into the
Why were the Documents that form Describe the extent to which this
the basis of this SQAP, chosen? SQAP is based on Standards and
What, are the deviations from Record any deviations that reflect
the Documents? the criticality presented.
Documents used to develop the SQAP should be referred in its Text. By definition,
these Documents originate outside the Project. Some may have confidentiality restrictions,
so that only Part of the Document is available to the Project. Some may have different
Update and Version Release Procedures. Some may be on Paper and some Electronic.
Some may be located with the project and some may be located elsewhere. Any special
arrangement to obtain the document is to be Identified and the Project must use the most
Current Official Version.
The Organizational Element(s) responsible for the Software Quality Assurance
Functions covered by the SQAP, should be Developers having Knowledge in Quality
Assurance Tools, Techniques, and Methodologies. The SQAP should state the
organizational and functional boundaries of the SQA organizational element.
The relationship between the Primary SQA Organizational Member and the Software
Development Member should be defined and explained. The most effective organization will
have a separate Quality Assurance (QA) Team, responsible to an SQA Organization, rather
than to the manager of the Software Development organization. SQA independence is
necessary because the QA manager must not have Development responsibilities, which can
otherwise tend to override the Quality concerns. SQA Organizational Elements should
share Quality Evaluation Records and related Information with Customer
Organizations to promote resolution of Quality Issues.
A pictorial Organizational Structure should be included with an Explanation
describing the Nature and Degree of Relationships with all Organizational Elements
responsible for Software Quality and Development. The explanation should include the
• A Description of each Element that Interacts with the SQA Element.
• The Organizational Element, Delegating Authority and Delegated Responsibilities of
• Reporting relationships among the Interacting elements identifying Dependence or
• Identification of the Organizational Element with Product Release Authority.
• Identification of the Organizational Element or Elements that approve the SQAP.
• The Reporting Chain for reducing Conflicts and the Method by which Conflicts are
to be Resolved among the elements.
• The Size of the SQAP Element and the Amount of Effort dedicated to the Project,
where the amount is less than 100%
• An explanation of any Deviations from the Organizational Structure outlined in
existing SQA Policies, Procedures, or Standards.
The description of the Organizational Structure should be complete so that all the
Tasks addressed in the SQAP can be directly related to a responsible Organization.
This describe (a) the Portion of the Software Life-cycle covered by the SQAP,
(b) the Tasks to be performed with special emphasis on Software Quality Assurance
Activities, and (c) the relationships between these Tasks and the planned Major
Check-points. The sequence of the tasks shall be indicated.
Some of these tasks consist of Planning Activities, while others, such as reviews and
tests, are directed towards the Software Product. All the Tasks may not be applicable to a
Specific Project, in which event, they may be omitted from the Project SQAP. Any omissions
or deviations should be explained in the appropriate section of the project SQAP. Any
additional tasks, should be included in the appropriate SQAP sections.
This section of the SQAP also should identify the Tasks associated with the
Publication, Distribution, Maintenance, and Implementation of the SQAP.
Some of these Tasks have a profound Influence on the Development Element, and
such Tasks should be performed by, or in close Association with, the Software Development
Element. Each identified Task, should be defined together with Entrance and Exit Criteria
required to Initiate and Terminate the Task. The output of each task should be defined in
such a way that its achievement or termination car be Objectively Determined in a
It is strongly recommended that a Software Development Plan (SDP) or a Software
Project Management Plan (SPMP) be prepared. If either Document is not available, schedule
Information outlining the Development Cycle should be provided to include the Software
Quality Assurance Activities.
If two or more organizational elements share Responsibility for a Task, their
Respective Responsibilities should be Identified. Describe the procedure for resolving
issues between Organizational Elements sharing Responsibilities. The Management
position accountable for overall Software Quality should be identified. SQAP should also
indicate the Review and Approval Cycle, indicating Signature Authority, as required. It
should show the number of Controlled Copies and describe the Method of Control. It should
designate the personnel and organizational element responsible for distributing the SQAP
and describe the Methods and Responsibilities for the Approval, Distribution, and
Incorporation of Changes. The SQAP should identify the Organizational Elements
responsible for the Origination, Review, Verification, Approval, Maintenance, and Control of
the required Task Documentation.
The SQAP shall identify Documentation, whether Hardcopy or softcopy, that will be
prepared during the Development, Verification and Validation, Use, and Maintenance of the
Software. If there is no Independent Verification and Validation, then the Quality Assurance
Procedures that are to be used on the Project should be Identified. Also, all required Test
Documentation should be noted. The SQAP also shall identify the Specific Reviews,
Audits, and Associated Criteria required for each Document.
To ensure that the implementation of the software satisfies requirements, the
following documentation is required as a minimum.
• Software Requirements Specifications (SRS)
• Software Design Description (SDD)
• Software Verification and Validation Plan (SWP)
• Software Verification and Validation Report (SVVR)
• User documentation
• Software Configuration Management Plan (SCMP)
This minimum set of documents has been found to be a solid foundation to assure
the Quality of a Software Development. These documents are equally usable in In-house
Developments with or without an Informal Contract. Other Documentation, such as Test
Plans and Database Design Information, should be provided, as applicable. The QA activities
associated with each Document Review must be scheduled in consonance with the
Development Life-cycle Phases of the project. Where the development project changes
Existing Software, the required Set of Documents may be a Subset of the Documents used
for the Original Development.
A brief description of each document follows.
Software Requirements Specification (SRS).
The SRS shall clearly and precisely describe each of the essential Requirements of the
Software and the External Interfaces. Each Requirement shall be defined such that its
Achievement is capable of being Objectively Verified and Validated by a prescribed
The SRS is usually developed from One or More Documents, e.g., a User
Requirements Statement, Operational Requirements, Preliminary Hazard Analysis, Software
Product Definition, SRS Reverse Engineering Documentation, System-level Requirements
and Design Documentation. It specifies in detail the Requirements as agreed upon by the
Developer and the Requester or User. Where the Developer produces the SRS, the SQAP
should identify the Standards or Guides followed. The SQAP should clearly define the
methods to be used by the SQA Organizational Element to Verify and Validate the Data in
Software Design Description (SDD).
The SDD is a Technical description of how the Software will meet the Requirements
defined in the SRS. It shall also describe the Components and Sub components of the
Software Design, including Databases and Internal Interfaces. The SDD shall be prepared
first as the Preliminary SDD and subsequently expanded to the Detailed SDD. It involves
Descriptions of the Operating Environment, Timing, System Throughput, Tables, Sizing,
Centralised or Distributed Processing, Extent of Parallelism, Client/Server, Reusable Objects
Library, Program Design Language (PDL), Prototypes, Modelling, Simulation, etc. The SQAP
should identify the Standards and Conventions that apply to the Content and Format of the
The SDD may be an evolving Document or a Document that is updated after each
significant Review. A new Version containing a more Detailed Design Description is
developed for each subsequent review. The SQAP should identify the Number and Purpose
of the SDD Documents.
The SDD is subject to the Preliminary Design Review (PDR) and the Critical Design
Review (CDR). The SDD should consist of following items:
• A textual description of the component's Inputs, Outputs, Calling sequence Function or
Task and Algorithms.
• A list of other components called.
• A list of all calling components.
• Allowed and tolerable range of values for all inputs.
• Allowed and expected range of values for all outputs.
• Assumptions, Limitations, and Impacts.
The SQAP should clearly define the methods to be used by the SQA Organizational
Element to Verify and Validate the Data in the SDD.
Software verification and validation plan (SVVP)
The SVVP shall identify and describe the methods to be used:
• To verify that the Requirements in the SRS have been approved by an appropriate
authority, are implemented in the Design expressed in the SDD, which is further
implemented in the Code.
• To Validate that the Code, when executed, Complies with the Requirements expressed in
The SVVP describes the overall Plan for the Verification and Validation of the
Software and could be produced and Reviewed incrementally. The Tasks, Methods, and
Criteria for V & V are described. SVVP might be used to Document the Procedures to be
followed for some of the Reviews. This section should explain the scope of Validation Testing
to ensure the Reviewed Requirements and explain the stages of Development that require
Customer Review and the extent of the Verification that will precede such a Review. The
contents of the SVVP will be Evaluated at the Software Verification and Validation Plan
Software Verification and Validation Report (SVVR)
The SVVR shall describe the Results of the execution of the SVVP and summarises
the observed Status of the Software as a result of the Execution of the SVVP. The SVVR
should include the following information:
• Summary of all Life-cycle V & V Tasks.
• Summary of Task Results.
• Summary of Anomalies and Resolutions.
• Assessment of overall Software Quality.
• Summary from the Verification Strategy.
• Recommendations such as whether the Software is ready for operational use.
This section should explain why the style of report was chosen and in what way it
satisfies the Criticality of the Product and gives Assurances to the Customer. The SVVPR
should clearly define the Methods to be used by the SQA Organizational Element to assure
the Correctness and Completeness of the Data in the SQAP.
User Documentation shall Specify and Describe the Required Data and Control Inputs
and Input Sequences as well as Options, Program Limitations, and other activities or items
necessary for successful Execution of the Software. All Error Messages shall be
Identified and Corrective Actions are to be elaborately described. The User
Documentation section of the SQAP should describe the Software's Operational Use and
comprise the following items:
• User Instructions that contain an Introduction and a Description of the User's Interaction
with the System
• An Overview of the System, its Purpose, and Description.
• Input / Output Specifications.
• Samples of Original Source Documents and examples of all Input Formats
• Samples of all Outputs (Forms, Reports, or Displays).
• Instructions for Data Preparation, Keying, Verification, Proofing, and Correction.
Wherever Software is capable of damaging the User’s Assets (e.g., Database contents)
the User should be forewarned to avoid accidental damage.
• References to all Documents or Manuals intended for use by the Users.
• A Description of the system's limitations.
• A Description of all the Error Messages that may be encountered,
Along with Recommended Steps to recover from each Error.
• Procedures for Reporting and Handling problems encountered during the use of a
• Preparing Menu for Hierarchy and Navigation Methods.
• User Administration Activities (backup, recovery, batch initiation, access control).
Software Configuration Management Plan (SCMP)
The SCMP shall document the Methods to be used for Identifying the Software Items,
Controlling and Implementing Changes and Recording and Reporting the Change
Implementation Status. The SCMP of the SQAP should describe the Tasks, Methodology
and Tools required to Assure that adequate Software Configuration Management
(SCM) Procedures and Controls are Documented and are being Implemented
correctly. It is essential that one SCMP exist for each project. The SQAP should define the
extent to which the project requires Configuration Management.
The SCMP should describe the methods to be used for
• Identifying the Software Configuration Items.
• Controlling and Implementing Changes.
• Recording and Reporting Change and Problem Reports Implementation Status.
• Conducting Configuration Audits.
• Identifying Review and Approval Cycle as well as Signature Authority.
• Identifying the Personnel responsible for maintaining the Guidelines and distributing the
Software Development Plan (SDP)
The SDP can be used as the highest-level Planning Document governing a Project,
or could be subordinate within a larger set of Plans. For example, several SDPs may be
written in support of a larger Project that is governed by an SDP. The SDP should identify all
Technical and Managerial Activities associated with Software Development. The SDP should
specify the following items, which should be reviewed and assessed by the SQA
• Description of Software Development.
• Software Development Organization responsibilities and interfaces.
• Process for managing the Software Development
• d) Technical Methods, Tools and Techniques to be used in support of the
• Software Development.
• Assignment of responsibility for each Software Development Activity.
• Schedule and Interrelationships among Activities.
• Formal Qualification Testing Organization and Approach.
• Software Product Evaluation during each Life-cycle Phase including subcontractor
The SQAP should define the procedures for creating the data in the SDP and criteria
for updating and assuring its Quality. Any deviations from the Plan should be reconciled
between the SQA staff and the Software Development Manager. The Plan should be
updated and the latest version clearly identified.
Software Project Management Plan (SPMP)
The SPMP can be used in place of an SDP, or as a Plan that governs a larger project
having Sub-projects, each covered by SDPs. The SPMP should identify all Technical and
Managerial Activities associated with an instance of Software Development. The SPMP
should specify the following items:
• Description of Software Development
• Software Development and Management Organizations Responsibilities and Interfaces.
• Process for Managing the Software Development
• Technical Methods, Tools, and Techniques to be used in support of the Software
• Assignment of Responsibility for each Activity Schedule and Interrelationships among
• Process Improvement Activities.
• Goals Deployment Activities.
• Strategic Quality Planning Efforts triggered by Reviews A list of deliverables.
• Subcontractor(s) project management plan(s).
The SQAP should define the procedures for creating the data in the SPMP and
criteria for updating and assuring its quality. Any deviations from the plan should be
reconciled between the SQA staff and the Software Project Manager.
Software Maintenance Manual (SMM)
A Maintenance Manual should contain Instructions for Software Product Support and
Maintenance, e.g., Procedures for Correcting Defects, Installation of Enhancements,
and Testing of all Changes. All Hardware and Software Configuration Specifications,
required to maintain the Software, should be described in Detail. Any Unusual Settings or
known Anomalies should be identified in order to aid in Efficient Maintenance. New Versions
of Software should be thoroughly tested prior to incorporation into Operational Systems.
Version control procedures should be reviewed and approved by SQA and SCM
Organizational Elements. The SQA Organizational Element should periodically Audit and
Validate the use of the Version Control Procedures as well as the Software Maintenance
Process and Procedures.
Additional suggested documentation
The Attributes, Context, and Environment of the Product could dictate Inclusion of
Additional Documents, such as, but not limited to, the following:
A User Requirements Statement can be used as a high-level document preceding
the approved SRS for a large development, in place of an SRS in cases where minor
changes are made to an operational system that has no SRS, or as a means of passing
requirements on to a supplier.
External Interface Specifications should be contained within the Software
Requirements Specifications, the System Design Document, or an Interface Control
Document (ICD). In situations where the detailed External Interface Specifications are not
available in the Design Documentation or ICD, a Separate External Interface Specifications
Document may be required that would provide lower-level detail.
The Internal Interface Specifications should contain information about files and
other connections among all the components within the system. Consideration should be
given to such subjects as Transfer of Control between Modules, Passing of Data between
Modules, Physical and Logical Interfaces, Common Data, Timing and Concurrence
An Operations Manual should contain at least the following items:
• Operating instructions containing
• An Introduction
• Run Schedules
• Set-up Requirements / Procedures
• Run Control Procedures
• Error Procedures
• Security Procedures
• Distribution Procedures
• Backup and Recovery Procedures
• Restart procedures
• Termination Procedures
• Tutorial and Practice Procedures
It should, in addition, contain Specifications for the System, including Environmental
Requirements, Input / Output Specifications and Auditing Controls.
An Installation Manual should contain Instructions for the Installation of the
Software, for example, File Conversion Instructions, use of User-controlled Installation
Options, and Instructions for Performing an Installation Test. Installation procedures may be
performed through an Interactive Interface (i.e., menu driven).
The training manual should contain information necessary for training users and
operators of the system. It should contain, but is not limited to, the following:
How to use the System
How to prepare Input
Data Input Descriptions
Data Control Descriptions
How to run the System
Description of Output Data and Interpretations
Tutorial and Practice Exercises
How to get Help
The Development of Software Products that require Complex or Unfamiliar
Interactions with Users and Operators should include a Comprehensive Plan for Training.
The Training Plan should include the following:
• A description of the populations to be trained, the training objectives for each
population, and the content to be covered in the training.
• An estimate of the amount of resources necessary for training development, delivery,
and time expenditures.
• Procedures for evaluating the effectiveness of the training and for making modifications
to the training.
The Software Metrics Plan should address the way Product and Process Metrics will
be used to Manage the Development, Delivery, and Maintenance Processes. The Software
Metrics Plan should contain information on the following:
a) The Mission and Objectives of the Software Metrics Program.
b) The Quantitative Measures of the Quality of the Software Products and Processes.
c) How the Product or Process Metrics will be used to Identify and Measure
d) How Remedial Action will be taken if Product or Process Metric Levels grow Worse or
exceed established Target Levels.
e) Improvement Goals in Terms of the Product and Process Metrics.
f) How the Metrics will be used to determine how well the Development Process is
being carried out in terms of Milestones and In-process Quality Objectives being
met on schedule.
g) How the Metrics will be used to determine how Effective the Development Process
Improvement is at reducing the Probability that Faults are introduced or that any
Faults Introduced go Undetected.
h) A definition of Metrics reports that are generated, including their Frequency,
Reporting periods, as well as which Element in the Software Organization uses these
i) How to Validate the Software Quality Metric.
j) Data Collection methodology including Roles and Responsibilities, Retention, and
The Software Security Plan should address the way in which the Software and the
Data will be protected from unauthorised Access or Damage. The Software Security Plan
should contain Information on the following:
a. How the Data should be Classified and how this will be Communicated
b. How the Users of the Software Access the Application and how that Access is to be
c. Network Design.
d. User Identifications, Passwords, Security Logging, and Auditing.
e. Super-user Password Control and Protection of path through the System
f. Physical Security.
g. Virus Protection.
h. How Employees will be trained on Security Procedures and Practices.
i. The Method by which this Security Plan will be Audited for Compliance.
j. Disaster plan.
k. Whether the system provides file encryption and decryption capability.
Standards, Practices, Conventions and Metrics
The subjects covered shall include the basic technical, design, and programming
activities involved, such as documentation, variable and module naming, programming,
inspection, and testing. As a minimum, the following information shall be provided:
(1) Documentation Standards
(2) Logic Structure Standards
(3) Coding Standards
(4) Commentary Standards
(5) Testing Standards and Practices
(6) Selected Software Quality Assurance Product and Process Metrics such as:
(a) Branch Metric
(b) Decision Point Metric
(c) Domain Metric
(d) Error Message Metric
(e) Requirements Demonstration Metric.
The SQAP should identify or reference the Standards, Practices, Conventions and
Metrics to be used on the Project. As a minimum, the Information required should be
addressed in the following life-cycle phases:
In addition, the Standards, Practices, and Conventions pertaining to Software
Documentation and the use of Metrics should be addressed. It is strongly recommended that
these issues be resolved in close co-operation with the Software Development Element. The
descriptions of the Standards, Practices, Conventions, and Metrics are often given in a
Standards and Procedures Manual. Use of all of the above should state how compliance is
As a minimum, the following reviews shall be conducted in SQAP:
a) Software Requirements Review (SRR)
b) Preliminary Design Review (PDR)
c) Critical Design Review (CDR)
d) Software Verification and Validation Plan Review (SVVPR)
e) Functional audit
f) Physical audit
g) In-Process audits
h) Managerial reviews
i) Software Configuration Management Plan Review (SCMPR)
j) Post mortem review
This section shall identify all the tests not included in the SVVP for the software
covered by the SQAP and shall state the methods to be used. The SQAP shall include the
Specific Software Tests and Testing not addressed in the SVVP (or other test
documentation). The SQAP should identify and describe the Methods, Approaches, and
Techniques to be used. Test methods and techniques can be divided into two
classifications: Static Tests and Dynamic Tests. Static Testing Evaluates or Analyses the
Software without executing the Code.
Dynamic testing, encompasses such Methods and Techniques as
Unit Level Testing,
System Level Testing,
Security Testing, and
This section of the SQAP should include a description of the Test Planning, Test
Execution and Evaluation of the Results of Testing or include a Reference to the Appropriate
Section of the SQAP or to Standards and Procedures that describe the Testing Process. The
testing process described should adequately address the Preparation and Review of
Documentation associated with the Testing Process including
• Test Plans
• Test Design and Test Case Specifications
• Test Procedures and Test Instructions
• Test Schedule
• Test Reports and
• Test Incidents and their Resolution.
The SQAP should specify the manner in which Records will be kept Also, it should state
how Records will be stored to protect them from Fire, Theft, or Environmental Deterioration.
The need for Training of Personnel designated to perform the Activities defined in the
SQAP should be assessed. It is suggested that a Document is created to define the Task,
Skill Requirement(s) of the Task and the Skills of the Personnel designated to perform the
Task. This will allow the Rapid Identification of Training Needs for each Task and each
Individual. Considerations for Training should include Special Tools, Techniques,
Methodologies and Computer Resources.
Once the Training Need is established, a Training Plan should be developed that
identifies the Training Activities Required to successfully implement the SQAP. Existing
Training Programs should be adapted or New Training Programs developed to meet these
Training Needs. Training Sessions should be scheduled for Personnel who will be assigned
to carry out the Tasks.
If Training for Non-SQAP related Software Activities is not covered in other Project
Documentation, the SQA Organizational Element, upon Completion of the other Project
Documentation Review, should recommend the Development and Inclusion of a Training
Programme into their Documentation.
Resources for SQAP
The four types of resources required to implement a SQAP are Personnel,
Equipment, Facilities and Tools. The Quantity and Quality of these Resources should be
made known to the Appropriate Level of Management. The responsible element should
identify the Job Classifications and Skill Levels of the Personnel required to implement and
maintain the SQAP, throughout the Life of the Project. It should identify the Hardware
needed to implement the SQAP and to support it throughout the Project, as well as
Estimates of Computer Time and Support required. Also, it should identify the Facilities
needed for Storage of Media and Records. When Resources are identified by an Element
other than the SQA element, the SQA Element should verify compliance with this Task. The
Tools required for Implementation should have been already identified in the SQAP itself.
CHAPTER 3 C7
Testing of any Software is an integral part of the Software Development Life cycle
and is a Critical part of Software Quality Assurance. Nowadays, roughly about 30 to 40 %
of the total Software Development Cost are catered for Software Testing. In fact, for Safety-
critical Software, e.g. those used in Railway Interlocking, the cost may be even three to five
times of all other costs taken together.
The main Objectives of Testing a Software, are
• To find Errors and their Effects and rectify them.
• To identify the presence of any Critical Error.
• To Predict the Reliability of the Software.
The Testing Principles adopts the following Points:
• All the Tests must be Traceable to the Customer Requirements.
• Tests should be planned much before actual Testing begins.
• Initial Test Plan should cover Individual Modules and gradually focus should shift to a
Cluster of Modules and finally, the whole System is to be tested.
• Testing should preferably be conducted by a Third party.
Softwares are to be built keeping Testability in mind, which would allow the Testers to
generate Test Cases more efficiently. Care is to be taken to have minimum ‘Bugs,’ and even
if there is any, the Test Execution should not be affected by this. The Software should have
Controllability, Simplicity, Stability and Understandability to have a good Testing Process.
The capabilities of a Software Test Plan is dependent on ---
• User satisfaction -- By involving the User at key points ,
• Management supports -- Effective testing needs strong management support,
• Planning -- Proper planning assures a thorough testing,
• Training -- Proper training for the testers is essential for the success of
• Use of processes -- Choice of Proper Process brings stability and consistency,
• Tools -- Proper Tools are necessary to provide effective and efficient
• Efficiency -- It provides the maximum coverage and assessment of risks
• Quality control -- It evaluates whether the test process has been performed
The parties interested in Software testing include Software Customer, User,
Developer, Tester and Information Technology Manager. Software Testing is required to
protect it from Defects in Product Specifications and Variance from Customer Expectation. A
Good Documentation must be kept regarding Testing Process as well as the Results.
Finally these are to be Verified and Validated.
The Factors associated with Software Testing
• Correctness -- It assures that the Data entered, processed and output
accurate and complete.
• File Integrity -- It ensures that the correct File is used and the sequence of
storage and retrieval of Data is correct.
• Authorization -- It assures that the Data is processed in accordance with
intents of management.
• Audit Trail -- Data processing is to be supported by evidential matter to
substantiate the accuracy, completeness, timeliness and
• Continuity of Processing -- It assures that the necessary Procedure and Information
are available to recover operations, in the case of integrity
being lost due to problems.
• Service Levels -- For achieving required service level, it is needed to match
user requirements with the available resources.
• Access Control -- It ensures that the application will be protected against,
accidental and intentional modification, damage or
destruction by unauthorized persons.
• Compliance -- It assures that the Software is developed according to the
organizational strategy, policy, procedures and
• Reliability -- It assures that the Software will perform intended functions.
Checklist for testing tactics
The following checklist is to be inducted by the Software Test Team for getting
efficient and fruitful test results:
• Whether the Test Strategy is used as a Guide for developing the Test Tactics?
• Whether the Individuals associated with testing were properly identified and a
Strategy was maintained for their recruitment?
• Did the Management assign any Documented Responsibility to the Test Team
• Had a Testing Plan been established, covering Objectives, Strategy, Tactic and
• Whether any provision for modifying Test Plan and Evaluation was kept to tackle any
• Does the Test Team adequately represent User, Data Administrator, Internal Auditor,
Quality Assurance Staff, Management, Security Administrator and Professionally trained
• Whether the idea of Removing Faults at an Early Stage is appreciated by the Test
• Will the Test Team do the Verification and Validation tasks as well?
• Will the Test Team develop a Tester’s Workbench?
• Has the Test Team identified a Source for Generic Test Tool?
.Softwares are tested both for their Functions as well as their Structures.
Software Functional Testing
These tests are designed to ensure that the System Requirements and
Specifications are achieved. The process normally involves creating Test Conditions for
use in evaluating the correctness of the application. The various types of Software Functional
• Requirements Test -- It sees that the User Requirements are fulfilled and Software
correctness is maintained over the whole process period. It
also ensures that the Process complies with the Policies and
Procedures of the Organization.
• Regression Test -- It assures that all aspects of an Application system remain
Functional after Testing. It determines whether the system
Documentation, Test Data and Test Conditions remain
• Error-handling Test -- This helps the Tester to know how the Software behaves,
when an Error occurs. It produces a representative Set of
the Transactions having errors and enter them in to the system
to Check, whether the Application can identify the problems
• Manual-Support Test -- It verifies that the Manual-support procedures are properly
documented, Testers are properly trained and the Manual as
well as the Automated Segments are properly interfaced.
• Inter System Test -- It determines that the proper Parameters and Data are
correctly passed between Applications, proper Coordination
and Timing of Functions exist between various Applications.
• Control Test -- It ensures an adequate Audit Trail and an Efficient and
• Parallel Test -- It conducts redundant processing to ensure that the
Application performs correctly.
Software System Structure Testing
The objective of Software Structure Testing is to ensure that the designed product is
strong in Structure and will function in a correct manner. Structural Tests will test every
Branch and every Statement in the Software. Each Clause in every Condition is forced to
take on each possible value, in combination with those of other Clauses. Every Expression,
used in the Software, must have a variety of values during Tests such that none can be
replaced by a simpler Expression. All Paths of the Software must be Executed. The various
Structural tests are:
• Stress Test -- It determines that the System performs with expected volumes
and System capacity has sufficient resources available. It
generates The Test group creates data and the Test-Transactions.
• Execution Test -- It determines the performance of the System Structure, verifies
the optimum use of resources and determines Response time to
• Recovery Test -- It ensures that operations can be continued after disaster. The
objectives include preservation of back-up data at a secure location
and documentation of the procedures.
• Operations Test -- It determines the completeness of Documentation, Support
system and Operator’s Training.
• Compliance Test -- It ensures that System Development and Maintenance
methodologies are properly followed. It also ensures
compliance to Departmental Standards, Procedures and
• Security Test -- It ensures that the resources being protected are identified and
access is defined for each resource. It evaluates whether designed
security has been followed and functions are as per the
Software Testing and Analyzing
Program Testing and Analysis are the most practical way to verify that it has the
features described in the Specification. Testing is a Dynamic approach, where Test Data is
executed to assess the availability of the required features. Analysis, on the other hand, is a
Static approach to verification, in which required features are detected by analyzing and not
executing the codes. Software Testing and Analyzing are basically of three types --
Functional, Structural and Error oriented.
• Functional (Function-driven) Testing and Analysis ensure that the major
characteristics of the Codes are covered.
• Structural Testing and Analysis ensure that various characteristics of the
Programme are adequately covered.
• Error oriented Testing and Analysis ensures that the range of Typical Errors is
In Function-driven Testing, the Test case Generator must create at least one Normal
Test Case for every specified Function. The Generator must also create one or more
abnormal Test Cases for every Function. The later finds out any missing Error-handling
Functions. The Functional Testing requires the Test of not only the Functions implemented
by the whole Programme but also partial Programmes. Some Faults can remain undetected
when the Programme is tested as a whole and can only be detected by testing of Partial
Functional Testing and Analysis can be both Independent of and Dependent on
Specification. For Tests independent of Specifications, we have two types:
i) Testing based on the Interface: -
• Input Domain Testing, where Input Data is chosen to cover the entire domain including
Upper and Lower extremes as well as many mid range values. The Tester applies a Data
Input at its Highest and Lowest valid values. This Boundary Analysis should be
complemented with feeding Data having values just adjacent and beyond the Boundary
Values, for effectiveness.
• Equivalence Partitioning, where the full set of Input Data is divided in to many Classes,
needing equivalent treatments. Here, the Tester chooses Partitions such that the every
Data value in a Partition Class has the Equivalent effect on Test Result.
• Syntax Checking, where the Programme must parse its Input and handle wrongly
ii) Testing based on the Functions: -
• Special Value testing, where Test Data is chosen on the basis of functional features to
• Output Domain Coverage, where Data is chosen such that extremes of each Output
Domain will be achieved.
In the Specification dependent Tests, the Techniques are Algebraic, Axiomatic,
State Machine Based and Decision Table Based.
Different types of Structural Software Testing during Test Phase
• Manual, Regression and Functional tests for Reliability
• Compliance test for Authorization, Security and Performance
• Functional tests for File Integrity, Audit Trail and Correctness
• Recovery test for Continuity of Testing
• Stress test for Service Level
• Inspections for Maintainability
• Disaster test for Portability
• Operations test for Ease of Operation.
Structural Analysis identifies Fault-prone Code to find out anomalous
Circumstances and generates Test Data to cover specific characteristics of the Programme
Structure. It covers:
• Complexity Measures -- It defines the small Percentage of Codes which contain the
largest number of Errors.
• Data Flow Analysis -- Data Flow is a Graphical Representation of Programme
Execution and gives information about Variable Definitions,
References and Indefiniteness.
• Symbolic Execution -- It takes as Input the Programme to be executed, a symbolic
Data Input and the Path to be followed. The Outputs are the
symbolic Output for the Computing job and the Condition
of the Path passed during Execution.
Software Test Execution
Text Execution means Running or Executing Software Test Cases to find Software
Faults and showing that the Software works as Specified. It is largely Procedural and
depends only slightly on Theoretical or Mathematical foundations. The Software Test
Execution is done according to one of the five different Maturity Levels described by the
Software Engineering Institute (SEI).
Level 1 Organizations do not provide any Automatic Tool for Software Testing and
the Test Documentation is also of a Poor Quality. The Professional Level of Testers is not
In Level 2, the Documentation is better, but still there is no Automated Tool for
Testing the Software. Some Capture Tools like VCR or Tape Recorder is used to capture
the Strokes from the Key Board or Clicks from the Mouse. Thus, this Level also cannot
produce a good quality Testing.
In Level 3 Organizations, every Testing Team Member precisely knows the
Procedure and the expected End Results. Comprehensive Test Plan, Test Case
Specification and Test Procedure Specification must be available. The Testers develop the
Test-scripts, systematically from Specifications, according to the written Policy. Some
Software Metrics, in the form of Number of Test Cases Generated, Time Take by the Test
Cases to run, Number of Failures and which Cases have not been Executed etc. Thus,
Testers in Level 3, perform Efficient Testing.
Testers at Level 4 will be provided with an Automated, Specification-based Test
Case Generator and with some Test Evaluator Tool as well. In this case, the Testers
regularly report to the Management about Test Productivity and Quality. There will be
enough Metrics available to the Test Managers and they will have more confidence in
Predicting the Quality and expected completion of the Test.
Level 5 is at the summit and here Specifications also can be entered into a Tool in the
Integrated Test-Bench, and very little Manual intervention is further needed to produce a high
quality Test Result.
When Automated Test Tools are not available, Software Testers must themselves
decide whether or not the Software has passed or failed a Test Case by using their Brain-
Power. The Time Taken for Tests is determined by the Number of Testers available, Time
available for Evaluation and how many Test Cases are to be Evaluated and with what
Confidence Level. This type of Test needs that the Tester must know the Expected Result
after the Execution of the Software. Comparison of the actual Output with the expected one
will determine whether the Software passed or failed. Prediction of expected result is time
consuming, needs thorough Knowledge of the System and is costly. If the time allowed for
Testing is not adequate, Testers perform some Domain Testing, which may not be able to
detect some hidden Failure. So, implementation of Software Testing Tool will always be
CHAPTER – 4 C7
SOFTWARE VERIFICATION AND VALIDATION
It is a disciplined approach for assessing Software products throughout the Life
cycle of the product. The Verification and Validation effort ensures that the Quality is built
in to the Software and that it satisfies User Requirements. These Tasks support each
other and combine to become a powerful tool and do the following :
• Discovers Errors as early as feasible in the Software Life Cycle.
• Ensures the required Software qualities are planned and built in the system.
• Predicts how well the Interim and Final products will result in satisfying User
• Assures conformance to Standards.
• Confirms Safety and Security functions.
• Helps to prevent last minute problems at delivery.
• Provides a higher Confidence Level in the Reliability of the Software.
• Provides management with better Decision criteria.
• Reduces frequency of operational changes.
• Assesses impact of proposed changes on V &V activities.
Each project needs its own effective Software Verification & Validation Plan
tailored for the project. The Plan may be modified at times in response to major changes. V
& V Planning, which can be thought of an integrated part of overall Project Planning, may be
broken down in to the following steps :
• Identify the V &V Scope
• Establish specific objectives from the general project scope
• Analyze the project input prior to selecting the V & V Tools and Techniques and
preparing the Plan.
• Select Techniques and Tools
• Develop the V & V Plan
REQUIREMENT PHASE V&V
The Requirement Phase is the period during which, the Functional and Non-
functional capabilities of a Software product are defined and documented. This phase
describes and specifies :
• System Boundary
• Software Environment
• Software Functions
• Software Constraints
• Software Interfaces
• Software Data
• Software Algorithm
• Software States
• Software Error Conditions
• Software Standards
• Hardware Interfaces
Requirements Phase V & V
For Critical Software’s, this includes :
• Software Requirement Traceability Analysis
• Software Requirement Evaluation
• Software Requirement Interface Analysis
• Test Plan Generation covering both System Test and Acceptance Test
Software Requirements Traceability Analysis establishes that the SRS completely
satisfies all the capabilities specified in the Concept Document. It also ensures that all of the
necessary parts of the Software are specified. This Analysis identifies the
• Format of the Concept along with the SRS Documentation and the releasing
• Criteria for extracting and identifying discrete requirements
• Indexing and Cross-referencing Requirement Specifications
• Acceptance conditions for the SRS.
Software Requirements Evaluation checks the SRS for Correctness, Consistency,
Completeness, Readability, Testability and Accuracy i.e., it assesses the Technical Merits
of the Software Requirements. Depending upon the Scope and Complexity of the SRS,
there may be multiple Evaluations, each of which addresses a particular purpose and may
use specialized techniques and specific participants.
Software Requirement Interface Analysis evaluates the SRS with Hardware, User,
Operator and Software Requirements Documentation for Correctness, Consistency,
Completeness, Accuracy and Readability. It assures that all External Interfaces to the
Software and Internal Interfaces between Software Functions are completely and
System Test Plan has the primary goal of validating that there are no Defects and
Omissions in the Software, the Concept Document and the System Requirements
Specification. Acceptance Test Plan validates that the Software complies with expectations
laid down in the Operational Concept, Functional Requirements and Quality Attributes.
Design Phase V & V
For Critical Software’s, this includes :
• Software Design Traceability Analysis
• Software Design Evaluation
• Software Design Interface Analysis
• Test Plan Generation including Component and Integration Tests
• Test Design Generations for Components, Integration, System and Acceptance Tests
During Design Phase, the designs for Architecture, Software Components,
Interfaces and Data are created, documented and verified to satisfy Requirements.
Removing Errors at this stage will substantially reduce defects at Coding stage. Design can
be a Multiple step process.
V & V tasks at this Phase considers the following :
• Responsibilities levied by the Project
• Design Methodology
• Design Standards
• Critical sections of the Design
• Design Assumptions needing Proofs
• Any Complex Algorithm needing extensive Analysis
• Resource Restrictions, if any
• Database Privacy with Security
• The Level of Design e.g. Safety Integrity Level ( SIL ) for Railway Signalling.
• Different approaches needed for Component and Integration Tests.
Software Design Interface Traceability Analysis considers the following :
• Is the use of Data items consistent and complete?
• Is the Interface correct, complete and necessary?
• Are the Data items used correctly in the Design elements?
• Are the System Resources being used properly in Data Transfer?
• Is the Interface Design understandable to the User?
• Will the Software detect User Errors and provide Helps?
• Has a Prototype been developed for a Critical Interface? If done, has it been validated?
• Are the Interfaces designed for effective Configuration Management?
• Is an Interface needed where there is none?
Implementation Phase V & V
During this Phase a Software product is created from Design Documents and then
debugged. V & V Task at this stage is focussed on the Code and determines how well it
conforms to the Design Specifications. Quality of the Codes can be checked in different
ways. The Programme Interfaces are analyzed and compared to the Interface
For Critical Software’s, Implementation Phase V & V addresses the following topics :
• Source Code Traceability Analysis
• Source Code Evaluation
• Source Code Interface Analysis
• Source Code Documentation Evaluation
• Test Code Generation including Components, Integration, System and Acceptance Tests
• Test Procedure Generation including Component, Integration and System Tests
• Component Test Evaluation
When planning Source Code Evaluation, the points to be considered are :
• Using Criticality Analysis of the Code components to determine which Code to Evaluate
• Ensuring that the Coding Standard is understood
• Ensuring that the Coding Standards are available to the Staff before coding starts
• Defining how to evaluate Code Quality Attributes
• Identifying Code Analysis Tools that may be used
Source Code Documentation Evaluation ensures that the Documents correctly
reflect the actual representation of the Source Codes. Code-related Documents can
include the Programme Support Manuals, User Manuals and Operator’s Manuals. While
planning for this activity, the points to be considered are :
• Reviewing for new Functionality in addition to Technical Accuracy, Completeness and
• Ensuring that the Documents are written for the appropriate level of Users
• Ensuring that Index, Table of Contents, Glossary, Heading, Format etc. are correct
• Making sure that the Documentation can handle Correction Process
• Making sure that there will be follow up action for updating the Draft Documents
Besides the above phases, V & V Tasks are also spread over the Test, Installation as
well as Operation and Maintenance Phases.
The notes on Software Engineering are prepared taking reference from following books:
1. Software Engineering – A Practitioners Approach by Roger S. Pressman
2. Software Engineering – By Ian Sommerville
3. Software Engineering – IEEE Standards 1059 (V&V), 1074 (Life Cycle Process), 730
Most of the typing are done by my wife Gouri Pal and Daughters Kasturi & Poushali