Software Independent Verification & Validation (IV&V) is a systems engineering process employing rigorous methodologies for evaluating the correctness and quality of the software product throughout the software life cycle. Software IV&V
In the 90's, the Commanding General of the Army's Operational Test and Evaluation Agency noted that 90 percent of systems that were not ready for scheduled operational tests had been delayed by immature software.
Data provided by The Army’s Software Metrics Newsletter “Insight”, Winter 1997 http://www.armysoftwaremetrics.org/documents/INSIGHT/winter97.pdf
The Standish Group has examined 30,000 Software Projects in the US since 1994. This "CHAOS" research has revealed a decided improvement in IT project management with the implementation of standards and practices such as IV&V. This improvement correlates with the rise in project success depicted in the chart below:
Project Resolution History (1994-2000) The Standish Group International, Inc.: Extreme CHAOS (2001) - The 2001 update to the CHAOS report. http://www.standishgroup.com/sample_research/PDFpages/extreme_chaos.pdf
The Carnegie Mellon Software Engineering Institute 1 reports that at least 42-50 percent of software defects originate in the requirements phase.
The Defense Acquisition University Program Manager Magazine 2 reports that a Department of Defense study that over 50 percent of all software errors originate in the requirements phase.
1 – Carnegie Mellon Software Engineering Institute, The Business Case for Requirements Engineering, RE’ 2003, 12 September 2003 2 - Defense Acquisition University Program Manager Magazine, Nov-Dec 1999, Curing the Software Requirements and Cost Estimating Blues
Early error detection and correction are vital. The cost to correct software errors multiplies during the software development lifecycle. Early error detection and correction reduce costs and save time.
Direct Return on Investment of Software Independent Verification and Validation: Methodology and Initial Case Studies, James B. Dabney and Gary Barber, Assurance Technology Symposium, 5 June 2003.
Establish and apply criterion, tools, and methodology to evaluate and assess software risk to identify appropriate level of IV&V
Task the NASA IV&V Facility in Fairmont, WV, to manage the performance of all IV&V for software in Provide Aerospace Products and Capabilities (PAPAC) programs and projects identified per the above criterion and any other safety critical software (as defined in NASA-STD-8719.13B )
Require NASA programs and projects that contain mission or safety critical software to document decisions concerning the use of IV&V
Responsibilities delineated for Chief Safety and Mission Assurance Officer, Chief Engineer, Chief Information Officers, Mission Office Associate Administrators (AAs), Governing Program Management Councils (GPMCs), and IV&V.
GPMCs will review results of the software IV&V process to assure that it meets project needs
NPD 2820.1C for Software IV&V Policy states: " Task the IV&V Facility in Fairmont, West Virginia to manage the performance of all IV&V for software identified per the established criteria, and for any other safety critical software (as defined in NASA-STD-8719.13) "
How is IV&V Started? Independent Verification & Validation Overview
The Work Breakdown Structure presented on the following slides represents the nominal set of tasks that will be performed on some part of the software in each mission type.
Work Breakdown Structure Key Other systems that support the NASA mission such as Integrated Financial Management Data Analysis Where NASA is responsible for only an instrument, not the complete mission Instrument Missions that are not human rated Robotic Missions where human life is at risk (human-rated) Human
Work Breakdown Structure X X X X Final Report Generation 1.3 X X X X Management and Planning of IV&V 1.1 X Traceability Analysis 2.6 X Software/User Requirements Allocation Analysis 2.5 X Concept Document Evaluation 2.4 X X System Requirements Review 2.3 X X X System Architecture Assessment 2.2 X X X X Reuse Analysis 2.1 Concept Phase 2.0 X X X X Identify Process Improvement Opportunities in the Conduct of IV&V 1.7 X X X X Criticality Analysis 1.6 X X X X Management and Technical Review Support 1.5 X X X X IV&V Tool Support 1.4 X X X X Issue and Risk Tracking 1.2 Phase Independent Support 1.0 Data Analysis Instrument Robotic Human IV&V Services
Work Breakdown Structure (cont.) Data Analysis Instrument Robotic Human IV&V Services X Data Flow Analysis 4.8 X X Acceptance Test Plan Analysis 3.5 X X Component Test Plan Analysis 4.7 X X X X Database Analysis 4.6 X X X X Software Integration Test Plan Analysis 4.5 X X X X Software FQT Plan Analysis 4.4 X X X X Interface Analysis – Design 4.3 X X X X Software Design Evaluation 4.2 X X X X Traceability Analysis – Design 4.1 Design Phase 4.0 X Timing and Sizing Analysis 3.6 X X X X System Test Plan Analysis 3.4 X X X X Interface Analysis – Requirements 3.3 X X X X Software Requirements Evaluation 3.2 X X X X Traceability Analysis – Requirements 3.1 Requirements Phase 3.0
Work Breakdown Structure (cont.) Data Analysis Instrument Robotic Human IV&V Services X X System Test Results Analysis 6.4 X Simulation Analysis 6.3 X Regression Test Analysis 6.2 X X X X Traceability Analysis – Test 6.1 Test Phase 6.0 X* Software FQT Procedure Analysis 5.12 X* System Test Procedure Analysis 5.11 X X Component Test Case Analysis 5.10 X X Software Integration Test Results Analysis 5.9 X X Software Integration Test Procedure Analysis 5.8 X X Acceptance Test Case Analysis 5.7 X X X X Software Integration Test Case Analysis 5.6 X X X X Software FQT Case Analysis 5.5 X X X X System Test Case Analysis 5.4 X X X X Interface Analysis – Code 5.3 X X X X Source Code and Documentation Evaluation 5.2 X X X X Traceability Analysis – Code 5.1 Implementation Phase 5.0
Work Breakdown Structure (cont.) Test Phase 6.0 X X Software FQT Results Analysis 6-5 Data Analysis Instrument Robotic Human IV&V Services Component Test Design Analysis 8.6 Component Test Procedure Analysis 8.7 Component Test Results Analysis 8.8 Configuration Management Assessment 8.9 Disaster Recovery Plan Assessment 8.10 X Migration Assessment 7.3 Audit Support 8.5 Algorithm Analysis 8.4 Acceptance Test Results Analysis 8.3 Acceptance Test Procedure Analysis 8.2 Acceptance Test Design Analysis 8.1 Optional Tasks 8.0 X Retirement Assessment 7.4 X Anomaly Evaluation 7.2 X Operating Procedure Evaluation 7.1 Operations and Maintenance Phase 7.0
Work Breakdown Structure (cont.) Feasibility Study Evaluation 8.11 Independent Testing 8.12 Data Analysis Instrument Robotic Human IV&V Services Software FTQ Design Analysis 8.17 Software Integration Test Design Analysis 8.18 System Test Design Analysis 8.19 Training Documentation Evaluation 8.20 Project Management Oversight Support 8.15 Security Assessment 8.16 Operational Evaluation 8.13 Performance Monitoring 8.14 User Documentation Evaluation 8.21 Optional Tasks 8.0
Software Independent Verification & Validation (IV&V) is a systems engineering process employing rigorous methods for evaluating the correctness and quality of the software product throughout the software life cycle
Software IV&V is executed across the full project life cycle
Software IV&V is an adaptive process based on the characteristics of a project
Software IV&V is a value-added approach to ensuring that software is fit for operations and meets its requirements for safety, availability and function with the shared goal of mission success with the project