Your SlideShare is downloading. ×
Kapruch steve
Upcoming SlideShare
Loading in...5

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Kapruch steve


Published on

Published in: Technology, Business
  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

No notes for slide


  • 1. NASA Engineering of Systems Excellence Initiative S J Kapurch Program Executive Officer Systems Engineering 7 Feb 2007
  • 2. PurposeOverview– OCE Initiatives– Policy in context of other effortsMulti-Faceted Approach– Highlight key elements– Integration of Related Efforts
  • 3. The Environment Acquisition streamlining of the 90’s – The Good, the Bad, & the Ugly Nature abhors a vacuum – ISO-15288, IEEE-1220, EAI-632=Standards quagmire Maj. Gen. Craig Cooning, the Air Forces Director of Space Acquisition at the Pentagon – "We walked away from some of the things that served us very well. When you did away with that, you did away with the common language that engineers spoke. So, what we are trying to do is to reinstate those processes that served us well--not to go overboard--but to do it selectively." (21 October 2004)Like other organizations, NASA tossed out many processes & procedures during acquisition streamlining.
  • 4. System Engineering Issues– Failure Review Boards have consistently pointed to lack of SE CAIB – “Organizational causes of this accident … cultural traits … detrimental to safety … allowed to develop including: …reliance on past success as a substitute for sound engineering practices …” -CAIB, Executive Summary Other Recent NASA Projects – “…., the root cause was not caught by the processes implemented by the project – CDR was too high level to adequately assess design…too little time to perform an adequate assessment – ….Although training was widely available, poor requirements are still common – “… likely cause for the failure of the system was a faulty design …switches … improperly installed on a circuit board” – “ the mishap occurred mainly because of failures in SE process….and is known to be the cause of several recent failures”
  • 5. Complexity is a Major IssueIntegration of systems create a majorproblem with complexity– As more systems are added, the interfaces grow in a non-linear fashion– Many of the existing systems were not built for these interfaces– Conflicting or missing interface standards make it hard to define interface interactionsSystems engineering must deal with thiscomplexity– End-to-end systems engineering is needed, including “reengineering” of legacy systems– Robust M&S, verification and validation testing are A must
  • 6. Specific NeedsNeed: Consistency in basic approach to systems engineering.Need: Agency wide framework of recognized best practices that guides the engineering of program and project products and capabilities.Need: Common systems engineering terminology and definitions to enhance communication and collaboration among engineering teams across the Agency and with external partners and customers.Need: Basis for assessing and continuously improving systems engineering capabilities.
  • 7. Response: Systems Engineering Excellence Initiative• Systems engineering working group (SEWG) - POC at centers and MD’s for SE efforts - Plan, develop and execute the Initiative. - Coordinate project products within members’ centers• Engineering management board (EMB) - provide project oversight and approvals.• Mission directorates & centers - Provide members of the SEWG. - Provide needed support for reviews, pilots and assessments. - Verify suitability for accomplishing programs and projects.• Customers • NASA engineering community • Advanced technology teams • Payload developers • OCE • MD and Center management • Program and project managers • External partners
  • 8. Response: The License. . . TERMS ofSEWG CHARTER REFERENCE The SEWG: “… This Is chartered by EMB, Framework will in support of Strategic describe the Plan to develop and requirements for SE implement a common processes required framework for systems engineering to engineer products in NASA and capabilities …”
  • 9. GoalsStimulate and enable the development and advancement of a sound systems engineering capability necessary for success in fulfilling NASA Mission’s Ensure continuous improvement of the NASA engineering workforce through relevant education, training and work experiences. Ensure sound and effective discipline and systems engineering Agency-wide. Provide value-added cross-cutting products and services that enable the infusion of technology, knowledge, and capabilities to support innovation in engineering and push the state of the art. Increase participation, membership, and leadership in recognized national and international engineering organizations. Integration of Software.
  • 10. Expected BenefitsEnable and foster excellence in systems engineeringcapabilities to:– Formulate feasible program and project concepts.– Deliver required products and services to NASA customers.– Make timely acquisition of enabling products and critical technologies.– Reduce risk in system development and deployment.Enable more effective communications andcollaboration within NASA and with external partnersand customers.Conduct effective assessment and improvement ofsystems engineering capabilities.Change the culture to represent the needs of one NASA,and not the unique needs of a particular Center.Develop strategic focus for advanced engineeringenvironments.
  • 11. What is Systems Engineering“Common People Separated by a different language” Winston Churchill
  • 12. Draft ver. 4.7. May 21, 2004 Integrated Defense Acquisition, Technology, & Logistics Life Cycle Management Framework Planning, Three DOD This chart is a classroom aid for Defense Acquisition University students. It provides a notional illustration of the interfaces among the three major decision support systems used to develop, Programming, Budgeting & Execution Decision Support Systems The Milestone Decision Authority may authorize entry into the acquisition process at any point, consistent with phase specific entrance criteria and statutory produce, and field a system for national defense. Defense acquisition is a complex process, with many more activities than shown here, and many concurrent processes that cannot be DEPSECDEF Oversight Effective Interaction is Essential properly displayed on a two-dimensional chart. Supporting information is on the back of this MID 913 requirements chart. For more detailed information see the Acquisition, Technology & Logistics Knowledge Sharing System ( Joint Capabilities Integration & Defense Acquisition Concept Refinement Phase Technology Development Phase System Development & Demonstration Phase Production & Deployment Phase Operations & Support Phase Development System System Develop a system or increment of capability; reduce integration and manufacturing risk; ensure operational Execute a support program that meets operational support performance VCJCS Oversight USD(AT&L) Oversight Refine initial concept. Develop Technology Reduce technology risk and determine appropriate set Achieve operational capability that supportability; reduce logistics footprint; implement human systems integration; design for producibility; requirements and sustains the system in the most cost-effective manner over its CJCSI 3170.01D DoDD 5000.1 Development Strategy MS of technologies to integrate into a full system. MS ensure affordability and protection of critical program information; and demonstrate system integration, MS satisfies mission needs. total life cycle. Dispose of the system in the most cost-effective manner at the interoperability, safety, and utility. end of its useful life. A B System Integration System Demonstration C Low-Rate Initial Production Full-Rate Production/Deployment Sustainment Disposal Decision Points/Milestones CD DRR FRP CDD – Capability Development DR Document Capabilities Integration & Development System - Acronyms Joint CJCSI – Chairman, Joint Chiefs of Staff DAB – Defense Acquisition Board KPPs KPPs DoD Strategic Guidance Instruction ICD – Initial Capabilities Document System Threat Assessment System Threat Assessment CPD – Capability Production Document DOTMLPF – Doctrine, Organization, IOC – Initial Operational Capability JROC – Joint Requirements Oversight Council Net-Ready KPP Net-Ready KPP IOC FOC Joint Joint Operations Concepts Training, Materiel, Leadership, Personnel, and Facilities KPP – Key Performance Parameter Joint Functional Concept Draft J-6 Interoperability Service/JROC Joint Functional Concept J-6 Interoperability Service/JROC CPD Threshold/objective tradeoffs Joint Functional Concept Validated and approved CDD and CPD for each increment of an Capabilities Functional Area Analysis Joint Operating Concepts Joint Functional Concepts Joint Integrating Concepts Integrated Architectures ICD Joint Integrating Concept Integrated Architecture CDD & Supportability Cert. Validation & Approval CDD Threshold/objective tradeoffs – Revised Performance Joint Integrating Concept Integrated Architecture & Supportability Cert. Validation & Approval CPD – Revised Performance Joint Integrating Concept Integrated Architecture evolutionary acquisition Attributes AttributesIntegration & Functional Needs Analysis Information Support Plan Information Support Plan Service/JROCDevelopment D Materiel Changes Ideas for Materiel Analysis of Materiel Validation & Approval O (CJCSI 3170) Approaches Approaches System T M L DOTMLPF Changes Alternative 1 B DRR C Increment II FRP P F (CJCSI 3180) Alternative 2 Alternative N Initiate Evolutionary Acquisition Strategy Evolutionary Acquisition Strategy(need driven) Functional Solution Analysis Post Independent Clinger-Cohen Act Clinger-Cohen Act Clinger-Cohen Act Clinger-Cohen Act B C Analysis - Compliance (all IT) - Compliance (all IT) - Compliance (all IT) - Compliance (all IT) DRR Increment III FRP - Certification - Certification - Certification - Certification (MAIS) (MAIS) (MAIS) (MAIS) Exit Exit Exit Exit Exit DAB/ Criteria DAB/ Criteria DAB/ Criteria DAB/ Criteria DAB/ Criteria DAB/ MDA ADM Met ITAB MDA ADM APB Met ITAB MDA ADM Met ITAB MDA ADM APB Met ITAB MDA ADM APB Met ITAB MDA ADM ITAB Oversight & Technology Acquisition Strategy Acquisition Strategy Acquisition Strategy Total Life Cycle Systems • Program Structure • Program Structure • Program Structure Development • Acquisition Approach Oversight & Review Acronyms • Acquisition Approach • Acquisition Approach Review Strategy • • Capability Needs T&E Considerations ADM – Acquisition Decision Memorandum AoA – Analysis of Alternatives FRPDR – Full-Rate Production Decision Review IOC – Initial Operational Capability • • Capability Needs T&E Considerations • • Capability Needs T&E Considerations Management • Program Strategy • Cost, schedule & • Risk Management APB – Acquisition Program Baseline ITAB – Information Technology Acquisition Board • Risk Management • Risk Management performance goals & • Resource Management CD – Concept Decision LRIP – Low Rate Initial Production • Resource Management • Resource Management • Life-Cycle Considerations DAB – Defense Acquisition Board MAIS – Major Automated Information System • Life-Cycle Considerations • Life-Cycle Considerations AoA exit criteria for first tech demo • Business Considerations DRR – Design Readiness Review MDA – Milestone Decision Authority • Business Considerations • Business Considerations Post Plan • Test Plan FOC – Full Operational Capability Deployment Purpose of LRIP: Performance AoA AoA AoA • Establish Production Base AoA Review • Ramp to Production Rate updated updated • Produce systems for IOT&E MAIS only n/a MAIS Draft Draft Draft Draft Study Source RFP Technology Source RFP System Development Source RFP Source RFP Acq LRIP Production Contracting Contracts Plan Selection RFP Development Contract Acq Selection RFP & Demonstration Acq Plan Selection RFP Contract Acq Plan Selection RFP Contract Post Production Software Support Plan Plan Contract Contracts Plan Plan Plan Sustainment Contracts RFP – Request for Proposal Integrated Integrated Integrated Integrated Baseline Baseline Baseline Baseline Review Review Review Review Preliminary Best Preferred System Prototypes/ Prototypes/ Initial Low-Rate Initial Final Full-Rate Major System Tech Integrated Materiel Performance Engineering Engineering Production Production Production Production In-Service Supports O&M Products Architecture Approach(es) Concept Demos Spec Dev Models Dev Models Baseline Systems Baseline Systems Review Budget Review Define Refine Develop Initial Product Support Strategy Set Product Support Plan Demonstrate Product Support Product Support Package/PBL Implementation Product Support/PBL Management Evaluate Product Supportability •Interoperability •Footprint reduction •Statutory/Regulatory •Product Support Elements •Product Support Elements •Public-Private Partnering •Continuous Tech Refreshment Operations & Sustainment Supportability Product Support Capability •Supply Chain Management Logistics/ Objectives Support Objectives/ Constraints •Supply Chain Mgmt •LCC Optimization •Product Support Elements Strategy •Source of Support •Legacy Considerations -Supply Support -Maintenance -Training -Support •Footprint Reduction •Support and Cost Baseline •PBA Modifications •Assessment of PSI/PSPs •Obsolescence Management •Configuration Control •Peacetime •Training •Joint Operations •Crises FRP •Contract for Sustainment (organic & commercial) •Supply Chain Management •Data Management Capabilities Data •Supply Chain Management Sustainment A B C -Manpower & personnel •Product Support Elements CD DRR OTRR DR INPUTS OUTPUTS INPUTS OUTPUTS INPUTS OUTPUTS Pre-IOC & Post IOC Supportability Assessments Performance Based Logistics (PBL) Strategy (Preferred Product Support Approach) Independent IOT&E •ICD •Prelim Sys Spec •ICD & Draft CDD •Sys Performance Spec •Sys Performance Spec •Initial Prod Baseline Defense •AoA Plan •Exit Criteria •Alternative Maintenance & •T&E Strategy •SEP •Support & Maintenance •Preferred Sys Concept •Exit Criteria •T&E Strategy •LFT&E Waiver Request •TEMP •Validated Sys Support & •Exit Criteria •Validated Sys Support & Maintenance Objectives & Performance Based Agreements Business Case Analysis Product Support Integrator/ Product Support Provider •Test Reports •TEMP •Elements of Product Full-Up System Level LFT&E BLRIP Report to FOT&E Disposal Logistics Concepts Maintenance Objectives & Requirements Logistics & Technical Acronyms Support MostAcquisition Concepts & Technologies •Support & Maintenance Congress Requirements •APB ASR – Alternative Systems Review •Risk Assessment JITC Interoperability Acceptable •Inputs to: Concepts & Technologies BLRIP – Beyond Low Rate Initial ProductionOT&E – Operational Test & Evaluation •SEP •PESHE •PPP •TRA •CDD •SEP •TRA Certification Testing -draft CDD •AoA CDR – Critical Design Review OTRR – Operational Test Readiness Recycle/Reuse •Inputs to: •SEP •PESHE -AoA •SEP CI – Configuration Item Review System -IBR -ISP -STA -CDD •ISP •Inputs to: J-6 Interoperability -TDS •TDS -Acq Strategy •TEMP DT&E – Developmental Test & Evaluation PESHE – Programmatic Environment, -CPD -STA -ISP & Supportability Validation LFTE -Cost/Manpower Est. -Affordability Assessment EDM – Engineering Development Model Safety & Report to -Cost/Manpower Est. Congress -Cost/Manpower Est. EOA – Early Operational Assessment Occupational Health Evaluation Reprocessing PDR – Preliminary Design Review ESOH – Environmental, Safety & Occupational Health (event ASR SRR FCA – Functional Configuration Audit FOT&E – Follow-on Test & Evaluation PCA – Physical Configuration Audit PRR – Production Readiness Review FMECA – Failure Mode Effects & Criticality Analysis PPP – Program Protection Plan SVR PRR Disposal Landfill FTA – Failure Tree Analysis RMS – Reliability, Maintainability & INPUTS OUTPUTS INPUTS OUTPUTS Least driven) Interpret User Needs, Analyze Operational Capabilities & Analyze/Assess Concepts Versus Defined User Interpret User Needs. Analyze Operational Capabilities & Demo & Validate Sys Concepts & Technology Maturity Versus Interpret User Needs, Refine System Performance Specs & IOT&E – Initial Operational Test & Evaluation ISR – In-Service Review Supportability SEP – Systems Engineering Plan JITC – Joint Interoperability Test Command S&T – Science & Technology LFT&E – Live Fire Test & Evaluation SFR – System Functional Review Combined DT&E/OT&E/LFT&E Demonstrate System to Specified User Needs & •Test Results •Exit Criteria •APB • CPD • SEP •TEMP •Production Baseline •Test Reports •TEMP • PESHE • SEP •Service Use Data •User Feedback •Failure Reports •Data for In-Service Review •Input to CDD for next increment •Modifications/upgrades to fielded Acceptable Environmental Constraints Needs Environmental Constraints Defined User Needs Environmental Constraints Environmental Constraints LCC – Life Cycle Costs SRR – System Requirements Review •Product Support Package •Input to: •Discrepancy Reports systems Technical Trades Analyze Trades Analyze Trades LORA – Level of Repair Analysis LRIP – Low Rate Initial Production STA – System Threat Assessment SVR – System Verification Review Analyze - Cost/Manpower Est. •SEP •SEP Systems Engineering SRR MTA – Maintenance Task Analysis T&E – Test & Evaluation OA – Operational Assessment TEMP – Test & Evaluation Master Plan Test & Evaluation Develop Concept Assess/Analyze Develop System Perf TDS – Technology Development Strategy System DT&E, LFT&E & OAs, PCA Demo/Model Develop System (& Constraints) Spec & TRA – Technology Readiness Verify System Functionality Supportability Performance (& Constraints) Concept & Verify Enabling/Critical Tech Integrated System Versus Functional Specs & Assessment & Constraints Compliance Monitor and Collect Definition & Verification System Concept’s Performance Spec System Verification Plan All Service Implement and TRR – Test Readiness Review to Specs Analyze Deficiencies Verify & Validate Objectives Performance &Verification Plan Field To Determine Corrective Production Use Data SFR TRR FCA Actions Configuration Trades Analyze Decompose Concept Assess/Analyze Develop Functional Demo System FMECA Evolve Functional Integrated DT&E, LFT&E & Perf into Functional Concept System Definitions for Enabling/ Performance Specs into EOAs Verify Performance Functionality Definition & Versus Functional Critical Technologies & CI Functional (Design to) Verification Objectives Capabilities Associated Verification Plan Versus Plan FTA Specs and CI Verification Plan Compliance to Specs Analyze Data to Assess Risk of Modify Configuration (Hardware/Software/Specs) Determine Improved System RCM PDR To Correct Deficiencies Root Cause Decompose Concept Assess/Analyze Decompose Functional Demo Enabling/ Evolve CI Functional Individual CI Functional Definition into Enabling/Critical Definitions into Critical Critical Technology Specs into Product Verification Component Concepts/ Components Versus Component Definition Components (Build to) Documentation DT&E Assessment Objectives Capabilities & Tech Verification Plan Versus Plan LFT&E LORA and Inspection Plan Determine Integrate & Test Waiver System Risk/ Corrective Action (if appropriate) Hazard Severity MTA CDR Develop Component Concepts, Develop System Concepts, Fabricate, Assemble, i.e., Enabling/Critical i.e., Enabling/Critical Technologies, Code to “Build-to” Technologies, Constraints Update Constraints & Documentation • Process Change – & Cost/Risk Drivers Develop Cost/Risk Drivers Full Corrective Hardware/Support Action Funding in • Materiel Change FYDP Cost Acronyms Economic Analysis Economic Analysis CARD – Cost Analysis Requirements Description Economic Analysis (MAIS Only) CCA – Component Cost Assessment (MAIS Only) (MAIS Only) ICE – Independent Cost Estimate CARD Affordability MAIS – Major Automated Information System CARD Affordability Affordability (Designated POE CCA ICE POE – Program Office Estimate (Designated POE CCA ICE POE CCA ICE Programs) Assessment RDT&E – Research, Development, Test & Evaluation Programs) Assessment Assessment PMO Budget Estimate Cost Actual Costs Cost Estimation Parametric Analogy Engineering PMO POM Input Methods RDT&E – Advanced Technology Development RDT&E – Adv Component Dev & Prototypes RDT&E – Systems Development & Demonstration Types of Procurement Operations & Funds RDT&E – Management & Support RDT&E – Management & Support RDT&E – Management & Support Maintenance Military Appropriated Funds To Support Contracts August Planning, Departments & POM/Budget Formulation POM/Budget Submit Planning, Programming, Budgeting & Execution Acronyms DoD Testimony DoD Appeals Allocation Defense Agencies BCP – Budget Change Proposals PCP – Program Change Proposals Programming, PCP/BCP Prep PCP/BCP Submit September - November FYDP – Future Years Defense Program – Program Decision Memorandum MBI – Major Budget Issue PDM POM – Program Objectives Memorandum OMB – Office of Management & Budget – Senior Leadership Review Group SLRG Budgeting, ISSU ES SLRG Reviews PDM(s) PBD – Program Budget Decision Strategic Planning Joint Programming & Execution OSD & Joint Staff Joint Planning Document Guidance Off Year Optional Guidance Off Year Optional Apportionment Process October April / May FYDP PBD Cycle Final PBDs MBI DoD Budget National Military Strategy updated FYDP(biennial calendar October - November November December updated Congress Congress OMB President’s Authorization/ driven) On Year White House National Security Strategy Fiscal Guidance Budget to Congress Budget Committees Authorization Committees Appropriation Committees Appropriation Acts Passed Off Year January February (1st Monday) February - September
  • 13. Engineering Excellence FrameworkConsistent Approach at All Levels Concepts andWorkforce Processes– Experienced– Well Trained– Application ty Concepts & Processes b ili pa Tools & Methods Ca Knowledge & Skill of WorkforceContinuous Improvement Tools &– Metrics Methods Multi-Dimensional problem requires 3-D solution
  • 14. And we will get it right laterAnd fix it with software
  • 15. Concepts & ProcessesDevelop a NASA Systems EngineeringPolicy to:– Provide consistency across agency– Advance practice in agency Take advantage of lesson’s learned from other organizations– Address findings and results from numerous studies and FRB’s/Mishaps CAIB/Diaz NIAT Others
  • 16. Policy Approach Develop Requirements for NPR – Detailed Research Into the Failure Review Board Reports, NIAT, Etc… – Analyzed Results of the Pre-assessment Study – Review SE at Other Organizations Such As LMI, Boeing, RAYCO, N-G, NGA, DoD, SMC, NRO, USAF, USN, INCOSCE, NDIA…. Assess Systems Engineering Best Practices Evaluate Systems Engineering at NASA Integrate the Best SE Practices for NASA Modeled After and Integrated with 7120.5C, 8700 Series and SW NPR Process Conduct Four Workshops Hold Technical Exchanges and Conduct Reviews (Internal and External) Products NPR As Well As Others As Required, …1. Make it a Value Added Proposition not an Overhead Burden to the Projects, Missions, Centers, Agency Practitioners2. Do it right up-front, the first time…instead of “heroes saving the day”
  • 17. Objective of NPRDevelop Agency NPR To:– Transform Systems Engineering From a Task Performed by Individuals With the Title Systems Engineer to -- A Logical Systems Approach Performed by Mutli-discipline Teams to Engineer and Integrate NASA’s Systems.– Provide Consistency Across Agency– Advance Practice in Agency Take Advantage of Lesson’s Learned From Center’s and Other Organizations– Address Findings and Results From Numerous Studies and FRB’s/mishaps
  • 18. Requirements Philosophy and Objectives NASA’s policy is to establish, document, and promulgate internal NASA requirements where necessary to fulfill the Agency’s vision, mission, and external mandates. (NPD 1400) Agency Level – Ensure consistency – Ability to work between centers Written requirements establish the baseline for: – Performing activities – Measuring compliance and effectiveness of that performance – Capture and disseminate corporate knowledge – Codify lessons learned Verba Volent – Scripta Manent (What is spoken flies – What is written remains)
  • 19. NPR 7123 Approach SE NPR Off-Site 4: Review draft Objectives NODIS Preparation Briefings Draft Outline Complete Other Updated Sections, Products Draft Off-Site 1: Baseline team, Define Interate Objectives and draft annotated outline Models Approved 75% NPR Draft NPR Red & Comp Teams/ Pre-Nodiss SE Process Elements Off-Site 3 Partial Draft for Rough Revised Outline Draft App. Update Update Develop Run Updated Stds 6105 6105 Scenarios Processes Process &Off-Site 2: Meat on the Bones Models & Models
  • 20. Workshop IOCE approval WS IIAttendees (partial list) WS Attendees (partial list)Workshop I: Senior Experts • •OSD - -Mark Schaffer OSD Mark SchafferCaptured valuable nuggets • •USAF - -Col. Mike Holbert USAF Col. Mike Holbert – DoD and Industry are developing robust SE • •NRO- Rob Klotz NRO- Rob Klotz processes • •USAF SMC - -Col. Rocky Dewan USAF SMC Col. Rocky Dewan Enforced and supported by leadership • •NSSO- Capt. Marsden NSSO- Capt. Marsden – NASA has many disconnected pockets • •USN - -Zig Rafalek USN Zig Rafalek • •Raytheon/NDIA- Robert Rossa Raytheon/NDIA- Robert Rossa Each center has their own way But, similarities and commonalities • •Lockheed- Paul Robataille Lockheed- Paul Robataille • •Boeing- Dev Banerjee Boeing- Dev Banerjee – NASA needs a top-down directed approach • •Northrop-Grumman- James van Gaasbeek – An Enterprise (OCE) Architecture Northrop-Grumman- James van Gaasbeek • •INCOSE- John Snoderly INCOSE- John Snoderly It is important that PM, software and systems • •NASA HQ (OCE) - -Rex Geveden engineering be well integrated NASA HQ (OCE) Rex Geveden • •NASA ESMD - -Ellen Stigberg NASA ESMD Ellen Stigberg – Crucial for development of complex systems • •NASA HQ (OSMA) - -Wilson Harkins NASA HQ (OSMA) Wilson Harkins – Converging on a new enterprise process, • •NASA Stennis - -Christine Powell framework, and consistent processes across NASA Stennis Christine Powell • •NASA HQ (SOMD) - -Stan Fishkind NASA HQ (SOMD) Stan Fishkind NASA represents significant culture change • •NASA JPL ––Steve Wall Change of this magnitude takes time and NASA JPL Steve Wall • •NASA LaRC - -Al Motley NASA LaRC Al Motley persistence • •NASA MSFC - -Herb Shivers NASA MSFC Herb Shivers • •NASA JSC - -Linda Bromley NASA JSC Linda Bromley A Change in Culture is Required that • •NASA NESC- Peggy Chun NASA NESC- Peggy Chun • •NASA APIO- Steve Cavanaugh NASA APIO- Steve Cavanaugh Promotes and Works Towards Using a • •USAF Col Mike Holbert USAF Col Mike HolbertSystems Approach by All Disciplines not Just • •Consultants- Jerry Lake/SMi, Jalal Mapar/SAIC Consultants- Jerry Lake/SMi, Jalal Mapar/SAIC Systems Engineers!
  • 21. Workshop II- IVNASA-only deliberationsCriteria for NPR developedPreliminary - Final Results of CMMI Pre-AssessmentAdopted 17 Processes as basis for engineFirst cycle of Validation Process for NPR– The NPR draft was tested against the four scenarios (representing types of projects/science)– Evaluated how NPR will apply to each type– Tests raised some fundamental questions resulted in rewriting of sections – Design reviews – Applicability of concepts – Product line life cycleContext for the document and training to accompany NPR– Make Buy discussions Leverage other activities (APPEL)– Online
  • 22. “What We Found” Lack of Uniform Understanding of SE in the Community-at-LargeNo single definition or agreement on the scope of SELack of common understanding of how SE is implemented onprograms – Is SE done by the systems engineer? – Does the systems engineer lead the SE effort?No uniform understanding of what makes a good systems engineerNo consistent set of metrics or measures to quantify the value of SECost and schedule estimation and risk management processesinconsistently aligned with SE processesResistance to harmonization of multiple standards and modelsMultiple practioner communities not aligned – Software - Hardware – Information Technology - Aircraft vs. Rocket Developers – Telecommunications - Submarine Propulsion vs. Ship Designers
  • 23. NPR Criteria1. Consistent with existing and emerging NASA policy2. Should have a track record of successful implementation to engineer systems3. Within the purview of SE4. Written at a level that states “what” not “how” (not overly prescriptive)5. Within the scope of responsibility of [actors]6. Must be a “requirement” denoted by a “shall” rather than guidance/informative7. Must be verifiable with objective evidence*8. Allows appropriate tailoring (appendices to define tailoring)9. Improves engineering performance10. Supports the NASA SE initiatives and ongoing improvement activities11. Reasonable confidence that projects can accomplish12. Necessary for consistency across the Agency
  • 24. Key Strategies in NASA SE PolicyRecognize differences in types of NASAprojects.Define a standard design review approachwithin a common life cycle definition.– Based on product lineStandard SE process that can be applied toany system regardless of scope & scale.
  • 25. SE NPRThe NPR is a high level NASA Policyand Requirements document to supportProgram and Project Management. Process oriented “What to do” vice “how to” Technical input Flow down to center directives
  • 26. PARADIGM SHIFT REQUIRED: Profile of Software NPR target audience Advances Early Progressive Slow Entrenched Adopters Users Adopters Resisters
  • 27. Scope: SE Related Standards CMMI – includes SW engineering IEEE 1220 Application & Management of the SE Process ANSI/EIA 632 Processes for Engineering a SystemLevel of Detail ISO/IEC Envisioned NASA NPG 15288 System Life Cycle MIL-STD-499B & EIA IS 632 Processes Systems Engineering Breadth of Scope
  • 28. 7123 Structure Preface : Describes applicability, scope, authority, and references Prologue: Describes the purpose and vision for this SE NPR. Chapter 1 : Describes the SE framework and introduces the SEMP. Chapter 2 : Describes institutional and programmatic requirements, including roles and responsibilities. Chapter 3 : Describes core set of common technical processes and requirements for engineering NASA system products throughout the product life cycle.– Appendix C contains supplemental amplifying material. Chapter 4 : Describes activities and requirements to be accomplished by NASA technical teams or individuals (NASA employees and their service support contractors) when performing technical oversight of a prime or external contractor. Chapter 5 : Describes tech reviews throughout the SE life cycles with clear differentiation between management reviews and engineering reviews.– Exit and Entrance Criteria Chapter 6 : Describes the SEMP, including the role, functions, and content.– Appendix D provides details of a generic SEMP and annotated outline. Appendices– A: Definitions– B: Acronyms– C: Practices– D: SEMP– E: Hierarchy– F: Tailoring– G: Technical Reviews– H: Templates– I: Additional reading– J: Index
  • 29. Requirements FlowdownRequirements Flowdown Realized End Products Realized End ProductsFrom WBS Model AboveFrom WBS Model Above To WBS Model Above To WBS Model Above or From User or From User or To User or To User Technical Planning 10. Technical Planning Process Product Transition Requirements Development Technical Control 9. Product Transition 1. Stakeholder Expectations Process Definition Process 11. Requirement Mgmt Process 2. Technical Requirements 12. Interface Management Process Definition Process 13. Technical Risk Mgmt Process Evaluation 14. Configuration Mgmt Process 7. Product Verification Process 15. Technical Data Mgmt Process 8. Product Validation Process Technical Solution Definition 3. Logical Decomposition Design Realization Process Technical Assessment 5. Product Implementation 4. Design Solution Definition 16. Technical Assessment Process Process Process 6. Product Integration Process Technical Decision Analysis 17. Decision Analysis ProcessRequirements FlowdownRequirements FlowdownTo WBS Models Below orTo WBS Models Below or Realized End Products From Realized End Products From To Implementation To Implementation WBS Models Below or From WBS Models Below or From Process System Structure Process Implementation Process Implementation Process WBS Model Top Down Bottom Up Applied to Applied to System Design Product Realization each WBS each WBS WBS Model WBS Model Model Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model
  • 30. 17 Common Processes – Flow Example STS Orbiter ET SRB Crew Aft Skirt Nose EtcPropulsion Payload Bay O2 Tank H2 Tank Cabin Etc Etc Etc Etc Etc Etc Etc ECLSS Galley Avionics Etc Etc Etc Etc Computers Transponder Antenna Etc Etc Etc Etc Etc Top Down System Design Bottom Up Product Realization
  • 31. Iterate Processes Throughout The Life Cycle Project Formulation Implementation Implementation Generic System Life Cycle - System Structure System Structure System Structure System Structure System Structure WBS Model WBS Model WBS Model WBS Model Top Down Bottom Up WBS Model Top Down Bottom Up Top Down Bottom Up Top Down Bottom Up System Design Top Down Bottom Up System Design Product Realization System Design Product Realization WBS Model WBS Model Product Realization System Design System Design WBS Model WBS Model Product Realization WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model Product Realization WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS Model WBS ModelWBS Model WBS Model WBS Model WBS Model WBS Model
  • 32. System of Interest & Hierarchy of Systems Program System of Interest Project Project System of Interest Subsystem Subsystem System of Interest Component ComponentThe systems engineering process can be applied atany level of the systems hierarchy.
  • 33. Engineering Excellence FrameworkConsistent Approach at All Levels Concepts andWorkforce Processes– Experienced– Well Trained– Application ty Concepts & Processes b ili pa Tools & Methods Ca Knowledge & Skill of WorkforceContinuous Improvement Tools &– Assessments Methods Multi-Dimensional problem requires 3-D solution
  • 34. Assessments To Establish a systems engineering capabilityassessment methodology that enables continuousprocess improvements in the engineering of systemsagency wide. – Metrics – Maturity models – Pilots – Assessments – Monitor – Feedback results – Updates as required
  • 35. Background: Pre-Assessment Response to TaskingSEWG Assessment Sub-group– Mission To establish a systems engineering capability assessment methodology that enables continuous process improvements in the engineering of systems Agency-wide, with validation and documentation throughout the lifecycle– Evaluate current SE and SWG capability assessment efforts for CAIB/DIAZ actionSubgroup developed approach, objectives and plans toconduct SE Pre-assessments IAW SE Framework– Researched Industry and other Government Agencies Approaches– Developed model selection methodology and criteria A tailored CMMI model selected for pre-assessments
  • 36. Background: Models ExaminedISO 15504 – The international standard assessment methodology forsystems engineeringEIA/IS 731 – An Electronic Industries Alliance (EIA) standard thatbrings together the EPIC Systems Engineering Capability Maturity Model(SE CMM) and the INCOSE Systems Engineering Capability AssessmentModel (SECAM) into a single capability model to minimize confusionwithin the industry and to relate the resulting capability model to the EIA-632 Standard, Processes for Engineering a System.SE-CMM – The Carnegie Mellon University (CMU) SoftwareEngineering Institute (SEI) capability maturity model for systemsengineeringFAA-iCMM, v2.0 – FAA’s own CMMI-based modelCMMI v1.1 SE/SW – This is the latest CMU/SEI capability maturitymodel that integrates systems engineering and software engineering
  • 37. Objectives of the Pre-assessmentTo acquire data to determine the state of system engineeringas practiced across the Agency in terms of consistencyacross centers, effectiveness and efficiencyTo determine gaps/deficiencies in the system engineeringdiscipline as practiced at the Agency relative to the pre-assessment approach based on a tailored CMMI modelTo form a baseline against which future assessments candetermine overall trends and provide input/guidance forsystem engineering process improvementsTo determine if the CMMI model meets future Agencysystem engineering continuous improvement requirements
  • 38. CMMI Appraisal Classes Class A: SCAMPI • Full, comprehensive method • Thorough model coverage • Provides maturity and/or capability levels Class B • Documentation • Interviews • Identify Gaps Class C • Estimate of Goal Satisfaction • No rating • Documentation Review •Identify Gaps• Estimate of Goal Satisfaction • No rating
  • 39. Overview of Pre-AssessmentThree projects were assessed at each center during a five day periodA tailored CMMI model was used – 13 of 25 Process Areas – Over 260 Practices looked at – The focus was on systems engineering practicesDiscovery-based with objective evidence – The findings were based on objective evidence to the question “Can you demonstrate that you do a practice?” rather than just asking question “Do you do a practice?” – Evidence was provided by the project and examined by a pre-assessment team during interviews to demonstrate whether or not the practice was performed.Rules for evidence and corroboration were relaxed from a formalappraisal to enable completion with minimal impact to projects and stillenable meeting NASA objectivesQuestions were asked of project leads and systems or other engineers todetermine their perspectives on systems engineering. These were asupplement to the tailored CMMI model discovery approach.
  • 40. Process Areas Maturity Level Focus Process Areas Quality Continuous Process Organizational Innovation and Deployment5 Optimizing Causal Analysis and Resolution Improvement4 Quantitatively Quantitative Organizational Process Performance Managed Management Quantitative Project Management Organizational Process Focus Organizational Process Definition Organizational Training Integrated Project Management Risk Management Process3 Defined Standardization Requirements Development Technical Solution Product Integration Verification Validation Decision Analysis and Resolution Requirements Management Project Planning Basic Project Monitoring and Control2 Managed Project Supplier Agreement Management Management Measurement and Analysis Process and Product Quality Assurance Configuration Management Risk &1 Performed Rework
  • 41. Detailed Analysis Tier One of the Analysis Results (a single Overall Agency over all radar chart) approximates an Performance indicator for the Over All Agency Performance Tier Two of the Analysis Results (separate SP and combined GPs radar chart) approximates an indicator for how much ofPerformance in Performance in the Over All Agency Performance isthe Specific the Generic attributed to people actually doing the "rightPractices Practices things" (the SPs) and how much of what they are doing was driven by structured prior planning (the GPs) Tier Three of the Analysis Results (two separate GG2 and GG3 radar charts) Performance in Performance in approximates an indicator for how much of the Generic the Generic that structured prior planning was driven by Practices for Practices for Project level planning (GG2) and by Center Generic Goal 2 Generic Goal 3 level or Agency level policies and procedures (GG3) Tier Four of the Analysis Results used qualitative information of gaps and Pareto charts provide basis for deficiencies as basis for developing recommendations recommendations
  • 42. How To Methodologies– NASA BOK Update and expand Existing SP 6105 – Lesson’s Learned – References Best practices Standards Guides –Templates E-Book
  • 43. Engineering Excellence FrameworkConsistent Approach at All Levels Concepts andWorkforce Processes– Experienced– Well Trained– Application ty Concepts & Processes b ili pa Tools & Methods Ca Knowledge & Skill of WorkforceContinuous Improvement Tools &– Assessments Methods Multi-Dimensional problem requires 3-D solution
  • 44. Office of the Chief Engineer Vision for Systems Engineering Vision: A premier systems engineering capability widelyrecognized for its leadership and expertise in the engineering of systems and subsystems to enable NASA to provide leading edge aerospace research, products and servicesMission: Develop and implement a common SE framework, and promote the environment for excellence and the revolutionary advancement of the system engineering capability to anticipate and meet the needs of NASA Programs and Projects.
  • 45. Questions