Focus on the nine I's (v9)

7,243 views

Published on

Published in: Technology, Business
0 Comments
4 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
7,243
On SlideShare
0
From Embeds
0
Number of Embeds
5,014
Actions
Shares
0
Downloads
58
Comments
0
Likes
4
Embeds 0
No embeds

No notes for slide

Focus on the nine I's (v9)

  1. 1. The Nine “I’s” of Program Success For any program to succeed, there are Nine Integration activities that must be in place and connected. Glen B. Alleman PrimePM Rick Price Lockheed Martin Tom Coonce Institute for Defense Analyses Government PMO Perspective Prime Contractor Perspective 1
  2. 2. The CPM Mission Statement … 2 Share, promote, and advance the best of planning, control, and performance management for projects of all sizes and complexity. … Is Our Mission As Well The basis of this Mission are delivered through the Nine I’s
  3. 3. We’re not here to show you HOW We’re here to show you WHY
  4. 4. Program Success  Program success is accomplishing the required objectives defined in the SOW on time and on cost  A program is a System Of Systems both technically and programmatically  Programmatic architecture, risk, and execution are just as important as technical performance 4
  5. 5. All the world's’ a system All the parts are separate All the parts are connected Practices Connect The Parts 5
  6. 6. Start with the end in Mind 6 Our sample program is a “sample return” program – Stardust to visit Comet Wild 2
  7. 7. We need the capability to …  Primary: collect dust particles from the comet  Secondary: Take pictures of the comet … – During our closest encounter, and – Get a picture of the nucleus  Secondary: Cometary Interstellar Dust Analyzer (CIDA) and Dust Flux Monitor Instrument (DFMI) – store the science data  Secondary: Analyze engineering and Doppler data 7
  8. 8. What could possibly go wrong?  Capturing and safely stowing particles  Surviving mission environments (including launch, comet encounter, and reentry)  Returning sample capsule to earth for scientific analysis  Must launch within 26 day launch window 8
  9. 9. Unrealistic Performance Expectations missing Measures of Effectiveness (MoE) and Measures of Performance (MoP) Unrealistic Cost and Schedule estimates based on inadequate risk adjusted growth models Inadequate assessment of risk and unmitigated exposure to these risks without proper handling plans Unanticipated Technical issues without alternative plans and solutions to maintain effectiveness Increasing the Probability of Program Success IMP,IMS,and,Integrated RiskManagement Removing the Root Causes of Poor Program Performance Diagram “borrowed” from Gary Bliss 9
  10. 10. The Nine “I’s” of Program Success 1. Integrated Master Plan . . . . . . . . . . . . . . . . . . (IMP) 2. Integrated Master Schedule . . . . . . . . . . . . . . (IMS) 3. Integrated Risk Management . . . . . . . . . . . . . (IRM) 4. Integrated Baseline Review . . . . . . . . . . . . . . . (IBR) 5. Integrated Team Structure . . . . . . . . . . . . . . . . (ITS) 6. Interface Control Document . . . . . . . . . . . . . . (ICD) 7. Integrated Program Management Report . .(IPMR) 8. Integrated Business Rhythm . . . . . . . . . . . . . (IBizR) 9. Integrated Supply Chain . . . . . . . . . . . . . . . . . . (ISC) 10
  11. 11. Let’s Start With Some Hard, Cold Facts 11
  12. 12. FACT  EVM is necessary but not sufficient for success  EVM measures cost and schedule performance FACT  The ANSI-748-B variance analysis guidance doesn’t speak to technical performance  Measures of Effectiveness (MoE), Measures of Performance (MoP), Key Performance Parameters (KPP), and Technical Performance Measures (TPM) are all Systems Engineering terms, not found in EVM Guidance 12
  13. 13. FACT  WBS is Paramount† – it shows: – what deliverables are needed for program success – who is needed to perform the work (Integrated Team Structure) – but not the Risk by itself, we need to add risk  The IMP is the glue between the needed capabilities and the program implementation  The IMP shows what “done” looks like through measures of increasing maturity 13 † Borrowed from Gordon Kranz
  14. 14. FACT  All the programmatic parts are connected, just like the technical parts  The Programmatic Architecture is a System-of- Systems  You have to “breakdown” the problem in the “Right way” – because everything is connected to everything else. 14
  15. 15. FACT  Risk management is how adults manage programs – Every “problem” in the past was either an unidentified or unmitigated risk  Everything has to be risk adjusted  Uncertainty drives Cost, Schedule, and Technical Performance  There are two types of uncertainty – Reducible (Epistemic) – Irreducible (Aleatory)  Not knowing the difference between these is a risk itself 15
  16. 16. PRINCIPLES Why we should be doing the Right thing
  17. 17. Risk Management The Structure of the Nine “I’s” SOW SOO ConOps WBS Techncial and Operational Requirements from SOW, SOO, ConOps CWBS & CWBS Dictionary Integrated Master Plan (IMP) Integrated Master Schedule (IMS) Earned Value Management System (EVMS) Objective Status and Essential Views to support the proactive management processes needed to keep the program GREEN Performance Measurement Baseline (PMB) Measures of Effectiveness (MoE) Measures of Performance (MoP) Measures of Progress JROC Key Performance Parameters (KPP) Program Specific Key Performance Parameters (KPP) Technical Performance Measures (TPM) 17
  18. 18. Steps to Building a Risk-Tolerant Plan (PMB and Total Plan) Capture All Activities defined in the SOW, SOO, and CDRLs1 2 3 4 5 Build the Integrated Master Plan Sequence the activities form the IMP into the IMS Build a Risk Register and quantify all the uncertainties Set Management Reserve based on reducible uncertainties Estimate the duration of each activity Assign resources to the Integrated Master Schedule 7 6 Adjust the PMB all uncertainties – reducible and irreducible8 9 18 Adjust PMB for Significant Risks
  19. 19. 19 WBS Program KPPs TPMs EVM ETC EAC Irreducible uncertainty in reference classes Schedule Margin in 81861 Cost Margin? Risk retirement in PMB MR and SR to cover unretired risk MoE MoP Technical and Programmatic Risks PE SA AC KPP IMP IMS Connecting the Dots Between IMP, IMS, and Risk Reducible uncertainty held in Risk Register Physical Percent CompleteWP
  20. 20. IMP Principles of the Integrated Master Plan
  21. 21. 21 Sow SOO ConOps WBS Techncial and Operational Requirements Integrated Master Schedule (IMS) Techncial Performance Measures Earned Value Management System Performance Measurement Baseline CWBS & CWBS Dictionary Integrated Master Plan (IMP) The Integrated Master Plan (IMP) is an event-based plan consisting of a hierarchy of program events, with each event being supported by specific accomplishments, and each accomplishment associated with specific criteria to be satisfied for its completion. The IMP is normally part of the contract and thus contractually binding. The IMP is a narrative explaining the overall management of the program. Objective Status and Essential Views to support the proactive management processes needed to keep the program GREEN
  22. 22. The Integrated Master Plan Tells Us Where The Program Is Going The Integrated Master Plan Is The Execution Strategy For The Successful Completion Of The Project 22
  23. 23. The IMP Demonstrates our understanding of the program’s requirements and the soundness of the approach represented by the plan The IMP is the single most important document to a program’s success 23
  24. 24. The IMP / IMS Structure 24 IMS IMP Describes how program capabilities will be delivered and how these capabilities will be recognized as ready for delivery Supplemental Schedules Work Packages and Tasks Criteria Accomplishment Events or Milestones
  25. 25. IMP PRACTICES Define increasing maturity for each program deliverable
  26. 26. IMP Building is a Full Contact Sport 26
  27. 27. Some Guidance 27
  28. 28. INCOSE VEE and the IMP Combine DT&E/O Demonstration` System to Specified User Needs and Environmental Constraints Interpret User Needs, Refine System Performance Specifications, and Environmental Constraints SRR Develop System Functional Specifications and System Verification Plan SFR Evolve Functional Performance Specifications into CI Functional (Design To) Specification and CI Verification Plans PDR System DT&E, Verify System Functionality & Constraints Compliance to Specifications TRR Integrated DT&E, Verify Performance Compliance to Specifications CI Verification DT&E Evolve Functional Performance Specifications into Product (Build To) Documentation and Verification Plans CDR Fabricate, Assemble, Unit Test to Build To Documentation Individual CI Verification DT&E ASFUT GSFUT System Integration System Demonstration System Development and Demonstration SVR PRR 28
  29. 29.  Program Event (PE) – A PE assess the readiness or completion as a measure of progress – First Flight Complete  Significant Accomplishment (SA) – The desired result(s) prior to or at completion of an event demonstrate the level of the program’s progress – Flight Test Readiness Review Complete  Accomplishment Criteria (AC) – Definitive evidence (measures or indicators) that verify a specific accomplishment has been completed – SEEK EAGLE Flight Clearance Obtained 29 F-22 Example
  30. 30. Quick View of Step-By-Step IMP Identify Program Events (PE) Identify Significant Accomplishments (SA) Identify Accomplishment Criteria (AC) Identify Work Packages needed to complete the Accomplishment Criteria Sequence the Work Packages (WP), Planning Packages (PP), Summary Level Planning Packages (SLPP) in a logical network. Adjust the sequence of WPs, PPs, & SLPPs to mitigate major risks. 30 1 2 3 4 5 6
  31. 31. PEs assess the maturity of the program’s deliverables 31  Program Events are maturity assessment points in the program  They define what levels of maturity for the products and services are needed before proceeding to the next maturity assessment point  The entry criteria for each Event defines the units of measure for the successful completion of the Event  The example below is typical of the purpose of a Program Event The Critical Design Review (CDR) is a multi-disciplined product and process assessment to ensure that the system under review can proceed into system fabrication, demonstration, and test, and can meet the stated performance requirements within cost (program budget), schedule (program schedule), risk, and other system constraints. 1
  32. 32. SAs define the entry criteria for each Program Event 32 Preliminary Design Review Complete 2
  33. 33. ACs are the Exit Criteria for Work Packages that produce outcomes 33 Critical Design Review Complete 3
  34. 34. IMS PRINCIPLES The IMS shows how the work is sequenced to deliver the requirements
  35. 35. 35 Sow SOO ConOps WBS Techncial and Operational Requirements Techncial Performance Measures Earned Value Management System Performance Measurement Baseline CWBS & CWBS Dictionary Integrated Master Plan (IMP) Integrated Master Schedule (IMS) The Integrated Master Schedule (IMS) is an integrated, networked schedule containing all the detailed discrete work packages and planning packages (or lower level tasks or activities) necessary to support the events, accomplishments, and criteria of the IMP. The IMP events, accomplishments, and criteria are duplicated in the IMS. Detailed tasks are added to depict the steps required to satisfy criterion. The IMS should be directly traceable to the IMP and should include all the elements associated with development, production or modification, and delivery of the total product and program high level plan. Objective Status and Essential Views to support the proactive management processes needed to keep the program GREEN
  36. 36. Integrated Master Schedule (IMS)  The IMS describes the horizontal sequence of work activities performed to increase the maturity of the deliverables.  When each deliverable reaches it’s needed maturity it is considered complete.  The IMS functions as the program’s “GPS” – Can we rely on what it’s telling us to get where we want to go? 36
  37. 37. The IMS is Our GPS for Navigating the Program 37 http://www.youtube.com/watch?v=uwkaZTLpQ_c
  38. 38. The IMP/IMS provides Horizontal and Vertical traceability of progress to plan  Vertical traceability AC  SA  PE  Horizontal traceability WP  WP  AC Program Events Define the maturity of a Capability at a point in time. Significant Accomplishments Represent requirements that enable Capabilities. Accomplishment Criteria Exit Criteria for the Work Packages that fulfill Requirements. Work Package Work Package Work Package Work Package Work Package Work Package Work package 38
  39. 39. IMS PRACTICES The IMS starts with vertical traceability, and only then links work packages horizontally
  40. 40. Steps to build the IMS Identify Program Events Identify Significant Accomplishments Identify Accomplishment Criteria Identify Work Packages needed to complete the Accomplishment Criteria Sequence the Work Packages (WP), Planning Packages (PP), Summary Level Planning Packages (SLPP) in a logical network. Adjust the sequence of WPs, PPs, & SLPPs to mitigate major risks. 40 1 2 3 4 5 6
  41. 41. Work is done in “packages” that produce measureable outcomes 41 4
  42. 42. Sequence Work Packages (AC’s) into an IMS for each Program Event 42 5
  43. 43. The Previous 6 Steps Result In A Credible IMP/IMS 43  The IMP is the “Outer Mold Line”, the Framework, the “Going Forward” Strategy for the Program.  The IMP describes the path to increasing maturity and the Events measuring that maturity.  The IMP tells us “How” the program will flow with the least risk, the maximum value, and the clearest visibility to progress.  The IMS tells us what work is needed to produce the product or service at the Work Package level.  A well integrated IMS provides accurate forecasting. Our Plan Tells Us “How” We are Going to Proceed The Schedule Tells Us “What” Work is Needed to Proceed
  44. 44. Horizontal and Vertical Traceability of the IMP/IMS Integrated Master Schedule Work sequenced to produce outcomes for each WP.  Vertical traceability AC  SA PE  Horizontal traceability WP WPAC Program Events Define the maturity of a Capability at a point in time. Significant Accomplishments Represent requirements that enable Capabilities. Accomplishment Criteria Exit Criteria for the Work Packages that fulfill Requirements. Work Package Work Package Work Package Work Package Work Package Work package Work Package Work Package 44
  45. 45. RISK PRINCIPLES Risk management is how adults manage projects – Tim Lister
  46. 46. Why Should We Care About Risk?  Deterministic plans (Performance Measurement Baselines) are ALWAYS WRONG … and usually woefully underestimated! Evidence – NASA has experienced and average schedule growth of 65% from PDR – NASA has experience an average cost growth of 35% from PDR  If we want to meet technical, cost, and schedule targets, we must adjust our plans for risk 46
  47. 47. Copyright, Hugh Macleod, www.gapingvoid.com
  48. 48. Integrated Risk Management (IRM)  Risk Management is How Adults Manage Projects – Tim Lister  Risk is created through Uncertainty, which has two forms: – Irreducible Uncertainty – the natural variations in the underlying processes of the work activities and the technical performance. – Reducible Uncertainty – probabilistic events with consequences that impact the cost, schedule, or techncial performance of the deliverables. 48
  49. 49. Integrated Risk Management (IRM) means Risks are Integrated with the Integrated Master Plan (IMP) and Integrated Master Schedule (IMS), Vertically and Horizontally Photo by, Col. Chris Hadfield, Mission Specialist STS-74. Commander ISS Expedition 35 49
  50. 50. IRM Risk Management is how Adults manage programs
  51. 51. The 1st Principle of Integrated Risk Management (IRM) 51 Start identifying programmatic and technical risks in the WBS
  52. 52. Connecting Risk Retirement with the work activities in the IMS 52  “Buying down” risk is planned in the IMS.  MoE, MoP, and KPP defined in the work package for the critical measure – weight.  If we can’t verify we’ve succeeded, then the risk did not get reduced.  The risk may have gotten worse Risk: CEV-037 - Loss of Critical Functions During Descent Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)RiskScore 24 22 20 18 16 14 12 10 8 6 4 2 0 Conduct Force and Moment Wind Develop analytical model to de Conduct focus splinter review Conduct Block 1 w ind tunnel te Correlate the analytical model Conduct w ind tunnel testing of Conduct w ind tunnel testing of Flight Application of Spacecra CEV block 5 w ind tunnel testin In-Flight development tests of Damaged TPS flight test 31.Mar.05 5.Oct.05 3.Apr.06 3.Jul.06 15.Sep.06 1.Jun.07 1.Apr.08 1.Aug.08 1.Apr.09 1.Jan.10 16.Dec.10 1.Jul.11 Weight risk reduced from RED to Yellow Weight confirmed ready to fly – it’s GREEN at this point
  53. 53. Beware the Black Swan 53
  54. 54. Hands ON Let’s put these Principles and Practices to work on Stardust Program
  55. 55. IMP Hands On
  56. 56. IMS Hands On
  57. 57. The Integrated Master Schedule 57 STARDUST S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A 1996 1997 1998 1999 S O N D J F M A M J J A S O N D J F M A M J J A S O N D J F M A M J J A Program Milestones ARR Risk Rvw Ship to Site Presidents Rvw Mis Success Rvw PDR/ C/D ATP Flt Sys CDR Start ATLO Integ Sys Test Environ Test Compl Launch Systems Engineering Requirements/ICDs Specs Released Verification System Design/Analy LEGEND: Critical Paths Schedule Margin ATLO Need Final FSCD Spacecraft Star Cam EDUACS IMUPeer Rvw ACS & ACS S/W Integ & Test Star Cam To ATLO Telecom Peer Rvw To ATLO Flt w/o Test Complete C&DH ATUPeer RvwEDU To ATLO Flight Complete EPS/Avionics ATU Solar Arrays Test Batteries PCA SASU Harness Flt Batteries Peer Rvw To ATLO PIU Thermal Control Peer Rvw Flt Substrate Louvers O/DLouver ATP TCS & MLI Installation To Propulsion Complete Structures SRC Sep Qual Peer Rvw To ATLOComplete Mechanisms To ATLO Peer Rvw Peer Rvw Complete Propulsion To ATLOComplete Software 2.0 1.1Peer Rvw 3.1 3.2 3.3 3.4 4.0 Complete Complete Complete Kick Off(1/18) CRR(6/11) EPS PDR S/A PDR Launch
  58. 58. IRM Hands On
  59. 59. ATLO Comparisons 59
  60. 60. 61 Integrating the Nine “I’s” with the information used by the program during its planning and execution processes provides visibility into the probability of success in ways not found by simply reporting data during review meetings 1. Earned Value Management (EVM) provides visibility to the efficiency and effectiveness in the execution of work through the allocated budget for planned work. Measures of physical percent complete are used to forecast future performance of that budget. 2. Integrated Risk Management (IRM) starts with defining the classes of uncertainty involved in the work activities. These included naturally occurring variances (aleatory uncertainty) and event based uncertainty (epistemic). Both uncertainties create risk to the program. Aleatory uncertainty is handled through margin. Epistemic uncertainty is handled through risk retirement or management reserve. When these two risk types are combined with the three handling method an integrated view of programmatic and technical risk is available. AleatoryandEpistemicUncertaintydrivesriskfor cost,schedule,andtechnicalperformance EarnedValueManagementmetricrecordpast performanceofplannedwork 3. TPMs are attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal by predicting the future value of a key technical performance parameter of the higher-level end product under development based on current assessments of products lower in the system structure. 4. KPPs represent the capabilities and characteristics so significant that failure to meet them can be cause for reevaluation, reassessing, or termination of the program. This include JROC KPPs and integrated program KPPs. TechnicalPerformance Measures 5. MoPs characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions. KeyPerformance Parameters 6. MoEs are operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions. Measuresof Effectiveness Measuresof Performance Nine Integrated Processes and their Artifacts needed to Increase the Probability of Program Success (PoPS) 1. Integrated Master Plan (IMP) – defines the measures of the increasing maturity of the deliverables and the processes needed to deliver them P P P P 2. Integrated Master Schedule (IMS) – defines the sequence of work needed to deliver the planned level of maturity for each deliverable P P P P P 3. Integrated Risk Management (IRM) – identified and handles aleatory and epistemic uncertainties in the program through margin, risk retirement activities, and management reserve P 4. Integrated Baseline Review (IBR) – confirms the program has a credible Performance Measurement Baseline, integrated risk management process, and risk management P 5. Integrated Team Structure (ITS) – enables horizontal and vertical integration of the work performed by functional elements across the product structure. P 6. Interface Control Document (ICD) – defines the technical, operational, and interfaces between each component or subsystem P 7. Integrated Program Management Report (IPMR) – reports physical process to plan using Earned Value Management and Estimates At Completion P P 8. Integrated Business Rhythm (IBizR) – assures all process functional correctly on a monthly, or weekly basis P P P P P 9. Integrated Supply Chain (ISC) –assures all suppliers of subsystems and components adhere to technical and operational specifications, in a timely manner, within budget. P P P These Nine “I’s” Integrated With The Six Measures Of Performance Provide Line Of Sight Visibility To Program Performance
  61. 61. 62
  62. 62. Identify the Program Events 63 Actors Processes Outcomes Systems Engineer Define the process flow for product production from contract award to end of contract  Confirm Program Events represent the logical process flow for program maturity Program Manager Confirm customer is willing to accept the process flows developed by the IMP  Engage with contracts and customer for PE definition Project Engineer Identify interdependencies between program event work streams  Identify Value Stream components at the PE level before flowing them down to the SA level IMP/IMS Architect Capture Program Event contents for each ITS or work stream  Establish foundation for a structure to support the description of the increasing mature as well as the flow to needed work. Copyright © 2012, Glen B. Alleman, Niwot Ridge, LLC 1
  63. 63. Identify the Significant Accomplishments (SA) for Each PE 64 Actors Processes Outcomes System Engineer Identify Integrated Team Structure (ITS) responsible for the SA’s  Define the boundaries of these programmatic interfaces  Define technical and process risk categories and their bounds Technical Lead Confirm the sequence of SA’s has the proper dependency relationships  Define the product development flow process improves maturity  Define technical risk drivers Project Engineer Confirm logic of SA’s for project sequence integrity  Define the program flows improves maturity Control Account Manager Validate SA outcomes in support of PE entry conditions  Confirm budget and resources adequate for defined work effort IMP/IMS Architect Assure the assessment points provide a logical flow of maturity at the proper intervals for the program  Maintain the integrity of the IMP, WBS, and IMS 2
  64. 64. Identify Accomplishment Criteria for each Significant Accomplishment 65 Actors Processes Outcomes CAM Define and sequence the contents of each Work Package and select the EV criteria for each Task needed to roll up the BCWP measurement  Establish ownership for the content of each Work Package and the Exit Criteria – the Accomplishment Criteria (AC) Project Engineer Identify the logical process flow of the Work Package to assure the least effort, maximum value and lowest risk path to the Program Event  Establish ownership for the process flow of the product or service Technical Lead Assure all technical processes are covered in each Work Package  Establish ownership for the technical outcome of each Work Package IMP/IMS Architect Confirm the process flow of the ACs can follow the DID 81650 structuring and Risk Assessment processes  Guide the development of outcomes for each Work Package to assure increasing maturity of the program 3
  65. 65. Identify the Work for Each AC in Work Packages 66 Actors Processes Outcomes Control Account Manager Identify or confirm the work activities in the Work Package represent the allocated work  Define bounded work effort defined “inside” each Work Package Technical Lead Confirm this work covers the SOW and CDRLs  Define all work effort for 100% completion of deliverable visible in a single location – the Work Package  Confirm risk drivers and duration variances IMP/IMS Architect Assist in the sequencing the work efforts in a logical manner  Develop foundation of the maturity flow starting to emerge from the contents of the Work Packages Earned Value Analyst Assign initial BCWS from BOE to Work Package  Confirmation of work effort against BOEs  Define EVT for measures progress to plan 4
  66. 66. Sequence Work Packages for each Significant Accomplishment (SA) 67 Actors Processes Outcomes Control Account Manager Define the order of the Work Packages needed to meet the Significant Accomplishments for each Program Event  Define the process flow of work and the resulting accomplishments.  Assure value is being produced at each SA and the AC’s that drive them IMP/IMS Architect Assure that the sequence of Work Packages adheres to the guidance provided by DCMA and the EVMS System description  Begin the structuring of the IMS for compliance and loading into the cost system Program Controls Staff Baseline the sequence of Work Packages using Earned Value Techniques (EVT) with measures of Physical Percent Complete  Develop insight to progress to plan with measures of physical progress for each Work Packages (EVT) 5
  67. 67. Assemble Final IMP/IMS 68 Actors Processes Outcomes IMP/IMS Architect Starting with the AC’s under each SA’s connect Work Packages in the proper order for each Program Event  Establish the Performance Measurement Baseline framework.  Identify MoE and MoP points in the IMP Program Manager Confirm the work efforts represent the committed activities for the contract  Review and approval of the IMS – ready for baseline.  Review and approve risk drivers and duration variance models Project Engineer Assess the product development flow for optimizations  Review and approval of the IMS – ready for baseline.  Identify risk drivers and their mitigations Systems Engineer Confirm the work process flows result in the proper products being built in the right order  Confirm risk drivers and duration variances.  Review and approval of the IMS – ready for baseline 6
  68. 68. Basic Themes for this Morning  Managing with “eye wide open”  There are more problems than solution – The 9 I’s provide a comprehensive, structured plan of attack  Learning this requires hands on – Leave them with hands on material  Building the IMP and IMS is straight forward – but don’t short change the effort – Lack of risk assessment that is the root cause of disappointment – This means quantified risks in the Risk Register (use the IDA report picture) – Start building some risk that impact the IMS (cost and schedule) 69
  69. 69. Exit Criteria for this Morning  Learning Objectives – Understand the everything is connected as a “system.” – We have to decompose the work starting with the WBS, to the IMP, to the IMS, … – How do build “credible” elements of the system 70
  70. 70. Timing  1st session – Principles – Practices – Show the hands on for 1st ties both principles and practices in the coming session • Handouts • Teams built • Homework assigned  2nd session – Do the homework assigned to the teams 71
  71. 71. Take Homes for the Participants  We need Schedule Reserve just like Management Reserve  The current PMB is as good as it gets – Faster or slower costs money  Everything is a system – No disconnected data  You can (can’t?) do this without an IMP – This tells you want “done” looks like  IMS is the reflection of the entire program – If the initial steps are not done right, it’s down hill from here – The IMS is the GPS for the program (All State Ad) 72
  72. 72. The Importance of the IMP  The program uses the IMP/IMS to provide: – Up Front Planning and commitment from all participants – A balanced design discipline with risk mitigation activities – Integrated requirements including production and support – Management with an incremental verification for informed program decisions 73
  73. 73. Analogies  Baseball team  F-111 Illusion of Choice  Compatibility – not  The IMS is the reflection of everything else – once we get the WBS, the IMP, and the measures in place, the IMS is the GPS – can you trust your  Is everything risk adjusted – we don’t spend enough time identifying risk – especially the programmatic risks 74
  74. 74. processes  Teams with rotation of the leaders  3 threads – Choose 3 products – Brief for each of the steps – Swap leaders  Connect the 6 “I’s” through risk 75
  75. 75. Steps to Building a Risk-Tolerant Plan (PMB and Total Plan) Capture All Activities1 2 3 4 5 Sequence These Activities Estimate Activity Durations and Associated Resources (Initial PMB) Build a Risk Register and Adjust PMB for Significant Risks and Uncertainties Set Management and Schedule Reserve Verify Schedule Is Traceable Horizontally And Vertically Confirm Valid Critical Path – schedule matches program 7 6 Assure that ECD and Cost both have at least 50% probability of success 8 Assure that ECD and Total Cost both have at least 70% probability of success9 76
  76. 76. To Implement This Integrated Proposal 1. Mandate IMP 2. Create more specific examples on how TPMs can be integrated into the IMP/IMS 3. Change IPMR DID and/or Implementation Guide to a) Require PMBs be based (at least initially) on a resource-loaded schedule b) Require initial PMBs to be adjusted for uncertainty and to communicate probabilities of meeting cost and schedule targets (in proposals and semi- annually) c) Require MR and SR be based on a Monte Carlo simulation and to communicate probabilities of meeting cost and schedule targets (in proposals and semi- annually) d) Require the contractor to electronically submit a risk register (initially and semi- annually) e) Add probability statements to EACs and ECDs within monthly reports 77
  77. 77. How Success Program Principles Should Be Applied on DoD Programs Principles Practices Processes  Where are we going?  Integrated Master Plan  WBS, OBS, CAP, RMP  How do we get there?  Integrated Master Schedule  Sequence and budget the work  Do we have enough resources?  Resource loaded IMS  PMB  What impediments will we encounter?  Risk adjusted IMS  Risk register content assigned to the IMS  How do we measure progress  Earned Value Management  Technical Performance Measures  Measures of Effectiveness  Measures of Performance  Technical Performance Parameters  MoE, MoP, KPP, TPMs embedded in the IMS 78
  78. 78. The State of DoD Program Management Data Flow Today Contract Requirements Accounting and EVM System Systems Engineering System Government Program Management Office Contract Ceiling Systems Engineering Requirements and KPPs Scheduling System Manual Sanity Checks to Ensure BCWS consistent with IMS Ad-Hoc Manual Sanity Checks IMS Schedule Status and Performance Data, e.g., BEI and SRA Results Technical Progress on Key Performance Parameters EVM Performance, Variance and Forecast Data Risk Management System Cost, Schedule and Technical Risks Risk 5 x 5 Matrix Completion Date 79
  79. 79. Questions Which the IPMR Should Answer for the PM to “Keep it Green” 1. At a summary level, what is the time-based networked plan of activities to provide required deliverables and end items? 2. What are the technical performance measures by WBS? 3. What are the interim technical performance criteria that permit assessments that technical scope of the program is being completed as planned? 4. What is the Work Breakdown Structure and does it cover all the required work? 5. What is the monthly manpower spend plan to deliver according to the Statement of Work? 6. What is the monthly Program Management Baseline that coincides with IMS after it has been adjusted for cost and schedule uncertainties? 7. What is the contractor’s projected probability of meeting the initial cost and schedule targets after taking into account known uncertainties? 8. What is the contractor’s initial probability of meeting initial Total Cost At Completion and its initial Management and Schedule Reserve taking into account known discrete risks that have not been mitigated? 9. What is the contractor’s initial probability of meeting the completion date taking into account known discrete risks that have not been mitigated? 80
  80. 80. Questions Which the IPMR Should Answer (Continued) 10. On a monthly basis, how has the contractor performed against his plan, specifically:  How did the contractor perform against his IMP/IMS? – Planned vs Actual Programmatic Deliverables (Program Events and Significant Accomplishments and Accomplishment Criteria) – Planned ranges vs actual TPM – Monthly BEI and CEI over time – Cumulative SPIt  How has the contractor performed against his original manpower spending plan (hours or FTE planned vs. monthly actuals)?  How has the contractor performed against his current financial plan, i.e., monthly and cumulative, BCWS, BCWP, ACWP, CPI, and SPI – Summary Level – At any indenture of the WBS – By Organization Breakdown Structure  Where has the contractor experienced problems? 81
  81. 81. Questions Which the IPMR Should Answer (Concluded) 11. On a semi-annual basis:  What is the projected cost and schedule outcomes to deliver required final end items, assuming future performance is the same as the past  What is the probability of meeting both cost and schedule targets to deliver the final end items given items on the risk register? – Best Case cost at completion and completion date assuming low probabilities that risks within the risk register occur and/or successful risk mitigation strategies; – Worst Case cost at completion and completion date assuming high probabilities that risks within the risk register occur and/or un-successful risk mitigation strategies; – Most Likely cost at completion and completion date assuming “realistic” probabilities that the risks within the risk register will occur and a moderate number of risk mitigation strategies  What items on the risk list have the highest probability and associated impact that could jeopardize the program from meeting technical, cost and schedule objectives? 82
  82. 82. STARDUST A Lesson in Managing Risk To achieve Mission Success 83
  83. 83. Douglas Isbell Headquarters, Washington, DC November 22, 1995 (Phone: 202/358- 1753) RELEASE: 95-209 COMET SAMPLE RETURN MISSION PICKED AS NEXT DISCOVERY FLIGHT A spacecraft designed to gather samples of dust spewed from a comet and return the dust to Earth for detailed analysis has been selected to become the fourth flight mission in NASA's Discovery program. Science •Collect Dust Particles •Images •Closest Encounter •Transmit Real Time One Image as Near as Possible to Nucleus •Image Size is 150x150 pixels •72 Images Centered At Time Of Closest Approach •CIDA and DFMI - Store Science Data •Dynamics Science - Analyze Engineering and Doppler Data 84
  84. 84. Common Risk Sources  Misunderstood or Poorly-defined Requirements  Requirements Changes (Creeping)  Non-Stable WBS (Chasing the Req. Changes)  Polishing the Cannonball (“Better is the enemy of good- enough”)  Straining Existing Capability (“Watch your Margins”)  Unrealistic/Optimistic Expectations (“Murphy Lives”)  Personnel Shortfalls (“Many hands make light work”)  Poor Metrics (“Ignorance is Bliss”)  Not Watching Cost-to-Complete 85
  85. 85. Mission Risks  Must launch within 26 day launch window  Risk of Spacecraft/SRC single point failures  2.72 AU on solar power  Risk to capturing and safely stowing particles  Surviving mission environments (including launch, comet encounter, and reentry)  Returning sample capsule safely to earth for scientific analysis (high-speed reentry, SRC ballistic instability, parachute operations, recovery ops) 86
  86. 86. Risk Mitigation Strategies  Margins (technical, cost, & schedule)  Use of heritage components & design redundancies  Mission design flexibility (primary science sacred…secondary science tradeable)  Design for survivability  Test early, test often, “test-like-you-fly”, pay for test units 87
  87. 87. Original “Toaster Drawer” Design 88
  88. 88. Workshop Exercise 1) Choose a risk from the “Mission Risk List” 2) Choose affected WBS elements 3) Select event as a “prior to” for risk mitigation 4) Write a SOW paragraph (subcon) and simple ICD 5) Develop risk retirement plan 6) Develop appropriate IMP/IMS entries 7) Integrate KPPs, MOEs, MOPs into IMP/IMS criteria 89
  89. 89. “Clam Shell” Design 90
  90. 90. Test Schedule (insert) 91
  91. 91. Survivability Design (Kevlar) “Whipple Shields” 92
  92. 92. Survivability Hyper-velocity Test 93

×