Integrated Master Plan Development

1,617 views
1,511 views

Published on

Building the Integrated Master Plan (and its Integrated Master Schedule) is a critical success factor in any project domain. It describes the increasing maturity of all deliverables in units of measure meaningful to the decision makers.
The IMP contains the Measures of Effectiveness and Measures of Performance. The IMS contains the Technical Performance Measures (as exit criteria for the Work Packages).

Risk and estimates are applied at all levels of the IMP and IMS, then definitized in the Performance Measurement Baseline on contract

Published in: Technology, Business
1 Comment
2 Likes
Statistics
Notes
  • This is a very good tutorial on how to create an IMP.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total views
1,617
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
51
Comments
1
Likes
2
Embeds 0
No embeds

No notes for slide

Integrated Master Plan Development

  1. 1. The  Integrated  Master  Plan  And   Integrated  Master  Schedule   The  Integrated  Master  Plan  and  Integrated  Master  Schedule  are  one  of  the   six  elements  of  a  Credible  Performance  Measurement  Baseline  (PMB).   The  PMB  is  the  source  of  data  for  the  Program  Manager.  Some  of  this  data  is   used  in  the  Integrated  Program  Measurement  Report  (IPMR).  Some  is  used   in  the  assessment  of  the  Program’s  risk.  Some  define  the  deliverables  and   their  Technical  Performance  Measures.     5 V8.7  
  2. 2. The  IMP  tells  us  where  is  the  program  going?   The  Plan  describes  where  we  are  going,  the  various  paths  we  can  take  to  reach  our   desLnaLon,  and  the  progress  or  performance  assessment  points  along  the  way  to   assure  we  are  on  the  right  path.   These  assessment  points  measures  the  “maturity”  of  the  product  or  service  against   the  planned  maturity.  This  is  the  only  real  measure  of  progress  –  not  the  passage   of  Lme  or  consumpLon  of  money.   The  Integrated  Master  Plan  (IMP)  Is  A  Strategy  For   The  Successful  Comple=on  Of  The  Project   2   5.0  Start  with  the  IMP  
  3. 3. 3   5.0  Start  with  the  IMP  
  4. 4. Quick  View  to  IMP/IMS   !  VerLcal  traceability  defines  the  increasing  maturity  of   key  deliverables   !  Horizontal  traceability  defines  the  work  acLviLes   needed  to  produce  this  increasing  maturity   !  Both  are  needed,  but  the  verLcal  traceability  is  the   starLng  point   !  Program  Events,  Significant  Accomplishments,  and   Accomplishment  Criteria  must  be  defined  before  the   horizontal  work  acLviLes  can  be  idenLfied   !  For  all  IMP  elements,  Key  Risks  must  be  idenLfied  and   assigned  from  Day  One,  even  without  miLgaLons   4   5.0  Start  with  the  IMP  
  5. 5. The  IMP/IMS  is  Needed  on     Both  Sides  of  the  Contract   !  Ver%cal  traceability  defines  the  increasing  maturity  of   the  program’s  deliverables  measures  in  EffecLveness   (MoE)  and  Performance  (MoP)  for  the  Government.   !  Horizontal  traceability  defines  the  progress  to  plan  for   MoE’s  and  MoP’s  with  tangible  evidence  for  both  the   Government  and  the  Contractor.   !  Ver%cal  traceability  provides  the  Government  with   insight  into  the  progress  of  the  MoE’s  and  MoP’s  and   Technical  Performance  Measures.   !  Horizontal  traceability  provides  insight  into  Cost  and   Schedule  performance  for  the  Contractor,  reportable   through  the  IPMR  to  the  Government.   5   5.0  Start  with  the  IMP  
  6. 6. MisconcepLons  of     the  IMP  /  IMS   Why  we  don’t  need/want  an  IMP/IMS     !  Only  required  for  ACAT  I  programs   !  Too  big  and  burdensome  for  our  small  dollar  value  program   !  Contractor  spends  B&P  and  program  budget  generaLng  and   maintaining  the  IMP,  without  measurable  benefit   !  Doesn’t  apply  on  a  services  contractors   !  Management  tool,  not  a  technical  tool   !  Doesn’t  apply  to  technology  programs   !  Doesn’t  apply  to  R&D  efforts   !  Doesn’t  apply  to  the  government   !  Help  me  get  a  waiver  so  I  don’t  have  to  use  an  IMP/IMS   6   5.0  Start  with  the  IMP  
  7. 7. Aeributes  of  the  IMP   !  Traceability   –  Expands  and  complies  with  the  SOO,  Performance   Requirements,  CWBS,  and  CSOW   –  Based  on  the  customers  WBS   –  Is  the  basis  of  the  IMS,  cost  reports,  and  award  fees   !  Implements  a  measurable  and  trackable  program   –  Accomplishes  integrated  product  development   –  Integrates  the  funcLonal  acLviLes  of  the  program   –  Incorporates  funcLonal,  lower  level  and  S/C  IMPs   !  Provides  for  evaluaLon  of  Program  Maturity   –  Provides  insight  into  the  overall  effort   –  Level  of  detail  is  consistent  with  risk  and  complexity  per  §L   –  Decomposes  events  into  a  logical  series  of  accomplishments   –  Measurable  criteria  demonstrate  compleLon  /  quality  of   accomplishments   7   5.0  Start  with  the  IMP  
  8. 8. Aeributes  of  the  IMS   !  Integrated,  networked,  mulL-­‐layered  schedule  of   efforts  required  to  achieve  each  IMP   accomplishment   –  Detailed  tasks  and  work  to  be  completed   –  Calendar  schedule  shows  work  compleLon  dates   –  Network  schedule  shows  interrelaLonships  and   criLcal  path   –  Expanded  granularity,  frequency,  and  depth  of  risk   areas   !  Resource  loading   !  Correlates  IMS  work  with  IMP  events   8   5.0  Start  with  the  IMP  
  9. 9. The  Importance  of  the  IMP   !  The  IMP/IMS  is  the  single  most  important  document  to   a  program’s  success   –  It  clearly  demonstrates  the  providers  understanding  of  the   program  requirements  and  the  soundness  of  the  approach   a  represented  by  the  plan   !  The  program  uses  the  IMP/IMS  to  provide:   –  Up  Front  Planning  and  commitment  from  all  parLcipants   –  A  balanced  design  discipline  with  risk  miLgaLon  acLviLes   –  Integrated  requirements  including  producLon  and  support   –  Management  with  an  incremental  verificaLon  for   informed  program  decisions   9   5.0  Start  with  the  IMP  
  10. 10. Just  A  Reminder,  before  moving  on   10   Page  47,  Defense  AcquisiLon  Guide,  January  10,  2012   Page  317,  Defense  AcquisiLon  Guide,  January  10,  2012   5.0  Start  with  the  IMP  
  11. 11. Our  Goal  is  simple  …     How  can  we  recognize  the  Reality  of  the  Program’s   current  status  and  its  future  performance?   The  top  spins  conLnuous  while  in  a  dream  –  stops   spinning  in  the  real  world  –  Cobb’s  totem,  Incep=on   5.0  Start  with  the  IMP   11  
  12. 12. 12   5.0  Start  with  the  IMP  
  13. 13. Principles  of  Building  a  Credible   Integrated  Master  Plan   Building  the  IMP  is  a  Systems  Engineering  acLvity.  The  Integrated  Master  Plan  is   the  Program  Architecture  in  the  same  way  the  hardware  and  sooware  are  the   Product  Architecture.   Poor,  weak,  or  unstructured  ProgrammaLc  Architecture  reduces  visibility  to  the   Product  Architecture’s  performance  measures  of  cost  and  schedule  connected   with  Technical  Performance  Measures.     6 V8.7  
  14. 14. Quick  View  of  Building  the  IMP   !  Start  with  each  Program  Event  and  define  the   Significant  Accomplishments  their  entry  and   exit  criteria  to  assess  the  needed  maturity  of   the  key  deliverables   !  Arrange  the  Significant  Accomplishments  in   the  proper  dependency  order     !  Segregate  these  Significant  Accomplishments   into  swim  lanes  for  IPTs   !  Define  the  dependencies  between  each  SA   14   6.0  Build  IMP  
  15. 15. A  CriLcal  Understanding  of  the  IMP   The  IMP  defines  the  connecLons  between  the  Product  maturity  –  VerLcal  –  and  the   implementaLon  of  this  Product  maturity  through  the  FuncLonal  acLviLes  –  the  Horizontal     6.0  Build  IMP   15  
  16. 16. Benefits  of  this  Formality   16   Objec%ve   Implementa%on   Event  Driven  Plan  versus  Schedule   Driven  Plan  is  based  on  compleLon  of   tasks  not  passage  of  Lme   Separate  the  plan  (IMP)  from  the   schedule  (IMS)  but  link  elements  with   numbering  system   Condensed,  easy  to  read  “Plan”  showing   the  “Events”  rather  than  the  work  effort   Indentured,  outline  format  –  not  text   Pre-­‐defined  entry  and  exit  criteria  for   major  Program  Events   Significant  Accomplishments  for  each   key  Program  Event   ObjecLve  measures  of  progress  and   compleLon  for  each  Accomplishment   Pre-­‐defined  Accomplishment  Criteria   (AC)  for  each  Significant   Accomplishment  (SA)   Stable,  contracture  plan,  flexible  enough   to  portray  program  status   IMP  part  of  the  contract,  IMS  is  a  data   item   Capture  essence  of  the  funcLonal   progress  without  mandaLng  a  parLcular   process  for  performing  the  work   Split  IMP  into  Product  and  Process   6.0  Build  IMP  
  17. 17. Risk  Management   Building  the  IMP  Starts  at  the  RFP  with   Systems  Engineering  Measures   17   6.0  Build  IMP   SOO   ConOps   SOW   Techncial  and  OperaLonal   Requirements   CWBS  &   CWBS  DicLonary   Integrated  Master  Plan   (IMP)   Integrated  Master  Schedule   (IMS)     Earned  Value  Management   System   ObjecLve  Status  and  EssenLal  Views  to  support  the  proacLve  management   processes  needed  to  keep  the  program  GREEN   Performance  Measurement  Baseline   Measures  of   EffecLveness   Measures  of   Performance   TPMs  and  QBDs   JROC     Key  Performance  Parameters   Program    Specific   Key  Performance  Parameters   Technical  Performance   Measures   WBS   CWBS  
  18. 18. IMP  captures  end  user  requirements  in   terms  MOEs,  MOPs,  KPPs,  and  TPMs   18   SOW   ConOps   WBS   CWBS   Technical  Performance   Measures     (TPM)   Messages  of  Performance   (MOP)   Measures  of  EffecLveness   (MOE)   Key  Performance   Parameters   (KPP)   Integrated  Master  Plan   (IMP)   Integrated  Master   Schedule   (IMS)   Performance   Measurement  Baseline   (PMB)   Technical  Requirements  
  19. 19. The  IMP  /  IMS  Structure   19   IMS IMP Describes how program capabilities will be delivered and how these capabilities will be recognized as ready for delivery Supplemental Schedules (CAM Notebook) Work Packages and Tasks Criteria Accomplishment Events or Milestones 6.0  Build  IMP  
  20. 20. The  IMP/IMS  provides  Horizontal  and   VerLcal  Traceability  of     progress  to  plan   !  VerLcal  traceability  AC  "  SA  "  PE   !  Horizontal  traceability  WP  "  WP  "  AC   20   Program Events Define the maturity of a Capability at a point in time. Significant Accomplishments Represent requirements that enable Capabilities. Accomplishment Criteria Exit Criteria for the Work Packages that fulfill Requirements. Work   Package   Work   Package   Work   Package   Work   Package   Work   Package   Work   Package   Work   package   6.0  Build  IMP  
  21. 21. The  IMP’s  role  during  ExecuLon   21   Program ExecutionPMB for IBRProposal SubmittalDRFP & RFP Performance Measurement Baseline Tasks (T) BOE % Complete Statement of Work Program Deliverables IMP Accomplishments (A) Criteria (C) EVMS Events (E) Budget Spreads by CA & WPCAIV Capabilities Based Requirements X BCWS = Probabilistic Risk Analysis = Time keeping and ODC = Technical Performance Measure BCWP ACWP Cost & Schedule Risk Model BCWS Decreasing technical and programmatic risk using Risk Management Methods IMS Physical % Complete Continuity and consistency from DRFP through Program Execution WBS 6.0  Build  IMP  
  22. 22. !  IMP  phrases  in  past  tense  –  do  describe  what   DONE  looks  like   !  IMS  phrase  in  present  tense  –  do  describe   work  needed  to  arrive  at  DONE   22   The  Grammar  of  the  Integrated   Master  Plan  (IMP)   Maturity   Product  AcLon   Product  State   AdjecLve   Noun   Verb   Verb   Demonstrates   Maturity   Step  in  the   Process   End  Item   Final  Status   Preliminary   Model/Sim   Design   Complete   Preliminary  Modeling  and  Simula%on  Design  Complete  
  23. 23. 1st,  Principle  –     IMP  Building  is  a  Full  Contact  Sport   23   6.0  Build  IMP  
  24. 24. Our  First  Approach  to  the  IMP  /   IMS  Paradigm   !  The  1st  approach  defines  Program  Events  (PE),   Significant  Accomplishments  (SA),  and   Accomplishment  Criteria  (AC),  derived  from  the   Work  Breakdown  Structure  or  the  SOW   !  This  1st  approach  can  be  done  in  6  easy  steps   24   6.0  Build  IMP   1.  IdenLfy  the  Program  Events  (PE)   (as  the  ACQ  Guide  tells  us)   2.  IdenLfy  the  Significant   Accomplishments  (SA)  from  the   WBS  deliverables     3.  IdenLfy  the  Accomplishment   Criteria  (AC)  needed  to  produce   these  deliverables   4.  IdenLfy  the  Work  Packages  for   each  Accomplishment  Criteria   5.  Sequence  the  Work  Packages   6.  Assemble  the  IMP/IMS  
  25. 25. This  approach  doesn’t  give  us   visibility  into  what  “done”  looks  like   !  We  must  measure  increasing  product  maturity  in   units  meaningful  to  the  decision  makers   !  We  must  see  the  risks  before  they  arrive  so  we   can  take  correcLve  acLon   6.0  Build  IMP   25  
  26. 26. First  Look  at  a  Significant   Accomplishment  (SA)   !  SAs  are  interim  and  final  steps  to  define,  design,  develop,   verify,  produce,  and  deploy  the  product  or  system.     !  SAs  must  occur  in  a  manner  that  ensures  a  logical  path  is   maintained  throughout  the  development  effort.     !  SAs  are  event  related  and  not  just  Lme  coincidental.     !  SAs  should  have  one  or  more  of  the  following  characterisLcs:     –  Consists  of  a  discrete  step  in  the  process  of  planned  development  that   must  be  complete  prior  to  an  event   –  Produces  a  desired  result  at  a  specified  event  that  indicates  a  level  of   design  maturity  (or  progress)  directly  related  to  each  product  and   process   –  Defines  interrelaLonships,  interdependencies,  or  “hand-­‐off”  points  of   different  funcLonal  disciplines  applied  to  the  program   26   6.0  Build  IMP   SAs  must  assess  compliance  with  Measures  of  EffecLveness    
  27. 27. First  look  at  an  Accomplishment   Criteria  (AC)   !  ACs  are  definiLve  measures  directly  supporLng  successful   compleLon  of  a  significant  accomplishment.     !  ACs  show  objecLve  evidence  of  work  progress  (maturity  of  a   product);  i.e.,  be  seen,  read,  demonstrated,  or  quanLfied.  These   results  are  usually  incorporated  into  a  report  or  document  as   evidence  of  accomplishment.     !  ACs  are  prerequisites  for  compleLon  of  an  SA  (i.e.,  exit  criteria).     !  The  quesLons  that  need  to  be  repeatedly  asked  when  developing   ACs  are:   –  How  do  I  know  when  an  accomplishment  has  been  completed?   –  Is  the  criteria  directly  related  to  the  accomplishment?     –  Is  it  proof?     –  What  is  the  work  product?”   27   6.0  Build  IMP   ACs  must  assess  compliance  with  Measures  of  Performance  
  28. 28. The  IMP  speaks  to  Measures  of  EffecLveness   (MoE)  and  Measures  of  Performance  (MoP)   28   6.0  Build  IMP   !  This  is  where  TPMs     are  connected  with   the  MoE’s  and   MoP’s   !  For  each   deliverable  from   the  program,  all   the  “measures”   must  be  defined  in   units  meaningful   to  the  decision   makers.   !  Here’s  some  “real”   examples.   1.  Provide  Precision   Approach  for  a  200   FT/0.5  NM  DH   2.  Provide  bearing  and   range  to  AC  plasorm   3.  Provide  AC   surveillance  to  GRND   plasorm     Measures  of   EffecLveness  (MoE)   1.  Net  Ready   2.  Guidance  Quality   3.  Land  Interoperability   4.  Manpower   5.  Availability   JROC  Key  Performance   Parameters  (KPP)   1.  Net  Ready   ! IPv4/6  compliance   ! 1Gb  Ethernet   2.  Guidance  quality   ! Accuracy  threshold  p70   @  6M   !   Integrity  threshold  4M   @  10-­‐6  /approach   3.  Land  interoperability   ! Processing  capability   meets  LB  growth  matrix   4.  Manpower   ! MTBC  >1000  hrs   ! MCM  <  2  hrs   5.  Availability   ! Clear  threshold  >99%   ! Jam  threshold  >90%   Measures  of  Performance   (MoP)   1.  Net  Ready   ! Standard  message  packets   2.  Guidance  Quality   ! MulLpath  allocaLon  budget   ! MulLpath  bias  protecLon   3.  Land  Interoperability   ! MOSA  compliant   ! Civil  compliant   4.  Manpower   ! OperaLng  elapsed  Lme   meters   ! Standby  elapsed  Lme   indicators   5.  Availability   ! Phase  center  variaLons     Technical  Performance   Measures  (TPM)   Mission  CapabiliLes  and  OperaLonal  Need   Technical  Insight  –  Risk  adjusted  performance  to  plan    
  29. 29. The  1st  Problem  with  the  IniLal   IMP/IMS  Paradigm   !  There  is  no  single  source  of  guidance  for   construcLng  a  credible  IMP  and  IMS   – A  DOD  Guidebook,  but  sLll  in  Version  0.9   – A  few  DOD  service  pamphlets   – A  commercial  guidebook   – Many  contractor  guidelines   !  No  definiLve  guidance  from  DoD  on  what   makes  a  good  IMP   !  No  definiLve  DID  requiring  an  IMP  on  specific   programs   29   6.0  Build  IMP  
  30. 30. The  IMP  is  a  good  start,  but  we’re   really  aoer  the  IMP  NarraLves   !  The  IMP  NarraLves  start  with  the  Statement   of  ObjecLves  (SOO)  or  Concept  of  OperaLons   (ConOps)   – These  idenLfy  the  top  level  program  objecLves   – They  define  the  big  picture  and  provide  pre-­‐award   trade  space   – They  provide  the  framework  for  the  Contractor  to   develop  the  proposal  through  the  IMP   !  With  the  Government’s  ExecuLve  summary,   provides  the  contractor  an  understanding  of   what  is  needed  and  what  is  important   30   6.0  Build  IMP  
  31. 31. The  IMP  Process  NarraLve   The  IMP  Process  NarraLve  describes  how  the   technical  and  business  elements  of  the  program   will  be  conducted,  monitored,  and  controlled.     NarraLves  provide  the  customer  visibility  into   the  contractor’s  key  funcLonal  processes  and   procedures,  the  relaLonship  of  these  processes   and  procedures,  and  an  overview  of  the  effort   required  to  implement  them.   31   6.0  Build  IMP  
  32. 32. IMP  NarraLve  for  PDR   Program   Event   Event  Descrip%on   PE  Maturity  Assessment   shown  in  SAs   PDR   PDR  establishes  the  “design-­‐  to”  allocated   baseline  to  the  subsystem  level,  assures  this   design  meets  the  funcLonal  baseline,  assures   system  requirements  have  been  properly   allocated  to  the  proper  subsystem.   PDR  establishes  the  feasibility  of  the  design   approach  to  meet  the  technical  requirements  and   provide  acceptable  interface  relaLonships   between  the  hardware  and  other  interfacing   items.     Any  changes  to  the  requirements  that  have   occurred  since  the  System  Requirements  Review   (SRR2)  will  be  verified  at  the  PDR.     PDR  assures  the  design  is  verifiable,  does  not  pose   major  IMS  or  Cost  risk,  and  is  mature  enough  to   advance  to  the  detailed  design  phase  –  CDR   !  Subsystem  level   operaLonal  concepts   defined   !  System  level  interfaces   baselined   !  Supportability  plans   established   !  Sooware  requirements   finalized   !  Subsystem  requirements   finalized  &  allocated   !  System  verificaLon,   validaLon  &  cerLficaLon   plans  updated   !  PDR  subsystem  design   completed   32   6.0  Build  IMP  
  33. 33. What  the  IMP  NarraLve  Tells  Us?   !  States  the  objecLve  of  the  processes  used  to  build  the   products  describe  in  the  SOW   !  Provides  the  governing  documents,  compliance,  and   reference  for  the  process  acLviLes   !  Explains  the  process  approach   –  Portrays  the  key  acLviLes  of  the  approach   –  Illustrates  the  processes  tailored  for  the  specific  program   33   Inputs   AcLvity   3   AcLvity   5   AcLvity   4   Tools   AcLvity   1   AcLvity   2   Outputs   Metrics   6.0  Build  IMP  
  34. 34. Some  IMP  Guidance     34   6.0  Build  IMP  
  35. 35. The  Defense  AcquisiLon  Guide   (DAG)  Tells  Us  About  the  IMP   35   6.0  Build  IMP  
  36. 36. The  DAG  Also  Tells  Us   36   6.0  Build  IMP  
  37. 37. 5000.01  Part  2.B.3  AcquisiLon  Strategies,   Exit  Criteria,  and  Risk  Management   “Event  driven  acquisiLon  strategies  and  program  plans   must  be  based  on  rigorous,  objecLve  assessments  of  a   program’s  status  and  the  plans  for  managing  risk  during   the  next  phase  and  the  remainder  of  the  program.     The  acquisiLon  strategy  and  associated  contracLng   acLviLes  must  explicitly  link  milestone  decision  reviews   to  events  and  demonstrated  accomplishments  in   development,  tesLng,  and  iniLal  producLon.     The  acquisiLon  strategy  must  reflect  the   interrelaLonships  and  schedule  of  acquisiLon  phases  and   events  based  on  logical  sequence  of  demonstrated   accomplishments  not  on  fiscal  or  calendar  expediency.”   37   6.0  Build  IMP  
  38. 38. What  makes  a  good     Program  Event  (PE)?   !  Events  are  the  conclusion  of  an  interval  of  major   program  accomplishments  (SA)  with  their  criteria  (AC)     !  IMP  events  represent  key  decision  transiLon  points   between  major  acLviLes  distributed  over  the  contract   period   –  The  IMP  is  a  Mini  Authoriza=on’s  to  Proceed  (ATP)  to  the   next  Program  Event  (PE)   !  Some  guidance  for  establishing  program/product   events:   –  Customer  Given  Events.     –  Key  Decisions  Needed   –  Risk  MiLgaLon  Event   –  DOD  Systems  Engineering  Technology  Review  (SETR)   Guidance   38   6.0  Build  IMP  
  39. 39. What  makes  a  good   Significant  Accomplishment  (SA)?   !  IPT  can  manage  it  at  a  working  level   !  Shows  compleLon  and  result  s  of  discrete  steps  in  the   development  process   !  Indicates  maturity  of  the  product  through  MoEs  and  MoPs   !  Its  “significance”  measures  program  event  status   !  Relevant  and  logically  linked  to  the  proper  PE   –  Just  because  the  work  occurs  during  the  Lme-­‐frame  for  PE  A   doesn’t  mean  it’s  logically  linked  to  PE  A   !  Uses  consistent  language,  style,  format  through  verb   dicLonary,  for  example:   –  Segment  build  4  detailed  design  completed   –  Analysis  of  structural  integrity  completed   –  Structural  integrity  verified   39   6.0  Build  IMP  
  40. 40. IMP  Significant  Accomplishments   !  SAs  are  NOT  just  a  list  of  “things”  to  do  before  the   Program  Event  (PE)   !  They  are  sequenced  accomplishments,  each  of  which   leads  to  the  PE,  e.g.  CriLcal  Design  Review  (CDR),  each   increasing  the  maturity  of  the  deliverables     –  SA  #  1  =  CDR  meeLng  conducted   –  SA  #  2  =  CDR  acLon  item  work-­‐off  plan  established   –  SA  #  3  =  85%  drawings  completed   –  SA  #  4  =  CDR  CDRLs  delivered   –  SA  #  5  =  Development  environment  operaLonal   –  SA  #  6  =  CriLcal  methods  analyses  completed   –  SA  #  7  =  RVTM  approved   40   6.0  Build  IMP  
  41. 41. What  makes  a  good   Accomplishment  Criteria  (AC)?   !  Provides  objecLve,  measureable,  and  explicit   evidence  of  compleLon  and  closure  of  the   work  acLviLes  in  Work  Packages  that  saLsfies   the  Measure  of  Performance.   !  Defines  condiLons  for  closing  the  Significant   Accomplishment  (SA).   !  Answers  the  quesLon  how  do  we  know  when   a  Significant  Accomplishment  has  been   completed?   41   6.0  Build  IMP  
  42. 42. !  Not  Significant   –  Too  small  to  significantly  contribute  to  successful  event  compleLon   –  Would  lead  to  trivial  tasks  (e.g.,  1  day  duraLon)   !  Ambiguous   –  Read  can’t  tell  what  Done  looks  like   !  Wrong  verb   –  Uses  a  verb  that’s  not  on  the  list  (DicLonary)   –  Uses  a  listed  verb  incorrectly   –  Doesn’t  have  a  verb  at  all   !  Not  measurable   –  Can’t  tell  when  we’re  done   !  Too  Many  SAs  or  ACs   –  Confuses  the  reader  and  confuses  the  execuLon  process  of  the   program   –  Dilutes  the  MoE  and  MoP   –  Reduces  visibility  into  increasing  maturity   42   What  makes  a  Not  So  Good     SA  or  AC?   6.0  Build  IMP  
  43. 43. !  Program  Event  (PE)   –  A  PE  assess  the  readiness  or  compleLon  as  a  measure   of  progress   –  First  Flight  Complete   !  Significant  Accomplishment  (SA)   –  The  desired  result(s)  prior  to  or  at  compleLon  of  an   event  demonstrate  the  level  of  the  program’s   progress   –  Flight  Test  Readiness  Review  Complete   !  Accomplishment  Criteria  (AC)   –  DefiniLve  evidence  (measures  or  indicators)  that   verify  a  specific  accomplishment  has  been  completed   –  SEEK  EAGLE  Flight  Clearance  Obtained   43   F-­‐22  Example   6.0  Build  IMP  
  44. 44. !  PE’s  are  easy,  they  are  in  the  SETR  and  Integrated   LogisLcs  Lifecycle  Management  System   !  The  SA’s  can  be  defined  from  the  program’s   deliverables  –  these  may  not  be  obvious  but  can   be  discovered  with  a  Product  Development   Kaizen  process  (more  later)   !  It’s  the  AC’s  that  are  hard  part  –  the  ACs  must   represent  the  exit  criteria  for  the  series  of  Work   Packages  that  do  the  work  to  produce  the   product   44   Now  comes  the  Hard  Part   6.0  Build  IMP  
  45. 45. 6  Steps  to  IMP  Development   Let’s  start  to  build  the  IMP.     This  step-­‐by-­‐step  process  needs  to  be  followed  carefully.   The  IMP  is  constructed  one  Program  Event  at  a  Lme  –  Leo  to  Right  in  Lme.     To  do  otherwise  allows  confusion  and  disconnecLon  between  Program  Events  to   occur  and  dilutes  our  focus  on  defining  what  Done  looks  like  for  each  Program   Event.       7 V8.7  
  46. 46. 46   7.0  6  Steps  
  47. 47. Quick  View  of  Step-­‐By-­‐Step  IMP   IdenLfy  Program  Events   IdenLfy  Significant  Accomplishments   IdenLfy  Accomplishment  Criteria   IdenLfy  Work  Packages  needed  to  complete   the  Accomplishment  Criteria   Sequence  the  Work  Packages  (WP),  Planning   Packages  (PP),  Summary  Level  Planning   Packages  (SLPP)    in  a  logical  network.   Adjust  the  sequence  of  WPs,  PPs,  &  SLPPs  to   miLgate  major  risks.   47   7.0  6  Steps   1 2 3 4 5 6
  48. 48. IdenLfy  the  Program  Events   Actors   Processes   Outcomes   Systems  Engineer   Define  the  process  flow  for   product  producLon  from   contract  award  to  end  of   contract   !  Confirm  Program  Events  represent  the   logical  process  flow  for  program   maturity   Program  Manager   Confirm  customer  is  willing  to   accept  the  process  flows   developed  by  the  IMP   !  Engage  with  contracts  and  customer   for  PE  definiLon   Project  Engineer   IdenLfy  interdependencies   between  program  event  work   streams   !  IdenLfy  Value  Stream  components  at   the  PE  level  before  flowing  them  down   to  the  SA  level   IMP/IMS  Architect   Capture  Program  Event   contents  for  each  IPT  or  work   stream   !  Establish  foundaLon  for  a  structure  to   support  the  descripLon  of  the   increasing  mature  as  well  as  the  flow   to  needed  work.   Copyright  ©  2012,  Glen  B.  Alleman,  Niwot  Ridge,  LLC   48   1 7.0  6  Steps  
  49. 49. Outcomes  of  Step   !  Confirm  the  end  to  end  descripLon  of  the   increasing  maturity  of  the  program’s   deliverables   !  Establish  of  RFP  or  Contract  target  dates  for   each  Event.     !  Socialize  the  language  of  speaking  in  “Events”   rather  than  Lme  and  efforts   49   1 7.0  6  Steps  
  50. 50. Events  Define  the  Assessment     of  the  Program’s  Maturity   50   ! Program  Events  are  maturity   assessment  points  in  the  program   ! They  define  what  levels  of  maturity   for  the  products  and  services  are   needed  before  proceeding  to  the  next   maturity  assessment  point   ! The  entry  criteria  for  each  Event   defines  the  units  of  measure  for  the   successful  compleLon  of  the  Event   ! The  example  below  is  typical  of  the   purpose  of  a  Program  Event   The  Cri=cal  Design  Review  (CDR)  is  a  mul=-­‐disciplined  product  and  process  assessment  to  ensure   that  the  system  under  review  can  proceed  into  system  fabrica=on,  demonstra=on,  and  test,  and   can  meet  the  stated  performance  requirements  within  cost  (program  budget),  schedule  (program   schedule),  risk,  and  other  system  constraints.     1 7.0  6  Steps  
  51. 51. IdenLfy  the  Significant  Accomplishments   (SA)  for  Each  Program  Event  (PE)   51   Actors   Processes   Outcomes   System  Engineer   IdenLfy  Integrated  Product  Teams   (IPT)  responsible  for  the  SA’s     !  Define  the  boundaries  of  these   programmaLc  interfaces   !  Define  technical  and  process  risk   categories  and  their  bounds   Technical  Lead   Confirm  the  sequence  of  SA’s  has   the  proper  dependency   relaLonships   !  Define  the  product  development   flow  process  improves  maturity   !  Define  technical  risk  drivers   Project  Engineer   Confirm  logic  of  SA’s  for  project   sequence  integrity   !  Define  the  program  flows   improves  maturity     Control  Account   Manager   Validate  SA  outcomes  in  support  of   PE  entry  condiLons   !  Confirm  budget  and  resources   adequate  for  defined  work  effort   IMP/IMS  Architect   Assure  the  assessment  points   provide  a  logical  flow  of  maturity  at   the  proper  intervals  for  the   program   !  Maintain  the  integrity  of  the  IMP,   WBS,  and  IMS   2 7.0  6  Steps  
  52. 52. Outcomes  of  Step   !  The  Significant  Accomplishments  are  the   “road  map”  to  the  increasing  maturity  of  the   program   !  The  “Value  Stream  Map”  resulLng  from  the   flow  of  SA’s  describes  how  the  products  or   services  move  through  the  maturaLon  process   while  reducing  risk   !  The  SA  map  is  the  path  to  “done”     52   2 7.0  6  Steps  
  53. 53. SAs  define  the  entry  criteria  for   each  Program  Event   53   Preliminary  Design  Review  Complete   3 7.0  6  Steps  
  54. 54. IdenLfy  Accomplishment  Criteria  (AC)  for  each   Significant  Accomplishment  (SA)   Actors   Processes   Outcomes   CAM   Define  and  sequence  the  contents   of  each  Work  Package  and  select   the  EV  criteria  for  each  Task   needed  to  roll  up  the  BCWP   measurement   !  Establish  ownership  for  the   content  of  each  Work  Package   and  the  Exit  Criteria  –  the   Accomplishment  Criteria  (AC)   Project  Engineer   IdenLfy  the  logical  process  flow  of   the  Work  Package  to  assure  the   least  effort,  maximum  value  and   lowest  risk  path  to  the  Program   Event   !  Establish  ownership  for  the   process  flow  of  the  product  or   service   Technical  Lead   Assure  all  technical  processes  are   covered  in  each  Work  Package   !  Establish  ownership  for  the   technical  outcome  of  each  Work   Package   IMP/IMS  Architect   Confirm  the  process  flow  of  the  ACs   can  follow  the  DID  81650   structuring  and  Risk  Assessment   processes   !  Guide  the  development  of   outcomes  for  each  Work  Package   to  assure  increasing  maturity  of   the  program   54   3 7.0  6  Steps  
  55. 55. Outcomes  of  Step   !  The  definiLon  of  “done”  emerges  in  the  form   of  deliverables  rather  than  measures  of  cost   and  passage  of  Lme.   !  At  each  Program  Event,  the  increasing   maturity  of  the  deliverables  is  defined  through   the  Measures  of  EffecLveness  (MoE)  and   Measures  of  Performance  (MoP)   55   3 7.0  6  Steps  
  56. 56. ACs  are  higher  fidelity   descripLons  of  “Done”  than  SAs   56  Cri%cal  Design  Review  Complete   4 7.0  6  Steps  
  57. 57. IdenLfy  the  Work  for  Each  Accomplishment   Criteria  in  Work  Packages   Actors   Processes   Outcomes   Control  Account   Manager   IdenLfy  or  confirm  the  work   acLviLes  in  the  Work  Package   represent  the  allocated  work   !  Define  bounded  work  effort   defined  “inside”  each  Work   Package   Technical  Lead   Confirm  this  work  covers  the  SOW   and  CDRLs   !  Define  all  work  effort  for  100%   compleLon  of  deliverable  visible  in   a  single  locaLon  –  the  Work   Package   !  Confirm  risk  drivers  and  duraLon   variances   IMP/IMS  Architect   Assist  in  the  sequencing  the  work   efforts  in  a  logical  manner   !  Develop  foundaLon  of  the   maturity  flow  starLng  to  emerge   from  the  contents  of  the  Work   Packages   Earned  Value   Analyst   Assign  iniLal  BCWS  from  BOE  to   Work  Package   !  ConfirmaLon  of  work  effort   against  BOEs   !  Define  EVT  for  measures  progress   to  plan   57   4 7.0  6  Steps  
  58. 58. Outcomes  of  Step   !  The  work  idenLfied  that  produces  a   measurable  outcome.   !  This  work  defined  in  each  Work  Package   !  The  Accomplishment  Criteria  (AC)  state   explicitly  what  “done”  looks  like  for  this  effort   !  With  “done”  stated,  measures  of  Performance   and  measures  of  EffecLveness  can  be  defined   58   4 7.0  6  Steps  
  59. 59. Work  is  done  in  “packages”  that   produce  measureable  outcomes   59   Launch  Readiness  Review  Complete   5 7.0  6  Steps  
  60. 60. Sequence  Work  Packages  (ACs)  for  each   Significant  Accomplishment  (SA)   60   Actors   Processes   Outcomes   Control  Account   Manager   Define  the  order  of  the  Work   Packages  needed  to  meet  the   Significant  Accomplishments  for   each  Program  Event   !  Define  the  process  flow  of  work   and  the  resulLng   accomplishments.   !  Assure  value  is  being  produced  at   each  SA  and  the  AC’s  that  drive   them   IMP/IMS  Architect   Assure  that  the  sequence  of  Work   Packages  adheres  to  the  guidance   provided  by  DCMA  and  the  EVMS   System  descripLon   !  Begin  the  structuring  of  the  IMS   for  compliance  and  loading  into   the  cost  system   Program  Controls   Staff   Baseline  the  sequence  of  Work   Packages  using  Earned  Value   Techniques  (EVT)  with  measures  of   Physical  Percent  Complete   !  Develop  insight  to  progress  to   plan  with  measures  of  physical   progress  for  each  Work  Packages   (EVT)   5 7.0  6  Steps  
  61. 61. Outcomes  of  Step   !  Work  Packages  parLLon  work  efforts  into   “bounded”  scope   !  Interdependencies  constrained  to  Work   Package  boundaries  prevents  “spaghe{  code”   style  schedule  flow   !  Visibility  of  the  Increasing  Flow  of  Maturity   starLng  to  emerge  from  the  flow  of   Accomplishment  Criteria  (AC)   61   5 7.0  6  Steps  
  62. 62. Sequence  Work  Packages  (AC’s)  into   an  IMS  for  each  Program  Event   62   6 7.0  6  Steps  
  63. 63. Assemble  Final  IMP/IMS   Actors   Processes   Outcomes   IMP/IMS  Architect   StarLng  with  the  AC’s  under  each   SA’s  connect  Work  Packages  in  the   proper  order  for  each  Program   Event   !  Establish  the  Performance   Measurement  Baseline   framework.   !  IdenLfy  MoE  and  MoP  points  in   the  IMP   Program  Manager   Confirm  the  work  efforts  represent   the  commieed  acLviLes  for  the   contract   !  Review  and  approval  of  the  IMS  –   ready  for  baseline.   !  Review  and  approve  risk  drivers   and  duraLon  variance  models   Project  Engineer   Assess  the  product  development   flow  for  opLmizaLons   !  Review  and  approval  of  the  IMS  –   ready  for  baseline.   !  IdenLfy  risk  drivers  and  their   miLgaLons   Systems  Engineer   Confirm  the  work  process  flows   result  in  the  proper  products  being   built  in  the  right  order   !  Confirm  risk  drivers  and  duraLon   variances.   !  Review  and  approval  of  the  IMS  –   ready  for  baseline   63   6 7.0  6  Steps  
  64. 64. Outcomes  of  Step   !  Both  the  maturity  assessment  criteria  and  the   work  needed  to  reach  that  level  of  maturity  are   described  in  a  single  locaLon   !  Risks  are  integrated  with  the  IMP  and  IMS  at   their  appropriate  levels   –  Risks  to  EffecLveness  –  risk  to  JROC  KPPs   –  Risks  to  Performance  –  risk  to  program  KPPs  and   TPMs   !  Leading  and  Lagging  indicator  data  provide   through  each  measure  to  forecast  future   performance     64   6 7.0  6  Steps  
  65. 65. The  Previous  6  Steps  Result  In  A   Credible  IMP/IMS   65   !  The  IMP  is  the  “Outer  Mold   Line”,  the  Framework,  the   “Going  Forward”  Strategy  for   the  Program.   !  The  IMP  describes  the  path   to  increasing  maturity  and   the  Events  measuring  that   maturity.   !  The  IMP  tells  us  “How”  the   program  will  flow  with  the   least  risk,  the  maximum   value,  and  the  clearest   visibility  to  progress.   !  The  IMS  tells  us  what  work  is   needed  to  produce  the   product  or  service  at  the   Work  Package  level.   Our  Plan  Tells  Us  “How”  We   are  Going  to  Proceed   The  Schedule  Tells  Us   “What”  Work  is  Needed  to   Proceed   7.0  6  Steps  
  66. 66. Significant  Accomplishments  for  an  actual   Program  Event   Flight  Test  Ascent  Aerodynamics  Confirmed   66   7.0  6  Steps  
  67. 67. Horizontal  and  VerLcal  Traceability   of  the  IMP/IMS   Integrated  Master  Schedule   Work  sequenced  to   produce  outcomes   for  each  WP.   !  VerLcal  traceability  AC  "  SA  "  PE   !  Horizontal  traceability  WP  "  WP  "AC   Program Events Define the maturity of a Capability at a point in time. Significant Accomplishments Represent requirements that enable Capabilities. Accomplishment Criteria Exit Criteria for the Work Packages that fulfill Requirements. Work   Package   Work   Package   Work   Package   Work   Package   Work   Package   Work   package   Work   Package   Work   Package   67   7.0  6  Steps  
  68. 68. The  IMP’s  connecLon  to  the  WBS   !  Start  with  the  Significant  Accomplishments  and  sequence   them  to  the  maturity  flow  for  each  Program  Event   !  The  WBS  connecLons  then  become  orthogonal  to  this  flow   68   7.0  6  Steps   Program  Event   SRR   SDR   PDR   CDR   TRR   ATLO   Work  Breakdown   Structure   4.920-­‐SDAI   A01,  A02   B01   C01,  C02   D01   E01   F01   4.200-­‐Sys  Test   A05   B03,  B04   D02,  D03   E02   F02   4.300-­‐Radar   A03   B02   C03   E03   4.330-­‐O&C  Sys   A06,  A07   B05   C04   D04   E04   F03,  F04   4.400-­‐  I&T   A08   C05   E05,  E06   F05   4.500-­‐Support   A09   D05   E07   F06,  F07  
  69. 69. 69   7.0  6  Steps  
  70. 70. 70   7.0  6  Steps  
  71. 71. 71   7.0  6  Steps  
  72. 72. 72   7.0  6  Steps  
  73. 73. 73   7.0  6  Steps  
  74. 74. 74   7.0  6  Steps  
  75. 75. Nuances  Of  These  6  Steps   Building  the  Program  Event,  to  Significant  Accomplishment,  to  Accomplishment   decomposiLon  is  straight  forward.   For  each  Program  Event,  simply  idenLfy  what  are  the  needed  Significant   Accomplishment  for  the  entry  and  exit  criteria,  and  the  Accomplishment  Criteria   for  the  Work  Packages  that  produce  the  AC.   Yea  Right,  no  problem   8 V8.7  
  76. 76. 76   8.0  Nuances  
  77. 77. Quick  View  of  the  Nuances   !  Unfortunately  building  a  credible  IMP/IMS  is  a   nuanced  process,  subject  to  many  opportuniLes   for  diversions,  blind  alleys,  and  false  starts     !  It  is  slightly  counter  intuiLve  from  the  tradiLonal   scheduling  approach  to  start  with  the  verLcal   integraLon  –  but  it  is  criLcal  to  start  verLcally   !  Success  requires  the  full  parLcipaLon  of  Systems   Engineering,  CAMs,  and  the  Program  Manager   !  Success  requires  everyone  to  understand  the   nuances  of  the  IMP  building  efforts   77   8.0  Nuances  
  78. 78. The  1st  Nuance   We  need  to  change  the  Planning  Paradigm   Beginner     Intermediate   Advanced   ! Take  the  program  events   and  use  the  WBS  to  define   the  SAs  and  ACs.   ! IdenLfy  the  tasks  to  produce   the  ACs  and  SAs   ! Collect  them  into  Program   Events   ! Organize  the  tasks  by  Work   Package   ! Sequence  the  work  packages   (AC’s)     ! Examine  the  exit  criteria  for   the  Program  Events  –  what   does  Done  look  like?   ! Ask  what  the  entry  criteria   are  for  each  Program  Event   ! Build  the  AC’s  to  support   these  entry  criteria   ! Pull  these  together  under   each  SA   ! Determine  the  Technical  and   ProgrammaLc  maturity  for   each  Program  Event  from   the  Concept  of  OperaLons   ! Assess  the  SA’s  for  each   Integrated  Product  Team  in   terms  of  their  streams   maturity  at  that  point  in  the   program   ! Sequence  the  SA’s  for  each   PE  and  assess  the  units  of   measure  of  “maturity”   ! Build  the  AC’s  to  support   each  SA’s  level  of  maturity   Copyright  ©  2012,  Glen  B.  Alleman,  Niwot  Ridge,  LLC   78   8.0  Nuances  
  79. 79. The  2nd  Nuance   We  need  to  speak  about  Increasing  Maturity   Beginner   Intermediate   Advanced   ! The  sequence  of  work   closely  matches  the   horizontal  schedules  of  the   past   ! Align  this  work  with  the  WBS   elements   ! Focus  on  the  WBS  DicLonary   of  the  delivered  products  or   services   ! No  explicit  TPM,  MOE,  and   MOP  elements     ! The  sequence  of  work  is   related  to  the  Program   Events,  but  essenLally   “hangs”  from  the  PE  to  the   SA’s  and  then  the  AC’s   ! All  deliverables  are  visible   but  their  TPMs  and  other   system  measures  are  not   stated  in  the  IMP  or  its   narraLve.   ! There  is  a  narraLve  in  the   form  of  SA’s  and  AC’s  that   describes  how  the  program   moves  from  leo  to  right   alone  its  maturity  path   ! Risk  buy  down  and   reLrement  are  visible   ! Intermediate  Technical   Performance  Measures,   Measures  of  EffecLveness,   and  Measures  of   Performance  are  visible  in   the  IMP   Copyright  ©  2012,  Glen  B.  Alleman,  Niwot  Ridge,  LLC   79   8.0  Nuances  
  80. 80. The  3rd  Nuance   Everything  Foot  and  Ties  to  the  IMP  &  IMS   Beginner   Intermediate   Advanced   ! The  IMS  contains  all  the   proper  fields  in  columns  and   is  horizontally  linked   ! The  WBS  elements  can  be   found  for  all  work  elements   ! CDRL’s  are  visible  and  their   mulLple  delivery  dates   connected  to  each  Program   Event   ! WBS  is  structured  in  a   product  manner  or  possibly   a  funcLonal  manner  with   some  deliverables  defined  in   the  terminal  nodes   ! The  WBS  is  properly  formed   inside  each  AC  with   incremental  deliverables   ! WBS  numbers  form  a  “well   structured”  tree,  but  sLll  is   not  “pure”  in  the  sense  of   deliverables  only,  no   funcLonal   ! Each  column  and  each  field   can  be  “pivoted”  to  form  a   proper  “tree”  of  value  flow.   ! The  WBS  is  a  “pure”  Product   Breakdown  Structure  (PBS)   and  the  services  needed  to   produce  those  products   ! The  WBS  defines  the   structure  of  the  delivered   product  or  service   ! The  VerLcal  trace  of  the  IPM   describes  the  flow  of   increasing  maturity  of  these   products  or  services   ! The  Horizontal  trace  of  the   IMP  describes  to  work  to  be   done  to  produce  this   maturity   80   8.0  Nuances  
  81. 81. The  4th  Nuance   IMP/IMS  is  ProgrammaLc  Architecture   Beginner   Intermediate   Advanced   ! The  IMP  is  built  from  the   WBS  for  each  Program   Event.   ! The  IMP  is  seen  as  a   compliance  document  that   lists  the  Program  Events  and   a  “bunch  of  stuff”   underneath.     ! The  IMP  is  structured   around  separate  Program   Events,  but  below  the  SA’s   looks  like  a  “shop  floor”   schedule  with  liele  verLcal   connecLvity.   ! The  IMP  is  built  as  a  “value   stream”  flow  for  the   program  but  the  Systems   Engineers   ! This  programmaLc   architecture  is  built  in  the   same  way  the  technical   system  architecture  is  built   ! It  is  derived  from  the   ConOps  and  Tier  1  System   Requirements   ! The  IMP  shows  explicitly   how  these  are  supported  in   the  flow  of  the  SA’s   81   8.0  Nuances  
  82. 82. The  5th  Nuance   The  IMP/IMS  connects  all  the  dots   82   8.0  Nuances   Measure  of   EffecLveness   Measure  of   Performance   Technical   Performance   Measure   Risk   Aleatory   Uncertainty   Epistemic   Uncertainty   Reference   Classes   Past   Performance   SME   Past   Performance   System   Architecture   AHP  
  83. 83. ConnecLng  The  IMP  To  Program   Performance  Measures   Assembling  the  IMS  from  the  IMP  appears  to  be  a  straight  forward  process  –   details  the  tasks  that  support  the  Accomplishment  Criteria.  But  there  are  some   criLcal  steps  that  must  be  done  in  the  right  order  to  end  up  with  a  risk  tolerant   IMS.   Let’s  do  this  for  Our  Program   9 V8.7  
  84. 84. The  Primary  Role  for  the  IMP  is  to  describe   what  done  looks  like  in  MoE’s  and  MoP’s   84  19  October  1899  Robert  Goddard  decided  that  he  wanted  to  "fly  without  wings"  to  Moon.   9.0  Framework  
  85. 85. Quick  View  to  IMP/IMS  Framework   !  Measures  of  increasing  maturity  for  the  key   deliverables  is  the  foundaLon  for  increasing   the  Probability  of  Program  Success   !  Measures  of  EffecLveness  (MoE)  and   Measures  of  Performance  (MoP)  are  defined   in  the  Integrated  Master  Plan  (IMP)  NarraLve   !  Key  Performance  Parameters  (KPP)  –  both   JROC  and  program  specific  are  needed   !  Technical  Performance  Measures  (TPM)  are   needed  for  all  key  deliverables   85   9.0  Framework  
  86. 86. Components  we’ll  meet  along  the  way   to  our  desLnaLon  to  the  credible  PMB   86   86   ObjecLve  Status  and  EssenLal  Views  to  support  the  proacLve  management  processes  needed   to  keep  the  program  GREEN   Risk  Management   SOW   SOO   ConOps   WBS   Techncial  and  OperaLonal   Requirements   CWBS  &   CWBS  DicLonary   Integrated  Master  Plan   (IMP)   Integrated  Master  Schedule   (IMS)     Measures  of   EffecLveness   Measures  of   Performance   Measures  of   Progress   JROC     Key  Performance  Parameters   Program    Specific   Key  Performance  Parameters   Technical  Performance   Measures   Earned  Value  Management   System   TPMs  Live   Here   Performance  Measurement  Baseline   86 9.0  Framework  
  87. 87. Components  of  our  Final   DesLnaLon   87   Sow   SOO   ConOps   WBS   Techncial  and  OperaLonal   Requirements   Performance  Measurement  Baseline   CWBS  &   CWBS  DicLonary   Integrated  Master  Plan   (IMP)   Integrated  Master  Schedule   (IMS)     Earned  Value  Management   System   Technical Performance Measurement (TPM) involves the predicting the future values of a key technical performance parameter of the higher-level end product under development, based on current assessments of products lower in the system structure. Continuous verification of actual versus anticipated achievement for selected technical parameters confirms progress and identifies variances that might jeopardize meeting a higher-level end product requirement. Assessed values falling outside established tolerances indicate the need for management attention and corrective action. A well thought out TPM program provides early warning of technical problems, supports assessments of the extent to which operational requirements will be met, and assesses the impacts of proposed changes made to lower-level elements in the system hierarchy on system performance. Techncial  Performance   Measurement   heps://dap.dau.mil/acquipedia/Pages/Default.aspx   87 9.0  Framework  
  88. 88. IdenLfying  the  TPMs  starts  with   good  Systems  Engineering   88   9.0  Framework  
  89. 89. !  EIA-­‐632  –  involves  a  technique  of  predicLng   the  future  value  of  a  key  technical   performance  parameter  of  the  higher-­‐level   end  product  under  development,  based  on   current  assessments  of  products  lower  in  the   system  structure.     !  INCOSE  Systems  Engineering  Handbook.   InternaLonal  Council  on  Systems  Engineering   !  INCOSE  Metrics  Guidebook  for  Integrated   System  and  Product  Development   89   Policy  Guidance  for  TPMs   9.0  Framework  
  90. 90. The  NDIA  EVM  Intent  Guide  Says   90   Notice the inclusion of Technical along with Cost and Schedule That’s the next step is generating Value from Earned Value EV MUST include the Technical Performance Measures 9.0  Framework  
  91. 91. Some  More  Guidance   91   Systems  engineering  uses  technical  performance   measurements  to  balance  cost,  schedule,  and  performance   throughout  the  life  cycle.  Technical  performance   measurements  compare  actual  versus  planned  technical   development  and  design.  They  also  report  the  degree  to   which  system  requirements  are  met  in  terms  of  performance,   cost,  schedule,  and  progress  in  implemenLng  risk  handling.   Performance  metrics  are  traceable  to  user–defined   capabiliLes.   ―  Defense  Acquisi=on  Guide  (heps://dag.dau.mil/Pages/ Default.aspx)   In The End ― It’s All About Systems Engineering 9.0  Framework  
  92. 92. More  Guidance  Can  Be  Found  in  …   92   9.0  Framework  
  93. 93. This  Has  All  Been  Said  Before.     We  Just  Weren’t  Listening…   93   … the basic tenets of the process are the need for seamless management tools, that support an integrated approach … and “proactive identification and management of risk” for critical cost, schedule, and technical performance parameters. ― Secretary of Defense, Perry memo, May 1995 Why Is This Hard To Understand? ! We seem to be focused on EV reporting, not the use of EV to manage the program. ! Getting the CPR out the door is the end of Program Planning and Control’s efforts, not the beginning. TPM Handbook 1984 9.0  Framework  
  94. 94. A  Final  Reminder  …   94   TPMs are one source of Risk Management processes Risk  Management  Guide  to  DOD  Acquisi=on,  Sixth  EdiLon  (Version  1,0),  Aug  2006   9.0  Framework  
  95. 95. StarLng  out  on  the  Right  Foot  …   !  Most  IMP/IMS  literature  states  how  to  build  an  IMP  from  the  RFP  and   contractual  elements,  in  simple  and  maybe  simple  minded  terms   –  Decompose  the  events  into  SAs,  ACs,  and  their  Tasks  –  sounds  easy   !  This  approach  fails  to  provide  advice  for  several  things:   –  How  to  minimize  the  topological  connecLons  between  Events   –  How  to  increase  the  concurrency  between  IPTs   –  How  to  increase  the  tolerance  of  the  IMS  to  disrupLve  events   •  Known  and  knowable  risk   •  Unknown  and  possibly  unknowable  risk   !  The  construcLon  of  the  IMS  needs  to  take  place  in  what  seems  to  be  a   reverse  order   –  Build  the  IMP  as  a  Value  Stream  Map  describing  the  increasing  maturity  –  and   therefore  the  increasing  VALUE  of  the  deliverables  to  the  customer.   –  It’s  the  delivery  of  Customer  Value  that  inverts  the  management  process,  and   focuses  on  Keeping  the  Program  GREEN  as  planned  that  maximizes  value  to   the  customer  (the  Government)     95   9.0  Framework  
  96. 96. SETR  Program  Events   96  heps://acc.dau.mil/docs/technicalreviews/dod_tech_reviews.htm   9.0  Framework  
  97. 97. Build  PEs  Leo  to  Right   Start  with  SRR  (or  something  on  the  leW)  and  completely   define  its  comple=on,  before  moving  to  the  next  PE   !  Define  the  SAs  for  an  Event  and  construct  a  work  flow   of  the  acLviLes  needed  to  saLsfy  the  SA   –  These  acLviLes  are  yet  tasks,  so  don’t  commit  too  soon  to   defining  the  detailed  work   –  Isolate  the  SAs  by  event  first  –  only  work  on  one  event  at  a   Lme   !  IdenLfy  the  parLcipants  in  the  work   –  What  IPTs  parLcipate  in  this  work?   –  What  swim  lanes  are  needed  to  isolate  the  IPTs?   !  Define  the  elements     –  AcLviLes  performed  to  saLsfy  the  SA   –  Deliverables  that  result  from  these  acLviLes   !  There  are  sLll  not  Accomplishment  Criteria  (AC),  but   that  comes  next   97   9.0  Framework  
  98. 98. The  Accomplishment  Criteria  (AC)   !  A  definiLve  measure  or  indicator  that  verifies   compleLon  of  work  for  the  accomplishment   – Completed  work  effort     •  Manufacturing  Plan  Completed   – ConfirmaLon  of  performance  compliance     •  Flight  Test  Report  Approved   – Incremental  verificaLon     •  Maintenance  DemonstraLon  Completed   – Completed  criLcal  process  acLviLes   •  Risk  Management  Plan  Approved   98   9.0  Framework  
  99. 99. The  Accomplishment  Criteria  (AC)   !  Defines  the  measure  by  which  an  Accomplishment  (SA)  is   considered  “done”   !  Terms  like  complete,  delivered,  closed  have  no  “units  of   measure”  in  the  context  of  a  Significant  Accomplishment   (SA)  and  are  open  to  interpretaLon   !  Terms  like  …   –  Measures  of  compleLon  –  80%  of  drawings  approved  for  release   –  Counts  of  available  items  –  75%  of  pin-­‐outs  assigned  voltage   –  Fidelity  of  a  design  –  outer  mold  line  defined  within  90%  of   target   –  Error  bounds  –  spacecraW  mass  known  to  ±20%   –  Performance  parameters  –  disconnect  force  within  allowed   limits   –  Maturity  parameters  –  flight  ar=cle  successful  in  last  3  tests   …  are  used  to  define  the  “exit  criteria”   99   9.0  Framework  
  100. 100. 2  Types  of  Accomplishment     Criteria:  Entry  and  Exit   !  Entry  Criteria  –  SubstanLates  readiness  for  the   review   !  Exit  Criteria  –  SubstanLates  successful   compleLon  of  the  review   !  CriLcal  Design  Review  (CDR)  example   – Are  we  ready  for  the  Flight  Test  Readiness   Review?   – How  do  we  know  the  FTRR  a  success?   – What  did  we  learn  from  the  FTRR  that  increases   the  maturity  of  the  program’s  deliverables.   100   9.0  Framework  
  101. 101. The  IMP  Focuses  us  on  Measures   of  EffecLveness  and  Performance   101   MoE   KPP   MoP   TPM   Mission   Need   Acquirer  Defines  the  Needs  and  CapabiliLes   in  terms  of  OperaLonal  Scenarios   Supplier  Defines  Physical  SoluLons  that   meet  the  needs  of  the  Stakeholders   Opera%onal   measures  of  success   related  to  the   achievement  of  the   mission  or   opera%onal   objec%ve  being   evaluated.   Measures  that   characterize   physical  or   func%onal  aKributes   rela%ng  to  the   system  opera%on.   Measures  used  to   assess  design   progress,   compliance  to   performance   requirements,  and   technical  risks.   9.0  Framework  
  102. 102. Measure  of  EffecLveness  (MoE)   !  Measures  of  EffecLveness  …   !  Are  stated  in  units  meaningful  to  the  buyer,   !  Focus  on  capabiliLes  independent  of  any   technical  implementaLon,   !  Are  connected  to  the  mission  success.   The  operaLonal  measures  of  success  that  are  closely  related  to  the   achievements  of  the  mission  or  operaLonal  objecLves  evaluated  in   the  operaLonal  environment,  under  a  specific  set  of  condiLons.   “Technical  Measurement,”  INCOSE–TP–2003–020–01   MoE’s  Belong  to  the  End  User   102   9.0  Framework  
  103. 103. Measure  of  Performance  (MoP)   !  Measures  of  Performance  are  …   !  Aeributes  that  assure  the  system  has  the   capability  to  perform,   !  Assessment  of  the  system  to  assure  it  meets   design  requirements  to  saLsfy  the  MoE.   Measures  that  characterize  physical  or  funcLonal  aeributes   relaLng  to  the  system  operaLon,  measured  or  esLmated   under  specific  condiLons.   “Technical  Measurement,”  INCOSE–TP–2003–020–01   MoP’s  belong  to  the  Program  –  Developed  by  the  Systems   Engineer,  Measured  By  CAMs,  and  Analyzed  by  PP&C     103   9.0  Framework  
  104. 104. Key  Performance  Parameters  (KPP)   Both  JROC  and  Program  Specific   !  Key  Performance  Parameters  …   !  Have  a  threshold  or  objecLve  value,   !  Characterize  the  major  drivers  of   performance,   !  Are  considered  CriLcal  to  Customer  (CTC).   Represent  the  capabiliLes  and  characterisLcs  so   significant  that  failure  to  meet  them  can  be  cause  for   reevaluaLon,  reassessing,  or  terminaLon  of  the  program   “Technical  Measurement,”  INCOSE–TP–2003–020–01   The  acquirer  defines  the  KPPs  during  the  operaLonal   concept  development  –  KPPs  say  what  DONE  looks  like   104   9.0  Framework  
  105. 105.   Technical  Performance  Measures   (TPM)  for  key  deliverables     !  Technical  Performance  Measures  …   !  Assess  design  progress,   !  Define  compliance  to  performance   requirements,   !  IdenLfy  technical  risk,   !  Are  limited  to  criLcal  thresholds,   !  Include  projected  performance.  “Technical  Measurement,”  INCOSE–TP–2003–020–01   Aeributes  that  determine  how  well  a  system  or  system   element  is  saLsfying  or  expected  to  saLsfy  a  technical   requirement  or  goal   105   9.0  Framework  
  106. 106. What  are  Technical  Performance   Measures  Really?   !  TPMs  are  measures  of  the  system   technical  performance  that  have   been  chosen  because  they  are   indicators  of  system  success.  They   are  based  on  the  driving   requirements  or  technical   parameters  of  high  risk  or   significance  -­‐  e.g.,  mass,  power  or   data  rate.   !  TPMs  are  analogous  to  the   programmaLc  measures  of   expected  total  cost  or  esLmated   Lme-­‐to-­‐compleLon.  There  is  a   required  performance,  a  current   best  esLmate,  and  a  trend  line.   !  Actual  versus  planned  progress  of   TPMs  are  tracked  so  the  systems   engineer  or  project  manager  can   assess  progress  and  the  risk   associated  with  each  TPM.     !  The  final,  delivered  system  value   can  be  esLmated  by  extending  the   TPM  trend  line  and  using  the   recommended  conLngency  values   for  each  project  phase.   !  The  project  life  trend-­‐to-­‐date,   current  value,  and  forecast  of  all   TPMs  are  reviewed  periodically   (typically  monthly)  and  at  all  major   milestone  reviews.   106   9.0  Framework  
  107. 107. !  Tracking  TPMs  and  comparing  them  to  the  resource  growth   provides  an  early  warning  system  to  detect  deficiencies  or   excesses   !  Reserve  allocaLons  narrow  as  design  proceeds   !  TPMs  that  violate  reserve  allocaLons  or  have  trends  that   do  not  meet  the  final  performance  trigger  correcLve   acLons   107   Tracking  the  Technical     Performance  Measures   9.0  Framework  
  108. 108. NoLonal  TPM  Design  Margin   Guidelines   108   9.0  Framework  
  109. 109. Sample  IMP:   A  Flight  Avionics  System  (ConLnued)   Hardware  PDR  –  Purpose     !  Ensure  system  hardware  iniLal  design  has  been  updated,  and  meets   funcLonal  and  allocated  performance  requirements  within  program   constraints.     !  OperaLonal  security  concept  assessed.     !  Ensure  training  requirements  have  been  analyzed  and  their  objecLves   have  been  defined  for  training  missions.     !  ConfirmaLon  training  objecLves  and  MTC  design  and  integraLon  conform   to  the  Air  Force  syllabus  and  Ready  Aircrew  Program  (RAP).     !  Training  plan  will  be  updated.   !  Hardware  PDR  –  ExpectaLons   !  Team  agrees  system  hardware  iniLal  design  has  been  updated  and  can   proceed  to  the  detailed  design  phase.     !  Team  agrees  training  plans  and  objecLves  correlate  with  the  Air  Force   syllabus,  RAP  and  training  planning,  development  can  conLnue.     !  System  SpecificaLon  and  TTL  requirements  are  traceable  to  the  allocated   hardware  design.     109   9.0  Framework  
  110. 110. Sample  IMP:   A  Flight  Avionics  System  (ConLnued)   !  Hardware  PDR  –  Entry  Criteria   !  FuncLonal  Baseline  AuthenLcated  (FBA)   !  IniLate  system  hardware  iniLal  design  and   allocate  funcLons  to  the  appropriate   ConfiguraLon  Items   !  All  specificaLons  updated  and  required   documentaLon  is  made  available  including   anLcipated  lower  level  design  documentaLon   !  All  SRR/SFR  acLon  items  closed  or   disposiLoned     110   9.0  Framework  
  111. 111. Sample  IMP:   A  Flight  Avionics  System  (ConLnued)   Hardware  PDR  –  Accomplishments   !  System  hardware  iniLal  design  complete.     –  FuncLons  allocated  to  one  or  more  hardware  configuraLon  items  and   are  traceable  to  the  MTC  SSS  and  TSSC  SSS.     –  Human,  safety,  R&M,  EMI,  operaLonal  security,  instructor  and   operator  interfaces,  etc,  design  factors  have  been  reviewed.     !  Drao  instructor  and  operator  manuals  reviewed     !  Program  risks  updated,  assessed,  and  reviewed.     –  MiLgaLon  plans  in  place.   !  Program  schedule  and  constraints  updated  and  reviewed.     –  CriLcal  schedule  path  drivers  reviewed.     !  Design  criteria  for  the  simulaLon  and  database  development   reviewed  and  updated.   !  Program  processes  and  metrics  reviewed.   !  Test  Planning  acLviLes  and  relevant  documentaLon  reviewed  by   test  team.   111   9.0  Framework  
  112. 112. Sample  IMP:   A  Flight  Avionics  System  (Concluded)   Hardware  PDR  –  Exit  Criteria     !  Hardware  (ownership,  visual,  IOS,  brief/debrief)  design  reviewed,  allocated  to  a   hardware  configuraLon  item  and  updated  to  include  instructor  and  operators   interfaces,  malfuncLon  and  control  requirements,  etc.     !  RTM  updated,  MTC  SSS,  TSSC  SSS  and  TTL  traceable  to  allocated  hardware  design   to  include  ESOH  requirements.     !  Human,  safety,  R&M,  EMI,  operaLonal  security,  instructor  and  operator  interfaces,   etc,  design  factors  reviewed.     !  MTC  and  TSSC  allocated  baselines  established  and  controlled  by  appropriate  level   documentaLon  for  PDR.   !  Drao  instructor  and  operator  manuals  reviewed  with  user  concurrence  and   incorporate  saLsfactory  human  factor  design  factors  into  the  operator  interfaces.     !  Risk  management  and  miLgaLon  plans  updated,  in  place,  addresses  ESOH  plans   and  risks,  and  within  program  constraints.     !  Risks  assesses,  understood,  documented,  accepted  and  understood  by  team.   !  Program  schedule  reviewed     –  CriLcal  path  drivers  idenLfied     –  IMS  Updated  and  reflects  criLcal  paths   112   9.0  Framework  
  113. 113. One  More  IMP  Sample   !  (SA)  System  &  Segment   Requirements  Updated  &  Allocated   –  (AC)  SRR  /  SDR  Update  Review   Conducted   –  (AC)  Preliminary  System   SpecificaLon  Documents  (A011)   Baselined   –  (AC)  Preliminary  Spacecrao   Segment  SpecificaLon  Baselined   –  (AC)  Preliminary  Ground  Segment   SpecificaLon  Baselined   –  (AC)  Preliminary  SpecificaLon  Tree   Baselined   !  (SA)  Preliminary  ICDs  Baselined  For   Customer  Review   –  (AC)  Preliminary  Space-­‐Ground   ICD  Baselined     !  (SA)  PDR  System  Design  Completed   –  (AC)  Top  Level  System   Architecture  Updated   –  (AC)  PDR  Level  System  Analyses   Completed   –  (AC)  PDR  Level  Reliability  /   Availability  Analysis  Completed   –  (AC)  Preliminary  System  Level  Risk   Assessment  Completed   –  (AC)  System  Level  Plans  Updated   For  PDR   –  (AC)  Flight  Long  Lead  Review   Conducted   113   Preliminary  Design  Review   9.0  Framework  
  114. 114. The  “But”  for  this  Guidance   !  With  these  samples  and  the  SETR  guidance   we’ve  just  started   !  The  program  needs  to  define  program  specific   events  to  assure  the  actual  maturity  measures   are  captured   – The  IMP  provides  sufficient  defini=on  to  track  the   step-­‐by-­‐step  comple=on  of  the  required   accomplishments  for  each  event  and  to   demonstrate  sa=sfac=on  of  the  comple=on   criteria  for  each  accomplishment.  [AFMC   PAMPHLET  63-­‐5]   114   9.0  Framework  
  115. 115. IMP  Verbs  for  Significant   Accomplishments   115   Integrated  Master  Plan  Allowable  Verbs   Allocated:  Segment  requirement  is  flowed  down   from  the  System  SpecificaLon   Released:  Approved  item  for  delivery  for   intended  customer  or  supplier;  all  internal   distribuLon  and  sign  offs  complete.  An   electronic  version  is  made  accessible  on  the  IDE   Completed:  The  subject  item,  data,  document,   or  process  is  prepared  or  concluded,  and   reviewed  and  accepted  by  the  responsible  IPT.   SupporLng  documentaLon  is  available  through   IDE   Reviewed:  The  subject  item,  data,  document,  or   process  is  prepared  or  concluded,  and   documented  for  compleLon.  SupporLng   documentaLon  is  available  through  IDE   Conducted:  The  subject  meeLng  or  review  has   been  held  with  all  required  parLcipants.  The   charts  or  minutes  are  available  through  the  IDE   Updated:  The  subject  process,  data,  or   document  has  been  reevaluated  using  later   informaLon,  and  adjustments  incorporated.   Defined:  The  subject  configuraLon  items,  data,   or  document  was  submieed  to  the  customer   Validated:  Requirements  are  validated,  received   contractor  approvals,  were  distributed,  and  are   available  through  the  IDE   Established:  The  subject  items  is  created  and  set   in  place  in  a  manner  consistent  with  its  intended   use  aoer  review  and  accepted  by  the  IPT   Verified:  Requirements  are  verified  or  processed   in  accordance  with  established  pracLce.   9.0  Framework  
  116. 116. The  IMP  Process  NarraLve   !  ObjecMve:  a  brief  statement  explaining  why  this   process  set  is  applied  for  this  program   !  Governing  DocumentaMon:  lists  of  the  guidance   or  compliance  documents;  e.g.,  specificaLons,   manuals,  and  procedures  including  company,   government,  and  industry  references   !  Approach:  concise  descripLon  as  to  who  owns   each  process;  what  are  the  roles  and   responsibiliLes;  and  the  overall  process  including   a  process  flow  diagram.   116   9.0  Framework  
  117. 117. Generic  IMP  EvaluaLon  Criteria   !  Do  the  Program  Events  and  Accomplishments   reflect  the  logical  evoluLon  and  progress  of  the   overall  Program?   –  Do  program  events  and  their  definiLons  clearly   demonstrate  the  maturity  of  the  program  over   its  life?   –  Do  the  selected  Accomplishments  and   associated  Criteria  idenLfy  meaningful  and   measurable  progress  toward  the  key  goals  of   the  Program?   !  Do  the  Accomplishments  for  each  event   demonstrate  a  meaningful  understanding  of  the   program  requirements  or  are  they  tasks  that   anyone  could  do  for  any  contract?   –  Do  they  reflect  your  SOW  requirements?     !  Does  the  IMP  structure  readily  map  to  the  IPT   structure  such  that  each  IPT  can  easily  visualize   the  scope  of  their  responsibility?     !  When  awarded,  could  the  contractor  use  the   Accomplishments  as  discrete  acLvity  cost  accounts   in  their  earned  value  system?  Or  are  they  level  of   effort  in  nature?     !  Is  sufficient  visibility  provided  to  idenLfy  and  track   the  Program  Risk  Plan  and  associated  risk   miLgaLon  accomplishments  and/or   conLngencies?   –  Does  criteria  supporLng  the  accomplishments   include  key  performance  requirements?   !  Are  IPT  cross-­‐dependencies  and  dependencies   external  to  the  Program  appropriately  reflected  if   they  reflect  potenLal  schedule  or  performance   risks  to  the  success  of  the  Program?   !  Is  the  submieed  "Contract  IMP"  (the  Product  IMP   that  is  to  be  included  as  part  of  the  Program   Contract)  defined  to  the  appropriate  level:   –  Do  the  Accomplishments  and  associated   Criteria  go  down  to  a  level  sufficient  to  provide   visibility  into  key  subcontractor  acLviLes  upon   which  the  success  of  the  Program  may  be   dependent?   –  Are  the  Events  and  Accomplishments  included   in  the  IMP  at  such  a  level  as  to  make  the   maintenance  of  the  'Contractual  IMP'  pracLcal   or  does  it  include  an  unnecessary  level  of   detail?   117   9.0  Framework  
  118. 118. Program  Management  Levels   Program Levels" IMP/IMS Elements" CWBS" Tier 1! Program Manager! Technical Leads! IPT Manager! Technical performance goals! Major Program Events (PE)! ! ! ! Significant Accomplishments (SA)! Level 1 & 2! Links to CLINs! Level 3 & 4! Control Packages! Link to PBS! Integrated EVMS! Tier 2! Control Account Managers ! Product Work Plan! Responsible organization elements! ! Accomplishment Criteria (AC)! ! ! ! ! Tasks (NA)! Level 5! Cost Account Package! Cost Collection Level! Links to WBS by OBS! Resource summaries! Early warning EVMS analysis! Tier 3! Work Package Manager! Detailed plans! Work package! Earned value calculations! 118   9.0  Framework  
  119. 119. ConnecLng  the  Components     of  the  IMP/IMS   !  The  assembled  IMP/IMS  links  all  work   acLviLes  verLcally  to  the  ACs,  SAs,  and  PEs   119   IMS   Customer   Requirements   IMP   ORG   IPTs   ORG   IPTs   Performance   Analysis/   Management   Review   Events (E)  Events (E)   Accomplishments   Process   Narratives   Criteria   Integrated Master   Schedule (IMS)   Integrated Master   Schedule (IMS)   Control Account  Control Account   Work Package  Work Package   Work Package Tasks  Work Package Tasks   WBS   Program   Performance   Management   System   Supplemental Schedules   Risk and   opportunity   Risk and   opportunity   9.0  Framework  
  120. 120. 120   9.0  Framework  
  121. 121. First  Pass  At  Building  The   Integrated  Master  Plan     Many  contractors  all  ready  have  work  processes  to  do  this.  These  steps  are   guidance  for  contractors  new  to  this  process.     With  the  RFP,  the  contract  should  be  capable  of  the  following  steps  to  create  the   Integrated  Master  Plan  and  Integrated  Master  Schedule.     We’ll  build  the  IMP/IMS  from  the  point  of  view  of  the  Government  to  compare   the  contractors  IMP  in  the  proposal   10   V8.7  
  122. 122. Focused  with  the  Air  Vehicle   122   10.  1st  Pass  
  123. 123. Quick  View  of  1st  Pass  for  IMP   !  Start  with  WBS   !  Use  Preliminary  Design  Review  Program  Event   !  IdenLfy  subsystems  for  flight  vehicle   !  IdenLfy  Significant  Accomplishments  for  PDR   compleLon   !  IdenLfy  Accomplishment  Criteria  for  each   sequence  of  Work  Packages  to  produce  the   deliverables  for  PDR   123   10.  1st  Pass  

×