Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Cpm500 d _alleman__tpm lesson 3 (v1)


Published on

Integrating Technical Performance Measures with Cost and Schedule increases the probability of program success

Published in: Technology, Business
  • Very nice tips on this. In case you need help on any kind of academic writing visit website ⇒ ⇐ and place your order
    Are you sure you want to  Yes  No
    Your message goes here

Cpm500 d _alleman__tpm lesson 3 (v1)

  1. 1. PMI EVM Community of PracticeIPM 2011CPM-500(D) : Implementing Technical PerformanceMeasuresGlen B. AllemanDoD 303 241 9633
  2. 2. Learning ObjectivesTLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project office.ELO #1: The student will recognize the policy requirements for Technical Performance Measures.ELO #2: The student will recognize the role of Integrated Baseline Reviews in confirming the entire technical scope of work has been planned.ELO #3: The student will recognize the role of the WBS in supporting Technical Performance Measure requirements.TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool implementation.ELO #1: The student will recognize the benefits and challenges of Technical Performance Measure implementation.ELO #2: The student will recognize the use of control limit charts to track Technical Performance Measure metrics.ELO #3: The student will understand the methodology and approach used to show the effect of Technical Performance Measure on Earned Value. 2/66
  3. 3. To Achieve Success …We Need to … ©gapingvoid ltd 3/66
  4. 4. Increasing the Probability of Program Success Means …Building A Credible Performance Measurement Baseline Risk Cost IMP/IMS PMB SOW WBS TPM This is actually harder than it looks! 4/66
  5. 5. A Core Problem With Earned Value Measures Of Progress Must Be In Units Meaningful To The Stakeholders Earned Value measures performance in units of “money” (BCWS, BCWP, ACWP). We need another measure of progress in units of TIME. 5/66
  6. 6. Doing This Starts With Some GuidanceSystems engineering uses technical performancemeasurements to balance cost, schedule, and performancethroughout the life cycle. Technical performancemeasurements compare actual versus planned technicaldevelopment and design. They also report the degree towhich system requirements are met in terms of performance,cost, schedule, and progress in implementing risk handling.Performance metrics are traceable to user–definedcapabilities.― Defense Acquisition Guide ( In The End ― It’s All About Systems Engineering 6/66
  7. 7. Just A Reminder Of The … Primary Elements of Earned Value CostFunding margin for Schedule margin forunder performance Over cost or Over cost or over target baseline under over schedule (OTB) performance Over schedule Technical or under Performance performing Schedule Schedule margin for underperformance or schedule extension 7/66
  8. 8. This Has All Been Said Before. We Just Weren’t Listening…… the basic tenets of the process are the need forseamless management tools, that support anintegrated approach … and “proactiveidentification and management of risk” for criticalcost, schedule, and technical performanceparameters.― Secretary of Defense, Perry memo, May 1995 TPM Handbook 1984 Why Is This Hard To Understand? We seem to be focused on EV reporting, not the use of EV to manage the program. Getting the CPR out the door is the end of Program Planning and Control’s efforts, not the beginning. 8/66
  9. 9. The Gap Seems To Start With A Common Problem Many Times, The Information from Cost, Schedule, TechnicalPerformance, and Risk Management Gets Mixed Up When We Try to Put Them Together 9/66 9/66
  10. 10. The NDIA EVM Intent Guide Says Notice the inclusion of Technical along with Cost and ScheduleThat’s the next step is generating Value from Earned Value EV MUST include the Technical Performance Measures 10/66
  11. 11. Back To Our Technical Performance MeasuresTechnical Performance Measures do what they say, Measure the Technical Performanceof the product or service produced by the program. 11/66
  12. 12. The real question? How fast can we safely go?Yes, the Units of Measure are MPH 12/66
  13. 13. Measure of Effectiveness (MoE)The operational measures of success that are closely related to theachievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions.Measures of Effectiveness … Are stated in units meaningful to the buyer, Focus on capabilities independent of any technical implementation, Are connected to the mission success. MoE’s Belong to the End User “Technical Measurement,” INCOSE–TP–2003–020–01 13/66
  14. 14. Measure of Performance (MoP) Measures that characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions.Measures of Performance are … Attributes that assure the system has the capability to perform, Assessment of the system to assure it meets design requirements to satisfy the MoE. MoP’s belong to the Program – Developed by the Systems Engineer, Measured By CAMs, and Analyzed by PP&C “Technical Measurement,” INCOSE–TP–2003–020–01 14/66
  15. 15. Key Performance Parameters (KPP) Represent the capabilities and characteristics so significant that failure to meet them can be cause forreevaluation, reassessing, or termination of the programKey Performance Parameters … Have a threshold or objective value, Characterize the major drivers of performance, Are considered Critical to Customer (CTC). The acquirer defines the KPPs during the operational concept development – KPPs say what DONE looks like “Technical Measurement,” INCOSE–TP–2003–020–01 15/66
  16. 16. Technical Performance Measures (TPM) Attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goalTechnical Performance Measures … Assess design progress, Define compliance to performance requirements, Identify technical risk, Are limited to critical thresholds, Include projected performance. “Technical Measurement,” INCOSE–TP–2003–020–01 16/66
  17. 17. Dependencies Between These MeasuresAcquirer Defines the Needs and Capabilities Supplier Defines Physical Solutions that in terms of Operational Scenarios meet the needs of the Stakeholders KPP Mission MoE MoP TPM Need Operational Measures that Measures used to measures of success characterize physical assess design related to the or functional progress, achievement of the attributes relating compliance to mission or to the system performance operational operation. requirements, and objective being technical risks. evaluated. “Coming to Grips with Measures of Effectiveness,” N. Sproles, Systems Engineering, Volume 3, Number 1, pp. 50–58 17/66
  18. 18. “Measures” of Technical Measures Attribute Description Measured technical progress or estimate of Achieved to Date progress Value of a technical parameter that is predicted to Current Estimate be achieved Point in time when an evaluation of a measure is Milestone accomplished Planned Value Predicted value of the technical parameter Planned Performance Profile representing the project time phased Profile demonstration of a technical parameter Tolerance Band Management alert limits Threshold Limiting acceptable value of a technical parameter  Demonstrated technical variance VariancesINCOSE Systems Engineering Handbook  Predicted technical variance 18/66
  19. 19. A Familiar Graphic of TPMs TPM Upper Limit Planned Profile Current Estimate Planned ValueMean To Between Failure Threshold Variance Lower Limit Achieved to Date Milestones Time = Program Maturity 19/66
  20. 20. TPMs from an Actual Program Chandra X–Ray Telescope 20/66
  21. 21. What Does A Real TechnicalPerformance Measure Look Like?Not that bagels are notinteresting in Lesson 1 and2, but let’s get ready to lookat a flying machine. 21/66
  22. 22. The WBS for a UAV1.1 Air Vehicle TPMs Start With The WBS1.1.1 Sensor Platform1. Airframe Airframe1.1.3 Propulsion1.1.4 On Board Comm1.1.5 Auxiliary Equipment1.1.6 Survivability Modules1.1.7 Electronic Warfare Module1.1.8 On Board Application & System SW1.3 Mission Control / Ground Station SW1.3.1 Signal Processing SW1.3.2 Station Display1.3.3 Operating System1.3.4 ROE Simulations1.3.5 Mission Commands 22/66
  23. 23. What Do We Need To Know About This Program Through TPMs What WBS elements represent the TPMs? What Work Packages produce these WBS elements? Where do these Work Packages live in the IMS? What are the Earned Value baseline values for these Work Packages? How are we going to measure all these variables? What does the curve look like for these measurements? 23/66
  24. 24. Verifying Each TPM Evidence that we’re in compliance With our submitted ROM what are the values we need to get Do we know what we promised toCA through Integrated Baseline Review. How do we measure deliver, now that we’ve won? weight for each program event? The contributors to the vehicle weight are confirmed and the Can we proceed into preliminarySFR upper limits defined in the product architecture and design? requirements flow down database (DOORS) into a model. Can we proceed into the System Do we know all drivers of vehicle weight? Can we bound theirSRR Development and Demonstration upper limits? Can the subsystem owners be successful within (SDD) phase? these constraints uses a high fidelity model? Can we start detailed design, and Does each subsystem designer have the target component meet the stated performance weight target and have some confidence they can stay belowPDR requirements within cost, schedule, the upper bound? Can this be verified in some tangible way? risk, and other constraints? Either through prior examples or a lab model? Can the system proceed to Do we know all we need to know to start the fabrication of fabrication, demonstration, and test,CDR the first articles of the flight vehicle. Some type of example, within cost, schedule, risk, and other maybe a prototype is used to verify we’re inside the lines. system constraints? Can the system ready to Does the assembled vehicle fall within the weight range limitsTRR proceed into formal test? for 1st flight – will this thing get off the ground? 24/66
  25. 25. Design Model TPM Trends & ResponsesROM in Proposal Detailed Design Model Bench Scale Model Measurement Technical Performance Measure 28kg Prototype Measurement Vehicle Weight Flight 1st Article 26kg 25kg 23kg CA SFR SRR PDR CDR TRR EV Taken, planned values met, tolerances kept, etc. Dr. Falk Chart – modified 25/66
  26. 26. The Assessment Of Weight As A Function Of Time At Contract Award there is a Proposal grade estimate of vehicle weight. At System Functional Review, the Concept of Operations is validated for the weight. At System Requirements Review the weight targets are flowed down to the subsystems components. At PDR the CAD model starts the verification process. At CDR actual measurements are needed to verify all models. At Test Readiness Review we need to know how much fuel to put on board for the 1st flight test. 26/66
  27. 27. The WBS for a UAV Airframe Weight TPM1.1 Air Vehicle The planned weight is1.1.1 Sensor Platform 25kg. The actual weight is1. Airframe Airframe 25.5kg. Close to plan! So we are doing okay, right? CA SFR SRR PDR CDR TRRPlanned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kgActual Value 30.4kg 29.0kg 27.5kg 25.5kg Moderate Low Low Very Low (lessAssessed Risk >2.0kg off 1–2 kg off 1–2 kg off than 1.0 kg to TRR target target target off target) Program– Actual Actual Program– Planned “Similar to” unique design measurement measurement ROM unique design Method Estimate model with of bench–test of prototype model validated data components airframe Actual “Similar to” Method Estimate ROM ROM ROM Here’s the Problem 27/66
  28. 28. Raison detre for Technical Performance MeasuresThe real purpose of Risk CostTechnical PerformanceMeasures is to reduce IMP/IMS PMB SOWProgrammatic and WBS TPMTechnical RISK 28/66
  29. 29. Buying Down Risk with TPMs  “Buying down” risk is Risk: CEV-037 - Loss of Critical Functions During Descent 24 Correlate the analytical model planned in the IMS. 22 Conduct focus splinter review 20 Develop analytical model to de Conduct Force and Moment Wind 18  MoE, MoP, and KPP Conduct Block 1 w ind tunnel te 16 Conduct w ind tunnel testing of Conduct w ind tunnel testing ofRisk Score 14 Flight Application of Spacecra 12 10 8 CEV block 5 w ind tunnel testin defined in the work 6 4 In-Flight development tests of package for the critical measure – weight. 2 Damaged TPS flight test 0 3.Jul.06 1.Jul.11 31.Mar.05 5.Oct.05 1.Jun.07 1.Jan.10 16.Dec.10 15.Sep.06 3.Apr.06 1.Apr.08 1.Aug.08 1.Apr.09 Planned Risk Level Weight risk Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete) Weight confirmed  If we can’t verify we’ve reduced from RED to Yellow ready to fly – it’s GREEN at this point succeeded, then the risk did not get reduced.  The risk may have gotten worse. 29/66
  30. 30. Increasing the Probability of Success with Risk Management Going outside the TPM limits always means cost and schedule impacts. “Coloring Inside the Lines” means knowing the how to keep the program GREEN, or at least stay close to GREEN. So much for our strategy of winning through technical dominance 30/66
  31. 31. Connecting the EV Variables Integrating Cost, Schedulele, and Technical Performance Assures Program Management has the needed performance information to deliver on‒time, on‒budget, and on‒specification = Technical Performance Measures Cost + Schedule Conventional Earned Value Cost Baseline Technical Performance Schedule Baseline Master Schedule is used to  Earned Value is diluted by  Requirements are derive Basis of Estimate missing technical decomposed into physical (BOE) not the other way performance. deliverables. around.  Earned Value is diluted by  Deliverables are produced Probabilistic cost postponed features. through Work Packages. estimating uses past  Earned Value is diluted by  Work Packages are performance and cost risk non compliant quality. assigned to accountable modeling.  All these dilutions require manager. Labor, Materiel, and other adjustments to the  Work Packages are direct costs accounted for Estimate at Complete sequenced to form the in Work Packages. (EAC) and the To Complete highest value stream with Risk adjustments for all Performance Index (TCPI). the lowest technical and elements of cost. programmatic risk. 31/66
  32. 32. TPM Checklist MoE MoP TPM Traceable to applicable Traceable to applicable MoPs,Traceable to needs, MOEs, KPPs, system level system element performance,goals, objectives, and performance requirements, requirements, objectives,risks. and risks. risks, and WBS elements. Focused on technical risks Further decomposed,Defined with associated and supports trades budgeted, and allocated toKPPs. between alternative lower level system elements in solutions. the WBS and IMS.Each MoE independent Provided insight into Assigned an owner, the CAMfrom others. system performance. and Work Package Manager.Each MoE independent Decomposed, budgeted Sources of measure identifiedof technical any and allocated to system and processes for generatingsolution. elements. the measures defined. Assigned an “owner,” the Integrated into the program’sAddress the required CAM and Technical IMS as part of the exit criteriaKPPs. Manager. for the Work Package. 32/66
  33. 33. Did We Accomplish the Learning Objectives?TLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project office.ELO #1: The student will recognize the policy Policies and supporting guidance, with links andrequirements for TPM. reference numbers provided.ELO #2: The student will recognize the role of IBRs in This is the first place where cost, schedule andconfirming the entire technical scope of work has technical performance come together – in thebeen planned. Integrated Master Schedule (IMS)ELO #3: The student will recognize the role of the TPMs are first located in the WBSWBS in supporting TPM requirements. TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool implementation.ELO #1: The student will recognize the benefits and Progress is measured in units of physical percentchallenges of TPM implementation. complete. TPMs are those units.ELO #2: The student will recognize the use of control We’ve seen notional and actual chartslimit charts to track TPM metrics.ELO #3: The student will understand the The example of our “flying machine” connects themethodology and approach used to show the effect dots for TPMs, risk, cost, and schedule.of TPMs on earned value. 33/66
  34. 34. 34/66
  35. 35. Backup MaterialsKnowledge is of two kinds. We know a subject ourselves, or we know where we can find information on it — Samuel Johnson 35/66
  36. 36. Many of Sources for Connecting the DotsOMB Circular A–11, Section 300 Interim Defense Acquisition Guidebook (DAG) 6/15/09GAO Report 06–250 Systems Engineering Plan (SEP) Preparation Guide 4/08DoDI 5000.02, Operation of the Defense WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05Acquisition System (POL) 12/08Integrated Master Plan (IMP) & Integrated Guide for Integrating SE into DOD AcquisitionMaster Schedule Preparation & Use Guide Contracts 12/06(IMS) 10/21/05Defense Acquisition Program Support Guide to the Project Management Institute Body ofMethodology (DAPS) V2.0 3/20/09 Knowledge (PMBOK Guide®), 4th EditionStandard for Application and Capability Maturity Model Integration (CMMI®)Management of the SE Process (IEEE1220)IEEE 1220: Processes for Engineering a System (ANSI/EIA–632)NASA EVM Guide NPG 9501.3 36/66
  37. 37. Office of Management and BudgetCircular No. A–11, Section 300 Planning, Budgeting, Acquisition and Management of Capital Assets Section 300–5 – Performance–based acquisition management – Based on EVMS standard – Measure progress towards milestones • Cost • Capability to meet specified requirements • Timeliness • Quality 37/66
  38. 38. Need: Accurate Performance Measurement GAO Report 06–250 Findings and RecommendationsInformation Technology: 2. If EVM is not implementedImprove the Accuracy and effectively, decisions based onReliability of Investment inaccurate and potentiallyInformation misleading information 3. Agencies not measuring actual versus expected performance in meeting IT performance goals. 38/66
  39. 39. DOD Guides: Technical PerformanceDepartment of Defense Guidelines for Technical Performance MeasuresDoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08Interim Defense Acquisition Guidebook (DAG) 6/15/09Systems Engineering Plan (SEP) Preparation Guide 4/08WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05Integrated Master Plan (IMP) & Integrated Master Schedule Preparation &Use Guide (IMS) 10/21/05Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 12/06Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09 39/66
  40. 40. DoD: TPMs in Technical Baselines and Reviews Engineering Integrated IMP/IMS Systems DAPS WBS DAG POL SEPDoD Policy or GuideTechnical Baselines:IMP/IMSFunctional (SFR)   Allocated (PDR)Product (CDR)Event driven timing       Success criteria oftechnical review       Entry and exit criteriafor technical reviews    Assess technicalmaturity     40/66
  41. 41. DoD: TPMs in Integrated Plans Engineering Integrated IMP/IMS Systems DAPS WBS DAG POL SEPDoD Policy or GuideIntegrated SEP with:IMP/IMSTPMs     EVMIntegrated WBS withRequirementSpecification     Statement of WorkIMP/IMS/EVMSLink risk management,technical reviews, TPMs,   EVM, WBS, IMS 41/66
  42. 42. Guidance in Standards, Models, and Defense Acquisition Guide Processes for Engineering a System (ANSI/EIA–632) Standard for Application and Management of the SE Process (IEEE 1220) Capability Maturity Model Integration (CMMI®) – CMMI for Development, Version 1.2 – CMMI for Acquisition, Version 1.2 – Using CMMI to Improve Earned Value Management, 2002 Guide to the Project Management Institute Body of Knowledge (PMBOK Guide®), 4th Edition 42/66
  43. 43. Technical Performance Measures (TPM) More SourcesIEEE 1220:, EIA–632: Glossary CMMI for DevelopmentPerformance–based progress Requirements DevelopmentmeasurementTPMs are key to Predict future value of key Specific Practice (SP) 3.3,progressively assess technical technical parameters of the Analyze Requirementsprogress end system based on current Typical work product: assessments TPMsEstablish dates for Planned value profile is Subpractice: – Checking progress time–phased achievement Identify TPMs that will be – Meeting full projected tracked during development conformance to • Achievement to date requirements • Technical milestone where TPM evaluation is reported 43/66
  44. 44. PMBOK® Guide Project Management Plan Performance Measurement Baseline: – Typically integrates scope, schedule, and cost parameters of a project – May also include technical and quality parameters 44/66
  45. 45. PMBOK® Guide Work Performance Measurements Used to produce project activity metrics Evaluate actual progress as compared to planned progress Include, but are not limited to: – Planned vs. actual technical performance – Planned vs. actual schedule performance, and – Planned vs. actual cost performance. 45/66
  46. 46. TPMs in DAG and DAPSDefense Acquisition Guide Performance measurement of WBS elements, using objective measures: – Essential for EVM and Technical Assessment activities Use TPMs and Critical Technical Parameters (CTP) to report progress in achieving milestonesDAPS Use TPMs to determine whether % completion metrics accurately reflect quantitative technical progress and quality toward meeting Key Performance Parameters (KPP) and Critical Technical Parameters 46/66
  47. 47. TPMs in DAG Compare the actual versus planned technical development and design Report progress in the degree to which system performance requirements are met. Plan is defined in terms of: – Expected performance at specific points • Defined in the WBS and IMS – Methods of measurement at those points – Variation limits for corrective action. 47/66
  48. 48. PMBOK® Guide Technical Performance Measurement Compares technical accomplishments… to … project management plan’s schedule of technical achievement Requires definition of objective quantifiable measures of technical performance which can be used to compare actual results against targets. Might include weight, transaction times, number of delivered defects, storage capacity etc. Deviation, such as demonstrating more or less functionality than planned at a milestone…forecast degree of success in achieving the project’s scope. 48/66
  49. 49. CMMI–ACQ Acquisition Technical Management SP 1.3 Conduct Technical Reviews Typical supplier deliverables Progress reports and process, product, and service level measurements TPMs 49/66
  50. 50. SMS Shall: Monitor Progress Against the Plan Monitoring – Contractor SHALL monitor progress against plan to validate, approve, and maintain each baseline and functional architecture Required Product Attributes – Each documented assessment includes: – TPMs, metrics – Metrics and technical parameters for tracking that are critical indicators of technical progress and achievement 50/66
  51. 51. NASA EVM Guide: Technical Performance• NASA EVM Guide NPG 9501.3 – 4.5 Technical Performance Requirements (TPR): When TPRs are used, – appropriate and relevant metrics… – must be defined in the solicitation – Appendix A.7, 14.1 TPR • Compares: • Expected performance and • Physical characteristics • With contractually specified values. • Basis for reporting established milestones • Progress toward meeting technical requirements 51/66
  52. 52. Derivation and Flow Down of TPMs Document, Baseline, IMS, EVM Parameter IMP, Functional Baseline Measures Of Effectiveness (MOE)  IMP, WBS, Functional Baseline Measures Of Performance (MOP)  IMP, Allocated Baseline Technical Performance Measure  IMS TPM Milestones And Planned Values  Work Packages TPM% Complete Criteria  See next chart for linkage of technical baselines to technical reviews 52/66
  53. 53. Interesting Attributes of TPMs Achieved to Date (sounds like EV) Current Estimate (sounds like EAC/ETC) Milestone Planned (target) value (sounds like PV) Planned performance profile (sounds like a PMB) Tolerance band (sounds like reporting thresholds) Threshold (yep, just what we thought) Variance (sounds like variance!) 53/66