Loading…

Flash Player 9 (or above) is needed to view presentations.
We have detected that you do not have it on your computer. To install it, go here.

Like this presentation? Why not share!

Cpm500 d _alleman__tpm jun 2010 lesson 3 (v2)

on

  • 3,422 views

Updated Technical Performance Measure from IPM 2010 (PMI-CPM Conference). New Learning Objectives and improved details on units of measure.

Updated Technical Performance Measure from IPM 2010 (PMI-CPM Conference). New Learning Objectives and improved details on units of measure.

Statistics

Views

Total Views
3,422
Views on SlideShare
2,702
Embed Views
720

Actions

Likes
1
Downloads
86
Comments
0

1 Embed 720

http://herdingcats.typepad.com 720

Accessibility

Categories

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Cpm500 d _alleman__tpm jun 2010 lesson 3 (v2) Cpm500 d _alleman__tpm jun 2010 lesson 3 (v2) Presentation Transcript

  • 500DCPM-500(D) : Principles of Technical ManagementLesson D: Implementing Technical Performance Measurement Glen B. Alleman Lewis & Fowler galleman@lewisandfowler.com (303) 241 9633 22nd Annual International IPM Conference November 8-10, 2010 Bethesda, MD Professional Education Program (Training Track) presented by PMI-College of Performance Management faculty Rights Reserved 1/64
  • 500D The Purpose Of This Lesson Defines the term and associated concept of Technical Performance Measurement (TPM) on projects. Discusses the interrelations between TPM and Earned Value Management (EVM). Introduces the student to implementation of computer based TPM tools, such as those used by the Defense Contract Management Agency (DCMA). Rights Reserved 2/64
  • 500D Learning ObjectivesTLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project office.ELO #1: The student will recognize the policy requirements for Technical Performance Measures.ELO #2: The student will recognize the role of Integrated Baseline Reviews in confirming the entire technical scope of work has been planned.ELO #3: The student will recognize the role of the WBS in supporting Technical Performance Measure requirements.TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool implementation.ELO #1: The student will recognize the benefits and challenges of Technical Performance Measure implementation.ELO #2: The student will recognize the use of control limit charts to track Technical Performance Measure metrics.ELO #3: The student will understand the methodology and approach used to show the effect of Technical Performance Measure on Earned Value. Rights Reserved 3/64
  • 500D Can Earned Value Alone Get Us To Our Destination? How do we increase visibility into program performance? How do we reduce cycle time to deliver the product? How do we foster accountability? How do we reduce risk? How do we start our journey to success? Increasing the Probability of Success means we have to Connect The Dots Between EVM and TPM to Reach Our Destination Rights Reserved 4/64
  • 500D To Achieve Success …We Need to …©gapingvoid ltd www.gapingvoidgallery.com Rights Reserved 5/64
  • 500D Increasing the Probability of Program Success Means …Building A Credible Performance Measurement Baseline Risk Cost IMP/IMS PMB SOW WBS TPM This is actually harder than it looks! Rights Reserved 6/64
  • 500D Doing This Starts With Some GuidanceSystems engineering uses technical performancemeasurements to balance cost, schedule, andperformance throughout the life cycle. Technicalperformance measurements compare actual versusplanned technical development and design. Theyalso report the degree to which system requirementsare met in terms of performance, cost, schedule, andprogress in implementing risk handling. Performancemetrics are traceable to user–defined capabilities.― Defense Acquisition Guide (https://dag.dau.mil/Pages/Default.aspx) In The End ― It’s All About Systems Engineering Rights Reserved 7/64
  • 500DThis Guidance Can Be Found in Many Sources Rights Reserved 8/64
  • 500D Just A Reminder Of The … Primary Elements of Earned Value CostFunding margin Schedule margin for for under Over cost or Over cost or over target baseline performance under over (OTB) performance schedule Over Technical schedule or Performance under Schedule performing Schedule margin for underperformance or schedule extension Rights Reserved 9/64
  • 500D Previous Approaches Using EV Are Mostly Unsuccessful In Connecting These Traditional approaches to program management are retrospective, – Cost and schedule of Earned Value, – Risk Management, and – Systems Engineering. Reporting past performance, – Sometimes 30 to 60 days old, and – Variances are reported beyond the window of opportunity for correction. Rights Reserved 10/64
  • 500D This Has All Been Said Before. We Just Weren’t Listening…… the basic tenets of the process are the need forseamless management tools, that support anintegrated approach … and “proactiveidentification and management of risk” for criticalcost, schedule, and technical performanceparameters.― Secretary of Defense, Perry memo, May 1995 TPM Handbook 1984 Why Is This Hard To Understand? We seem to be focused on EV reporting, not the use of EV to manage the program. Getting the CPR out the door is the end of Program Planning and Control’s efforts, not the beginning. Rights Reserved 11/64
  • 500D The Gap Seems To Start With A Common Problem Many Times, The Information from Cost, Schedule, TechnicalPerformance, and Risk Management Gets Mixed Up When We Try to Put Them Together Rights Reserved 12/64
  • 500D When We Put The Cart Before The Horse, We Discover … EVM really doesn’t do its job effectively. Most of the time EV has no measure of quality or compliance with technical requirements. EV measures progress to plan in units of “money,” not tangible value to the customer. Most EV System Descriptions fail to connect the dots between cost, schedule, and technical performance – even though instructed to do so in official guidance. Rights Reserved 13/64
  • 500D The NDIA EVM Intent Guide Says Notice the inclusion of Technical along with Cost and ScheduleThat’s the next step is generating Value from Earned Value EV MUST include the Technical Performance Measures Rights Reserved
  • 500D Back To Our Technical Performance MeasuresTechnical Performance Measures do what they say, Measure the Technical Performanceof the product or service produced by the program. Rights Reserved 15/64
  • 500D What’s Our Motivation for “Connecting the Dots?”TPMs are a set of measures that provide the supplier andacquirer with insight into progress to plan of the technicalsolution, the associated risks, and emerging issues.Technical Performance Measures … Provide program management with information to make better decisions, Increase the probability of delivering a solution that meets both the requirements and mission need.We’ve been talking about this since as early as 1984, in Technical PerformanceMeasurement Handbook, Defense Systems Management College, Fort Belvoir, VA 22060 Rights Reserved 16/64
  • 500D Measure of Effectiveness (MoE) The operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions. Measures of Effectiveness …  Are stated in units meaningful to the buyer,  Focus on capabilities independent of any technical implementation,  Are connected to the mission success. MoE’s Belong to the End User“Technical Measurement,” INCOSE–TP–2003–020–01 Rights Reserved 17/64
  • 500D Measure of Performance (MoP) Measures that characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions. Measures of Performance are …  Attributes that assure the system has the capability to perform,  Assessment of the system to assure it meets design requirements to satisfy the MoE. MoP’s belong to the Program – Developed by the Systems Engineer, Measured By CAMs, and Analyzed by PP&C“Technical Measurement,” INCOSE–TP–2003–020–01 Rights Reserved 18/64
  • 500D Key Performance Parameters (KPP) Represent the capabilities and characteristics so significant that failure to meet them can be cause for reevaluation, reassessing, or termination of the program Key Performance Parameters …  Have a threshold or objective value,  Characterize the major drivers of performance,  Are considered Critical to Customer (CTC). The acquirer defines the KPPs during the operational concept development – KPPs say what DONE looks like“Technical Measurement,” INCOSE–TP–2003–020–01 Rights Reserved 19/64
  • 500D Technical Performance Measures (TPM) Attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal Technical Performance Measures …  Assess design progress,  Define compliance to performance requirements,  Identify technical risk,  Are limited to critical thresholds,  Include projected performance.“Technical Measurement,” INCOSE–TP–2003–020–01 Rights Reserved 20/64
  • 500D Dependencies Between Measures Acquirer Defines the Needs and Capabilities Supplier Defines Physical Solutions that in terms of Operational Scenarios meet the needs of the Stakeholders KPP Mission MoE MoP TPM Need Operational Measures that Measures used to measures of characterize assess design success related to physical or progress, the achievement functional compliance to of the mission or attributes relating performance operational to the system requirements, and objective being operation. technical risks. evaluated.“Coming to Grips with Measures of Effectiveness,” N. Sproles,Systems Engineering, Volume 3, Number 1, pp. 50–58 Rights Reserved 21/64
  • 500DWhen Do We First Encounter The Technical Performance Measures? At the IBR of course…  That’s when all the pieces come together.  That’s when we can have line of sight from the requirements to the TPM’s to the work needed to produce the deliverables.  §5.3.1 of the 1 Sept 2010 version. Rights Reserved 22/64
  • 500D “Candidates” for Technical Measures Concept Description Useful Life Physical Size and Stability Weight Volumetric capacity Accuracy Functional Correctness Power performance Supportability Maintainability All the “ilities” Dependability Reliability Reliability = Mean Time Failure Utilization Efficiency Response time Throughput Suitability for Purpose Readiness Rights Reserved 23/64INCOSE Systems Engineering Handbook
  • 500D “Measures” of Technical Measures Attribute Description Measured technical progress or estimate of Achieved to Date progress Value of a technical parameter that is predicted to Current Estimate be achieved Point in time when an evaluation of a measure is Milestone accomplished Planned Value Predicted value of the technical parameter Planned Performance Profile representing the project time phased Profile demonstration of a technical parameter Tolerance Band Management alert limits Threshold Limiting acceptable value of a technical parameter  Demonstrated technical variance VariancesINCOSE Systems Engineering Handbook  Predicted technical variance Rights Reserved 24/64
  • 500D A Familiar Graphic of TPMs TPM Upper Limit Planned Profile Current Estimate Planned ValueMean To Between Failure Threshold Variance Lower Limit Achieved to Date Milestones Time = Program Maturity Rights Reserved 25/64
  • 500DA Simple Method of Assembling the TPMs MOE / MOP KPP / TPM Risks Define the Assess the Select Technical planned impact on Risk Performance progress for from this Parameters each TPM progress Parameters Progress Risk  Weight XXXX  Speed XXXX  MTBF XXXX  Loiter Time XXXX Rights Reserved 26/64
  • 500D TPMs from an Actual ProgramJames Webb Space Telescope Rights Reserved 27/64
  • 500DTPMs from an Actual Program Chandra X–Ray Telescope Rights Reserved 28/64
  • 500D What Does A Real TechnicalPerformance Measure Look Like?Not that bagels are notinteresting in Lesson 1 and2, but let’s get ready to lookat a flying machine. Rights Reserved 29/64
  • 500D The WBS for a UAV TPMs Start With The WBS1.1 Air Vehicle1.1.1 Sensor Platform1.1.21.1.2 Airframe Airframe1.1.3 Propulsion1.1.4 On Board Comm1.1.5 Auxiliary Equipment1.1.6 Survivability Modules1.1.7 Electronic Warfare Module1.1.8 On Board Application & System SW1.3 Mission Control / Ground Station SW1.3.1 Signal Processing SW1.3.2 Station Display1.3.3 Operating System1.3.4 ROE Simulations1.3.5 Mission Commands Rights Reserved 30/64
  • 500D What Do We Need To Know About This Program Through TPMs What WBS elements represent the TPMs? What Work Packages produce these WBS elements? Where do these Work Packages live in the IMS? What are the Earned Value baseline values for these Work Packages? How are we going to measure all these variables? What does the curve look like for these measurements? Rights Reserved 31/64
  • 500D Let’s Connect The DotsTechnical and ProgrammaticRisks Connected to the WBSand IMS BCWS at the Work Package, rolled to the Control AccountIMS contains allthe WorkPackages, BCWS, Risk CostRisk mitigation Namedplans, and rolls to IMP/IMS PMB SOW Deliverablesthe Integrated defined in the WBSMaster Plan to WBS TPMmeasureincreasing maturity TPMs attached to each critical deliverables in the WBS and identified inThe Products and each Work Package in theProcesses that produce IMS, used to assessthem in a “well structured” maturity in the IMPdecomposition in the WBS Rights Reserved 32/64
  • 500D Verifying Each TPM Evidence that we’re in compliance With our submitted ROM what are the values we need to get Do we know what we promised toCA through Integrated Baseline Review. How do we measure deliver, now that we’ve won? weight for each program event? The contributors to the vehicle weight are confirmed and the Can we proceed into preliminarySFR upper limits defined in the product architecture and design? requirements flow down database (DOORS) into a model. Can we proceed into the System Do we know all drivers of vehicle weight? Can we bound theirSRR Development and Demonstration upper limits? Can the subsystem owners be successful within (SDD) phase? these constraints uses a high fidelity model? Can we start detailed design, and Does each subsystem designer have the target component meet the stated performance weight target and have some confidence they can stay belowPDR requirements within cost, schedule, the upper bound? Can this be verified in some tangible way? risk, and other constraints? Either through prior examples or a lab model? Can the system proceed to Do we know all we need to know to start the fabrication of fabrication, demonstration, and test,CDR the first articles of the flight vehicle. Some type of example, within cost, schedule, risk, and other maybe a prototype is used to verify we’re inside the lines. system constraints? Can the system ready to Does the assembled vehicle fall within the weight range limitsTRR Rights Reserved proceed into formal test? for 1st flight – will this thing get off the ground? 33/64
  • 500D TPM Trends & Responses Design ModelROM in Proposal Detailed Design Model Bench Scale Model Measurement Technical Performance Measure 28kg Prototype Measurement Vehicle Weight Flight 1st Article 26kg 25kg 23kg CA SFR SRR PDR CDR TRR EV Taken, planned values met, tolerances kept, etc. Rights Reserved Dr. Falk Chart – modified 34/64
  • 500D The Assessment Of Weight As A Function Of Time At Contract Award there is a Proposal grade estimate of vehicle weight. At System Functional Review, the Concept of Operations is validated for the weight. At System Requirements Review the weight targets are flowed down to the subsystems components. At PDR the CAD model starts the verification process. At CDR actual measurements are needed to verify all models. At Test Readiness Review we need to know how much fuel to put on board for the 1st flight test. Rights Reserved 35/64
  • 500D The WBS for a UAV Airframe Weight TPM1.1 Air Vehicle The planned weight is1.1.1 Sensor Platform 25kg. The actual weight is1.1.2 Airframe 25.5kg.1.1.2 Airframe Close to plan! So we are doing okay, right? CA SFR SRR PDR CDR TRRPlanned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kgActual Value 30.4kg 29.0kg 27.5kg 25.5kg Moderate Low Low Very Low (lessAssessed Risk >2.0kg off 1–2 kg off 1–2 kg off than 1.0 kg to TRR target target target off target) Program– Actual Actual Program– Planned “Similar to” unique design measurement measurement ROM unique design Method Estimate model with of bench–test of prototype model validated data components airframe Actual “Similar to” Method Estimate ROM ROM ROM Here’s the Problem Rights Reserved 36/64
  • 500D Is This A Problem? You Bet’ya It’s A Problem! The measurement is close to the planned value, But the planned method of measurement is a program unique design model with validated data, But the actual method of measurement is a Rough Order of Magnitude estimate, No improvement in fidelity since the System Functionality Review (SFR), and The TPM provides no new information – so we’re probably late and don’t know it yet. Rights Reserved 37/64
  • 500D Raison detre for Technical Performance MeasuresThe real purpose of Risk CostTechnical PerformanceMeasures is to reduce IMP/IMS PMB SOWProgrammatic and WBS TPMTechnical RISK Rights Reserved 38/64
  • 500D Buying Down Risk with TPMs Risk: CEV-037 - Loss of Critical Functions During Descent  “Buying down” risk is242220 Correlate the analytical model Conduct focus splinter review Develop analytical model to de planned in the IMS.  MoE, MoP, and KPP Conduct Force and Moment Wind18 Conduct Block 1 w ind tunnel te16 Conduct w ind tunnel testing of Conduct w ind tunnel testing of defined in the work14 Flight Application of Spacecra12 package for the critical10 CEV block 5 w ind tunnel testin8642 In-Flight development tests of Damaged TPS flight test measure – weight.  If we can’t verify0 3.Jul.06 1.Jul.11 31.Mar.05 5.Oct.05 1.Jun.07 1.Jan.10 16.Dec.10 15.Sep.06 3.Apr.06 1.Apr.08 1.Aug.08 1.Apr.09 Planned Risk Level Weight risk reduced from Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete) Weight confirmed ready to fly – it’s we’ve succeeded, then RED to Yellow GREEN at this point the risk did not get reduced.  The risk may have gotten worse. Rights Reserved 39/64
  • 500D Increasing the Probability of Success with Risk Management Going outside the TPM limits always means cost and schedule impacts. “Coloring Inside the Lines” means knowing the how to keep the program GREEN, or at least stay close to So much for our strategy of winning GREEN. through technical dominance Rights Reserved 40/64
  • 500D Connecting the EV Variables Integrating Cost, Schedulele, and Technical Performance Assures Program Management has the needed performance information to deliver on‒time, on‒budget, and on‒specification = Technical Performance Measures Cost + Schedule Conventional Earned Value Cost Baseline Technical Performance Schedule Baseline Master Schedule is used  Earned Value is diluted  Requirements are to derive Basis of by missing technical decomposed into Estimate (BOE) not the performance. physical deliverables. other way around.  Earned Value is diluted  Deliverables are Probabilistic cost by postponed features. produced through Work estimating uses past  Earned Value is diluted Packages. performance and cost by non compliant quality.  Work Packages are risk modeling.  All these dilutions assigned to accountable Labor, Materiel, and require adjustments to manager. other direct costs the Estimate at Complete  Work Packages are accounted for in Work (EAC) and the To sequenced to form the Packages. Complete Performance highest value stream Risk adjustments for all Index (TCPI). with the lowest technical elements of cost. Rights Reserved and programmatic risk. 41/64
  • 500D TPM Checklist MoE MoP TPM Traceable to applicable Traceable to applicable MoPs,Traceable to needs, MOEs, KPPs, system level system element performance,goals, objectives, and performance requirements, requirements, objectives,risks and risks risks, and WBS elements Focused on technical risks Further decomposed,Defined with associated and supports trades budgeted, and allocated toKPPs between alternative lower level system elements in solutions the WBS and IMSEach MoE independent Provided insight into Assigned an owner, the CAMfrom others system performance and Work Package Manager Decomposed, budgeted Sources of measure identifiedEach MoE independent and allocated to system and processes for generatingof technical any solution elements the measures defined. Assigned an “owner,” the Integrated into the program’sAddress the required CAM and Technical IMS as part of the exit criteriaKPPs Manager Rights Reserved for the Work Package 42/64
  • 500D Increasing the Probability of Program Success Means …Building A Credible Performance Measurement Baseline Risk Cost IMP/IMS PMB SOW WBS TPM Using the Check List – “Connect the Dots” Rights Reserved 43/64
  • 500D Did We Accomplish the Learning Objectives?TLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project office.ELO #1: The student will recognize the policy Policies and supporting guidance, with links andrequirements for TPM. reference numbers provided.ELO #2: The student will recognize the role of IBRs in This is the first place where cost, schedule andconfirming the entire technical scope of work has technical performance come together – in thebeen planned. Integrated Master Schedule (IMS)ELO #3: The student will recognize the role of the TPMs are first located in the WBSWBS in supporting TPM requirements. TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool implementation.ELO #1: The student will recognize the benefits and Progress is measured in units of physical percentchallenges of TPM implementation. complete. TPMs are those units.ELO #2: The student will recognize the use of control We’ve seen notional and actual chartslimit charts to track TPM metrics.ELO #3: The student will understand the The example of our “flying machine” connects themethodology and approach used to show the effect dots for TPMs, risk, cost, and schedule.of TPMs on earned value. Rights Reserved 44/64
  • 500DRights Reserved 45/64
  • 500D Backup MaterialsKnowledge is of two kinds. We know a subject ourselves, or we know where we can find information on it — Samuel Johnson Rights Reserved 46/64
  • 500D Many of Sources for Connecting the DotsOMB Circular A–11, Section 300 Interim Defense Acquisition Guidebook (DAG) 6/15/09GAO Report 06–250 Systems Engineering Plan (SEP) Preparation Guide 4/08DoDI 5000.02, Operation of the Defense WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05Acquisition System (POL) 12/08Integrated Master Plan (IMP) & Integrated Guide for Integrating SE into DOD AcquisitionMaster Schedule Preparation & Use Guide Contracts 12/06(IMS) 10/21/05Defense Acquisition Program Support Guide to the Project Management Institute Body ofMethodology (DAPS) V2.0 3/20/09 Knowledge (PMBOK Guide®), 4th EditionStandard for Application and Capability Maturity Model Integration (CMMI®)Management of the SE Process (IEEE1220)IEEE 1220: 6.8.1.5 Processes for Engineering a System (ANSI/EIA–632)NASA EVM Guide NPG 9501.3 Rights Reserved 47/64
  • 500D Office of Management and BudgetCircular No. A–11, Section 300 Planning, Budgeting, Acquisition and Management of Capital Assets Section 300–5 – Performance–based acquisition management – Based on EVMS standard – Measure progress towards milestones • Cost • Capability to meet specified requirements • Timeliness • Quality Rights Reserved 48/64
  • 500D Need: Accurate Performance Measurement Findings and GAO Report 06–250 RecommendationsInformation Technology: 2. If EVM is notImprove the Accuracy and implemented effectively,Reliability of Investment decisions based onInformation inaccurate and potentially misleading information 3. Agencies not measuring actual versus expected performance in meeting IT performance goals. Rights Reserved 49/64
  • 500D DOD Guides: Technical PerformanceDepartment of Defense Guidelines for Technical Performance MeasuresDoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08Interim Defense Acquisition Guidebook (DAG) 6/15/09Systems Engineering Plan (SEP) Preparation Guide 4/08WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05Integrated Master Plan (IMP) & Integrated Master Schedule Preparation &Use Guide (IMS) 10/21/05Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 12/06Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09 Rights Reserved 50/64
  • 500D DoD: TPMs in Technical Baselines and Reviews Engineering Integrated IMP/IMS Systems DAPS WBS DAG POL SEPDoD Policy or GuideTechnical Baselines:IMP/IMSFunctional (SFR)   Allocated (PDR)Product (CDR)Event driven timing       Success criteria oftechnical review       Entry and exit criteriafor technical reviews    Assess technicalmaturity  Reserved Rights    51/64
  • 500D DoD: TPMs in Integrated Plans Engineering Integrated IMP/IMS Systems DAPS WBS DAG POL SEPDoD Policy or GuideIntegrated SEP with:IMP/IMSTPMs     EVMIntegrated WBS withRequirementSpecification     Statement of WorkIMP/IMS/EVMSLink risk management,technical reviews, TPMs,   EVM, WBS, IMS Rights Reserved 52/64
  • 500D Guidance in Standards, Models, and Defense Acquisition Guide Processes for Engineering a System (ANSI/EIA–632) Standard for Application and Management of the SE Process (IEEE 1220) Capability Maturity Model Integration (CMMI®) – CMMI for Development, Version 1.2 – CMMI for Acquisition, Version 1.2 – Using CMMI to Improve Earned Value Management, 2002 Guide to the Project Management Institute Body of Knowledge (PMBOK Guide®), 4th Edition Rights Reserved 53/64
  • 500D Technical Performance Measures (TPM) More SourcesIEEE 1220: 6.8.1.5, EIA–632: Glossary CMMI for DevelopmentPerformance–based Requirementsprogress measurement DevelopmentTPMs are key to Predict future value of key Specific Practice (SP) 3.3,progressively assess technical parameters of Analyze Requirementstechnical progress the end system based on Typical work product: current assessments TPMsEstablish dates for Planned value profile is Subpractice: – Checking progress time–phased achievement Identify TPMs that will be – Meeting full projected tracked during conformance to • Achievement to date development requirements • Technical milestone where TPM evaluation is reported Rights Reserved 54/64
  • 500D PMBOK® Guide 10.5.1.1 Project Management Plan Performance Measurement Baseline: – Typically integrates scope, schedule, and cost parameters of a project – May also include technical and quality parameters Rights Reserved 55/64
  • 500D PMBOK® Guide 8.3.5.4 Work Performance Measurements Used to produce project activity metrics Evaluate actual progress as compared to planned progress Include, but are not limited to: – Planned vs. actual technical performance – Planned vs. actual schedule performance, and – Planned vs. actual cost performance. Rights Reserved 56/64
  • 500D TPMs in DAG and DAPSDefense Acquisition Guide Performance measurement of WBS elements, using objective measures: – Essential for EVM and Technical Assessment activities Use TPMs and Critical Technical Parameters (CTP) to report progress in achieving milestonesDAPS Use TPMs to determine whether % completion metrics accurately reflect quantitative technical progress and quality toward meeting Key Performance Parameters (KPP) and Critical Technical Parameters Rights Reserved 57/64
  • 500D TPMs in DAG Compare the actual versus planned technical development and design Report progress in the degree to which system performance requirements are met. Plan is defined in terms of: – Expected performance at specific points • Defined in the WBS and IMS – Methods of measurement at those points – Variation limits for corrective action. Rights Reserved 58/64
  • 500D PMBOK® Guide 11.6.2.4 Technical Performance Measurement Compares technical accomplishments… to … project management plan’s schedule of technical achievement Requires definition of objective quantifiable measures of technical performance which can be used to compare actual results against targets. Might include weight, transaction times, number of delivered defects, storage capacity etc. Deviation, such as demonstrating more or less functionality than planned at a milestone…forecast degree of success in achieving the project’s scope. Rights Reserved 59/64
  • 500D CMMI–ACQ Acquisition Technical Management SP 1.3 Conduct Technical Reviews Typical supplier deliverables Progress reports and process, product, and service level measurements TPMs Rights Reserved 60/64
  • 500D SMS Shall: Monitor Progress Against the Plan 4.2.12.2 Monitoring – Contractor SHALL monitor progress against plan to validate, approve, and maintain each baseline and functional architecture 4.2.12.2.2 Required Product Attributes – Each documented assessment includes: – TPMs, metrics – Metrics and technical parameters for tracking that are critical indicators of technical progress and achievement Rights Reserved 61/64
  • 500D NASA EVM Guide: Technical Performance• NASA EVM Guide NPG 9501.3 – 4.5 Technical Performance Requirements (TPR): When TPRs are used, – appropriate and relevant metrics… – must be defined in the solicitation – Appendix A.7, 14.1 TPR • Compares: • Expected performance and • Physical characteristics • With contractually specified values. • Basis for reporting established milestones • Progress toward meeting technical requirements Rights Reserved 62/64
  • 500D Derivation and Flow Down of TPMs Document, Baseline, IMS, EVM Parameter IMP, Functional Baseline Measures Of Effectiveness (MOE)  IMP, WBS, Functional Baseline Measures Of Performance (MOP)  IMP, Allocated Baseline Technical Performance Measure  TPM Milestones And Planned IMS  Values Work Packages TPM% Complete Criteria  See next chart for linkage of technical baselines to technical reviews Rights Reserved 63/64
  • 500D Interesting Attributes of TPMs Achieved to Date (sounds like EV) Current Estimate (sounds like EAC/ETC) Milestone Planned (target) value (sounds like PV) Planned performance profile (sounds like a PMB) Tolerance band (sounds like reporting thresholds) Threshold (yep, just what we thought) Variance (sounds like variance!) Rights Reserved 64/64