CPM-200C: Technical Performance Measures
Glen B. Alleman
Niwot Ridge, LLC
glen.alleman@niwotridge.com 303-241-9633
Date: May 2013
IPMC 2012
1
TLO #1: The student will recognize what a Technical Performance Measurement (TPM) is in
the context of a program’s Performance Measurement Baseline and DOD Guidance.
ELO #1.1: The student will be able to summarize the policy requirements for Technical
Performance Measures in government documents
ELO #1.2: The student will be able to identify the activities during the Integrated Baseline
Review (IBR) to confirm the TPMs are defined as assigned to WBS elements and
identified and assigned to work in the IMS.
TLO #2: The student will be able identify and select TPMs using the example Work
Breakdown Structure from the program.
ELO #2.1: The student will construct a set of TPMs from the provided WBS
ELO #2.2: With these constructed TPMs, the student will apply them to the WBS elements
of the notional program.
TLO #3: The student will learn how to prepare descriptions of TPM needed to assess the
physical “progress to plan” of the program.
ELO #3.1: The student will be able to define the upper and lower bounds for a TPM using
the WBS and Statement of Work for the notional program
TLO #4: The student will able to judge the validity of a TPM from a list of TPM’s from a
notional program.
ELO #4.1: The student will set up a control limit chart to track Technical Performance
Measure metrics for the assigned deliverable.
Learning Objectives
2
Components we’ll meet along the way to
our destination to the credible PMB
Objective Status and Essential Views to support the proactive management processes needed
to keep the program GREEN
Risk Management
SOW
SOO
ConOps
WBS
Techncial and Operational
Requirements
CWBS &
CWBS Dictionary
Integrated Master Plan
(IMP)
Integrated Master Schedule
(IMS)
Measures of
Effectiveness
Measures of
Performance
Measures of
Progress
JROC
Key Performance Parameters
Program Specific
Key Performance Parameters
Technical Performance
Measures
Earned Value Management
System
TPMs Live
Here
Performance Measurement Baseline
TLO #1
3
Components of our Final Destination
Sow
SOO
ConOps
WBS
Techncial and Operational
Requirements
Performance Measurement Baseline
CWBS &
CWBS Dictionary
Integrated Master Plan
(IMP)
Integrated Master Schedule
(IMS)
Earned Value Management
System
Technical Performance Measurement (TPM) involves the predicting the future values of a
key technical performance parameter of the higher-level end product under
development, based on current assessments of products lower in the system structure.
Continuous verification of actual versus anticipated achievement for selected technical
parameters confirms progress and identifies variances that might jeopardize meeting a
higher-level end product requirement. Assessed values falling outside established
tolerances indicate the need for management attention and corrective action.
A well thought out TPM program provides early warning of technical problems, supports
assessments of the extent to which operational requirements will be met, and assesses
the impacts of proposed changes made to lower-level elements in the system hierarchy
on system performance.
Techncial Performance
Measurement
TLO #1
4
https://dap.dau.mil/acquipedia/Pages/Default.aspx
Can Earned Value Alone Get Us To
Our Destination?
• How do we increase visibility into program performance?
• How do we reduce cycle time to deliver the product?
• How do we foster accountability?
• How do we reduce risk?
• How do we start our journey to success?
Increasing the Probability of Success means we have to
Connect The Dots Between EVM and TPM to Reach Our Destination 5
TLO #1: EOL #1.1
Increasing the Probability of
Program Success Means …
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
Building A Credible Performance Measurement Baseline
This is actually harder than it looks!
6
TLO #1: EOL #1.1
Identifying the TPMs starts with
good Systems Engineering
7
TLO #1: EOL #1.1
Policy Guidance for TPMs
• EIA-632 – involves a technique of predicting the
future value of a key technical performance
parameter of the higher-level end product under
development, based on current assessments of
products lower in the system structure.
• INCOSE Systems Engineering Handbook.
International Council on Systems Engineering
• INCOSE Metrics Guidebook for Integrated System
and Product Development
8
TLO #1: EOL #1.1
The NDIA EVM Intent Guide Says
Notice the inclusion of Technical along with
Cost and Schedule
That’s the next step is generating Value from Earned Value
EV MUST include the Technical Performance Measures
9
TLO #1: EOL #1.1
Some More Guidance
Systems engineering uses technical performance
measurements to balance cost, schedule, and performance
throughout the life cycle. Technical performance
measurements compare actual versus planned technical
development and design. They also report the degree to
which system requirements are met in terms of performance,
cost, schedule, and progress in implementing risk handling.
Performance metrics are traceable to user–defined
capabilities.
― Defense Acquisition Guide
(https://dag.dau.mil/Pages/Default.aspx)
In The End ― It’s All About Systems Engineering
10
TLO #1: EOL #1.1
More Guidance Can Be Found in …
11
TLO #1: EOL #1.1
This Has All Been Said Before.
We Just Weren’t Listening…
… the basic tenets of the process are the need for
seamless management tools, that support an
integrated approach … and “proactive
identification and management of risk” for critical
cost, schedule, and technical performance
parameters.
― Secretary of Defense, Perry memo, May 1995
Why Is This Hard To Understand?
 We seem to be focused on EV reporting, not the use of EV to
manage the program.
 Getting the CPR out the door is the end of Program Planning
and Control’s efforts, not the beginning.
TPM Handbook 1984
12
TLO #1: EOL #1.1
A Final Reminder …
13
TPMs are one source of Risk Management processes
Risk Management Guide to DOD Acquisition, Sixth Edition (Version 1,0), Aug 2006
TLO #1: EOL #1.1
We see TPMs starting with the
Integrated Baseline Review (IBR)
• Confirms that the Performance Measurement
Baseline (PMB) captures the entire techncial scope
of work, which is schedule to meet program
objectives, identifies risks, assigns resources to
meet requirements, and implements management
control processes.
14
TLO #1: EOL #1.2
Currently reported Earned Value data contains invaluable planning
and budgeting information … for program management, however,
shortcomings of the system are it emphasis on retrospection and
lack of integration with techncial achievement.
– Cmdr. Nicholas Pisano
Where can we find IBR
TPM Guidance?
• In the obvious place – the NDIA
Integrated Baseline Review Guide
15
TLO #1: EOL #1.2
Back To Our Technical
Performance Measures
Technical Performance Measures do what
they say,
Measure the Technical Performance
of the product or service produced by the
program.
TLO #2
16
The Primary Role of TMPs is to describe what done
looks like in units meaningful to the decision maker
19 October 1899 Robert Goddard decided that he wanted to "fly without wings" to Moon.
TLO #2
17
To develop the TPMs we first need
to define some other measures
• Measures of Effectiveness (MoE)
• Measures of Performance (MoP)
• Key Performance Parameters (KPP)
– JROC KPPs
– Program KPPs
• Than we can define the Technical Performance
Measures (TPM)
18
Measure of Effectiveness (MoE)
Measures of Effectiveness …
• Are stated in units meaningful to the buyer,
• Focus on capabilities independent of any
technical implementation,
• Are connected to the mission success.
The operational measures of success that are closely related to the
achievements of the mission or operational objectives evaluated in
the operational environment, under a specific set of conditions.
“Technical Measurement,” INCOSE–TP–2003–020–01
MoE’s Belong to the End User
TLO #2
19
Measure of Performance (MoP)
Measures of Performance are …
• Attributes that assure the system has the
capability to perform,
• Assessment of the system to assure it meets
design requirements to satisfy the MoE.
Measures that characterize physical or functional attributes
relating to the system operation, measured or estimated
under specific conditions.
“Technical Measurement,” INCOSE–TP–2003–020–01
MoP’s belong to the Program – Developed by the Systems
Engineer, Measured By CAMs, and Analyzed by PP&C
TLO #2
20
Key Performance Parameters (KPP)
Key Performance Parameters …
• Have a threshold or objective value,
• Characterize the major drivers of performance,
• Are considered Critical to Customer (CTC).
Represent the capabilities and characteristics so
significant that failure to meet them can be cause for
reevaluation, reassessing, or termination of the program
“Technical Measurement,” INCOSE–TP–2003–020–01
The acquirer defines the KPPs during the operational
concept development – KPPs say what DONE looks like
TLO #2
21
Technical Performance Measures
(TPM)
“Technical Measurement,” INCOSE–TP–2003–020–01
Technical Performance Measures …
• Assess design progress,
• Define compliance to performance requirements,
• Identify technical risk,
• Are limited to critical thresholds,
• Include projected performance.
Attributes that determine how well a system or system
element is satisfying or expected to satisfy a technical
requirement or goal
TLO #2
22
From Mission to TPM
“Coming to Grips with Measures of
Effectiveness,” N. Sproles, Systems
Engineering, Volume 3, Number 1, pp. 50–58
MoE
KPP
MoP TPM
Mission
Need
Acquirer Defines the Needs and Capabilities
in terms of Operational Scenarios
Supplier Defines Physical Solutions that
meet the needs of the Stakeholders
Operational
measures of success
related to the
achievement of the
mission or
operational
objective being
evaluated.
Measures that
characterize physical
or functional
attributes relating
to the system
operation.
Measures used to
assess design
progress,
compliance to
performance
requirements, and
technical risks.
TLO #2
23
One more reminder
24
IPMC November 2012, Gordon Kranz, PARCA
TLO #2: ELO #2.1
TPMs start with the WBS
25
TLO #2: ELO #2.1
• Each “end item” in the
WBS needs a TPM to
confirm the work
produced the planned
value.
• These measures can
be found in the WBS
Dictionary, the “exit
criteria” for the Work
Packages and other
engineering
documents
26
 This is where TPMs
are connected with
the MoE’s and
MoP’s
 For each
deliverable from
the program, all
the “measures”
must be defined in
units meaningful
to the decision
makers.
 Here’s some “real”
examples.
Connecting the MoE, MoP, KPP, and TPMs
1. Provide Precision
Approach for a 200
FT/0.5 NM DH
2. Provide bearing and
range to AC platform
3. Provide AC
surveillance to GRND
platform
Measures of
Effectiveness (MoE)
1. Net Ready
2. Guidance Quality
3. Land Interoperability
4. Manpower
5. Availability
JROC Key Performance
Parameters (KPP)
1. Net Ready
 IPv4/6 compliance
 1Gb Ethernet
2. Guidance quality
 Accuracy threshold p70
@ 6M
 Integrity threshold 4M
@ 10-6 /approach
3. Land interoperability
 Processing capability
meets LB growth matrix
4. Manpower
 MTBC >1000 hrs
 MCM < 2 hrs
5. Availability
 Clear threshold >99%
 Jam threshold >90%
Measures of Performance
(MoP)
1. Net Ready
 Standard message packets
2. Guidance Quality
 Multipath allocation
budget
 Multipath bias protection
3. Land Interoperability
 MOSA compliant
 Civil compliant
4. Manpower
 Operating elapsed time
meters
 Standby elapsed time
indicators
5. Availability
 Phase center variations
Technical Performance
Measures (TPM)
Mission Capabilities and Operational Need
Technical Insight – Risk adjusted performance to plan
27
TLO #2: ELO #2.1
IDENTIFYING TECHNCIAL
PERFORMANCE MEASURES
Technical performance measures must be techncial and must describe
the performance in units of measure meaningful to the decision makers.
TPM’s are not counts of things delivered. TPMs are not assessments of
completed work through effort or cost.
TPMs are measures of Technical Performance
TLO #3
28
Measuring Cost and Schedule but
NOT measuring Compliance with
Technical Performance Specifications
TLO #3
29
Measures of Technical Parameters
INCOSE Systems Engineering Handbook
Attribute Description
Achieved to Date
 Measured technical progress or estimate of
progress in units of performance
Current Estimate
 Value of a technical parameter that is predicted
to be achieved now or in the future
Milestone
 Point in time when an evaluation of the
technical measure is accomplished
Planned Value  Predicted value of the technical parameter
Planned Performance
Profile
 Profile representing the project time phased
demonstration of a technical parameter
Tolerance Band  Management alert limits
Threshold
 Limiting acceptable value of a technical
parameter
Variances
 Demonstrated technical variance
 Predicted technical variance
TLO #3: ELO # 3.1
30
A Familiar Graphic for TPMs
Variance
Planned Value
Planned Profile
Current Estimate
Milestones
Threshold
Upper Limit
Lower Limit
MeanToBetweenFailure
Time = Program Maturity
Achieved to Date
TPM
31
TLO #3: ELO # 3.1
Simple Method of Assembling TPMs
Select Technical
Performance
Parameters
Define the
planned
progress for
each TPM
Assess the
impact on Risk
from this
progress
 Weight XXXX
 Speed XXXX
 MTBF XXXX
 Loiter Time XXXX
Parameters Progress Risk
MOE / MOP KPP / TPM Risks
32
TLO #3: ELO # 3.1
TPMs from an Actual Program
James Webb Space Telescope
33
TLO #3: ELO # 3.1
TPMs from an Actual Program
Chandra X–Ray Telescope
34
TLO #3: ELO # 3.1
1.1 Air Vehicle
1.1.1 Sensor Platform
1.1.2 Airframe
1.1.3 Propulsion
1.1.4 On Board Comm
1.1.5 Auxiliary Equipment
1.1.6 Survivability
Modules
1.1.7 Electronic Warfare
Module
1.1.8 On Board
Application &
System SW
1.3 Mission Control /
Ground Station SW
1.3.1 Signal Processing
SW
1.3.2 Station Display
1.3.3 Operating System
1.3.4 ROE Simulations
1.3.5 Mission Commands
TPMs Start With The WBS
The WBS for a UAV
1.1.2 Airframe
35
TLO #3: ELO # 3.1
What Do We Need To Know About
This Program Through The TPMs?
• What WBS elements represent the TPMs?
• What Work Packages produce these WBS elements?
• Where do these Work Packages live in the IMS?
• What are the Earned Value baseline values for these
Work Packages?
• How are we going to measure all these variables?
• What does the curve look like for these
measurements?
36
TLO #3: ELO # 3.1
Verifying Each TPM at Each Program Event
Evidence that we’re in compliance
CA
Do we know what we promised to
deliver, now that we’ve won?
With our submitted ROM what are the values we need to get
through Integrated Baseline Review. How do we measure
weight for each program event?
SFR
Can we proceed into preliminary
design?
The contributors to the vehicle weight are confirmed and the
upper limits defined in the product architecture and
requirements flow down database (DOORS) into a model.
SRR
Can we proceed into the System
Development and Demonstration
(SDD) phase?
Do we know all drivers of vehicle weight? Can we bound their
upper limits? Can the subsystem owners be successful within
these constraints uses a high fidelity model?
PDR
Can we start detailed design, and
meet the stated performance
requirements within cost, schedule,
risk, and other constraints?
Does each subsystem designer have the target component
weight target and have some confidence they can stay below
the upper bound? Can this be verified in some tangible way?
Either through prior examples or a lab model?
CDR
Can the system proceed to
fabrication, demonstration, and test,
within cost, schedule, risk, and other
system constraints?
Do we know all we need to know to start the fabrication of
the first articles of the flight vehicle. Some type of example,
maybe a prototype is used to verify we’re inside the lines.
TRR
Can the system ready to
proceed into formal test?
Does the assembled vehicle fall within the weight range limits
for 1st flight – will this thing get off the ground? 37
TLO #3: ELO # 3.1
What Do We Need To Know About
This Program Through TPMs
• What WBS elements represent the TPMs?
• What Work Packages produce these WBS
elements?
• Where do these Work Packages live in the IMS?
• What are the Earned Value baseline values for
these Work Packages?
• How are we going to measure all these variables?
• What does the curve look like for these
measurements?
38
TLO #3: ELO # 3.1
Let’s Connect The Dots
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
Named
Deliverables
defined in the WBS
BCWS at the Work
Package, rolled to the
Control Account
TPMs attached to each
critical deliverables in the
WBS and identified in
each Work Package in the
IMS, used to assess
maturity in the IMP
The Products and
Processes that produce
them in a “well structured”
decomposition in the WBS
IMS contains all
the Work
Packages, BCWS,
Risk mitigation
plans, and rolls to
the Integrated
Master Plan to
measure
increasing maturity
Technical and Programmatic
Risks Connected to the WBS
and IMS
39
TLO #3: ELO # 3.1
TPM Trends & Responses
Dr. Falk Chart – modified
EV Taken, planned values met, tolerances kept, etc.
25kg
23kg
28kg
26kg
PDRSRRSFRCA TRRCDR
ROM in Proposal
Design Model
Bench Scale Model Measurement
Detailed Design Model
Prototype Measurement
Flight 1st Article
TechnicalPerformanceMeasure
VehicleWeight
40
TLO #3: ELO # 3.1
The Assessment Of Weight As A
Function Of Time
• At Contract Award there is a Proposal grade estimate of
vehicle weight.
• At System Functional Review, the Concept of Operations is
validated for the weight.
• At System Requirements Review the weight targets are
flowed down to the subsystems components.
• At PDR the CAD model starts the verification process.
• At CDR actual measurements are needed to verify all
models.
• At Test Readiness Review we need to know how much fuel
to put on board for the 1st flight test.
41
TLO #3: ELO # 3.1
1.1 Air Vehicle
1.1.1 Sensor Platform
1.1.2 Airframe
Airframe Weight TPMThe WBS for a UAV
1.1.2 Airframe
CA SFR SRR PDR CDR TRR
Planned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kg
Actual Value 30.4kg 29.0kg 27.5kg 25.5kg
Assessed Risk
to TRR
Moderate
>2.0kg off
target
Low
1–2 kg off
target
Low
1–2 kg off
target
Very Low (less
than 1.0 kg
off target)
Planned “Similar to”
Program–
Program–
unique design
Actual
measurement
Actual
measurement
The planned weight is
25kg. The actual weight is
25.5kg.
Close to plan! So we are
doing okay, right?
Here’s the Problem
42
TLO #3: ELO # 3.1
Raison d'etre for Technical
Performance Measures
The real purpose of
Technical Performance
Measures is to reduce
Programmatic and
Technical RISK
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
TLO #4
43
Can TPM’s be our Leading Indicator?
TLO #4
44
Key Elements of Good TPMs
• Traceability – Requirements to WBS to TPMs to EV
control accounts.
• Impact – How much WBS work, & therefore EV
money, is covered by the TPM(s)? What is effect?
• TPM Banding/Sensitivity – What banding (R/Y/G)
and sensitivity (EV impact) should be used for each
TPM?
• Technical Readiness Level – What’s the state of the
technology supporting the requirement(s) for which
TPM is a metric?
Implementing Technical Performance Measurement , Mike Ferraro , General Engineer, PEO/SYSCOM Conference
45
TLO #4: ELO # 4.1
Forecasting the Future
Stationary
• The underlying governing rule does not change
– The program baseline is stable
• Future behavior can be induced from the past
– Past performance will likely continue in the future
• We have a sufficient record of the past to
understand the underlying nature
– UN/CEFACT XML now gives us this
46
TLO #4: ELO # 4.1
Forecasting the Future with
statistical analysis of the Past
47
TLO #4: ELO # 4.1
Connecting the EV Variables
Integrating Cost, Schedulele, and Technical Performance
Assures Program Management has the needed performance information to deliver
on‒time, on‒budget, and on‒specification
Technical Performance Measures
Cost Schedule
Conventional Earned Value
+
=
 Master Schedule is used to
derive Basis of Estimate
(BOE) not the other way
around.
 Probabilistic cost
estimating uses past
performance and cost risk
modeling.
 Labor, Materiel, and other
direct costs accounted for
in Work Packages.
 Risk adjustments for all
elements of cost.
Cost Baseline
 Earned Value is diluted by
missing technical
performance.
 Earned Value is diluted by
postponed features.
 Earned Value is diluted by
non compliant quality.
 All these dilutions require
adjustments to the
Estimate at Complete
(EAC) and the To Complete
Performance Index (TCPI).
Technical Performance
 Requirements are
decomposed into physical
deliverables.
 Deliverables are produced
through Work Packages.
 Work Packages are
assigned to accountable
manager.
 Work Packages are
sequenced to form the
highest value stream with
the lowest technical and
programmatic risk.
Schedule Baseline
48
TLO #4: ELO # 4.1
TPM Checklist
MoE MoP TPM
Traceable to needs,
goals, objectives, and
risks.
Traceable to applicable
MOEs, KPPs, system level
performance requirements,
and risks.
Traceable to applicable MoPs,
system element performance,
requirements, objectives,
risks, and WBS elements.
Defined with associated
KPPs.
Focused on technical risks
and supports trades
between alternative
solutions.
Further decomposed,
budgeted, and allocated to
lower level system elements in
the WBS and IMS.
Each MoE independent
from others.
Provided insight into
system performance.
Assigned an owner, the CAM
and Work Package Manager.
Each MoE independent
of technical any
solution.
Decomposed, budgeted
and allocated to system
elements.
Sources of measure identified
and processes for generating
the measures defined.
Address the required
KPPs.
Assigned an “owner,” the
CAM and Technical
Manager.
Integrated into the program’s
IMS as part of the exit criteria
for the Work Package. 49
TLO #4: ELO # 4.1
Interesting Attributes of TPMs
• Achieved to Date (sounds like EV)
• Current Estimate (sounds like EAC/ETC)
• Milestone
• Planned (target) value (sounds like PV)
• Planned performance profile (sounds like a PMB)
• Tolerance band (sounds like reporting thresholds)
• Threshold (yep, just what we thought)
• Variance (sounds like variance!)
50
TLO #4: ELO # 4.1
Summary of TPM Benefits
• Measures of results based on techncial achievement
• Integrated technical performance with budget and
schedule – providing an integrated systems
approach to performance measurement
• Establishes the linkage between engineering and
cost/schedule considerations in program
management decision making
• Measures the programmatic impact of techncial
variance in terms of risk
• Reduces subjectivity through the establishment of
technical performance baseline
51
TLO #4: ELO # 4.1
52
DOD Guides:
Technical Performance
Department of Defense Guidelines for Technical Performance Measures
DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08
Interim Defense Acquisition Guidebook (DAG) 6/15/09
Systems Engineering Plan (SEP) Preparation Guide 4/08
WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Integrated Master Plan (IMP) & Integrated Master Schedule Preparation &
Use Guide (IMS) 10/21/05
53
TLO #4: ELO # 4.1
TPMs in DAG and DAPS
Defense Acquisition Guide
• Performance measurement of WBS elements, using objective
measures:
– Essential for EVM and Technical Assessment activities
• Use TPMs and Critical Technical Parameters (CTP) to report
progress in achieving milestones
DAPS
• Use TPMs to determine whether % completion metrics
accurately reflect quantitative technical progress and quality
toward meeting Key Performance Parameters (KPP) and Critical
Technical Parameters
54
TPMs in DAG
• Compare the actual versus planned technical
development and design
• Report progress in the degree to which system
performance requirements are met.
• Plan is defined in terms of:
– Expected performance at specific points
• Defined in the WBS and IMS
– Methods of measurement at those points
– Variation limits for corrective action.
55
NASA EVM Guide:
Technical Performance
• NASA EVM Guide NPG 9501.3
– 4.5 Technical Performance Requirements (TPR): When TPRs
are used,
– appropriate and relevant metrics…
– must be defined in the solicitation
– Appendix A.7, 14.1 TPR
• Compares:
• Expected performance and
• Physical characteristics
• With contractually specified values.
• Basis for reporting established milestones
• Progress toward meeting technical requirements
56
Document, Baseline,
IMS, EVM
Parameter
 IMP, Functional Baseline Measures Of Effectiveness (MOE) 
 IMP, WBS, Functional Baseline Measures Of Performance (MOP) 
 IMS, Allocated Baseline Technical Performance Measure 
 IMS TPM Milestones And Planned Values 
 Work Packages TPM% Complete Criteria 
Derivation and Flow
Down of TPMs
57

Technical Performance Measures

  • 1.
    CPM-200C: Technical PerformanceMeasures Glen B. Alleman Niwot Ridge, LLC glen.alleman@niwotridge.com 303-241-9633 Date: May 2013 IPMC 2012 1
  • 2.
    TLO #1: Thestudent will recognize what a Technical Performance Measurement (TPM) is in the context of a program’s Performance Measurement Baseline and DOD Guidance. ELO #1.1: The student will be able to summarize the policy requirements for Technical Performance Measures in government documents ELO #1.2: The student will be able to identify the activities during the Integrated Baseline Review (IBR) to confirm the TPMs are defined as assigned to WBS elements and identified and assigned to work in the IMS. TLO #2: The student will be able identify and select TPMs using the example Work Breakdown Structure from the program. ELO #2.1: The student will construct a set of TPMs from the provided WBS ELO #2.2: With these constructed TPMs, the student will apply them to the WBS elements of the notional program. TLO #3: The student will learn how to prepare descriptions of TPM needed to assess the physical “progress to plan” of the program. ELO #3.1: The student will be able to define the upper and lower bounds for a TPM using the WBS and Statement of Work for the notional program TLO #4: The student will able to judge the validity of a TPM from a list of TPM’s from a notional program. ELO #4.1: The student will set up a control limit chart to track Technical Performance Measure metrics for the assigned deliverable. Learning Objectives 2
  • 3.
    Components we’ll meetalong the way to our destination to the credible PMB Objective Status and Essential Views to support the proactive management processes needed to keep the program GREEN Risk Management SOW SOO ConOps WBS Techncial and Operational Requirements CWBS & CWBS Dictionary Integrated Master Plan (IMP) Integrated Master Schedule (IMS) Measures of Effectiveness Measures of Performance Measures of Progress JROC Key Performance Parameters Program Specific Key Performance Parameters Technical Performance Measures Earned Value Management System TPMs Live Here Performance Measurement Baseline TLO #1 3
  • 4.
    Components of ourFinal Destination Sow SOO ConOps WBS Techncial and Operational Requirements Performance Measurement Baseline CWBS & CWBS Dictionary Integrated Master Plan (IMP) Integrated Master Schedule (IMS) Earned Value Management System Technical Performance Measurement (TPM) involves the predicting the future values of a key technical performance parameter of the higher-level end product under development, based on current assessments of products lower in the system structure. Continuous verification of actual versus anticipated achievement for selected technical parameters confirms progress and identifies variances that might jeopardize meeting a higher-level end product requirement. Assessed values falling outside established tolerances indicate the need for management attention and corrective action. A well thought out TPM program provides early warning of technical problems, supports assessments of the extent to which operational requirements will be met, and assesses the impacts of proposed changes made to lower-level elements in the system hierarchy on system performance. Techncial Performance Measurement TLO #1 4 https://dap.dau.mil/acquipedia/Pages/Default.aspx
  • 5.
    Can Earned ValueAlone Get Us To Our Destination? • How do we increase visibility into program performance? • How do we reduce cycle time to deliver the product? • How do we foster accountability? • How do we reduce risk? • How do we start our journey to success? Increasing the Probability of Success means we have to Connect The Dots Between EVM and TPM to Reach Our Destination 5 TLO #1: EOL #1.1
  • 6.
    Increasing the Probabilityof Program Success Means … Risk SOW Cost WBS IMP/IMS TPM PMB Building A Credible Performance Measurement Baseline This is actually harder than it looks! 6 TLO #1: EOL #1.1
  • 7.
    Identifying the TPMsstarts with good Systems Engineering 7 TLO #1: EOL #1.1
  • 8.
    Policy Guidance forTPMs • EIA-632 – involves a technique of predicting the future value of a key technical performance parameter of the higher-level end product under development, based on current assessments of products lower in the system structure. • INCOSE Systems Engineering Handbook. International Council on Systems Engineering • INCOSE Metrics Guidebook for Integrated System and Product Development 8 TLO #1: EOL #1.1
  • 9.
    The NDIA EVMIntent Guide Says Notice the inclusion of Technical along with Cost and Schedule That’s the next step is generating Value from Earned Value EV MUST include the Technical Performance Measures 9 TLO #1: EOL #1.1
  • 10.
    Some More Guidance Systemsengineering uses technical performance measurements to balance cost, schedule, and performance throughout the life cycle. Technical performance measurements compare actual versus planned technical development and design. They also report the degree to which system requirements are met in terms of performance, cost, schedule, and progress in implementing risk handling. Performance metrics are traceable to user–defined capabilities. ― Defense Acquisition Guide (https://dag.dau.mil/Pages/Default.aspx) In The End ― It’s All About Systems Engineering 10 TLO #1: EOL #1.1
  • 11.
    More Guidance CanBe Found in … 11 TLO #1: EOL #1.1
  • 12.
    This Has AllBeen Said Before. We Just Weren’t Listening… … the basic tenets of the process are the need for seamless management tools, that support an integrated approach … and “proactive identification and management of risk” for critical cost, schedule, and technical performance parameters. ― Secretary of Defense, Perry memo, May 1995 Why Is This Hard To Understand?  We seem to be focused on EV reporting, not the use of EV to manage the program.  Getting the CPR out the door is the end of Program Planning and Control’s efforts, not the beginning. TPM Handbook 1984 12 TLO #1: EOL #1.1
  • 13.
    A Final Reminder… 13 TPMs are one source of Risk Management processes Risk Management Guide to DOD Acquisition, Sixth Edition (Version 1,0), Aug 2006 TLO #1: EOL #1.1
  • 14.
    We see TPMsstarting with the Integrated Baseline Review (IBR) • Confirms that the Performance Measurement Baseline (PMB) captures the entire techncial scope of work, which is schedule to meet program objectives, identifies risks, assigns resources to meet requirements, and implements management control processes. 14 TLO #1: EOL #1.2 Currently reported Earned Value data contains invaluable planning and budgeting information … for program management, however, shortcomings of the system are it emphasis on retrospection and lack of integration with techncial achievement. – Cmdr. Nicholas Pisano
  • 15.
    Where can wefind IBR TPM Guidance? • In the obvious place – the NDIA Integrated Baseline Review Guide 15 TLO #1: EOL #1.2
  • 16.
    Back To OurTechnical Performance Measures Technical Performance Measures do what they say, Measure the Technical Performance of the product or service produced by the program. TLO #2 16
  • 17.
    The Primary Roleof TMPs is to describe what done looks like in units meaningful to the decision maker 19 October 1899 Robert Goddard decided that he wanted to "fly without wings" to Moon. TLO #2 17
  • 18.
    To develop theTPMs we first need to define some other measures • Measures of Effectiveness (MoE) • Measures of Performance (MoP) • Key Performance Parameters (KPP) – JROC KPPs – Program KPPs • Than we can define the Technical Performance Measures (TPM) 18
  • 19.
    Measure of Effectiveness(MoE) Measures of Effectiveness … • Are stated in units meaningful to the buyer, • Focus on capabilities independent of any technical implementation, • Are connected to the mission success. The operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions. “Technical Measurement,” INCOSE–TP–2003–020–01 MoE’s Belong to the End User TLO #2 19
  • 20.
    Measure of Performance(MoP) Measures of Performance are … • Attributes that assure the system has the capability to perform, • Assessment of the system to assure it meets design requirements to satisfy the MoE. Measures that characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions. “Technical Measurement,” INCOSE–TP–2003–020–01 MoP’s belong to the Program – Developed by the Systems Engineer, Measured By CAMs, and Analyzed by PP&C TLO #2 20
  • 21.
    Key Performance Parameters(KPP) Key Performance Parameters … • Have a threshold or objective value, • Characterize the major drivers of performance, • Are considered Critical to Customer (CTC). Represent the capabilities and characteristics so significant that failure to meet them can be cause for reevaluation, reassessing, or termination of the program “Technical Measurement,” INCOSE–TP–2003–020–01 The acquirer defines the KPPs during the operational concept development – KPPs say what DONE looks like TLO #2 21
  • 22.
    Technical Performance Measures (TPM) “TechnicalMeasurement,” INCOSE–TP–2003–020–01 Technical Performance Measures … • Assess design progress, • Define compliance to performance requirements, • Identify technical risk, • Are limited to critical thresholds, • Include projected performance. Attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal TLO #2 22
  • 23.
    From Mission toTPM “Coming to Grips with Measures of Effectiveness,” N. Sproles, Systems Engineering, Volume 3, Number 1, pp. 50–58 MoE KPP MoP TPM Mission Need Acquirer Defines the Needs and Capabilities in terms of Operational Scenarios Supplier Defines Physical Solutions that meet the needs of the Stakeholders Operational measures of success related to the achievement of the mission or operational objective being evaluated. Measures that characterize physical or functional attributes relating to the system operation. Measures used to assess design progress, compliance to performance requirements, and technical risks. TLO #2 23
  • 24.
    One more reminder 24 IPMCNovember 2012, Gordon Kranz, PARCA TLO #2: ELO #2.1
  • 25.
    TPMs start withthe WBS 25 TLO #2: ELO #2.1
  • 26.
    • Each “enditem” in the WBS needs a TPM to confirm the work produced the planned value. • These measures can be found in the WBS Dictionary, the “exit criteria” for the Work Packages and other engineering documents 26
  • 27.
     This iswhere TPMs are connected with the MoE’s and MoP’s  For each deliverable from the program, all the “measures” must be defined in units meaningful to the decision makers.  Here’s some “real” examples. Connecting the MoE, MoP, KPP, and TPMs 1. Provide Precision Approach for a 200 FT/0.5 NM DH 2. Provide bearing and range to AC platform 3. Provide AC surveillance to GRND platform Measures of Effectiveness (MoE) 1. Net Ready 2. Guidance Quality 3. Land Interoperability 4. Manpower 5. Availability JROC Key Performance Parameters (KPP) 1. Net Ready  IPv4/6 compliance  1Gb Ethernet 2. Guidance quality  Accuracy threshold p70 @ 6M  Integrity threshold 4M @ 10-6 /approach 3. Land interoperability  Processing capability meets LB growth matrix 4. Manpower  MTBC >1000 hrs  MCM < 2 hrs 5. Availability  Clear threshold >99%  Jam threshold >90% Measures of Performance (MoP) 1. Net Ready  Standard message packets 2. Guidance Quality  Multipath allocation budget  Multipath bias protection 3. Land Interoperability  MOSA compliant  Civil compliant 4. Manpower  Operating elapsed time meters  Standby elapsed time indicators 5. Availability  Phase center variations Technical Performance Measures (TPM) Mission Capabilities and Operational Need Technical Insight – Risk adjusted performance to plan 27 TLO #2: ELO #2.1
  • 28.
    IDENTIFYING TECHNCIAL PERFORMANCE MEASURES Technicalperformance measures must be techncial and must describe the performance in units of measure meaningful to the decision makers. TPM’s are not counts of things delivered. TPMs are not assessments of completed work through effort or cost. TPMs are measures of Technical Performance TLO #3 28
  • 29.
    Measuring Cost andSchedule but NOT measuring Compliance with Technical Performance Specifications TLO #3 29
  • 30.
    Measures of TechnicalParameters INCOSE Systems Engineering Handbook Attribute Description Achieved to Date  Measured technical progress or estimate of progress in units of performance Current Estimate  Value of a technical parameter that is predicted to be achieved now or in the future Milestone  Point in time when an evaluation of the technical measure is accomplished Planned Value  Predicted value of the technical parameter Planned Performance Profile  Profile representing the project time phased demonstration of a technical parameter Tolerance Band  Management alert limits Threshold  Limiting acceptable value of a technical parameter Variances  Demonstrated technical variance  Predicted technical variance TLO #3: ELO # 3.1 30
  • 31.
    A Familiar Graphicfor TPMs Variance Planned Value Planned Profile Current Estimate Milestones Threshold Upper Limit Lower Limit MeanToBetweenFailure Time = Program Maturity Achieved to Date TPM 31 TLO #3: ELO # 3.1
  • 32.
    Simple Method ofAssembling TPMs Select Technical Performance Parameters Define the planned progress for each TPM Assess the impact on Risk from this progress  Weight XXXX  Speed XXXX  MTBF XXXX  Loiter Time XXXX Parameters Progress Risk MOE / MOP KPP / TPM Risks 32 TLO #3: ELO # 3.1
  • 33.
    TPMs from anActual Program James Webb Space Telescope 33 TLO #3: ELO # 3.1
  • 34.
    TPMs from anActual Program Chandra X–Ray Telescope 34 TLO #3: ELO # 3.1
  • 35.
    1.1 Air Vehicle 1.1.1Sensor Platform 1.1.2 Airframe 1.1.3 Propulsion 1.1.4 On Board Comm 1.1.5 Auxiliary Equipment 1.1.6 Survivability Modules 1.1.7 Electronic Warfare Module 1.1.8 On Board Application & System SW 1.3 Mission Control / Ground Station SW 1.3.1 Signal Processing SW 1.3.2 Station Display 1.3.3 Operating System 1.3.4 ROE Simulations 1.3.5 Mission Commands TPMs Start With The WBS The WBS for a UAV 1.1.2 Airframe 35 TLO #3: ELO # 3.1
  • 36.
    What Do WeNeed To Know About This Program Through The TPMs? • What WBS elements represent the TPMs? • What Work Packages produce these WBS elements? • Where do these Work Packages live in the IMS? • What are the Earned Value baseline values for these Work Packages? • How are we going to measure all these variables? • What does the curve look like for these measurements? 36 TLO #3: ELO # 3.1
  • 37.
    Verifying Each TPMat Each Program Event Evidence that we’re in compliance CA Do we know what we promised to deliver, now that we’ve won? With our submitted ROM what are the values we need to get through Integrated Baseline Review. How do we measure weight for each program event? SFR Can we proceed into preliminary design? The contributors to the vehicle weight are confirmed and the upper limits defined in the product architecture and requirements flow down database (DOORS) into a model. SRR Can we proceed into the System Development and Demonstration (SDD) phase? Do we know all drivers of vehicle weight? Can we bound their upper limits? Can the subsystem owners be successful within these constraints uses a high fidelity model? PDR Can we start detailed design, and meet the stated performance requirements within cost, schedule, risk, and other constraints? Does each subsystem designer have the target component weight target and have some confidence they can stay below the upper bound? Can this be verified in some tangible way? Either through prior examples or a lab model? CDR Can the system proceed to fabrication, demonstration, and test, within cost, schedule, risk, and other system constraints? Do we know all we need to know to start the fabrication of the first articles of the flight vehicle. Some type of example, maybe a prototype is used to verify we’re inside the lines. TRR Can the system ready to proceed into formal test? Does the assembled vehicle fall within the weight range limits for 1st flight – will this thing get off the ground? 37 TLO #3: ELO # 3.1
  • 38.
    What Do WeNeed To Know About This Program Through TPMs • What WBS elements represent the TPMs? • What Work Packages produce these WBS elements? • Where do these Work Packages live in the IMS? • What are the Earned Value baseline values for these Work Packages? • How are we going to measure all these variables? • What does the curve look like for these measurements? 38 TLO #3: ELO # 3.1
  • 39.
    Let’s Connect TheDots Risk SOW Cost WBS IMP/IMS TPM PMB Named Deliverables defined in the WBS BCWS at the Work Package, rolled to the Control Account TPMs attached to each critical deliverables in the WBS and identified in each Work Package in the IMS, used to assess maturity in the IMP The Products and Processes that produce them in a “well structured” decomposition in the WBS IMS contains all the Work Packages, BCWS, Risk mitigation plans, and rolls to the Integrated Master Plan to measure increasing maturity Technical and Programmatic Risks Connected to the WBS and IMS 39 TLO #3: ELO # 3.1
  • 40.
    TPM Trends &Responses Dr. Falk Chart – modified EV Taken, planned values met, tolerances kept, etc. 25kg 23kg 28kg 26kg PDRSRRSFRCA TRRCDR ROM in Proposal Design Model Bench Scale Model Measurement Detailed Design Model Prototype Measurement Flight 1st Article TechnicalPerformanceMeasure VehicleWeight 40 TLO #3: ELO # 3.1
  • 41.
    The Assessment OfWeight As A Function Of Time • At Contract Award there is a Proposal grade estimate of vehicle weight. • At System Functional Review, the Concept of Operations is validated for the weight. • At System Requirements Review the weight targets are flowed down to the subsystems components. • At PDR the CAD model starts the verification process. • At CDR actual measurements are needed to verify all models. • At Test Readiness Review we need to know how much fuel to put on board for the 1st flight test. 41 TLO #3: ELO # 3.1
  • 42.
    1.1 Air Vehicle 1.1.1Sensor Platform 1.1.2 Airframe Airframe Weight TPMThe WBS for a UAV 1.1.2 Airframe CA SFR SRR PDR CDR TRR Planned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kg Actual Value 30.4kg 29.0kg 27.5kg 25.5kg Assessed Risk to TRR Moderate >2.0kg off target Low 1–2 kg off target Low 1–2 kg off target Very Low (less than 1.0 kg off target) Planned “Similar to” Program– Program– unique design Actual measurement Actual measurement The planned weight is 25kg. The actual weight is 25.5kg. Close to plan! So we are doing okay, right? Here’s the Problem 42 TLO #3: ELO # 3.1
  • 43.
    Raison d'etre forTechnical Performance Measures The real purpose of Technical Performance Measures is to reduce Programmatic and Technical RISK Risk SOW Cost WBS IMP/IMS TPM PMB TLO #4 43
  • 44.
    Can TPM’s beour Leading Indicator? TLO #4 44
  • 45.
    Key Elements ofGood TPMs • Traceability – Requirements to WBS to TPMs to EV control accounts. • Impact – How much WBS work, & therefore EV money, is covered by the TPM(s)? What is effect? • TPM Banding/Sensitivity – What banding (R/Y/G) and sensitivity (EV impact) should be used for each TPM? • Technical Readiness Level – What’s the state of the technology supporting the requirement(s) for which TPM is a metric? Implementing Technical Performance Measurement , Mike Ferraro , General Engineer, PEO/SYSCOM Conference 45 TLO #4: ELO # 4.1
  • 46.
    Forecasting the Future Stationary •The underlying governing rule does not change – The program baseline is stable • Future behavior can be induced from the past – Past performance will likely continue in the future • We have a sufficient record of the past to understand the underlying nature – UN/CEFACT XML now gives us this 46 TLO #4: ELO # 4.1
  • 47.
    Forecasting the Futurewith statistical analysis of the Past 47 TLO #4: ELO # 4.1
  • 48.
    Connecting the EVVariables Integrating Cost, Schedulele, and Technical Performance Assures Program Management has the needed performance information to deliver on‒time, on‒budget, and on‒specification Technical Performance Measures Cost Schedule Conventional Earned Value + =  Master Schedule is used to derive Basis of Estimate (BOE) not the other way around.  Probabilistic cost estimating uses past performance and cost risk modeling.  Labor, Materiel, and other direct costs accounted for in Work Packages.  Risk adjustments for all elements of cost. Cost Baseline  Earned Value is diluted by missing technical performance.  Earned Value is diluted by postponed features.  Earned Value is diluted by non compliant quality.  All these dilutions require adjustments to the Estimate at Complete (EAC) and the To Complete Performance Index (TCPI). Technical Performance  Requirements are decomposed into physical deliverables.  Deliverables are produced through Work Packages.  Work Packages are assigned to accountable manager.  Work Packages are sequenced to form the highest value stream with the lowest technical and programmatic risk. Schedule Baseline 48 TLO #4: ELO # 4.1
  • 49.
    TPM Checklist MoE MoPTPM Traceable to needs, goals, objectives, and risks. Traceable to applicable MOEs, KPPs, system level performance requirements, and risks. Traceable to applicable MoPs, system element performance, requirements, objectives, risks, and WBS elements. Defined with associated KPPs. Focused on technical risks and supports trades between alternative solutions. Further decomposed, budgeted, and allocated to lower level system elements in the WBS and IMS. Each MoE independent from others. Provided insight into system performance. Assigned an owner, the CAM and Work Package Manager. Each MoE independent of technical any solution. Decomposed, budgeted and allocated to system elements. Sources of measure identified and processes for generating the measures defined. Address the required KPPs. Assigned an “owner,” the CAM and Technical Manager. Integrated into the program’s IMS as part of the exit criteria for the Work Package. 49 TLO #4: ELO # 4.1
  • 50.
    Interesting Attributes ofTPMs • Achieved to Date (sounds like EV) • Current Estimate (sounds like EAC/ETC) • Milestone • Planned (target) value (sounds like PV) • Planned performance profile (sounds like a PMB) • Tolerance band (sounds like reporting thresholds) • Threshold (yep, just what we thought) • Variance (sounds like variance!) 50 TLO #4: ELO # 4.1
  • 51.
    Summary of TPMBenefits • Measures of results based on techncial achievement • Integrated technical performance with budget and schedule – providing an integrated systems approach to performance measurement • Establishes the linkage between engineering and cost/schedule considerations in program management decision making • Measures the programmatic impact of techncial variance in terms of risk • Reduces subjectivity through the establishment of technical performance baseline 51 TLO #4: ELO # 4.1
  • 52.
  • 53.
    DOD Guides: Technical Performance Departmentof Defense Guidelines for Technical Performance Measures DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08 Interim Defense Acquisition Guidebook (DAG) 6/15/09 Systems Engineering Plan (SEP) Preparation Guide 4/08 WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05 Integrated Master Plan (IMP) & Integrated Master Schedule Preparation & Use Guide (IMS) 10/21/05 53 TLO #4: ELO # 4.1
  • 54.
    TPMs in DAGand DAPS Defense Acquisition Guide • Performance measurement of WBS elements, using objective measures: – Essential for EVM and Technical Assessment activities • Use TPMs and Critical Technical Parameters (CTP) to report progress in achieving milestones DAPS • Use TPMs to determine whether % completion metrics accurately reflect quantitative technical progress and quality toward meeting Key Performance Parameters (KPP) and Critical Technical Parameters 54
  • 55.
    TPMs in DAG •Compare the actual versus planned technical development and design • Report progress in the degree to which system performance requirements are met. • Plan is defined in terms of: – Expected performance at specific points • Defined in the WBS and IMS – Methods of measurement at those points – Variation limits for corrective action. 55
  • 56.
    NASA EVM Guide: TechnicalPerformance • NASA EVM Guide NPG 9501.3 – 4.5 Technical Performance Requirements (TPR): When TPRs are used, – appropriate and relevant metrics… – must be defined in the solicitation – Appendix A.7, 14.1 TPR • Compares: • Expected performance and • Physical characteristics • With contractually specified values. • Basis for reporting established milestones • Progress toward meeting technical requirements 56
  • 57.
    Document, Baseline, IMS, EVM Parameter IMP, Functional Baseline Measures Of Effectiveness (MOE)   IMP, WBS, Functional Baseline Measures Of Performance (MOP)   IMS, Allocated Baseline Technical Performance Measure   IMS TPM Milestones And Planned Values   Work Packages TPM% Complete Criteria  Derivation and Flow Down of TPMs 57