CPM-500(D) : Implementing Technical Performance
Measures
Glen B. Alleman
DoD Programs
glen.alleman@niwotridge.com
+1 303 241 9633
PMI EVM Community of Practice
IPM 2011
Learning Objectives
2/53
TLO #9: The student will understand the role of Technical Performance Measurement
(TPM) in the project office.
ELO #1: The student will recognize the policy requirements for Technical Performance
Measures.
ELO #2: The student will recognize the role of Integrated Baseline Reviews in confirming the
entire technical scope of work has been planned.
ELO #3: The student will recognize the role of the WBS in supporting Technical Performance
Measure requirements.
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software
management tool implementation.
ELO #1: The student will recognize the benefits and challenges of Technical Performance
Measure implementation.
ELO #2: The student will recognize the use of control limit charts to track Technical
Performance Measure metrics.
ELO #3: The student will understand the methodology and approach used to show the
effect of Technical Performance Measure on Earned Value.
To Achieve Success …
3/53
We Need to …
©gapingvoid ltd www.gapingvoidgallery.com
Increasing the Probability of
Program Success Means …
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
Building A Credible Performance Measurement Baseline
This is actually harder than it looks!
4/53
A Core Problem With Earned
Value
 Earned Value measures
performance in units
of “money” (BCWS,
BCWP, ACWP).
 We need another
measure of progress in
units of TIME.
5/53
Measures Of Progress Must Be In Units
Meaningful To The Stakeholders
Doing This Starts With Some Guidance
6/53
Systems engineering uses technical performance
measurements to balance cost, schedule, and performance
throughout the life cycle. Technical performance
measurements compare actual versus planned technical
development and design. They also report the degree to
which system requirements are met in terms of performance,
cost, schedule, and progress in implementing risk handling.
Performance metrics are traceable to user–defined
capabilities.
― Defense Acquisition Guide
(https://dag.dau.mil/Pages/Default.aspx)
In The End ― It’s All About Systems Engineering
Just A Reminder Of The …
Primary Elements of Earned Value
Cost
Technical
Performance
Schedule
Funding margin for
under performance
Schedule margin for
over target baseline
(OTB)
Schedule margin for
underperformance or
schedule extension
Over cost or
under
performance
Over cost or
over schedule
Over schedule
or under
performing
7/53
This Has All Been Said Before.
We Just Weren’t Listening…
… the basic tenets of the process are the need for
seamless management tools, that support an
integrated approach … and “proactive
identification and management of risk” for critical
cost, schedule, and technical performance
parameters.
― Secretary of Defense, Perry memo, May 1995
Why Is This Hard To Understand?
 We seem to be focused on EV reporting, not the use of EV to
manage the program.
 Getting the CPR out the door is the end of Program Planning
and Control’s efforts, not the beginning.
8/53
TPM Handbook 1984
The Gap Seems To Start With A
Common Problem
Many Times, The Information from Cost, Schedule, Technical
Performance, and Risk Management Gets Mixed Up When We
Try to Put Them Together
9/539/53
The NDIA EVM Intent Guide Says
Notice the inclusion of Technical along with
Cost and Schedule
That’s the next step is generating Value from Earned Value
EV MUST include the Technical Performance Measures
10/53
Back To Our Technical
Performance Measures
Technical Performance Measures do what
they say,
Measure the Technical Performance
of the product or service produced by the
program.
11/53
The real question?
How fast can we safely go?
Yes, the Units of Measure are MPH
12/53
Measure of Effectiveness (MoE)
Measures of Effectiveness …
 Are stated in units meaningful to the buyer,
 Focus on capabilities independent of any
technical implementation,
 Are connected to the mission success.
The operational measures of success that are closely related to the
achievements of the mission or operational objectives evaluated in
the operational environment, under a specific set of conditions.
“Technical Measurement,” INCOSE–TP–2003–020–01
MoE’s Belong to the End User
13/53
Measure of Performance (MoP)
Measures of Performance are …
 Attributes that assure the system has the
capability to perform,
 Assessment of the system to assure it meets
design requirements to satisfy the MoE.
Measures that characterize physical or functional attributes
relating to the system operation, measured or estimated
under specific conditions.
“Technical Measurement,” INCOSE–TP–2003–020–01
MoP’s belong to the Program – Developed by the Systems
Engineer, Measured By CAMs, and Analyzed by PP&C
14/53
Key Performance Parameters (KPP)
Key Performance Parameters …
 Have a threshold or objective value,
 Characterize the major drivers of performance,
 Are considered Critical to Customer (CTC).
Represent the capabilities and characteristics so
significant that failure to meet them can be cause for
reevaluation, reassessing, or termination of the program
“Technical Measurement,” INCOSE–TP–2003–020–01
The acquirer defines the KPPs during the operational
concept development – KPPs say what DONE looks like
15/53
Technical Performance Measures (TPM)
“Technical Measurement,” INCOSE–TP–2003–020–01
Technical Performance Measures …
 Assess design progress,
 Define compliance to performance requirements,
 Identify technical risk,
 Are limited to critical thresholds,
 Include projected performance.
Attributes that determine how well a system or system
element is satisfying or expected to satisfy a technical
requirement or goal
16/53
Dependencies Between These Measures
17/53
“Coming to Grips with Measures of Effectiveness,” N. Sproles,
Systems Engineering, Volume 3, Number 1, pp. 50–58
MoE
KPP
MoP TPM
Mission
Need
Acquirer Defines the Needs and Capabilities
in terms of Operational Scenarios
Supplier Defines Physical Solutions that
meet the needs of the Stakeholders
Operational
measures of success
related to the
achievement of the
mission or
operational
objective being
evaluated.
Measures that
characterize physical
or functional
attributes relating
to the system
operation.
Measures used to
assess design
progress,
compliance to
performance
requirements, and
technical risks.
“Measures” of Technical Measures
INCOSE Systems Engineering Handbook
Attribute Description
Achieved to Date
Measured technical progress or estimate of
progress
Current Estimate
Value of a technical parameter that is predicted to
be achieved
Milestone
Point in time when an evaluation of a measure is
accomplished
Planned Value Predicted value of the technical parameter
Planned Performance
Profile
Profile representing the project time phased
demonstration of a technical parameter
Tolerance Band Management alert limits
Threshold Limiting acceptable value of a technical parameter
Variances
Demonstrated technical variance
Predicted technical variance 18/53
A Familiar Graphic of TPMs
Variance
Planned Value
Planned Profile
Current Estimate
Milestones
Threshold
Upper Limit
Lower Limit
MeanToBetweenFailure
Time = Program Maturity
Achieved to Date
19/53
TPM
TPMs from an Actual Program
Chandra X–Ray Telescope
20/53
What Does A Real Technical
Performance Measure Look Like?
Not that bagels are not
interesting in Lesson 1 and
2, but let’s get ready to look
at a flying machine.
21/53
1.1 Air Vehicle
1.1.1 Sensor Platform
1.1.2 Airframe
1.1.3 Propulsion
1.1.4 On Board Comm
1.1.5 Auxiliary Equipment
1.1.6 Survivability
Modules
1.1.7 Electronic Warfare
Module
1.1.8 On Board
Application &
System SW
1.3 Mission Control /
Ground Station SW
1.3.1 Signal Processing
SW
1.3.2 Station Display
1.3.3 Operating System
1.3.4 ROE Simulations
1.3.5 Mission Commands
TPMs Start With The WBS
The WBS for a UAV
1.1.2 Airframe
22/53
What Do We Need To Know About
This Program Through TPMs
 What WBS elements represent the TPMs?
 What Work Packages produce these WBS elements?
 Where do these Work Packages live in the IMS?
 What are the Earned Value baseline values for these
Work Packages?
 How are we going to measure all these variables?
 What does the curve look like for these
measurements?
23/53
Verifying Each TPM
Evidence that we’re in compliance
CA
Do we know what we promised to
deliver, now that we’ve won?
With our submitted ROM what are the values we need to get
through Integrated Baseline Review. How do we measure
weight for each program event?
SFR
Can we proceed into preliminary
design?
The contributors to the vehicle weight are confirmed and the
upper limits defined in the product architecture and
requirements flow down database (DOORS) into a model.
SRR
Can we proceed into the System
Development and Demonstration
(SDD) phase?
Do we know all drivers of vehicle weight? Can we bound their
upper limits? Can the subsystem owners be successful within
these constraints uses a high fidelity model?
PDR
Can we start detailed design, and
meet the stated performance
requirements within cost, schedule,
risk, and other constraints?
Does each subsystem designer have the target component
weight target and have some confidence they can stay below
the upper bound? Can this be verified in some tangible way?
Either through prior examples or a lab model?
CDR
Can the system proceed to
fabrication, demonstration, and test,
within cost, schedule, risk, and other
system constraints?
Do we know all we need to know to start the fabrication of
the first articles of the flight vehicle. Some type of example,
maybe a prototype is used to verify we’re inside the lines.
TRR
Can the system ready to
proceed into formal test?
Does the assembled vehicle fall within the weight range limits
for 1st flight – will this thing get off the ground? 24/53
TPM Trends & Responses
Dr. Falk Chart – modified
EV Taken, planned values met, tolerances kept, etc.
25/53
25kg
23kg
28kg
26kg
PDRSRRSFRCA TRRCDR
ROM in Proposal
Design Model
Bench Scale Model Measurement
Detailed Design Model
Prototype Measurement
Flight 1st Article
TechnicalPerformanceMeasure
VehicleWeight
The Assessment Of Weight As A
Function Of Time
 At Contract Award there is a Proposal grade estimate of
vehicle weight.
 At System Functional Review, the Concept of Operations is
validated for the weight.
 At System Requirements Review the weight targets are
flowed down to the subsystems components.
 At PDR the CAD model starts the verification process.
 At CDR actual measurements are needed to verify all
models.
 At Test Readiness Review we need to know how much
fuel to put on board for the 1st flight test.
26/53
1.1 Air Vehicle
1.1.1 Sensor Platform
1.1.2 Airframe
Airframe Weight TPMThe WBS for a UAV
1.1.2 Airframe
CA SFR SRR PDR CDR TRR
Planned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kg
Actual Value 30.4kg 29.0kg 27.5kg 25.5kg
Assessed Risk
to TRR
Moderate
>2.0kg off
target
Low
1–2 kg off
target
Low
1–2 kg off
target
Very Low (less
than 1.0 kg
off target)
Planned
Method
“Similar to”
Estimate
ROM
Program–
unique design
model
Program–
unique design
model with
validated data
Actual
measurement
of bench–test
components
Actual
measurement
of prototype
airframe
Actual
Method
“Similar to”
Estimate
ROM ROM ROM
The planned weight is
25kg. The actual weight is
25.5kg.
Close to plan! So we are
doing okay, right?
27/53
Here’s the Problem
Raison d'etre for Technical
Performance Measures
The real purpose of
Technical Performance
Measures is to reduce
Programmatic and
Technical RISK
Risk
SOW
Cost
WBS
IMP/IMS
TPM
PMB
28/53
Buying Down Risk with TPMs
 “Buying down” risk is
planned in the IMS.
 MoE, MoP, and KPP
defined in the work
package for the critical
measure – weight.
 If we can’t verify we’ve
succeeded, then the
risk did not get
reduced.
 The risk may have
gotten worse.
29/53
Risk: CEV-037 - Loss of Critical Functions During Descent
Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)
RiskScore
24
22
20
18
16
14
12
10
8
6
4
2
0
Conduct Force and Moment Wind
Develop analytical model to de
Conduct focus splinter review
Conduct Block 1 w ind tunnel te
Correlate the analytical model
Conduct w ind tunnel testing of
Conduct w ind tunnel testing of
Flight Application of Spacecra
CEV block 5 w ind tunnel testin
In-Flight development tests of
Damaged TPS flight test
31.Mar.05
5.Oct.05
3.Apr.06
3.Jul.06
15.Sep.06
1.Jun.07
1.Apr.08
1.Aug.08
1.Apr.09
1.Jan.10
16.Dec.10
1.Jul.11
Weight risk
reduced from
RED to Yellow
Weight confirmed
ready to fly – it’s
GREEN at this point
Increasing the Probability of
Success with Risk Management
 Going outside the TPM
limits always means
cost and schedule
impacts.
 “Coloring Inside the
Lines” means knowing
the how to keep the
program GREEN, or at
least stay close to
GREEN.
30/53
So much for our strategy of winning
through technical dominance
Connecting the EV Variables
31/53
Integrating Cost, Schedulele, and Technical Performance
Assures Program Management has the needed performance information to deliver
on‒time, on‒budget, and on‒specification
Technical Performance Measures
Cost Schedule
Conventional Earned Value
+
=
 Master Schedule is used to
derive Basis of Estimate
(BOE) not the other way
around.
 Probabilistic cost
estimating uses past
performance and cost risk
modeling.
 Labor, Materiel, and other
direct costs accounted for
in Work Packages.
 Risk adjustments for all
elements of cost.
Cost Baseline
 Earned Value is diluted by
missing technical
performance.
 Earned Value is diluted by
postponed features.
 Earned Value is diluted by
non compliant quality.
 All these dilutions require
adjustments to the
Estimate at Complete
(EAC) and the To Complete
Performance Index (TCPI).
Technical Performance
 Requirements are
decomposed into physical
deliverables.
 Deliverables are produced
through Work Packages.
 Work Packages are
assigned to accountable
manager.
 Work Packages are
sequenced to form the
highest value stream with
the lowest technical and
programmatic risk.
Schedule Baseline
TPM Checklist
MoE MoP TPM
Traceable to needs,
goals, objectives, and
risks.
Traceable to applicable
MOEs, KPPs, system level
performance requirements,
and risks.
Traceable to applicable MoPs,
system element performance,
requirements, objectives,
risks, and WBS elements.
Defined with associated
KPPs.
Focused on technical risks
and supports trades
between alternative
solutions.
Further decomposed,
budgeted, and allocated to
lower level system elements in
the WBS and IMS.
Each MoE independent
from others.
Provided insight into
system performance.
Assigned an owner, the CAM
and Work Package Manager.
Each MoE independent
of technical any
solution.
Decomposed, budgeted
and allocated to system
elements.
Sources of measure identified
and processes for generating
the measures defined.
Address the required
KPPs.
Assigned an “owner,” the
CAM and Technical
Manager.
Integrated into the program’s
IMS as part of the exit criteria
for the Work Package. 32/53
Did We Accomplish the Learning
Objectives?
33/53
TLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project
office.
ELO #1: The student will recognize the policy
requirements for TPM.
Policies and supporting guidance, with links and
reference numbers provided.
ELO #2: The student will recognize the role of IBRs in
confirming the entire technical scope of work has
been planned.
This is the first place where cost, schedule and
technical performance come together – in the
Integrated Master Schedule (IMS)
ELO #3: The student will recognize the role of the
WBS in supporting TPM requirements.
TPMs are first located in the WBS
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool
implementation.
ELO #1: The student will recognize the benefits and
challenges of TPM implementation.
Progress is measured in units of physical percent
complete. TPMs are those units.
ELO #2: The student will recognize the use of control
limit charts to track TPM metrics.
We’ve seen notional and actual charts
ELO #3: The student will understand the
methodology and approach used to show the effect
of TPMs on earned value.
The example of our “flying machine” connects the
dots for TPMs, risk, cost, and schedule.
34/53
Backup Materials
Knowledge is of two kinds. We know a
subject ourselves, or we know where
we can find information on it
— Samuel Johnson
35/53
OMB Circular A–11, Section 300 Interim Defense Acquisition Guidebook (DAG)
6/15/09
GAO Report 06–250 Systems Engineering Plan (SEP) Preparation Guide
4/08
DoDI 5000.02, Operation of the Defense
Acquisition System (POL) 12/08
WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Integrated Master Plan (IMP) & Integrated
Master Schedule Preparation & Use Guide
(IMS) 10/21/05
Guide for Integrating SE into DOD Acquisition
Contracts 12/06
Defense Acquisition Program Support
Methodology (DAPS) V2.0 3/20/09
Guide to the Project Management Institute Body of
Knowledge (PMBOK Guide®), 4th Edition
Standard for Application and
Management of the SE Process (IEEE
1220)
Capability Maturity Model Integration (CMMI®)
IEEE 1220: 6.8.1.5 Processes for Engineering a System (ANSI/EIA–632)
NASA EVM Guide NPG 9501.3
Many of Sources for Connecting the Dots
36/53
Office of Management and
Budget
Circular No. A–11, Section 300
 Planning, Budgeting, Acquisition and Management
of Capital Assets
 Section 300–5
– Performance–based acquisition management
– Based on EVMS standard
– Measure progress towards milestones
• Cost
• Capability to meet specified requirements
• Timeliness
• Quality
37/53
Need: Accurate Performance
Measurement
GAO Report 06–250 Findings and Recommendations
Information Technology:
Improve the Accuracy and
Reliability of Investment
Information
2. If EVM is not implemented
effectively, decisions based on
inaccurate and potentially
misleading information
3. Agencies not measuring actual
versus expected performance
in meeting IT performance
goals.
38/53
DOD Guides:
Technical Performance
Department of Defense Guidelines for Technical Performance Measures
DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08
Interim Defense Acquisition Guidebook (DAG) 6/15/09
Systems Engineering Plan (SEP) Preparation Guide 4/08
WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Integrated Master Plan (IMP) & Integrated Master Schedule Preparation &
Use Guide (IMS) 10/21/05
Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 12/06
Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09
39/53
DoD: TPMs in Technical Baselines and Reviews
DoD Policy or Guide
POL
DAG
SEP
WBS
IMP/IMS
Integrated
Systems
Engineering
DAPS
Technical Baselines:
IMP/IMS
Functional (SFR)
Allocated (PDR)
Product (CDR)
  
Event driven timing       
Success criteria of
technical review       
Entry and exit criteria
for technical reviews    
Assess technical
maturity     40/53
DoD: TPMs in Integrated Plans
DoD Policy or Guide
POL
DAG
SEP
WBS
IMP/IMS
Integrated
Systems
Engineering
DAPS
Integrated SEP with:
IMP/IMS
TPMs
EVM
    
Integrated WBS with
Requirement
Specification
Statement of Work
IMP/IMS/EVMS
    
Link risk management,
technical reviews, TPMs,
EVM, WBS, IMS
  
41/53
Guidance in Standards, Models,
and Defense Acquisition Guide
 Processes for Engineering a System (ANSI/EIA–632)
 Standard for Application and Management of the SE
Process (IEEE 1220)
 Capability Maturity Model Integration (CMMI®)
– CMMI for Development, Version 1.2
– CMMI for Acquisition, Version 1.2
– Using CMMI to Improve Earned Value Management,
2002
 Guide to the Project Management Institute Body of
Knowledge (PMBOK Guide®), 4th Edition
42/53
Technical Performance
Measures (TPM)
More Sources
IEEE 1220: 6.8.1.5,
Performance–based progress
measurement
EIA–632: Glossary CMMI for Development
Requirements Development
TPMs are key to
progressively assess technical
progress
Predict future value of key
technical parameters of the
end system based on current
assessments
Specific Practice (SP) 3.3,
Analyze Requirements
Typical work product:
TPMs
Establish dates for
– Checking progress
– Meeting full
conformance to
requirements
Planned value profile is
time–phased achievement
projected
• Achievement to date
• Technical milestone where
TPM evaluation is reported
Subpractice:
Identify TPMs that will be
tracked during development
43/53
PMBOK® Guide
 10.5.1.1 Project Management Plan
 Performance Measurement Baseline:
– Typically integrates scope, schedule, and cost
parameters of a project
– May also include technical and quality parameters
44/53
PMBOK® Guide
 8.3.5.4 Work Performance Measurements
 Used to produce project activity metrics
 Evaluate actual progress as compared to planned
progress
 Include, but are not limited to:
– Planned vs. actual technical performance
– Planned vs. actual schedule performance, and
– Planned vs. actual cost performance.
45/53
TPMs in DAG and DAPS
Defense Acquisition Guide
 Performance measurement of WBS elements, using
objective measures:
– Essential for EVM and Technical Assessment activities
 Use TPMs and Critical Technical Parameters (CTP) to
report progress in achieving milestones
DAPS
 Use TPMs to determine whether % completion metrics
accurately reflect quantitative technical progress and
quality toward meeting Key Performance Parameters
(KPP) and Critical Technical Parameters
46/53
TPMs in DAG
 Compare the actual versus planned technical
development and design
 Report progress in the degree to which system
performance requirements are met.
 Plan is defined in terms of:
– Expected performance at specific points
• Defined in the WBS and IMS
– Methods of measurement at those points
– Variation limits for corrective action.
47/53
PMBOK® Guide
 11.6.2.4 Technical Performance Measurement
 Compares technical accomplishments… to … project
management plan’s schedule of technical
achievement
 Requires definition of objective quantifiable
measures of technical performance which can be
used to compare actual results against targets.
 Might include weight, transaction times, number of
delivered defects, storage capacity etc.
 Deviation, such as demonstrating more or less
functionality than planned at a milestone…forecast
degree of success in achieving the project’s scope.
48/53
CMMI–ACQ
 Acquisition Technical Management
 SP 1.3 Conduct Technical Reviews
 Typical supplier deliverables
 Progress reports and process, product, and
service level measurements
 TPMs
49/53
SMS Shall:
Monitor Progress Against the Plan
 4.2.12.2 Monitoring
– Contractor SHALL monitor progress against plan to
validate, approve, and maintain each baseline and
functional architecture
 4.2.12.2.2 Required Product Attributes
– Each documented assessment includes:
– TPMs, metrics
– Metrics and technical parameters for tracking that
are critical indicators of technical progress and
achievement
50/53
NASA EVM Guide:
Technical Performance
• NASA EVM Guide NPG 9501.3
– 4.5 Technical Performance Requirements (TPR): When
TPRs are used,
– appropriate and relevant metrics…
– must be defined in the solicitation
– Appendix A.7, 14.1 TPR
• Compares:
• Expected performance and
• Physical characteristics
• With contractually specified values.
• Basis for reporting established milestones
• Progress toward meeting technical requirements
51/53
See next chart for linkage of technical baselines to technical reviews
Document, Baseline,
IMS, EVM Parameter
 IMP, Functional Baseline Measures Of Effectiveness (MOE) 
 IMP, WBS, Functional Baseline Measures Of Performance (MOP) 
 IMP, Allocated Baseline Technical Performance Measure 
 IMS TPM Milestones And Planned Values 
 Work Packages TPM% Complete Criteria 
Derivation and Flow
Down of TPMs
52/53
Interesting Attributes of TPMs
 Achieved to Date (sounds like EV)
 Current Estimate (sounds like EAC/ETC)
 Milestone
 Planned (target) value (sounds like PV)
 Planned performance profile (sounds like a PMB)
 Tolerance band (sounds like reporting
thresholds)
 Threshold (yep, just what we thought)
 Variance (sounds like variance!)
53/53

Implementing Technical Performance Measures

  • 1.
    CPM-500(D) : ImplementingTechnical Performance Measures Glen B. Alleman DoD Programs glen.alleman@niwotridge.com +1 303 241 9633 PMI EVM Community of Practice IPM 2011
  • 2.
    Learning Objectives 2/53 TLO #9:The student will understand the role of Technical Performance Measurement (TPM) in the project office. ELO #1: The student will recognize the policy requirements for Technical Performance Measures. ELO #2: The student will recognize the role of Integrated Baseline Reviews in confirming the entire technical scope of work has been planned. ELO #3: The student will recognize the role of the WBS in supporting Technical Performance Measure requirements. TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool implementation. ELO #1: The student will recognize the benefits and challenges of Technical Performance Measure implementation. ELO #2: The student will recognize the use of control limit charts to track Technical Performance Measure metrics. ELO #3: The student will understand the methodology and approach used to show the effect of Technical Performance Measure on Earned Value.
  • 3.
    To Achieve Success… 3/53 We Need to … ©gapingvoid ltd www.gapingvoidgallery.com
  • 4.
    Increasing the Probabilityof Program Success Means … Risk SOW Cost WBS IMP/IMS TPM PMB Building A Credible Performance Measurement Baseline This is actually harder than it looks! 4/53
  • 5.
    A Core ProblemWith Earned Value  Earned Value measures performance in units of “money” (BCWS, BCWP, ACWP).  We need another measure of progress in units of TIME. 5/53 Measures Of Progress Must Be In Units Meaningful To The Stakeholders
  • 6.
    Doing This StartsWith Some Guidance 6/53 Systems engineering uses technical performance measurements to balance cost, schedule, and performance throughout the life cycle. Technical performance measurements compare actual versus planned technical development and design. They also report the degree to which system requirements are met in terms of performance, cost, schedule, and progress in implementing risk handling. Performance metrics are traceable to user–defined capabilities. ― Defense Acquisition Guide (https://dag.dau.mil/Pages/Default.aspx) In The End ― It’s All About Systems Engineering
  • 7.
    Just A ReminderOf The … Primary Elements of Earned Value Cost Technical Performance Schedule Funding margin for under performance Schedule margin for over target baseline (OTB) Schedule margin for underperformance or schedule extension Over cost or under performance Over cost or over schedule Over schedule or under performing 7/53
  • 8.
    This Has AllBeen Said Before. We Just Weren’t Listening… … the basic tenets of the process are the need for seamless management tools, that support an integrated approach … and “proactive identification and management of risk” for critical cost, schedule, and technical performance parameters. ― Secretary of Defense, Perry memo, May 1995 Why Is This Hard To Understand?  We seem to be focused on EV reporting, not the use of EV to manage the program.  Getting the CPR out the door is the end of Program Planning and Control’s efforts, not the beginning. 8/53 TPM Handbook 1984
  • 9.
    The Gap SeemsTo Start With A Common Problem Many Times, The Information from Cost, Schedule, Technical Performance, and Risk Management Gets Mixed Up When We Try to Put Them Together 9/539/53
  • 10.
    The NDIA EVMIntent Guide Says Notice the inclusion of Technical along with Cost and Schedule That’s the next step is generating Value from Earned Value EV MUST include the Technical Performance Measures 10/53
  • 11.
    Back To OurTechnical Performance Measures Technical Performance Measures do what they say, Measure the Technical Performance of the product or service produced by the program. 11/53
  • 12.
    The real question? Howfast can we safely go? Yes, the Units of Measure are MPH 12/53
  • 13.
    Measure of Effectiveness(MoE) Measures of Effectiveness …  Are stated in units meaningful to the buyer,  Focus on capabilities independent of any technical implementation,  Are connected to the mission success. The operational measures of success that are closely related to the achievements of the mission or operational objectives evaluated in the operational environment, under a specific set of conditions. “Technical Measurement,” INCOSE–TP–2003–020–01 MoE’s Belong to the End User 13/53
  • 14.
    Measure of Performance(MoP) Measures of Performance are …  Attributes that assure the system has the capability to perform,  Assessment of the system to assure it meets design requirements to satisfy the MoE. Measures that characterize physical or functional attributes relating to the system operation, measured or estimated under specific conditions. “Technical Measurement,” INCOSE–TP–2003–020–01 MoP’s belong to the Program – Developed by the Systems Engineer, Measured By CAMs, and Analyzed by PP&C 14/53
  • 15.
    Key Performance Parameters(KPP) Key Performance Parameters …  Have a threshold or objective value,  Characterize the major drivers of performance,  Are considered Critical to Customer (CTC). Represent the capabilities and characteristics so significant that failure to meet them can be cause for reevaluation, reassessing, or termination of the program “Technical Measurement,” INCOSE–TP–2003–020–01 The acquirer defines the KPPs during the operational concept development – KPPs say what DONE looks like 15/53
  • 16.
    Technical Performance Measures(TPM) “Technical Measurement,” INCOSE–TP–2003–020–01 Technical Performance Measures …  Assess design progress,  Define compliance to performance requirements,  Identify technical risk,  Are limited to critical thresholds,  Include projected performance. Attributes that determine how well a system or system element is satisfying or expected to satisfy a technical requirement or goal 16/53
  • 17.
    Dependencies Between TheseMeasures 17/53 “Coming to Grips with Measures of Effectiveness,” N. Sproles, Systems Engineering, Volume 3, Number 1, pp. 50–58 MoE KPP MoP TPM Mission Need Acquirer Defines the Needs and Capabilities in terms of Operational Scenarios Supplier Defines Physical Solutions that meet the needs of the Stakeholders Operational measures of success related to the achievement of the mission or operational objective being evaluated. Measures that characterize physical or functional attributes relating to the system operation. Measures used to assess design progress, compliance to performance requirements, and technical risks.
  • 18.
    “Measures” of TechnicalMeasures INCOSE Systems Engineering Handbook Attribute Description Achieved to Date Measured technical progress or estimate of progress Current Estimate Value of a technical parameter that is predicted to be achieved Milestone Point in time when an evaluation of a measure is accomplished Planned Value Predicted value of the technical parameter Planned Performance Profile Profile representing the project time phased demonstration of a technical parameter Tolerance Band Management alert limits Threshold Limiting acceptable value of a technical parameter Variances Demonstrated technical variance Predicted technical variance 18/53
  • 19.
    A Familiar Graphicof TPMs Variance Planned Value Planned Profile Current Estimate Milestones Threshold Upper Limit Lower Limit MeanToBetweenFailure Time = Program Maturity Achieved to Date 19/53 TPM
  • 20.
    TPMs from anActual Program Chandra X–Ray Telescope 20/53
  • 21.
    What Does AReal Technical Performance Measure Look Like? Not that bagels are not interesting in Lesson 1 and 2, but let’s get ready to look at a flying machine. 21/53
  • 22.
    1.1 Air Vehicle 1.1.1Sensor Platform 1.1.2 Airframe 1.1.3 Propulsion 1.1.4 On Board Comm 1.1.5 Auxiliary Equipment 1.1.6 Survivability Modules 1.1.7 Electronic Warfare Module 1.1.8 On Board Application & System SW 1.3 Mission Control / Ground Station SW 1.3.1 Signal Processing SW 1.3.2 Station Display 1.3.3 Operating System 1.3.4 ROE Simulations 1.3.5 Mission Commands TPMs Start With The WBS The WBS for a UAV 1.1.2 Airframe 22/53
  • 23.
    What Do WeNeed To Know About This Program Through TPMs  What WBS elements represent the TPMs?  What Work Packages produce these WBS elements?  Where do these Work Packages live in the IMS?  What are the Earned Value baseline values for these Work Packages?  How are we going to measure all these variables?  What does the curve look like for these measurements? 23/53
  • 24.
    Verifying Each TPM Evidencethat we’re in compliance CA Do we know what we promised to deliver, now that we’ve won? With our submitted ROM what are the values we need to get through Integrated Baseline Review. How do we measure weight for each program event? SFR Can we proceed into preliminary design? The contributors to the vehicle weight are confirmed and the upper limits defined in the product architecture and requirements flow down database (DOORS) into a model. SRR Can we proceed into the System Development and Demonstration (SDD) phase? Do we know all drivers of vehicle weight? Can we bound their upper limits? Can the subsystem owners be successful within these constraints uses a high fidelity model? PDR Can we start detailed design, and meet the stated performance requirements within cost, schedule, risk, and other constraints? Does each subsystem designer have the target component weight target and have some confidence they can stay below the upper bound? Can this be verified in some tangible way? Either through prior examples or a lab model? CDR Can the system proceed to fabrication, demonstration, and test, within cost, schedule, risk, and other system constraints? Do we know all we need to know to start the fabrication of the first articles of the flight vehicle. Some type of example, maybe a prototype is used to verify we’re inside the lines. TRR Can the system ready to proceed into formal test? Does the assembled vehicle fall within the weight range limits for 1st flight – will this thing get off the ground? 24/53
  • 25.
    TPM Trends &Responses Dr. Falk Chart – modified EV Taken, planned values met, tolerances kept, etc. 25/53 25kg 23kg 28kg 26kg PDRSRRSFRCA TRRCDR ROM in Proposal Design Model Bench Scale Model Measurement Detailed Design Model Prototype Measurement Flight 1st Article TechnicalPerformanceMeasure VehicleWeight
  • 26.
    The Assessment OfWeight As A Function Of Time  At Contract Award there is a Proposal grade estimate of vehicle weight.  At System Functional Review, the Concept of Operations is validated for the weight.  At System Requirements Review the weight targets are flowed down to the subsystems components.  At PDR the CAD model starts the verification process.  At CDR actual measurements are needed to verify all models.  At Test Readiness Review we need to know how much fuel to put on board for the 1st flight test. 26/53
  • 27.
    1.1 Air Vehicle 1.1.1Sensor Platform 1.1.2 Airframe Airframe Weight TPMThe WBS for a UAV 1.1.2 Airframe CA SFR SRR PDR CDR TRR Planned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kg Actual Value 30.4kg 29.0kg 27.5kg 25.5kg Assessed Risk to TRR Moderate >2.0kg off target Low 1–2 kg off target Low 1–2 kg off target Very Low (less than 1.0 kg off target) Planned Method “Similar to” Estimate ROM Program– unique design model Program– unique design model with validated data Actual measurement of bench–test components Actual measurement of prototype airframe Actual Method “Similar to” Estimate ROM ROM ROM The planned weight is 25kg. The actual weight is 25.5kg. Close to plan! So we are doing okay, right? 27/53 Here’s the Problem
  • 28.
    Raison d'etre forTechnical Performance Measures The real purpose of Technical Performance Measures is to reduce Programmatic and Technical RISK Risk SOW Cost WBS IMP/IMS TPM PMB 28/53
  • 29.
    Buying Down Riskwith TPMs  “Buying down” risk is planned in the IMS.  MoE, MoP, and KPP defined in the work package for the critical measure – weight.  If we can’t verify we’ve succeeded, then the risk did not get reduced.  The risk may have gotten worse. 29/53 Risk: CEV-037 - Loss of Critical Functions During Descent Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete) RiskScore 24 22 20 18 16 14 12 10 8 6 4 2 0 Conduct Force and Moment Wind Develop analytical model to de Conduct focus splinter review Conduct Block 1 w ind tunnel te Correlate the analytical model Conduct w ind tunnel testing of Conduct w ind tunnel testing of Flight Application of Spacecra CEV block 5 w ind tunnel testin In-Flight development tests of Damaged TPS flight test 31.Mar.05 5.Oct.05 3.Apr.06 3.Jul.06 15.Sep.06 1.Jun.07 1.Apr.08 1.Aug.08 1.Apr.09 1.Jan.10 16.Dec.10 1.Jul.11 Weight risk reduced from RED to Yellow Weight confirmed ready to fly – it’s GREEN at this point
  • 30.
    Increasing the Probabilityof Success with Risk Management  Going outside the TPM limits always means cost and schedule impacts.  “Coloring Inside the Lines” means knowing the how to keep the program GREEN, or at least stay close to GREEN. 30/53 So much for our strategy of winning through technical dominance
  • 31.
    Connecting the EVVariables 31/53 Integrating Cost, Schedulele, and Technical Performance Assures Program Management has the needed performance information to deliver on‒time, on‒budget, and on‒specification Technical Performance Measures Cost Schedule Conventional Earned Value + =  Master Schedule is used to derive Basis of Estimate (BOE) not the other way around.  Probabilistic cost estimating uses past performance and cost risk modeling.  Labor, Materiel, and other direct costs accounted for in Work Packages.  Risk adjustments for all elements of cost. Cost Baseline  Earned Value is diluted by missing technical performance.  Earned Value is diluted by postponed features.  Earned Value is diluted by non compliant quality.  All these dilutions require adjustments to the Estimate at Complete (EAC) and the To Complete Performance Index (TCPI). Technical Performance  Requirements are decomposed into physical deliverables.  Deliverables are produced through Work Packages.  Work Packages are assigned to accountable manager.  Work Packages are sequenced to form the highest value stream with the lowest technical and programmatic risk. Schedule Baseline
  • 32.
    TPM Checklist MoE MoPTPM Traceable to needs, goals, objectives, and risks. Traceable to applicable MOEs, KPPs, system level performance requirements, and risks. Traceable to applicable MoPs, system element performance, requirements, objectives, risks, and WBS elements. Defined with associated KPPs. Focused on technical risks and supports trades between alternative solutions. Further decomposed, budgeted, and allocated to lower level system elements in the WBS and IMS. Each MoE independent from others. Provided insight into system performance. Assigned an owner, the CAM and Work Package Manager. Each MoE independent of technical any solution. Decomposed, budgeted and allocated to system elements. Sources of measure identified and processes for generating the measures defined. Address the required KPPs. Assigned an “owner,” the CAM and Technical Manager. Integrated into the program’s IMS as part of the exit criteria for the Work Package. 32/53
  • 33.
    Did We Accomplishthe Learning Objectives? 33/53 TLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project office. ELO #1: The student will recognize the policy requirements for TPM. Policies and supporting guidance, with links and reference numbers provided. ELO #2: The student will recognize the role of IBRs in confirming the entire technical scope of work has been planned. This is the first place where cost, schedule and technical performance come together – in the Integrated Master Schedule (IMS) ELO #3: The student will recognize the role of the WBS in supporting TPM requirements. TPMs are first located in the WBS TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool implementation. ELO #1: The student will recognize the benefits and challenges of TPM implementation. Progress is measured in units of physical percent complete. TPMs are those units. ELO #2: The student will recognize the use of control limit charts to track TPM metrics. We’ve seen notional and actual charts ELO #3: The student will understand the methodology and approach used to show the effect of TPMs on earned value. The example of our “flying machine” connects the dots for TPMs, risk, cost, and schedule.
  • 34.
  • 35.
    Backup Materials Knowledge isof two kinds. We know a subject ourselves, or we know where we can find information on it — Samuel Johnson 35/53
  • 36.
    OMB Circular A–11,Section 300 Interim Defense Acquisition Guidebook (DAG) 6/15/09 GAO Report 06–250 Systems Engineering Plan (SEP) Preparation Guide 4/08 DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08 WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05 Integrated Master Plan (IMP) & Integrated Master Schedule Preparation & Use Guide (IMS) 10/21/05 Guide for Integrating SE into DOD Acquisition Contracts 12/06 Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09 Guide to the Project Management Institute Body of Knowledge (PMBOK Guide®), 4th Edition Standard for Application and Management of the SE Process (IEEE 1220) Capability Maturity Model Integration (CMMI®) IEEE 1220: 6.8.1.5 Processes for Engineering a System (ANSI/EIA–632) NASA EVM Guide NPG 9501.3 Many of Sources for Connecting the Dots 36/53
  • 37.
    Office of Managementand Budget Circular No. A–11, Section 300  Planning, Budgeting, Acquisition and Management of Capital Assets  Section 300–5 – Performance–based acquisition management – Based on EVMS standard – Measure progress towards milestones • Cost • Capability to meet specified requirements • Timeliness • Quality 37/53
  • 38.
    Need: Accurate Performance Measurement GAOReport 06–250 Findings and Recommendations Information Technology: Improve the Accuracy and Reliability of Investment Information 2. If EVM is not implemented effectively, decisions based on inaccurate and potentially misleading information 3. Agencies not measuring actual versus expected performance in meeting IT performance goals. 38/53
  • 39.
    DOD Guides: Technical Performance Departmentof Defense Guidelines for Technical Performance Measures DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08 Interim Defense Acquisition Guidebook (DAG) 6/15/09 Systems Engineering Plan (SEP) Preparation Guide 4/08 WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05 Integrated Master Plan (IMP) & Integrated Master Schedule Preparation & Use Guide (IMS) 10/21/05 Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 12/06 Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09 39/53
  • 40.
    DoD: TPMs inTechnical Baselines and Reviews DoD Policy or Guide POL DAG SEP WBS IMP/IMS Integrated Systems Engineering DAPS Technical Baselines: IMP/IMS Functional (SFR) Allocated (PDR) Product (CDR)    Event driven timing        Success criteria of technical review        Entry and exit criteria for technical reviews     Assess technical maturity     40/53
  • 41.
    DoD: TPMs inIntegrated Plans DoD Policy or Guide POL DAG SEP WBS IMP/IMS Integrated Systems Engineering DAPS Integrated SEP with: IMP/IMS TPMs EVM      Integrated WBS with Requirement Specification Statement of Work IMP/IMS/EVMS      Link risk management, technical reviews, TPMs, EVM, WBS, IMS    41/53
  • 42.
    Guidance in Standards,Models, and Defense Acquisition Guide  Processes for Engineering a System (ANSI/EIA–632)  Standard for Application and Management of the SE Process (IEEE 1220)  Capability Maturity Model Integration (CMMI®) – CMMI for Development, Version 1.2 – CMMI for Acquisition, Version 1.2 – Using CMMI to Improve Earned Value Management, 2002  Guide to the Project Management Institute Body of Knowledge (PMBOK Guide®), 4th Edition 42/53
  • 43.
    Technical Performance Measures (TPM) MoreSources IEEE 1220: 6.8.1.5, Performance–based progress measurement EIA–632: Glossary CMMI for Development Requirements Development TPMs are key to progressively assess technical progress Predict future value of key technical parameters of the end system based on current assessments Specific Practice (SP) 3.3, Analyze Requirements Typical work product: TPMs Establish dates for – Checking progress – Meeting full conformance to requirements Planned value profile is time–phased achievement projected • Achievement to date • Technical milestone where TPM evaluation is reported Subpractice: Identify TPMs that will be tracked during development 43/53
  • 44.
    PMBOK® Guide  10.5.1.1Project Management Plan  Performance Measurement Baseline: – Typically integrates scope, schedule, and cost parameters of a project – May also include technical and quality parameters 44/53
  • 45.
    PMBOK® Guide  8.3.5.4Work Performance Measurements  Used to produce project activity metrics  Evaluate actual progress as compared to planned progress  Include, but are not limited to: – Planned vs. actual technical performance – Planned vs. actual schedule performance, and – Planned vs. actual cost performance. 45/53
  • 46.
    TPMs in DAGand DAPS Defense Acquisition Guide  Performance measurement of WBS elements, using objective measures: – Essential for EVM and Technical Assessment activities  Use TPMs and Critical Technical Parameters (CTP) to report progress in achieving milestones DAPS  Use TPMs to determine whether % completion metrics accurately reflect quantitative technical progress and quality toward meeting Key Performance Parameters (KPP) and Critical Technical Parameters 46/53
  • 47.
    TPMs in DAG Compare the actual versus planned technical development and design  Report progress in the degree to which system performance requirements are met.  Plan is defined in terms of: – Expected performance at specific points • Defined in the WBS and IMS – Methods of measurement at those points – Variation limits for corrective action. 47/53
  • 48.
    PMBOK® Guide  11.6.2.4Technical Performance Measurement  Compares technical accomplishments… to … project management plan’s schedule of technical achievement  Requires definition of objective quantifiable measures of technical performance which can be used to compare actual results against targets.  Might include weight, transaction times, number of delivered defects, storage capacity etc.  Deviation, such as demonstrating more or less functionality than planned at a milestone…forecast degree of success in achieving the project’s scope. 48/53
  • 49.
    CMMI–ACQ  Acquisition TechnicalManagement  SP 1.3 Conduct Technical Reviews  Typical supplier deliverables  Progress reports and process, product, and service level measurements  TPMs 49/53
  • 50.
    SMS Shall: Monitor ProgressAgainst the Plan  4.2.12.2 Monitoring – Contractor SHALL monitor progress against plan to validate, approve, and maintain each baseline and functional architecture  4.2.12.2.2 Required Product Attributes – Each documented assessment includes: – TPMs, metrics – Metrics and technical parameters for tracking that are critical indicators of technical progress and achievement 50/53
  • 51.
    NASA EVM Guide: TechnicalPerformance • NASA EVM Guide NPG 9501.3 – 4.5 Technical Performance Requirements (TPR): When TPRs are used, – appropriate and relevant metrics… – must be defined in the solicitation – Appendix A.7, 14.1 TPR • Compares: • Expected performance and • Physical characteristics • With contractually specified values. • Basis for reporting established milestones • Progress toward meeting technical requirements 51/53
  • 52.
    See next chartfor linkage of technical baselines to technical reviews Document, Baseline, IMS, EVM Parameter  IMP, Functional Baseline Measures Of Effectiveness (MOE)   IMP, WBS, Functional Baseline Measures Of Performance (MOP)   IMP, Allocated Baseline Technical Performance Measure   IMS TPM Milestones And Planned Values   Work Packages TPM% Complete Criteria  Derivation and Flow Down of TPMs 52/53
  • 53.
    Interesting Attributes ofTPMs  Achieved to Date (sounds like EV)  Current Estimate (sounds like EAC/ETC)  Milestone  Planned (target) value (sounds like PV)  Planned performance profile (sounds like a PMB)  Tolerance band (sounds like reporting thresholds)  Threshold (yep, just what we thought)  Variance (sounds like variance!) 53/53