New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Connecting Quality with Performance Measures
1. Patrick K. Barker Glen B. Alleman
MCR, LLC Lewis & Fowler
pbarker@mcri.com galleman@lewisandfowler.com
(703) – 898–6354 (303) – 241 9633
1/30
2. Knowing what DONE looks like begins
with the Integrated Master Plan.
Recognizing what DONE looks like
when it arrives means measuring the
planned Technical Performance.
Measuring Physical Percent Complete
tells us how far we have moved
toward DONE by calculating the
“Earned Value” we’ve achieved.
Connecting Earned Value, Technical
Performance, and Physical Percent
Connect Quality with Performance Complete in the IMP establishes a
credible measure of Progress to Plan.
Without these four elements we don’t
have a clue to what DONE looks like or
if we’ll ever get there as planned.
2/30
3. Evidence of DONE Measures of DONE Beneficial Outcomes
IMP The collection of TPM, EV, and
The IMP then defines
Accomplishment PPC measures defined as “exit
the measurable path to
Criteria criteria” for the Work Packages
the planned maturity.
(AC) performing the work.
Technical The actual technical Evidence that the
Performance performance measured at a product or service is
Measure defined date compared to meeting the planned
(TPM) planned technical performance. maturity or quality.
Evidence of completion
Physical Percent Unit of measure to assess the
of the planned work in
Complete planned technical performance,
units meaningful to the
(PPC) evidenced by tangible outcomes.
customer.
Cost and schedule needed to The performance
Earned Value
deliver the planned technical measured in units of
(EV)
performance on baseline. dollars.
3/30
4. No Matter How Great and Destructive Your
Problems May Seem Now, Remember, You’ve
Probably only Seen the Tip of Them
4/30
6. A Technical Performance Measures …
State how well the program is achieving the
planned performance requirements at the
planned delivery time.
Use actual or predicted values from:
– Engineering measurements
– Tests
– Experiments
– Prototypes
For Example:
– Response time
– Flight range
– Power consumption
– Static takeoff weight
– Product Quality
6/30
8. Physical Percent Complete Requires A Tangible
Measure of “Progress to Plan”
Tangible evidentiary
materials measure
progress to plan.
Pre–defined
existence of this
evidence in
meaningful units of
measure established
before starting work.
No Hand Waving Allowed
“Show Me The Money” Progress is defined in
these same units of
measure. 8/30
9. IMP
Describes how the Major program milestones or assessment events that substantiate system
capabilities will be maturity (initial, progress, or final). These milestones or assessment events
delivered and Events deliver the specific capabilities for the system on planned dates.
how these Milestones
capabilities will
be recognized Specified result, substantiating a Milestone or Event, that indicates
Accomplishment
maturity or progress for each product or process
Definitive measures substantiating the Accomplishment maturity
Criteria level. Completion of specific work that ensures closure of a
specified Accomplishment
IMS Work activities performed to produce the deliverables that
Work Packages and Tasks fulfill the requirements that enable the capabilities
Supplemental Schedules
9/30
10. Measure the progress to plan using Physical % Complete at the
Accomplishment Criteria (AC) and CWBS level
Program Statement of
CWBS
Events Work
Completed SA’s are Work structure
Aligned entry criteria for aligned to
Program Events SOW
Significant
Accomplishments
Completed Work
Aligned Packages are exit
Defines Aligned
criteria for Tasks
Accomplishment CDRLs and Tasks Contained
Criteria Aligned Deliverables Aligned in Work Packages
Describes increasing Documents the product Work necessary to
product maturity as 0/100 or maturity that is aligned with mature products
EVMS SD guidance SOW and CWBS grouped by CWBS
10/30
11. 0.0 TOTAL BRILLIANT EYES PROGRAM
Planned
Profile Tolerance Band
1.0 SPACE BRILLIANT EYES SYSTEM
Technical Parameter Value
1.1 System-Level Costs
(e.g., Vehicle Weight, lb)
1.2 Space Vehicle (SV) Segment
15
1.2.1 SV Program Level
1.2.2 Space Vehicle Prime Mission Equipment Achieved
To Date
1.2.2.1 Space Software
1.2.2.2 Space Vehicle
1.2.2.2.1
1.2.2.2.2
Space Vehicle IA&T
Sensor Payload
Variation
1.2.2.2.3
1.2.2.2.4
Insertion Vehicle
Survivability 10
1.2.3 Prototype Lot
1.2.4
1.2.5
Spare Parts
Technology and Producibility
Threshold
1.2.6 Aerospace Ground Equipment
5 Planned
1.3
1.2.7 Launch Support
Engineering Change Orders (ECOs) Value Current
1.4
1.5
Other Government Costs
Risk
Estimate
A WBS can be displayed as a Milestones
“linear” List. Program Cost is often
Calculated by summing the Costs of Technical performance measurement have often been
All Items on That List … but that treated as an engineering function only, not typically
ignores correlation and basic math: linked with EVM, although it shares a common, almost
distributions don’t add that way. identical, language structure with EVM.
2 1 6 4 3 5
The Schedule Network is Nonlinear, and Program Duration
Cannot be Calculated by Adding Together the Durations of
3 1 6 9 2 4 All Activities in the Network. And since time is money it is
important for schedule slips to reflect cost
6 9 2 1 4 2 1
11/30
12. Neither cost performance nor schedule performance nor technical
performance measurement can be fully evaluated by itself.
12/30
13. How to Define DONE
Determine Physical Percent Complete based in performance measurement
– Compare actual performance against an integrated baseline planned performance of the
integrated cost, schedule and technical goals.
This measurement is possible within a performance measurement system
– Use the organization’s internal management control system to provide decision makers with
specific performance information about the progress and expenditures against the planed progress
and expenditures.
In many projects where EVM systems are implemented, what a CAM determines
(“eyeballing”) as physical percent complete often differs from what is reported
via the EV techniques.
– This disconnect in measurement of “done–ness” is a clear sign that a cost, schedule and/or
technical performance problem is going to materialize before the PM can properly react.
13/30
16. Making Trades
Between
Cost, Schedule,
and
Performance
Baselines is a Ponzi
Scheme
When we’re on
baseline, the algebraic
relationship between
C,S,P, means when
there is a change
everyone loses
16/30
17. A Really Simple Example of Physical
Percent Complete
Painting a room is simple example to
focused on a notion of percent complete.
But how about a flight avionics system?
What are the measures of physical percent
complete?
‒ Design basis complete and verified
‒ Hardware platform increasing capabilities
‒ Software system increasing capabilities
‒ Integration and test of 1st flight article
‒ Operational verification and validation
complete
Where can we look for these measures of physical percent complete?
It’s Obvious – in the Integrated Master Plan’s Accomplishment Criteria
17
19. The notion of measuring physical percent
complete is at the core of every project
management method.
It answers the question of what DONE is
in units of measure meaningful to the
program.
This helps us determine answers to
No stretching the truth
is allowed once we questions that include, but are not limited
start using Technical to:
Performance – What does DONE look like for today, for this
Measures with week, or for this month?
tangible evidence. – What does DONE look like for entry/exit into
the technical review?
– What does DONE look like for quality control?
– What does DONE look like for the customer?
19/30
20. The increasing maturing of a product or service is described through Events or Milestones,
Accomplishments, Criteria, and their Work Packages or Planning Packages. The Structure of a
Each Event or Milestone represents the availability of one or more capabilities. Credible IMP
The presence of these capabilities is measured by the Accomplishments and their Criteria.
Accomplishments are the pre–conditions for the maturity assessment of the product or service at Program Events
each Event or Milestone. Define the availability
This hierarchy decomposes the System Capabilities into Requirements, Work Packages, and the of a Capability at a point in
activities which produce deliverables. This hierarchy also describes increasing program maturity time.
resulting from the activities contained in the Work Packages.
Performance of the work activities, Work Packages, Criteria, Accomplishments, and Events or
Milestones is measured in units of “physical percent complete” by connecting Earned Value with
Technical Performance Measures.
Accomplishments
Represent requirements
that enable Capabilities.
Criteria
Represent Work Packages that
fulfill Requirements.
Work Work Work Each WP must
Package Package Package have a measure
of Physical
Work Percent
Work Work Complete
Package Package Package
Work
Work Work package
Package Package
20/30
22. IEEE 1220: 6.8.1.5, EIA–632: Glossary CMMI for Development
Performance–based Requirements Development
progress measurement
TPMs are key to Predict future value of key Specific Practice (SP) 3.3,
progressively assess technical parameters of the Analyze Requirements
technical progress end system based on Typical work product:
current assessments Technical Performance
Measures
Establish dates for Planned value profile is time– Sub Practice (SP):
– Checking progress phased achievement projected
Identify TPMs that will be
– Meeting full conformance – Achievement to date tracked during development
to requirements – Technical milestone where
TPM evaluation is
reported
22/30
24. EV TPM What this tells the PM Derived Measurements
Number of requirements approved, percentage
How well the system is maturing as requirements growth, number TBD/TBR closure per plan,
Requirements
compared to expectations estimated impact of changes, defect profile, defect density,
cycle time for requirements changes
System Definition Change If changes are being made in a timely
Approval rates, closure rates, cycle times, priority density
Backlog manner
Number interfaces approved, percent interface growth,
Risk associated with interface number TBD/TBR closure per plan, estimated impact of
Interface
development and maturity interface changes, defect profile, defect density, cycle time
for interface changes
If requirements are being validated with
Requirements validation rate.
Requirement Validation applicable stakeholders appropriate to
Percent requirements validated
the level of requirement.
If requirements are being verified Requirements verification rate.
Requirement Verification
appropriate to the level of requirement. Percent requirements verified
Approval rate, distribution of dispositions, approval rate
Work Product Approval Work progress and approval efficiency
performance
Technical/Design Review Progress in closing significant action Closure rates, action item closure performance, variance
Action Closure items from technical reviews from thresholds
Maturity of key components, Component –Subsystem– Element –System TRL, Technology
Technology Maturity
subsystems, elements to expectations opportunity exposure, technology obsolescence exposure
Effectiveness of the risk management Percentage of risk handling actions closed on time, percent
Technical Risk Handling
process overdue, percent of risks meeting handling expectations
Technical Staffing And Adequacy of technical effort and Technical effort staffing, variance.
Skills dynamics of actual staffing mix Efficiency by labor category
Current performance status, projections Delta performance (planned versus actual).
Technical Performance
and associated risk Delta performance to thresholds, objectives
24/30
25. Control Account
WP1: Chassis WP2: Array
Completion Criteria & Taking “Value” for Work Packages
Chassis June July August Sept
Current 100 150 200 250
Completion Requirement Design 80% All TPM within Build complete;
Criteria Quality (drawings) pre–build All TPM
Completed; TPM tolerances via validated by
2, 5 and 6 within refined actual
PDR tolerances performance measurement
via basic model; TPM 1
performance and 6 verified
model via bench testing
Cummulative 100 250 450 700
25/30
26. Using Periodic Program Events to Calibrate “Doneness”
INPUT PROCESS OUTPUT
Program Requirements & Program–level Risk Assessment of the
Scope technical baseline
Lvl 1 WBS
Program Cost & Schedule
Estimate
Cumulative Distribution Function
1
0.9
IPT–level risks
0.8
0.7
Probability
0.6
0.5
0.4
When it comes right down to it, the (10’s)
0.3
0.2
0.1
0
IBR is all about the technical baseline.
IBRs should accompany each major
250000 300000 350000 400000 450000 500000 550000 600000 650000
Total Cost
Rollup of quantified process & product risks
Date: 3/17/2009 8:35:32 AM
Samples: 2000
Unique ID: 114
Name: FIST GEO-1
Completion Std Deviation: 71.23 days
95% Confidence Interval: 3.12 days
Each bar represents 25 days
baseline evolution.
0.16
0.14
1.0
0.9
Completion Probability Table
In the end, the question implicitly
Cumulative Probability
Prob Date Prob Date
0.8
0.12 0.05 10/21/2010 0.55 3/24/2011
asked of the PM each time is:
0.7 0.10 11/11/2010 0.60 4/7/2011
Frequency
0.10 0.6 0.15 12/2/2010 0.65 4/25/2011
0.08 0.5 0.20 12/20/2010 0.70 5/10/2011
0.4 0.25 1/3/2011 0.75 5/27/2011
0.06
“What is your confidence level in your
0.30 1/14/2011 0.80 6/16/2011
0.04
0.02
0.3
0.2
0.1
0.35
0.40
0.45
1/28/2011
2/9/2011
2/22/2011
0.85
0.90
0.95
7/5/2011
7/26/2011
9/9/2011
CA–level risks
7/6/2010 3/17/2011
Completion Date
3/16/2012
0.50 3/8/2011 1.00 3/16/2012
program’s ability to hit cost, (100’s)
IBR Guidance schedule, and scope targets
associated with your technical
baseline”
26/30
27. Connecting Risk Management with Measures of DONE
Risk: CEV-037 - Loss of Critical Functions During Descent
24
Correlate the analytical model
22 Conduct focus splinter review
20 Develop analytical model to de
Conduct Force and Moment Wind
18
Conduct Block 1 w ind tunnel te
16 Conduct w ind tunnel testing of
Conduct w ind tunnel testing of
14
Flight Application of Spacecra
12
10 CEV block 5 w ind tunnel testin
8
6
4
In-Flight development tests of
2 Damaged TPS flight test
0
3.Jul.06
1.Jul.11
31.Mar.05
5.Oct.05
1.Jun.07
1.Jan.10
16.Dec.10
15.Sep.06
3.Apr.06
1.Apr.08
1.Aug.08
1.Apr.09
Planned Risk Level Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)
Risk Response Milestone Date
and Risk ID in traceable between
IMS RM Tool and IMS
27/30
29. A Credible Performance Measurement System
Assures the information is available to the decision maker, needed to deliver
on‒time, on‒budget, and on‒specification
=
Technical Performance Measures
Cost + Schedule
Conventional Earned Value
Cost Baseline Technical Performance Schedule Baseline
The Master Schedule Earned Value is diluted by Requirements are
used to derive Basis of missing technical decomposed into physical
Estimate (BOE) not the performance. deliverables.
other way around. Earned Value is diluted by Deliverables are produced
Probabilistic cost postponed features. through Work Packages.
estimating uses past Earned Value is diluted by Work Packages are
performance and cost risk non compliant quality. assigned to accountable
modeling. All these dilutions require manager.
Labor, Materiel, and other adjustments to the Work Packages are
direct costs accounted for Estimate at Complete sequenced to form the
in Work Packages. (EAC) and the To highest value stream with
Risk adjustments for all Complete Performance the lowest technical and
elements of cost. Index (TCPI). programmatic risk.
29/30
30. Connecting Measures of Cost, Schedule, and
Technical Performance
means the
difference
between …
Measuring Actual Physical Percent Complete
against the Planned Physical Percent Complete
at the time that Physical Percent Complete
was planned to be achieved is the foundation
of determining a program’s performance.
Doing anything else leads to disappointment.
30/30