For credible decisions to be made, we need confidence intervals on all the numbers we use to make decisions.
These confidence intervals come from the underlying statistics and the related probabilities.
Statistical forecasting, using time series analysis of past performance, is mandatory for any credible discussion of project performance in the future.
3. Top Level Theme
3
Earned Value metrics are lagging and linear
representations of project performance
SPI/CPI are Cumm to date with this underlying
variances washed out
Current period is a one period extension of a
linearized accumulation
No information about the dynamics of the past is
available
Forecasts of the future are linear projects from the
current base point measurement
No statistical inference or probabilistic projects are
provided
4. Top Level Theme (Continued)
4
“All models are wrong, some are useful”
Time series analysis: Forecasting and
Control, George E. P. Box and Gwilym
M. Jenkins, Holden-Day, 1976
Box actually said …
5. Top Level Themes (Concluded)
5
For credible decisions to be made, we need
confidence intervals on all the numbers we use to
make decisions.
These confidence intervals come from the underlying
statistics and the related probabilities.
Statistical forecasting, using time series analysis of
past performance, is mandatory for any credible
discussion of project performance in the future.
7. All Program Activities have Naturally
Occurring Uncertainty
7
Naturally occurring uncertainty
and its resulting risk, impacts the
probability of a successful
outcome
What is the probability of making
a desired completion date or cost
target?
The statistical behavior of these activities, their
arrangement in a network of activities, and correlation
between their behaviors creates risk
Adding margin protects the outcome from the impact of
this naturally occurring uncertainty
Risk
9. Earned Value Numbers
9
CPI / SPI
That’s it, that’s all we got
Cumulative to date
Current period
Cumulative values wipe out the underlying variance
Current period performance not adjusted for past
variance
Both cumulative and current period not adjusted for
risk
10. Technical Performance Numbers
10
Technical Performance Measures
Measures of Performance
Measures of Effectiveness
Key Performance Parameters
JROC
Program specific
11. 11
Our goal in forecasting the future is to …
… Produce a forecast with a range of
confidence intervals
14. Some Principles
Reliable forecasting is a critical component of
project planning, controlling, and risk management.
At-Completion forecast made before the project
starts is the basis of credible project management
Execution phase forecasting serves as a leading
indicator of project success
14
15. Some Important Terms
The term confidence interval is applied to interval
estimates for fixed but unknown parameters
The term prediction interval is an interval estimate
for an (unknown) future value.
A prediction interval consists of an upper and a lower
limit at a prescribed probability, which are referred to
as prediction bounds
15
16. A core limitation of the current
approach to performance forecasting
Standard EVM technique for forecasting the final
cost at completion is not applicable to forecasting
the project duration at completion†
This leads to inconsistent assumptions about the
relationships between past performance and future
performance
Using CPI creates a future forecast will be the same as
past performance
† Short 1993, Vandevoorde and Vanhoucke 2006, Leach 2005, and Lipke 2003.
16
17. Forecasting Methods
Probabilistic Forecasting – explicit uncertainties in
project performance and errors in measurement
provide prediction bounds on the predicted values
Integrative Forecasting Methodologies – collect all
relevant information from different sources in a
mathematically correct manner
Consistent Forecasting Methodologies – methods
that can be applied to both cost and schedule
performance forecasts
17
18. Elements of a good forecast
Timely
Reliable
Accurate
Meaningful
Easy to use
Actionable
18
19. Three types of forecasts
Judgmental – Subjective analysis of subjective
inputs
The Management EAC is a judgmental forecast
Associative Models – Analyzes historical data to
reveal relationships between (easily or in advance)
observable quantities and forecast quantities. Uses
this relationship to make predictions.
Time Series – Objective analysis historical data
assuming the future will be like the past
19
20. Forecasting Techniques
Moving average – taking the
value of a chosen number of
previous periods to use as an
estimate for the next period
Weighted average – same as
moving average except giving
greater weight to more recent
periods
Linear regression – a
mathematical algorithm is using
past data to create a line
showing the direction. The line
can then be carried forward to
create a forecast
Exponential smoothing –
combines most recent actual
figure with the previous period’s
forecast in order forecast an
upcoming period
Regression analysis – equations
that are used to analyze the
relationship between a
dependent variable and one or
more independent variables.
Time series – uses recent history
is a good predictor of the near
future
Trend analysis – finds trends in
historic data that can be used to
make forecasts.
20
22. 22
We’ve got the start of the data capture process
UN/CEFACT is a place not the name of an XML
protocol
Cost and schedule information available at the
Control Account level on a monthly basis
In order to build the Control Account data stream,
the Work Package are used
Use the Work Package information as well
24. An Example of Additional Insight Gained By Comparing
Schedules Against Planned Program Deliverables
24
FY05 FY06 FY07 FY08 FY09 FY10 FY11 FY12 FY13
1
25
20
15
10
5
MilestoneCount
3/05 Plan (Rev. A)
Preliminary Rev. H
5/07 Plan (Rev. E)
0
0
6
1
1
9
2
6
4
2
8
6
9
7
0
5
3
0
6
0
0
(actual from
5/07 plan)
(1 actual from
5/07 plan)
3/05 Plan
12/08 Prel. Rev H
5/07 Plan
0
0
6
1
1
15
3
7
19
5
15
25
14
22
0
19
25
0
25
0
0M/S Count
Cum
3/05 Plan
12/08 Prel. Rev H
5/07 Plan
M/S Count
Rate
Actual Plans of
the James Web
Space
Telescope
(JWST)
25. JWST Schedules and Deliverables by
Subsystems
25
Spacecraft
Sunshield
PDR
CDR
Test Compl
I&T Compl
FY05 FY06 FY07 FY08 FY09 FY10 FY11 FY12
OTE
3/05 Plan (Rev. A)
Preliminary Rev. H
5/07 Plan (Rev. E)
3/05 Plan
5/07 Plan
0
0
6
1
1
9
2
6
4
2
8
6
9
7
0
5
3
0
6
0
0
(actual from
5/07 plan)
(1 actual from
5/07 plan)
ISIM
0
0
6
1
1
15
3
7
19
5
15
25
14
22
0
19
25
0
25
0
0
M/S Count
Rate
M/S Count
Cum
STR - PDR
STR - CDR
ETU - NIRSpec
ETU - MIRI
ETU - NIRCam
ETU - FGS
ETU - I&T
FM - NIRCam
FM - FGS
FM - I&T
FM - MIRI
FM - NIRSpec
OTE PDR
Start Polish
OTE CDR
1st Mirror Delivery
Final Mirror Delivery
PDR
CDR
Struct Compl
Prop I&T
3/05 Plan
Prel. Rev H
5/07 Plan
Prel. Rev H
Exhibit 30: Deliverables Planned vs Actuals
26. Probabilities of Success Results
26
Case 2: Baseline with Uncertainty, Discrete Risks @ 25 probability
34.8%
50.8%
57.4%
30 Sept2022
Probability of meeting both
cost and schedule targets
Probability of meeting
targeted schedule
Probability of
meeting targeted cost
Exhibit 31: Probability of Meeting Cost and Schedule Targets
27. Current Execution Index (CEI)
• CEI is a forecast schedule
execution metric for a specific near
term window
• The goal of this metric is to
determine or measure how well
the near term schedule represents
what actually takes place through
execution
• The calculation is made through
capturing a forward-looking
snapshot of what is forecasted to
finish in the near term window.
Then a comparison is made to the
original snapshot after that window
has been executed
• CEI is NOT an actual to baseline
comparison. CEI is an actual to
forecast comparison.
27
28. Baseline Execution (BE) Chart
• Incomplete tasks, as well as
tasks completed late, instill risk to
executing program milestones and
events
• Used with CEI and VI measures,
these indicators can help the team
determine the level of confidence
in Prime’s ability to build a
realistic/ executable schedule
28
41. Each status period counts the number of finishes moved to
forward periods and adds them to the planned finishes
This increase is the debt of work added to that period
41
Late Finish derived directly from the
IMS
42. Leading Indicators come from
integrating EVM and SE†
42
† Systems Engineering Applied Leading Indicators: Enabling Assessment of Acquisition Technical Performance, 24 September 2010, Paul
Montgomery and Ron Carlson, Graduate School of Engineering & Applied Sciences Naval Postgraduate School, NPS-AM-10-175