Integrated Performance Management 2014
A Study on Informing Earned Value by
Objectively Assessing Accomplishments at
Work Performance Level †
Glen B. Alleman
Niwot Ridge, L.L.C.
+1 303 241 9633
glen.alleman@niwotridge.com
David Walden
Sysnovation, LLC
+1 952 807 1388
Dave@sysnovation.com
1
Gordon M. Kranz
Deputy Director,
EVM PARCA (OSD)
+1 703 697 3703
gordon.m.kranz.civ@mail.mil
† Page 1 of EIA-748-C
2
2
• Study Overview and Preliminary Findings
• Discussion of Preliminary Findings
• Summary and Wrap-up
Workshop Topics
3
3
STUDY OVERVIEW AND PRELIMINARY
FINDINGS
Our study shows that connecting objective assessment of accomplishments at
work performance level with Earned Value (BCWP) is being done in many ways in
many firms.
Our study has collected this information into a cohesive set of findings with the
intent to development a guide for government program managers
4
Moving EVM from …
• Task Completion
– Drawings Completed
– Lines of Code Written
– Work Products Produced
– Reviews Completed
• Measuring Design Effectiveness
– Critical TPM Achievement
– System Capabilities Met
– Quality of Work Products
– System Under Review Acceptable
What People Do What the System Does
Progress is measured by effectiveness of outcomes to the end user
5
5
Interviews
Initial Documents
Updated
Documents
Final
Documents
Analysis
Research
A OS DN J F
IPMC EOC
Major Study Phases
On-Site
Follow-up Visits
2014 2015
6
Interview Summary Information
Interview Details
• 10 phone interviews
• 12 participants
• 3 on-site follow-ups
Interviewee Demographics
• Sector
– 6 Government
– 6 Industry
• Roles
– 4 EV
– 4 PM
– 4 SE/PE
• Domains
– Military, Space, Intel, IT,
C4I, Facilities, Medical
7
7
• Goal: to identify ways to inform program performance with
technical performance
• Assumption: effective program management, systems engineering,
and earned value are being done; we just need them better aligned
• Question #1 - Of the things you use to track technical performance
to plan, which provide the most benefit for you and why?
• Question #2 - How do you ensure technical performance informs
your program performance reporting?
• Question #3 - We did a study that used TPMs to inform program
performance reporting. Do you use TPMs, and if so, how do you use
them?
• Question #4 - What else do you use to inform your program
performance reporting? Do you have any suggestions for additional
technical or programmatic areas to explore as part of the study
effort?
Interview Questions
8
8
• Most (but not all) interviewees used some form
of TPMs to track technical performance, but not
all use TPM to influence Earned Value
• Some use CDRL quality to track technical
performance.
• Other areas
– Deferred functionality and rework
– Risk adjusted EAC
– Requirements artifacts
– …illities
– Technical maturity – TRL
Interview Summaries
9
9
• Near-term ideas to explore
– BCWP “Gold Card” guidance (3)
– Quality non-conformances/escapes (4)
– Use of QBDs (Quantifiable Backup Data) (8)
– Break scope growth into new work and unplanned work (9)
• Long-term ideas to explore
– Agile & EVM (3) (9)
– Improving the timeliness of EV data, while still reflecting accurate
technical status
• Weekly EV (1) Early look data (3) Leading Indicators (?)
– Automated traceability of EV data (2)
– Process metrics (4)
– Self-service of EV data (5)
– Predictive BCWP and CPI/SPI (7)
Identified New Areas to Explore
10
10
1. TPMs, MOPs, MOEs
2. Requirements/Design Maturity
3. Risk
4. Staffing
5. Agile
6. Data Item/Deliverables (e.g., CDRLs) Quality
Proposed Focus Areas
11
11
• Intent of the Guide
– Leverage best practices from the field to improve the
connection between objective measures and Earned Value
• Audience for the Guide
– Systems engineers, program managers, and program
performance leaders
• Structure of the Guide
– Focus on the 6 areas of the study
– Additional areas from this workshop and other inputs
• Ground rules for the Guide’s development
– Practicum from users of Earned Value and program
performance management
Proposed Guidance Framework
12
12
DISCUSSION OF PRELIMINARY
FINDINGS
With these preliminary results, the purpose of this workshop is to gather
opinions, ideas, and improvements that can go into the preliminary guidance to
be produced in February of 2015
13
13
• Feedback on the Interview Findings
• Feedback on the Proposed Focus Areas
• Feedback on the Guidance Framework
Workshop Discussion Topics
14
14
SUMMARY AND WRAP-UP
Thank you for your participation in this workshop!
15
15
BACKUP SLIDES
16
16
• Most (but not all) interviewees used some form of
TPMs to track technical performance
• Not all used TPMs to influence program performance
reporting
• Identified some additional ideas to explore
– Guidance on getting TPMs to “threshold” or “plan” (1)
– Use TPMs across several categories (“pairs” or “trends” of
TPMs), not single TPMs (4)
– TPMs derived during analysis, modeling & simulation, test
results, and extrapolations from existing data (2) (6)
– Compliance to standards checks integrated with EVM (6)
Value of TPMs in
Informing Program Performance
#1
17
17
• Some interviewees mentioned CDRL quality
being used to track technical performance
• Identified some additional ideas to explore
– Standards set the CDRL quality (1) (5)
– Many tools have compliance checks that could be
integrated into EV (5)
– SE evaluates quality of CDRLs (8)
Value of CDRL Quality in
Informing Program Performance
#2
18
Other Areas Identified in Research
Supporting Areas Identified in Previous Studies
• Deferred Functionality
• Rework
• Risk
– Risk adjusted EAC (4)
– Asymmetrical SRA (4)
– RYG program status (9)
– Medical device risk per
ISO 14971 (9)
• Integration & Test
– SE Integration planning
(9)
• IBR/Baseline
Establishment
– Two-phased IBR (2)
– Add IMS field to define
“done” (8)
#3
19
Other Areas Identified in Research
Supporting Key SE Processes
• Requirements Artifacts (e.g.,
Compliance, TBDs)
– Track TBD/TBR burn down (4)
– RYG on requirements (9)
• System Architecture/Design
Artifacts (e.g., DoDAF)
• “ilities” Artifacts (e.g., RMA,
Affordability)
– Assurance functions (5)
– Analysis (9)
• Technical Maturity (e.g., TRLs,
IRLs, SRLs)
• Technical Review Artifacts ( e.g.,
PDR, FCA, MS-B)
– Meaningful design reviews (4)
– Importance of subsystem reviews
(4) (9)
• Facilities & Equipment (e.g., Test
Equip, GFE, Enabling Systems)
• Human Resources (e.g., Staffing,
Capabilities)
– Head count vs load-on plan (4)
– Resource availability (5)
– Staffing reduction at end of
program (6)
#3
20
20
• Goal
– Create guidance on how measures such as TPMs can be
used to inform program performance reporting
• Candidate areas to explore
– Guidance on what TPMs, MOPs, MOEs are and where they
come from being derived during analysis, modeling &
simulation, test results, and extrapolations from existing
data and how to look at TPMs across several categories
(“pairs” or “trends” of TPMs), not just single TPMs
– Guidance on how to link the into the IMP/IMS
– Guidance on how EV could be used to support assessing
technical decisions on getting TPMs to “threshold” or
“plan” (i.e., “getting to green”) (pervasive?)
Guidance Area #1
TPMs, MOPs, MOEs
21
21
• Goal
– Create guidance on how the system technical maturity
can be used to inform program performance reporting
• Candidate areas to explore
– Guidance on how the following might influence
performance claims
• Requirements TBDs/TBRs (incl. burn-down to plan)
• Design decisions to be made
• Quality checks build into the tools
• Defect containment/escapes
• TRLs, IRLs, SRLs
Guidance Area #2
Requirements/Design Maturity
22
22
• Goal
– Create guidance on how risks can be used to inform program
performance reporting
• Candidate areas to explore
– Guidance on how reducible risks can be used to inform performance
claims
• Dealing with specific program risks
• RYG overall program status
– Guidance on how irreducible risks can be used to inform performance
claims
• Guidance on how to how to create a risk-adjusted PMB and perform an SRA
• Guidance on risk adjusted EAC
– Safety-critical applications (e.g., medical, 178 SW, ChemDemil,
Nuclear) – specific, stringent set of requirements (pervasive?) (move?)
Guidance Area #3
Risk
23
23
• Goal
– Create guidance on how staffing can be used to
inform program performance reporting
• Candidate areas to explore
– Guidance on how the following might influence
performance claims
• Staffing ramp-up at the start of the program
• Turn-over during the life of the program
• Staff release at the end of the program
– Guidance on how to key staff turn-over can lead to
future program performance issues
Guidance Area #4
Staffing
24
24
• Goal
– Create guidance on agile performance can inform
EV performance
• Candidate areas to explore
– Guidance on how the following might influence
performance claims
• Sprint incomplete work claims and replanning
• Decision backlogs
– Coordinate with Government/Industry working
groups
Guidance Area #5
Agile
25
25
• Goal
– Create guidance on how data item quality can be used to inform
program performance reporting
• Candidate areas to explore
– Guidance on how to objectively define the characteristics of the CDRL
so that you can accurately claim performance and understand
producer and receiver roles
– Guidance on early and frequent engagement by both CDRL producer
and receiver
– Guidance on how to plan for CDRL rework
– Guidance on how the following might influence performance claims
• Description/checklists of CDRL quality (industry, government)
• Use of external standards set the CDRL quality
• Compliance checks that could be integrated into EV
Guidance Area #6
CDRLs
26
26
A OS DN J F
IPMC GK
Mtg
GB
Mtg
EOC
Guidance PPT / TOC Draft 1 Draft 2 Final
Final Briefing Draft 1 Draft 2 Final
Final Report TOC Draft Final
Executive
Summary
Draft Final
Deliverables Maturation

Study outbrief (v5)

  • 1.
    Integrated Performance Management2014 A Study on Informing Earned Value by Objectively Assessing Accomplishments at Work Performance Level † Glen B. Alleman Niwot Ridge, L.L.C. +1 303 241 9633 glen.alleman@niwotridge.com David Walden Sysnovation, LLC +1 952 807 1388 Dave@sysnovation.com 1 Gordon M. Kranz Deputy Director, EVM PARCA (OSD) +1 703 697 3703 gordon.m.kranz.civ@mail.mil † Page 1 of EIA-748-C
  • 2.
    2 2 • Study Overviewand Preliminary Findings • Discussion of Preliminary Findings • Summary and Wrap-up Workshop Topics
  • 3.
    3 3 STUDY OVERVIEW ANDPRELIMINARY FINDINGS Our study shows that connecting objective assessment of accomplishments at work performance level with Earned Value (BCWP) is being done in many ways in many firms. Our study has collected this information into a cohesive set of findings with the intent to development a guide for government program managers
  • 4.
    4 Moving EVM from… • Task Completion – Drawings Completed – Lines of Code Written – Work Products Produced – Reviews Completed • Measuring Design Effectiveness – Critical TPM Achievement – System Capabilities Met – Quality of Work Products – System Under Review Acceptable What People Do What the System Does Progress is measured by effectiveness of outcomes to the end user
  • 5.
    5 5 Interviews Initial Documents Updated Documents Final Documents Analysis Research A OSDN J F IPMC EOC Major Study Phases On-Site Follow-up Visits 2014 2015
  • 6.
    6 Interview Summary Information InterviewDetails • 10 phone interviews • 12 participants • 3 on-site follow-ups Interviewee Demographics • Sector – 6 Government – 6 Industry • Roles – 4 EV – 4 PM – 4 SE/PE • Domains – Military, Space, Intel, IT, C4I, Facilities, Medical
  • 7.
    7 7 • Goal: toidentify ways to inform program performance with technical performance • Assumption: effective program management, systems engineering, and earned value are being done; we just need them better aligned • Question #1 - Of the things you use to track technical performance to plan, which provide the most benefit for you and why? • Question #2 - How do you ensure technical performance informs your program performance reporting? • Question #3 - We did a study that used TPMs to inform program performance reporting. Do you use TPMs, and if so, how do you use them? • Question #4 - What else do you use to inform your program performance reporting? Do you have any suggestions for additional technical or programmatic areas to explore as part of the study effort? Interview Questions
  • 8.
    8 8 • Most (butnot all) interviewees used some form of TPMs to track technical performance, but not all use TPM to influence Earned Value • Some use CDRL quality to track technical performance. • Other areas – Deferred functionality and rework – Risk adjusted EAC – Requirements artifacts – …illities – Technical maturity – TRL Interview Summaries
  • 9.
    9 9 • Near-term ideasto explore – BCWP “Gold Card” guidance (3) – Quality non-conformances/escapes (4) – Use of QBDs (Quantifiable Backup Data) (8) – Break scope growth into new work and unplanned work (9) • Long-term ideas to explore – Agile & EVM (3) (9) – Improving the timeliness of EV data, while still reflecting accurate technical status • Weekly EV (1) Early look data (3) Leading Indicators (?) – Automated traceability of EV data (2) – Process metrics (4) – Self-service of EV data (5) – Predictive BCWP and CPI/SPI (7) Identified New Areas to Explore
  • 10.
    10 10 1. TPMs, MOPs,MOEs 2. Requirements/Design Maturity 3. Risk 4. Staffing 5. Agile 6. Data Item/Deliverables (e.g., CDRLs) Quality Proposed Focus Areas
  • 11.
    11 11 • Intent ofthe Guide – Leverage best practices from the field to improve the connection between objective measures and Earned Value • Audience for the Guide – Systems engineers, program managers, and program performance leaders • Structure of the Guide – Focus on the 6 areas of the study – Additional areas from this workshop and other inputs • Ground rules for the Guide’s development – Practicum from users of Earned Value and program performance management Proposed Guidance Framework
  • 12.
    12 12 DISCUSSION OF PRELIMINARY FINDINGS Withthese preliminary results, the purpose of this workshop is to gather opinions, ideas, and improvements that can go into the preliminary guidance to be produced in February of 2015
  • 13.
    13 13 • Feedback onthe Interview Findings • Feedback on the Proposed Focus Areas • Feedback on the Guidance Framework Workshop Discussion Topics
  • 14.
    14 14 SUMMARY AND WRAP-UP Thankyou for your participation in this workshop!
  • 15.
  • 16.
    16 16 • Most (butnot all) interviewees used some form of TPMs to track technical performance • Not all used TPMs to influence program performance reporting • Identified some additional ideas to explore – Guidance on getting TPMs to “threshold” or “plan” (1) – Use TPMs across several categories (“pairs” or “trends” of TPMs), not single TPMs (4) – TPMs derived during analysis, modeling & simulation, test results, and extrapolations from existing data (2) (6) – Compliance to standards checks integrated with EVM (6) Value of TPMs in Informing Program Performance #1
  • 17.
    17 17 • Some intervieweesmentioned CDRL quality being used to track technical performance • Identified some additional ideas to explore – Standards set the CDRL quality (1) (5) – Many tools have compliance checks that could be integrated into EV (5) – SE evaluates quality of CDRLs (8) Value of CDRL Quality in Informing Program Performance #2
  • 18.
    18 Other Areas Identifiedin Research Supporting Areas Identified in Previous Studies • Deferred Functionality • Rework • Risk – Risk adjusted EAC (4) – Asymmetrical SRA (4) – RYG program status (9) – Medical device risk per ISO 14971 (9) • Integration & Test – SE Integration planning (9) • IBR/Baseline Establishment – Two-phased IBR (2) – Add IMS field to define “done” (8) #3
  • 19.
    19 Other Areas Identifiedin Research Supporting Key SE Processes • Requirements Artifacts (e.g., Compliance, TBDs) – Track TBD/TBR burn down (4) – RYG on requirements (9) • System Architecture/Design Artifacts (e.g., DoDAF) • “ilities” Artifacts (e.g., RMA, Affordability) – Assurance functions (5) – Analysis (9) • Technical Maturity (e.g., TRLs, IRLs, SRLs) • Technical Review Artifacts ( e.g., PDR, FCA, MS-B) – Meaningful design reviews (4) – Importance of subsystem reviews (4) (9) • Facilities & Equipment (e.g., Test Equip, GFE, Enabling Systems) • Human Resources (e.g., Staffing, Capabilities) – Head count vs load-on plan (4) – Resource availability (5) – Staffing reduction at end of program (6) #3
  • 20.
    20 20 • Goal – Createguidance on how measures such as TPMs can be used to inform program performance reporting • Candidate areas to explore – Guidance on what TPMs, MOPs, MOEs are and where they come from being derived during analysis, modeling & simulation, test results, and extrapolations from existing data and how to look at TPMs across several categories (“pairs” or “trends” of TPMs), not just single TPMs – Guidance on how to link the into the IMP/IMS – Guidance on how EV could be used to support assessing technical decisions on getting TPMs to “threshold” or “plan” (i.e., “getting to green”) (pervasive?) Guidance Area #1 TPMs, MOPs, MOEs
  • 21.
    21 21 • Goal – Createguidance on how the system technical maturity can be used to inform program performance reporting • Candidate areas to explore – Guidance on how the following might influence performance claims • Requirements TBDs/TBRs (incl. burn-down to plan) • Design decisions to be made • Quality checks build into the tools • Defect containment/escapes • TRLs, IRLs, SRLs Guidance Area #2 Requirements/Design Maturity
  • 22.
    22 22 • Goal – Createguidance on how risks can be used to inform program performance reporting • Candidate areas to explore – Guidance on how reducible risks can be used to inform performance claims • Dealing with specific program risks • RYG overall program status – Guidance on how irreducible risks can be used to inform performance claims • Guidance on how to how to create a risk-adjusted PMB and perform an SRA • Guidance on risk adjusted EAC – Safety-critical applications (e.g., medical, 178 SW, ChemDemil, Nuclear) – specific, stringent set of requirements (pervasive?) (move?) Guidance Area #3 Risk
  • 23.
    23 23 • Goal – Createguidance on how staffing can be used to inform program performance reporting • Candidate areas to explore – Guidance on how the following might influence performance claims • Staffing ramp-up at the start of the program • Turn-over during the life of the program • Staff release at the end of the program – Guidance on how to key staff turn-over can lead to future program performance issues Guidance Area #4 Staffing
  • 24.
    24 24 • Goal – Createguidance on agile performance can inform EV performance • Candidate areas to explore – Guidance on how the following might influence performance claims • Sprint incomplete work claims and replanning • Decision backlogs – Coordinate with Government/Industry working groups Guidance Area #5 Agile
  • 25.
    25 25 • Goal – Createguidance on how data item quality can be used to inform program performance reporting • Candidate areas to explore – Guidance on how to objectively define the characteristics of the CDRL so that you can accurately claim performance and understand producer and receiver roles – Guidance on early and frequent engagement by both CDRL producer and receiver – Guidance on how to plan for CDRL rework – Guidance on how the following might influence performance claims • Description/checklists of CDRL quality (industry, government) • Use of external standards set the CDRL quality • Compliance checks that could be integrated into EV Guidance Area #6 CDRLs
  • 26.
    26 26 A OS DNJ F IPMC GK Mtg GB Mtg EOC Guidance PPT / TOC Draft 1 Draft 2 Final Final Briefing Draft 1 Draft 2 Final Final Report TOC Draft Final Executive Summary Draft Final Deliverables Maturation

Editor's Notes

  • #2 We want to thank all of you for joining us in todays Webinar. We’ll be presenting a quick overview of the material published in Measurement News around build a credible Performance Measurement Baseline and assessing program performance in ways beyond just the Earned Value numbers. We’ll show you the steps needed to development that PMB and assess program progress using the measures contained in the IMP and IMS.
  • #5 One of the key conclusions of this study was initial confirmation that EVM data can be improved by shifting the focus from measurement of “what people do” to “what the systems does.” This results in a shift from measuring task completion (e.g., input performance such as drawing completed or reviews held) to measuring design effectiveness (e.g., outcome performance such as do the drawing and systems being reviewed represent a viable technical solution).