This document discusses implementing technical performance measures (TPM) on projects. It begins by outlining several learning objectives related to understanding the role and requirements of TPM. It then discusses that TPM are needed in addition to earned value measures, which only measure cost and schedule, to also measure technical progress. The document provides examples of how TPM can be defined and measured for work breakdown structure elements and used to track and reduce risk over time. It emphasizes that integrating TPM with cost and schedule measures provides program management with the necessary performance information to deliver projects on time, on budget, and meeting technical requirements.
Showing how to Increase the Probability of Project Success by applying the ...Glen Alleman
All projects ‒ Traditional and Agile ‒ operate in the presence of uncertainty that creates risk.
Five Immutable Principles and their supporting Processes and Practices can be used to increase the probability of success in the presence of these uncertainties.
Cost and schedule growth for complex projects is created when unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, contribute to project technical and programmatic shortfalls
Getting To Done - A Master Class WorkshopGlen Alleman
The Principles, Processes, Practices, and Tools to Increase the Probability of successfully completing Project's On-Tiem, On-Budget, and Needed Capabilities
Root cause analysis of why many DOD programs fail to deliver required capabilities within the planned time and budget has shown causes for failure begin with the buyer not knowing what “done looks like” before releasing the Request for Proposal (RFP). These are corrected with better guidance for preparing Measures of Effectiveness, Measures of Performance, and Key Performance Parameters in the RFP.
The notion of integrating cost, schedule, technical performance, and risk is possible in theory. In practice care is needed to assure credible information is provided to the Program Manager.
Showing how to Increase the Probability of Project Success by applying the ...Glen Alleman
All projects ‒ Traditional and Agile ‒ operate in the presence of uncertainty that creates risk.
Five Immutable Principles and their supporting Processes and Practices can be used to increase the probability of success in the presence of these uncertainties.
Cost and schedule growth for complex projects is created when unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, contribute to project technical and programmatic shortfalls
Getting To Done - A Master Class WorkshopGlen Alleman
The Principles, Processes, Practices, and Tools to Increase the Probability of successfully completing Project's On-Tiem, On-Budget, and Needed Capabilities
Root cause analysis of why many DOD programs fail to deliver required capabilities within the planned time and budget has shown causes for failure begin with the buyer not knowing what “done looks like” before releasing the Request for Proposal (RFP). These are corrected with better guidance for preparing Measures of Effectiveness, Measures of Performance, and Key Performance Parameters in the RFP.
The notion of integrating cost, schedule, technical performance, and risk is possible in theory. In practice care is needed to assure credible information is provided to the Program Manager.
The role of Risk Assessment and Risk Management is to continuously Identify, Analyze, Plan, Track, Control, and Communicate the risks associated with a project.
The Webster’s definition of risk is the possibility of suffering a loss. Risk in itself is not bad. Risk is essential to progress and failure is often a key part of learning. Managing risk is a key part of
success.
This document describes the foundations for conducting a risk assessment of a large-scale system
development project. Such a project will likely include the procurement of Commercial Off The
Shelf (COTS) products as well as their integration with legacy systems.
Niwot Ridge
Establishing schedule margin using monte carlo simulation Glen Alleman
The first order goal is to develop a resource loaded, risk tolerant, Integrated Master Schedule, derived from the Integrated Master Plan that clearly shows the increasing maturity of the program's deliverables, through vertical and horizontal traceability to the program's requirements.
The question – what does Done Look Like? – was asked every week on the program that changed my life as a Program Manager. Rocky Flats Environmental Technology Site (RFETS) was the marketing term for the 3rd worst toxic waste site on the planet. RFETS was a nuclear bomb manufacturing plant, built in 1951, operating until 1989, and closed in 2005. I served as the VP of Program Management of the ITC (Information Technology and Communications) group, providing ERP, purpose built IT, voice, and data systems for 5,000 employees and contractors of the Bomb Factory.
Delivering programs with less capability than promised, while exceeding the cost and planned durations, distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.
Many reasons have been hypothesized and documented for cost and schedule growth. The authors review some of these reasons, and propose that government and contractors use the historical variability of the past programs to establish cost and schedule estimates at the outset and periodically update these estimates with up-to-date risks, to increase the probability of program success. For this to happen, the authors recommend changes to estimating, acquisition and contracting processes.
From WBS to Integrated Master ScheduleGlen Alleman
A step by step guide to increasing the Probability of Program success starting with the WBS, developing the Integrated Master Plan and Integrated Master Schedule, risk adjusting the IMS, and measuring progress to plan in units of measure meaningful to the decision makers.
IMP & WBS - Getting Both Right is ParamountGlen Alleman
WBS is the starting point for program success. It tells us what DONE looks like in terms of deliverables.
Integrated Master Plan (IMP) tells us how the increasing maturity of the deliverables will be assessed at each Program Event.
Integrated Master Schedule (IMS) tells us the order of the Work Packages needed to produce this increasing maturity.
Control Account Plan (CAP) defines the authorized scope, budget, and period of performance for the work that produces the deliverables defined in the WBS, assessed in the IMP, and sequenced in the IMS.
Cost and schedule growth for federal programs is created by unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, all contributing to program technical and programmatic shortfalls
Capabilities‒Based Planning the capabilities needed to accomplish a mission or fulfill a business strategy
Only when capabilities are defined can we start with requirements elicitation
The management of risk is a critical success factor of any project or program. This document is a collection of risk management categories that are used to ask the question “did you think about this risk and its impact on our probability of success?”
Earned Value Management Meets Big DataGlen Alleman
The Earned Value Management System (EVMS) maintains period–by–period data in its underlying databases. The contents of the Earned Value repository can be considered BIG DATA, characterized by three attributes – 1) Volume: Large amounts of data; 2) Variety: data comes from different sources, including traditional data bases, documents, and complex records; 3) Velocity: the content is continually being updated by absorbing other data collections, through previously archived data, and through streamed data from external sources.
With this time series information in the repository, analysis of trends, cost and schedule forecasts, and confidence levels of these performance estimates can be calculated using statistical analysis techniques enabled by the Autoregressive Integrated Moving Average (ARIMA) algorithm provided by the R programming system. ARIMA provides a statistically informed Estimate At Completion (EAC) and Estimate to Complete (ETC) to the program in ways not available using standard EVM calculations. Using ARIMA reveals underlying trends not available through standard EVM reporting calculations.
With ARIMA in place and additional data from risk, technical performance and the Work Breakdown Structure, Principal Component Analysis can be used to identify the drivers of unanticipated EAC.
The Role of the Architect in ERP and PDM System DeploymentGlen Alleman
The architect’s role in the development of an ERP or PDM system is to maintain the integrity of the vision statement produced by the owners, users, and funders of the system.
Notes on IT programmatic risk in 5 not so easy piecesGlen Alleman
Risk management in the IT business is similar to risk management most domains. Here's a starting point for understanding the steps needed to manage risk
Delivering programs with less capability than promised, while exceeding the cost and planned durations, distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.
To achieve success in any project domains, measures of progress to plan are needed in units meaningful to the decision-makers. These include cost, schedule, and technical performance
The role of Risk Assessment and Risk Management is to continuously Identify, Analyze, Plan, Track, Control, and Communicate the risks associated with a project.
The Webster’s definition of risk is the possibility of suffering a loss. Risk in itself is not bad. Risk is essential to progress and failure is often a key part of learning. Managing risk is a key part of
success.
This document describes the foundations for conducting a risk assessment of a large-scale system
development project. Such a project will likely include the procurement of Commercial Off The
Shelf (COTS) products as well as their integration with legacy systems.
Niwot Ridge
Establishing schedule margin using monte carlo simulation Glen Alleman
The first order goal is to develop a resource loaded, risk tolerant, Integrated Master Schedule, derived from the Integrated Master Plan that clearly shows the increasing maturity of the program's deliverables, through vertical and horizontal traceability to the program's requirements.
The question – what does Done Look Like? – was asked every week on the program that changed my life as a Program Manager. Rocky Flats Environmental Technology Site (RFETS) was the marketing term for the 3rd worst toxic waste site on the planet. RFETS was a nuclear bomb manufacturing plant, built in 1951, operating until 1989, and closed in 2005. I served as the VP of Program Management of the ITC (Information Technology and Communications) group, providing ERP, purpose built IT, voice, and data systems for 5,000 employees and contractors of the Bomb Factory.
Delivering programs with less capability than promised, while exceeding the cost and planned durations, distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.
Many reasons have been hypothesized and documented for cost and schedule growth. The authors review some of these reasons, and propose that government and contractors use the historical variability of the past programs to establish cost and schedule estimates at the outset and periodically update these estimates with up-to-date risks, to increase the probability of program success. For this to happen, the authors recommend changes to estimating, acquisition and contracting processes.
From WBS to Integrated Master ScheduleGlen Alleman
A step by step guide to increasing the Probability of Program success starting with the WBS, developing the Integrated Master Plan and Integrated Master Schedule, risk adjusting the IMS, and measuring progress to plan in units of measure meaningful to the decision makers.
IMP & WBS - Getting Both Right is ParamountGlen Alleman
WBS is the starting point for program success. It tells us what DONE looks like in terms of deliverables.
Integrated Master Plan (IMP) tells us how the increasing maturity of the deliverables will be assessed at each Program Event.
Integrated Master Schedule (IMS) tells us the order of the Work Packages needed to produce this increasing maturity.
Control Account Plan (CAP) defines the authorized scope, budget, and period of performance for the work that produces the deliverables defined in the WBS, assessed in the IMP, and sequenced in the IMS.
Cost and schedule growth for federal programs is created by unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, all contributing to program technical and programmatic shortfalls
Capabilities‒Based Planning the capabilities needed to accomplish a mission or fulfill a business strategy
Only when capabilities are defined can we start with requirements elicitation
The management of risk is a critical success factor of any project or program. This document is a collection of risk management categories that are used to ask the question “did you think about this risk and its impact on our probability of success?”
Earned Value Management Meets Big DataGlen Alleman
The Earned Value Management System (EVMS) maintains period–by–period data in its underlying databases. The contents of the Earned Value repository can be considered BIG DATA, characterized by three attributes – 1) Volume: Large amounts of data; 2) Variety: data comes from different sources, including traditional data bases, documents, and complex records; 3) Velocity: the content is continually being updated by absorbing other data collections, through previously archived data, and through streamed data from external sources.
With this time series information in the repository, analysis of trends, cost and schedule forecasts, and confidence levels of these performance estimates can be calculated using statistical analysis techniques enabled by the Autoregressive Integrated Moving Average (ARIMA) algorithm provided by the R programming system. ARIMA provides a statistically informed Estimate At Completion (EAC) and Estimate to Complete (ETC) to the program in ways not available using standard EVM calculations. Using ARIMA reveals underlying trends not available through standard EVM reporting calculations.
With ARIMA in place and additional data from risk, technical performance and the Work Breakdown Structure, Principal Component Analysis can be used to identify the drivers of unanticipated EAC.
The Role of the Architect in ERP and PDM System DeploymentGlen Alleman
The architect’s role in the development of an ERP or PDM system is to maintain the integrity of the vision statement produced by the owners, users, and funders of the system.
Notes on IT programmatic risk in 5 not so easy piecesGlen Alleman
Risk management in the IT business is similar to risk management most domains. Here's a starting point for understanding the steps needed to manage risk
Delivering programs with less capability than promised, while exceeding the cost and planned durations, distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.
To achieve success in any project domains, measures of progress to plan are needed in units meaningful to the decision-makers. These include cost, schedule, and technical performance
Performance based planning in a nut shell (V5)Glen Alleman
Step by step activtiies to increase the probability of success for all projects, no matter the project domain. These principles and practices can be found in all successful projects.
Chapter 0 of Performance Based Project Management (sm)Glen Alleman
Defining the needed capabilities is the critical success factor proejct success. The capabilities state what Done looks like in units of measure meaningful to the decision makers.
Five Immutable Principles of Project of Digital Transformation SuccessGlen Alleman
Successful Digital Transformation projects are fraught with technical, cost, and schedule risks.
These Five Principles of Project Success have been shown to increase the Probability of Project Success (PoPS).
EVMS with Technical Performance MeasuresGlen Alleman
Assessing the general health of the program to establish early warning system for unexpected problems starts by integrating Earned Value metrics and Technical Performance Measures for key program deliverables.
We cannot determine the Value of something unless we know it’s cost. But determining Value requires have tangible measures to be compared against the cost. In the Systems Engineering Paradigm, these are the Measures of Effectiveness, Measures of Performance, Technical Performance Measures, and Key Performance Parameters
Informing Program Performance with Technical Performance Measures (TPMs)Glen Alleman
This Handbook provides guidance to Government Program Managers and Program Performance Analysts for identifying, integrating, managing and assessing the impacts on program performance from Technical Performance Measures (TPMs).
EAI-748-C asks us to objectively assess accomplishments at the work performance level. As well §3.8 of 748-C tells us Earned Value is a direct measurement of the quantity of work accomplished. The quality and technical content of work is controlled by other processes. To provide visibility to integrated cost, schedule, and technical performance, we need more than CPI and SPI. We need measures of increasing technical performance.
Return on Investment for a Design for Reliability ProgramAccendo Reliability
Last year we presented a paper on Design for Reliability (DFR), reviewing the benefits of a good DFR program and included some of the essential building blocks of DfR along with pointing out some erroneous practices that people today are using today.
We discussed a good DFR Program having the following attributes:
1. Setting Goals at the beginning of the program and then developing a plan to meet the goals.
2. Having the reliability goals being driven by the design team with the reliability team acting as mentors.
3. Providing metrics so that you have checkpoints on where you are against your goals.
4. Writing a Reliability Plan (not only a test plan) to drive your program.
A Good DFR Program must choose the best tools from each area of the product life cycle
• Identify
• Design
• Analyze
• Verify
• Validate
• Monitor and Control
The DFR Program must then integrate the tools together effectively.
Since then, we have developed a method to calculate the Return on Investment (ROI) from a Design for Reliability (DFR) program, also known as the DFR ROI. In this paper, we will discuss a method we have developed to calculate the Return on Investment (ROI) from a Design for Reliability (DFR) program, also known as the DFR ROI.
There are a number of factors involved in calculating the ROI for your DFR program, including:
1) Improved Warranty Rate (derived from your Reliability Maturity Level)
2) Current Warranty Rate
3) Cost per Repair
4) Cost of New Reliability Program
5) Savings from Losing a Customer
6) Volume
In this paper, we will show you how to calculate each of these to derive your DFR ROI.
Recent College of Performance Management Webinar on using Technical Performance to inform Earned Value Management. Six steps to building a credible Performance Measurement Baseline to connect the dots between all the elements of the program
Planning projects usually starts with tasks and milestones. The planner gathers this information from the participants – customers, engineers, subject matter experts. This information is usually arranged in the form of activities and milestones. PMBOK defines “project time management” in this manner. The activities are then sequenced according to the projects needs and mandatory dependencies.
Increasing the Probability of Project SuccessGlen Alleman
Risk Management is essential for development and production programs. Information about key cost, performance and schedule attributes are often uncertain or unknown until late in the program.
Risk issues that can be identified early in the program, which may potentially impact the program, termed Known Unknowns, can be alleviated with good risk management. -- Effective Risk Management 2nd Edition, Page 1, Edmund Conrow, American Institute of Aeronautics and Astronautics, 2003
From Principles to Strategies for Systems EngineeringGlen Alleman
From Principles to Strategies How to apply Principles, Practices, and Processes of Systems Engineering to solve complex technical, operational,
and organizational problems
Building a Credible Performance Measurement BaselineGlen Alleman
Establishing a credible Performance Measurement Baseline, with a risk adjusted Integrated Master Plan and Integrated Master Schedule, starts with the WBS and connects Technical Measures of progress to Earned Value
Starting with the development of a Rough Order of Magnitude (ROM) estimate of work and duration, creating the Product Roadmap and Release Plan, the Product and Sprint Backlogs, executing and statusing the Sprint, and informing the Earned Value Management Systems, using Physical Percent Complete of progress to plan.
Program Management Office Lean Software Development and Six SigmaGlen Alleman
Successfully combining a PMO, Agile, and Lean / 6 starts with understanding what benefit each paradigm brings to the table. Architecting a solution for the enterprise requires assembling a “Systems” with processes, people, and principles – all sharing the goal of business improvement.
This resource document describes the Program Governance Road map for product development, deployment, and sustainment of products and services in compliance with CMS guidance, ITIL IT management, CMMI best practices, and other guidance to assure high quality software is deployed for sustained operational success in mission critical domains.
Increasing the Probability of Project Success with Five Principles and PracticesGlen Alleman
There are many approaches to managing projects in every domain.
This seminar lays the foundations for increasing the probability of project success, no matter the domain, what technology, what approach to delivering the outcomes of the project.
The principles of this approach are immutable.
The practices for implementing the principles are universally applicable.
Each chart in this presentation, contains guidance that can be applied to your project, no matter the domain.
In our short hour here, we’re going to cover a lot of material.
The bibliography contains the supporting materials we can tailor to your individual project
Seven Habits of a Highly Effective agile project managerGlen Alleman
Recent neurological studies indicate that the role of emotion in human cognition is essential; emotions are not a luxury. Instead, emotions play a critical role in rational decision–making, in perception, in human interaction, and in human intelligence. Habits are the intersection of knowledge, skill, and desire.
The 5 Immutable principles of project managementGlen Alleman
Software development methods are sometimes confused with Project Management principles. There are 5 irreducible principles used to manage projects, no matter the domain or context. We need to assure our development work is guided by these 5 Project Management principles.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Albert Hoitingh
In this session I delve into the encryption technology used in Microsoft 365 and Microsoft Purview. Including the concepts of Customer Key and Double Key Encryption.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Transcript: Selling digital books in 2024: Insights from industry leaders - T...BookNet Canada
The publishing industry has been selling digital audiobooks and ebooks for over a decade and has found its groove. What’s changed? What has stayed the same? Where do we go from here? Join a group of leading sales peers from across the industry for a conversation about the lessons learned since the popularization of digital books, best practices, digital book supply chain management, and more.
Link to video recording: https://bnctechforum.ca/sessions/selling-digital-books-in-2024-insights-from-industry-leaders/
Presented by BookNet Canada on May 28, 2024, with support from the Department of Canadian Heritage.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Epistemic Interaction - tuning interfaces to provide information for AI support
Cpm500 d _alleman__tpm lesson 3 (v1)
1. PMI EVM Community of Practice
IPM 2011
CPM-500(D) : Implementing Technical Performance
Measures
Glen B. Alleman
DoD Programs
glen.alleman@niwotridge.com
+1 303 241 9633
2. Learning Objectives
TLO #9: The student will understand the role of Technical Performance Measurement
(TPM) in the project office.
ELO #1: The student will recognize the policy requirements for Technical Performance
Measures.
ELO #2: The student will recognize the role of Integrated Baseline Reviews in confirming the
entire technical scope of work has been planned.
ELO #3: The student will recognize the role of the WBS in supporting Technical Performance
Measure requirements.
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software
management tool implementation.
ELO #1: The student will recognize the benefits and challenges of Technical Performance
Measure implementation.
ELO #2: The student will recognize the use of control limit charts to track Technical
Performance Measure metrics.
ELO #3: The student will understand the methodology and approach used to show the
effect of Technical Performance Measure on Earned Value.
2/66
4. Increasing the Probability of
Program Success Means …
Building A Credible Performance Measurement Baseline
Risk Cost
IMP/IMS PMB SOW
WBS TPM
This is actually harder than it looks!
4/66
5. A Core Problem With Earned
Value
Measures Of Progress Must Be In Units
Meaningful To The Stakeholders
Earned Value measures
performance in units
of “money” (BCWS,
BCWP, ACWP).
We need another
measure of progress in
units of TIME.
5/66
6. Doing This Starts With Some Guidance
Systems engineering uses technical performance
measurements to balance cost, schedule, and performance
throughout the life cycle. Technical performance
measurements compare actual versus planned technical
development and design. They also report the degree to
which system requirements are met in terms of performance,
cost, schedule, and progress in implementing risk handling.
Performance metrics are traceable to user–defined
capabilities.
― Defense Acquisition Guide
(https://dag.dau.mil/Pages/Default.aspx)
In The End ― It’s All About Systems Engineering
6/66
7. Just A Reminder Of The …
Primary Elements of Earned Value
Cost
Funding margin for Schedule margin for
under performance Over cost or Over cost or over target baseline
under over schedule (OTB)
performance
Over schedule
Technical or under
Performance performing
Schedule
Schedule margin for
underperformance or
schedule extension
7/66
8. This Has All Been Said Before.
We Just Weren’t Listening…
… the basic tenets of the process are the need for
seamless management tools, that support an
integrated approach … and “proactive
identification and management of risk” for critical
cost, schedule, and technical performance
parameters.
― Secretary of Defense, Perry memo, May 1995 TPM Handbook 1984
Why Is This Hard To Understand?
We seem to be focused on EV reporting, not the use of EV to
manage the program.
Getting the CPR out the door is the end of Program Planning
and Control’s efforts, not the beginning.
8/66
9. The Gap Seems To Start With A
Common Problem
Many Times, The Information from Cost, Schedule, Technical
Performance, and Risk Management Gets Mixed Up When We
Try to Put Them Together
9/66
9/66
10. The NDIA EVM Intent Guide Says
Notice the inclusion of Technical along with
Cost and Schedule
That’s the next step is generating Value from Earned Value
EV MUST include the Technical Performance Measures
10/66
11. Back To Our Technical
Performance Measures
Technical Performance Measures do what
they say,
Measure the Technical Performance
of the product or service produced by the
program.
11/66
12. The real question?
How fast can we safely go?
Yes, the Units of Measure are MPH
12/66
13. Measure of Effectiveness (MoE)
The operational measures of success that are closely related to the
achievements of the mission or operational objectives evaluated in
the operational environment, under a specific set of conditions.
Measures of Effectiveness …
Are stated in units meaningful to the buyer,
Focus on capabilities independent of any
technical implementation,
Are connected to the mission success.
MoE’s Belong to the End User
“Technical Measurement,” INCOSE–TP–2003–020–01 13/66
14. Measure of Performance (MoP)
Measures that characterize physical or functional attributes
relating to the system operation, measured or estimated
under specific conditions.
Measures of Performance are …
Attributes that assure the system has the
capability to perform,
Assessment of the system to assure it meets
design requirements to satisfy the MoE.
MoP’s belong to the Program – Developed by the Systems
Engineer, Measured By CAMs, and Analyzed by PP&C
“Technical Measurement,” INCOSE–TP–2003–020–01 14/66
15. Key Performance Parameters (KPP)
Represent the capabilities and characteristics so
significant that failure to meet them can be cause for
reevaluation, reassessing, or termination of the program
Key Performance Parameters …
Have a threshold or objective value,
Characterize the major drivers of performance,
Are considered Critical to Customer (CTC).
The acquirer defines the KPPs during the operational
concept development – KPPs say what DONE looks like
“Technical Measurement,” INCOSE–TP–2003–020–01 15/66
16. Technical Performance Measures (TPM)
Attributes that determine how well a system or system
element is satisfying or expected to satisfy a technical
requirement or goal
Technical Performance Measures …
Assess design progress,
Define compliance to performance requirements,
Identify technical risk,
Are limited to critical thresholds,
Include projected performance.
“Technical Measurement,” INCOSE–TP–2003–020–01 16/66
17. Dependencies Between These Measures
Acquirer Defines the Needs and Capabilities Supplier Defines Physical Solutions that
in terms of Operational Scenarios meet the needs of the Stakeholders
KPP
Mission
MoE MoP TPM
Need
Operational Measures that Measures used to
measures of success characterize physical assess design
related to the or functional progress,
achievement of the attributes relating compliance to
mission or to the system performance
operational operation. requirements, and
objective being technical risks.
evaluated.
“Coming to Grips with Measures of Effectiveness,” N. Sproles,
Systems Engineering, Volume 3, Number 1, pp. 50–58 17/66
18. “Measures” of Technical Measures
Attribute Description
Measured technical progress or estimate of
Achieved to Date
progress
Value of a technical parameter that is predicted to
Current Estimate
be achieved
Point in time when an evaluation of a measure is
Milestone
accomplished
Planned Value Predicted value of the technical parameter
Planned Performance Profile representing the project time phased
Profile demonstration of a technical parameter
Tolerance Band Management alert limits
Threshold Limiting acceptable value of a technical parameter
Demonstrated technical variance
Variances
INCOSE Systems Engineering Handbook Predicted technical variance 18/66
19. A Familiar Graphic of TPMs
TPM
Upper Limit
Planned Profile
Current Estimate
Planned Value
Mean To Between Failure
Threshold
Variance Lower Limit
Achieved to Date
Milestones
Time = Program Maturity
19/66
20. TPMs from an Actual Program
Chandra X–Ray Telescope
20/66
21. What Does A Real Technical
Performance Measure Look Like?
Not that bagels are not
interesting in Lesson 1 and
2, but let’s get ready to look
at a flying machine.
21/66
22. The WBS for a UAV
1.1 Air Vehicle TPMs Start With The WBS
1.1.1 Sensor Platform
1.1.2
1.1.2 Airframe
Airframe
1.1.3 Propulsion
1.1.4 On Board Comm
1.1.5 Auxiliary Equipment
1.1.6 Survivability
Modules
1.1.7 Electronic Warfare
Module
1.1.8 On Board
Application &
System SW
1.3 Mission Control /
Ground Station SW
1.3.1 Signal Processing
SW
1.3.2 Station Display
1.3.3 Operating System
1.3.4 ROE Simulations
1.3.5 Mission Commands
22/66
23. What Do We Need To Know About
This Program Through TPMs
What WBS elements represent the TPMs?
What Work Packages produce these WBS elements?
Where do these Work Packages live in the IMS?
What are the Earned Value baseline values for these
Work Packages?
How are we going to measure all these variables?
What does the curve look like for these
measurements?
23/66
24. Verifying Each TPM
Evidence that we’re in compliance
With our submitted ROM what are the values we need to get
Do we know what we promised to
CA through Integrated Baseline Review. How do we measure
deliver, now that we’ve won?
weight for each program event?
The contributors to the vehicle weight are confirmed and the
Can we proceed into preliminary
SFR upper limits defined in the product architecture and
design?
requirements flow down database (DOORS) into a model.
Can we proceed into the System Do we know all drivers of vehicle weight? Can we bound their
SRR Development and Demonstration upper limits? Can the subsystem owners be successful within
(SDD) phase? these constraints uses a high fidelity model?
Can we start detailed design, and Does each subsystem designer have the target component
meet the stated performance weight target and have some confidence they can stay below
PDR
requirements within cost, schedule, the upper bound? Can this be verified in some tangible way?
risk, and other constraints? Either through prior examples or a lab model?
Can the system proceed to
Do we know all we need to know to start the fabrication of
fabrication, demonstration, and test,
CDR the first articles of the flight vehicle. Some type of example,
within cost, schedule, risk, and other
maybe a prototype is used to verify we’re inside the lines.
system constraints?
Can the system ready to Does the assembled vehicle fall within the weight range limits
TRR
proceed into formal test? for 1st flight – will this thing get off the ground? 24/66
25. Design Model
TPM Trends & Responses
ROM in Proposal Detailed Design Model
Bench Scale Model Measurement
Technical Performance Measure
28kg Prototype Measurement
Vehicle Weight
Flight 1st Article
26kg
25kg
23kg
CA SFR SRR PDR CDR TRR
EV Taken, planned values met, tolerances kept, etc.
Dr. Falk Chart – modified 25/66
26. The Assessment Of Weight As A
Function Of Time
At Contract Award there is a Proposal grade estimate of
vehicle weight.
At System Functional Review, the Concept of Operations is
validated for the weight.
At System Requirements Review the weight targets are
flowed down to the subsystems components.
At PDR the CAD model starts the verification process.
At CDR actual measurements are needed to verify all
models.
At Test Readiness Review we need to know how much
fuel to put on board for the 1st flight test.
26/66
27. The WBS for a UAV Airframe Weight TPM
1.1 Air Vehicle The planned weight is
1.1.1 Sensor Platform 25kg. The actual weight is
1.1.2
1.1.2 Airframe
Airframe 25.5kg.
Close to plan! So we are
doing okay, right?
CA SFR SRR PDR CDR TRR
Planned Value 28.0kg 27.0kg 26.0kg 25.0kg 24.0kg 23.0kg
Actual Value 30.4kg 29.0kg 27.5kg 25.5kg
Moderate Low Low Very Low (less
Assessed Risk
>2.0kg off 1–2 kg off 1–2 kg off than 1.0 kg
to TRR
target target target off target)
Program– Actual Actual
Program–
Planned “Similar to” unique design measurement measurement
ROM unique design
Method Estimate model with of bench–test of prototype
model
validated data components airframe
Actual “Similar to”
Method Estimate
ROM ROM ROM Here’s the Problem
27/66
28. Raison d'etre for Technical
Performance Measures
The real purpose of
Risk Cost
Technical Performance
Measures is to reduce IMP/IMS PMB SOW
Programmatic and WBS TPM
Technical RISK
28/66
29. Buying Down Risk with TPMs
“Buying down” risk is
Risk: CEV-037 - Loss of Critical Functions During Descent
24
Correlate the analytical model
planned in the IMS.
22 Conduct focus splinter review
20 Develop analytical model to de
Conduct Force and Moment Wind
18
MoE, MoP, and KPP
Conduct Block 1 w ind tunnel te
16 Conduct w ind tunnel testing of
Conduct w ind tunnel testing of
Risk Score
14
Flight Application of Spacecra
12
10
8
CEV block 5 w ind tunnel testin
defined in the work
6
4
In-Flight development tests of
package for the critical
measure – weight.
2 Damaged TPS flight test
0
3.Jul.06
1.Jul.11
31.Mar.05
5.Oct.05
1.Jun.07
1.Jan.10
16.Dec.10
15.Sep.06
3.Apr.06
1.Apr.08
1.Aug.08
1.Apr.09
Planned Risk Level
Weight risk
Planned (Solid=Linked, Hollow =Unlinked, Filled=Complete)
Weight confirmed
If we can’t verify we’ve
reduced from
RED to Yellow
ready to fly – it’s
GREEN at this point succeeded, then the
risk did not get
reduced.
The risk may have
gotten worse.
29/66
30. Increasing the Probability of
Success with Risk Management
Going outside the TPM
limits always means
cost and schedule
impacts.
“Coloring Inside the
Lines” means knowing
the how to keep the
program GREEN, or at
least stay close to
GREEN. So much for our strategy of winning
through technical dominance
30/66
31. Connecting the EV Variables
Integrating Cost, Schedulele, and Technical Performance
Assures Program Management has the needed performance information to deliver
on‒time, on‒budget, and on‒specification
=
Technical Performance Measures
Cost + Schedule
Conventional Earned Value
Cost Baseline Technical Performance Schedule Baseline
Master Schedule is used to Earned Value is diluted by Requirements are
derive Basis of Estimate missing technical decomposed into physical
(BOE) not the other way performance. deliverables.
around. Earned Value is diluted by Deliverables are produced
Probabilistic cost postponed features. through Work Packages.
estimating uses past Earned Value is diluted by Work Packages are
performance and cost risk non compliant quality. assigned to accountable
modeling. All these dilutions require manager.
Labor, Materiel, and other adjustments to the Work Packages are
direct costs accounted for Estimate at Complete sequenced to form the
in Work Packages. (EAC) and the To Complete highest value stream with
Risk adjustments for all Performance Index (TCPI). the lowest technical and
elements of cost. programmatic risk.
31/66
32. TPM Checklist
MoE MoP TPM
Traceable to applicable Traceable to applicable MoPs,
Traceable to needs,
MOEs, KPPs, system level system element performance,
goals, objectives, and
performance requirements, requirements, objectives,
risks.
and risks. risks, and WBS elements.
Focused on technical risks Further decomposed,
Defined with associated and supports trades budgeted, and allocated to
KPPs. between alternative lower level system elements in
solutions. the WBS and IMS.
Each MoE independent Provided insight into Assigned an owner, the CAM
from others. system performance. and Work Package Manager.
Each MoE independent Decomposed, budgeted Sources of measure identified
of technical any and allocated to system and processes for generating
solution. elements. the measures defined.
Assigned an “owner,” the Integrated into the program’s
Address the required
CAM and Technical IMS as part of the exit criteria
KPPs.
Manager. for the Work Package. 32/66
33. Did We Accomplish the Learning
Objectives?
TLO #9: The student will understand the role of Technical Performance Measurement (TPM) in the project
office.
ELO #1: The student will recognize the policy Policies and supporting guidance, with links and
requirements for TPM. reference numbers provided.
ELO #2: The student will recognize the role of IBRs in This is the first place where cost, schedule and
confirming the entire technical scope of work has technical performance come together – in the
been planned. Integrated Master Schedule (IMS)
ELO #3: The student will recognize the role of the TPMs are first located in the WBS
WBS in supporting TPM requirements.
TLO #9: The student will understand the scope of DCMA’s (or other) TPM software management tool
implementation.
ELO #1: The student will recognize the benefits and Progress is measured in units of physical percent
challenges of TPM implementation. complete. TPMs are those units.
ELO #2: The student will recognize the use of control We’ve seen notional and actual charts
limit charts to track TPM metrics.
ELO #3: The student will understand the The example of our “flying machine” connects the
methodology and approach used to show the effect dots for TPMs, risk, cost, and schedule.
of TPMs on earned value.
33/66
35. Backup Materials
Knowledge is of two kinds. We know a
subject ourselves, or we know where
we can find information on it
— Samuel Johnson
35/66
36. Many of Sources for Connecting the Dots
OMB Circular A–11, Section 300 Interim Defense Acquisition Guidebook (DAG)
6/15/09
GAO Report 06–250 Systems Engineering Plan (SEP) Preparation Guide
4/08
DoDI 5000.02, Operation of the Defense WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Acquisition System (POL) 12/08
Integrated Master Plan (IMP) & Integrated Guide for Integrating SE into DOD Acquisition
Master Schedule Preparation & Use Guide Contracts 12/06
(IMS) 10/21/05
Defense Acquisition Program Support Guide to the Project Management Institute Body of
Methodology (DAPS) V2.0 3/20/09 Knowledge (PMBOK Guide®), 4th Edition
Standard for Application and Capability Maturity Model Integration (CMMI®)
Management of the SE Process (IEEE
1220)
IEEE 1220: 6.8.1.5 Processes for Engineering a System (ANSI/EIA–632)
NASA EVM Guide NPG 9501.3
36/66
37. Office of Management and
Budget
Circular No. A–11, Section 300
Planning, Budgeting, Acquisition and Management
of Capital Assets
Section 300–5
– Performance–based acquisition management
– Based on EVMS standard
– Measure progress towards milestones
• Cost
• Capability to meet specified requirements
• Timeliness
• Quality
37/66
38. Need: Accurate Performance
Measurement
GAO Report 06–250 Findings and Recommendations
Information Technology: 2. If EVM is not implemented
Improve the Accuracy and effectively, decisions based on
Reliability of Investment inaccurate and potentially
Information misleading information
3. Agencies not measuring actual
versus expected performance
in meeting IT performance
goals.
38/66
39. DOD Guides:
Technical Performance
Department of Defense Guidelines for Technical Performance Measures
DoDI 5000.02, Operation of the Defense Acquisition System (POL) 12/08
Interim Defense Acquisition Guidebook (DAG) 6/15/09
Systems Engineering Plan (SEP) Preparation Guide 4/08
WBS Handbook, Mil–HDBK–881A (WBS) 7/30/05
Integrated Master Plan (IMP) & Integrated Master Schedule Preparation &
Use Guide (IMS) 10/21/05
Guide for Integrating SE into DOD Acquisition Contracts (Integ SE) 12/06
Defense Acquisition Program Support Methodology (DAPS) V2.0 3/20/09
39/66
40. DoD: TPMs in Technical Baselines and Reviews
Engineering
Integrated
IMP/IMS
Systems
DAPS
WBS
DAG
POL
SEP
DoD Policy or Guide
Technical Baselines:
IMP/IMS
Functional (SFR)
Allocated (PDR)
Product (CDR)
Event driven timing
Success criteria of
technical review
Entry and exit criteria
for technical reviews
Assess technical
maturity 40/66
41. DoD: TPMs in Integrated Plans
Engineering
Integrated
IMP/IMS
Systems
DAPS
WBS
DAG
POL
SEP
DoD Policy or Guide
Integrated SEP with:
IMP/IMS
TPMs
EVM
Integrated WBS with
Requirement
Specification
Statement of Work
IMP/IMS/EVMS
Link risk management,
technical reviews, TPMs,
EVM, WBS, IMS 41/66
42. Guidance in Standards, Models,
and Defense Acquisition Guide
Processes for Engineering a System (ANSI/EIA–632)
Standard for Application and Management of the SE
Process (IEEE 1220)
Capability Maturity Model Integration (CMMI®)
– CMMI for Development, Version 1.2
– CMMI for Acquisition, Version 1.2
– Using CMMI to Improve Earned Value Management,
2002
Guide to the Project Management Institute Body of
Knowledge (PMBOK Guide®), 4th Edition
42/66
43. Technical Performance
Measures (TPM)
More Sources
IEEE 1220: 6.8.1.5, EIA–632: Glossary CMMI for Development
Performance–based progress Requirements Development
measurement
TPMs are key to Predict future value of key Specific Practice (SP) 3.3,
progressively assess technical technical parameters of the Analyze Requirements
progress end system based on current Typical work product:
assessments TPMs
Establish dates for Planned value profile is Subpractice:
– Checking progress time–phased achievement Identify TPMs that will be
– Meeting full projected tracked during development
conformance to • Achievement to date
requirements • Technical milestone where
TPM evaluation is reported
43/66
44. PMBOK® Guide
10.5.1.1 Project Management Plan
Performance Measurement Baseline:
– Typically integrates scope, schedule, and cost
parameters of a project
– May also include technical and quality parameters
44/66
45. PMBOK® Guide
8.3.5.4 Work Performance Measurements
Used to produce project activity metrics
Evaluate actual progress as compared to planned
progress
Include, but are not limited to:
– Planned vs. actual technical performance
– Planned vs. actual schedule performance, and
– Planned vs. actual cost performance.
45/66
46. TPMs in DAG and DAPS
Defense Acquisition Guide
Performance measurement of WBS elements, using
objective measures:
– Essential for EVM and Technical Assessment activities
Use TPMs and Critical Technical Parameters (CTP) to
report progress in achieving milestones
DAPS
Use TPMs to determine whether % completion metrics
accurately reflect quantitative technical progress and
quality toward meeting Key Performance Parameters
(KPP) and Critical Technical Parameters
46/66
47. TPMs in DAG
Compare the actual versus planned technical
development and design
Report progress in the degree to which system
performance requirements are met.
Plan is defined in terms of:
– Expected performance at specific points
• Defined in the WBS and IMS
– Methods of measurement at those points
– Variation limits for corrective action.
47/66
48. PMBOK® Guide
11.6.2.4 Technical Performance Measurement
Compares technical accomplishments… to … project
management plan’s schedule of technical
achievement
Requires definition of objective quantifiable
measures of technical performance which can be
used to compare actual results against targets.
Might include weight, transaction times, number of
delivered defects, storage capacity etc.
Deviation, such as demonstrating more or less
functionality than planned at a milestone…forecast
degree of success in achieving the project’s scope.
48/66
49. CMMI–ACQ
Acquisition Technical Management
SP 1.3 Conduct Technical Reviews
Typical supplier deliverables
Progress reports and process, product, and
service level measurements
TPMs
49/66
50. SMS Shall:
Monitor Progress Against the Plan
4.2.12.2 Monitoring
– Contractor SHALL monitor progress against plan to
validate, approve, and maintain each baseline and
functional architecture
4.2.12.2.2 Required Product Attributes
– Each documented assessment includes:
– TPMs, metrics
– Metrics and technical parameters for tracking that
are critical indicators of technical progress and
achievement
50/66
51. NASA EVM Guide:
Technical Performance
• NASA EVM Guide NPG 9501.3
– 4.5 Technical Performance Requirements (TPR): When
TPRs are used,
– appropriate and relevant metrics…
– must be defined in the solicitation
– Appendix A.7, 14.1 TPR
• Compares:
• Expected performance and
• Physical characteristics
• With contractually specified values.
• Basis for reporting established milestones
• Progress toward meeting technical requirements
51/66
52. Derivation and Flow
Down of TPMs
Document, Baseline,
IMS, EVM Parameter
IMP, Functional Baseline Measures Of Effectiveness (MOE)
IMP, WBS, Functional Baseline Measures Of Performance (MOP)
IMP, Allocated Baseline Technical Performance Measure
IMS TPM Milestones And Planned Values
Work Packages TPM% Complete Criteria
See next chart for linkage of technical baselines to technical reviews
52/66
53. Interesting Attributes of TPMs
Achieved to Date (sounds like EV)
Current Estimate (sounds like EAC/ETC)
Milestone
Planned (target) value (sounds like PV)
Planned performance profile (sounds like a PMB)
Tolerance band (sounds like reporting
thresholds)
Threshold (yep, just what we thought)
Variance (sounds like variance!)
53/66