This document discusses cost and schedule overruns that are common in publicly-funded programs. It proposes establishing cost and schedule estimates based on the historical variability of similar past programs, and updating estimates periodically with risks, to increase the probability of on-target program delivery. Specific changes are recommended to how initial estimates are created and agreed upon, including developing activity-based estimates adjusted for issues on past programs. Programs should be budgeted at a 70% probability of meeting cost and schedule targets using this joint confidence level approach. This involves changes to estimating, acquisition, and contracting processes.
Notes on IT programmatic risk in 5 not so easy piecesGlen Alleman
Risk management in the IT business is similar to risk management most domains. Here's a starting point for understanding the steps needed to manage risk
This document discusses the importance of continuous risk management for project success. It outlines five key concepts for effective risk management: 1) hoping is not a strategy, 2) single point estimates are inaccurate, 3) integrating cost, schedule, and technical performance is essential, 4) a formal risk management model is needed, and 5) risk communication is critical. The document emphasizes that risk management requires identifying risks early, quantifying their potential impacts, and developing mitigation plans. An effective risk management process is proactive rather than reactive and considers uncertainties as well as known risks.
The question – what does Done Look Like? – was asked every week on the program that changed my life as a Program Manager. Rocky Flats Environmental Technology Site (RFETS) was the marketing term for the 3rd worst toxic waste site on the planet. RFETS was a nuclear bomb manufacturing plant, built in 1951, operating until 1989, and closed in 2005. I served as the VP of Program Management of the ITC (Information Technology and Communications) group, providing ERP, purpose built IT, voice, and data systems for 5,000 employees and contractors of the Bomb Factory.
Cost and schedule growth for federal programs is created by unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, all contributing to program technical and programmatic shortfalls
Managing risk with deliverables planningGlen Alleman
This document discusses managing risk through continuous risk management (CRM). It introduces the five principles of risk management and outlines the CRM process, which includes identifying risks, analyzing and prioritizing them, planning mitigations, tracking mitigation progress and risks, making decisions based on risk data, and communicating throughout the project. The presentation provides examples of risk statements, evaluation criteria, classification approaches, and integrating risks and mitigation plans into project schedules. The goal of CRM is to continually identify, assess, and mitigate risks to improve project outcomes.
Risk Management is a critical success factor for all project work.
Risk identification, quantitative and qualitative analysis, and risk response planning and execution is provided in this presentation
With uncertainty comes opportunity. But if a project manager is consumed with managing the risks, there is little time to manage the opportunities. Good risk management is not about fear of failure; it is about removing barriers to success. This is when opportunity management emerges.
Root cause analysis of why many DOD programs fail to deliver required capabilities within the planned time and budget has shown causes for failure begin with the buyer not knowing what “done looks like” before releasing the Request for Proposal (RFP). These are corrected with better guidance for preparing Measures of Effectiveness, Measures of Performance, and Key Performance Parameters in the RFP.
Notes on IT programmatic risk in 5 not so easy piecesGlen Alleman
Risk management in the IT business is similar to risk management most domains. Here's a starting point for understanding the steps needed to manage risk
This document discusses the importance of continuous risk management for project success. It outlines five key concepts for effective risk management: 1) hoping is not a strategy, 2) single point estimates are inaccurate, 3) integrating cost, schedule, and technical performance is essential, 4) a formal risk management model is needed, and 5) risk communication is critical. The document emphasizes that risk management requires identifying risks early, quantifying their potential impacts, and developing mitigation plans. An effective risk management process is proactive rather than reactive and considers uncertainties as well as known risks.
The question – what does Done Look Like? – was asked every week on the program that changed my life as a Program Manager. Rocky Flats Environmental Technology Site (RFETS) was the marketing term for the 3rd worst toxic waste site on the planet. RFETS was a nuclear bomb manufacturing plant, built in 1951, operating until 1989, and closed in 2005. I served as the VP of Program Management of the ITC (Information Technology and Communications) group, providing ERP, purpose built IT, voice, and data systems for 5,000 employees and contractors of the Bomb Factory.
Cost and schedule growth for federal programs is created by unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, all contributing to program technical and programmatic shortfalls
Managing risk with deliverables planningGlen Alleman
This document discusses managing risk through continuous risk management (CRM). It introduces the five principles of risk management and outlines the CRM process, which includes identifying risks, analyzing and prioritizing them, planning mitigations, tracking mitigation progress and risks, making decisions based on risk data, and communicating throughout the project. The presentation provides examples of risk statements, evaluation criteria, classification approaches, and integrating risks and mitigation plans into project schedules. The goal of CRM is to continually identify, assess, and mitigate risks to improve project outcomes.
Risk Management is a critical success factor for all project work.
Risk identification, quantitative and qualitative analysis, and risk response planning and execution is provided in this presentation
With uncertainty comes opportunity. But if a project manager is consumed with managing the risks, there is little time to manage the opportunities. Good risk management is not about fear of failure; it is about removing barriers to success. This is when opportunity management emerges.
Root cause analysis of why many DOD programs fail to deliver required capabilities within the planned time and budget has shown causes for failure begin with the buyer not knowing what “done looks like” before releasing the Request for Proposal (RFP). These are corrected with better guidance for preparing Measures of Effectiveness, Measures of Performance, and Key Performance Parameters in the RFP.
This document discusses principles of effective risk management for projects. It emphasizes the importance of clearly defining requirements and success criteria before releasing requests for proposals. This includes quantifying measures of effectiveness and performance for different use scenarios. Effective risk management also requires developing a funded implementation plan informed by historical risks and uncertainties. The document outlines key data and processes needed to reduce risks and increase the probability of a project's success, including defining requirements, developing plans and schedules, identifying risks and adjustments needed to plans. It discusses uncertainties from both known and unknown sources that can impact cost, schedule and performance.
Showing how to Increase the Probability of Project Success by applying the ...Glen Alleman
All projects ‒ Traditional and Agile ‒ operate in the presence of uncertainty that creates risk.
Five Immutable Principles and their supporting Processes and Practices can be used to increase the probability of success in the presence of these uncertainties.
The role of Risk Assessment and Risk Management is to continuously Identify, Analyze, Plan, Track, Control, and Communicate the risks associated with a project.
The Webster’s definition of risk is the possibility of suffering a loss. Risk in itself is not bad. Risk is essential to progress and failure is often a key part of learning. Managing risk is a key part of success.
This document describes the foundations for conducting a risk assessment of a large-scale system development project. Such a project will likely include the procurement of Commercial Off The Shelf (COTS) products as well as their integration with legacy systems.
This document provides sample questions and answers that a Control Account Manager (CAM) could expect to encounter during an interview related to their Earned Value Management responsibilities. It includes background on topics like required training, the program organization structure, work authorization processes, performance measurement baseline planning, and earned value measurement. Sample questions are provided on each topic along with potential answers the CAM could provide to demonstrate their knowledge and management of their control accounts.
When contractually required, DOD acquisition contractors are obligated to submit IPMR's electronically IAW DID 81861. This data is necessary but not sufficient for successfully managing a program. This presentation is the overview of the Essential Views needed for that success
The notion of integrating cost, schedule, technical performance, and risk is possible in theory. In practice care is needed to assure credible information is provided to the Program Manager.
The document discusses risk management on high technology programs. It outlines the agenda for a 4 hour session which will cover the principles of risk management, introduce Continuous Risk Management (CRM), illustrate each CRM process area with examples, and familiarize participants with identifying risks. The document then discusses the five principles of risk management and explains concepts like the Mission-Oriented Success Analysis and Improvement Criteria (MOSAIC) framework for assessing project risk.
Risk Management is essential to the success of all project work. Information about key project cost, performance, and schedule attributes are often unknown until the project is underway and changes are occurring during execution.
Increasing the probability of program successGlen Alleman
Program Success starts and end with Process. Along the way, people and tools are needed, but process is the foundation of program success. These processes start with the Concept of Operations, describing what Capabilities are needed by the stakeholder to accomplish the mission of the program. Assessment of progress to plan must be made in units of measure meaningful to the decision makers. Measures of Effectiveness are defined by the Government. Measures of Performance and Technical Performance Measures are defined by Industry.
Measurement of project quality performanceK.K. PATHAK
The document discusses measuring project quality performance for the construction industry. It outlines the need to measure quality, identifies measurable quality parameters, and describes how to score and evaluate projects based on these parameters. Key points include:
- Measuring quality allows understanding of standards, trends, strengths/weaknesses to enhance quality.
- Parameters include implementation of quality plans, product quality, record availability, open issues, repair costs, amounts withheld.
- Projects are scored and graded (A,B,C) based on criteria for each parameter, with weights assigned.
- A sample score sheet and quality trends report are shown as examples of the measurement system.
Cost and schedule growth for complex projects is created when unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, contribute to project technical and programmatic shortfalls
Building a risk tolerant integrated master scheduleGlen Alleman
Traditional approaches to planning, scheduling, and managing technical performance are not adequate to defend against these disruptions. This paper outlines the six steps for building a risk-tolerant schedule, using a field-proven approach.
This document discusses implementing technical performance measures (TPM) on projects. It begins by outlining several learning objectives related to understanding the role and requirements of TPM. It then discusses that TPM are needed in addition to earned value measures, which only measure cost and schedule, to also measure technical progress. The document provides examples of how TPM can be defined and measured for work breakdown structure elements and used to track and reduce risk over time. It emphasizes that integrating TPM with cost and schedule measures provides program management with the necessary performance information to deliver projects on time, on budget, and meeting technical requirements.
The role of Risk Assessment and Risk Management is to continuously Identify, Analyze, Plan, Track, Control, and Communicate the risks associated with a project.
The Webster’s definition of risk is the possibility of suffering a loss. Risk in itself is not bad. Risk is essential to progress and failure is often a key part of learning. Managing risk is a key part of
success.
This document describes the foundations for conducting a risk assessment of a large-scale system
development project. Such a project will likely include the procurement of Commercial Off The
Shelf (COTS) products as well as their integration with legacy systems.
Niwot Ridge
Getting To Done - A Master Class WorkshopGlen Alleman
The Principles, Processes, Practices, and Tools to Increase the Probability of successfully completing Project's On-Tiem, On-Budget, and Needed Capabilities
Increasing the Probability of Success with Continuous Risk ManagementGlen Alleman
This document summarizes an article from The Measurable News publication that discusses increasing the probability of program success through continuous risk management. It describes how identifying, analyzing, planning, tracking and controlling risk on complex systems can help assess the maturity of an existing risk management process and determine actions needed to improve it. The article provides examples of root cause analysis, assumptions, and sources of risk, and argues that separating risks into aleatory and epistemic categories helps better measure the impacts of each type of risk. Continuous risk management is presented as a way to produce risk-informed decisions that can address issues leading to cost overruns, schedule delays, and shortfalls in technical performance.
Estimating cost and schedule for monte carlo simulationGlen Alleman
This document discusses different approaches to estimating cost and schedule for Monte Carlo simulation projects: Baby Bear, Mamma Bear, and Papa Bear.
The Baby Bear approach involves assigning low, medium, or high confidence intervals. The Mamma Bear approach builds on Baby Bear by making narrow adjustments based on past performance. The Papa Bear approach performs in-depth reference class assessments and identifies cost and schedule risk drivers.
It emphasizes that good estimates require understanding the underlying probability distributions of random project data. Correlations between factors like cost, schedule, and technical performance must also be considered. The different approaches can significantly impact total estimated project costs.
The document discusses risk management in construction projects. It outlines that risks can significantly impact cost, time, quality and success. It describes various risk management models and processes including risk identification, assessment, response, and monitoring. Effective risk management requires a systematic process and participation across the supply chain. Key factors that influence construction project risks are also examined.
A Possible Cure for Unanticipated EAC GrowthGlen Alleman
Delivering programs with less capability than promised, while exceeding the cost and planned durations, distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.
Many reasons have been hypothesized and documented for cost and schedule growth. The authors review some of these reasons, and propose that government and contractors use the historical variability of the past programs to establish cost and schedule estimates at the outset and periodically update these estimates with up-to-date risks, to increase the probability of program success. For this to happen, the authors recommend changes to estimating, acquisition and contracting processes.
Portfolio management and the ppbe process at the department of energy white p...p6academy
This document discusses using portfolio management tools to improve the Planning, Programming, Budgeting, and Evaluation (PPBE) process for the National Nuclear Security Administration (NNSA). It describes how NNSA implemented Primavera Portfolio Management (PPM) to better track budgets at lower levels and make more informed decisions. PPM allows NNSA to group work into portfolios based on scope, location, and appropriation. This provides transparency into total costs and helps justify budget requests to Congress. The new system addresses issues found in a government audit and recommendations to better account for infrastructure and production costs across the nuclear security enterprise.
This document discusses principles of effective risk management for projects. It emphasizes the importance of clearly defining requirements and success criteria before releasing requests for proposals. This includes quantifying measures of effectiveness and performance for different use scenarios. Effective risk management also requires developing a funded implementation plan informed by historical risks and uncertainties. The document outlines key data and processes needed to reduce risks and increase the probability of a project's success, including defining requirements, developing plans and schedules, identifying risks and adjustments needed to plans. It discusses uncertainties from both known and unknown sources that can impact cost, schedule and performance.
Showing how to Increase the Probability of Project Success by applying the ...Glen Alleman
All projects ‒ Traditional and Agile ‒ operate in the presence of uncertainty that creates risk.
Five Immutable Principles and their supporting Processes and Practices can be used to increase the probability of success in the presence of these uncertainties.
The role of Risk Assessment and Risk Management is to continuously Identify, Analyze, Plan, Track, Control, and Communicate the risks associated with a project.
The Webster’s definition of risk is the possibility of suffering a loss. Risk in itself is not bad. Risk is essential to progress and failure is often a key part of learning. Managing risk is a key part of success.
This document describes the foundations for conducting a risk assessment of a large-scale system development project. Such a project will likely include the procurement of Commercial Off The Shelf (COTS) products as well as their integration with legacy systems.
This document provides sample questions and answers that a Control Account Manager (CAM) could expect to encounter during an interview related to their Earned Value Management responsibilities. It includes background on topics like required training, the program organization structure, work authorization processes, performance measurement baseline planning, and earned value measurement. Sample questions are provided on each topic along with potential answers the CAM could provide to demonstrate their knowledge and management of their control accounts.
When contractually required, DOD acquisition contractors are obligated to submit IPMR's electronically IAW DID 81861. This data is necessary but not sufficient for successfully managing a program. This presentation is the overview of the Essential Views needed for that success
The notion of integrating cost, schedule, technical performance, and risk is possible in theory. In practice care is needed to assure credible information is provided to the Program Manager.
The document discusses risk management on high technology programs. It outlines the agenda for a 4 hour session which will cover the principles of risk management, introduce Continuous Risk Management (CRM), illustrate each CRM process area with examples, and familiarize participants with identifying risks. The document then discusses the five principles of risk management and explains concepts like the Mission-Oriented Success Analysis and Improvement Criteria (MOSAIC) framework for assessing project risk.
Risk Management is essential to the success of all project work. Information about key project cost, performance, and schedule attributes are often unknown until the project is underway and changes are occurring during execution.
Increasing the probability of program successGlen Alleman
Program Success starts and end with Process. Along the way, people and tools are needed, but process is the foundation of program success. These processes start with the Concept of Operations, describing what Capabilities are needed by the stakeholder to accomplish the mission of the program. Assessment of progress to plan must be made in units of measure meaningful to the decision makers. Measures of Effectiveness are defined by the Government. Measures of Performance and Technical Performance Measures are defined by Industry.
Measurement of project quality performanceK.K. PATHAK
The document discusses measuring project quality performance for the construction industry. It outlines the need to measure quality, identifies measurable quality parameters, and describes how to score and evaluate projects based on these parameters. Key points include:
- Measuring quality allows understanding of standards, trends, strengths/weaknesses to enhance quality.
- Parameters include implementation of quality plans, product quality, record availability, open issues, repair costs, amounts withheld.
- Projects are scored and graded (A,B,C) based on criteria for each parameter, with weights assigned.
- A sample score sheet and quality trends report are shown as examples of the measurement system.
Cost and schedule growth for complex projects is created when unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, contribute to project technical and programmatic shortfalls
Building a risk tolerant integrated master scheduleGlen Alleman
Traditional approaches to planning, scheduling, and managing technical performance are not adequate to defend against these disruptions. This paper outlines the six steps for building a risk-tolerant schedule, using a field-proven approach.
This document discusses implementing technical performance measures (TPM) on projects. It begins by outlining several learning objectives related to understanding the role and requirements of TPM. It then discusses that TPM are needed in addition to earned value measures, which only measure cost and schedule, to also measure technical progress. The document provides examples of how TPM can be defined and measured for work breakdown structure elements and used to track and reduce risk over time. It emphasizes that integrating TPM with cost and schedule measures provides program management with the necessary performance information to deliver projects on time, on budget, and meeting technical requirements.
The role of Risk Assessment and Risk Management is to continuously Identify, Analyze, Plan, Track, Control, and Communicate the risks associated with a project.
The Webster’s definition of risk is the possibility of suffering a loss. Risk in itself is not bad. Risk is essential to progress and failure is often a key part of learning. Managing risk is a key part of
success.
This document describes the foundations for conducting a risk assessment of a large-scale system
development project. Such a project will likely include the procurement of Commercial Off The
Shelf (COTS) products as well as their integration with legacy systems.
Niwot Ridge
Getting To Done - A Master Class WorkshopGlen Alleman
The Principles, Processes, Practices, and Tools to Increase the Probability of successfully completing Project's On-Tiem, On-Budget, and Needed Capabilities
Increasing the Probability of Success with Continuous Risk ManagementGlen Alleman
This document summarizes an article from The Measurable News publication that discusses increasing the probability of program success through continuous risk management. It describes how identifying, analyzing, planning, tracking and controlling risk on complex systems can help assess the maturity of an existing risk management process and determine actions needed to improve it. The article provides examples of root cause analysis, assumptions, and sources of risk, and argues that separating risks into aleatory and epistemic categories helps better measure the impacts of each type of risk. Continuous risk management is presented as a way to produce risk-informed decisions that can address issues leading to cost overruns, schedule delays, and shortfalls in technical performance.
Estimating cost and schedule for monte carlo simulationGlen Alleman
This document discusses different approaches to estimating cost and schedule for Monte Carlo simulation projects: Baby Bear, Mamma Bear, and Papa Bear.
The Baby Bear approach involves assigning low, medium, or high confidence intervals. The Mamma Bear approach builds on Baby Bear by making narrow adjustments based on past performance. The Papa Bear approach performs in-depth reference class assessments and identifies cost and schedule risk drivers.
It emphasizes that good estimates require understanding the underlying probability distributions of random project data. Correlations between factors like cost, schedule, and technical performance must also be considered. The different approaches can significantly impact total estimated project costs.
The document discusses risk management in construction projects. It outlines that risks can significantly impact cost, time, quality and success. It describes various risk management models and processes including risk identification, assessment, response, and monitoring. Effective risk management requires a systematic process and participation across the supply chain. Key factors that influence construction project risks are also examined.
A Possible Cure for Unanticipated EAC GrowthGlen Alleman
Delivering programs with less capability than promised, while exceeding the cost and planned durations, distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.
Many reasons have been hypothesized and documented for cost and schedule growth. The authors review some of these reasons, and propose that government and contractors use the historical variability of the past programs to establish cost and schedule estimates at the outset and periodically update these estimates with up-to-date risks, to increase the probability of program success. For this to happen, the authors recommend changes to estimating, acquisition and contracting processes.
Portfolio management and the ppbe process at the department of energy white p...p6academy
This document discusses using portfolio management tools to improve the Planning, Programming, Budgeting, and Evaluation (PPBE) process for the National Nuclear Security Administration (NNSA). It describes how NNSA implemented Primavera Portfolio Management (PPM) to better track budgets at lower levels and make more informed decisions. PPM allows NNSA to group work into portfolios based on scope, location, and appropriation. This provides transparency into total costs and helps justify budget requests to Congress. The new system addresses issues found in a government audit and recommendations to better account for infrastructure and production costs across the nuclear security enterprise.
We have lots data. Let’s use it create more credible estimates to help tame the growth beast. In spite the estimating community’s efforts to provide credible estimates, government programs still seem to deliver less than promised, cost more than planned, and take longer than needed. When estimates are consistently biased low:
- Decisions of choice are distorted
- Cost growth causes more growth as programs are stretched out to fund portfolios with fixed budgets
- Taxpayers become more cynical and negative about government
- The estimating community’s credibility is seriously questioned
This summary provides an overview of a study analyzing the performance of over 1,000 demand-side management (DSM) programs from 2010-2013:
- The study found that over 40% of total program spending during this period went to programs that did not meet their stated energy savings goals, and this proportion increased to 47% in 2013. Certain regions and program categories were less likely to reliably meet targets.
- Programs in Canada and the Midwest US saw larger increases in the percentage of their budgets spent on underperforming programs compared to other regions. Aggressive savings targets in these areas made goals more difficult to attain.
- Across both the residential and non-residential sectors, certain program categories consistently
This document provides a summary of the preface to the GAO Cost Estimating and Assessment Guide. It discusses three key points:
1) The guide was developed to establish a consistent cost estimating methodology across the federal government based on best practices. It aims to help generate reliable cost estimates to support budgeting and oversight.
2) Accurate cost estimating is critical given increasing budget pressures as resources become scarce. Reliable estimates are needed to ensure programs can deliver as promised and within budget.
3) The guide compiles best practices from both government and industry to develop and maintain cost estimates throughout a program's lifecycle. It serves as a tool for auditors to evaluate economy, efficiency and effectiveness of government
In December 2015, the UK’s National Audit Office reported that one-third of UK government projects were either unachievable or in-doubt. This would have come as a great surprise to most commentators, in particular that two-thirds of the projects were forecast to be on track! It is not immediately clear from the report on what basis these projects are secure about their on-time delivery. How can a project forecast with confidence that it will deliver on time?
WSARA: Baselining Programs Early Compounds the ProblemsPete Modigliani
The document discusses issues with section of the Weapon System Acquisition Reform Act (WSARA) that requires major defense acquisition programs to establish cost and schedule baselines early at Milestone A. [1] Baselining programs so early compounds problems as estimates at Milestone A are developed years before technologies are mature or requirements are solidified. [2] Exceeding the initial baseline estimates by 25% now triggers a Nunn-McCurdy breach, which could terminate the program. [3] This motivates program managers to inflate estimates and increases costs, risks, and bureaucracy.
NASA is required to regularly report cost and schedule performance data to Congress, the Office of Management and Budget (OMB), and the Government Accountability Office (GAO) for its major projects. The types of reports include baseline reports, current estimate reports, threshold reports if growth exceeds certain levels, and rebaseline reports if costs grow by 30% or more. NASA manages this reporting by linking it to agency policies and using a single standardized data tracking system to provide consistent information across all reports on project-managed costs, unfunded enhancements held by program offices, and other agency-managed costs.
BNamericas Project Risk Analytics - June 2015 Inaugural reportnatans
The document summarizes the findings of a report analyzing cost overruns and delays in the top 100 infrastructure projects in Latin America with a total original cost estimate of $242 billion. It finds that the projects have collectively experienced cost increases of over $133 billion (51% overrun) and delays in 58% of the projects. The largest cost overruns and delays have occurred in large, complex projects like refineries and hydroelectric dams. Later stage projects have seen greater increases to cost estimates, calling into question the traditional view that uncertainty decreases as projects progress. Financial management issues were the most commonly cited factor for cost increases.
To deliver projects on time and on budget, strong project planning and resource management is needed. Statistics show the average project experiences 24% cost and time overruns. Organizations that invest in project planning enjoy a 90% success rate for IT projects and save 12-19% of original budgets. Good project planning focuses on information management - assembling, analyzing, and disseminating information so that interdependencies and timelines are clear. An accurate plan concentrates on critical tasks that could delay the project. Risks should be actively identified and managed with contingencies built into plans. Forecasting allows advance warning of busy periods to move staff or prioritize tasks, improving predictions of resource needs and reducing variation between planned and actual resources.
Does Better Scheduling Drive Execution Success?Acumen
This white paper outlines the statistics that support the relationship between better schedules and better execution. Based on an ongoing research study conducted by Acumen, this paper is the hard proof of a need for better planning across all industries.
Why Scheduling Mustn't Be Allowed to Become an Extinct ScienceAcumen
The document discusses the importance of effective project planning and scheduling. It argues that [1] unrealistic project plans and [2] inability to execute plans according to schedule are the main causes of project failure. It advocates for top-down, deliverable-based planning with uncertainty analysis to create realistic plans. It also recommends visualizing projects in phases and disciplines to aid understanding, and using metrics and trend analysis to continuously improve plans and monitor performance. The goal is to tie effective planning to successful execution to reduce project failures.
Estimate costs in fragile and transitional contexts - July - 2014Abdulrahman Ismail
The document discusses estimating costs for construction projects in fragile and transitional contexts using the three-point estimating technique. It explains that the technique uses optimistic, most likely, and pessimistic estimates to define a cost range and improve accuracy. As an example, the document applies the technique to estimate costs for a basic education project in Sudan. It calculates optimistic, most likely, and pessimistic cost scenarios accounting for risks like instability and variables like inflation. The final cost estimate of $157,192 is close to the most likely amount, demonstrating how the simple three-point technique can reliably estimate costs while considering the uncertainties of fragile contexts.
The document discusses various methods for estimating project times and costs, including top-down and bottom-up approaches. Top-down approaches include consensus methods, ratio methods, and the apportion method. Bottom-up approaches include the template method, parametric procedures applied to specific tasks, and detailed estimates for work packages. Estimates should be refined as risks and details become clearer, and contingency funds and time buffers can be added to offset uncertainty.
This document discusses approaches to managing uncertainty in projects. It begins by defining key terms like uncertainty, ambiguity, complexity, and risk. It describes exploring the project environment to understand factors that contribute to uncertainty. Approaches to managing uncertainty include gathering more information, preparing for multiple potential outcomes, using set-based design to explore alternatives, and building resilience to adapt to changes. Metrics and visual controls help monitor uncertainty and track the effectiveness of response plans.
C:\documents and settings\ckampschulte\desktop\in sync risk range analysisInSync Conference
This document summarizes and compares two capital projects, Case A and Case B, in terms of their risk management approaches. Case A took a single point estimate approach which resulted in fixed contingency amounts, while Case B took a range analysis workshop approach to identify and quantify risks. Case B developed cost and schedule ranges and analyzed key risk drivers, allowing them to model outcomes more accurately. Best practice for risk analysis involves taking a risk-driven, integrated cost-schedule approach to understand contingency needs and prioritize top risks.
This document discusses the key components of establishing an effective monitoring and evaluation (M&E) system for international development programs, including: 1) conducting a causal analysis to identify the problem, its causes, and desired outcomes of the program; 2) developing a logical framework (logframe) that outlines the goals, objectives, indicators, and assumptions of the program; and 3) creating an indicator matrix that defines each indicator and outlines the data collection methods, responsibilities, analysis, and use of data. The document emphasizes starting M&E planning early and involving stakeholders to ensure the feasibility and ownership of the M&E system.
The document discusses recommendations from two groups, Group A and Group B, on several project management topics:
1. Training - Establishing core competencies and training for project managers while allowing flexibility for contractors.
2. Tailoring - Providing guidance and incentives for tailoring projects to ensure costs and schedules are optimized while still meeting requirements.
3. Performance measurement - Developing tailored performance measurement tools based on project size and complexity rather than one-size-fits-all earned value systems.
4. Risk management - Standardizing practices around contingency and risk tracking and sharing lessons learned across projects.
Visual controls are used in lean environments to illustrate processes and compare actual performance to expected performance. Task boards and burn charts are examples of visual controls that can be used to track work status and project velocity. Effective use of visual controls provides transparency into processes and tasks.
Planning projects usually starts with tasks and milestones. The planner gathers this information from the participants – customers, engineers, subject matter experts. This information is usually arranged in the form of activities and milestones. PMBOK defines “project time management” in this manner. The activities are then sequenced according to the projects needs and mandatory dependencies.
Increasing the Probability of Project SuccessGlen Alleman
This document discusses principles and practices for increasing the probability of project success by managing risk from uncertainty. It defines risk as the effect of uncertainty on objectives. There are two types of uncertainty - epistemic (reducible) and aleatory (irreducible). Risk from epistemic uncertainty can be reduced through work on the program, while risk from aleatory uncertainty requires establishing margins. The document argues that effective risk management is needed to deliver capabilities on time and budget by identifying risks, understanding their interactions and impacts, and implementing risk handling strategies. This increases the likelihood of project success by preventing problems, improving quality, enabling better resource use, and promoting teamwork.
Process Flow and Narrative for Agile+PPMGlen Alleman
This document describes how an organization integrates agile software development practices with earned value management (EVM) to provide program status updates. It outlines a process that begins with developing a rough order of magnitude estimate of features needed. These features are then prioritized, mapped to a product roadmap and product backlog. Stories are developed from features and estimated, and tasks are estimated in hours. Physical percent complete data from tasks in Rally is used to calculate EVM metrics to inform stakeholders.
From Principles to Strategies for Systems EngineeringGlen Alleman
From Principles to Strategies How to apply Principles, Practices, and Processes of Systems Engineering to solve complex technical, operational,
and organizational problems
Building a Credible Performance Measurement BaselineGlen Alleman
The document discusses establishing a credible Performance Measurement Baseline (PMB) for programs by integrating technical and programmatic plans. It recommends starting with a Work Breakdown Structure (WBS) that identifies system elements, associated risks, and processes to produce outcomes. An Integrated Master Plan (IMP) should then define how system elements mature at Program Events, with Measures of Effectiveness (MOEs) and Measures of Performance (MOPs) assigned. Finally, an Integrated Master Schedule (IMS) should arrange tasks to increase technical maturity, identify reducible and irreducible risks, and establish a risk-adjusted PMB to increase the probability of program success. Connecting these elements through the WBS, IMP and IMS
Integrated master plan methodology (v2)Glen Alleman
The document describes a methodology for developing an Integrated Master Plan (IMP). It outlines five conditions an IMP must meet, five steps in the development process, five common questions about IMP development, five common mistakes, and provides five templates/samples for key IMP sections. The methodology is intended to help program and project teams create effective IMPs that integrate execution plans and align with contractual requirements.
Capabilities‒Based Planning the capabilities needed to accomplish a mission or fulfill a business strategy
Only when capabilities are defined can we start with requirements elicitation
Starting with the development of a Rough Order of Magnitude (ROM) estimate of work and duration, creating the Product Roadmap and Release Plan, the Product and Sprint Backlogs, executing and statusing the Sprint, and informing the Earned Value Management Systems, using Physical Percent Complete of progress to plan.
Program Management Office Lean Software Development and Six SigmaGlen Alleman
Successfully combining a PMO, Agile, and Lean / 6 starts with understanding what benefit each paradigm brings to the table. Architecting a solution for the enterprise requires assembling a “Systems” with processes, people, and principles – all sharing the goal of business improvement.
This resource document describes the Program Governance Road map for product development, deployment, and sustainment of products and services in compliance with CMS guidance, ITIL IT management, CMMI best practices, and other guidance to assure high quality software is deployed for sustained operational success in mission critical domains.
The document discusses the development of an Integrated Master Plan (IMP) as the basis for an Integrated Master Schedule (IMS) for a program. It outlines a 6-step process for developing the IMP and IMS that includes understanding requirements, developing a product structure, forming integrated product teams, creating the IMP, creating the IMS, and developing the basis of estimate. It describes artifacts like the product tree, work breakdown structure, statement of work, and their relationships. It also outlines responsibilities of the program management team, integrated product team leads, and program planning and controls.
This document discusses the need for an underlying theory of software project management that can better handle uncertainties. It argues traditional, linear project management models are not well-suited for today's complex, rapidly changing software projects. Adaptive control theory may provide a better model than traditional approaches. Adaptive control systems and agile development processes use feedback loops and emergent solutions to adjust to changes in dynamics, disturbances, or other unplanned events, similar to how project management needs to respond.
Increasing the Probability of Project Success with Five Principles and PracticesGlen Alleman
There are many approaches to managing projects in every domain.
This seminar lays the foundations for increasing the probability of project success, no matter the domain, what technology, what approach to delivering the outcomes of the project.
The principles of this approach are immutable.
The practices for implementing the principles are universally applicable.
Each chart in this presentation, contains guidance that can be applied to your project, no matter the domain.
In our short hour here, we’re going to cover a lot of material.
The bibliography contains the supporting materials we can tailor to your individual project
Generating privacy-protected synthetic data using Secludy and MilvusZilliz
During this demo, the founders of Secludy will demonstrate how their system utilizes Milvus to store and manipulate embeddings for generating privacy-protected synthetic data. Their approach not only maintains the confidentiality of the original data but also enhances the utility and scalability of LLMs under privacy constraints. Attendees, including machine learning engineers, data scientists, and data managers, will witness first-hand how Secludy's integration with Milvus empowers organizations to harness the power of LLMs securely and efficiently.
Your One-Stop Shop for Python Success: Top 10 US Python Development Providersakankshawande
Simplify your search for a reliable Python development partner! This list presents the top 10 trusted US providers offering comprehensive Python development services, ensuring your project's success from conception to completion.
Connector Corner: Seamlessly power UiPath Apps, GenAI with prebuilt connectorsDianaGray10
Join us to learn how UiPath Apps can directly and easily interact with prebuilt connectors via Integration Service--including Salesforce, ServiceNow, Open GenAI, and more.
The best part is you can achieve this without building a custom workflow! Say goodbye to the hassle of using separate automations to call APIs. By seamlessly integrating within App Studio, you can now easily streamline your workflow, while gaining direct access to our Connector Catalog of popular applications.
We’ll discuss and demo the benefits of UiPath Apps and connectors including:
Creating a compelling user experience for any software, without the limitations of APIs.
Accelerating the app creation process, saving time and effort
Enjoying high-performance CRUD (create, read, update, delete) operations, for
seamless data management.
Speakers:
Russell Alfeche, Technology Leader, RPA at qBotic and UiPath MVP
Charlie Greenberg, host
Dandelion Hashtable: beyond billion requests per second on a commodity serverAntonios Katsarakis
This slide deck presents DLHT, a concurrent in-memory hashtable. Despite efforts to optimize hashtables, that go as far as sacrificing core functionality, state-of-the-art designs still incur multiple memory accesses per request and block request processing in three cases. First, most hashtables block while waiting for data to be retrieved from memory. Second, open-addressing designs, which represent the current state-of-the-art, either cannot free index slots on deletes or must block all requests to do so. Third, index resizes block every request until all objects are copied to the new index. Defying folklore wisdom, DLHT forgoes open-addressing and adopts a fully-featured and memory-aware closed-addressing design based on bounded cache-line-chaining. This design offers lock-free index operations and deletes that free slots instantly, (2) completes most requests with a single memory access, (3) utilizes software prefetching to hide memory latencies, and (4) employs a novel non-blocking and parallel resizing. In a commodity server and a memory-resident workload, DLHT surpasses 1.6B requests per second and provides 3.5x (12x) the throughput of the state-of-the-art closed-addressing (open-addressing) resizable hashtable on Gets (Deletes).
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
Freshworks Rethinks NoSQL for Rapid Scaling & Cost-EfficiencyScyllaDB
Freshworks creates AI-boosted business software that helps employees work more efficiently and effectively. Managing data across multiple RDBMS and NoSQL databases was already a challenge at their current scale. To prepare for 10X growth, they knew it was time to rethink their database strategy. Learn how they architected a solution that would simplify scaling while keeping costs under control.
Northern Engraving | Nameplate Manufacturing Process - 2024Northern Engraving
Manufacturing custom quality metal nameplates and badges involves several standard operations. Processes include sheet prep, lithography, screening, coating, punch press and inspection. All decoration is completed in the flat sheet with adhesive and tooling operations following. The possibilities for creating unique durable nameplates are endless. How will you create your brand identity? We can help!
Driving Business Innovation: Latest Generative AI Advancements & Success StorySafe Software
Are you ready to revolutionize how you handle data? Join us for a webinar where we’ll bring you up to speed with the latest advancements in Generative AI technology and discover how leveraging FME with tools from giants like Google Gemini, Amazon, and Microsoft OpenAI can supercharge your workflow efficiency.
During the hour, we’ll take you through:
Guest Speaker Segment with Hannah Barrington: Dive into the world of dynamic real estate marketing with Hannah, the Marketing Manager at Workspace Group. Hear firsthand how their team generates engaging descriptions for thousands of office units by integrating diverse data sources—from PDF floorplans to web pages—using FME transformers, like OpenAIVisionConnector and AnthropicVisionConnector. This use case will show you how GenAI can streamline content creation for marketing across the board.
Ollama Use Case: Learn how Scenario Specialist Dmitri Bagh has utilized Ollama within FME to input data, create custom models, and enhance security protocols. This segment will include demos to illustrate the full capabilities of FME in AI-driven processes.
Custom AI Models: Discover how to leverage FME to build personalized AI models using your data. Whether it’s populating a model with local data for added security or integrating public AI tools, find out how FME facilitates a versatile and secure approach to AI.
We’ll wrap up with a live Q&A session where you can engage with our experts on your specific use cases, and learn more about optimizing your data workflows with AI.
This webinar is ideal for professionals seeking to harness the power of AI within their data management systems while ensuring high levels of customization and security. Whether you're a novice or an expert, gain actionable insights and strategies to elevate your data processes. Join us to see how FME and AI can revolutionize how you work with data!
Fueling AI with Great Data with Airbyte WebinarZilliz
This talk will focus on how to collect data from a variety of sources, leveraging this data for RAG and other GenAI use cases, and finally charting your course to productionalization.
zkStudyClub - LatticeFold: A Lattice-based Folding Scheme and its Application...Alex Pruden
Folding is a recent technique for building efficient recursive SNARKs. Several elegant folding protocols have been proposed, such as Nova, Supernova, Hypernova, Protostar, and others. However, all of them rely on an additively homomorphic commitment scheme based on discrete log, and are therefore not post-quantum secure. In this work we present LatticeFold, the first lattice-based folding protocol based on the Module SIS problem. This folding protocol naturally leads to an efficient recursive lattice-based SNARK and an efficient PCD scheme. LatticeFold supports folding low-degree relations, such as R1CS, as well as high-degree relations, such as CCS. The key challenge is to construct a secure folding protocol that works with the Ajtai commitment scheme. The difficulty, is ensuring that extracted witnesses are low norm through many rounds of folding. We present a novel technique using the sumcheck protocol to ensure that extracted witnesses are always low norm no matter how many rounds of folding are used. Our evaluation of the final proof system suggests that it is as performant as Hypernova, while providing post-quantum security.
Paper Link: https://eprint.iacr.org/2024/257
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
1. Page 1 of 13
A Cure For Unanticipated Cost and Schedule Growth
Thomas J. Coonce
Thomas J. Coonce, LLC
tom.coonce@verizon.net
+1 703 362 2568
Glen B. Alleman
Niwot Ridge LLC
glen.alleman@niwotridge.com
+1 303 241 9633
ABSTRACT
Delivering programs with less capability than promised, while exceeding the cost and planned durations,
distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal
government’s credibility with taxpayers and contributes to the public’s negative support for these
programs.
Many reasons have been hypothesized and documented for cost and schedule growth. The authors
review some of these reasons, and propose that government and contractors use the historical variability of
the past programs to establish cost and schedule estimates at the outset and periodically update these
estimates with up-to-date risks, to increase the probability of program success. For this to happen, the
authors recommend changes to estimating, acquisition and contracting processes.
INTRODUCTION
Publically-funded programs have consistently delivered less than originally planned capabilities and cost
more than planned and are late. Because public funds are limited, new programs cannot be started, or
worse, they are started and then must be stretched out to fund the overruns of programs that are being
developed. In short, the entire portfolio of programs is affected when most of the programs require more
funds and take longer than planned. Funding authorizers become skeptical of new proposals because the
estimates are not deemed credible. Decisions of choice are distorted since the estimates are in error, or
worse, decision-makers know the numbers are biased low and make best guesses which result in poor
choices. Furthermore, excessive growth in public programs damages the credibility of the estimating
community and contributes to the public’s negative support for these programs.
This paper discusses the never-ending problem of cost and schedule growth in publically funded
programs and offers some recommendations to improve performance within the current Department of
Defense (DOD) acquisition environment. This paper highlights the magnitude of cost and schedule growth
within the DOD and the National Aeronautics and Space Administration (NASA) and summarizes some of
the known and hypothesized reasons for growth. The paper also summarizes some of the initiatives that
have been put in place to address estimating issues but suggests more can be done to improve them,
effectively communicate them, and set these estimates high enough to reflect the historical experience so as
to minimize or eliminate growth in future programs.
The authors recommend specific changes to the way initial estimates are created and agreed to. In
particular, this paper discusses the value of developing activity-based estimates adjusted to account for
issues that have occurred on past similar programs.
The authors propose that programs be budgeted at the 70 percent probability of meeting both cost and
schedule targets – Joint Confidence Level. 1They show how this can be done, including active input from
the contracting community, and how contract management can be improved through the use of risk
registers and periodic probability statements about Estimates At Completion (EAC) and Estimated
Completion Dates (ECD). They also enumerate the implementation challenges, and how they can be
overcome.
1
The Joint Confidence Level connects Schedule, Cost, Risk, and Uncertainty in a single assessment of the probability
of program success, http://www.nasa.gov/pdf/724371main_76646-Risk_Analysis_Brochure-Final6.pdf
2. Page 2 of 13
COST AND SCHEDULE GROWTH – A PERSISTENT PROBLEM THROUGH THE DECADES
“In 1982, an unnamed witness at a House Armed Services Committee stated, ‘enough material has been
written on the subject of cost growth during the last ten years to fill a Minuteman silo. Unfortunately, cost
growth is still with us. In a decade since that testimony enough additional information on cost growth has
been written to fill a second minuteman silo.” 2
Table 1 summarizes cost growth associated with NASA and DOD research programs. 3 Although this
information is somewhat dated, the growth trend has not been materially reduced since the time this study
was completed. Figure 1.0 below illustrates the cost and schedule growth of 30 NASA missions that were
developed in the 2007 – 2009 time frame. 4
Table 1.0. Cost Growth Experience, 2004 NASA Study showing increasing growth over the decades in spite of all
efforts to control this growth through improved insight and management processes.
Figure 1.0 – Cost and Schedule Growth of 30 NASA Missions, 2007 - 2009
2
Cost Growth in DOD Major Programs: A Historical Perspective, Col. Harry Calcutt, April 1993,
http://www.dtic.mil/dtic/tr/fulltext/u2/a276950.pdf
3
Unpublished 2004 NASA study performed by Drs Joseph Hamaker and Mathew Schaffer.
4
Unpublished 2009 NASA study
42%
29%
21%
0%
10%
20%
30%
40%
50%
60%
From
Phase B
Start
From
PDR
From
CDR
DevelopmentCostGrowth
29%
23%
19%
0%
10%
20%
30%
40%
50%
60%
From
Phase B
Start
From
PDR
From
CDR
PhaseB/C/DScheduleGrowth
Average Median
NASA in the 90s 36% 26% 78%
NASA in the 70s 43% 26% 75%
NASA in the 80s (GAO) 83% 60% 89%
DoD RDT&E 45% 27% 76%
Cost/Budget Growth
Study
Percent of Projects
Which Experienced
3. Page 3 of 13
Reasons for Cost and Schedule Growth
As stated earlier, volumes have been written about programs failing to provide promised capabilities at
the agreed cost and projected completion dates. Most of these studies attempted to categorize the root
causes, so corrective actions could be taken. Some of the causes are related to estimating and acquisition
initiation activities while others focused on issues associated with management and / or contract execution.
One of the more comprehensive studies was performed by Colonel Harry Calcutt in 1993. 5 Following are
Col. Calcutt’s reasons for growth organized into five categories.
Requirements:
o Poor initial requirement definition
o Poor performance/cost trade-off during development
o Changes in quantity requirements
Estimating:
o Errors due to limitation is estimating procedures
o Failure to understand and account for technical risks
o Poor inflation estimates
o Top down pressure to reduce estimates
o Lack of valid independent cost estimates
Program Management:
o Lack of program management expertise
o Mismanagement/human error
o Over optimism
o Schedule concurrency
o Program stretch outs to keep production lines open
Contracting:
o Lack of competition
o Contractor buy-in
o Use of wrong type of contract
o Inconsistent contract management/admin procedures
o Too much contractoroversight
o Waste
o Excess profits
o Contractors overstaffed
o Contractor indirect costs unreasonable
o Taking too long to resolve undefinitized contracts
Budgeting:
o Funding instabilities caused by trying to fund too many programs
o Funding instabilities caused by congressionaldecisions
o Inefficient production rates due to stretching out programs
o Failure to fund for management reserves
o Failure to fund programs at most likely cost
5
Ibid, Calcutt, p.17
4. Page 4 of 13
A second study from DOD’s Office of Program Assessment and Root Cause Analysis (PARCA)
published a recent report, which categorized growth into two broad categories:6
Inception:
o Unrealistic performance expectations
o Unrealistic baseline estimates for cost or schedule
o Immature technologies or excessive manufacturing or integration risk
Execution related
o Unanticipated design, engineering manufacturing or technology integration issues
o Changes in procurement quantities
o Inadequate program funding or funding instability
o Poor performance by government or contractor personnel
INITIATIVES TO REDUCE COST AND SCHEDULE GROWTH
Over the last five decades, the DOD, NASA, and other civilian agencies have implemented numerous
policy and regulation changes to reduce growth. This paper does not attempt to enumerate all the initiatives
nor assess their effectiveness in combating growth. However, this paper does address those initiatives
dealing with estimating and the processes used to establish and manage contractor baseline estimates.
A few of the more important initiatives to improve the ability of the cost estimating community to
produce better estimates were:
Need for independent estimating at all levels;
Standardized Work Breakdown Structures (WBS) for military systems e.g., MIL-STD 881, and
similar structures for NASA and the National Reconnaissance Office;
Routine cost collection of analogous historical systems upon which to base future estimates;
Use of a Cost Analysis Requirements Description (CARD) (or similar documents) to be used as a
basis for the estimates;
Use of technical and program management subject matter expertise to assist in developing
estimates; and
Policy requirements to budget to higher confidence levels (80 percent cost confidence level within
DOD and 70 percent joint cost and schedule confidence level for NASA projects).
These initiatives are believed to have helped estimators create more credible estimates. Using the actual
experience of similar past projects has been particularly useful as it helps estimators explain the basis for
their estimates and provides leadership with a reference by which to judge the proposed new systems.
However, these data are product-based (based on the systems’ WBS) and Program Management Offices
(PMOs) often think in terms of the materiel developers (contractors’) activities so they find the product-
based estimates difficult to consume, particularly when the independent estimates are higher than those the
PMO receives from one or more contractors. PMOs frequently discount the independent estimates in favor
of the contractor-developed estimates they find easier to understand.
Figure 2 depicts the traditional method in which the cost estimating community develops its estimates.
The chart shows that the cost estimate is a function of the value of the cost driving para meter as well as the
variability of the costs that have been incurred for the product in the past. The figure illustrates that the
higher the input parameter, the higher the cost, but that the estimated cost also depends on the point within
the distribution of cost values that have been incurred on past similar programs. Cost estimators prefer to
provide estimates at the mean or higher for a given WBS element. Estimates created in this way are
considered to be standard acceptable practice and result in credible estimates. However, as noted earlier,
the estimates are not always in the same language as the PMO or the contractors and thus are difficult to
consume.
6
Report to Congress on Performance Assessment and Root Cause Analyses, Office of the Under Secretary of Defense
for Acquisition, Technology and Logistics, March 2014, p. 7, http://www.acq.osd.mil/parca/docs/2014-parca-report-
to-congress.pdf
5. Page 5 of 13
Figure 2.0 – Traditional method of estimating by the WBS product or process defines the variances on the cost
estimate compared to the drivers of the cost, in this example the “weight” of the product being produced. The weight
becomes a “technical performance measure. Being outside the weight bounds will cost more for the product to be
produced.
6. Page 6 of 13
Figure 3 depicts an activity-based estimate built up using the schedule of activities and the resources
associated with the activities. This is known as a cost or resource-loaded schedule that is based upon a set
of expected activity durations and costs for the resources. This is the type of estimate that contractors
submit in response to RFPs and used to set contract baselines for reporting progress to the PMOs. When
independent WBS-based estimates are higher than those submitted by contractors, PMOs find it near
impossible to analyze where the extra cost should be added to the contractors’ activity-based estimates.
Thus, the WBS-based estimates are not as “consumable” as ones that are based on a set of generally agreed
activities.
Figure 3.0 – Sample Activity-Based Estimate derived from the Integrated Master Schedule (IMS). Costs are assigned
for each work element in the IMS, usually within a Work Package, using the WBS number to identify this unique
assignment. These costs are then “rolled” up to the Control Account and then the program at the top. These costs are
reported in DI-MGMT-81861’s IPMR Format 1.
A POSSIBLE SOLUTION TO SOLVE COST AND SCHEDULE GROWTH
While activity-based estimates are easier to explain, they still need to be credible and to do so, they too
must be informed with the variance of the activity durations and costs fromsimilar historical programs. The
variance of the cost associated with an activity-based plan has two components – costs that vary with the
passage of time and cost that are independent of time. A good example of a cost that is time independent is
material. An example of a cost that is time dependent would be systems engineering or progra m
management which must be in place as long as the program is under development. Similar to the variance
in the cost of each WBS element, each activity has a variance in duration and the cost associated with those
durations (both time dependent and time independent). In the end, the cost and cost variance of a given
system that is aggregated by WBS must be the same of the cost and cost variance that is collected by
activities. This concept is depicted in Figure 4.0 below.
It should be noted that variance of cost and schedules reflect the variation that occur in the “normal”
process of developing similar programs. This is referred to as the natural uncertainty that occurs and
estimates of future programs must take this into account. Some researchers have labeled this uncertainty as
“irreducible” because a manager cannot take specific actions to reduce the “risks”; rather, the manager
must just recognize that this variation occurs as part of the normal process of development. 7 The amount of
7
Irreducibility Uncertainty – a Fact of Life for Reserve Estimates, John D. Wright, SPE Annual Technical Conference
and Exhibition, 5-8 October, Denver, Colorado, Society of Petroleum Engineers, 2003.
7. Page 7 of 13
uncertainty that “should” be included has been discussed within the community within the past ten years.
The Weapon Systems Acquisition Reform Act of 2009 specified that all major programs within DOD
should be budgeted at the 80 percent cost confidence level. That policy for the DOD was recently rescinded
in favor of point estimates that reflect “… a confidence level such that it provides a high degree of
confidence that the program can be completed without the need for significant adjustment to program
budgets.”8 NASA requires all programs to be budgeted such that the programs possess a 70 percent
probability of achieving both cost and schedule targets. Assuming cost and schedule are independent,
identically statisticaldistributions, this implies an 84 percent confidence level for cost separately.
Figure 4.0 – Cost Distributions Are The Same Regardless of Estimating Technique
Some have advocated that estimates just need to be conservative. Others have argued that probabilistic
estimating should not be done at all since the historical data are scant and therefore, assessed values reflect
the estimator’s judgment and those could be biased and difficult to explain. Yet still others have advocated
budgeting programs a high confidence levels (above 80 percent) and perhaps raise it higher if the growth is
not contained. Still others have advocated that budgeting programs at high confidence levels is inefficient
as certain programs will be over-budgeted and other worthy programs will not be started. Others have
argued that high confidence levels will only result in higher costs over time as program managers will
expand requirements to use up any funds that appeared to be in excess of those required.
The authors assert that it is not possible to predict in advance which of these scenarios will occur unless
setting the budget at some confidence level is tried. The authors observe that the community has never
really tried budgeting programs at confidence levels that are based on well-understood historical data.
(NASA has only recently done so and it is too early to conclude whether this has had a positive impact on
cost and schedule growth). Anecdotal information suggests that some programs within the DOD have been
budgeted at a stated 50% confidence level, yet experienced rather significant over runs in costs and
Innovative Design Manufacturing Research Center (IdMRC) – Cost Modeling, 24th
International Forum on
COCOMO – November 2009.
A Framework for Considering Uncertainty in Quantitative Life Cycle Cost Estimation, Proceedings of the ASME
2009 Design Engineering Technical Conferences & Computers and Information Engineering Conference,
IDETC/CIE 2009,
8
FY2011 Annual Report on Cost Assessment Activities, Director, Cost Assessment and Program Evaluation (CAPE),
February 2012, p.7-8.
8. Page 8 of 13
completion dates in excess of plan. The authors take the position that programs should be budgeted at a
higher confidence level (above 80 percent for cost and schedule separately) using agreed to historical data
and then observe if this controls the unanticipated growth.
IMPLEMENTATION DETAILS
Setting the budgets at a high confidence level will necessitate modifications in estimating methods
(noted above), the contracting process,and implementation. These are described below.
The first step requires that government program offices (PMOs) take a more active role in clarifying
requirements, and developing the top-level plan (cost-loaded schedule) with an associated statement about
the probability of success (technical, cost and schedule). This top-level plan ought to form the basis for the
first draft of the annual resources required. The second major step involves getting feedback from the
contracting community concerning the probability of delivering program capabilities for the targeted cost
and delivery date. The government will do this by issuing a Request for Information (RFI). PMOs will then
analyze industry’s feedback, modify the requirements, budget or time (or all three) and then issue a Request
for Proposal (RFP). And finally, along with their technical approach, contractors will submit their cost-
loaded plans, risk registers, and probability statements in response to RFPs.
The processes to complete the first major step – development of the RFI – are depicted in Figure 5.0
below and described in the following paragraphs.
Figure 5.0 – Develop draft plan and obtain industry feedback. This feedback will confirm if the government
generated cost-phased estimate is credible from the past experience of industry. This approach is used in Japan’s
defense acquisition system. It engages the government and the providers in a dialogue in search of a phased plan that
can increase the probability of success for the program
9. Page 9 of 13
The first step in this process (1.0) is to develop an Integrated Master Plan (IMP). This is a top-level
picture of the general sequence of activities needed in order to mature the capabilities desired. This should
be developed per the Department of Defense’s (DOD’s) Integrated Master Plan and Integrated Master
Schedule Preparation and Use Guide.9
The second Step is to develop a top-level Integrated Master Schedule. This schedule is notional but
reflects the PMO’s thoughts on how the system capabilities ought to be developed. By its nature, the
schedule is high level, but still includes a logical structure of activities. It ought to reflect the PMO’s
thoughts about which activities can be performed in parallel and which must be serial. The activity
durations are based on the amount of time of similar past programs.
In Step 3, the PMO adds the resources (costs) associated with the high level schedule developed in
step 2. Ideally, the cost associated with the activities will be available from databases. However, if they are
unavailable, the PMO should estimate the costs. Some of the cost information could come from the
developed independent estimates but transformed into the number of people, labor rates and material costs.
Ideally, these discrete estimates ought to be equal to or higher than the mean of the historical data.
The forth step is to create a risk register (RR) that will be used along with the top-level cost loaded
schedule (developed in step 3) to develop cost and schedule confidence levels (step 5) for the PMO’s draft
plan. The risk register should contain, at a minimum, information about the variation of the activity
durations and associated costs from the historical database. As discussed earlier, this uncertainty represents
the “natural variation” associated with developing similar programs, i.e., the irreducible risks. The risk
register should also include the specific discrete risks that the PMO feels is unique to the program. This
uncertainty (risk) should have an associated probability of occurrence and a cost and/or schedule
consequence if it does occur.
In the fifth step, the PMO should use appropriate Monte Carlo simulation software with the Risk
Register developed in Step 4 and the top-level cost-loaded plan developed in step 3 to develop cost and
schedule confidence levels for the targeted budget and development duration.
In sixth step, the PMO adjusts cost and or program duration to achieve a “reasonable” confidence level.
The authors believe that PMO should select a targeted cost with at least a 70 percent joint cost and schedule
confidence level (above 80 percent for cost and schedule separately).
Once the RFI has been issued, contractors are expected to review and comment on the PMO’s draft plan
and answer the following questions:
1. Is the top-level plan logical given the technical challenges and capabilities required? If no,
a. What other activities should be included or dropped?
b. What changes in logic are required?
2. Are activities durations “consistent” (within family) of their experience?
3. Are the costs “consistent” (within family) of their experience?
4. Is the PMO’s perspective on risks realistic? If not,
a. Which risks are overstated?
b. Which risks are understated?
c. Which risks were missed? What is the contractor’s assessment of probabilities and
consequences forthose risks? (Included in their risk register).
Contractors are expected to provide an updated high-level cost-loaded schedule, updated risk register,
and associated probability statement of achieving the desired capabilities within the desired budget and
duration. These submissions are not considered proposals; rather, they are considered inputs to the PMO on
how the contractor might proceed, along with the associated confidence levels for the stated resources.
Upon receipt of the contractors’ responses to the RFI, the PMO modifies the top-level plan and creates a
Request for Proposal (RFP) that reflects the contractors’ concerns and notional plans. This process is
depicted in Figure 6.0 below.
9
Integrated Master Plan and Integrated Master Schedule Preparation and Use Guide, October 21, 2005
10. Page 10 of 13
Figure 6.0 – The process to improve the Request for Proposals (RFP) starts with an assessment of the responses
from the Request for Information, where contractors provided their cost estimates.
In step eight, the PMO assesses the contractors’ concerns and evaluates their top-level plans, risk
registers, and confidence levels on the risk and their reduction through the handling plans. The PMO
evaluates and synthesizes the concerns expressed about the capabilities, top-level plan; the risk registers
and revises the PMO plan including the budget profile (step 9). Certain capabilities might be dropped or
revised to reduce the development risk. The PMO will re-run their Monte-Carlo simulations to determine
the confidence levels for the original budget and duration. PMOs increase the budget and/or development
period in order to achieve at least a 70 joint cost and schedule confidence level (step 11). Once this is done,
the PMO prepares and issues the RFP.
Contractors develop their responses the RFPs in the normal fashion, but will include their final risk
registers along with their implantation plans contained in a schedule model instrumented with Monte
Carlos Simulation software. The response will include probability statements about the achieving the cost
and schedule targets. This response and deliverables should be similar to the cost-loaded schedule
submitted in response to the RFI. This process is depicted in Figure 7.0 below.
8.0
Assess
Responses to
RFI
Updated Risk
Register (RR)
Updated Program Capabilities
(Requirements)
10.0
Re-create
Joint
Probability
Distribution
Updated Cost-Loaded IMS
11.0
Decide on
TargetedCost
and
Completion
Date
Multiple responses
12.0
Create
Request for
Proposal
(RFP)
9.0
Update IMP/
IMS, Risk
Register,
and/or
Modify
Rqmts
11. Page 11 of 13
Figure 7.0 – Contractor process to generate responses to RFPs using the earlier cost estimates developed and shared
by all the participants in the acquisition process. The government now has a credible estimate of the cost, schedule, and
technical performance and makes the selection on “Best Value.”
The next step is the selection of the winning contractor. In addition to other selection criteria, the
government will include “Implementation Plan Realism” as part of its selection criteria. Inclusion of the
cost-loaded schedule and the contractors’ risk registers will permit the government evaluators to assess the
credibility of the contractors’ plans and perspective on risks. Plans which reflect documented actual
historical experience, control for the discrete risks, i.e., include resources in their plans to address or control
them, and possess a joint cost and schedule confidence levels in excess of 60 percent, should be awarded
higher scores.10
At the Integrated Baseline Review, the contractor will:
a) Show how progress will be objectively measured in units of measure meaningful to the decision
makers, through Technical Performance Measures derived from the Integrated Master Plan. It is
essential that technical progress be the driver of the reported cost and schedule progress. Without
progress measures that are meaningful to the PMO, the contractor will be unable to propose
corrective actions to keep the contract effort “on-track”; and
b) Develop a Performance Management Baseline (PMB) such that it has at least of 50 percent
probability of meeting both cost and schedule targets and demonstrate the contractor possesses
sufficient Management and Schedule Reserves (schedule margin) to maintain a minimum of a 60
percent joint probability of success. If the contractor proposes a PMB with less than 50 percent
joint probability of meeting cost and schedule targets in its baseline, the contract should be
amended (reduce capabilities, increase cost and/or schedule), or be cancelled. Contracts should not
be started if the PMB has less than a 50/50 change of success.
10 The selection of the joint confidence level could vary, but 60 percent is chosen as an absolute
minimum. In order to tame cost and schedule growth, contracted efforts ought to have better than 50/50
change of meeting targets.
13.0
Create
Contractor
Versionof
IMP
14.0
Develop
Summary
Level IMS
15.0
Develop
Detailed
Cost or
Resource-
Loaded
IMS
Contractor Historical Activity
Durations and Associated Costs
17.0
Create Joint
Probability
Distribution
16.0
Create
Updated
Risk
Register
(RR)
Contractor Risk
Register (RR)
18.0
Specify Joint
Confidence Level of
Targeted Cost and
Completion Date
19.0
Complete
Preparation of
Proposal
Response to RFP
Updated Program
Capabilities
(Requirements)
12. Page 12 of 13
After the contract is awarded, the PMO and the contractor must actively manage the risks and attempt to
develop the desired capabilities while maintaining a high probability of meeting cost and schedule targets.
The PMO must do its part by holding firm on requirement changes – a well-known cause of cost and
schedule growth. The contractor must communicate progress based on the pre-approved TPMs, and
evolving risks. The contractor should submit updated plans every six months to the PMO along with his
risk register and communicate the probabilities associated with its “Best Case”, “Worst Case” and “Most
Likely” Estimates at completion and the Estimated Completion Date. These terms mean little to the PMO
or other stakeholders without associated probability statements and the drivers of cost and schedule from
the risk register content. This information should allow the contractor and PMO to focus on the future risks
and revise their plans to keep the contracts on track.
THE CHALLENGES
There are several challenges that must be addressed in order to implement these recommendations.
These are briefly discussed below.
First, the estimating community must broaden its data collection effort to capture the durations and
associated costs of common high-level development activities. Some of these data may already exist within
the native schedules contained with the Earned Value Management Central Repository (EVM -CR).
Comparison of initial and final schedules could provide useful analogies for future systems. All programs,
regardless of size, have some form of schedule and the community should make a concerted effort to
collect, store and share these data. The authors assert that the contracting communities, particularly those
involved in the construction industry, have done a better job of collecting these data and effectively using
these data, than the government.
Second, PMOs must take more active roles in creating realistic plans for their programs. As noted above,
this means PMOs must create Integrated Master Plans – the architectural document to develop the required
capabilities. They also need to assure themselves that issued RFPs possess an “achievable” outcome within
the resources available. This means that PMOs must develop a top-level cost-loaded plan and compute
probabilities for meeting the cost and schedule targets. Given limited internal resources, PMO will likely
need to obtain outside assistance to perform this function. This is an additional expense to the PMO, but
one that is easily justified.
Third, asking the contracting community for their top-level picture of probabilities in advance of the
RFP might come at a cost. Serious interested vendors will likely make the investment to develop a
summary-level plan, Risk Register, and model to calculate probabilities, but perhaps not. PMOs might have
to contract for systems engineering contracts to provide their perspectives, but cost should be minimal –
less than four staff months worth of effort, each. This will be a s mall price to pay to obtain FRPs that have
a higher probability of success.
Forth, the IPMR Data Item Description (DID) will need to be updated to require: a) submission of the
IMP in responses to RFPs, b) EAC and ECD probability statements in Format 5, and c) inclusion of risk
registers and instrumented schedule models every six months. Requesting the IMP within the RFP should
never be optional as it provides the architectural framework for delivering the desired capabilities. As noted
earlier, it provides the foundation for the contractors’plans – the cost or resource loaded schedule.
Including the probability statements in Format 5 appears to be fairly straightforward and be could be
done with minimal expense, but will be of marginal value to the PMO without the accompanying schedule
model and associated risk registers. A PMO should be able to see the impact to the cost and schedule
outcomes if changes in schedule logic are made of if they have a different perspective on risks. PMOs (or
other stakeholders) should be able to independently exercise the contractors’ models. However, this
requirement could be negotiated in such a way that the contractor will merely be required to develop the
model and be ready to address PMO “what-if” questions.
13. Page 13 of 13
CONCLUSION
This paper discusses the perennial problem of cost and schedule growth in publically funded programs
and offers recommendations to improve performance within the DOD environment. This paper proposes
that programs be budgeted (with active inputs from industry partners) at a 70 percent joint cost and
schedule confidence level and that contracts be periodically reassessed to ensure high probabilities of
success.
The authors observe that few programs have been budgeted for high confidence levels and cost growth
continues. They assert that setting a relatively high probability of success is necessary to attempt to control
growth and restore credibility with authorizers and taxpayers. The authors also address a number of the
challenges to implement their recommendations and offer suggestions to overcome them.