SlideShare a Scribd company logo
What is Risk?
Glen B. Alleman, Thomas J. Coonce, and Rick A. Price
Cost and schedule growth for federal programs is created by unrealistic technical performance expectations,
unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly
performed and ineffective risk management, all contributing to program technical and programmatic shortfalls,
shown in Figure 1 and documented in a wide range of sources. [12], [13], [16], [18], [25], [30], [31], [38], [57], [58],
[59], [62], [67], [69]
Figure 1 ‒ Four summary sources of program cost and schedule overruns and technical shortfalls are from a more detailed
assessment of the root cause of program cost, schedule, and performance shortfalls [32].
Risk is the effect of uncertainty of objectives. Uncertainty is a state or condition that involves a deficiency of information and
leads to inadequate or incomplete knowledge of understanding. In the context of riskmanagement, uncertainty exists whenever
the knowledge or understanding of an event, consequence, or likelihood is inadequate or incomplete
‒ ISO 31000:2009, ISO 17666:2016, and ISO 11231:2010
Risk is Uncertainty that Matters [29]
Risk can be the potential consequence of a specific outcomethat affects the system’s ability to meet cost,
schedule, and/or technical objectives.Risk has three primary components: [32]
 Probability of the activity or event occurringor not occurring,described by a Probability Distribution Function.
 Consequence or effect resultingfrom the activity or event occurringor notoccurring,described by a Probability
Distribution Function.
 Root Cause (condition and activity) of a future outcome, which when reduced or eliminated, will prevent
occurrence, non‒occurrence, or recurrence of the cause of the risk.
For the program manager, there are three risk categories thatmust be identified and handled:
 Technical ‒ risks that may prevent the end item from performing as intended or not meeting performance
expectations. Measures of Effectiveness, Measures of Performance, Technical PerformanceMeasures,and Key
Performance Parameters describe the measures of these expectations.
 Programmatic ‒ risks that affect the cost and schedule measures of the program. The programmatic risks are
within the control or influenceof the Program Management or Program Executive Office, through managerial
actions applied to the work activities contained in the Integrated Master Schedule. [17]
 Business ‒ risks that originate outside the program office or are not within the control or influence of the
Program Manager or Program Executive Office.
Uncertainty comes from the lack information to describe a current state or to predict future states, preferred outcomes, or the
actions needed to achieve them. [51], [70] This uncertainty can originate from the naturally (randomly) occurring processes of
the program (AleatoryUncertainty). Or it can originate from the lack of knowledge about the future outcomes from the work on
the program (Epistemic Uncertainty). [37], [60]
Risks, Their Sources, and Their Handling Strategies
Riskidentification during early design phases of complex systems is commonlyimplemented but often fails to identifyevents and
circumstances that challenge program performance. Inefficiencies in cost and schedule estimates are usually held accountable
for cost and schedule overruns, but the true root cause is often the realization of programmatic risks. A deeper understanding of
frequent risk identification trends and biases pervasive during system design and development is needed, for it would lead to
improved execution of existing identification processes and methods. [52], [53], [54]
Risk management means building a model of the risk, the impact of the risk on the program, and a model for
handlingof the risk,sinceit is a risk,the corrective or preventive action has not occurred yet. [19], [35], [46], [60],
[66]
Probabilistic Risk Assessment(PRA) is the basis of these models and provides the Probability of Program Success
Probabilities resultfrom uncertainty and are central to the analysis of the risk.Scenarios,model assumptions,with
model parameters based on current knowledge of the behavior of the system under a given set of uncertainty
conditions. [43]
The source of uncertainty must be identified, characterized, and the impact on program success modeled and understood, so
decisions can be made about corrective and preventive actions needed to increase the Probability of Program Success.
Since risk is the outcome of Uncertainty, distinguishing between the types of uncertainty in the definition and
management of risk on complex systems is useful when buildingrisk assessmentand management models.[8], [25],
[27], [35], [43], [71]
 Epistemic uncertainty ‒ from the Greek επιστηµη (episteme), meaning knowledge of uncertainty due to a lack
of knowledge of quantities or processes of the system or the environment. Epistemic uncertainty is represented
by the ranges of values for parameters, a range of workable models, the level of model detail, multipleexpert
interpretations, and statistical confidence. Epistemic uncertainty derives from a lack of knowledge about the
appropriate value to use for a quantity that is assumed to have a fixed value in the context of a particular
analysis. The accumulation of information and implementation of actions reduce epistemic uncertainty to
eliminateor reduce the likelihood and/or impactof risk.This uncertainty is modeled as a subjectiveassessment
of the probability of our knowledge and the probability of occurrence of an undesira ble event.
Incomplete knowledge about some characteristics of the system or its environment are primary sources of Epistemic
uncertainty.
 Aleatory uncertainty ‒ from the Latin alea (a single die in Latin) is the inherent variation associated with a
physical system or the environment. Aleatory uncertainty arises from an inherent randomness, natural
stochasticity, environmental or structural variation across space and time in the properties or behavior of the
system under study. [1] The accumulation of more data or additional information cannot reduce aleatory
uncertainty. This uncertainty is modeled as a stochastic process of an inherently random physical model. The
projected impact of the risk produced by Aleatory uncertainty can be managed through cost, schedule, and/or
technical margin.
Naturally occurring variations associated with the physical system are primary sources of Aleatory uncertainty.
There is a third uncertainty found on some programs in the Federal Systems community that is not addressed by
this White Paper, sincethis uncertainty is notcorrectable
 Ontological Uncertainty ‒ is attributable to the complete lack of knowledge of the states of a system. This is
sometimes labeled an Unknowable Risk. Ontological uncertainty cannot be measured directly.
Ontological uncertainty creates risk from Inherent variations and incomplete information that is not knowable.
Separating Aleatory and Epistemic Uncertainty for Risk Management
Aleatory is mentioned 55 times and Epistemic 120 times in NASA/SP-2011-3421, Probabilistic Risk Assessment Procedure Guide
for NASA Managers and Practitioners. [42]
Knowing the percentage of reducible uncertainty versus irreducible uncertainty is needed to construct a credible risk model.
Without the separation, knowing what uncertainty is reducible and what uncertainty is irreducible inhibits the design of the
corrective and preventive actions needed to increase the probability of program success.
Separatingthe types of uncertainty serves to increasetheclarity of risk communication,makingitclear which type
of uncertainty can be reduced and which types cannot be reduced. For the latter (irreduciblerisk),only margin can
be used to protect the program from the uncertainty. [9], [37]
As uncertainty increases,the ability to precisely measurethe uncertainty is reduced to where a direct estimate of
the risk can no longer be assessed through a mathematical model. Whilea decision in the presence of uncertainty
must still bemade, deep uncertainty and poorly characterized risks lead to the absence of data and risk models in
the Defense and Space Acquisition domain. [39], [42]
Epistemic Uncertainty Creates Reducible Risk
The risk created by Epistemic Uncertainty represents resolvable knowledge, with elements expressed as
probabilistic uncertainty of a future valuerelated to a loss in a futureperiod of time. [3], [72] Awareness of this lack
of knowledge provides the opportunity to reduce this uncertainty through direct corrective or preventive actions.
[36]
Epistemic uncertainty, and the risk it creates, is modeled by defining the probabil ity that the risk will occur, the
time frame in which that probability is active,and the probability of an impact or consequence from the risk when
it does occur, and finally, the probability of the residual risk when the handing of that risk has been applied.
Epistemic uncertainty statements define and model these event‒based risks:
 If‒Then ‒ if we miss our next milestone then the program will fail to achieveits business valueduring the next
quarter.
 Condition‒Concern ‒ our subcontractor has notprovided enough information for us to status the schedule,and
our concern is the schedule is slipping and we do not know it.
 Condition‒Event‒Consequence ‒ our status shows there aresome tasks behind schedule,so we could miss our
milestone, and the program will fail to achieve its business value in the next quarter.
For these types of risks, an explicit or an implicit risk handling plan is needed. The word handling is used with
special purpose.“WeHandle risks”in a variety of ways. Mitigation is oneof those ways.In order to mitigatethe risk,
new effort (work) must be introduce into the schedule. We are buying down the risk,or we areretiring the risk by
spending money and/or consuming time to reduce the probability of the risk occurring. Or we could be spending
money and consuming time to reduce the impactof the risk when it does occur. In both cases actions aretaken to
address the risk.
Reducible Cost Risk
Reducible cost risk is often associated with unidentified reducible Technical risks, changes in tec hnical
requirements and their propagation thatimpacts cost. [7] Understandingthe uncertainty in costestimates supports
decision making for setting targets and contingencies, risk treatment planning, and the selection of options in the
management of program costs. Before reducible cost risk can take place, the cost structure must be understood.
Cost risk analysis goes beyond capturing the cost of WBS elements in the Basis of Estimate and the Cost Estimating
Relationships. This involves:
 Development of quantitative modellingof integrated cost and schedule,incorporatingthe drivers of reducible
uncertainty in quantities, rates and productivities, and the recording of these drivers in the Risk Register.
 Determining how cost and schedule uncertainty can be integrated in the analysis of the cost risk model
 Performing sensitivity analysis to provide understanding of the effects of reducible uncertainty and the
allocation of contingency amounts across the program.
Reducible Schedule Risk
While there is significant variability, for every 10% in Schedule Growth there is a corresponding 12% Cost Growth. [10]
Schedule Risk Analysis (SRA) is an effective technique to connect the risk information of programactivities to the
baselineschedule,to provide sensitivity information of individual programactivities to assess the potential impact
of uncertainty on the final program duration and cost.
Schedule risk assessment is performed in 4 steps:
1. BaselineSchedule‒ Construct a credibleactivity network compliantwith GAO‒16‒89G, “Schedule
Assessment Guide: Best Practices for ProjectSchedule.”
2. Define ReducibleUncertainties ‒ for activity durations and costdistributionsfromthe Risk Register and
assign theseto work activities affected by the risk and/or the work activities assigned to reduce the risk.
3. Run Monte‒Carlo simulations‒ for the scheduleusingthe assigned Probability Distribution Functions
(PDFs), usingthe Min/Max values of the distribution,for each work activity in the IMS.
4. Interpret Simulation Results ‒ usingdata produced by the Monte Carlo Simulation,includingatleast: [61]
 Criticality Index (CI): Measures the probability that an activity is on the critical path. [14], [15]
 Significance Index (SI): Measures the relative importance of an activity. [63], [64]
 Schedule Sensitivity Index (SSI): Measures the relative importance of an activity taking the CI into account.
[49], [63], [64]
 Cruciality Index (CRI): Measures the correlation between the activity duration and the total program
duration. [63], [64]
Reducible Technical Risk
Technical risk is the impact on a program, system, or entire infrastructurewhen the outcomes from engineering
development do not work as expected, do not provide the needed technical performance, or create higher than
planned risk to the performance of the system. Failure to identify or properly manage this technical risk results in
performance degradation, security breaches, system failures,increased maintenancetime, and significantamount
of technical debt 1 and addition cost and time for end item deliver for the program.
Reducible Cost Estimating Risk
Reducible cost estimating risk is dependent on technical, schedule, and programmatic risks, which must be
assessed to provide an accurate picture of program cost. Cost risk estimating assessment addresses the cost,
schedule, and technical risks that impact the cost estimate. To quantify these cost impacts from the reduci blerisk,
sources of risk need to be identified. This assessment is concerned with three sources of risk and ensure that the
model calculating the cost also accounts for these risks: [23]
 The risk inherentin the costestimatingmethod. The Standard Error of the Estimate (SEE), confidenceintervals,
and prediction intervals.
 Risk inherent in technical and programmatic processes. The technology’s maturity, design and engineering,
integration, manufacturing, schedule, and complexity. [68]
 The risk inherent in the correlation between WBS elements, which decides to what degree one WBS element’s
change in cost is related to another and in which direction.WBS elements within federal systems have positive
correlations with each other, and the cumulative effect of this positive correlation increases the range of the
costs. 2
Unidentified reducible Technical Risks are often associated with Reducible Cost and Schedule risk.
Aleatory Uncertainty Creates Irreducible Risk
Aleatory uncertainty and the risk it creates comes not from the lack of information, but from the naturally
occurring processes of the system. For aleatory uncertainty, more information cannot be bought nor specific risk
reduction actions taken to reduce the uncertainty and resulting risk. The objective of identifying and managing
aleatory uncertainty to be preparing to handle the impacts when risk is realized.
The method for handling these impacts is to provide margin for this type of risk, including cost, schedule, and technical margin.
Using the NASA definition, Margin is the difference between the maximum possible value and the maximum
expected Value and separate from Contingency. Contingency is the difference between the current best estimates
and maximum expected estimate. For systems under development, the technical resources and the technical
performance values carry both margin and contingency.
Schedule Margin should be used to cover the naturally occurring variances in how long it takes to do the work.
Cost Margin is held to cover the naturally occurring variances in the price of something being consumed in the
program. Technical margin is intended to cover the naturally occurring variation of technical products.
Aleatory uncertainty and the resultingrisk ismodeled with a Probability Distribution Function (PDF) thatdescribes
the possiblevalues theprocess can takeand the probability of each value.The PDF for the possibledurationsfor the
work in the program can be determined. Knowledge can be bought about the aleatory uncertainty
1
Technical debt is a term used in the Agile community to describe eventual consequences of deferring complexity or implementin g incomplete changes. As a
development team progresses there may be additional coordinated work that must be addressed in other places in the schedule. When these changes do not
get addressed within the planned time or get deferred for a later integration, the program accumulates a debt that must be paid at some point in the future.
2
The use of Design Structure Matrix provides visibility and modeling of these dependencies. Many models consider the dependencies as statistic or fixed at
some value. But the dependencies are dynamic driven by nonstationary stochastic processes, that evolve as the program evolves .
through Reference Class Forecasting and past performance modeling. [11], [21], [71] This new information then
allows us to update ‒ adjust ‒ our past performance on similar work will provide information about our future
performance. But the underlyingprocesses is still random,and our new information simply created a new aleatory
uncertainty PDF.
The firststep in handlingIrreducibleUncertainty is thecreation of Margin.Schedulemargin,Costmargin,Technical
Margin,to protect the programfrom the risk of irreducibleuncertainty.Margin isdefined as theallowancein budget,
programed schedule … to account for uncertainties and risks. [44]
Margin needs to be quantified by:
 Identifying WBS elements that contribute to margin.
 Identifying uncertainty and risk that contributes to margin.
Irreducible Schedule Risk
Programs are over budget and behind schedule, to some extent because uncertainties are not accounted for in
scheduleestimates. Research and practiceis now addressing this problem, often by using Monte Carlo methods to
simulatethe effect of variances in work packagecosts and durationson total costand date of completion. However,
many such program risk approaches ignorethe significantimpactof probabilistic correlation on work package cost
and duration predictions. [5], [33]
Irreducibleschedulerisk ishandled with Schedule Margin which is defined as the amount of added time needed
to achievea significantevent with an acceptableprobability of success. [2] Significantevents are major contractual
milestones or deliverables. [48]
With minimal or no margins in schedule, technical, or cost present to deal with unanticipated risks, successful acquisition is
susceptible to cost growth and cost overruns. Error! Reference source not found.
The Program Manager owns the schedule margin.It does not belong to the clientnor can it be negotiated away
by the business management team or the customer. This is the primary reason to CLEARLY identify the Schedule
Margin in the Integrated Master Schedule. [17] It is there to protect the program deliverable(s).Schedulemargin is
not allocated to over‒runningtasks,rather is planned to protect the end item deliverables.
The schedulemargin should protect the delivery date of major contractevents or deliverables.This isdonewith
a Task in the IMS that has no budget (BCWS). The duration of this Task is derived from Reference Classes or Monte
Carlo Simulation of aleatory uncertainty that creates risk to the event or deliverable. [11],[47]
The IMS, with schedule margin to protect against the impact aleatory uncertainty, represents the most likely and realistic risk‒
based plan to deliver the needed capabilities of the program.
Irreducible Technical Risk
The last 10 percent of the technical performance generates one‒third of the cost and two‒thirds of the problems ‒ Norman
Augustine’s 15th Law. [6]
Using the NASA definition, Margin is the difference between the maximum possible value and the maximum
expected Value and separate that from contingency is the difference between the current best estimates and
maximum expected estimate, then for the systems under development, the technical outcome and technical
performance values both carry margin and contingency.
 Technical Margin and Contingency serve several purposes:
 Describe the need for and use of resource margins and contingency in system development.
 Define and distinguish between margins and contingency.
 Demonstrate that, historically, resource estimates grow as designs mature.
 Provide a representative margin depletion table showing prudent resource contingency as a function of
program phase.
For any system, in any stage of its development, there is a maximum possible, maximum expected, and current
best estimate for every technical outcome. The current best estimate of a technical performance changes as the
development team improves the design and the understanding of that design matures.
For a system in development, most technical outcomes should carry both margin and contingency. As designs
mature, the estimate of any technical resource usually grows. This is true historically and, independent of exactly
why, development programs must plan for it to occur. [41]
The goal of Technical Margin (unlikeCostand ScheduleMargin) is to reduce the margins (for example Size Weight
and Power) to as close to zero as possible, to maximize mission capabilities. The technical growth and its handling
include:
 Expected technical growth ‒ contingency accounts for expected growth
 Recognize mass growth is historically inevitable.
 As systems mature through their development life cycle, with better understanding of the design emerging
from conceptual to actual designs.
 Requirements changes often increase resource use
 Unplanned technical growth ‒ margins account for unexpected growth
 Recognize all system development is challenging
 Programs encounter “unknown unknowns” with the use of new technology that is difficult to gauge, that
develop into uncertainties in design and execution of the program, all the way to manufacturing variations.
Ontological Uncertainty
On the scale of uncertainty ‒ from random naturally occurring processes (aleatory) to the Lack of Knowledge
(epistemic), Ontological uncertainty lies atthe far end of this continuum and is a state of complete ignorance.Not
only arethe uncertainties notknown, the uncertainties may notbe knowable.Whilethe truth is outthere, 3 itcannot
be accessed because it is simply not known where to look in the first instance. NASA calls it operating outside the
experience base, where things are done for the first time and operate in a state of ignorance.
Management of uncertainty is the critical success factor of effective program management. Complex programs
and the organizations thatdeliver them organizations can involvepeople with different genders, personality types,
cultures, first languages, social concerns, and/or work experiences.
Such differences can lead to ontological uncertainty and semantic uncertainty. Ontological uncertainty involves
different parties in the same interactions having different conceptualizations about what kinds of entities inhabit
their world;what kinds of interactions theseentities have; how the entities and their interactions changeas a result
of these interactions. Semantic uncertainty involves different participants in the same interactions giving different
meanings to the same term, phrase and/or actions. Ontological uncertainty and semantic uncertainty can lead to
intractable misunderstandings between people. [22], [55]
When new technologies are introduced into these complex organizations, concepts, principles and techniques
may be fundamentally unfamiliar and carry a higher degree of ontological uncertainty than more mature
technologies.A subtler problemis these uncertainties areoften implicitrather than explicitand become an invisible
and unquestioned partof the system. When epistemic and ontological uncertainty represents our lack of knowledge,
then reducing the risks produced by this uncertainty requires improvement in our knowledge of the system of
interest or avoiding situations that increase these types of uncertainty. To reduce Epistemic and Ontological
uncertainty there must be a reduction in the uncertainty of the model of system behavior (ontology) or in the
model’s parameters (epistemology).
Call To Action
A great deal of work has been done on improving risk identification and analysis. The knowledge created by these efforts
provides a more complete list of risks and better estimates of their likelihood and consequences, but the fruits of this effort have
NOT been powerful enough to change the written guidance of DOD and other agencies.
Successful risk management can be applied to increasethe probability of programsuccess,but only if it can be
assured that:
 The models of Aleatory and Epistemic uncertainty properly represent the actual uncertainties on the program.
 The corrective and preventive actions for the risks created by Epistemic uncertainty and margins used for the
Aleatory uncertainties will reduce risk as well as increase the probability of program success.
In 2002,ClaudeBolton 4 said,there are still too many surprises, usingtraditional metrics for poorly performingand
failingprograms in a briefing to Army senior leadership. [28] Pastefforts to build dashboards,provide guidanceto
services (USAF, Navy, and ARMY) have fallen into disuse.
3
Apologies to Mulder and Scully and the X‒Files
4
Retired United States Air Force Major General who served as United States Assistant Secretary of the Army for Acquisition, Lo gistics, and Technology from 2002
to 2008.
Two Critical Steps start the path toward Increasing Probability of Program Success
3. Fix budgeting process and 6. Align Technical Plan with Budgeting
Table 1 ‒ Corrective and preventive actions for programmatic root causes reducing probability of program success
Condition or Action (Cause) Undesirable Outcome (Effect) Corrective Action
1. Government failsto definewhat Done
lookslike in unitsofmeasure meaningful
to the decision makers, priorto RFP.
Systemsare deliveredthat don’t provide the
desired capabilities, arrive too late,and cost more
than budgeted.
Government definestheMoE’s
for the program prior to issuing
the RFP.
2. Different participantsin theacquisition
processimpose conflicting demandson
programs. [24]
Participantsdo what isin theirown best interest
versusthe best interest oftechnical, cost,and
schedule performancefor the program.
Agreed upon MoP’sdefined at
IBR for each program deliverable.
3. Budget processesforcefunding decisions
to be made in advance ofprogram
decisions. [12], [69]
Budget processencouragesundueoptimism that
createsprogram risk for cost, schedule, and
technical performance.
Government providesarisk
adjusted cost loaded Integrated
Master Plan prior to Milestone‒
B/KDP-B.
4. Program managers’ short tenures,
limitationsin experience, and lack of
formal training. [4], Error! Reference
source notfound.
The program manager focused on minimizing risk
to Keep theProgram Sold and creating the
opportunity for career advancement.
Program Manager assigned from
start of development (KDP-B)to
completion.
Table 2 ‒ Corrective and preventive actions for cost, schedule, and technical root causes reducing probability of program success
A fundamental responsibility in the acquisition of major systems is to ensure that visibility of progress is sufficient to re liably
assess the results being obtained. [56]
Without the integration of the risk adjusted cost, schedule, and technical performance measures, and preventive and corrective
actions needed to reduce or eliminate the impact of these risks, the probability of program success is not likely to meet the
needs of those paying for the outcomes.
Condition or Action (Cause) Undesirable Outcome (Effect) Corrective Action
5. Schedule and Cost Margin not assignedto
protect key contractual deliverablesor end
items. [10],[34]
Lack ofcost and schedule margin causesthe
program to under deliver technical, go over
budget, and deliver late.
Maintain the required schedule and cost
margin to protect key contractual
deliverables.
6. Technical Plan not aligned with Budget
and Spend Plan, with missing Schedule
Margin and risk buy down plan. [69]
Lack ofintegrated cost, schedule,technical
progressgiveswrong information offuture
performance.
Develop resource loadedIntegrated
Master Schedule, aligned with Technical
and Budget adjusted with Marginat
(IBR).
7. Alternative PointsofIntegrationnot
considered in IMS whenrisk becomesan
issue. [50]
No Plan-B meansProbability ofSuccessis
reduced.
Have a Plan-B on baseline in theform an
alternative pointsofintegration.
8. Cost, Schedule,and Technical
UncertaintiesimpactProbability of
Program Successwithout handling plans.
[26]
Without acredible risk strategy, there isno
credible plan for success.
Develop risk modelsfor integrated cost,
schedule, and technical performance.
9. Missing Resource Loaded Scheduleto
assure staff, facilities, and other resources
available when needed. [45]
If resourcesare not availablewhenneeded,
there isno credibleplan for success.
Develop and maintain aresourceloaded
schedule.
10. Propagation ofAleatory and Epistemic
Uncertainty not considered in risk profile
or handling plans. [2], [20], [40], [65]
Without understanding thedynamicsofrisk
and itspropagation, there isno credible
plan to Keep the Program Green. Error!
Referencesourcenot found.
Model network ofactivitiesto reveal risk
propagation using DSM.
Bibliography
[1] Ali,T., Boruah, H., and Dutta, P., “Modeling Uncertainty in Risk Assessment UsingDouble Monte Carlo
Method,” International Journal of Engineering and Innovation Technology (IJEIT), Volume 1, Issue4, April
2012.
[2] Alleman, G. B., Coonce, T. J., and Price, R. A., “Buildinga CrediblePerformance Measurement Baseline,”
The Measureable News, 2014,Issue4.
[3] Apeland, S., Aven, T., and Nilsen,T., “QuantifyingUncertainty under a Predictive, Epistemic Approach to
Risk Analysis,”
[4] Arena, M. V., et. al.,“Managing Perspectives Pertainingto Root CauseAnalyses of Nunn-McCrudy
Breaches, Volume 4: Program Manager Tenure, Oversightof Acquisition Category II Programs,and Framing
Assumptions”RAND MG1171/4, Rand Corporation,2013.
[5] Arízaga, J. F. O, “A Methodology for ProjectRisk AnalysisUsingBayesian Belief Networks Within A Monte
Carlo Simulation Environment,” PhD Dissertation,University of Maryland,2007.
[6] Atin, A., “Project Risk Propagation Modelingof Engineering, Procurement and Construction,”Ph. D. Thesis
Wayne State University,2016.
[7] Becerril,L., Sauer, M., and Lindemann, U., “Estimating the effects of Engineering Changes in Early Stage
Product Development,” 18th International Dependency and Structural Modeling Conference, August 29‒30,
2016.
[8] Bartolomei, J. E., et. al.,“Engineering Systems Matrix: An OrganizingFramework for ModelingLarge-Scape
Complex Systems,” Systems Engineering, 15.1, 2012,pp. 41‒61.
[9] Beer, M., “Alternative Approaches to the Treatment of Epistemic Uncertainties in Risk As sessment,”
Institute for Risk and reliability, LeibnizUniversitätHannover, 2016.
[10] Blickstein,I.,et. al., “Root CauseAnalyses of Nunn‒McCurdy Breaches: Volume 2,” Rand Corporation,
MG1171z2.
[11] Bordley, R. F., “Reference Class Forecasting:ResolvingIts Challengeto Statistical Modeling,” The American
Statistician, 68:4, 221‒229,2014.
[12] Charette, R., Dwinnell,L., and McGarry, J., “Understanding the Roots of Process Performance Failure,”
CROSSTALK: The Journal of Defense Software Engineering, August, 2004, pp. 18-24.
[13] Clemen, R. T., Making Hard Decisions:An Introduction to Decision Analysis,Second Edition, Duxbury Press,
1996.
[14] Covert, R., “Analytic Method for Probabilistic Costand Schedule Risk Analysis,Final Report,” prepared by
Raymond Covert, for National Aeronautics and Space Administration (NASA) Officeof Program Analysisand
Evaluation (PA&E) Cost Analysis Division (CAD),5 April 2013.
[15] Covert, R., "Final Report BriefingAnalytic Methods for Cost and Schedule Risk Analysis,"March 4,2013.
[16] Davis,D., and Philip,A.S., “Annual Growth of Contract Costs for Major Programs in Development and Early
Production,” Acquisition Policy Analysis Center,Performance Assessments and Root‒Cause Analyses,OUSD
AT&L, March 21, 2016.
[17] DOD, “Integrated Program Management Report (IPMR), DI‒MGMT‒81861, June 20, 2012.
[18] van Dorp, J. R., and Duffey, M. R., “Statistical Dependence in Risk Analysis for ProjectNetworks using
Monte Carlo Methods,” International Journal of Production Economics, 58, pp. 17‒29,1999.
[19] Faber, M. H., “On the treatment of uncertainties and probabilitiesin engineeringdecision analysis,” Journal
of Offshore Mechanics and Arctic Engineering, 127:243‒248.
[20] Fang, C., Marle,F., and Xio,M., “Applying Importance Measures to Risk Analysisin EngineeringProject
Usinga Risk Network Model,” IEEE Systems Journal, 2016.
[21] Flyvbjerg,B., “From Noble Prizeto ProjectManagement: Getting Risk Right,” Project Management Journal,
Vol. 37, No. 3, August 2006,pp. 5‒15.
[22] Fox, S.,” Ontological Uncertainty and Semantic Uncertainty in Global Network Organizations,” VTT Working
Papers 102,2008.
[23] Galway,L. A., “Subjective Probability Distribution Elicitation in CostRisk Analysis:AReview,” RAND
Corporation,TR-410-AF, 2007.
[24] Gerstein, D. M. et. al.,“Developing a Risk Assessment Methodology for the National Aeronautics and Space
Administration,”RAND Corporation,RR 1537,2015.
[25] Grady, J. O., Systems Requirements Analysis, Academic Press,2006.
[26] Hamaker, J. W., “But What Will ItCost? The History of NASA Cost Estimating,” Readings in Program Control,
pp. 25‒37,1994.
[27] Helton, J. C. and Burmaster, D. E., “Treatment of aleatory and epistemic uncertainty in performance
assessments for complex systems,” Reliability Engineering and System Safety, vol.54, no. 2‒3, pp. 91–94,
1996.
[28] Higbee, J. (2004),In ProgramSuccess Probability,“TaskingfromASA (ALT), Claude Bolton,” March 2002,
Defense Acquisition University,2 June 2004.
[29] Hillson,D.and Simon, P., Practical Risk Management: The ATOM Method, Management Concepts, 2007.
[30] Hofbauer, et al.,“Cost and time overruns for major defense Acquisition programs:An annotated brief,”
Center for Strategic and International Studies, Washington,DC, 2011.
[31] Hulett, D. T. and Hillson,D.,“BranchingOut: Decision Trees Offer a Realistic Approach to Risk Analysis,”PM
Network, May 2006,p. 43.
[32] IDA, “Root Causeof Nunn‒McCrudy Breaches ‒ A Survey of PARCA Root Causes Analyses,2010‒2011,
Interim Report, IDA Paper P‒4911,August 2012.
[33] Karim,A. B. A., “The Development of an Empirical-Based Framework for Project Risk Management,” Ph. D.
Thesis,University of Manchester, 2014.
[34] Kerzner, H., Project Management: A Systems Approach to Planning, Scheduling and Controlling, John Wiley
& Sons, 1998.
[35] der Kiureghian,A., “Aleatory or Epistemic? Does it matter?” Special Workshop on Risk Acceptance and Risk
Communication, March 26‒27,2007,Stanford University.
[36] Kwak, Y. H, and Smith, B. M., “ManagingRisks in Mega Defense Acquisition Projects:Performance,Policy,
and Opportunities,” International Journal of Project Management, 27, pp. 812‒820,2009.
[37] Laitonen, J. (Project manager) Losoi,H., Losoi,M., and Ylilammi,K.,“Tools for analyzingepistemic
uncertainties in probabilistic risk analysisFinal report,” Mat-2.4177 Seminar on CaseStudies in Operations
Research, 24 May 2013.
[38] Lorell,M. A., Lowell, J. F., and Younossi,O., “Implementation Challenges for Defense Space Programs,”
MG‒431‒AF, Rand Corporation 2006.
[39] Lempert, R. and Collins,M.,“Managingthe Risk on Uncertain Threshold Responses: Comparison of Robust,
Optimum, and Precautionary Responses,”Risk Analysis 27 August2007,pp. 1009‒1026,Rand Corporation.
[40] Maier,M. and Rechtin, E., The Art of Systems Architecting, Third Edition,2009
[41] NASA, “Margins and Contingency Module Space Systems Engineering,” Version 1.0, Lisa Guerra of NASA’s
Exploration Systems Mission Directorate,Department of Aerospace Engineering at the University of Texas
at Austin, 2008
[42] NASA, “Probabilistic Risk AssessmentProcedureGuide for NASA Managers and Practitioners,”NASA/SP‒
2011‒3421,Second Edition,December 2011.
[43] NASA, “NASA Risk Management Handbook,” NASA/SP‒2011‒3422, 2011
[44] NASA, “NPR 7120.5E, NASA Space FlightProgram and ProjectManagement Handbook,” NASA/SP‒2014‒
3705,September 2014.
[45] NDIA, “Planning& Executing Excellence Guide (PASEG),” National Defense Industry Association,
Procurement Division ‒ ProgramManagement Systems Committee (PMSC), June 22, 2012.
[46] DePasquale,D. and Charania,A.C., “I-RaCM: A Fully Integrated Risk and Life Cycle Cost Model,” American
Institute of Aeronautics and Astronautics,2008.
[47] Pawlikowski,E.,Loverro, D., and Cristler,T., “DisruptiveChallenges,New Opportunities,and New
Strategies, Strategic Studies Quarterly, Maxwell Air Force Base, Spring2012,pp. 27‒54.
[48] Petković, I., “Risk Management using Dependency Structure Matrix,” Workshop in Opatija,September 2 -
8, 2012,German Academic Exchange Service.
[49] PM Knowledge Center, “Schedule Risk Analysis: How to measure your baseline schedule sensitivity?”
https://goo.gl/jpMxuy
[50] Price,R. A. and Alleman, G. B., “Critical Thinkingon Critical Path Analysis,”Collegeof Performance
Management, EVM World 2016.
[51] Public Works and Government Services Canada,“ProjectComplexity and Risk AssessmentManual,”May
2015,Version 5.
[52] Raiffa,H., Decision Analysis:Introductory Lectures on Choice Under Uncertainty, Addison Wesley, 1968.
[53] Razaque, A., Bach,C., Salama,N. Alotaibi,A., “Fostering Project Schedulingand ControllingRisk
Management,” International Journal of Business and Social Science, Vol. 3 No. 14,Special Issue,July 2012.
[54] Reeves, J. D., “The Impact of Risk Identification and Trend on Space Systems Project Performance,” Ph. D.
Thesis Schools of Engineering and Applied Science, The George Washington University,May 19th, 2013.
[55] Riesch,Hauke, “Levels of Uncertainty,” in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics,
and Social Implications of Risk, Rosener, S. and Hillebrand R.Editors,Springer, 2011.
[56] Salado,A. and Nilchiani,R.R., “The Concept of ProblemComplexity,” Conference on Systems Engineering
Research (CSER 2014), Procedia Computer Science, 28
[57] Scheinin,W. “Start Early and Often: the Need for Persistent Risk Management in the Early Acquisition
Phases,”Paper to the AerospaceCorporation / Air Force SMC / NASA Seventh Symposiumon Space
Systems Engineering & Risk Management Conference, 27-29 February, 2008.
[58] Schwartz M., “Reform of the Defense Acquisition System,” United States Senate Committee on Armed
Services,Congressional Research Service,April 30,2014.
[59] SMC, “Space and MissileSystems Center Risk Management Process Guide,” Version 2.0 ‒ 5 September
2014.
[60] Siu, N. and Marble,J., “Aleatory vs. Epistemic Uncertainties:Principles and Challenges,”ASME PVP
Conference, Baltimore,MD., July 20, 2011.
[61] USAF, “Air Force Instruction 63‒101/20‒101,Integrated Life CycleManagement,” 9 May 2014.
[62] U. S. Congress, Redesigning Defense: Planning the Transition to the Future U. S. Defense Industrial Base,
Officeof Technology Assessment, Government PrintingOffice,July 1991.
[63] Vanhoucke, M., MeasuringTime: ImprovingProject Performance UsingEarned Value Management,
Springer, 2009.
[64] Vanhoucke, M., Integrated ProjectManagement Sourcebook: A Technical Guideto Project Scheduling,Risk,
and Control, Springer, 2016.
[65] Vidal,L. and Marle,F., “Modeling project complexity,” International Conference on Engineering Design,
2007.
[66] Vrouwenvelder, A.C.W.M., “Uncertainty analysis for flood defense systems in the Netherland,”
Proceedings, ESREL, 2003.
[67] Williams,T., “The Nature of Risk in Complex Projects,” Project Management Journal, 48(4), pp. 55-66, 2017.
[68] Wood, H. L. and Ashton, P., “The Factors of Project Complexity,” 18th CIB World Building Congress, Salford,
United Kingdom, 10 May 2010.
[69] Younossi,Obaid et. al.,“Improving the Cost Estimation of Space Systems: PastLessons and Future
Recommendations,” RAND Project Air Force, MG‒690.
[70] Zack, M. H., “Managing Organizational Ignorance,Knowledge Directions, Volume 1, Summer, 1999,pp. 36‒
49.
[71] Zarikas,V.and Kitsos, C. P., “Risk Analysis with Reference Class ForecastingAdoptingToleranceRegions,” in
Theory and Practice of Risk Assessment, Chapter: 18, Springer International,June 2015.
[72] Zhang, Y., Bai, S., and Guo, Y., “The Application of Design Structure Matrix in Project Schedule
Management,” 2010 International Conference on E-Business and E-Government, pp. 2813‒2816.

More Related Content

What's hot

Risk management
Risk managementRisk management
Risk management
Abhi Kalyan
 
Project risk management
Project risk managementProject risk management
Project risk management
Aswin prakash i , Xantus Technologies
 
Risk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk MatrixRisk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk Matrix
EtQ, Inc.
 
Risk assessment-training
Risk assessment-trainingRisk assessment-training
Risk assessment-training
Ishah Khaliq
 
Risk Management Plan In Business PowerPoint Presentation Slides
Risk Management Plan In Business PowerPoint Presentation Slides Risk Management Plan In Business PowerPoint Presentation Slides
Risk Management Plan In Business PowerPoint Presentation Slides
SlideTeam
 
Risk Identification Process PowerPoint Presentation Slides
Risk Identification Process PowerPoint Presentation SlidesRisk Identification Process PowerPoint Presentation Slides
Risk Identification Process PowerPoint Presentation Slides
SlideTeam
 
Project Risk Management
Project Risk ManagementProject Risk Management
Project Risk Management
Russell Taylor BEng CEng FAPM FIET
 
Chap 11.7 Monitor Risks
Chap 11.7 Monitor RisksChap 11.7 Monitor Risks
Chap 11.7 Monitor Risks
Anand Bobade
 
Risk Analysis and Management
Risk Analysis and ManagementRisk Analysis and Management
Risk Analysis and Management
Genie
 
Risk assessment tools and techniques
Risk assessment tools and techniquesRisk assessment tools and techniques
Risk assessment tools and techniques
Ahmad Azwang Aisram Omar
 
Risk Management Best Practices
Risk Management Best PracticesRisk Management Best Practices
Risk Management Best Practices
PMILebanonChapter
 
Life Saving Rules - HSE
Life Saving Rules - HSELife Saving Rules - HSE
Life Saving Rules - HSE
Razi Uddin Farooqi
 
Step by step guide on project risk management
Step by step guide on project risk managementStep by step guide on project risk management
Step by step guide on project risk management
PMC Mentor
 
Risk management
Risk management Risk management
Risk management
Sushrut Gautam
 
Risk Management
Risk ManagementRisk Management
Risk Management 101
Risk Management 101Risk Management 101
Risk Management 101
Wil Rickards
 
risk assessment
 risk assessment risk assessment
risk assessment
Said Al Jaafari
 
Risk assessment and management
Risk assessment and managementRisk assessment and management
Risk assessment and management
TaekHyeun Kim
 
Project Risk Powerpoint Presentation Slides
Project Risk Powerpoint Presentation SlidesProject Risk Powerpoint Presentation Slides
Project Risk Powerpoint Presentation Slides
SlideTeam
 
Risk Management
Risk ManagementRisk Management
Risk Management
cgeorgeo
 

What's hot (20)

Risk management
Risk managementRisk management
Risk management
 
Project risk management
Project risk managementProject risk management
Project risk management
 
Risk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk MatrixRisk Assessment: Creating a Risk Matrix
Risk Assessment: Creating a Risk Matrix
 
Risk assessment-training
Risk assessment-trainingRisk assessment-training
Risk assessment-training
 
Risk Management Plan In Business PowerPoint Presentation Slides
Risk Management Plan In Business PowerPoint Presentation Slides Risk Management Plan In Business PowerPoint Presentation Slides
Risk Management Plan In Business PowerPoint Presentation Slides
 
Risk Identification Process PowerPoint Presentation Slides
Risk Identification Process PowerPoint Presentation SlidesRisk Identification Process PowerPoint Presentation Slides
Risk Identification Process PowerPoint Presentation Slides
 
Project Risk Management
Project Risk ManagementProject Risk Management
Project Risk Management
 
Chap 11.7 Monitor Risks
Chap 11.7 Monitor RisksChap 11.7 Monitor Risks
Chap 11.7 Monitor Risks
 
Risk Analysis and Management
Risk Analysis and ManagementRisk Analysis and Management
Risk Analysis and Management
 
Risk assessment tools and techniques
Risk assessment tools and techniquesRisk assessment tools and techniques
Risk assessment tools and techniques
 
Risk Management Best Practices
Risk Management Best PracticesRisk Management Best Practices
Risk Management Best Practices
 
Life Saving Rules - HSE
Life Saving Rules - HSELife Saving Rules - HSE
Life Saving Rules - HSE
 
Step by step guide on project risk management
Step by step guide on project risk managementStep by step guide on project risk management
Step by step guide on project risk management
 
Risk management
Risk management Risk management
Risk management
 
Risk Management
Risk ManagementRisk Management
Risk Management
 
Risk Management 101
Risk Management 101Risk Management 101
Risk Management 101
 
risk assessment
 risk assessment risk assessment
risk assessment
 
Risk assessment and management
Risk assessment and managementRisk assessment and management
Risk assessment and management
 
Project Risk Powerpoint Presentation Slides
Project Risk Powerpoint Presentation SlidesProject Risk Powerpoint Presentation Slides
Project Risk Powerpoint Presentation Slides
 
Risk Management
Risk ManagementRisk Management
Risk Management
 

Similar to What is risk?

Risk management of the performance measurement baseline
Risk management of the performance measurement baselineRisk management of the performance measurement baseline
Risk management of the performance measurement baseline
Glen Alleman
 
Building Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and ScheduleBuilding Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and Schedule
Glen Alleman
 
Increasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk ManagementIncreasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk Management
Glen Alleman
 
Managing in the presence of uncertainty
Managing in the presence of uncertaintyManaging in the presence of uncertainty
Managing in the presence of uncertainty
Glen Alleman
 
Risk Management
Risk ManagementRisk Management
Risk Management
Hinal Lunagariya
 
Project/Program Risk management
Project/Program Risk managementProject/Program Risk management
Project/Program Risk management
Shan Sokhanvar (CISM, AWS-SAP, PMP, MCTS)
 
Quantification of Risks in Project Management
Quantification of Risks in Project ManagementQuantification of Risks in Project Management
Quantification of Risks in Project Management
Venkatesh Ganapathy
 
Practical Application Of System Safety For Performance Improvement
Practical Application Of System Safety For Performance ImprovementPractical Application Of System Safety For Performance Improvement
Practical Application Of System Safety For Performance Improvement
Albert V. Condello III CSP CHMM
 
Control only.pdf
Control only.pdfControl only.pdf
Control only.pdf
NmnKmr2
 
Increasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk ManagementIncreasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk Management
Glen Alleman
 
Risk management(software engineering)
Risk management(software engineering)Risk management(software engineering)
Risk management(software engineering)
Priya Tomar
 
RISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdf
RISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdfRISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdf
RISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdf
olabisiali
 
Risk Management Methodologies in Construction Industries
Risk Management Methodologies in Construction IndustriesRisk Management Methodologies in Construction Industries
Risk Management Methodologies in Construction Industries
IRJET Journal
 
Arif Mammadov risk managment.pptx
Arif Mammadov risk managment.pptxArif Mammadov risk managment.pptx
Arif Mammadov risk managment.pptx
ArifMamedov5
 
Managing cost and schedule risk
Managing cost and schedule riskManaging cost and schedule risk
Managing cost and schedule risk
Glen Alleman
 
Building a risk tolerant integrated master schedule
Building a risk tolerant integrated master scheduleBuilding a risk tolerant integrated master schedule
Building a risk tolerant integrated master schedule
Glen Alleman
 
Risk management
Risk managementRisk management
Risk management
Emad Nassar
 
Adopting the Quadratic Mean Process to Quantify the Qualitative Risk Analysis
Adopting the Quadratic Mean Process to Quantify the Qualitative Risk AnalysisAdopting the Quadratic Mean Process to Quantify the Qualitative Risk Analysis
Adopting the Quadratic Mean Process to Quantify the Qualitative Risk Analysis
Ricardo Viana Vargas
 
Risk management (final review)
Risk management (final review)Risk management (final review)
Risk management (final review)
Glen Alleman
 
Chapter 1 risk management (3)
Chapter 1  risk management (3)Chapter 1  risk management (3)
Chapter 1 risk management (3)
rafeeqameen
 

Similar to What is risk? (20)

Risk management of the performance measurement baseline
Risk management of the performance measurement baselineRisk management of the performance measurement baseline
Risk management of the performance measurement baseline
 
Building Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and ScheduleBuilding Risk Tolerance into the Program Plan and Schedule
Building Risk Tolerance into the Program Plan and Schedule
 
Increasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk ManagementIncreasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk Management
 
Managing in the presence of uncertainty
Managing in the presence of uncertaintyManaging in the presence of uncertainty
Managing in the presence of uncertainty
 
Risk Management
Risk ManagementRisk Management
Risk Management
 
Project/Program Risk management
Project/Program Risk managementProject/Program Risk management
Project/Program Risk management
 
Quantification of Risks in Project Management
Quantification of Risks in Project ManagementQuantification of Risks in Project Management
Quantification of Risks in Project Management
 
Practical Application Of System Safety For Performance Improvement
Practical Application Of System Safety For Performance ImprovementPractical Application Of System Safety For Performance Improvement
Practical Application Of System Safety For Performance Improvement
 
Control only.pdf
Control only.pdfControl only.pdf
Control only.pdf
 
Increasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk ManagementIncreasing the Probability of Success with Continuous Risk Management
Increasing the Probability of Success with Continuous Risk Management
 
Risk management(software engineering)
Risk management(software engineering)Risk management(software engineering)
Risk management(software engineering)
 
RISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdf
RISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdfRISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdf
RISK TEMPLATE FORMATE GOOD-ALIU OLAB.pdf
 
Risk Management Methodologies in Construction Industries
Risk Management Methodologies in Construction IndustriesRisk Management Methodologies in Construction Industries
Risk Management Methodologies in Construction Industries
 
Arif Mammadov risk managment.pptx
Arif Mammadov risk managment.pptxArif Mammadov risk managment.pptx
Arif Mammadov risk managment.pptx
 
Managing cost and schedule risk
Managing cost and schedule riskManaging cost and schedule risk
Managing cost and schedule risk
 
Building a risk tolerant integrated master schedule
Building a risk tolerant integrated master scheduleBuilding a risk tolerant integrated master schedule
Building a risk tolerant integrated master schedule
 
Risk management
Risk managementRisk management
Risk management
 
Adopting the Quadratic Mean Process to Quantify the Qualitative Risk Analysis
Adopting the Quadratic Mean Process to Quantify the Qualitative Risk AnalysisAdopting the Quadratic Mean Process to Quantify the Qualitative Risk Analysis
Adopting the Quadratic Mean Process to Quantify the Qualitative Risk Analysis
 
Risk management (final review)
Risk management (final review)Risk management (final review)
Risk management (final review)
 
Chapter 1 risk management (3)
Chapter 1  risk management (3)Chapter 1  risk management (3)
Chapter 1 risk management (3)
 

More from Glen Alleman

Managing risk with deliverables planning
Managing risk with deliverables planningManaging risk with deliverables planning
Managing risk with deliverables planning
Glen Alleman
 
A Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSA Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMS
Glen Alleman
 
Increasing the Probability of Project Success
Increasing the Probability of Project SuccessIncreasing the Probability of Project Success
Increasing the Probability of Project Success
Glen Alleman
 
Process Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMProcess Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPM
Glen Alleman
 
Practices of risk management
Practices of risk managementPractices of risk management
Practices of risk management
Glen Alleman
 
Principles of Risk Management
Principles of Risk ManagementPrinciples of Risk Management
Principles of Risk Management
Glen Alleman
 
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Glen Alleman
 
From Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringFrom Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems Engineering
Glen Alleman
 
NAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideNAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guide
Glen Alleman
 
Building a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineBuilding a Credible Performance Measurement Baseline
Building a Credible Performance Measurement Baseline
Glen Alleman
 
Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Integrated master plan methodology (v2)
Integrated master plan methodology (v2)
Glen Alleman
 
IMP / IMS Step by Step
IMP / IMS Step by StepIMP / IMS Step by Step
IMP / IMS Step by Step
Glen Alleman
 
DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)
Glen Alleman
 
Making the impossible possible
Making the impossible possibleMaking the impossible possible
Making the impossible possible
Glen Alleman
 
Heliotropic Abundance
Heliotropic AbundanceHeliotropic Abundance
Heliotropic Abundance
Glen Alleman
 
Capabilities based planning
Capabilities based planningCapabilities based planning
Capabilities based planning
Glen Alleman
 
Process Flow and Narrative for Agile
Process Flow and Narrative for AgileProcess Flow and Narrative for Agile
Process Flow and Narrative for Agile
Glen Alleman
 
Building the Performance Measurement Baseline
Building the Performance Measurement BaselineBuilding the Performance Measurement Baseline
Building the Performance Measurement Baseline
Glen Alleman
 
Program Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaProgram Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six Sigma
Glen Alleman
 
Policy and Procedure Rollout
Policy and Procedure RolloutPolicy and Procedure Rollout
Policy and Procedure Rollout
Glen Alleman
 

More from Glen Alleman (20)

Managing risk with deliverables planning
Managing risk with deliverables planningManaging risk with deliverables planning
Managing risk with deliverables planning
 
A Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMSA Gentle Introduction to the IMP/IMS
A Gentle Introduction to the IMP/IMS
 
Increasing the Probability of Project Success
Increasing the Probability of Project SuccessIncreasing the Probability of Project Success
Increasing the Probability of Project Success
 
Process Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPMProcess Flow and Narrative for Agile+PPM
Process Flow and Narrative for Agile+PPM
 
Practices of risk management
Practices of risk managementPractices of risk management
Practices of risk management
 
Principles of Risk Management
Principles of Risk ManagementPrinciples of Risk Management
Principles of Risk Management
 
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
Deliverables Based Planning, PMBOK® and 5 Immutable Principles of Project Suc...
 
From Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems EngineeringFrom Principles to Strategies for Systems Engineering
From Principles to Strategies for Systems Engineering
 
NAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guideNAVAIR Integrated Master Schedule Guide guide
NAVAIR Integrated Master Schedule Guide guide
 
Building a Credible Performance Measurement Baseline
Building a Credible Performance Measurement BaselineBuilding a Credible Performance Measurement Baseline
Building a Credible Performance Measurement Baseline
 
Integrated master plan methodology (v2)
Integrated master plan methodology (v2)Integrated master plan methodology (v2)
Integrated master plan methodology (v2)
 
IMP / IMS Step by Step
IMP / IMS Step by StepIMP / IMS Step by Step
IMP / IMS Step by Step
 
DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)DHS - Using functions points to estimate agile development programs (v2)
DHS - Using functions points to estimate agile development programs (v2)
 
Making the impossible possible
Making the impossible possibleMaking the impossible possible
Making the impossible possible
 
Heliotropic Abundance
Heliotropic AbundanceHeliotropic Abundance
Heliotropic Abundance
 
Capabilities based planning
Capabilities based planningCapabilities based planning
Capabilities based planning
 
Process Flow and Narrative for Agile
Process Flow and Narrative for AgileProcess Flow and Narrative for Agile
Process Flow and Narrative for Agile
 
Building the Performance Measurement Baseline
Building the Performance Measurement BaselineBuilding the Performance Measurement Baseline
Building the Performance Measurement Baseline
 
Program Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six SigmaProgram Management Office Lean Software Development and Six Sigma
Program Management Office Lean Software Development and Six Sigma
 
Policy and Procedure Rollout
Policy and Procedure RolloutPolicy and Procedure Rollout
Policy and Procedure Rollout
 

Recently uploaded

Generative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionGenerative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to Production
Aggregage
 
How to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptxHow to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptx
danishmna97
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
Neo4j
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance
 
Microsoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdfMicrosoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdf
Uni Systems S.M.S.A.
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
Quotidiano Piemontese
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
sonjaschweigert1
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
Matthew Sinclair
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
SOFTTECHHUB
 
By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024
Pierluigi Pugliese
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Nexer Digital
 
Climate Impact of Software Testing at Nordic Testing Days
Climate Impact of Software Testing at Nordic Testing DaysClimate Impact of Software Testing at Nordic Testing Days
Climate Impact of Software Testing at Nordic Testing Days
Kari Kakkonen
 
Full-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalizationFull-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalization
Zilliz
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
KAMESHS29
 
GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...
ThomasParaiso2
 
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
Neo4j
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
Neo4j
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Albert Hoitingh
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
KatiaHIMEUR1
 

Recently uploaded (20)

Generative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to ProductionGenerative AI Deep Dive: Advancing from Proof of Concept to Production
Generative AI Deep Dive: Advancing from Proof of Concept to Production
 
How to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptxHow to Get CNIC Information System with Paksim Ga.pptx
How to Get CNIC Information System with Paksim Ga.pptx
 
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
GraphSummit Singapore | Graphing Success: Revolutionising Organisational Stru...
 
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdfFIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
FIDO Alliance Osaka Seminar: The WebAuthn API and Discoverable Credentials.pdf
 
Microsoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdfMicrosoft - Power Platform_G.Aspiotis.pdf
Microsoft - Power Platform_G.Aspiotis.pdf
 
National Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practicesNational Security Agency - NSA mobile device best practices
National Security Agency - NSA mobile device best practices
 
A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...A tale of scale & speed: How the US Navy is enabling software delivery from l...
A tale of scale & speed: How the US Navy is enabling software delivery from l...
 
20240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 202420240605 QFM017 Machine Intelligence Reading List May 2024
20240605 QFM017 Machine Intelligence Reading List May 2024
 
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
Why You Should Replace Windows 11 with Nitrux Linux 3.5.0 for enhanced perfor...
 
By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024By Design, not by Accident - Agile Venture Bolzano 2024
By Design, not by Accident - Agile Venture Bolzano 2024
 
Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?Elizabeth Buie - Older adults: Are we really designing for our future selves?
Elizabeth Buie - Older adults: Are we really designing for our future selves?
 
Climate Impact of Software Testing at Nordic Testing Days
Climate Impact of Software Testing at Nordic Testing DaysClimate Impact of Software Testing at Nordic Testing Days
Climate Impact of Software Testing at Nordic Testing Days
 
Full-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalizationFull-RAG: A modern architecture for hyper-personalization
Full-RAG: A modern architecture for hyper-personalization
 
RESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for studentsRESUME BUILDER APPLICATION Project for students
RESUME BUILDER APPLICATION Project for students
 
GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...GridMate - End to end testing is a critical piece to ensure quality and avoid...
GridMate - End to end testing is a critical piece to ensure quality and avoid...
 
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
GraphSummit Singapore | Enhancing Changi Airport Group's Passenger Experience...
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024GraphSummit Singapore | The Art of the  Possible with Graph - Q2 2024
GraphSummit Singapore | The Art of the Possible with Graph - Q2 2024
 
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
Encryption in Microsoft 365 - ExpertsLive Netherlands 2024
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
 

What is risk?

  • 1. What is Risk? Glen B. Alleman, Thomas J. Coonce, and Rick A. Price Cost and schedule growth for federal programs is created by unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, all contributing to program technical and programmatic shortfalls, shown in Figure 1 and documented in a wide range of sources. [12], [13], [16], [18], [25], [30], [31], [38], [57], [58], [59], [62], [67], [69] Figure 1 ‒ Four summary sources of program cost and schedule overruns and technical shortfalls are from a more detailed assessment of the root cause of program cost, schedule, and performance shortfalls [32]. Risk is the effect of uncertainty of objectives. Uncertainty is a state or condition that involves a deficiency of information and leads to inadequate or incomplete knowledge of understanding. In the context of riskmanagement, uncertainty exists whenever the knowledge or understanding of an event, consequence, or likelihood is inadequate or incomplete ‒ ISO 31000:2009, ISO 17666:2016, and ISO 11231:2010 Risk is Uncertainty that Matters [29] Risk can be the potential consequence of a specific outcomethat affects the system’s ability to meet cost, schedule, and/or technical objectives.Risk has three primary components: [32]  Probability of the activity or event occurringor not occurring,described by a Probability Distribution Function.  Consequence or effect resultingfrom the activity or event occurringor notoccurring,described by a Probability Distribution Function.  Root Cause (condition and activity) of a future outcome, which when reduced or eliminated, will prevent occurrence, non‒occurrence, or recurrence of the cause of the risk. For the program manager, there are three risk categories thatmust be identified and handled:  Technical ‒ risks that may prevent the end item from performing as intended or not meeting performance expectations. Measures of Effectiveness, Measures of Performance, Technical PerformanceMeasures,and Key Performance Parameters describe the measures of these expectations.  Programmatic ‒ risks that affect the cost and schedule measures of the program. The programmatic risks are within the control or influenceof the Program Management or Program Executive Office, through managerial actions applied to the work activities contained in the Integrated Master Schedule. [17]  Business ‒ risks that originate outside the program office or are not within the control or influence of the Program Manager or Program Executive Office.
  • 2. Uncertainty comes from the lack information to describe a current state or to predict future states, preferred outcomes, or the actions needed to achieve them. [51], [70] This uncertainty can originate from the naturally (randomly) occurring processes of the program (AleatoryUncertainty). Or it can originate from the lack of knowledge about the future outcomes from the work on the program (Epistemic Uncertainty). [37], [60] Risks, Their Sources, and Their Handling Strategies Riskidentification during early design phases of complex systems is commonlyimplemented but often fails to identifyevents and circumstances that challenge program performance. Inefficiencies in cost and schedule estimates are usually held accountable for cost and schedule overruns, but the true root cause is often the realization of programmatic risks. A deeper understanding of frequent risk identification trends and biases pervasive during system design and development is needed, for it would lead to improved execution of existing identification processes and methods. [52], [53], [54] Risk management means building a model of the risk, the impact of the risk on the program, and a model for handlingof the risk,sinceit is a risk,the corrective or preventive action has not occurred yet. [19], [35], [46], [60], [66] Probabilistic Risk Assessment(PRA) is the basis of these models and provides the Probability of Program Success Probabilities resultfrom uncertainty and are central to the analysis of the risk.Scenarios,model assumptions,with model parameters based on current knowledge of the behavior of the system under a given set of uncertainty conditions. [43] The source of uncertainty must be identified, characterized, and the impact on program success modeled and understood, so decisions can be made about corrective and preventive actions needed to increase the Probability of Program Success. Since risk is the outcome of Uncertainty, distinguishing between the types of uncertainty in the definition and management of risk on complex systems is useful when buildingrisk assessmentand management models.[8], [25], [27], [35], [43], [71]  Epistemic uncertainty ‒ from the Greek επιστηµη (episteme), meaning knowledge of uncertainty due to a lack of knowledge of quantities or processes of the system or the environment. Epistemic uncertainty is represented by the ranges of values for parameters, a range of workable models, the level of model detail, multipleexpert interpretations, and statistical confidence. Epistemic uncertainty derives from a lack of knowledge about the appropriate value to use for a quantity that is assumed to have a fixed value in the context of a particular analysis. The accumulation of information and implementation of actions reduce epistemic uncertainty to eliminateor reduce the likelihood and/or impactof risk.This uncertainty is modeled as a subjectiveassessment of the probability of our knowledge and the probability of occurrence of an undesira ble event. Incomplete knowledge about some characteristics of the system or its environment are primary sources of Epistemic uncertainty.  Aleatory uncertainty ‒ from the Latin alea (a single die in Latin) is the inherent variation associated with a physical system or the environment. Aleatory uncertainty arises from an inherent randomness, natural stochasticity, environmental or structural variation across space and time in the properties or behavior of the system under study. [1] The accumulation of more data or additional information cannot reduce aleatory uncertainty. This uncertainty is modeled as a stochastic process of an inherently random physical model. The projected impact of the risk produced by Aleatory uncertainty can be managed through cost, schedule, and/or technical margin. Naturally occurring variations associated with the physical system are primary sources of Aleatory uncertainty. There is a third uncertainty found on some programs in the Federal Systems community that is not addressed by this White Paper, sincethis uncertainty is notcorrectable  Ontological Uncertainty ‒ is attributable to the complete lack of knowledge of the states of a system. This is sometimes labeled an Unknowable Risk. Ontological uncertainty cannot be measured directly. Ontological uncertainty creates risk from Inherent variations and incomplete information that is not knowable. Separating Aleatory and Epistemic Uncertainty for Risk Management Aleatory is mentioned 55 times and Epistemic 120 times in NASA/SP-2011-3421, Probabilistic Risk Assessment Procedure Guide for NASA Managers and Practitioners. [42] Knowing the percentage of reducible uncertainty versus irreducible uncertainty is needed to construct a credible risk model. Without the separation, knowing what uncertainty is reducible and what uncertainty is irreducible inhibits the design of the corrective and preventive actions needed to increase the probability of program success.
  • 3. Separatingthe types of uncertainty serves to increasetheclarity of risk communication,makingitclear which type of uncertainty can be reduced and which types cannot be reduced. For the latter (irreduciblerisk),only margin can be used to protect the program from the uncertainty. [9], [37] As uncertainty increases,the ability to precisely measurethe uncertainty is reduced to where a direct estimate of the risk can no longer be assessed through a mathematical model. Whilea decision in the presence of uncertainty must still bemade, deep uncertainty and poorly characterized risks lead to the absence of data and risk models in the Defense and Space Acquisition domain. [39], [42] Epistemic Uncertainty Creates Reducible Risk The risk created by Epistemic Uncertainty represents resolvable knowledge, with elements expressed as probabilistic uncertainty of a future valuerelated to a loss in a futureperiod of time. [3], [72] Awareness of this lack of knowledge provides the opportunity to reduce this uncertainty through direct corrective or preventive actions. [36] Epistemic uncertainty, and the risk it creates, is modeled by defining the probabil ity that the risk will occur, the time frame in which that probability is active,and the probability of an impact or consequence from the risk when it does occur, and finally, the probability of the residual risk when the handing of that risk has been applied. Epistemic uncertainty statements define and model these event‒based risks:  If‒Then ‒ if we miss our next milestone then the program will fail to achieveits business valueduring the next quarter.  Condition‒Concern ‒ our subcontractor has notprovided enough information for us to status the schedule,and our concern is the schedule is slipping and we do not know it.  Condition‒Event‒Consequence ‒ our status shows there aresome tasks behind schedule,so we could miss our milestone, and the program will fail to achieve its business value in the next quarter. For these types of risks, an explicit or an implicit risk handling plan is needed. The word handling is used with special purpose.“WeHandle risks”in a variety of ways. Mitigation is oneof those ways.In order to mitigatethe risk, new effort (work) must be introduce into the schedule. We are buying down the risk,or we areretiring the risk by spending money and/or consuming time to reduce the probability of the risk occurring. Or we could be spending money and consuming time to reduce the impactof the risk when it does occur. In both cases actions aretaken to address the risk. Reducible Cost Risk Reducible cost risk is often associated with unidentified reducible Technical risks, changes in tec hnical requirements and their propagation thatimpacts cost. [7] Understandingthe uncertainty in costestimates supports decision making for setting targets and contingencies, risk treatment planning, and the selection of options in the management of program costs. Before reducible cost risk can take place, the cost structure must be understood. Cost risk analysis goes beyond capturing the cost of WBS elements in the Basis of Estimate and the Cost Estimating Relationships. This involves:  Development of quantitative modellingof integrated cost and schedule,incorporatingthe drivers of reducible uncertainty in quantities, rates and productivities, and the recording of these drivers in the Risk Register.  Determining how cost and schedule uncertainty can be integrated in the analysis of the cost risk model  Performing sensitivity analysis to provide understanding of the effects of reducible uncertainty and the allocation of contingency amounts across the program. Reducible Schedule Risk While there is significant variability, for every 10% in Schedule Growth there is a corresponding 12% Cost Growth. [10] Schedule Risk Analysis (SRA) is an effective technique to connect the risk information of programactivities to the baselineschedule,to provide sensitivity information of individual programactivities to assess the potential impact of uncertainty on the final program duration and cost. Schedule risk assessment is performed in 4 steps: 1. BaselineSchedule‒ Construct a credibleactivity network compliantwith GAO‒16‒89G, “Schedule Assessment Guide: Best Practices for ProjectSchedule.” 2. Define ReducibleUncertainties ‒ for activity durations and costdistributionsfromthe Risk Register and assign theseto work activities affected by the risk and/or the work activities assigned to reduce the risk.
  • 4. 3. Run Monte‒Carlo simulations‒ for the scheduleusingthe assigned Probability Distribution Functions (PDFs), usingthe Min/Max values of the distribution,for each work activity in the IMS. 4. Interpret Simulation Results ‒ usingdata produced by the Monte Carlo Simulation,includingatleast: [61]  Criticality Index (CI): Measures the probability that an activity is on the critical path. [14], [15]  Significance Index (SI): Measures the relative importance of an activity. [63], [64]  Schedule Sensitivity Index (SSI): Measures the relative importance of an activity taking the CI into account. [49], [63], [64]  Cruciality Index (CRI): Measures the correlation between the activity duration and the total program duration. [63], [64] Reducible Technical Risk Technical risk is the impact on a program, system, or entire infrastructurewhen the outcomes from engineering development do not work as expected, do not provide the needed technical performance, or create higher than planned risk to the performance of the system. Failure to identify or properly manage this technical risk results in performance degradation, security breaches, system failures,increased maintenancetime, and significantamount of technical debt 1 and addition cost and time for end item deliver for the program. Reducible Cost Estimating Risk Reducible cost estimating risk is dependent on technical, schedule, and programmatic risks, which must be assessed to provide an accurate picture of program cost. Cost risk estimating assessment addresses the cost, schedule, and technical risks that impact the cost estimate. To quantify these cost impacts from the reduci blerisk, sources of risk need to be identified. This assessment is concerned with three sources of risk and ensure that the model calculating the cost also accounts for these risks: [23]  The risk inherentin the costestimatingmethod. The Standard Error of the Estimate (SEE), confidenceintervals, and prediction intervals.  Risk inherent in technical and programmatic processes. The technology’s maturity, design and engineering, integration, manufacturing, schedule, and complexity. [68]  The risk inherent in the correlation between WBS elements, which decides to what degree one WBS element’s change in cost is related to another and in which direction.WBS elements within federal systems have positive correlations with each other, and the cumulative effect of this positive correlation increases the range of the costs. 2 Unidentified reducible Technical Risks are often associated with Reducible Cost and Schedule risk. Aleatory Uncertainty Creates Irreducible Risk Aleatory uncertainty and the risk it creates comes not from the lack of information, but from the naturally occurring processes of the system. For aleatory uncertainty, more information cannot be bought nor specific risk reduction actions taken to reduce the uncertainty and resulting risk. The objective of identifying and managing aleatory uncertainty to be preparing to handle the impacts when risk is realized. The method for handling these impacts is to provide margin for this type of risk, including cost, schedule, and technical margin. Using the NASA definition, Margin is the difference between the maximum possible value and the maximum expected Value and separate from Contingency. Contingency is the difference between the current best estimates and maximum expected estimate. For systems under development, the technical resources and the technical performance values carry both margin and contingency. Schedule Margin should be used to cover the naturally occurring variances in how long it takes to do the work. Cost Margin is held to cover the naturally occurring variances in the price of something being consumed in the program. Technical margin is intended to cover the naturally occurring variation of technical products. Aleatory uncertainty and the resultingrisk ismodeled with a Probability Distribution Function (PDF) thatdescribes the possiblevalues theprocess can takeand the probability of each value.The PDF for the possibledurationsfor the work in the program can be determined. Knowledge can be bought about the aleatory uncertainty 1 Technical debt is a term used in the Agile community to describe eventual consequences of deferring complexity or implementin g incomplete changes. As a development team progresses there may be additional coordinated work that must be addressed in other places in the schedule. When these changes do not get addressed within the planned time or get deferred for a later integration, the program accumulates a debt that must be paid at some point in the future. 2 The use of Design Structure Matrix provides visibility and modeling of these dependencies. Many models consider the dependencies as statistic or fixed at some value. But the dependencies are dynamic driven by nonstationary stochastic processes, that evolve as the program evolves .
  • 5. through Reference Class Forecasting and past performance modeling. [11], [21], [71] This new information then allows us to update ‒ adjust ‒ our past performance on similar work will provide information about our future performance. But the underlyingprocesses is still random,and our new information simply created a new aleatory uncertainty PDF. The firststep in handlingIrreducibleUncertainty is thecreation of Margin.Schedulemargin,Costmargin,Technical Margin,to protect the programfrom the risk of irreducibleuncertainty.Margin isdefined as theallowancein budget, programed schedule … to account for uncertainties and risks. [44] Margin needs to be quantified by:  Identifying WBS elements that contribute to margin.  Identifying uncertainty and risk that contributes to margin. Irreducible Schedule Risk Programs are over budget and behind schedule, to some extent because uncertainties are not accounted for in scheduleestimates. Research and practiceis now addressing this problem, often by using Monte Carlo methods to simulatethe effect of variances in work packagecosts and durationson total costand date of completion. However, many such program risk approaches ignorethe significantimpactof probabilistic correlation on work package cost and duration predictions. [5], [33] Irreducibleschedulerisk ishandled with Schedule Margin which is defined as the amount of added time needed to achievea significantevent with an acceptableprobability of success. [2] Significantevents are major contractual milestones or deliverables. [48] With minimal or no margins in schedule, technical, or cost present to deal with unanticipated risks, successful acquisition is susceptible to cost growth and cost overruns. Error! Reference source not found. The Program Manager owns the schedule margin.It does not belong to the clientnor can it be negotiated away by the business management team or the customer. This is the primary reason to CLEARLY identify the Schedule Margin in the Integrated Master Schedule. [17] It is there to protect the program deliverable(s).Schedulemargin is not allocated to over‒runningtasks,rather is planned to protect the end item deliverables. The schedulemargin should protect the delivery date of major contractevents or deliverables.This isdonewith a Task in the IMS that has no budget (BCWS). The duration of this Task is derived from Reference Classes or Monte Carlo Simulation of aleatory uncertainty that creates risk to the event or deliverable. [11],[47] The IMS, with schedule margin to protect against the impact aleatory uncertainty, represents the most likely and realistic risk‒ based plan to deliver the needed capabilities of the program. Irreducible Technical Risk The last 10 percent of the technical performance generates one‒third of the cost and two‒thirds of the problems ‒ Norman Augustine’s 15th Law. [6] Using the NASA definition, Margin is the difference between the maximum possible value and the maximum expected Value and separate that from contingency is the difference between the current best estimates and maximum expected estimate, then for the systems under development, the technical outcome and technical performance values both carry margin and contingency.  Technical Margin and Contingency serve several purposes:  Describe the need for and use of resource margins and contingency in system development.  Define and distinguish between margins and contingency.  Demonstrate that, historically, resource estimates grow as designs mature.  Provide a representative margin depletion table showing prudent resource contingency as a function of program phase. For any system, in any stage of its development, there is a maximum possible, maximum expected, and current best estimate for every technical outcome. The current best estimate of a technical performance changes as the development team improves the design and the understanding of that design matures. For a system in development, most technical outcomes should carry both margin and contingency. As designs mature, the estimate of any technical resource usually grows. This is true historically and, independent of exactly why, development programs must plan for it to occur. [41]
  • 6. The goal of Technical Margin (unlikeCostand ScheduleMargin) is to reduce the margins (for example Size Weight and Power) to as close to zero as possible, to maximize mission capabilities. The technical growth and its handling include:  Expected technical growth ‒ contingency accounts for expected growth  Recognize mass growth is historically inevitable.  As systems mature through their development life cycle, with better understanding of the design emerging from conceptual to actual designs.  Requirements changes often increase resource use  Unplanned technical growth ‒ margins account for unexpected growth  Recognize all system development is challenging  Programs encounter “unknown unknowns” with the use of new technology that is difficult to gauge, that develop into uncertainties in design and execution of the program, all the way to manufacturing variations. Ontological Uncertainty On the scale of uncertainty ‒ from random naturally occurring processes (aleatory) to the Lack of Knowledge (epistemic), Ontological uncertainty lies atthe far end of this continuum and is a state of complete ignorance.Not only arethe uncertainties notknown, the uncertainties may notbe knowable.Whilethe truth is outthere, 3 itcannot be accessed because it is simply not known where to look in the first instance. NASA calls it operating outside the experience base, where things are done for the first time and operate in a state of ignorance. Management of uncertainty is the critical success factor of effective program management. Complex programs and the organizations thatdeliver them organizations can involvepeople with different genders, personality types, cultures, first languages, social concerns, and/or work experiences. Such differences can lead to ontological uncertainty and semantic uncertainty. Ontological uncertainty involves different parties in the same interactions having different conceptualizations about what kinds of entities inhabit their world;what kinds of interactions theseentities have; how the entities and their interactions changeas a result of these interactions. Semantic uncertainty involves different participants in the same interactions giving different meanings to the same term, phrase and/or actions. Ontological uncertainty and semantic uncertainty can lead to intractable misunderstandings between people. [22], [55] When new technologies are introduced into these complex organizations, concepts, principles and techniques may be fundamentally unfamiliar and carry a higher degree of ontological uncertainty than more mature technologies.A subtler problemis these uncertainties areoften implicitrather than explicitand become an invisible and unquestioned partof the system. When epistemic and ontological uncertainty represents our lack of knowledge, then reducing the risks produced by this uncertainty requires improvement in our knowledge of the system of interest or avoiding situations that increase these types of uncertainty. To reduce Epistemic and Ontological uncertainty there must be a reduction in the uncertainty of the model of system behavior (ontology) or in the model’s parameters (epistemology). Call To Action A great deal of work has been done on improving risk identification and analysis. The knowledge created by these efforts provides a more complete list of risks and better estimates of their likelihood and consequences, but the fruits of this effort have NOT been powerful enough to change the written guidance of DOD and other agencies. Successful risk management can be applied to increasethe probability of programsuccess,but only if it can be assured that:  The models of Aleatory and Epistemic uncertainty properly represent the actual uncertainties on the program.  The corrective and preventive actions for the risks created by Epistemic uncertainty and margins used for the Aleatory uncertainties will reduce risk as well as increase the probability of program success. In 2002,ClaudeBolton 4 said,there are still too many surprises, usingtraditional metrics for poorly performingand failingprograms in a briefing to Army senior leadership. [28] Pastefforts to build dashboards,provide guidanceto services (USAF, Navy, and ARMY) have fallen into disuse. 3 Apologies to Mulder and Scully and the X‒Files 4 Retired United States Air Force Major General who served as United States Assistant Secretary of the Army for Acquisition, Lo gistics, and Technology from 2002 to 2008.
  • 7. Two Critical Steps start the path toward Increasing Probability of Program Success 3. Fix budgeting process and 6. Align Technical Plan with Budgeting Table 1 ‒ Corrective and preventive actions for programmatic root causes reducing probability of program success Condition or Action (Cause) Undesirable Outcome (Effect) Corrective Action 1. Government failsto definewhat Done lookslike in unitsofmeasure meaningful to the decision makers, priorto RFP. Systemsare deliveredthat don’t provide the desired capabilities, arrive too late,and cost more than budgeted. Government definestheMoE’s for the program prior to issuing the RFP. 2. Different participantsin theacquisition processimpose conflicting demandson programs. [24] Participantsdo what isin theirown best interest versusthe best interest oftechnical, cost,and schedule performancefor the program. Agreed upon MoP’sdefined at IBR for each program deliverable. 3. Budget processesforcefunding decisions to be made in advance ofprogram decisions. [12], [69] Budget processencouragesundueoptimism that createsprogram risk for cost, schedule, and technical performance. Government providesarisk adjusted cost loaded Integrated Master Plan prior to Milestone‒ B/KDP-B. 4. Program managers’ short tenures, limitationsin experience, and lack of formal training. [4], Error! Reference source notfound. The program manager focused on minimizing risk to Keep theProgram Sold and creating the opportunity for career advancement. Program Manager assigned from start of development (KDP-B)to completion. Table 2 ‒ Corrective and preventive actions for cost, schedule, and technical root causes reducing probability of program success A fundamental responsibility in the acquisition of major systems is to ensure that visibility of progress is sufficient to re liably assess the results being obtained. [56] Without the integration of the risk adjusted cost, schedule, and technical performance measures, and preventive and corrective actions needed to reduce or eliminate the impact of these risks, the probability of program success is not likely to meet the needs of those paying for the outcomes. Condition or Action (Cause) Undesirable Outcome (Effect) Corrective Action 5. Schedule and Cost Margin not assignedto protect key contractual deliverablesor end items. [10],[34] Lack ofcost and schedule margin causesthe program to under deliver technical, go over budget, and deliver late. Maintain the required schedule and cost margin to protect key contractual deliverables. 6. Technical Plan not aligned with Budget and Spend Plan, with missing Schedule Margin and risk buy down plan. [69] Lack ofintegrated cost, schedule,technical progressgiveswrong information offuture performance. Develop resource loadedIntegrated Master Schedule, aligned with Technical and Budget adjusted with Marginat (IBR). 7. Alternative PointsofIntegrationnot considered in IMS whenrisk becomesan issue. [50] No Plan-B meansProbability ofSuccessis reduced. Have a Plan-B on baseline in theform an alternative pointsofintegration. 8. Cost, Schedule,and Technical UncertaintiesimpactProbability of Program Successwithout handling plans. [26] Without acredible risk strategy, there isno credible plan for success. Develop risk modelsfor integrated cost, schedule, and technical performance. 9. Missing Resource Loaded Scheduleto assure staff, facilities, and other resources available when needed. [45] If resourcesare not availablewhenneeded, there isno credibleplan for success. Develop and maintain aresourceloaded schedule. 10. Propagation ofAleatory and Epistemic Uncertainty not considered in risk profile or handling plans. [2], [20], [40], [65] Without understanding thedynamicsofrisk and itspropagation, there isno credible plan to Keep the Program Green. Error! Referencesourcenot found. Model network ofactivitiesto reveal risk propagation using DSM.
  • 8. Bibliography [1] Ali,T., Boruah, H., and Dutta, P., “Modeling Uncertainty in Risk Assessment UsingDouble Monte Carlo Method,” International Journal of Engineering and Innovation Technology (IJEIT), Volume 1, Issue4, April 2012. [2] Alleman, G. B., Coonce, T. J., and Price, R. A., “Buildinga CrediblePerformance Measurement Baseline,” The Measureable News, 2014,Issue4. [3] Apeland, S., Aven, T., and Nilsen,T., “QuantifyingUncertainty under a Predictive, Epistemic Approach to Risk Analysis,” [4] Arena, M. V., et. al.,“Managing Perspectives Pertainingto Root CauseAnalyses of Nunn-McCrudy Breaches, Volume 4: Program Manager Tenure, Oversightof Acquisition Category II Programs,and Framing Assumptions”RAND MG1171/4, Rand Corporation,2013. [5] Arízaga, J. F. O, “A Methodology for ProjectRisk AnalysisUsingBayesian Belief Networks Within A Monte Carlo Simulation Environment,” PhD Dissertation,University of Maryland,2007. [6] Atin, A., “Project Risk Propagation Modelingof Engineering, Procurement and Construction,”Ph. D. Thesis Wayne State University,2016. [7] Becerril,L., Sauer, M., and Lindemann, U., “Estimating the effects of Engineering Changes in Early Stage Product Development,” 18th International Dependency and Structural Modeling Conference, August 29‒30, 2016. [8] Bartolomei, J. E., et. al.,“Engineering Systems Matrix: An OrganizingFramework for ModelingLarge-Scape Complex Systems,” Systems Engineering, 15.1, 2012,pp. 41‒61. [9] Beer, M., “Alternative Approaches to the Treatment of Epistemic Uncertainties in Risk As sessment,” Institute for Risk and reliability, LeibnizUniversitätHannover, 2016. [10] Blickstein,I.,et. al., “Root CauseAnalyses of Nunn‒McCurdy Breaches: Volume 2,” Rand Corporation, MG1171z2. [11] Bordley, R. F., “Reference Class Forecasting:ResolvingIts Challengeto Statistical Modeling,” The American Statistician, 68:4, 221‒229,2014. [12] Charette, R., Dwinnell,L., and McGarry, J., “Understanding the Roots of Process Performance Failure,” CROSSTALK: The Journal of Defense Software Engineering, August, 2004, pp. 18-24. [13] Clemen, R. T., Making Hard Decisions:An Introduction to Decision Analysis,Second Edition, Duxbury Press, 1996. [14] Covert, R., “Analytic Method for Probabilistic Costand Schedule Risk Analysis,Final Report,” prepared by Raymond Covert, for National Aeronautics and Space Administration (NASA) Officeof Program Analysisand Evaluation (PA&E) Cost Analysis Division (CAD),5 April 2013. [15] Covert, R., "Final Report BriefingAnalytic Methods for Cost and Schedule Risk Analysis,"March 4,2013. [16] Davis,D., and Philip,A.S., “Annual Growth of Contract Costs for Major Programs in Development and Early Production,” Acquisition Policy Analysis Center,Performance Assessments and Root‒Cause Analyses,OUSD AT&L, March 21, 2016. [17] DOD, “Integrated Program Management Report (IPMR), DI‒MGMT‒81861, June 20, 2012. [18] van Dorp, J. R., and Duffey, M. R., “Statistical Dependence in Risk Analysis for ProjectNetworks using Monte Carlo Methods,” International Journal of Production Economics, 58, pp. 17‒29,1999. [19] Faber, M. H., “On the treatment of uncertainties and probabilitiesin engineeringdecision analysis,” Journal of Offshore Mechanics and Arctic Engineering, 127:243‒248. [20] Fang, C., Marle,F., and Xio,M., “Applying Importance Measures to Risk Analysisin EngineeringProject Usinga Risk Network Model,” IEEE Systems Journal, 2016. [21] Flyvbjerg,B., “From Noble Prizeto ProjectManagement: Getting Risk Right,” Project Management Journal, Vol. 37, No. 3, August 2006,pp. 5‒15. [22] Fox, S.,” Ontological Uncertainty and Semantic Uncertainty in Global Network Organizations,” VTT Working Papers 102,2008. [23] Galway,L. A., “Subjective Probability Distribution Elicitation in CostRisk Analysis:AReview,” RAND Corporation,TR-410-AF, 2007. [24] Gerstein, D. M. et. al.,“Developing a Risk Assessment Methodology for the National Aeronautics and Space Administration,”RAND Corporation,RR 1537,2015. [25] Grady, J. O., Systems Requirements Analysis, Academic Press,2006.
  • 9. [26] Hamaker, J. W., “But What Will ItCost? The History of NASA Cost Estimating,” Readings in Program Control, pp. 25‒37,1994. [27] Helton, J. C. and Burmaster, D. E., “Treatment of aleatory and epistemic uncertainty in performance assessments for complex systems,” Reliability Engineering and System Safety, vol.54, no. 2‒3, pp. 91–94, 1996. [28] Higbee, J. (2004),In ProgramSuccess Probability,“TaskingfromASA (ALT), Claude Bolton,” March 2002, Defense Acquisition University,2 June 2004. [29] Hillson,D.and Simon, P., Practical Risk Management: The ATOM Method, Management Concepts, 2007. [30] Hofbauer, et al.,“Cost and time overruns for major defense Acquisition programs:An annotated brief,” Center for Strategic and International Studies, Washington,DC, 2011. [31] Hulett, D. T. and Hillson,D.,“BranchingOut: Decision Trees Offer a Realistic Approach to Risk Analysis,”PM Network, May 2006,p. 43. [32] IDA, “Root Causeof Nunn‒McCrudy Breaches ‒ A Survey of PARCA Root Causes Analyses,2010‒2011, Interim Report, IDA Paper P‒4911,August 2012. [33] Karim,A. B. A., “The Development of an Empirical-Based Framework for Project Risk Management,” Ph. D. Thesis,University of Manchester, 2014. [34] Kerzner, H., Project Management: A Systems Approach to Planning, Scheduling and Controlling, John Wiley & Sons, 1998. [35] der Kiureghian,A., “Aleatory or Epistemic? Does it matter?” Special Workshop on Risk Acceptance and Risk Communication, March 26‒27,2007,Stanford University. [36] Kwak, Y. H, and Smith, B. M., “ManagingRisks in Mega Defense Acquisition Projects:Performance,Policy, and Opportunities,” International Journal of Project Management, 27, pp. 812‒820,2009. [37] Laitonen, J. (Project manager) Losoi,H., Losoi,M., and Ylilammi,K.,“Tools for analyzingepistemic uncertainties in probabilistic risk analysisFinal report,” Mat-2.4177 Seminar on CaseStudies in Operations Research, 24 May 2013. [38] Lorell,M. A., Lowell, J. F., and Younossi,O., “Implementation Challenges for Defense Space Programs,” MG‒431‒AF, Rand Corporation 2006. [39] Lempert, R. and Collins,M.,“Managingthe Risk on Uncertain Threshold Responses: Comparison of Robust, Optimum, and Precautionary Responses,”Risk Analysis 27 August2007,pp. 1009‒1026,Rand Corporation. [40] Maier,M. and Rechtin, E., The Art of Systems Architecting, Third Edition,2009 [41] NASA, “Margins and Contingency Module Space Systems Engineering,” Version 1.0, Lisa Guerra of NASA’s Exploration Systems Mission Directorate,Department of Aerospace Engineering at the University of Texas at Austin, 2008 [42] NASA, “Probabilistic Risk AssessmentProcedureGuide for NASA Managers and Practitioners,”NASA/SP‒ 2011‒3421,Second Edition,December 2011. [43] NASA, “NASA Risk Management Handbook,” NASA/SP‒2011‒3422, 2011 [44] NASA, “NPR 7120.5E, NASA Space FlightProgram and ProjectManagement Handbook,” NASA/SP‒2014‒ 3705,September 2014. [45] NDIA, “Planning& Executing Excellence Guide (PASEG),” National Defense Industry Association, Procurement Division ‒ ProgramManagement Systems Committee (PMSC), June 22, 2012. [46] DePasquale,D. and Charania,A.C., “I-RaCM: A Fully Integrated Risk and Life Cycle Cost Model,” American Institute of Aeronautics and Astronautics,2008. [47] Pawlikowski,E.,Loverro, D., and Cristler,T., “DisruptiveChallenges,New Opportunities,and New Strategies, Strategic Studies Quarterly, Maxwell Air Force Base, Spring2012,pp. 27‒54. [48] Petković, I., “Risk Management using Dependency Structure Matrix,” Workshop in Opatija,September 2 - 8, 2012,German Academic Exchange Service. [49] PM Knowledge Center, “Schedule Risk Analysis: How to measure your baseline schedule sensitivity?” https://goo.gl/jpMxuy [50] Price,R. A. and Alleman, G. B., “Critical Thinkingon Critical Path Analysis,”Collegeof Performance Management, EVM World 2016. [51] Public Works and Government Services Canada,“ProjectComplexity and Risk AssessmentManual,”May 2015,Version 5. [52] Raiffa,H., Decision Analysis:Introductory Lectures on Choice Under Uncertainty, Addison Wesley, 1968. [53] Razaque, A., Bach,C., Salama,N. Alotaibi,A., “Fostering Project Schedulingand ControllingRisk Management,” International Journal of Business and Social Science, Vol. 3 No. 14,Special Issue,July 2012.
  • 10. [54] Reeves, J. D., “The Impact of Risk Identification and Trend on Space Systems Project Performance,” Ph. D. Thesis Schools of Engineering and Applied Science, The George Washington University,May 19th, 2013. [55] Riesch,Hauke, “Levels of Uncertainty,” in Handbook of Risk Theory: Epistemology, Decision Theory, Ethics, and Social Implications of Risk, Rosener, S. and Hillebrand R.Editors,Springer, 2011. [56] Salado,A. and Nilchiani,R.R., “The Concept of ProblemComplexity,” Conference on Systems Engineering Research (CSER 2014), Procedia Computer Science, 28 [57] Scheinin,W. “Start Early and Often: the Need for Persistent Risk Management in the Early Acquisition Phases,”Paper to the AerospaceCorporation / Air Force SMC / NASA Seventh Symposiumon Space Systems Engineering & Risk Management Conference, 27-29 February, 2008. [58] Schwartz M., “Reform of the Defense Acquisition System,” United States Senate Committee on Armed Services,Congressional Research Service,April 30,2014. [59] SMC, “Space and MissileSystems Center Risk Management Process Guide,” Version 2.0 ‒ 5 September 2014. [60] Siu, N. and Marble,J., “Aleatory vs. Epistemic Uncertainties:Principles and Challenges,”ASME PVP Conference, Baltimore,MD., July 20, 2011. [61] USAF, “Air Force Instruction 63‒101/20‒101,Integrated Life CycleManagement,” 9 May 2014. [62] U. S. Congress, Redesigning Defense: Planning the Transition to the Future U. S. Defense Industrial Base, Officeof Technology Assessment, Government PrintingOffice,July 1991. [63] Vanhoucke, M., MeasuringTime: ImprovingProject Performance UsingEarned Value Management, Springer, 2009. [64] Vanhoucke, M., Integrated ProjectManagement Sourcebook: A Technical Guideto Project Scheduling,Risk, and Control, Springer, 2016. [65] Vidal,L. and Marle,F., “Modeling project complexity,” International Conference on Engineering Design, 2007. [66] Vrouwenvelder, A.C.W.M., “Uncertainty analysis for flood defense systems in the Netherland,” Proceedings, ESREL, 2003. [67] Williams,T., “The Nature of Risk in Complex Projects,” Project Management Journal, 48(4), pp. 55-66, 2017. [68] Wood, H. L. and Ashton, P., “The Factors of Project Complexity,” 18th CIB World Building Congress, Salford, United Kingdom, 10 May 2010. [69] Younossi,Obaid et. al.,“Improving the Cost Estimation of Space Systems: PastLessons and Future Recommendations,” RAND Project Air Force, MG‒690. [70] Zack, M. H., “Managing Organizational Ignorance,Knowledge Directions, Volume 1, Summer, 1999,pp. 36‒ 49. [71] Zarikas,V.and Kitsos, C. P., “Risk Analysis with Reference Class ForecastingAdoptingToleranceRegions,” in Theory and Practice of Risk Assessment, Chapter: 18, Springer International,June 2015. [72] Zhang, Y., Bai, S., and Guo, Y., “The Application of Design Structure Matrix in Project Schedule Management,” 2010 International Conference on E-Business and E-Government, pp. 2813‒2816.