Introduction to monte-carlo analysis for software development - Troy Magennis...Troy Magennis
Forecasting and managing software development project risks & uncertainty. Monte-carlo analysis is the tool of choice for managing risk in many fields where risk is an inherent part of doing business. This paper examines how to use monte-carlo techniques to understand and leverage risk in Software Development projects and teams.
IS EARNED VALUE + AGILE A MATCH MADE IN HEAVEN?
Increasing the Probability of Program Success requires by connecting the dots between EV and Agile Development.
Presented at
The Nexus of Agile Software Development and
Earned Value Management, OSD-PARCA,
February 19 – 20, 2015
Institute for Defense Analysis, Alexandria, VA
The resources listed here are the starting point for anyone interested in applying the principles developed in this briefing for integrating Agile with Earned Value Management projects
Performance-Based Project Management® id s deliverables based approach to project success. Deliverables start with the needed capabilities that the project produces to meet the mission objectives or fulfill a business case.
These deliverables fulfill the requirements, assessed through Measures of Effectiveness and Measures of Performance
In Agile, Story Points are used as measures of effort. In Earned Value there is no concept of Story Points, rather Dollars and Hours are the measures of effort and duration for the work.
When using Agile on EVM projects, each unit of measure has value to the benefits produced through the integration, IF there is a proper segregation of these concepts.
Forecasting cost and schedule performanceGlen Alleman
For credible decisions to be made, we need confidence intervals on all the numbers we use to make decisions.
These confidence intervals come from the underlying statistics and the related probabilities.
Statistical forecasting, using time series analysis of past performance, is mandatory for any credible discussion of project performance in the future.
Starting with an EIA–748–C compliant Earned Value Management System, integrating an Agile Software Development Lifecycle (Agile) is straightforward when there is a Bright Line between the Performance Measurement Baseline (PMB) and the Sprints and Tasks of the Agile Software Development Process.
Introduction to monte-carlo analysis for software development - Troy Magennis...Troy Magennis
Forecasting and managing software development project risks & uncertainty. Monte-carlo analysis is the tool of choice for managing risk in many fields where risk is an inherent part of doing business. This paper examines how to use monte-carlo techniques to understand and leverage risk in Software Development projects and teams.
IS EARNED VALUE + AGILE A MATCH MADE IN HEAVEN?
Increasing the Probability of Program Success requires by connecting the dots between EV and Agile Development.
Presented at
The Nexus of Agile Software Development and
Earned Value Management, OSD-PARCA,
February 19 – 20, 2015
Institute for Defense Analysis, Alexandria, VA
The resources listed here are the starting point for anyone interested in applying the principles developed in this briefing for integrating Agile with Earned Value Management projects
Performance-Based Project Management® id s deliverables based approach to project success. Deliverables start with the needed capabilities that the project produces to meet the mission objectives or fulfill a business case.
These deliverables fulfill the requirements, assessed through Measures of Effectiveness and Measures of Performance
In Agile, Story Points are used as measures of effort. In Earned Value there is no concept of Story Points, rather Dollars and Hours are the measures of effort and duration for the work.
When using Agile on EVM projects, each unit of measure has value to the benefits produced through the integration, IF there is a proper segregation of these concepts.
Forecasting cost and schedule performanceGlen Alleman
For credible decisions to be made, we need confidence intervals on all the numbers we use to make decisions.
These confidence intervals come from the underlying statistics and the related probabilities.
Statistical forecasting, using time series analysis of past performance, is mandatory for any credible discussion of project performance in the future.
Starting with an EIA–748–C compliant Earned Value Management System, integrating an Agile Software Development Lifecycle (Agile) is straightforward when there is a Bright Line between the Performance Measurement Baseline (PMB) and the Sprints and Tasks of the Agile Software Development Process.
Every tool, process, and practice has a dark side. Knowing these is a Critical Success Factor to the integration of EVM and Agile at the desired Maturity Level.
Both Earned Value Management and Agile have Dark Sides. Things that are not talked about in public.
But when they are Integrated, each provides a solution for the problems of the other.
Assess current and desired Maturity for Agile and EVM is the starting point for integrating these two processes.
The Project Breathalyzer provides program managers with a quick look at software project health. They identify software projects that "should not be in the road." The Breathalyzer determines whether key program elements exist, without which the program is not likely to suceed.
Integrated Agile with EVM -- Executive overviewGlen Alleman
Earned Value Management and Agile Software Development have much in common. The most important is progress to plan is measured by Physical Percent Complete with tangible evidence of working products at the end of planned period of performance.
For software intensive system of systems, agile development provides powerful tools for producing working software on frequent boundaries to gain needed customer feedback to assure the program is going in the right direction.
Monte Carlo Simulation for Agile DevelopmentGlen Alleman
Managing in the presence of uncertainty requires making decisions with models of that uncertainty. Monte Carlo Simulation and related approaches are the basis of making informed decisions in the presence of uncertainty
Making decision in the presence of uncertainty requires estimating the impact of the outcome of those decisions. Here;s a collection of resources can can be used to guide that process
Avoid software project horror stories - check the reality value of the estima...Harold van Heeringen
Many large software projects turn into software horror stories, resulting in newspaper headlines and even political issues. Often, the project costs and schedule were estimated unrealistically optimistic, using immature estimation techniques. A relatively simple way to avoid many problems is to perform a reality check on the estimate. This presentation was given on the conference of the International Cost Estimating and Analysis Association (ICEAA2014), June 2014 (Denver, USA)
The management of software development is fraught with risk: technical risk, market risk, requirements risk, and financial risk. This paper describes nine (9) key management principles for
guiding the development of a software project. These principles are not original. They are taken directly from the work of Norm Brown, the founder and executive Director of the Software Program Managers Network (SPMN).
Start with defining the deliverables to produce the capabilities needed for project success. Then what work is needed, the order of that work, and the defined outcomes of that work become obvious. Sequence that work, assign durations and resources and you've generated the plans and schedule for success
The question – what does Done Look Like? – was asked every week on the program that changed my life as a Program Manager. Rocky Flats Environmental Technology Site (RFETS) was the marketing term for the 3rd worst toxic waste site on the planet. RFETS was a nuclear bomb manufacturing plant, built in 1951, operating until 1989, and closed in 2005. I served as the VP of Program Management of the ITC (Information Technology and Communications) group, providing ERP, purpose built IT, voice, and data systems for 5,000 employees and contractors of the Bomb Factory.
SOLVING PROJECT ALLOCATION RESOURCE PROBLEMS WITH AEROSPACE ERPKevin West
Defence manufacturing is all about project manufacturing and project accounting. And that means enterprise resource planning (ERP) software for defence manufacturing must include robust functionality for project management and specifically project cost allocation.
Earned Value Management Meets Big DataGlen Alleman
The Earned Value Management System (EVMS) maintains period–by–period data in its underlying databases. The contents of the Earned Value repository can be considered BIG DATA, characterized by three attributes – 1) Volume: Large amounts of data; 2) Variety: data comes from different sources, including traditional data bases, documents, and complex records; 3) Velocity: the content is continually being updated by absorbing other data collections, through previously archived data, and through streamed data from external sources.
With this time series information in the repository, analysis of trends, cost and schedule forecasts, and confidence levels of these performance estimates can be calculated using statistical analysis techniques enabled by the Autoregressive Integrated Moving Average (ARIMA) algorithm provided by the R programming system. ARIMA provides a statistically informed Estimate At Completion (EAC) and Estimate to Complete (ETC) to the program in ways not available using standard EVM calculations. Using ARIMA reveals underlying trends not available through standard EVM reporting calculations.
With ARIMA in place and additional data from risk, technical performance and the Work Breakdown Structure, Principal Component Analysis can be used to identify the drivers of unanticipated EAC.
Managing in the presence of uncertaintyGlen Alleman
Uncertainty is the source of risk. Uncertainty comes in two types, aleatory and epistemic. It is important to understand both and deal with both in distinct ways, in order to produce a credible risk handling strategy.
Every tool, process, and practice has a dark side. Knowing these is a Critical Success Factor to the integration of EVM and Agile at the desired Maturity Level.
Both Earned Value Management and Agile have Dark Sides. Things that are not talked about in public.
But when they are Integrated, each provides a solution for the problems of the other.
Assess current and desired Maturity for Agile and EVM is the starting point for integrating these two processes.
The Project Breathalyzer provides program managers with a quick look at software project health. They identify software projects that "should not be in the road." The Breathalyzer determines whether key program elements exist, without which the program is not likely to suceed.
Integrated Agile with EVM -- Executive overviewGlen Alleman
Earned Value Management and Agile Software Development have much in common. The most important is progress to plan is measured by Physical Percent Complete with tangible evidence of working products at the end of planned period of performance.
For software intensive system of systems, agile development provides powerful tools for producing working software on frequent boundaries to gain needed customer feedback to assure the program is going in the right direction.
Monte Carlo Simulation for Agile DevelopmentGlen Alleman
Managing in the presence of uncertainty requires making decisions with models of that uncertainty. Monte Carlo Simulation and related approaches are the basis of making informed decisions in the presence of uncertainty
Making decision in the presence of uncertainty requires estimating the impact of the outcome of those decisions. Here;s a collection of resources can can be used to guide that process
Avoid software project horror stories - check the reality value of the estima...Harold van Heeringen
Many large software projects turn into software horror stories, resulting in newspaper headlines and even political issues. Often, the project costs and schedule were estimated unrealistically optimistic, using immature estimation techniques. A relatively simple way to avoid many problems is to perform a reality check on the estimate. This presentation was given on the conference of the International Cost Estimating and Analysis Association (ICEAA2014), June 2014 (Denver, USA)
The management of software development is fraught with risk: technical risk, market risk, requirements risk, and financial risk. This paper describes nine (9) key management principles for
guiding the development of a software project. These principles are not original. They are taken directly from the work of Norm Brown, the founder and executive Director of the Software Program Managers Network (SPMN).
Start with defining the deliverables to produce the capabilities needed for project success. Then what work is needed, the order of that work, and the defined outcomes of that work become obvious. Sequence that work, assign durations and resources and you've generated the plans and schedule for success
The question – what does Done Look Like? – was asked every week on the program that changed my life as a Program Manager. Rocky Flats Environmental Technology Site (RFETS) was the marketing term for the 3rd worst toxic waste site on the planet. RFETS was a nuclear bomb manufacturing plant, built in 1951, operating until 1989, and closed in 2005. I served as the VP of Program Management of the ITC (Information Technology and Communications) group, providing ERP, purpose built IT, voice, and data systems for 5,000 employees and contractors of the Bomb Factory.
SOLVING PROJECT ALLOCATION RESOURCE PROBLEMS WITH AEROSPACE ERPKevin West
Defence manufacturing is all about project manufacturing and project accounting. And that means enterprise resource planning (ERP) software for defence manufacturing must include robust functionality for project management and specifically project cost allocation.
Earned Value Management Meets Big DataGlen Alleman
The Earned Value Management System (EVMS) maintains period–by–period data in its underlying databases. The contents of the Earned Value repository can be considered BIG DATA, characterized by three attributes – 1) Volume: Large amounts of data; 2) Variety: data comes from different sources, including traditional data bases, documents, and complex records; 3) Velocity: the content is continually being updated by absorbing other data collections, through previously archived data, and through streamed data from external sources.
With this time series information in the repository, analysis of trends, cost and schedule forecasts, and confidence levels of these performance estimates can be calculated using statistical analysis techniques enabled by the Autoregressive Integrated Moving Average (ARIMA) algorithm provided by the R programming system. ARIMA provides a statistically informed Estimate At Completion (EAC) and Estimate to Complete (ETC) to the program in ways not available using standard EVM calculations. Using ARIMA reveals underlying trends not available through standard EVM reporting calculations.
With ARIMA in place and additional data from risk, technical performance and the Work Breakdown Structure, Principal Component Analysis can be used to identify the drivers of unanticipated EAC.
Managing in the presence of uncertaintyGlen Alleman
Uncertainty is the source of risk. Uncertainty comes in two types, aleatory and epistemic. It is important to understand both and deal with both in distinct ways, in order to produce a credible risk handling strategy.
Recent College of Performance Management Webinar on using Technical Performance to inform Earned Value Management. Six steps to building a credible Performance Measurement Baseline to connect the dots between all the elements of the program
Defining business value in units meaningful to the business and connecting these to the measures of performance for the project that produce this business value.
Forecasting cost and schedule performanceGlen Alleman
For credible decisions to be made, we need confidence intervals on all the numbers we use to make decisions.
These confidence intervals come from the underlying statistics and the related probabilities.
Statistical forecasting, using time series analysis of past performance, is mandatory for any credible discussion of project performance in the future.
Methods of Forecasting for Capacity ManagementPrecisely
Forecasting is the process of making statements about events in the future. Events related to capacity management are typically things like the state of resource consumption, service levels, and computing environment changes at future points in time. Making statements or predictions about these future events requires analysis of information to determine a future state. Knowing what information is needed to make accurate forecasts is a critical step for any analysis.
Forecasts are made to answer questions. Understanding the questions, and things that affect answers to those questions, is the first step to creating an accurate forecast. Required accuracy of a forecast should determine which methods are used to create it. Assumptions can be made to limit the amount of data and time required for creating forecasts. Validating forecast accuracy, after events happen, is an important part of continually improving future forecasts, and building credibility. This webinar describes the important task of forecasting as it relates to capacity management.
This presentation covers the following topics:
• Why do we forecast?
• Forecasting scenarios
• Forecasting Techniques
• Forecasting and Virtualization
• Summary
Earned Value Management involves more that just cost and schedule. Six Business Systems, including EVM, are the basis of credible program performance management.
Here's a suggestion of how to "connect the dots."
Building a Credible Performance Measurement BaselineGlen Alleman
Establishing a credible Performance Measurement Baseline, with a risk adjusted Integrated Master Plan and Integrated Master Schedule, starts with the WBS and connects Technical Measures of progress to Earned Value
View Related videos:-
Truth about Supply Demand Planning:-
http://www.youtube.com/watch?v=K66q2o1ED3c
Demantra Vs Oracle Demand Planning
http://www.youtube.com/watch?v=QwAzP3T6ut4
Another slideshare PPT:-
http://www.slideshare.net/amitforu78/demantra-vs-oracle-demand-planning
Contact me at www.ezdia.com
<a>AsiaLinks</a>
EVM is more than people, processes, and tools. It's an integrated service that provides actionable information to the decision makers. Webinar, EcoSys, October 22, 2014.
Earned Value Management is more than people, processes, and tools. It is an integral service that provides actionable information to the decisions makers
Data Analytics For Beginners | Introduction To Data Analytics | Data Analytic...Edureka!
Data Analytics for R Course: https://www.edureka.co/r-for-analytics
This Edureka Tutorial on Data Analytics for Beginners will help you learn the various parameters you need to consider while performing data analysis.
The following are the topics covered in this session:
Introduction To Data Analytics
Statistics
Data Cleaning and Manipulation
Data Visualization
Machine Learning
Roles, Responsibilities and Salary of Data Analyst
Need of R
Hands-On
Statistics for Data Science: https://youtu.be/oT87O0VQRi8
Follow us to never miss an update in the future.
Instagram: https://www.instagram.com/edureka_learning/
Facebook: https://www.facebook.com/edurekaIN/
Twitter: https://twitter.com/edurekain
LinkedIn: https://www.linkedin.com/company/edureka
Planning projects usually starts with tasks and milestones. The planner gathers this information from the participants – customers, engineers, subject matter experts. This information is usually arranged in the form of activities and milestones. PMBOK defines “project time management” in this manner. The activities are then sequenced according to the projects needs and mandatory dependencies.
Increasing the Probability of Project SuccessGlen Alleman
Risk Management is essential for development and production programs. Information about key cost, performance and schedule attributes are often uncertain or unknown until late in the program.
Risk issues that can be identified early in the program, which may potentially impact the program, termed Known Unknowns, can be alleviated with good risk management. -- Effective Risk Management 2nd Edition, Page 1, Edmund Conrow, American Institute of Aeronautics and Astronautics, 2003
Cost and schedule growth for complex projects is created when unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, contribute to project technical and programmatic shortfalls
From Principles to Strategies for Systems EngineeringGlen Alleman
From Principles to Strategies How to apply Principles, Practices, and Processes of Systems Engineering to solve complex technical, operational,
and organizational problems
Capabilities‒Based Planning the capabilities needed to accomplish a mission or fulfill a business strategy
Only when capabilities are defined can we start with requirements elicitation
Starting with the development of a Rough Order of Magnitude (ROM) estimate of work and duration, creating the Product Roadmap and Release Plan, the Product and Sprint Backlogs, executing and statusing the Sprint, and informing the Earned Value Management Systems, using Physical Percent Complete of progress to plan.
Program Management Office Lean Software Development and Six SigmaGlen Alleman
Successfully combining a PMO, Agile, and Lean / 6 starts with understanding what benefit each paradigm brings to the table. Architecting a solution for the enterprise requires assembling a “Systems” with processes, people, and principles – all sharing the goal of business improvement.
This resource document describes the Program Governance Road map for product development, deployment, and sustainment of products and services in compliance with CMS guidance, ITIL IT management, CMMI best practices, and other guidance to assure high quality software is deployed for sustained operational success in mission critical domains.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
State of ICS and IoT Cyber Threat Landscape Report 2024 previewPrayukth K V
The IoT and OT threat landscape report has been prepared by the Threat Research Team at Sectrio using data from Sectrio, cyber threat intelligence farming facilities spread across over 85 cities around the world. In addition, Sectrio also runs AI-based advanced threat and payload engagement facilities that serve as sinks to attract and engage sophisticated threat actors, and newer malware including new variants and latent threats that are at an earlier stage of development.
The latest edition of the OT/ICS and IoT security Threat Landscape Report 2024 also covers:
State of global ICS asset and network exposure
Sectoral targets and attacks as well as the cost of ransom
Global APT activity, AI usage, actor and tactic profiles, and implications
Rise in volumes of AI-powered cyberattacks
Major cyber events in 2024
Malware and malicious payload trends
Cyberattack types and targets
Vulnerability exploit attempts on CVEs
Attacks on counties – USA
Expansion of bot farms – how, where, and why
In-depth analysis of the cyber threat landscape across North America, South America, Europe, APAC, and the Middle East
Why are attacks on smart factories rising?
Cyber risk predictions
Axis of attacks – Europe
Systemic attacks in the Middle East
Download the full report from here:
https://sectrio.com/resources/ot-threat-landscape-reports/sectrio-releases-ot-ics-and-iot-security-threat-landscape-report-2024/
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
The Art of the Pitch: WordPress Relationships and SalesLaura Byrne
Clients don’t know what they don’t know. What web solutions are right for them? How does WordPress come into the picture? How do you make sure you understand scope and timeline? What do you do if sometime changes?
All these questions and more will be explored as we talk about matching clients’ needs with what your agency offers without pulling teeth or pulling your hair out. Practical tips, and strategies for successful relationship building that leads to closing the deal.
1. +
Big Data Meets Earned
Value Management
We have lots data. How can we use it to
make predictive and prescriptive
forecasts of future performance to
increase Probability of Program
Success?
Glen B. Alleman
Thomas J. Coonce
2. +The Killer Question For Every Manager
Of A Complex, High Risk Program Is …
2
… How Can I See An
Unanticipated Estimate At
Completion (EAC) Coming
Before It’s Too Late?
“What’s in Your Estimate at Completion?”, Pat Barker and
Roberta Tomasini, Defense AT&L, March-April, 2014
3. +Here’s WHY We Need Better Ways To
Forecast Estimate At Complete …
3
… the root cause starts on day one,
with a less than credible PMB.
42%
29%
21%
0%
10%
20%
30%
40%
50%
60%
From
Phase B
Start
From
PDR
From
CDR
DevelopmentCostGrowth
29%
23%
19%
0%
10%
20%
30%
40%
50%
60%
From
Phase B
Start
From
PDR
From
CDR
PhaseB/C/DScheduleGrowth
4. +Three Types Of Data Are Available In
Big Data Repositories
n Descriptive – looking in the past we can learn what
happened, but it’s too late to take corrective action.
n Predictive – using past performance we can answer
the question what will happen if we do nothing but
do the same as we’ve done in the past.
n Prescriptive – past performance data used to make
predictions and suggest decision options to take
advantage of the predictions
4
Prescriptive analytics not only anticipates what will
happen and when it will happen, but why it will
happen.
5. +Descriptive Analytics
n Descriptive Analytics – condensing big data into
smaller, useful nuggets of information.
n Most raw Earned Value data is not suitable for human
consumption since it is reported by WBS without the
connectivity to the product or programmatic topology
n Descriptive data summarizes what happened in the
past, many times 45 days in the past.
n Correlations between WBS elements not defined nor
correlations between risk, technical performance or
Systems Engineering attributes – MOE, MOP, KPP†
The EVM repositories provide the raw material for Descriptive
Analytics through the IPMR (DI-MGMT-81861) submittals
5
† The Defense Acquisition Guide defines how to apply Measures of Effectiveness, Measures of Performance,
Technical Performance Measures, and Key Performance Parameters to assess program performance
6. +DAU Gold Card’s EAC Formula Uses
Predictive Analytics, But …
n Past variances are wiped
out with “Cumulative to
Date” data
n No adjustment for risk
n Not statistically corrected
for past performance
6
7. +Prescriptive Analytics
n Is a type of Predictive Analytics
n Used when we need to prescribe an action so
leadership can take the data and act.
n Predictive analytics doesn’t predict one future
outcome – but Multiple outcomes based on the
decision makers actions.
n Prescriptive analytics requires a predictive model
with two additional components:
n Actionable data.
n Feedback system that tracks the outcome produced by the
action taken..
7
8. +Prescriptive Analytics Is The
Foundation For Corrective Actions
n Prescriptive Analytics is about making
decisions based on data.
n Prescriptive analytics requires a predictive
model with two components:
n Actionable data
n Feedback from those actions
n Prescriptive models predict the possible
consequences based on different choices of
action.
8
Milestones are rocks on the side of the road.The Roman
Milestone was a measure back to Rome.You only know that
distance after you pass the milestone.
9. +There Is Untapped Value In An Earned
Value Data Repository
n Most data is of little value at the detail level since it is
uncorrelated in the reporting process
n Making correlations between cause and effect is difficult for
humans, but statistical process algorithms can do this for us
n With correlated data in hand, we can start generating
descriptive analytics
n But drivers of variance are not visible in the repository
n Variances from past can be calculated, but not used in future
forecasts
n There is no built-in mechanism to see patterns in the
data
n Standard tools produce linear, non-statistical, non-risk adjusted
forecasts
To extract this value we need to overcome some limitations in
today’s repositories
9
10. +All Programmatic Forecasting Is Probabilistic,
Driven By Underlying Statistical Processes
If we make forecasts about program performance that are not
statistically and risk adjusted – we’re gonna get wet.
10
11. +Schedule, Related Cost And
Technical Elements Are Probabilistic
The IMS doesn’t help us much either,since the correlative
drivers are themselves non-linear stochastic processes
11
A Stochastic
process is a
collection of
random variables
used to represent
the evolution of
some random
value or system,
over time.
12. +The Ability To Forecast Future Performance
Starts With A Tool That Provides…
n Forecasting of future performance,
using time series of the past using
Autoregressive Integrated Moving
Average (ARIMA) algorithm
n Confidence intervals of these
forecasts for past performance
n Correlation between the time series
elements (CPI, SPI,WBS element)
n Deeper correlations between these
Earned Value elements as risk
retirement, increase effectiveness and
performance and any other recorded
measure of the program.
12
http://cran.us.r-project.org/
The combination of some data and an aching desire for an answer does not ensure
that a reasonable answer can be extracted from a given body of data. - John Tukey
13. +A Quick Look At Where We’re Going
Starting with Forecasting CPI/SPI
n We have a time
series of CPI, SPI, in
the repository
n What’s possible
behaviors in the
future can we
discover from the
past behavior?
n The R code on the
top answers that in
4 lines.
If we want to credibly forecast the future with the past,we’ll
need better tools.We’ve got the data,just need to use it
13
14. 14
§ The Units of Measures for Earned Value Management are
Dollars
§ Cumulative indices wipe out all the variances
§ Forecasts of future performance are not statistically adjusted
§ There is no correlative information drivers of variances
§ None of these forecasts use the risk register to adjust their
value
15. +Since ARIMA Is A Well Traveled Path,
We Need More and Better Tools
n The Earned Value Management Performance measures
need to be connected to:
n Risk retirement and buy down status Technical Performance
Measure compliance
n Measures of Effectiveness and Measures of Performance
n Work Breakdown Structure correlations for each work
activity
n Correlations between performance and work performed is
available in the repository
n We’re missing the tool to reveal these correlations, drivers, and
corrective actions to keep the program GREEN
To provide better forecasts of EAC,we need more data.
CPI/SPI needs to be augmented with technical program data
15
16. +We Need More Power To See Into The
Future And Take Corrective Actions
16
We need
more power
Mr. Scott
I’m given
her all she’s
got ‘Captain
17. +Principal Component Analysis (PCA)
Gets More Power from our data
17
Principal component analysis (PCA) is a statistical procedure
that uses orthogonal transformation to convert a set of
observations of possibly correlated variables into a set of
values of linearly uncorrelated variables called principal
components.
We want to convert a larger set of program performance
variables – SPI, CPI, Risk Retirements,TPM, MOE, MOP, KPP,
Staffing, and others, into a small set of drivers of variance.
PCA can provide visibility to what are the connections
between EAC growth and the source of that growth in a
statistically sound manner, not currently available with IPMR
reporting using CPI/SPI
18. +What Can PCA Tell Us?
n If data lies in high dimensional space (more than just
CPI/SPI), then large amount of data is required to learn
distributions or decision rules.
n For each WBS element 9 dimensions (CPI, SPI,WBS,
TPM, MOE, MOP, KPP, Risk, Staffing Profiles).
n Each dimension has 36 levels (36 months of data).
n We could produce a 9 dimension scatter plot for the 36
months of data and it’d look like a big blob.
n We need to know what are the drivers in this Blob of
data?
With “all” the data in a single place – which it is not – we need
a way to reduce the dimensionality to provide analysis
18
19. +From 2 Dimensions (SPI/CPI) to 8
Dimensions and Back Again
n Two components, for example –
SPI and CPI
n Discover the correlation between
these two data samples
n Locate in the individual samples
the time the drivers started
impacting the program
n Extend this to 8 dimensions
n Similar to Joint Confidence Level,
but with actual data
19
PCi=a1X1 + a2X2 + a3X3 + … + a8K8
20. +Program Performance Dimensions
PCA data can be simple 2 dimensional – CPI/SPI or more
complex and represent other “attributes” driving EAC
20
Variable Information that mat drive Unanticipated EAC
CPI/SPI CPI for program, time phased by reporting period
TPM
Technical Performance Measures, with control bands as program
moves left to right.These can be any measure technical
compliance
§ Weight
§ Throughput
§ Information Assurance validation
§ Any of the JROC KPPs
Risk
Risk
§ Risk retirement buy down plan
§ Risk handling and planned reduction
Margin Cost and schedule margin burn down to plan
21. +Call to Action for increased visibility to
Unanticipated EAC Growth using BIG Data
21
Normalize data in the Central
Repository in preparation for analysis
Apply ARIMA to normalized data to
forecast CPI, SPI, and Calculated EAC
Adjust ARIMA parameters using past
performance compliance
Integrate external data with EV
repository data to build correlations
the EAC forecasts
Apply Principal Component Analysis
(PCA) to identify correlated drivers of
EAC growth