Building the Integrated Master Plan (and its Integrated Master Schedule) is a critical success factor in any project domain. It describes the increasing maturity of all deliverables in units of measure meaningful to the decision makers.
The IMP contains the Measures of Effectiveness and Measures of Performance. The IMS contains the Technical Performance Measures (as exit criteria for the Work Packages).
Risk and estimates are applied at all levels of the IMP and IMS, then definitized in the Performance Measurement Baseline on contract
The integrated master plan and integrated master scheduleGlen Alleman
The Integrated Master Plan (IMP) and Integrated Master Schedule( (IMS) provide a strategy for the incremental delivery of program outcomes through increasing maturity assessments with Measures of Effectiveness, Measures of Performance, Technical Performance Measures, and Key Performance Parameters.
These assessment assure the needed capabilities of the project are met at each assessment point to confirm physical percent complete as planned in the Integrated Master Plan
IMP & WBS - Getting Both Right is ParamountGlen Alleman
WBS is the starting point for program success. It tells us what DONE looks like in terms of deliverables.
Integrated Master Plan (IMP) tells us how the increasing maturity of the deliverables will be assessed at each Program Event.
Integrated Master Schedule (IMS) tells us the order of the Work Packages needed to produce this increasing maturity.
Control Account Plan (CAP) defines the authorized scope, budget, and period of performance for the work that produces the deliverables defined in the WBS, assessed in the IMP, and sequenced in the IMS.
From WBS to Integrated Master ScheduleGlen Alleman
A step by step guide to increasing the Probability of Program success starting with the WBS, developing the Integrated Master Plan and Integrated Master Schedule, risk adjusting the IMS, and measuring progress to plan in units of measure meaningful to the decision makers.
إدارة القيمة المكتسبة مفيدة ومخادعة في التحكم في المشروعات
فيديو دورة الأساسيات: http://prof.planner.teachable.com/p/evm-basics/
دورة المستوى المتقدم: http://www.slideshare.net/MohamedMaged8/contracts-classification
للمزيد: https://www.facebook.com/groups/prof.cost.engineers/
The integrated master plan and integrated master scheduleGlen Alleman
The Integrated Master Plan (IMP) and Integrated Master Schedule( (IMS) provide a strategy for the incremental delivery of program outcomes through increasing maturity assessments with Measures of Effectiveness, Measures of Performance, Technical Performance Measures, and Key Performance Parameters.
These assessment assure the needed capabilities of the project are met at each assessment point to confirm physical percent complete as planned in the Integrated Master Plan
IMP & WBS - Getting Both Right is ParamountGlen Alleman
WBS is the starting point for program success. It tells us what DONE looks like in terms of deliverables.
Integrated Master Plan (IMP) tells us how the increasing maturity of the deliverables will be assessed at each Program Event.
Integrated Master Schedule (IMS) tells us the order of the Work Packages needed to produce this increasing maturity.
Control Account Plan (CAP) defines the authorized scope, budget, and period of performance for the work that produces the deliverables defined in the WBS, assessed in the IMP, and sequenced in the IMS.
From WBS to Integrated Master ScheduleGlen Alleman
A step by step guide to increasing the Probability of Program success starting with the WBS, developing the Integrated Master Plan and Integrated Master Schedule, risk adjusting the IMS, and measuring progress to plan in units of measure meaningful to the decision makers.
إدارة القيمة المكتسبة مفيدة ومخادعة في التحكم في المشروعات
فيديو دورة الأساسيات: http://prof.planner.teachable.com/p/evm-basics/
دورة المستوى المتقدم: http://www.slideshare.net/MohamedMaged8/contracts-classification
للمزيد: https://www.facebook.com/groups/prof.cost.engineers/
Getting To Done - A Master Class WorkshopGlen Alleman
The Principles, Processes, Practices, and Tools to Increase the Probability of successfully completing Project's On-Tiem, On-Budget, and Needed Capabilities
The document has been developed keeping in mind the common challenges that a planner may face while
developing a schedule. I have also tried to cover in areas which is required for effective earned value
calculation. The document is been prepared considering that the reader has a basic understanding of Primavera P6.
Promo_Epc project rule of credit and progress measurement ignitetribes
Project progress monitoring and control is one of the most important tasks of construction project management. Many times planner or project manager not able.The hardest part of project controls is accurate performance measurement of work accomplished.
Taking time out to establish repeatable rules of credit can literally remove 75% of the performance measurement "guessing game" out of the equation.
In this book we don't just explain on what is rule of credit but we also provide tonnes of examples on how to establish the weighted milestone. It's a ready to be plug and plan in your project control measurement.
Log in to ignitetribes.com to purchase the book.
A presentation proposing one method of integrating and managing a mega-project portfolio through the use of a KIM schedule without losing interproject relationships key to critical path calculation.
RCF Method-1 uses P6 as the only tool required to manage, execute and control the project schedule regardless of its daunting size. Here is a proposal on a workable method that will support accurate, quick date analysis and timely decision making.
Project Portfolio Management is Managing several projects and coordinate them to achieve specific organizational objectives.
Examples show typical tasks of the Portfolio Managers, the key indicators, challenges, and tools they use to reach Strategic objectives. The samples from the project portfolio management simulation SimulTrain(R)+ explain how to manage complex portfolios and adapt to the changing environment.
إدارة التخطيط والبرامج الزمنية
فيديو المحاضرة: https://www.youtube.com/watch?v=HiGNZeLQ9Po
Content:
1- Planning and scheduling
2- Time schedule development
3- Resource and cost loading
4- Time schedule submittal
5- Review and approval
6- Update and reporting
7- Delay quantification approaches
8- Mitigation and action plans
McLachlan Lister provides a range of management consulting and project management services. These are offered either discretely or as an integrated service - you control the depth of our relationship:
Project risk analysis methodology and how RiskyProject software can be used for quantitative project risk analysis.
For more information how to perform schedule risk analysis using RiskyProject software please visit Intaver Institute web site: http://www.intaver.com.
About Intaver Institute.
Intaver Institute Inc. develops project risk management and project risk analysis software. Intaver's flagship product is RiskyProject: project risk management software. RiskyProject integrates with Microsoft Project, Oracle Primavera, other project management software or can run standalone. RiskyProject comes in three configurations: RiskyProject Lite, RiskyProject Professional, and RiskyProject Enterprise.
Earned schedule role in performance reporting and other important delay indicators.
Video: https://www.youtube.com/watch?v=FbA6RWB1gDM&feature=youtu.be
The full course: https://www.luqmanacademy.com/course?course=project-control-using-evm_399sl6015424f8aba9
Video: https://twitter.com/magedkom/status/1354678096683618305?s=20
In this presentation you will discover how the PMO is vital to delivering real business results to companies that are seeking to maximized return on their investments and accelerate performance.
Primavera p6 18.8 planning and scheduling guide r3Matiwos Tsegaye
This manual is developed to assist construction professional in understanding the basic principles of planning and to guide them how planning is developed using the recent edition Primavera Professional P6 18.8.
Project Controls Expo 09 Nov 2011, London - DELAY AND FORENSIC ANALYSIS By Ro...Project Controls Expo
Delay in Construction Contracts: • On-going phenomenon
• Introduction of Critical Path Method (‘CPM’) • Prospective or retrospective analysis
• Observational or modelled
• Dynamic or Static
• Common Methodologies
Getting To Done - A Master Class WorkshopGlen Alleman
The Principles, Processes, Practices, and Tools to Increase the Probability of successfully completing Project's On-Tiem, On-Budget, and Needed Capabilities
The document has been developed keeping in mind the common challenges that a planner may face while
developing a schedule. I have also tried to cover in areas which is required for effective earned value
calculation. The document is been prepared considering that the reader has a basic understanding of Primavera P6.
Promo_Epc project rule of credit and progress measurement ignitetribes
Project progress monitoring and control is one of the most important tasks of construction project management. Many times planner or project manager not able.The hardest part of project controls is accurate performance measurement of work accomplished.
Taking time out to establish repeatable rules of credit can literally remove 75% of the performance measurement "guessing game" out of the equation.
In this book we don't just explain on what is rule of credit but we also provide tonnes of examples on how to establish the weighted milestone. It's a ready to be plug and plan in your project control measurement.
Log in to ignitetribes.com to purchase the book.
A presentation proposing one method of integrating and managing a mega-project portfolio through the use of a KIM schedule without losing interproject relationships key to critical path calculation.
RCF Method-1 uses P6 as the only tool required to manage, execute and control the project schedule regardless of its daunting size. Here is a proposal on a workable method that will support accurate, quick date analysis and timely decision making.
Project Portfolio Management is Managing several projects and coordinate them to achieve specific organizational objectives.
Examples show typical tasks of the Portfolio Managers, the key indicators, challenges, and tools they use to reach Strategic objectives. The samples from the project portfolio management simulation SimulTrain(R)+ explain how to manage complex portfolios and adapt to the changing environment.
إدارة التخطيط والبرامج الزمنية
فيديو المحاضرة: https://www.youtube.com/watch?v=HiGNZeLQ9Po
Content:
1- Planning and scheduling
2- Time schedule development
3- Resource and cost loading
4- Time schedule submittal
5- Review and approval
6- Update and reporting
7- Delay quantification approaches
8- Mitigation and action plans
McLachlan Lister provides a range of management consulting and project management services. These are offered either discretely or as an integrated service - you control the depth of our relationship:
Project risk analysis methodology and how RiskyProject software can be used for quantitative project risk analysis.
For more information how to perform schedule risk analysis using RiskyProject software please visit Intaver Institute web site: http://www.intaver.com.
About Intaver Institute.
Intaver Institute Inc. develops project risk management and project risk analysis software. Intaver's flagship product is RiskyProject: project risk management software. RiskyProject integrates with Microsoft Project, Oracle Primavera, other project management software or can run standalone. RiskyProject comes in three configurations: RiskyProject Lite, RiskyProject Professional, and RiskyProject Enterprise.
Earned schedule role in performance reporting and other important delay indicators.
Video: https://www.youtube.com/watch?v=FbA6RWB1gDM&feature=youtu.be
The full course: https://www.luqmanacademy.com/course?course=project-control-using-evm_399sl6015424f8aba9
Video: https://twitter.com/magedkom/status/1354678096683618305?s=20
In this presentation you will discover how the PMO is vital to delivering real business results to companies that are seeking to maximized return on their investments and accelerate performance.
Primavera p6 18.8 planning and scheduling guide r3Matiwos Tsegaye
This manual is developed to assist construction professional in understanding the basic principles of planning and to guide them how planning is developed using the recent edition Primavera Professional P6 18.8.
Project Controls Expo 09 Nov 2011, London - DELAY AND FORENSIC ANALYSIS By Ro...Project Controls Expo
Delay in Construction Contracts: • On-going phenomenon
• Introduction of Critical Path Method (‘CPM’) • Prospective or retrospective analysis
• Observational or modelled
• Dynamic or Static
• Common Methodologies
The Impedance Mismatch in Integrated Engineering Design Systems is an issue in the Integration of commercial off the shelf (COTS) components.
This issue is a member of the Impedance Mismatch
problems found when commercial off the shelf
components are assembled into systems.
This mismatch occurs when event, control sequence,
or data semantics of two or more participating application
domains are mismatched.
During the system integration process the impedance
mismatch must be addressed through some means,
either through an integration layer which hides the
mismatch or through an integrating service, such as
CORBA, which facilitates the impedance adaptation
between the applications.
Slides as given for the Feb. 12, 2014 talk at Bay Area Software Testers.
(btw, I failed to give credit for the "Stand Back!" t-shirt image, it was from the XKCD T-shirt here: http://store-xkcd-com.myshopify.com/products/try-science)
Also forgot reference to the paper on Fibonacci numbers in planning poker affecting estimates: http://simula.no/publications/Simula.simula.1282/simula_pdf_file
Delivering programs with less capability than promised, while exceeding the cost and planned durations, distorts decision making, contributes to increasing cost growth to other programs, undermines the Federal government’s credibility with taxpayers and contributes to the public’s negative support for these programs.
Many reasons have been hypothesized and documented for cost and schedule growth. The authors review some of these reasons, and propose that government and contractors use the historical variability of the past programs to establish cost and schedule estimates at the outset and periodically update these estimates with up-to-date risks, to increase the probability of program success. For this to happen, the authors recommend changes to estimating, acquisition and contracting processes.
Una pequeña reflexión sobre diferentes maneras de evitar que los perros manchen la calle con sus necesidades. Nos encontramos desde avisos procedentes de las autoridades, tanto apelando a la ley como premiando actitudes positivas, hasta carteles hechos por particulares.
When we hear that a proposed process, tool, method or any ideas is all about "risk management," check to see if it covers these areas. If not, the suggestion is not really about risk management.
Building a Credible Performance Measurement BaselineGlen Alleman
Establishing a credible Performance Measurement Baseline, with a risk adjusted Integrated Master Plan and Integrated Master Schedule, starts with the WBS and connects Technical Measures of progress to Earned Value
Building A Credible Measurement BaselineGlen Alleman
Establishing a credible Performance Measurement Baseline, with a risk adjusted Integrated Master Plan and Integrated Master Schedule, starts with the WBS and connects Technical Measures of progress to Earned Value
EIA-748-C asks us to “objectively assess accomplishments at the work performance level.” As well §3.8 of 748-C tells us “Earned Value is a direct measurement of the quantity of work accomplished. The quality and technical content of work performed is controlled by other processes.”
Building a Credible Performance Measurement BaselineGlen Alleman
Establishing a credible Performance Measurement Baseline, with a risk adjusted Integrated Master Plan and Integrated Master Schedule, starts with the WBS and connects Technical Measures of progress to Earned Value
IS EARNED VALUE + AGILE A MATCH MADE IN HEAVEN?
Increasing the Probability of Program Success requires by connecting the dots between EV and Agile Development.
Presented at
The Nexus of Agile Software Development and
Earned Value Management, OSD-PARCA,
February 19 – 20, 2015
Institute for Defense Analysis, Alexandria, VA
EAI-748-C asks us to objectively assess accomplishments at the work performance level. As well §3.8 of 748-C tells us Earned Value is a direct measurement of the quantity of work accomplished. The quality and technical content of work is controlled by other processes. To provide visibility to integrated cost, schedule, and technical performance, we need more than CPI and SPI. We need measures of increasing technical performance.
There are four major questions that need answers when applying agile software development to DOD development programs
1. How can Agile Development methods increase the Probability of Program Success (PoPS) on Earned Value programs?
2. How can Agile development be integrated with the FAR / DFAR and OMB mandates for program performance measures using Earned Value?
3. What are the “touch” points (or possible collision points) between Agile and EIA-748-C?
4. What are the measures of success for Agile methods in the context of EIA-748-C?
How should we estimates agile projects (CAST)Glen Alleman
“Why do so many big projects overspend and
overrun? They’re managed as if they were merely
complicated when in fact they are complex. They’re planned as if everything was known at the start when in fact they involve high levels of uncertainty and risk.” ‒ Architecting Systems: Concepts, Principles and Practice, Hillary Sillitto
Planning projects usually starts with tasks and milestones. The planner gathers this information from the participants – customers, engineers, subject matter experts. This information is usually arranged in the form of activities and milestones. PMBOK defines “project time management” in this manner. The activities are then sequenced according to the projects needs and mandatory dependencies.
Increasing the Probability of Project SuccessGlen Alleman
Risk Management is essential for development and production programs. Information about key cost, performance and schedule attributes are often uncertain or unknown until late in the program.
Risk issues that can be identified early in the program, which may potentially impact the program, termed Known Unknowns, can be alleviated with good risk management. -- Effective Risk Management 2nd Edition, Page 1, Edmund Conrow, American Institute of Aeronautics and Astronautics, 2003
Cost and schedule growth for complex projects is created when unrealistic technical performance expectations, unrealistic cost and schedule estimates, inadequate risk assessments, unanticipated technical issues, and poorly performed and ineffective risk management, contribute to project technical and programmatic shortfalls
From Principles to Strategies for Systems EngineeringGlen Alleman
From Principles to Strategies How to apply Principles, Practices, and Processes of Systems Engineering to solve complex technical, operational,
and organizational problems
Capabilities‒Based Planning the capabilities needed to accomplish a mission or fulfill a business strategy
Only when capabilities are defined can we start with requirements elicitation
Starting with the development of a Rough Order of Magnitude (ROM) estimate of work and duration, creating the Product Roadmap and Release Plan, the Product and Sprint Backlogs, executing and statusing the Sprint, and informing the Earned Value Management Systems, using Physical Percent Complete of progress to plan.
Program Management Office Lean Software Development and Six SigmaGlen Alleman
Successfully combining a PMO, Agile, and Lean / 6 starts with understanding what benefit each paradigm brings to the table. Architecting a solution for the enterprise requires assembling a “Systems” with processes, people, and principles – all sharing the goal of business improvement.
This resource document describes the Program Governance Road map for product development, deployment, and sustainment of products and services in compliance with CMS guidance, ITIL IT management, CMMI best practices, and other guidance to assure high quality software is deployed for sustained operational success in mission critical domains.
Increasing the Probability of Project Success with Five Principles and PracticesGlen Alleman
There are many approaches to managing projects in every domain.
This seminar lays the foundations for increasing the probability of project success, no matter the domain, what technology, what approach to delivering the outcomes of the project.
The principles of this approach are immutable.
The practices for implementing the principles are universally applicable.
Each chart in this presentation, contains guidance that can be applied to your project, no matter the domain.
In our short hour here, we’re going to cover a lot of material.
The bibliography contains the supporting materials we can tailor to your individual project
Seven Habits of a Highly Effective agile project managerGlen Alleman
Recent neurological studies indicate that the role of emotion in human cognition is essential; emotions are not a luxury. Instead, emotions play a critical role in rational decision–making, in perception, in human interaction, and in human intelligence. Habits are the intersection of knowledge, skill, and desire.
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
GDG Cloud Southlake #33: Boule & Rebala: Effective AppSec in SDLC using Deplo...James Anderson
Effective Application Security in Software Delivery lifecycle using Deployment Firewall and DBOM
The modern software delivery process (or the CI/CD process) includes many tools, distributed teams, open-source code, and cloud platforms. Constant focus on speed to release software to market, along with the traditional slow and manual security checks has caused gaps in continuous security as an important piece in the software supply chain. Today organizations feel more susceptible to external and internal cyber threats due to the vast attack surface in their applications supply chain and the lack of end-to-end governance and risk management.
The software team must secure its software delivery process to avoid vulnerability and security breaches. This needs to be achieved with existing tool chains and without extensive rework of the delivery processes. This talk will present strategies and techniques for providing visibility into the true risk of the existing vulnerabilities, preventing the introduction of security issues in the software, resolving vulnerabilities in production environments quickly, and capturing the deployment bill of materials (DBOM).
Speakers:
Bob Boule
Robert Boule is a technology enthusiast with PASSION for technology and making things work along with a knack for helping others understand how things work. He comes with around 20 years of solution engineering experience in application security, software continuous delivery, and SaaS platforms. He is known for his dynamic presentations in CI/CD and application security integrated in software delivery lifecycle.
Gopinath Rebala
Gopinath Rebala is the CTO of OpsMx, where he has overall responsibility for the machine learning and data processing architectures for Secure Software Delivery. Gopi also has a strong connection with our customers, leading design and architecture for strategic implementations. Gopi is a frequent speaker and well-known leader in continuous delivery and integrating security into software delivery.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
Smart TV Buyer Insights Survey 2024 by 91mobiles.pdf91mobiles
91mobiles recently conducted a Smart TV Buyer Insights Survey in which we asked over 3,000 respondents about the TV they own, aspects they look at on a new TV, and their TV buying preferences.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Builder.ai Founder Sachin Dev Duggal's Strategic Approach to Create an Innova...Ramesh Iyer
In today's fast-changing business world, Companies that adapt and embrace new ideas often need help to keep up with the competition. However, fostering a culture of innovation takes much work. It takes vision, leadership and willingness to take risks in the right proportion. Sachin Dev Duggal, co-founder of Builder.ai, has perfected the art of this balance, creating a company culture where creativity and growth are nurtured at each stage.
Epistemic Interaction - tuning interfaces to provide information for AI supportAlan Dix
Paper presented at SYNERGY workshop at AVI 2024, Genoa, Italy. 3rd June 2024
https://alandix.com/academic/papers/synergy2024-epistemic/
As machine learning integrates deeper into human-computer interactions, the concept of epistemic interaction emerges, aiming to refine these interactions to enhance system adaptability. This approach encourages minor, intentional adjustments in user behaviour to enrich the data available for system learning. This paper introduces epistemic interaction within the context of human-system communication, illustrating how deliberate interaction design can improve system understanding and adaptation. Through concrete examples, we demonstrate the potential of epistemic interaction to significantly advance human-computer interaction by leveraging intuitive human communication strategies to inform system design and functionality, offering a novel pathway for enriching user-system engagements.
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Key Trends Shaping the Future of Infrastructure.pdfCheryl Hung
Keynote at DIGIT West Expo, Glasgow on 29 May 2024.
Cheryl Hung, ochery.com
Sr Director, Infrastructure Ecosystem, Arm.
The key trends across hardware, cloud and open-source; exploring how these areas are likely to mature and develop over the short and long-term, and then considering how organisations can position themselves to adapt and thrive.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
1. The
Integrated
Master
Plan
And
Integrated
Master
Schedule
The
Integrated
Master
Plan
and
Integrated
Master
Schedule
are
one
of
the
six
elements
of
a
Credible
Performance
Measurement
Baseline
(PMB).
The
PMB
is
the
source
of
data
for
the
Program
Manager.
Some
of
this
data
is
used
in
the
Integrated
Program
Measurement
Report
(IPMR).
Some
is
used
in
the
assessment
of
the
Program’s
risk.
Some
define
the
deliverables
and
their
Technical
Performance
Measures.
5
V8.7
2. The
IMP
tells
us
where
is
the
program
going?
The
Plan
describes
where
we
are
going,
the
various
paths
we
can
take
to
reach
our
desLnaLon,
and
the
progress
or
performance
assessment
points
along
the
way
to
assure
we
are
on
the
right
path.
These
assessment
points
measures
the
“maturity”
of
the
product
or
service
against
the
planned
maturity.
This
is
the
only
real
measure
of
progress
–
not
the
passage
of
Lme
or
consumpLon
of
money.
The
Integrated
Master
Plan
(IMP)
Is
A
Strategy
For
The
Successful
Comple=on
Of
The
Project
2
5.0
Start
with
the
IMP
4. Quick
View
to
IMP/IMS
! VerLcal
traceability
defines
the
increasing
maturity
of
key
deliverables
! Horizontal
traceability
defines
the
work
acLviLes
needed
to
produce
this
increasing
maturity
! Both
are
needed,
but
the
verLcal
traceability
is
the
starLng
point
! Program
Events,
Significant
Accomplishments,
and
Accomplishment
Criteria
must
be
defined
before
the
horizontal
work
acLviLes
can
be
idenLfied
! For
all
IMP
elements,
Key
Risks
must
be
idenLfied
and
assigned
from
Day
One,
even
without
miLgaLons
4
5.0
Start
with
the
IMP
5. The
IMP/IMS
is
Needed
on
Both
Sides
of
the
Contract
! Ver%cal
traceability
defines
the
increasing
maturity
of
the
program’s
deliverables
measures
in
EffecLveness
(MoE)
and
Performance
(MoP)
for
the
Government.
! Horizontal
traceability
defines
the
progress
to
plan
for
MoE’s
and
MoP’s
with
tangible
evidence
for
both
the
Government
and
the
Contractor.
! Ver%cal
traceability
provides
the
Government
with
insight
into
the
progress
of
the
MoE’s
and
MoP’s
and
Technical
Performance
Measures.
! Horizontal
traceability
provides
insight
into
Cost
and
Schedule
performance
for
the
Contractor,
reportable
through
the
IPMR
to
the
Government.
5
5.0
Start
with
the
IMP
6. MisconcepLons
of
the
IMP
/
IMS
Why
we
don’t
need/want
an
IMP/IMS
! Only
required
for
ACAT
I
programs
! Too
big
and
burdensome
for
our
small
dollar
value
program
! Contractor
spends
B&P
and
program
budget
generaLng
and
maintaining
the
IMP,
without
measurable
benefit
! Doesn’t
apply
on
a
services
contractors
! Management
tool,
not
a
technical
tool
! Doesn’t
apply
to
technology
programs
! Doesn’t
apply
to
R&D
efforts
! Doesn’t
apply
to
the
government
! Help
me
get
a
waiver
so
I
don’t
have
to
use
an
IMP/IMS
6
5.0
Start
with
the
IMP
7. Aeributes
of
the
IMP
! Traceability
– Expands
and
complies
with
the
SOO,
Performance
Requirements,
CWBS,
and
CSOW
– Based
on
the
customers
WBS
– Is
the
basis
of
the
IMS,
cost
reports,
and
award
fees
! Implements
a
measurable
and
trackable
program
– Accomplishes
integrated
product
development
– Integrates
the
funcLonal
acLviLes
of
the
program
– Incorporates
funcLonal,
lower
level
and
S/C
IMPs
! Provides
for
evaluaLon
of
Program
Maturity
– Provides
insight
into
the
overall
effort
– Level
of
detail
is
consistent
with
risk
and
complexity
per
§L
– Decomposes
events
into
a
logical
series
of
accomplishments
– Measurable
criteria
demonstrate
compleLon
/
quality
of
accomplishments
7
5.0
Start
with
the
IMP
8. Aeributes
of
the
IMS
! Integrated,
networked,
mulL-‐layered
schedule
of
efforts
required
to
achieve
each
IMP
accomplishment
– Detailed
tasks
and
work
to
be
completed
– Calendar
schedule
shows
work
compleLon
dates
– Network
schedule
shows
interrelaLonships
and
criLcal
path
– Expanded
granularity,
frequency,
and
depth
of
risk
areas
! Resource
loading
! Correlates
IMS
work
with
IMP
events
8
5.0
Start
with
the
IMP
9. The
Importance
of
the
IMP
! The
IMP/IMS
is
the
single
most
important
document
to
a
program’s
success
– It
clearly
demonstrates
the
providers
understanding
of
the
program
requirements
and
the
soundness
of
the
approach
a
represented
by
the
plan
! The
program
uses
the
IMP/IMS
to
provide:
– Up
Front
Planning
and
commitment
from
all
parLcipants
– A
balanced
design
discipline
with
risk
miLgaLon
acLviLes
– Integrated
requirements
including
producLon
and
support
– Management
with
an
incremental
verificaLon
for
informed
program
decisions
9
5.0
Start
with
the
IMP
10. Just
A
Reminder,
before
moving
on
10
Page
47,
Defense
AcquisiLon
Guide,
January
10,
2012
Page
317,
Defense
AcquisiLon
Guide,
January
10,
2012
5.0
Start
with
the
IMP
11. Our
Goal
is
simple
…
How
can
we
recognize
the
Reality
of
the
Program’s
current
status
and
its
future
performance?
The
top
spins
conLnuous
while
in
a
dream
–
stops
spinning
in
the
real
world
–
Cobb’s
totem,
Incep=on
5.0
Start
with
the
IMP
11
13. Principles
of
Building
a
Credible
Integrated
Master
Plan
Building
the
IMP
is
a
Systems
Engineering
acLvity.
The
Integrated
Master
Plan
is
the
Program
Architecture
in
the
same
way
the
hardware
and
sooware
are
the
Product
Architecture.
Poor,
weak,
or
unstructured
ProgrammaLc
Architecture
reduces
visibility
to
the
Product
Architecture’s
performance
measures
of
cost
and
schedule
connected
with
Technical
Performance
Measures.
6
V8.7
14. Quick
View
of
Building
the
IMP
! Start
with
each
Program
Event
and
define
the
Significant
Accomplishments
their
entry
and
exit
criteria
to
assess
the
needed
maturity
of
the
key
deliverables
! Arrange
the
Significant
Accomplishments
in
the
proper
dependency
order
! Segregate
these
Significant
Accomplishments
into
swim
lanes
for
IPTs
! Define
the
dependencies
between
each
SA
14
6.0
Build
IMP
15. A
CriLcal
Understanding
of
the
IMP
The
IMP
defines
the
connecLons
between
the
Product
maturity
–
VerLcal
–
and
the
implementaLon
of
this
Product
maturity
through
the
FuncLonal
acLviLes
–
the
Horizontal
6.0
Build
IMP
15
16. Benefits
of
this
Formality
16
Objec%ve
Implementa%on
Event
Driven
Plan
versus
Schedule
Driven
Plan
is
based
on
compleLon
of
tasks
not
passage
of
Lme
Separate
the
plan
(IMP)
from
the
schedule
(IMS)
but
link
elements
with
numbering
system
Condensed,
easy
to
read
“Plan”
showing
the
“Events”
rather
than
the
work
effort
Indentured,
outline
format
–
not
text
Pre-‐defined
entry
and
exit
criteria
for
major
Program
Events
Significant
Accomplishments
for
each
key
Program
Event
ObjecLve
measures
of
progress
and
compleLon
for
each
Accomplishment
Pre-‐defined
Accomplishment
Criteria
(AC)
for
each
Significant
Accomplishment
(SA)
Stable,
contracture
plan,
flexible
enough
to
portray
program
status
IMP
part
of
the
contract,
IMS
is
a
data
item
Capture
essence
of
the
funcLonal
progress
without
mandaLng
a
parLcular
process
for
performing
the
work
Split
IMP
into
Product
and
Process
6.0
Build
IMP
17. Risk
Management
Building
the
IMP
Starts
at
the
RFP
with
Systems
Engineering
Measures
17
6.0
Build
IMP
SOO
ConOps
SOW
Techncial
and
OperaLonal
Requirements
CWBS
&
CWBS
DicLonary
Integrated
Master
Plan
(IMP)
Integrated
Master
Schedule
(IMS)
Earned
Value
Management
System
ObjecLve
Status
and
EssenLal
Views
to
support
the
proacLve
management
processes
needed
to
keep
the
program
GREEN
Performance
Measurement
Baseline
Measures
of
EffecLveness
Measures
of
Performance
TPMs
and
QBDs
JROC
Key
Performance
Parameters
Program
Specific
Key
Performance
Parameters
Technical
Performance
Measures
WBS
CWBS
18. IMP
captures
end
user
requirements
in
terms
MOEs,
MOPs,
KPPs,
and
TPMs
18
SOW
ConOps
WBS
CWBS
Technical
Performance
Measures
(TPM)
Messages
of
Performance
(MOP)
Measures
of
EffecLveness
(MOE)
Key
Performance
Parameters
(KPP)
Integrated
Master
Plan
(IMP)
Integrated
Master
Schedule
(IMS)
Performance
Measurement
Baseline
(PMB)
Technical
Requirements
19. The
IMP
/
IMS
Structure
19
IMS
IMP
Describes how program
capabilities will be
delivered and
how these
capabilities will
be recognized
as ready for
delivery
Supplemental Schedules (CAM Notebook)
Work Packages and Tasks
Criteria
Accomplishment
Events
or
Milestones
6.0
Build
IMP
20. The
IMP/IMS
provides
Horizontal
and
VerLcal
Traceability
of
progress
to
plan
! VerLcal
traceability
AC
"
SA
"
PE
! Horizontal
traceability
WP
"
WP
"
AC
20
Program Events
Define the maturity
of a Capability at a point in
time.
Significant Accomplishments
Represent requirements
that enable Capabilities.
Accomplishment Criteria
Exit Criteria for the Work
Packages that fulfill Requirements.
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
package
6.0
Build
IMP
21. The
IMP’s
role
during
ExecuLon
21
Program ExecutionPMB for IBRProposal SubmittalDRFP & RFP
Performance Measurement Baseline
Tasks (T)
BOE
% Complete
Statement of Work
Program Deliverables
IMP
Accomplishments (A)
Criteria (C)
EVMS
Events (E)
Budget Spreads by CA & WPCAIV
Capabilities Based Requirements
X BCWS =
Probabilistic Risk Analysis
=
Time keeping and ODC =
Technical Performance Measure
BCWP
ACWP
Cost & Schedule Risk Model
BCWS
Decreasing technical and programmatic risk using Risk Management Methods
IMS
Physical % Complete
Continuity and consistency from DRFP through Program Execution
WBS
6.0
Build
IMP
22. ! IMP
phrases
in
past
tense
–
do
describe
what
DONE
looks
like
! IMS
phrase
in
present
tense
–
do
describe
work
needed
to
arrive
at
DONE
22
The
Grammar
of
the
Integrated
Master
Plan
(IMP)
Maturity
Product
AcLon
Product
State
AdjecLve
Noun
Verb
Verb
Demonstrates
Maturity
Step
in
the
Process
End
Item
Final
Status
Preliminary
Model/Sim
Design
Complete
Preliminary
Modeling
and
Simula%on
Design
Complete
23. 1st,
Principle
–
IMP
Building
is
a
Full
Contact
Sport
23
6.0
Build
IMP
24. Our
First
Approach
to
the
IMP
/
IMS
Paradigm
! The
1st
approach
defines
Program
Events
(PE),
Significant
Accomplishments
(SA),
and
Accomplishment
Criteria
(AC),
derived
from
the
Work
Breakdown
Structure
or
the
SOW
! This
1st
approach
can
be
done
in
6
easy
steps
24
6.0
Build
IMP
1. IdenLfy
the
Program
Events
(PE)
(as
the
ACQ
Guide
tells
us)
2. IdenLfy
the
Significant
Accomplishments
(SA)
from
the
WBS
deliverables
3. IdenLfy
the
Accomplishment
Criteria
(AC)
needed
to
produce
these
deliverables
4. IdenLfy
the
Work
Packages
for
each
Accomplishment
Criteria
5. Sequence
the
Work
Packages
6. Assemble
the
IMP/IMS
25. This
approach
doesn’t
give
us
visibility
into
what
“done”
looks
like
! We
must
measure
increasing
product
maturity
in
units
meaningful
to
the
decision
makers
! We
must
see
the
risks
before
they
arrive
so
we
can
take
correcLve
acLon
6.0
Build
IMP
25
26. First
Look
at
a
Significant
Accomplishment
(SA)
! SAs
are
interim
and
final
steps
to
define,
design,
develop,
verify,
produce,
and
deploy
the
product
or
system.
! SAs
must
occur
in
a
manner
that
ensures
a
logical
path
is
maintained
throughout
the
development
effort.
! SAs
are
event
related
and
not
just
Lme
coincidental.
! SAs
should
have
one
or
more
of
the
following
characterisLcs:
– Consists
of
a
discrete
step
in
the
process
of
planned
development
that
must
be
complete
prior
to
an
event
– Produces
a
desired
result
at
a
specified
event
that
indicates
a
level
of
design
maturity
(or
progress)
directly
related
to
each
product
and
process
– Defines
interrelaLonships,
interdependencies,
or
“hand-‐off”
points
of
different
funcLonal
disciplines
applied
to
the
program
26
6.0
Build
IMP
SAs
must
assess
compliance
with
Measures
of
EffecLveness
27. First
look
at
an
Accomplishment
Criteria
(AC)
! ACs
are
definiLve
measures
directly
supporLng
successful
compleLon
of
a
significant
accomplishment.
! ACs
show
objecLve
evidence
of
work
progress
(maturity
of
a
product);
i.e.,
be
seen,
read,
demonstrated,
or
quanLfied.
These
results
are
usually
incorporated
into
a
report
or
document
as
evidence
of
accomplishment.
! ACs
are
prerequisites
for
compleLon
of
an
SA
(i.e.,
exit
criteria).
! The
quesLons
that
need
to
be
repeatedly
asked
when
developing
ACs
are:
– How
do
I
know
when
an
accomplishment
has
been
completed?
– Is
the
criteria
directly
related
to
the
accomplishment?
– Is
it
proof?
– What
is
the
work
product?”
27
6.0
Build
IMP
ACs
must
assess
compliance
with
Measures
of
Performance
28. The
IMP
speaks
to
Measures
of
EffecLveness
(MoE)
and
Measures
of
Performance
(MoP)
28
6.0
Build
IMP
! This
is
where
TPMs
are
connected
with
the
MoE’s
and
MoP’s
! For
each
deliverable
from
the
program,
all
the
“measures”
must
be
defined
in
units
meaningful
to
the
decision
makers.
! Here’s
some
“real”
examples.
1. Provide
Precision
Approach
for
a
200
FT/0.5
NM
DH
2. Provide
bearing
and
range
to
AC
plasorm
3. Provide
AC
surveillance
to
GRND
plasorm
Measures
of
EffecLveness
(MoE)
1. Net
Ready
2. Guidance
Quality
3. Land
Interoperability
4. Manpower
5. Availability
JROC
Key
Performance
Parameters
(KPP)
1. Net
Ready
! IPv4/6
compliance
! 1Gb
Ethernet
2. Guidance
quality
! Accuracy
threshold
p70
@
6M
!
Integrity
threshold
4M
@
10-‐6
/approach
3. Land
interoperability
! Processing
capability
meets
LB
growth
matrix
4. Manpower
! MTBC
>1000
hrs
! MCM
<
2
hrs
5. Availability
! Clear
threshold
>99%
! Jam
threshold
>90%
Measures
of
Performance
(MoP)
1. Net
Ready
! Standard
message
packets
2. Guidance
Quality
! MulLpath
allocaLon
budget
! MulLpath
bias
protecLon
3. Land
Interoperability
! MOSA
compliant
! Civil
compliant
4. Manpower
! OperaLng
elapsed
Lme
meters
! Standby
elapsed
Lme
indicators
5. Availability
! Phase
center
variaLons
Technical
Performance
Measures
(TPM)
Mission
CapabiliLes
and
OperaLonal
Need
Technical
Insight
–
Risk
adjusted
performance
to
plan
29. The
1st
Problem
with
the
IniLal
IMP/IMS
Paradigm
! There
is
no
single
source
of
guidance
for
construcLng
a
credible
IMP
and
IMS
– A
DOD
Guidebook,
but
sLll
in
Version
0.9
– A
few
DOD
service
pamphlets
– A
commercial
guidebook
– Many
contractor
guidelines
! No
definiLve
guidance
from
DoD
on
what
makes
a
good
IMP
! No
definiLve
DID
requiring
an
IMP
on
specific
programs
29
6.0
Build
IMP
30. The
IMP
is
a
good
start,
but
we’re
really
aoer
the
IMP
NarraLves
! The
IMP
NarraLves
start
with
the
Statement
of
ObjecLves
(SOO)
or
Concept
of
OperaLons
(ConOps)
– These
idenLfy
the
top
level
program
objecLves
– They
define
the
big
picture
and
provide
pre-‐award
trade
space
– They
provide
the
framework
for
the
Contractor
to
develop
the
proposal
through
the
IMP
! With
the
Government’s
ExecuLve
summary,
provides
the
contractor
an
understanding
of
what
is
needed
and
what
is
important
30
6.0
Build
IMP
31. The
IMP
Process
NarraLve
The
IMP
Process
NarraLve
describes
how
the
technical
and
business
elements
of
the
program
will
be
conducted,
monitored,
and
controlled.
NarraLves
provide
the
customer
visibility
into
the
contractor’s
key
funcLonal
processes
and
procedures,
the
relaLonship
of
these
processes
and
procedures,
and
an
overview
of
the
effort
required
to
implement
them.
31
6.0
Build
IMP
32. IMP
NarraLve
for
PDR
Program
Event
Event
Descrip%on
PE
Maturity
Assessment
shown
in
SAs
PDR
PDR
establishes
the
“design-‐
to”
allocated
baseline
to
the
subsystem
level,
assures
this
design
meets
the
funcLonal
baseline,
assures
system
requirements
have
been
properly
allocated
to
the
proper
subsystem.
PDR
establishes
the
feasibility
of
the
design
approach
to
meet
the
technical
requirements
and
provide
acceptable
interface
relaLonships
between
the
hardware
and
other
interfacing
items.
Any
changes
to
the
requirements
that
have
occurred
since
the
System
Requirements
Review
(SRR2)
will
be
verified
at
the
PDR.
PDR
assures
the
design
is
verifiable,
does
not
pose
major
IMS
or
Cost
risk,
and
is
mature
enough
to
advance
to
the
detailed
design
phase
–
CDR
! Subsystem
level
operaLonal
concepts
defined
! System
level
interfaces
baselined
! Supportability
plans
established
! Sooware
requirements
finalized
! Subsystem
requirements
finalized
&
allocated
! System
verificaLon,
validaLon
&
cerLficaLon
plans
updated
! PDR
subsystem
design
completed
32
6.0
Build
IMP
33. What
the
IMP
NarraLve
Tells
Us?
! States
the
objecLve
of
the
processes
used
to
build
the
products
describe
in
the
SOW
! Provides
the
governing
documents,
compliance,
and
reference
for
the
process
acLviLes
! Explains
the
process
approach
– Portrays
the
key
acLviLes
of
the
approach
– Illustrates
the
processes
tailored
for
the
specific
program
33
Inputs
AcLvity
3
AcLvity
5
AcLvity
4
Tools
AcLvity
1
AcLvity
2
Outputs
Metrics
6.0
Build
IMP
37. 5000.01
Part
2.B.3
AcquisiLon
Strategies,
Exit
Criteria,
and
Risk
Management
“Event
driven
acquisiLon
strategies
and
program
plans
must
be
based
on
rigorous,
objecLve
assessments
of
a
program’s
status
and
the
plans
for
managing
risk
during
the
next
phase
and
the
remainder
of
the
program.
The
acquisiLon
strategy
and
associated
contracLng
acLviLes
must
explicitly
link
milestone
decision
reviews
to
events
and
demonstrated
accomplishments
in
development,
tesLng,
and
iniLal
producLon.
The
acquisiLon
strategy
must
reflect
the
interrelaLonships
and
schedule
of
acquisiLon
phases
and
events
based
on
logical
sequence
of
demonstrated
accomplishments
not
on
fiscal
or
calendar
expediency.”
37
6.0
Build
IMP
38. What
makes
a
good
Program
Event
(PE)?
! Events
are
the
conclusion
of
an
interval
of
major
program
accomplishments
(SA)
with
their
criteria
(AC)
! IMP
events
represent
key
decision
transiLon
points
between
major
acLviLes
distributed
over
the
contract
period
– The
IMP
is
a
Mini
Authoriza=on’s
to
Proceed
(ATP)
to
the
next
Program
Event
(PE)
! Some
guidance
for
establishing
program/product
events:
– Customer
Given
Events.
– Key
Decisions
Needed
– Risk
MiLgaLon
Event
– DOD
Systems
Engineering
Technology
Review
(SETR)
Guidance
38
6.0
Build
IMP
39. What
makes
a
good
Significant
Accomplishment
(SA)?
! IPT
can
manage
it
at
a
working
level
! Shows
compleLon
and
result
s
of
discrete
steps
in
the
development
process
! Indicates
maturity
of
the
product
through
MoEs
and
MoPs
! Its
“significance”
measures
program
event
status
! Relevant
and
logically
linked
to
the
proper
PE
– Just
because
the
work
occurs
during
the
Lme-‐frame
for
PE
A
doesn’t
mean
it’s
logically
linked
to
PE
A
! Uses
consistent
language,
style,
format
through
verb
dicLonary,
for
example:
– Segment
build
4
detailed
design
completed
– Analysis
of
structural
integrity
completed
– Structural
integrity
verified
39
6.0
Build
IMP
40. IMP
Significant
Accomplishments
! SAs
are
NOT
just
a
list
of
“things”
to
do
before
the
Program
Event
(PE)
! They
are
sequenced
accomplishments,
each
of
which
leads
to
the
PE,
e.g.
CriLcal
Design
Review
(CDR),
each
increasing
the
maturity
of
the
deliverables
– SA
#
1
=
CDR
meeLng
conducted
– SA
#
2
=
CDR
acLon
item
work-‐off
plan
established
– SA
#
3
=
85%
drawings
completed
– SA
#
4
=
CDR
CDRLs
delivered
– SA
#
5
=
Development
environment
operaLonal
– SA
#
6
=
CriLcal
methods
analyses
completed
– SA
#
7
=
RVTM
approved
40
6.0
Build
IMP
41. What
makes
a
good
Accomplishment
Criteria
(AC)?
! Provides
objecLve,
measureable,
and
explicit
evidence
of
compleLon
and
closure
of
the
work
acLviLes
in
Work
Packages
that
saLsfies
the
Measure
of
Performance.
! Defines
condiLons
for
closing
the
Significant
Accomplishment
(SA).
! Answers
the
quesLon
how
do
we
know
when
a
Significant
Accomplishment
has
been
completed?
41
6.0
Build
IMP
42. ! Not
Significant
– Too
small
to
significantly
contribute
to
successful
event
compleLon
– Would
lead
to
trivial
tasks
(e.g.,
1
day
duraLon)
! Ambiguous
– Read
can’t
tell
what
Done
looks
like
! Wrong
verb
– Uses
a
verb
that’s
not
on
the
list
(DicLonary)
– Uses
a
listed
verb
incorrectly
– Doesn’t
have
a
verb
at
all
! Not
measurable
– Can’t
tell
when
we’re
done
! Too
Many
SAs
or
ACs
– Confuses
the
reader
and
confuses
the
execuLon
process
of
the
program
– Dilutes
the
MoE
and
MoP
– Reduces
visibility
into
increasing
maturity
42
What
makes
a
Not
So
Good
SA
or
AC?
6.0
Build
IMP
43. ! Program
Event
(PE)
– A
PE
assess
the
readiness
or
compleLon
as
a
measure
of
progress
– First
Flight
Complete
! Significant
Accomplishment
(SA)
– The
desired
result(s)
prior
to
or
at
compleLon
of
an
event
demonstrate
the
level
of
the
program’s
progress
– Flight
Test
Readiness
Review
Complete
! Accomplishment
Criteria
(AC)
– DefiniLve
evidence
(measures
or
indicators)
that
verify
a
specific
accomplishment
has
been
completed
– SEEK
EAGLE
Flight
Clearance
Obtained
43
F-‐22
Example
6.0
Build
IMP
44. ! PE’s
are
easy,
they
are
in
the
SETR
and
Integrated
LogisLcs
Lifecycle
Management
System
! The
SA’s
can
be
defined
from
the
program’s
deliverables
–
these
may
not
be
obvious
but
can
be
discovered
with
a
Product
Development
Kaizen
process
(more
later)
! It’s
the
AC’s
that
are
hard
part
–
the
ACs
must
represent
the
exit
criteria
for
the
series
of
Work
Packages
that
do
the
work
to
produce
the
product
44
Now
comes
the
Hard
Part
6.0
Build
IMP
45. 6
Steps
to
IMP
Development
Let’s
start
to
build
the
IMP.
This
step-‐by-‐step
process
needs
to
be
followed
carefully.
The
IMP
is
constructed
one
Program
Event
at
a
Lme
–
Leo
to
Right
in
Lme.
To
do
otherwise
allows
confusion
and
disconnecLon
between
Program
Events
to
occur
and
dilutes
our
focus
on
defining
what
Done
looks
like
for
each
Program
Event.
7
V8.7
47. Quick
View
of
Step-‐By-‐Step
IMP
IdenLfy
Program
Events
IdenLfy
Significant
Accomplishments
IdenLfy
Accomplishment
Criteria
IdenLfy
Work
Packages
needed
to
complete
the
Accomplishment
Criteria
Sequence
the
Work
Packages
(WP),
Planning
Packages
(PP),
Summary
Level
Planning
Packages
(SLPP)
in
a
logical
network.
Adjust
the
sequence
of
WPs,
PPs,
&
SLPPs
to
miLgate
major
risks.
47
7.0
6
Steps
1
2
3
4
5
6
49. Outcomes
of
Step
! Confirm
the
end
to
end
descripLon
of
the
increasing
maturity
of
the
program’s
deliverables
! Establish
of
RFP
or
Contract
target
dates
for
each
Event.
! Socialize
the
language
of
speaking
in
“Events”
rather
than
Lme
and
efforts
49
1
7.0
6
Steps
50. Events
Define
the
Assessment
of
the
Program’s
Maturity
50
! Program
Events
are
maturity
assessment
points
in
the
program
! They
define
what
levels
of
maturity
for
the
products
and
services
are
needed
before
proceeding
to
the
next
maturity
assessment
point
! The
entry
criteria
for
each
Event
defines
the
units
of
measure
for
the
successful
compleLon
of
the
Event
! The
example
below
is
typical
of
the
purpose
of
a
Program
Event
The
Cri=cal
Design
Review
(CDR)
is
a
mul=-‐disciplined
product
and
process
assessment
to
ensure
that
the
system
under
review
can
proceed
into
system
fabrica=on,
demonstra=on,
and
test,
and
can
meet
the
stated
performance
requirements
within
cost
(program
budget),
schedule
(program
schedule),
risk,
and
other
system
constraints.
1
7.0
6
Steps
51. IdenLfy
the
Significant
Accomplishments
(SA)
for
Each
Program
Event
(PE)
51
Actors
Processes
Outcomes
System
Engineer
IdenLfy
Integrated
Product
Teams
(IPT)
responsible
for
the
SA’s
! Define
the
boundaries
of
these
programmaLc
interfaces
! Define
technical
and
process
risk
categories
and
their
bounds
Technical
Lead
Confirm
the
sequence
of
SA’s
has
the
proper
dependency
relaLonships
! Define
the
product
development
flow
process
improves
maturity
! Define
technical
risk
drivers
Project
Engineer
Confirm
logic
of
SA’s
for
project
sequence
integrity
! Define
the
program
flows
improves
maturity
Control
Account
Manager
Validate
SA
outcomes
in
support
of
PE
entry
condiLons
! Confirm
budget
and
resources
adequate
for
defined
work
effort
IMP/IMS
Architect
Assure
the
assessment
points
provide
a
logical
flow
of
maturity
at
the
proper
intervals
for
the
program
! Maintain
the
integrity
of
the
IMP,
WBS,
and
IMS
2
7.0
6
Steps
52. Outcomes
of
Step
! The
Significant
Accomplishments
are
the
“road
map”
to
the
increasing
maturity
of
the
program
! The
“Value
Stream
Map”
resulLng
from
the
flow
of
SA’s
describes
how
the
products
or
services
move
through
the
maturaLon
process
while
reducing
risk
! The
SA
map
is
the
path
to
“done”
52
2
7.0
6
Steps
53. SAs
define
the
entry
criteria
for
each
Program
Event
53
Preliminary
Design
Review
Complete
3
7.0
6
Steps
54. IdenLfy
Accomplishment
Criteria
(AC)
for
each
Significant
Accomplishment
(SA)
Actors
Processes
Outcomes
CAM
Define
and
sequence
the
contents
of
each
Work
Package
and
select
the
EV
criteria
for
each
Task
needed
to
roll
up
the
BCWP
measurement
! Establish
ownership
for
the
content
of
each
Work
Package
and
the
Exit
Criteria
–
the
Accomplishment
Criteria
(AC)
Project
Engineer
IdenLfy
the
logical
process
flow
of
the
Work
Package
to
assure
the
least
effort,
maximum
value
and
lowest
risk
path
to
the
Program
Event
! Establish
ownership
for
the
process
flow
of
the
product
or
service
Technical
Lead
Assure
all
technical
processes
are
covered
in
each
Work
Package
! Establish
ownership
for
the
technical
outcome
of
each
Work
Package
IMP/IMS
Architect
Confirm
the
process
flow
of
the
ACs
can
follow
the
DID
81650
structuring
and
Risk
Assessment
processes
! Guide
the
development
of
outcomes
for
each
Work
Package
to
assure
increasing
maturity
of
the
program
54
3
7.0
6
Steps
55. Outcomes
of
Step
! The
definiLon
of
“done”
emerges
in
the
form
of
deliverables
rather
than
measures
of
cost
and
passage
of
Lme.
! At
each
Program
Event,
the
increasing
maturity
of
the
deliverables
is
defined
through
the
Measures
of
EffecLveness
(MoE)
and
Measures
of
Performance
(MoP)
55
3
7.0
6
Steps
56. ACs
are
higher
fidelity
descripLons
of
“Done”
than
SAs
56
Cri%cal
Design
Review
Complete
4
7.0
6
Steps
57. IdenLfy
the
Work
for
Each
Accomplishment
Criteria
in
Work
Packages
Actors
Processes
Outcomes
Control
Account
Manager
IdenLfy
or
confirm
the
work
acLviLes
in
the
Work
Package
represent
the
allocated
work
! Define
bounded
work
effort
defined
“inside”
each
Work
Package
Technical
Lead
Confirm
this
work
covers
the
SOW
and
CDRLs
! Define
all
work
effort
for
100%
compleLon
of
deliverable
visible
in
a
single
locaLon
–
the
Work
Package
! Confirm
risk
drivers
and
duraLon
variances
IMP/IMS
Architect
Assist
in
the
sequencing
the
work
efforts
in
a
logical
manner
! Develop
foundaLon
of
the
maturity
flow
starLng
to
emerge
from
the
contents
of
the
Work
Packages
Earned
Value
Analyst
Assign
iniLal
BCWS
from
BOE
to
Work
Package
! ConfirmaLon
of
work
effort
against
BOEs
! Define
EVT
for
measures
progress
to
plan
57
4
7.0
6
Steps
58. Outcomes
of
Step
! The
work
idenLfied
that
produces
a
measurable
outcome.
! This
work
defined
in
each
Work
Package
! The
Accomplishment
Criteria
(AC)
state
explicitly
what
“done”
looks
like
for
this
effort
! With
“done”
stated,
measures
of
Performance
and
measures
of
EffecLveness
can
be
defined
58
4
7.0
6
Steps
59. Work
is
done
in
“packages”
that
produce
measureable
outcomes
59
Launch
Readiness
Review
Complete
5
7.0
6
Steps
60. Sequence
Work
Packages
(ACs)
for
each
Significant
Accomplishment
(SA)
60
Actors
Processes
Outcomes
Control
Account
Manager
Define
the
order
of
the
Work
Packages
needed
to
meet
the
Significant
Accomplishments
for
each
Program
Event
! Define
the
process
flow
of
work
and
the
resulLng
accomplishments.
! Assure
value
is
being
produced
at
each
SA
and
the
AC’s
that
drive
them
IMP/IMS
Architect
Assure
that
the
sequence
of
Work
Packages
adheres
to
the
guidance
provided
by
DCMA
and
the
EVMS
System
descripLon
! Begin
the
structuring
of
the
IMS
for
compliance
and
loading
into
the
cost
system
Program
Controls
Staff
Baseline
the
sequence
of
Work
Packages
using
Earned
Value
Techniques
(EVT)
with
measures
of
Physical
Percent
Complete
! Develop
insight
to
progress
to
plan
with
measures
of
physical
progress
for
each
Work
Packages
(EVT)
5
7.0
6
Steps
61. Outcomes
of
Step
! Work
Packages
parLLon
work
efforts
into
“bounded”
scope
! Interdependencies
constrained
to
Work
Package
boundaries
prevents
“spaghe{
code”
style
schedule
flow
! Visibility
of
the
Increasing
Flow
of
Maturity
starLng
to
emerge
from
the
flow
of
Accomplishment
Criteria
(AC)
61
5
7.0
6
Steps
63. Assemble
Final
IMP/IMS
Actors
Processes
Outcomes
IMP/IMS
Architect
StarLng
with
the
AC’s
under
each
SA’s
connect
Work
Packages
in
the
proper
order
for
each
Program
Event
! Establish
the
Performance
Measurement
Baseline
framework.
! IdenLfy
MoE
and
MoP
points
in
the
IMP
Program
Manager
Confirm
the
work
efforts
represent
the
commieed
acLviLes
for
the
contract
! Review
and
approval
of
the
IMS
–
ready
for
baseline.
! Review
and
approve
risk
drivers
and
duraLon
variance
models
Project
Engineer
Assess
the
product
development
flow
for
opLmizaLons
! Review
and
approval
of
the
IMS
–
ready
for
baseline.
! IdenLfy
risk
drivers
and
their
miLgaLons
Systems
Engineer
Confirm
the
work
process
flows
result
in
the
proper
products
being
built
in
the
right
order
! Confirm
risk
drivers
and
duraLon
variances.
! Review
and
approval
of
the
IMS
–
ready
for
baseline
63
6
7.0
6
Steps
64. Outcomes
of
Step
! Both
the
maturity
assessment
criteria
and
the
work
needed
to
reach
that
level
of
maturity
are
described
in
a
single
locaLon
! Risks
are
integrated
with
the
IMP
and
IMS
at
their
appropriate
levels
– Risks
to
EffecLveness
–
risk
to
JROC
KPPs
– Risks
to
Performance
–
risk
to
program
KPPs
and
TPMs
! Leading
and
Lagging
indicator
data
provide
through
each
measure
to
forecast
future
performance
64
6
7.0
6
Steps
65. The
Previous
6
Steps
Result
In
A
Credible
IMP/IMS
65
! The
IMP
is
the
“Outer
Mold
Line”,
the
Framework,
the
“Going
Forward”
Strategy
for
the
Program.
! The
IMP
describes
the
path
to
increasing
maturity
and
the
Events
measuring
that
maturity.
! The
IMP
tells
us
“How”
the
program
will
flow
with
the
least
risk,
the
maximum
value,
and
the
clearest
visibility
to
progress.
! The
IMS
tells
us
what
work
is
needed
to
produce
the
product
or
service
at
the
Work
Package
level.
Our
Plan
Tells
Us
“How”
We
are
Going
to
Proceed
The
Schedule
Tells
Us
“What”
Work
is
Needed
to
Proceed
7.0
6
Steps
67. Horizontal
and
VerLcal
Traceability
of
the
IMP/IMS
Integrated
Master
Schedule
Work
sequenced
to
produce
outcomes
for
each
WP.
! VerLcal
traceability
AC
"
SA
"
PE
! Horizontal
traceability
WP
"
WP
"AC
Program Events
Define the maturity
of a Capability at a point in
time.
Significant Accomplishments
Represent requirements
that enable Capabilities.
Accomplishment Criteria
Exit Criteria for the Work
Packages that fulfill Requirements.
Work
Package
Work
Package
Work
Package
Work
Package
Work
Package
Work
package
Work
Package
Work
Package
67
7.0
6
Steps
68. The
IMP’s
connecLon
to
the
WBS
! Start
with
the
Significant
Accomplishments
and
sequence
them
to
the
maturity
flow
for
each
Program
Event
! The
WBS
connecLons
then
become
orthogonal
to
this
flow
68
7.0
6
Steps
Program
Event
SRR
SDR
PDR
CDR
TRR
ATLO
Work
Breakdown
Structure
4.920-‐SDAI
A01,
A02
B01
C01,
C02
D01
E01
F01
4.200-‐Sys
Test
A05
B03,
B04
D02,
D03
E02
F02
4.300-‐Radar
A03
B02
C03
E03
4.330-‐O&C
Sys
A06,
A07
B05
C04
D04
E04
F03,
F04
4.400-‐
I&T
A08
C05
E05,
E06
F05
4.500-‐Support
A09
D05
E07
F06,
F07
75. Nuances
Of
These
6
Steps
Building
the
Program
Event,
to
Significant
Accomplishment,
to
Accomplishment
decomposiLon
is
straight
forward.
For
each
Program
Event,
simply
idenLfy
what
are
the
needed
Significant
Accomplishment
for
the
entry
and
exit
criteria,
and
the
Accomplishment
Criteria
for
the
Work
Packages
that
produce
the
AC.
Yea
Right,
no
problem
8
V8.7
77. Quick
View
of
the
Nuances
! Unfortunately
building
a
credible
IMP/IMS
is
a
nuanced
process,
subject
to
many
opportuniLes
for
diversions,
blind
alleys,
and
false
starts
! It
is
slightly
counter
intuiLve
from
the
tradiLonal
scheduling
approach
to
start
with
the
verLcal
integraLon
–
but
it
is
criLcal
to
start
verLcally
! Success
requires
the
full
parLcipaLon
of
Systems
Engineering,
CAMs,
and
the
Program
Manager
! Success
requires
everyone
to
understand
the
nuances
of
the
IMP
building
efforts
77
8.0
Nuances
80. The
3rd
Nuance
Everything
Foot
and
Ties
to
the
IMP
&
IMS
Beginner
Intermediate
Advanced
! The
IMS
contains
all
the
proper
fields
in
columns
and
is
horizontally
linked
! The
WBS
elements
can
be
found
for
all
work
elements
! CDRL’s
are
visible
and
their
mulLple
delivery
dates
connected
to
each
Program
Event
! WBS
is
structured
in
a
product
manner
or
possibly
a
funcLonal
manner
with
some
deliverables
defined
in
the
terminal
nodes
! The
WBS
is
properly
formed
inside
each
AC
with
incremental
deliverables
! WBS
numbers
form
a
“well
structured”
tree,
but
sLll
is
not
“pure”
in
the
sense
of
deliverables
only,
no
funcLonal
! Each
column
and
each
field
can
be
“pivoted”
to
form
a
proper
“tree”
of
value
flow.
! The
WBS
is
a
“pure”
Product
Breakdown
Structure
(PBS)
and
the
services
needed
to
produce
those
products
! The
WBS
defines
the
structure
of
the
delivered
product
or
service
! The
VerLcal
trace
of
the
IPM
describes
the
flow
of
increasing
maturity
of
these
products
or
services
! The
Horizontal
trace
of
the
IMP
describes
to
work
to
be
done
to
produce
this
maturity
80
8.0
Nuances
81. The
4th
Nuance
IMP/IMS
is
ProgrammaLc
Architecture
Beginner
Intermediate
Advanced
! The
IMP
is
built
from
the
WBS
for
each
Program
Event.
! The
IMP
is
seen
as
a
compliance
document
that
lists
the
Program
Events
and
a
“bunch
of
stuff”
underneath.
! The
IMP
is
structured
around
separate
Program
Events,
but
below
the
SA’s
looks
like
a
“shop
floor”
schedule
with
liele
verLcal
connecLvity.
! The
IMP
is
built
as
a
“value
stream”
flow
for
the
program
but
the
Systems
Engineers
! This
programmaLc
architecture
is
built
in
the
same
way
the
technical
system
architecture
is
built
! It
is
derived
from
the
ConOps
and
Tier
1
System
Requirements
! The
IMP
shows
explicitly
how
these
are
supported
in
the
flow
of
the
SA’s
81
8.0
Nuances
82. The
5th
Nuance
The
IMP/IMS
connects
all
the
dots
82
8.0
Nuances
Measure
of
EffecLveness
Measure
of
Performance
Technical
Performance
Measure
Risk
Aleatory
Uncertainty
Epistemic
Uncertainty
Reference
Classes
Past
Performance
SME
Past
Performance
System
Architecture
AHP
83. ConnecLng
The
IMP
To
Program
Performance
Measures
Assembling
the
IMS
from
the
IMP
appears
to
be
a
straight
forward
process
–
details
the
tasks
that
support
the
Accomplishment
Criteria.
But
there
are
some
criLcal
steps
that
must
be
done
in
the
right
order
to
end
up
with
a
risk
tolerant
IMS.
Let’s
do
this
for
Our
Program
9
V8.7
84. The
Primary
Role
for
the
IMP
is
to
describe
what
done
looks
like
in
MoE’s
and
MoP’s
84
19
October
1899
Robert
Goddard
decided
that
he
wanted
to
"fly
without
wings"
to
Moon.
9.0
Framework
85. Quick
View
to
IMP/IMS
Framework
! Measures
of
increasing
maturity
for
the
key
deliverables
is
the
foundaLon
for
increasing
the
Probability
of
Program
Success
! Measures
of
EffecLveness
(MoE)
and
Measures
of
Performance
(MoP)
are
defined
in
the
Integrated
Master
Plan
(IMP)
NarraLve
! Key
Performance
Parameters
(KPP)
–
both
JROC
and
program
specific
are
needed
! Technical
Performance
Measures
(TPM)
are
needed
for
all
key
deliverables
85
9.0
Framework
86. Components
we’ll
meet
along
the
way
to
our
desLnaLon
to
the
credible
PMB
86
86
ObjecLve
Status
and
EssenLal
Views
to
support
the
proacLve
management
processes
needed
to
keep
the
program
GREEN
Risk
Management
SOW
SOO
ConOps
WBS
Techncial
and
OperaLonal
Requirements
CWBS
&
CWBS
DicLonary
Integrated
Master
Plan
(IMP)
Integrated
Master
Schedule
(IMS)
Measures
of
EffecLveness
Measures
of
Performance
Measures
of
Progress
JROC
Key
Performance
Parameters
Program
Specific
Key
Performance
Parameters
Technical
Performance
Measures
Earned
Value
Management
System
TPMs
Live
Here
Performance
Measurement
Baseline
86
9.0
Framework
87. Components
of
our
Final
DesLnaLon
87
Sow
SOO
ConOps
WBS
Techncial
and
OperaLonal
Requirements
Performance
Measurement
Baseline
CWBS
&
CWBS
DicLonary
Integrated
Master
Plan
(IMP)
Integrated
Master
Schedule
(IMS)
Earned
Value
Management
System
Technical Performance Measurement (TPM) involves the predicting the future
values of a key technical performance parameter of the higher-level end product
under development, based on current assessments of products lower in the
system structure. Continuous verification of actual versus anticipated achievement
for selected technical parameters confirms progress and identifies variances that
might jeopardize meeting a higher-level end product requirement. Assessed
values falling outside established tolerances indicate the need for management
attention and corrective action.
A well thought out TPM program provides early warning of technical problems,
supports assessments of the extent to which operational requirements will be met,
and assesses the impacts of proposed changes made to lower-level elements in
the system hierarchy on system performance.
Techncial
Performance
Measurement
heps://dap.dau.mil/acquipedia/Pages/Default.aspx
87
9.0
Framework
89. ! EIA-‐632
–
involves
a
technique
of
predicLng
the
future
value
of
a
key
technical
performance
parameter
of
the
higher-‐level
end
product
under
development,
based
on
current
assessments
of
products
lower
in
the
system
structure.
! INCOSE
Systems
Engineering
Handbook.
InternaLonal
Council
on
Systems
Engineering
! INCOSE
Metrics
Guidebook
for
Integrated
System
and
Product
Development
89
Policy
Guidance
for
TPMs
9.0
Framework
90. The
NDIA
EVM
Intent
Guide
Says
90
Notice the inclusion of Technical along with
Cost and Schedule
That’s the next step is generating Value from Earned Value
EV MUST include the Technical Performance Measures
9.0
Framework
91. Some
More
Guidance
91
Systems
engineering
uses
technical
performance
measurements
to
balance
cost,
schedule,
and
performance
throughout
the
life
cycle.
Technical
performance
measurements
compare
actual
versus
planned
technical
development
and
design.
They
also
report
the
degree
to
which
system
requirements
are
met
in
terms
of
performance,
cost,
schedule,
and
progress
in
implemenLng
risk
handling.
Performance
metrics
are
traceable
to
user–defined
capabiliLes.
―
Defense
Acquisi=on
Guide
(heps://dag.dau.mil/Pages/
Default.aspx)
In The End ― It’s All About Systems Engineering
9.0
Framework
93. This
Has
All
Been
Said
Before.
We
Just
Weren’t
Listening…
93
… the basic tenets of the process are the need for
seamless management tools, that support an integrated
approach … and “proactive identification and
management of risk” for critical cost, schedule, and
technical performance parameters.
― Secretary of Defense, Perry memo, May 1995
Why Is This Hard To Understand?
! We seem to be focused on EV reporting, not the use of
EV to manage the program.
! Getting the CPR out the door is the end of Program
Planning and Control’s efforts, not the beginning.
TPM Handbook 1984
9.0
Framework
94. A
Final
Reminder
…
94
TPMs are one source of Risk Management processes
Risk
Management
Guide
to
DOD
Acquisi=on,
Sixth
EdiLon
(Version
1,0),
Aug
2006
9.0
Framework
95. StarLng
out
on
the
Right
Foot
…
! Most
IMP/IMS
literature
states
how
to
build
an
IMP
from
the
RFP
and
contractual
elements,
in
simple
and
maybe
simple
minded
terms
– Decompose
the
events
into
SAs,
ACs,
and
their
Tasks
–
sounds
easy
! This
approach
fails
to
provide
advice
for
several
things:
– How
to
minimize
the
topological
connecLons
between
Events
– How
to
increase
the
concurrency
between
IPTs
– How
to
increase
the
tolerance
of
the
IMS
to
disrupLve
events
• Known
and
knowable
risk
• Unknown
and
possibly
unknowable
risk
! The
construcLon
of
the
IMS
needs
to
take
place
in
what
seems
to
be
a
reverse
order
– Build
the
IMP
as
a
Value
Stream
Map
describing
the
increasing
maturity
–
and
therefore
the
increasing
VALUE
of
the
deliverables
to
the
customer.
– It’s
the
delivery
of
Customer
Value
that
inverts
the
management
process,
and
focuses
on
Keeping
the
Program
GREEN
as
planned
that
maximizes
value
to
the
customer
(the
Government)
95
9.0
Framework
96. SETR
Program
Events
96
heps://acc.dau.mil/docs/technicalreviews/dod_tech_reviews.htm
9.0
Framework
97. Build
PEs
Leo
to
Right
Start
with
SRR
(or
something
on
the
leW)
and
completely
define
its
comple=on,
before
moving
to
the
next
PE
! Define
the
SAs
for
an
Event
and
construct
a
work
flow
of
the
acLviLes
needed
to
saLsfy
the
SA
– These
acLviLes
are
yet
tasks,
so
don’t
commit
too
soon
to
defining
the
detailed
work
– Isolate
the
SAs
by
event
first
–
only
work
on
one
event
at
a
Lme
! IdenLfy
the
parLcipants
in
the
work
– What
IPTs
parLcipate
in
this
work?
– What
swim
lanes
are
needed
to
isolate
the
IPTs?
! Define
the
elements
– AcLviLes
performed
to
saLsfy
the
SA
– Deliverables
that
result
from
these
acLviLes
! There
are
sLll
not
Accomplishment
Criteria
(AC),
but
that
comes
next
97
9.0
Framework
98. The
Accomplishment
Criteria
(AC)
! A
definiLve
measure
or
indicator
that
verifies
compleLon
of
work
for
the
accomplishment
– Completed
work
effort
• Manufacturing
Plan
Completed
– ConfirmaLon
of
performance
compliance
• Flight
Test
Report
Approved
– Incremental
verificaLon
• Maintenance
DemonstraLon
Completed
– Completed
criLcal
process
acLviLes
• Risk
Management
Plan
Approved
98
9.0
Framework
99. The
Accomplishment
Criteria
(AC)
! Defines
the
measure
by
which
an
Accomplishment
(SA)
is
considered
“done”
! Terms
like
complete,
delivered,
closed
have
no
“units
of
measure”
in
the
context
of
a
Significant
Accomplishment
(SA)
and
are
open
to
interpretaLon
! Terms
like
…
– Measures
of
compleLon
–
80%
of
drawings
approved
for
release
– Counts
of
available
items
–
75%
of
pin-‐outs
assigned
voltage
– Fidelity
of
a
design
–
outer
mold
line
defined
within
90%
of
target
– Error
bounds
–
spacecraW
mass
known
to
±20%
– Performance
parameters
–
disconnect
force
within
allowed
limits
– Maturity
parameters
–
flight
ar=cle
successful
in
last
3
tests
…
are
used
to
define
the
“exit
criteria”
99
9.0
Framework
100. 2
Types
of
Accomplishment
Criteria:
Entry
and
Exit
! Entry
Criteria
–
SubstanLates
readiness
for
the
review
! Exit
Criteria
–
SubstanLates
successful
compleLon
of
the
review
! CriLcal
Design
Review
(CDR)
example
– Are
we
ready
for
the
Flight
Test
Readiness
Review?
– How
do
we
know
the
FTRR
a
success?
– What
did
we
learn
from
the
FTRR
that
increases
the
maturity
of
the
program’s
deliverables.
100
9.0
Framework
101. The
IMP
Focuses
us
on
Measures
of
EffecLveness
and
Performance
101
MoE
KPP
MoP
TPM
Mission
Need
Acquirer
Defines
the
Needs
and
CapabiliLes
in
terms
of
OperaLonal
Scenarios
Supplier
Defines
Physical
SoluLons
that
meet
the
needs
of
the
Stakeholders
Opera%onal
measures
of
success
related
to
the
achievement
of
the
mission
or
opera%onal
objec%ve
being
evaluated.
Measures
that
characterize
physical
or
func%onal
aKributes
rela%ng
to
the
system
opera%on.
Measures
used
to
assess
design
progress,
compliance
to
performance
requirements,
and
technical
risks.
9.0
Framework
102. Measure
of
EffecLveness
(MoE)
! Measures
of
EffecLveness
…
! Are
stated
in
units
meaningful
to
the
buyer,
! Focus
on
capabiliLes
independent
of
any
technical
implementaLon,
! Are
connected
to
the
mission
success.
The
operaLonal
measures
of
success
that
are
closely
related
to
the
achievements
of
the
mission
or
operaLonal
objecLves
evaluated
in
the
operaLonal
environment,
under
a
specific
set
of
condiLons.
“Technical
Measurement,”
INCOSE–TP–2003–020–01
MoE’s
Belong
to
the
End
User
102
9.0
Framework
103. Measure
of
Performance
(MoP)
! Measures
of
Performance
are
…
! Aeributes
that
assure
the
system
has
the
capability
to
perform,
! Assessment
of
the
system
to
assure
it
meets
design
requirements
to
saLsfy
the
MoE.
Measures
that
characterize
physical
or
funcLonal
aeributes
relaLng
to
the
system
operaLon,
measured
or
esLmated
under
specific
condiLons.
“Technical
Measurement,”
INCOSE–TP–2003–020–01
MoP’s
belong
to
the
Program
–
Developed
by
the
Systems
Engineer,
Measured
By
CAMs,
and
Analyzed
by
PP&C
103
9.0
Framework
104. Key
Performance
Parameters
(KPP)
Both
JROC
and
Program
Specific
! Key
Performance
Parameters
…
! Have
a
threshold
or
objecLve
value,
! Characterize
the
major
drivers
of
performance,
! Are
considered
CriLcal
to
Customer
(CTC).
Represent
the
capabiliLes
and
characterisLcs
so
significant
that
failure
to
meet
them
can
be
cause
for
reevaluaLon,
reassessing,
or
terminaLon
of
the
program
“Technical
Measurement,”
INCOSE–TP–2003–020–01
The
acquirer
defines
the
KPPs
during
the
operaLonal
concept
development
–
KPPs
say
what
DONE
looks
like
104
9.0
Framework
105.
Technical
Performance
Measures
(TPM)
for
key
deliverables
! Technical
Performance
Measures
…
! Assess
design
progress,
! Define
compliance
to
performance
requirements,
! IdenLfy
technical
risk,
! Are
limited
to
criLcal
thresholds,
! Include
projected
performance.
“Technical
Measurement,”
INCOSE–TP–2003–020–01
Aeributes
that
determine
how
well
a
system
or
system
element
is
saLsfying
or
expected
to
saLsfy
a
technical
requirement
or
goal
105
9.0
Framework
106. What
are
Technical
Performance
Measures
Really?
! TPMs
are
measures
of
the
system
technical
performance
that
have
been
chosen
because
they
are
indicators
of
system
success.
They
are
based
on
the
driving
requirements
or
technical
parameters
of
high
risk
or
significance
-‐
e.g.,
mass,
power
or
data
rate.
! TPMs
are
analogous
to
the
programmaLc
measures
of
expected
total
cost
or
esLmated
Lme-‐to-‐compleLon.
There
is
a
required
performance,
a
current
best
esLmate,
and
a
trend
line.
! Actual
versus
planned
progress
of
TPMs
are
tracked
so
the
systems
engineer
or
project
manager
can
assess
progress
and
the
risk
associated
with
each
TPM.
! The
final,
delivered
system
value
can
be
esLmated
by
extending
the
TPM
trend
line
and
using
the
recommended
conLngency
values
for
each
project
phase.
! The
project
life
trend-‐to-‐date,
current
value,
and
forecast
of
all
TPMs
are
reviewed
periodically
(typically
monthly)
and
at
all
major
milestone
reviews.
106
9.0
Framework
107. ! Tracking
TPMs
and
comparing
them
to
the
resource
growth
provides
an
early
warning
system
to
detect
deficiencies
or
excesses
! Reserve
allocaLons
narrow
as
design
proceeds
! TPMs
that
violate
reserve
allocaLons
or
have
trends
that
do
not
meet
the
final
performance
trigger
correcLve
acLons
107
Tracking
the
Technical
Performance
Measures
9.0
Framework
109. Sample
IMP:
A
Flight
Avionics
System
(ConLnued)
Hardware
PDR
–
Purpose
! Ensure
system
hardware
iniLal
design
has
been
updated,
and
meets
funcLonal
and
allocated
performance
requirements
within
program
constraints.
! OperaLonal
security
concept
assessed.
! Ensure
training
requirements
have
been
analyzed
and
their
objecLves
have
been
defined
for
training
missions.
! ConfirmaLon
training
objecLves
and
MTC
design
and
integraLon
conform
to
the
Air
Force
syllabus
and
Ready
Aircrew
Program
(RAP).
! Training
plan
will
be
updated.
! Hardware
PDR
–
ExpectaLons
! Team
agrees
system
hardware
iniLal
design
has
been
updated
and
can
proceed
to
the
detailed
design
phase.
! Team
agrees
training
plans
and
objecLves
correlate
with
the
Air
Force
syllabus,
RAP
and
training
planning,
development
can
conLnue.
! System
SpecificaLon
and
TTL
requirements
are
traceable
to
the
allocated
hardware
design.
109
9.0
Framework
110. Sample
IMP:
A
Flight
Avionics
System
(ConLnued)
! Hardware
PDR
–
Entry
Criteria
! FuncLonal
Baseline
AuthenLcated
(FBA)
! IniLate
system
hardware
iniLal
design
and
allocate
funcLons
to
the
appropriate
ConfiguraLon
Items
! All
specificaLons
updated
and
required
documentaLon
is
made
available
including
anLcipated
lower
level
design
documentaLon
! All
SRR/SFR
acLon
items
closed
or
disposiLoned
110
9.0
Framework
111. Sample
IMP:
A
Flight
Avionics
System
(ConLnued)
Hardware
PDR
–
Accomplishments
! System
hardware
iniLal
design
complete.
– FuncLons
allocated
to
one
or
more
hardware
configuraLon
items
and
are
traceable
to
the
MTC
SSS
and
TSSC
SSS.
– Human,
safety,
R&M,
EMI,
operaLonal
security,
instructor
and
operator
interfaces,
etc,
design
factors
have
been
reviewed.
! Drao
instructor
and
operator
manuals
reviewed
! Program
risks
updated,
assessed,
and
reviewed.
– MiLgaLon
plans
in
place.
! Program
schedule
and
constraints
updated
and
reviewed.
– CriLcal
schedule
path
drivers
reviewed.
! Design
criteria
for
the
simulaLon
and
database
development
reviewed
and
updated.
! Program
processes
and
metrics
reviewed.
! Test
Planning
acLviLes
and
relevant
documentaLon
reviewed
by
test
team.
111
9.0
Framework
112. Sample
IMP:
A
Flight
Avionics
System
(Concluded)
Hardware
PDR
–
Exit
Criteria
! Hardware
(ownership,
visual,
IOS,
brief/debrief)
design
reviewed,
allocated
to
a
hardware
configuraLon
item
and
updated
to
include
instructor
and
operators
interfaces,
malfuncLon
and
control
requirements,
etc.
! RTM
updated,
MTC
SSS,
TSSC
SSS
and
TTL
traceable
to
allocated
hardware
design
to
include
ESOH
requirements.
! Human,
safety,
R&M,
EMI,
operaLonal
security,
instructor
and
operator
interfaces,
etc,
design
factors
reviewed.
! MTC
and
TSSC
allocated
baselines
established
and
controlled
by
appropriate
level
documentaLon
for
PDR.
! Drao
instructor
and
operator
manuals
reviewed
with
user
concurrence
and
incorporate
saLsfactory
human
factor
design
factors
into
the
operator
interfaces.
! Risk
management
and
miLgaLon
plans
updated,
in
place,
addresses
ESOH
plans
and
risks,
and
within
program
constraints.
! Risks
assesses,
understood,
documented,
accepted
and
understood
by
team.
! Program
schedule
reviewed
– CriLcal
path
drivers
idenLfied
– IMS
Updated
and
reflects
criLcal
paths
112
9.0
Framework
113. One
More
IMP
Sample
! (SA)
System
&
Segment
Requirements
Updated
&
Allocated
– (AC)
SRR
/
SDR
Update
Review
Conducted
– (AC)
Preliminary
System
SpecificaLon
Documents
(A011)
Baselined
– (AC)
Preliminary
Spacecrao
Segment
SpecificaLon
Baselined
– (AC)
Preliminary
Ground
Segment
SpecificaLon
Baselined
– (AC)
Preliminary
SpecificaLon
Tree
Baselined
! (SA)
Preliminary
ICDs
Baselined
For
Customer
Review
– (AC)
Preliminary
Space-‐Ground
ICD
Baselined
! (SA)
PDR
System
Design
Completed
– (AC)
Top
Level
System
Architecture
Updated
– (AC)
PDR
Level
System
Analyses
Completed
– (AC)
PDR
Level
Reliability
/
Availability
Analysis
Completed
– (AC)
Preliminary
System
Level
Risk
Assessment
Completed
– (AC)
System
Level
Plans
Updated
For
PDR
– (AC)
Flight
Long
Lead
Review
Conducted
113
Preliminary
Design
Review
9.0
Framework
114. The
“But”
for
this
Guidance
! With
these
samples
and
the
SETR
guidance
we’ve
just
started
! The
program
needs
to
define
program
specific
events
to
assure
the
actual
maturity
measures
are
captured
– The
IMP
provides
sufficient
defini=on
to
track
the
step-‐by-‐step
comple=on
of
the
required
accomplishments
for
each
event
and
to
demonstrate
sa=sfac=on
of
the
comple=on
criteria
for
each
accomplishment.
[AFMC
PAMPHLET
63-‐5]
114
9.0
Framework
115. IMP
Verbs
for
Significant
Accomplishments
115
Integrated
Master
Plan
Allowable
Verbs
Allocated:
Segment
requirement
is
flowed
down
from
the
System
SpecificaLon
Released:
Approved
item
for
delivery
for
intended
customer
or
supplier;
all
internal
distribuLon
and
sign
offs
complete.
An
electronic
version
is
made
accessible
on
the
IDE
Completed:
The
subject
item,
data,
document,
or
process
is
prepared
or
concluded,
and
reviewed
and
accepted
by
the
responsible
IPT.
SupporLng
documentaLon
is
available
through
IDE
Reviewed:
The
subject
item,
data,
document,
or
process
is
prepared
or
concluded,
and
documented
for
compleLon.
SupporLng
documentaLon
is
available
through
IDE
Conducted:
The
subject
meeLng
or
review
has
been
held
with
all
required
parLcipants.
The
charts
or
minutes
are
available
through
the
IDE
Updated:
The
subject
process,
data,
or
document
has
been
reevaluated
using
later
informaLon,
and
adjustments
incorporated.
Defined:
The
subject
configuraLon
items,
data,
or
document
was
submieed
to
the
customer
Validated:
Requirements
are
validated,
received
contractor
approvals,
were
distributed,
and
are
available
through
the
IDE
Established:
The
subject
items
is
created
and
set
in
place
in
a
manner
consistent
with
its
intended
use
aoer
review
and
accepted
by
the
IPT
Verified:
Requirements
are
verified
or
processed
in
accordance
with
established
pracLce.
9.0
Framework
116. The
IMP
Process
NarraLve
! ObjecMve:
a
brief
statement
explaining
why
this
process
set
is
applied
for
this
program
! Governing
DocumentaMon:
lists
of
the
guidance
or
compliance
documents;
e.g.,
specificaLons,
manuals,
and
procedures
including
company,
government,
and
industry
references
! Approach:
concise
descripLon
as
to
who
owns
each
process;
what
are
the
roles
and
responsibiliLes;
and
the
overall
process
including
a
process
flow
diagram.
116
9.0
Framework
117. Generic
IMP
EvaluaLon
Criteria
! Do
the
Program
Events
and
Accomplishments
reflect
the
logical
evoluLon
and
progress
of
the
overall
Program?
– Do
program
events
and
their
definiLons
clearly
demonstrate
the
maturity
of
the
program
over
its
life?
– Do
the
selected
Accomplishments
and
associated
Criteria
idenLfy
meaningful
and
measurable
progress
toward
the
key
goals
of
the
Program?
! Do
the
Accomplishments
for
each
event
demonstrate
a
meaningful
understanding
of
the
program
requirements
or
are
they
tasks
that
anyone
could
do
for
any
contract?
– Do
they
reflect
your
SOW
requirements?
! Does
the
IMP
structure
readily
map
to
the
IPT
structure
such
that
each
IPT
can
easily
visualize
the
scope
of
their
responsibility?
! When
awarded,
could
the
contractor
use
the
Accomplishments
as
discrete
acLvity
cost
accounts
in
their
earned
value
system?
Or
are
they
level
of
effort
in
nature?
! Is
sufficient
visibility
provided
to
idenLfy
and
track
the
Program
Risk
Plan
and
associated
risk
miLgaLon
accomplishments
and/or
conLngencies?
– Does
criteria
supporLng
the
accomplishments
include
key
performance
requirements?
! Are
IPT
cross-‐dependencies
and
dependencies
external
to
the
Program
appropriately
reflected
if
they
reflect
potenLal
schedule
or
performance
risks
to
the
success
of
the
Program?
! Is
the
submieed
"Contract
IMP"
(the
Product
IMP
that
is
to
be
included
as
part
of
the
Program
Contract)
defined
to
the
appropriate
level:
– Do
the
Accomplishments
and
associated
Criteria
go
down
to
a
level
sufficient
to
provide
visibility
into
key
subcontractor
acLviLes
upon
which
the
success
of
the
Program
may
be
dependent?
– Are
the
Events
and
Accomplishments
included
in
the
IMP
at
such
a
level
as
to
make
the
maintenance
of
the
'Contractual
IMP'
pracLcal
or
does
it
include
an
unnecessary
level
of
detail?
117
9.0
Framework
118. Program
Management
Levels
Program Levels" IMP/IMS Elements" CWBS"
Tier 1!
Program Manager!
Technical Leads!
IPT Manager!
Technical performance goals!
Major Program Events
(PE)!
!
!
!
Significant
Accomplishments (SA)!
Level 1 & 2!
Links to CLINs!
Level 3 & 4!
Control Packages!
Link to PBS!
Integrated EVMS!
Tier 2!
Control Account Managers !
Product Work Plan!
Responsible organization
elements!
!
Accomplishment
Criteria (AC)!
!
!
!
!
Tasks (NA)!
Level 5!
Cost Account
Package!
Cost Collection Level!
Links to WBS by OBS!
Resource summaries!
Early warning EVMS
analysis!
Tier 3!
Work Package Manager!
Detailed plans! Work package!
Earned value
calculations!
118
9.0
Framework
119. ConnecLng
the
Components
of
the
IMP/IMS
! The
assembled
IMP/IMS
links
all
work
acLviLes
verLcally
to
the
ACs,
SAs,
and
PEs
119
IMS
Customer
Requirements
IMP
ORG
IPTs
ORG
IPTs
Performance
Analysis/
Management
Review
Events (E)
Events (E)
Accomplishments
Process
Narratives
Criteria
Integrated Master
Schedule (IMS)
Integrated Master
Schedule (IMS)
Control Account
Control Account
Work Package
Work Package
Work Package Tasks
Work Package Tasks
WBS
Program
Performance
Management
System
Supplemental Schedules
Risk and
opportunity
Risk and
opportunity
9.0
Framework
121. First
Pass
At
Building
The
Integrated
Master
Plan
Many
contractors
all
ready
have
work
processes
to
do
this.
These
steps
are
guidance
for
contractors
new
to
this
process.
With
the
RFP,
the
contract
should
be
capable
of
the
following
steps
to
create
the
Integrated
Master
Plan
and
Integrated
Master
Schedule.
We’ll
build
the
IMP/IMS
from
the
point
of
view
of
the
Government
to
compare
the
contractors
IMP
in
the
proposal
10
V8.7
123. Quick
View
of
1st
Pass
for
IMP
! Start
with
WBS
! Use
Preliminary
Design
Review
Program
Event
! IdenLfy
subsystems
for
flight
vehicle
! IdenLfy
Significant
Accomplishments
for
PDR
compleLon
! IdenLfy
Accomplishment
Criteria
for
each
sequence
of
Work
Packages
to
produce
the
deliverables
for
PDR
123
10.
1st
Pass