Canon SBA

614 views
553 views

Published on

0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
614
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
3
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide
  • Observations are easy… say true things .. Challenge is relevancy My observations in the first half of this briefing address: Circumstances – What’s going on in the M&S community-of-practice that might be of interest to JSMARTS? Enterprise Context – What’s the big picture? JSMARTS Program – What is evident to the relatively ‘casual observer’ regarding JSMARTS? Challenges – What’s hard? … and why?
  • Observations are easy… say true things .. Challenge is relevancy My observations in the first half of this briefing address: Circumstances – What’s going on in the M&S community-of-practice that might be of interest to JSMARTS? Enterprise Context – What’s the big picture? JSMARTS Program – What is evident to the relatively ‘casual observer’ regarding JSMARTS? Challenges – What’s hard? … and why?
  • Simply put, at some level all simulations are wrong since they are an abstraction of reality. The M&S VV&A process is focused on understanding the degree and nature of the abstraction, which can be called the conceptual model.
  • Observations are easy… say true things .. Challenge is relevancy My observations in the first half of this briefing address: Circumstances – What’s going on in the M&S community-of-practice that might be of interest to JSMARTS? Enterprise Context – What’s the big picture? JSMARTS Program – What is evident to the relatively ‘casual observer’ regarding JSMARTS? Challenges – What’s hard? … and why?
  • Observations are easy… say true things .. Challenge is relevancy My observations in the first half of this briefing address: Circumstances – What’s going on in the M&S community-of-practice that might be of interest to JSMARTS? Enterprise Context – What’s the big picture? JSMARTS Program – What is evident to the relatively ‘casual observer’ regarding JSMARTS? Challenges – What’s hard? … and why?
  • There is no magical or mathematical way to ‘correctly’ quantify risk. This does not mean that risk cannot be quantified, however. Risk Assessment becomes an objective evaluation of all the recognized risks such as- the the potential impact on human lives, equipment, resources, decisions making, and time. These factors can be quantified by subjective judgments obtained by a consensus from recognized ‘experts’. To put it simply, risk and probability have a cumulative effect. To quantify risk there must be some criteria for quantifying ‘risk impact and risk probability’. For the Staff Officer Level - the key areas to remember are: Risk Assessment is based on the CUSTOMER’s Needs Risk Management is determined by optimizing the Resources available Risk prioritization is based on the customer’s determination of high interest items Bottom Line: The management of risk is based on Impact and Resources from the Customer’s perspective . ---------------------- Instructor Notes ------------------------ In common terms, addressing risk analyses equated to the risk to the product of impact and probability. The graphic above depicts this concept. As the probability of an occurrence increases, the probability wedge would move to the right. As the impact of the occurrence grows more serious, the impact wedge moves to the left. As either increases, the overall level of risk increases. [Additionally, the ‘proof’ required to ensure that the risk is minimized, adds to the overall potential cost of assurance [VV&A].] Additionally, a great influence on the cost of risk is based on the “WHEN” of VV&A.
  • Observations are easy… say true things .. Challenge is relevancy My observations in the first half of this briefing address: Circumstances – What’s going on in the M&S community-of-practice that might be of interest to JSMARTS? Enterprise Context – What’s the big picture? JSMARTS Program – What is evident to the relatively ‘casual observer’ regarding JSMARTS? Challenges – What’s hard? … and why?
  • Observations are easy… say true things .. Challenge is relevancy My observations in the first half of this briefing address: Circumstances – What’s going on in the M&S community-of-practice that might be of interest to JSMARTS? Enterprise Context – What’s the big picture? JSMARTS Program – What is evident to the relatively ‘casual observer’ regarding JSMARTS? Challenges – What’s hard? … and why?
  • Although the requirement for VV&A may be relatively new, there are some outstanding efforts in progress where we can extract some valuable lessons from. In pursuing any VV&A effort, a primary concern is to optimize funds by making the effort the most cost-effective. There are five key ideas that are the foundation of the entire VV&A process we have described in this tutorial. They provide a way of looking at the VV&A problem and leveraging required products and processes to gain benefits that extend beyond mere compliance with directives. These benefits include a better understanding of the fundamental problem risks and a means of controlling VV&A costs. These ideas have evolved during the work on a variety of programs that were trying to address a VV&A problem related to their use of M&S. These ideas underlie the process and methods that are documented various publications on VV&A that are available an the world wide web or contacting the agencies directly. Their ideas and processes are to be addressed and incorporated into the next revision of the DMSO VV&A Recommended Practices Guide [which should be published in 1999.]
  • Canon SBA

    1. 1. Simulation Based Acquisition (SBA) - Verification, Validation and Accreditation of Models and Simulations - Key Concepts From A Practitioner’s Perspective 19 May 2005 Mr. Patrick M. Cannon, P.E. The AEgis Technologies Group www.AEgisTG.com 631 Discovery Drive, Huntsville, AL 32806 Phone (256) 922-0802, Fax: (256) 922-0904, email: PCannon@AEgisTG.com
    2. 2. Outline <ul><li>SBA Background </li></ul><ul><li>M&S Technology to Support SBA </li></ul><ul><li>VV&A of M&S for SBA </li></ul><ul><ul><li>VV&A Background </li></ul></ul><ul><ul><li>Cost of VV&A </li></ul></ul><ul><li>VV&A Procedural Recommendations </li></ul><ul><ul><li>Risk-focused Strategy </li></ul></ul><ul><ul><li>Managed Investment Strategy </li></ul></ul><ul><li>Conclusions </li></ul>
    3. 3. Some SBA History <ul><li>Vice President Gore’s National Performance Review </li></ul><ul><ul><li>Cut delivery time for new systems by 25% </li></ul></ul><ul><li>Department of Defense </li></ul><ul><ul><li>Stretched goal to 50% reduction of cycle time </li></ul></ul><ul><ul><li>Set goal to reduce Total Ownership Cost </li></ul></ul><ul><li>Defense Systems Affordability Council </li></ul><ul><ul><li>Recognized potential of M&S </li></ul></ul><ul><ul><li>Set SBA as one of the top initiatives to realize the stretch goal </li></ul></ul>
    4. 4. Definition(s) of SBA <ul><li>SBA definitions are numerous and generally unrevealing. Elements include: </li></ul><ul><ul><li>Methodology or practice </li></ul></ul><ul><ul><li>Supporting systems engineering </li></ul></ul><ul><ul><li>Sometimes convolved with program or organizational unit </li></ul></ul><ul><li>Several aliases include: </li></ul><ul><ul><li>SBA (Simulation Based Acquisition) </li></ul></ul><ul><ul><li>SEBA (Synthetic Environments Based Acquisition) </li></ul></ul><ul><ul><li>MSAM (Modeling and Simulation for Affordable Manufacturing) </li></ul></ul><ul><ul><li>SMART (Simulation and Modeling for Analysis Requirements and Training) </li></ul></ul><ul><ul><li>… and variations thereof </li></ul></ul>
    5. 5. Significant Attributes of ‘SBA’ (as a handle) <ul><li>SBA is not: </li></ul><ul><ul><li>only simulation, </li></ul></ul><ul><ul><li>exclusively simulation-based, </li></ul></ul><ul><ul><li>just for acquisition, or </li></ul></ul><ul><ul><li>a program or a single organization’s prerogative </li></ul></ul><ul><li>SBA is: </li></ul><ul><ul><li>a concept-of-operations, </li></ul></ul><ul><ul><li>for an entire enterprise, </li></ul></ul><ul><ul><li>supportive of capabilities management, materiel systems engineering life-cycle, and collaborative operations, wherein … </li></ul></ul><ul><ul><li>collaborative representations support a shared worldview </li></ul></ul>
    6. 6. SBA - Not Just Technology Collectively, these were to facilitate -- An unprecedented quality of enterprise-wide, collaborative decision making across the acquisition life-cycle... Evolved Acquisition Culture <ul><li>Enabled Integrated </li></ul><ul><li>Process Teams </li></ul>Culture <ul><li>Changing Roles </li></ul><ul><li>and Responsibilities </li></ul>Iterative Acquisition Process <ul><li>Iterative Spiral Process </li></ul>Process - Rapid Evaluation of Multiple Options - Electronic Exchange of System Models Requirements Functional Design Implementation Design <ul><li>Info Repository </li></ul>- Seamless Integration of Engineering Disciplines Tactical Decision Support Req Elicitation and Analysis Functional Design and Analysis HME / HW / SW Implementation and Analysis HME / HW / SW Development Training and Ops Support Modification and Upgrade Maintenance and Logistics Operational Need System Integration and Test Integrated Engineering & Management Enterprise <ul><li>Integrated Design Data Schema </li></ul>Technology <ul><li>Collaborative </li></ul><ul><li>Distributed Engineering </li></ul>- User Transparent Web Style Access Government Industry Integrated Product Process Model Common Project Data Repository Format
    7. 7. Outline <ul><li>SBA Background </li></ul><ul><li>M&S Technology to Support SBA </li></ul><ul><li>VV&A of M&S for SBA </li></ul><ul><ul><li>VV&A Background </li></ul></ul><ul><ul><li>Cost of VV&A </li></ul></ul><ul><li>VV&A Procedural Recommendations </li></ul><ul><ul><li>Risk-focused Strategy </li></ul></ul><ul><ul><li>Managed Investment Strategy </li></ul></ul><ul><li>Conclusions </li></ul>
    8. 8. Definitions of M&S <ul><li>We propose the broadest reasonable definitions of M&S: </li></ul><ul><li>Referent – n. Something referenced or singled out for attention, a designated object, real or imaginary or any set of such objects </li></ul><ul><li>Model – n. Representation of some referent </li></ul><ul><li>Simulation – n. Mechanization of a model’s evolution through time </li></ul>
    9. 9. Scope of Consideration <ul><li>All of Modeling and Simulation </li></ul><ul><ul><li>All kinds / types of simulation </li></ul></ul><ul><ul><li>All phases of M&S life-cycle </li></ul></ul><ul><ul><li>All domains of applications and kinds of uses </li></ul></ul><ul><ul><li>All aspects of industrial and business environment </li></ul></ul><ul><li>All Interested Parties </li></ul><ul><li>All Legitimate M&S-Related Concerns </li></ul>
    10. 10. Implementation - Live, Virtual, Constructive - Live Simulation: A simulation involving real people operating real systems. Virtual Simulation: A simulation involving real people operating simulated systems. Constructive Simulation: Models and simulations that involve simulated people operating simulated systems. Reference: DoD 5000.59-P, &quot;Modeling and Simulation Master Plan,&quot; October 1995
    11. 11. Technology Circumstances - Net Assessment - <ul><li>M&S Technology is generally available, robust, and relevant to SBA </li></ul><ul><li>Risk accrues from application of available technologies in SBA enterprise collaborative operational environment </li></ul><ul><li>Special concerns are: </li></ul><ul><ul><li>Conceptual modeling </li></ul></ul><ul><ul><li>Architecture management </li></ul></ul><ul><ul><li>Data management </li></ul></ul><ul><ul><li>Standards </li></ul></ul>} <ul><li>Composability </li></ul><ul><li>Interoperability </li></ul><ul><li>Reuse </li></ul>
    12. 12. Outline <ul><li>SBA Background </li></ul><ul><li>M&S Technology to Support SBA </li></ul><ul><li>VV&A of M&S for SBA </li></ul><ul><ul><li>VV&A Background </li></ul></ul><ul><ul><li>Cost of VV&A </li></ul></ul><ul><li>VV&A Procedural Recommendations </li></ul><ul><ul><li>Risk-focused Strategy </li></ul></ul><ul><ul><li>Managed Investment Strategy </li></ul></ul><ul><li>Conclusions </li></ul>
    13. 13. VV&A Definitions <ul><li>VERIFICATION - the process of determining that a model [or simulation] implementation accurately represents the developer’s conceptual description and specification. ... Is it what was intended? </li></ul><ul><li>VALIDATION - the process of determining the degree to which a model [or simulation] is an accurate representation the real-world from the perspective of the intended uses of the model or simulation ... How well does it represent what I care about? </li></ul><ul><li>ACCREDITATION - the official certification that a model or simulation is acceptable for use for a specific purpose ... Should my organization endorse its use? </li></ul>[Reference DoDD 5000.59]
    14. 14. VV&A vs IV&V <ul><li>Provides Independent Evaluation / Assessment of: </li></ul><ul><ul><li>Are we building the product right? = Verification </li></ul></ul><ul><ul><li>Are we building the right product? = Validation </li></ul></ul><ul><li>Verification (Are we building the product right?) </li></ul><ul><ul><li>The process of determining whether or not the products of a given phase of the software development lifecycle fulfill the requirements established during the previous phase </li></ul></ul><ul><ul><li>The product is internally complete, consistent and correct will support the next phase </li></ul></ul><ul><li>Validation (Are we building the right product?) </li></ul><ul><ul><li>The process of evaluating software throughout its development process to ensure compliance with software requirements. This process ensures: </li></ul></ul><ul><ul><ul><li>Expected behavior when subjected to anticipated events </li></ul></ul></ul><ul><ul><ul><li>No unexpected behavior when subjected to unanticipated events </li></ul></ul></ul><ul><ul><ul><li>System performs to the customer's expectations under all operational conditions </li></ul></ul></ul>Source: IV&V Overview Briefing, NASA IV&V Facility 100 University Dr. Fairmont, WV 26554
    15. 15. In a Perfectly Engineered World Source and Executable Code Requirements Requirements Allocated to Software Software Requirements Specification Software Design Documents Reqts Analysis Software Reqts Analysis Software Analysis Source and Executable Code Source and Executable Code Unit Integration & Test Component Integration & Test Configuration Item Integration & Test System Integration & Test Software V&V Integrated Software Software Coding Source and Executable Code Source and Executable Code
    16. 16. In a Perfectly Engineered (M&S) World Source and Executable Code Requirements Allocated to Software Software Requirements Specification Software Design Documents Model/Sim Reqts Analysis Software Reqts Analysis Software Analysis Source and Executable Code Source and Executable Code Unit Integration & Test Component Integration & Test Configuration Item Integration & Test System Integration & Test Integrated Simulation Software Coding Source and Executable Code Source and Executable Code M&S VV&A M&S Requirements ‘ Real World’ Simulation Validation
    17. 17. In a Perfectly Engineered (M&S) World <ul><li>But Whose Requirements? </li></ul><ul><ul><li>The M&S developers? [System requirements, software requirements] </li></ul></ul><ul><ul><li>The M&S Users? [Accreditation requirements] </li></ul></ul>Requirements Met ~ Valid M&S  M&S Would be suitable for use Requirements Not Met ~ Invalid M&S X M&S would not be suitable for use
    18. 18. In The Real (M&S) World <ul><li>Systems Engineering is not perfect </li></ul><ul><li>Requirements aren’t right (and, sometimes not even articulated) </li></ul><ul><li>Consequently…. </li></ul>Stated Requirements may be met…  …… and the M&S may not be credible for the problem at hand. Stated Requirements may not be met… X <ul><ul><li>…… and the M&S may be sufficiently credible for the problem at hand. </li></ul></ul>
    19. 19. M&S VV&A Focus <ul><li>Simply put, because every model or simulation is an abstraction of reality, at some level they all are wrong. </li></ul><ul><li>The VV&A process should be focused on accreditation based on the credibility of the M&S implementation and risks if the M&S is used! </li></ul>
    20. 20. The Fundamental Questions How Good Is It? Is It Good Enough? &A VV
    21. 21. Outline <ul><li>SBA Background </li></ul><ul><li>M&S Technology to Support SBA </li></ul><ul><li>VV&A of M&S for SBA </li></ul><ul><ul><li>VV&A Background </li></ul></ul><ul><ul><li>Cost of VV&A </li></ul></ul><ul><li>VV&A Procedural Recommendations </li></ul><ul><ul><li>Risk-focused Strategy </li></ul></ul><ul><ul><li>Managed Investment Strategy </li></ul></ul><ul><li>Conclusions </li></ul>
    22. 22. What’s the Cost? <ul><li>M&S VV&A activities compete with development, operations, and maintenance activities for scarce program resources </li></ul><ul><ul><li>Dollars </li></ul></ul><ul><ul><li>Staff </li></ul></ul><ul><ul><li>Facilities </li></ul></ul><ul><ul><li>Equipment </li></ul></ul><ul><ul><li>Information </li></ul></ul>
    23. 23. So Answer the Question! <ul><li>I don’t exactly know, </li></ul><ul><li>but I know it depends! </li></ul>
    24. 24. What Does M&S Cost in DoD? Results of a DoD Modeling and Simulation Survey* * The Workshop on Foundations for Modeling and Simulation (M&S) Verification and Validation (V&V) in the 21st Century ”, better known as “ Foundations ’02 ”, sponsored bythe Defense Modeling and Simulation Office, and held October 22-24, 2002 at the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland.
    25. 25. V&V Cost Factor 1 <ul><li>Quality of M&S documentation affects the cost of V&V (especially verification) </li></ul><ul><ul><li>e.g., no S/W design documentation means you have to reverse engineer the code to do verification </li></ul></ul><ul><li>Three cost drivers </li></ul><ul><ul><li>Cost of buying information about the model </li></ul></ul><ul><ul><li>Cost of reconstructing unavailable information </li></ul></ul><ul><ul><li>Cost difference incurred when forced to replace a relatively “cheap” V&V technique with a more expensive V&V technique </li></ul></ul>Availability of Information about the M&S
    26. 26. V&V Cost Factor 2 <ul><li>M&S Validation requires data from dynamic behavior of the system being modeled </li></ul><ul><ul><li>– Test costs are the biggest driver of validation data collection cost </li></ul></ul><ul><li>Test data may not be available for M&S validation </li></ul><ul><ul><li>Insufficient instrumentation </li></ul></ul><ul><ul><li>Test events are not generally done just for M&S validation purposes </li></ul></ul><ul><ul><li>Program sensitivities may preclude release of data </li></ul></ul><ul><ul><li>Classification issues may get in the way </li></ul></ul><ul><ul><li>Data collected may not be suitable for validation </li></ul></ul><ul><li>If the program “doesn’t need it”, they won’t measure it </li></ul>Availability of Information about the Referent (Validation Data)
    27. 27. V&V Cost Factor 3 <ul><li>V&V requirements may be difficult to separate out for highly integrated simulations </li></ul><ul><ul><li>Integrated live, virtual and constructive simulations, for example </li></ul></ul><ul><li>If the M&S are only a part of the analysis process, V&V requirements may be subjective at best </li></ul><ul><ul><li>For example, accuracy requirements for simulation federates may be difficult to quantify </li></ul></ul><ul><ul><li>Subjective at best, political at worst </li></ul></ul>Application Complexity
    28. 28. V&V Cost Factor 4 <ul><li>Most VV&A processes used in DOD are based on risk assessment </li></ul><ul><ul><li>VV&A activities can provide estimates of residual risk since no model can be completely verified and validated. </li></ul></ul><ul><ul><li>Usually subjective judgments of risk based on expert opinion </li></ul></ul><ul><li>High risk applications require more V&V resources than lower risk applications </li></ul><ul><ul><li>Both impact and probability of wrong answers must be evaluated to determine V&V resource requirements. </li></ul></ul>Application Risk
    29. 29. V&V Cost Factor 4 Application Risk
    30. 30. V&V Cost Factor 5 <ul><li>Experience under the SMART* VV&A program executed by Joint Accreditation Support Activity: </li></ul><ul><ul><li>Analyzed VV&A of 5 models varied between 30,000 and 100,000 lines of code </li></ul></ul><ul><ul><li>V&V tasking and resources expended on each was about the same </li></ul></ul><ul><li>This conclusion may not be applicable everywhere, but…the level of experience of the practitioner is likely to be a significant driver of VV&A cost requirements. </li></ul><ul><li>* Susceptibility Model Assessment with Range Test </li></ul>Practitioner Experience
    31. 31. V&V Cost Factor 6 <ul><li>“ Sponsor” accreditation data requirements may not be driven by technical issues </li></ul><ul><ul><li>May not always seem logical to the M&S developer, VV&A practitioner, or anyone else for that matter </li></ul></ul><ul><li>Often driven by </li></ul><ul><ul><li>Policy </li></ul></ul><ul><ul><li>Previous experience </li></ul></ul><ul><ul><li>Politics </li></ul></ul><ul><ul><li>Preconceived opinions about M&S (for or against) on the part of the program manager </li></ul></ul><ul><ul><li>Funding </li></ul></ul><ul><li>All of these are subjective at best and inherently non-measurable </li></ul>Accreditation Authority Data Requirements
    32. 32. V&V Cost Function? Cost of evaluation V&V Mature techniques & tools Cost of required real system information System Knowledge System Data Cost of required model information Model documentation Cost of particular V or V Enterprise What do I know about the real system? What do I know about the model? What V&V methodologies do I have? = + +
    33. 33. One Other Issue <ul><li>How to account for M&S V&V costs </li></ul><ul><ul><li>Is verification counted as S/W development, but validation as V&V? </li></ul></ul><ul><ul><li>Is a development team peer review, with outside experts, an “SME” review and part of verification? Or part of development? </li></ul></ul><ul><ul><li>Is pre-test prediction part of validation or a test cost? </li></ul></ul><ul><ul><li>If post-test analysis is useful for validation, does it get charged as V&V or as a test cost? </li></ul></ul><ul><li>For simulations of real objects that are under development, which costs go into the “item” development bin, the “M&S” development bin, the “V&V” bin, etc. </li></ul><ul><li>When good software/simulation development practices are followed, the harder it is to sort out development costs from V&V costs </li></ul><ul><ul><li>Good systems engineering produces good V&V information </li></ul></ul>M&S Task Accounting
    34. 34. Outline <ul><li>SBA Background </li></ul><ul><li>M&S Technology to Support SBA </li></ul><ul><li>VV&A of M&S for SBA </li></ul><ul><ul><li>VV&A Background </li></ul></ul><ul><ul><li>Cost of VV&A </li></ul></ul><ul><li>VV&A Procedural Recommendations </li></ul><ul><ul><li>Risk-focused Strategy </li></ul></ul><ul><ul><li>Managed Investment Strategy </li></ul></ul><ul><li>Conclusions </li></ul>
    35. 35. Any SBA Activity Needs a Deliberate VV&A Program <ul><li>Modeling & simulation tools are CRITICAL RESOURCES for analysis, test, and evaluation </li></ul><ul><ul><li>Supporting key decisions (technical, cost and schedule) </li></ul></ul><ul><ul><li>Risks inherent in key decisions mandate risk management </li></ul></ul><ul><li>The NEED exists to regularize the verification and validation of modeling and simulation tools (data, models, test beds) and activities (studies, exercises) </li></ul><ul><li>A DELIBERATE PROGRAM of activity to establish confidence in the operation of models & simulations and in the significance of modeling & simulation results is imperative </li></ul>
    36. 36. Optimizing Risk & VV&A RISK = PROBABILITY x IMPACT RISK ANALYSIS: KEEPS the VV&A EFFORTS FOCUSED ON PROBLEM CREDIBILITY REQUIREMENTS! Determining the Best use of all the Resources available Potential Cost of VV&A $$$$ $$ $$$ $ RISK LEVELS HIGH LOW MEDIUM High Credibility “ More” VV&A More Credibility Nominal Credibility “ Less” VV&A Probability of Risk Impact of Risk Probability of Risk Impact of Risk Probability of Risk Impact of Risk
    37. 37. Quantifying Risk <ul><li>Three areas typically used to identify and quantify risks </li></ul><ul><ul><li>Problem categories </li></ul></ul><ul><ul><li>Impact of occurrence </li></ul></ul><ul><ul><li>Probability of occurrence </li></ul></ul><ul><li>Use current organizational risk approaches if they are well defined </li></ul>
    38. 38. Problem Categories and Levels of Impact *Consistent with DoD Military System Safety Standard: MIL-STD-882C Minor or Small Scale Minor and Small Scale Severe or Broad Scale Severe & Broad Scale OCCUPATIONAL ILLNESS Minimal Add’l Casualties Small Number Add’l Casualties Large Number Combat Deaths Widespread Add’l Combat Deaths OPERATIONAL IMPACT LEVELS Local Embarrassment ($200 Hammer) Significant (Tailhook ’91) Nat’l or Internat’l (Watergate) POLITICAL Republish Schedules Slip Causes Internal Turmoil Slip Causes Cost Impact Slip Reduces DoD Capabilities SCHEDULE <20% Cost Growth Some Trivial Small Scale Minor Damage Less than Minor Injury NEGLIBIBLE 20% to 50% Cost Growth Minor Broad Scale Minor Damage Minor Injury MARGINAL Funds Reduction; 50% to 100% Cost Growth Major (Love Canal) Broad Scale Major Damage Severe Injury CRITICAL Loss of Program Funds; 100% Cost Growth Severe (Chernobyl) Major Equip Loss; Broad Scale Major Damage Death CATASTROPHIC COST ENVIRONMENT DAMAGE EQUIPMENT SAFETY PERSONNEL SAFETY PROBLEM CATEGORIES
    39. 39. Problem Categories and Levels of Impact *Consistent with DoD Military System Safety Standard: MIL-STD-882C Minor or Small Scale Minor and Small Scale Severe or Broad Scale Severe & Broad Scale OCCUPATIONAL ILLNESS Minimal Add’l Casualties Small Number Add’l Casualties Large Number Combat Deaths Widespread Add’l Combat Deaths OPERATIONAL IMPACT LEVELS Local Embarrassment ($200 Hammer) Significant (Tailhook ’91) Nat’l or Internat’l (Watergate) POLITICAL Republish Schedules Slip Causes Internal Turmoil Slip Causes Cost Impact Slip Reduces DoD Capabilities SCHEDULE <20% Cost Growth Some Trivial Small Scale Minor Damage Less than Minor Injury NEGLIBIBLE 20% to 50% Cost Growth Minor Broad Scale Minor Damage Minor Injury MARGINAL Funds Reduction; 50% to 100% Cost Growth Major (Love Canal) Broad Scale Major Damage Severe Injury CRITICAL Loss of Program Funds; 100% Cost Growth Severe (Chernobyl) Major Equip Loss; Broad Scale Major Damage Death CATASTROPHIC COST ENVIRONMENT DAMAGE EQUIPMENT SAFETY PERSONNEL SAFETY PROBLEM CATEGORIES
    40. 40. Probability of Occurrence **The number of Items should be specified Unlikely to Occur but Possible So Unlikely, it can be Assumed Occurrence May Not be Experienced Impossible (10 -6 ) Unlikely but can Reasonably be Expected to Occur Unlikely but Possible to Occur in Life of Item Remote (10 -5 ) Will Occur Several Times Likely to Occur Some Time in Life of Item Occasional (10 -4 ) Will Occur Frequently Will Occur Several Times in Life of Item Probable (10 -3 ) Widely Experienced Likely to Occur Frequently Frequent (10 -2 ) LIKELIHOOD OF OCCURRENCE PER NUMBER OF ITEMS** LIKELIHOOD OF OCCURRENCE OVER LIFETIME OF AN ITEM PROBABILITY DESCRIPTION
    41. 41. Mapping M&S to Risk Areas Impact of Occurrence Probability of Occurrence Problem Categories Low Low Low Low Low Low Low Low Medium Medium Low Medium Medium High High Medium Medium Medium High High Phenomenology and Environmental Models IRI–95 Ionosphere Model GRAM-95 Neutral Atmosphere Crane Rain RF Attenuation
    42. 42. V&V Activities as a Function of Risk Specifications and/or Requirements Trace Reports, Functional Flow Diagrams, Verify M&S Design Any Two Audit, Desk Checking, Face Validation, Inspection, Reviews, Turing Test, Walkthroughs Verify M&S Implementation Required Any Two Either SME, Scientific theory and accepted algorithms, Laboratory tests, developmental tests, system operational tests, engineering data, training test results, Historical values, previously and separately validated simulations or data Validate M&S Application Any Three Any Two Any One Conceptual Model Review Reports Validate Conceptual Model Required Required Required Needed when Risk is… Any One Any One Desired Required Any Two Medium Any Two Any One Low Any Two Any Two Required Required Any Three High SPCR logs, test reports, verification reports, usage history S/W V&V Results CM Database, SCR’s S/W Docs, CCB minutes, S/W Design Documentation S/W Mgmt Artifacts Mgmt Plans, S/W Documentation, anecdotal S/W Mgmt Resources Desc Requirements Trace Reports, Review Reports, Code Walkthroughs, S/W Tests reports Simulation Support Plan, Configuration Management Plan, V&V Plan, Accreditation Plan Typical Sources S/W Development Results M&S Development Process Item Required
    43. 43. Outline <ul><li>SBA Background </li></ul><ul><li>M&S Technology to Support SBA </li></ul><ul><li>VV&A of M&S for SBA </li></ul><ul><ul><li>VV&A Background </li></ul></ul><ul><ul><li>Cost of VV&A </li></ul></ul><ul><li>VV&A Procedural Recommendations </li></ul><ul><ul><li>Risk-focused Strategy </li></ul></ul><ul><ul><li>Managed Investment Strategy </li></ul></ul><ul><li>Conclusions </li></ul>
    44. 44. Putting the A in Front of V&V! <ul><li>Individual M&S and agency accreditation plans may be unique, but the M&S V&V activities selected for execution should provide essential, fundamental information about the simulation to support M&S accreditation decisions. </li></ul><ul><li>The VV&A goal is to establish that M&S produces realistic, unbiased, credible measurements of performance when operated within a specific domain of scenario and environmental conditions for it to be acceptable (accredited) for use. </li></ul><ul><li>As a consequence, accreditation must be the primary objective in the definition of the M&S V&V activities. </li></ul>A The planning for A focuses V&V execution .
    45. 45. Requirements Flow-Down <ul><li>M&S VV&A programs can be defined top-down </li></ul><ul><ul><li>Accreditation-decision information needs drive V&V data products </li></ul></ul><ul><ul><li>Data requirements are contingent on accreditation scope </li></ul></ul><ul><ul><li>Requirements are flowed down to V&V activities </li></ul></ul><ul><ul><li>...In a perfect world... </li></ul></ul><ul><li>And, Executed from the bottom-up </li></ul>
    46. 46. Managing the VV&A Investment <ul><li>As in M&S itself, the specification of scope and detail of accreditation is problematic </li></ul><ul><ul><li>Managed investment addresses the problem of specifying the scope and detail of M&S VV&A activity </li></ul></ul><ul><ul><li>It allows near-optimal investment in assessment activities and products for an economically constrained environment </li></ul></ul><ul><li>A managed investment means the deliberate, progressive, marginal investment in information valuable for support of accreditation decisions </li></ul>While no a priori determination will provide closed-form guidance on ‘how much is enough’, the proposed algorithm will allow investment in VV&A activities and products to be made in a technically complex, dynamic, time-distributed, economically constrained environment.
    47. 47. Using A Managed Investment Strategy <ul><li>Managed Investment is the execution of a carefully selected subset of V&V activities: </li></ul><ul><ul><li>Offering the “best return on investment” by providing the essential information necessary from V&V findings </li></ul></ul><ul><ul><li>Providing the required evidence supporting the accreditation decisions of the accreditation authorities </li></ul></ul><ul><li>This approach considers cost as an independent variable during the selection and execution of VV&A assessment activities </li></ul><ul><li>An optimal subset of VV&A activities can then be chosen based upon the: </li></ul><ul><ul><li>Information and data needs of the accreditation authority </li></ul></ul><ul><ul><li>Realities of the program (schedule) </li></ul></ul><ul><ul><li>Fixed resources (budget) available for V&V activities </li></ul></ul><ul><ul><li>Risk tolerance of the program </li></ul></ul>
    48. 48. <ul><li>Another concept key VV&A strategy that supports program definition is a familiar one: </li></ul><ul><ul><li>It is the systems engineer's multi-dimensional view of the enterprise whose dimensions exhaust the important attributes of the conceptual space </li></ul></ul><ul><li>The recommended “evaluation space” dimensions consist of: </li></ul><ul><ul><li>Unit-under-tes t ( UUT ) </li></ul></ul><ul><ul><li>V&V activity </li></ul></ul><ul><ul><li>V&V agent </li></ul></ul><ul><ul><li>… that yield a V&V product </li></ul></ul><ul><li>The VV&A Program domain-of-interest is comprised of the most cost-effective set of cells in the Evaluation Activity Space </li></ul><ul><li>This explicit activity domain-of-interest assures complete, systematic evaluation and intelligent choices within each dimension (UUT, Activity, Agent, & Product) </li></ul>What Is The VV&A ‘Evaluation Activity Space’? V&V ACTIVITY V&V PRODUCT(S) UNIT UNDER TEST V&V AGENT
    49. 49. Managed Investment Applied To M&S VV&A
    50. 50. M&S VV&A Program Specification <ul><li>M&S VV&A program specification includes: </li></ul>M&S VV&A PROGRAM PLAN PRODUCT SPECIFICATION UUT - ACTIVITY MATRIX •• •• •• •• ••• • •• • •• •   •• • •  •  •  •   •• •• •  • • •••   •• ••  SCHEDULE RESOURCE PLAN M&S ACCREDITATION REQUIREMENTS SPECIFICATIONS M&S Accreditation Requirements Specifications Accreditation Product DIDs
    51. 51. Outline <ul><li>SBA Background </li></ul><ul><li>M&S Technology to Support SBA </li></ul><ul><li>VV&A of M&S for SBA </li></ul><ul><ul><li>VV&A Background </li></ul></ul><ul><ul><li>Cost of VV&A </li></ul></ul><ul><li>VV&A Procedural Recommendations </li></ul><ul><ul><li>Risk-focused Strategy </li></ul></ul><ul><ul><li>Managed Investment Strategy </li></ul></ul><ul><li>Conclusions </li></ul>
    52. 52. Lessons Learned <ul><li>Accreditation is the key to making VV&A cost effective </li></ul><ul><li>Accreditation agents are key </li></ul><ul><ul><li>Help the accreditation authority articulate requirements </li></ul></ul><ul><ul><li>Communicate those requirements to the M&S developers </li></ul></ul><ul><ul><li>Develop the accreditation recommendations </li></ul></ul><ul><li>V&V agents are key </li></ul><ul><ul><li>Help discern necessary V&V activities based on accreditation requirements </li></ul></ul><ul><ul><li>Focus the activities to reduce costs </li></ul></ul><ul><ul><li>Ensure appropriate documentation is produced to support accreditation </li></ul></ul>
    53. 53. Some Challenges Remain <ul><li>Cost and resource requirements for M&S V&V are not well understood </li></ul><ul><ul><li>Meaningful cost metrics are not widely shared within M&S communities. </li></ul></ul><ul><ul><li>Much more information about cost and resource requirements needs to be collected </li></ul></ul><ul><ul><li>More reliable cost estimation processes are needed </li></ul></ul><ul><li>Data limitations required for effective V&V have to be addressed </li></ul><ul><ul><li>Management processes should be common across simulation applications </li></ul></ul><ul><ul><li>Should address required data and detailed characterization of associated uncertainties and errors, simulation/software artifacts, etc. </li></ul></ul>* The Workshop on Foundations for Modeling and Simulation (M&S) Verification and Validation (V&V) in the 21st Century ”, better known as “ Foundations ’02 ”, sponsored by the Defense Modeling and Simulation Office, and held October 22-24, 2002 at the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland.
    54. 54. Some Challenges Remain <ul><li>Effective communication is a problem </li></ul><ul><ul><li>There continues to be differences in the details about terminology, concepts, and V&V paradigms among various M&S communities </li></ul></ul><ul><ul><li>Excessive use of acronyms makes it difficult to communicate easily across community boundaries </li></ul></ul><ul><li>M&S V&V needs to employ more formal (repeatable and rigorous) methods to facilitate better judgments about appropriateness of simulation capabilities for intended uses </li></ul><ul><li>Advances in M&S frameworks / theory can enhance V&V capabilities, and is essential for increasing automated V&V techniques </li></ul>* The Workshop on Foundations for Modeling and Simulation (M&S) Verification and Validation (V&V) in the 21st Century ”, better known as “ Foundations ’02 ”, sponsored by the Defense Modeling and Simulation Office, and held October 22-24, 2002 at the Johns Hopkins University Applied Physics Laboratory in Laurel, Maryland.
    55. 55. LIVE THE VISION – by Bill Waite <ul><li>Conceive in general </li></ul><ul><ul><li>Be liberal in imagination, because … </li></ul></ul><ul><ul><li>The big picture matters </li></ul></ul><ul><li>Execute in particular </li></ul><ul><ul><li>Everything should follow from the vision, but … </li></ul></ul><ul><ul><li>Only the ‘real work’ counts </li></ul></ul>“ Imagination is more important than knowledge.” – Albert Einstein

    ×