Your SlideShare is downloading. ×
Updm Group Sar Example Brainstorm5(2010 02 24)
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Updm Group Sar Example Brainstorm5(2010 02 24)

639
views

Published on


0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
639
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
10
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide
  • Of course, these simplified inputs and outputs decompose into much more complicated sets of products, and the analyses themselves require much more examination. The point is, however, that a JCIDS CBA is not really different than any other analysis. It must specify the issues, estimate our current and projected abilities, and recommend actions. Note that your CBA may not include an FSA. The current trend in JCIDS for jointly-initiated assessments is to do an FAA and an FNA, and then produce a “Joint Capabilities Document” (JCD), which is sent to the JROC. If the JROC opts to act on the needs identified in the assessment, they will assign a sponsor (typically a Service), to do one or more FSAs. This also means that you may do a CBA which consists of nothing but an FSA. In these cases, you will have to rely on someone else’s FAA and FNA, to include repairing any defects and reacting to subsequent changes in guidance.
  • There are five types of sememes: two denotational and three connotational, with connotational occurring only in phrase units (they do not reflect the denotation): Denotational 1: Primary denotation, for example "head" (body); Denotational 2: Secondary denotation by resemblance with other denotation: "head" (ship); Connotational 1: High position, as the role or function of "head" in the operation of the human body; Connotational 2: Emotive, e.g., meaning in "honey"; Connotational 3: Evaluative, e.g., meaning in "sneak" – move silently and secretly for a bad purpose Denotation is the specific, literal image, idea, concept, or object that a sign refers to. Connotation is the figurative cultural assumptions that the image implies or suggests. It involves emotional overtones, subjective interpretation, socio-cultural values, and ideological assumptions. Examples: Stop Sign Denotation—Stop (even without words, we recognize the meaning from the shape and color) Connotation—Risk (accident or ticket) Health club ad Denotation—fit person in foreground --> you could look like this Connotation—fit person in background --> you could pick up a date like this in our club
  • The 6 C’s Completeness, Consistency, Correctness, Conservation, Compliance (to Rules/policy), Conformance (to contracts/I/F)
  • Work organization occupies the highest level because it provides the structure within which the modeling language and its heuristics can be effectively applied, measured, and refined. While a given process may be able to accommodate many variations in heuristics and language, the these elements are the foundation of any methodology, as the directly impact the process and its artifacts.
  • Transcript

    • 1. Search & Rescue (SAR) Sample – Annex C Brainstorming Session Five Leonard F. Levine (Initial POC) [email_address] UPDM Group 24 February 2010
    • 2. 03 March 2010 Agenda
    • 3. Why Upgrade?
      • Scope of Revisions in UPDM 2.0. Reflect DoDAF 2.0 and new metamodel for UPDM 2.0. IDEAS. MoDAF. NAF.
      • Clarify Formal Normative Description with Informal Examples
      • Educate
      • User Requirements
      • Testable
      • More Real World Problems
      • Detailed Systems Engineering or Enterprise Architecture examples (handover)
      • Separate “Part” or “Volume”
      • Team Formation & Expertise
      • Prioritization
      • Next Steps and Action Items
      • Need to Clarify Methodology embedded in UPDM in the Example
      • Atego (Artisan) has upgraded sample from 1.0
    • 4. Scope of Revisions in UPDM 2.0.
      • Reflect DoDAF 2.0 and new metamodel for UPDM 2.0. IDEAS. MoDAF. NAF.
        • DoDAF now has 52 pre-canned models (vice views) and a custom/user-defined views (capturing custom relationships).
          • Should we even try to do 1 example of each model/view?
            • Need to be consistent with other views in sample
              • Some examples naturally won’t have SOAML
        • SysML – How more detailed than Enterprise Architecture concepts should the examples go.
        • Timing with completion of metamodel before details of sample can be drafted
      • More SOAML? Bit of BPMN?
      • SOPES
      • DNDF?
      • Consistency. Sample must be consistent, conservation issues with more detailed BPMN, SysML issues. Correlation between methodologies & frameworks. Across various layers. Show hand-off/handover between EA & Sys Eng, Soft Eng?
      • Some items out of scope (too much detail) could be shared via external websites, bodies, etc. Training courses…
      • We are setting the boundaries of an AV-1 for the Sample! Is it one or many scenarios that we’re doing? Or something in-between?
      • Lifecycle? Large SAR EA example. Subset scenario such as requirement for new system.
    • 5. OV-1 for the Sample
      • Graphics?
      • Scenario?
    • 6. Educate
      • Role of Education in a Standard?
        • Is it a tutorial?
    • 7. Sample Clarifies Formal Normative Description
      • Role of Informal (non-normative) Examples or Samples
    • 8. User Requirements of SAR Sample
      • Who are the users? (That is, the intended readership?)
        • Vendors?
        • End Users of Vendors Tools?
        • OMG Membership?
        • ISO Reviewers?
        • Configuration Managers? 
        • Architects & Designers?
        • Program Managers?
      • Tentative consensus: All of the above.
    • 9. Shall We continue to use SAR, another domain, and/or a combination?
      • Does someone want to raise an alternative?
        • Graham: Too much invested to switch
        • Moe: Supplement with DoD and/or US Government GiG approach. How much extra work? Command & control? At least honorable mention of OV’s for multinational coordination. Key phrase “full spectrum dominance”. Stay away from SV’s?
          • Len: Does USCG use GiG idea for SAR? USGS? Volunteer to do a little research… Look at GiG / NCOW (net-centric operations & warfare?). Look for CONOPS or high-level design?
          • Antoine: Watch out for extra work, time delays, and consistency and coordination.
          • Len (private): Avoid DoD (& MOD) politics on what the GiG really is..
      • We may need a formal or informal vote. Ask the co-chairs to advise on issue. E-mail to Jim, Graham, Matthew. Group of volunteers only. Architects?
      • ACTION DEFERRED UNTIL Next Session (2010-03-03)
    • 10. Testable
      • OMG Model Interoperability Working Group?
      • Others?
    • 11. Real World Problems
      • More real world architectural problems or goals documented in the literature of the US, UK, Canadian, and other Coast Guards -- as well as international maritime authorities.
      • See example from Antoine of Mega
      • Consensus : We want to concentrate on real world problems.
    • 12. More Detail
      • Detailed Systems Engineering?
        • Flows, Constraints?
        • Dynamics: state machines, activity diagrams?
        • RECONFIRM : Consensus : We need detailed and large set of examples because UPDM (DoDAF, MODAF) have a rich set of notations in current model, rich language. We need to show how to express competencies (one example per competency);
        • Enterprise Architecture examples?
      • DoD or MoD Samples?
    • 13. Format: Separate Part & Volume
      • Consensus on Separate Part & Volume
        • Ease of Editing & Publishing
        • Ease of Configuration Management
        • Problem of Consistency?
      • Color?
        • Non-normative
          • Discriminatory?
    • 14. SAR Team: Formation & Expertise Other? Teleconferencing & Whiteboarding Permanent Team Chair & co-chair (Matthew ?) Expert in using chosen modeling tool Functional Expertise: E.g., Service Oriented Expertise SME (SAR Expertise) End User Point of View Architect Call for Volunteers
      • Matthew Hause
      • Volunteers:
      • Primary: Lars, Len, Moe, Antoine
      • Secondary (less time commitment): Graham
      Expertise Name
    • 15. Use of Individual Upgrades to/since UPDM 1.0 SAR Sample
      • Atego (Artisan) How extensive? Very.
      • Mega. Antoine has some extensions from Mega.
      • Services from Lars-Olof.
      • Services also from Graham.
      • Annotated Bibliography with Text / Graphic Excerpts from Len (about 100 pp.)
    • 16. Prioritization Write Tech Note to Governments on why one unified example is or is not possible 4 Draft Unified Example and then specializations. 3 Determine & clearly articulate objectives (see Modeling Guidance) 2 Form Team for SAR Example Upgrade. 1
    • 17. Modeling Guidance (Methodology)
      • Need to Clarify Methodology embedded in UPDM in the Example (not questioning MoDAF or DoDAF or NAF… internal methodologies)
        • Physical and UPDM Elements? What represents what?
          • Physical: How do I do a communications network? Just a set of connections? Architecture versus science/engineering. We should include at least 1 network SAR example in UPDM 2.0.
        • Modeling Guidance (rather than Method versus methodology)
        • Refer back to the heuristics of guidelines. Ex. Implicit relationship between OV-5 (Activity Model) and SV-4 (), a semantic relationship. But the metamodel is supposed to take care of this? We should provide High level guidance. Most End users are not going back to metamodel (associated with frameworks). Remember, in MoDAF, some “technical users” use metamodel frequently. Enterprise Goal such as increasing air traffic by 80%. Hard to explain “goal”, for example. Could be clarified in SAR. Could be done in UPDM – whether in UPDM L0 and L1.
        • Here’s why we are showing this example in the first place. (Len started this in UPDM 1.0 intro paragraphs.) Why should be deeper and include modeling guidance.
        • Need a paragraph by e-mail from
          • Moe
          • Antoine
          • Other
    • 18. Modeling Guidance (Methodology)
      • Also different vendors may have slightly takes , or guidance, or implementation related X things on methodology.
      • DISCUSS : Moe (and Dave McD?): Different Acquisition processes (e.g. US DoD’s JCIDS) require different modeling approaches. See “UPDM/DoDAF 2.0: Its Place and Role in Defense Acquisition Its Place and Role in Defense Acquisition” by Clarence Moreland at end of this presentation.
        • DOTMLPF (Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel and Facilities ) (US) vs. DLOD ( Defence Lines of Development (UK) http://www.aof.mod.uk/aofcontent/strategic/guide/sg_howacqworks.htm
          • The subjects are, however, much the same.
          • Can one unified example cover both?
        • Ref to Moe’s Triangle. Upper levels (e.g. Architectures) capture highest level enterprise structure and activities. Lowest level captures LOEs, individual level, data types (eg, integers), specific functions on a radar screen (de-cluttering) implemented in low level software
          • Low level may not understand high level concepts and vice versa. (Radar operation vs. net centric vision)
    • 19. Modeling Guidance (Methodology)
        • DISCUSS: Methodology Structure. 3 component – work organization, modeling heuristics, & language. JCIDS/DoDAF dictates inputs such as tasks lists, doctrines, and standards and dictates output such as the views, OV’s, AV’s, TV’s, etc. (Reality most in DoD use JCIDS & DoDAF together.) 3 major analytical areas: functional area analysis primarily by government (capstone documents, requirements…), functional needs government & civilian (FFRDCs such as MITRE, …), and functional solution analysis (industry).
          • Len: Although DoDAF is “view neutral” and even supports User Defined Views, there are a certain number of minimal views required for registration and comparison in the DoD (see purpose of DARS and associated registries).
          • Language continuum - (see Moe's slides) from High Level Lexicon to Lower Level Grammars (Morphemes//Semmes, Lexem's, Tokens)
    • 20. Modeling Guidance (Methodology)
      • What is In Bounds? And What is Out of Bounds?
        • Out of Bounds
          • Ontology. Metadata modeling such as Chris Patridge’s xxx and/or the IDEAS Group is NOT part of end-user concerns for methodology in the UPDM 2.0 SAR Example.
            • [EDIT] Rationale.
        • Questionable
          • UML 2 “methodology”. Quotation that no development process is implied. However, OO approach is useful? Standard static & dynamic analyses are useful? Etc. ???
        • In Bounds
          • [EDIT]
    • 21. DoDAF v. MoDAF: Can we have a Unified Example?
      • Can one example do both? Probably no as things stand today (3 Feb 10).
        • Two examples needed because of naming conventions and different concepts.
        • Example ‘capability’ is defined similarly but used very differently. Organization types.
      • Issue: Reluctance of DM2 TWG in accommodate these differences.
      • Suggestion – constructing example of dissonance in charts/models between DoDAF and MoDAF and bringing back to DoD / MoD principals.
        • Trace differences back to “authoritative sources”.
          • Example use of Performer in DoD. Can aliasing be done better?
      • Goal (for Example Point of View): Try to make DoDAF/MoDAF modeling for end users closer together (fewer obvious differences).
    • 22. Next Steps and Action Items
      • Next Steps
        • Continue brainstorming session
        • Session 5: Next Wednesday 03 March 2010 another 45 minutes!
      • Action Items
        • Matthew to send invitations to join SAR Example Team including call for chair/co-chair. Done.
        • Len to send this revised presentation to Moe , who will send out to group. Do again.
        • Group to review slides & conduct e-mail threads as we feel need
          • SAR and/or GIG – see page xxx for notes
          • *Methodology 15 min – done, see page 15 -16 for notes
          • Reminder (Moe, Antoine, Others), please provide feedback on page 15.  ***NEED MORE FEEDBACK*** 
    • 23. References
        • DLOD ( Defence Lines of Development (UK) http://www.aof.mod.uk/aofcontent/strategic/guide/sg_howacqworks.htm
        • DOTMLPF (Doctrine, Organization, Training, Materiel, Leadership and Education, Personnel and Facilities ) (US) [ Edit ] (relating to methodology); JCIDS; DoD 5002 (acquisition)
        • IBM. Developing Object-Oriented Software: An Experienced Based Approach [add isbn, etc.] – as applied to system architecture.
        • Add about half dozen (6) primary References from Len’s SAR bibliography?
          • Len to send clean copy of full references for posting on Wiki.
        • Need to REVIEW above. Don’t’ let references drive example & methodology choices…
    • 24. Preparation – “Run Up” to OMG Tech Meeting, Jacksonville,
      • 22 March 2010
      • Goal: Hand over to a permanent “SAR” team by end of this meeting
      • Meet on Sunday 21 st Face-to-Face – the volunteers and brainstorming? Evening? Len to volunteer. Conflict with Architecture Ecosystem.
      • Brief the whole UPDM Group on consensus and items in progress. Len to volunteer as long as it does not imply that I will chair the permanent team.
    • 25. Backup
      • Presentation from Clarence Moreland (Moe) on Methodology
    • 26. UPDM/DoDAF 2.0 Its Place and Role in Defense Acquisition Its Place and Role in Defense Acquisition Clarence Moreland Atego Inc. All Rights Reserved
    • 27. JCIDS Analysis Process 03/17/10 The FAA synthesizes existing guidance to specify the military problems to be studied. The FSA takes this assessment as input, and generates recommendations for solutions to the needs. FNA then examines that problem, assesses how well the DOD can address the problem given its current program, and recommends needs the DOD should address. Clarence Moreland Atego Inc. All Rights Reserved Architectures are utilized Throughout
    • 28. Methodology
      • Modeling Language
        • The language or notation used to convey ideas in both the problem domain (analysis) and the solution domain (design)
      • Modeling Heuristics
        • Describes how the modeling language can be used in specific situations
      • Work Organization
        • A framework for organizing and performing development work (the process)
      Clarence Moreland Atego Inc. All Rights Reserved Assume that any (good) methodology minimally has these three components
    • 29. Hierarchy of Frameworks and MS&A Test & Evaluation Operational Rqmts Dev Effectiveness Analysis Tactics Development Mission Planning & Rehearsal Architecture Framework Enterprise Processes, CDRL & WBS Modeling Guidelines Language & Notation Many-on-Many Concept-to-Construct One-on-One Concept-to-Construct System/Subsystem/Component RESOLUTION Increasing Aggregation Comparative Results Actual Performance Increasing Resolution FUNCTIONS SUPPORTED FORCE OR SOS LEVEL UML/SysML/IDEF/BPMN Systems & Signals, OR, Automata Theory Methodologies, Practices, & Procedures C, C++, C# Java, ADA Lexicon Heuristics Data Dictionary Denotations (two levels) Connotations (three levels) DOTMLPF DLOD Clarence Moreland Atego Inc. All Rights Reserved Design Manufacturing Cost Tech Rqmts Development Air Wings Battle Groups Corps Division Joint Combined Forces Combat Support Services Combat Support Combat Combat Maneuver System Tank Gunner’s Controls & Display Panel Fire Control Capabilities Materials
      • Semmes
      Morphemes Grammars
      • Lexemes
      • Tokens
    • 30. Systems are Amalgamated Architectures Systems of Systems are Emergent Architectures
      • Information Hiding
        • - Higher levels defined w/o knowledge of lower level implementation
      • Separation of Concerns
        • - Policy separated from mechanism
        • - Lower levels only recognize context imposed by contract with higher level
      Recursive Structures Enables Commonality and Consistency At each level of hierarchy a consistent set of policies & Profiles are applied Verified Architectural Principles affirmed at one level are Captured, Codified & Reified at lower levels via the Profiles, Patterns, Policies Institutionalized within the Repository’s Data Dictionary . Variability explicitly managed at different levels of refinement/abstraction via recursive application of Standards, Policies, Principles, Patterns and Idioms. DARS Components MS&A Clarence Moreland Atego Inc. All Rights Reserved AF - DoDAF Process - JCIDS Procedure – Mil Std 499 Language(s) – UML/SysML/BPMN SoS Acquisition Systems Policy Policy Policy Policy
    • 31. Methodology Structure OV-1, OV-5, OV-6c OV-2, OV-4 OV-6 Class & State Diagrams System Performance OV-2, OV-3, SV-1, SV-2, SV-4, SV-6, SV-7, SV-10c Clarence Moreland Atego Inc. All Rights Reserved DoDAF Task Lists Conditions Doctrine Standards Modeling Heuristics Work Organization CONOPS Development Operational Scenarios Definition Architecture Framework Definition CONOPS Development Operational Scenarios Definition Architecture Framework Definition CONOPS Development Operational Scenarios Definition Architecture Framework Definition Functional Area Analysis CONOPS Development Operational Scenarios Definition Architecture Framework Definition Joint Integrating Concept Gap Analysis Gap Analysis Gap Analysis Gap Analysis Find Track Fix Target Access Engage “ As Is” Find Track Fix Find Track Fix Find Find Track Fix Target Access Engage Find Track Fix Target Access Engage “ As Is” Find Track Fix Find Track Fix Find Find Track Fix Target Access Engage Target Access Engage Target Target Access Engage “ As Is ” Constructive Iteration Constructive Iteration Constructive Iteration Functional Needs Analysis Integrated Experiments Performance Definition Measurement Notation
    • 32. Add input from Antoine
      • As requested during our previous meeting, I am proposing two questions related to modeling techniques using UPDM.
      •  
      •  
      • 1. How to represent the following sentence : « Support a 30% increase of air traffic”.
      • Is  this represented as an objective, a capability or an activity?
      •  
      •  
      • 2. How to represent  a local network or the BGAN network of Inmarsat?
      • Is it represented as an artifact or as a resource architecture?
      •  
    • 33.
      • As requested during our previous meeting, I am proposing two questions related to modeling techniques using UPDM.
      •  
      •  
      • 1. How to represent the following sentence : « Support a 30% increase of air traffic”.
      • Is  this represented as an objective, a capability or an activity?
      •  
      •  
      • 2. How to represent  a local network or the BGAN network of Inmarsat?
      • Is it represented as an artifact or as a resource architecture?
      • Antoine wants to add more questions.
        • keep these types of questions in mind when we actually do the examples.
        • Part of SAR example should ask these questions.
      •  
    • 34. Maritime SAR Configuration (Antoine Lonjon)
    • 35. Blank
    • 36. THANK YOU!

    ×