Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Reuse versus Reinvention: How will Formal Methods Deal with ...

465 views

Published on

  • Be the first to comment

  • Be the first to like this

Reuse versus Reinvention: How will Formal Methods Deal with ...

  1. 1. Reuse Versus Reinvention ** Mary Ann Malloy, PhD [email_address] LFM 2008 Conference ** How Will Formal Methods Deal with Composable Systems?
  2. 2. Composable C2 needs Formal Methods!
  3. 3. As a public interest company, MITRE works in partnership with the U.S. government to address issues of critical national importance.
  4. 4. Disclaimers <ul><li>The views, opinions, and conclusions expressed here are those of the presenter and should not be construed as an official position of MITRE or the United States Department of Defense (DoD). </li></ul><ul><li>All information presented here is UNCLASSIFIED , technically accurate, contains no critical military technology and is not subject to export controls. </li></ul>
  5. 5. What you WILL NOT take away today <ul><li>A “shrink-wrapped” solution to DoD’s emergent testing challenges. </li></ul>
  6. 6. What you WILL take away today <ul><li>An understanding of what “service-orientation” and “composability” mean. </li></ul><ul><li>Insight regarding how DoD is trying to build composable systems, but may not be testing them appropriately nor learning from the testing it does do. </li></ul><ul><li>Ideas for formal methods / testing investigation paths that may improve the state of composable systems verification & testing. </li></ul>
  7. 7. Overview <ul><li>Background </li></ul><ul><li>Examples </li></ul><ul><li>Recent Work </li></ul><ul><li>Challenges </li></ul><ul><li>Summary </li></ul>
  8. 8. BACKGROUND
  9. 9. Who uses services? Answer : YOU do! … and DoD wants to! BANKING DIRECTIONS TRAVEL eCOMMERCE NEWS & INFORMATION Online
  10. 10. What is a service? <ul><li>Characteristics of services </li></ul><ul><ul><li>Modular; composable , much like “lego” building blocks </li></ul></ul><ul><ul><li>Network-accessible </li></ul></ul><ul><ul><li>Reusable </li></ul></ul><ul><ul><li>Standards-based </li></ul></ul><ul><ul><li>Distributed capabilities </li></ul></ul>“ A mechanism to enable access to one or more capabilities, where the access is provided using a prescribed interface and is exercised consistent with constraints and policies as specified by the service description.” – DoD Net-Centric Services Strategy May 2007
  11. 11. What DoD sees… Ability to sort by type of incident, date, location, etc. Listing of bomb-related events between 14 Feb 08 and 15 Feb 08 Worldwide threats and incidents: airport, chemical, bridge, railway, bombs, etc. It also has links to related news stories and a searchable database. … and wants!
  12. 12. What is “service-oriented architecture”? <ul><li>An architectural style based on flexibly linked software components that leverage web standards and services </li></ul><ul><ul><li>NOT a product </li></ul></ul><ul><ul><li>NOT a bunch of web services </li></ul></ul>“ A paradigm for organizing and utilizing distributed capabilities that may be under the control of different ownership domains.” – OASIS Reference Model for Service-Oriented Architecture October 2006
  13. 13. What is composability? <ul><li>What is a composable system? </li></ul><ul><ul><li>one that consists of recombinant atomic behaviors (components) selected and assembled to satisfy specific [new] processing requirements. </li></ul></ul><ul><li>NOTE: Composability is meaningful at many layers of abstraction. </li></ul>Composable solutions – the desired end-state of a full-scale SOA implementation – are the direction DoD, federal stakeholders & commercial enterprises are evolving their automation assets. <ul><ul><li>A design principle dealing with interrelated components that do not make assumptions about the compositions that may employ them; they are “fit for the unforeseen.” </li></ul></ul><ul><li>– proposed definition </li></ul>
  14. 14. Testing principles <ul><li>Testing for composability </li></ul><ul><ul><li>Ensure individual processing elements do not make undue assumptions about the composition </li></ul></ul><ul><ul><ul><li>Code analysis or inspection for “hidden assumptions” or “out of band” dependencies </li></ul></ul></ul><ul><li>Testing the composition </li></ul><ul><ul><li>Validate the chosen composition of individual elements performs the desired functions. </li></ul></ul><ul><ul><li>The “composition layer” is an additional one that must be tested separately. </li></ul></ul><ul><ul><ul><li>A composition can be VALID yet still not do anything USEFUL with respect to the relevant CONTEXT </li></ul></ul></ul>
  15. 15. EXAMPLES
  16. 16. Typical DoD approach to testing compositions Capability Delivery Enables Drives Data Needed START HERE Drives Service Needed Capability Demonstration DECLARE SUCCESS! Enables Enables Community Information Exchange Vocabulary Drives Capability Need Service Implementations
  17. 17. Better DoD example: Net-Centric Diplomacy ** <ul><li>General findings from the testing of the NCD initiative of Horizontal fusion: </li></ul><ul><ul><li>Many different types of interrelated testing are needed. </li></ul></ul><ul><ul><li>Exhaustive testing is impossible. </li></ul></ul><ul><ul><ul><li>testing must still be iterative </li></ul></ul></ul><ul><ul><ul><li>it is time consuming! </li></ul></ul></ul><ul><ul><ul><li>Operationally specific test cases are needed </li></ul></ul></ul><ul><ul><li>Performance testing must focus on service dependencies vice user interface. </li></ul></ul><ul><ul><li>The number of requests that will cause a web service to fail is far lower than for a web server. </li></ul></ul><ul><ul><li>“ Few realize the complexity that must be taken into account when attempting to quantitatively measure performance and reliability when dealing with web services.” </li></ul></ul><ul><li>– Derik Pack, SPAWAR System Center, 2005 </li></ul>** see http://www.dtic.mil/ndia/2005systems/thursday/pack2.pdf
  18. 18. Better DoD example: Net-Centric Diplomacy concluded <ul><li>Testing was conducted until “error thresholds” were reached: </li></ul><ul><ul><li>Round trip time (90 sec) </li></ul></ul><ul><ul><li>Error (15%) </li></ul></ul><ul><li>Specific findings </li></ul><ul><ul><li>A mean of 3.06 Connections per Second could be achieved </li></ul></ul><ul><ul><li>WSDLs define interfaces, but not valid service use </li></ul></ul>Is this practicable across all of DoD? DoD may need to stand up multiple access points for heavily used services / compositions; and the “sweet spot” will likely differ in times of war vice times of peace.
  19. 19. RECENT WORK
  20. 20. What DoD must create to “get there…” <ul><li>Loosely coupled, relevant, “right-sized” services that can be leveraged across continuously changing processes. </li></ul><ul><li>New governance that can deal with complex management of distributed, loosely coupled, dynamically composable services. </li></ul><ul><li>A better understanding of maintenance implications: </li></ul><ul><ul><li>How long does it take? How will other components or clients be impacted? </li></ul></ul><ul><ul><li>Components with low or unknown MTBF should be highly accessible and easily replaceable…can this be automated? </li></ul></ul><ul><li>Rapidly deployable, virtual, continuous test environment </li></ul><ul><ul><li>Examples provided on the next two slides … </li></ul></ul>
  21. 21. … something like ELBA? ** ** see http://www.static.cc.gatech.edu/systems/projects/Elba/pub/200604_NOMS_mulini.pdf 1) Developers provide design-level specifications of model and policy documents (as input to Cauldron) and a test plan (XTBL). 2) Cauldron creates a provisioning and deployment plan for the application. 3) Mulini generates staging plan from the input components referred to from XTBL (dashed arrows). 4) Deployment tools deploy the application, monitoring tools to the staging environment. 5) The staging is executed. 6) Data from monitoring tools is gathered for analysis. 7) After analysis, developers adjust deployment specifications or possibly even policies and repeat the process.
  22. 22. … or STARSHIP? <ul><li>Key component of the Electronic Proving Ground Command, Control, Communications, Computers, Intelligence, Surveillance and Reconnaissance (C4ISR) tool kit for live distributed test environments. </li></ul><ul><li>Provides a “threads-based” composable environment to plan, generate planning documents, verify configuration, initialize, execute, synchronize, monitor, control, and report the status of sequence of activities. </li></ul><ul><li>Freely available & customizable to any problem domain. </li></ul><ul><li>Complexity may be a barrier. </li></ul>POC : Ms. Janet McDonald (520) 538-3575 (DSN 879) [email_address] PM ITTS IMO – Ft. Huachuca
  23. 23. What MITRE is doing <ul><li>c2c-composable-c2-list and Community Share site </li></ul><ul><li>Composable C2 is a “Grand Challenge Problem” within 2009 MITRE Innovation Program (MIP): </li></ul><ul><ul><li>How to build reconfigurable components that can be mashed together in an agile fashion </li></ul></ul><ul><ul><ul><li>Visualization and Analysis </li></ul></ul></ul><ul><ul><ul><li>Info Sharing </li></ul></ul></ul><ul><ul><ul><li>Interoperability and Integration </li></ul></ul></ul><ul><ul><ul><li>Resource Management to enable composablility (of people, organizations, networks, sensors, platforms…) </li></ul></ul></ul><ul><ul><ul><li>Acquisition and Systems Engineering </li></ul></ul></ul><ul><ul><ul><li>Collaborative and Distributed C2 </li></ul></ul></ul><ul><li>Example proposal: Web Service Process Optimization </li></ul><ul><ul><li>“ Our hypothesis is that web service optimization can be realized through </li></ul></ul><ul><ul><li>machine learning techniques and statistical methods. In this research we intend to find a computational solution to the problem of creating and maintaining web service processes.” </li></ul></ul>
  24. 24. What MITRE is doing concluded <ul><li>Resources for Early and Agile Testing </li></ul><ul><ul><li>Recently showed how low-cost simulation games can create a simple, “good-enough” simulation capability to evaluate new concepts early in development and expose the most challenging issues. </li></ul></ul><ul><ul><li>REACT “Online”  A composed testing environment for composed solutions! </li></ul></ul><ul><ul><ul><li>a loosely coupled simulation capability delivering dynamic flexibility for “quick look” experiments </li></ul></ul></ul><ul><li>A series of brainstorming sessions on Composable C2 </li></ul><ul><ul><li>“ static” vs. “dynamic” composablity viz legacy systems </li></ul></ul><ul><ul><li>do services derived from a “proven capability” have lower or non-existent testing requirement </li></ul></ul>
  25. 25. CHALLENGES
  26. 26. Practical challenges <ul><li>Can we “right-size” testing as “fit-for-composition?” </li></ul><ul><ul><li>Is composability binary? A sliding scale? When is it [not] OK to use “lower-rated” components? </li></ul></ul><ul><ul><li>Can we characterize the right amount of testing based on the anticipated longevity of the composition? Other factors? </li></ul></ul><ul><li>What metadata must be exposed to assess contextual validity of components in composition? </li></ul><ul><ul><li>Should WSDL be enriched? Supplemented? </li></ul></ul><ul><ul><li>Can what constitutes valid compositions be expressed as rules? How narrowly / broadly? </li></ul></ul><ul><li>What thresholds / metrics are required? Nice to have? </li></ul><ul><ul><li>Performance thresholds? Ongoing component health? </li></ul></ul><ul><li>Can we “borrow” ideas from other composability abstractions for applicability here? </li></ul>
  27. 27. Levels of composability testing? testing Risk (e.g., loss of life) Can we “rate” the composability of components? For a composition that will only be used a few times, can we tolerate higher risk? as-is for composable C2
  28. 28. “ Pressure-points” for formalisms <ul><ul><li>How can the lessons-learned from the past inform the way ahead for extending formal methods to testing & verification of composable systems? </li></ul></ul><ul><ul><li>Can we derive principles to compose systems in methodical, rather than ad-hoc ways, that will produce more satisfactory results? </li></ul></ul><ul><li>How can we handle partial and incremental specifications? </li></ul><ul><li>How can we cope when building a composition with parts that make incompatible assumptions about their mutual interactions? </li></ul><ul><li>What kinds of automated checking and analysis can we support? </li></ul>
  29. 29. Take-away points <ul><li>DoD will continue to deploy composed solutions to realize its SOA vision. </li></ul><ul><li>Current testing focuses more on the level of service provided and less on how reliably the capability is delivered or whether it actually meets the need. </li></ul><ul><li>Different levels of testing are probably appropriate for different contexts (“static” versus “dynamic,” use frequency, loss-of-life consequences). </li></ul><ul><li>Automated environments are needed to test composed solutions targeted for rapid deployment </li></ul>
  30. 30. Pointers to more information <ul><li>www.thedacs.com </li></ul><ul><ul><li>Data & Analysis Center for Software: A repository of documents, tools in research areas including testing and reuse </li></ul></ul><ul><li>Search for the latest results on: </li></ul>composable systems composability web service testing <ul><li>www.peostri.army.mil/PRODUCTS/STARSHIP/ </li></ul><ul><ul><li>Starship II homepage </li></ul></ul>testing composable composable C2
  31. 31. Composable C2 needs Formal Methods!
  32. 32. QUESTIONS & COMMENTS
  33. 33. ACRONYMS <ul><li>C2 = Command & Control </li></ul><ul><li>C4ISR = Command, Control, Communications, Computers, </li></ul><ul><li>Intelligence, Surveillance and Reconnaissance </li></ul><ul><li>DoD = Department of Defense </li></ul><ul><li>MTBF = mean time-between-failures </li></ul><ul><li>NDC = Net-Centric Diplomacy </li></ul><ul><li>PEO STRI = Program Executive Office for Simulation, Training </li></ul><ul><ul><li>& Instrumentation </li></ul></ul><ul><li>SOA = service-oriented architecture </li></ul><ul><li>WSDL = web services descriptive language </li></ul>
  34. 34. BACKUPS
  35. 35. Observations about compositions <ul><li>Solutions are built from primitive and composite components and connectors. </li></ul><ul><li>Components and connectors can be described by interface and protocol specifications. </li></ul><ul><li>Common patterns provide abstractions we may be able to exploit in design, development, analysis and testing. </li></ul><ul><li>To delivery meaningful capability, the components must be composable regarding their underlying ideas </li></ul>Technical realm Semantic realm
  36. 36. “ Lines of Evolution” vision for DoD systems A single system with a non-flexible hierarchical structure A system consisting of several independently functional but integrated components A capability is realized through a pre-defined orchestration of services Reusable, “mobile” services Services are orchestrated on an ad-hoc basis to deliver a capability…and then disappear 1 2 3 4 5
  37. 37. Another view: stages of SOA adoption Business Process Understanding: How is the work done? IT Assessment: What IT assets exist supporting the business process? SOA Design/ Determination: What should be a service? SOA Enablement (Java EE, .NET, federated data services): How will application and data services be developed and deployed? Infrastructure (ESB, Registry, Management Governance: How will services, application, people interact and communicate? Process Orchestration/ Composition: How will business processes and rules be developed and deployed? ** Mark Driver, Optimizing Open Source and SOA Strategies , Gartner Application Architecture, Development & Integration Summit 2007, http://www.gartner.com/it/page.jsp?id=506878&tab=overview DoD is lurking around here 1 2 3 4 5 6

×