Daniel.dvorak
Upcoming SlideShare
Loading in...5
×
 

Like this? Share it with your network

Share

Daniel.dvorak

on

  • 13,922 views

 

Statistics

Views

Total Views
13,922
Views on SlideShare
13,922
Embed Views
0

Actions

Likes
0
Downloads
29
Comments
0

0 Embeds 0

No embeds

Accessibility

Upload Details

Uploaded via as Adobe PDF

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Daniel.dvorak Presentation Transcript

  • 1. NASA Project Management Challenge 2010 Feb. 9-10, 2010 Software Architecture Review Board for Flight Systems Daniel Dvorak Michael Aguilar JPL Systems & Software Division NESC Software Discipline Expert SARB Team Lead NASA Technical Fellow in Software Daniel.L.Dvorak@jpl.nasa.gov Michael.L.Aguilar@nasa.gov 818.393.1986 301.388.015612/02/2009 Software Architecture Review Board Used with Permission 1
  • 2. Motivating Questions• What is the purpose of a software architecture review?• What kinds of problems are found?• What are the benefits?• What is software architecture, anyway?• How do you evaluate architecture?• Are there impediments to good software architecture within NASA?• What’s the relationship between systems architecture and software architecture? Software Architecture Review Board 2
  • 3. OverviewGoals• Help NASA missions achieve higher reliability and cost savings• Manage flight software complexity through better software architectureApproach Plan • Prepare introductory document,• Create a NASA-wide review process, review checklist, software architecture sample problem statement, and review board (SARB) sample report• Engage with flight projects • Educate team on process in the formative stages of • Practice on flown missions software architecture • Conduct real reviews Software Architecture Review Board 3
  • 4. Outline• Origin of this task• Architecture reviews: history and benefits• Architecture description• Architecture review process• Issues found in architecture reviews• How you can help Software Architecture Review Board 4
  • 5. Origin• NASA OCE study on flight software complexity involved flight software engineers from GSFC, JPL, JSC, MSFC, and APL• The study recommended formation of a NASA software architecture review board• Why? Because … – It saves projects time and money (AT&T and Lucent experience) – Weak software architecture is a contributor to problems on NASA missions (though rarely recognized as such) – Good architecture is the best defense against unnecessary complexity. “Point of view is worth 80 IQ points” (Alan Kay) Software Architecture Review Board 5
  • 6. NESC Support• NASA Chief Engineer Michael Ryskewitsch was briefed on findings and recommendations of the Flight Software Complexity study on 4/23/2009• One recommendation was to establish a NASA software architecture review board (SARB)• Michael Aguilar, NESC Software Discipline Expert and NASA Technical Fellow in Software, volunteered to support the board as a technical discipline team (TDT)• (Other recommendations are being followed up on. For example, static analysis of software code is now a NASA requirement.) Software Architecture Review Board 6
  • 7. Systems vs. Software Architecture• Architecture provides a unifying vision• Systems architecture is comprehensive: flight and ground systems, spacecraft instruments and subsystems, hardware, software, etc.• Software controls most of the system behavior• Thus, the architecture of behavior is in the software architecture• Note: Growth in size and complexity of flight software is a reflection of its role in meeting increasingly ambitious mission & system requirements Software Architecture Review Board 7
  • 8. Architecture Reviews • Description • History • Benefits
  • 9. What is an Architecture Review?“Architectural Reviews are formal reviews held to evaluate howwell a proposed architecture meets the needs and operationalconcept of the system under development. By focusing on thearchitecture and ops concepts, they identify mismatches early inthe life cycle. An architectural review board or panel (an expert,non-advocate group) usually conducts the review.” Best Practices Clearinghouse Defense Acquisition University bcph.dau.mil • DAU lists architecture reviews as a best practice having the most supporting evidence Software Architecture Review Board 9
  • 10. History Software Architecture Reviews AT&T Bell Labs was developing software- intensive systems for telephony in the 1960’s By the 1990’s AT&T had a standing Architecture Review Board that examined proposed software architectures for projects, in depth, and pointed out problem areas for rework – The board members were experts in architecture & system analysis – They could spot common problems a mile away – The review was invited and the board provided constructive feedback – It helped immensely to avoid big problems Software Architecture Review Board 10
  • 11. History Benefits of Architecture Reviews (1 of 2)• “Architecture reviews tend to increase quality, control cost, and decrease budget risk.” – [Bass, Clements, and Kazman, Software Architecture in Practice, 1998]• “In our experience, the average [architecture] review pays back at least twelve times its cost.” – [Daniel Starr and Gus Zimmerman, STQE Magazine, July/August 2002]• Beneficial side effects: – Cross-organizational learning is enhanced – Architectural reviews get management attention without personal retribution – Architectural reviews assist organizational change – Greater opportunities exist to find different defects in integration and system tests. – [Maranzano et al, IEEE Software, March/April 2005] Software Architecture Review Board 11
  • 12. History Benefits of Architecture Reviews (2 of 2)“Project teams found that the preparation for the reviewhelped them get a clearer understanding of their projects.In several reviews the people on the project team asked morequestions than the reviewers. Often, this was one of the fewopportunities for the project team members to have in-depthdiscussions of the technical issues about the project. Thereview served as a catalyst to bring them together.”AT&T: “Best Current Practices: Software Architecture Validation”, 1991 “The architecture review process has helped train people to become better architects and helped establish a consistent view across our companies, both of what architecture is and what good architecture’s characteristics are.” Maranzano et al, Architecture Reviews: Practice and Experience, IEEE Software, March/April 2005. Software Architecture Review Board 12
  • 13. Mission & Charter of SARB Mission: Manage flight software complexity through better software architectureCharter• Provide constructive feedback to flight projects in the formative stages of software architecting• Focus on architectural improvements to reduce and/or better manage complexity in requirements, analysis, design, implementation, verification, and operations• Spread best architectural practices, principles, and patterns across flight software centers• Contribute to NASA Lessons Learned Software Architecture Review Board 13
  • 14. Architecture Description• Architecture reviews help carry out our mission• A review requires an architecture description• So … – What is architecture? – How should an architecture be described?
  • 15. What is an Architecture Description? IEEE 1471-2000: Recommended Practice for Architecture Description of Software-Intensive Systems• Every system … – has an architecture (documented or not) – has stakeholders• Every stakeholder has concerns• An architecture is described by an architecture description• An architecture description … – identifies stakeholders – is organized by views that address stakeholders’ concerns – provides rationale Software Architecture Review Board 15
  • 16. Conceptual Framework for Architecture ANSI / IEEE 1471-2000: Mission Recommended Practice for Architecture 1..* Description of Software-Intensive Systems fulfills influences has anEnvironment inhabits System Architecture Rationale 1 provides described by has 1..* 1 1..* Stakeholder identifies Architecture Description 1..* is addressed to has selects organized by 1..* 1..* 1..* 1..* conforms to Concern Viewpoint View 1..* Software Architecture Review Board 16
  • 17. Scope of Architectural Concerns • Architecture should address non-functional rqmtsRequirements Complexity – performance, availability, maintainability, modifiability, security, testability, operability, etc. – Watch for unsubstantiated or ambiguous rqmts System-Level Analysis & • Architecture should address analyzability Design - “Point of view is worth 80 IQ points” • Architecture should address principles of designFlight Software - Identify and follow architectural principles Complexity - Leverage appropriate architectural patternsVerification & • Architecture should address verifiability Validation - Design can simplify or complicate verification Complexity • Architecture should address operability Operations - Inadequate design complicates operations Complexity - Operational workarounds raises risk Software Architecture Review Board 17
  • 18. What is Architecture?“ Architecture is the fundamental organization of asystem, embodied in its components, theirrelationships to each other and the environment, andthe principles governing its design and evolution.” IEEE Standard 1471: Recommended Practice for Architectural Description of Software-Intensive Systems• where: – fundamental = essential, unifying concepts and principles – system = application, system, platform, system-of- systems, enterprise, product line, … – environment = developmental, operational, programmatic, … context Software Architecture Review Board 18
  • 19. Two Notions of “Architecture”• architecture — What gets built – Describing components and interfaces – The details of assembly and integration• Architecture — Why it gets built the way it does – Identifying properties of interest beyond just the requirements, and from all essential points of view – Defining abstractions and other patterns of design that give the design shape and reflect fundamental concepts of the domain – Guiding design and maintaining principles throughout lifecycle – Building on a body of experience and refining concepts as necessary Architecting is about managing complexity Software Architecture Review Board Source: Bob Rasmussen, JPL 19
  • 20. The Review Process• AT&T Review Process • Principles • Participants • Implementation • Artifacts• Issues found
  • 21. What review process to use?• For reviews within NASA, expert opinion favors the AT&T process over ATAM – David Garlan (CMU), Gus Zimmerman (Lucent), Katy Weiss (JPL)• Reasons given in favor of AT&T process: – Review team composed of external experts whereas ATAM review team is all stakeholders (potentially biased) – Better transfer: encourages cross-fertilization of ideas/standards/approaches across organization. – Better residuals: reusable checklists, standard templates, shared understandings about common architectural approaches – ATAM not quantitative enough to characterize quality attribute requirements with testable figures of merit, and then map those testable requirements directly into the structures in the architecture that accomplish them – ATAM prioritizes scenarios by voting among all stakeholders thus allowing a serious issue, identified by an expert, to be ranked low Software Architecture Review Board 21
  • 22. AT&T Review Process: Principles• A clearly defined problem statement (success criteria) drives the system architecture – The project writes it and the review team uses it – Often requires iteration before it’s good enough• Product line and business application projects require a system architect at all phases – Architect is an explicitly defined role• Independent experts conduct reviews – Chosen reviewers are recognized experts and are independent of the project• Reviews are open processes – Open approach to identifying issues and strengths – Issues written on 5x8 “snow cards” and displayed on wall• Conduct reviews for the project’s benefit – Reactions to issues are project management responsibility – Review team does not discuss issues later without project consent Software Architecture Review Board 22
  • 23. AT&T Review Process: Participants • Review client – Pays for system development, or is review’s sponsor • Project members – Creators, contributors, and users of the architectureproject – One member selected to be contact for logistics and artifacts • Project management – All managers responsible for project’s success – Management nominates project for review • Review teamreview team – Subject matter experts selected on basis of expertise, independence from project, and interpersonal skills • Architecture review board – Standing board that oversees and adjusts review process board • Review angel – ARB member with managerial experience to address political issues • ARB chair – Strong process advocate; ensures effectiveness; secures support Software Architecture Review Board 23
  • 24. AT&T Review Process: Implementation• Phase 1: Screening – Project and ARB determine if a review would benefit project – If so, ARB selects a review angel• Phase 2: Preparation – ARB selects a review team, including a review leader – Project prepares clear problem statement (success criteria) and docs• Phase 3: Review Meeting – Typical review is 2 days plus ½ day of caucus plus 1-hour readout – Issues prioritized: management alert, critical, major, minor• Phase 4: Follow-Up – Review team delivers report to project within 15 days – If management alert(s), project must respond within two weeks Software Architecture Review Board 24
  • 25. AT&T Review Process: Artifacts• Architecture review checklist – Questions that architects and reviewers should consider – Evolves over time according to serious and prevalent issues – “An accumulated institutional knowledge repository”• Input to the review – Problem statement (success criteria), system requirements, functional requirements, architectural specification, other informational docs• Output from the review – Set of issues (often in the form of snow cards) – A review report – Optional management alert letter Software Architecture Review Board 25
  • 26. Kinds of Issues Found (AT&T, Lucent)Category PercentProblem Definition 10-18%Problem isn’t completely or clearly definedProduct Architecture and Design 29-49%Proposed solution doesn’t adequately solve the problemTechnology 3-14%Inadequate languages, tools and/or components to build systemDomain Knowledge 2-5%Team lacks adequate knowledge or experienceProcess 4-19%Process isn’t systematic, complete, or designed to make developmentmanageableManagement Controls 14-26%Inadequate management monitoring capabilities, staffing, controls, anddecision-making mechanisms Software Architecture Review Board 26
  • 27. Example Issues from within NASA• Boxes-and-lines diagrams lack clear semantics• Flight software design details that are unnecessarily coupled to hardware details• Lots of software “gadgets” but little in the way of abstractions tailored for the problem domain• Excessive cross-strapping of hardware that complicates software without much reliability benefit• Underestimation of time needed to adequately test redundancy management• Fault protection design that doesn’t scale well• Fault protection designed only to handle a laundry list of faults; lack of defensive mindset Software Architecture Review Board 27
  • 28. Impediments to Software Architecture within NASA• Inappropriate modeling techniques – “Software architecture is just boxes and lines” – “Software architecture is just code modules” As presented by – “A layered diagram says it all” Prof. David Garlan (CMU) at• Misunderstanding about role of architecture NASA Planetary Spacecraft in product lines and architectural reuse Fault Management Workshop, – “A product line is just a reuse library” 4/15/08• Impoverished culture of architecture design – No standards for arch description and analysis – Architecture reviews are not productive – Architecture is limited to one or two phases – Lack of architecture education among engineers• Failure to take architecture seriously – “We always do it that way. It’s cheaper/easier/less risky to do it the way we did it last time.” – “They do it a certain way ‘out there’ so we should too.” – “We need to reengineer it from scratch because the mission is different from all others.” Software Architecture Review Board 28
  • 29. What an architecture review is NOTAn architecture review is …• not a gate, not a mandatory review (in AT&T)• not a pass/fail judgment• not an audit for a cancellation decision• not an evaluation of architect’s performance• not a tutorial• not a code review Software Architecture Review Board 29
  • 30. Timeline of SARB Activities Mike Ryschkewitsch approves recommendation for SARBApril 2009 Mike Aguilar funds it as a NESC Technical Discipline TeamMay 2009 Kickoff telecon with FSW Complexity team + Mike AguilarJune 2009 Team formation and preparation phaseJuly 2009 • prepare charter • select review board membersAug. 2009 • define review and reporting processSep. 2009 • identify focus areas for evaluation • get exposure to several flight software architecturesOct. 2009 • conduct mock reviewsNov. 2009Dec. 2009Jan. 2010Feb. 2010 Conduct first real review (goal) Software Architecture Review Board 30
  • 31. Current Team Members (as of 12/3/2009)Name AffiliationMichael Aguilar OCE/NESCDan Dvorak JPLMichel Ingham JPLLou Hallock (leaving) GSFCMichael Blau (coming) GSFCJohn Weir MSFCLeann Thomas (observing) MSFCHelen Neighbors JSCKevin Balon APLSteve Williams (observing) APL Software Architecture Review Board 31
  • 32. How You Can HelpProjects:• Identify candidate projects for reviewPeople:• Identify subject matter experts who may serve on reviews• Suggest potential review board membersArchitectural Concerns:• Improve our architecture checklist – Offer your strongly felt architectural issues – Negative examples are among the best teachers Software Architecture Review Board 32
  • 33. Questions? “Great goulash, Stan. That reminds me, are you still in charge of our system architecture?” Software Architecture Review Board 33
  • 34. Backup• What about ground software?• SARB telecon topics• “Early warning signs”• Mindset for reviews• Open questions
  • 35. What About Ground Software?• Initial focus is on flight software because the recommendation came from a FSW study• Ideally, GSW should be included, but … – Architectural concerns will be somewhat different – Need different expertise on the review board – Need additional funds• One way to “ease into” GSW is to review: – End-to-end information flow – Flight/ground operations paradigm and uplink/downlink Software Architecture Review Board 35
  • 36. SARB Telecon Topics (1 of 2)Date Topic5/18/09 Pre-planning: NESC intro, identify board members and projects6/01/09 Kickoff: charter, questions, speakers6/08/09 Handbook for real-time analysis (Mike Aguilar)6/15/09 Architecture review checklist (Lou Hallock)6/22/09 Architecture docs and review process (Katy Weiss)6/29/09 Altair FSW architecture study (Lore Williams)7/13/09 General discussion of problems seen in FSW7/20/09 cFE/CFS software architecture (Charlie Wildermann)7/27/09 Observations on weak s/w architecture (Bob Rasmussen)8/10/09 Space Shuttle Main Engine: s/w architecture (Andy Young)8/17/09 Examples of weak s/w architecture (all)8/24/09 Discuss architecture review checklist (Lou Hallock) Software Architecture Review Board 36
  • 37. SARB Telecon Topics (2 of 2)Date Topic8/31/09 Overview of IEEE 1471-2000 “Recommended Practice for Architectural Description of Software-Intensive Systems” (Jeff Estefan)9/21/09 Ares I upper stage flight computer software architecture (J. Weir)10/05/09 Four documents we need to prepare (Dan Dvorak)10/26/09 The architecture “problem statement” (Ken Costello)11/02/09 Revisiting the architecture review checklist (Lou Hallock)11/30/09 SDO Software Architecture (Manuel Maldonado, Mark Walters)12/14/09 Architecture review overview document (H. Neighbors, J. Weir)TBD MLAS software architecture (Mike Aguilar)TBD Experiences in adapting cFE/CFS for LRO (Mike Blau)TBD MSAP Software Architecture (David Hecox or Bob Denise)TBD Architectural issues in fault protection (Dan Dvorak) Software Architecture Review Board 37
  • 38. What to look for Early Warning Signs (1 of 2)1. The architecture is forced to match current organization.2. Top-level architectural components number more than 25.3. One requirement drives the rest of the design.4. The architecture depends on alternatives in the OS.5. Proprietary components used when standard components would do.6. Component definition comes from the hardware division.7. There is redundancy not needed for reliability (e.g., two databases, two start-up routines, two error-locking procedures).8. The design is exception driven; that is, the emphasis is on extensibility and not core commonalities.9. Development unit unable to identify a system architect Software Architecture Review Board 38
  • 39. What to look for Early Warning Signs (2 of 2) 10. Difficulty identifying the architecture’s stakeholders 11. The architecture provides no guidance on how to code an interaction between two components (silent or a plethora of choices) 12. Architecture documentation consists of class diagrams and nothing else 13. Architecture documentation is a large stack of documents automatically produced by some tool, but which no human has ever seen 14. Documents provided are old, not kept up to date 15. A designer or developer, when asked to describe the architecture, is either unable to or describes a much different architecture than what the architect presented(from “Evaluating Software Architectures”, by Clements, Kazman, and Klein 2002) Software Architecture Review Board 39
  • 40. Mindset for Reviews• Architecture addresses non-functional requirements – Quality attributes such as verifiability, operability, maintainability, interoperability, adaptability, reusability, scalability, portability, etc• Architecture is about principles – “Software architecture involves the description of elements from which systems are built, interactions among those elements, patterns that guide their composition, and constraints on these patterns.” [Shaw & Garlan, 1996] – Fred Brooks writes emphatically that a system’s conceptual integrity is of overriding importance and that systems without it fail [Brooks 1975, Bass 2003]• A major goal of a review is to elicit architectural risks• Architecture is more than high-level design Software Architecture Review Board 40
  • 41. Open Questions• Should the board’s scope include aeronautics in addition to aerospace?• How do we measure effectiveness of reviews?• What kind of follow-up should we do with projects?• Should architecture reviews be made part of standard project reviews?• How do we keep findings confidential to projects but also report upward to OCE/NESC? Software Architecture Review Board 41
  • 42. About Software Architecture Definitions• Software architecture = { elements, form, rationale} [Perry & Wolf, 1992]• “Software architecture involves the description of elements from which systems are built, interactions among those elements, patterns that guide their composition, and constraints on these patterns.” [Shaw & Garlan, 1996]• “An architecture is a specification of the components of a system and the communication between them. Systems are constrained to conform to an architecture.” [Luckham et al, 1995]• “An architecture is … a set of constraints on a design. The utility of an architecture is its ability to simplify the process of generating a design through the imposition of carefully chosen constraints.” [Gat, 1998] Software Architecture Review Board 42
  • 43. Architecture versus Design• Architecture is design, but not all design is architectural. – Architects intentionally limit their focus and avoid the details of how elements do what they do. • Detailed designs and implementation details are left to downstream engineers/experts. – Downstream engineers are expected to respect the architecture to ensure properties promised by the architect are present in the product. Source: Anthony J. Lattanze and David Garlan Software Architecture Review Board 43
  • 44. Two Sources of Software ComplexitySw complexity = Essential complexity + Incidental complexity Essential complexity  Incidental complexity comes from problem comes from choices domain and mission about architecture, requirements design, implementation, Can reduce it only by including avionics descoping  Can reduce it by making Can move it (e.g. to ops), wise choices but can’t remove it Software Architecture Review Board 44
  • 45. Miscellaneous• Architecture is concepts, patterns, interaction, and principles of design − the things you care so much about that they are the last thing you would change. They guide the design.• There’s a real opportunity for the SARB to document and spread good design patterns and practices• Software and hardware must support the system architecture – Historically, software has been made to conform to the selected hardware – Now, due to growth in software complexity, hardware decisions should be balanced with impacts on software Software Architecture Review Board 45
  • 46. Summary of IEEE 1471• An architecture should address a systems stakeholders concerns• Architecture descriptions are inherently multi-view (no single view adequately captures all stakeholder concerns)• It separates the notion of view from viewpoint • a viewpoint identifies the set of concerns and the representations/modeling techniques, etc. used to describe the architecture to address those concerns • a view is the result of applying a viewpoint to a particular system• It establishes content requirements for architecture descriptions• It provides guidance for capturing architecture rationale Software Architecture Review Board 46