More Related Content Similar to Evaluating Software Architectures (20) Evaluating Software Architectures1. Evaluating Software Architectures
Stakeholders, Metrics, Results, Migration Strategies
Ingolf H. Krueger
ikrueger@ucsd.edu
Department of Computer Science & Engineering California Institute for Telecommunications
University of California, San Diego and Information Technologies
La Jolla, CA 92093-0114, USA La Jolla, CA 92093-0405, USA
2. Overview
• Background: Why Evaluate Software Architectures?
• Goal-Orientation
• Stakeholders and Architectural Views
• Evaluation Results
• Evaluation Methods and Metrics
• Migration Strategies
• Summary and Outlook
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 2
3. The Role of SW-Architecture
• Bounds leeway for implementation
• Fosters (or impedes!) system quality
• Supports the critical system services
• Defines starting point for
– Change management
– Product families
– Division of work
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 3
4. Why Evaluate an Architecture?
• Stakeholders have different views/opinions on
– what the system does
– how the system should be structured
– what documentation should look like
– how the company/suppliers should conduct their business
– …
• Architecture Evaluation
– brings different stakeholders together
– forum for voicing concerns (can, but need not, be related to
the software architecture under consideration)
– forum for establishing common understanding of important
system aspects across development/business teams
– means for risk identification and analysis
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 4
5. Why Evaluate an Architecture?
• Find errors early: 55% of all errors are made, but less than 10%
are detected during requirements capture and analysis
• Find out if architecture is adequate wrt.
– desirable/mandatory requirements
– critical use cases
– anticipated changes/extensions
– budget constraints
• Find out if state of the art development and documentation
approaches were used
• Find interfaces for coupling existing architecture with legacy or
new neighboring systems
• Decide among several competing architectural alternatives
• Determine migration path towards target architecture
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 5
6. Overview
• Background: Why Evaluate Software Architectures?
• Goal-Orientation
• Stakeholders and Architectural Views
• Evaluation Results
• Evaluation Methods and Metrics
• Migration Strategies
• Summary and Outlook
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 6
7. Goal-Orientation
• First Step of Architecture Evaluation:
determine the specific goals for the system under
consideration
• Prioritize goals!
• What are the critical properties/requirements the
architecture must support?
• Is the architecture suitable wrt. these
goals/properties/requirements?
• Determine the points in the architecture that influence
the critical goals
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 7
8. Overview
• Background: Why Evaluate Software Architectures?
• Goal-Orientation
• Stakeholders and Architectural Views
• Evaluation Results
• Evaluation Methods and Metrics
• Migration Strategies
• Summary and Outlook
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 8
9. Who is involved?
Customer End User
Manager
Administrator
Marketing
Maintainer
Tester
Operator
Implementer
Evaluation Team
Architect
…
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 9
10. Architectural Aspects: What to Look at?
see also: [RUP98]
Functional requirements
Models of structure
Source code organization
and behavior
File structure
Configuration information
logical
view ... Aspects of distribution and
concurrency
service
implementation view Response times
view
Throughput
process ...
view
Mapping of executables to
processors
deployment
System platform view
Installation
...
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 10
11. Key Elements of Architecture Evaluation
• Determine goals for the evaluation
– why does it happen?
– who has an interest in it?
• Build understanding of the application domain
– where are the difficulties in building this and similar applications?
– are standard solutions available/used?
• Build domain model if it doesn’t exist already
• Determine and prioritize goals for the system/organization
• Identify and prioritize scenarios (“critical”, “medium”, “low”)
• Play through scenarios, record adequacy of the architectural decisions
• Discuss alternative architectures and migration strategies
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 11
12. Overview
• Background: Why Evaluate Software Architectures?
• Goal-Orientation
• Stakeholders and Architectural Views
• Evaluation Results
• Evaluation Methods and Metrics
• Migration Strategies
• Summary and Outlook
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 12
13. Example: Architecture Evaluation
Build Understanding of the Existing Architecture
GUI
GUI-Coupling
Application
Legacy
Server
Middle-
ware
Hardware Abstraction
*
Hardware
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 13
14. Example: Architecture Evaluation
• Evaluation goals:
– Determine if prototypic architecture can be transferred to production
– Determine adequate means for architecture documentation
• Prioritized system goals:
– extensibility (applications/internal services)
– understandability
– short time-to-market
– support for emerging standards
• Domain model
• Alternative architectures
– use of off-the-shelf middleware vs. proprietary one
– distribution of application “knowledge” between client and host of
service
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 14
15. Example: Results of Architecture Evaluation
• Overview documentation of the Software Architecture
– Short description of the system
– Domain model
– Essential use cases (including exceptions)
– Component structure
(possibly taken from the domain model)
– Interaction patterns
– Other relevant views
(process view, distribution view, ... – see above)
• Prioritized list of quality attributes / goals and rationale
• Rationale for the architecture’s adequacy
• Discussion of alternative architectures
• Risk/Cost/Tradeoff Analysis
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 15
16. Overview
• Background: Why Evaluate Software Architectures?
• Goal-Orientation
• Stakeholders and Architectural Views
• Evaluation Results
• Evaluation Methods and Metrics
• Migration Strategies
• Summary and Outlook
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 16
17. Evaluation Methods: Example
ATAM: Architecture Tradeoff Analysis Method*
• Presentation
1. Present ATAM
2. Present business drivers
3. Present architecture
• Investigation and Analysis
4. Identify architectural approaches
5. Generate quality-attribute utility tree
6. Analyze architectural approaches
• Testing
7. Brainstorm and prioritize scenarios
8. Analyze architectural approaches
• Reporting
9. Present the results
*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 17
18. Evaluation Methods: Example
ATAM: Architecture Tradeoff Analysis Method*
Utility Tree
Extensibility offline-time/year < 5 sec.
Utility Availability handle DOS attacks
Usability …
…
…
*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 18
19. Evaluation Methods: Example
ATAM: Architecture Tradeoff Analysis Method*
• Phase 0:
– Create evaluation team
– Form partnership between evaluation organization and customer
• Phase 1:
– Steps 1-6 (architecture-centric)
• Phase 2:
– Steps 7-9 (stakeholder-centric)
• Phase 3:
– Produce final report
– Plan follow-on actions
*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 19
20. Watchlist*
Be wary, if…
• Architecture follows customer’s organization
• Complex: too many components on every hierarchical
level (rule of thumb: ≤7±2)
• Unbalanced set of requirements
• Architecture depends on specifics of an operating system
• Standards and standard components neglected
• Architecture follows hardware design
• Inappropriate redundancy to cover indecision
*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 20
21. Watchlist*
Be wary, if…
• Exceptions drive architecture
• No architect exists
• Stakeholders difficult to identify
• Architecture = class diagrams
• Outdated documentation
• Disparity in perception of the architecture between
developers and architects
*P. Clements, R. Kazman, M. Klein: Evaluating Software Architectures. Methods and Case Studies, Addison Wesley, 2002
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 21
22. Metrics: Measure and Control Progress and Quality
• “You can’t control what you can’t measure”
– is this true for software development?
• Metrics define how characteristics of a software system are
measured
• Examples:
– interface complexity
(# ports/component, # messages/protocol, …)
– structural complexity
(# components/modules, # components/hierarchical level, # levels, #
data classes, height of inheritance tree…)
– behavioral complexity
(# calls to other components, # states, # transitions, # synchronization
points, # choice-points, # loops, …)
– test complexity
(# branches covered, # boundary conditions covered, # data values
covered, # loop iterations covered, …)
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 22
23. Metrics: Measure and Control Progress and Quality
• Tool for complexity estimation
• Different techniques for complexity estimation can differ widely in
their predictions for the same system
• Initial estimates are almost always wrong
– iterative updating required
• Use of prototypes to determine adequacy may be preferable
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 23
24. Overview
• Background: Why Evaluate Software Architectures?
• Goal-Orientation
• Stakeholders and Architectural Views
• Evaluation Results
• Evaluation Methods and Metrics
• Migration Strategies
• Summary and Outlook
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 24
25. Migration Strategies
GUI • What is the improvement?
• What will it cost?
GUI-Coupling
• What qualifications are
Application
Legacy
… needed?
Server
Middle-
• How long will it take?
ware
• What are the intermediate
Hardware Abstraction
stages?
Hardware
•…
GUI
GUI-Coupling Application Legacy
Services Services Systems
Middleware
Hardware Abstraction
Hardware
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 25
26. Migration Strategies
• Consider sequence of manageable, smaller migration
steps as opposed to one “big bang”
– depends on application context
– influential factors:
time-to-market, competitive advantage, maintenance pressures
• Sketch out architectures of intermediate steps
• Discuss “top” scenarios for each intermediate
architecture
• Balance
– cost and benefit
– available technologies/standards and product delivery schedules
– capabilities of development team and ambition
– clarity and performance
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 26
27. Overview
• Background: Why Evaluate Software Architectures?
• Goal-Orientation
• Stakeholders and Architectural Views
• Evaluation Results
• Evaluation Methods and Metrics
• Migration Strategies
• Summary and Outlook
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 27
28. Summary and Outlook
• Architecture Evaluation
– brings different stakeholders together
– increases understanding of goals and their architectural
(mis)representation across team boundaries
– exposes risks
• Result should provide
– Overview documentation of the Software Architecture
– Prioritized list of quality attributes / goals and rationale
– Rationale for the architecture’s adequacy
– Discussion of alternative architectures
– Risk/Cost/Tradeoff Analysis
• Metrics provide aid in measuring and controlling process only if
estimates are regularly updated and compared with reality
• Migration strategies enable transition between current architecture
and target
– orient along goals
– “big bang” vs. sequence of small steps
CSE/CAL·(IT)2
© Ingolf H. Krueger
March 06, 2003 28