Capitol Tech U Doctoral Presentation - April 2024.pptx
Â
Software management renaissance
1. SOFTWARE MANAGEMENT
RENAISSANCE
Content
⢠Conventional Software Management:
⢠The waterfall model,
⢠Conventional software Management performance.
⢠Evolution of Software Economics:
⢠Software Economics,
⢠Pragmatic software cost estimation.
⢠Improving Software Economics.
⢠The old way and the new:
⢠The principles of conventional software Engineering,
⢠Principles of modern software management,
⢠Transitioning to an iterative process.
Dr. P. KUPPUSAMY
2. Three Analyses of Conventional Software
1. Software development is still highly unpredictable. Only about
10% of software projects are delivered successfully within initial
budget and schedule estimates.
2. Management discipline is more important in success or failure
than technology.
3. The level of software scrap and rework is revealing of an
immature process.
Dr. P. KUPPUSAMY
3. Conventional Software Model - Waterfall Model
Analysis
Program
design
Coding
Testing
Maintenance
and reliance
Software
requirements
System
requirements
Dr. P. KUPPUSAMY
4. Waterfall Model
⢠Requirements (Software & Hardware) â
defines needed information, function, behavior,
performance and interfaces.
⢠Design â data structures, software architecture,
interface representations, algorithmic details.
⢠Coding â source code, database, user
documentation,
⢠Testing â Test the designed software with
suitable data
Dr. P. KUPPUSAMY
5. Waterfall Strengths
⢠Easy to understand, easy to use
⢠Provides structure to inexperienced staff
⢠Milestones are well understood
⢠Sets requirements stability
⢠Good for management control (plan, staff,
track)
⢠Works well when quality is more important
than cost or schedule
Dr. P. KUPPUSAMY
6. Waterfall Deficiencies
⢠All requirements must be known earlier
⢠Deliverables created for each phase are
considered frozen â inhibits flexibility
⢠Can give a false impression of progress
⢠Does not reflect problem-solving nature of
software development â iterations of phases
⢠Integration is one big bang at the end
⢠Little opportunity for customer to preview the
system
Dr. P. KUPPUSAMY
7. When to use the Waterfall Model
⢠Requirements are very well known
⢠Product definition is stable
⢠Technology is understood
⢠New version of an existing product
⢠Porting an existing product to a new platform.
Dr. P. KUPPUSAMY
8. Conventional Software Management Performance
1. Finding and fixing a software problem after delivery makes cost
100 times more than finding and fixing the problem in early
design phases.
2. You can compress software development schedules 25% of
nominal, but not more.
3. For every $1 you spend on development, you will spend $2 on
maintenance.
4. Software development and maintenance costs are primarily a
function of the number of source lines of code.
Dr. P. KUPPUSAMY
9. Conventional Software Management Performance
5. Variations among people makes the biggest differences in
software productivity.
6. The overall ratio of software to hardware costs is still
growing. In 1955 it was 15:85; in 1985, 85:15.
7. Only about 15% of software development effort is
devoted to programming.
8. Walkthroughs catch 60% of the errors.
9. 80% of the contribution comes from 20% of contributors.
Dr. P. KUPPUSAMY
10. Basic waterfall model that would eliminate
most of the development risks are as follows:
⢠Complete program design before analysis and coding
begin (program design comes first)
⢠Maintain current and complete documentation
(Document the design).
⢠Do the job twice, if possible (Do it N times).
⢠Plan, control, and monitor testing
⢠Involve the customer.
Dr. P. KUPPUSAMY
11. IN PRACTICE
Dr. P. KUPPUSAMY
⢠ACTIVITY COST
⢠Management 5%
⢠Requirements 5%
⢠Design 10%
⢠Code and unit testing 30%
⢠Integration and test 40%
⢠Deployment 5%
⢠Environment 5%
14. In Practice:
Late Risk Resolution
Dr. P. KUPPUSAMY
⢠Four distinct periods of risk exposure, where
risk is defined as âthe probability of missing a
cost, schedule, feature, or quality goalâ.
15. Requirements-Driven Functional
Decomposition
⢠The software development process has been requirement-
driven.
⢠This approach depends on specifying requirements
completely and clearly before other development activities
begin.
⢠Frankly treats all requirements as equally important.
⢠Specification of requirements is a difficult and important part
of the software development process
Dr. P. KUPPUSAMY
17. Adversarial Stakeholder Relationships
⢠The contractor prepared a draft contact-deliverable document
that captured an intermediate artifact and delivered it to the
customer for approval.
⢠The customer was expected to provide comments (within 15
to 30 days).
⢠The contractor integrated these comments and submitted a
final version for approval (within 15 to 30 days).
Dr. P. KUPPUSAMY
18. Adversarial Stakeholder Relationships
⢠Project Stakeholders :
â the project sponsor and project team
â support staff
â Customers
â Users
â Suppliers
â opponents to the project
Dr. P. KUPPUSAMY
24. Evolution of Software Economics
ďą Most software cost models can be abstracted into a function of five basic
parameters:
⢠Size - Size of the end product (in human-generated components), which is
typically quantified in terms of the number of source instructions or the
number of function points required to develop the required functionality
⢠Process - The process used to produce the end product, in particular the ability
of the process to avoid non-value-adding activities (rework, bureaucratic
delays, communications overhead)
⢠Personnel - The capabilities of software engineering personnel, and particularly
their experience with the computer science issues and the applications
domain issues of the project
⢠Environment - Made up of the tools and techniques available to support
efficient software development and to automate the process
⢠Quality - Required quality of the product, including its features, performance,
reliability, and adaptability
⢠Relationships among parameters and the estimated cost is:
Effort = (Personnel)(Environment) (Quality)(SizeProcess)
Dr. P. KUPPUSAMY
25. Three generations of software economics
Cost
Software size
1960s-1970s
Waterfall model
Functional design
Diseconomy of scale
1980s-1990s
Process improvement
Encapsulation-based
Diseconomy of scale
2000 and on
Iterative development
Component- based
Return to investment
Environments/tools:
Custom
Size:
100% custom
Process:
Ad hoc
Environments/tools:
Off-the-shelf, separate
Size:
30%component-based, 70% custom
Process:
Repeatable
Environments/tools:
Off-the-shelf, integrated
Size:
70%component-based, 30% custom
Process:
Managed/measured
Typical project performance
Predictably bad
Always:
-Over budget
-Over schedule
Unpredictable
Infrequently:
-On budget
-On schedule
Predictable
Usually:
-On budget
-On schedule
Software
ROI
ROI â Return on Investment
Dr. P. KUPPUSAMY
26. 1. Conventional: 1960s and 1970s, craftsmanship.
Organizations used custom tools, custom processes, and virtually all custom
components built in primitive languages. Project performance was highly
predictable in that cost, schedule, and quality objectives were almost always
underachieved.
2. Transition: 1980s and 1990s, software engineering.
Organizations used more-repeatable processes and off-the-shelf tools, and
mostly (>70%) custom components built in higher level languages.
Some of the components (>30%) were available as commercial products,
including the operating system, database management system, networking, and
graphical user interface.
During the 1980s, some organizations began achieving economies of scale, but
with the growth in applications complexity (primarily in the move to distributed
systems), the existing languages, techniques, and technologies were just not
enough to sustain the desired business performance.
3. Modern practices: 2000 and later, software production.
It is rooted in the use of managed and measured processes, integrated
automation environments, and mostly (70%) off-the-shelf components.
Perhaps as few as 30% of the components need to be custom built.
With advances in software technology and integrated production environments,
these component-based systems can be produced very rapidly.
Three generations of software economics
Dr. P. KUPPUSAMY
29. Three generations of software economics
Diseconomies of scale is an economic concept referring to a situation in which
economies of scale no longer functions for a firm. With this principle, rather than
experiencing continued decreasing costs and increasing output, a firm sees an
increase in marginal costs when output is increased.
Short for commercial off-the-shelf, an adjective that describes software or hardware
products that are ready-made and available for sale to the general public. For
example, Microsoft Office is a COTS product that is a packaged software solution for
businesses.
Dr. P. KUPPUSAMY
30. Issues in software cost estimation model
⢠Lack of well-documented case studies of projects that used an iterative
development approach.
⢠Inconsistency and compatibility.
⢠It is hard enough to collect a homogeneous set of project data within one
organization.
⢠Although cost model vendors claim that their tools are suitable for
estimating iterative development projects based on empirical project
databases with modern iterative development success stories.
⢠To Avoid these issues pragmatic software cost estimation model is used
Dr. P. KUPPUSAMY
31. Pragmatic software cost estimation
Software manager,
Software Architecture Manager,
Software Development Manager,
Software Assessment manager
Cost estimate
Cost modelers
Risks, options,
trade-offs,
alternatives
The predominant cost estimation process:
Dr. P. KUPPUSAMY
32. Pragmatic software cost estimation
⢠A good estimate has the following attributes:
ď It is conceived and supported by the project manager, architecture
team, development team, and test team accountable for performing
the work.
ď It is accepted by all stakeholders as ambitious but realizable.
ď It is based on a well defined software cost model with a credible basis.
ď It is based on a database of relevant project experience that includes
similar processes, technologies, environments, quality requirements,
and people.
ď It is defined in enough detail so that its key risk areas are understood
and the probability of success is objectively assessed.
⢠Ideal estimate would be derived from a mature cost model with an
experienced team with the same mature processes and tools.
Dr. P. KUPPUSAMY
33. Improving Software Economics
⢠Five basic parameters of the software cost model:
1. Reducing the size or complexity of what needs to be
developed
2. Improving the development process
3. Using more-skilled personnel and better teams (not
necessarily the same thing)
4. Using better environments (tools to automate the process)
5. Trading off or backing off on quality thresholds
Dr. P. KUPPUSAMY
34. Important trends in improving software economics
Cost model parameters Trends
Size
Abstraction and component
based development technologies
Higher order languages
(C++, Java, Visual Basic, etc.)
Object-oriented
(Analysis, design, programming)
Reuse
Commercial components
Process
Methods and techniques
Iterative development
Process maturity models
Architecture-first development
Acquisition reform
Personnel
People factors
Training and personnel
skill development
Teamwork
Win-win cultures
Environment
Automation technologies and tools
Integrated tools
(Visual modeling, compiler, editor, etc)
Open systems
Hardware platform performance
Automation of coding, documents,
testing, analyses
Quality
Performance, reliability, accuracy
Hardware platform performance
Demonstration-based assessment
Statistical quality control
Dr. P. KUPPUSAMY
35. Reducing Software Product Size
âThe most significant way to improve affordability and
return on investment is usually to produce a product that
achieves the design goals with the minimum amount of
human-generated source material.â
Reuse, object-oriented technology, automatic code production,
and higher order programming languages are all focused on
achieving a given system with fewer lines of human-specified
source directives.
Dr. P. KUPPUSAMY
36. Reducing Software Product Size based on Programming Languages â To Improve Software
Economics
Language SLOC per
UFP
Assembly 320
C 128
Fortran 77 105
Cobol 85 91
Ada 83 71
C++ 56
Ada 95 55
Java 55
Visual Basic 35
UFP -Universal Function Points
The basic units of the function
Points are external user inputs,
external outputs,
internal logic data groups,
external data interfaces,
and external inquiries.
SLOC (Source lines of code)metrics
are useful estimators for software
after a candidate solution is
Formulated and
an implementation
language is known.
Dr. P. KUPPUSAMY
37. Reducing Software Product Size using Object-Oriented Methods to Improve Software Economics
⢠âAn object-oriented model of the problem and its solution encourages a common
vocabulary between the end users of a system and its developers, thus creating a
shared understanding (teamwork and interpersonal communications) of the
problem being solved.â
⢠âThe use of continuous integration creates opportunities to recognize risk early and
make incremental corrections without destabilizing the entire development effort.â
This aspect of object-oriented technology enables an architecture-first process, in
which integration is an early and continuous life-cycle activity.
⢠An object-oriented architecture provides a clear separation of concerns among
disparate elements of a system, creating firewalls that prevent a change in one
part of the system from rending the fabric of the entire architecture.â
This feature of object-oriented technology is crucial to the supporting languages and
environments available to implement object-oriented architectures.
Dr. P. KUPPUSAMY
38. Reducing Software Product Size â Reuse
Cost and schedule investments necessary to achieve reusable components
Number of Projects Using Reusable Components
Development
Cost
and
Schedule
Resources
1 Project Solution: $N and
M months
2 Project Solution: 50%
more cost and 100% more
time
5 Project Solution: 125%
more cost and 150% more
time
Many-project solution:
Operating with high value per
unit investment, typical of
commercial products
Dr. P. KUPPUSAMY
39. Reducing Software estimation cost by Reusing Commercial Components â To Improve
Software Economics
APPROACH ADVANTAGES DISADVANTAGES
Commercial
components
Predictable license costs
Broadly used,
mature technology,
Dedicated support
organization,
Hardware/software
independence,
Rich in functionality
Frequent upgrades
Up-front license fees
Recurring maintenance fees
Dependency on vendor
Run-time efficiency sacrifices
Functionality constraints
Integration not always trivial
No control over upgrades and maintenance
Unnecessary features that consume extra
resources
Often inadequate reliability and stability
Multiple-vendor incompatibility
Custom
development
Complete change freedom
Smaller,
often simpler implementations,
Often better performance,
Control of development and
enhancement
Expensive, unpredictable development,
Unpredictable availability date,
Undefined maintenance model,
Often immature and fragile,
Single-platform dependency,
Drain on expert resources
Dr. P. KUPPUSAMY
40. Improving Software Processes â to Improving Software Economics
Attributes Metaprocess Macroprocess Microprocess
Subject Line of business Project Iteration
Objectives Line-of-business profitability
Competitiveness
Project profitability
Risk management
Project budget, schedule, quality
Resource management
Risk resolution
Milestone budget, schedule,
quality
Audience Acquisition authorities, customers,
Organizational management
Software project managers,
Software engineers
Subproject managers,
Software engineers
Metrics Project predictability
Revenue, market share
On budget, on schedule,
Major milestone success,
Project scrap and rework
On budget, on schedule
Major milestone progress,
Release/iteration scrap and
rework
Concerns Bureaucracy vs. standardization Quality vs. financial performance Content vs. schedule
Time scales 6 to 12 months 1 to many years 1 to 6 months
Three levels of processes and their attributes
Dr. P. KUPPUSAMY
42. Improving Team Effectiveness using five staffing principles
1. The principle of top talent: Use better and fewer people.
â There is a natural team size for most jobs, and being larger or smaller than
natural size is bad for team dynamics because it results in too little or too much
pressure on individuals to perform.
2. The principle of job matching: Fit the task to the skills an motivation
of the people available.
â Countless cases of great software engineers being promoted prematurely into
positions for which they were unskilled and unqualified.
3. The principle of career progression: An organization does best in the
long run by helping its people to self-actualize.
â Good performers usually self-actualize in any environment. Organizations can
help and hinder employee self-actualization, but organizational energy will benefit
average and below-average performers the most.
â Organizational training programs are typically strategic undertakings with
educational value. Project training programs are purely tactical, intended to be
useful and applied the day after training ends.
Dr. P. KUPPUSAMY
43. Improving Team Effectiveness using five staffing principles
4. The principle of team balance: Select people who will complement
and harmonize with one another.
⢠Software team balance has many dimensions, and when a team is unbalanced in anyone
of them, a project becomes seriously at risk. These dimensions include:
â Raw skills: intelligence, objectivity, creativity, organization, analytical thinking
â Psychological makeup: leaders and followers, risk takers and conservatives,
visionaries and nitpickers, cynics and optimists
â Objectives: financial, feature set, quality, timeliness
5. The principle of phase-out: Keeping a misfit on the team doesnât
benefit anyone.
â A misfit gives you a reason to find a better person or to live with fewer people.
â A misfit demotivates other team members, will not self-actualize, and disrupts the
team balance in some dimension.
â Misfits are obvious, and it is almost never right to procrastinate weeding them out.
Dr. P. KUPPUSAMY
44. Improving Team Effectiveness
Important Project Manager Skills:
⢠Hiring skills. Few decisions are as important as hiring decisions. Placing the
right person in the right job seems obvious but is surprisingly hard to achieve.
⢠Customer-interface skill. Avoiding adversarial relationships among stake-
holders is a prerequisite for success.
⢠Decision-making skill. Good leaderâs decision-making skill seems obvious
despite its intangible definition.
⢠Team-building skill. Teamwork requires that a manager establish trust,
motivate progress, exploit eccentric prima donnas, transition average people
into top performers, eliminate misfits, and consolidate diverse opinions into a
team direction.
⢠Selling skill. Successful project managers must sell all stakeholders
(including themselves) on decisions and priorities, sell candidates on job
positions, sell changes to the status quo in the face of resistance, and sell
achievements against objectives. In practice, selling requires continuous
negotiation, compromise, and empathy.
Dr. P. KUPPUSAMY
46. Achieving Required software Quality in minimum cost
Key practices that improve overall software quality:
ďź Focusing on driving requirements and critical use cases early in the life
cycle, focusing on requirements completeness and traceability late in the
life cycle, and focusing throughout the life cycle on a balance between
requirements evolution, design evolution, and plan evolution.
ďź Using metrics and indicators to measure the progress and quality of an
architecture as it evolves from a high-level prototype into a fully
compliant product.
ďź Providing integrated life-cycle environments that support early and
continuous configuration control, change management, rigorous design
methods, document automation, and regression test automation.
ďź Using visual modeling and higher level language that support
architectural control, abstraction, reliable programming, reuse, and self-
documentation.
ďź Early and continuous insight into performance issues through
demonstration-based evaluations.
Dr. P. KUPPUSAMY
47. Achieving Required software Quality in minimum cost
Conventional project performance assessment as follows:
⢠Project inception: The proposed design was asserted to be low risk with
adequate performance margin.
⢠Initial design review: Optimistic assessments of adequate design margin were
based mostly on paper analysis or rough simulation of the critical threads.
In most cases, the actual application algorithms and database sizes were fairly
well understood. However, the infrastructure-including the operating system
overhead, the database management overhead, and the interprocess and
network communications overhead-and all the secondary threads were typically
misunderstood.
⢠Mid-life-cycle design review: The assessments started whittling away at the
margin, as early benchmarks and initial tests began exposing the optimism
inherent in earlier estimates.
⢠Integration and test: Serious performance problems were uncovered,
necessitating fundamental changes in the architecture. The underlying
infrastructure was usually the scapegoat, but the real culprit was immature use of
the infrastructure, immature architectural solutions, or poorly understood early
design trade-offs.
Dr. P. KUPPUSAMY
48. Principles of Conventional Software Engineering
1. Make quality #1. Quality must be quantified and mechanism put into place to motivate its
achievement.
2. High-quality software is possible. Techniques that have been demonstrated to increase quality
include involving the customer, prototyping, simplifying design, conducting inspections, and
hiring the best people.
3. Give products to customers early. No matter how hard you try to learn usersâ needs during the
requirements phase, the most effective way to determine real needs is to give users a product and
let them play with it.
4. Determine the problem before writing the requirements. When faced with what they believe is
a problem, most engineers rush to offer a solution. Before you try to solve a problem, be sure to
explore all the alternatives and donât be blinded by the obvious solution.
5. Evaluate design alternatives. After the requirements are agreed upon, you must examine a
variety of architectures and algorithms. You certainly do not want to use an âarchitectureâ simply
because it was used in the requirements specification.
6. Use an appropriate process model. Each project must select a process that makes the most sense
for that project on the basis of corporate culture, willingness to take risks, application area,
volatility of requirements, and the extent to which requirements are well understood.
7. Use different languages for different phases. Our industryâs eternal thirst for simple solutions to
complex problems has driven many to declare that the best development method is one that uses
the same notation through-out the life cycle. Software engineers use Ada for requirements,
design, and code unless Ada were optimal for all these phases.
8. Minimize intellectual distance. To minimize intellectual distance, the softwareâs structure
should be as close as possible to the real-world structure.
9. Put techniques before tools. An undisciplined software engineer with a tool becomes a
dangerous, undisciplined software engineer.
10. Get it right before you make it faster. It is far easier to make a working program run than it is
to make a fast program work. Donât worry about optimization during initial coding.
Dr. P. KUPPUSAMY
49. Principles of Conventional Software Engineering
11. Inspect code. Inspecting the detailed design and code is a much better way to find errors than
testing.
12. Good management is more important than good technology. The best technology will not
compensate for poor management, and a good manager can produce great results even with
meager resources. Good management motivates people to do their best, but there are no
universal ârightâ styles of management.
13. People are the key to success. Highly skilled people with appropriate experience, talent, and
training are key. The right people with insufficient tools, languages, and process will succeed.
The wrong people with appropriate tools, languages, and process will probably fail.
14. Follow with care. Just because everybody is doing something does not make it right for you. It
may be right, but you must carefully assess its applicability to your environment. Object
orientation, measurement, reuse, process improvement, CASE, prototyping-all these might
increase quality, decrease cost, and increase user satisfaction. The potential of such techniques is
often oversold, and benefits are by no means guaranteed or universal.
15. Take responsibility. When a bridge collapses we ask, âwhat did the engineers do wrong?â Even
when software fails, we rarely ask this. The fact is that in any engineering discipline, the best
methods can be used to produce awful designs, and the most antiquated methods to produce
elegant design.
16. Understand the customerâs priorities. It is possible the customer would tolerate 90% of the
functionality delivered late if they could have 10% of it on time.
17. The more they see, the more they need. The more functionality (or performance) you provide a
user, the more functionality (or performance) the user wants.
18. Plan to throw one away .One of the most important critical success factors is whether or not a
product is entirely new. Such brand-new applications, architectures, interfaces, or algorithms
rarely work the first time.
19. Design for change. The architectures, components, and specification techniques you use must
accommodate change.
20. Design without documentation is not design. I have often heard software engineers say, âI
have finished the design. All that is left is the documentation.â
Dr. P. KUPPUSAMY
50. Principles of Conventional Software Engineering
21. Use tools, but be realistic. Software tools make their users more efficient.
22. Avoid tricks. Many programmers love to create programs with tricks-
constructs that perform a function correctly, but in an obscure way. Show the
world how smart you are by avoiding tricky code.
23. Encapsulate. Information-hiding is a simple, proven concept that results in
software that is easier to test and much easier to maintain.
24. Use coupling and cohesion. Coupling and cohesion are the best ways to
measure softwareâs inherent maintainability and adaptability.
25. Use the McCabe complexity measure. Although there are many metrics
available to report the inherent complexity of software, none is as intuitive and
easy to use as Tom McCabeâs.
26. Donât test your own software. Software developers should never be the
primary testers of their own software.
27. Analyze causes for errors. It is far more cost-effective to reduce the effect of
an error by preventing it than it is to find and fix it. One way to do this is to
analyze the causes of errors as they are detected.
28. Realize that softwareâs entropy increases. Any software system that
undergoes continuous change will grow in complexity and become more and
more disorganized.
29. People and time are not interchangeable. Measuring a project solely by
person-months makes little sense.
30. Expert excellence. Your employees will do much better if you have high
expectations for them. Dr. P. KUPPUSAMY
51. Principles of Modern Software Management
The central design element
Architecture-first approach
Design and integration first, then production and test
The risk management element
Iterative life-cycle process
Risk control through ever-increasing function, performance, quality
The technology element
Component-based development
Object-oriented methods, rigorous notations, visual modeling
The control element
Change management environment
Metrics, trends, process instrumentation
The automation element
Round-trip engineering
Complementary tools, integrated environments
Dr. P. KUPPUSAMY
52. Principles of Modern Software Management
1. Base the process on an architecture-first approach. This requires that a
demonstrable balance be achieved among the driving requirements, the
architecturally significant design decisions, and the life-cycle plans before the
resources are committed for full-scale development.
2. Establish an iterative life-cycle process that confronts risk early. It is not
possible to define the entire problem, design the entire solution, build the
software, then test the end product in sequence. Instead, an iterative process
that refines the problem understanding, an effective solution, and an effective
plan over several iterations encourages a balanced treatment of all stakeholder
objectives. Major risks must be addressed early to increase predictability and
avoid expensive downstream scrap and rework.
3. Transition design methods to emphasize component-based development.
Moving from a line-of-code mentality to a component-based mentality is
necessary to reduce the amount of human-generated source code and custom
development. A component is a cohesive set of preexisting lines of code, either
in source or executable format, with a defined interface and behavior.
4. Establish a change management enviroriment. The dynamics of iterative
development, including concurrent workflows by different teams working on
shared artifacts, necessitates objectively controlled baselines.
Dr. P. KUPPUSAMY
53. Principles of Modern Software Management
5. Round-trip engineering: It is the environment support necessary to automate
and synchronize engineering information in different formats (such as
requirements, specifications, design model, source code, executable code, test
cases). Without substantial automation of this bookkeeping, change management,
documentation, and testing, it is difficult to reduce iteration cycles to manageable
time frames in which change is encouraged rather than avoided. Change freedom
is a necessity in an iterative process, and establishing an integrated environment
is crucial.
6. Capture design artifacts in rigorous, model-based notation. It supports the
evolution of semantically rich graphical and textual design notations. Visual
modeling with rigorous notations and a formal machine-processable language
provides for far more objective measures than the traditional approach of human
review and inspection of ad hoc design representations in paper documents.
7. Instrument the process for objective quality control and progress assessment.
Life-cycle assessment of the progress and the quality of all intermediate products
must be integrated into the process. The best assessment mechanisms are well-
defined measures derived directly from the evolving engineering artifacts and
integrated into all activities and teams.
Dr. P. KUPPUSAMY
54. Principles of Modern Software Management
8. Use a demonstration-based approach to assess intermediate artifacts.
Transitioning the current state-of-the-product artifacts (whether the arti- fact is an
early prototype, a baseline architecture, or a beta capability) into an executable
demonstration of relevant scenarios stimulates earlier convergence on
integration, a more tangible understanding of design tradeoffs, arid earlier
elimination of architectural defects.
9. Plan intermediate releases in groups of usage scenarios with evolving levels of
detail. It is essential that the software management process drive toward early
and continuous demonstrations within the operational context of the system,
namely its use cases. The evolution of project increments and generations must
be commensurate with the current level of understanding of the requirements and
architecture. Cohesive usage scenarios are then the primary mechanism for
organizing requirements, defining iteration content, assessing implementations,
and organizing acceptance testing.
10. Establish a configurable process that is economically scalable. No single
process is suitable for all software developments. A pragmatic process
framework must be configurable to a broad spectrum of applications. The
process must ensure that there is economy of scale and return on investment by
exploiting a common process spirit, extensive process automation, and common
architecture patterns and components.
Dr. P. KUPPUSAMY
55. COCOMO II Model Parameters
⢠Application precedentedness: Domain experience is a critical factor in
understanding how to plan and execute a software development project.
⢠For unprecedented systems, one of the key goals is to confront risks and
establish early precedents, even if they are incomplete or experimental. This
is one of the primary reasons that the software industry has moved to an
iterative life-cycle process.
⢠Early iterations in the life cycle establish precedents from which the product,
the process, and the plans can be elaborated in evolving levels of detail.
⢠Process flexibility: Development of modern software is characterized by such a
broad solution space and so many interrelated concerns that there is a paramount
need for continuous incorporation of changes.
⢠These changes may be inherent in the problem understanding, the solution space, or the
plans. Project artifacts must be supported by an efficient change management
environment commensurate with project needs. Both a rigid process and a chaotically
changing process are destined for failure except with the most trivial projects. A
configurable process that allows a common framework to be adapted across a range of
projects is necessary to achieve a software return on investment.
Dr. P. KUPPUSAMY
56. COCOMO II Model Parameters
⢠Architecture risk resolution: In this project team develops and stabilizes an
architecture before developing all the components that make up the entire suite
of applications components.
⢠An architecture-first and component-based development approach forces the
infrastructure, common mechanisms, and control mechanisms to be elaborated early
in the life cycle and drives all component make/buy decisions into the architecture
process.
⢠This approach initiates integration activity early in the life cycle as the verification
activity of the design process and products. It also forces the development
environment for life-cycle software engineering to be configured and exercised early
in the life cycle, thereby ensuring early attention to testability and a foundation for
demonstration-based assessment.
⢠Team cohesion: Successful teams are cohesive, and cohesive teams are
successful. Cohesive teams share common objectives and priorities.
⢠Cohesive teams avoid sources of project turbulence and entropy that may result from
difficulties in synchronizing project stakeholder expectations. While there are many
reasons for such turbulence, one of the primary reasons is miscommunication,
particularly in exchanging information solely through paper documents that present
engineering information subjectively.
Dr. P. KUPPUSAMY
57. COCOMO II Model Parameters
⢠Advances in technology (such as programming languages, UML, and visual
modeling) have enabled more rigorous and understandable notationsâ for
communicating software engineering information, particularly in the requirements
and design artifacts that previously were ad hoc and based completely on paper
exchange. These model-based formats have also enabled the round-trip engineering
support needed to establish change freedom sufficient for evolving design
representations.
⢠Software process maturity: The Software Engineering Institute's Capability Maturity
Model (CMM) is a well-accepted benchmark for software process assessment.
⢠Just as domain experience is crucial for avoiding the application risks and exploiting
the available domain assets and lessons learned,
⢠Software process maturity is crucial for avoiding software development risks and
exploiting the organization's software assets and lessons learned.
Dr. P. KUPPUSAMY
58. Basic COCOMO - The Functions
person-month = a*(KLOC)b
duration-in-month = c*(person-month)d
KLOC ... kilo lines of code
a, b, c, d depend on the project type
Dr. P. KUPPUSAMY
59. Basic COCOMO - Tables
project type: a b c d
organic 2.4 1.05 2.5 0.38
semi-detached 3.0 1.12 2.5 0.35
embedded 3.6 1.20 2.5 0.32
Dr. P. KUPPUSAMY
60. Basic Cocomo - Example
⢠You must decide on:
â LOCS: 50,000 lines of code
â project type: semi-detached
⢠person-month: 3.0*(50)1.12 ~ 240
⢠duration-in-month: 2.5*(240)0.35 ~ 17
⢠number of people: 240 / 17 ~ 14
Dr. P. KUPPUSAMY
61. COCOMO II
⢠COCOMO II Effort Equation
⢠The COCOMO II model makes its estimates of required
effort (measured in Person-Months -- PM) based primarily
on your estimate of the software project's size (as
measured in thousands of SLOC, KSLOC)):
Effort = 2.94 * EAF * (KSLOC)E
⢠Where
EAF Is the Effort Adjustment Factor derived from the
Cost Drivers
E Is an exponent derived from the five Scale Drivers
Dr. P. KUPPUSAMY
62. COCOMO II
⢠COCOMO II Effort Equation
⢠The COCOMO II model makes its estimates of required
effort (measured in Person-Months -- PM) based primarily
on your estimate of the software project's size (as
measured in thousands of SLOC, KSLOC)):
Effort = 2.94 * EAF * (KSLOC)E
⢠Where
EAF Is the Effort Adjustment Factor derived from the
Cost Drivers
E Is an exponent derived from the five Scale Drivers
Dr. P. KUPPUSAMY
63. ⢠As an example, a project with all Nominal Cost
Drivers and Scale Drivers would have an EAF of
1.00 and exponent, E, of 1.0997. Assuming
that the project is projected to consist of
8,000 source lines of code, COCOMO II
estimates that 28.9 Person-Months of effort is
required to complete it:
Effort = 2.94 * (1.0) * (8)1.0997 = 28.9 Person-
Months
Dr. P. KUPPUSAMY
64. Effort Adjustment Factor
⢠The Effort Adjustment Factor in the effort equation is
simply the product of the effort multipliers corresponding
to each of the cost drivers for your project.
For example, if your project is rated Very High for
Complexity (effort multiplier of 1.34), and Low for Language
& Tools Experience (effort multiplier of 1.09), and all of the
other cost drivers are rated to be Nominal (effort multiplier
of 1.00), the EAF is the product of 1.34 and 1.09.
Effort Adjustment Factor = EAF = 1.34 * 1.09 *1= 1.46
Effort = 2.94 * (1.46) * (8)1.0997 = 42.3 Person-Months
Dr. P. KUPPUSAMY
65. COCOMO II Schedule Equation
⢠COCOMO II Schedule Equation
⢠The COCOMO II schedule equation predicts the number of
months required to complete your software project. The
duration of a project is based on the effort predicted by the
effort equation:
Duration = 3.67 * (Effort)SE
Where
Effort Is the effort from the COCOMO II effort equation
SE Is the schedule equation exponent derived from
the five Scale Drivers
Dr. P. KUPPUSAMY
66. COCOMO II Schedule Equation
⢠Continuing the example, and substituting the
exponent of 0.3179 that is calculated from the
scale drivers, yields an estimate of just over a
year, and an average staffing of between 3 and
4 people:
Duration = 3.67 * (42.3)0.3179 = 12.1 months
Average staffing = (42.3 Person-Months) /
(12.1 Months) = 3.5 people
Dr. P. KUPPUSAMY
67. SCED Cost Driver
⢠The COCOMO cost driver for Required Development Schedule (SCED) is
unique, and requires a special explanation.
The SCED cost driver is used to account for the observation that a project
developed on an accelerated schedule will require more effort than a
project developed on its optimum schedule. A SCED rating of Very Low
corresponds to an Effort Multiplier of 1.43 (in the COCOMO II.2000 model)
and means that you intend to finish your project in 75% of the optimum
schedule (as determined by a previous COCOMO estimate). Continuing the
example used earlier, but assuming that SCED has a rating of Very Low,
COCOMO produces these estimates:
Duration = 75% * 12.1 Months = 9.1 Months
Effort Adjustment Factor = EAF = 1.34 * 1.09 * 1.43 = 2.09
Effort = 2.94 * (2.09) * (8)1.0997 = 60.4 Person-Months
Average staffing = (60.4 Person-Months) / (9.1 Months) = 6.7 people
Dr. P. KUPPUSAMY