SlideShare a Scribd company logo
1 of 14
Download to read offline
SOFTWARE METRICS, MEASUREMENT,
                                                          AND ANALYTICAL METHODS
An agile process has two primary actors. One



                                                    A Bipartite
is the development team, which commits to
develop software for a specific small set of
requirements in a limited amount of time,
and then develops and tests this software. The



                                                    Empirically
other is overall project management, which
hands off the requirements and appropriate
components to the teams and interfaces with
stakeholders such as customers. In a similar



                                                     Oriented
vein, metrics for an agile process should have
two parts: one for the development team and
one for project management.




                                                      Metrics
This article outlines metrics for an agile
process being used at Brooks Automation.
The process uses lightweight metrics at
the development team level, where the



                                                    Process for
focus is on developing working code, and
more heavyweight metrics at the project
management level, where the focus is on
delivering a quality release to the customer.



                                                  Agile Software
The authors describe the process and carry
out a goal-question-metric (GQM) analy-
sis to determine the goals and questions




                                                   Development
for such a process. They then examine
the specific metrics used in the process,
identifying them as either team-related or
project-management-related; compare their
scope to that shown by the GQM analysis;
and identify issues in the existing metrics.
Approaches to rectify those issues are then                            DaviD Heimann
described.
                                                              University of Massachusetts, Boston
Key words: agile, information systems,
information technology, metrics, process
improvement, project management, soft-                   Peter Hennessey anD asHok triPatHi
ware, software development                                         Brooks Software

SQP References
A Practical Process to Establish Software
Metrics
    vol. 8, issue 2                              INTRODUCTION
    Linda Westfall                               As a primary supplier of Real Time Enterprise software to the
The Uses and Abuses of Software Metrics          world’s foremost high-technology manufacturing environments,
    vol. 6, issue 2                              Brooks Software, a division of Brooks Automation, has adopted
    Pat Cross
                                                 agile principles and metrics into its continuous improvement
Software Quality Metrics—From Theory to          culture. Recent development efforts have shown a need to fre-
Implementation
                                                 quently reassess requirements for the intended software product
    vol. 5, issue 3
    Daniel Galin
                                                 and, consequently, replan the project, leading to significant
                                                 product redesign and refactoring. Brooks recognized that an
                                                 agile development paradigm would allow these significant chang-
                                                 es to occur robustly. With such a process, the evolutions could be
                                                 done deliberately, planned, and under organizational control.
                                                     Brooks Software has had several successful commercial
                                                 software product releases using the software product life cycle
                                                 and agile development processes. In addition to increased
                                                 market and customer responsiveness, improvements in


36 SQP VOL. 9, NO. 2/© 2007, ASQ
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

on-time delivery (OTD) predictability were signifi-       entific software environment at NASA-Langley; Poole
cant. Defect reduction has also seen a 50 percent         and Huisman (2001), an implementation in a main-
year-over-year improvement.                               tenance environment at Iona; and Blotner (2002), an
    This article outlines a procedure for establishing    implementation in a start-up environment at Sabrix)
metrics and metrics processes for an empirical (agile)    and describe how the bipartite metrics approach
process, with emphasis on using this procedure to         would work for each case. In this article the authors
specify improved metrics. Since an empirical process      take the approach in the aforementioned paper, extend
implements changes based on observations taken            and develop it further, and implement it in an ongoing
as the process evolves, with agile processes such as      environment, that is, the Brooks Agile Development
extreme programming (XP) being a specific form of         Process (ADP).
empirical process, metrics assume an important role          Cao et al. (2004) describe the need for overlaying
in such processes. In the procedure the authors use a     agile principles on top of XP practices as the develop-
lightweight metrics process at the development team       ment methodology to work effectively in a large-scale
level, where the emphasis is on simple data collec-       and complex software product development.
tion with only brief analyses carried out, and a more        In Software Quality Professional, Gatlin (2003)
heavyweight process at the project management level,      discusses a case where a metrics implementation ran
where more substantial data collection and manage-        into difficulties, examines the lessons learned, and
ment is done and deeper and more sophisticated            mentions common metrics and the weaknesses they
analyses are carried out.                                 can run into. Cross (2004) mentions various statisti-
                                                          cal pitfalls to watch out for in using software metrics,
LITERATURE REVIEW                                         including confusions between discrete and continuous
                                                          data and between accuracy and precision. Westfall
Agile methods originated in the mid-1990s as a coun-
                                                          (2006) outlines 12 steps in selecting, designing, and
terbalance to the more formal, plan-driven methods
                                                          implementing software metrics to ensure understand-
that developed in the decade or two previously. Beck
                                                          ing of their definition and purpose.
(2000) covers these kinds of development, especially
XP. Goldratt (1992) discusses lean methods and the-
ory of constraints, which also began to be applied to     METRICS IMPLEMENTATION
software life cycles and methods.                         CONSIDERATIONS FOR
    Boehm and Turner (2004) present criteria to
determine when agile development is preferable and        AGILE PROCESSES
when plan-driven development is preferable. Manhart
and Schneider (2004) carried out a case study using
goal-question-metric (GQM) to select certain agile
                                                          Empirical Nature of the Agile
principles (automated unit test, test first) to combine   Process and Resulting Metrics
with more traditional principles (documentation, cod-     Plan-driven processes define the steps and proce-
ing rules) to implement in an embedded-software           dures in advance for the development work to follow.
environment. Jalote et al. (2004) discuss the concept     Requirements are also assumed to be defined in
of timeboxing: the breaking up of a large project into    advance and unchanging thereafter, except in special
small “timeboxes” that can then be carried out in         cases. There are fixed artifacts to produce, detailed
separate steps (pipelined execution).                     documentation to produce and follow, and tests to be
    Williams et al. (2004) have developed a theoretical   conducted after development is complete.
framework to evaluate XP and have applied it to an            The agile development process, on the other hand,
IBM-based case study. Cohn and Ford (2003) address        is more empirical. While some procedures are defined
the organizational aspects of the transition from a       in advance, many are formulated as development pro-
plan-driven to an agile process.                          ceeds, and even for predefined procedures, plenty of
    Heimann et al. (2005) identify three agile software   opportunity exists for flexibility and iteration. There
development efforts described in the literature (Wood     is frequent opportunity to reassess, redesign, refactor,
and Kleb (2003); an implementation of agility in a sci-   and replan. Plans for any stage of a project are vaguely

                                                                                                   www.asq.org 37
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

specified at first and gradually become more detailed       management metrics should be comprehensive, so as
as the stage approaches, becoming specific only when        to encompass the project’s entire complexity; inte-
the stage is actually reached.                              grative, to help with amalgamating the results of the
   Metrics for an agile process should reflect this         development teams into a unified product with many
empirical nature. Comparison of work done against           interfaces; and long-term, to enable tracking of many
work planned must adjust to a moving rather than            small development teams throughout the project’s life.
a fixed “work planned” target. Measures of testing          Figure 1 shows the relationships among the teams,
accomplished must similarly reflect that test cases         coordination and management function, and metrics
planned will rapidly change. Defect counts must take        and databases (note that the team-generated metrics
into effect the amount of change taking place and the       connect into a central database from which project
resulting complexity in order to be properly evaluated.     management develops its metrics). Figure 2 shows the
Resource allocation for a given stage must fit into wide    development process as it moves from project manage-
estimation intervals at first, narrowing down only as       ment to the development team and back again.
the stage approaches actuality.
                                                            THE BROOKS AGILE
Bipartite Nature of Metrics—                                DEVELOPMENT PROCESS
Within/Among Agile                                          The ADP leverages the theory of constraints concept,

Development Teams                                           which advocates keeping each development cycle
                                                            short. If one considers that the set of requirements
The core of the agile software development effort is a      committed and under construction is work-in-prog-
development team—a small group of people who oper-          ress (WIP), the concept of the theory of constraints
ate for a limited time to carry out a specific body of      is to keep WIP backlog low, working on the highest
work. The teams are coordinated through an overall          priority requirements first. The development activity
project management function, which parcels out work         continues on the highest priority requirement until
to the teams and integrates their work products into        the product engineering team, which plays the role of
an overall release for customer delivery.                   a customer proxy, satisfactorily validates it from the
   This leads to a dichotomy of interests between the       end-user perspective. The aim is to complete work as
teams and project management. Much of the infor-            quickly as possible.
mation needed by project management is generated                The goals of the product release, captured in
during the detail-oriented product generation effort by     a project contract, are decomposed into manage-
the teams. The teams, however, do not want to capture       able iterations through elaboration. Requirements
this information if it adversely affects their ability to
complete their assignments in a timely fashion.             Figure 1       Distribution of metrics and
   What is necessary, therefore, is to provide an                          databases in agile software
approach where a team can generate and capture the                         development project
necessary metric data as automatically as possible in
                                                                            Team                Team                 Team
developing its work products while enabling project
management to access the resulting data for their                       Team              Team            Team
more complex needs.                                                    metrics           metrics         metrics
   Therefore, the best way to organize metrics is as a
bipartite collection, with one set of metrics focused on
the development teams and the other focused on proj-
ect management. The team metrics should be brief,                Metrics
so as to minimize the reporting burden on the teams;            database             Project
                                                                                                     Project
                                                                                   management
focused, in order to provide the teams with readily                                                management
                                                                                     metrics
usable information for their development tasks; and
                                                                                                                            © 2007, ASQ




short-term and rapid responding, to coincide with                                                       Customer/
                                                                                                       stakeholder
the short life of the team’s iterative cycle. Project

38 SQP VOL. 9, NO. 2/© 2007, ASQ
A Bipartite Empirically Oriented Metrics Process for Agile Software Development


Figure 2          Conceptual view: Agile development process (ADP)



                      Project & product                                                        Software
                        management                                                           development


       Pending
       features


         Story           Release                      Story            Iteration                        Story
        writing         planning                   elaboration         planning                      development




                                                                                                                       Construction
             Requirements                                Design & planning
                                                                                             error
                                          error
                                     System                                   End user               Code review
                   Package
                                   verification                               usability               & unit test


                                                  Validation & verification
                  Complete




                                                                                                                                      © 2007, ASQ
                   features



are specified in terms of stories, which encapsulate
                                                                     Figure 3             Snippet of a sample story
functionality definition in a form that can be easily
communicated, validated with the customer and relat-
                                                                      High-Level Story Definition
ed stakeholders, and implemented in no more than
two person-weeks of development effort. An example                    Users will double click the Application Launcher
story is provided in Figure 3.                                        icon on their desktop, or access it through the
    A product release consists of multiple features, and              Windows Start menu. They will be presented with
each feature is associated with a priority (see Figure                the standard login dialog for the aforementioned
                                                                      Application. If the user is able to log in successfully,
4). For example, in the case of F2P2, F2 represents fea-
                                                                      the Application Launcher application will open
ture no. 2 and P2 is the priority level of feature no. 2.
                                                                      and present the list of subapplications that are
Each feature (FxPx) is broken down into one or more                   available for that user. The user may click and
stories such that each story can be completed within                  select any of the subapplications from the list;
a two-week construction cycle.                                        this action should invoke the subapplication. The
    Stories represent the fundamental tracking unit                   user will NOT be required to log in again as long
for the WIP. By limiting construction to a two-week                   as the Application Launcher is still running.
construction cycle within an overall six-week develop-
                                                                      If the user closes the Application Launcher, they
ment cycle, the stories are “batched” for each iteration              will be required to log in again the next time they
cycle and can be effectively managed. The iterations                  open the Application Launcher. The user name
                                                                                                                                      © 2007, ASQ




are organized as shown in Figure 5. Each six-week                     and password will NOT be stored locally on the
development cycle is completed in two iteration cycles,               client machine.
with the first cycle being used for planning and con-
struction (development) and the second being used                    are three calendar weeks long, with activities for pre-
for development validation, verification (testing), and              vious and current iterations taking place in parallel.
advance planning for the next iteration.                             Hence, a typical release is about 24 calendar weeks
    On average, seven to eight iterations are sufficient             long. Once feature freeze is reached, iteration cycles
to provide overall functionality for a release, with two             drop down to weekly and/or daily product build, and
to three of these iterations reserved for backend bug                this is followed through the final testing and packaging
fixes, final testing, and packaging. All the iterations              phases of the development process.

                                                                                                                    www.asq.org 39
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

   Story elaboration and iteration allow flexibility to                                          • Is the team on schedule relative to the cur-
schedule specific stories for each iteration cycle based                                           rent release backlog?
on business priority, technical risk, and the need to                                            • How fast is a given team, or the project as a
include learning cycles and refactoring turns. Figure 2
                                                                                                   whole, completing story development?
summarizes this process.
                                                                                            2. Level-of-quality questions

ADP METRICS GOALS                                                                                • How well does the team product from the

AND QUESTIONS                                                                                      current iteration fulfill the current require-
                                                                                                   ments?
In keeping with the GQM paradigm (Basili, Caldiera,
                                                                                                 • How well does the overall product fulfill the
and Rombach 1994), the authors identify four goals for
                                                                                                   current requirements?
the metrics process, with a number of questions and
consequent metrics for each goal.                                                                • For a given team, how well is its developed
   The ADP process is guided by the following metrics                                              product passing the quality assurance (QA)
goals: 1) projected project completion against targets;                                            tests specific to its product?
2) quality of completed work; 3) ability to change con-
tent or priorities to address current business needs;                                    Figure 4            High-level view:
and 4) ability to merge iteration-cycle products into a                                                      Release—Features—Stories
seamless overall release.
   Each of these goals gives rise to a number of met-                                                                      Release
rics questions:                                                                                             F1P1           F2P2           . . . F5P5

                     1. Project-completion questions
                        • What stories has the team completed?                                    Feature (F1P1)             Feature (F2P2)        Feature (F5P5)




                                                                                                                                                                    © 2007, ASQ
                                                                                            S1         S2      . . . S10             S1                S1    S2
                        • How is the team doing compared to its task
                          commitments?                                                  F1: Feature No. 1, P1: Feature Priority No. 1, S1: Story No. 1

Figure 5                                  Three-week ADP iterative cycle for the ADP view

                                                                                                 Activity
                               Calendar             Iteration A—Previous                Iteration B—Current                          Iteration C—Next
                                1: Monday       Construct—Build media              Plan— Initial story elaboration         Plan— Leftover stories from
                                                                                         Story estimation                          iteration B
                                2: Tuesday      Verify— Build validation
                                                        End user usability
                      Week 1




                                3: Wednesday                                       Refine— Revise stories based
Plan




                                                          validation
                                                                                             on iteration A
                                4: Thursday             Fix issue (if necessary)
                                                                                             results (if necessary)
                                5: Friday       Plan— Update completed             Finalize—Record stories
                                                        stories
                                6: Monday       Verify— Bug fix verification       Construct— Bug fixes for                Develop— New stories for
                                                          (if necessary)                        iteration A                           features or
                                7: Tuesday
                                                        Hand off updated                      Develop iteration                       prioritize list
                      Week 2




                                8: Wednesday              stories of QA                         B stories                             of bugs
Develop and Verify




                                9: Thursday             Updated story
                                                          verification
                                10: Friday
                                11: Monday    Verify— Bug fix verification         Construct— Bug fixes for                Develop— New stories for
                                                        (if necsessary)                         iteration A                           features or
                                12: Tuesday
                                                      Updated story                             (if necessary)                        prioritize list
                      Week 3




                                13: Wednesday           verification                          Develop iteration                       of bugs
                                                                                                                                                                    © 2007, ASQ




                                14: Thursday                                                    B stories

                                15: Friday



40 SQP VOL. 9, NO. 2/© 2007, ASQ
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

     • Overall, how well is the completed work            the assessment of productivity and workload neces-
       passing overall integration/system tests?          sary for this, one must measure the “mass” of a story.
     • Does the developed product possess sys-            Brooks measures this by the “story point.” This is an
       temic integrity and fitness for customer           agile-related measure and is calculated by the software
       demonstration and eventual delivery?               development team in conjunction with the project man-
                                                          agement team. A story has a certain number of story
   3. Ability-to-change questions
                                                          points, obtained by multiplying a measure of the story’s
     • How have requirements changed thus far             size by a measure of its risk. As shown in Figure 6, the
       over the release lifetime?                         size of a story is given by a category of the effort needed
     • How are the current requirements matching          for its development. A size can vary from a value of 1
       up to customer (or market) expectations?           through 10. The risk of a story is given by a category of
   4. Ability-to-integrate questions                      its difficulty due to such factors as the novelty of a story
                                                          or the lack of experience of its resources (a resource is
     • Does each team know the inputs from and            a talent in such areas as development, documentation,
       outputs to other product components that           verification, and so on). A risk can vary from a value of
       they need to address during their iteration?
                                                          1 to 3. Therefore, a story can have a value between 1 to
     • Have all the interactions among the various        30 story points. Stories for all the iterations are tracked
       components and the various teams been              to provide a clear visibility of the current status to both
       accounted for, both in development plan-           the development and project management teams.
       ning and in integration and system testing?            For the overall release at a project management level
     • Does system testing cover all the code, inter-     a key delivery metric is the organization’s ability to meet
       actions, and functionality?                        OTD to customer expectations. This is a traditional
                                                          measure and is calculated by project management. This
                                                          measurement tracks actual delivery against committed
ADP METRICS                                               delivery within a given tolerance, usually 5 percent, with
Following are the important metrics that were selected    durations being measured in calendar workdays (see
to measure and track within the ADP process. While        Appendix). During the design and planning phase the
each can address multiple goals, the metrics are shown    engineering team establishes the transition of a target
with the primary goals they address.                      date to a commit date. For ADP projects, past velocity
   For each metric the authors identify whether the       rates and the complexity level of the current project are
metric is calculated by the development team (such        used to extrapolate project completion dates.
as the defects-found-fixed or story points metrics) or
by project management (such as on-time delivery or        Figure 6         Guidelines for determining story
velocity metrics). The authors also identify whether                       size and risk
the metric is a traditional metric or developed to fit
within an agile process such as ADP.                         ADP feature analysis—Size & risk estimation table
   The use and consequent benefits of metrics within                            Size legend
                                                           1 1-2 days effort
ADP is continuing to evolve. The authors’ plan is to       2 3-5 days effort
report additional metrics-related information in future    3 6-10 days effort
publications as the metrics evolve and more insights       4 11-15 days effort
                                                           5 16-20 days effort
can be drawn from them.
                                                           6 21-25 days effort
                                                           7 26-30 days effort
Metrics for Goal 1:                                        8 31-35 days effort
                                                           9 36-40 days effort
Projected Completion                                       10 41-45 days effort
                                                                                Risk legend
To answer the question, “Are we on schedule?” an esti-     1 Low—No unknowns, have prior experience
                                                                                                                         © 2007, ASQ




mate of effort, anticipated productivity, and workload     2 Medium—Well understood, limited unknowns
must be done and a schedule created. To carry out          3 High—No experience, high unknowns


                                                                                                      www.asq.org 41
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

    To continually forecast and refine estimates a key              the reciprocal of the size-weighted risk, and is used to
productivity metric of the ADP is the velocity rate.                compare the contribution of risk between modules of
This is an agile measure and is calculated by project               similar story-point values. Within modules of similar
management. The velocity is the rate at which story                 story points, those with low complexity factors have
points are being completed per assigned capacity of                 higher risks and lower sizes, while those with high
resource (see Appendix).                                            complexity factors have lower risks and higher sizes,
    Velocity measures the effectiveness of the project              so that those with low complexity factors tend to be
team to deliver functional units, as defined by story               deserving of special scrutiny. The complexity factor
points. Measurement of the actual number of story
                                                                    also addresses Goal 1: Projected project completion, as
points that can be produced by a project team of
a given level of resource capacity, that is, velocity,               Figure 7            Sample ADP velocity monitor
enables management to assess team effectiveness.
Hence, it ensures a balanced team in the project roles                                    ADP Velocity Monitor
and allows monitoring of the process using theory of                                     Resource      Story points                  Velocity
                                                                        Iteration
                                                                                         capacity
constraint principles for optimization as well as the                    number
                                                                                             C            SS * RS                   [SS * RS]/C
basis for forward estimation using historical data. The                    i1              52.00            24                         0.46
unit for velocity is “story points generated per resource                  i2              35.70            26                         0.73
                                                                           i3             134.10            71                         0.53
capacity.” For example, if four developers generate 24                     i4              76.95            41                         0.53
story points in three weeks, the velocity is two story                     i5              72.00            45                         0.63
points per labor-week. Note that the resource capacity                     i6              90.00            68                         0.76
                                                                                      Overall velocity                                 0.60
can vary from one iteration to another.
                                                                    0.80
    Figure 7 shows a sample velocity calculation over a                         Iteration velocity          Cumulative velocity
                                                                    0.70
six-iteration period, with both immediate and cumula-
                                                                    0.60
tive velocities shown in the accompanying graph.
                                                                    0.50
    A constituent element of velocity is the complexity
                                                                    0.40
factor. This is an agile-related measure and is calcu-                      0.46        0.57         0.55       0.54          0.56        0.60
                                                                    0.30
lated by project management. The complexity factor is
                                                                    0.20
a measurement of risk (see Figure 8 and Appendix).




                                                                                                                                                    © 2007, ASQ
                                                                    0.10
    The complexity factor shows the contribution of                 0.00
the risk within the total sizing of story points. It is                         i1       i2           i3         i4            i5          i6


Figure 8          Sample feature analysis, story points, and complexity factor calculation
                                           Product Version—Feature Analysis (Snippet)
                    Feature                                                                                            Size     Risk Points
    Feature ID                                High-level feature requirement description
                    Priority                                                                                            SS          RS    SS * RS
                               Upgrade product to utilize Microsoft .Net 2.0 framework recommended by
Project Name_1         1       Microsoft and contains many critical bug fixes and enhancements. .Net 2.0                8            2      16
                               migration is required to use Visual Studio 2005.
                               Define the details to support versioning for business logic component.
                               Detailed design will be developed based on the outcome of the prototype.
Project Name_2         2                                                                                                5            3      15
                               Versioning includes document states such as active, pending, archived, and
                               off-line.
                               Support dynamic Web service configuration for external interfaces. Adopt
Project Name_3         3                                                                                                3            1      3
                               to current Web service standards.
                               Add visualization to view various different types of activities in a Pareto
Project Name_4         4                                                                                                2            2      4
                               type chart.
                               Incorporate system user authentication and security model into the login
Project Name_5         5                                                                                                7            3      21
                               application.
                               Define the requirements to support application support for multiple user
Project Name_6         6                                                                                                3            2      6
                                                                                                                                                    © 2007, ASQ




                               roles.
                                                                                                     Total             28        13         65
                                                          Complexity factor: Sum of size/Sum of points                          0.43


42 SQP VOL. 9, NO. 2/© 2007, ASQ
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

well as indirectly addressing Goal 2: Level of quality of   of requirements changes and additions, so that the
completed work.                                             ongoing customer presence can be properly informed
   Velocity rates may change over time from project to      and give their proper input to the agile process. Within
project for similar components. The complexity factor       the agile development process this is performed by
helps identify the contribution of the risk aspect, with    a customer proxy role that continually manages the
respect to the corresponding velocity trends. Figure 8      requirements with the agile project team. Additionally,
shows a sample calculation of the complexity factor.        one of the primary functions of the customer proxy
                                                            role is to validate the developed product from all the
Issues for Goal 1:                                          iterations for its usability from an end-user perspec-
                                                            tive. Developed product (stories) can be refactored to
Projected Completion                                        ensure end-user usability criteria are met.

Contending with moving planned-
production targets
                                                            Metrics for Goal 2:
Velocity shows the production of stories, which             Product Quality
can be used to compare actual production against            To address the quality of the completed work metrics
planned production. However, in an agile develop-           includes measuring tests run for each build including
ment process “planned” is not a static unchanging           confidence levels over time for test coverage. Using
benchmark against which one can compare produc-             Brooks Software’s Temporal reporting technologies
tion. As requirements change, features change, and,         against ClearQuest unified change management (UCM)
consequently, stories continue to be created and to         repositories allows project teams and management to
evolve. To keep the “planned” picture up to date, one       visualize these defect rates and found-fixed ratio. Both
needs to track the stories in real time, adjusting them     of these are traditional measures and are calculated by
and the underlying requirements as they change, as          the team. The software quality assurance team sup-
well as estimate and forecast them into the future          porting the project collects the underlying data.
based on past and current customer volatility. Such a           A quantity used to evaluate test quality is a weight-
level of story creation ahead of time, which may not        ed quality percentage (see Figure 9 and Appendix).
be needed, is going against the principles of theory        This is an agile-related measure, since it takes into
of constraints. Keeping the release cycles short and        account the age of the build on which a test was last
maintaining a robust customer proxy role within the         run, and is calculated by the team using test-case
project team can help mitigate this concern to a large      weightings assigned by QA and project management
extent. Note, however, that within an iteration the pro-    based on the highest risk to the project deliverables.
duction target is unchanging, so that the development       The range of the quality percentage is from –100 per-
team has a fixed target on which to focus. This is the      cent to 100 percent, where –100 percent is complete
heart of the bipartite process, where the development       failure of the full suite of tests on the latest build and
teams are provided with a fixed “waterfall-type” envi-      100 percent is complete success of the full suite of
ronment, while project management exercises agility         tests on the latest build.
in responding and adjusting to changes in customer              Each test case in a build is assigned a level from 1 to 4,
requirements.                                               with level 1 being the highest, and each level is assigned
                                                            a point value. The considerations for the weightings are
Interface between requirements                              based on the criticality of the test. For example, a level
(customer input) and product backlog                        1 test may be a server component where a defect would
Agile processes need a requirements-management              have severe consequences, while a level 4 test may be
system, ironically more than do plan-based processes.       a client on a function that has low frequency of use.
This is especially true to link changes in require-         The point values of failed test cases are subtracted from
ments to maintenance of the product backlog list            those of passed cases to obtain an adjusted point value.
used to assign stories to teams in the iterations. This     These values are divided by the total point value of the
is also true in order to properly evaluate the impact       test cases to obtain the weighted quality percentage for

                                                                                                         www.asq.org 43
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

a given build. In the example of Figure 9, the weighted             Note that with the weighted quality percentage with
quality percentage is 75 percent.                               confidence loss the negative impact of a failed test on
    As the build frequency increases and the full suite         the metric value lessens as the build on which the test
of tests grows, quick turnaround time and cost con-             occurs recedes further into the past without a retest tak-
siderations may not allow for the full suite of tests to        ing place. It is therefore important that QA retest a failed
be run for each build. One, therefore, must use test            test as soon as possible, especially for level 1 tests.
results from earlier builds, with a resulting loss in               Weighted quality percentages (with or without con-
confidence as to whether those results are still appro-         fidence loss) may be very low in the first builds when
priate. To take this “aging” into account the authors           tests tend to fail more often. By release time, however,
have established the “weighted quality percentage               they must be near 100 percent in order to comply
with confidence loss.” This is obtained by multiplying          with product release readiness policies, with explicit
the adjusted point value for each test case by the time-        explanations of any workarounds agreed to by product
liness value of the most recent build for which the test        stakeholders and management.
case was run. This timeliness value of each test case is            A metric used for product after shipment to the
100 percent for the latest build. The same timeliness           customer is voice of customer (VOC). This is an
value for each test case will proportionally drop for           agile-related measure, tracked by the marketing and
subsequent future builds (for example, if the test case         customer support groups and maintained by project
was last run on the seventh build and the current build         management. The VOC metrics are grades of 1 to 5,
is the 10th, the timeliness value is 70 percent). The           with 5 representing the best answer, based on survey
metric provides a gauge for quality and project man-            results calculated on a spreadsheet (see Figure 10)
agement on the potential risk of a regression to a code         for the suite of products and versions delivered to
change not explicitly covered in the tests executed             the particular customer. To increase the quantitative
for the build. The weighted quality percentage with             sophistication of this metric, since it represents a key
confidence loss is the sum of the timeliness-adjusted           customer-satisfaction value, and to prevent an individ-
point values for all test cases divided by the total point      ual low score from being ignored because of an overall
value of all test cases. In the example of Figure 9, the        high average, any response or any product with a 3 or
weighted quality percentage with confidence loss is             less triggers a root cause follow-up with the customer.
52 percent.                                                     Questions used in VOC metrics include:

Figure 9          Sample weighted quality percentage

     Story            Level         Value        Last test      Raw score      Last build    Confidence       Weighted
                                                  result                         tested        weight           score
       A                   1               10      Pass                   10               7           0.7               7
       B                   2                8      Pass                    8               8           0.8             6.4
       C                   3                5      Void                    0               0             0               0
       D                   4                1      Pass                    1               8           0.8             0.8
       E                   3                5      Pass                    5               2           0.2               1
       F                   2                8      Fail                   -8               8           0.8            -6.4
       G                   3                5      Pass                    5               9           0.9             4.5
       H                   4                1      Pass                    1               5           0.5             0.5
       I                   3                5      Pass                    5              10             1               5
       J                   2                8      Pass                    8               5           0.5               4
       K                   1               10      Pass                   10               7           0.7               7
       L                   2                8      Pass                    8               4           0.4             3.2
       M                   3                5      Pass                    5              10             1               5
       N                   2                8      Pass                    8               9           0.9             7.2
       O                   4                1      Pass                    1               7           0.7             0.7
       P                   3                5      Fail                   -5               7           0.7            -3.5
       Q                   2                8      Pass                    8               5           0.5               4
       R                   1               10      Pass                   10               7           0.7               7
       S                   3                5      Pass                    5               9           0.9             4.5
                                                                                                                               © 2007, ASQ




       T                   1               10      Pass                   10               7           0.7               7
                  Maximum possible score: 126             Weighted score: 95       Weighted score with confidence loss: 65
                                  Weighted quality %: 75.4%                Weighted quality % with confidence loss: 51.5%


44 SQP VOL. 9, NO. 2/© 2007, ASQ
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

   • “Q1: Please select the rating that best describes        automation and product documentation to only occur
     your opinion of the way your issue or question           when the requirements are stable.
     was resolved.                                               Automation of tests helps to reduce testing com-
   • Q2: How would you rate the speed that you                plexity by routinely carrying out straightforward tests
     received the solution to resolve your issue or           such as regression or duration tests, allowing one to
     answer to your question?                                 dedicate more resources to less routine tests.
                                                                 Establishing metrics for code and functional cover-
   • Q3: How satisfied were you with the profession-          age is a future need that will allow the understanding
     alism and courtesy of the support representative         of how much test coverage exists for each build.
     who assisted you with your issue?
   • Q4: How satisfied were you with your overall
     support you received based on your last
                                                              Metrics for Goal 3: Ability to
     interaction?”                                            Change Content or Priorities
                                                              The ADP process does not bind requirements to a
                                                              release until the iteration planning activities. This
Issues for Goal 2:                                            allows flexibility to recast priorities for all the iteration.
Product Quality                                               The cost of this flexibility is that it requires a dedicated
                                                              effort and resources for ongoing product (requirements)
                                                              engineering throughout the project life span. Each
Testing first of code at the unit level
                                                              iteration has an iteration plan that defines the priori-
A key issue is the timeliness and complexity of the
                                                              tized set of requirements (story points) to be addressed
testing regimen for the produced code representing
                                                              in each three-week cycle. Each cycle is planned based
the stories. To fulfill agility, an important aspect is the
                                                              on current business conditions and is short enough that
“test first” principle, that is, carry out unit tests when
                                                              change is minimal. Requirements that do not get suc-
or before writing the code. Unit tests are developed in
                                                              cessfully implemented in the iteration are candidates
conjunction with building the automated regression
                                                              for the next cycle based on overall priorities.
suite that is executed post build for each iteration. Unit
                                                                  Currently, initial release plan stories and actual
test harnesses are checked in upon code completion
                                                              completed stories are tracked. Unlike traditional
and integrated into the regression suite. This ensures
                                                              processes the planned stories vary over time as require-
the code is stable and functional for every build.
                                                              ments change in keeping with the changing marketplace
                                                              requirements. Therefore, once they are defined for
Impact of changing requirements
                                                              implementation in a given iteration, actual stories are
on testing                                                    tracked. The difference between the initially planned
The coordination between story, integration, system,          story points and the actual story points to be developed
and regression testing becomes more complex when              is called the story point variance (see Appendix). This
requirements and stories are changing over time. The          is an agile-related metric that is calculated by project
ADP process allows for the customer proxy or require-         management. The purpose of the variance between the
ments engineer to lock a requirement down for each            initially planned stories and actual developed stories
iteration cycle by flagging whether the story delivered
in the iteration is likely to need refactoring, that is,      Figure 10 Sample voice of the customer
a story is initially flagged for refactoring, and the                   (VOC) metric
step of no longer flagging for refactoring locks down
the requirement. This allows an explicit decision for                         Voice of the customer (VOC)
                                                                                 Project                         Score
requirements freeze to occur at a level of granular-
                                                              ProductA V2.3                                        4
ity that can be effectively managed. Story validation         ProductB V4.0                                        5
occurs for every change in the requirements by the            ProductB V4.1                                        5
                                                              ProductB V4.2                                        2
customer proxy/requirements engineer to ensure the
                                                                                                                               © 2007, ASQ




                                                              ProductC V1.0                                        5
requirements meet their intended use. This allows             ProductC V1.1                                        3
for a downstream activity such as building full test          Overall                                             4.0


                                                                                                           www.asq.org 45
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

is to help determine a project management buffer for
future releases. Knowledge of this historical variance is
                                                            Metrics for Goal 4: Ability to
helpful to keep up with the desire to meet the OTD met-     Merge Iteration Cycles Into a
ric discussed earlier. Within a project ADP measures the    Seamless Overall Release
actual stories completed for delivery to the customer.
                                                            For a commercial application supplier like Brooks
                                                            Software it is important that software releases pro-
Issues for Goal 3: Ability to                               duced with an agile process also include whole product
Change Content or Priorities                                readiness that includes integration with other software
                                                            and hardware components such as operating system
Currently, the suite of project management metrics          and infrastructure configurations, documentation,
for goal 3 is not as rich as those for goals 1 and 2.       installation, upgrade, and other elements that meet
Capturing requirements changes at the various levels        customer expectations and maximizes user experi-
would allow for better forecasting of how much the          ence.
requirements can be expected to change over the                 Overall shipment readiness is determined by the
project’s life. Story point variance is currently being     engineering and launch readiness process that ensures
employed in this regard. Other metrics under consid-        all product elements are available to ship through
eration include:                                            measurement of out-of-box quality (OBQ), which
   • Market changes: The market requirements are            includes the measurement of post-release customer-
     allocated to a project contract that defines the       reported defects (see Figure 11 and Appendix). This
     high-level targets for each release. By exercis-       is a traditional measure and is calculated by project
     ing version and configuration control on the           management. It measures overall product quality as
     project contract one can monitor its evolution         experienced by the customer. This measures quality
     over time due to market changes and conse-             beyond code-defect rates and ensures whole product
     quent changes in requirements.                         packaging objectives are being met. Figure 11 shows
                                                            an example OBQ graph. The bar graph shows that
   • Changes in the project backlog: Within the
                                                            installation and configuration defects occur most
     iteration planning, the project backlog is the
                                                            frequently and the cumulative line graph shows that
     source of stories to be developed within each
                                                            the degree of imbalance is approximately 40/60 (that
     cycle. Within an iteration, the two-to-three
                                                            is, the top 40 percent of the categories accounts for 60
     week development cycle allows the iteration to
                                                            percent of the total defects).
     have its own planning process to allow for adap-
     tation and change. Through change control on           Figure 11 Sample OBQ measure
     the project backlog over the various iterations,
     one can keep track of changes in the project                 14                                                                        100%
     backlog as the iterations are carried out. While                                                                                       90%
                                                                  12
     market changes and story point variance focus                                                                                          80%
     on the inflow to the project backlog, this metric                10                                                                    70%
                                                                                                                                                   Cumulative percent




     focuses on the outflow from the backlog. The
                                                                                                                                            60%
     project backlog and its administration is a core                 8
                                                            Defects




     concept of the ADP and allows for a high level                                                                                         50%
     of agility and change control.                                   6
                                                                                                                                            40%

   These metrics are agile-related and can be main-                   4                                                                     30%

tained by project management. They may require                                                                                              20%
                                                                      2
somewhat more of a workload than the current story-                                                                                         10%
tracking metric from the project management team.                     0                                                                     0%
Automated database systems can help in this regard,                               ion           ion           on          ng          ing
                                                                             llat           rat           ati         agi         rad
                                                                                                                                                                        © 2007, ASQ




particularly a requirement-traceability tool and a good                  sta            igu           ent         ack          pg
                                                                      In             nf           cum
                                                                                                                 P           U
                                                                                  Co          Do
software configuration management (SCM) tool.                                                          Category


46 SQP VOL. 9, NO. 2/© 2007, ASQ
A Bipartite Empirically Oriented Metrics Process for Agile Software Development


Issues for Goal 4: Ability to                                    1. Additional bipartite team and project metrics
                                                                    for process improvement
Merge Iteration Cycles Into a                                    2. Further sophistication in the data collection
Seamless Overall Release                                            system for the bipartite agile metrics
This is an area that is difficult to establish quantitative      3. Further implementation of process improve-
metrics for and uses more of a peer and stakeholder                 ments incorporating the bipartite approach
review process to ensure product readiness. This is
                                                                  Another area of longer-term interest for Brooks is
addressed in the area of customer acceptance testing
                                                              scaling the agile process to support distributed teams.
and installation and delivery sign-off. Earlier in the
development cycle other processes can be developed to         With advances in software engineering environments
address the various questions in this goal area:              and with the advent of collaboration systems from
                                                              suppliers like Microsoft, IBM, and Oracle, the benefits
   • Does each team know the inputs from and
                                                              of the ADP agile process may be able to be scaled to
     outputs to other product components that
                                                              benefit disbursed global teams.
     they need to address during their iteration?
     This can be addressed through peer review
     of the team’s plan for story development                 SUMMARY AND
     and tracked by various input-output analyses
     among stories. Traceability and configuration
                                                              CONCLUSIONS
                                                              An agile process can be established that optimizes
     management tools can be useful here.
                                                              around short planning cycles and theory of constraints.
   • Have all the interactions among the vari-                It is consistent with proven advances in other domains
     ous components and the various teams been                such as product life-cycle management and manufac-
     accounted for, both in development plan-                 turing in which the discipline to remove waste from
     ning and in integration and system testing?              all processes has proven as required for sustaining
     This can be similarly addressed through peer             growth. For commercial software companies where
     reviews of the development plans and the sys-            time-to-market, building it right, and cost pressures
     tem-test plans and can be similarly tracked by           drive management investment decisions, agile has
     input-output and traceability analyses.                  moved from being a fad to being a trend.
   • Does system-testing cover all the code, interac-             The authors have established a bipartite suite of
     tions, and functionality? Currently, the full set        metrics, with some, such as defect rates, defects found
     of stories is tested. This is addressed in system        and fixed, and weighted quality percentage, oriented to
     test planning. In addition, the authors are con-         the development team, others, such as velocity, com-
     sidering adding metrics that quantify coverage of        plexity factor, OTD, VOC, and story point variance,
     code and functions. This will help detect defects        oriented to project management, and others, in par-
     relating to unplanned functionality in the code          ticular story points, oriented to both the development
     that are not fully captured in the stories and           team and project management. Many of these metrics,
     design specifications, as well as “dead code,” that      such as story points, story point variance, velocity,
     is, hard-to-reach portions of the code that may          complexity factor, weighted quality percentage, and
     not be covered unless the test effort specifically       VOC, have been created to address the specifics of an
     focuses on that part of the code.                        agile development process such as ADP.
                                                                  In particular, the velocity metric has been instru-
   The authors are also looking to increase the level of
                                                              mental in providing a meaningful way to predict the
sophistication in this goal area, as well as to develop a
                                                              availability of future product version release time-
number of specific measures.
                                                              frame. Velocity metric is used as a throughput rate,
                                                              which is generally a measure of effort in the traditional
Next Steps                                                    sense. The differentiating factor of this metric is that
Future papers will discuss the advances of these              velocity also captures a historical level of difficulty
areas:                                                        that is not captured in traditional throughput rates.

                                                                                                       www.asq.org 47
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

Hence, using velocity to determine future release                            Barrington, Mass.: North River Press.
timeframe has been more beneficial and accurate as                           Heimann, D., N. Ashrafi, W. Koehler, J. P. Kuilboer, S. Mathiyalakan, and
compared to traditional throughput measures.                                 F. Waage. 2005.
    In addition, the complexity factor metric has pro-                       Metrics and databases for agile software development projects – A
vided valuable information about the level of difficulty                     research proposition. 2005. International Conference on Agility, Helsinki,
for a given product version. Using historical complex-                       27-28 July.

ity factor information an appropriate level of risk                          Hennessey, Peter R. 2005. Can ISO, CMMI and agile co-exist?. Boston
management planning can be put in place for a given                          Software Process Improvement Network, March. Available at: http://
                                                                             www.boston-spin.org/slides/050-Mar2005-talk.pdf.
product version. Paying too much or too little attention
                                                                             Jalote, P., A. Palit, P. Kurien, and V. T. Peethamber. 2004. Timeboxing: A
to various forms of risks can be a key ingredient for
                                                                             process model for iterative software development. Journal of Systems
delayed software delivery and/or lack of needed level
                                                                             and Software 70: 117-127.
of quality in the released software product.
                                                                             Manhart, P., and K. Schneider. 2004. Breaking the ice for agile
    All of the metrics identified in this article have
                                                                             development of embedded software: An industry experience report.
provided Brooks Software invaluable insights and                             In Proceedings of the 26th International Conference on Software
measurable tools in effectively managing both the                            Engineering (ICSE’04).
project and product portfolio that best meets the needs                      Paulk, M. 2001. Extreme programming from a CMM perspective. IEEE
of its customers. Metrics and the associated historical                      Software (November/December).
information collected thus far have proved to be very                        Poole, C., and J. W. Huisman. 2001. Using extreme programming in a
helpful to both the project team and the management                          maintenance environment. IEEE Software (November/December).
team. The ADP process and its associated metrics                             Westfall, L. 2006. A practical process to establish software metrics.
have evolved over a three-year period, and this journey                      Software Quality Professional 6, no. 2 (March).
will continue over time to adapt the process further,                        Williams, L., W. Krebs, L. Layman, and A. Anton. 2004. Toward a frame-
thereby advancing the discipline of software engineer-                       work for evaluating extreme programming, technical report TR-2004-02.
ing in an agile environment.                                                 Raleigh, N.C.: North Carolina State University.
                                                                             Wood, W., and W. L. Kleb. 2003. Exploring XP for scientific research. IEEE
REFERENCES                                                                   Software (May/June).

Basili, V., G. Caldiera, and H. D. Rombach. 1994. The goal question metric
approach. Encyclopedia of Software Engineering. New York: John Wiley         BIOGRAPHIES
and Sons.
                                                                             David I. Heimann is a professor in the Management Science and
Beck, K. 2000. Extreme Programming Explained: Embrace Change.                Information Systems Department of the College of Management
Reading, Mass.: Addison Wesley Longman.                                      at the University of Massachusetts, Boston. He has a doctor-
Blotner, J. 2002. Agile techniques to avoid firefighting at a start-up.      ate in computer science from Purdue University. He has held
OOPSLA 2002 Practitioners Reports, Seattle, November.                        positions in government and industry, performing activities in
                                                                             reliability modeling, simulation, database management systems,
Boehm, B., and R. Turner. 2004. Balancing agility and discpline: A guide
                                                                             probabilistic modeling, software analysis, and software process
for the perplexed. Reading, Mass.: Addison-Wesley.
                                                                             improvement. Among his publications are articles over sev-
Cao, L., K. Mohan, P. Xu, and R. Balasubramaniam. 2004. How extreme          eral volumes of the “Proceedings of the Annual Reliability and
does extreme programming have to be? Adapting AP practices to                Maintainability Symposium,” as well as a book-length article
large-scale projects. In Proceedings of 37th International Conference on     on system availability in “Advances in Computers.” He can be
System Sciences, Hawaii, January 5-8.                                        reached by e-mail at David.Heimann@umb.edu.
Cohn, M., and D. Ford. 2003. Introducing an agile process to an organi-      Peter R. Hennessey is director of central engineering programs
zation. IEEE Computer (June).                                                for Brooks Software. He has more than 20 years of experience
Cross, P. 2004. The uses and abuses of software metrics. Software            developing and delivering automation and business software
Quality Professional 6, no. 2 (March).                                       to some of the worlds most advanced manufacturing technol-
Evans, B., and L. Scheinkopf. 2004. Faster, cheaper, better products: Tips   ogy companies. At Brooks Software, Hennessey works across
and tricks from insiders for freeing up the product development logjam.      the engineering community to establish product and engineer-
Medical Device and Diagnostics Industry Journal (January).                   ing process, methods, and architectures to maintain Brook’s
                                                                             worldwide operational and product leadership. He has a master’s
Galin, D. 2003. Software quality metrics: From theory to implementation.     degree in information systems from Northeastern University
Software Quality Professional 5, no. 3 (June).                               and a bachelor’s degree in economics from the University of
Goldratt, E. M. 1992. The goal: A process of ongoing improvement. Great      Massachusetts, Amherst.


48 SQP VOL. 9, NO. 2/© 2007, ASQ
A Bipartite Empirically Oriented Metrics Process for Agile Software Development

Ashok R. Tripathi is a development manager for Brooks Software            for several large advanced manufacturing customers worldwide.
and a certified project management professional with more than            Tripathi is also a holder of a U.S. patent on efficient methods
15 years of software product life-cycle experience. During his            to collect data from semiconductor equipment. He is currently
time at Brooks, Tripathi has led several product development              pursuing an MBA and has a bachelor’s degree in industrial
teams to build real-time software enterprise application suites           engineering from Arizona State University.




   Appendix

                           OTD = (actual available to ship date — planned commit date)
                                                    (project duration)

                               Metric: True or false: ”Is OTD ≤ a given tolerance level?”



                                          Velocity = Sum of actual story points
                                                         Sum of capacity

                                                                          SSRS
                                                           V=
                                                                           Ci

        where s is a story, SS the size of story s, RS the risk for story s, and Ci the capacity of resource i
                                 Note that story points = Story size * Complexity risk


                                   The complexity factor is defined by the following:
                                                                           SS
                                                           C=         S
                                                                          SSRS
                                                                  S

                        where s is a story, SS the size of story s, and RS is the risk for story s


                                                             (test case values * test-result weighting)
                      Weighted quality percentage =
                                                                         (test case values)


               Weighted quality percentage (test case values * test-result weighting * build timeliness)
                                           =
                      with confidence loss                       (test case values)



                                                                                 (PS – AS)
                                               Story point variance =
                                                                                     PS

     where PS is the total initially planned story points and AS is the total actual completed story points


      OBQ = % of defect categories per month (installation, packaging, documentation, software, etc.)



                                                                                                                        www.asq.org 49

More Related Content

What's hot

International Journal of Business and Management Invention (IJBMI)
International Journal of Business and Management Invention (IJBMI)International Journal of Business and Management Invention (IJBMI)
International Journal of Business and Management Invention (IJBMI)inventionjournals
 
Archibald di filippo_comprehensive_plc_model_final
Archibald di filippo_comprehensive_plc_model_finalArchibald di filippo_comprehensive_plc_model_final
Archibald di filippo_comprehensive_plc_model_finalsansharmajs
 
Ch23-Software Engineering 9
Ch23-Software Engineering 9Ch23-Software Engineering 9
Ch23-Software Engineering 9Ian Sommerville
 
PROJECT PLANNINGMEASURES IN CMMI
PROJECT PLANNINGMEASURES IN CMMIPROJECT PLANNINGMEASURES IN CMMI
PROJECT PLANNINGMEASURES IN CMMIIJSEA
 
Software re engineering
Software re engineeringSoftware re engineering
Software re engineeringdeshpandeamrut
 
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...ecij
 
Se solns 9th edition
Se solns 9th editionSe solns 9th edition
Se solns 9th editionrajabaidyo
 
Architectured Centered Design
Architectured Centered DesignArchitectured Centered Design
Architectured Centered DesignGlen Alleman
 
Bryan.moser
Bryan.moserBryan.moser
Bryan.moserNASAPMC
 
D0365030036
D0365030036D0365030036
D0365030036theijes
 
An Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps AdoptionAn Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps AdoptionIRJET Journal
 
Reengineering framework for open source software using decision tree approach
Reengineering framework for open source software using decision tree approachReengineering framework for open source software using decision tree approach
Reengineering framework for open source software using decision tree approachIJECEIAES
 
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...ijseajournal
 
2004 Team Center Training Presentation On Project
2004 Team Center Training Presentation On Project2004 Team Center Training Presentation On Project
2004 Team Center Training Presentation On ProjectSam Ha
 

What's hot (19)

International Journal of Business and Management Invention (IJBMI)
International Journal of Business and Management Invention (IJBMI)International Journal of Business and Management Invention (IJBMI)
International Journal of Business and Management Invention (IJBMI)
 
Archibald di filippo_comprehensive_plc_model_final
Archibald di filippo_comprehensive_plc_model_finalArchibald di filippo_comprehensive_plc_model_final
Archibald di filippo_comprehensive_plc_model_final
 
Ch23-Software Engineering 9
Ch23-Software Engineering 9Ch23-Software Engineering 9
Ch23-Software Engineering 9
 
4213ijsea08
4213ijsea084213ijsea08
4213ijsea08
 
PROJECT PLANNINGMEASURES IN CMMI
PROJECT PLANNINGMEASURES IN CMMIPROJECT PLANNINGMEASURES IN CMMI
PROJECT PLANNINGMEASURES IN CMMI
 
Chapter 2
Chapter 2Chapter 2
Chapter 2
 
Software re engineering
Software re engineeringSoftware re engineering
Software re engineering
 
Road map to cmm
Road map to cmmRoad map to cmm
Road map to cmm
 
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...
FISHBONE ANALYSIS ON WASTES IN SOFTWARE DEVELOPMENT USING THE LEAN I.T. PRINC...
 
Se solns 9th edition
Se solns 9th editionSe solns 9th edition
Se solns 9th edition
 
Architectured Centered Design
Architectured Centered DesignArchitectured Centered Design
Architectured Centered Design
 
Project plan
Project planProject plan
Project plan
 
Bryan.moser
Bryan.moserBryan.moser
Bryan.moser
 
Unified Process
Unified Process Unified Process
Unified Process
 
D0365030036
D0365030036D0365030036
D0365030036
 
An Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps AdoptionAn Approach of Improve Efficiencies through DevOps Adoption
An Approach of Improve Efficiencies through DevOps Adoption
 
Reengineering framework for open source software using decision tree approach
Reengineering framework for open source software using decision tree approachReengineering framework for open source software using decision tree approach
Reengineering framework for open source software using decision tree approach
 
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
STRATEGIES TO REDUCE REWORK IN SOFTWARE DEVELOPMENT ON AN ORGANISATION IN MAU...
 
2004 Team Center Training Presentation On Project
2004 Team Center Training Presentation On Project2004 Team Center Training Presentation On Project
2004 Team Center Training Presentation On Project
 

Similar to Agile Metrics article

Lightweight Processes: A Definition
Lightweight Processes: A DefinitionLightweight Processes: A Definition
Lightweight Processes: A DefinitionGlen Alleman
 
DEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEW
DEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEWDEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEW
DEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEWijseajournal
 
A Systematic Study On Agile Software Development Methodlogies And Practices
A Systematic Study On Agile Software Development Methodlogies And PracticesA Systematic Study On Agile Software Development Methodlogies And Practices
A Systematic Study On Agile Software Development Methodlogies And PracticesSean Flores
 
An Introduction to Agile Software Development
An Introduction to Agile Software DevelopmentAn Introduction to Agile Software Development
An Introduction to Agile Software DevelopmentSerena Software
 
Improvement opportunity in agile methodology and a survey on the adoption rat...
Improvement opportunity in agile methodology and a survey on the adoption rat...Improvement opportunity in agile methodology and a survey on the adoption rat...
Improvement opportunity in agile methodology and a survey on the adoption rat...Alexander Decker
 
Agile Project Management Methods of IT Projects
Agile Project Management Methods of IT ProjectsAgile Project Management Methods of IT Projects
Agile Project Management Methods of IT ProjectsGlen Alleman
 
A Survey Of Agile Development Methodologies
A Survey Of Agile Development MethodologiesA Survey Of Agile Development Methodologies
A Survey Of Agile Development MethodologiesAbdul Basit
 
Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...IJECEIAES
 
Software Development Life Cycle: Traditional and Agile- A Comparative Study
Software Development Life Cycle: Traditional and Agile- A Comparative StudySoftware Development Life Cycle: Traditional and Agile- A Comparative Study
Software Development Life Cycle: Traditional and Agile- A Comparative Studyijsrd.com
 
Integrated Analysis of Traditional Requirements Engineering Process with Agil...
Integrated Analysis of Traditional Requirements Engineering Process with Agil...Integrated Analysis of Traditional Requirements Engineering Process with Agil...
Integrated Analysis of Traditional Requirements Engineering Process with Agil...zillesubhan
 
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessEvolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessIJMER
 
A Comparative Analysis Of Various Methodologies Of Agile Project Management V...
A Comparative Analysis Of Various Methodologies Of Agile Project Management V...A Comparative Analysis Of Various Methodologies Of Agile Project Management V...
A Comparative Analysis Of Various Methodologies Of Agile Project Management V...Brittany Allen
 
Abb case study 1
Abb case study 1Abb case study 1
Abb case study 1apn18
 
Metrics in Agile: SCRUM, XP and Agile Methods
Metrics in Agile: SCRUM, XP and Agile MethodsMetrics in Agile: SCRUM, XP and Agile Methods
Metrics in Agile: SCRUM, XP and Agile MethodsMihir Thuse
 
Metrics in Agile: Scrum, XP and other agile methods
Metrics in Agile: Scrum, XP and other agile methodsMetrics in Agile: Scrum, XP and other agile methods
Metrics in Agile: Scrum, XP and other agile methodsMihir Thuse
 
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT ijseajournal
 
How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...Glen Alleman
 

Similar to Agile Metrics article (20)

Lightweight Processes: A Definition
Lightweight Processes: A DefinitionLightweight Processes: A Definition
Lightweight Processes: A Definition
 
DEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEW
DEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEWDEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEW
DEVOPS ADOPTION IN INFORMATION SYSTEMS PROJECTS; A SYSTEMATIC LITERATURE REVIEW
 
A Systematic Study On Agile Software Development Methodlogies And Practices
A Systematic Study On Agile Software Development Methodlogies And PracticesA Systematic Study On Agile Software Development Methodlogies And Practices
A Systematic Study On Agile Software Development Methodlogies And Practices
 
An Introduction to Agile Software Development
An Introduction to Agile Software DevelopmentAn Introduction to Agile Software Development
An Introduction to Agile Software Development
 
Improvement opportunity in agile methodology and a survey on the adoption rat...
Improvement opportunity in agile methodology and a survey on the adoption rat...Improvement opportunity in agile methodology and a survey on the adoption rat...
Improvement opportunity in agile methodology and a survey on the adoption rat...
 
Agile Project Management Methods of IT Projects
Agile Project Management Methods of IT ProjectsAgile Project Management Methods of IT Projects
Agile Project Management Methods of IT Projects
 
A Survey Of Agile Development Methodologies
A Survey Of Agile Development MethodologiesA Survey Of Agile Development Methodologies
A Survey Of Agile Development Methodologies
 
Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...Performance assessment and analysis of development and operations based autom...
Performance assessment and analysis of development and operations based autom...
 
Software Development Life Cycle: Traditional and Agile- A Comparative Study
Software Development Life Cycle: Traditional and Agile- A Comparative StudySoftware Development Life Cycle: Traditional and Agile- A Comparative Study
Software Development Life Cycle: Traditional and Agile- A Comparative Study
 
Integrated Analysis of Traditional Requirements Engineering Process with Agil...
Integrated Analysis of Traditional Requirements Engineering Process with Agil...Integrated Analysis of Traditional Requirements Engineering Process with Agil...
Integrated Analysis of Traditional Requirements Engineering Process with Agil...
 
ETPM5
ETPM5ETPM5
ETPM5
 
Agile development
Agile developmentAgile development
Agile development
 
Software models
Software modelsSoftware models
Software models
 
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcessEvolvea Frameworkfor SelectingPrime Software DevelopmentProcess
Evolvea Frameworkfor SelectingPrime Software DevelopmentProcess
 
A Comparative Analysis Of Various Methodologies Of Agile Project Management V...
A Comparative Analysis Of Various Methodologies Of Agile Project Management V...A Comparative Analysis Of Various Methodologies Of Agile Project Management V...
A Comparative Analysis Of Various Methodologies Of Agile Project Management V...
 
Abb case study 1
Abb case study 1Abb case study 1
Abb case study 1
 
Metrics in Agile: SCRUM, XP and Agile Methods
Metrics in Agile: SCRUM, XP and Agile MethodsMetrics in Agile: SCRUM, XP and Agile Methods
Metrics in Agile: SCRUM, XP and Agile Methods
 
Metrics in Agile: Scrum, XP and other agile methods
Metrics in Agile: Scrum, XP and other agile methodsMetrics in Agile: Scrum, XP and other agile methods
Metrics in Agile: Scrum, XP and other agile methods
 
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
STATISTICAL ANALYSIS OF METRICS FOR SOFTWARE QUALITY IMPROVEMENT
 
How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...How Should We Estimate Agile Software Development Projects and What Data Do W...
How Should We Estimate Agile Software Development Projects and What Data Do W...
 

Recently uploaded

New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Alan Dix
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking MenDelhi Call girls
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreternaman860154
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonetsnaman860154
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Neo4j
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...shyamraj55
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationSafe Software
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphNeo4j
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhisoniya singh
 

Recently uploaded (20)

New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...Swan(sea) Song – personal research during my six years at Swansea ... and bey...
Swan(sea) Song – personal research during my six years at Swansea ... and bey...
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
08448380779 Call Girls In Diplomatic Enclave Women Seeking Men
 
Presentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreterPresentation on how to chat with PDF using ChatGPT code interpreter
Presentation on how to chat with PDF using ChatGPT code interpreter
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
How to convert PDF to text with Nanonets
How to convert PDF to text with NanonetsHow to convert PDF to text with Nanonets
How to convert PDF to text with Nanonets
 
Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024Build your next Gen AI Breakthrough - April 2024
Build your next Gen AI Breakthrough - April 2024
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
Automating Business Process via MuleSoft Composer | Bangalore MuleSoft Meetup...
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptxVulnerability_Management_GRC_by Sohang Sengupta.pptx
Vulnerability_Management_GRC_by Sohang Sengupta.pptx
 
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry InnovationBeyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
Beyond Boundaries: Leveraging No-Code Solutions for Industry Innovation
 
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge GraphSIEMENS: RAPUNZEL – A Tale About Knowledge Graph
SIEMENS: RAPUNZEL – A Tale About Knowledge Graph
 
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | DelhiFULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
FULL ENJOY 🔝 8264348440 🔝 Call Girls in Diplomatic Enclave | Delhi
 

Agile Metrics article

  • 1. SOFTWARE METRICS, MEASUREMENT, AND ANALYTICAL METHODS An agile process has two primary actors. One A Bipartite is the development team, which commits to develop software for a specific small set of requirements in a limited amount of time, and then develops and tests this software. The Empirically other is overall project management, which hands off the requirements and appropriate components to the teams and interfaces with stakeholders such as customers. In a similar Oriented vein, metrics for an agile process should have two parts: one for the development team and one for project management. Metrics This article outlines metrics for an agile process being used at Brooks Automation. The process uses lightweight metrics at the development team level, where the Process for focus is on developing working code, and more heavyweight metrics at the project management level, where the focus is on delivering a quality release to the customer. Agile Software The authors describe the process and carry out a goal-question-metric (GQM) analy- sis to determine the goals and questions Development for such a process. They then examine the specific metrics used in the process, identifying them as either team-related or project-management-related; compare their scope to that shown by the GQM analysis; and identify issues in the existing metrics. Approaches to rectify those issues are then DaviD Heimann described. University of Massachusetts, Boston Key words: agile, information systems, information technology, metrics, process improvement, project management, soft- Peter Hennessey anD asHok triPatHi ware, software development Brooks Software SQP References A Practical Process to Establish Software Metrics vol. 8, issue 2 INTRODUCTION Linda Westfall As a primary supplier of Real Time Enterprise software to the The Uses and Abuses of Software Metrics world’s foremost high-technology manufacturing environments, vol. 6, issue 2 Brooks Software, a division of Brooks Automation, has adopted Pat Cross agile principles and metrics into its continuous improvement Software Quality Metrics—From Theory to culture. Recent development efforts have shown a need to fre- Implementation quently reassess requirements for the intended software product vol. 5, issue 3 Daniel Galin and, consequently, replan the project, leading to significant product redesign and refactoring. Brooks recognized that an agile development paradigm would allow these significant chang- es to occur robustly. With such a process, the evolutions could be done deliberately, planned, and under organizational control. Brooks Software has had several successful commercial software product releases using the software product life cycle and agile development processes. In addition to increased market and customer responsiveness, improvements in 36 SQP VOL. 9, NO. 2/© 2007, ASQ
  • 2. A Bipartite Empirically Oriented Metrics Process for Agile Software Development on-time delivery (OTD) predictability were signifi- entific software environment at NASA-Langley; Poole cant. Defect reduction has also seen a 50 percent and Huisman (2001), an implementation in a main- year-over-year improvement. tenance environment at Iona; and Blotner (2002), an This article outlines a procedure for establishing implementation in a start-up environment at Sabrix) metrics and metrics processes for an empirical (agile) and describe how the bipartite metrics approach process, with emphasis on using this procedure to would work for each case. In this article the authors specify improved metrics. Since an empirical process take the approach in the aforementioned paper, extend implements changes based on observations taken and develop it further, and implement it in an ongoing as the process evolves, with agile processes such as environment, that is, the Brooks Agile Development extreme programming (XP) being a specific form of Process (ADP). empirical process, metrics assume an important role Cao et al. (2004) describe the need for overlaying in such processes. In the procedure the authors use a agile principles on top of XP practices as the develop- lightweight metrics process at the development team ment methodology to work effectively in a large-scale level, where the emphasis is on simple data collec- and complex software product development. tion with only brief analyses carried out, and a more In Software Quality Professional, Gatlin (2003) heavyweight process at the project management level, discusses a case where a metrics implementation ran where more substantial data collection and manage- into difficulties, examines the lessons learned, and ment is done and deeper and more sophisticated mentions common metrics and the weaknesses they analyses are carried out. can run into. Cross (2004) mentions various statisti- cal pitfalls to watch out for in using software metrics, LITERATURE REVIEW including confusions between discrete and continuous data and between accuracy and precision. Westfall Agile methods originated in the mid-1990s as a coun- (2006) outlines 12 steps in selecting, designing, and terbalance to the more formal, plan-driven methods implementing software metrics to ensure understand- that developed in the decade or two previously. Beck ing of their definition and purpose. (2000) covers these kinds of development, especially XP. Goldratt (1992) discusses lean methods and the- ory of constraints, which also began to be applied to METRICS IMPLEMENTATION software life cycles and methods. CONSIDERATIONS FOR Boehm and Turner (2004) present criteria to determine when agile development is preferable and AGILE PROCESSES when plan-driven development is preferable. Manhart and Schneider (2004) carried out a case study using goal-question-metric (GQM) to select certain agile Empirical Nature of the Agile principles (automated unit test, test first) to combine Process and Resulting Metrics with more traditional principles (documentation, cod- Plan-driven processes define the steps and proce- ing rules) to implement in an embedded-software dures in advance for the development work to follow. environment. Jalote et al. (2004) discuss the concept Requirements are also assumed to be defined in of timeboxing: the breaking up of a large project into advance and unchanging thereafter, except in special small “timeboxes” that can then be carried out in cases. There are fixed artifacts to produce, detailed separate steps (pipelined execution). documentation to produce and follow, and tests to be Williams et al. (2004) have developed a theoretical conducted after development is complete. framework to evaluate XP and have applied it to an The agile development process, on the other hand, IBM-based case study. Cohn and Ford (2003) address is more empirical. While some procedures are defined the organizational aspects of the transition from a in advance, many are formulated as development pro- plan-driven to an agile process. ceeds, and even for predefined procedures, plenty of Heimann et al. (2005) identify three agile software opportunity exists for flexibility and iteration. There development efforts described in the literature (Wood is frequent opportunity to reassess, redesign, refactor, and Kleb (2003); an implementation of agility in a sci- and replan. Plans for any stage of a project are vaguely www.asq.org 37
  • 3. A Bipartite Empirically Oriented Metrics Process for Agile Software Development specified at first and gradually become more detailed management metrics should be comprehensive, so as as the stage approaches, becoming specific only when to encompass the project’s entire complexity; inte- the stage is actually reached. grative, to help with amalgamating the results of the Metrics for an agile process should reflect this development teams into a unified product with many empirical nature. Comparison of work done against interfaces; and long-term, to enable tracking of many work planned must adjust to a moving rather than small development teams throughout the project’s life. a fixed “work planned” target. Measures of testing Figure 1 shows the relationships among the teams, accomplished must similarly reflect that test cases coordination and management function, and metrics planned will rapidly change. Defect counts must take and databases (note that the team-generated metrics into effect the amount of change taking place and the connect into a central database from which project resulting complexity in order to be properly evaluated. management develops its metrics). Figure 2 shows the Resource allocation for a given stage must fit into wide development process as it moves from project manage- estimation intervals at first, narrowing down only as ment to the development team and back again. the stage approaches actuality. THE BROOKS AGILE Bipartite Nature of Metrics— DEVELOPMENT PROCESS Within/Among Agile The ADP leverages the theory of constraints concept, Development Teams which advocates keeping each development cycle short. If one considers that the set of requirements The core of the agile software development effort is a committed and under construction is work-in-prog- development team—a small group of people who oper- ress (WIP), the concept of the theory of constraints ate for a limited time to carry out a specific body of is to keep WIP backlog low, working on the highest work. The teams are coordinated through an overall priority requirements first. The development activity project management function, which parcels out work continues on the highest priority requirement until to the teams and integrates their work products into the product engineering team, which plays the role of an overall release for customer delivery. a customer proxy, satisfactorily validates it from the This leads to a dichotomy of interests between the end-user perspective. The aim is to complete work as teams and project management. Much of the infor- quickly as possible. mation needed by project management is generated The goals of the product release, captured in during the detail-oriented product generation effort by a project contract, are decomposed into manage- the teams. The teams, however, do not want to capture able iterations through elaboration. Requirements this information if it adversely affects their ability to complete their assignments in a timely fashion. Figure 1 Distribution of metrics and What is necessary, therefore, is to provide an databases in agile software approach where a team can generate and capture the development project necessary metric data as automatically as possible in Team Team Team developing its work products while enabling project management to access the resulting data for their Team Team Team more complex needs. metrics metrics metrics Therefore, the best way to organize metrics is as a bipartite collection, with one set of metrics focused on the development teams and the other focused on proj- ect management. The team metrics should be brief, Metrics so as to minimize the reporting burden on the teams; database Project Project management focused, in order to provide the teams with readily management metrics usable information for their development tasks; and © 2007, ASQ short-term and rapid responding, to coincide with Customer/ stakeholder the short life of the team’s iterative cycle. Project 38 SQP VOL. 9, NO. 2/© 2007, ASQ
  • 4. A Bipartite Empirically Oriented Metrics Process for Agile Software Development Figure 2 Conceptual view: Agile development process (ADP) Project & product Software management development Pending features Story Release Story Iteration Story writing planning elaboration planning development Construction Requirements Design & planning error error System End user Code review Package verification usability & unit test Validation & verification Complete © 2007, ASQ features are specified in terms of stories, which encapsulate Figure 3 Snippet of a sample story functionality definition in a form that can be easily communicated, validated with the customer and relat- High-Level Story Definition ed stakeholders, and implemented in no more than two person-weeks of development effort. An example Users will double click the Application Launcher story is provided in Figure 3. icon on their desktop, or access it through the A product release consists of multiple features, and Windows Start menu. They will be presented with each feature is associated with a priority (see Figure the standard login dialog for the aforementioned Application. If the user is able to log in successfully, 4). For example, in the case of F2P2, F2 represents fea- the Application Launcher application will open ture no. 2 and P2 is the priority level of feature no. 2. and present the list of subapplications that are Each feature (FxPx) is broken down into one or more available for that user. The user may click and stories such that each story can be completed within select any of the subapplications from the list; a two-week construction cycle. this action should invoke the subapplication. The Stories represent the fundamental tracking unit user will NOT be required to log in again as long for the WIP. By limiting construction to a two-week as the Application Launcher is still running. construction cycle within an overall six-week develop- If the user closes the Application Launcher, they ment cycle, the stories are “batched” for each iteration will be required to log in again the next time they cycle and can be effectively managed. The iterations open the Application Launcher. The user name © 2007, ASQ are organized as shown in Figure 5. Each six-week and password will NOT be stored locally on the development cycle is completed in two iteration cycles, client machine. with the first cycle being used for planning and con- struction (development) and the second being used are three calendar weeks long, with activities for pre- for development validation, verification (testing), and vious and current iterations taking place in parallel. advance planning for the next iteration. Hence, a typical release is about 24 calendar weeks On average, seven to eight iterations are sufficient long. Once feature freeze is reached, iteration cycles to provide overall functionality for a release, with two drop down to weekly and/or daily product build, and to three of these iterations reserved for backend bug this is followed through the final testing and packaging fixes, final testing, and packaging. All the iterations phases of the development process. www.asq.org 39
  • 5. A Bipartite Empirically Oriented Metrics Process for Agile Software Development Story elaboration and iteration allow flexibility to • Is the team on schedule relative to the cur- schedule specific stories for each iteration cycle based rent release backlog? on business priority, technical risk, and the need to • How fast is a given team, or the project as a include learning cycles and refactoring turns. Figure 2 whole, completing story development? summarizes this process. 2. Level-of-quality questions ADP METRICS GOALS • How well does the team product from the AND QUESTIONS current iteration fulfill the current require- ments? In keeping with the GQM paradigm (Basili, Caldiera, • How well does the overall product fulfill the and Rombach 1994), the authors identify four goals for current requirements? the metrics process, with a number of questions and consequent metrics for each goal. • For a given team, how well is its developed The ADP process is guided by the following metrics product passing the quality assurance (QA) goals: 1) projected project completion against targets; tests specific to its product? 2) quality of completed work; 3) ability to change con- tent or priorities to address current business needs; Figure 4 High-level view: and 4) ability to merge iteration-cycle products into a Release—Features—Stories seamless overall release. Each of these goals gives rise to a number of met- Release rics questions: F1P1 F2P2 . . . F5P5 1. Project-completion questions • What stories has the team completed? Feature (F1P1) Feature (F2P2) Feature (F5P5) © 2007, ASQ S1 S2 . . . S10 S1 S1 S2 • How is the team doing compared to its task commitments? F1: Feature No. 1, P1: Feature Priority No. 1, S1: Story No. 1 Figure 5 Three-week ADP iterative cycle for the ADP view Activity Calendar Iteration A—Previous Iteration B—Current Iteration C—Next 1: Monday Construct—Build media Plan— Initial story elaboration Plan— Leftover stories from Story estimation iteration B 2: Tuesday Verify— Build validation End user usability Week 1 3: Wednesday Refine— Revise stories based Plan validation on iteration A 4: Thursday Fix issue (if necessary) results (if necessary) 5: Friday Plan— Update completed Finalize—Record stories stories 6: Monday Verify— Bug fix verification Construct— Bug fixes for Develop— New stories for (if necessary) iteration A features or 7: Tuesday Hand off updated Develop iteration prioritize list Week 2 8: Wednesday stories of QA B stories of bugs Develop and Verify 9: Thursday Updated story verification 10: Friday 11: Monday Verify— Bug fix verification Construct— Bug fixes for Develop— New stories for (if necsessary) iteration A features or 12: Tuesday Updated story (if necessary) prioritize list Week 3 13: Wednesday verification Develop iteration of bugs © 2007, ASQ 14: Thursday B stories 15: Friday 40 SQP VOL. 9, NO. 2/© 2007, ASQ
  • 6. A Bipartite Empirically Oriented Metrics Process for Agile Software Development • Overall, how well is the completed work the assessment of productivity and workload neces- passing overall integration/system tests? sary for this, one must measure the “mass” of a story. • Does the developed product possess sys- Brooks measures this by the “story point.” This is an temic integrity and fitness for customer agile-related measure and is calculated by the software demonstration and eventual delivery? development team in conjunction with the project man- agement team. A story has a certain number of story 3. Ability-to-change questions points, obtained by multiplying a measure of the story’s • How have requirements changed thus far size by a measure of its risk. As shown in Figure 6, the over the release lifetime? size of a story is given by a category of the effort needed • How are the current requirements matching for its development. A size can vary from a value of 1 up to customer (or market) expectations? through 10. The risk of a story is given by a category of 4. Ability-to-integrate questions its difficulty due to such factors as the novelty of a story or the lack of experience of its resources (a resource is • Does each team know the inputs from and a talent in such areas as development, documentation, outputs to other product components that verification, and so on). A risk can vary from a value of they need to address during their iteration? 1 to 3. Therefore, a story can have a value between 1 to • Have all the interactions among the various 30 story points. Stories for all the iterations are tracked components and the various teams been to provide a clear visibility of the current status to both accounted for, both in development plan- the development and project management teams. ning and in integration and system testing? For the overall release at a project management level • Does system testing cover all the code, inter- a key delivery metric is the organization’s ability to meet actions, and functionality? OTD to customer expectations. This is a traditional measure and is calculated by project management. This measurement tracks actual delivery against committed ADP METRICS delivery within a given tolerance, usually 5 percent, with Following are the important metrics that were selected durations being measured in calendar workdays (see to measure and track within the ADP process. While Appendix). During the design and planning phase the each can address multiple goals, the metrics are shown engineering team establishes the transition of a target with the primary goals they address. date to a commit date. For ADP projects, past velocity For each metric the authors identify whether the rates and the complexity level of the current project are metric is calculated by the development team (such used to extrapolate project completion dates. as the defects-found-fixed or story points metrics) or by project management (such as on-time delivery or Figure 6 Guidelines for determining story velocity metrics). The authors also identify whether size and risk the metric is a traditional metric or developed to fit within an agile process such as ADP. ADP feature analysis—Size & risk estimation table The use and consequent benefits of metrics within Size legend 1 1-2 days effort ADP is continuing to evolve. The authors’ plan is to 2 3-5 days effort report additional metrics-related information in future 3 6-10 days effort publications as the metrics evolve and more insights 4 11-15 days effort 5 16-20 days effort can be drawn from them. 6 21-25 days effort 7 26-30 days effort Metrics for Goal 1: 8 31-35 days effort 9 36-40 days effort Projected Completion 10 41-45 days effort Risk legend To answer the question, “Are we on schedule?” an esti- 1 Low—No unknowns, have prior experience © 2007, ASQ mate of effort, anticipated productivity, and workload 2 Medium—Well understood, limited unknowns must be done and a schedule created. To carry out 3 High—No experience, high unknowns www.asq.org 41
  • 7. A Bipartite Empirically Oriented Metrics Process for Agile Software Development To continually forecast and refine estimates a key the reciprocal of the size-weighted risk, and is used to productivity metric of the ADP is the velocity rate. compare the contribution of risk between modules of This is an agile measure and is calculated by project similar story-point values. Within modules of similar management. The velocity is the rate at which story story points, those with low complexity factors have points are being completed per assigned capacity of higher risks and lower sizes, while those with high resource (see Appendix). complexity factors have lower risks and higher sizes, Velocity measures the effectiveness of the project so that those with low complexity factors tend to be team to deliver functional units, as defined by story deserving of special scrutiny. The complexity factor points. Measurement of the actual number of story also addresses Goal 1: Projected project completion, as points that can be produced by a project team of a given level of resource capacity, that is, velocity, Figure 7 Sample ADP velocity monitor enables management to assess team effectiveness. Hence, it ensures a balanced team in the project roles ADP Velocity Monitor and allows monitoring of the process using theory of Resource Story points Velocity Iteration capacity constraint principles for optimization as well as the number C SS * RS [SS * RS]/C basis for forward estimation using historical data. The i1 52.00 24 0.46 unit for velocity is “story points generated per resource i2 35.70 26 0.73 i3 134.10 71 0.53 capacity.” For example, if four developers generate 24 i4 76.95 41 0.53 story points in three weeks, the velocity is two story i5 72.00 45 0.63 points per labor-week. Note that the resource capacity i6 90.00 68 0.76 Overall velocity 0.60 can vary from one iteration to another. 0.80 Figure 7 shows a sample velocity calculation over a Iteration velocity Cumulative velocity 0.70 six-iteration period, with both immediate and cumula- 0.60 tive velocities shown in the accompanying graph. 0.50 A constituent element of velocity is the complexity 0.40 factor. This is an agile-related measure and is calcu- 0.46 0.57 0.55 0.54 0.56 0.60 0.30 lated by project management. The complexity factor is 0.20 a measurement of risk (see Figure 8 and Appendix). © 2007, ASQ 0.10 The complexity factor shows the contribution of 0.00 the risk within the total sizing of story points. It is i1 i2 i3 i4 i5 i6 Figure 8 Sample feature analysis, story points, and complexity factor calculation Product Version—Feature Analysis (Snippet) Feature Size Risk Points Feature ID High-level feature requirement description Priority SS RS SS * RS Upgrade product to utilize Microsoft .Net 2.0 framework recommended by Project Name_1 1 Microsoft and contains many critical bug fixes and enhancements. .Net 2.0 8 2 16 migration is required to use Visual Studio 2005. Define the details to support versioning for business logic component. Detailed design will be developed based on the outcome of the prototype. Project Name_2 2 5 3 15 Versioning includes document states such as active, pending, archived, and off-line. Support dynamic Web service configuration for external interfaces. Adopt Project Name_3 3 3 1 3 to current Web service standards. Add visualization to view various different types of activities in a Pareto Project Name_4 4 2 2 4 type chart. Incorporate system user authentication and security model into the login Project Name_5 5 7 3 21 application. Define the requirements to support application support for multiple user Project Name_6 6 3 2 6 © 2007, ASQ roles. Total 28 13 65 Complexity factor: Sum of size/Sum of points 0.43 42 SQP VOL. 9, NO. 2/© 2007, ASQ
  • 8. A Bipartite Empirically Oriented Metrics Process for Agile Software Development well as indirectly addressing Goal 2: Level of quality of of requirements changes and additions, so that the completed work. ongoing customer presence can be properly informed Velocity rates may change over time from project to and give their proper input to the agile process. Within project for similar components. The complexity factor the agile development process this is performed by helps identify the contribution of the risk aspect, with a customer proxy role that continually manages the respect to the corresponding velocity trends. Figure 8 requirements with the agile project team. Additionally, shows a sample calculation of the complexity factor. one of the primary functions of the customer proxy role is to validate the developed product from all the Issues for Goal 1: iterations for its usability from an end-user perspec- tive. Developed product (stories) can be refactored to Projected Completion ensure end-user usability criteria are met. Contending with moving planned- production targets Metrics for Goal 2: Velocity shows the production of stories, which Product Quality can be used to compare actual production against To address the quality of the completed work metrics planned production. However, in an agile develop- includes measuring tests run for each build including ment process “planned” is not a static unchanging confidence levels over time for test coverage. Using benchmark against which one can compare produc- Brooks Software’s Temporal reporting technologies tion. As requirements change, features change, and, against ClearQuest unified change management (UCM) consequently, stories continue to be created and to repositories allows project teams and management to evolve. To keep the “planned” picture up to date, one visualize these defect rates and found-fixed ratio. Both needs to track the stories in real time, adjusting them of these are traditional measures and are calculated by and the underlying requirements as they change, as the team. The software quality assurance team sup- well as estimate and forecast them into the future porting the project collects the underlying data. based on past and current customer volatility. Such a A quantity used to evaluate test quality is a weight- level of story creation ahead of time, which may not ed quality percentage (see Figure 9 and Appendix). be needed, is going against the principles of theory This is an agile-related measure, since it takes into of constraints. Keeping the release cycles short and account the age of the build on which a test was last maintaining a robust customer proxy role within the run, and is calculated by the team using test-case project team can help mitigate this concern to a large weightings assigned by QA and project management extent. Note, however, that within an iteration the pro- based on the highest risk to the project deliverables. duction target is unchanging, so that the development The range of the quality percentage is from –100 per- team has a fixed target on which to focus. This is the cent to 100 percent, where –100 percent is complete heart of the bipartite process, where the development failure of the full suite of tests on the latest build and teams are provided with a fixed “waterfall-type” envi- 100 percent is complete success of the full suite of ronment, while project management exercises agility tests on the latest build. in responding and adjusting to changes in customer Each test case in a build is assigned a level from 1 to 4, requirements. with level 1 being the highest, and each level is assigned a point value. The considerations for the weightings are Interface between requirements based on the criticality of the test. For example, a level (customer input) and product backlog 1 test may be a server component where a defect would Agile processes need a requirements-management have severe consequences, while a level 4 test may be system, ironically more than do plan-based processes. a client on a function that has low frequency of use. This is especially true to link changes in require- The point values of failed test cases are subtracted from ments to maintenance of the product backlog list those of passed cases to obtain an adjusted point value. used to assign stories to teams in the iterations. This These values are divided by the total point value of the is also true in order to properly evaluate the impact test cases to obtain the weighted quality percentage for www.asq.org 43
  • 9. A Bipartite Empirically Oriented Metrics Process for Agile Software Development a given build. In the example of Figure 9, the weighted Note that with the weighted quality percentage with quality percentage is 75 percent. confidence loss the negative impact of a failed test on As the build frequency increases and the full suite the metric value lessens as the build on which the test of tests grows, quick turnaround time and cost con- occurs recedes further into the past without a retest tak- siderations may not allow for the full suite of tests to ing place. It is therefore important that QA retest a failed be run for each build. One, therefore, must use test test as soon as possible, especially for level 1 tests. results from earlier builds, with a resulting loss in Weighted quality percentages (with or without con- confidence as to whether those results are still appro- fidence loss) may be very low in the first builds when priate. To take this “aging” into account the authors tests tend to fail more often. By release time, however, have established the “weighted quality percentage they must be near 100 percent in order to comply with confidence loss.” This is obtained by multiplying with product release readiness policies, with explicit the adjusted point value for each test case by the time- explanations of any workarounds agreed to by product liness value of the most recent build for which the test stakeholders and management. case was run. This timeliness value of each test case is A metric used for product after shipment to the 100 percent for the latest build. The same timeliness customer is voice of customer (VOC). This is an value for each test case will proportionally drop for agile-related measure, tracked by the marketing and subsequent future builds (for example, if the test case customer support groups and maintained by project was last run on the seventh build and the current build management. The VOC metrics are grades of 1 to 5, is the 10th, the timeliness value is 70 percent). The with 5 representing the best answer, based on survey metric provides a gauge for quality and project man- results calculated on a spreadsheet (see Figure 10) agement on the potential risk of a regression to a code for the suite of products and versions delivered to change not explicitly covered in the tests executed the particular customer. To increase the quantitative for the build. The weighted quality percentage with sophistication of this metric, since it represents a key confidence loss is the sum of the timeliness-adjusted customer-satisfaction value, and to prevent an individ- point values for all test cases divided by the total point ual low score from being ignored because of an overall value of all test cases. In the example of Figure 9, the high average, any response or any product with a 3 or weighted quality percentage with confidence loss is less triggers a root cause follow-up with the customer. 52 percent. Questions used in VOC metrics include: Figure 9 Sample weighted quality percentage Story Level Value Last test Raw score Last build Confidence Weighted result tested weight score A 1 10 Pass 10 7 0.7 7 B 2 8 Pass 8 8 0.8 6.4 C 3 5 Void 0 0 0 0 D 4 1 Pass 1 8 0.8 0.8 E 3 5 Pass 5 2 0.2 1 F 2 8 Fail -8 8 0.8 -6.4 G 3 5 Pass 5 9 0.9 4.5 H 4 1 Pass 1 5 0.5 0.5 I 3 5 Pass 5 10 1 5 J 2 8 Pass 8 5 0.5 4 K 1 10 Pass 10 7 0.7 7 L 2 8 Pass 8 4 0.4 3.2 M 3 5 Pass 5 10 1 5 N 2 8 Pass 8 9 0.9 7.2 O 4 1 Pass 1 7 0.7 0.7 P 3 5 Fail -5 7 0.7 -3.5 Q 2 8 Pass 8 5 0.5 4 R 1 10 Pass 10 7 0.7 7 S 3 5 Pass 5 9 0.9 4.5 © 2007, ASQ T 1 10 Pass 10 7 0.7 7 Maximum possible score: 126 Weighted score: 95 Weighted score with confidence loss: 65 Weighted quality %: 75.4% Weighted quality % with confidence loss: 51.5% 44 SQP VOL. 9, NO. 2/© 2007, ASQ
  • 10. A Bipartite Empirically Oriented Metrics Process for Agile Software Development • “Q1: Please select the rating that best describes automation and product documentation to only occur your opinion of the way your issue or question when the requirements are stable. was resolved. Automation of tests helps to reduce testing com- • Q2: How would you rate the speed that you plexity by routinely carrying out straightforward tests received the solution to resolve your issue or such as regression or duration tests, allowing one to answer to your question? dedicate more resources to less routine tests. Establishing metrics for code and functional cover- • Q3: How satisfied were you with the profession- age is a future need that will allow the understanding alism and courtesy of the support representative of how much test coverage exists for each build. who assisted you with your issue? • Q4: How satisfied were you with your overall support you received based on your last Metrics for Goal 3: Ability to interaction?” Change Content or Priorities The ADP process does not bind requirements to a release until the iteration planning activities. This Issues for Goal 2: allows flexibility to recast priorities for all the iteration. Product Quality The cost of this flexibility is that it requires a dedicated effort and resources for ongoing product (requirements) engineering throughout the project life span. Each Testing first of code at the unit level iteration has an iteration plan that defines the priori- A key issue is the timeliness and complexity of the tized set of requirements (story points) to be addressed testing regimen for the produced code representing in each three-week cycle. Each cycle is planned based the stories. To fulfill agility, an important aspect is the on current business conditions and is short enough that “test first” principle, that is, carry out unit tests when change is minimal. Requirements that do not get suc- or before writing the code. Unit tests are developed in cessfully implemented in the iteration are candidates conjunction with building the automated regression for the next cycle based on overall priorities. suite that is executed post build for each iteration. Unit Currently, initial release plan stories and actual test harnesses are checked in upon code completion completed stories are tracked. Unlike traditional and integrated into the regression suite. This ensures processes the planned stories vary over time as require- the code is stable and functional for every build. ments change in keeping with the changing marketplace requirements. Therefore, once they are defined for Impact of changing requirements implementation in a given iteration, actual stories are on testing tracked. The difference between the initially planned The coordination between story, integration, system, story points and the actual story points to be developed and regression testing becomes more complex when is called the story point variance (see Appendix). This requirements and stories are changing over time. The is an agile-related metric that is calculated by project ADP process allows for the customer proxy or require- management. The purpose of the variance between the ments engineer to lock a requirement down for each initially planned stories and actual developed stories iteration cycle by flagging whether the story delivered in the iteration is likely to need refactoring, that is, Figure 10 Sample voice of the customer a story is initially flagged for refactoring, and the (VOC) metric step of no longer flagging for refactoring locks down the requirement. This allows an explicit decision for Voice of the customer (VOC) Project Score requirements freeze to occur at a level of granular- ProductA V2.3 4 ity that can be effectively managed. Story validation ProductB V4.0 5 occurs for every change in the requirements by the ProductB V4.1 5 ProductB V4.2 2 customer proxy/requirements engineer to ensure the © 2007, ASQ ProductC V1.0 5 requirements meet their intended use. This allows ProductC V1.1 3 for a downstream activity such as building full test Overall 4.0 www.asq.org 45
  • 11. A Bipartite Empirically Oriented Metrics Process for Agile Software Development is to help determine a project management buffer for future releases. Knowledge of this historical variance is Metrics for Goal 4: Ability to helpful to keep up with the desire to meet the OTD met- Merge Iteration Cycles Into a ric discussed earlier. Within a project ADP measures the Seamless Overall Release actual stories completed for delivery to the customer. For a commercial application supplier like Brooks Software it is important that software releases pro- Issues for Goal 3: Ability to duced with an agile process also include whole product Change Content or Priorities readiness that includes integration with other software and hardware components such as operating system Currently, the suite of project management metrics and infrastructure configurations, documentation, for goal 3 is not as rich as those for goals 1 and 2. installation, upgrade, and other elements that meet Capturing requirements changes at the various levels customer expectations and maximizes user experi- would allow for better forecasting of how much the ence. requirements can be expected to change over the Overall shipment readiness is determined by the project’s life. Story point variance is currently being engineering and launch readiness process that ensures employed in this regard. Other metrics under consid- all product elements are available to ship through eration include: measurement of out-of-box quality (OBQ), which • Market changes: The market requirements are includes the measurement of post-release customer- allocated to a project contract that defines the reported defects (see Figure 11 and Appendix). This high-level targets for each release. By exercis- is a traditional measure and is calculated by project ing version and configuration control on the management. It measures overall product quality as project contract one can monitor its evolution experienced by the customer. This measures quality over time due to market changes and conse- beyond code-defect rates and ensures whole product quent changes in requirements. packaging objectives are being met. Figure 11 shows an example OBQ graph. The bar graph shows that • Changes in the project backlog: Within the installation and configuration defects occur most iteration planning, the project backlog is the frequently and the cumulative line graph shows that source of stories to be developed within each the degree of imbalance is approximately 40/60 (that cycle. Within an iteration, the two-to-three is, the top 40 percent of the categories accounts for 60 week development cycle allows the iteration to percent of the total defects). have its own planning process to allow for adap- tation and change. Through change control on Figure 11 Sample OBQ measure the project backlog over the various iterations, one can keep track of changes in the project 14 100% backlog as the iterations are carried out. While 90% 12 market changes and story point variance focus 80% on the inflow to the project backlog, this metric 10 70% Cumulative percent focuses on the outflow from the backlog. The 60% project backlog and its administration is a core 8 Defects concept of the ADP and allows for a high level 50% of agility and change control. 6 40% These metrics are agile-related and can be main- 4 30% tained by project management. They may require 20% 2 somewhat more of a workload than the current story- 10% tracking metric from the project management team. 0 0% Automated database systems can help in this regard, ion ion on ng ing llat rat ati agi rad © 2007, ASQ particularly a requirement-traceability tool and a good sta igu ent ack pg In nf cum P U Co Do software configuration management (SCM) tool. Category 46 SQP VOL. 9, NO. 2/© 2007, ASQ
  • 12. A Bipartite Empirically Oriented Metrics Process for Agile Software Development Issues for Goal 4: Ability to 1. Additional bipartite team and project metrics for process improvement Merge Iteration Cycles Into a 2. Further sophistication in the data collection Seamless Overall Release system for the bipartite agile metrics This is an area that is difficult to establish quantitative 3. Further implementation of process improve- metrics for and uses more of a peer and stakeholder ments incorporating the bipartite approach review process to ensure product readiness. This is Another area of longer-term interest for Brooks is addressed in the area of customer acceptance testing scaling the agile process to support distributed teams. and installation and delivery sign-off. Earlier in the development cycle other processes can be developed to With advances in software engineering environments address the various questions in this goal area: and with the advent of collaboration systems from suppliers like Microsoft, IBM, and Oracle, the benefits • Does each team know the inputs from and of the ADP agile process may be able to be scaled to outputs to other product components that benefit disbursed global teams. they need to address during their iteration? This can be addressed through peer review of the team’s plan for story development SUMMARY AND and tracked by various input-output analyses among stories. Traceability and configuration CONCLUSIONS An agile process can be established that optimizes management tools can be useful here. around short planning cycles and theory of constraints. • Have all the interactions among the vari- It is consistent with proven advances in other domains ous components and the various teams been such as product life-cycle management and manufac- accounted for, both in development plan- turing in which the discipline to remove waste from ning and in integration and system testing? all processes has proven as required for sustaining This can be similarly addressed through peer growth. For commercial software companies where reviews of the development plans and the sys- time-to-market, building it right, and cost pressures tem-test plans and can be similarly tracked by drive management investment decisions, agile has input-output and traceability analyses. moved from being a fad to being a trend. • Does system-testing cover all the code, interac- The authors have established a bipartite suite of tions, and functionality? Currently, the full set metrics, with some, such as defect rates, defects found of stories is tested. This is addressed in system and fixed, and weighted quality percentage, oriented to test planning. In addition, the authors are con- the development team, others, such as velocity, com- sidering adding metrics that quantify coverage of plexity factor, OTD, VOC, and story point variance, code and functions. This will help detect defects oriented to project management, and others, in par- relating to unplanned functionality in the code ticular story points, oriented to both the development that are not fully captured in the stories and team and project management. Many of these metrics, design specifications, as well as “dead code,” that such as story points, story point variance, velocity, is, hard-to-reach portions of the code that may complexity factor, weighted quality percentage, and not be covered unless the test effort specifically VOC, have been created to address the specifics of an focuses on that part of the code. agile development process such as ADP. In particular, the velocity metric has been instru- The authors are also looking to increase the level of mental in providing a meaningful way to predict the sophistication in this goal area, as well as to develop a availability of future product version release time- number of specific measures. frame. Velocity metric is used as a throughput rate, which is generally a measure of effort in the traditional Next Steps sense. The differentiating factor of this metric is that Future papers will discuss the advances of these velocity also captures a historical level of difficulty areas: that is not captured in traditional throughput rates. www.asq.org 47
  • 13. A Bipartite Empirically Oriented Metrics Process for Agile Software Development Hence, using velocity to determine future release Barrington, Mass.: North River Press. timeframe has been more beneficial and accurate as Heimann, D., N. Ashrafi, W. Koehler, J. P. Kuilboer, S. Mathiyalakan, and compared to traditional throughput measures. F. Waage. 2005. In addition, the complexity factor metric has pro- Metrics and databases for agile software development projects – A vided valuable information about the level of difficulty research proposition. 2005. International Conference on Agility, Helsinki, for a given product version. Using historical complex- 27-28 July. ity factor information an appropriate level of risk Hennessey, Peter R. 2005. Can ISO, CMMI and agile co-exist?. Boston management planning can be put in place for a given Software Process Improvement Network, March. Available at: http:// www.boston-spin.org/slides/050-Mar2005-talk.pdf. product version. Paying too much or too little attention Jalote, P., A. Palit, P. Kurien, and V. T. Peethamber. 2004. Timeboxing: A to various forms of risks can be a key ingredient for process model for iterative software development. Journal of Systems delayed software delivery and/or lack of needed level and Software 70: 117-127. of quality in the released software product. Manhart, P., and K. Schneider. 2004. Breaking the ice for agile All of the metrics identified in this article have development of embedded software: An industry experience report. provided Brooks Software invaluable insights and In Proceedings of the 26th International Conference on Software measurable tools in effectively managing both the Engineering (ICSE’04). project and product portfolio that best meets the needs Paulk, M. 2001. Extreme programming from a CMM perspective. IEEE of its customers. Metrics and the associated historical Software (November/December). information collected thus far have proved to be very Poole, C., and J. W. Huisman. 2001. Using extreme programming in a helpful to both the project team and the management maintenance environment. IEEE Software (November/December). team. The ADP process and its associated metrics Westfall, L. 2006. A practical process to establish software metrics. have evolved over a three-year period, and this journey Software Quality Professional 6, no. 2 (March). will continue over time to adapt the process further, Williams, L., W. Krebs, L. Layman, and A. Anton. 2004. Toward a frame- thereby advancing the discipline of software engineer- work for evaluating extreme programming, technical report TR-2004-02. ing in an agile environment. Raleigh, N.C.: North Carolina State University. Wood, W., and W. L. Kleb. 2003. Exploring XP for scientific research. IEEE REFERENCES Software (May/June). Basili, V., G. Caldiera, and H. D. Rombach. 1994. The goal question metric approach. Encyclopedia of Software Engineering. New York: John Wiley BIOGRAPHIES and Sons. David I. Heimann is a professor in the Management Science and Beck, K. 2000. Extreme Programming Explained: Embrace Change. Information Systems Department of the College of Management Reading, Mass.: Addison Wesley Longman. at the University of Massachusetts, Boston. He has a doctor- Blotner, J. 2002. Agile techniques to avoid firefighting at a start-up. ate in computer science from Purdue University. He has held OOPSLA 2002 Practitioners Reports, Seattle, November. positions in government and industry, performing activities in reliability modeling, simulation, database management systems, Boehm, B., and R. Turner. 2004. Balancing agility and discpline: A guide probabilistic modeling, software analysis, and software process for the perplexed. Reading, Mass.: Addison-Wesley. improvement. Among his publications are articles over sev- Cao, L., K. Mohan, P. Xu, and R. Balasubramaniam. 2004. How extreme eral volumes of the “Proceedings of the Annual Reliability and does extreme programming have to be? Adapting AP practices to Maintainability Symposium,” as well as a book-length article large-scale projects. In Proceedings of 37th International Conference on on system availability in “Advances in Computers.” He can be System Sciences, Hawaii, January 5-8. reached by e-mail at David.Heimann@umb.edu. Cohn, M., and D. Ford. 2003. Introducing an agile process to an organi- Peter R. Hennessey is director of central engineering programs zation. IEEE Computer (June). for Brooks Software. He has more than 20 years of experience Cross, P. 2004. The uses and abuses of software metrics. Software developing and delivering automation and business software Quality Professional 6, no. 2 (March). to some of the worlds most advanced manufacturing technol- Evans, B., and L. Scheinkopf. 2004. Faster, cheaper, better products: Tips ogy companies. At Brooks Software, Hennessey works across and tricks from insiders for freeing up the product development logjam. the engineering community to establish product and engineer- Medical Device and Diagnostics Industry Journal (January). ing process, methods, and architectures to maintain Brook’s worldwide operational and product leadership. He has a master’s Galin, D. 2003. Software quality metrics: From theory to implementation. degree in information systems from Northeastern University Software Quality Professional 5, no. 3 (June). and a bachelor’s degree in economics from the University of Goldratt, E. M. 1992. The goal: A process of ongoing improvement. Great Massachusetts, Amherst. 48 SQP VOL. 9, NO. 2/© 2007, ASQ
  • 14. A Bipartite Empirically Oriented Metrics Process for Agile Software Development Ashok R. Tripathi is a development manager for Brooks Software for several large advanced manufacturing customers worldwide. and a certified project management professional with more than Tripathi is also a holder of a U.S. patent on efficient methods 15 years of software product life-cycle experience. During his to collect data from semiconductor equipment. He is currently time at Brooks, Tripathi has led several product development pursuing an MBA and has a bachelor’s degree in industrial teams to build real-time software enterprise application suites engineering from Arizona State University. Appendix OTD = (actual available to ship date — planned commit date) (project duration) Metric: True or false: ”Is OTD ≤ a given tolerance level?” Velocity = Sum of actual story points Sum of capacity SSRS V= Ci where s is a story, SS the size of story s, RS the risk for story s, and Ci the capacity of resource i Note that story points = Story size * Complexity risk The complexity factor is defined by the following: SS C= S SSRS S where s is a story, SS the size of story s, and RS is the risk for story s (test case values * test-result weighting) Weighted quality percentage = (test case values) Weighted quality percentage (test case values * test-result weighting * build timeliness) = with confidence loss (test case values) (PS – AS) Story point variance = PS where PS is the total initially planned story points and AS is the total actual completed story points OBQ = % of defect categories per month (installation, packaging, documentation, software, etc.) www.asq.org 49