SlideShare a Scribd company logo
1 of 32
Download to read offline
CAUSAL ANALYSIS AND
       RESOLUTION

           Cyrus Fakharzadeh

        USC Computer Science




CS577b 3/20/00                 1
Outline

    n Definitions
    n Defect analysis review
    n Sample causal analysis exercises
    n Defect prevention KPA




   CS577b 3/20/00                        2
Definitions
  n Causal analysis: the analysis of defects to
    determine their underlying root cause
  n Causal analysis meeting: a meeting,
    conducted after completing a specific task,
    to analyze defects uncovered during the
    performance of that task




      CS577b 3/20/00                        3
Defect Analysis

    n Defect: any flaw in the specification,
      design, or implementation of a product.
    n Facilitate process improvement through
      defect analysis
        • defect categorization to identify where work
          must be done and to predict future defects
        • causal analysis to prevent problems from
          recurring

   CS577b 3/20/00                                 4
Fault Distributions
  Requirements     Design          Coding   Functional   System Test   Field Use
                                            Test
                                                                                    Fault
                                      50%                                           Origin
                    40%
    10%



                                                            50%
                                                                                   Fault
                                   7%                                              Detection
                    5%
    3%                                         25%
                                                                          10%




                                                                                   Cost per
                                                                       ~20. KDM    Fault

                                                         ~12. KDM

   ~1.KDM         ~1.KDM         ~1.KDM     ~6. KDM


                 KDM=kilo deutsch marks

      CS577b 3/20/00                                                                5
Fault Distributions (cont.)

                          Design   Coding   Functional
           Requirements                     Test         System Test   Field Use    Phase
Process
Maturity                                                                           Fault
Level                                                                              Introduction
              10%          40%      50%                                            Distribution

    5                     20%                  20%         10%          <5%
              5%                    40%

                          12%                 30%           20%         5%
    4         3%                    30%
                                                                                   Fault
    3         0%           2%      20%         38%        32%             8%       Detection
                                                                                   Distribution

    2         0%            0%                 30%         50%           17%
                                     3%

    1         0%           0%       2%         15%         50%          33%

                                                                        20         Relative Fault
               1            1        1          6           12                     Cost

           CS577b 3/20/00                                                            6
Sample Defect Data
   n   Defect data should be collected by:
       •   detection activity
       •   when detected
       •   introduction phase
       •   type
       •   mode
   n   A defect introduction and removal matrix can be
       generated and used for defect prevention to help
       answer “what are high-leverage opportunities for
       defect prevention / cost containment?”.




   CS577b 3/20/00                                         7
Defect Flow Tracking
    n   A defect introduction and removal
        matrix can be generated and used as a
        basis for defect analysis and prevention.
                                                      Percentage of Defects
                              Phase injected
        Phase detected        Requirements Preliminary Detailed Code/unit Total
                                            design      design   test
        Requirements                   37%                                       8%
        Preliminary design             22%          38%                         16%
        Detailed design                15%          18%      34%                17%
        Code/unit test                  7%          24%      28%       43%      25%
        Integration testing             7%           9%      14%       29%      14%
        System testing                 11%          12%      24%       29%      19%
        Total                         100%         100%     100%      100%    100%




   CS577b 3/20/00                                                                     8
Causal Analysis
    n Data on defects is collected and
      categorized
    n Trace each defect to its underlying
      cause
    n Isolate the vital few causes
        • Pareto principle: 80% of defects are
          traceable to 20% of all possible causes
    n   Move to correct the problems that
        caused the defects
   CS577b 3/20/00                                   9
Causal Analysis Form Fields

    n   Post-inspection example:
    Moderator, Date
    Subject, Subject type
    Item number
    Assigned to
    Defect category (interface, requirements, design, code, other)
    Item description
    Probable cause
    Suggestions for eliminating probable cause
    Action taken
    Number of hours to take corrective action


   CS577b 3/20/00                                                    10
Frequency




                                                     0
                                                         100
                                                               200
                                                                     300
                                                                            400
                                                                                  500
                                                                                        600
                                                                                              700
                                                                                                    800
                                                                                                          900
                                                                                                                1000
                                    Correctness




                                          Clarity




                                   Completeness




CS577b 3/20/00
                                    Consistency




                                     Compliance




                                   Maintainability




                                    Functionality




                 Defect Category
                                        Interface
                                                                                                                       Causal Analysis Example




                                    Performance




                                       Testability




                                      Reusability




                                     Traceability
11
Typical Analysis Steps
    1. Sort data by defect origin. Count the number in each group.
       Arrange the totals in descending order of total hours.
    2. Calculate the average fix times for each of the totals in the first
       step.
    3. For the top two or three totals in step 1, count the defects sorted
       by defect type and multiply the appropriate average fix times.
       Limit the number of types to the largest totals plus a single total
       for all others.
    4. Add up the defects in each module. Get totals for the five most
       frequently changed modules plus a single total for all others.
    5. Review the defect reports for the defects included in the largest
       totals from steps 3 and 4. Summarize the defect-report
       suggestions for how the defects might have been prevented or
       found earlier.

    CS577b 3/20/00                                                   12
Causal Analysis Exercise #1
The following defect data is from a completed project, and another with the same
generic component types is being planned with no reuse. Use causal analysis to identify the
highest risks and make suggestions for the new project.
                   Component       Type          Rework hours
                   C    hardware interface       25
                   B    communication           3
                   B     communication          6
                   B     hardware interface    15
                   B     hardware interface    18
                   A     communication           4
                   A     logic                 12
                   B     logic                  5
                   A     logic                 12
                   A     logic                 14
                   B     user interface        19
                   C     logic                 20
                   A     user interface        23
                   C     user interface        42


         CS577b 3/20/00                                                               13
Exercise #1 Answer

    n   Determine the defect types and components that contribute the most
        rework:
    n   TYPE
         • user interface      84 hours
         • logic               63 hours
         • hardware interface 53 hours
         • communication       13 hours
             – > concentrate on the user interface (resolve risk early, allocate
               resources, user prototyping, inspections, etc.)
    n   COMPONENT
         • C87 hours
         • B         66 hours
         • A         65 hours
             – > concentrate on component C


   CS577b 3/20/00                                                                  14
Causal Analysis Exercise #2
Analyze the following defect data. Produce three                   Defect # Origin        Type
Pareto column charts (or tables in descending order)
showing 1) the distribution of defect origins, 2) an               1    Documentation     Standards
                                                                   2    Code             Logic
effort-weighted distribution of defect origins showing
                                                                   3    Documentation     Process Comm.
the normalized hours to fix defects, 3) effort-weighted            4    Design           S/W Interface
distribution of defect types for the top two defect                5    Code             Computation
origins from chart #1. Make summary suggestions for                6    Code             Logic
the development process.                                           7    Specification    User Interface
                                                                   8    Design           Process Comm.
Weighting factors - normalized cost to fix defect types            9    Specification    Functionality
if not found until testing.                                        10   Code             Logic
                                                                   11   Design           User Interface
Specification           14                                         12   Code             Logic
(e.g. it takes 14 times as much effort to fix a specification      13   Design           H/W Interface
defect in the test phase compared to in the specification phase)   14    Other           Process Comm.
Design                  6.2                                        15   Code             Computation
Code                    2.5                                        16   Environment      Standards Support
Documentation           1                                          17    Other           Process Comm.
Other                   1                                          18    Specification   Functionality
Operator                1                                          19   Code             Computation
                                                                   20   Code             Logic
           CS577b 3/20/00                                                                        15
Exercise #2 Answers
Defect Origin   # of defects        Code Defect Type # of Defects Weight   Total wt.
Code                 8              Logic                 5       2.5        12.5
Design               4              Computation           3       2.5        7.5
Specification        3
Documentation        2              Design Defect Type # of Defects Weight Total wt.
Other               2               S/W Interface        1         6.2        6.2
Environment Support 1               Process Comm.        1         6.2        6.2
                                    User Interface       1         6.2        6.2
                                    H/W Interface        1         6.2        6.2

Defect Origin # of defects     weight   total weight
Specification        3          14       42
Design               4          6.2      24.8
Code                 8          2.5      20
Documentation       2           1         2
Other                2          1         2
Environment Support 1           1         1

        CS577b 3/20/00                                                       16
Level 4 Relationship to Level 5 KPAs

    n   Data analysis from Level 4 activities
        enables focusing the performance of
        Defect Prevention (DP), Technology
        Change Management (TCM), and
        Process Change Management (PCM)




   CS577b 3/20/00                           17
Defect Prevention
    The purpose of Defect Prevention is to identify the cause of defects and prevent them
    from recurring.

     Defect Prevention involves analyzing defects that were encountered in the past and
    taking specific actions to prevent the occurrence of those types of defects in the
    future. The defects may have been identified on other projects as well as in earlier
    stages or tasks of the current project. Defect prevention activities are also one
    mechanism for spreading lessons learned between projects.

    Trends are analyzed to track the types of defects that have been encountered and to
    identify defects that are likely to recur. Based on an understanding of the project's
    defined software process and how it is implemented (as described in the Integrated
    Software Management and Software Product Engineering key process areas), the root
    causes of the defects and the implications of the defects for future activities are
    determined.

     Both the project and the organization take specific actions to prevent recurrence of
    the defects.



   CS577b 3/20/00                                                                           18
Defect Prevention (DP)
                          ETVX Diagram
            ENTRY                                      TASK                                     EXIT
1. Policy for organization to       1. Develop Project’s DP plan (Ac1)              1. DP activities are planned
perform DP activities (C1)                                                          (G1)
                                    2. Team has kick-off meeting to prepare for
2. Policy for projects to perform   DP activities (Ac2)                             2. Common causes of defects
DP activities (C2)                                                                  are sought out and identified
                                    3. Conduct causal analysis meetings (Ac3)
                                                                                    (G2)
3. Organization-level team exists
                                    4. Conduct coordination meetings to review
to coordinate DP activities (Ab1)                                                   3. Common causes of defects
                                    the implementation of action proposals from
                                                                                    are prioritized and
4. Project level team exists to     the causal analysis meetings (Ac4)
                                                                                    systematically eliminated(G3)
coordinate DP activities (Ab2)
                                    5. Document and track DP data (Ac5)
5. Adequate resources/funding
                                    6. Revise the organization’s standard process
(Ab3)
                                    resulting from DP actions (Ac6)
6. Training for members of the
                                    7. Revise the project’s defined process
S/W engineering group and
                                    resulting from DP actions (Ac7)
related groups (Ab4)
                                    8. Provide feedback to developers on the
7. Procedures for Ac1, Ac3, Ac6,
                                    status and results of DP actions (Ac8)
& Ac7
                                                 VERIFICATION
                                    1. Reviews with senior management (V1)
                                    2. Reviews with project manager (V2)
                                    3. Reviews/audits by SQA (V3)
                                    4. Measurement of status of DP activities
                                    (M1)


      CS577b 3/20/00                                                                                                19
Defect Prevention Policies
    n   Organization defect prevention policy should state:
         •   Long-term plans and commitments are established for funding, staffing, and other
             resources for defect prevention.
         •   The resources needed are allocated for the defect prevention activities.
         •   Defect prevention activities are implemented across the organization to improve the
             software processes and products.
         •   The results of the defect prevention activities are reviewed to ensure the effectiveness
             of those activities.
         •   Management and technical actions identified as a result of the defect prevention
             activities are addressed.


    n   Project defect prevention policy should state:
         •   Defect prevention activities are included in each project's software development plan.
         •   The resources needed are allocated for the defect prevention activities.
         •   Project management and technical actions identified as a result of the defect
             prevention activities are addressed.




   CS577b 3/20/00                                                                             20
DP Tools and Training
   n   Tools:
       • statistical analysis tools
       • database systems
       • other
   n   Examples of DP training:
       •   defect prevention methods
       •   conduct of task kick-off meetings
       •   conduct of causal analysis meetings, and
       •   statistical methods (e.g., cause/effect
           diagrams and Pareto analysis).
   CS577b 3/20/00                                21
DP Project Activities

    n   Project plan for defect prevention:
           1.Identifies the defect prevention activities (e.g., task kick-off and causal
             analysis meetings) that will be held.
           2.Specifies the schedule of defect prevention activities.
           3.Covers the assigned responsibilities and resources required, including staff
             and tools.
           4.Undergoes peer review.
    n   Kick-off meetings are held to familiarize the members of the team with the
        details of the implementation of the process, as well as any recent changes to
        the process.
    n   Causal analysis meetings are held.




    CS577b 3/20/00                                                                 22
Causal Analysis Procedures
    n     Causal analysis meeting procedure typically specifies:

        1.Each team that performs a software task conducts causal analysis meetings.
            A causal analysis meeting is conducted shortly after the task is completed.
            Meetings are conducted during the software task if and when the number
            of defects uncovered warrants the additional meetings.
            Periodic causal analysis meetings are conducted after software products
            are released to the customer, as appropriate.
            For software tasks of long duration, periodic in-process defect prevention
            meetings are conducted, as appropriate.
         An example of a long duration task is a level-of-effort, customer support task.
        2.The meetings are led by a person trained in conducting causal analysis meetings.
        3.Defects are identified and analyzed to determine their root causes.
         An example of a method to determine root causes is cause/effect diagrams.




   CS577b 3/20/00                                                                            23
Causal Analysis Procedures (cont.)
    4.The defects are assigned to categories of root causes.
       Examples of defect root cause categories include:
          inadequate training,
          breakdown of communications,
          not accounting for all details of the problem, and
          making mistakes in manual procedures (e.g., typing).
    5.Proposed actions to prevent the future occurrence of identified defects and
       similar defects are developed and documented.
       Examples of proposed actions include modifications to:
          the process,
          training,
          tools,
          methods,
          communications, and
          software work products.
    6.Common causes of defects are identified and documented.
       Examples of common causes include:
          frequent errors made in invoking a certain system function, and
          frequent errors made in a related group of software units.
    7.The results of the meeting are recorded for use by the organization and other
       projects.
   CS577b 3/20/00                                                                     24
DP Team Activities
    n     Each of the teams assigned to coordinate defect prevention activities meets on a periodic
          basis to review and coordinate implementation of action proposals from the causal analysis
          meetings. The teams involved may be at the organization or project level.

    n      Team activities include:
        1.Review the output from the causal analysis meetings and select action proposals
         that will be addressed.
        2.Review action proposals that have been assigned to them by other teams
         coordinating defect prevention activities in the organization and select action
         proposals that will be addressed.
        3.Review actions taken by the other teams in the organization to assess whether
         these actions can be applied to their activities and processes.
        4.Perform a preliminary analysis of the action proposals and set their priorities.
         Priority is usually nonrigorous and is based on an understanding of:
            the causes of defects,
            the implications of not addressing the defects,
            the cost to implement process improvements to prevent the defects, and
            the expected impact on software quality.
         An example of a technique used to set priorities for the action proposals is
         Pareto analysis.

   CS577b 3/20/00                                                                             25
DP Team Activities (cont.)
    5.Reassign action proposals to teams at another level in the organization, as
       appropriate.
    6.Document their rationale for decisions and provide the decision and the
       rationale to the submitters of the action proposals.
    7.Assign responsibility for implementing the action items resulting from the
       action proposals.
           Implementation of the action items includes making immediate changes
           to the activities that are within the purview of the team and arranging for
           other changes.
           Members of the team usually implement the action items, but, in some
           cases, the team can arrange for someone else to implement an action
           item.
    8.Review results of defect prevention experiments and take actions to incorporate
       the results of successful experiments into the rest of the project or
       organization, as appropriate.
       Examples of defect prevention experiments include:
           using a temporarily modified process, and
           using a new tool.
    9.Track the status of the action proposals and action items.

   CS577b 3/20/00                                                                        26
DP Team Activities (cont.)

    10.Document software process improvement proposals for the organization's
       standard software process and the projects' defined software processes as
       appropriate.
       The submitters of the action proposal are designated as the submitters of the
       software process improvement proposals.
     11.Review and verify completed action items before they are closed.
     12.Ensure that significant efforts and successes in preventing defects are
       recognized.




   CS577b 3/20/00                                                                      27
DP Documentation and Tracking Activities
    Activity 5 -- Defect prevention data are documented and tracked
    across the teams coordinating defect prevention activities.

     1.Action proposals identified in causal analysis meetings are documented.
        Examples of data that are in the description of an action proposal include:
         originator of the action proposal,
         description of the defect,
         description of the defect cause,
         defect cause category,
         stage when the defect was injected,
         stage when the defect was identified,
         description of the action proposal, and
         action proposal category.

     2.Action items resulting from action proposals are documented.
      Examples of data that are in the description of an action item include:
         the person responsible for implementing it,
         a description of the areas affected by it,
         the individuals who are to be kept informed of its status,
         the next date its status will be reviewed,
         the rationale for key decisions,
         a description of implementation actions,
         the time and cost for identifying the defect and correcting it, and
         the estimated cost of not fixing the defect.
    CS577b 3/20/00                                                                    28
DP Feedback

   n     Feedback is needed on the status and results of the organization's and
         project's defect prevention activities on a periodic basis.

   The feedback provides:

       1.A summary of the major defect categories.
       2.The frequency distribution of defects in the major defect categories.
       3.Significant innovations and actions taken to address the major defect categories.
       4.A summary status of the action proposals and action items.

        Examples of means to provide this feedback include:
          electronic bulletin boards,
          newsletters, and
          information flow meetings.




   CS577b 3/20/00                                                                            29
DP Measurements
   n   Examples:
        • the cumulative costs of defect prevention activities (e.g.,
          holding causal analysis meetings and implementing action
          items)
        • the time and cost for identifying the defects and correcting
          them, compared to the estimated cost of not correcting the
          defects
        • profiles measuring the number of action items proposed,
          open, and completed
        • the number of defects injected in each stage, cumulatively,
          and over releases of similar products and the total number of
          defects.




   CS577b 3/20/00                                                 30
DP Management Reviews
   n     DP reviews cover:

       1.A summary of the major defect categories and the frequency
        distribution of
        defects in these categories.
       2.A summary of the major action categories and the frequency
        distribution of
        actions in these categories.
       3.Significant actions taken to address the major defect categories.
       4.A summary status of the proposed, open, and completed action items.
       5.A summary of the effectiveness of and savings attributable to the
        defect
        prevention activities.
       6.The actual cost of completed defect prevention activities and the
        projected cost
        of planned defect prevention activities.
   CS577b 3/20/00                                                       31
References
    n    Defect Prevention (DP)

    Inderpal Bhandari, Michael Halliday, et al., "A Case Study of Software Process Improvement During
         Development," IEEE Transactions on Software Engineering, Vol. 19, No. 12, December 1993, pp. 1157-
         1170.

    R. Chillarege and I. Bhandari, "Orthogonal Defect Classification -- A Concept for In-Process Measurements,"
        IEEE Software, Vol. 18, No. 11, November 1992, pp. 943-955.

    Julia L. Gale, Jesus R. Tirso, and C. Art Burchfield, "Implementing the Defect Prevention Process in the MVS
          Interactive Programming Organization," IBM Systems Journal, Vol. 29, No. 1, 1990, pp. 33-43.

    C.L. Jones, "A Process-Integrated Approach to Defect Prevention," IBM Systems Journal, Vol. 24, No. 2, 1985,
         pp. 150-167.

    Juichirou Kajihara, Goro Amamiya, and Tetsuo Saya, "Learning from Bugs," IEEE Software, Vol. 10, No. 5,
         September 1993, pp. 46-54.

    R.G. Mays, C.L. Jones, G.J. Holloway, and D.P. Studinski, "Experiences with Defect Prevention," IBM Systems
         Journal, Vol. 29, No. 1, 1990, pp. 4-32.

    Norman Bridge and Corinne Miller, "Orthogonal Defect Classification Using Defect Data to Improve Software
        Development," Proceedings of the 7th International Conference on Software Quality, Montgomery,
        Alabama, 6-8 October 1997, pp. 197-213.



   CS577b 3/20/00                                                                                          32

More Related Content

Viewers also liked

Root cause analysis - tools and process
Root cause analysis - tools and processRoot cause analysis - tools and process
Root cause analysis - tools and processCharles Cotter, PhD
 
Causal comparative research
Causal comparative researchCausal comparative research
Causal comparative researchDua FaTima
 
Defect analysis and prevention methods
Defect analysis and prevention methods Defect analysis and prevention methods
Defect analysis and prevention methods deep sharma
 
8 d egitimi
8 d egitimi8 d egitimi
8 d egitimiras1215
 
The Six Stages of Incident Response
The Six Stages of Incident Response The Six Stages of Incident Response
The Six Stages of Incident Response Darren Pauli
 
Root cause analysis common problems and solutions
Root cause analysis common problems and solutions Root cause analysis common problems and solutions
Root cause analysis common problems and solutions ASQ Reliability Division
 
Root cause Analysis of Defects
Root cause Analysis of DefectsRoot cause Analysis of Defects
Root cause Analysis of DefectsDavid Gevorgyan
 
Disasters and Ecosystem: Philippine Setting
Disasters and Ecosystem: Philippine SettingDisasters and Ecosystem: Philippine Setting
Disasters and Ecosystem: Philippine SettingASU-CHARRM
 
Root Cause Analysis
Root Cause AnalysisRoot Cause Analysis
Root Cause Analysismtalhausmani
 
India Quiz,Elims- Quizzathon'16,Manipal Institute of Technology
India Quiz,Elims- Quizzathon'16,Manipal Institute of TechnologyIndia Quiz,Elims- Quizzathon'16,Manipal Institute of Technology
India Quiz,Elims- Quizzathon'16,Manipal Institute of TechnologyKrittibas Majumdar
 
Incident & Accident Reporting
Incident & Accident ReportingIncident & Accident Reporting
Incident & Accident Reporting87amanda
 
Accident reporting and investigation
Accident reporting and investigationAccident reporting and investigation
Accident reporting and investigationHien Dinh
 
Total Productive Maintenance (TPM)
Total Productive Maintenance (TPM)Total Productive Maintenance (TPM)
Total Productive Maintenance (TPM)Sanjeev Deshmukh
 
Accident Investigation - UK-HSE
Accident Investigation - UK-HSEAccident Investigation - UK-HSE
Accident Investigation - UK-HSEGraememk2
 
Root Cause Analysis
Root Cause AnalysisRoot Cause Analysis
Root Cause Analysistqmdoctor
 

Viewers also liked (19)

Root cause analysis - tools and process
Root cause analysis - tools and processRoot cause analysis - tools and process
Root cause analysis - tools and process
 
Causal comparative research
Causal comparative researchCausal comparative research
Causal comparative research
 
Defect analysis and prevention methods
Defect analysis and prevention methods Defect analysis and prevention methods
Defect analysis and prevention methods
 
RCA
RCARCA
RCA
 
8 D Report
8 D Report8 D Report
8 D Report
 
8 d egitimi
8 d egitimi8 d egitimi
8 d egitimi
 
The Six Stages of Incident Response
The Six Stages of Incident Response The Six Stages of Incident Response
The Six Stages of Incident Response
 
Root cause analysis common problems and solutions
Root cause analysis common problems and solutions Root cause analysis common problems and solutions
Root cause analysis common problems and solutions
 
Root cause Analysis of Defects
Root cause Analysis of DefectsRoot cause Analysis of Defects
Root cause Analysis of Defects
 
Disasters and Ecosystem: Philippine Setting
Disasters and Ecosystem: Philippine SettingDisasters and Ecosystem: Philippine Setting
Disasters and Ecosystem: Philippine Setting
 
Root Cause Analysis
Root Cause AnalysisRoot Cause Analysis
Root Cause Analysis
 
India Quiz,Elims- Quizzathon'16,Manipal Institute of Technology
India Quiz,Elims- Quizzathon'16,Manipal Institute of TechnologyIndia Quiz,Elims- Quizzathon'16,Manipal Institute of Technology
India Quiz,Elims- Quizzathon'16,Manipal Institute of Technology
 
Incident & Accident Reporting
Incident & Accident ReportingIncident & Accident Reporting
Incident & Accident Reporting
 
Accident reporting and investigation
Accident reporting and investigationAccident reporting and investigation
Accident reporting and investigation
 
Total Productive Maintenance (TPM)
Total Productive Maintenance (TPM)Total Productive Maintenance (TPM)
Total Productive Maintenance (TPM)
 
Accident Investigation and Analysis
Accident Investigation and AnalysisAccident Investigation and Analysis
Accident Investigation and Analysis
 
C & E matrix
C & E matrixC & E matrix
C & E matrix
 
Accident Investigation - UK-HSE
Accident Investigation - UK-HSEAccident Investigation - UK-HSE
Accident Investigation - UK-HSE
 
Root Cause Analysis
Root Cause AnalysisRoot Cause Analysis
Root Cause Analysis
 

Similar to Dp and causal analysis guideline

A Qualitative Study on Performance Bugs (MSR 2012)
A Qualitative Study on Performance Bugs (MSR 2012)A Qualitative Study on Performance Bugs (MSR 2012)
A Qualitative Study on Performance Bugs (MSR 2012)Bram Adams
 
Studying the impact of dependency network measures on software quality
Studying the impact of dependency network measures on software quality	Studying the impact of dependency network measures on software quality
Studying the impact of dependency network measures on software quality ICSM 2010
 
Testaus 2013 Mark Fewster Reporting Software Quality
Testaus 2013 Mark Fewster Reporting Software QualityTestaus 2013 Mark Fewster Reporting Software Quality
Testaus 2013 Mark Fewster Reporting Software QualityTieturi Oy
 
TDD sharevison team
TDD sharevison teamTDD sharevison team
TDD sharevison teamKhou Suylong
 
Benefit From Unit Testing In The Real World
Benefit From Unit Testing In The Real WorldBenefit From Unit Testing In The Real World
Benefit From Unit Testing In The Real WorldDror Helper
 
Armenia ict adoption ii
Armenia ict adoption iiArmenia ict adoption ii
Armenia ict adoption iiKaty Pearce
 
Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...
Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...
Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...ICSM 2011
 
Test-Driven Development (TDD)
Test-Driven Development (TDD)Test-Driven Development (TDD)
Test-Driven Development (TDD)Brian Rasmussen
 
Session #2: Test Driven Development
Session #2: Test Driven DevelopmentSession #2: Test Driven Development
Session #2: Test Driven DevelopmentSteve Lange
 
The Relationship Between Development Problems and Use of Software Engineering...
The Relationship Between Development Problems and Use of Software Engineering...The Relationship Between Development Problems and Use of Software Engineering...
The Relationship Between Development Problems and Use of Software Engineering...SoftwarePractice
 
Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...
Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...
Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...UserZoom
 
Driver's Attitudes toward Speed Limits
Driver's Attitudes toward Speed LimitsDriver's Attitudes toward Speed Limits
Driver's Attitudes toward Speed LimitsBasil Psarianos
 
Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?
Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?
Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?Denim Group
 
The History of App Store
The History of App StoreThe History of App Store
The History of App StoreSeungyul Kim
 
Establishing effective ort requirements
Establishing effective ort requirementsEstablishing effective ort requirements
Establishing effective ort requirementsAccendo Reliability
 
Establishing effective ort requirements
Establishing effective ort requirementsEstablishing effective ort requirements
Establishing effective ort requirementsAccendo Reliability
 
Bullhorn Live Benchmarking Session
Bullhorn Live Benchmarking SessionBullhorn Live Benchmarking Session
Bullhorn Live Benchmarking Sessionbullhornlive
 

Similar to Dp and causal analysis guideline (20)

Defect prevention
Defect preventionDefect prevention
Defect prevention
 
A Qualitative Study on Performance Bugs (MSR 2012)
A Qualitative Study on Performance Bugs (MSR 2012)A Qualitative Study on Performance Bugs (MSR 2012)
A Qualitative Study on Performance Bugs (MSR 2012)
 
Studying the impact of dependency network measures on software quality
Studying the impact of dependency network measures on software quality	Studying the impact of dependency network measures on software quality
Studying the impact of dependency network measures on software quality
 
Testaus 2013 Mark Fewster Reporting Software Quality
Testaus 2013 Mark Fewster Reporting Software QualityTestaus 2013 Mark Fewster Reporting Software Quality
Testaus 2013 Mark Fewster Reporting Software Quality
 
TDD sharevison team
TDD sharevison teamTDD sharevison team
TDD sharevison team
 
Benefit From Unit Testing In The Real World
Benefit From Unit Testing In The Real WorldBenefit From Unit Testing In The Real World
Benefit From Unit Testing In The Real World
 
Armenia ict adoption ii
Armenia ict adoption iiArmenia ict adoption ii
Armenia ict adoption ii
 
Menatel delta qs opi
Menatel delta qs opiMenatel delta qs opi
Menatel delta qs opi
 
Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...
Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...
Tutorial 2 - Practical Combinatorial (t-way) Methods for Detecting Complex Fa...
 
Test-Driven Development (TDD)
Test-Driven Development (TDD)Test-Driven Development (TDD)
Test-Driven Development (TDD)
 
Session #2: Test Driven Development
Session #2: Test Driven DevelopmentSession #2: Test Driven Development
Session #2: Test Driven Development
 
The Relationship Between Development Problems and Use of Software Engineering...
The Relationship Between Development Problems and Use of Software Engineering...The Relationship Between Development Problems and Use of Software Engineering...
The Relationship Between Development Problems and Use of Software Engineering...
 
Session 55 Oded Cats
Session 55 Oded CatsSession 55 Oded Cats
Session 55 Oded Cats
 
Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...
Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...
Webinar slides: Scale your UX Research and Convince Stakeholders with UZ Sess...
 
Driver's Attitudes toward Speed Limits
Driver's Attitudes toward Speed LimitsDriver's Attitudes toward Speed Limits
Driver's Attitudes toward Speed Limits
 
Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?
Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?
Remediation Statistics: What Does Fixing Application Vulnerabilities Cost?
 
The History of App Store
The History of App StoreThe History of App Store
The History of App Store
 
Establishing effective ort requirements
Establishing effective ort requirementsEstablishing effective ort requirements
Establishing effective ort requirements
 
Establishing effective ort requirements
Establishing effective ort requirementsEstablishing effective ort requirements
Establishing effective ort requirements
 
Bullhorn Live Benchmarking Session
Bullhorn Live Benchmarking SessionBullhorn Live Benchmarking Session
Bullhorn Live Benchmarking Session
 

Dp and causal analysis guideline

  • 1. CAUSAL ANALYSIS AND RESOLUTION Cyrus Fakharzadeh USC Computer Science CS577b 3/20/00 1
  • 2. Outline n Definitions n Defect analysis review n Sample causal analysis exercises n Defect prevention KPA CS577b 3/20/00 2
  • 3. Definitions n Causal analysis: the analysis of defects to determine their underlying root cause n Causal analysis meeting: a meeting, conducted after completing a specific task, to analyze defects uncovered during the performance of that task CS577b 3/20/00 3
  • 4. Defect Analysis n Defect: any flaw in the specification, design, or implementation of a product. n Facilitate process improvement through defect analysis • defect categorization to identify where work must be done and to predict future defects • causal analysis to prevent problems from recurring CS577b 3/20/00 4
  • 5. Fault Distributions Requirements Design Coding Functional System Test Field Use Test Fault 50% Origin 40% 10% 50% Fault 7% Detection 5% 3% 25% 10% Cost per ~20. KDM Fault ~12. KDM ~1.KDM ~1.KDM ~1.KDM ~6. KDM KDM=kilo deutsch marks CS577b 3/20/00 5
  • 6. Fault Distributions (cont.) Design Coding Functional Requirements Test System Test Field Use Phase Process Maturity Fault Level Introduction 10% 40% 50% Distribution 5 20% 20% 10% <5% 5% 40% 12% 30% 20% 5% 4 3% 30% Fault 3 0% 2% 20% 38% 32% 8% Detection Distribution 2 0% 0% 30% 50% 17% 3% 1 0% 0% 2% 15% 50% 33% 20 Relative Fault 1 1 1 6 12 Cost CS577b 3/20/00 6
  • 7. Sample Defect Data n Defect data should be collected by: • detection activity • when detected • introduction phase • type • mode n A defect introduction and removal matrix can be generated and used for defect prevention to help answer “what are high-leverage opportunities for defect prevention / cost containment?”. CS577b 3/20/00 7
  • 8. Defect Flow Tracking n A defect introduction and removal matrix can be generated and used as a basis for defect analysis and prevention. Percentage of Defects Phase injected Phase detected Requirements Preliminary Detailed Code/unit Total design design test Requirements 37% 8% Preliminary design 22% 38% 16% Detailed design 15% 18% 34% 17% Code/unit test 7% 24% 28% 43% 25% Integration testing 7% 9% 14% 29% 14% System testing 11% 12% 24% 29% 19% Total 100% 100% 100% 100% 100% CS577b 3/20/00 8
  • 9. Causal Analysis n Data on defects is collected and categorized n Trace each defect to its underlying cause n Isolate the vital few causes • Pareto principle: 80% of defects are traceable to 20% of all possible causes n Move to correct the problems that caused the defects CS577b 3/20/00 9
  • 10. Causal Analysis Form Fields n Post-inspection example: Moderator, Date Subject, Subject type Item number Assigned to Defect category (interface, requirements, design, code, other) Item description Probable cause Suggestions for eliminating probable cause Action taken Number of hours to take corrective action CS577b 3/20/00 10
  • 11. Frequency 0 100 200 300 400 500 600 700 800 900 1000 Correctness Clarity Completeness CS577b 3/20/00 Consistency Compliance Maintainability Functionality Defect Category Interface Causal Analysis Example Performance Testability Reusability Traceability 11
  • 12. Typical Analysis Steps 1. Sort data by defect origin. Count the number in each group. Arrange the totals in descending order of total hours. 2. Calculate the average fix times for each of the totals in the first step. 3. For the top two or three totals in step 1, count the defects sorted by defect type and multiply the appropriate average fix times. Limit the number of types to the largest totals plus a single total for all others. 4. Add up the defects in each module. Get totals for the five most frequently changed modules plus a single total for all others. 5. Review the defect reports for the defects included in the largest totals from steps 3 and 4. Summarize the defect-report suggestions for how the defects might have been prevented or found earlier. CS577b 3/20/00 12
  • 13. Causal Analysis Exercise #1 The following defect data is from a completed project, and another with the same generic component types is being planned with no reuse. Use causal analysis to identify the highest risks and make suggestions for the new project. Component Type Rework hours C hardware interface 25 B communication 3 B communication 6 B hardware interface 15 B hardware interface 18 A communication 4 A logic 12 B logic 5 A logic 12 A logic 14 B user interface 19 C logic 20 A user interface 23 C user interface 42 CS577b 3/20/00 13
  • 14. Exercise #1 Answer n Determine the defect types and components that contribute the most rework: n TYPE • user interface 84 hours • logic 63 hours • hardware interface 53 hours • communication 13 hours – > concentrate on the user interface (resolve risk early, allocate resources, user prototyping, inspections, etc.) n COMPONENT • C87 hours • B 66 hours • A 65 hours – > concentrate on component C CS577b 3/20/00 14
  • 15. Causal Analysis Exercise #2 Analyze the following defect data. Produce three Defect # Origin Type Pareto column charts (or tables in descending order) showing 1) the distribution of defect origins, 2) an 1 Documentation Standards 2 Code Logic effort-weighted distribution of defect origins showing 3 Documentation Process Comm. the normalized hours to fix defects, 3) effort-weighted 4 Design S/W Interface distribution of defect types for the top two defect 5 Code Computation origins from chart #1. Make summary suggestions for 6 Code Logic the development process. 7 Specification User Interface 8 Design Process Comm. Weighting factors - normalized cost to fix defect types 9 Specification Functionality if not found until testing. 10 Code Logic 11 Design User Interface Specification 14 12 Code Logic (e.g. it takes 14 times as much effort to fix a specification 13 Design H/W Interface defect in the test phase compared to in the specification phase) 14 Other Process Comm. Design 6.2 15 Code Computation Code 2.5 16 Environment Standards Support Documentation 1 17 Other Process Comm. Other 1 18 Specification Functionality Operator 1 19 Code Computation 20 Code Logic CS577b 3/20/00 15
  • 16. Exercise #2 Answers Defect Origin # of defects Code Defect Type # of Defects Weight Total wt. Code 8 Logic 5 2.5 12.5 Design 4 Computation 3 2.5 7.5 Specification 3 Documentation 2 Design Defect Type # of Defects Weight Total wt. Other 2 S/W Interface 1 6.2 6.2 Environment Support 1 Process Comm. 1 6.2 6.2 User Interface 1 6.2 6.2 H/W Interface 1 6.2 6.2 Defect Origin # of defects weight total weight Specification 3 14 42 Design 4 6.2 24.8 Code 8 2.5 20 Documentation 2 1 2 Other 2 1 2 Environment Support 1 1 1 CS577b 3/20/00 16
  • 17. Level 4 Relationship to Level 5 KPAs n Data analysis from Level 4 activities enables focusing the performance of Defect Prevention (DP), Technology Change Management (TCM), and Process Change Management (PCM) CS577b 3/20/00 17
  • 18. Defect Prevention The purpose of Defect Prevention is to identify the cause of defects and prevent them from recurring. Defect Prevention involves analyzing defects that were encountered in the past and taking specific actions to prevent the occurrence of those types of defects in the future. The defects may have been identified on other projects as well as in earlier stages or tasks of the current project. Defect prevention activities are also one mechanism for spreading lessons learned between projects. Trends are analyzed to track the types of defects that have been encountered and to identify defects that are likely to recur. Based on an understanding of the project's defined software process and how it is implemented (as described in the Integrated Software Management and Software Product Engineering key process areas), the root causes of the defects and the implications of the defects for future activities are determined. Both the project and the organization take specific actions to prevent recurrence of the defects. CS577b 3/20/00 18
  • 19. Defect Prevention (DP) ETVX Diagram ENTRY TASK EXIT 1. Policy for organization to 1. Develop Project’s DP plan (Ac1) 1. DP activities are planned perform DP activities (C1) (G1) 2. Team has kick-off meeting to prepare for 2. Policy for projects to perform DP activities (Ac2) 2. Common causes of defects DP activities (C2) are sought out and identified 3. Conduct causal analysis meetings (Ac3) (G2) 3. Organization-level team exists 4. Conduct coordination meetings to review to coordinate DP activities (Ab1) 3. Common causes of defects the implementation of action proposals from are prioritized and 4. Project level team exists to the causal analysis meetings (Ac4) systematically eliminated(G3) coordinate DP activities (Ab2) 5. Document and track DP data (Ac5) 5. Adequate resources/funding 6. Revise the organization’s standard process (Ab3) resulting from DP actions (Ac6) 6. Training for members of the 7. Revise the project’s defined process S/W engineering group and resulting from DP actions (Ac7) related groups (Ab4) 8. Provide feedback to developers on the 7. Procedures for Ac1, Ac3, Ac6, status and results of DP actions (Ac8) & Ac7 VERIFICATION 1. Reviews with senior management (V1) 2. Reviews with project manager (V2) 3. Reviews/audits by SQA (V3) 4. Measurement of status of DP activities (M1) CS577b 3/20/00 19
  • 20. Defect Prevention Policies n Organization defect prevention policy should state: • Long-term plans and commitments are established for funding, staffing, and other resources for defect prevention. • The resources needed are allocated for the defect prevention activities. • Defect prevention activities are implemented across the organization to improve the software processes and products. • The results of the defect prevention activities are reviewed to ensure the effectiveness of those activities. • Management and technical actions identified as a result of the defect prevention activities are addressed. n Project defect prevention policy should state: • Defect prevention activities are included in each project's software development plan. • The resources needed are allocated for the defect prevention activities. • Project management and technical actions identified as a result of the defect prevention activities are addressed. CS577b 3/20/00 20
  • 21. DP Tools and Training n Tools: • statistical analysis tools • database systems • other n Examples of DP training: • defect prevention methods • conduct of task kick-off meetings • conduct of causal analysis meetings, and • statistical methods (e.g., cause/effect diagrams and Pareto analysis). CS577b 3/20/00 21
  • 22. DP Project Activities n Project plan for defect prevention: 1.Identifies the defect prevention activities (e.g., task kick-off and causal analysis meetings) that will be held. 2.Specifies the schedule of defect prevention activities. 3.Covers the assigned responsibilities and resources required, including staff and tools. 4.Undergoes peer review. n Kick-off meetings are held to familiarize the members of the team with the details of the implementation of the process, as well as any recent changes to the process. n Causal analysis meetings are held. CS577b 3/20/00 22
  • 23. Causal Analysis Procedures n Causal analysis meeting procedure typically specifies: 1.Each team that performs a software task conducts causal analysis meetings. A causal analysis meeting is conducted shortly after the task is completed. Meetings are conducted during the software task if and when the number of defects uncovered warrants the additional meetings. Periodic causal analysis meetings are conducted after software products are released to the customer, as appropriate. For software tasks of long duration, periodic in-process defect prevention meetings are conducted, as appropriate. An example of a long duration task is a level-of-effort, customer support task. 2.The meetings are led by a person trained in conducting causal analysis meetings. 3.Defects are identified and analyzed to determine their root causes. An example of a method to determine root causes is cause/effect diagrams. CS577b 3/20/00 23
  • 24. Causal Analysis Procedures (cont.) 4.The defects are assigned to categories of root causes. Examples of defect root cause categories include: inadequate training, breakdown of communications, not accounting for all details of the problem, and making mistakes in manual procedures (e.g., typing). 5.Proposed actions to prevent the future occurrence of identified defects and similar defects are developed and documented. Examples of proposed actions include modifications to: the process, training, tools, methods, communications, and software work products. 6.Common causes of defects are identified and documented. Examples of common causes include: frequent errors made in invoking a certain system function, and frequent errors made in a related group of software units. 7.The results of the meeting are recorded for use by the organization and other projects. CS577b 3/20/00 24
  • 25. DP Team Activities n Each of the teams assigned to coordinate defect prevention activities meets on a periodic basis to review and coordinate implementation of action proposals from the causal analysis meetings. The teams involved may be at the organization or project level. n Team activities include: 1.Review the output from the causal analysis meetings and select action proposals that will be addressed. 2.Review action proposals that have been assigned to them by other teams coordinating defect prevention activities in the organization and select action proposals that will be addressed. 3.Review actions taken by the other teams in the organization to assess whether these actions can be applied to their activities and processes. 4.Perform a preliminary analysis of the action proposals and set their priorities. Priority is usually nonrigorous and is based on an understanding of: the causes of defects, the implications of not addressing the defects, the cost to implement process improvements to prevent the defects, and the expected impact on software quality. An example of a technique used to set priorities for the action proposals is Pareto analysis. CS577b 3/20/00 25
  • 26. DP Team Activities (cont.) 5.Reassign action proposals to teams at another level in the organization, as appropriate. 6.Document their rationale for decisions and provide the decision and the rationale to the submitters of the action proposals. 7.Assign responsibility for implementing the action items resulting from the action proposals. Implementation of the action items includes making immediate changes to the activities that are within the purview of the team and arranging for other changes. Members of the team usually implement the action items, but, in some cases, the team can arrange for someone else to implement an action item. 8.Review results of defect prevention experiments and take actions to incorporate the results of successful experiments into the rest of the project or organization, as appropriate. Examples of defect prevention experiments include: using a temporarily modified process, and using a new tool. 9.Track the status of the action proposals and action items. CS577b 3/20/00 26
  • 27. DP Team Activities (cont.) 10.Document software process improvement proposals for the organization's standard software process and the projects' defined software processes as appropriate. The submitters of the action proposal are designated as the submitters of the software process improvement proposals. 11.Review and verify completed action items before they are closed. 12.Ensure that significant efforts and successes in preventing defects are recognized. CS577b 3/20/00 27
  • 28. DP Documentation and Tracking Activities Activity 5 -- Defect prevention data are documented and tracked across the teams coordinating defect prevention activities. 1.Action proposals identified in causal analysis meetings are documented. Examples of data that are in the description of an action proposal include: originator of the action proposal, description of the defect, description of the defect cause, defect cause category, stage when the defect was injected, stage when the defect was identified, description of the action proposal, and action proposal category. 2.Action items resulting from action proposals are documented. Examples of data that are in the description of an action item include: the person responsible for implementing it, a description of the areas affected by it, the individuals who are to be kept informed of its status, the next date its status will be reviewed, the rationale for key decisions, a description of implementation actions, the time and cost for identifying the defect and correcting it, and the estimated cost of not fixing the defect. CS577b 3/20/00 28
  • 29. DP Feedback n Feedback is needed on the status and results of the organization's and project's defect prevention activities on a periodic basis. The feedback provides: 1.A summary of the major defect categories. 2.The frequency distribution of defects in the major defect categories. 3.Significant innovations and actions taken to address the major defect categories. 4.A summary status of the action proposals and action items. Examples of means to provide this feedback include: electronic bulletin boards, newsletters, and information flow meetings. CS577b 3/20/00 29
  • 30. DP Measurements n Examples: • the cumulative costs of defect prevention activities (e.g., holding causal analysis meetings and implementing action items) • the time and cost for identifying the defects and correcting them, compared to the estimated cost of not correcting the defects • profiles measuring the number of action items proposed, open, and completed • the number of defects injected in each stage, cumulatively, and over releases of similar products and the total number of defects. CS577b 3/20/00 30
  • 31. DP Management Reviews n DP reviews cover: 1.A summary of the major defect categories and the frequency distribution of defects in these categories. 2.A summary of the major action categories and the frequency distribution of actions in these categories. 3.Significant actions taken to address the major defect categories. 4.A summary status of the proposed, open, and completed action items. 5.A summary of the effectiveness of and savings attributable to the defect prevention activities. 6.The actual cost of completed defect prevention activities and the projected cost of planned defect prevention activities. CS577b 3/20/00 31
  • 32. References n Defect Prevention (DP) Inderpal Bhandari, Michael Halliday, et al., "A Case Study of Software Process Improvement During Development," IEEE Transactions on Software Engineering, Vol. 19, No. 12, December 1993, pp. 1157- 1170. R. Chillarege and I. Bhandari, "Orthogonal Defect Classification -- A Concept for In-Process Measurements," IEEE Software, Vol. 18, No. 11, November 1992, pp. 943-955. Julia L. Gale, Jesus R. Tirso, and C. Art Burchfield, "Implementing the Defect Prevention Process in the MVS Interactive Programming Organization," IBM Systems Journal, Vol. 29, No. 1, 1990, pp. 33-43. C.L. Jones, "A Process-Integrated Approach to Defect Prevention," IBM Systems Journal, Vol. 24, No. 2, 1985, pp. 150-167. Juichirou Kajihara, Goro Amamiya, and Tetsuo Saya, "Learning from Bugs," IEEE Software, Vol. 10, No. 5, September 1993, pp. 46-54. R.G. Mays, C.L. Jones, G.J. Holloway, and D.P. Studinski, "Experiences with Defect Prevention," IBM Systems Journal, Vol. 29, No. 1, 1990, pp. 4-32. Norman Bridge and Corinne Miller, "Orthogonal Defect Classification Using Defect Data to Improve Software Development," Proceedings of the 7th International Conference on Software Quality, Montgomery, Alabama, 6-8 October 1997, pp. 197-213. CS577b 3/20/00 32