SlideShare a Scribd company logo
1 of 8
14
                                                                        June 2011




                             The Magazine for Professional Testers
printed in Germany
print version 8,00 €
free digital version
www.testingexperience.com




                                           Improving the Test Process
ISSN 1866-5705




                             © iStockphoto.com/ jgroup
© iStockphoto.com/Dimitris66




                                              Predicting Defects in System Testing
                                              Phase Using a Model: A Six Sigma
                                              Approach
                                                                 by Muhammad Dhiauddin Mohamed Suffian




                                                                                ing a predicted number of defects to be discovered, test
This article is a revised version of a paper presented at the 4th Inter-        execution can be planned accordingly to meet the project
national Symposium on Information Technology (ITSIM‘10)                         deadline.
                                                                           4.   Model also deals with issue on reliability of software to be de-
Defect prediction is an important aspect of the software develop-               livered: More defects found within the testing phase means
ment life cycle. The rationale in knowing the predicted number                  more defects are prevented from escaping to the field. Having
of functional defects earlier on in the lifecycle, particularly at the          defect prediction provides a direction on how many defects
start of the testing phase, is, apart from just find as many defects            should be contained within the phase and later, contributes
as possible during the testing phase, to determine when to stop                 to the zero known post-release defects of the software. In the
testing and ensure all defects have been found in testing before                long run, it shall portray the stability of the whole team effort
                                                                                in producing software.
the software is delivered to the intended end user. It also ensures
that wider test coverage is put in place to discover the predicted         Research Objectives
defects. This research is aimed at achieving zero known post-re-
lease defects of the software delivered to the end user. To achieve        The main question to address is “How to predict the total number
the target, the research effort focuses on establishing a test de-         of defects to be found at the start of the system testing phase
fect prediction model using the Design for Six Sigma methodol-             using a model?” This is followed by the list of sub-questions to
ogy in a controlled environment, where all factors contributing            support the main research questions as below:
to defects in the software are collected prior to commencing the
                                                                           •	   What are the key contributors to the test defect prediction
testing phase. It identifies the requirements for the prediction                model?
model and how the model can benefit from them. It also outlines
the possible predictors associated with defect discovery in the            •	   What are the factors that contribute to finding defects in the
                                                                                system testing phase?
testing phase. The proposed test defect prediction model is dem-
onstrated via multiple regression analysis incorporating testing           •	   How can we measure the relationship between the defect
metrics and development-related metrics as the predictors.                      factors and the total number of defects in the system testing
                                                                                phase?
Why use defect prediction at the start of the testing
                                                                           •	   What type of data should be gathered and how to get them?
phase
                                                                           •	   How can the prediction model help in improving the testing
1.   Better resource planning for test execution across projects: As
                                                                                process and improve software quality?
     one test resource can work in more than one testing project,
     the prediction of defects in the testing phase allows for plan-       From the list of questions, the research effort has been able to
     ning the appropriate number of resources to be assigned               demonstrate the following key items:
     across multiple projects and support resource planning ac-
     tivities to ensure the optimized resources and higher pro-            •	   Establish a defect prediction model for software testing
     ductivity.                                                                 phase

2.   Issue on wider test coverage in discovering defects: To ensure        •	   Demonstrate the approach in building a defect prediction
     wider and better test coverage, prediction of defects could                model using the Design for Six Sigma (DfSS) Methodology
     improve the way of finding defects since there is now a tar-          •	   Identify the significant factors that contribute to a reliable
     get number of defects to be found, either by adding more                   defect prediction model
     types of testing or adding more scenarios related to user ex-
     perience which results in better root cause analysis.                 •	   Determine the importance of a defect prediction model for
                                                                                improving the testing process
3.   Issue on improving test execution time to meet project dead-
     line: Putting defect prediction in place will reduce schedule
     slippage problems resulting from testing activities. By hav-

52              The Magazine for Professional Testers                                                                www.testingexperience.com
Building the Model                                                       A.       Define Phase
                                                                         The Define Phase primarily involves establishing the project
In Design for Six Sigma (DfSS), the research is organized accord-        definition and acknowledging the fundamental needs of the re-
ing to DMADV phases: Define (D), Measure (M), Analyze (A), De-           search. A typical software production process which applies the V-
sign (D) and Verify (V). DfSS seeks to sequence proper tools and         Model software development life cycle is used as the basis for this
techniques to design in more value during new product devel-             research. Throughout the process, the testing team is involved in
opment, while creating and using higher quality data for key de-         all review sessions for each phase, starting from planning until
velopment decisions. Achievement of DfSS is observed when the            the end of the system integration testing phase. The test lab is
products or services developed meet customer needs rather than           involved in reviewing the planning document, the requirement
competing alternatives. In general, DfSS-DMADV phases are de-            analysis document, the design document, the test planning doc-
scribed as below:                                                        ument and the test cases (see Figure 1). The software production
                                                                         process is governed by project management, quality manage-
•	   Define - identify the project goals and customer (internal
     and external) requirements                                          ment, configuration and change management, integral and sup-
                                                                         port as well as process improvement initiatives, in compliance to
•	   Measure - determine customer needs and specifications;              CMMI®. From the figure, the area of study is the functional or sys-
     benchmark competitors and industry
                                                                         tem test phase, where defects are going to be predicted. There-
•	   Analyze – study and evaluate the process options to meet            fore, only potential factors/predictors in the phases prior to the
     customer needs                                                      system testing phase are considered and investigated.
•	   Design – detailing the process to meet customer needs
•	   Verify – confirm and prove the design performance and abil-
     ity to meet customer needs




Figure 1: Typical Software Production Process




Two (2) schematic diagrams are produced, which are a high level          are defect containment in the test phase, customer satisfaction,
schematic diagram (Figure 2) and a detailed schematic diagram            quality of the process being imposed to produce the software and
(Figure 3). The high level schematic diagram deals with estab-           project management. There are two aspects involved related to
lishing the Big Y or business target, little Ys, vital Xs and the goal   these little Ys: potential number of defects before the test phase
statement against the business scorecard. In this research, Big          which is the research interest, and the number of defects after
Y is to produce software with zero-known post-release defects,           completing the test phase.
while for little Ys, elements that contribute to achieving the Big Y




www.testingexperience.com                                                               The Magazine for Professional Testers            53
Big Y                                                                           Zero-Known Post
                                                                                    Release Defects




   Little ys
                          Defect Contain-                         Customer                                                        Project
                                                                                               Quality of process
                         ment in Test Phase                      satisfaction                                                   Management




  Vital Xs
                           Potential # of                          Level of                         People                         Timeline
                         defect before test                      satisfaction                      capability                      Allocation



                          # of defect after                                                         Process                         Rescue
                                test                                                             Effectiveness                     Allocation


Figure 2: Schematic Diagram


For the detailed schematic diagram (or detailed Y-X tree), possible             test, and these are summarized in a Y to X tree diagram as shown
factors that contribute to the test defect prediction are defined               in the figure below. The highlighted factors are selected for fur-
from the Vital X, which is the potential number of defects before               ther analysis.



                                                                   Test Defect
                                                                   Prediction


 Software                                                                                         Historical
                   Knowledge          Test Process             Errors             Fault                                        Project
Complexity                                                                                         Defect


   Require-          Developer          Test Case De-           Require-          Require-             Defect            Project          Project
  ment Pages         Knowledge          sign Coverage          ment Error        ment Fault           Severity           Domain           Thread


      Design           Tester           Targeted Total                             Design          Defect Type/
                                                               Design Error                                                               Component
      Pages          Knowledge            Test Cases                                Fault           Category


  Programming                              Test                                                        Defect
    Language
                                                                CUT Error         CUT Fault                                               Application
                                        Automation                                                     Validity


                                        Test Case Execu-        Test Plan        Integration        Total PR (Defects)
     Code Size                          tion Productivity         Error             Fault                 Raised




                                        Total Effort in Test    Test Cases        Test Case
                                          Design Phase            Error             Fault


                                          Total Effort in
                                         Phases Prior to
                                           System Test                                                                     Factors to consider


Figure 3: Detailed Y-X Tree Diagram




54               The Magazine for Professional Testers                                                                     www.testingexperience.com
B.           Measure Phase
                                                                                known results of PASS and FAIL are identified. Then three test
A Measurement System Analysis (MSA) is conducted to validate                    engineers are selected to execute the test cases at random. This
that the processes of discovering defects are repeatable and re-                is repeated three times for every engineer. The outcome is pre-
producible, thus eliminating human errors. Ten test cases with                  sented in Figure 4 below:

                                                              Tester 1                       Tester 2                        Tester 3
             TCD ID             TC Result
                                                     1           2          3         1         2          3          1         2          3
      TC1                    PASS                PASS         PASS       PASS     PASS      PASS        PASS      PASS       PASS       PASS
      TC2                    FAIL                FAIL         FAIL       FAIL     FAIL      FAIL        FAIL      PASS       PASS       PASS
      TC3                    FAIL                FAIL         FAIL       FAIL     FAIL      FAIL        FAIL      PASS       PASS       PASS
      TC4                    PASS                FAIL         FAIL       FAIL     FAIL      FAIL        FAIL      PASS       PASS       PASS
      TC5                    PASS                PASS         PASS       PASS     PASS      PASS        PASS      PASS       PASS       PASS
      TC6                    PASS                PASS         PASS       PASS     PASS      PASS        PASS      PASS       PASS       PASS
      TC7                    PASS                PASS         PASS       PASS     PASS      PASS        PASS      PASS       PASS       PASS
      TC8                    FAIL                FAIL         FAIL       FAIL     FAIL      FAIL        FAIL      FAIL       FAIL       FAIL
      TC9                    PASS                PASS         PASS       PASS     PASS      PASS        PASS      PASS       PASS       PASS
      TC10                   PASS                PASS         PASS       PASS     PASS      PASS        PASS      PASS       PASS       PASS

     Figure 4: Data for Analysis



The data is then used to run the MSA, where the result is com-                  As the MSA is passed, the operational definition is prepared con-
pared against the Kappa value. Since the overall result of all ap-              sisting of type of data to be gathered, measurement to be used,
praisers against standard is above 70% (as required by Kappa’s                  responsibilities, and mechanism to obtain those data so that the
value), the MSA is considered as a PASS as presented in Figure 5                data can be used for next DfSS phase.
below:




                             Overall Result for MSA is PASS since Kappa value is greater than 0.7 or 70%

Figure 5: Overall Result of MSA




C.           Analyze Phase                                                      analysis via manual stepwise regression analysis, i.e. the predic-
                                                                                tors are added and removed during the regression until a strong
The data gathered from the Measure Phase is used to perform
                                                                                statistical result is obtained. The figure involves the P-value of
a first round of regression analysis using the data shown in Fig-
                                                                                each predictor against the defects, the R-squared (R-Sq.) value and
ure 6, which was collected from thirteen projects. The predictors
                                                                                the R-squared adjusted (R-Sq. (adj.)) value. These figures demon-
used include requirement error, design error, Code and Unit Test
                                                                                strated the goodness of the equation and how well it can be used
(CUT) error, Kilo Lines of Code (KLOC) size, targeted total test cases
                                                                                for predicting the defects. The result of the regression analysis is
to be executed, test plan error, test cases error, automation per-
                                                                                presented in Figure 7.
centage, test effort in days, and test execution productivity per
staff day. The regression is done against the functional defects as
the target. MINITAB software is used to perform the regression


56                    The Magazine for Professional Testers                                                               www.testingexperience.com
Project      Req. Er-       Design        CUT         KLOC             Total     Test Plan Test Case Automa-                Test     Test Ex-    Func-
  Name           ror           Error        Error                         Test       Error     Error    tion %               Effort    ecution     tional
                                                                         Cases                                                         Produc-    Defects
                                                                                                                                        tivity
 Project A        5             22              12          28.8         224            0           34           0           6.38      45.8000      19
 Project B        0                0            1           6.8           17            0           6            0           9.36      17.0000       1
 Project C        9             10              14          5.4           24            4           6            0           29.16     5.8333       4
 Project D        7             12              2            1.1          25            4           9            0           13.17     7.0000       0
 Project E        11            29              3           1.2           28            4           12           0           14.26     3.4000       3
 Project F        0                2            7           6.8           66            1           7            0           32.64     31.0000      16
 Project G        3             25              11           4           149            5           0            0            7.15     74.5000      3
 Project H        4                9            2           0.2           24            4           0            0           18.78     7.6667       0
 Project I        7                0            1           1.8           16            1           3            0           9.29      2.6818        1
 Project J         1               7            2           2.1           20            1           4            0           6.73      1.9450       0
 Project K        17               0            3           1.4           13            1           4            0           8.44      6.5000        1
 Project L        3                0            0           1.3           20            1           7            0           14.18     9.7500        1
 Project M        2                3            16          2.5           7             1           6            0           8.44      1.7500       0
Figure 6: Data for Regression Analysis




                       Figure 7: Regression Analysis Result




  Project       Req.        Design       CUT         KLOC          Req.        Design       Total         Test   Total         Test      Func-      All
  Name          Error        Error       Error                     Page         Page         Test        Cases   Effort       Design     tional   Defects
                                                                                            Cases        Error                Effort    Defects
 Project A        5            22          12        28.8           81           121        224           34         16.79     15.20        19      19
 Project B        0            0           1         6.8           171           14          17           6      45.69         40.91        1        1
 Project C        9            10          14        5.4            23           42          24           6          13.44     13.44        4        4
 Project D        7            12          2          1.1           23           42          25           9          4.90       9.90        0        0
 Project E        11           29          3          1.2           23           54          28           12         4.72       4.59        3        3
 Project F        0            2           7         6.8           20            70          88            7     32.69         16.00        16      27
 Project G        3            25          11         4             38           131        149           0      64.00         53.30        3        3
 Project H        4            9           2         0.2           26            26          24           0          5.63       5.63        0        0
 Project I        17           0           3          1.4           15           28          13           4          9.13       7.88        1        1
 Project J       61            34          24         36            57           156        306           16     89.42         76.16        25      28
 Project K       32            16          19        12.3          162           384         142          0          7.00       7.00        12      12
 Project L        0            2           3         3.8            35           33          40           3          8.86       8.86        6        6
 Project M       15            18          10        26.1          88            211         151          22     30.99         28.61        39      57
 Project N        0            4           0         24.2          102           11          157          0          41.13     28.13        20      33
Figure 8: New Set of Data for Regression




www.testingexperience.com                                                                           The Magazine for Professional Testers                57
D.        Design Phase
Based on the result, it has been demonstrated that for this round
of regression, possible predictors are design error, targeted total         Further analysis is conducted during the Design Phase due to the
test cases to be executed, test plan error, and test effort in days,        need to generate a prediction model that both is practical and
in which the P-value for each predictor is less than 0.05. As overall       makes sense from the viewpoint of software practitioners using
model equation, this model portrays strong characteristics of a             logical predictors, to filter the metrics to contain only valid data,
good model via high percentage of R-Sq. and R-Sq. (adjusted) val-           and to reduce the model to have only coefficients that have a logi-
ues: 96.7% and 95.0% respectively. From the regression, this equa-          cal correlation to the defect. As a result, a new set of data is used
tion is selected for study in the next phase:                               to refine the model as shown in Figure 8.

                                                                            Using the new set of data, new regression results are presented
     Defect = - 3.04 + 0.220 Design Error + 0.0624 Targeted Total
                                                                            in Figure 9.
     Test Cases - 2.30 Test Plan Error + 0.477 Test Efforts




                            Figure 9: New Regression Result


From these latest results, it can be demonstrated that significant          Based on the the verification result, it is clearly shown that the
predictors for predicting defects are requirement error, CUT er-            model is fit for use and can be implemented to predict test de-
ror, KLOC, requirement page, design page, targeted total test cas-          fects for the software product. This is justified by the predicted
es to be executed, and test effort in days from the phases prior to         defects number, which falls within 95% Prediction Interval (PI). As
system test. The P-value for each factor is less than 0.05, while the       the test defect prediction model equation is finalized, the next
R-Sq. and R-Sq. (adjusted) values are 98.9% and 97.6% respective-           consideration is to emphasize on the control plan with regard to
ly, which results in a stronger prediction equation. The selected           its implementation in the process, i.e. when the actual number
equation is as below:                                                       of defects found is lower or greater than the prediction. Figure 11
                                                                            below summarizes the control plan:
     Functional Defects (Y) = 4.00 - 0.204 Requirement Error - 0.631
     CUT error + 1.90 KLOC - 0.140 Requirement Page + 0.125 Design           Actual Defects < Predicted           Actual Defects > Predicted
     Page - 0.169 Total Test Cases + 0.221 Test Effort                       Defects                              Defects
                                                                             Perform thorough testing             Re-visit the errors captured
                                                                             during ad-hoc test                   during requirement, design
E.            Verify Phase                                                                                        and CUT phase
The selected model in the Design Phase is now verified against               Perform additional test              Re-visit the errors captured
new projects that have yet to go to the System Test phase. The               strategy that relates to the         during test case design
actual defects found after the System Test has been completed                discovery of more functional
                                                                             defects
are compared against the predicted defects to ensure that actual
defects fall between 95% prediction intervals (PI) of the model as                                                Re-visit the model and the
presented in the last column in Figure 10.                                                                        factors used to define the
                                                                                                                  model
        No          Predicted        Actual        95% CI     95% PI
                                                                            Figure 11: Action Plan for Test Defect Prediction Model
                   Functional      Functional    (min, max) (min, max)
                     Defects        Defects
         1.            182             187        (155, 209)   (154, 209)
         2.             6               1           (0, 2)      (0, 14)
         3.             1               1           (0, 3)       (0, 6)

Figure 10: Verification Result for the Prediction Model

58                  The Magazine for Professional Testers                                                                 www.testingexperience.com
Conclusion
                                                                                                        > biography
                                 From the findings in this research, it is proven that Six Sigma pro-
                                 vides a structured and systematic way of building a defect predic-                                    Muhammad Dhiauddin Mo-
                                 tion model for the testing phase. The Design for Six Sigma (DfSS)                                     hamed Suffian
                                                                                                                                       is pursuing his PhD (Computer
                                 methodology provides opportunities to clearly determine what
                                                                                                                                       Science) in Software Testing at
                                 needs to be achieved from the research, issues to be addressed,                                       Universiti Teknologi Malaysia
                                 data to be collected, the needs to be measured and how the                                            and working as lecturer at
                                 model is generated, constructed and validated. Moreover, from                                         the leading open-university
                                 the model equation it is possible to discover strong factors that                                     in Malaysia. He was the Solu-
                                 contribute to the number of defects in the testing phase, while                                       tion Test Manager at one of
                                 acknowledging the need to consider other important factors that                                       the public-listed IT companies
                                 have significant impact to on testing defects. This research has                                      in Malaysia and prior to that,
                                 also demonstrated the importance of a defect prediction model in                                      he was Senior Engineer and
                                                                                                                                       Test Team Lead for the test de-
                                 improving the testing process, since it contributes to zero-known
                                                                                                          partment in one of the leading R&D agencies in Malaysia.
                                 post-release defects of a software. Having test defect prediction        He has almost 7 years of experience in the software/sys-
                                 helps in better test strategy and wider test coverage, while at the      tem sevelopment and software testing / quality assurance
                                 same time ensuring stability of overall the software development         fields. With working experience in IT, automotive, bank-
                                 effort for releasing a software product.                                 ing and research & development companies, he obtained
                                                                                                          his technical and management skills from various project
                                 References                                                               profiles. A graduate of M.Sc. in Real Time Software Engi-
                                 1.   Graham, G., Veenendaal, E.V., Evans, I. and Black, R. (2006).       neering from Centre for Advanced Software Engineering
                                      Foundations of Software Testing: ISTQB Certification. Thom-         (CASE), Universiti Teknologi Malaysia, he holds various pro-
                                      son Learning. United Kingdom.                                       fessional certifications, namely Certified Six Sigma Green
                                                                                                          Belt, Certified Tester Foundation Level (CTFL) and Certified
                                 2.   Fenton, N.E. and Neil, M. (1999). A Critique of Software Defect     Tester Advanced Level – Test Manager (CTAL-TM). He also
                                      Prediction Models. IEEE Transactions On Software Engineer-          has vast knowledge in CMMi, test process and methodolo-
                                      ing. Volume 25, No.5.                                               gies as well as the Software Development Life Cycle (SDLC).
                                 3.   Clark, B. and Zubrow, D. (2001). How Good is the Software: A        He was involved in and has managed various testing strat-
                                      Review of Defect Prediction Techniques. Software Engineer-          egies for different projects, including functional, perfor-
                                      ing Symposium. Carnegie Mellon University.                          mance, security, usability and compatibility test, both at
                                                                                                          system test and system integration test level. His interest
                                 4.   Nayak, V. and Naidya, D. (2003). Defect Estimation Strategies.      falls within software engineering and software testing ar-
                                      Patni Computer Systems Limited. Mumbai.                             eas, particularly on performance testing and test manage-
                                                                                                          ment.
                                 5.   Thangarajan, M. and Biswas, B. (2002). Software Reliability
                                      Prediction Model. Tata Elxsi Whitepaper




                                                                             Lassen Sie sich auf
                                                                            Mallorca zertifizieren!
© Wolfgang Zintl - Fotolia.com




                                                              Certified Tester Advanced Level
                                                                       TESTMANAGER - deutsch
                                                                                 10.10. – 14.10.2011 Mallorca




                                 www.testingexperience.com                       http://training.diazhilterscheid.com Magazine for Professional Testers
                                                                                                                    The                                                  59
                                                                                                                                                                          1

More Related Content

What's hot

A survey of software testing
A survey of software testingA survey of software testing
A survey of software testing
Tao He
 
SDT STRW Test Assessment White Paper
SDT STRW Test Assessment White PaperSDT STRW Test Assessment White Paper
SDT STRW Test Assessment White Paper
JamesWright
 
01 software test engineering (manual testing)
01 software test engineering (manual testing)01 software test engineering (manual testing)
01 software test engineering (manual testing)
Siddireddy Balu
 
Software testing
Software testingSoftware testing
Software testing
Sengu Msc
 
Introduction to software testing
Introduction to software testingIntroduction to software testing
Introduction to software testing
Venkat Alagarsamy
 
Mi0033 software engineering
Mi0033  software engineeringMi0033  software engineering
Mi0033 software engineering
smumbahelp
 
St & internationalization
St & internationalizationSt & internationalization
St & internationalization
Sachin MK
 

What's hot (17)

A survey of software testing
A survey of software testingA survey of software testing
A survey of software testing
 
EFFECTIVE TEST CASE DESING: A REVIEW
EFFECTIVE TEST CASE DESING: A REVIEWEFFECTIVE TEST CASE DESING: A REVIEW
EFFECTIVE TEST CASE DESING: A REVIEW
 
Unit II Software Testing and Quality Assurance
Unit II Software Testing and Quality AssuranceUnit II Software Testing and Quality Assurance
Unit II Software Testing and Quality Assurance
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and Planning
 
12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating12 sdd lesson testing and evaluating
12 sdd lesson testing and evaluating
 
SDT STRW Test Assessment White Paper
SDT STRW Test Assessment White PaperSDT STRW Test Assessment White Paper
SDT STRW Test Assessment White Paper
 
01 software test engineering (manual testing)
01 software test engineering (manual testing)01 software test engineering (manual testing)
01 software test engineering (manual testing)
 
Software testing
Software testing   Software testing
Software testing
 
Ctfl 001 q&amp;a-demo-exam-area
Ctfl 001 q&amp;a-demo-exam-areaCtfl 001 q&amp;a-demo-exam-area
Ctfl 001 q&amp;a-demo-exam-area
 
Essential Test Management and Planning
Essential Test Management and PlanningEssential Test Management and Planning
Essential Test Management and Planning
 
Software testing course_in_mumbai
Software testing course_in_mumbaiSoftware testing course_in_mumbai
Software testing course_in_mumbai
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing course_in_mumbai
Software testing course_in_mumbaiSoftware testing course_in_mumbai
Software testing course_in_mumbai
 
Introduction to software testing
Introduction to software testingIntroduction to software testing
Introduction to software testing
 
Fundamentals of Testing 2
Fundamentals of Testing 2Fundamentals of Testing 2
Fundamentals of Testing 2
 
Mi0033 software engineering
Mi0033  software engineeringMi0033  software engineering
Mi0033 software engineering
 
St & internationalization
St & internationalizationSt & internationalization
St & internationalization
 

Viewers also liked

Clipping Vinicola Garibaldi - Maio/Agosto 2008
Clipping Vinicola Garibaldi - Maio/Agosto 2008Clipping Vinicola Garibaldi - Maio/Agosto 2008
Clipping Vinicola Garibaldi - Maio/Agosto 2008
Agência DUE
 
Terrorist assemblages
Terrorist assemblagesTerrorist assemblages
Terrorist assemblages
Sarah Rainey
 
Clipping Famastil 2010/02 - Publicações On Line
Clipping Famastil 2010/02 - Publicações On LineClipping Famastil 2010/02 - Publicações On Line
Clipping Famastil 2010/02 - Publicações On Line
Agência DUE
 
KHN Social Media Presentatie Ede 18 april 2011
KHN Social Media Presentatie Ede 18 april 2011KHN Social Media Presentatie Ede 18 april 2011
KHN Social Media Presentatie Ede 18 april 2011
Al Sauerfield
 
Politics of human trafficking
Politics of human traffickingPolitics of human trafficking
Politics of human trafficking
Sarah Rainey
 
Clipping Hotel Alpestre - Novembro 2009
Clipping Hotel Alpestre - Novembro 2009Clipping Hotel Alpestre - Novembro 2009
Clipping Hotel Alpestre - Novembro 2009
Agência DUE
 
Presentatie 1, tour culinair 2011
Presentatie 1, tour culinair 2011Presentatie 1, tour culinair 2011
Presentatie 1, tour culinair 2011
Al Sauerfield
 

Viewers also liked (20)

Leadership training brisbane_trust
Leadership training brisbane_trustLeadership training brisbane_trust
Leadership training brisbane_trust
 
The waves
The wavesThe waves
The waves
 
6 Ingredients for a Good eCommerce Blog | Keyideas Infotech
6 Ingredients for a Good eCommerce Blog | Keyideas Infotech6 Ingredients for a Good eCommerce Blog | Keyideas Infotech
6 Ingredients for a Good eCommerce Blog | Keyideas Infotech
 
10 Hottest Web-Design Trends | Keyideas Infotech
10 Hottest Web-Design Trends | Keyideas Infotech10 Hottest Web-Design Trends | Keyideas Infotech
10 Hottest Web-Design Trends | Keyideas Infotech
 
Clipping Vinicola Garibaldi - Maio/Agosto 2008
Clipping Vinicola Garibaldi - Maio/Agosto 2008Clipping Vinicola Garibaldi - Maio/Agosto 2008
Clipping Vinicola Garibaldi - Maio/Agosto 2008
 
De Boer Business Space Solutions
De Boer Business Space SolutionsDe Boer Business Space Solutions
De Boer Business Space Solutions
 
IOTO 3: Foursquare, 8 november 2011
IOTO 3: Foursquare, 8 november 2011IOTO 3: Foursquare, 8 november 2011
IOTO 3: Foursquare, 8 november 2011
 
KHN Kantoor presentatie 2.0
KHN Kantoor presentatie 2.0KHN Kantoor presentatie 2.0
KHN Kantoor presentatie 2.0
 
Terrorist assemblages
Terrorist assemblagesTerrorist assemblages
Terrorist assemblages
 
KHN 2015 jan 2015
KHN 2015 jan 2015  KHN 2015 jan 2015
KHN 2015 jan 2015
 
Clipping Famastil 2010/02 - Publicações On Line
Clipping Famastil 2010/02 - Publicações On LineClipping Famastil 2010/02 - Publicações On Line
Clipping Famastil 2010/02 - Publicações On Line
 
Human rights
Human rightsHuman rights
Human rights
 
What to Look for When hiring a Web Development Company
What to Look for When hiring a Web Development CompanyWhat to Look for When hiring a Web Development Company
What to Look for When hiring a Web Development Company
 
Heteren 17 mei kansen voor horecaondernemers
Heteren 17 mei kansen voor horecaondernemersHeteren 17 mei kansen voor horecaondernemers
Heteren 17 mei kansen voor horecaondernemers
 
KHN Social Media Presentatie Ede 18 april 2011
KHN Social Media Presentatie Ede 18 april 2011KHN Social Media Presentatie Ede 18 april 2011
KHN Social Media Presentatie Ede 18 april 2011
 
Presentatie Patrick Leenheers Vodafone - online tuesday 'Mobile' 11 mei 2010
Presentatie Patrick Leenheers Vodafone - online tuesday 'Mobile' 11 mei 2010Presentatie Patrick Leenheers Vodafone - online tuesday 'Mobile' 11 mei 2010
Presentatie Patrick Leenheers Vodafone - online tuesday 'Mobile' 11 mei 2010
 
Restaurant R7
Restaurant R7Restaurant R7
Restaurant R7
 
Politics of human trafficking
Politics of human traffickingPolitics of human trafficking
Politics of human trafficking
 
Clipping Hotel Alpestre - Novembro 2009
Clipping Hotel Alpestre - Novembro 2009Clipping Hotel Alpestre - Novembro 2009
Clipping Hotel Alpestre - Novembro 2009
 
Presentatie 1, tour culinair 2011
Presentatie 1, tour culinair 2011Presentatie 1, tour culinair 2011
Presentatie 1, tour culinair 2011
 

Similar to Testing Experience Magazine Vol.14 June 2011

International Journal of Soft Computing and Engineering (IJS
International Journal of Soft Computing and Engineering (IJSInternational Journal of Soft Computing and Engineering (IJS
International Journal of Soft Computing and Engineering (IJS
hildredzr1di
 
Towards formulating dynamic model for predicting defects in system testing us...
Towards formulating dynamic model for predicting defects in system testing us...Towards formulating dynamic model for predicting defects in system testing us...
Towards formulating dynamic model for predicting defects in system testing us...
Journal Papers
 

Similar to Testing Experience Magazine Vol.14 June 2011 (20)

Test case-point-analysis (whitepaper)
Test case-point-analysis (whitepaper)Test case-point-analysis (whitepaper)
Test case-point-analysis (whitepaper)
 
Week_02.pptx
Week_02.pptxWeek_02.pptx
Week_02.pptx
 
International Journal of Soft Computing and Engineering (IJS
International Journal of Soft Computing and Engineering (IJSInternational Journal of Soft Computing and Engineering (IJS
International Journal of Soft Computing and Engineering (IJS
 
Process Models IN software Engineering
Process Models IN software EngineeringProcess Models IN software Engineering
Process Models IN software Engineering
 
Towards formulating dynamic model for predicting defects in system testing us...
Towards formulating dynamic model for predicting defects in system testing us...Towards formulating dynamic model for predicting defects in system testing us...
Towards formulating dynamic model for predicting defects in system testing us...
 
Testing throughout the software life cycle (software development models)
Testing throughout the software life cycle (software development models)Testing throughout the software life cycle (software development models)
Testing throughout the software life cycle (software development models)
 
Software testing ppt
Software testing pptSoftware testing ppt
Software testing ppt
 
IT Quality Testing and the Defect Management Process
IT Quality Testing and the Defect Management ProcessIT Quality Testing and the Defect Management Process
IT Quality Testing and the Defect Management Process
 
Software testing & Quality Assurance
Software testing & Quality Assurance Software testing & Quality Assurance
Software testing & Quality Assurance
 
Software Testing 1198102207476437 4
Software Testing 1198102207476437 4Software Testing 1198102207476437 4
Software Testing 1198102207476437 4
 
Software Testing
Software TestingSoftware Testing
Software Testing
 
Module-4 PART-2&3.ppt
Module-4 PART-2&3.pptModule-4 PART-2&3.ppt
Module-4 PART-2&3.ppt
 
Estimating test effort part 1 of 2
Estimating test effort part 1 of 2Estimating test effort part 1 of 2
Estimating test effort part 1 of 2
 
FROM THE ART OF SOFTWARE TESTING TO TEST-AS-A-SERVICE IN CLOUD COMPUTING
FROM THE ART OF SOFTWARE TESTING TO TEST-AS-A-SERVICE IN CLOUD COMPUTINGFROM THE ART OF SOFTWARE TESTING TO TEST-AS-A-SERVICE IN CLOUD COMPUTING
FROM THE ART OF SOFTWARE TESTING TO TEST-AS-A-SERVICE IN CLOUD COMPUTING
 
From the Art of Software Testing to Test-as-a-Service in Cloud Computing
From the Art of Software Testing to Test-as-a-Service in Cloud ComputingFrom the Art of Software Testing to Test-as-a-Service in Cloud Computing
From the Art of Software Testing to Test-as-a-Service in Cloud Computing
 
A Complexity Based Regression Test Selection Strategy
A Complexity Based Regression Test Selection StrategyA Complexity Based Regression Test Selection Strategy
A Complexity Based Regression Test Selection Strategy
 
Test planning.ppt
Test planning.pptTest planning.ppt
Test planning.ppt
 
Sta unit 2(abimanyu)
Sta unit 2(abimanyu)Sta unit 2(abimanyu)
Sta unit 2(abimanyu)
 
programming testing.pdf
programming testing.pdfprogramming testing.pdf
programming testing.pdf
 
programming testing.pdf
programming testing.pdfprogramming testing.pdf
programming testing.pdf
 

More from MIMOS Berhad/Open University Malaysia/Universiti Teknologi Malaysia

More from MIMOS Berhad/Open University Malaysia/Universiti Teknologi Malaysia (10)

An Alternative of Secured Online Shopping System via Point-Based Contactless ...
An Alternative of Secured Online Shopping System via Point-Based Contactless ...An Alternative of Secured Online Shopping System via Point-Based Contactless ...
An Alternative of Secured Online Shopping System via Point-Based Contactless ...
 
A Proposal of Postgraduate Programme for Software Testing Specialization
A Proposal of Postgraduate Programme for Software Testing SpecializationA Proposal of Postgraduate Programme for Software Testing Specialization
A Proposal of Postgraduate Programme for Software Testing Specialization
 
Performance Testing Strategy for Cloud-Based System using Open Source Testing...
Performance Testing Strategy for Cloud-Based System using Open Source Testing...Performance Testing Strategy for Cloud-Based System using Open Source Testing...
Performance Testing Strategy for Cloud-Based System using Open Source Testing...
 
A Method for Predicting Defects in System Testing for V-Model
A Method for Predicting Defects in System Testing for V-ModelA Method for Predicting Defects in System Testing for V-Model
A Method for Predicting Defects in System Testing for V-Model
 
A Regression Analysis Approach for Building a Prediction Model for System Tes...
A Regression Analysis Approach for Building a Prediction Model for System Tes...A Regression Analysis Approach for Building a Prediction Model for System Tes...
A Regression Analysis Approach for Building a Prediction Model for System Tes...
 
Performance Testing: Analyzing Differences of Response Time between Performan...
Performance Testing: Analyzing Differences of Response Time between Performan...Performance Testing: Analyzing Differences of Response Time between Performan...
Performance Testing: Analyzing Differences of Response Time between Performan...
 
Adopting Six Sigma Approach in Predicting Functional Defects for System Testing
Adopting Six Sigma Approach in Predicting Functional Defects for System TestingAdopting Six Sigma Approach in Predicting Functional Defects for System Testing
Adopting Six Sigma Approach in Predicting Functional Defects for System Testing
 
Establishing A Defect Prediction Model Using A Combination of Product Metrics...
Establishing A Defect Prediction Model Using A Combination of Product Metrics...Establishing A Defect Prediction Model Using A Combination of Product Metrics...
Establishing A Defect Prediction Model Using A Combination of Product Metrics...
 
Testing Experience Magazine Vol.12 Dec 2010
Testing Experience Magazine Vol.12 Dec 2010Testing Experience Magazine Vol.12 Dec 2010
Testing Experience Magazine Vol.12 Dec 2010
 
Breaking the Software - A Topic on Software Engineering & Testing
Breaking the Software -  A Topic on Software Engineering & TestingBreaking the Software -  A Topic on Software Engineering & Testing
Breaking the Software - A Topic on Software Engineering & Testing
 

Testing Experience Magazine Vol.14 June 2011

  • 1. 14 June 2011 The Magazine for Professional Testers printed in Germany print version 8,00 € free digital version www.testingexperience.com Improving the Test Process ISSN 1866-5705 © iStockphoto.com/ jgroup
  • 2. © iStockphoto.com/Dimitris66 Predicting Defects in System Testing Phase Using a Model: A Six Sigma Approach by Muhammad Dhiauddin Mohamed Suffian ing a predicted number of defects to be discovered, test This article is a revised version of a paper presented at the 4th Inter- execution can be planned accordingly to meet the project national Symposium on Information Technology (ITSIM‘10) deadline. 4. Model also deals with issue on reliability of software to be de- Defect prediction is an important aspect of the software develop- livered: More defects found within the testing phase means ment life cycle. The rationale in knowing the predicted number more defects are prevented from escaping to the field. Having of functional defects earlier on in the lifecycle, particularly at the defect prediction provides a direction on how many defects start of the testing phase, is, apart from just find as many defects should be contained within the phase and later, contributes as possible during the testing phase, to determine when to stop to the zero known post-release defects of the software. In the testing and ensure all defects have been found in testing before long run, it shall portray the stability of the whole team effort in producing software. the software is delivered to the intended end user. It also ensures that wider test coverage is put in place to discover the predicted Research Objectives defects. This research is aimed at achieving zero known post-re- lease defects of the software delivered to the end user. To achieve The main question to address is “How to predict the total number the target, the research effort focuses on establishing a test de- of defects to be found at the start of the system testing phase fect prediction model using the Design for Six Sigma methodol- using a model?” This is followed by the list of sub-questions to ogy in a controlled environment, where all factors contributing support the main research questions as below: to defects in the software are collected prior to commencing the • What are the key contributors to the test defect prediction testing phase. It identifies the requirements for the prediction model? model and how the model can benefit from them. It also outlines the possible predictors associated with defect discovery in the • What are the factors that contribute to finding defects in the system testing phase? testing phase. The proposed test defect prediction model is dem- onstrated via multiple regression analysis incorporating testing • How can we measure the relationship between the defect metrics and development-related metrics as the predictors. factors and the total number of defects in the system testing phase? Why use defect prediction at the start of the testing • What type of data should be gathered and how to get them? phase • How can the prediction model help in improving the testing 1. Better resource planning for test execution across projects: As process and improve software quality? one test resource can work in more than one testing project, the prediction of defects in the testing phase allows for plan- From the list of questions, the research effort has been able to ning the appropriate number of resources to be assigned demonstrate the following key items: across multiple projects and support resource planning ac- tivities to ensure the optimized resources and higher pro- • Establish a defect prediction model for software testing ductivity. phase 2. Issue on wider test coverage in discovering defects: To ensure • Demonstrate the approach in building a defect prediction wider and better test coverage, prediction of defects could model using the Design for Six Sigma (DfSS) Methodology improve the way of finding defects since there is now a tar- • Identify the significant factors that contribute to a reliable get number of defects to be found, either by adding more defect prediction model types of testing or adding more scenarios related to user ex- perience which results in better root cause analysis. • Determine the importance of a defect prediction model for improving the testing process 3. Issue on improving test execution time to meet project dead- line: Putting defect prediction in place will reduce schedule slippage problems resulting from testing activities. By hav- 52 The Magazine for Professional Testers www.testingexperience.com
  • 3. Building the Model A. Define Phase The Define Phase primarily involves establishing the project In Design for Six Sigma (DfSS), the research is organized accord- definition and acknowledging the fundamental needs of the re- ing to DMADV phases: Define (D), Measure (M), Analyze (A), De- search. A typical software production process which applies the V- sign (D) and Verify (V). DfSS seeks to sequence proper tools and Model software development life cycle is used as the basis for this techniques to design in more value during new product devel- research. Throughout the process, the testing team is involved in opment, while creating and using higher quality data for key de- all review sessions for each phase, starting from planning until velopment decisions. Achievement of DfSS is observed when the the end of the system integration testing phase. The test lab is products or services developed meet customer needs rather than involved in reviewing the planning document, the requirement competing alternatives. In general, DfSS-DMADV phases are de- analysis document, the design document, the test planning doc- scribed as below: ument and the test cases (see Figure 1). The software production process is governed by project management, quality manage- • Define - identify the project goals and customer (internal and external) requirements ment, configuration and change management, integral and sup- port as well as process improvement initiatives, in compliance to • Measure - determine customer needs and specifications; CMMI®. From the figure, the area of study is the functional or sys- benchmark competitors and industry tem test phase, where defects are going to be predicted. There- • Analyze – study and evaluate the process options to meet fore, only potential factors/predictors in the phases prior to the customer needs system testing phase are considered and investigated. • Design – detailing the process to meet customer needs • Verify – confirm and prove the design performance and abil- ity to meet customer needs Figure 1: Typical Software Production Process Two (2) schematic diagrams are produced, which are a high level are defect containment in the test phase, customer satisfaction, schematic diagram (Figure 2) and a detailed schematic diagram quality of the process being imposed to produce the software and (Figure 3). The high level schematic diagram deals with estab- project management. There are two aspects involved related to lishing the Big Y or business target, little Ys, vital Xs and the goal these little Ys: potential number of defects before the test phase statement against the business scorecard. In this research, Big which is the research interest, and the number of defects after Y is to produce software with zero-known post-release defects, completing the test phase. while for little Ys, elements that contribute to achieving the Big Y www.testingexperience.com The Magazine for Professional Testers 53
  • 4. Big Y Zero-Known Post Release Defects Little ys Defect Contain- Customer Project Quality of process ment in Test Phase satisfaction Management Vital Xs Potential # of Level of People Timeline defect before test satisfaction capability Allocation # of defect after Process Rescue test Effectiveness Allocation Figure 2: Schematic Diagram For the detailed schematic diagram (or detailed Y-X tree), possible test, and these are summarized in a Y to X tree diagram as shown factors that contribute to the test defect prediction are defined in the figure below. The highlighted factors are selected for fur- from the Vital X, which is the potential number of defects before ther analysis. Test Defect Prediction Software Historical Knowledge Test Process Errors Fault Project Complexity Defect Require- Developer Test Case De- Require- Require- Defect Project Project ment Pages Knowledge sign Coverage ment Error ment Fault Severity Domain Thread Design Tester Targeted Total Design Defect Type/ Design Error Component Pages Knowledge Test Cases Fault Category Programming Test Defect Language CUT Error CUT Fault Application Automation Validity Test Case Execu- Test Plan Integration Total PR (Defects) Code Size tion Productivity Error Fault Raised Total Effort in Test Test Cases Test Case Design Phase Error Fault Total Effort in Phases Prior to System Test Factors to consider Figure 3: Detailed Y-X Tree Diagram 54 The Magazine for Professional Testers www.testingexperience.com
  • 5. B. Measure Phase known results of PASS and FAIL are identified. Then three test A Measurement System Analysis (MSA) is conducted to validate engineers are selected to execute the test cases at random. This that the processes of discovering defects are repeatable and re- is repeated three times for every engineer. The outcome is pre- producible, thus eliminating human errors. Ten test cases with sented in Figure 4 below: Tester 1 Tester 2 Tester 3 TCD ID TC Result 1 2 3 1 2 3 1 2 3 TC1 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS TC2 FAIL FAIL FAIL FAIL FAIL FAIL FAIL PASS PASS PASS TC3 FAIL FAIL FAIL FAIL FAIL FAIL FAIL PASS PASS PASS TC4 PASS FAIL FAIL FAIL FAIL FAIL FAIL PASS PASS PASS TC5 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS TC6 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS TC7 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS TC8 FAIL FAIL FAIL FAIL FAIL FAIL FAIL FAIL FAIL FAIL TC9 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS TC10 PASS PASS PASS PASS PASS PASS PASS PASS PASS PASS Figure 4: Data for Analysis The data is then used to run the MSA, where the result is com- As the MSA is passed, the operational definition is prepared con- pared against the Kappa value. Since the overall result of all ap- sisting of type of data to be gathered, measurement to be used, praisers against standard is above 70% (as required by Kappa’s responsibilities, and mechanism to obtain those data so that the value), the MSA is considered as a PASS as presented in Figure 5 data can be used for next DfSS phase. below: Overall Result for MSA is PASS since Kappa value is greater than 0.7 or 70% Figure 5: Overall Result of MSA C. Analyze Phase analysis via manual stepwise regression analysis, i.e. the predic- tors are added and removed during the regression until a strong The data gathered from the Measure Phase is used to perform statistical result is obtained. The figure involves the P-value of a first round of regression analysis using the data shown in Fig- each predictor against the defects, the R-squared (R-Sq.) value and ure 6, which was collected from thirteen projects. The predictors the R-squared adjusted (R-Sq. (adj.)) value. These figures demon- used include requirement error, design error, Code and Unit Test strated the goodness of the equation and how well it can be used (CUT) error, Kilo Lines of Code (KLOC) size, targeted total test cases for predicting the defects. The result of the regression analysis is to be executed, test plan error, test cases error, automation per- presented in Figure 7. centage, test effort in days, and test execution productivity per staff day. The regression is done against the functional defects as the target. MINITAB software is used to perform the regression 56 The Magazine for Professional Testers www.testingexperience.com
  • 6. Project Req. Er- Design CUT KLOC Total Test Plan Test Case Automa- Test Test Ex- Func- Name ror Error Error Test Error Error tion % Effort ecution tional Cases Produc- Defects tivity Project A 5 22 12 28.8 224 0 34 0 6.38 45.8000 19 Project B 0 0 1 6.8 17 0 6 0 9.36 17.0000 1 Project C 9 10 14 5.4 24 4 6 0 29.16 5.8333 4 Project D 7 12 2 1.1 25 4 9 0 13.17 7.0000 0 Project E 11 29 3 1.2 28 4 12 0 14.26 3.4000 3 Project F 0 2 7 6.8 66 1 7 0 32.64 31.0000 16 Project G 3 25 11 4 149 5 0 0 7.15 74.5000 3 Project H 4 9 2 0.2 24 4 0 0 18.78 7.6667 0 Project I 7 0 1 1.8 16 1 3 0 9.29 2.6818 1 Project J 1 7 2 2.1 20 1 4 0 6.73 1.9450 0 Project K 17 0 3 1.4 13 1 4 0 8.44 6.5000 1 Project L 3 0 0 1.3 20 1 7 0 14.18 9.7500 1 Project M 2 3 16 2.5 7 1 6 0 8.44 1.7500 0 Figure 6: Data for Regression Analysis Figure 7: Regression Analysis Result Project Req. Design CUT KLOC Req. Design Total Test Total Test Func- All Name Error Error Error Page Page Test Cases Effort Design tional Defects Cases Error Effort Defects Project A 5 22 12 28.8 81 121 224 34 16.79 15.20 19 19 Project B 0 0 1 6.8 171 14 17 6 45.69 40.91 1 1 Project C 9 10 14 5.4 23 42 24 6 13.44 13.44 4 4 Project D 7 12 2 1.1 23 42 25 9 4.90 9.90 0 0 Project E 11 29 3 1.2 23 54 28 12 4.72 4.59 3 3 Project F 0 2 7 6.8 20 70 88 7 32.69 16.00 16 27 Project G 3 25 11 4 38 131 149 0 64.00 53.30 3 3 Project H 4 9 2 0.2 26 26 24 0 5.63 5.63 0 0 Project I 17 0 3 1.4 15 28 13 4 9.13 7.88 1 1 Project J 61 34 24 36 57 156 306 16 89.42 76.16 25 28 Project K 32 16 19 12.3 162 384 142 0 7.00 7.00 12 12 Project L 0 2 3 3.8 35 33 40 3 8.86 8.86 6 6 Project M 15 18 10 26.1 88 211 151 22 30.99 28.61 39 57 Project N 0 4 0 24.2 102 11 157 0 41.13 28.13 20 33 Figure 8: New Set of Data for Regression www.testingexperience.com The Magazine for Professional Testers 57
  • 7. D. Design Phase Based on the result, it has been demonstrated that for this round of regression, possible predictors are design error, targeted total Further analysis is conducted during the Design Phase due to the test cases to be executed, test plan error, and test effort in days, need to generate a prediction model that both is practical and in which the P-value for each predictor is less than 0.05. As overall makes sense from the viewpoint of software practitioners using model equation, this model portrays strong characteristics of a logical predictors, to filter the metrics to contain only valid data, good model via high percentage of R-Sq. and R-Sq. (adjusted) val- and to reduce the model to have only coefficients that have a logi- ues: 96.7% and 95.0% respectively. From the regression, this equa- cal correlation to the defect. As a result, a new set of data is used tion is selected for study in the next phase: to refine the model as shown in Figure 8. Using the new set of data, new regression results are presented Defect = - 3.04 + 0.220 Design Error + 0.0624 Targeted Total in Figure 9. Test Cases - 2.30 Test Plan Error + 0.477 Test Efforts Figure 9: New Regression Result From these latest results, it can be demonstrated that significant Based on the the verification result, it is clearly shown that the predictors for predicting defects are requirement error, CUT er- model is fit for use and can be implemented to predict test de- ror, KLOC, requirement page, design page, targeted total test cas- fects for the software product. This is justified by the predicted es to be executed, and test effort in days from the phases prior to defects number, which falls within 95% Prediction Interval (PI). As system test. The P-value for each factor is less than 0.05, while the the test defect prediction model equation is finalized, the next R-Sq. and R-Sq. (adjusted) values are 98.9% and 97.6% respective- consideration is to emphasize on the control plan with regard to ly, which results in a stronger prediction equation. The selected its implementation in the process, i.e. when the actual number equation is as below: of defects found is lower or greater than the prediction. Figure 11 below summarizes the control plan: Functional Defects (Y) = 4.00 - 0.204 Requirement Error - 0.631 CUT error + 1.90 KLOC - 0.140 Requirement Page + 0.125 Design Actual Defects < Predicted Actual Defects > Predicted Page - 0.169 Total Test Cases + 0.221 Test Effort Defects Defects Perform thorough testing Re-visit the errors captured during ad-hoc test during requirement, design E. Verify Phase and CUT phase The selected model in the Design Phase is now verified against Perform additional test Re-visit the errors captured new projects that have yet to go to the System Test phase. The strategy that relates to the during test case design actual defects found after the System Test has been completed discovery of more functional defects are compared against the predicted defects to ensure that actual defects fall between 95% prediction intervals (PI) of the model as Re-visit the model and the presented in the last column in Figure 10. factors used to define the model No Predicted Actual 95% CI 95% PI Figure 11: Action Plan for Test Defect Prediction Model Functional Functional (min, max) (min, max) Defects Defects 1. 182 187 (155, 209) (154, 209) 2. 6 1 (0, 2) (0, 14) 3. 1 1 (0, 3) (0, 6) Figure 10: Verification Result for the Prediction Model 58 The Magazine for Professional Testers www.testingexperience.com
  • 8. Conclusion > biography From the findings in this research, it is proven that Six Sigma pro- vides a structured and systematic way of building a defect predic- Muhammad Dhiauddin Mo- tion model for the testing phase. The Design for Six Sigma (DfSS) hamed Suffian is pursuing his PhD (Computer methodology provides opportunities to clearly determine what Science) in Software Testing at needs to be achieved from the research, issues to be addressed, Universiti Teknologi Malaysia data to be collected, the needs to be measured and how the and working as lecturer at model is generated, constructed and validated. Moreover, from the leading open-university the model equation it is possible to discover strong factors that in Malaysia. He was the Solu- contribute to the number of defects in the testing phase, while tion Test Manager at one of acknowledging the need to consider other important factors that the public-listed IT companies have significant impact to on testing defects. This research has in Malaysia and prior to that, also demonstrated the importance of a defect prediction model in he was Senior Engineer and Test Team Lead for the test de- improving the testing process, since it contributes to zero-known partment in one of the leading R&D agencies in Malaysia. post-release defects of a software. Having test defect prediction He has almost 7 years of experience in the software/sys- helps in better test strategy and wider test coverage, while at the tem sevelopment and software testing / quality assurance same time ensuring stability of overall the software development fields. With working experience in IT, automotive, bank- effort for releasing a software product. ing and research & development companies, he obtained his technical and management skills from various project References profiles. A graduate of M.Sc. in Real Time Software Engi- 1. Graham, G., Veenendaal, E.V., Evans, I. and Black, R. (2006). neering from Centre for Advanced Software Engineering Foundations of Software Testing: ISTQB Certification. Thom- (CASE), Universiti Teknologi Malaysia, he holds various pro- son Learning. United Kingdom. fessional certifications, namely Certified Six Sigma Green Belt, Certified Tester Foundation Level (CTFL) and Certified 2. Fenton, N.E. and Neil, M. (1999). A Critique of Software Defect Tester Advanced Level – Test Manager (CTAL-TM). He also Prediction Models. IEEE Transactions On Software Engineer- has vast knowledge in CMMi, test process and methodolo- ing. Volume 25, No.5. gies as well as the Software Development Life Cycle (SDLC). 3. Clark, B. and Zubrow, D. (2001). How Good is the Software: A He was involved in and has managed various testing strat- Review of Defect Prediction Techniques. Software Engineer- egies for different projects, including functional, perfor- ing Symposium. Carnegie Mellon University. mance, security, usability and compatibility test, both at system test and system integration test level. His interest 4. Nayak, V. and Naidya, D. (2003). Defect Estimation Strategies. falls within software engineering and software testing ar- Patni Computer Systems Limited. Mumbai. eas, particularly on performance testing and test manage- ment. 5. Thangarajan, M. and Biswas, B. (2002). Software Reliability Prediction Model. Tata Elxsi Whitepaper Lassen Sie sich auf Mallorca zertifizieren! © Wolfgang Zintl - Fotolia.com Certified Tester Advanced Level TESTMANAGER - deutsch 10.10. – 14.10.2011 Mallorca www.testingexperience.com http://training.diazhilterscheid.com Magazine for Professional Testers The 59 1