SOFTWARE QUALITY ASSURANCE
TEST MANAGEMENT




      Seminar: Oana FEIDI
      Quality Manager – Continental Automotive
PROJECT TEAM
            Quality Manager         Test Manager




                              SW Project Manager
    Project Manager
TEST MANAGEMENT - DEFINITIONS
   important part of software quality is the process of testing and
    validating the software

    Test management is the practice of organizing and controlling
    the process and artifacts required for the testing effort.

    The general goal of test management is to allow teams to plan,
    develop, execute, and assess all testing activities within the
    overall software development effort.
      This includes coordinating efforts of all those involved in the
       testing effort, tracking dependencies and relationships among
       test assets and, most importantly, defining, measuring, and
       tracking quality goals.
TEST MANAGEMENT - PHASES
   Test artifacts and resource organization
   Test planning is the overall set of tasks that address the questions
    of why, what, where, and when to test.


    Test authoring is a process of capturing the specific steps required
    to complete a given test.


   Test execution consists of running the tests by assembling
    sequences of test scripts into a suite of tests.
    Test reporting is how the various results of the testing effort are
    analyzed and communicated. This is used to determine the current
    status of project testing, as well as the overall level of quality of the
    application or system.                 100%
                                            90%
                                                                        act. # of spec. TCs
                                                         Automatisation level
                                                                        atc. autom. level [%]
                                                                                                                                                        3500

                                                                                                                                                        3000
                                                                        planned autom. level [%]                                                 80%
                                         % of automated TCs




                                                              80%




                                                                                                                                                               # of specified TCs
                                                                                                                                          70% 70%0%
                                                                                                                                                7 69%
                                                                                                                                68%     68% 69%         2500
                                                              70%                                                                     64%
                                                                                                                 6 61%
                                                                                                               60%0%
                                                              60%                                                    57%        56% 58%
                                                                                                                                  57%
                                                                                                                       54%4% 5 3%
                                                                                                                         5 53%                          2000
                                                                                                             51%
                                                                                                      48%48%8%
                                                                                                       47% 4
                                                              50%                                43% 44%
                                                                                               42% 42%
                                                                                             39%                                                        1500
                                                              40%       36% 3 5% 35% 36%6% 34%
                                                                          36% 35% 3 6% 3 36%
                                                                                      35%
                                                                      31%
                                                                    28%
                                                              30%                                                                                       1000
                                                              20%
                                                                                                                                                        500
                                                              10%
                                                              0%                                                                                        0
                                                                    CW13
                                                                    CW14
                                                                    CW15
                                                                    CW16
                                                                    CW17
                                                                    CW18
                                                                    CW19
                                                                    CW20
                                                                    CW21
                                                                    CW22
                                                                    CW23
                                                                    CW24
                                                                    CW25
                                                                    CW26
                                                                    CW27
                                                                    CW28
                                                                    CW29
                                                                    CW30
                                                                    CW31
                                                                    CW32
                                                                    CW33
                                                                    CW34
                                                                    CW35
                                                                    CW36
                                                                    CW37
                                                                    CW38
                                                                    CW39
                                                                    CW40
                                                                    CW41
                                                                    CW42
                                                                    CW43
                                                                    CW44
                                                                    CW45
                                                                    CW46

                                                                    CW48
                                                                    CW49
                                                                    CW50
                                                                    CW51
                                                                    CW47
                                                                          LS 7             LS 7.1               LS 8                      LS 9

                                                                                           Calender weeks / Delivery steps
TEST MANAGEMENT – PHASES
(EXAMPLES)
TEST MANAGEMENT - CHALLENGES

     Why should I test?                        Not enough time to test
     What should I test?                       Not enough resources to test
     Where do I test?                           Testing teams are not always in
     When do I test?                            one place
     How do I conduct the tests?            
                                                 Difficulties with requirements
                                                Keeping in synch with
                                                 development
                                                Reporting the right information


       http://www.ibm.com/developerworks/rational/library/06/1107_davis/
TEST MANAGEMENT – PRIORITIES
DEFINITIONS
   Example Priority Definitions
      P1 – Failure on this test is likely to result in a loss or corruption of data. This test
      must be run as soon as practicable and must also be run on the final build.
      P2 – Failure on this test is likely to results in unacceptable loss of functionality.
      This test must be run as soon as practicable. The test should also be run for the
      final time once development in this area of functionality has stabilized.
      P3 – Failure on this test is likely to result in loss of functionality but there may
      well be workarounds available. This test should be run only once development in
      this area of functionality has stabilized.
      P4 – Failure on this test is likely to result in loss of functionality that is not
      critical to a user. This test should be run once and probably need not be run again.
      P5 – Failure on this test is likely to indicate a trivial problem with the
      functionality. If time permits it would be nice to run these tests but they need not
      be completed if the time scales don’t allow (i.e. if this test was carried out and
      failed it would not stop the product shipping)
TEST MANAGEMENT –
CLASSIFICATIONS EXAMPLES
(AUTOMOTIVE)




                    Continental rating
  Renault rating
TEST MANAGEMENT – SPECIFIC RULES
Test Technique type   Example

Systematic            Boundary value (~85%)

Lessons Learned       Checklist (~5%)

Intuitive             Error guessing (~5%)

Supporting            Stress test, Robustness test (~5%)


Special               critical timing analysis (only if applicable)
ROLE OF TEST MANAGER
   What the test manager is responsible for:
    ◦ Defining and implementing the role testing plays within the
      organization.
    ◦ Defining the scope of testing within the context of each
      release/delivery.
    ◦ Deploying and managing the appropriate testing framework to meet
      the testing mandate.
    ◦ Implementing and evolving appropriate measurements and metrics.
        To be applied against the product under test.

        To be applied against the testing team.

    ◦ Planning, deploying and managing the testing effort for any given
      engagement/release.
    ◦ Managing and growing testing assets required for meeting the
      testing mandate:
        Team members

        Testing tools

        Testing processes

    ◦ Retaining skilled testing personnel.
TEST MANAGEMENT
RECOMMENDATIONS
     Start test management activities early
     Test iteratively
     Reuse test artifacts
     Utilize requirements-based testing
       Validating that something does what it is supposed to do
       Trying to find out what can cause something to break
     Defining and enforcing a flexible testing process
     Coordinate and integrate with the rest of development
     Communicate status
     Focus on goals and results
TEST MANAGEMENT - TESTING
METRICS

    Number of faults detected per functionality ordered by severity before
     delivery
    Number of test cases per functionality
    Number of test steps per test case
    Number of test cases per requirement
    Number of faults detected by test cases before delivery
    Effort for execution of test cases
    Requirement coverage by test cases
TEST MANAGEMENT - TESTING
METRICS

Test management

  • 1.
    SOFTWARE QUALITY ASSURANCE TESTMANAGEMENT Seminar: Oana FEIDI Quality Manager – Continental Automotive
  • 2.
    PROJECT TEAM Quality Manager Test Manager SW Project Manager Project Manager
  • 3.
    TEST MANAGEMENT -DEFINITIONS  important part of software quality is the process of testing and validating the software  Test management is the practice of organizing and controlling the process and artifacts required for the testing effort.  The general goal of test management is to allow teams to plan, develop, execute, and assess all testing activities within the overall software development effort.  This includes coordinating efforts of all those involved in the testing effort, tracking dependencies and relationships among test assets and, most importantly, defining, measuring, and tracking quality goals.
  • 4.
    TEST MANAGEMENT -PHASES  Test artifacts and resource organization  Test planning is the overall set of tasks that address the questions of why, what, where, and when to test.  Test authoring is a process of capturing the specific steps required to complete a given test.  Test execution consists of running the tests by assembling sequences of test scripts into a suite of tests.  Test reporting is how the various results of the testing effort are analyzed and communicated. This is used to determine the current status of project testing, as well as the overall level of quality of the application or system. 100% 90% act. # of spec. TCs Automatisation level atc. autom. level [%] 3500 3000 planned autom. level [%] 80% % of automated TCs 80% # of specified TCs 70% 70%0% 7 69% 68% 68% 69% 2500 70% 64% 6 61% 60%0% 60% 57% 56% 58% 57% 54%4% 5 3% 5 53% 2000 51% 48%48%8% 47% 4 50% 43% 44% 42% 42% 39% 1500 40% 36% 3 5% 35% 36%6% 34% 36% 35% 3 6% 3 36% 35% 31% 28% 30% 1000 20% 500 10% 0% 0 CW13 CW14 CW15 CW16 CW17 CW18 CW19 CW20 CW21 CW22 CW23 CW24 CW25 CW26 CW27 CW28 CW29 CW30 CW31 CW32 CW33 CW34 CW35 CW36 CW37 CW38 CW39 CW40 CW41 CW42 CW43 CW44 CW45 CW46 CW48 CW49 CW50 CW51 CW47 LS 7 LS 7.1 LS 8 LS 9 Calender weeks / Delivery steps
  • 5.
    TEST MANAGEMENT –PHASES (EXAMPLES)
  • 6.
    TEST MANAGEMENT -CHALLENGES  Why should I test?  Not enough time to test  What should I test?  Not enough resources to test  Where do I test?  Testing teams are not always in  When do I test? one place  How do I conduct the tests?  Difficulties with requirements  Keeping in synch with development  Reporting the right information http://www.ibm.com/developerworks/rational/library/06/1107_davis/
  • 7.
    TEST MANAGEMENT –PRIORITIES DEFINITIONS  Example Priority Definitions  P1 – Failure on this test is likely to result in a loss or corruption of data. This test must be run as soon as practicable and must also be run on the final build.  P2 – Failure on this test is likely to results in unacceptable loss of functionality. This test must be run as soon as practicable. The test should also be run for the final time once development in this area of functionality has stabilized.  P3 – Failure on this test is likely to result in loss of functionality but there may well be workarounds available. This test should be run only once development in this area of functionality has stabilized.  P4 – Failure on this test is likely to result in loss of functionality that is not critical to a user. This test should be run once and probably need not be run again.  P5 – Failure on this test is likely to indicate a trivial problem with the functionality. If time permits it would be nice to run these tests but they need not be completed if the time scales don’t allow (i.e. if this test was carried out and failed it would not stop the product shipping)
  • 8.
    TEST MANAGEMENT – CLASSIFICATIONSEXAMPLES (AUTOMOTIVE) Continental rating Renault rating
  • 9.
    TEST MANAGEMENT –SPECIFIC RULES Test Technique type Example Systematic Boundary value (~85%) Lessons Learned Checklist (~5%) Intuitive Error guessing (~5%) Supporting Stress test, Robustness test (~5%) Special critical timing analysis (only if applicable)
  • 10.
    ROLE OF TESTMANAGER  What the test manager is responsible for: ◦ Defining and implementing the role testing plays within the organization. ◦ Defining the scope of testing within the context of each release/delivery. ◦ Deploying and managing the appropriate testing framework to meet the testing mandate. ◦ Implementing and evolving appropriate measurements and metrics.  To be applied against the product under test.  To be applied against the testing team. ◦ Planning, deploying and managing the testing effort for any given engagement/release. ◦ Managing and growing testing assets required for meeting the testing mandate:  Team members  Testing tools  Testing processes ◦ Retaining skilled testing personnel.
  • 11.
    TEST MANAGEMENT RECOMMENDATIONS  Start test management activities early  Test iteratively  Reuse test artifacts  Utilize requirements-based testing  Validating that something does what it is supposed to do  Trying to find out what can cause something to break  Defining and enforcing a flexible testing process  Coordinate and integrate with the rest of development  Communicate status  Focus on goals and results
  • 12.
    TEST MANAGEMENT -TESTING METRICS  Number of faults detected per functionality ordered by severity before delivery  Number of test cases per functionality  Number of test steps per test case  Number of test cases per requirement  Number of faults detected by test cases before delivery  Effort for execution of test cases  Requirement coverage by test cases
  • 13.
    TEST MANAGEMENT -TESTING METRICS