SOFTWARE QUALITY ASSURANCE TEST MANAGEMENT Seminar:  Oana FEIDI Quality Manager – Continental Automotive
Project team Project Manager Test Manager SW Project Manager Quality Manager
Test management - definitions important part of software quality is the process of testing and validating the software  Test management  is the practice of organizing and controlling the process and artifacts required for the testing effort.  The general goal of test management is  to allow teams  to plan ,  develop ,  execute , and  assess  all testing activities within the overall software development effort . This includes coordinating efforts of all those involved in the testing effort, tracking dependencies and relationships among test assets and, most importantly, defining, measuring, and tracking quality goals.
Test management - phases Test artifacts and resource organization  Test planning  is the overall set of tasks that address the questions of why, what, where, and when to test. Test authoring  is a process of capturing the specific steps required to complete a given test.  Test execution  consists of running the tests by assembling sequences of test scripts into a suite of tests. Test reporting  is how the various results of the testing effort are analyzed and communicated. This is used to determine the current status of project testing, as well as the overall level of quality of the application or system.
Test management – phases (examples)
Test management - challenges Why should I test?  What should I test?  Where do I test?  When do I test?  How do I conduct the tests?  Not enough time to test Not enough resources to test  Testing teams are not always in one place  Difficulties with requirements   Keeping in synch with development  Reporting the right information http://www.ibm.com/developerworks/rational/library/06/1107_davis/
Test management – priorities definitions Example Priority Definitions  P1  – Failure on this test is likely to result in a loss or corruption of data. This test must be run as soon as practicable and must also be run on the final build.  P2  – Failure on this test is likely to results in unacceptable loss of functionality. This test must be run as soon as practicable. The test should also be run for the final time once development in this area of functionality has stabilized.  P3  – Failure on this test is likely to result in loss of functionality but there may well be workarounds available. This test should be run only once development in this area of functionality has stabilized.  P4  – Failure on this test is likely to result in loss of functionality that is not critical to a user. This test should be run once and probably need not be run again.  P5  – Failure on this test is likely to indicate a trivial problem with the functionality. If time permits it would be nice to run these tests but they need not be completed if the time scales don’t allow (i.e. if this test was carried out and failed it would not stop the product shipping)
Test management – classifications examples (automotive) Renault rating Continental rating
Test management – specific rules Test Technique type   Example   Systematic   Boundary value (~85%) Lessons Learned   Checklist (~5%) Intuitive   Error guessing (~5%) Supporting  Stress test, Robustness test (~5%) Special   critical timing analysis (only if applicable)
Role of test manager What the test manager is responsible for:  Defining and implementing the role testing plays within the organization.  Defining the scope of testing within the context of each release/delivery.  Deploying and managing the appropriate testing framework to meet the testing mandate.  Implementing and evolving appropriate measurements and metrics.  To be applied against the product under test.  To be applied against the testing team. Planning, deploying and managing the testing effort for any given engagement/release.  Managing and growing testing assets required for meeting the testing mandate:  Team members  Testing tools  Testing processes Retaining skilled testing personnel.
Test management recommendations   Start test management activities early  Test iteratively  Reuse test artifacts  Utilize requirements-based testing  Validating that something does what it is supposed to do  Trying to find out what can cause something to break  Defining and enforcing a flexible testing process  Coordinate and integrate with the rest of development  Communicate status  Focus on goals and results
Test management - testing metrics Number of faults detected per functionality ordered by severity before delivery Number of test cases per functionality Number of test steps per test case Number of test cases per requirement Number of faults detected by test cases before delivery Effort for execution of test cases  Requirement coverage by test cases
Test management - testing metrics

Test management

  • 1.
    SOFTWARE QUALITY ASSURANCETEST MANAGEMENT Seminar: Oana FEIDI Quality Manager – Continental Automotive
  • 2.
    Project team ProjectManager Test Manager SW Project Manager Quality Manager
  • 3.
    Test management -definitions important part of software quality is the process of testing and validating the software Test management is the practice of organizing and controlling the process and artifacts required for the testing effort. The general goal of test management is to allow teams to plan , develop , execute , and assess all testing activities within the overall software development effort . This includes coordinating efforts of all those involved in the testing effort, tracking dependencies and relationships among test assets and, most importantly, defining, measuring, and tracking quality goals.
  • 4.
    Test management -phases Test artifacts and resource organization Test planning is the overall set of tasks that address the questions of why, what, where, and when to test. Test authoring is a process of capturing the specific steps required to complete a given test. Test execution consists of running the tests by assembling sequences of test scripts into a suite of tests. Test reporting is how the various results of the testing effort are analyzed and communicated. This is used to determine the current status of project testing, as well as the overall level of quality of the application or system.
  • 5.
    Test management –phases (examples)
  • 6.
    Test management -challenges Why should I test? What should I test? Where do I test? When do I test? How do I conduct the tests? Not enough time to test Not enough resources to test Testing teams are not always in one place Difficulties with requirements Keeping in synch with development Reporting the right information http://www.ibm.com/developerworks/rational/library/06/1107_davis/
  • 7.
    Test management –priorities definitions Example Priority Definitions P1 – Failure on this test is likely to result in a loss or corruption of data. This test must be run as soon as practicable and must also be run on the final build. P2 – Failure on this test is likely to results in unacceptable loss of functionality. This test must be run as soon as practicable. The test should also be run for the final time once development in this area of functionality has stabilized. P3 – Failure on this test is likely to result in loss of functionality but there may well be workarounds available. This test should be run only once development in this area of functionality has stabilized. P4 – Failure on this test is likely to result in loss of functionality that is not critical to a user. This test should be run once and probably need not be run again. P5 – Failure on this test is likely to indicate a trivial problem with the functionality. If time permits it would be nice to run these tests but they need not be completed if the time scales don’t allow (i.e. if this test was carried out and failed it would not stop the product shipping)
  • 8.
    Test management –classifications examples (automotive) Renault rating Continental rating
  • 9.
    Test management –specific rules Test Technique type Example Systematic Boundary value (~85%) Lessons Learned Checklist (~5%) Intuitive Error guessing (~5%) Supporting Stress test, Robustness test (~5%) Special critical timing analysis (only if applicable)
  • 10.
    Role of testmanager What the test manager is responsible for: Defining and implementing the role testing plays within the organization. Defining the scope of testing within the context of each release/delivery. Deploying and managing the appropriate testing framework to meet the testing mandate. Implementing and evolving appropriate measurements and metrics. To be applied against the product under test. To be applied against the testing team. Planning, deploying and managing the testing effort for any given engagement/release. Managing and growing testing assets required for meeting the testing mandate: Team members Testing tools Testing processes Retaining skilled testing personnel.
  • 11.
    Test management recommendations Start test management activities early Test iteratively Reuse test artifacts Utilize requirements-based testing Validating that something does what it is supposed to do Trying to find out what can cause something to break Defining and enforcing a flexible testing process Coordinate and integrate with the rest of development Communicate status Focus on goals and results
  • 12.
    Test management -testing metrics Number of faults detected per functionality ordered by severity before delivery Number of test cases per functionality Number of test steps per test case Number of test cases per requirement Number of faults detected by test cases before delivery Effort for execution of test cases Requirement coverage by test cases
  • 13.
    Test management -testing metrics