Component Testing: Part I


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Component Testing: Part I

  1. 1. Component Testing: Part I Ednaldo Dilorenzo de Souza Filho [email_address]
  2. 2. Summary <ul><li>Introduction </li></ul><ul><li>Fundaments of Testing </li></ul><ul><li>Software Testability </li></ul><ul><li>Test Case Projects </li></ul><ul><li>Environment Testing, Architecture and Specialized Applications </li></ul><ul><li>Software Test Strategies </li></ul><ul><li>Component Testing </li></ul>
  3. 3. Introduction <ul><li>Testing is an important process to support assurance ; </li></ul><ul><li>As software becomes more pervasive and is used more often to perform critical tasks, it will be required to be of higher quality ; </li></ul><ul><li>Because testing requires the execution of the software, it is often called dynamic analysis [Harrold]; </li></ul><ul><li>Testing consists in compare the outputs from the executions with expected results to identify those test cases on which the software failed; </li></ul><ul><li>Testing cannot show the absence of faults, if can show only their presence. </li></ul>
  4. 4. Fundaments of Testing <ul><li>Testing goals: </li></ul><ul><ul><li>Find faults in a software program; </li></ul></ul><ul><ul><li>A good test case have a high probability to find errors; </li></ul></ul><ul><ul><li>A good test process should find errors; </li></ul></ul><ul><li>Testing fundamentals: </li></ul><ul><ul><li>Testing should be connected with client requirements; </li></ul></ul><ul><ul><li>Testing should be planned before testing execution; </li></ul></ul><ul><ul><li>Isolation of suspected components ; </li></ul></ul><ul><ul><li>Testing should start in individual components; </li></ul></ul><ul><ul><li>Complete testing is imposible; </li></ul></ul><ul><ul><li>Testing shouldn’t be executed for developers; </li></ul></ul>
  5. 5. Software Testability <ul><li>“ Software Testability means how easy the software is for testing” [PRESS]; </li></ul><ul><li>There are lots of characteristics of a testable software: </li></ul><ul><ul><li>Operability ; </li></ul></ul><ul><ul><li>Observables ; </li></ul></ul><ul><ul><li>Controllability ; </li></ul></ul><ul><ul><li>Decouplability ; </li></ul></ul><ul><ul><li>Simplicity ; </li></ul></ul><ul><ul><li>Stability ; </li></ul></ul><ul><ul><li>Understandably ; </li></ul></ul>
  6. 6. <ul><li>How to design for software testability and how to measure the degree to which you have achieve it? </li></ul><ul><li>Software testability is the process of investigate where software faults are and how to execute [Voas]; </li></ul><ul><li>Testability provide some guidance for testing; </li></ul><ul><li>Testability can give you confidence of correctness in fewer tests; </li></ul>Software Testability
  7. 7. Software Testability – Design for Testability <ul><li>Testability’s goal is to assess software accurately enough to demonstrate whether or not it has high quality ; </li></ul><ul><li>There are two ways to reduce the number of required tests: </li></ul><ul><ul><li>Select tests that have a great ability to reveal faults; </li></ul></ul><ul><ul><li>Design software that has a greater ability to fail when faults do exist (design for testability); </li></ul></ul><ul><li>There are several criteria on program design: </li></ul><ul><ul><li>More of the code must be exercised for each input; </li></ul></ul><ul><ul><li>Programs must contain constructs that are likely to cause the state of the program to become incorrect if the constructs are themselves incorrect; </li></ul></ul><ul><ul><li>Programs must be able to propagate incorrect states into software failures; </li></ul></ul>
  8. 8. <ul><li>Information Loss; </li></ul><ul><li>Implicit Information Loss; </li></ul><ul><li>Explicit Information Loss; </li></ul><ul><li>Design Heuristics; </li></ul><ul><li>Specification Decomposition; </li></ul><ul><li>Minimizations of variation reuse; </li></ul><ul><li>Analysis; </li></ul>Software Testability – Design for Testability
  9. 9. <ul><li>“ Sensitive Analysis quantifies behavioral information about the likelihood that faults are hiding” [Voas]; </li></ul><ul><li>Repeatedly executes the original program and mutations of its source code and data states using two assumptions: </li></ul><ul><ul><li>Single-fault assumption; </li></ul></ul><ul><ul><li>Simple-fault assumption; </li></ul></ul><ul><li>The specific purpose of sensitivity analysis is to provide information that suggests how small the program’s smallest faults are likely to be; </li></ul><ul><li>The strength of sensitivity analysis is that your prediction is based on observed effects from actual faults; </li></ul><ul><li>The weakness is that the faults injected and observed are only a small set from what might be an infinite class of faults; </li></ul>Software Testability – Sensitive Analysis
  10. 10. <ul><li>Execution Analysis; </li></ul><ul><li>Infection Analysis; </li></ul><ul><li>Propagation Analysis; </li></ul><ul><li>Implementation; </li></ul>Software Testability – Sensitive Analysis
  11. 11. <ul><li>Main goal of Test Projects is having a great probability of finding faults in software; </li></ul><ul><li>Hitachi Software, has attained such a high quality software that only 0.02 percent of all bugs in software program emerge at user’s site [Yamaura]; </li></ul>Test Case Projects
  12. 12. Test Case Projects <ul><li>Documenting Test Cases - Benefits </li></ul><ul><ul><li>Designing test cases gives you a chance to analyze the specification from a different angle; </li></ul></ul><ul><ul><li>You can repeat the same test cases; </li></ul></ul><ul><ul><li>Somebody else can execute the test cases for you; </li></ul></ul><ul><ul><li>You can easily validate the quality of the test cases; </li></ul></ul><ul><ul><li>You can estimate the quality of the target software early on; </li></ul></ul><ul><li>Schedule, Cost, and Personnel </li></ul><ul><ul><li>An average project might apportion 12 months spend 2 months in testing by quality assurance team; </li></ul></ul><ul><ul><li>The test-case density ; </li></ul></ul><ul><ul><li>A project with 100,000 LOC needs approximately 10,000 test cases – 1,000 per programmer; </li></ul></ul>
  13. 13. Test Case Projects <ul><li>Steps for Debugging and Testing: </li></ul><ul><ul><li>Design a set of test cases; </li></ul></ul><ul><ul><li>Check the test cases; </li></ul></ul><ul><ul><li>Do code inspection based on the test cases; </li></ul></ul><ul><ul><li>Do machine debugging based on the test cases; </li></ul></ul><ul><ul><li>Collect quality-related data during debugging; </li></ul></ul><ul><ul><li>Analyze the data during debugging; </li></ul></ul>
  14. 14. <ul><li>Main ways of testing a software: </li></ul><ul><ul><li>White-box Testing; </li></ul></ul><ul><ul><ul><li>Basic Way Testing; </li></ul></ul></ul><ul><ul><ul><ul><li>Graph Flow Notation; </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Cyclomatic Complexity; </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Test Case Derivation; </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Graph Matrix; </li></ul></ul></ul></ul><ul><ul><ul><li>Control Structure Testing; </li></ul></ul></ul><ul><ul><ul><ul><li>Condition Testing; </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Data Flow Testing; </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Cycle Testing; </li></ul></ul></ul></ul><ul><ul><li>Black-box Testing; </li></ul></ul><ul><ul><ul><li>Graph Based Testing Method; </li></ul></ul></ul><ul><ul><ul><li>Partition Equivalence; </li></ul></ul></ul><ul><ul><ul><li>Comparison Testing; </li></ul></ul></ul><ul><ul><ul><li>Orthogonal Matrix Testing; </li></ul></ul></ul>Test Case Projects
  15. 15. Environment Testing, Architecture and Specialized Applications <ul><li>GUI Testing; </li></ul><ul><li>Client/Server Architecture Testing; </li></ul><ul><li>Documentation and Help Devices Testing; </li></ul><ul><li>Real Time Systems Testing; </li></ul>
  16. 16. Software Test Strategies <ul><li>Integration of test case projects methods in well planed steps, resulting in a good software construction; </li></ul><ul><li>It should be sufficiently flexible for holding test method and hard for holding planning and management; </li></ul><ul><li>A software strategy should execute low and high level tests; </li></ul>
  17. 17. Software Test Strategies <ul><li>A Strategic Approach for Testing Software </li></ul><ul><ul><li>Verification and Validation </li></ul></ul><ul><ul><ul><li>Verification means the activities that assure the software implements a function correctly; </li></ul></ul></ul><ul><ul><li>Software Test Organization </li></ul></ul><ul><ul><ul><li>Software developers should test execute only unit tests; </li></ul></ul></ul><ul><ul><ul><li>Independent test group (ITG) is responsible for execute destructive tests in the software integration; </li></ul></ul></ul><ul><ul><ul><li>Software developers should be available for correcting system’s faults; </li></ul></ul></ul><ul><ul><li>A Software Testing Strategy </li></ul></ul><ul><ul><ul><li>Unit Test; </li></ul></ul></ul><ul><ul><ul><li>Integration Test; </li></ul></ul></ul><ul><ul><ul><li>Validation Test; </li></ul></ul></ul><ul><ul><ul><li>System Test; </li></ul></ul></ul>
  18. 18. Software Test Strategies <ul><li>A Strategic Approach for Testing Software </li></ul><ul><ul><li>Testing Completion Criterions </li></ul></ul><ul><ul><ul><li>When the test phase is finished? </li></ul></ul></ul><ul><ul><ul><li>Based on Statistical Criterions [PRESS]: </li></ul></ul></ul>f(t) = (1/p) ln[l 0 pt + 1]
  19. 19. Software Test Strategies <ul><li>Strategically Aspects </li></ul><ul><ul><li>Specify product requirements in a quantifiable way before test; </li></ul></ul><ul><ul><li>Show test goals in a clear way; </li></ul></ul><ul><ul><li>Understand users software and develop a profile for each user category; </li></ul></ul><ul><ul><li>Develop a test plan that emphasize a fast cycle test; </li></ul></ul><ul><ul><li>Build a robust software projected for testing itself; </li></ul></ul><ul><ul><li>Use effective technical formal revisions as a filter before testing; </li></ul></ul><ul><ul><li>Conduct technical formal revisions for evaluate a test strategy and test cases; </li></ul></ul><ul><ul><li>Develop an approach for improve the test process; </li></ul></ul>
  20. 20. Software Test Strategies <ul><li>Unit Test </li></ul><ul><ul><li>Used for testing software components separately ; </li></ul></ul><ul><ul><li>It uses white-box test for exercising code lines; </li></ul></ul><ul><ul><li>Test cases should be planned with the expected results for comparison; </li></ul></ul>
  21. 21. Software Test Strategies <ul><li>Integration Test </li></ul><ul><ul><li>If separated components work, why should I make integration tests? </li></ul></ul><ul><ul><ul><li>Loosing data through an interface; </li></ul></ul></ul><ul><ul><ul><li>A module should have a not expected effect; </li></ul></ul></ul><ul><ul><li>Top-Down Integration; </li></ul></ul><ul><ul><li>Bottom-up Integration; </li></ul></ul><ul><ul><li>Regression Test; </li></ul></ul><ul><ul><li>Smoke Test; </li></ul></ul>
  22. 22. Software Test Strategies <ul><li>Top-Down Integration </li></ul><ul><ul><li>First-In-Depth Integration; </li></ul></ul><ul><ul><li>First-In-Width Integration; </li></ul></ul><ul><ul><li>The main control module is used first for testing; </li></ul></ul><ul><ul><li>The next modules are being added; </li></ul></ul><ul><ul><li>Tests are being executed while components are added; </li></ul></ul><ul><ul><li>Regression tests can be executed for assure new errors are not added; </li></ul></ul>M1 M2 M7 M5 M6 M3 M8 M4
  23. 23. Software Test Strategies <ul><li>Bottom-up Integration </li></ul><ul><ul><li>Atomic modules are added before in the system; </li></ul></ul><ul><ul><li>Components are being placed when low level components are tested; </li></ul></ul>Mc Ma Mb D1 D2 D3
  24. 24. Software Test Strategies <ul><li>Regression Test </li></ul><ul><ul><li>All components added in the software may cause errors in components added before; </li></ul></ul><ul><ul><li>Regression test strategy test components tested before to assure new components didn’t affected it; </li></ul></ul><ul><ul><li>Some tools can be used for execute regression test; </li></ul></ul><ul><ul><li>It should be separated in modules; </li></ul></ul><ul><li>Analyzing Regression Test Selection Techniques [Rothernel] </li></ul><ul><ul><li>Regression Test Selection for Fault Detection; </li></ul></ul><ul><ul><li>Framework for Analyzing Regression Test Selection Techniques </li></ul></ul><ul><ul><ul><li>Inclusiveness; </li></ul></ul></ul><ul><ul><ul><li>Precision; </li></ul></ul></ul><ul><ul><ul><li>Efficiency; </li></ul></ul></ul><ul><ul><ul><li>Generality; </li></ul></ul></ul><ul><ul><ul><li>Tradeoffs; </li></ul></ul></ul>
  25. 25. Software Test Strategies <ul><li>Analyzing Regression Test Selection Techniques </li></ul><ul><ul><li>An Analysis of Regression Test Selection Techniques </li></ul></ul>
  26. 26. Software Test Strategies <ul><li>Analyzing Regression Test Selection Techniques </li></ul><ul><ul><li>An Analysis of Regression Test Selection Techniques </li></ul></ul>
  27. 27. Software Test Strategies <ul><li>Analyzing Regression Test Selection Techniques </li></ul><ul><ul><li>An Analysis of Regression Test Selection Techniques </li></ul></ul>
  28. 28. Software Test Strategies <ul><li>Smoke Test </li></ul><ul><ul><li>Used in software products developed fast; </li></ul></ul><ul><ul><li>Modules built by components are tested; </li></ul></ul><ul><ul><li>Modules are integrated and tested daily: </li></ul></ul><ul><ul><ul><li>Integration risk; </li></ul></ul></ul><ul><ul><ul><li>Final product quality; </li></ul></ul></ul><ul><ul><ul><li>Errors diagnose are simplified; </li></ul></ul></ul><ul><ul><ul><li>Easy progress evaluation; </li></ul></ul></ul>
  29. 29. Software Test Strategies <ul><li>Validation Tests </li></ul><ul><ul><li>Started just after integration tests; </li></ul></ul><ul><ul><li>Used for verify results based on requirements of the system; </li></ul></ul><ul><ul><li>An important element in validation test is called configuration revision; </li></ul></ul><ul><ul><li>Customer should validate requirements using system; </li></ul></ul><ul><ul><li>Alfa tests are tests executed in the user environment in the developer presence; </li></ul></ul><ul><ul><li>Beta tests are executed by the user without developer’s presence; </li></ul></ul><ul><li>Validation, Verification, and Testing: Diversity Rules [Kitchenham] </li></ul><ul><ul><li>Practical Problems with Operational Testing </li></ul></ul><ul><ul><ul><li>It assumes that the most frequently occurring operations will have the highest manifestation of faults; </li></ul></ul></ul><ul><ul><ul><li>Transition situations are often the most complete error-prone; </li></ul></ul></ul>
  30. 30. Software Test Strategies <ul><li>Validation, Verification, and Testing: Diversity Rules [Kitchenham] </li></ul><ul><ul><li>Testing Critical Function </li></ul></ul><ul><ul><ul><li>Critical functions have extremely severe consequences if they fail; </li></ul></ul></ul><ul><ul><ul><li>This leads to a problem with identifying the reliability of the system as a whole; </li></ul></ul></ul><ul><ul><ul><li>The product reliability can be defined in terms of failure in a given period of time; </li></ul></ul></ul><ul><ul><ul><li>For critical functions, you are likely to obtain a measure of reliability related of failure on demand; </li></ul></ul></ul><ul><ul><li>The Need For Non-Execution-Based Testing </li></ul></ul><ul><ul><ul><li>You can only ignore nonoperational testing if: </li></ul></ul></ul><ul><ul><ul><ul><li>All faults found by execution testing have exactly the same cost to debug and correct </li></ul></ul></ul></ul><ul><ul><ul><ul><li>All faults can be detected, and </li></ul></ul></ul></ul><ul><ul><ul><ul><li>The relative size of faults is constant across different operations for all time </li></ul></ul></ul></ul>
  31. 31. Software Test Strategies <ul><li>Validation, Verification, and Testing: Diversity Rules [Kitchenham] </li></ul><ul><ul><li>Diminishing Returns </li></ul></ul><ul><ul><ul><li>Operational test becomes less and less useful for fault detection when it have been removed faults from the most commonly used functions; </li></ul></ul></ul><ul><ul><ul><li>A systems engineer should mix some techniques, including: </li></ul></ul></ul><ul><ul><ul><ul><li>Additional testing boundary and transition conditions and of critical functions </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Design inspections and reviews to identify specification and design faults early in development process </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Proofs for functions whose correct outcome cannot otherwise be verified, and </li></ul></ul></ul></ul><ul><ul><ul><ul><li>Operational testing for those initial tests aimed at identifying the largest faults and assessing reliability. </li></ul></ul></ul></ul>
  32. 32. Software Test Strategies <ul><li>System Test </li></ul><ul><ul><li>Should exercise the system as a whole for find errors; </li></ul></ul><ul><ul><li>Recovery Test: </li></ul></ul><ul><ul><ul><li>Enforce the system to fail; </li></ul></ul></ul><ul><ul><ul><li>Verify the system recovery; </li></ul></ul></ul><ul><ul><li>Security Test: </li></ul></ul><ul><ul><ul><li>Verify if hackers can affect the system; </li></ul></ul></ul><ul><ul><ul><li>Testers should try to obtain secure data from the system; </li></ul></ul></ul><ul><li>Stress Test </li></ul><ul><ul><li>Execute system in a way it should require almost all resources; </li></ul></ul><ul><ul><li>Sensibility Test </li></ul></ul><ul><ul><ul><li>Try to find data that the system fail; </li></ul></ul></ul><ul><li>Performance Test </li></ul><ul><ul><li>For systems need to provide performance requirements; </li></ul></ul><ul><ul><li>Executed with Stress Test; </li></ul></ul>
  33. 33. Software Test Strategies <ul><li>Debugging </li></ul><ul><ul><li>Process started after the tester find an error and try to remove it; </li></ul></ul><ul><ul><li>Relation between a fault and its cause; </li></ul></ul><ul><ul><li>Two possible results: </li></ul></ul><ul><ul><ul><li>The cause is found and corrected; </li></ul></ul></ul><ul><ul><ul><li>The cause is not found; </li></ul></ul></ul><ul><ul><li>Why debugging is so difficult? </li></ul></ul><ul><ul><ul><li>Error and cause geographically remote; </li></ul></ul></ul><ul><ul><ul><li>Error can disappear when another error was corrected; </li></ul></ul></ul><ul><ul><ul><li>Error can be caused by non-error; </li></ul></ul></ul><ul><ul><ul><li>Error can be cause by an human mistake; </li></ul></ul></ul><ul><ul><ul><li>Error can be a time problem; </li></ul></ul></ul><ul><ul><ul><li>Error can be difficult to be found; </li></ul></ul></ul><ul><ul><ul><li>Error can be caused by distributed reasons; </li></ul></ul></ul><ul><ul><ul><li>Error can be found in a component; </li></ul></ul></ul>
  34. 34. Component Testing <ul><li>Testing Component-Based Software: A Cautionary Tale [Weyuker] </li></ul><ul><ul><li>The Ariane 5 Lesson </li></ul></ul><ul><ul><ul><li>In June 1996, during the maiden voyage of the Ariane 5 launch vehicle, the launcher veered off course and exploded less than one minute after take-off; </li></ul></ul></ul><ul><ul><ul><li>The explosion resulted from insufficiently tested software reused from Ariane 4 launcher; </li></ul></ul></ul><ul><ul><li>Testing Newly Developed Software </li></ul></ul><ul><ul><ul><li>Unit Testing; </li></ul></ul></ul><ul><ul><ul><li>Integration Testing; </li></ul></ul></ul><ul><ul><ul><li>System Testing; </li></ul></ul></ul>
  35. 35. Component Testing <ul><li>Testing Component-Based Software: A Cautionary Tale [Weyuker] </li></ul><ul><ul><li>Problems with systems built from reusable components: </li></ul></ul><ul><ul><ul><li>Performance problems; </li></ul></ul></ul><ul><ul><ul><li>Fitting selected components together; </li></ul></ul></ul><ul><ul><ul><li>Absence of low-level understanding; </li></ul></ul></ul><ul><ul><li>Developing a component for a particular project with no expectation of reuse, testing proceed as usual; </li></ul></ul><ul><ul><li>Significant additional testing should be done for changing priority; </li></ul></ul><ul><ul><li>Modify or not the source code? </li></ul></ul><ul><ul><li>Facilitating Reusable Component Testing </li></ul></ul><ul><ul><ul><li>Associate a person or a team for maintaining each component; </li></ul></ul></ul><ul><ul><ul><li>Add software specification; </li></ul></ul></ul><ul><ul><ul><li>Add test suite; </li></ul></ul></ul><ul><ul><ul><li>Add pointers between the specification and parts of the test suite; </li></ul></ul></ul>
  36. 36. Component Testing <ul><li>Third-Party Testing and the Quality of Software Components [Councill] </li></ul><ul><ul><li>10 contribuitors devoted 70 person-hour to developing a definition of component ; </li></ul></ul><ul><ul><li>Ariane Project ; </li></ul></ul><ul><ul><li>Certification business introduced a Maturity Model focused on large organizations; </li></ul></ul><ul><ul><li>99 percent of all US businesses are small businesses; </li></ul></ul><ul><ul><li>Producers provide the developing and testing procedures, purchasers or their agents conduct second-party testing and independent testing organizations perform third-party testing; </li></ul></ul>
  37. 37. References <ul><li>[PRESS] Pressman, Roger S., Engenharia de Software , 5° Ed., Rio de Janeiro, MacGraw-Hill, 2002. </li></ul><ul><li>[Councill] William T. Councill, Third-Part Testing and the Quality of Software Components , IEEE Software, Quality Time, July 1999. </li></ul><ul><li>[Adams] Tom Adams, Functionally Effective Functional Testing Primer , IEEE Software, bookshelf, September 1995. </li></ul><ul><li>[Rothermel] G. Rothermel and M. J. Harrold, Analyzing Refression Test Selection Techniques , IEEE Transactions of Software Engineering, Vol. 22, N° 8, August 1996. </li></ul><ul><li>[Stocks] P. Stocks and D. Carrington, A Framework for Specification-Based Testing , IEEE Transactions on Software Engineering, Vol. 22, N° 11, November 1996. </li></ul><ul><li>[Harrold] M. J. Harrold, Testing a Roadmap , College of Computing, Georgia Institute Technology, 2000. </li></ul>
  38. 38. References <ul><li>[Franki] P. G. Franki and R. G. Hamlet, Evaluating Testing Methods by Delivered Reliability , IEEE Transactions on Software Engineering, Vol. 24, N° 8, August 1998. </li></ul><ul><li>[Kitchenham] B. Kitchenham and S. Linkman, Validation, Verification, and Testing: Diversity Rules , IEEE Software, August 1998. </li></ul><ul><li>[Voas] J. M. Voas and K. W. Miller, Software Testability: The New Verification , IEEE Software, May 1995. </li></ul><ul><li>[Yamaura] T. Yamaura, How to Design Practical Test Cases , IEEE Software, December 1998. </li></ul><ul><li>[Weyuker] E. J. Weyuker, Testing Component-Based Software: A Cautionary Tale , IEEE Software, October 1998. </li></ul>