Matada Prashanth


Published on

  • Be the first to comment

  • Be the first to like this

No Downloads
Total Views
On Slideshare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Matada Prashanth

  1. 1. Testing: A Roadmap Mary Jean Harrold College Of Computing Georgia Institute Of Technology Presented By Prashanth L Anmol N M
  2. 2. Introduction <ul><li>Definition </li></ul><ul><li>Purposes for which testing is performed </li></ul><ul><li>Key Concepts </li></ul><ul><li>Advantages </li></ul><ul><li>Disadvantages </li></ul>
  3. 3. Introduction <ul><li>Definition </li></ul><ul><ul><li>Software testing is any activity aimed at evaluating an attribute or capability of a program or system and determining that it meets its required results. </li></ul></ul><ul><ul><li>Constitutes more than 50% of cost of software development </li></ul></ul>
  4. 4. Introduction <ul><li>Purposes </li></ul><ul><li>To improve quality </li></ul><ul><li>For Verification & Validation (V&V) </li></ul><ul><ul><li>Functionality (Exterior Quality)-Correctness ,Reliability Usability Integrity </li></ul></ul><ul><ul><li>Engineering (interior quality)-Efficiency ,Testability Documentation Structure </li></ul></ul><ul><ul><li>Adaptability (future quality )-Flexibility Reusability Maintainability </li></ul></ul><ul><li>For reliability estimation </li></ul>
  5. 5. Introduction <ul><li>Key Concepts </li></ul><ul><li>Taxonomy </li></ul><ul><ul><li>correctness testing </li></ul></ul><ul><ul><li>performance testing </li></ul></ul><ul><ul><li>reliability testing </li></ul></ul><ul><ul><li>security testing </li></ul></ul><ul><li>Testing Automation </li></ul><ul><li>When to Stop Testing? </li></ul>
  6. 6. Introduction <ul><li>Correctness Testing- </li></ul><ul><ul><li>Minimum Requirements of Software. </li></ul></ul><ul><ul><li>can be either black box or white box </li></ul></ul><ul><li>Black Box - treats s/w as blackbox.only the inputs/outputs visible basic functionality testing. </li></ul><ul><li>White Box-structure & flow of software under test visible. </li></ul><ul><li>Performance Testing-Design problems in software that cause the system performance to degrade </li></ul><ul><li>Reliability Testing-robustness of a software component is the degree to </li></ul><ul><li>which it can function correctly in the presence of exceptional inputs or stressful environmental conditions </li></ul><ul><li>Security Testing-include identifying and removing software flaws that may potentially lead to security violations, and validating the effectiveness of security measures. Simulated security attacks can be performed to find vulnerabilities </li></ul>
  7. 7. Introduction <ul><li>Testing Automation </li></ul><ul><ul><li>Automation is a good way to cut down time and cost. </li></ul></ul><ul><ul><li>Software testing tools and techniques usually suffer from a lack of generic applicability and scalability. </li></ul></ul><ul><li>When to Stop Testing? </li></ul><ul><ul><li>Testing is a trade-off between budget, time and quality. </li></ul></ul><ul><ul><li>It is driven by profit models. </li></ul></ul><ul><ul><li>pessimistic  time, budget, or test cases -- are exhausted. </li></ul></ul><ul><ul><li>optimistic  reliability meets the requirement, or the benefit from </li></ul></ul><ul><li>continuing testing cannot justify the testing cost . </li></ul><ul><li>Advantages </li></ul><ul><ul><li>Easy generation of test cases & Instrumentation of software. </li></ul></ul><ul><ul><li>Process automation & execution in expected environment. </li></ul></ul>
  8. 8. Roadmap <ul><li>Fundamental Research </li></ul><ul><ul><li>Testing Component Based Systems </li></ul></ul><ul><ul><li>Testing Based on Precode Artifacts </li></ul></ul><ul><ul><li>Testing Evolving Software </li></ul></ul><ul><ul><li>Demonstrating Effectiveness of Testing Techniques </li></ul></ul><ul><ul><li>Using Testing Artifacts </li></ul></ul><ul><ul><li>Other Testing Techniques </li></ul></ul><ul><ul><li>Methods & Tools </li></ul></ul><ul><ul><li>Empirical Studies </li></ul></ul><ul><ul><li>Testing Resources </li></ul></ul>
  9. 9. Testing Component Based Systems <ul><li>Issues </li></ul><ul><ul><li>Component Provider (Developer of the Components) </li></ul></ul><ul><ul><li>Views Components independently of the context. </li></ul></ul><ul><ul><li>Component User (Application Developer) </li></ul></ul><ul><li>Views Components relevant to the application. </li></ul><ul><li>Limiting Factor </li></ul><ul><ul><li>Availability of Source Code </li></ul></ul><ul><li>Roadmap Suggested </li></ul><ul><ul><li>Types of Testing Information needed. </li></ul></ul><ul><ul><li>Techniques for representing & computing the types of testing info the user needs. </li></ul></ul><ul><ul><li>Techniques to use the information provided with the component for testing the application. </li></ul></ul>
  10. 10. Testing Based On Precode Artifacts <ul><li>Issues </li></ul><ul><ul><li>Design </li></ul></ul><ul><ul><li>Requirements </li></ul></ul><ul><ul><li>Architectural Specifications </li></ul></ul><ul><li>Issue Under Spotlight </li></ul><ul><ul><li>Architecture </li></ul></ul><ul><li>Roadmap Suggested </li></ul><ul><ul><li>Use of formal notations used for s/w architecture </li></ul></ul><ul><ul><li>Develop techniques to be used with architectural specification for test-case development. </li></ul></ul><ul><ul><li>Develop techniques to evaluate s/w architectures for testability. </li></ul></ul>
  11. 11. Testing Evolving Software <ul><li>Regression testing: </li></ul><ul><li>Validate modified software to ensure no new errors introduced. </li></ul><ul><li>ONE OF THE MOST EXPENSIVE PART !!! </li></ul><ul><li>Some useful techniques </li></ul><ul><li>* select subset of test suite of previous from previous testing. </li></ul><ul><li>* techniques to help manage growth in size of test suite. </li></ul><ul><li>* assess regression testability. </li></ul><ul><li>Testing techniques needed …. </li></ul><ul><li>* not only for software but also for architecture and requirements. </li></ul><ul><li>* manage test suites themselves. </li></ul><ul><li>* identify parts of modified software for which new test cases are required. </li></ul><ul><li>* identify test cases no longer needed. </li></ul><ul><li>* prioritize test cases. </li></ul><ul><li>* asses the test suite themselves. </li></ul>
  12. 12. Demonstrating effectiveness of Testing Techniques <ul><li>How ???? </li></ul><ul><li>* Increase developers confidence. </li></ul><ul><li>* software behavior </li></ul><ul><li>* identify classes of faults for a given test criteria. </li></ul><ul><li>* provide visual interface. </li></ul><ul><li>* determine interaction among test selection criteria and ways to combine. </li></ul><ul><li>Research been done In …. </li></ul><ul><li>* evaluation criteria to determine adequacy of test suites and test cases that inspire confidence. </li></ul><ul><li>* test cases based on either software’s intended behavior or purely on code. </li></ul><ul><li>* test cases based on data-flow in a program. </li></ul><ul><li>* use existing testing techniques to test visual programming languages. </li></ul><ul><li>* test complex Boolean expressions. </li></ul><ul><li>* mutation analysis and approximation ( ex: can avoid testing pointer variable in data flow analysis. </li></ul>
  13. 13. Establishing Effective Process for Testing <ul><li>Need to develop process for planning and implementation of Testing. </li></ul><ul><li>Some of the currently used or proposed techniques …. </li></ul><ul><li>* Develop a test plan during requirements gathering phase and implementation of the test plan after s/w implementation phase. Useful ? </li></ul><ul><li>* what does Microsoft do ? </li></ul><ul><li>- frequently synchronize what people are doing. </li></ul><ul><li>- periodically stabilize the product in increments as project proceeds. </li></ul><ul><li>- build and test a version every night !!! ( only Microsoft can do…!) </li></ul><ul><li>* perpetual testing - Build foundation for treating analysis and test ongoing activities for improved quality. </li></ul><ul><li>* selective regression testing where we test one version and gather testing artifacts such as I/o pairs and coverage information. </li></ul><ul><li>* explicit progress of regression testing that integrates many key testing techniques into development and maintenance of evolving software. </li></ul>
  14. 14. Establishing Effective Process for Testing. Cont’d <ul><li>Some Open questions … </li></ul><ul><ul><li>Does Microsoft nightly rebuild, minimize testing later ? </li></ul></ul><ul><ul><li>Does testing show all the software qualities ? </li></ul></ul><ul><ul><li>Can results obtained from testing be generalized ? </li></ul></ul><ul><li>Some useful suggestions … </li></ul><ul><ul><li>Integrate various quality techniques and tools </li></ul></ul><ul><ul><li>Combine static analysis with testing </li></ul></ul>
  15. 15. Using Testing Artifacts <ul><li>Artifacts include : </li></ul><ul><ul><li>Execution traces of software’s execution with test cases. </li></ul></ul><ul><ul><li>Results of execution like pass/fail of software for given test cases. </li></ul></ul><ul><li>Useful : Store results and use for retesting modified software. </li></ul><ul><li>A whole bunch of research is done on this, some of the proposed …. </li></ul><ul><li>* use dynamic program slices derived from execution traces along with pass/fail results for execution traces to identify potential faulty code. </li></ul><ul><li>apply heuristics to find out subset of test suite that passed and failed. </li></ul><ul><li>* identify program invariants and develop test cases generation. </li></ul><ul><li>* use coverage information to predict the magnitude of regression testing. </li></ul><ul><li>* use coverage information also to select test cases from test suite for use in regression testing. </li></ul>
  16. 16. Using Testing Artifacts. Cont’d… <ul><li>Proposed research cont’d…. </li></ul><ul><li>* Use artifacts to for test suite reduction and prioritization. </li></ul><ul><li>* Perform concept analysis on coverage info and compute relationships among executed entities. Helps in uncovering properties of test suites. </li></ul><ul><li>* Path spectra identifies paths where control diverges in program execution which is helpful in debugging, testing and maintenance. Expensive ! </li></ul><ul><li>* Use branch spectra which is less expensive profiling. </li></ul><ul><li>* Using visual tools to analyze test info. </li></ul><ul><li>Additional research Needed …. </li></ul><ul><li>* use of testing artifacts for software engineering tasks. </li></ul><ul><li>* identify types of info software engineers and managers need (data mining) </li></ul>
  17. 17. Other Testing Techniques <ul><li>Some other techniques helpful in reaching end goal (quality software) </li></ul><ul><li>* Need for a scalable automatic test data generation </li></ul><ul><li>* Static analysis required but expensive. Need for a scalable analysis technique that can be used to compute required information. </li></ul><ul><li>* data-flow analysis expensive and hence need for efficient instrumentation and recording techniques. </li></ul>
  18. 18. Method and Tools <ul><li>Goal </li></ul><ul><ul><li>Develop efficient methods and tools that can be used by practitioners to test their software. </li></ul></ul><ul><li>Complaint </li></ul><ul><ul><li>Software Engineering technology requires on average 18 years to be transferred in to practice !!!!!!! We need to reduce this time for technology transfer. </li></ul></ul><ul><li>Reasons </li></ul><ul><ul><li>techniques developed demonstrated on contrived or toy system. Not Scalable ! </li></ul></ul><ul><li>What we need </li></ul><ul><ul><li>Development of tools and methods for industrial setting to demonstrate the usefulness of techniques proposed </li></ul></ul><ul><ul><li>Techniques proposed should be scalable. </li></ul></ul><ul><ul><li>Develop robust prototypes and identify the context in which they can function and use them to perform the experiments to demonstrate the techniques. </li></ul></ul>
  19. 19. Method and Tools. Cont’d …. <ul><li>What we need. Cont’d … </li></ul><ul><ul><li>Tools to consider computation trade offs like Precision Vs Efficiency . </li></ul></ul><ul><ul><li>Automatic development of method and tools on the lines of compilers. </li></ul></ul><ul><ul><li>Develop tools that are attractive to practitioners. </li></ul></ul><ul><ul><li>Finally the developed tools require minimal involvement of Software Engineers. </li></ul></ul>
  20. 20. Empirical Studies <ul><li>What does it mean ? </li></ul><ul><ul><li>Studies which will help demonstrate the scalability and usefulness of the techniques in practice in other words feedback for future research. </li></ul></ul><ul><li>Difficulties in doing Empirical Studies </li></ul><ul><ul><li>Difficulty in acquiring sufficient robust implementation of these techniques </li></ul></ul><ul><ul><li>Difficulty in obtaining sufficient experimental subjects (software and test suites) </li></ul></ul><ul><li>Solutions </li></ul><ul><ul><li>Collect sets of experimental subjects and make them available for researchers. </li></ul></ul><ul><ul><li>Create sanitized information that would reveal no proprietary information and still useful for experimentation. </li></ul></ul>
  21. 21. Testing Resources <ul><li>Workshops, Conferences, Journals and Useful Links </li></ul><ul><ul><li>Workshop on Strategic Directions in Software Quality 1996 . (ACM) </li></ul></ul><ul><ul><li>National Science Foundation & Computational Research Association </li></ul></ul><ul><ul><li>Workshop on Role of Software Architectures in Testing and Analysis. (INRC) </li></ul></ul><ul><ul><li>International Conference on Software Engineering Workshop on Testing Distributed Component-based Systems. </li></ul></ul><ul><ul><li>Middle Tennessee State’s STORM Software Testing </li></ul></ul><ul><ul><li> </li></ul></ul><ul><ul><li>Reliable Software Technology’s Software Assurance Hotlist </li></ul></ul><ul><ul><li> </li></ul></ul><ul><ul><li>Research Institute’s Software Quality Hotlist </li></ul></ul><ul><ul><li> </li></ul></ul><ul><ul><li>Newsgroup: </li></ul></ul>
  22. 22. Conclusions <ul><ul><li>Relevance To Embedded Systems </li></ul></ul><ul><ul><li> Emphasizes the basic stages-Sets up the next paper. </li></ul></ul><ul><ul><li> Talks about evolving systems-ES evolve by the second. </li></ul></ul><ul><ul><li> Talks about component based testing-COTS. </li></ul></ul><ul><ul><li>Emphasizes the need for Testing based on Precode Artifacts- </li></ul></ul><ul><ul><li>Software Architecture. </li></ul></ul><ul><ul><li> Examining Current Techniques to demonstrate scalability & </li></ul></ul><ul><ul><li>usefulness of techniques in practice-Empirical. </li></ul></ul><ul><ul><li>Weakness – Nil </li></ul></ul>
  1. A particular slide catching your eye?

    Clipping is a handy way to collect important slides you want to go back to later.