Software Testing – Day One _____________________________________ By Govardhan Reddy M
Agenda <ul><li>Software Testing </li><ul><li>Why?
Purpose?
Terminologies? </li></ul><li>Notes </li></ul><ul><li>Test Design </li><ul><li>Targets
Techniques </li><ul><li>Black Box
White Box
Gray Box </li></ul></ul></ul>
Software Testing <ul><li>Process of executing an application with the intent of finding bugs.
Process of demonstrating that errors are not present.
Process of establishing confidence that a program does what it is supposed to do. </li></ul>
Software Testing - Why... <ul><li>Here are some important defects that bettter testing would have found. </li><ul><li>In O...
Software Testing - Why... <ul><ul><li>In June 1996 the first flight of the European Space Agency's Ariane 5 rocket failed ...
Software Testing - Purpose... <ul><li>Verification
Validation
Error Finding </li></ul>
Software Testing - Purpose... <ul><li>Verification </li><ul><li>Whether we built the project right?
Typically involves reviews and meetings to evaluate: </li><ul><li>Documents, Plans, Code, Requirements & Specifications. <...
Software Testing - Purpose... <ul><li>Validation </li><ul><li>Whether we built the right project?
Typically involves actual testing.
Tasks are designed to ensure that </li><ul><li>Internal design is valid.
Meets user expectations / business requirements. </li></ul><li>Validation takes place after verifications. </li></ul></ul>
Software Testing - Purpose... <ul><li>Error Finding </li><ul><li>Failure points of the application. </li><ul><li>Failure: ...
PRD: Product Requirement Document.
FDD: Funtional/Feature Design Document. </li></ul></ul></ul>
Software Testing - Terminologies... <ul><li>Error: </li><ul><li>Discrepancy between expected and actual result.
Flaws that exists due to improper coding.
Undesirable output which negates the requirement.
Occurs in the syntax level (syntax error).
Error could be in: </li><ul><li>Environment
Application
Resource associated with testbed setup. </li></ul></ul></ul>
Software Testing - Terminologies... <ul><li>Bug: </li><ul><li>Error found before the application goes into production. </l...
Software Testing - Terminologies... <ul><li>Note: </li><ul><li>Error can be at any stage: </li><ul><li>Development
QA
After Release </li></ul><li>Error can be a Bug or Defect. </li></ul></ul>
Software Testing - Terminologies... <ul><li>Test Strategy: </li><ul><li>An elaborate and systematic plan of action.
High level system wide expression of major activities that collectively achieve the overall desired result.
Identifies Risks, Constraints and Exposures present.
Introduces significant added value to the plan. </li></ul></ul>
Software Testing - Terminologies... <ul><li>Test Strategy: Cont... </li><ul><li>Statements are expressed in high level ter...
Resources (People and Machines).
Types and Levels of Testing.
Schedules. </li></ul><li>Plan will be specific to the system being developed. </li></ul></ul>
Software Testing - Terminologies... <ul><li>Test Plan:  </li><ul><li>Collection of test cases.
Upcoming SlideShare
Loading in...5
×

Software Testing - Day One

1,017

Published on

Published in: Education
1 Comment
3 Likes
Statistics
Notes
  • hi govardhan reddy can you send me the pdf files of testing which you have placed here my mail id navindhanireddy@gmail.com
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total Views
1,017
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
0
Comments
1
Likes
3
Embeds 0
No embeds

No notes for slide

Transcript of "Software Testing - Day One"

  1. 1. Software Testing – Day One _____________________________________ By Govardhan Reddy M
  2. 2. Agenda <ul><li>Software Testing </li><ul><li>Why?
  3. 3. Purpose?
  4. 4. Terminologies? </li></ul><li>Notes </li></ul><ul><li>Test Design </li><ul><li>Targets
  5. 5. Techniques </li><ul><li>Black Box
  6. 6. White Box
  7. 7. Gray Box </li></ul></ul></ul>
  8. 8. Software Testing <ul><li>Process of executing an application with the intent of finding bugs.
  9. 9. Process of demonstrating that errors are not present.
  10. 10. Process of establishing confidence that a program does what it is supposed to do. </li></ul>
  11. 11. Software Testing - Why... <ul><li>Here are some important defects that bettter testing would have found. </li><ul><li>In October 1999 the $125 million NASA Mars Climate Orbiter – an interplanetary weather satellite was lost in space due to a data conversion error. Investigators discovered that the software on the spacecraft performed certain calculations in English units (yards) when it should have used metric units (meters). </li><ul><ul><ul><ul><ul><ul><ul><li>(Cont...) </li></ul></ul></ul></ul></ul></ul></ul></ul></ul>
  12. 12. Software Testing - Why... <ul><ul><li>In June 1996 the first flight of the European Space Agency's Ariane 5 rocket failed shortly after launching, resulting in an uninsured loss of $500,000,000. The disaster was traced to the lack of exception handling for a floating-point error when a 64-bit integer was converted to a 16-bit signed integer. </li></ul></ul>
  13. 13. Software Testing - Purpose... <ul><li>Verification
  14. 14. Validation
  15. 15. Error Finding </li></ul>
  16. 16. Software Testing - Purpose... <ul><li>Verification </li><ul><li>Whether we built the project right?
  17. 17. Typically involves reviews and meetings to evaluate: </li><ul><li>Documents, Plans, Code, Requirements & Specifications. </li></ul><li>Can be done with: </li><ul><li>Checklists, Issues lists, walkthrouhs and inspection meetings. </li></ul></ul></ul>
  18. 18. Software Testing - Purpose... <ul><li>Validation </li><ul><li>Whether we built the right project?
  19. 19. Typically involves actual testing.
  20. 20. Tasks are designed to ensure that </li><ul><li>Internal design is valid.
  21. 21. Meets user expectations / business requirements. </li></ul><li>Validation takes place after verifications. </li></ul></ul>
  22. 22. Software Testing - Purpose... <ul><li>Error Finding </li><ul><li>Failure points of the application. </li><ul><li>Failure: Variance between expected and actual result. </li></ul><li>Defects introduced in the: </li><ul><li>SRS: Software Requirement Specification.
  23. 23. PRD: Product Requirement Document.
  24. 24. FDD: Funtional/Feature Design Document. </li></ul></ul></ul>
  25. 25. Software Testing - Terminologies... <ul><li>Error: </li><ul><li>Discrepancy between expected and actual result.
  26. 26. Flaws that exists due to improper coding.
  27. 27. Undesirable output which negates the requirement.
  28. 28. Occurs in the syntax level (syntax error).
  29. 29. Error could be in: </li><ul><li>Environment
  30. 30. Application
  31. 31. Resource associated with testbed setup. </li></ul></ul></ul>
  32. 32. Software Testing - Terminologies... <ul><li>Bug: </li><ul><li>Error found before the application goes into production. </li></ul><li>Defect: </li><ul><li>Error found after the application goes into production. </li></ul></ul>
  33. 33. Software Testing - Terminologies... <ul><li>Note: </li><ul><li>Error can be at any stage: </li><ul><li>Development
  34. 34. QA
  35. 35. After Release </li></ul><li>Error can be a Bug or Defect. </li></ul></ul>
  36. 36. Software Testing - Terminologies... <ul><li>Test Strategy: </li><ul><li>An elaborate and systematic plan of action.
  37. 37. High level system wide expression of major activities that collectively achieve the overall desired result.
  38. 38. Identifies Risks, Constraints and Exposures present.
  39. 39. Introduces significant added value to the plan. </li></ul></ul>
  40. 40. Software Testing - Terminologies... <ul><li>Test Strategy: Cont... </li><ul><li>Statements are expressed in high level terms of </li><ul><li>Physical components and activities.
  41. 41. Resources (People and Machines).
  42. 42. Types and Levels of Testing.
  43. 43. Schedules. </li></ul><li>Plan will be specific to the system being developed. </li></ul></ul>
  44. 44. Software Testing - Terminologies... <ul><li>Test Plan: </li><ul><li>Collection of test cases.
  45. 45. Test plans are normally written from the requirements documents and from the specifiation. </li><ul><li>PRD: High Level Design Document (HLD)
  46. 46. FDD: Functional Design Document (LLD)
  47. 47. SRS: HLD and/or LLD </li></ul></ul></ul>
  48. 48. Software Testing - Terminologies... <ul><li>Test Case: </li><ul><li>Detailed procedure that fully tests a feature or an aspect of a feature.
  49. 49. Trigger event that starts the test.
  50. 50. One or more data values which are generated as a result of the input(s) being processed (output).
  51. 51. Tools and data required to allow the test to execute (the environment). </li></ul></ul>
  52. 52. Software Testing - Terminologies... <ul><li>Test Case: Cont... </li><ul><li>Manner inwhich it is depicted varies between organizations.
  53. 53. Most test cases are in the form of a table.
  54. 54. eg. </li></ul><li>Test Data: </li><ul><li>Data included in the test case. </li></ul></ul>Test Name Design Step Description Expected Result Actual Result Pass/Fail Criteria Estimated Time Platform Remarks
  55. 55. Software Testing - Terminologies... <ul><li>Test Script: </li><ul><li>Set of instructions that will be performed on the SUT (System Under Test) or DUT (Device Under Test) to test that the system/device functions as expected.
  56. 56. Manual Scripts = Test Cases
  57. 57. Automated Scripts: </li><ul><li>Written in a programming language used to test part of the functionality of the system software. </li></ul></ul></ul>
  58. 58. Software Testing - Terminologies... <ul><ul><li>Automated Scripts: Cont... </li><ul><li>Programs can be written using </li><ul><li>Special automated functional GUI test tools </li><ul><li>HP QTP, Borland Silk Test, Rational Robot </li></ul><li>Well known programming languages </li><ul><li>C++, C#, TCL, Perl, Expect, Java, PHP, Powershell, Python or Ruby. </li></ul></ul></ul></ul><li>Build: </li><ul><li>Compiled version of code. </li></ul></ul>
  59. 59. Test Design - Targets... <ul><li>Understanding the sources of: </li><ul><li>Test cases
  60. 60. Test coverage
  61. 61. How to develop and document test cases
  62. 62. How to build and maintain test data </li></ul></ul>
  63. 63. Test Design Techniques – Black Box... <ul><li>Black Box Testing: </li><ul><li>No usage of internal structure knowledge.
  64. 64. Focus on testing functional requirements.
  65. 65. Tester would only know legal inputs and what expected outputs will be, But not know what program actually arrives to those outputs.
  66. 66. Testing against the specification and no other knowledge is required. </li></ul></ul>
  67. 67. Test Design Techniques – Black Box... <ul><li>Synonyms: </li><ul><li>Behavioral
  68. 68. Funtional
  69. 69. Opaque (not clearly understand)
  70. 70. Closed Box </li></ul></ul>Note: Behavioral is not Black Box, As usage of internal knowledge is not strictly forbidden.
  71. 71. Test Design Techniques – Black Box... <ul><li>Black Box Testing – Techniques: </li></ul>According to Beizer: <ul><li>Testing the system software in to its intended (planned) use by performing a standardized task.
  72. 72. General order to design tests are </li><ul><li>Clean tests against requirements.
  73. 73. Additional </li><ul><li>Structural tests for branch coverage as needed.
  74. 74. Tests for data flow coverage as needed. </li></ul></ul></ul>
  75. 75. Test Design Techniques – Black Box... According to Beizer: Cont... <ul><ul><li>Domain tests not covered by the above.
  76. 76. Special techniques as appropriate </li><ul><li>Syntax, loop, state etc., </li></ul><li>Any dirty tests not covered above. </li></ul></ul>
  77. 77. Test Design Techniques – Black Box... <ul><li>Graph Based: </li><ul><li>Test cases are designed based on nature of the relationships. i.e. Links and objects among the program. </li><ul><li>Transaction Flow: Transaction validated or not?
  78. 78. Finite State Modeling: Particular state?
  79. 79. Data Flow Modeling: Object Object.
  80. 80. Timing Modeling: Execution times? </li></ul></ul></ul>
  81. 81. Test Design Techniques – Black Box... <ul><li>Equivalence Class Partitioning: </li><ul><li>Input domain is divided in to classes of data from which test cases are derived.
  82. 82. eg. </li></ul></ul>1 - 20 1 - 9 10 - 20 Note: 1. Test for digit 1, If passed, Then no need to test for digits 2 – 9 2. Test for digit 10, If passed, Then no need to test for digits 11 - 20
  83. 83. Test Design Techniques – Black Box... <ul><li>Boundary Value Analysis: </li><ul><li>Range/Values/Boundaries.
  84. 84. eg. </li><ul><li>On Boundary: min, max
  85. 85. In Boundary: min + 1, max - 1
  86. 86. Out Boundary: min – 1, max + 1 </li></ul></ul></ul>1 - 20 1,20 2,19 0,21
  87. 87. Test Design Techniques – Black Box... <ul><li>Error Guessing: </li><ul><li>eg. mm/dd/yyyy – 02/29/2009 </li><ul><li>February 29 th in the year 2009 is not possible as it is not a leap year. </li></ul></ul></ul>
  88. 88. Test Design Techniques – Black Box... <ul><li>Advantages: </li><ul><li>More effective for major projects.
  89. 89. No internal is knowledge is required.
  90. 90. Tester and programmer are independent.
  91. 91. Tests are done from users point of view.
  92. 92. Test cases will be designed as soon as specifications are ready.
  93. 93. Will help to expose unclearness in the specification. </li></ul></ul>
  94. 94. Test Design Techniques – Black Box... <ul><li>Disadvantages: </li><ul><li>Only small number of possible inputs can be tested.
  95. 95. Tests are hard to design without clear and concise specifications.
  96. 96. May leave many program paths untested.
  97. 97. May be unnecessary repetition of test cases – if the tester was not informed of the test cases for which the programmer has already tried.
  98. 98. Most testing research has been detected towards glass box testing. </li></ul></ul>
  99. 99. Test Design Techniques – White Box... <ul><li>White Box Testing: </li><ul><li>Usage of internal structure knowledge.
  100. 100. Used to detect errors by means of execution-oriented test case.
  101. 101. Consideration: </li><ul><li>Allocation of resources to perform </li><ul><li>Class Analysis
  102. 102. Method Analysis
  103. 103. And to develop and review the same. </li></ul></ul></ul></ul>
  104. 104. Test Design Techniques – White Box... <ul><li>Synonyms: </li><ul><li>Structural
  105. 105. Glass Box
  106. 106. Clear Box
  107. 107. Open Box </li></ul></ul>
  108. 108. Test Design Techniques – White Box... <ul>White Box Testing – Techniques <li>Code Coverage Analysis: (By Mc. Cabe) will exercise basic set which will execute every statement at least once. </li><ul><li>Basis Path Testing: </li><ul><li>Flow Graph: Control flow notation representation.
  109. 109. Cyclomatic Complexity: Gives quantitative measure of logical complexity. </li></ul></ul></ul>
  110. 110. Test Design Techniques – White Box... <ul><ul><li>Control Structure Testing: </li><ul><li>Conditional Testing </li><ul><li>Relational Expressions
  111. 111. Simple Condition
  112. 112. Compound Condition
  113. 113. Boolean Expression </li></ul><li>Data Flow Testing:
  114. 114. Loop Testing: </li><ul><li>Simple
  115. 115. Concatenated
  116. 116. Nested
  117. 117. Unstructured </li></ul></ul></ul></ul>Flow
  118. 118. Test Design Techniques – White Box... <ul><li>Design by Contract: DBC specifies requirements such as: </li><ul><li>Conditions that </li><ul><li>The client must meet before a method is involved.
  119. 119. The method must meet after it is executed. </li></ul><li>Assertions that a method must satisfy at specific points of execution. </li></ul></ul>
  120. 120. Test Design Techniques – White Box... <ul><li>Profiling: </li><ul><li>Provides a framework for analyzing java code performance for speed and heap (collection of objects) memory use.
  121. 121. Leads to improve CPU time performance. </li></ul></ul>
  122. 122. Test Design Techniques – White Box... <ul><li>Error Handling: </li><ul><li>Proper error recovery notifications and logging are checked against references to validate program design.
  123. 123. Exception and error handling are checked thoroughly by operating an error causing test vectors. </li></ul></ul>
  124. 124. Test Design Techniques – White Box... <ul><li>Transactions:. </li><ul><li>To ensure ACID properties. </li><ul><li>Atomicity, Consistency, Isolation, Durability </li></ul><li>Transactions are checked thoroughly for partial/complete commits besides rollbacks. </li></ul></ul>
  125. 125. Test Design Techniques – White Box... <ul><li>Advantages: </li><ul><li>Forces test developer to reason carefully about implementation.
  126. 126. Approximate the partition done by equivalence.
  127. 127. Reveals errors in hidden code.
  128. 128. Beneficent side effects: </li><ul><li>Helps in removing extra lines of code which can bring in hidden defects. </li></ul></ul></ul>
  129. 129. Test Design Techniques – White Box... <ul><li>Disadvantages: </li><ul><li>Expensive because a skilled tester is required.
  130. 130. Case omitted (leave undone) in the code could be missed out.
  131. 131. It is nearly impossible to look into every bit of code to find out hidden errors, Which may create problem, resulting in failure of the application. </li></ul></ul>
  132. 132. Test Design Techniques – Gray Box... <ul><li>Gray Box Testing: </li><ul><li>Includes both </li><ul><li>Black Box (50%) and
  133. 133. White Box (50%) </li></ul></ul><li>Synonyms: </li><ul><li>Translucent. </li></ul></ul>
  134. 134. Software Testing - Notes <ul><li>Don't terminate an issue just because you are unable to reproduce, </li></ul>It could kill your company. <ul><li>When you are unable to reproduce an issue, Think out of the box,
  135. 135. Think out of the black box.
  136. 136. Hate the product while testing and love the product after its release
  137. 137. - Emotional Testing. </li></ul>
  138. 138. Questions?
  139. 139. Thank You!

×