• Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
  • hi govardhan reddy can you send me the pdf files of testing which you have placed here my mail id navindhanireddy@gmail.com
    Are you sure you want to
    Your message goes here
No Downloads

Views

Total Views
937
On Slideshare
0
From Embeds
0
Number of Embeds
0

Actions

Shares
Downloads
0
Comments
1
Likes
3

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Software Testing – Day One _____________________________________ By Govardhan Reddy M
  • 2. Agenda
    • Software Testing
    • Notes
    • Test Design
  • 8. Software Testing
    • Process of executing an application with the intent of finding bugs.
    • 9. Process of demonstrating that errors are not present.
    • 10. Process of establishing confidence that a program does what it is supposed to do.
  • 11. Software Testing - Why...
    • Here are some important defects that bettter testing would have found.
      • In October 1999 the $125 million NASA Mars Climate Orbiter – an interplanetary weather satellite was lost in space due to a data conversion error. Investigators discovered that the software on the spacecraft performed certain calculations in English units (yards) when it should have used metric units (meters).
                    • (Cont...)
  • 12. Software Testing - Why...
      • In June 1996 the first flight of the European Space Agency's Ariane 5 rocket failed shortly after launching, resulting in an uninsured loss of $500,000,000. The disaster was traced to the lack of exception handling for a floating-point error when a 64-bit integer was converted to a 16-bit signed integer.
  • 13. Software Testing - Purpose...
  • 16. Software Testing - Purpose...
    • Verification
      • Whether we built the project right?
      • 17. Typically involves reviews and meetings to evaluate:
        • Documents, Plans, Code, Requirements & Specifications.
      • Can be done with:
        • Checklists, Issues lists, walkthrouhs and inspection meetings.
  • 18. Software Testing - Purpose...
    • Validation
      • Whether we built the right project?
      • 19. Typically involves actual testing.
      • 20. Tasks are designed to ensure that
        • Internal design is valid.
        • 21. Meets user expectations / business requirements.
      • Validation takes place after verifications.
  • 22. Software Testing - Purpose...
    • Error Finding
      • Failure points of the application.
        • Failure: Variance between expected and actual result.
      • Defects introduced in the:
        • SRS: Software Requirement Specification.
        • 23. PRD: Product Requirement Document.
        • 24. FDD: Funtional/Feature Design Document.
  • 25. Software Testing - Terminologies...
    • Error:
      • Discrepancy between expected and actual result.
      • 26. Flaws that exists due to improper coding.
      • 27. Undesirable output which negates the requirement.
      • 28. Occurs in the syntax level (syntax error).
      • 29. Error could be in:
        • Environment
        • 30. Application
        • 31. Resource associated with testbed setup.
  • 32. Software Testing - Terminologies...
    • Bug:
      • Error found before the application goes into production.
    • Defect:
      • Error found after the application goes into production.
  • 33. Software Testing - Terminologies...
    • Note:
      • Error can be at any stage:
      • Error can be a Bug or Defect.
  • 36. Software Testing - Terminologies...
    • Test Strategy:
      • An elaborate and systematic plan of action.
      • 37. High level system wide expression of major activities that collectively achieve the overall desired result.
      • 38. Identifies Risks, Constraints and Exposures present.
      • 39. Introduces significant added value to the plan.
  • 40. Software Testing - Terminologies...
    • Test Strategy: Cont...
      • Statements are expressed in high level terms of
        • Physical components and activities.
        • 41. Resources (People and Machines).
        • 42. Types and Levels of Testing.
        • 43. Schedules.
      • Plan will be specific to the system being developed.
  • 44. Software Testing - Terminologies...
    • Test Plan:
      • Collection of test cases.
      • 45. Test plans are normally written from the requirements documents and from the specifiation.
        • PRD: High Level Design Document (HLD)
        • 46. FDD: Functional Design Document (LLD)
        • 47. SRS: HLD and/or LLD
  • 48. Software Testing - Terminologies...
    • Test Case:
      • Detailed procedure that fully tests a feature or an aspect of a feature.
      • 49. Trigger event that starts the test.
      • 50. One or more data values which are generated as a result of the input(s) being processed (output).
      • 51. Tools and data required to allow the test to execute (the environment).
  • 52. Software Testing - Terminologies...
    • Test Case: Cont...
      • Manner inwhich it is depicted varies between organizations.
      • 53. Most test cases are in the form of a table.
      • 54. eg.
    • Test Data:
      • Data included in the test case.
    Test Name Design Step Description Expected Result Actual Result Pass/Fail Criteria Estimated Time Platform Remarks
  • 55. Software Testing - Terminologies...
    • Test Script:
      • Set of instructions that will be performed on the SUT (System Under Test) or DUT (Device Under Test) to test that the system/device functions as expected.
      • 56. Manual Scripts = Test Cases
      • 57. Automated Scripts:
        • Written in a programming language used to test part of the functionality of the system software.
  • 58. Software Testing - Terminologies...
      • Automated Scripts: Cont...
        • Programs can be written using
          • Special automated functional GUI test tools
            • HP QTP, Borland Silk Test, Rational Robot
          • Well known programming languages
            • C++, C#, TCL, Perl, Expect, Java, PHP, Powershell, Python or Ruby.
    • Build:
      • Compiled version of code.
  • 59. Test Design - Targets...
    • Understanding the sources of:
      • Test cases
      • 60. Test coverage
      • 61. How to develop and document test cases
      • 62. How to build and maintain test data
  • 63. Test Design Techniques – Black Box...
    • Black Box Testing:
      • No usage of internal structure knowledge.
      • 64. Focus on testing functional requirements.
      • 65. Tester would only know legal inputs and what expected outputs will be, But not know what program actually arrives to those outputs.
      • 66. Testing against the specification and no other knowledge is required.
  • 67. Test Design Techniques – Black Box...
    • Synonyms:
      • Behavioral
      • 68. Funtional
      • 69. Opaque (not clearly understand)
      • 70. Closed Box
    Note: Behavioral is not Black Box, As usage of internal knowledge is not strictly forbidden.
  • 71. Test Design Techniques – Black Box...
    • Black Box Testing – Techniques:
    According to Beizer:
    • Testing the system software in to its intended (planned) use by performing a standardized task.
    • 72. General order to design tests are
      • Clean tests against requirements.
      • 73. Additional
        • Structural tests for branch coverage as needed.
        • 74. Tests for data flow coverage as needed.
  • 75. Test Design Techniques – Black Box... According to Beizer: Cont...
      • Domain tests not covered by the above.
      • 76. Special techniques as appropriate
        • Syntax, loop, state etc.,
      • Any dirty tests not covered above.
  • 77. Test Design Techniques – Black Box...
    • Graph Based:
      • Test cases are designed based on nature of the relationships. i.e. Links and objects among the program.
        • Transaction Flow: Transaction validated or not?
        • 78. Finite State Modeling: Particular state?
        • 79. Data Flow Modeling: Object Object.
        • 80. Timing Modeling: Execution times?
  • 81. Test Design Techniques – Black Box...
    • Equivalence Class Partitioning:
      • Input domain is divided in to classes of data from which test cases are derived.
      • 82. eg.
    1 - 20 1 - 9 10 - 20 Note: 1. Test for digit 1, If passed, Then no need to test for digits 2 – 9 2. Test for digit 10, If passed, Then no need to test for digits 11 - 20
  • 83. Test Design Techniques – Black Box...
    • Boundary Value Analysis:
      • Range/Values/Boundaries.
      • 84. eg.
        • On Boundary: min, max
        • 85. In Boundary: min + 1, max - 1
        • 86. Out Boundary: min – 1, max + 1
    1 - 20 1,20 2,19 0,21
  • 87. Test Design Techniques – Black Box...
    • Error Guessing:
      • eg. mm/dd/yyyy – 02/29/2009
        • February 29 th in the year 2009 is not possible as it is not a leap year.
  • 88. Test Design Techniques – Black Box...
    • Advantages:
      • More effective for major projects.
      • 89. No internal is knowledge is required.
      • 90. Tester and programmer are independent.
      • 91. Tests are done from users point of view.
      • 92. Test cases will be designed as soon as specifications are ready.
      • 93. Will help to expose unclearness in the specification.
  • 94. Test Design Techniques – Black Box...
    • Disadvantages:
      • Only small number of possible inputs can be tested.
      • 95. Tests are hard to design without clear and concise specifications.
      • 96. May leave many program paths untested.
      • 97. May be unnecessary repetition of test cases – if the tester was not informed of the test cases for which the programmer has already tried.
      • 98. Most testing research has been detected towards glass box testing.
  • 99. Test Design Techniques – White Box...
    • White Box Testing:
      • Usage of internal structure knowledge.
      • 100. Used to detect errors by means of execution-oriented test case.
      • 101. Consideration:
        • Allocation of resources to perform
          • Class Analysis
          • 102. Method Analysis
          • 103. And to develop and review the same.
  • 104. Test Design Techniques – White Box...
  • 108. Test Design Techniques – White Box...
      White Box Testing – Techniques
    • Code Coverage Analysis: (By Mc. Cabe) will exercise basic set which will execute every statement at least once.
      • Basis Path Testing:
        • Flow Graph: Control flow notation representation.
        • 109. Cyclomatic Complexity: Gives quantitative measure of logical complexity.
  • 110. Test Design Techniques – White Box...
      • Control Structure Testing:
    Flow
  • 118. Test Design Techniques – White Box...
    • Design by Contract: DBC specifies requirements such as:
      • Conditions that
        • The client must meet before a method is involved.
        • 119. The method must meet after it is executed.
      • Assertions that a method must satisfy at specific points of execution.
  • 120. Test Design Techniques – White Box...
    • Profiling:
      • Provides a framework for analyzing java code performance for speed and heap (collection of objects) memory use.
      • 121. Leads to improve CPU time performance.
  • 122. Test Design Techniques – White Box...
    • Error Handling:
      • Proper error recovery notifications and logging are checked against references to validate program design.
      • 123. Exception and error handling are checked thoroughly by operating an error causing test vectors.
  • 124. Test Design Techniques – White Box...
    • Transactions:.
      • To ensure ACID properties.
        • Atomicity, Consistency, Isolation, Durability
      • Transactions are checked thoroughly for partial/complete commits besides rollbacks.
  • 125. Test Design Techniques – White Box...
    • Advantages:
      • Forces test developer to reason carefully about implementation.
      • 126. Approximate the partition done by equivalence.
      • 127. Reveals errors in hidden code.
      • 128. Beneficent side effects:
        • Helps in removing extra lines of code which can bring in hidden defects.
  • 129. Test Design Techniques – White Box...
    • Disadvantages:
      • Expensive because a skilled tester is required.
      • 130. Case omitted (leave undone) in the code could be missed out.
      • 131. It is nearly impossible to look into every bit of code to find out hidden errors, Which may create problem, resulting in failure of the application.
  • 132. Test Design Techniques – Gray Box...
    • Gray Box Testing:
      • Includes both
        • Black Box (50%) and
        • 133. White Box (50%)
    • Synonyms:
      • Translucent.
  • 134. Software Testing - Notes
    • Don't terminate an issue just because you are unable to reproduce,
    It could kill your company.
    • When you are unable to reproduce an issue, Think out of the box,
    • 135. Think out of the black box.
    • 136. Hate the product while testing and love the product after its release
    • 137. - Emotional Testing.
  • 138. Questions?
  • 139. Thank You!