Software Testing
• Software Development life cycle (SDLC)
• Definition of testing
• Principles of testing
• Testing technique & Types
• Sof...
SDLC
•Coherent sets of activities for specifying, designing, implementing and
testing software systems
•A structured set o...
• Waterfall model
• Spiral model
• V model
• V & V model
• Agile Methodology
SDLC
Water fall model
The standard waterfall model also called as Life cycle model for
systems development and an approach that...
Requirements
Design
Code & Unit Test
Test & Integration
Operation & Maintenance
Requirements
Design
Code & Unit Test
Test ...
V & V Model
Tests
Tests
Business
Requirements
Tests
Tests
System
Specification
Tests
Design
Specification
Code
Integration...
Agile Methodology
• “We are uncovering better ways of developing software by
doing it and helping
• others do it. Through ...
Principles Of Agile Software
• Partner with Customers
• Work Toward a Shared Vision
• Deliver Incremental Value
• Working ...
Agile-Roles in Agile Project
• The Customer or Customer Proxy who is responsible for defining the
requirements, priorities...
Agile-Additional Roles
• The Agile Coach is an experienced Agile practitioner who is responsible for
helping the team adop...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment
User
Stories
Estimation
Release
Pla...
• Testing
– The process of exercising software to verify that
it satisfies specified requirements and to detect
faults
– T...
Principles of Testing
 What do Software Faults cost
 What is a bug
 Error – Fault – Failure
 Reliability versus faults...
What do Software Faults cost?(Cost of Quality)
• Cost of quality is the term that is used to quantify the total cost of
fa...
What is a bug ?
Any deviation from requirements.
• Error: a human action that produces an incorrect result
• Fault: a mani...
A person makes
an error ...
… that creates a
fault in the
software ...
… that can cause
a failure
in operation
Error - Fau...
Reliability versus faults
Reliability: the probability that software will not cause the
failure of the system for a specif...
Why do faults occur in software?
•Software is written by human beings
Who know something, but not everything
Who have sk...
So why is testing necessary?
•Because software is likely to have faults
•To learn about the reliability of the software
•T...
Why not just "test everything"? (complete testing)
System has
20 screens
Average 4 menus
3 options / menu
Average: 10 fiel...
The most important principle of testing
Prioritise tests
so that,
whenever you stop testing,
you have done the best testin...
Testing technique
•A procedure for selecting or designing tests
•Based on a structural or functional model of the software...
Unit Test:
Unit testing verifies the smallest piece of a program (module) to
determine if the actual structure is correct ...
Testing types - Smoke testing
• Smoke Testing : Smoke testing( some times called as Sanity
test) is non-exhaustive softwar...
System Test: As soon as an integrated set of modules has been
combined to form your application, system testing can then b...
Testing types- Regression Test
• Regression Test: Regression testing is the re-running of all
tests after a fix, change or...
Testing types- Adhoc testing
• Adhoc testing :Testing without a formal test plan or outside of
a test plan. With some proj...
Testing types- Compatibility test
• Compatibility Test: It checks how one product
works with another, efficiently share th...
Testing types - Usability testing
• Usability Testing:Usability is a quality attribute that
assesses how easy user interfa...
Testing types - Security Testing
• Security Test: How easy would it be for an unauthorized
user to gain access to this pro...
System Test: As soon as an integrated set of modules has
been combined to form your application, system testing can
then b...
Acceptance Test: The objective of acceptance testing is to
verify if the application is fit for deployment. Acceptance tes...
Performance testing :Testing with the intent of
determining how quickly a product handles a variety of events.
Automated t...
Regression Test: Regression testing is the re-running of all
tests after a fix, change or enhancement has been made to the...
•Plan Test
•Create Test Plan
•Design Test
•Design Test Cases
•Automate tests
•Create Test Automation Architecture
•Create ...
Plan Test
Design Test
Automate tests
Evaluate Test
Execute Test
Automation
Yes
No
Yes
No Adherence To Exit Criteria
Softwa...
Test Lead Create Test Plan
QATP
Change
Request
DSRS
Project Approach
Entry Criteria:
•Project Kickoff Meeting
•DSRS should...
Test Engineer Design test Cases
Architecture
Document
DSTD QATP Project Engineering
Guidelines
Change Request
QATC
Work Lo...
Automate Test
Test Engineer
QATC
Test Lead Create Test Automation
Architecture
Test Automation
Architecture
Guidelines
Aut...
Automate Test
Test Engineer
QATC
Test Lead Create Test Automation
Architecture
Test Automation
Architecture
Guidelines
Aut...
Execute Test
Test Engineer Execute Test
QATCTest Suite Build
Test Log
Entry Criteria:
Exit Criteria:
Execute Test
Test Engineer Execute Test
QATCTest Suite Build
Test Log
Entry Criteria:
•Unit Testing completed by
engineeri...
Evaluate Test
Test Lead Evaluate Test
QATCTest Suite Test Log
Test Log
Test
Report
QATP
Entry Criteria:
•System Testing Co...
Roles and Responsibility of For Software Test Process
Activity Artifact Owner Reviewer Approver
Create Test
Plan
QATP Test...
Roles and Responsibility of For Software Test Process
Activity Artifact Owner Reviewer Approver
Create
Automated
Test Scri...
• The tester has to first identify the defect.
• He has to characterize the defects based on the severity and
priority.
• ...
Defect Life Cycle- (DLC)
New Open
In
Progress
Fixed
Verified
Fixed
Closed
Deferred
Cannot
Reproduce
Documented
Is
Duplicat...
DLC-Reason for a defect
• Defects may occur because of the following reasons:
– Time pressure
– Code complexity
– Change i...
Thank You
?
Upcoming SlideShare
Loading in...5
×

Software testing

4,267

Published on

Manual Testing Overview

Published in: Technology
2 Comments
3 Likes
Statistics
Notes
No Downloads
Views
Total Views
4,267
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
489
Comments
2
Likes
3
Embeds 0
No embeds

No notes for slide

Software testing

  1. 1. Software Testing
  2. 2. • Software Development life cycle (SDLC) • Definition of testing • Principles of testing • Testing technique & Types • Software Testing life Cycle (STLC) • Defect Life Cycle (DLC) Agenda
  3. 3. SDLC •Coherent sets of activities for specifying, designing, implementing and testing software systems •A structured set of activities required to develop a software system SDLC
  4. 4. • Waterfall model • Spiral model • V model • V & V model • Agile Methodology SDLC
  5. 5. Water fall model The standard waterfall model also called as Life cycle model for systems development and an approach that goes through the following steps: • Document system concept • Identify system requirements and Analyze them • Break the system into pieces (Architectural Design) • Design each piece (Detailed Design) • Code the system components and test them individually (Coding, Debugging, and Unit Testing) • Integrate the pieces and test the system (System Testing) • Deploy the system and operate it Note: This model is useful when all the requirements are clear for the system development SDLC-Waterfall
  6. 6. Requirements Design Code & Unit Test Test & Integration Operation & Maintenance Requirements Design Code & Unit Test Test & Integration Operation & Maintenance SDLC -Waterfall model
  7. 7. V & V Model Tests Tests Business Requirements Tests Tests System Specification Tests Design Specification Code Integration Testing in the Small Integration Testing in the Large System Testing Component Testing Acceptance Testing Run Tests Design Tests Project Specification
  8. 8. Agile Methodology • “We are uncovering better ways of developing software by doing it and helping • others do it. Through this work we have come to value: – Individuals and interactions over processes & tools – Working software over comprehensive documentation – Customer collaboration over contract negotiation – Responding to change over following a plan • That is, while there is value in the items on the right, we value the items on the
  9. 9. Principles Of Agile Software • Partner with Customers • Work Toward a Shared Vision • Deliver Incremental Value • Working software is the primary measure of progress • Requirements evolve • Embrace change • Sustainable development • Invest in Quality and technical excellence • Empower Team Members • Interact with Business on a daily basis • Establish Clear Accountability • Learn from all experiences • Foster open communications
  10. 10. Agile-Roles in Agile Project • The Customer or Customer Proxy who is responsible for defining the requirements, priorities and accepts delivery of the completed User Stories. • The Project Manager who is responsible for delivering the completed system to the Customer. • The Business Analyst (who often acts as a Customer Proxy) is responsible for ensuring that the requirements are fully formed into proper User Stories with accompanying Acceptance Criteria. • The Designer is responsible for ensuring a coherent technical design with appropriate levels of quality, performance etc which satisfies the Customer's requirements. • The Developer is responsible for delivering software code which fulfils User Stories by meeting their Acceptance Criteria. • The Tester is responsible for ensuring that the Acceptance Criteria tests are run and that they pass. In addition they are responsible for ensuring the overall quality of the system.
  11. 11. Agile-Additional Roles • The Agile Coach is an experienced Agile practitioner who is responsible for helping the team adopt Agile Practices. The Agile Coach may also take on the Iteration Manager role. • The Agile Enablers who work within the team in specific roles. Agile favors learning via interactions between individuals, teams benefit greatly if they are seeded with experienced Agile Enablers in areas such as the Developer, Business Analyst and Tester teams. • The Build Master, usually a Developer who has specific experience and skills in setting up and maintaining a Continuous Integration environment. • The Iteration Manager who takes on the inward focusing parts of the Project Manager role, specifically dealing with ensuring that the team is productive and that the iterative process is running smoothly. • The Technical Lead, usually a senior Developer working within the Developer team, who is responsible for ensuring that the technical Common Vision is maintained. The Technical Lead works closely with the Designer in order to both communicate design goals to the team and to provide feedback to the Designer about implementation issues.
  12. 12. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  13. 13. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  14. 14. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  15. 15. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  16. 16. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  17. 17. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  18. 18. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  19. 19. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  20. 20. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  21. 21. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  22. 22. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  23. 23. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  24. 24. Agile Process FlowRequirements & Planning Development Iteration (2-4 Weeks) Deployment User Stories Estimation Release Planning Iteration Zero Iteration Planning Iteration Kick-off Daily Activities Showcase Iteration Demo Iteration Retrospective Continuous CAT (UAT) Production Deployment Continuous Integration Automated Testing TrackingDevelopment Automated Builds Automated Deployment Daily Stand-ups Iteration Tracking Simple Design Refactoring Coding Automated Unit Testing Automated Functional Testing
  25. 25. • Testing – The process of exercising software to verify that it satisfies specified requirements and to detect faults – The purpose of testing is to show that a program performs its intended functions correctly – Testing is the process of executing a program with the intent of finding errors – Software testing validates the behavior of a program with a finite set of test cases, against the specified expected behavior Testing Defined
  26. 26. Principles of Testing  What do Software Faults cost  What is a bug  Error – Fault – Failure  Reliability versus faults  Why do faults occur in software?  Why is testing necessary?  Why not just test everything?(complete testing)  The most important principle.
  27. 27. What do Software Faults cost?(Cost of Quality) • Cost of quality is the term that is used to quantify the total cost of failure, appraisal and prevention costs associated with the production of software. • The cost of quality will vary from one organization to the next. • The goal is to optimize the production process to the extent that rework is eliminated and inspection is built into the production process. • Apply the concepts of continuous testing to the systems development process can reduce the cost of quality Principles of Testing
  28. 28. What is a bug ? Any deviation from requirements. • Error: a human action that produces an incorrect result • Fault: a manifestation of an error in software – also known as a defect or bug – if executed, a fault may cause a failure • Failure: deviation of the software from its expected delivery or service • Software faults become software failures only when the exact computation conditions are met, and the faulty portion of the code is executed on the CPU Failure is an event; fault is a state of the software, caused by an error Principles of Testing
  29. 29. A person makes an error ... … that creates a fault in the software ... … that can cause a failure in operation Error - Fault - Failure Principles of Testing
  30. 30. Reliability versus faults Reliability: the probability that software will not cause the failure of the system for a specified time under specified conditions – Can a system be fault-free? (zero faults, right first time) – Can a software system be reliable but still have faults? – Is a “fault-free” software application always reliable? . Principles of Testing
  31. 31. Why do faults occur in software? •Software is written by human beings Who know something, but not everything Who have skills, but aren‟t perfect Who do make mistakes (errors) •Under increasing pressure to deliver to strict deadlines No time to check but assumptions may be wrong Systems may be incomplete •Incomplete and misunderstood system requirements, errors in design and poor test coverage •A good test case is one that has a high probability of finding an as yet undiscovered error Principles of Testing
  32. 32. So why is testing necessary? •Because software is likely to have faults •To learn about the reliability of the software •To fill the time between delivery of the software and the release date •To prove that the software has no faults •Because testing is included in the project plan •Because failures can be very expensive •To avoid being sued by customers •To stay in business Principles of Testing
  33. 33. Why not just "test everything"? (complete testing) System has 20 screens Average 4 menus 3 options / menu Average: 10 fields / screen 2 types input / field (date as Jan 3 or 3/1) (number as integer or decimal) Around 100 possible values Total for 'exhaustive' testing: 20 x 4 x 3 x 10 x 2 x 100 = 480,000 tests If 1 second per test, 8000 min, 133 hrs, 17.7 days (not counting finger trouble, faults or retest) 10 secs = 34 wks, 1 min = 4 yrs, 10 min = 40 yrs So, Exhaustive testing is not possible Principles of Testing
  34. 34. The most important principle of testing Prioritise tests so that, whenever you stop testing, you have done the best testing in the time available. Principles of Testing
  35. 35. Testing technique •A procedure for selecting or designing tests •Based on a structural or functional model of the software •Successful at finding faults •„Best' practice •A way of deriving good test cases •A way of objectively measuring a test effort Static Testing Inspections, Walkthroughs and Reviews The inspection process Benefits of Inspection Static Analysis Dynamic Testing Black Box Testing White Box Testing Using techniques makes testing much more effective
  36. 36. Unit Test: Unit testing verifies the smallest piece of a program (module) to determine if the actual structure is correct and if the function the code defines operates correctly and reliably(without crashing or hanging) Types of Testing- Unit testing
  37. 37. Testing types - Smoke testing • Smoke Testing : Smoke testing( some times called as Sanity test) is non-exhaustive software testing, ascertaining that the most crucial functions of a program work, but not bothering with finer details. The term comes to software testing from a similarly basic type of hardware testing, in which the device passed the test if it didn't catch fire the first time it was turned on. A daily build and smoke test is among industry best practices advocated by the IEEE (Institute of Electrical and Electronics Engineers). Who : Usually Done by the QA Team When : Before actual testing.
  38. 38. System Test: As soon as an integrated set of modules has been combined to form your application, system testing can then be performed. System testing verifies the system-level reliability and functionality of the product by testing your application in the integrated system. Note: QA team performs System testing/ done by independent test group Integration Test: Integration testing is used to test the reliability and functionality of groups of units (modules) that have been combined together into larger segments. The most efficient method of integration is to slowly and progressively combine the separate modules into small segments rather than merging all the units into large component. Note: Test engineers performs Integration testing in development phase Types of Testing- ST & SIT
  39. 39. Testing types- Regression Test • Regression Test: Regression testing is the re-running of all tests after a fix, change or enhancement has been made to the code and a new build of the AUT has been delivered to QA. Regression testing verifies that previously identified problems have been fixed and changes to one part of your application have not introduced new problems elsewhere. Changing a line of code might cause a ripple effect that produces an unexpected result in another part of your application. If you do not re-run all of your test cases after your application has been changed, you cannot be certain about the quality of the entire system. Who : Usually Done by the test engineers When : After changes have been incorporated for the existing functionality
  40. 40. Testing types- Adhoc testing • Adhoc testing :Testing without a formal test plan or outside of a test plan. With some projects this type of testing is carried out as an adjunct to formal testing. If carried out by a skilled tester, it can often find problems that are not caught in regular testing. Sometimes, if testing occurs very late in the development cycle, this will be the only kind of testing that can be performed. Sometimes ad hoc testing is referred to as exploratory testing. Who : Usually Done by the skilled tester. When :After normal testing .
  41. 41. Testing types- Compatibility test • Compatibility Test: It checks how one product works with another, efficiently share the same data files simultaneously that resides in the same computer‟s memory. Who : Usually Done by the test engineer When :In testing phase
  42. 42. Testing types - Usability testing • Usability Testing:Usability is a quality attribute that assesses how easy user interfaces are to use. The word 'usability testing' refers to testing done for improving ease-of-use during the design process. Who : Usually Done by the testers in users prospective. When : Before actual release
  43. 43. Testing types - Security Testing • Security Test: How easy would it be for an unauthorized user to gain access to this program. Testing of database and network software in order to keep company data and resources secure from mistaken/accidental users, hackers, and other malevolent attackers. Who : Usually Done by the network expert When : Before actual release
  44. 44. System Test: As soon as an integrated set of modules has been combined to form your application, system testing can then be performed. System testing verifies the system-level reliability and functionality of the product by testing your application in the integrated system. Note: QA team performs System testing/ done by independent test group Types of Testing
  45. 45. Acceptance Test: The objective of acceptance testing is to verify if the application is fit for deployment. Acceptance testing may include verifying whether the application is reliable, meets the requirements for business, performs well, and has a consistent look and feel. Note: Acceptance testing is generally done by a QA person at Client Location Types of Testing - UAT
  46. 46. Performance testing :Testing with the intent of determining how quickly a product handles a variety of events. Automated test tools geared specifically to test and fine-tune performance are used most often for this type of testing. Load, Stress and Volume testing come under this testing. When :Conducted so as to meet the Performance criteria stated by the Client. Who : By the performance test automation expert Types of Testing- PT
  47. 47. Regression Test: Regression testing is the re-running of all tests after a fix, change or enhancement has been made to the code and a new build of the AUT has been delivered to QA. Regression testing verifies that previously identified problems have been fixed and changes to one part of your application have not introduced new problems elsewhere. Changing a line of code might cause a ripple effect that produces an unexpected result in another part of your application. If you do not re-run all of your test cases after your application has been changed, you cannot be certain about the quality of the entire system. Who : Usually Done by the test engineers When :After changes have been incorporated for the existing functionality Types of Testing
  48. 48. •Plan Test •Create Test Plan •Design Test •Design Test Cases •Automate tests •Create Test Automation Architecture •Create Automated Test Scripts •Execute Test •Test Log •Evaluate Test •Test Report STLC
  49. 49. Plan Test Design Test Automate tests Evaluate Test Execute Test Automation Yes No Yes No Adherence To Exit Criteria Software Testing life Cycle (STLC)
  50. 50. Test Lead Create Test Plan QATP Change Request DSRS Project Approach Entry Criteria: •Project Kickoff Meeting •DSRS should be approved Exit Criteria: •Reviewed, Approved and Base-lined QATP Workflow Detail: Plan Test
  51. 51. Test Engineer Design test Cases Architecture Document DSTD QATP Project Engineering Guidelines Change Request QATC Work Load Analysis Entry Criteria: •Baselined QATP, DSRS, DSTD Exit Criteria: •All the requirements are mapped to Test cases to meet the Test Objective •Peer reviews comments are tracked to closure •Updated Traceability matrix Design Test workflow
  52. 52. Automate Test Test Engineer QATC Test Lead Create Test Automation Architecture Test Automation Architecture Guidelines Automated Test Suite Build Test Script QATP Create Automated Test Scripts Test Automation Architecture Entry Criteria: Exit Criteria: Entry Criteria: Exit Criteria:
  53. 53. Automate Test Test Engineer QATC Test Lead Create Test Automation Architecture Test Automation Architecture Guidelines Automated Test Suite Build Test Script QATP Create Automated Test Scripts Test Automation Architecture Entry Criteria: •Signed off / baselined test automation architecture •Frozen test suite •Build has to be ready Exit Criteria: •Planned Test scripts are developed •Stability of script used for automation is to be checked Entry Criteria: •Identifying all test cases that are to be automated •Validated Test Scripts(if already exists) Exit Criteria: •Test suite should cover all the requirements •Structure of the test suite should be completed and approved
  54. 54. Execute Test Test Engineer Execute Test QATCTest Suite Build Test Log Entry Criteria: Exit Criteria:
  55. 55. Execute Test Test Engineer Execute Test QATCTest Suite Build Test Log Entry Criteria: •Unit Testing completed by engineering team •Baselined QATP and QATC •Software build ready for testing •Test Data is available Exit Criteria: •Execution of Test Cases and Test Scripts are complete •Test Results captured
  56. 56. Evaluate Test Test Lead Evaluate Test QATCTest Suite Test Log Test Log Test Report QATP Entry Criteria: •System Testing Completed •Results logged in Test log •Test Log reviewed by Peers/Lead Exit Criteria: •Test report generated and distributed to stakeholders
  57. 57. Roles and Responsibility of For Software Test Process Activity Artifact Owner Reviewer Approver Create Test Plan QATP Test Lead Test Analyst, PM, Test Team PM,Client. Design Test Cases QATC Test Engineer Test Lead, Test Team Test Lead, PM, Client. Create Test Automation Architecture Test Automation Architecture Automated Test Suite Test Lead Test Analyst PM.
  58. 58. Roles and Responsibility of For Software Test Process Activity Artifact Owner Reviewer Approver Create Automated Test Scripts Test Scripts Test Engineer Test Lead, Test Team Test Lead, PM, Client. Execute Test Test Log Test Engineer Test Lead, Test Team Test Lead, PM, Client. Evaluate Test Test Report Test Lead Test Analyst PM,Client.
  59. 59. • The tester has to first identify the defect. • He has to characterize the defects based on the severity and priority. • Reports the bug to the development team. • Once the bug is fixed by the developer, tester retests the same test case for validation. • Test the related test cases for any adverse effects. Defect Life Cycle- (DLC)
  60. 60. Defect Life Cycle- (DLC) New Open In Progress Fixed Verified Fixed Closed Deferred Cannot Reproduce Documented Is Duplicate As Designed
  61. 61. DLC-Reason for a defect • Defects may occur because of the following reasons: – Time pressure – Code complexity – Change in technologies – Change in requirements – Programming errors – Miscommunication – Wrong Interpretation – Unavailability of software resources – Lack of human resources
  62. 62. Thank You ?
  1. ¿Le ha llamado la atención una diapositiva en particular?

    Recortar diapositivas es una manera útil de recopilar información importante para consultarla más tarde.

×