Practical Agile QA


Published on

A practical guide to implementing Agile QA practices for Agile teams. It tries to answer the questions most teams ask when implementing QA practices in an Agile environment. If you would like, I will be happy to do this talk for your local Agile user group or at your company. Contact me by email

Published in: Technology
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Teams either skip QA, or Teams continue to do QA as they did on waterfall projects. Impedance mismatch results. We end up being not so Agile and QA becomes a sore point in the whole process.
  • Too many changes, not enough time By the end of the presentation, hopefully we all will be able to answer these questions.
  • “… planned and systematic production processes that provide confidence in a product's suitability for its intended purpose. It is a set of activities intended to ensure that products (goods and/or services) satisfy customer requirements in a systematic, reliable fashion.”
  • ACCEPTANCE TESTING. Testing to verify a product meets customer specified requirements. A customer usually does this type of testing on a product that is developed externally. FUNCTIONAL TESTING. Validating an application or Web site conforms to its specifications and correctly performs all its required functions. INTEGRATION TESTING. Testing in which modules are combined and tested as a group. LOAD TESTING. Load testing is a generic term covering Performance Testing and Stress Testing to verify that the system meets the scalability requirements. REGRESSION TESTING. Similar in scope to a functional test, a regression test allows a consistent, repeatable validation of each new release of a product or Web site. Such testing ensures reported product defects have been corrected for each new release and that no new quality problems were introduced in the maintenance process. Though regression testing can be performed manually an automated test suite is often used to reduce the time and resources needed to perform the required testing. SMOKE TESTING. A quick-and-dirty test that the major functions of a piece of software work without bothering with finer details. UNIT TESTING. tests for the behavior of components of a product to ensure their correct behavior prior to system integration. SYSTEM TESTING. Testing conducted on a complete, integrated system to evaluate the system's compliance with its specified requirements.
  • Who is using these tests? Who is automating which tests?
  • Who is using these tests? Who is automating which tests? Issues with commercial test tools- proprietary scripting, closed platform, vendor lock-in
  • What is issue with this approach? Technical Debt- developers’ focus change
  • This will typically involve collecting information about which parts of a program are actually executed when running the test suite in order to identify which branches of conditional statements which have been taken
  • “ Continuous Integration is a software development practice where members of a team integrate their work frequently , usually each person integrates at least daily - leading to multiple integrations per day . Each integration is verified by an automated build (including test) to detect integration errors as quickly as possible ” - Martin Fowler
  • Aviation control system safety is of critical importance and hence you would test all and more Social Networking Software not as much, in fact, you would only focus on 1, 2 Financial transactions need to be tested more than other features Security features should be tested more thoroughly than all other features Administrative features probably more than user specific features
  • When a team says they are done with a story, what is the expectation? Is it ready for integration with other parts of the system? Is it ready for production?
  • QA was out numbered by the developers. We needed to add more QA persons. Best is 1:1. 1:2 also would work depending on the skill of the tester and the complexity of the domain.
  • Practical Agile QA

    1. 1. Agile QA A practical guide to implementing Agile QA process on Agile Projects Syed Rayhan Co-founder, Code71, Inc. Contact: Blog: Company: Product:
    2. 2. My Background  Co-founder, Code71, Inc. Career  14+ years of total experience  Co-author of “Enterprise Java with UML”  Iterative incremental development Ex pertise  Technology planning and architecture  On-shore/Off-shore software development using Agile/Scrum  Cultural aspect of self-organizing team  Scrum for projects delivered remotely Int erest s  Agile engineering practices Copyright 2010, Code71, Inc. 2
    3. 3. Agenda Section 1 Introduction Section 2 Holistic View of QA Section 3 Individual Practices Section 4 A Case Study Section 5 Recap Section 6 Q&A Copyright 2010, Code71, Inc. 3
    4. 4. What to Expect  Teams and organizations are adopting Agile/Scrum Contex t  Teams struggle with making the transition from waterfall to Agile/Scrum  Build common base of understanding Focus  Develop a set of guidelines- process, roles, and team composition  Address typical questions asked  How to perform QA on an Agile/Scrum project K ey T akeaways  Agile/QA best practices Copyright 2010, Code71, Inc. 4
    5. 5. Agenda Section 1 Introduction Section 2 Holistic View of QA Section 3 Individual Practices Section 4 A Case Study Section 5 Recap Section 6 Q&A Copyright 2010, Code71, Inc. 5
    6. 6. The challenges?  Is QA part of the development team?  Can we fit QA in the same iteration as development?  Who does QA?  Does QA costs more in Agile as product seems to change from sprint to sprint?  How can we scale Agile QA?  Do we need “test plan”?  Who defines test cases?  Are story acceptance tests enough?  When do we know testing is done?  Do we need to track bugs? Copyright 2010, Code71, Inc. 6
    7. 7. What is QA (Quality Assurance)? To ensure Software is working right How? Test, Test, Test We will primarily focus on single-team model for our discussion Copyright 2010, Code71, Inc. 7
    8. 8. Types of Testing? White Box Unit Testing Regression Testing Black Box Integration Testing Acceptance Testing Load Testing Functional Testing System Testing Smoke Testing Copyright 2010, Code71, Inc. 8
    9. 9. Who Performs What? What? Who? When? Automation? Unit Testing Developer Coding Always Integration Testing Developer Coding Always System Testing Tester Test Possible Regression Testing Developer/Tester Build/Test Possible Acceptance Testing Client/Users Deployment/ Possible Delivery Smoke Testing Tester/Support Engineer Deployment Possible Load Testing Performance Engineer Deployment Always Copyright 2010, Code71, Inc. 9
    10. 10. Right tools for right tests? Test Tool Unit Testing NUnit, JUnit, Mock, DBUnit Integration Testing Unit test tools, HttpUnit, SoapUI, RESTClient System Testing Selenium, Fit, WET, Watir, WatiN Regression Testing Unit test tools, System test tools Acceptance Testing FIT, FitNesse Smoke Testing Regression test tools Load Testing JMeter, Httperf Copyright 2010, Code71, Inc. 10
    11. 11. What is missing? 1. Right Spec Assumptions 2. Right Design 3. Right amount of Tests 4. Right Tests Reality? Assumptions are farther from truth Spec Spec review Measures Design Design review Code Code review Test Test Coverage Copyright 2010, Code71, Inc. 11
    12. 12. Team Composition? region iteration Out-of-Cycle 1 2 3 Separate Team Dev 1 Test 1 2 3 Prod 1 2 … region iteration Dev 1 2 3 In-Cycle 2 Test 1 2 3 Integrated Team Prod 1 2 … Developer to tester ratio? Copyright 2010, Code71, Inc. 12
    13. 13. Quality Funnel Unit & Int. Backlog Design Code System Test, Review Review Review Test CI bugs QG#1 QG#2 QG#3 QG#4 QG#5 bugs Copyright 2010, Code71, Inc. 13
    14. 14. Agenda Section 1 Introduction Section 2 Holistic View of QA Section 3 Individual Practices Section 4 A Case Study Section 5 Recap Section 6 Q&A Copyright 2010, Code71, Inc. 14
    15. 15. Test Coverage “A measure of the proportion of a program Definition exercised by a test suite, usually expressed as a percentage.” Measure Usually expressed as a percentage • Function coverage Types of coverage • Path coverage • Statement coverage Tests coverage metrics can tell you what code is not tested Copyright 2010, Code71, Inc. 15
    16. 16. Continuous Integration as the Glue Report Monitor Automated Regression Test Source Code Build Test Coverage Continuous Integration is Continuous QA Copyright 2010, Code71, Inc. 16
    17. 17. Test Case Prioritization Frequency of use high normal 3 1 hot Risk of having bugs high low cold 4 2 warm low Copyright 2010, Code71, Inc. 17
    18. 18. “Inspect and Adapt” through QA Lens Discover Log bugs found by testers Triage & fix Prioritize bugs over stories Prevent Five “whys” of root cause analysis Copyright 2010, Code71, Inc. 18
    19. 19. Definition of “Done” Is QA part of your definition of “Done?” Copyright 2010, Code71, Inc. 19
    20. 20. Tracking Quality Quality Metrics  Defect Rate Bug count per iteration  Defect Density Bug count per module Bug count per function point What else should we track? Copyright 2010, Code71, Inc. 20
    21. 21. Technical Debt & QA “Aged” open bugs can contribute to increasing “technical debt” “Reoccurring” bugs may indicate hidden “technical debt” Copyright 2010, Code71, Inc. 21
    22. 22. Agenda Section 1 Introduction Section 2 Holistic View of QA Section 3 Individual Practices Section 4 A Case Study Section 5 Recap Section 6 Q&A Copyright 2010, Code71, Inc. 22
    23. 23. A Case Study A large enterprise system that includes technologies like Project ASP.Net, BizTalk, Workflow, Scanning, SQL Server, Data Warehouse, and Mainframe 2 product owners, 1 scrum master, 1 architect, 5 developers, 1 Team QA tester Sprint 2 weeks Day 1, 2 Day 3, 4 Day 5-8 Day 9 Day 10 QA Process Refine Identify UI Write test scripts Final Demo scope, elements, test data Test & acceptance test data & Acceptanc test cases QA schedule Fix e Test Copyright 2010, Code71, Inc. 23
    24. 24. Agenda Section 1 Introduction Section 2 Holistic View of QA Section 3 Individual Practices Section 4 A Case Study Section 5 Recap Section 6 Q&A Copyright 2010, Code71, Inc. 24
    25. 25. Recap  “In-cycle QA” or “Integrated Team” is critical to the success of a project  System testing is not the only “quality gate,” it includes all types of testing and reviews  Test automation is critical to “in-cycle” QA  Target at least 90% test coverage  “Continuous Integration” is “Continuous QA”  Prioritize test cases based on risk and frequency of usage Copyright 2010, Code71, Inc. 25
    26. 26. Recap contd.  All known bugs should be fixed first  Right size story with well-thought out acceptance tests improves quality  Include all “Quality Gates” as part of definition of “Done”  Analyze each bug to understand where (Quality Gate) it should have been caught and improve (Inspect and adapt)  QA is not a designated person’s responsibility, it is a team’s responsibility (self-organizing team) Copyright 2010, Code71, Inc. 26
    27. 27. Q&A “QA is making sure right software works right” “QA is not an act, but a habit” Please contact for on-site training or Webinar: Contact: Blog: Company: Product: Copyright 2010, Code71, Inc. 27