Test Automation in Agile
2
Agenda
• Capgemini’s World Quality Report on Automation
• Evolution of business models and IT ecosystem
• QA and Testing in the agile world
• Agile and test automation
• Challenges in Agile automation
• In conclusion …
3
Connect with Capgemini:
https://www.capgemini.com/testing-services
Capgemini’s World Quality Report
4
“By 2018, one-third of the top 20 market share
leaders in most industries will be significantly
disrupted by competitors that use the 3rd Platform
to create new services and business models.”
Evolution of Business Models
1st Platform
Mainframes . Everything centralized
dumb terminals
Voice: Telephony stand alone
2nd Platform
C/S: Distributed computing PC,
tablet & smart phone
Voice: Telephony as part of
computer HW
3rd Platform
SMAC era
Voice to access & exchange
real time video, data, text,
transactions first step toward
cognitive computing
Mainframe Terminal
LAN/ Internet Client/Server
Source: IDC
5
Evolution of IT ecosystem
Development Process Application Architecture Deployment & Packaging Application Infrastructure
Waterfall
Agile
CI/CD
Dev(T)Ops
Monolithic
N-Tier
Microservices
Physical Servers
Virtual Servers
Containers
Datacenter
Hosted
Cloud
Infra as a Code
6www.unicomlearning.com/ITW_Pune/
Testing is embedded within the Scrum process
•Test preparation begins with completion of user stories
•Daily test runs are conducted along with the development team and
errors communicated to scrum team and stakeholders
•Testing is also performed on partially completed scenarios
•Testing is dynamic, changes can occur at any time
•Coordinate additional testing (performance, UAT etc.,) with non-agile
development teams, as needed
There is no separate allocated schedule for testing.
All elements have to be tested within the scrum team.
Testing in the Agile process
7
Testing in Agile development lifecycle
Req
Design &
Coding
Test AT Req
Design &
Coding
Test AT Req
Design &
Coding
Test AT
Dynamic Analysis
Unit Testing
Dynamic Analysis
Unit Testing
Dynamic Analysis
Unit Testing
Functional Test Functional TestFunctional Test
Reviews & Static
Analysis
Reviews & Static
Analysis
Reviews & Static
Analysis
Performance Testing
Automation Testing
Test Data Management
Test Environment Management
Testing and Quality Metrics
Configuration Management
Agile Testing Objectives
•Test as early and as often possible
Continuously plan the testing effort with every sprint
Adapt and Evolve with each delivery and test cycle
•Test enough & Prioritise
Test most important features first
•Pair testing with development
Constant development testing interaction .
Jointly develop test cases. Leverage the Unit test cases
Leading guiding agile principle “Every one test, Customer
accepts”
Testing Types and Phases
•Validation of User Stories
•Acceptance Tests
•Validation of Tasks implementing
the User Stories
Unit Tests
Integration Tests
Functional Tests
Exploratory Testing
Performance Tests
Regression of previous Sprint(s)
Agile Testing Approach
Sprint-1 Sprint-2 Sprint-3
 Defect Acceptance Ratio
(DAR)
 Defect Removal Efficiency
(DRE)
 % Deferred Defects
 Reopen defects
 Test case Execution (TCE) -
ratio of test cases executed
vs. test cases planned
 Test Automation Coverage
Key Agile Testing Metrics
8www.unicomlearning.com/ITW_Pune/
Agile Development and Testing Activities
9
TMap NEXT® - How TMap is Applied to the Agile
Test Lifecycle
A project contains one or more iterations. Per project a compact project test plan is made. An iteration contains one or more user stories.
Per iteration a compact iteration test plan is made. The preparation (P), specification (S), execution (E) and completion (C) phases apply
to each individual user story. The control and infrastructure phases are continuous phases covering the whole project.
 Preserve
testware
 Evaluate
process
Iteration Plan
Control
Infrastructure
Planning
Iteration 1
E
P S
C
Story 7
E
P S
C
Story 13
Iteration 2 Iteration n
E
P S
C
Story 4
E
P S
C
Story 11
 Develop test strategy
 Risk analysis
 Test estimation
 Prepare test plan
 Define organization
• Review requirements
• Assign techniques (See
Appendix)
• Identify scenarios
 Create test scripts
 Define test data
Execution
Preparation Specification
Story 1
 Test, (Re)test
 Check, Assess
 Manage the test project
 Report metrics
 Control budget &
Timelines
Execution
Completion
10www.unicomlearning.com/ITW_Pune/
A tester in an agile environment:
 Is a skilled communicator
 Is flexible
 Has domain knowledge
 Is creative but practical
 Is solution-oriented
 Is customer-focused
 Is a team player
 Supports the product owner in
the prioritization of product
backlog and acceptance criteria
 Supports the business analyst
 Evaluates unit tests
 Pairs with a developer to help
program with minimal errors
 Participates in daily scrum
meetings
The most important characteristic is that the tester must be proactive and
open-minded, and, as a part of the team, feels responsible for the delivery of a quality
result.
Testing in the Agile process
11www.unicomlearning.com/ITW_Pune/
• Automation has moved beyond graphical user interface testing using Record and
playback mode.
• Faster Time to Market, Increased productivity, Real time Data Processing
• Learn to Program. Programming is first step for automation. Be well versed with
scripting languages like Python, Ruby etc.
• Learn Automation tools in the market
• Social, Mobile, Analytics, Cloud (SMAC) – Digital everywhere !!
• Devops Automation - the way forward!
Agile and Automation
12
Agile and Automation
Requirements
Architecture &
Design
Development Testing Release
ShiftLeft
Test Driven
Development
Automation: 80 - 100%
Unit Tests
Automation: 80 - 100%
Behavior driven
development
Automation: 50 – 70%
Service Virtualization
Automation 40 – 60%
ShiftRight
Continuous
Integration
Auto: 80 – 100%
Continuous
Delivery
Auto:10-60%
UAT
Automation
40 – 60%
Continuous
Monitoring
Auto: 40 – 60%
Environment
virtualization
Auto: 40 – 60%
Non functional tests
Automation: 80-100%
Performance Tests
Automation; 100%Regression test
Automation 60 – 80%
Test Management/Defect
Management
Automation: 70 - 100%
Sprint Acceptance Tests
Automation 40 – 50%
Exploratory tests
Automation: 30 – 40%
Functional Tests
Automation: 40 – 50%
13
Test Automation Approach in Agile
Sprint 1 Sprint 2 Sprint 3 Sprint 4
C1 C1
C2
C1
C3
F1
C2
C1
C2
C4
C3
F1Tester automates in N + 1
sprint once stable in N sprint
F1 Stable FeatureC Stable Component C Component
14
Test Driven Development [TDD]
Write a test
that fails
Make the
code work
Eliminate
the
redundancy
by
refactoring
The mantra of Test-Driven Development (TDD) is red, green, refactor.
JUnit
NUnit
HttpUnit
15
Acceptance Test Driven Development
The information 1
Write an
acceptance
test that
fails
Make the
code work
Eliminate
the
redundancy
by
refactoring
16Copyright © Capgemini 2015. All Rights Reserved
Continuous Integration Automation framework
Continuous
Integration
Automation
Physical / Virtual Environment
Ready for
Deployment
SCM
Environment Sanity
/ Smoke Test
Unit Testing
Environment
Pre-Check
Deployment
Test Automation
/Regression Suite
SCM Tools
MS TFS, SVN, IBM RTC,
Clear Case, HP ALM
Build Tools
MS TFS, Maven, Jenkins,
Cruise Control, Build Forge
Unit Testing Tools
MSTest, JUnit
Testing Automation
Tools
MS Coded UI, Café Next
QTP, Selenium, BDD +
Cucumber
17
With a ready to use tool kit for automation
Release and Sprint Planning :
• Version One, Jira Agile, HP Agile Manager, Microsoft TFS,
Agile Tools -
Snapshot
TDD and BDD tools :
• Cucumber, Jbehave, Junit, xUnit,
Jmeter,csunit, CAST
Sprint Level – Services Automation:
CA Lisa, HP ServiceTest
Functional Automation:
• QTP, Café (Capgemini Automation
Framework),IMDA, Selenium
Exploratory and test design
• HP Sprinter, BluePrint (MBT)
Metrics:
• LIVE (Lifecycle Integration with Virtualized
Engine)
• HP Quality Center , Version 1,Jira , TFS
Build & Integration:
• Jenkins, CruiseControl, Gnumake
18
Future of Automation – A sneak peek
 Technical connect
Normal testing as a profession would diminish and testers would be expected to have
scripting knowledge with the ability to automate.
 Business connect
Automation testers would be expected to have domain and business knowhow.
 DevOps
End to end continuous integration automation would be a necessity not an innovation.
 SMAC adoption
Multiple channel integration and smart device compatibility is a necessity.
Test Automation in Agile

Test Automation in Agile

  • 1.
  • 2.
    2 Agenda • Capgemini’s WorldQuality Report on Automation • Evolution of business models and IT ecosystem • QA and Testing in the agile world • Agile and test automation • Challenges in Agile automation • In conclusion …
  • 3.
  • 4.
    4 “By 2018, one-thirdof the top 20 market share leaders in most industries will be significantly disrupted by competitors that use the 3rd Platform to create new services and business models.” Evolution of Business Models 1st Platform Mainframes . Everything centralized dumb terminals Voice: Telephony stand alone 2nd Platform C/S: Distributed computing PC, tablet & smart phone Voice: Telephony as part of computer HW 3rd Platform SMAC era Voice to access & exchange real time video, data, text, transactions first step toward cognitive computing Mainframe Terminal LAN/ Internet Client/Server Source: IDC
  • 5.
    5 Evolution of ITecosystem Development Process Application Architecture Deployment & Packaging Application Infrastructure Waterfall Agile CI/CD Dev(T)Ops Monolithic N-Tier Microservices Physical Servers Virtual Servers Containers Datacenter Hosted Cloud Infra as a Code
  • 6.
    6www.unicomlearning.com/ITW_Pune/ Testing is embeddedwithin the Scrum process •Test preparation begins with completion of user stories •Daily test runs are conducted along with the development team and errors communicated to scrum team and stakeholders •Testing is also performed on partially completed scenarios •Testing is dynamic, changes can occur at any time •Coordinate additional testing (performance, UAT etc.,) with non-agile development teams, as needed There is no separate allocated schedule for testing. All elements have to be tested within the scrum team. Testing in the Agile process
  • 7.
    7 Testing in Agiledevelopment lifecycle Req Design & Coding Test AT Req Design & Coding Test AT Req Design & Coding Test AT Dynamic Analysis Unit Testing Dynamic Analysis Unit Testing Dynamic Analysis Unit Testing Functional Test Functional TestFunctional Test Reviews & Static Analysis Reviews & Static Analysis Reviews & Static Analysis Performance Testing Automation Testing Test Data Management Test Environment Management Testing and Quality Metrics Configuration Management Agile Testing Objectives •Test as early and as often possible Continuously plan the testing effort with every sprint Adapt and Evolve with each delivery and test cycle •Test enough & Prioritise Test most important features first •Pair testing with development Constant development testing interaction . Jointly develop test cases. Leverage the Unit test cases Leading guiding agile principle “Every one test, Customer accepts” Testing Types and Phases •Validation of User Stories •Acceptance Tests •Validation of Tasks implementing the User Stories Unit Tests Integration Tests Functional Tests Exploratory Testing Performance Tests Regression of previous Sprint(s) Agile Testing Approach Sprint-1 Sprint-2 Sprint-3  Defect Acceptance Ratio (DAR)  Defect Removal Efficiency (DRE)  % Deferred Defects  Reopen defects  Test case Execution (TCE) - ratio of test cases executed vs. test cases planned  Test Automation Coverage Key Agile Testing Metrics
  • 8.
  • 9.
    9 TMap NEXT® -How TMap is Applied to the Agile Test Lifecycle A project contains one or more iterations. Per project a compact project test plan is made. An iteration contains one or more user stories. Per iteration a compact iteration test plan is made. The preparation (P), specification (S), execution (E) and completion (C) phases apply to each individual user story. The control and infrastructure phases are continuous phases covering the whole project.  Preserve testware  Evaluate process Iteration Plan Control Infrastructure Planning Iteration 1 E P S C Story 7 E P S C Story 13 Iteration 2 Iteration n E P S C Story 4 E P S C Story 11  Develop test strategy  Risk analysis  Test estimation  Prepare test plan  Define organization • Review requirements • Assign techniques (See Appendix) • Identify scenarios  Create test scripts  Define test data Execution Preparation Specification Story 1  Test, (Re)test  Check, Assess  Manage the test project  Report metrics  Control budget & Timelines Execution Completion
  • 10.
    10www.unicomlearning.com/ITW_Pune/ A tester inan agile environment:  Is a skilled communicator  Is flexible  Has domain knowledge  Is creative but practical  Is solution-oriented  Is customer-focused  Is a team player  Supports the product owner in the prioritization of product backlog and acceptance criteria  Supports the business analyst  Evaluates unit tests  Pairs with a developer to help program with minimal errors  Participates in daily scrum meetings The most important characteristic is that the tester must be proactive and open-minded, and, as a part of the team, feels responsible for the delivery of a quality result. Testing in the Agile process
  • 11.
    11www.unicomlearning.com/ITW_Pune/ • Automation hasmoved beyond graphical user interface testing using Record and playback mode. • Faster Time to Market, Increased productivity, Real time Data Processing • Learn to Program. Programming is first step for automation. Be well versed with scripting languages like Python, Ruby etc. • Learn Automation tools in the market • Social, Mobile, Analytics, Cloud (SMAC) – Digital everywhere !! • Devops Automation - the way forward! Agile and Automation
  • 12.
    12 Agile and Automation Requirements Architecture& Design Development Testing Release ShiftLeft Test Driven Development Automation: 80 - 100% Unit Tests Automation: 80 - 100% Behavior driven development Automation: 50 – 70% Service Virtualization Automation 40 – 60% ShiftRight Continuous Integration Auto: 80 – 100% Continuous Delivery Auto:10-60% UAT Automation 40 – 60% Continuous Monitoring Auto: 40 – 60% Environment virtualization Auto: 40 – 60% Non functional tests Automation: 80-100% Performance Tests Automation; 100%Regression test Automation 60 – 80% Test Management/Defect Management Automation: 70 - 100% Sprint Acceptance Tests Automation 40 – 50% Exploratory tests Automation: 30 – 40% Functional Tests Automation: 40 – 50%
  • 13.
    13 Test Automation Approachin Agile Sprint 1 Sprint 2 Sprint 3 Sprint 4 C1 C1 C2 C1 C3 F1 C2 C1 C2 C4 C3 F1Tester automates in N + 1 sprint once stable in N sprint F1 Stable FeatureC Stable Component C Component
  • 14.
    14 Test Driven Development[TDD] Write a test that fails Make the code work Eliminate the redundancy by refactoring The mantra of Test-Driven Development (TDD) is red, green, refactor. JUnit NUnit HttpUnit
  • 15.
    15 Acceptance Test DrivenDevelopment The information 1 Write an acceptance test that fails Make the code work Eliminate the redundancy by refactoring
  • 16.
    16Copyright © Capgemini2015. All Rights Reserved Continuous Integration Automation framework Continuous Integration Automation Physical / Virtual Environment Ready for Deployment SCM Environment Sanity / Smoke Test Unit Testing Environment Pre-Check Deployment Test Automation /Regression Suite SCM Tools MS TFS, SVN, IBM RTC, Clear Case, HP ALM Build Tools MS TFS, Maven, Jenkins, Cruise Control, Build Forge Unit Testing Tools MSTest, JUnit Testing Automation Tools MS Coded UI, Café Next QTP, Selenium, BDD + Cucumber
  • 17.
    17 With a readyto use tool kit for automation Release and Sprint Planning : • Version One, Jira Agile, HP Agile Manager, Microsoft TFS, Agile Tools - Snapshot TDD and BDD tools : • Cucumber, Jbehave, Junit, xUnit, Jmeter,csunit, CAST Sprint Level – Services Automation: CA Lisa, HP ServiceTest Functional Automation: • QTP, Café (Capgemini Automation Framework),IMDA, Selenium Exploratory and test design • HP Sprinter, BluePrint (MBT) Metrics: • LIVE (Lifecycle Integration with Virtualized Engine) • HP Quality Center , Version 1,Jira , TFS Build & Integration: • Jenkins, CruiseControl, Gnumake
  • 18.
    18 Future of Automation– A sneak peek  Technical connect Normal testing as a profession would diminish and testers would be expected to have scripting knowledge with the ability to automate.  Business connect Automation testers would be expected to have domain and business knowhow.  DevOps End to end continuous integration automation would be a necessity not an innovation.  SMAC adoption Multiple channel integration and smart device compatibility is a necessity.

Editor's Notes

  • #14 As a general rule of thumb, we advise customers that automated test creation lag by no more than one sprint. In other words, the exit criteria for a second sprint includes the agreed upon automation for the functionality of the first sprint, and so on. This rule recognizes that rarely is there enough time within a sprint to automate the tests of that sprint. But it also lessens the temptation to let test automation slide to later and later sprints. Furthermore, this “sprint+1” law of automated testing helps to ensure (a) that time for test automation is allocated to each sprint, and (b) that if the team is not able to complete its target automation, the gap in remaining work is treated just as a code gap would be. That is, either the sprint is extended to complete the work, or the outstanding tasks are moved to the backlog for allocation to future sprints.
  • #15 Add a Test In test-driven development, each new feature begins with writing a test. This test must inevitably fail because it is written before the feature has been implemented. (If it does not fail, then either the proposed “new” feature already exists or the test is defective.) To write a test, the developer must clearly understand the feature's specification and requirements. The developer can accomplish this through use cases and user stories that cover the requirements and exception conditions. This could also imply a variant, or modification of an existing test. This is a differentiating feature of test-driven development versus writing unit tests after the code is written: it makes the developer focus on the requirements before writing the code, a subtle but important difference. Run all tests and see if the new one fails This validates that the test harness is working correctly and that the new test does not mistakenly pass without requiring any new code. This step also tests the test itself, in the negative: it rules out the possibility that the new test will always pass, and therefore be worthless. The new test should also fail for the expected reason. This increases confidence (although it does not entirely guarantee) that it is testing the right thing, and will pass only in intended cases. Write some code The next step is to write some code that will cause the test to pass. The new code written at this stage will not be perfect and may, for example, pass the test in an inelegant way. That is acceptable because later steps will improve and hone it. It is important that the code written is only designed to pass the test; no further (and therefore untested) functionality should be predicted and 'allowed for' at any stage. Run the automated tests and see them succeed If all test cases now pass, the programmer can be confident that the code meets all the tested requirements. This is a good point from which to begin the final step of the cycle. Refactor code Now the code can be cleaned up as necessary. By re-running the test cases, the developer can be confident that code refactoring is not damaging any existing functionality. The concept of removing duplication is an important aspect of any software design. In this case, however, it also applies to removing any duplication between the test code and the production code — for example magic numbers or strings that were repeated in both, in order to make the test pass in step 3. Repeat Starting with another new test, the cycle is then repeated to push forward the functionality. The size of the steps should always be small, with as few as 1 to 10 edits between each test run. If new code does not rapidly satisfy a new test, or other tests fail unexpectedly, the programmer should undo or revert in preference to excessive debugging. Continuous Integration helps by providing revertible checkpoints. When using external libraries it is important not to make increments that are so small as to be effectively merely testing the library itself,[3] unless there is some reason to believe that the library is buggy or is not sufficiently feature-complete to serve all the needs of the main program being written.
  • #16 Add a Test In Acceptance test-driven development, each new feature begins with writing an acceptance test. This test must inevitably fail because it is written before the feature has been implemented. (If it does not fail, then either the proposed “new” feature already exists or the test is defective.) To write a test, the developer must clearly understand the feature's specification and requirements. The developer can accomplish this through use cases and user stories that cover the requirements and exception conditions. This could also imply a variant, or modification of an existing test. This is a differentiating feature of test-driven development versus writing unit tests after the code is written: it makes the developer focus on the requirements before writing the code, a subtle but important difference. Run all tests and see if the new one fails This validates that the test harness is working correctly and that the new test does not mistakenly pass without requiring any new code. This step also tests the test itself, in the negative: it rules out the possibility that the new test will always pass, and therefore be worthless. The new test should also fail for the expected reason. This increases confidence (although it does not entirely guarantee) that it is testing the right thing, and will pass only in intended cases. Write some code The next step is to write some code that will cause the test to pass. The new code written at this stage will not be perfect and may, for example, pass the test in an inelegant way. That is acceptable because later steps will improve and hone it. It is important that the code written is only designed to pass the test; no further (and therefore untested) functionality should be predicted and 'allowed for' at any stage. Run the automated tests and see them succeed If all test cases now pass, the programmer can be confident that the code meets all the tested requirements. This is a good point from which to begin the final step of the cycle. Refactor code Now the code can be cleaned up as necessary. By re-running the test cases, the developer can be confident that code refactoring is not damaging any existing functionality. The concept of removing duplication is an important aspect of any software design. In this case, however, it also applies to removing any duplication between the test code and the production code — for example magic numbers or strings that were repeated in both, in order to make the test pass in step 3. Repeat Starting with another new test, the cycle is then repeated to push forward the functionality. The size of the steps should always be small, with as few as 1 to 10 edits between each test run. If new code does not rapidly satisfy a new test, or other tests fail unexpectedly, the programmer should undo or revert in preference to excessive debugging. Continuous Integration helps by providing revertible checkpoints. When using external libraries it is important not to make increments that are so small as to be effectively merely testing the library itself,[3] unless there is some reason to believe that the library is buggy or is not sufficiently feature-complete to serve all the needs of the main program being written.