Integration testing
at large scaled agile projects
1
The relation between time-to-market
and the level of integration
Derk-Jan de Grood
[SC]2 – 25 May 2016
Aim of this session
2
Our End Goal
3
Deliver our product (or new features)
with a short time to market
CI/CD Assumptions
Teams
Collaborate
Integration is
Continue
Tests are
Automated
Deployment is
hands-off process
No Automation
Backlog
Clear Acceptance
Criteria
Feedback loop to
improve Testing
Frequent Product
Launch
….Integrationproblems
Feature driven: When you work
with more teams on the same
system
Component driven: When teams
work on adjacent systems
Legacy system: When code
adaptions have unknown impact
5
No matter how we are organized we
have….
What kind of integrations
do we have?
6
What is your system boundary?
7
Component System Service Organization
8
Definition of
(un)done
The definition of done gives a
quick insight in the level at which
integration takes place. The
definition of Undone (see Less
framework) identifies integration
tests that are not done within
each sprint. Both are indicators of
the system boundaries that are
taken into account
Ensuring Integration
9
Organization
Component
System
Service
Continuously
(in the sprint)
Occasionally
(e.g. prior to a
release)
General trend: Increasing the
system (e.g from Units tot
Systems) results in less
frequent integration, because it
becomes harder to test the
integration. This has impact on
the time-to-market and this
insight might lead to targeted
improvements
Ensuring Integration (rough sketch)
10
Organization
Component
System
Service
Continuously
(in the sprint)
Occasionally
(e.g. prior to a
release)
Reduce
amount
of
releases
Ensuring Integration (rough sketch)
11
Organization
Component
System
Service
Continuously
(in the sprint)
Occasionally
(e.g. prior to a
release)
issues
after the
sprint
Ensuring Integration (rough sketch)
12
Organization
Component
System
Service
Continuously
(in the sprint)
Occasionally
(e.g. prior to a
release)
Organizational
Readiness
Another Case Study
13
Architecture
• What are the
business
processes?
• What are the
components?
• What are the
interfaces?
Acceptance
criteria
• What is the
Minimal Viable
Product?
• What integrations
are needed to
make it work?
Requirements
traceability
• When are we
complete?
• How do test
results add up to
acceptance?
14
Missing
What should a car minimally do?
15
Planned Integration Tests
16
Integrations
needed to
make it work
Release Date
In one of my projects I used
tested integrations as a
measure of progress. Only if
the integration is done (and
tested) you know that the
solution will work as a whole…
The Burn down Chart was used
to inform management on the
progress of the project and to
plan the next integrations in the
scrum-of-scrums
Exchange Experiences
On what level do you
integrate
(SUT) and with what
speed?
Is there a need to
speed up?
17
For example
 CI/CD
 MBT
 UT
 Automated System test
 Automated e2e test
 Interface testing
 Manual Regression testing
 Integration sprints
 other
WRAP-UP
18
Summary
There is a common goal: Deliver our product (or new
features) with a short time-to-market.
CI/CD helps to do so, but requires a lot from the
organization, a.o. frequent integration
There are levels on which you can integrate, e.g.
Component, System, Service and Organization
Large scale integration, gives more reliable and
commercially feasible products, but integrates is harder.
Making this clear to management enables to manage
expectations or helps to target your next improvements.
Derk-Jan
Valori
Coltbaan 4a
3439 NG NIEUWEGEIN
The Netherlands
• derkjandegrood@valori.nl
• +31(0)651807878
• www.valori.nl
• @DerkJanDeGrood
• http://djdegrood.wordpress.com
20

Integration testing in Scaled agile projects

  • 1.
    Integration testing at largescaled agile projects 1 The relation between time-to-market and the level of integration Derk-Jan de Grood [SC]2 – 25 May 2016
  • 2.
    Aim of thissession 2
  • 3.
    Our End Goal 3 Deliverour product (or new features) with a short time to market
  • 4.
    CI/CD Assumptions Teams Collaborate Integration is Continue Testsare Automated Deployment is hands-off process No Automation Backlog Clear Acceptance Criteria Feedback loop to improve Testing Frequent Product Launch
  • 5.
    ….Integrationproblems Feature driven: Whenyou work with more teams on the same system Component driven: When teams work on adjacent systems Legacy system: When code adaptions have unknown impact 5 No matter how we are organized we have….
  • 6.
    What kind ofintegrations do we have? 6
  • 7.
    What is yoursystem boundary? 7 Component System Service Organization
  • 8.
    8 Definition of (un)done The definitionof done gives a quick insight in the level at which integration takes place. The definition of Undone (see Less framework) identifies integration tests that are not done within each sprint. Both are indicators of the system boundaries that are taken into account
  • 9.
    Ensuring Integration 9 Organization Component System Service Continuously (in thesprint) Occasionally (e.g. prior to a release) General trend: Increasing the system (e.g from Units tot Systems) results in less frequent integration, because it becomes harder to test the integration. This has impact on the time-to-market and this insight might lead to targeted improvements
  • 10.
    Ensuring Integration (roughsketch) 10 Organization Component System Service Continuously (in the sprint) Occasionally (e.g. prior to a release) Reduce amount of releases
  • 11.
    Ensuring Integration (roughsketch) 11 Organization Component System Service Continuously (in the sprint) Occasionally (e.g. prior to a release) issues after the sprint
  • 12.
    Ensuring Integration (roughsketch) 12 Organization Component System Service Continuously (in the sprint) Occasionally (e.g. prior to a release) Organizational Readiness
  • 13.
  • 14.
    Architecture • What arethe business processes? • What are the components? • What are the interfaces? Acceptance criteria • What is the Minimal Viable Product? • What integrations are needed to make it work? Requirements traceability • When are we complete? • How do test results add up to acceptance? 14 Missing
  • 15.
    What should acar minimally do? 15
  • 16.
    Planned Integration Tests 16 Integrations neededto make it work Release Date In one of my projects I used tested integrations as a measure of progress. Only if the integration is done (and tested) you know that the solution will work as a whole… The Burn down Chart was used to inform management on the progress of the project and to plan the next integrations in the scrum-of-scrums
  • 17.
    Exchange Experiences On whatlevel do you integrate (SUT) and with what speed? Is there a need to speed up? 17 For example  CI/CD  MBT  UT  Automated System test  Automated e2e test  Interface testing  Manual Regression testing  Integration sprints  other
  • 18.
  • 19.
    Summary There is acommon goal: Deliver our product (or new features) with a short time-to-market. CI/CD helps to do so, but requires a lot from the organization, a.o. frequent integration There are levels on which you can integrate, e.g. Component, System, Service and Organization Large scale integration, gives more reliable and commercially feasible products, but integrates is harder. Making this clear to management enables to manage expectations or helps to target your next improvements.
  • 20.
    Derk-Jan Valori Coltbaan 4a 3439 NGNIEUWEGEIN The Netherlands • derkjandegrood@valori.nl • +31(0)651807878 • www.valori.nl • @DerkJanDeGrood • http://djdegrood.wordpress.com 20

Editor's Notes

  • #5 Teams collaborate to deliver each sprint a working product Integration is continue Tests are automated Tests are run from the build server Deployment is hands-off process TDD ensures no backlog in TA Acceptance criteria are clear Feedback loop to create better tests in place Product goes live regularly
  • #10 How to ensure Integration We do: CI/CD MBT UT Automated System test Automated e2e test Interface testing Manual Regression testing Integration sprints other