Quality Loopback
Accelerating Code to Release
Omar Bashir
https://www.linkedin.com/in/obprofile/ @OmarBashir_40
DevOps
Lead Time
Dev QA OPS
● Build
● Build-time tests
● Dev e2e tests
● Functional &
non-functional
tests
● Deployment to non-prod
environments
● Operate & monitor
Dev
Code
Commit
Deploy in Dev
Test in Dev
Deploy in QA
Test in QA
Deploy in UAT
Test in UAT
Prod
Lead Time
Lead Time
Lead Time = ∑Deployment Time + ∑Testing Time
● Reducing deployment time
– Automating deployments
– Reducing deployment errors
● Reducing testing time
– Faster tests
– Reliable tests
– Higher overall code and functional coverage
Where to test and how much to test?
Unit/Component Tests
Service Tests
Manual
Tests
TestExecution,TestMaintenance
Business Centric
Are we building the right
system ?
Developer Centric
Are we building
system right ?
Testing Pyramid Pattern
Service Tests
Manual Tests
Ice Cream Cone Anti-Pattern
Component
& Unit
Tests
TestExecution,TestMaintenance
Ice Cream Cone Anti-Pattern
● Siloed QA focused on or tasked with testing.
● Legacy untestable code.
● Systems largely composed of third party
services.
End-to-End Testing Challenges
● Resource requirements
– Multiple production-like environments required.
– Never enough environments,
– Developers queuing for environments.
– Release trains.
● Scheduling builds and releases.
● Organisational overhead.
End-to-End Testing Challenges
● Testing speed and throughput
– Test independence leads to larger, slower tests with
multiple common steps.
● Smart automation tools may compress tests.
– Test failures may leave environments in unstable
state.
Shifting Left to Accelerate
● More testing at build and pre-build stages.
– Developers to test within IDEs prior to commits and
merges.
● Increasing coverage to the tests on the left.
– Tests on the right to focus on integration.
Code
Commit
Deploy in Dev
Test in Dev
Deploy in QA
Test in QA
Deploy in UAT
Test in UAT
Prod
Increasing tests progressively to the left
Confidence in Testing
● Reliability
– Tests produce same results for given inputs and a
specified state of the system and the environment.
● Coverage.
– High code coverage.
– Complete functional coverage.
High Code Coverage Controversy
● Advantages,
– Tests cover majority of code.
– Code is testable, hence SOLID.
● But not a guarantee for fewer functional
defects,
The code may be built right but is it the right
code?
Quality Loopback
● Domain inclusive left shift.
– Aligning unit and component tests to the acceptance
criteria.
● Detecting functional and technical issues early
in the development process.
● Reduces the need for large number of end-to-
end test.
Quality Loopback Tools
● Acceptance criteria to build tests.
● Code reviews with testing evidence.
● Test coverage.
Looping Back In Legacy Code
● Bugs and end-to-end test failures to be
reproduced in unit and component tests.
● Incremental refactoring of legacy code for
testability.
● Consolidating end-to-end tests.
Challenges and Opportunities
● Mapping acceptance criteria to units, components,
microservices, etc. can be challenging.
● Stronger QA and development collaboration.
● Testable design and implementation leading to loosely
couple applications.
● Reduced lead time.
Quality Loopback

Quality Loopback

  • 1.
    Quality Loopback Accelerating Codeto Release Omar Bashir https://www.linkedin.com/in/obprofile/ @OmarBashir_40
  • 2.
  • 3.
    Lead Time Dev QAOPS ● Build ● Build-time tests ● Dev e2e tests ● Functional & non-functional tests ● Deployment to non-prod environments ● Operate & monitor Dev Code Commit Deploy in Dev Test in Dev Deploy in QA Test in QA Deploy in UAT Test in UAT Prod Lead Time
  • 4.
    Lead Time Lead Time= ∑Deployment Time + ∑Testing Time ● Reducing deployment time – Automating deployments – Reducing deployment errors ● Reducing testing time – Faster tests – Reliable tests – Higher overall code and functional coverage
  • 5.
    Where to testand how much to test?
  • 6.
    Unit/Component Tests Service Tests Manual Tests TestExecution,TestMaintenance BusinessCentric Are we building the right system ? Developer Centric Are we building system right ? Testing Pyramid Pattern
  • 7.
    Service Tests Manual Tests IceCream Cone Anti-Pattern Component & Unit Tests TestExecution,TestMaintenance
  • 8.
    Ice Cream ConeAnti-Pattern ● Siloed QA focused on or tasked with testing. ● Legacy untestable code. ● Systems largely composed of third party services.
  • 9.
    End-to-End Testing Challenges ●Resource requirements – Multiple production-like environments required. – Never enough environments, – Developers queuing for environments. – Release trains. ● Scheduling builds and releases. ● Organisational overhead.
  • 10.
    End-to-End Testing Challenges ●Testing speed and throughput – Test independence leads to larger, slower tests with multiple common steps. ● Smart automation tools may compress tests. – Test failures may leave environments in unstable state.
  • 11.
    Shifting Left toAccelerate ● More testing at build and pre-build stages. – Developers to test within IDEs prior to commits and merges. ● Increasing coverage to the tests on the left. – Tests on the right to focus on integration. Code Commit Deploy in Dev Test in Dev Deploy in QA Test in QA Deploy in UAT Test in UAT Prod Increasing tests progressively to the left
  • 12.
    Confidence in Testing ●Reliability – Tests produce same results for given inputs and a specified state of the system and the environment. ● Coverage. – High code coverage. – Complete functional coverage.
  • 13.
    High Code CoverageControversy ● Advantages, – Tests cover majority of code. – Code is testable, hence SOLID. ● But not a guarantee for fewer functional defects, The code may be built right but is it the right code?
  • 14.
    Quality Loopback ● Domaininclusive left shift. – Aligning unit and component tests to the acceptance criteria. ● Detecting functional and technical issues early in the development process. ● Reduces the need for large number of end-to- end test.
  • 15.
    Quality Loopback Tools ●Acceptance criteria to build tests. ● Code reviews with testing evidence. ● Test coverage.
  • 16.
    Looping Back InLegacy Code ● Bugs and end-to-end test failures to be reproduced in unit and component tests. ● Incremental refactoring of legacy code for testability. ● Consolidating end-to-end tests.
  • 17.
    Challenges and Opportunities ●Mapping acceptance criteria to units, components, microservices, etc. can be challenging. ● Stronger QA and development collaboration. ● Testable design and implementation leading to loosely couple applications. ● Reduced lead time.