Inverting The Testing Pyramid


Published on

As more and more companies are moving to the Cloud, they want their latest, greatest software features to be available to their users as quickly as they are built. However there are several issues blocking them from moving ahead.

One key issue is the massive amount of time it takes for someone to certify that the new feature is indeed working as expected and also to assure that the rest of the features will continuing to work. In spite of this long waiting cycle, we still cannot assure that our software will not have any issues. In fact, many times our assumptions about the user's needs or behavior might itself be wrong. But this long testing cycle only helps us validate that our assumptions works as assumed.

How can we break out of this rut & get thin slices of our features in front of our users to validate our assumptions early?

Most software organizations today suffer from what I call, the "Inverted Testing Pyramid" problem. They spend maximum time and effort manually checking software. Some invest in automation, but mostly building slow, complex, fragile end-to-end GUI test. Very little effort is spent on building a solid foundation of unit & acceptance tests.

This over-investment in end-to-end tests is a slippery slope. Once you start on this path, you end up investing even more time & effort on testing which gives you diminishing returns.

In this session Naresh Jain will explain the key misconceptions that has lead to the inverted testing pyramid approach being massively adopted, main drawbacks of this approach and how to turn your organization around to get the right testing pyramid.

Published in: Technology
No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Inverting The Testing Pyramid

  1. 1. InvertingThe Testing Pyramid Naresh Jain @nashjain Copyright © 2013, AgileFAQs. All Rights Reserved.
  2. 2. Plan Back in the Stone-age DesignHappiness/Excitement Distribute Work in Isolation Integrate Time/Money/Opportunity Cost
  3. 3. Last Minute Integration Surprises Copyright © 2013, AgileFAQs. All Rights Reserved.
  4. 4. BAD things were visible too Late...
  5. 5. Birth of CI
  6. 6. CI Helped Us Learn That... Integrating Early, Integrating OftenCollaboration Feedback Quality Delivery Time Wastage Copyright © 2013, AgileFAQs. All Rights Reserved.
  7. 7. Key Questions Business FacingAre we building the right product?Are we building the product right? Technology/Implementation Facing Copyright © 2013, AgileFAQs. All Rights Reserved.
  8. 8. Brian Marick’s Test Categorization Before/While Coding Post Coding Business FacingSupports Programming Critique product Technology/Implementation Facing Copyright © 2013, AgileFAQs. All Rights Reserved.
  9. 9. It Helps to Think of Tests this way... Business Facing Acceptance Testing Exploratory TestingDrives Development Critique product Low-fi prototypes UI and Usability Testing Performance Testing Unit Testing System Tests Technology/Implementation Facing Copyright © 2013, AgileFAQs. All Rights Reserved.
  10. 10. Test Driven Development Add a Test Pass Run the Test FailTDD Rhythm - Test, Code, Refactor Make a little change Fail Run the Test Pass Refactor Copyright © 2013, AgileFAQs. All Rights Reserved.
  11. 11. Lean-Start-up Community Tried something quite disruptive...
  12. 12. Continuous Deployment Copyright © 2013, AgileFAQs. All Rights Reserved.
  13. 13. But...
  14. 14. Software Testing Ice-cream Cone Manual Checking End-to-End GUI Tests 10-20% Integration 5-10% Tests Unit Tests 1-5% Copyright © 2013, AgileFAQs. All Rights Reserved.
  15. 15. Commercial Break!
  16. 16. Copyright © 2013, AgileFAQs. All Rights Reserved.
  17. 17. Tech Talks!
  18. 18. Inverting the Testing Pyramid Typical testing strategies lead to an inverted testing pyramid... Manual Checking End-to-End GUI Tests 80-90% Integration 5-15% Tests Unit Tests 1-5% Copyright © 2013, AgileFAQs. All Rights Reserved.
  19. 19. Inverting the Testing Pyramid GUI 1% Tests Performance Tests End to End 4% Security Flow Tests Tests One Layer Below GUI Workflow Tests 6% Integration Test 9% Biz Logic Acceptance Tests 10% Unit Tests 70% This is the need of the hours... Copyright © 2013, AgileFAQs. All Rights Reserved.
  20. 20. Number of End-to-End Tests T1 T2 T3 DB Classes Method Calls 8 9 11 70 Complex, Fragile End-to-End Tests vs. 61 Unit tests + 8 End-to-End Tests Copyright © 2013, AgileFAQs. All Rights Reserved.
  21. 21. Best ROI for Testing Impact Effort End to End 10% 90%Layers API 65% 35% Modules 70% 30% Unit 90% 10% 0 25 50 75 100 Value Copyright © 2013, AgileFAQs. All Rights Reserved.
  22. 22. Types of Tests in the Testing Pyramid GUI Tests GUI 1% Tests Performance Acceptance Tests Tests End to End 4% Security Flow Tests Tests One Layer Below GUI Unit Tests Workflow Tests 6%Small Agile Projects with Single Team Integration Test 9% Biz Logic Acceptance Tests 10% Unit Tests 70% Large Scale Agile Projects with Global Teams Copyright © 2013, AgileFAQs. All Rights Reserved.
  23. 23. Types of Tests for Large Agile Project Type Tool Who When Level Objective xUnit, Mockito/ Each class functions correctly in isolation. The During the Iteration, whileUnit Tests Moq, Jasmine, Developers coding Class UI components (including JS) can also be unit tested in isolation. HSqlDB, JumbleBiz Logic Cucumber (some TechBAs + Acceptance Criteria are Work-steam specific From a business logic point of view, the functionality is implemented correctly. Works defined before Iteration starts.Acceptance may need Lisa) Developers + Acceptance Tests are written business functionality in isolation with other work-streams/modules. Also good point to do some basicTests JMeter/LoadRunner Testers (QA) during Iteration, before coding (At component level) performance testing. Contracts are defined before Between dependenciesIntegration Developers + 2 Components can communicate with each Iteration, tests are (work-streams, other correctly. Best suited for negative path xUnit + Lisa Testers (QA) +Test TechBAs implemented during the Iteration upstream/down-stream testing. Should not test any functional aspects of the application. systems, 3rd Party) SMEs + TechBAs + UX Across a few work- Inside a business operation, a particular stepWorkflow When a theme is picked (business workflow) works correctly from the Cucumber + Lisa + Developers + Arch (functional group of Iterations) streams (Issuance/Tests + Testers (QA + UAT) Acceptance) business point of view. Upstream/Downstream dependencies stubbed. When an Internal Milestone is SMEs + TechBAs + UX decided, the E2E flow tests are Ensure a specific type of business operationEnd to End Cucumber defined & authored. As & when End to end Trade flow, can go thru all the steps successfully. + Developers + Arch + the complete workflow isFlow Tests JMeter/LoadRunner Testers (QA + UAT) implemented these tests are one layer below the UI Very important to do detailed performance testing at this stage. enabled. Usually its takes 3-4 Iterations Selenium/Sahi/ to complete one whole TechBAs + UX + End to end Trade flow Ensure all the basicthrough all the steps types of businessGUI Tests Watir/White/ workflow including the UI. operations can go SWTBot Testers (QA) After that the UI tests will be with GUI successfully via a UI. authored and executed. Copyright © 2013, AgileFAQs. All Rights Reserved.
  24. 24. “Testing” Touch Points during Project Lifecycle Work- Internal Product Release 1st Release Iteration Iteration Themes Stream Iteration Milestone Release Discovery Planning Planning Review Planning Release➡Collaborate to ➡Ensure themes ➡Validate Acceptance ➡Show demo (parts) ➡Sanity Testing increate the story map are coherent criteria at story level ➡Confirm that the biz Live environment➡Help SMEs to ➡Create theme ➡Estimate Tasks requirements are met ➡Participate indefine Acceptance level epics with stakeholders Release retrosCriteria at each level ➡Author End-to- ➡Participate inin story map end flow tests Iteration retrospectives ➡Define goals & ➡Author Work-flow tests ➡Collaborate with Dev ➡Sanity Testing in ER cycles readiness criteria ➡Collaborate on identifying & Biz to write Automated ➡Confirm user goals are for each Release. dependency and Acceptance Test met with operations team documenting them ➡Exploratory Testing ➡Participate in Internal ➡Create stream level epics ➡Buddy testing Release retrospective ➡Create Integration Test ➡UI automation of previous Iteration tasks ➡Performance & Security Tests ➡ The Continuous Integration and Build Pipeline process will execute all tests as part of the regression pack and report various quality parameters to all stakeholders. ➡ On going Governance & Test Data management Copyright © 2013, AgileFAQs. All Rights Reserved.
  25. 25. Inverting the Pyramid How to…?
  26. 26. 1Acceptance Test/Behavior Driven Development
  27. 27. Acceptance Test Driven Development Iteration Automated Acceptance Acceptance Tests CriteriaUser Story Automated Unit Test Automated UI Tests P E Automated R F Acceptance Tests O R M Acceptance E Exploratory Criteria N T Testing C E E S T Copyright © 2013, AgileFAQs. All Rights Reserved. S
  28. 28. 2Acceptance Criteria Driven Discovery
  29. 29. Acceptance Criteria Driven Product Discovery PersonasElevator Business Pragmatic User A Pitch Goals Personas Goals c Chartering c Day in Life e Scenarios & p of each Narratives Persona t a Participants nProduct Manager/Owners Story Mapping Activity Map c SMEsOperation/Support Reps e Tech BAs UI Sketch Interaction Task MapUser Experience Designer Design C App Developers r Architects Planning i Testers (QA+UAT) Reiterating Grouping by Prioritization t Themes e Project Managers r User Story Authoring i Acceptance User Criteria Stories a
  30. 30. Story  mapping  at  SEP Photo  courtesy  of  So@ware  Engineering  Professionals,  Indianapolis  INSrc:  Jeff  Pa+on,
  31. 31. Organize  &  Visualize  your  Product’s  Release  RoadmapSrc:  Jeff  Pa+on,
  32. 32. 3Continuous Integration Pipeline
  33. 33. Typical Continuous Integration Process Reports Checkstyle Test Test Copyright © 2013, AgileFAQs. All Rights Reserved.
  34. 34. Dev Box Continuous Integration Build Pipeline Update Local Version CheckoutDev Build Checkin Control ~5 Min Team CI Server ou t Checkout Project/Task Checkout k ec Each build will update Management Ch Smoke Ch Task statues eck Build Tool out ~10 Min Team CI Server Retrieve Functional Artifacts Build ~30 Min Retrieve Dev Env Artifacts Cross ts Publish ifac Artifacts Stream Art (jars) Build d ~60 Min ts she ac SIT Env ti f cts ifa ubli A r t r Theme ed nt A eP b lish nd e Build riev Pu ep e ve ~60 Min eD acts e Artif Ret tri ot hed QA Env Re om ublis r eP Product h/P Retriev Dependency ub lis ed Artifacts Build Management P Retrieve Publish ~60 Min UAT Env UAT Build ~30 Min Staging Env SCM Staging Each build will broadcast build stats and other reports to all Build ~30 Min relevant stakeholders at each Prod Env stage of the build pipeline Live Copyright © 2013, AgileFAQs. All Rights Reserved. Build ~30 Min
  35. 35. Local Cross Smoke Func Theme Prod UAT Staging Live Dev Stream Build Build Build Build Build Build Build Build Build Ensure the Ensure each Ensure each Ensure each story developer has Ensure each story’s implemented, implemented end-to- Ensure the product integrates with other Ensure checked in all files, business logic is complete trade end feature works as as as a whole can Ensure basic sanity streams and business Operations can the code compiles, implemented workflow works expected including work with other Deploy the product before the logic continues to execute an end toObjective developer checks all unit tests work correctly and work as expected. as expected. various performance end workflow and systems in the to the live and basic code satisfies the Interact with as and security related enterprise. Make environment. in code. Promote new it meets their quality related doneness criteria at many sub-systems SLAs are met. Also test Go/No-Go produced jars to goals. stats are reported the story level & interfaces as out the migration decision. dependent teams. to the devs. possible. scripts Integrate promoted Clean environment, code with other Clean Migrate existing system Migrate existing Retains the Migrate existing Compiles, Unit test setup database, team modules, runs cross- environment, to the latest version, system to the system to the latest environment. Only and static/dynamic specific biz team workflow & setup dependent setup dependent latest version, version, test the Migrate existing compiles and tests acceptance tests. integration tests in systems, execute systems, execute setup dependent Job locally changed code analysis on an Stubs/Mocks out an incremental env. theme-level product-level systems, let the system with other system to the latest incremental systems during the version and party! code other functional Stubs/Mocks other acceptance tests. acceptance and UI Operations team environment Enterprise Release (incremental). modules or sub- sub-systems. As few stubs/ tests. Almost no stubs/ use the system cycle. systems. Promote jars to mocks as possible. mocks. thoroughly. dependent stream.Duration ~ 5 mins. ~ 10 mins. ~ 30 mins ~ 60 mins ~ 60 mins ~ 60 mins ~ 30 mins ~ 30 mins ~ 30 mins Stake- Developer writing Developers of a Team members of a given team + Teach All functionally affected Teams + All teams + SMEs All teams + SMEs SCM + SMEs + Enterprise Wide World-wideholders the code given team BA SMEs Operations Team On successful On Successful Trigger/ Before checking in code. At least Every checkin. At least every 30 On successful On successful Smoke build. At least Functional build. At Cross-Stream On successful Theme build. At least once Each Iteration Each Enterprise Staging BuildFrequency every 30 mins. mins. every 2 hours least twice each day build. At least once each day each week Release Cycle (Manually triggered for now) On developer’s Stating Where workstation Team CI Server Team CI Server Dev Environment SIT Environment QA Environment UAT Environment Environment Production Env Team produced Jars/ Fully functional Configured & packaged Configured & Configured & Team produced Integrated jars Code+Tests Wars. Static+ app with app (war) with packaged app packaged app (war) Artifacts +Config Jars, static code Dynamic Code pushed to workflow level migration scripts & (war) with with migration $$ analysis reports dependent teams Analysis reports reports traceability reports. migration scripts scripts ➡ DevOps Roles (currently missing) & Test Data strategy is very important to get this working. ➡ Overall governance model will also be required to make sure build failures are addressed immediately. Copyright © 2013, AgileFAQs. All Rights Reserved.
  36. 36. Example Build Pipeline View Copyright © 2013, AgileFAQs. All Rights Reserved.
  37. 37. Automate 100%
  38. 38. Testing VS. CheckingTesting is explorative, probing and learning oriented.Checking is confirmative (verification and validation of what we alreadyknow). The outcome of a check is simply a pass or fail result; the outcomedoesn’t require human interpretation. Hence checking should be the firsttarget for automation. James Bach points out that checking does require some element of testing; to create a check requires an act of test design, and to act upon the result of a check requires test result interpretation and learning. But its important to distinguish between the two because when people refer to Testing they really mean Checking.Why is this distinction important? Michael Bolton explains it really well: “A development strategy that emphasizes checking at the expense of testing is one in which we’ll emphasize confirmation of existing knowledge over discovery of new knowledge. That might be okay. A few checks might constitute sufficient testing for some purpose; no testing and no checking at all might even be sufficient for some purposes. But the more we emphasize checking over testing, and the less testing we do generally, the more we leave ourselves vulnerable to the Black Swan.” Copyright © 2013, AgileFAQs. All Rights Reserved.
  39. 39. Questions?Thank you Naresh Copyright © 2013, AgileFAQs. All Rights Reserved.