Your SlideShare is downloading. ×
Dr. Ronen Bar-Nahor - Optimizing Agile Testing in Complex Environments
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Dr. Ronen Bar-Nahor - Optimizing Agile Testing in Complex Environments

37,928

Published on

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
37,928
On Slideshare
0
From Embeds
0
Number of Embeds
7
Actions
Shares
0
Downloads
0
Comments
0
Likes
1
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Optimizing (Agile) Testing in Complex Environment Dr. Ronen Bar-Nahor 1
  • 2. The challenges in complex environment • Legacy system • Spaghetti code • dependencies • Technical debt (quality, architecture, etc) • Minimal automation in system level (gui driven) • Low level of unit testing • “Integration hell” • QA play defense – Hunting defects – Gates, sign-offs – Documentation driven – Keep automation for itself 22
  • 3. Some of the most common challenges for QA in Agile • The manifesto – Working SW over documentation – Changes over following a plan – Interactions over process and tools • Shared team ownership on quality • How to be “part of the team” but still – maintaining testing as a center of expertise Agile Team – Keep loyalty to the business • Lose of the “big picture” QA • Testing on non-stable SW Business group 3
  • 4. What is one of the problems in waterfall testing ?http://www.slideshare.net/nashjain/role-of-qa-and-testing-in-agile-presentation 4
  • 5. in waterfall reality – integration hell So, Agile is all about early feedbackhttp://www.slideshare.net/nashjain/role-of-qa-and-testing-in-agile-presentation 5
  • 6. How early we test ? PSPWaterfall (Level 0) R e l e a s e PSPScrum Level 1 Sprint 1 Sprint 2 Sprint 3 Sprint 4 PSP PSP PSP PSPScrum Level 2 Sprint 1 Sprint 2 Sprint 3 Sprint 4 PSP PSP PSP PSPScrum Level 3 Sprint 1 Sprint 2 Sprint 3 Sprint 4 R&D QA © Copyright of AgileSparks LTD 6
  • 7. But, isn’t it more expensive to take many small chunks? It might be less locally efficient... But it’s cheaper overall! • the hidden cost of “holding” for too long is higher (1:10) – Dev. context switch – Defect reproduction – Shaking the system – management • Testing early exposes project risks and Increase business flexibility 7
  • 8. So what is the real challenge ? effort time 8
  • 9. So we want to get to this… Burndown / Burnup 120 100 80 60 DONE (BURNUP) ACTUAL REMAINING EFFORT PLANNED REMAINING EFFORT 40 20 0 1 2 3 4 5 6 7 8 9 10 Time 9
  • 10. And to avoid … the brown cloud Iteration Iteration Iteration Iteration Product Product Product Product Stuff we defer: From “Done” Defects refactoring to “Done Non Functional Testing user documentation Done” UAT... Shippable Product ? 10
  • 11. So how we can avoid - effortAnd achieve time PSP PSP PSP PSP Sprint 1 Sprint 2 Sprint 3 Sprint 4 11 11
  • 12. Start with the user stories BIG Features 12
  • 13. That take very long to get to testing… Longer iterations? 13
  • 14. To smaller chunks that can quickly flow to test 14
  • 15. And to make the story testable • INVEST • Break stories effectively – “Mona Lisa” (working system) – “Elephant Carpaccio” (working feature) • Ready-Ready stories !!! • Do not forget the “big picture”/integration test early as possible (Epic level) 15
  • 16. Make architecture testable • QA involved in this phase ! • Every feature goes thru some high level design/architecture • Ensure architecture allows and support: – Automation – Good reporting for problem analysis – Isolation, componentization and low dependencies – Mocking framework • Discuss Strategy for Testing the architecture and the system 16
  • 17. Challenge the automation strategy Manual GUI – 5-10% ATDD - Acceptance Test Driven Development– 20% Unit and component (integration) – 70% 17
  • 18. Practical Incremental Approach for automation New Test Refactored Refactored Coverage New Features Code New Features Code New Features New Test Refactored Coverage New Features Code New Features New Test Sanity and Coverage New Features risky 1 2 areas 3 4 Legacy Systems Legacy Systems Legacy Systems Legacy Systems Malfunctioning Code Low quality code covered by automated tests High quality code covered by automated tests Apply automation as new features added (minimize debt) Automate Sanity and risky areas by independent team Manual regression testing only after risk analysis. 18
  • 19. Integrate system Continuously – staged approach Early DropsDeveloper Product Build Cross User Products Acceptance Test • Code + Build + • Build and Package • End to end • Pickup & Unit Testing • Unit Testing flows Deploy until stable • Deploy and Test • Test • Get latest and • Integration / Merge Acceptance / • Local Build + System test + code analysis … • Code Quality Checks • Check-in • Profiling • Log Analysis Failure Report 19
  • 20. Meanwhile -> make it manually mini- I I h I I h I H hardening Continuously Independent Integration Team 20 20
  • 21. ATDD - Tests represent expectations and drive outambiguity • turn your user stories into acceptance tests (ready ready story) • Preferably those tests will eventually become automatic within the sprint no need for feature freeze 21 21
  • 22. ATDD also make progress visible • Are we there yet? No need for traceability Track progress matrix !!! based on the testing progress Implemented Story 22 22
  • 23. what practices we use During Iteration ? • whole team approach – Practice Collective Test Ownership – Common goals – One team, partnership ! 23 23
  • 24. Dev-QA collaboration • Planning together – Identify Steel Thread – key slice through the story (Happy Path usually) – Continue to other slices • Sharing test plans (unit/functional) • Automation ... • Visualize defects status 24
  • 25. Stop and fix - Stop starting start finishing • When do we need to stop and fix defects ? – Does it block somebody or break the system (CI) ? – Can it goes to production ? – Did we pass a Threshold/One page? – Violate release criteria ? • Done is Done – Associate defect to the story – 70/100-100/80 • All remaining test are in the backlog ! – System flows on epic level 25 • Measure cycle time !!! 25
  • 26. The key to control the Dev-QA gap in release level Coding Feature Done Freeze DONE RD C T Hardening R D C T RD C T RD C T RD C T R D C T Time 26
  • 27. Kanban – managing the end to end flow•Visualize the workflow •Explicit policies•Limit WIP •Measure & optimize flow WIP Limit! Can’t start new DEV work! Dev A lot Empty Empty Done of Test downstream almost WIP in Done (Bubble) Full Test• Classic “solution” - Add more testers 27
  • 28. What LIMITED WIP Does Fix open defects on our Stories WIP Limit! Can’t start Help us new DEV with work! Blocker Help us automate How can I tests for help current this story stories? 28
  • 29. How do we Visualize the work status in more depth? 120 100 Total Scope 80 Average Cycle Time TODO 60 Work in Process Burnup 40 (WIP) 20 Done 0 1 2 3 4 5 6 7 8 9 10 29
  • 30. Elaborating the WIP 120 100 Heavy load on QA 80 TODO Dev 60 40 Test 20Heavy load on Dev Done 0 1 2 3 4 5 6 7 8 9 10 30
  • 31. 120If we use a WIP Limit... 100 80 TODO 60 Dev 40 Test Work Cycle Time Average Done 20 in Proce 0 ss (WIP) 1 2 3 4 5 6 7 8 9 10 31
  • 32. The next level – Tracking also the upstream 32 32
  • 33. Good luck !!! ronen@agilesparks.com 052-5522749 33 33
  • 34. summary So how we can avoid - effortAnd achieve time PSP PSP PSP PSP Sprint 1 Sprint 2 Sprint 3 Sprint 4 34 34

×