Automated testers agile evangelist


Published on

Published in: Technology, Business
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide
  • Remit was to promptdiscourse and debate.Please try and save you biggest points, objections to end of session when I will recap all my points.
  • All of these companies utilised some form of automation whilst I was on the project – to varying degrees of success.
  • SLIDE 3This is the presentation where I talk myself out of a job.Do we need auto specialists? I'm going to contend that we don't.I believe that the value of having automated testers on your team based on the problems I'm about to discuss lie somewhere between low on the cost/benefit chart to detrimental.
  • SLIDE 4 - the world before agileIronically I will argue for that role later on - part time auto testers or testers with the requisite skills but also do manual.
  • Ask any developer what the V model is?V model demonstrates rel-ship between requirements, design, coding stages and the testing stages.
  • SLIDE 6 - Agile worldI’ve been harangued into meetings where testers are mandatory.
  • “Our highest priority is to satisfy the customer through early and continuous delivery of valuable software.”“Deliver working software frequently.”Great - so finally we have an appreciation of the importance of good test automation.So why am I contending that automation specialists can be in agile terms 'an impediment'?
  • I’m not blaming auto testers per se – they work hard and have skilled themselves appropriately.Some of the issues tend to be symptoms of employing specialists auto testers – the tester themselves may be blameless.
  • SLIDE 9Supply and demand and specialism. I hold my hands up - if I'm employed as an automation tester (bit like a good performance tester) because I have a specialism I expect to get paid more than a average manual tester. Not necessarily a compelling reason for not employing automated testers.And this is justified if you are adding value. In many cases we are working on applications that may make millions. The difference between low and high quality can cost, as can undetected defects.
  • So we have predominantly java, dotnet, phpdevs - we can find them ok. But finding a capybara tester?
  • SLIDE 12We have gone from a few expensive proprietary tools to a proliferation of open source solutions of varying complexity and with varying support. For many technologies there is no clear favourite.When a new flavour of the month comes do you stick or twist?Ask what people are familiar with…I use SpecFlow, Coypu and Behat, Selenium,WebDriverand occasionally Sahi.Hundreds on github vying for attention.Automation testers may come with baggage.Favoured web testing frameworks, tools etc which may not sit smoothly with existing skillset and technology or may simply not be the best choice.EasyBee, Tellerium, Concordian, Twist, Watir/Watinetc
  • SLIDE 13 - Silos and Miscommunication and lack of communicationAll this is flies in the face of one of agile’s main benefits which is close collaboration.
  • SLIDE 14 Symptoms of silos - Reporting features as defects
  • SLIDE 15 - Symptoms of silos - falling behindThis is already a symptom of iterative approaches that testers have to tackle even without automation.
  • SLIDE 17 - Symptoms of silos - dependency
  • SLIDE 18 - Symptoms of silos -Fail to find defects earlyX10 exponential cost as noted by some clever guy – Barry Boehm
  • SLIDE 19 - tester are not devsThis how we’ve traditionally viewed each other.I would be interested to see how an automated tester is viewed.
  • SLIDE 20 - tester are not devs
  • SLIDE 21 - Interface level tests
  • SLIDE 22 - Interface level testsTendency to test on the interface level - tests more fragile and likely to fail.You should never marry a person based on looks alone although there is some merit there. Test managers or testers may see some merit in interface tests but they will be high maintenance, not reliable just like your relationship.The tests below the surface will last longer just like in a relationship will be deeper. Even the best automated testers cannot possibly work at the service level as well as the person who created that service level.
  • The point of concentrating on the service level is that theoretically the presentation layer could change and the tests will not break as the logic is fundamentally unchanged.
  • SLIDE 24 - who fixes failing tests? Leading on from robust and fragile tests…We have already mentioned dependencies (particularly on knowledge) as a product of silos….
  • SLIDE 25 – Refactoring They will not have the safety of the tests for refactoring - will the devs run the tests?
  • SLIDE 26 – ObjectivityAgain - are you a dev or a tester?where do they get the most satisfaction? Finding issues or making tests work?Are we as analytical as we should be when working so closely?
  • SLIDE 27 - Solution - devs implement testsGojko p.142
  • SLIDE 28 - Solution – confused person imageSo where does that leave the automated tester?Can an automated tester help a dev write the step definitions? I guess they can pair - but so can two devs? But a tester can help write the test? yes, but you don't need to be an automated tester to do that.
  • SLIDE 29 Solution - what tester do you need?I was disparaging but now I’m supportive of the part time tester role.
  • SLIDE 30 Solution - what tester do you need?Have clarity of what is being tested and how (configurations etc) and importantly what is not being tested. How can you explore if you don't know what's in your backpack/where you have already been?
  • Automated testers agile evangelist

    1. 1. AUTOMATED TESTERSEssential or a luxury?
    2. 2.
    3. 3. Discussion• Waterfall and test automation• Agile and test automation – continuous delivery• Why automation testers may be detrimental to your agile project• What type of tester should we have on an agile project?• Q&A
    4. 4. Waterfall and automation• QARun, WinRunner, QTP etc.• Feature rich but significant cost (licenses, support etc).• Cost associated in investing in the test automation team.• Biggest cost was subsequent maintenance.• Flakey tests not strongly coupled with functionality.• Frequently the team was made up of part time auto testers.
    5. 5. Are non-agileYer names not approaches compatibledown, yer not coming in! with test automation? • Testers tried to get early involvement in projects (we promoted the V model) but we were frequently stopped at the door. • So when do we start test automation? • Testing seen as a service. • Frequently squeezed.
    6. 6. Welcome to the wonderful world of AgileIn an agile world testers arepositively encouraged to be integralto the project.Lisa Crispin:“Where testing is treated as a separate“phase,” programmers may not beused to working closely with testers.Learning to treat testing as an integralpart of the development process, asimportant as coding, helpsprogrammers appreciate the feedbacktesters contribute.”
    7. 7. Spotlight is back on test automation… C O D N E T L I I N V U E O R U Y S “We know from experience that without excellent automated acceptance test coverage, one of three things happens: either a lot of time is spent trying to find and fix bugs at the end of the process when you thought you were done, or you spend a great deal of time and money on manual acceptance and regression testing, or you end up releasing poor quality software.” Jez Humble & David Farley
    8. 8. Why Automated Testers may not be theanswer to your project needs…• Cost of resource• Favoured test tools• Increased chance of miscommunication and silos • Reporting features as defects • Falling behind • Dependency on auto tester • Fail to find defects early• Automated testers are not developers• Interface level testing• Failing tests• Refactoring• Objectivity and critical approach
    9. 9. Cost of resource = Automation should cost no more than it saves.
    10. 10. Jobserve search…developers.Net 2598Java 1929 663PHP
    11. 11. Jobserve search…testersCapybara 1Cucumbe r 51Selenium 164 Test 3197
    12. 12. Favoured test tools…
    13. 13. Silos and miscommunication• Disrupts collaboration – creates disconnects• Encourages handover culture of gateways and separate phases• Both devs and testers may see test automation as a separate function• Tests are not driving development• Manual testers may not have a full knowledge of automation coverage.
    14. 14. Symptom of silo – features become defects• Detecting spurious defects when actually the automated test has found a new feature.• Old problem of working with automated regression pack.
    15. 15. Symptom of silos – falling behind• Automation testers can significantly find themselves lagging behind the development or even project effort.• Are you even testing the right build?• Sometimes the automated tester will find real defects as s/he is writing the tests. But the dev team have moved on to another story/sprint.• All to frequently the testing is not planned in the estimations or even when it is it invariably falls behind and rolls over to the next sprint.• Other tasks disrupt automation effort – like helping with manual test effort.
    16. 16. Symptom of silos – falling behind“Automated acceptance tests have to becreated, owned and maintained by the deliveryteam - this should be the default position of allprojects. To maintain the focus of the developers onthe behaviour of the application, it is important thatthe acceptance tests are not owned by a separatetest team. Otherwise you will find that the test teamrapidly become snowed under with tests to repairas well as work to implement new tests for the newrequirements that the developers wereimplementing.”Humble and Farley
    17. 17. Symptom of silo - dependency• Dependency on tester - to run tests, write tests, fix tests etc – then no one else understands them.• Manual testers may not have skills or know the tests well enough or are simply reluctant to get involved.• Developers dont see it as their job to write acceptance tests.
    18. 18. Symptom of silo - Fail to find defects early• We should detect bugs early• Without coding against the tests we are losing the dynamic process of capturing defects early.• We lose benefits of BDD/ATDD and early defect detection. Defects not captured early in the process, cause a lot of pain….and money!
    19. 19. Testers are not developers
    20. 20. Testers are not developers…?• Additional overhead – auto tester has to have not just automation skills but development infrastructure skills to keep their build current.• e.g. Developers update their local builds (to use .Net 5, or switch to a mongo db, make changes to Maven/POM, new MVC version etc) and the auto tester is not in the loop.• Stay up to date with branching strategies.• Otherwise automated tester starts finding spurious issues.• Is it a communication problem? What team do they fall under - do they attend developer meetings and retrospectives, join dev emails aswell as tester groups?
    21. 21. Interface level testing Mike Cohn‟s Automation pyramid
    22. 22. Interface level testing Service layer Presentation layer Deeper, more reliable High relationship maintenance
    23. 23. Shallow presentation layer…. Replace the presentation layer and service layer tests stay the same….
    24. 24. Fixing failing tests…• What happens when tests fail - who reacts to the tests and makes sure they are working?• Weve already established that the automated tester or team may be busy trying to keep up with new stories, so how much time will they need to dedicate to failing tests or builds? Tests are set up to fail and alert developers and the team. But frequently they backfire on the automated testers who are held responsible for fixing them. More time may be spent fixing tests, than detecting defects.
    25. 25. RefactoringWithout a shared knowledge of the tests we may lose thebenefit of confident refactoring.Ideally you want the tests to berun in a suite of build tasks.
    26. 26. Objectivity and critical approach…• We all want the sprint to burn down its story points right?• Lose critical approach. Not exclusive to auto testers but also embedded testers.• Are automated testers more focused on writing good tests and making those tests pass or in actually testing.• Some automated tests written to work around the system i.e. make it pass, rather than really test the functionality.
    27. 27. Solutions…?First and foremost developers should „implement‟ and „pass‟ theacceptance tests.Can automation be successful outside of BDD/ATDD?“When automation is done along withimplementation, developers have to design the system to make ittestable. When automation is delegated to testers orconsultants, developers dont take care to implement the systemin a way that makes it easy to validate. This leads to more costlyand more difficult automation. It also causes tests to slip into thenext iteration, interrupting the flow when problems come back.”Gojko Adzic
    28. 28. So where does that leave theautomated test specialist?• Can an automated tester help a developer write the step definitions? Yes, they can pair on the test - but so can two developers.• The tester can help write the test? Yes - but you dont need to be an automated tester to do that.
    29. 29. Agile tester…Lisa Crispin:"I‟m often asked: If programmers are doing these tests, why do softwareteams need specialized testers? The answer lies in the particular skillsthat testers bring to the party." Reprise for the part time automated testers of yore?
    30. 30. Agile tester…wish list• Understanding of test automation – clarity of what is being tested and importantly what is not being tested• Ability to extend acceptance criteria and write good feature files – Gherkin skills• Strong character - insist on definition of done• Technical ability to extend automated scenarios, write auto regression packs / deployment smoke tests, fix tests, pair• Exploratory test skills• Energetic, confident, and energetic again
    31. 31. Thanks for listening… Over to you…
    32. 32. Q&AHow many automation testers are on your team?How does it work for your project?• Cost of resource• Favoured test tools• Increased chance of miscommunication and silos • Reporting features as defects • Falling behind • Dependency on auto tester • Fail to find defects early• Automated testers are not developers• Interface level testing• Failing tests• Refactoring• Objectivity and critical approach