More Related Content

Similar to 7 Deadly Sins of Agile Software Test Automation(20)

7 Deadly Sins of Agile Software Test Automation

  1. 7 Deadly Sins of Agile Software Test Automation Craig Smith
  2. Welcome… Image: © Gracie Films / 20th Century Fox Television
  3. Adrian Smith @adrianlsmith
  4. Geeks Hate Repetition
  5. It Started With… Image: © New Line Cinema © United Plankton Pictures
  6. Discussion: What is stopping you from being great at automation? Image:
  7. Image:
  8. Image: Building Quality In
  9. AUTOMATED TESTING Automated Testing Image:
  10. Envy Flawed comparison of manual testing & automation Image:
  11. How Management See Testing Image:
  12. How Management Would Like To See Testing Image:
  13. Repeat After Me… Agile ≠ Automated Testing
  14. Software is a Series of Loops Image:
  15.  End to end automated tests and cover 10 conditions per test, manual covers 1 condition per test  Automated tests cover all high priority business flows, manual tests mostly cover lower priority flows  Metrics need context! % Regression test cases automated Automated Manual Discussion: How Much Automated Testing Is Enough?
  16. Manual vs Automated… A Flawed Comparison Image:
  17. Testing is more than a series of merely repeatable actions Image:
  18. Look Left… Look Right…
  19. Image: Look Up!
  20. 21 Image: © Jerry Bruckheimer Television CSI: Software
  21.  Regression testing – assessing current state  Automation of test support activities Ideal Automation Targets  Data generation / sub-setting  Load generation  Non-functional testing - performance, security, ...)  Deterministic problems  Big data problems Image:
  22. Common “Envy” Symptoms  Relying on automation as the basis for all testing activities  All tests are built by developers  Absence of code reviews  Absence of exploratory testing  Absence of user testing Image:
  23. Approach: Desired Role Automation: I work with developers to automate tests that provide business value and identify system risks Collaboration: I work with analysts and SMEs to ensure that testable acceptance criteria are created for all stories Strategy & Planning: I am involved in the project at all stages to ensure that testing provides the greatest value and quality objectives are achieved Tools & Techniques: I use both manual and automated techniques using the preferred testing tools based on the situation Architecture: I have an understanding of the system architecture and can create tests that verify individual components and the system as a whole Development: I take an interest in development practices and monitor code quality metrics Estimating: I am involved in developing estimates for projects at a story level, for tools and infrastructure and also for deployment/release activities Reporting: I provide metrics that give insight into project health and system quality Agile: I understand Agile project delivery and the differences between testing in the different phases of an phase Qualifications & Training: I have recognised qualifications in testing and continually update and maintain my skills Recruitment & Development: I am attracted to the organisation because testing is a cool career path that offers heaps of opportunities and a way of continually developing my skills Community & Teams: I am part of a community of testers that are embedded within teams but share common values
  24. Approach: Report on Confidence
  25. “Envy” Lessons Learned  Avoid comparison between manual and automated testing - both are needed  Distinguish between the automation and the process that is being automated  Use automation to provide a baseline  Use automation in conjunction with manual techniques Image:
  26. •Over indulging on commercial test tools Gluttony Over indulging on commercial test tools… Image: © Universal Pictures
  27. Discussion: What Automated Tools Do You Use & Why? Image:
  28. Some vendors believe they have the “secret sauce” for automation Image:
  29. Promise of Automation Image:
  30. Image: Commercial Restricts Usage
  31. Justifying The Expense I have spent a lot of money on this tool, we better be using it for everything! Image: © Warner Bros Television Distribution / Screen Gems
  32.  Underlying commercial tools technology often not compatible with development tool chain Incompatible Technology  Special file formats or databases  Lack of version control for tests and/or cannot be versioned within the software  Not easily combined with Continuous Integration Image:  Not easily adapted or extended by the developers
  33. Common “Gluttony” Symptoms  A commercial tool forms the basis of a testing strategy  Only certain teams or individuals can access a tool or run tests  Developers have not been consulted in the selection of a testing tools  “We always use <insert tool-name> for testing!” Image:
  34. Approach: Tools Guide
  35. Approach: Craftsmanship Image: © Wild Dancer Productions / Touchstone Television
  36. “Gluttony” Lessons Learned  Favour open software tools where ever possible  Use tools that can easily be supported by the development team and play nicely with existing development tool chain  Ensure any commercial tools can be executed in a command-line mode so it can be automated Image:  Educate!
  37. •User interface forms the basis for all testing Lust User interface forms the basis for all testing… Image:
  38. Traditionally Test via the UI Image:
  39. Manual Exploratory Collaboratively built around system behaviour Developer built optimised for fast feedback Confidence Speed/Feedback Exercises components and systems Investment Profile
  40. Understand The Architecture Image:
  41. Test Design
  42. F Fast I Isolated R Repeatable S Self Verifying T Timely FIRST Test Properties Image:
  43. Common “Lust” Symptoms  Testers cannot draw the application or system architecture  Large proportion of tests are run through the UI  Testers have limited technical skills  No collaboration with developers  Intent of tests is unclear Image:
  44. Discussion: What technologies do you need to understand to automate? Image: © Paramount Pictures
  45. Agile Testing Command Line Interface Continuous Integration Version Control Build Tools Capture Replay (Selenium) HTML Test Maintenance & Data Specification By Example / ATDD / BDD Concordion / Cucumber SQL Web Services Approach: Test Automation Course
  46. Approach: Use UI Tools for Right Purpose Use tools like Selenium sparingly, good for exploratory testing Only use tools like HTMLUnit or scrape screens when no other options available
  47. “Lust” Lessons Learned  Limit the investment in automated tests that are executed through the user interface  Collaborate with developers  Focus investment in automation at lowest possible level with clear test intent  Ensure automation gives fast feedback Image:
  48. Pride Too proud to collaborate when creating tests Pride Too proud to collaborate when creating tests… Image:
  49. Poor Collaboration = Poor Tests Image:
  50. Poor Collaboration or Dedicated Roles = Duplication Image: © Universal Pictures
  51. No Definition of Quality Image:
  52. Discussion: What is something you regard as quality? Image:
  53. Analyst / Customer Developer Tester Automation Elaboration / Specification Acceptance Criteria High Performing AGILE Project Manager Good Collaboration
  54. Developer Tester Analyst More Technical Less Technical Code Design UnitTest Automated Functional/ Specialist Tests Manual Test Exploratory Test Requirements Customer Collab. Developers need more testing involvement Testers need more technical involvement Analysts need more testing involvement Skills
  55. Red GreenRefactor Red GreenRefactor ATDD ATDD TDD
  56. Image: Specification by Example
  57. Common “Pride” Symptoms  Automated tests are being built in isolation from team  Intent of tests is unclear or not matched to quality  Poor automation design (abstraction, encapsulation, ...)  Maintainability or compatibility issues Image:
  58. Approach: Defining Quality Quality Advocates What does quality mean to the different roles in the team? Quality Taxonomy What are quality attributes? Quality Prioritisation How do we know what quality attributes to include? Quality Tradeoff Risks What are risks with the quality attributes we are trading off? Quality Measurement How do we test and measure quality? Success Sliders How does quality relate to the sliders? Quality Definition What does quality mean? Next Steps How do we apply quality to our work 1 2 3 4 5 6 7 8
  59. Approach: Disband Testing Teams Image:
  60. Approach: Simple Strategy
  61. “Pride” Lessons Learned  Collaborate to create good tests and avoid duplication  Limit the investment in UI based automated tests  Collaborate with developers to ensure good technical practices (encapsulation, abstraction, reuse, ... )  Test code = Production code Image:
  62. Sloth Too lazy to properly maintain automated tests… Image:
  63. New Feature System Interface Change OS Patch Reference Data Changes Time Many Causes of Automated Test Failure
  64. Time Cost/Effort Potential Value of Maintained Automated Test Suite Value of Unmaintained Automated Test Suite Manual test execution Maintained automation Unmaintained automation Importance of Maintenance
  65. Discussion: How often are you red? How do you know? Who cares? Image:
  66. Continuous Integration Watch Code Build / Compile Run Tests / Analysis Publish Results Stop The Line! Failure?
  67. Continuous Delivery Image:
  68. #notesting Image:
  69. Common “Sloth” Symptoms  Test suite has not been recently run - state is unknown  Continuous Integration history shows consistent failures following development changes / release  Test suite requires manual intervention  Duplication within automation code Image:  Small changes trigger a cascade of failures
  70. Approach: Visible Status / Stop The Line
  71. “Sloth” Lessons Learned  Ensure automated tests are executed using a Continuous Integration environment  Ensure tests are always runnable - even if the system in not being actively developed  Make test results visible - create transparency of system health  Ensure collaboration between developers and testers Image:
  72. Rage Frustration with slow, brittle or unreliable tests… Image: © 20th Century Fox
  73. Fast Feedback Image:
  74. Discussion: What is a good… Time benchmark for a build? Amount of coverage? Image:
  75. Image: No Waiting…
  76. 100%* Image:
  77. Common “Rage” Symptoms  Slow Automated Tests  Brittle  Unreliable  Large datasets  Unnecessary integrations  Inadequate environments  Too many / UI / manual tests  Time bound data  External / Prod integrations  Reliance on UI / sequence  False positives / confidence  Failures being ignored  Workarounds investigated
  78. HEALTH PROJECT DEVELOPMENT TESTING USER 0 1 2 3 4 5 0 1 2 3 4 5 6 7 1 2 3 4 5 6 7 8 NewRisks&IssuesRaised NumberofRisks&Issues Iteration TEST COVERAGE 0 5 10 15 20 25 0 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 NumberofTests NumberofDefects Iteration MAINTAINABILITY PERFORMANCE 0 200 400 600 800 1000 1200 1400 1600 1800 4700 4800 4900 5000 5100 5200 5300 5400 5500 5600 5700 1 2 3 4 5 6 7 8 LinesofTestCode LinesofCode Iteration OVERALL $- $5 $10 $15 $20 $25 $30 0 1 2 3 4 5 6 7 8 9 1 2 3 4 5 6 7 8 BusinessValue NumberofFeatures Iteration Approach: BVC
  79. Approach: Cloud / Central / Open Source ALM
  80. “Rage” Lessons Learned  Treat automated tests with the same importance as production code  Review, refactor, improve ...  Apply a “Stop the line” approach to test failure  Eliminate (quarantine) unreliable tests Image:  Ensure collaboration with developers
  81. Greed Trying to cut costs through automation… Image: © 20th Century Fox
  82. Lure of saving labour Image:
  83. Automation is not cheap! Image:
  84. Common “Greed” Symptoms  Investment in commercial tools using a business-case based on reducing headcount  Using a predicted ROI as a way of reducing budget for testing  Consolidating automated testing within a special group Image: © Gracie Films / 20th Century Fox Television
  85. New Software Test Engineer Role New Graduates, Recruits, Consultants & Partners BA SME Other Up Skill Assessment Current Test Analysts Capable + Current Skills + Desire Capable + Desire Not Capable + No Desire Not Capable + Desire **Assessment = Technical Test + Interview** Approach: Assist & Assess
  86. Approach: Outsource Teams Not Testing Image:
  87. Approach: Avoid Test Automation Projects Image:
  88. “Greed” Lessons Learned  Ensure the reasons for automation are clear and are NOT based purely on saving money/headcount  Ensure business case for automation includes costs for ongoing maintenance Image:
  89. Quality is Key Image:
  90. Approach: Quality Assessment
  91.  Tests are NOT executed on a regular basis  State of current defects is unknown  Functional testing occurs regularly  All testers can execute the functional tests  Production verification testing is used to ensure success  A majority of functional tests have been automated  Exploratory testing forms part of test execution  Customers verify implemented features prior to deployment  A majority of non-functional tests (performance, reliability, ...) are completed prior to deployment, many are automated  Testers pair with developers to automate tests  Customers verify implemented features as they are completed  Functional and non-functional testing occurs continuously within development iterations  Developers and testers are performing test first practices -1 1 2 3 4 Discussion: Test Execution
  92. Wrap Up Image:
  93. Envy Flawed comparison of manual testing & automation Gluttony Over indulging on commercial test tools Lust User interface forms the basis for all testing Pride Too proud to collaborate when creating tests Sloth Too lazy to maintain automated tests Rage Frustration with slow, brittle or unreliable tests Greed Trying to cut costs through automation 7 Deadly Sins
  94. Why Use Automation?
  95. Image: © Fuzzy Door Productions / 20th Century Fox Television Don’t Lose Sight of the Goal
  96. How geeks really workHow Geeks Can Work Together
  97. Questions? Craig Smith @smithcdau