Your SlideShare is downloading. ×
0
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
7 Deadly Sins of Agile Software Test Automation
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

7 Deadly Sins of Agile Software Test Automation

6,878

Published on

Automated software testing is a key enabler for teams wanting to build high quality software that can be progressively enhanced and continuously released. To ensure development practices are …

Automated software testing is a key enabler for teams wanting to build high quality software that can be progressively enhanced and continuously released. To ensure development practices are sustainable, automated testing must be treated as a first-class citizen and not all approaches are created equal. Some approaches can accumulate technical debt, cause duplication of effort and even team dysfunctions.

The seven deadly sins of automated software testing are a set of common anti-patterns that have been found to erode the value of automated testing resulting in long term maintenance issues and ultimately affecting the ability of development teams to respond to change and continuously deliver.

Taking the classic seven sins (Gluttony, Sloth, Lust, Envy, Rage, Pride, Greed) as they might be applied to test automation we will discuss how to identify each automated sin and more importantly provide guidance on recommended solutions and how to avoid them in the first place.

Published in: Technology
1 Comment
14 Likes
Statistics
Notes
  • I like this presentation! You have 'hit the nail on the head' with many of the problems in test automation. Thanks for making it available here.
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
No Downloads
Views
Total Views
6,878
On Slideshare
0
From Embeds
0
Number of Embeds
5
Actions
Shares
0
Downloads
169
Comments
1
Likes
14
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. 7 DEADLY SINS OF AUTOMATED TESTING Dr Adrian Smith September 2012 Engineering Innovation.Thursday, 20 September 12
  • 2. Adrian Smith • Background in Engineering • Software development using Agile and Lean Diverse Ex Diverse Ex • Technical and Organisational Coach Aerospace Engineering • Founded a startup product development and Commercial and military engineering Aerospace Engineering analysis and manufacturing experienc consulting business Diverse Experienc Commercial and military engineering programs including A380 and F35. analysis and manufacturing experienc programs including A380 and F35. Agile Software Developm Aerospace Engineering Software development, architecture a Agile Software Developm Commercial and military engineering design, management for engineering CAE, au Software development, architecture a analysis and manufacturing experience on major scientific and digital media. programs including A380 and F35. engineering CAE, au management for scientific and digital media. Systems IntegrationThursday, 20 September 12 Agile Software Development Integration of logistics, financial, engi
  • 3. Geeks hate repetitionThursday, 20 September 12
  • 4. Airbus A380 WingThursday, 20 September 12
  • 5. Thursday, 20 September 12
  • 6. Envy Flawed comparison of manual testing and automationThursday, 20 September 12
  • 7. How management see testingThursday, 20 September 12
  • 8. How management would like to see testingThursday, 20 September 12
  • 9. Manual vs Automation • A flawed comparison • Assumes that automation can replace manual testing effort • Automation generally doesn’t find new defects • Testing is not merely a sequence of repeatable actions • Testing requires thought and learningThursday, 20 September 12
  • 10. Ideal automation targets • Regression testing - assessing current state • Automation of test support activities • Data generation or sub-setting • Load generation • Non-functional testing • Deterministic problems • Big data problemsThursday, 20 September 12
  • 11. Common symptoms • Relying on automation as the basis for all testing activities • All tests are built by developers • Absence of code reviews • Absence of exploratory testing • Absence of user testingThursday, 20 September 12
  • 12. Suggested approach • Avoid comparison between manual and automated testing - both are needed • Distinguish between the automation and the process that is being automated • Use automation to provide a baseline • Use automation in conjunction with manual techniquesThursday, 20 September 12
  • 13. GLUTTONY Over indulging on commercial test toolsThursday, 20 September 12
  • 14. Promise of automation • Software vendors have sold automation as the capture-replay of manual testing processes • Miracle tools that solve all testing problemsThursday, 20 September 12
  • 15. License barrier • Commercial licenses restrict usage • Not everyone can run the tests • Typically, organisations create special groups or privileged individualsThursday, 20 September 12
  • 16. Incompatible technology • Underlying technology of commercial tools is often not compatible with the development toolchain • Special file formats or databases • Lack of version control for tests • Tests cannot be versioned within the software • Continuous integration problems • Can’t be adapted or extended by the developersThursday, 20 September 12
  • 17. Justifying the expense • Financial commitments distort judgement • Difficult to make objective decisions • Tendency to use the tool for every testing problem • People define their role by the tools they useThursday, 20 September 12
  • 18. Common symptoms • A commercial tools form the basis of a testing strategy • Only certain teams or individuals can access a tool or run tests • Developers have not be consulted in the selection of a testing tools • “We always use <insert tool-name> for testing!”Thursday, 20 September 12
  • 19. Suggested approach • Use Open Source software tools where ever possible • Use tools that can easily be supported by the development team and play nicely with existing development tool chain • Ensure any commercial tools can be executed in a command-line modeThursday, 20 September 12
  • 20. Lust User interface forms the basis for all testingThursday, 20 September 12
  • 21. Testing through the GUI • Non-technical testers often approach testing through the user interface • Ignores the underlying system and application architecture • Resulting tests are slow and brittle • Difficult to setup test context - resulting in sequence dependent scriptsThursday, 20 September 12
  • 22. Investment profile Manual Exploratory Interface Confidence Speed / Feedback Collaboratively built around system behaviour Acceptance Exercises components and Integration systems Unit/Component Investment / Importance Developer built optimised for fast feedbackThursday, 20 September 12
  • 23. Architecture • Understanding application and system architecture improves test design • Creates opportunities to verify functionality at the right levelThursday, 20 September 12
  • 24. Test design Test Intent (Clearly identifies what the Test Data test is trying to verify) Test Implementation System Under (Implementation of the test Test including usage of test data)Thursday, 20 September 12
  • 25. F.I.R.S.T. class tests F Fast I Independent R Reliable S Small T TransparentThursday, 20 September 12
  • 26. Common symptoms • Testers cannot draw the application or system architecture • Large proportion of tests are being run through the user interface • Testers have limited technical skills • No collaboration with developers • Intent of tests is unclearThursday, 20 September 12
  • 27. Suggested approach • Limit the investment in automated tests that are executed through the user interface • Collaborate with developers • Focus investment in automation at lowest possible level with clear test intent • Ensure automation give fast feedbackThursday, 20 September 12
  • 28. Pride Too proud to collaborate when creating testsThursday, 20 September 12
  • 29. Poor collaboration • Organisations often create specialisations of roles and skills • Layers of management and control then develop • Collaboration becomes difficult • Poor collaboration = poor testsThursday, 20 September 12
  • 30. Automating too much • Delegating test automation to a special group inhibits collaboration • Poor collaboration can results in duplicate test cases / coverage • Duplication wastes effort and creates maintenance issues • Setting performance goals based around test- cases automated leads to problemsThursday, 20 September 12
  • 31. No definition of quality • Automated testing effort should match the desired system quality • Risk that too-much, too-little or not the right things will be tested • Defining quality creates a shared understanding and can only be achieved through collaborationThursday, 20 September 12
  • 32. Good collaboration • Cross-functional Acceptance teams built Criteria better software Analyst • Collaboration Specification and improves Elaboration Collaboration Tester definition and verification Developer AutomationThursday, 20 September 12
  • 33. Specification by Example • Recognises the value of collaboration in testing • More general than ATDD and/or BDD • Based around building a suite of Living Documentation that can be executedThursday, 20 September 12
  • 34. Common symptoms • Automated tests are being built in isolation from team • Intent of tests is unclear or not matched to quality • Poor automation design (abstraction, encapsulation, ...) • Maintainability or compatibility issuesThursday, 20 September 12
  • 35. Suggested approach • Collaborate to create good tests and avoid duplication • Limit the investment in UI based automated tests • Collaborate with developers to ensure good technical practices (encapsulation, abstraction, reuse, ... ) • Test code = Production codeThursday, 20 September 12
  • 36. SLOTH Too lazy to properly maintain automated tests Engineering Innovation.Thursday, 20 September 12
  • 37. Automated Test Failures • Many potential causes of failure • Unless maintained - value is slowly eroded System Reference Interface Data Change Changes Time New OS Feature PatchThursday, 20 September 12
  • 38. Importance of maintenance Manual test execution Value of Unmaintained Maintained automation Automated Test Suite Unmaintained automation Cost / Effort Potential Value of Maintained Automated Test Suite TimeThursday, 20 September 12
  • 39. Continuous integrationThursday, 20 September 12
  • 40. Common symptoms • Test suite has not been recently run - state is unknown • Continuous Integration history shows consistent failures following development changes / release • Test suite requires manual intervention • Duplication within automation code • Small changes triggers a cascade of failuresThursday, 20 September 12
  • 41. Suggested Approach • Ensure automated tests are executed using a Continuous Integration environment • Ensure test are always running - even if system in not being actively developed • Make test results visible - create transparency of system health • Ensure collaboration between developers and testersThursday, 20 September 12
  • 42. Rage Frustration with slow, brittle or unreliable automated testsThursday, 20 September 12
  • 43. Slow automation • Large datasets • Unnecessary integrations • Inadequate hardware/environments • Too many tests • Reliance on GUI based tests • Manual intervention • ... many othersThursday, 20 September 12
  • 44. Fast FeedbackThursday, 20 September 12
  • 45. Brittle Tests • Contain time-bound data • Have external dependencies • Rely on UI layout/style • Rely on sequence of execution • Based on production data or environmentsThursday, 20 September 12
  • 46. FrustrationThursday, 20 September 12
  • 47. Unreliable Tests • False positives • Wastes time investigating • Failures start being ignored • Creates uncertainty of system health • Workarounds and alternate tests are createdThursday, 20 September 12
  • 48. Suggested approach • Treat automated tests with the same importance as production code • Review, refactor, improve ... • Apply a “Stop the line” approach to test failure • Eliminate (quarantine) unreliable tests • Ensure collaboration with developers • Up-skill / pair testersThursday, 20 September 12
  • 49. Avarice Trying to cut costs through (Greed) automationThursday, 20 September 12
  • 50. Lure of cheap testing • Testing tool vendors often try to calculate ROI based on saving labour • Analysis is unreliable and under values the importance of testingThursday, 20 September 12
  • 51. Automation is not cheap • Adopting test automation tools and techniques requires significant investment • Investment in new ways of working • Investment in skills • Investment in collaboration • Ongoing investment in maintenanceThursday, 20 September 12
  • 52. Common symptoms • Investment in commercial tools using a business-case based on reducing headcount • Using a predicted ROI as a way of reducing budget for Testing • Consolidating automated testing within a special groupThursday, 20 September 12
  • 53. Suggested approach • Ensure the reasons for automation are clear and are NOT based purely on saving money/headcount • Ensure business case for automation includes costs for ongoing maintenanceThursday, 20 September 12
  • 54. 7 Deadly Sins Envy Flawed comparison of manual testing and automation Gluttony Over indulging on commercial test tools Lust User interface forms the basis for all testing Pride Too proud to collaborate when creating tests Sloth Too lazy to maintain automated tests Rage Frustration with slow, brittle or unreliable tests Greed Trying to cut costs through automationThursday, 20 September 12
  • 55. Why automate testing ?Thursday, 20 September 12
  • 56. How geeks really workThursday, 20 September 12
  • 57. Thank you Dr Adrian Smith September 2012 Engineering Innovation.Thursday, 20 September 12

×