Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Are Your Continuous Tests Too Fragile for Agile?

571 views

Published on

With a fragile test suite, the Continuous Testing that's vital to Agile just isn't feasible. If you truly want to automate the execution of a broad test suite—embracing unit, component, integration, functional, performance, and security testing—during continuous integration, you need to ensure that your test suite is up to the task. How do you achieve this? This presentation provides tips on ensuring that your tests are up to the task.

Published in: Software
  • D0WNL0AD FULL ▶ ▶ ▶ ▶ http://1url.pw/pLqT0 ◀ ◀ ◀ ◀
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here
  • Be the first to like this

Are Your Continuous Tests Too Fragile for Agile?

  1. 1. ParasoftCopyright© 2015 1 2015-05-01 Are Your Continuous Tests Too Fragile for Agile Arthur Hicken Parasoft Evangelist
  2. 2. ParasoftCopyright© 2015 22 Arthur Hicken Bio Arthur “Code Curmudgeon” Hicken is heavily involved in the development and testing communities. He is passionate about businesses realizing the full potential of testing solutions. As an expert in his field, Arthur is a sought after speaker regarding testing strategies, efficiencies and results. He has been at Parasoft for over 20 years driving the strategic conversation around SDLC practices that enhance productivity and quality. In addition to holding the title of Evangelist, Arthur represents Parasoft on multiple industry organizations including ICSQ, OWASP, NIST/SAMATE, CWE and more.
  3. 3. ParasoftCopyright© 2015 33 Agenda  Challenges of Continuous Agile  Testing defined  Testing is hard (no one wants to do it)  Best practices  A modest proposal: Spring cleaning
  4. 4. ParasoftCopyright© 2015 44 The role of QA in Agile  QA is part of development, not an afterthought  Review developer testing efforts  Additional testing of requirements  Iterative during development  Fast discovery and reporting of defects  Investigate root cause of defects  Influence development policy
  5. 5. ParasoftCopyright© 2015 55 Challenges of Agile  Need to be fast  Need to be flexible  Testing can be overlooked if not baked in  Architectural compromises from schedule
  6. 6. ParasoftCopyright© 2015 66 Challenges of Continuous  Tests must produce binary decision go/no-go  Reuse unit test and functional tests from dev to QA  High level of automation  Requires disciplined mature process  Testing must automatically answer  Is it stable  Will it do what it’s supposed to do
  7. 7. ParasoftCopyright© 2015 77 Importance of Testing How do you know when you're done? How do you know if a fix for a minor bug broke a major function of the system? How can the system evolve into something more than is currently envisioned? Testing,both unit and functional,needs to be an integratedpart of the developmentprocess.
  8. 8. ParasoftCopyright© 2015 88 Testing delivers A maintained suite of unit tests:  Represents the most practical design possible  Provides the best form of documentation for classes  Determines when a class is "done"  Gives a developer confidence in the code  Is a basis for refactoring quickly
  9. 9. ParasoftCopyright© 2015 99 Why Testing is Often Ignored • Too much effort • Too much time • We’re on a schedule – no time to test. • Why is there always time to do it over, but never to do it right? Perceived as “burdensome” • “I don’t need to test my code – I don’t make mistakes.” • Testing means I have to admit that I might have made a mistake. • I don’t like to test – it’s beneath me. • Testing isn’t fun – I just want to code! • That’s for the QA dept. Developer Ego
  10. 10. ParasoftCopyright© 2015 1010 Definitions  Unit  Functional  Regression
  11. 11. ParasoftCopyright© 2015 1111 Unit Testing Unit tests that validates your method behaves a certain way given a specific parameters  Regressions testing  Validate behavior of existing code hasn’t changed  Scheduled and run automatically  Test Driven Development  Validate behavior of new code to be written  Created before developing, turned into regression tests after
  12. 12. ParasoftCopyright© 2015 1212 Functional Testing • Unit tests (Junit, CppUnit, Nunit, etc) • Integration tests • Scripted tests • Manual tests Incarnations of functional tests • Manually written • Machine generated from application tracing • Machine generated from capturing traffic • Machine generated from recording clicks Ways to createfunctional tests
  13. 13. ParasoftCopyright© 2015 1313 Regression Testing • Functional unit tests that pass • Machine generated unit tests for current behavior • Unit test written by developers and testers • Tests written to validate bug fixes Contributors to the regression test suite • Unwanted change in behavior – fix the code • Intentional change in specification – update test controls Regression test failures
  14. 14. ParasoftCopyright© 2015 1414 What’s the Difference? Unit Tests • The code is doing things right • Written fromprogrammers perspective • Confirm code gives proper output based on specific input Functional Tests • The code is doing the right thing • Written fromusers perspective • Confirm the system does what the user expects it to do Regression Tests • It still works
  15. 15. ParasoftCopyright© 2015 1515 Why Do Unit Testing Software bugs are inserted during coding phase, but often aren’t found until functional testing, or worse they are found by customers. The longer a bug stays in the system, the more expensive it is to fix it. Unit testing is a strategy that allows you to find out if your system is behaving the way you expect early in the SDLC. Unit testing allows for easy regression testing, so you can find out quickly whether you have introduced some unexpected behavior into the system.
  16. 16. ParasoftCopyright© 2015 1616 The Cost of Defects
  17. 17. ParasoftCopyright© 2015 1717 Best Practices
  18. 18. ParasoftCopyright© 2015 1818 What’s Important Unit tests require a LOT of care Unit testing is a continuous operation Test suite maintenance is a commitment Control data (assertions) in the code
  19. 19. ParasoftCopyright© 2015 1919 Test incorporation  Take existing unit tests and schedule them into build process  Run and fix tests as part of build process  Ideal goal: 0 failures and 100% code coverage  Realistic goal: 10% test failures and 80% code coverage  If the unit test suite is noisy, don’t qualify the build based on assertion failures
  20. 20. ParasoftCopyright© 2015 2020 Unit Test Creation  Maintainability  Proper assertions  Nightly runs  Tests should be:  Automatic (no human input or review)  Repeatable (no strange dependencies)
  21. 21. ParasoftCopyright© 2015 2121 Legacy Test Suites  Leverage your existing suite  Clean up incrementally  Set policy based on current results  Tighten policy when in compliance
  22. 22. ParasoftCopyright© 2015 2222 Maintaining and Creating UT  Maintaining is more of a problem than creating.  Don’t create a test if you can’t or don’t plan to maintain it.  Creating as you go creates a better test suite  Tests not run regularly become irrelevant
  23. 23. ParasoftCopyright© 2015 2323 Write tests with the code Create the test when you’re writing the code Have a test for every task you complete Increases understanding Improve assertions
  24. 24. ParasoftCopyright© 2015 2424 Connect Unit Tests  Associate tests with Code and Requirements  Naming Conventions should have meaning  Use code comment tags @task @pr @fr  Use check-in comments @task @pr @fr  Improve change-based testing
  25. 25. ParasoftCopyright© 2015 2525 Specific Generation  Don’t be lured by 100% coverage  Auto assertions are not meaningful unless validated  Review the tests as you work on the code
  26. 26. ParasoftCopyright© 2015 2626 Code Review Test  Code for unit tests should be reviewed  Consider doing test-design review  Ensures you’re testing what you think you are  Helps reduce noise  Helps improve understanding  Review the code and the test code together
  27. 27. ParasoftCopyright© 2015 2727 Testing Exception Handling  Frequently Skipped  Prevents crashes  Prevents death spiral  Rule of thirds  1/3 basic functionality  1/3 advanced features / infrequent settings  1/3 error handling
  28. 28. ParasoftCopyright© 2015 2828 All About Assertions  Verifies that the code behaves as intended  True/false  Null or not null  Specific values  Complex conditions
  29. 29. ParasoftCopyright© 2015 2929 Why Assert?  Excuses:  My code already has console output  I can check it visually/manually  Binary output (pass/fail)  Part of overall test suite  No human intervention
  30. 30. ParasoftCopyright© 2015 3030 Assertion failures Introduced coding error • Fix the new code Functionality changed • Update the assertion Improper assertion • Comment it out
  31. 31. ParasoftCopyright© 2015 3131 Assertion Policy  Assign failures as tasks  Author of the test or of the changed code  Try to reach developer that has the highest chance to understand the change  Developer assesses failure:  Bug (fix)  Changed functionality (update assertion)  Bad assertion (improve or remove) - don’t delete; instead, comment out
  32. 32. ParasoftCopyright© 2015 3232 What to assert  Ease of creating assertions leads to poor choices  Anything CAN be checked – but should it be?  Assertions should be connected to what you’re trying to test  Assertions should be logically invariant  Check values by range, not equivalence  “What do I expect here logically?”
  33. 33. ParasoftCopyright© 2015 3333 How to recognize bad assertions  How often do you WANT to manually verify an assertion?  Daily or near-daily checking indicates poor assertion.  Troublesome possibilities:  Date  User or machine name  OS  Filesystem names and paths
  34. 34. ParasoftCopyright© 2015 3434 Missing Assertions  Error was found after code changed  Test suite never caught the error  Update unit-test with new assertion  Write new unit tests as necessary
  35. 35. ParasoftCopyright© 2015 3535 Cleaning It Up
  36. 36. ParasoftCopyright© 2015 3636 The Challenge: Management’s Perspective  Need tests to reduce risks  Testing takes time from features/fixes  Not enough time  Not enough people  Tests and reports lack context
  37. 37. ParasoftCopyright© 2015 3737 The Challenge: Development’s Perspective  Hundreds of failing or broken tests  Test requirements unclear  Existing tests are dubious  Deadline pressure
  38. 38. ParasoftCopyright© 2015 3838 Spring Cleaning Solution  Throw out the useless, old junk  Identify and eliminate “noisy” tests  Organize the remaining chores  Prioritize unit test tasks  Keep it clean  Establish and follow a routine test maintenance plan
  39. 39. ParasoftCopyright© 2015 3939 Throw out the Junk: Identifying Noisy Tests Tests that are fragile • Pass/Fail/Incomplete resultsfrequentlychange • Frequentchangesare requiredto keep up-to-date Tests that are outdated • More than“n” releaseshaveoccurred despitethistest failure • The test appliesto legacy codethat is frozen • The test hasnot been executedfor “x” periodof time Tests that are unclear • It takes more than5 minutesto understand why thetest exists • The failuremessage is incorrect/unhelpful/irrelevant • There is no associated requirementordefect Be merciless!
  40. 40. ParasoftCopyright© 2015 4040 Organize Chores: Prioritize Tasks with Policy All tasks are created equal Some tasks are more equal than others Policies establish prioritization and assignment
  41. 41. ParasoftCopyright© 2015 4141 Keep it Clean: Establish a Routine Process  Management and Development need to co-define a Policy  Clear testing goals that tie tests to business needs  Clear delineation of responsibility  Test creation and deletion  Test failureresolution  Naming conventions  Peer review  Clear and frequentcommunication  Understand the technical and business impact  Enhance traceability to project requirements  Ask questions  Focus on safety/quality, not coverage  Keep the number of tests minimal  Run all tests on a regularly scheduled, frequent basis  Clean as you go
  42. 42. ParasoftCopyright© 2015 4242 Spring Cleaning Recap  Find and delete “noisy” tests  Use policies to prioritize, assign, and plan  Co-design and follow a process to keep it clean
  43. 43. ParasoftCopyright© 2015 4343  Web  http://www.parasoft.com/jsp/resources  Blog  http://alm.parasoft.com Social  Facebook: https://www.facebook.com/parasoftcorporation  Twitter: @Parasoft@MustRead4Dev @CodeCurmudgeon  LinkedIn: http://www.linkedin.com/company/parasoft  Google+: +Parasoft +ArthurHickenCodeCurmudgeon  Google+ Community: Development Testing Because its Better

×