Better Practices for Crafting Automated Tests


Published on

Over the past year or so, I have been tasked to clean up the automation stack in our feature area. During that time there were days when I just wanted to throw my hands in the air, but instead I just beat my head against my desk a few more times. It seemed that knocking myself senseless was the only way to try to make sense of what was going on inside some of the legacy automated test cases in our code base. After removing around 5000 lines of redundant code, reducing runtime by more than 20 minutes, and improving maintainability of the tests in our feature area I came up with a list of things to avoid when coding automated tests. Although not an exhaustive list of best practices, here are some ideas that can help testers develop automated tests that will be more robust, efficient, and potentially improve the effectiveness of your automation test suites.

View the webinar here:

Published in: Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Better Practices for Crafting Automated Tests

  1. 1. Better Practices for Crafting Automated Tests Bj Rollison, Principal Test Architect, Microsoft, Inc. @TestingMentor
  2. 2. Introduction • Ideas for people who automate tests! • Not a discussion of general automation usefulness or pros and cons • Not about processes • And not about automation strategies • Assumptions about the audience • You are experienced writing automated tests using a programming language • You want to improve your automation reliability and maintainability • You want to make your automation more robust
  3. 3. Overview • Basic tenets • 5 practices for developing automated tests 1. Get Organized 2. Less is more 3. Logging 4. Common mistakes 5. Expand & explore • Questions
  4. 4. Basic tenets • Automate tests that provide information that enables your team to make decisions • Critical functionality – regressions, monitoring • Non-functional – performance, stress, etc. • Behaviors – Customer scenarios • Be a control freak! • Don’t assume anything • Assume the automated test will fail 30 seconds after you leave for the night • Your automated must be able to predict and respond to anomalies in the system • Code early! (esp. unit tests and API tests)
  5. 5. Get Organized • Templates provide consistency • Objective explains purpose of test • Test steps help keep focus & check assumptions • Report results • Pass / Fail • Blocked
  6. 6. Get Organized • Source control! • Use source control…period! • Automation code is project code! • Commit early & often IDE • Static analysis • Clean code! • Improves readability • Improves robustness – prevents common errors • Don’t forget peer or code reviews! • Coding guidelines
  7. 7. Less is more! • Test scripts focus too much on coding • Many execution steps repeated in other tests • Difficult to read • Expensive to maintain Open a file Replace all text Save the file Launch app Close app Get handle 22 program statements reduced to 6
  8. 8. Less is more! • Move the work out of the test and into reusable methods • Use custom methods to wrap complex tasks • Helper methods throw exceptions • If method fails, test is blocked
  9. 9. Oracles • Oracles report pass or fail results • Separate oracles from test code. • More == better! • Multiple checks • Catch each unique exception • Re-throw to preserve the stack.
  10. 10. Logging & Commenting • Logging • Log more than you think you need • If you can’t determine cause of failure from your log files, you aren’t logging (or capturing) enough information. • Logging test steps (other details) should be approximately 50% of test code. • Commenting • No one can read your mind; especially if you aren’t around • Comments help you and others understand/debug your code • Explain workarounds in your code
  11. 11. Logging • Logger.TestStep – application task or manipulation • Logger.Comment – additional notes • If any method throws an exception; test reports as blocked • Don’t forget cleanup!
  12. 12. Code Comments • Explain what your custom methods do! • Either code comments or use Logging.Comment () method • Follow coding style guidelines!!!
  13. 13. Log files • Log files should capture information why a test is blocked or fails • Don’t let blocked (or failing tests) stop your automation from running
  14. 14. Common pitfalls • Hard coded variables • What happens if this test is ran on non-EN-US language version? • What about resolution changes? • No exception handling • What happens when the find replace dialog doesn’t appear? • What just happened here? • How do we know if this test passed, failed, or is blocked? • Is this a test or a macro?
  15. 15. Exploratory automation • Introducing smart variation into an automated test • Managers generally don’t like randomness in automation. • So, call it something else; “equivalent variation” • Benefits • Increased test/data coverage • Increased data coverage • Drawbacks • Unreliability (false positives) • More work
  16. 16. Exploratory Automation • Find Replace Example • Find word – • Randomly select a word from any word in the document • Replace word • Select a word from list of words not in document • Random string
  17. 17. Data Variation • Generate random seed value each iteration • Replace static variables with real variables • Oracle doesn’t change!
  18. 18. Summary • Organize • Templates, coding guidelines, static analysis tools • Less is more • Optimize test frameworks to facilitate writing tests; not code • Logging and Comments • More is better • Common mistakes • Hard coded variables, magic numbers, US centric, etc.Expand & explore • Exploratory automation • Design tests to expand and explore variations in data, state traversals, etc.
  19. 19. Thanks for listening!