Agile Acceptance testing with Fitnesse

9,978 views

Published on

How to get the most out of your team by making developers and testers jointly responsible for running tests.

Published in: Technology
  • Be the first to comment

Agile Acceptance testing with Fitnesse

  1. 1. Bridging the Gap Creating better quality software using Agile Acceptance Testing and Fitnesse Clare McLennan clare.mclennan@gmail.com http://crazy-it-adventures.blogspot.com http://twitter.com/claremclennan
  2. 2. Contents • The Challenge • What means Testability? • Fitnesse • 0 to 90% in 9 months • Our System Test Toolbox • Success Factors
  3. 3. The Challenge
  4. 4. The Challenge • In the beginning, time to market was everything • As our customer base grew it became important that the system always worked • Big culture change • 99.99% uptime target • Releases weekly • System was hard to test
  5. 5. What means testability?
  6. 6. Testability • Any working system is testable • Aim for Easy to Test • How much – Setup – Knowledge is needed to test one aspect of the system? • See podcast, Test Driven Development is Design
  7. 7. Symptoms of Low Testability • Frustrating and slow to test anything • Testers needing continuous help from developers • Developers may believe testers are stupid • Developers avoid system testing and stop after unit testing succeeds • Poor understanding of the System develops • Can’t easily introduce new people to the project
  8. 8. Automated Testing System Test System Predict- Knowledge ability Debugging Status Methods Installer Tools
  9. 9. System Test System Predict- Knowledge ability Auto Debugging Status Tools Test Methods Installer
  10. 10. How We Improved Testability • Created an installer so everyone runs the system in the same way • Created a means to query the system for when processing is finished • Added business time to the system so we instantly test functionality that takes long periods of time Example: Conversion tracking works over a period of 30 days.
  11. 11. Fitnesse
  12. 12. Fitnesse in a Nut Shell • Means of capturing requirements as tests • Tests turn green if passed, red if failed • Requirements stay up to date • Customer or testers write the tests • Programmers write fixture code to make the tests run
  13. 13. Running a Suite of Tests
  14. 14. The Technical Side • Fitnesse is a wiki • Recommend to store tests with the code • Use SLIM (has replaced FIT) • Java, C++, Ruby, Python and more • Test fail when testers write them • Testers can reuse fixtures to create more tests
  15. 15. Sample Fixture
  16. 16. Sample Fixture
  17. 17. Sample Fixture
  18. 18. Sample Fixture Code
  19. 19. Script Fixture
  20. 20. Query Fixture
  21. 21. Test Organisation • Tests organised into heirachy of suites • SetUp and TearDown run before each test • Test History of success/failures • Tests can have explanatory text • Fixture toolbox documentation
  22. 22. How To Write Good Tests • Use user language, not programmer mumbo-jumbo • Make each test specific • Write cases not scripts - you should only specify things relevant for this example • Generally, if you can’t do it manually you won’t be able to automate it. • See http://www.concordion.org/Technique.html
  23. 23. Evolution of Our Tests (1) Test uses magic numbers from database – can't see what this test is about
  24. 24. Evolution of Our Tests (2) All this setup...
  25. 25. Evolution of Our Tests (3) ...replaced by this This is a system current implementation detail – not a requirement
  26. 26. Evolution of Our Tests (4) Hang on, these just set up data and don't test anything! We still aren't finished.. Ah better..
  27. 27. 0 to 90% in 12 months
  28. 28. Creating Quality Processes Preprocesses • First Fitnesse tests were written to prove it was possible • First testers joined the project But... • Writing automated tests didn't catch on
  29. 29. Creating Quality Processes Stage 1 • QA group was formed to – Recruite and train testers – Write and program the Fitnesse automated tests – Test new functionality But... • QA group struggled to keep up with development effort
  30. 30. Creating Quality Processes Stage 2 • QA group continued to write tests. • QA group responsible for running tests • New Fixture requests were handed over to development team at start of sprint But... • QA group still needed needed system programmers knowledge to keep tests working • Hard to specify upfront all fixtures required • Programmers hated writing fixtures
  31. 31. Creating Quality Processes Stage 3 • Testers joined in development teams. • Testers responsibility to write tests • Dev teams responsibility to get tests running • Dev team given a test box to run tests on • Weekly QA meeting for testers to share changes and ideas
  32. 32. Creating Quality Processes Stage 3 Highly successful!! After initial teething in sprint (3 weeks) everyone was positive about the change
  33. 33. What Happened During Stage 3? • DBA sped up tests • We reduced the number of GUI functionality tests required because of good unit test coverage • Many manual testing issues were resolved • Finally testing and development occurred at the same pace • Programmers embraced writing tests as part of their job to maintain quality
  34. 34. Change in Testers Role • More about ensuring good specifications to prevent bugs • More testing time spent on exploratory testing • Better relationships with programmers • Less dull work • More influence on how the system is written to make testing easier
  35. 35. Our System Test Toolbox
  36. 36. Our System Test Toolbox • Ask User fixture • Business time • Staged Deployment • Separate Functionality, Gui Functionality, Gui Layout, Load, Full System, Sanity and Full system tests • Close communication between testers and programmers to find optimal test strategies
  37. 37. Ask User
  38. 38. Ask User • Mix and match human and automated processes • Allows tests to be written and run before all automation is ironed out – Example: Gui testing will eventually be automated with Selenium • User created objects can be referred to in Fitnesse tests • Simple idea but really practical!
  39. 39. Business Time • Changes the current time in the system • Allows testing of scenarios that take a long time • Only in testing mode • Low risk, as if forgotten system still works correctly in production environment
  40. 40. Staged Deployment • Think Beta testing • Test throughly, then do a partial release to – Only some customers, or – For a small proportion of the daily impressions, or – Run old and new system side by side • Gives a more accurate test
  41. 41. Types of testing Unit – Programmers tests. Pinpoint bugs quickly. Functionality – Pure, automatic testing of the whole system. All processes triggered Gui Functionality – Test functionality of Gui page at a time. All other objects required are generated as for functionality testing
  42. 42. Types of Testing Gui Layout – Pure gui test of layout, updating. Load – Use real database, or extra large database Full System – Main functionality,with processes on timers, as in production Sanity Check – Final check performed before a each release
  43. 43. Success Factors
  44. 44. Outcomes • Automated testing of the system takes one hour, plus some quick manual tests • Programmers attitude has changed from expecting outsiders to validate the system, to sharing responsibility for this task • System more easy to test, deploy, monitor and manage • A self-running QA process which is continually improved by development teams
  45. 45. Success Factors • Realistic time frame (12 months) • Treated building automated testing system as a project of it’s own • Influence over design of system to be tested • Started with high ROI, hard to manual test, functionality • Get tests working, then perfect • Gently challenged company culture
  46. 46. Future Improvements? • Automate GUI functionality tests with Selenium • Start sprints with a Specification Workshop • Improve Load tests - run them on a cloud • Move system knowledge from Fitnesse, back into the system • Make tests more user orientated
  47. 47. To learn more read Bridging the Communications Gap by Gojko Adzic
  48. 48. References • Fitnesse http://www.fitnesse.org • Bridging the Communications Gap by Gojko Adzic • Test Driven Development is Design - The Last Word on TDD, Hansel Minutes Podcast starring Scott Bellware and Scott Hanselman • Hints and Tips [for writing acceptance tests] by David Peterson http://www.concordion.org/Technique.html

×