2. What is your experience with
requirements?
Obvious things are not that obvious
Imperative requirements
Fulfilling specifications does not guarantee success
3. Meeting the needs with ATDD
Requirement Acceptance Test Feedback Implementation
Drive implementation of a requirement through a set of
automated, executable acceptance tests
4. ATDD in a nutshell
Real-world examples to build a shared understanding of the
domain
Select a set of these examples to be a specification and an
acceptance test suite
Automate the verification of acceptance tests
Focus the software development effort on the acceptance
tests
Use the set of acceptance tests to facilitate discussion about
future change requests.
5. The ATDD Cycle
coding •Implementation of tests
usually occur before the code
testing is done
• Feature not done until test
Feature
passes
Specification architecture
Done
Workshop customer documentation
other activities
• Activities happen in parallel
Developers •Team clarifies and implement the feature
Testers together
Product Owner Example •Tests are executable at the end of the
ATDD Workshop
Architect tests
Technical writers
6. TDD and ATDD
TDD
TDD
Automated TDD Definition of
TDD
TDD
Story Acceptance TDD Done
TDD
TDD
Tests
ATDD is a TEAM practice
Acceptance test is a not replacement for unit-level testing
7. Benefits of ATDD
Comprehensible examples over complex formulas
Close Collaboration
Definition of Done
Trust and Commitment
Testing on system level
8. Specification by Examples
Use realistic examples to demonstrate differences in
possibilities instead of abstract requirements
Write specifications down as tables
“Spine story” contains the essence of a feature
Edge cases, negative paths
Normal use
Abnormal but reasonable
Abnormal and unreasonable
9. Example, Tests, and Spec
can become Tests
Examples
ela
rify
bo
ve
ra
te
Requirements
11. Good Acceptance Tests
Focused on one thing
Not a script
Self-explanatory
SMART
Specific - explicitly defined and definite
Measurable - observable and quantifiable
Achievable - realistic scenario
Relevant - related to a particular story
Time-bound - observed almost instantly
12. Specification Workshop
Ask the domain experts
Developers and testers should suggest examples of edges
or important issues for discussion
Ubiquitous language
Organise feedback to ensure shared understanding
Use facilitator to stay focused if needed
13. Distilling the specifications
Write tests collaboratively
Examples in a form close to what your automation tool can
understand
Keep tests in a form that is human-readable
Specification over Scripting, describe WHAT, not how
Acceptance tests to prevent defects, not to discover
Not necessarily automate anything
Acceptance tests only work when we can discuss them
14. Some considerations
User Interface
Easy?
Fragile?
Performance issues?
Boundary of Stub
Sufficiently close
Simulators?
Business logic
Not from developer perspective
15. Acceptance Test smells
Long tests
Parameters of calculation tests that always have the
same value
Similar test with minor differences
Tests that reflect the way code was written
Tests fail intermittently even though you didn’t change
any code
Interdependent tests, e.g. setup for others
16. Change
Use existing acceptance tests to discuss future
changes
Seek advices from customer to determine if it
specifies obsolete functionality when test fails
Automate periodic execution of regression
tests with CI
Keep tests in the same version control as code
17. References
Bridging the Communication Gap
Gojko Adzic
Practical TDD and ATDD for Java Developers
Lasse Koskela
Agile Testing
Lisa Crispin and Janet Gregory
18. Robot Framework
http://www.robotframework.org
Keyword-driven test automation framework
Test libraries implemented either in Python or Java Test
cases are written in tabular format, save in HTML or TSV
files
Syntax similar to natural language
Users can create new keywords from existing ones and
contribute to the project