The document discusses using contract tests to decouple development and delivery of services that depend on external systems. It proposes:
1) Writing tests that exercise services based on the agreed interactions or "contract" rather than testing against real external systems, which may be slow, unstable or inaccessible.
2) Using stateful "fakes" that mimic external systems behavior to allow running contract tests quickly without depending on external services.
3) Sharing contract tests between services so they can validate compatibility before releases to catch issues early.
2. A technique to help teams go faster
by decoupling their development and
delivery activities from the external
systems they depend on
3. A simple delivery pipeline
Developers CI QA Staging Production
Build
Test
Package &
Version Artefact
Manual
Exploratory
Testing
Sanity
Check
Real
Users
Coding
A snapshot in time
v10v16 v13v21
4. Developers CI QA Staging Production
Two services collaborating
Consumer
Provider
Integration test in CI environment
v21 v18 v14 v13
Release
Candidate
5. Integration Tests: A definition
• Collect modules together into a
subsystem that provides some
larger piece of behaviour
Paraphrasing from Testing Strategies in a
Microservice Architecture (martinfowler.com)
http://bit.ly/1GKK07d
• Check for any incorrect
assumptions the modules make
about the peers they collaborate
with
• Verify interactions between layers
of integration code and the
external components with which
they are integrating
External Component
UI Layer or API Layer
Integration Layer
Domain Layer
Exercise Assert
Exercise Assert
✓
✓
6. Developers CI QA Staging Production
Two services collaborating
Consumer
Provider
Integration test in CI environment
v21 v18 v14 v13
Release
Candidate
Does this scale?
7. ⌛
Multiple collaborating services
Slow build and test phase
v21 v18 v14 v13
Developers CI QA Staging Production
Release
Candidate
Queuing
=> development delays
⌛
⌛
⌛
8. Downstream integration testingDefect in one component fails release candidateMonolithic deployments are slow and riskySlow integration phase increases feature lead time
Separate CI builds
v2
2
v1
1
v9
2
v5
8
v1
4
v1
1
v9
0
v4
0
v42
v1
3
v1
2
v9
0
v4
0
v41
v1
8
v1
1
v9
1
v4
6
v44
v2
1
v1
1
v9
1
v5
8
v47
Developers CI QA Staging ProductionIntegration
Release
Candidate
11. v26
Which version are we integrating with?
Does Provider meet Consumer expectations?
Which version are we integrating with?
?
Consumer Provider
____
____
____
____
____
Contract
GET /cities HTTP/1.1
[
{ name: "London", country: "UK" },
{ name: "Paris", country: "France" }
]
Contract
Test
v26
✓
12. [Test]
public void CitiesEndpointShouldProvideCityNames()
{
var response = httpClient.Get("/cities");
var cities = response.Body.As<JsonArray>();
Assert.That(cities, Has.JsonObjectWith(o => o.name == "London"));
Assert.That(cities, Has.JsonObjectWith(o => o.name == "Paris"));
}
Example
• Write in language of interaction (vs language of consumer)
• Configure Provider URL at runtime
• Develop against shared instance of real system
What will cause this test to fail?
• Maintain one contract test suite per external system
13. Developers
Flexible contracts
Be conservative in what you send, be
liberal in what you accept (Postel’s Law)
Use the Tolerant Reader pattern
Evolve APIs with backwards compatibility
(no breaking changes)
[
{ name: "London", country: "UK" },
{ name: "Paris", country: "France" }
]
15. Production
Testing against another environment
If we don’t integration test, how does Provider validate their code?
• Target mirror of Production if possible
• Gain access to environment
• Establish stable “golden dataset” iteratively (per story)
• If Prod, create entities (e.g. users) that are safe to modify
• If Prod, whitelist permitted requests and safe entities
• Consider contention: parallel test runs touch same entities
• Understand availability and stability of environment
17. QA Staging Production
Sharing contract tests with Providers
_
_
_
_
• Consumer publishes contract tests as
versioned build artefact ("LATEST")
• After deploy, Consumer labels Prod
version of contract tests ("LIVE")
• Provider can fetch and run
Production contract tests at any point
CI
Content
Site
Developers
Librarian
Portal
Artefact Repository
_
_
_
_
"LIVE""LATEST"
• Contract tests should be trivial to run
and easy to understand when they fail
What if a Provider has multiple Consumers?
Contract
Testing
_
_
_
_
18. Collecting all your consumers’ contracts
_
_
_
_
_
_
_
_
_
_
_
_
Provider Consumer C
Consumer BConsumer A
GET /cities HTTP/1.1[
{
name: "London",
country: "UK"
},
{
name: "Paris",
country: "France"
}
]
[
{
name: "London",
country: "UK"
},
{
name: "Paris",
country: "France"
}
]
[
{
name: "London",
country: "UK"
},
{
name: "Paris",
country: "France"
}
]
[
{
name: "London",
country: "UK"
},
{
name: "Paris",
country: "France"
}
]
Consumer A Contract Consumer B Contract
Consumer C ContractConsumer Driven Contract
✓
• CDC defines Provider’s
obligations to its Consumers
• Provider can use CDC to
understand impact of changes
20. Contract
Testing
QA Staging Production
Sensing danger: watch where you’re walking
• Be a good citizen:
check before
deploying to each
shared env.
• Will this consumer
work in Production?
• Will this provider
break Production?
• Hard gate: abort
deployment if
contract test fails
Is this feedback early enough?
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
21. Content
Site
User
Service
Report
Service
Developers
Librarian
Portal
CI
Sensing danger: read the warning signs
• If providers do not run
contract tests, consumers
should run them periodically
Are we diverging?
• Run latest contracts against
latest builds
• Failing contract tests do not
necessarily block pipeline,
they provoke conversation
• Early warning system: we’re
diverging
_
_
_
_
24. Testing Consumer integration code
Core
Domain
Services
Quotes
Site
Policies
Service
Customer
Portal
Test
Backed by mainframe
Data setup painful
Slow shared
environment
Impossible to generate
some states
Different tech stack
Hard to generate
some errors
Relied on other
shared systems
Slow to start
Sometimes
unavailable
25. What if this isn’t easy?
Testing Consumer integration code
We verify contract at various stages in pipeline
Is Provider deployed to shared env?
Is the Provider fast to interact with?
Is the Provider usually available?
Is it easy to put the Provider into a desired state?
Is it easy for us to host the Provider?
Does the Provider talk to other services?
Is it easy to make the Provider return errors?
Quotes
Site
Policies
Service
Core
Domain
Services
Customer
Portal
Developers CI
26. Consumer Provider
Client
Code
Stub
Client
Integration Test
• Tests are only valuable if stubs imitate Provider behaviour
• We observe divergent behaviour
• How do we know when Provider behaviour changes?
• Need to keep stub behaviour up to date
Avoiding the Provider
27. ✗
2. Create feedback loop using contract test
Consumer ProviderIntegration Test
5. Fix Fake
3. Provider changes
Faking it
Fake
__
__
__
__ ✓✗
✓
4. We fix contract test
An out-of-process,
stateful test double
1. Inject golden dataset into Fake
Golde
n Data
6. Run integration test
✓
Integration
Test Data
28. Some characteristics of Fakes
Fake
• Implement just the functionality
required to pass the contract test
• Transient – no persistent state
• Same network interaction as real
Provider
• Fast to start
• Easy to deploy
• Can be put into desired state easily
• Can simulate failures
• Unlike HTTP stub servers, Fakes are stateful and exhibit dynamic behaviour
• Simple embedded HTTP Servers,
such as:
• Jersey / Scalatra + Jetty
• Nancy + OWIN HttpListener
• Express.js + Node
29. Leveraging Fakes
Fake
• Use as custom sandboxed
environment for exploratory
testing by BAs, QAs and XDs
• Use in automated tests to
speed them up
• Target Fake from Consumer
simply by configuring a URL
Consumer
v21
____
____
____
____
____
• Version Fake with consumer
and contract test
• Always contract-test Fake before
using in automated tests
• Use in automated tests to
ensure Provider’s initial state is
explicit and maintainable
31. QA Staging Production
Sensing danger throughout the delivery pipeline
Contract
Testing
_
_
_
_
Developers CI
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
_
So everything’s watertight, right?
32. QA Staging Production
Supporting Practices: minimising surprises
Contract
Testing
Short lead time
Smoke testing
Deploy small
batches
Canary releases
Design for
failure
33. No Free Lunch
• Establishing and stabilising golden data
• Learning curve
• Avoiding data contention
• Gaining access to environments
• Living with instable environments
• Infrastructure for sharing contracts
• Packaging contracts for Providers to
execute
• Faking failure modes and
strictly enforcing request
structures
• Dense message structures
(e.g. SAML) that require a lot
of integration logic
• Pass-through systems that
modify outbound or inbound
messages
• Faking systems that share
state/communicate with each
other (e.g. OAuth)
34. In summary
Contract tests may help you release independently,
with confidence, if either of the following are true:
• The Service Provider is willing to run your
contract tests in their pipeline
• You need to use a realistic fake in your tests
because the external system is unavailable, slow
or hard to configure
39. • Integration tests written against real systems may be coupled to fixed data
in those systems, making tests less readable and harder to maintain
Why not use integration tests to check the contract?
• Integration test failures are reported in terms of our domain, not in
the language of the contract: harder for Provider to diagnose errors
• To remain decoupled, other teams would run our integration tests on-
demand in their pipelines, but…
• A single integration test might make calls to multiple systems owned
by different teams, increasing coupling between pipelines.
• Test-driving realistic fakes with integration tests is a case of “the tests
talking to themselves”
• Distributing some of our Production code might constitute a security
or intellectual property concern
Svc 1
Svc 2
AppTest
___
___
___
___
___
___
___
___
41. Story lifecycle: outside-in (ATDD)
UI Layer or API Layer
Integration Layer
Domain Layer
AcceptanceTest
✗
• Start Fake before all tests
• Inject state into Fake
• Exercise application via UI / API
• Assert on observed behaviour
• Reset Fake state before each test
Start Inject Reset
Exercise Assert
42. Story lifecycle: unit tests
UI Layer or API Layer
Integration Layer
Domain Layer
• Drive UI / API code with unit tests
• Drive domain code with unit tests
AcceptanceTest
✗
UnitTests
✓
• Stop when you reach integration layer
UI Layer or API Layer
Domain Layer
43. Domain Layer
Story lifecycle: integration tests
UI Layer or API Layer
Integration Layer
AcceptanceTest
✗
IntegrationTest
✗
• Start Fake before all tests
• Inject state into Fake
• Exercise integration layer
• Assert on response
• Reset Fake state before each test
• Begin implementing integration code
to discover what assumptions we
make about the Provider
Start Inject Reset
Exercise Assert
Integration Layer
44. Provider
Story lifecycle: contract tests
UI Layer or API Layer
Integration Layer
Domain Layer
AcceptanceTest
✗
IntegrationTest
✗
• Write contract test and run against
the real Provider (requires access
from developer workstation)
• Assert on responses from Provider in
order to check our assumptions
ContractTest
✓Exercise Assert
45. Story lifecycle: implementing the Fake
UI Layer or API Layer
Integration Layer
Domain Layer
AcceptanceTest
✗
IntegrationTest
✗
• Start Fake before all contract tests
ContractTest
• Inject golden dataset into Fake so its
state matches real Provider
• Point contract tests at Fake to drive
out implementation
• Don’t reset Fake state between tests
Start Inject
Exercise Assert
✓
46. Story lifecycle: all tests green
UI Layer or API Layer
Integration Layer
Domain Layer
AcceptanceTest
✗
IntegrationTest
✗
• Run integration tests against Fake
• Run acceptance tests against Fake
• Point application at Fake and manually
test
• Start Fake and inject sample state
• Point application at real Provider and
manually test
✓
✓
Provider
Manual/AcceptanceTest
Exercise Assert
Exercise Assert
Start Inject
55. Tools
https://github.com/realestate-com-au/pact
• Ruby-based (JVM and .NET versions available)
• Write integration and contract tests together
• No need to package for Providers
• Simple HTTP + JSON support
• Complementary tool: Pact Broker
• Define http request and response
• Does not support stateful transaction testing
• Does not produce a separate fake to use in acceptance or manual tests
Suddenly, we’re required to check components work together
Jon Postel's “robustness principle” from early TCP spec
Public APIs – collect contracts from representative Consumers
Alarm system adjacent to delivery pipeline, not part of it
Requires early access to each others’ services
External systems changed at different rates
Radically-different combinations of versions in later environments
Later environments locked / torn down, data setup in mainframe system => hard to contract test those environments
Work with data/environment owners
Dense message structures => high coupling
Hard to reason about assumptions you’re making: hard to write and maintain contract tests
Less development overhead to integration test, even if slower to run?
Also known as Functional or E2E test
Start in or out of test process – debugging, tech options, state injection
Alternatively: hand-roll using your favourite unit testing framework, HTTP client and protocol parser (e.g. JSON or XML)