A LEGO Exercise and an
Experience Report
David Kane
Solution Architect and Coach
General Dynamics Information Technology
Twitter: @ADavidKane
LinkedIn: david-kane-agile
George Paci
Senior DevOps Engineer
MAXIMUS Federal
Email:gpaci@tiac.net
Agile DC
October 15th, 2018
Approval Tests in Action
2 | https://www.agiledc.org
Exercise: Round 1
• Build a person using LEGO
pieces with the following
criteria
 Use at least 5 pieces
 Use no more than 10 pieces
 Have a distinct head
 Have two distinct arms
 Have two distinct legs
 Arms should look similar to
each other
 Legs should look similar to
each other
 Legs, Arms and Head should
look different
• Each person builds their own
figure
3 | https://www.agiledc.org
Exercise: Round 2
• Using other LEGO bricks,
write as many tests as you
can to validate that you have
satisfied the criteria laid out
in Round 1
 Use at least 5 pieces
 Use no more than 10 pieces
 Have a distinct head
 Have two distinct arms
 Have two distinct legs
 Arms should look similar to
each other
 Legs should look similar to
each other
 Legs, Arms and Head
should look different
4 | https://www.agiledc.org
Exercise: Round 3
• Make a copy of your solution
from Round 1
• Visually inspect your copy to
confirm that it satisfied the
Round 1 criteria
• Compare your Round 3 test
with your Round 1 solution to
verify that they are equivalent
5 | https://www.agiledc.org
Discussion
6 | https://www.agiledc.org
Approval Tests
Created By
Llewelyn Falco
7 | https://www.agiledc.org
• Write test code that interacts with
your system
• Capture the results as a String
 Inspect the string to verify it is
correct
 Save it as a reference
• Subsequent tests compare new
results against the reference string
 Match: Passes
 Difference: Fails
• Diff Tools highlight changes when
a test fails
• Inspect failures
 Change the application code?
 Use the new result file as the
approval reference
Approval Extend Unit Test Frameworks
Available for many languages
including:
Java, .Net, PHP, Python,
JavaScript, C++
Characterization Tests
Test by Example
8 | https://www.agiledc.org
Example -- Starting
Approved
Received
9 | https://www.agiledc.org
Example – Accept at Approved
Approved
Received
10 | https://www.agiledc.org
Example – Test Passes, No Display Needed
11 | https://www.agiledc.org
Example – Approval Test Detects a Bug
Differences Detected
12 | https://www.agiledc.org
Example – This One is a New Feature
Difference DetectedDifferences Detected
13 | https://www.agiledc.org
Approval Test Code Example
Very Little Test Code
Expected Outcome in Approval File
14 | https://www.agiledc.org
Typical Unit Test Code
More Verbose
Harder to Read
15 | https://www.agiledc.org
• Tests that are easier to understand
• Write tests for code that is not easily testable
Approval Test Benefits
16 | https://www.agiledc.org
• New Projects
 Use approvals to externalize your expected test
behavior
 You can reduce your test code footprint, and improve
the readability of your expected results
• Legacy Projects
 Look for interfaces can you access programmatically
with no or minimal system changes?
 Identify how you can you most easily express your
systems output as a string?
 Use these answers to boostrap your automated testing
with approvals
Approval Test Applicability
17 | https://www.agiledc.org
• Legacy code base had
limited automated testing
• Introducing strong unit
tests required refactoring
the code
• Did not want to refactor
without automated tests
• Approval Tests provided
a path out of this
conundrum
Recently Used Approval Tests Bootstrap
Automated Testing
18 | https://www.agiledc.org
• We had many existing interfaces in our system that could used
without application modifications that could fit into the approval
framework
 JSON
 Excel
• We don’t need pages of test code filled with “AssertEquals”
statements
• Our tests have become much more sensitive to application code
changes
• We have gone from 6% test coverage to 46% test coverage in a
few months. Our key areas of focus have coverage over 75%
 We have sufficient coverage that we can now refactor safely
 We are now able to start writing code and tests with better
isolation
Approval Tests Results
19 | https://www.agiledc.org
• Our Approval Tests lack isolation
 Because many tests execute some of the same code, a
single problem in shared code can cause many test to
fail
 Many existing interfaces had call stacks that extended
into our database
o Relatively slow
o Meant we had to deal with database management
challenges
 This is not a problem of the approval framework per se,
but of the fact we are using broad-based interfaces
against which to execute the tests
• Not suitable for Test-Driven Development (TDD)
Approval Tests Weaknesses
20 | https://www.agiledc.org
Questions?
More Information
http://approvaltests.com/
https://github.com/approvals
22 | https://www.agiledc.org
Session Feedback
• How was today’s session?
• Green: Great session, I am glad I
came
• Yellow: Meh
• Red: I wished I had spent my time
doing something else
• Do you have any suggestions,
comments, or concerns for the
speakers or the organizers?

Approval Tests in Action: A LEGO Exercise and an Experience Report

  • 1.
    A LEGO Exerciseand an Experience Report David Kane Solution Architect and Coach General Dynamics Information Technology Twitter: @ADavidKane LinkedIn: david-kane-agile George Paci Senior DevOps Engineer MAXIMUS Federal Email:gpaci@tiac.net Agile DC October 15th, 2018 Approval Tests in Action
  • 2.
    2 | https://www.agiledc.org Exercise:Round 1 • Build a person using LEGO pieces with the following criteria  Use at least 5 pieces  Use no more than 10 pieces  Have a distinct head  Have two distinct arms  Have two distinct legs  Arms should look similar to each other  Legs should look similar to each other  Legs, Arms and Head should look different • Each person builds their own figure
  • 3.
    3 | https://www.agiledc.org Exercise:Round 2 • Using other LEGO bricks, write as many tests as you can to validate that you have satisfied the criteria laid out in Round 1  Use at least 5 pieces  Use no more than 10 pieces  Have a distinct head  Have two distinct arms  Have two distinct legs  Arms should look similar to each other  Legs should look similar to each other  Legs, Arms and Head should look different
  • 4.
    4 | https://www.agiledc.org Exercise:Round 3 • Make a copy of your solution from Round 1 • Visually inspect your copy to confirm that it satisfied the Round 1 criteria • Compare your Round 3 test with your Round 1 solution to verify that they are equivalent
  • 5.
  • 6.
    6 | https://www.agiledc.org ApprovalTests Created By Llewelyn Falco
  • 7.
    7 | https://www.agiledc.org •Write test code that interacts with your system • Capture the results as a String  Inspect the string to verify it is correct  Save it as a reference • Subsequent tests compare new results against the reference string  Match: Passes  Difference: Fails • Diff Tools highlight changes when a test fails • Inspect failures  Change the application code?  Use the new result file as the approval reference Approval Extend Unit Test Frameworks Available for many languages including: Java, .Net, PHP, Python, JavaScript, C++ Characterization Tests Test by Example
  • 8.
    8 | https://www.agiledc.org Example-- Starting Approved Received
  • 9.
    9 | https://www.agiledc.org Example– Accept at Approved Approved Received
  • 10.
    10 | https://www.agiledc.org Example– Test Passes, No Display Needed
  • 11.
    11 | https://www.agiledc.org Example– Approval Test Detects a Bug Differences Detected
  • 12.
    12 | https://www.agiledc.org Example– This One is a New Feature Difference DetectedDifferences Detected
  • 13.
    13 | https://www.agiledc.org ApprovalTest Code Example Very Little Test Code Expected Outcome in Approval File
  • 14.
    14 | https://www.agiledc.org TypicalUnit Test Code More Verbose Harder to Read
  • 15.
    15 | https://www.agiledc.org •Tests that are easier to understand • Write tests for code that is not easily testable Approval Test Benefits
  • 16.
    16 | https://www.agiledc.org •New Projects  Use approvals to externalize your expected test behavior  You can reduce your test code footprint, and improve the readability of your expected results • Legacy Projects  Look for interfaces can you access programmatically with no or minimal system changes?  Identify how you can you most easily express your systems output as a string?  Use these answers to boostrap your automated testing with approvals Approval Test Applicability
  • 17.
    17 | https://www.agiledc.org •Legacy code base had limited automated testing • Introducing strong unit tests required refactoring the code • Did not want to refactor without automated tests • Approval Tests provided a path out of this conundrum Recently Used Approval Tests Bootstrap Automated Testing
  • 18.
    18 | https://www.agiledc.org •We had many existing interfaces in our system that could used without application modifications that could fit into the approval framework  JSON  Excel • We don’t need pages of test code filled with “AssertEquals” statements • Our tests have become much more sensitive to application code changes • We have gone from 6% test coverage to 46% test coverage in a few months. Our key areas of focus have coverage over 75%  We have sufficient coverage that we can now refactor safely  We are now able to start writing code and tests with better isolation Approval Tests Results
  • 19.
    19 | https://www.agiledc.org •Our Approval Tests lack isolation  Because many tests execute some of the same code, a single problem in shared code can cause many test to fail  Many existing interfaces had call stacks that extended into our database o Relatively slow o Meant we had to deal with database management challenges  This is not a problem of the approval framework per se, but of the fact we are using broad-based interfaces against which to execute the tests • Not suitable for Test-Driven Development (TDD) Approval Tests Weaknesses
  • 20.
  • 21.
  • 22.
    22 | https://www.agiledc.org SessionFeedback • How was today’s session? • Green: Great session, I am glad I came • Yellow: Meh • Red: I wished I had spent my time doing something else • Do you have any suggestions, comments, or concerns for the speakers or the organizers?

Editor's Notes

  • #3 5 minutes
  • #4 5 minutes Questions for discussion Were you able to test all of the properties? – Typically no. If the original was designed with testing in mind, it might be easier. (e.g. if you standardized bricks you could probably come up with attest to validate the piece limits) If you look at the tests in isolation, do they give you an understanding about what the original model looks like? Typically no. Many of the test, while clever can be obtuse. Were the tests easy to make? – Some more so than others
  • #5 5 minutes Questions for discussion Are you able to validate all of the criteria? -- Typically yes. Sometimes folks have difficulty matching the pieces exactly. This can happen in practice. (e.g. code that generates a time stamp might look a little different from run to run) Do these “tests” provide mode clarify about what the target is support to be? – Yes. They look the same Were these tests easier to make – Unless someone had to hunt for weird pieces, yes this test was easier.
  • #6 Do these kinds of challenges echo what you see in testing software? This Lego exercise is a metaphor for testing software. The second round represents how we typically approach unit testing. The third round represents an approval tests –style approach to testing software.