EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
1
Are your tests well-
travelled? Thoughts
about coverage
Prepared and presented by
Dorothy Graham
email: info@dorothygraham.co.uk
www.DorothyGraham.co.uk
EuroStar Webinar
© Dorothy Graham 2017
2
Contents
Analogy with travelling
What is coverage?
Should testing be thorough?
What coverage is not (often mistaken for)
The four caveats of coverage
The question you should ask
Twitter: @DorothyGraham
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
3
Where on earth have you been?
• have you seen a lot of the world?
– are you “well-travelled”?
• what does it mean if you say “yes”?
– what does it mean if you say “85%”?
– or “100%”?
4
Scratch map 1 (unscratched)
scratch off places I
have been – cities?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
5
can you spot
the 74 cities
I’ve visited?
6
hint: look on
the East side
of the country
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
7
Where is Grand
Rapids?
8
Cities aren’t that
impressive on
the map – how
about States?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
9
Only 4 US
States I haven’t
been to.
plus
Hawaii
10
country
coverage
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
11
Some travel metrics
• Have you been to
– every street in the
place where you live?
– every city/town?
– every state/province?
– every country in the
world?
– every continent?
• where haven’t you
been?
12
Contents
Analogy with travelling
What is coverage?
Should testing be thorough?
What coverage is not (often mistaken for)
The four caveats of coverage
The question you should ask
Twitter: @DorothyGraham
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
13
Travelling
• have you seen a lot of the world?
– are you “well-travelled”?
• what does it mean if you say “yes”?
– what does it mean if you say “85%”?
– or “100%”?
Test coverage
your tests seen a lot of the system?
- do you have “good coverage”?
they
14
Some travel metrics
• Have you been to
– every street in the
place where you live?
– every city/town?
– every state/ province?
– every country in the
world?
– every continent?
• where haven’t you
been?
• Have your tests been to
– every statement / decision
/ branch?
– every data combination?
– every error message?
– every menu option?
– every program / function?
– every user story option?
• where haven’t your tests
been?
coverage
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
15
What is coverage?
system
the
tests
16
What is coverage?
system
the
tests
this part of the
system has
been covered
by these tests
the rest has
not been
covered by
these tests
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
17
What is coverage?
system
more
tests
these tests give more coverage
than the previous set of tests
18
Tests giving 100% coverage
system
even more
tests
Great - we’ve tested
all of the system!
- or have we?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
19
Tested everything?
system
systemsystemsystem
system
even more
tests
100%? – of what?
modules, statements, branches,
states, data, menu options, functions
business rules, user stories
20
Statement vs decision coverage
read(a)
IF a > 6 THEN
b = a * 2
ENDIF
print b
1
2
3
4
5
Statement
numbers
5
1
3
4
True
Test
Case Input
Expected
Output
A 7 14
Test Path Decision Decision Statement
Case Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%
True
2
Great – 100%
tested, right?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
21
Statement vs decision coverage
read(a)
IF a > 6 THEN
b = a * 2
ENDIF
print b
1
2
3
4
5
Statement
numbers
5
1
3
4
True
Test
Case Input
Expected
Output
A 7 14
Test Path Decision Decision Statement
Case Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%
True
2
22
Statement vs decision coverage
read(a)
IF a > 6 THEN
b = a * 2
ENDIF
print b
1
2
3
4
5
Statement
numbers
B 3 3
5
1
3
4
True
Test
Case Input
Expected
Output
A 7 14
Test Path Decision Decision Statement
Case Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%
B 1, 2, 4, 5 False 50% 80%
True
2
Both 100% 100%
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
23
ISTQB definitions
• coverage
– the degree, expressed as a percentage, to which
a specified coverage item has been exercised by
a test suite
• coverage item
– an entity or property used as a basis for test
coverage, e.g. equivalence partitions or code
statements
ISTQB Glossary v 1.3
24
What is coverage?
• coverage is a relationship
– between a set of tests
– and some countable part of the system
• 100% coverage is not 100% tested
– 100% of some countable things – among several
types of countable things
• an objective measurement of some aspect of
thoroughness
– if thoroughness is what you want!
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
25
Contents
Analogy with travelling
What is coverage?
Should testing be thorough?
What coverage is not (often mistaken for)
The four caveats of coverage
The question you should ask
Twitter: @DorothyGraham
26
Thoroughness of testing
• should testing be thorough? why?
• is testing like butter or like strawberry jam?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
27
Testing is like butter?
• spread it evenly over
the bread
• same thickness
throughout
• no part not covered
• every part of the
software is tested to the
same extent
28
Testing is like strawberry jam?
• thicker in some
places than others
• big lumps
• some parts not
covered
• should some parts of
the software be
tested more than
others?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
29
Breadth or depth? the coverage illusion
major bug minor bug
Breadth / width is coverage
30
Breadth or depth? the coverage illusion
major bug minor bug
Breadth / width is coverage
Depth / lumpy testing is selective
What is better testing?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
31
Breadth or depth? the coverage illusion
major bug minor bug
Breadth / width is coverage
Depth / lumpy testing is selective
“I’ve covered /
tested everything -
haven’t missed
anything!
An illusion, a trap
What is better testing?
32
What’s the goal for testing?
• Width
– every part has been
tested once
– may be required by
regulatory bodies
– wide view, no area
untouched
– may miss something
• Selected depth
– not all parts of the
system are equally
important or equally risky
– focus on where testing
brings greatest value
– deep view, concentrate
on critical parts
– may miss something
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
33
Contents
Analogy with travelling
What is coverage?
Should testing be thorough?
What coverage is not (often mistaken for)
The four caveats of coverage
The question you should ask
Twitter: @DorothyGraham
34
Coverage is NOT
the system
the
tests
the
tests
this is test completion!
don’t call it “coverage”!
“we’ve run all of the tests”
[that we have thought of]
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
35
Contents
Analogy with travelling
What is coverage?
Should testing be thorough?
What coverage is not (often mistaken for)
The four caveats of coverage
The question you should ask
Twitter: @DorothyGraham
36
1 A single measure is only one level of
coverage out of many
• what is the “right” level of coverage?
– e.g. city, state, country?
– statement, menu options, user stories?
– 100%? 80%? See what you have tested / missed?
system
systemsystemsystem
coverage level
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
37
2 Only need one test to cover
• to cover Australia or Brazil
– only visit one city per country
• to cover a “coverage element”
(statement, decision outcome,
menu option)
– only one test per element
• might be hundreds of ways that
element is used
38
3 Not related to quality of the software
read(a)
IF a > 6 THEN
b = a * 2
ENDIF
print b
1
2
3
4
5
Statement
numbers
B 3 3
5
1
3
4
True
Test
Case Input
Expected
Output
A 7 14
Test Path Decision Decision Statement
Case Taken Outcome Coverage Coverage
A 1, 2, 3, 4, 5 True 50% 100%
B 1, 2, 4, 5 False 50% 80%
True
2
what gets
printed?
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
39
3 Not related to quality of the software
• we still have decision coverage, even though
the test fails (a bug)
• statement coverage only wouldn’t find the bug
• client story
– poor quality software from 3rd party
– required decision coverage (80%?)
– 3rd party got a tool, demonstrated coverage
– the software was still rubbish
• tests didn’t pass, lots of bugs
• but the tests exercised the required decisions
40
4 Coverage of what exists, not what
should exist
the system
as built
the system
as needed
coverage (White
Box)
not tested,
even with
100%
coverage
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
41
4 Coverage of what exists, not what
should exist
• map didn’t show some cities I have visited
– are they important?
• coverage does not show missing functions or
features
– are they important?
• what about requirements coverage?
– good idea, but still only test the requirements you
listed
42
Coverage traps
• the four coverage caveats
– only one level / aspect of thoroughness
– only needs one test to “tick the box”
– not related to how good the tests or software are
– only what is there, not what’s missing
• 100% coverage is NOT 100% tested!
• “coverage” feels comfy, re-assuring
– insurance – you’re covered
– you think you haven’t missed anything
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
43
Contents
Analogy with travelling
What is coverage?
Should testing be thorough?
What coverage is not (often mistaken for)
The four caveats of coverage
The question you should ask
Twitter: @DorothyGraham
44
“We need to
increase our coverage”
“Make sure you cover 100%!”
“What coverage
are we getting?”
“If we automate,
we will get better coverage”
Have you heard:
“We need as much
coverage as possible”
EuroS|tar webinar 30 Jan 2017
info@dorothygraham.co.uk
© Dorothy Graham 2017
www.DorothyGraham.co.uk
www.TestAutomationPatterns.org
45
Next time you hear:
• “we need coverage”, ask: “of what”?
– exactly what countable things need to be
covered by tests?
– why is it important to test them [all]?
– how “deeply” should we cover things?
• would testing be more effective if lumpy, not smooth?
• what can we safely not test (this time)?
– we always miss something
• better to miss it on purpose than fool yourself into
thinking you haven’t missed anything
46
Summary
• coverage is a relationship
– between tests and what-is-tested
• coverage is not:
– test completion (my tests tested what they tested)
– 100% tested – only in one dimension
• beware the coverage traps
• when you hear “coverage”
– ask “of what”
www.DorothyGraham.co.uk email:info@DorothyGraham.co.uk
www.TestAutomationPatterns.org

Are Your Tests Well-Travelled? Thoughts About Test Coverage

  • 1.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 1 Are your tests well- travelled? Thoughts about coverage Prepared and presented by Dorothy Graham email: info@dorothygraham.co.uk www.DorothyGraham.co.uk EuroStar Webinar © Dorothy Graham 2017 2 Contents Analogy with travelling What is coverage? Should testing be thorough? What coverage is not (often mistaken for) The four caveats of coverage The question you should ask Twitter: @DorothyGraham
  • 2.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 3 Where on earth have you been? • have you seen a lot of the world? – are you “well-travelled”? • what does it mean if you say “yes”? – what does it mean if you say “85%”? – or “100%”? 4 Scratch map 1 (unscratched) scratch off places I have been – cities?
  • 3.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 5 can you spot the 74 cities I’ve visited? 6 hint: look on the East side of the country
  • 4.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 7 Where is Grand Rapids? 8 Cities aren’t that impressive on the map – how about States?
  • 5.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 9 Only 4 US States I haven’t been to. plus Hawaii 10 country coverage
  • 6.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 11 Some travel metrics • Have you been to – every street in the place where you live? – every city/town? – every state/province? – every country in the world? – every continent? • where haven’t you been? 12 Contents Analogy with travelling What is coverage? Should testing be thorough? What coverage is not (often mistaken for) The four caveats of coverage The question you should ask Twitter: @DorothyGraham
  • 7.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 13 Travelling • have you seen a lot of the world? – are you “well-travelled”? • what does it mean if you say “yes”? – what does it mean if you say “85%”? – or “100%”? Test coverage your tests seen a lot of the system? - do you have “good coverage”? they 14 Some travel metrics • Have you been to – every street in the place where you live? – every city/town? – every state/ province? – every country in the world? – every continent? • where haven’t you been? • Have your tests been to – every statement / decision / branch? – every data combination? – every error message? – every menu option? – every program / function? – every user story option? • where haven’t your tests been? coverage
  • 8.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 15 What is coverage? system the tests 16 What is coverage? system the tests this part of the system has been covered by these tests the rest has not been covered by these tests
  • 9.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 17 What is coverage? system more tests these tests give more coverage than the previous set of tests 18 Tests giving 100% coverage system even more tests Great - we’ve tested all of the system! - or have we?
  • 10.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 19 Tested everything? system systemsystemsystem system even more tests 100%? – of what? modules, statements, branches, states, data, menu options, functions business rules, user stories 20 Statement vs decision coverage read(a) IF a > 6 THEN b = a * 2 ENDIF print b 1 2 3 4 5 Statement numbers 5 1 3 4 True Test Case Input Expected Output A 7 14 Test Path Decision Decision Statement Case Taken Outcome Coverage Coverage A 1, 2, 3, 4, 5 True 50% 100% True 2 Great – 100% tested, right?
  • 11.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 21 Statement vs decision coverage read(a) IF a > 6 THEN b = a * 2 ENDIF print b 1 2 3 4 5 Statement numbers 5 1 3 4 True Test Case Input Expected Output A 7 14 Test Path Decision Decision Statement Case Taken Outcome Coverage Coverage A 1, 2, 3, 4, 5 True 50% 100% True 2 22 Statement vs decision coverage read(a) IF a > 6 THEN b = a * 2 ENDIF print b 1 2 3 4 5 Statement numbers B 3 3 5 1 3 4 True Test Case Input Expected Output A 7 14 Test Path Decision Decision Statement Case Taken Outcome Coverage Coverage A 1, 2, 3, 4, 5 True 50% 100% B 1, 2, 4, 5 False 50% 80% True 2 Both 100% 100%
  • 12.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 23 ISTQB definitions • coverage – the degree, expressed as a percentage, to which a specified coverage item has been exercised by a test suite • coverage item – an entity or property used as a basis for test coverage, e.g. equivalence partitions or code statements ISTQB Glossary v 1.3 24 What is coverage? • coverage is a relationship – between a set of tests – and some countable part of the system • 100% coverage is not 100% tested – 100% of some countable things – among several types of countable things • an objective measurement of some aspect of thoroughness – if thoroughness is what you want!
  • 13.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 25 Contents Analogy with travelling What is coverage? Should testing be thorough? What coverage is not (often mistaken for) The four caveats of coverage The question you should ask Twitter: @DorothyGraham 26 Thoroughness of testing • should testing be thorough? why? • is testing like butter or like strawberry jam?
  • 14.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 27 Testing is like butter? • spread it evenly over the bread • same thickness throughout • no part not covered • every part of the software is tested to the same extent 28 Testing is like strawberry jam? • thicker in some places than others • big lumps • some parts not covered • should some parts of the software be tested more than others?
  • 15.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 29 Breadth or depth? the coverage illusion major bug minor bug Breadth / width is coverage 30 Breadth or depth? the coverage illusion major bug minor bug Breadth / width is coverage Depth / lumpy testing is selective What is better testing?
  • 16.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 31 Breadth or depth? the coverage illusion major bug minor bug Breadth / width is coverage Depth / lumpy testing is selective “I’ve covered / tested everything - haven’t missed anything! An illusion, a trap What is better testing? 32 What’s the goal for testing? • Width – every part has been tested once – may be required by regulatory bodies – wide view, no area untouched – may miss something • Selected depth – not all parts of the system are equally important or equally risky – focus on where testing brings greatest value – deep view, concentrate on critical parts – may miss something
  • 17.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 33 Contents Analogy with travelling What is coverage? Should testing be thorough? What coverage is not (often mistaken for) The four caveats of coverage The question you should ask Twitter: @DorothyGraham 34 Coverage is NOT the system the tests the tests this is test completion! don’t call it “coverage”! “we’ve run all of the tests” [that we have thought of]
  • 18.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 35 Contents Analogy with travelling What is coverage? Should testing be thorough? What coverage is not (often mistaken for) The four caveats of coverage The question you should ask Twitter: @DorothyGraham 36 1 A single measure is only one level of coverage out of many • what is the “right” level of coverage? – e.g. city, state, country? – statement, menu options, user stories? – 100%? 80%? See what you have tested / missed? system systemsystemsystem coverage level
  • 19.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 37 2 Only need one test to cover • to cover Australia or Brazil – only visit one city per country • to cover a “coverage element” (statement, decision outcome, menu option) – only one test per element • might be hundreds of ways that element is used 38 3 Not related to quality of the software read(a) IF a > 6 THEN b = a * 2 ENDIF print b 1 2 3 4 5 Statement numbers B 3 3 5 1 3 4 True Test Case Input Expected Output A 7 14 Test Path Decision Decision Statement Case Taken Outcome Coverage Coverage A 1, 2, 3, 4, 5 True 50% 100% B 1, 2, 4, 5 False 50% 80% True 2 what gets printed?
  • 20.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 39 3 Not related to quality of the software • we still have decision coverage, even though the test fails (a bug) • statement coverage only wouldn’t find the bug • client story – poor quality software from 3rd party – required decision coverage (80%?) – 3rd party got a tool, demonstrated coverage – the software was still rubbish • tests didn’t pass, lots of bugs • but the tests exercised the required decisions 40 4 Coverage of what exists, not what should exist the system as built the system as needed coverage (White Box) not tested, even with 100% coverage
  • 21.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 41 4 Coverage of what exists, not what should exist • map didn’t show some cities I have visited – are they important? • coverage does not show missing functions or features – are they important? • what about requirements coverage? – good idea, but still only test the requirements you listed 42 Coverage traps • the four coverage caveats – only one level / aspect of thoroughness – only needs one test to “tick the box” – not related to how good the tests or software are – only what is there, not what’s missing • 100% coverage is NOT 100% tested! • “coverage” feels comfy, re-assuring – insurance – you’re covered – you think you haven’t missed anything
  • 22.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 43 Contents Analogy with travelling What is coverage? Should testing be thorough? What coverage is not (often mistaken for) The four caveats of coverage The question you should ask Twitter: @DorothyGraham 44 “We need to increase our coverage” “Make sure you cover 100%!” “What coverage are we getting?” “If we automate, we will get better coverage” Have you heard: “We need as much coverage as possible”
  • 23.
    EuroS|tar webinar 30Jan 2017 info@dorothygraham.co.uk © Dorothy Graham 2017 www.DorothyGraham.co.uk www.TestAutomationPatterns.org 45 Next time you hear: • “we need coverage”, ask: “of what”? – exactly what countable things need to be covered by tests? – why is it important to test them [all]? – how “deeply” should we cover things? • would testing be more effective if lumpy, not smooth? • what can we safely not test (this time)? – we always miss something • better to miss it on purpose than fool yourself into thinking you haven’t missed anything 46 Summary • coverage is a relationship – between tests and what-is-tested • coverage is not: – test completion (my tests tested what they tested) – 100% tested – only in one dimension • beware the coverage traps • when you hear “coverage” – ask “of what” www.DorothyGraham.co.uk email:info@DorothyGraham.co.uk www.TestAutomationPatterns.org