Deprogramming the cargo cult of testing
Mike Talks
Datacom
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Who are we?
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
•  RealMe developed
as “classic” V-model
•  18 months between
inception and
delivery
•  6 weeks of system
testing alone
A long time ago …
Being a “classic” V-model
System
Test
UAT
As RealMe matured …
DIA needed to
respond more to
their customers’
needs
Rethinking the model
Going agile …
Where does this
leave testing?
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Take what
you’ve always
done …
The ultimate temptation
This is the Cargo Cult trap
• Project context
changes
• Use same
procedures
• Expect same
result
Waterfall
l Deliver
EVERYthing
l 12 months from
inception
l 6 weeks for
testing
l 7 testers
In a nutshell
Agile
•  Deliver SOMEthing
•  2 weeks from
inception
•  2 weeks for testing
•  2 testers
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
As a test
professional …
How do you
begin the
process of
change?
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Initiate change
By	
  requiring	
  a	
  
change	
  in	
  contract	
  
with	
  your	
  
customer?	
  
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Initiate change
By	
  manda5ng	
  a	
  
new	
  set	
  of	
  test	
  
processes?	
  
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Initiate change
Implemen5ng	
  
change	
  secretly,	
  
hoping	
  no-­‐one	
  
no5ces	
  …	
  
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
How the deprogramming
begins
Looking at the model we had
Requirements
Test
Strategy
Test
Plan
Test
Scripts
Test
Reporting
How can this model go off the
rails?
Derail #1: Lead time
Derail #2: Requirement
change
Derail #3: Reviewing &
Collaboration
Time for introspection
Test Strategy
How do we go about
delivering testing?
Test Plan
What are we going
to test?
Who is going to test?
How long is it going
to take?
Test Script
What testing are we
going to do?
What requirement/
feature are we
testing?
What testing have
we done?
How do new testers
learn/ test on the
system?
Test Reporting
What testing is
complete?
What problems have
there been?
[Defects]
What testing have
we done?
That’s a lot of questions …
What testing is
complete?
What problems
have there
been?
[Defects]
What testing
have we done?
What testing are
we going to do?
What
requirement/
feature are we
testing?
What testing
have we done?
How do new
testers learn/ test
on the system?
What are we
going to test?
Who is going to
test?
How long is it
going to take?
How do we go
about delivering
testing?
Looking at it another way
… as the
questions
they answer
The
deliverables
don’t have
as much
value …
To Make Agile Testing Work
…
Four
fundamental
aims
Aim #1
Deliver
real
change
Aim #2
Processes
lightweight but
address needs
Aim #3
Review
processes
each sprint
Aim #4
Encourage
transparency
between
Datacom & DIA
Use Exploratory Testing?
What testing are we
going to do? What testing have
we done?
Common worries …
There’s always that
one person asking …
How will you
know what to test
without your
scripts?
From Jon & James Bach …
•  “We wanted to be
accountable for our
work”
•  “We wanted to give
status reports that
reflected what we
actually did”
Session Based Test
Management
•  The heart of what
we did.
•  But needed some
customisation
Session Based Test
Management
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
How We Test On RealMe
Warning!
Tailored to fit
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Testing Cycle 1: Planning
Testing Tasks
What are we going
to test?
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Testing Cycle 2:
Expanding Testing Tasks
(pt1)
Boundaries?
Browsers?
Related
areas?
What testing are we
going to do?
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Testing Cycle 2:
Expanding Testing Tasks
(pt2)
Testing Cycle 3: Doing
The Testing (pt1) -
Assignment
What requirement/
feature are we
testing?
Testing Cycle 3: Doing
The Testing (pt2) -
Recording
What testing have
we done?
Testing Cycle 3: Doing
The Testing (pt3) –
Problems!
What problems have
there been?
[Defects]
Testing Cycle 3: Doing
The Testing (pt4) – Peer
review
What testing have
we done?
Testing Cycle 4: Updating
the board
What testing is
complete?
What testing have
we done?
Testing Cycle 5: And right
at the end …
So what’s left?
Who is going to test?
How long is it going
to take?
But what about …
How do new testers
learn/ test on the
system?
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Deliverables?
Mind maps = Plan
Recordings =
Proof of testing
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
Deliverables?
Sprint board =
Progress Report /
Defect Tracking
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
And how’s it
working out?
More engaged
test team Happy customer
Making testing
visible to whole
team
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
We’re
replicating the
success
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
The temptation is still
there
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
The trap is still there …
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
We need to remember…
Individuals and
interactions over
processes and
tools
Copyright © Datacom New Zealand Limited 2013 Wednesday, 9 September 15
But we’re getting there …
Summary #1
Relationships matter!
Summary #2
Focus on needs, not
deliverables
Summary #3
Can what you do be made
more visible?
Mike Lowery
Agile Coach
Nomad 8
Thanks for listening…
Mike Talks
Datacom
Question – Manual
regression
Use of
“character
cards”
personas
Question – Automated
checking with Selenity
Datacom software
•  Uses Selenium
drivers
•  Part of build
checks
•  Delivers less
buggy code to
test env
Thanks for listening…
Mike Talks
Datacom

Mike Talks (Datacom)