SlideShare a Scribd company logo
1 of 43
Download to read offline
MJ
Half-day Tutorials
5/5/2014 8:30:00 AM
Exploratory Testing
Explained
Presented by:
Paul Holland
Testing Thoughts
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
Paul Holland
Testing Thoughts
An independent software test consultant and teacher, Paul Holland has more than sixteen
years of hands-on testing and test management experience, primarily at Alcatel-Lucent
where he led a transformation of the testing approach for two product divisions, making
them more efficient and effective. As a test manager and tester, Paul focused on
exploratory testing, test automation, and improving testing techniques. For the past five
years, he has been consulting and delivering training within Alcatel-Lucent and externally to
companies such as Intel, Intuit, Progressive Insurance, HP, RIM, and General Dynamics.
Paul teaches the Rapid Software Testing course for Satisfice. For more information
visit testingthoughts.com.
Copyright © 1996-2013, Satisfice, Inc.
dddd 1
1
1
Copyright © 1995-2014, Satisfice, Inc.
Paul Holland, Doran Jones, Inc.
pholland@doranjones.com
www.doranjones.com
My Background
Managing Director, Testing Practice at Doran Jones
Independent S/W Testing consultant 4/2012 - 3/2014
16+ years testing telecommunications equipment and
reworking test methodologies at Alcatel-Lucent
10+ years as a test manager
Presenter at STAREast, STARWest, Let’s Test,
EuroSTAR and CAST
Facilitator at 50+ peer conferences and workshops
Teacher of S/W testing for the past 6 years
Teacher of Rapid Software Testing
Military Helicopter pilot – Canadian Sea Kings
Copyright © 1996-2013, Satisfice, Inc.
dddd 2
2
Let’s Do ET! Ready…?
3
The Roadmap
Want to learn how to test?
OKAY LET’S GO!!!
4
(To do ET well, begin by doing it poorly.)
Copyright © 1996-2013, Satisfice, Inc.
dddd 3
3
Acquiring the competence, motivation,
and credibility to…
Testing is…
create the conditions necessary to…
…so that you help your clients to make
informed decisions about risk.
evaluate a product by learning
about it through experimentation, which includes to
some degree: questioning, study, modeling,
observation and inference, including…
operating a product
to check specific
facts about it…
Call this “Checking” not Testing
Observe Evaluate Report
Interact with the
product in specific
ways to collect
specific
observations.
Apply algorithmic
decision rules to
those
observations.
Report any
failed checks.
means
operating a product
to check specific
facts about it…
Copyright © 1996-2013, Satisfice, Inc.
dddd 4
4
Tacit
Test Procedures
Consistency
Oracles
Prospective Testing
Learning and
Teaching
Commitment
Management
(inc. estimation)
Recruiting
Helpers
Managing Testing
Logistics
Test Tooling and
Artifact Development
Test Framing
Bug
Advocacy
& Triage
Project
Post Mortem
Creating Archival
Documentation
Guiding Helpers
Discovery of
Curios, Issues &
Risks
Building the
Test Team
Designing
Checks and Tests
Playing with
the Product
Studying
Results
Galumphing
Configuring
Product & Tools
Schedule
Management
Study
Customer
Feedback
Relationship
Building
Making Failure
Productive
Sympathetic
Testing
Maintaining Personal
Health and Motivation
Team
Leadership
Quasi-Functional
Testing
Playing
Programmer
Testing w/Simulated
Conditions
Testing a Simulation
Creating the Test Lab
Studying Specs
Managing Records
Playing
Business Analyst
Opposition
Research
Testability Advocacy
Cultivate Credibility
Testing vs. Checking
TESTING (think “what testers
do”):
the process of evaluating a product by learning about it
through experimentation, which includes to some degree:
questioning, study, modeling, observation and inference.
CHECKING (think “fact
checking”):
the process of making evaluations by applying algorithmic
decision rules to specific observations of a product.
Copyright © 1996-2013, Satisfice, Inc.
dddd 5
5
10
Exploratory Testing Is…
an approach to testing…
that emphasizes the personal freedom and
responsibility of each tester to continually
optimize the value of his work…
by treating learning, test design, test
execution and result evaluation as mutually
supportive activities that run in parallel
throughout the project.
(applicable to any test technique)
(optimize how?)
Copyright © 1996-2013, Satisfice, Inc.
dddd 6
6
Exploration is Not Just Action
arrows and cycles
You can put them together!
arrows and cycles
(value seeking)
(task performing)
Copyright © 1996-2013, Satisfice, Inc.
dddd 7
7
You can put them together!
arrows and cycles
You can put them together!
arrows and cycles
Copyright © 1996-2013, Satisfice, Inc.
dddd 8
8
QUIZ
Testing without a specification
Testing with a specification
Testing without instructions
Testing with instructions
Testing a product for the first time
Re-testing after a change
Informal testing
Formal testing
Playing… Just playing…
Fully automated fact checking 15
Which of
these
are
exploratory
testing?
16
The Testing Formality Continuum
Mixing Scripting and Exploration
When I say “exploratory testing” and don’t qualify it,
I mean anything on the informal side of this continuum.
INFORMAL FORMAL
Not done in any specific way,
nor to verify specific facts.
Done in a specific way,
or to verify specific facts.
Machine
Checking
Human
Checking
Vague/Generic
Test Scripts
“Human
Transceiver”
Matrix/Outline
of Test Conditions
Product
Coverage
OutlinePlay
Specific
Test Data
Survey
Exploratory
Analytical
Exploratory
Copyright © 1996-2013, Satisfice, Inc.
dddd 9
9
Common Questions in ET
How do we manage ET?
How do we record ET?
How do we measure ET?
How do we estimate ET?
How do we learn ET?
How do we defend ET?
17
“How do we do X with ET that we
could never do with scripted testing
either but nobody noticed?”
18
Tests
Project
Environment
Product
Elements
Quality
Criteria
Perceived
Quality
A Heuristic Test Strategy Model
Copyright © 1996-2013, Satisfice, Inc.
dddd 10
10
19
Project
Environment
Tests
Product
Elements
Quality
Criteria
Perceived
Quality
A Heuristic Test Strategy Model
This is what people think you do
described actual
“Compare the product to its specification”
20
Copyright © 1996-2013, Satisfice, Inc.
dddd 11
11
This is more like what you really do
imagined
actualdescribed
“Compare the idea
of the product to
a description of it”
“Compare the actual product to a
description of it”
“Compare the idea
of the product to
the actual product”
21
This is what you find…
The designer INTENDS the product
to be Firefox compatible,
but never says so,
and it actually is not.
The designer INTENDS
the product to be Firefox
compatible,
SAYS SO IN THE SPEC,
but it actually is not.
The designer assumes
the product is not
Firefox compatible,
and it actually is not, but the
ONLINE HELP SAYS IT IS.
The designer
INTENDS
the product to be
Firefox compatible,
SAYS SO,
and IT IS.
The designer assumes
the product is not
Firefox compatible,
but it ACTUALLY IS, and the
ONLINE HELP SAYS IT IS.
The designer INTENDS the product
to be Firefox compatible,
MAKES IT FIREFOX COMPATIBLE,
but forgets to say so in the spec.
The designer assumes
the product is not
Firefox compatible,
and no one claims that it is,
but it ACTUALLY IS.
Imagined
Described Actual
Copyright © 1996-2013, Satisfice, Inc.
dddd 12
12
23
Oracles
An oracle is a way to recognize
a problem that appears during testing.
“...it appeared at least once to meet some
requirement to some degree.”
“...it works”
24
Consistency (“this agrees with that”)
an important theme in oracles
Familiarity: The system is not consistent with the pattern of any familiar problem.
Explainability: The system is consistent with our ability to explain it.
World: The system is consistent with objects and states, in the world, that it represents.
History: The present version of the system is consistent with past versions of it.
Image: The system is consistent with an image that the organization wants to project.
Comparable Products: The system is consistent with comparable systems.
Claims: The system is consistent with what important people say it’s supposed to be.
Users’ Expectations: The system is consistent with what users want.
Product: Each element of the system is consistent with comparable elements in the same
system.
Purpose: The system is consistent with its purposes, both explicit and implicit.
Statutes & Standards: The system is consistent with applicable laws and standards.
Copyright © 1996-2013, Satisfice, Inc.
dddd 13
13
Familiar Problems
If a product is consistent with problems we’ve seen before,
we suspect that there might be a problem.
Explainability
If a product is inconsistent with our ability to explain it
(or someone else’s), we suspect that there might be a problem.
Copyright © 1996-2013, Satisfice, Inc.
dddd 14
14
World
If a product is inconsistent with the way the world works,
we suspect that there might be a problem.
History
If a product is inconsistent with previous versions of itself,
we suspect that there might be a problem.
Okay,
so how the #&@
do I print now?
Copyright © 1996-2013, Satisfice, Inc.
dddd 15
15
Image
If a product is inconsistent with an image that
the company wants to project, we suspect a problem.
Comparable Products
WordPad Word
When a product seems inconsistent with a product that is
in some way comparable, we suspect that there might be a problem.
Copyright © 1996-2013, Satisfice, Inc.
dddd 16
16
Claims
When a product is inconsistent with claims that important
people make about it, we suspect a problem.
User Expectations
When a product is inconsistent with expectations that a
reasonable user might have, we suspect a problem.
Copyright © 1996-2013, Satisfice, Inc.
dddd 17
17
Purpose
When a product is inconsistent with its designers’ explicit
or implicit purposes, we suspect a problem.
Product
When a product is inconsistent internally—as when it
contradicts itself—we suspect a problem.
Copyright © 1996-2013, Satisfice, Inc.
dddd 18
18
Statutes and Standards
When a product is inconsistent with laws or widely
accepted standards, we suspect a problem.
Tacit Explicit
OtherPeopleTester
Your
Feelings &
Mental Models
Shared
Artifacts
(specs, tools,
etc.)
Stakeholders’
Feelings &
Mental Models
Inference
Observable
Consistencies
ReferenceConference
Experience
Oracles From the Inside Out
Copyright © 1996-2013, Satisfice, Inc.
dddd 19
19
37
ET is a Structured Process
Exploratory testing, as I teach it, is a structured process
conducted by a skilled tester, or by lesser skilled
testers or users working under supervision.
The structure of ET comes from many sources:
− Test design heuristics
− Chartering
− Time boxing
− Perceived product risks
− The nature of specific tests
− The structure of the product being tested
− The process of learning the product
− Development activities
− Constraints and resources afforded by the project
− The skills, talents, and interests of the tester
− The overall mission of testing
In other words,
it’s not “random”,
but systematic.
IP Address
38
Heuristics bring useful structure
to problem-solving skill.
adjective:
“serving to discover.”
noun:
“a fallible method for solving a problem or
making a decision.”
“The engineering method is the use of heuristics
to cause the best change in a poorly understood situation
within the available resources.”
-- Billy Vaughan Koen, Discussion of The Method
Copyright © 1996-2013, Satisfice, Inc.
dddd 20
20
39
Test Design and Execution
Guide testers with personal supervision and concise
documentation of test ideas. Meanwhile, train them
so that they can guide themselves and be accountable
for increasingly challenging work.
Test
Ideas
Achieve excellent test design by
exploring different test designs
while actually testing
Product
Product
or spec
40
If You are Frustrated
1. Look over your recent tests and find a pattern there.
2. With your next few tests, violate the old pattern.
3. Prefer MFAT (multiple factors at a time).
4. Broaden and vary your observations.
Boundary Testing
Copyright © 1996-2013, Satisfice, Inc.
dddd 21
21
41
If You are Confused
1. Simplify your tests.
2. Conserve states.
3. Frequently repeat your actions.
4. Frequently return to a known state.
5. Prefer OFAT heuristic (one factor at a time).
6. Make precise observations.
Dice Game
42
Exploratory Branching:
Distractions are good!
New test
idea
New test
idea
New test
idea
New test ideas occur
continually during an
ET session.
Copyright © 1996-2013, Satisfice, Inc.
dddd 22
22
43
“Long Leash” Heuristic
but periodically take stock
of your status against your mission
Let yourself be distracted…
‘cause you never know
what you’ll find…
44
ET is a Structured Process
In excellent exploratory testing, one structure tends to
dominate all the others:
Exploratory testers construct a compelling story of
their testing. It is this story that
gives ET a backbone.
Pen Test
Copyright © 1996-2013, Satisfice, Inc.
dddd 23
23
To test is to construct three stories
(plus one more)
Level 1: A story about the status of the PRODUCT
…about how it failed, and how it might fail...
…in ways that matter to your various clients.
Level 2: A story about HOW YOU TESTED it
…how you configured, operated and observed it…
…about what you haven’t tested, yet…
…and won’t test, at all…
Level 3: A story about the VALUE of the testing
…what the risks and costs of testing are…
…how testable (or not) the product is…
…things that make testing harder or slower…
…what you need and what you recommend…
(Level 3+: A story about the VALUE of the stories.)
…do you know what happened? can you report? does report serve its purpose?
Why should I be pleased
with your work?
The Roadmap
46
How often do you account for your progress?
If you have any autonomy at all, you can risk
investing some time in
− learning
− thinking
− refining approaches
− better tests
Allow some disposable time
Self-management is good!
Copyright © 1996-2013, Satisfice, Inc.
dddd 24
24
47
Allow some disposable time
If it turns out that you’ve made a bad
investment…oh well
☺ If it turns out that you’ve made a
good investment, you might have
− learned something about the product
− invented a more powerful test
− found a bug
− done a better job
− avoided going down a dead end for too long
− surprised and impressed your manager
48
“Plunge in and Quit” Heuristic
This benefits from disposable time– that part of your
work not scrutinized in detail.
Plunge in and quit means you can start something
without committing to finish it successfully, and
therefore you don’t need a plan.
Several cycles of this is a great way to discover a plan.
Whenever you are called upon to test
something very complex or frightening, plunge in!
After a little while, if you are very confused
or find yourself stuck, quit!
Copyright © 1996-2013, Satisfice, Inc.
dddd 25
25
49
The First Law of Documentation
“That should be documented.”
“That should be documented
if and when and how it serves our purposes.”
Who will read it? Will they understand it?
Is there a better way to communicate that information?
What does documentation cost you?
50
Common Problems with
Test Documentation
Distracts, and even prevents, testers from doing the best testing
they know how to do.
Authors don’t understand testing.
Authors don’t own format.
Templates help hide poor thinking.
Full of fluff.
Fluff discourages readers and increases cost of maintenance.
No one knows documentation requirements.
Too much formatting increases the cost of maintenance.
Information for different audiences is mixed together.
Catering to rare audiences instead of probable user.
Disrupts the social life of information.
Long term and short term goals conflict.
Most people don’t read.
Copyright © 1996-2013, Satisfice, Inc.
dddd 26
26
51
What Does Rapid Testing Look Like?
Concise Documentation Minimizes Waste
Risk ModelCoverage Model Test Strategy
Reference
Risk CatalogTesting Heuristics
General
Project-
Specific Status
Dashboard
Schedule BugsIssues
52
Consider Automatic Logging
Exploratory testing works better when the
product produces an automatic log of
everything that was covered in each test.
You can also use external logging tools
such as Spector (www.spectorsoft.com).
Automatic logging means that you get
something like a retrospective script of
what was tested.
Copyright © 1996-2013, Satisfice, Inc.
dddd 27
27
53
Introducing the Test Session
1) Charter
2) Uninterrupted Time Box
3) Reviewable Result
4) Debriefing vs.
Copyright © 1996-2013, Satisfice, Inc.
dddd 28
28
Charter:
A clear mission for the session
A charter summarizes the goal or activity of the session.
Before a session, it is a statement of how the session will be
focused. Afterward, it should be changed, if necessary, to
reasonably characterize the actual focus of the session.
General charters are the norm, early in the project:
“Analyze the Insert Picture function”
More specific charters tend to emerge later on:
“Test clip art insertion. Focus on stress and flow techniques, and make
sure to insert into a variety of documents. We’re concerned about
resource leaks or anything else that might degrade performance over
time.”
Charter Patterns:
Evolving test strategy
Intake Sessions (Goal: negotiate mission)
“Interview the project manager about testing Xmind.”
− Survey Sessions (Goal: learn product)
“Familiarize yourself with Xmind.”
Setup Sessions (Goal: create testing infrastructure)
“Develop a library of mindmaps for testing Xmind.”
Analysis Sessions (Goal: get ideas for deep coverage)
“Identify the primary functions of Xmind.”
“Construct a test coverage outline.”
“Brainstorm test ideas.”
“Prepare a state model for state-based testing.”
“Perform a component risk-analysis to guide further testing.”
“Discover all the error messages in Xmind.”
Copyright © 1996-2013, Satisfice, Inc.
dddd 29
29
Charter Patterns:
Evolving test strategy
Deep Coverage Sessions (Goal: find the right bugs)
“Perform scenario testing based on the scenario playbook.”
“Perform a tour that achieves double-transition state coverage.”
“Perform steeplechase boundary testing on the major data items.”
“Test each error message in Xmind.”
“Perform a function tour using the 2300 node mindmap.”
Closure Sessions (Goal: get ready to release)
“Verify the latest fixes.”
“Re-test tutorial with the latest build.”
“Review help files and readme.”
“Go over deferred bugs with Customer Support people.”
“Perform clean-machine install test.”
Time Box:
Focused test effort of fixed duration
A normal session is 90 minutes long.
Base timing can be adjusted based on your context
A real session may be somewhat longer or shorter.
A normal session is uninterrupted.
A real session may be somewhat interrupted.
Real sessions are “normed” for the purposes of
reporting metrics. This is so that our clients don’t
get confused by the numbers.
Copyright © 1996-2013, Satisfice, Inc.
dddd 30
30
Example
Session 1…
Work for 20 minutes
(fifteen minute interruption)
Work for 75 minutes
(ten minute interruption)
Work for 40 minutes
(end the real session)
Session 2…
Work for 35 minutes
(twenty minute interruption)
Work for 85 minutes
(end the real session)
This results in 2 session reports; 5 hours of real time.
It would be reported as 3 normal sessions.
If a session is so interrupted that your
efficiency is destroyed, abort the session
until you get control of your time.
Copyright © 1996-2013, Satisfice, Inc.
dddd 31
31
Debriefing:
Measurement begins with observation
The manager reviews session report to
assure that he understands it and that it
follows the protocol.
The tester answers any questions.
Session metrics are checked.
Charter may be adjusted.
Session may be extended.
New sessions may be chartered.
Coaching happens in both directions.
Reviewable Result:
A scannable session sheet
Charter
− #AREAS
Start Time
Tester Name(s)
Breakdown
− #DURATION
− #TEST DESIGN AND EXECUTION
− #BUG INVESTIGATION AND REPORTING
− #SESSION SETUP
− #CHARTER/OPPORTUNITY
Data Files
Test Notes
Bugs
− #BUG
Issues
− #ISSUE
CHARTER
-----------------------------------------------
Analyze MapMaker’s View menu functionality and
report on areas of potential risk.
#AREAS
OS | Windows 2000
Menu | View
Strategy | Function Testing
Strategy | Functional Analysis
START
-----------------------------------------------
5/30/00 03:20 pm
TESTER
-----------------------------------------------
Jonathan Bach
TASK BREAKDOWN
-----------------------------------------------
#DURATION
short
#TEST DESIGN AND EXECUTION
65
#BUG INVESTIGATION AND REPORTING
25
#SESSION SETUP
20
Copyright © 1996-2013, Satisfice, Inc.
dddd 32
32
Reviewable Result:
Various tools are now available.
Rapid Reporter
SessionWeb
SBTExecute
Atlassian: Bonfire
Atlassian: Jira
The Breakdown Metrics
Testing is like looking for worms
Test Design and Execution
Bug Investigation and Reporting
Session Setup
Copyright © 1996-2013, Satisfice, Inc.
dddd 33
33
Challenges of SBTM
Architecting the system of charters (test planning)
Making time for debriefings
Getting the metrics right
Creating good test notes
Keeping the technique from dominating the testing
Managing chronic interruptions
For example session sheets and metrics see
http://www.satisfice.com/sbtm
Whiteboard
Used for planning and tracking of test
execution
Suitable for use in waterfall or agile (as long as
you have control over your own team’s
process)
Use colours to track:
− Features, or
− Main Areas, or
− Test styles (performance, robustness, system)
Copyright © 1996-2013, Satisfice, Inc.
dddd 34
34
Whiteboard
Divide the board into four areas:
− Work to be done
− Work in Progress
− Cancelled or Work not being done
− Completed work
Red stickies indicate issues (not just bugs)
Create a sticky note for each half day of work (or
mark # of half days expected on the sticky note)
Prioritize stickies daily (or at least twice/wk)
Finish “on-time” with low priority work
incomplete
Whiteboard Example
End of
week 1
Out of 7
weeks
Copyright © 1996-2013, Satisfice, Inc.
dddd 35
35
Whiteboard Example
End of
week 6
Out of 7
weeks
Reporting
An Excel Spreadsheet with:
− List of Charters
− Area
− Estimated Effort
− Expended Effort
− Remaining Effort
− Tester(s)
− Start Date
− Completed Date
− Issues
− Comments
Does NOT include pass/fail percentage or number of
test cases
Copyright © 1996-2013, Satisfice, Inc.
dddd 36
36
Sample Report
Charter Area
Estimated
Effort
Expende
d Effort
Remainin
g Effort Tester
Date
Started
Date
Completed
Issues
Found Comments
Investigation for high QLN spikes on
EVLT
H/W
Performance 0 20 0 acode 12/10/2011 01/14/2012 1617032
Lots of investigation. Problem was on 2-3
out of 48 ports which just happened to be
2 of the 6 ports I tested.
ARQ Verification under different RA
Modes ARQ 2 2 0 ncowan 12/14/2011 12/15/2011
POTS interference ARQ 2 0 0 --- 01/08/2012 01/08/2012
Decided not to test as the H/W team
already tested this functionality and time
was tight.
Expected throughput testing ARQ 5 5 0 acode 01/10/2012 01/14/2012
INP vs. SHINE ARQ 6 6 0 ncowan 12/01/2011 12/04/2011
INP vs. REIN ARQ 6 7 5 jbright 01/06/2012 01/10/2012
To translate the files properly, had to
install Python solution from Antwerp.
Some overhead to begin testing
(installation, config test) but was fairly
quick to execute afterwards
INP vs. REIN + SHINE ARQ 12 12
Traffic delay and jitter from RTX ARQ 2 2 0 ncowan 12/05/2011 12/05/2011
Attainable Throughput ARQ 1 4 0 jbright 01/05/2012 01/08/2012
Took longer because was not behaving as
expected and I had to make sure I was
testing correctly. My expectations were
wrong based on virtual noise not being
exact.
Weekly Report
A PowerPoint slide indicating the
important issues (not a count but a list)
− “Show stopping” bugs
− New bugs found since last report
− Important issues with testing (blocking bugs,
equipment issues, people issues, etc.)
− Risks (updates and newly discovered)
− Tester concerns (if different from above)
− The slide on the next page indicating progress
Copyright © 1996-2013, Satisfice, Inc.
dddd 37
37
Sample Report
0
10
20
30
40
50
60
70
80
90
ARQ SRA Vectoring Regression H/W Performance
Effort(personhalfdays)
Feature
"Awesome Product" Test Progress as of 02/01/2012
Original Planned Effort
Expended Effort
Total Expected Effort
Direction of lines indicates
effort trend since last report
Solid centre bar=finished
Green: No concerns
Yellow: Some concerns
Red: Major concerns
74
Copyright © 1995-2014, Satisfice, Inc.
Copyright © 1996-2013, Satisfice, Inc.
dddd 38
38
75
Copyright © 1995-2014, Satisfice, Inc.
Appendix A
Reporting the TBS Breakdown
A guess is okay, but follow the protocol
Test, Bug, and Setup are orthogonal categories.
Estimate the percentage of charter work that fell into each
category.
Nearest 5% or 10% is good enough.
If activities are done simultaneously, report the highest
precedence activity.
Precedence goes in order: T, B, then S.
All we really want is to track interruptions to testing.
Don’t include Opportunity Testing in the estimate.
Copyright © 1996-2013, Satisfice, Inc.
dddd 39
39
Activity Hierarchy
All test work fits here, somewhere
all work
non-
session
session
opportunity on charter
test bug setup
inferred
Non-Session
61%
Test
28%
Bug
4%
Opportunity
1%
Setup
6%
Work Breakdown:
Diagnosing the productivity
Do these proportions make sense?
How do they change over time?
Is the reporting protocol being
followed?
0.0
50.0
100.0
150.0
200.0
250.0
300.0
5/26 6/9 6/23 7/7 7/21 8/4 8/18
Copyright © 1996-2013, Satisfice, Inc.
dddd 40
40
Coverage:
Specifying coverage areas
These are text labels listed in the Charter
section of the session sheet. (e.g. “insert
picture”)
Coverage areas can include anything
− areas of the product
− test configuration
− test strategies
− system configuration parameters
Use the debriefings to check the validity of
the specified coverage areas.
Coverage:
Are we testing the right stuff?
Is it a lop-sided set of
coverage areas?
Is it distorted
reporting?
Distribution of On Charter Testing
Across Areas
0
20
40
60
80
100
120
• Is this a risk-based test
strategy?
Copyright © 1996-2013, Satisfice, Inc.
dddd 41
41
Using the Data
to Estimate a Test Cycle
1. How many perfect sessions (100% on-charter testing) does
it take to do a cycle? (let’s say 40)
2. How many sessions can the team (of 4 testers) do per day?
(let’s say 3 per day, per tester = 12)
3. How productive are the sessions? (let’s say 66% is on-
charter test design and execution)
4. Estimate: 40 / (12 * .66) = 5 days
5. We base the estimate on the data we’ve collected. When
any conditions or assumptions behind this estimate
change, we will update the estimate.

More Related Content

What's hot

Penyakit yang menyertai persalinan 3
Penyakit yang menyertai persalinan 3Penyakit yang menyertai persalinan 3
Penyakit yang menyertai persalinan 3089633666
 
Assessment of fetal wellbeing in pregnancy and labour
Assessment of fetal wellbeing in pregnancy and labour  Assessment of fetal wellbeing in pregnancy and labour
Assessment of fetal wellbeing in pregnancy and labour 716
 
5. parametritis & pelviksitis
5. parametritis & pelviksitis5. parametritis & pelviksitis
5. parametritis & pelviksitisPradasary
 
10 preeklampsia eklampsia
10 preeklampsia eklampsia10 preeklampsia eklampsia
10 preeklampsia eklampsiaJoni Iswanto
 
Abnormal progress of labor for 4th year med.students
Abnormal progress of labor for 4th year med.studentsAbnormal progress of labor for 4th year med.students
Abnormal progress of labor for 4th year med.studentsDr. Aisha M Elbareg
 
Manajemen Kegawat Daruratan Obstetri dan Ginekologi
Manajemen Kegawat Daruratan Obstetri dan GinekologiManajemen Kegawat Daruratan Obstetri dan Ginekologi
Manajemen Kegawat Daruratan Obstetri dan GinekologiDokter Tekno
 
Asfiksia bayi baru lahir
Asfiksia bayi baru lahirAsfiksia bayi baru lahir
Asfiksia bayi baru lahirDeGirl's ZeViey
 
Sop konjungtivitis
Sop konjungtivitisSop konjungtivitis
Sop konjungtivitisAndy Neon
 
Daftar tilik untuk keterampilan lapangan
Daftar tilik untuk keterampilan lapanganDaftar tilik untuk keterampilan lapangan
Daftar tilik untuk keterampilan lapanganyeni82
 
4. asuhan sayang ibu
4. asuhan sayang ibu4. asuhan sayang ibu
4. asuhan sayang ibueka f
 
Kehamilan ektopik terganggu power point mahdiah(1)
Kehamilan ektopik terganggu power point mahdiah(1)Kehamilan ektopik terganggu power point mahdiah(1)
Kehamilan ektopik terganggu power point mahdiah(1)Muh Al Imran Abidin
 
Ppt Masalah pada Neonatus -- Bisulan (Furunkel)
Ppt Masalah pada Neonatus -- Bisulan (Furunkel)Ppt Masalah pada Neonatus -- Bisulan (Furunkel)
Ppt Masalah pada Neonatus -- Bisulan (Furunkel)Aftina Eka R
 

What's hot (20)

Penyakit yang menyertai persalinan 3
Penyakit yang menyertai persalinan 3Penyakit yang menyertai persalinan 3
Penyakit yang menyertai persalinan 3
 
Assessment of fetal wellbeing in pregnancy and labour
Assessment of fetal wellbeing in pregnancy and labour  Assessment of fetal wellbeing in pregnancy and labour
Assessment of fetal wellbeing in pregnancy and labour
 
5. parametritis & pelviksitis
5. parametritis & pelviksitis5. parametritis & pelviksitis
5. parametritis & pelviksitis
 
10 preeklampsia eklampsia
10 preeklampsia eklampsia10 preeklampsia eklampsia
10 preeklampsia eklampsia
 
Makalah furunkel
Makalah furunkelMakalah furunkel
Makalah furunkel
 
Lp pterygium
Lp pterygiumLp pterygium
Lp pterygium
 
Abnormal progress of labor for 4th year med.students
Abnormal progress of labor for 4th year med.studentsAbnormal progress of labor for 4th year med.students
Abnormal progress of labor for 4th year med.students
 
Manajemen Kegawat Daruratan Obstetri dan Ginekologi
Manajemen Kegawat Daruratan Obstetri dan GinekologiManajemen Kegawat Daruratan Obstetri dan Ginekologi
Manajemen Kegawat Daruratan Obstetri dan Ginekologi
 
Preterm labor
Preterm labor  Preterm labor
Preterm labor
 
Preeklampsia
PreeklampsiaPreeklampsia
Preeklampsia
 
Asfiksia bayi baru lahir
Asfiksia bayi baru lahirAsfiksia bayi baru lahir
Asfiksia bayi baru lahir
 
Sop konjungtivitis
Sop konjungtivitisSop konjungtivitis
Sop konjungtivitis
 
Daftar tilik untuk keterampilan lapangan
Daftar tilik untuk keterampilan lapanganDaftar tilik untuk keterampilan lapangan
Daftar tilik untuk keterampilan lapangan
 
Referat vicki
Referat vickiReferat vicki
Referat vicki
 
VAKUM & FORCEP
VAKUM & FORCEPVAKUM & FORCEP
VAKUM & FORCEP
 
4. asuhan sayang ibu
4. asuhan sayang ibu4. asuhan sayang ibu
4. asuhan sayang ibu
 
Kehamilan ektopik terganggu power point mahdiah(1)
Kehamilan ektopik terganggu power point mahdiah(1)Kehamilan ektopik terganggu power point mahdiah(1)
Kehamilan ektopik terganggu power point mahdiah(1)
 
Ppt Masalah pada Neonatus -- Bisulan (Furunkel)
Ppt Masalah pada Neonatus -- Bisulan (Furunkel)Ppt Masalah pada Neonatus -- Bisulan (Furunkel)
Ppt Masalah pada Neonatus -- Bisulan (Furunkel)
 
Bisulan
BisulanBisulan
Bisulan
 
Presus vbac
Presus vbacPresus vbac
Presus vbac
 

Similar to Exploratory Testing Half-Day Tutorial

Exploratory Testing Explained
Exploratory Testing ExplainedExploratory Testing Explained
Exploratory Testing ExplainedTechWell
 
Exploratory Testing Explained
Exploratory Testing ExplainedExploratory Testing Explained
Exploratory Testing ExplainedTechWell
 
A Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingA Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingTechWell
 
A Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingA Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingTechWell
 
Rapid software testing
Rapid software testingRapid software testing
Rapid software testingSachin MK
 
A Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingA Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingTechWell
 
A Holistic View of Complex Systems and Organizational Change
A Holistic View of Complex Systems and Organizational ChangeA Holistic View of Complex Systems and Organizational Change
A Holistic View of Complex Systems and Organizational ChangeTechWell
 
Top Tips to a Successful eDiscovery Software Demo
Top Tips to a Successful eDiscovery Software DemoTop Tips to a Successful eDiscovery Software Demo
Top Tips to a Successful eDiscovery Software DemoMark Walker
 
Enterprise Devops Presentation @ Magentys Seminar London May 15 2014
Enterprise Devops Presentation @ Magentys Seminar London May 15 2014Enterprise Devops Presentation @ Magentys Seminar London May 15 2014
Enterprise Devops Presentation @ Magentys Seminar London May 15 2014Jwooldridge
 
March APLN: Agile development- Measure & Analyze by Garry Rowland
March APLN: Agile development- Measure & Analyze by Garry RowlandMarch APLN: Agile development- Measure & Analyze by Garry Rowland
March APLN: Agile development- Measure & Analyze by Garry RowlandConscires Agile Practices
 
5 Ways to make load testing work for you
5 Ways to make load testing work for you5 Ways to make load testing work for you
5 Ways to make load testing work for youIsrael Rogoza
 
It takes a village to build a quality product
It takes a village to build a quality productIt takes a village to build a quality product
It takes a village to build a quality productAnne-Marie Charrett
 
Team Leadership: Telling Your Testing Stories
Team Leadership: Telling Your Testing StoriesTeam Leadership: Telling Your Testing Stories
Team Leadership: Telling Your Testing StoriesTechWell
 
A Dozen Keys to Agile Testing Maturity
A Dozen Keys to Agile Testing MaturityA Dozen Keys to Agile Testing Maturity
A Dozen Keys to Agile Testing MaturityTechWell
 
5 Essential Tips for Load Testing Beginners
5 Essential Tips for Load Testing Beginners5 Essential Tips for Load Testing Beginners
5 Essential Tips for Load Testing BeginnersNeotys
 

Similar to Exploratory Testing Half-Day Tutorial (20)

Exploratory Testing Explained
Exploratory Testing ExplainedExploratory Testing Explained
Exploratory Testing Explained
 
Exploratory Testing Explained
Exploratory Testing ExplainedExploratory Testing Explained
Exploratory Testing Explained
 
[Paul Holland] Trends in Software Testing
[Paul Holland] Trends in Software Testing[Paul Holland] Trends in Software Testing
[Paul Holland] Trends in Software Testing
 
A Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingA Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software Testing
 
A Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingA Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software Testing
 
Rapid software testing
Rapid software testingRapid software testing
Rapid software testing
 
A Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software TestingA Rapid Introduction to Rapid Software Testing
A Rapid Introduction to Rapid Software Testing
 
Fundamentals of testing
Fundamentals of testingFundamentals of testing
Fundamentals of testing
 
A Holistic View of Complex Systems and Organizational Change
A Holistic View of Complex Systems and Organizational ChangeA Holistic View of Complex Systems and Organizational Change
A Holistic View of Complex Systems and Organizational Change
 
Top Tips for eDiscovery Software Demo iControl ESI
Top Tips for eDiscovery Software Demo iControl ESITop Tips for eDiscovery Software Demo iControl ESI
Top Tips for eDiscovery Software Demo iControl ESI
 
Top Tips to a Successful eDiscovery Software Demo
Top Tips to a Successful eDiscovery Software DemoTop Tips to a Successful eDiscovery Software Demo
Top Tips to a Successful eDiscovery Software Demo
 
Enterprise Devops Presentation @ Magentys Seminar London May 15 2014
Enterprise Devops Presentation @ Magentys Seminar London May 15 2014Enterprise Devops Presentation @ Magentys Seminar London May 15 2014
Enterprise Devops Presentation @ Magentys Seminar London May 15 2014
 
Develop your inner tester
Develop your inner tester Develop your inner tester
Develop your inner tester
 
March APLN: Agile development- Measure & Analyze by Garry Rowland
March APLN: Agile development- Measure & Analyze by Garry RowlandMarch APLN: Agile development- Measure & Analyze by Garry Rowland
March APLN: Agile development- Measure & Analyze by Garry Rowland
 
5 Ways to make load testing work for you
5 Ways to make load testing work for you5 Ways to make load testing work for you
5 Ways to make load testing work for you
 
It takes a village to build a quality product
It takes a village to build a quality productIt takes a village to build a quality product
It takes a village to build a quality product
 
Team Leadership: Telling Your Testing Stories
Team Leadership: Telling Your Testing StoriesTeam Leadership: Telling Your Testing Stories
Team Leadership: Telling Your Testing Stories
 
A Dozen Keys to Agile Testing Maturity
A Dozen Keys to Agile Testing MaturityA Dozen Keys to Agile Testing Maturity
A Dozen Keys to Agile Testing Maturity
 
Life of a Tester v1
Life of a Tester v1Life of a Tester v1
Life of a Tester v1
 
5 Essential Tips for Load Testing Beginners
5 Essential Tips for Load Testing Beginners5 Essential Tips for Load Testing Beginners
5 Essential Tips for Load Testing Beginners
 

More from TechWell

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and RecoveringTechWell
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization TechWell
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTechWell
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartTechWell
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyTechWell
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTechWell
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowTechWell
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityTechWell
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyTechWell
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTechWell
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipTechWell
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsTechWell
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GameTechWell
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsTechWell
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationTechWell
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessTechWell
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateTechWell
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessTechWell
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTechWell
 

More from TechWell (20)

Failing and Recovering
Failing and RecoveringFailing and Recovering
Failing and Recovering
 
Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization Instill a DevOps Testing Culture in Your Team and Organization
Instill a DevOps Testing Culture in Your Team and Organization
 
Test Design for Fully Automated Build Architecture
Test Design for Fully Automated Build ArchitectureTest Design for Fully Automated Build Architecture
Test Design for Fully Automated Build Architecture
 
System-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good StartSystem-Level Test Automation: Ensuring a Good Start
System-Level Test Automation: Ensuring a Good Start
 
Build Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test StrategyBuild Your Mobile App Quality and Test Strategy
Build Your Mobile App Quality and Test Strategy
 
Testing Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for SuccessTesting Transformation: The Art and Science for Success
Testing Transformation: The Art and Science for Success
 
Implement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlowImplement BDD with Cucumber and SpecFlow
Implement BDD with Cucumber and SpecFlow
 
Develop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your SanityDevelop WebDriver Automated Tests—and Keep Your Sanity
Develop WebDriver Automated Tests—and Keep Your Sanity
 
Ma 15
Ma 15Ma 15
Ma 15
 
Eliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps StrategyEliminate Cloud Waste with a Holistic DevOps Strategy
Eliminate Cloud Waste with a Holistic DevOps Strategy
 
Transform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOpsTransform Test Organizations for the New World of DevOps
Transform Test Organizations for the New World of DevOps
 
The Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—LeadershipThe Fourth Constraint in Project Delivery—Leadership
The Fourth Constraint in Project Delivery—Leadership
 
Resolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile TeamsResolve the Contradiction of Specialists within Agile Teams
Resolve the Contradiction of Specialists within Agile Teams
 
Pin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile GamePin the Tail on the Metric: A Field-Tested Agile Game
Pin the Tail on the Metric: A Field-Tested Agile Game
 
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile TeamsAgile Performance Holarchy (APH)—A Model for Scaling Agile Teams
Agile Performance Holarchy (APH)—A Model for Scaling Agile Teams
 
A Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps ImplementationA Business-First Approach to DevOps Implementation
A Business-First Approach to DevOps Implementation
 
Databases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery ProcessDatabases in a Continuous Integration/Delivery Process
Databases in a Continuous Integration/Delivery Process
 
Mobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to AutomateMobile Testing: What—and What Not—to Automate
Mobile Testing: What—and What Not—to Automate
 
Cultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for SuccessCultural Intelligence: A Key Skill for Success
Cultural Intelligence: A Key Skill for Success
 
Turn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile TransformationTurn the Lights On: A Power Utility Company's Agile Transformation
Turn the Lights On: A Power Utility Company's Agile Transformation
 

Recently uploaded

Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Commit University
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfAlex Barbosa Coqueiro
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piececharlottematthew16
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsMemoori
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Mark Simos
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLScyllaDB
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxNavinnSomaal
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brandgvaughan
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Enterprise Knowledge
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Scott Keck-Warren
 

Recently uploaded (20)

Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!Nell’iperspazio con Rocket: il Framework Web di Rust!
Nell’iperspazio con Rocket: il Framework Web di Rust!
 
Unraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdfUnraveling Multimodality with Large Language Models.pdf
Unraveling Multimodality with Large Language Models.pdf
 
Story boards and shot lists for my a level piece
Story boards and shot lists for my a level pieceStory boards and shot lists for my a level piece
Story boards and shot lists for my a level piece
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
AI as an Interface for Commercial Buildings
AI as an Interface for Commercial BuildingsAI as an Interface for Commercial Buildings
AI as an Interface for Commercial Buildings
 
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
Tampa BSides - Chef's Tour of Microsoft Security Adoption Framework (SAF)
 
My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
Pigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food ManufacturingPigging Solutions in Pet Food Manufacturing
Pigging Solutions in Pet Food Manufacturing
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 
Developer Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQLDeveloper Data Modeling Mistakes: From Postgres to NoSQL
Developer Data Modeling Mistakes: From Postgres to NoSQL
 
SAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptxSAP Build Work Zone - Overview L2-L3.pptx
SAP Build Work Zone - Overview L2-L3.pptx
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
WordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your BrandWordPress Websites for Engineers: Elevate Your Brand
WordPress Websites for Engineers: Elevate Your Brand
 
Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024Designing IA for AI - Information Architecture Conference 2024
Designing IA for AI - Information Architecture Conference 2024
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024Advanced Test Driven-Development @ php[tek] 2024
Advanced Test Driven-Development @ php[tek] 2024
 

Exploratory Testing Half-Day Tutorial

  • 1. MJ Half-day Tutorials 5/5/2014 8:30:00 AM Exploratory Testing Explained Presented by: Paul Holland Testing Thoughts Brought to you by: 340 Corporate Way, Suite 300, Orange Park, FL 32073 888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
  • 2. Paul Holland Testing Thoughts An independent software test consultant and teacher, Paul Holland has more than sixteen years of hands-on testing and test management experience, primarily at Alcatel-Lucent where he led a transformation of the testing approach for two product divisions, making them more efficient and effective. As a test manager and tester, Paul focused on exploratory testing, test automation, and improving testing techniques. For the past five years, he has been consulting and delivering training within Alcatel-Lucent and externally to companies such as Intel, Intuit, Progressive Insurance, HP, RIM, and General Dynamics. Paul teaches the Rapid Software Testing course for Satisfice. For more information visit testingthoughts.com.
  • 3. Copyright © 1996-2013, Satisfice, Inc. dddd 1 1 1 Copyright © 1995-2014, Satisfice, Inc. Paul Holland, Doran Jones, Inc. pholland@doranjones.com www.doranjones.com My Background Managing Director, Testing Practice at Doran Jones Independent S/W Testing consultant 4/2012 - 3/2014 16+ years testing telecommunications equipment and reworking test methodologies at Alcatel-Lucent 10+ years as a test manager Presenter at STAREast, STARWest, Let’s Test, EuroSTAR and CAST Facilitator at 50+ peer conferences and workshops Teacher of S/W testing for the past 6 years Teacher of Rapid Software Testing Military Helicopter pilot – Canadian Sea Kings
  • 4. Copyright © 1996-2013, Satisfice, Inc. dddd 2 2 Let’s Do ET! Ready…? 3 The Roadmap Want to learn how to test? OKAY LET’S GO!!! 4 (To do ET well, begin by doing it poorly.)
  • 5. Copyright © 1996-2013, Satisfice, Inc. dddd 3 3 Acquiring the competence, motivation, and credibility to… Testing is… create the conditions necessary to… …so that you help your clients to make informed decisions about risk. evaluate a product by learning about it through experimentation, which includes to some degree: questioning, study, modeling, observation and inference, including… operating a product to check specific facts about it… Call this “Checking” not Testing Observe Evaluate Report Interact with the product in specific ways to collect specific observations. Apply algorithmic decision rules to those observations. Report any failed checks. means operating a product to check specific facts about it…
  • 6. Copyright © 1996-2013, Satisfice, Inc. dddd 4 4 Tacit Test Procedures Consistency Oracles Prospective Testing Learning and Teaching Commitment Management (inc. estimation) Recruiting Helpers Managing Testing Logistics Test Tooling and Artifact Development Test Framing Bug Advocacy & Triage Project Post Mortem Creating Archival Documentation Guiding Helpers Discovery of Curios, Issues & Risks Building the Test Team Designing Checks and Tests Playing with the Product Studying Results Galumphing Configuring Product & Tools Schedule Management Study Customer Feedback Relationship Building Making Failure Productive Sympathetic Testing Maintaining Personal Health and Motivation Team Leadership Quasi-Functional Testing Playing Programmer Testing w/Simulated Conditions Testing a Simulation Creating the Test Lab Studying Specs Managing Records Playing Business Analyst Opposition Research Testability Advocacy Cultivate Credibility Testing vs. Checking TESTING (think “what testers do”): the process of evaluating a product by learning about it through experimentation, which includes to some degree: questioning, study, modeling, observation and inference. CHECKING (think “fact checking”): the process of making evaluations by applying algorithmic decision rules to specific observations of a product.
  • 7. Copyright © 1996-2013, Satisfice, Inc. dddd 5 5 10 Exploratory Testing Is… an approach to testing… that emphasizes the personal freedom and responsibility of each tester to continually optimize the value of his work… by treating learning, test design, test execution and result evaluation as mutually supportive activities that run in parallel throughout the project. (applicable to any test technique) (optimize how?)
  • 8. Copyright © 1996-2013, Satisfice, Inc. dddd 6 6 Exploration is Not Just Action arrows and cycles You can put them together! arrows and cycles (value seeking) (task performing)
  • 9. Copyright © 1996-2013, Satisfice, Inc. dddd 7 7 You can put them together! arrows and cycles You can put them together! arrows and cycles
  • 10. Copyright © 1996-2013, Satisfice, Inc. dddd 8 8 QUIZ Testing without a specification Testing with a specification Testing without instructions Testing with instructions Testing a product for the first time Re-testing after a change Informal testing Formal testing Playing… Just playing… Fully automated fact checking 15 Which of these are exploratory testing? 16 The Testing Formality Continuum Mixing Scripting and Exploration When I say “exploratory testing” and don’t qualify it, I mean anything on the informal side of this continuum. INFORMAL FORMAL Not done in any specific way, nor to verify specific facts. Done in a specific way, or to verify specific facts. Machine Checking Human Checking Vague/Generic Test Scripts “Human Transceiver” Matrix/Outline of Test Conditions Product Coverage OutlinePlay Specific Test Data Survey Exploratory Analytical Exploratory
  • 11. Copyright © 1996-2013, Satisfice, Inc. dddd 9 9 Common Questions in ET How do we manage ET? How do we record ET? How do we measure ET? How do we estimate ET? How do we learn ET? How do we defend ET? 17 “How do we do X with ET that we could never do with scripted testing either but nobody noticed?” 18 Tests Project Environment Product Elements Quality Criteria Perceived Quality A Heuristic Test Strategy Model
  • 12. Copyright © 1996-2013, Satisfice, Inc. dddd 10 10 19 Project Environment Tests Product Elements Quality Criteria Perceived Quality A Heuristic Test Strategy Model This is what people think you do described actual “Compare the product to its specification” 20
  • 13. Copyright © 1996-2013, Satisfice, Inc. dddd 11 11 This is more like what you really do imagined actualdescribed “Compare the idea of the product to a description of it” “Compare the actual product to a description of it” “Compare the idea of the product to the actual product” 21 This is what you find… The designer INTENDS the product to be Firefox compatible, but never says so, and it actually is not. The designer INTENDS the product to be Firefox compatible, SAYS SO IN THE SPEC, but it actually is not. The designer assumes the product is not Firefox compatible, and it actually is not, but the ONLINE HELP SAYS IT IS. The designer INTENDS the product to be Firefox compatible, SAYS SO, and IT IS. The designer assumes the product is not Firefox compatible, but it ACTUALLY IS, and the ONLINE HELP SAYS IT IS. The designer INTENDS the product to be Firefox compatible, MAKES IT FIREFOX COMPATIBLE, but forgets to say so in the spec. The designer assumes the product is not Firefox compatible, and no one claims that it is, but it ACTUALLY IS. Imagined Described Actual
  • 14. Copyright © 1996-2013, Satisfice, Inc. dddd 12 12 23 Oracles An oracle is a way to recognize a problem that appears during testing. “...it appeared at least once to meet some requirement to some degree.” “...it works” 24 Consistency (“this agrees with that”) an important theme in oracles Familiarity: The system is not consistent with the pattern of any familiar problem. Explainability: The system is consistent with our ability to explain it. World: The system is consistent with objects and states, in the world, that it represents. History: The present version of the system is consistent with past versions of it. Image: The system is consistent with an image that the organization wants to project. Comparable Products: The system is consistent with comparable systems. Claims: The system is consistent with what important people say it’s supposed to be. Users’ Expectations: The system is consistent with what users want. Product: Each element of the system is consistent with comparable elements in the same system. Purpose: The system is consistent with its purposes, both explicit and implicit. Statutes & Standards: The system is consistent with applicable laws and standards.
  • 15. Copyright © 1996-2013, Satisfice, Inc. dddd 13 13 Familiar Problems If a product is consistent with problems we’ve seen before, we suspect that there might be a problem. Explainability If a product is inconsistent with our ability to explain it (or someone else’s), we suspect that there might be a problem.
  • 16. Copyright © 1996-2013, Satisfice, Inc. dddd 14 14 World If a product is inconsistent with the way the world works, we suspect that there might be a problem. History If a product is inconsistent with previous versions of itself, we suspect that there might be a problem. Okay, so how the #&@ do I print now?
  • 17. Copyright © 1996-2013, Satisfice, Inc. dddd 15 15 Image If a product is inconsistent with an image that the company wants to project, we suspect a problem. Comparable Products WordPad Word When a product seems inconsistent with a product that is in some way comparable, we suspect that there might be a problem.
  • 18. Copyright © 1996-2013, Satisfice, Inc. dddd 16 16 Claims When a product is inconsistent with claims that important people make about it, we suspect a problem. User Expectations When a product is inconsistent with expectations that a reasonable user might have, we suspect a problem.
  • 19. Copyright © 1996-2013, Satisfice, Inc. dddd 17 17 Purpose When a product is inconsistent with its designers’ explicit or implicit purposes, we suspect a problem. Product When a product is inconsistent internally—as when it contradicts itself—we suspect a problem.
  • 20. Copyright © 1996-2013, Satisfice, Inc. dddd 18 18 Statutes and Standards When a product is inconsistent with laws or widely accepted standards, we suspect a problem. Tacit Explicit OtherPeopleTester Your Feelings & Mental Models Shared Artifacts (specs, tools, etc.) Stakeholders’ Feelings & Mental Models Inference Observable Consistencies ReferenceConference Experience Oracles From the Inside Out
  • 21. Copyright © 1996-2013, Satisfice, Inc. dddd 19 19 37 ET is a Structured Process Exploratory testing, as I teach it, is a structured process conducted by a skilled tester, or by lesser skilled testers or users working under supervision. The structure of ET comes from many sources: − Test design heuristics − Chartering − Time boxing − Perceived product risks − The nature of specific tests − The structure of the product being tested − The process of learning the product − Development activities − Constraints and resources afforded by the project − The skills, talents, and interests of the tester − The overall mission of testing In other words, it’s not “random”, but systematic. IP Address 38 Heuristics bring useful structure to problem-solving skill. adjective: “serving to discover.” noun: “a fallible method for solving a problem or making a decision.” “The engineering method is the use of heuristics to cause the best change in a poorly understood situation within the available resources.” -- Billy Vaughan Koen, Discussion of The Method
  • 22. Copyright © 1996-2013, Satisfice, Inc. dddd 20 20 39 Test Design and Execution Guide testers with personal supervision and concise documentation of test ideas. Meanwhile, train them so that they can guide themselves and be accountable for increasingly challenging work. Test Ideas Achieve excellent test design by exploring different test designs while actually testing Product Product or spec 40 If You are Frustrated 1. Look over your recent tests and find a pattern there. 2. With your next few tests, violate the old pattern. 3. Prefer MFAT (multiple factors at a time). 4. Broaden and vary your observations. Boundary Testing
  • 23. Copyright © 1996-2013, Satisfice, Inc. dddd 21 21 41 If You are Confused 1. Simplify your tests. 2. Conserve states. 3. Frequently repeat your actions. 4. Frequently return to a known state. 5. Prefer OFAT heuristic (one factor at a time). 6. Make precise observations. Dice Game 42 Exploratory Branching: Distractions are good! New test idea New test idea New test idea New test ideas occur continually during an ET session.
  • 24. Copyright © 1996-2013, Satisfice, Inc. dddd 22 22 43 “Long Leash” Heuristic but periodically take stock of your status against your mission Let yourself be distracted… ‘cause you never know what you’ll find… 44 ET is a Structured Process In excellent exploratory testing, one structure tends to dominate all the others: Exploratory testers construct a compelling story of their testing. It is this story that gives ET a backbone. Pen Test
  • 25. Copyright © 1996-2013, Satisfice, Inc. dddd 23 23 To test is to construct three stories (plus one more) Level 1: A story about the status of the PRODUCT …about how it failed, and how it might fail... …in ways that matter to your various clients. Level 2: A story about HOW YOU TESTED it …how you configured, operated and observed it… …about what you haven’t tested, yet… …and won’t test, at all… Level 3: A story about the VALUE of the testing …what the risks and costs of testing are… …how testable (or not) the product is… …things that make testing harder or slower… …what you need and what you recommend… (Level 3+: A story about the VALUE of the stories.) …do you know what happened? can you report? does report serve its purpose? Why should I be pleased with your work? The Roadmap 46 How often do you account for your progress? If you have any autonomy at all, you can risk investing some time in − learning − thinking − refining approaches − better tests Allow some disposable time Self-management is good!
  • 26. Copyright © 1996-2013, Satisfice, Inc. dddd 24 24 47 Allow some disposable time If it turns out that you’ve made a bad investment…oh well ☺ If it turns out that you’ve made a good investment, you might have − learned something about the product − invented a more powerful test − found a bug − done a better job − avoided going down a dead end for too long − surprised and impressed your manager 48 “Plunge in and Quit” Heuristic This benefits from disposable time– that part of your work not scrutinized in detail. Plunge in and quit means you can start something without committing to finish it successfully, and therefore you don’t need a plan. Several cycles of this is a great way to discover a plan. Whenever you are called upon to test something very complex or frightening, plunge in! After a little while, if you are very confused or find yourself stuck, quit!
  • 27. Copyright © 1996-2013, Satisfice, Inc. dddd 25 25 49 The First Law of Documentation “That should be documented.” “That should be documented if and when and how it serves our purposes.” Who will read it? Will they understand it? Is there a better way to communicate that information? What does documentation cost you? 50 Common Problems with Test Documentation Distracts, and even prevents, testers from doing the best testing they know how to do. Authors don’t understand testing. Authors don’t own format. Templates help hide poor thinking. Full of fluff. Fluff discourages readers and increases cost of maintenance. No one knows documentation requirements. Too much formatting increases the cost of maintenance. Information for different audiences is mixed together. Catering to rare audiences instead of probable user. Disrupts the social life of information. Long term and short term goals conflict. Most people don’t read.
  • 28. Copyright © 1996-2013, Satisfice, Inc. dddd 26 26 51 What Does Rapid Testing Look Like? Concise Documentation Minimizes Waste Risk ModelCoverage Model Test Strategy Reference Risk CatalogTesting Heuristics General Project- Specific Status Dashboard Schedule BugsIssues 52 Consider Automatic Logging Exploratory testing works better when the product produces an automatic log of everything that was covered in each test. You can also use external logging tools such as Spector (www.spectorsoft.com). Automatic logging means that you get something like a retrospective script of what was tested.
  • 29. Copyright © 1996-2013, Satisfice, Inc. dddd 27 27 53 Introducing the Test Session 1) Charter 2) Uninterrupted Time Box 3) Reviewable Result 4) Debriefing vs.
  • 30. Copyright © 1996-2013, Satisfice, Inc. dddd 28 28 Charter: A clear mission for the session A charter summarizes the goal or activity of the session. Before a session, it is a statement of how the session will be focused. Afterward, it should be changed, if necessary, to reasonably characterize the actual focus of the session. General charters are the norm, early in the project: “Analyze the Insert Picture function” More specific charters tend to emerge later on: “Test clip art insertion. Focus on stress and flow techniques, and make sure to insert into a variety of documents. We’re concerned about resource leaks or anything else that might degrade performance over time.” Charter Patterns: Evolving test strategy Intake Sessions (Goal: negotiate mission) “Interview the project manager about testing Xmind.” − Survey Sessions (Goal: learn product) “Familiarize yourself with Xmind.” Setup Sessions (Goal: create testing infrastructure) “Develop a library of mindmaps for testing Xmind.” Analysis Sessions (Goal: get ideas for deep coverage) “Identify the primary functions of Xmind.” “Construct a test coverage outline.” “Brainstorm test ideas.” “Prepare a state model for state-based testing.” “Perform a component risk-analysis to guide further testing.” “Discover all the error messages in Xmind.”
  • 31. Copyright © 1996-2013, Satisfice, Inc. dddd 29 29 Charter Patterns: Evolving test strategy Deep Coverage Sessions (Goal: find the right bugs) “Perform scenario testing based on the scenario playbook.” “Perform a tour that achieves double-transition state coverage.” “Perform steeplechase boundary testing on the major data items.” “Test each error message in Xmind.” “Perform a function tour using the 2300 node mindmap.” Closure Sessions (Goal: get ready to release) “Verify the latest fixes.” “Re-test tutorial with the latest build.” “Review help files and readme.” “Go over deferred bugs with Customer Support people.” “Perform clean-machine install test.” Time Box: Focused test effort of fixed duration A normal session is 90 minutes long. Base timing can be adjusted based on your context A real session may be somewhat longer or shorter. A normal session is uninterrupted. A real session may be somewhat interrupted. Real sessions are “normed” for the purposes of reporting metrics. This is so that our clients don’t get confused by the numbers.
  • 32. Copyright © 1996-2013, Satisfice, Inc. dddd 30 30 Example Session 1… Work for 20 minutes (fifteen minute interruption) Work for 75 minutes (ten minute interruption) Work for 40 minutes (end the real session) Session 2… Work for 35 minutes (twenty minute interruption) Work for 85 minutes (end the real session) This results in 2 session reports; 5 hours of real time. It would be reported as 3 normal sessions. If a session is so interrupted that your efficiency is destroyed, abort the session until you get control of your time.
  • 33. Copyright © 1996-2013, Satisfice, Inc. dddd 31 31 Debriefing: Measurement begins with observation The manager reviews session report to assure that he understands it and that it follows the protocol. The tester answers any questions. Session metrics are checked. Charter may be adjusted. Session may be extended. New sessions may be chartered. Coaching happens in both directions. Reviewable Result: A scannable session sheet Charter − #AREAS Start Time Tester Name(s) Breakdown − #DURATION − #TEST DESIGN AND EXECUTION − #BUG INVESTIGATION AND REPORTING − #SESSION SETUP − #CHARTER/OPPORTUNITY Data Files Test Notes Bugs − #BUG Issues − #ISSUE CHARTER ----------------------------------------------- Analyze MapMaker’s View menu functionality and report on areas of potential risk. #AREAS OS | Windows 2000 Menu | View Strategy | Function Testing Strategy | Functional Analysis START ----------------------------------------------- 5/30/00 03:20 pm TESTER ----------------------------------------------- Jonathan Bach TASK BREAKDOWN ----------------------------------------------- #DURATION short #TEST DESIGN AND EXECUTION 65 #BUG INVESTIGATION AND REPORTING 25 #SESSION SETUP 20
  • 34. Copyright © 1996-2013, Satisfice, Inc. dddd 32 32 Reviewable Result: Various tools are now available. Rapid Reporter SessionWeb SBTExecute Atlassian: Bonfire Atlassian: Jira The Breakdown Metrics Testing is like looking for worms Test Design and Execution Bug Investigation and Reporting Session Setup
  • 35. Copyright © 1996-2013, Satisfice, Inc. dddd 33 33 Challenges of SBTM Architecting the system of charters (test planning) Making time for debriefings Getting the metrics right Creating good test notes Keeping the technique from dominating the testing Managing chronic interruptions For example session sheets and metrics see http://www.satisfice.com/sbtm Whiteboard Used for planning and tracking of test execution Suitable for use in waterfall or agile (as long as you have control over your own team’s process) Use colours to track: − Features, or − Main Areas, or − Test styles (performance, robustness, system)
  • 36. Copyright © 1996-2013, Satisfice, Inc. dddd 34 34 Whiteboard Divide the board into four areas: − Work to be done − Work in Progress − Cancelled or Work not being done − Completed work Red stickies indicate issues (not just bugs) Create a sticky note for each half day of work (or mark # of half days expected on the sticky note) Prioritize stickies daily (or at least twice/wk) Finish “on-time” with low priority work incomplete Whiteboard Example End of week 1 Out of 7 weeks
  • 37. Copyright © 1996-2013, Satisfice, Inc. dddd 35 35 Whiteboard Example End of week 6 Out of 7 weeks Reporting An Excel Spreadsheet with: − List of Charters − Area − Estimated Effort − Expended Effort − Remaining Effort − Tester(s) − Start Date − Completed Date − Issues − Comments Does NOT include pass/fail percentage or number of test cases
  • 38. Copyright © 1996-2013, Satisfice, Inc. dddd 36 36 Sample Report Charter Area Estimated Effort Expende d Effort Remainin g Effort Tester Date Started Date Completed Issues Found Comments Investigation for high QLN spikes on EVLT H/W Performance 0 20 0 acode 12/10/2011 01/14/2012 1617032 Lots of investigation. Problem was on 2-3 out of 48 ports which just happened to be 2 of the 6 ports I tested. ARQ Verification under different RA Modes ARQ 2 2 0 ncowan 12/14/2011 12/15/2011 POTS interference ARQ 2 0 0 --- 01/08/2012 01/08/2012 Decided not to test as the H/W team already tested this functionality and time was tight. Expected throughput testing ARQ 5 5 0 acode 01/10/2012 01/14/2012 INP vs. SHINE ARQ 6 6 0 ncowan 12/01/2011 12/04/2011 INP vs. REIN ARQ 6 7 5 jbright 01/06/2012 01/10/2012 To translate the files properly, had to install Python solution from Antwerp. Some overhead to begin testing (installation, config test) but was fairly quick to execute afterwards INP vs. REIN + SHINE ARQ 12 12 Traffic delay and jitter from RTX ARQ 2 2 0 ncowan 12/05/2011 12/05/2011 Attainable Throughput ARQ 1 4 0 jbright 01/05/2012 01/08/2012 Took longer because was not behaving as expected and I had to make sure I was testing correctly. My expectations were wrong based on virtual noise not being exact. Weekly Report A PowerPoint slide indicating the important issues (not a count but a list) − “Show stopping” bugs − New bugs found since last report − Important issues with testing (blocking bugs, equipment issues, people issues, etc.) − Risks (updates and newly discovered) − Tester concerns (if different from above) − The slide on the next page indicating progress
  • 39. Copyright © 1996-2013, Satisfice, Inc. dddd 37 37 Sample Report 0 10 20 30 40 50 60 70 80 90 ARQ SRA Vectoring Regression H/W Performance Effort(personhalfdays) Feature "Awesome Product" Test Progress as of 02/01/2012 Original Planned Effort Expended Effort Total Expected Effort Direction of lines indicates effort trend since last report Solid centre bar=finished Green: No concerns Yellow: Some concerns Red: Major concerns 74 Copyright © 1995-2014, Satisfice, Inc.
  • 40. Copyright © 1996-2013, Satisfice, Inc. dddd 38 38 75 Copyright © 1995-2014, Satisfice, Inc. Appendix A Reporting the TBS Breakdown A guess is okay, but follow the protocol Test, Bug, and Setup are orthogonal categories. Estimate the percentage of charter work that fell into each category. Nearest 5% or 10% is good enough. If activities are done simultaneously, report the highest precedence activity. Precedence goes in order: T, B, then S. All we really want is to track interruptions to testing. Don’t include Opportunity Testing in the estimate.
  • 41. Copyright © 1996-2013, Satisfice, Inc. dddd 39 39 Activity Hierarchy All test work fits here, somewhere all work non- session session opportunity on charter test bug setup inferred Non-Session 61% Test 28% Bug 4% Opportunity 1% Setup 6% Work Breakdown: Diagnosing the productivity Do these proportions make sense? How do they change over time? Is the reporting protocol being followed? 0.0 50.0 100.0 150.0 200.0 250.0 300.0 5/26 6/9 6/23 7/7 7/21 8/4 8/18
  • 42. Copyright © 1996-2013, Satisfice, Inc. dddd 40 40 Coverage: Specifying coverage areas These are text labels listed in the Charter section of the session sheet. (e.g. “insert picture”) Coverage areas can include anything − areas of the product − test configuration − test strategies − system configuration parameters Use the debriefings to check the validity of the specified coverage areas. Coverage: Are we testing the right stuff? Is it a lop-sided set of coverage areas? Is it distorted reporting? Distribution of On Charter Testing Across Areas 0 20 40 60 80 100 120 • Is this a risk-based test strategy?
  • 43. Copyright © 1996-2013, Satisfice, Inc. dddd 41 41 Using the Data to Estimate a Test Cycle 1. How many perfect sessions (100% on-charter testing) does it take to do a cycle? (let’s say 40) 2. How many sessions can the team (of 4 testers) do per day? (let’s say 3 per day, per tester = 12) 3. How productive are the sessions? (let’s say 66% is on- charter test design and execution) 4. Estimate: 40 / (12 * .66) = 5 days 5. We base the estimate on the data we’ve collected. When any conditions or assumptions behind this estimate change, we will update the estimate.