Explorator y Testing
Explained
Jon Bach
QE Director, Live Site Quality
jobach@ebay.com
STAR Canada 2013
Do you see structure here?
How about here?

3
… or here?

http://www.itechnews.net/2008/03/29/steve-jobs-mosaic-portrait/
… or here?

http://www.japanquakemap.com/
… or here?

6
Preamble
Ever use the term "playing around" to describe your testing?
Ever cringe after saying it, wishing there was a better way of
describing what you did than to give the impression it was all
accidental and random?
If so, this workshop may help you understand and explain
exploratory testing as a thoughtful, purposeful approach whose
results stand up under scrutiny.

7
Promises


Participate in exercises that focus on bug isolation and
investigation, risks and vulnerabilities.



Learn frameworks and heuristics of exploration to use in tight
situations



Discover ways to report your exploration so it stands up to
scrutiny.

There is structure and purpose if you know how to
identify it and tell a story about it.
8
Why this talk?
1) Exploratory testers want respect: When testers explore
during testing, they find great bugs. However, since they
often don’t know how to describe their thinking, it’s
considered to be dismissed as “playing around”.
2) The documentation dilemma: Project managers may insist
that all testing be documented, so how to balance time spent
documenting with time spent testing?
3) Your work might be scrutinized: You may have to give a
report someday about something you did that was
exploratory – like attending this conference.
Exercise

(ebay Search)
Most bizarre thing for sale on ebay?
Most expensive thing on ebay?
What’s trending?
How can you find completed items?
What’s most common item sold?

How many categories of items for sale?
How many actual items?
Exploratory Testing
• Sabourin: “continuous test design as testing continues;
continuous testing as design continues; continuous test
planning as testing continues”
• Hendrickson: a style of testing in which you explore the
software while simultaneously designing and executing
tests, using feedback from the last test to inform the next
(Test-Driven Testing?)
• Bolton: Operating and observing the product with the
freedom and mandate to investigate it in an open-ended
search for information about the program.
• Kaner: Simultaneous learning, design and execution,
with an emphasis on learning.
“The” ET Definition
A style of software testing…
that emphasizes the personal freedom…
and responsibility of the individual tester…
to continually optimize the quality of his/her work…
by treating test-related learning…
test design…
test execution…
and test result interpretation…
as mutually supportive activities…
that run in parallel…
throughout the project.
-- Cem Kaner, 2006
“The” ET Definition
A style of software testing…
that emphasizes the personal freedom…
and responsibility of the individual tester…
to continually optimize the quality of his/her work…
by treating test-related learning…
test design…
test execution…
and test result interpretation…
as mutually supportive activities…
that run in parallel…
throughout the project.
-- Cem Kaner, 2006
Analogies
Psychologist
Driving a car

“20 Questions”
Sports
Bounty Hunter

Going to a testing conference
Job Interview

Jam session
Newspaper reporter
Missions that inspire ET
• Change test case variables
• Execute a checklist
• Regress a list of bugs
• Confirm a rumor
• Design a test case
• Write some automation
Testers light the way.

This is our role.

We see things for what they are.
We make informed decisions about quality possible,
because we think critically about software.
16
Key Idea
Testing is…
an infinite process
of comparing the invisible
to the ambiguous
in order to avoid the unthinkable
happening to the anonymous.
Key Idea
Testing is…
an infinite process
of comparing the invisible
to the ambiguous
in order to avoid the unthinkable
happening to the anonymous.
What is testing?

“Try it and see if it works.”

Learn anything reasonable that matters about
whether it can work and how it might not work.
19
What is testing?

“Try it and see if it works.”
Coverage

Oracles

Get it set up

Choose where to look

Read specs

Run it

See what’s there

See if product matches

Run it again, maybe

See what’s not there

Find problems…
…especially the bad
ones

Procedures

20
“I want you to test this…”
What is testing?

If you don’t have an understanding and an agreement on what is
the mission of your testing, then doing it “rapidly” would be
pointless.

“everything that matters”
22
The “tester freedom” scale
pure scripted
vague scripts

fragmentary
test cases
(scenarios)

freestyle exploratory
charters

roles

To know where a test falls on this scale, the
tester must ask themselves: “to what extent
am I in control of the test, and from where did
the idea originate?”
Exercise

This app asks you for the next
item in a sequence of numbers.
What is the next number?

Operating rule?
Exploration is discovery…
?

?

?
Pa
rts

Un

?
kn

?
ow
n

?
…that starts with an idea…
and ends with a perception…

Ea
rth
s

top

sh

e re
…
…depending on the mission
Before
exploring

After
exploring
…mission, mission, mission

If you don’t know your mission,
you’re not testing.
That’s ok, just call it *touring*.
Lewis & Clark, 1802
Mission: Find a water passage across North America…
The charter from Jefferson
“The object of your mission is to explore the
Missouri river, & such principal stream of it, as,
by its course & communication with the water of
the Pacific ocean may offer the most direct &
practicable water communication across this
continent, for the purposes of commerce.”
http://www.monticello.org/jefferson/lewisandclark/instructions.html
Chartering

Making your own decisions about what you will work on and
how you will work. Understanding your client’s needs, the
problems you must solve, and assuring that your work is on
target.
Sponsors and stakeholders

•
•
•
•
•
•
•
•
•

Test Manager
Product Manager
CEO
Customer
Developer
Marketing
Tech Writer
Customer Support
Other testers
Charter-based method #1
Session-Based Exploration
Think in time-boxed missions to
explore, resulting in a test report
with Notes, Bugs, and Issues.
Structure

The “Session”
1)
2)
3)

Time Box
Reviewable Result
Debriefing
“I want you to test this…”

My testing demo…
Some sample session charters


Installation: When installed, does Triangle! put any files in the wrong places?
Does it leave any files for the uninstall? Check the registry keys, use InCtrl to
see what changes are made. Installation is new, so we want to be sure it’s
clean.



Boundary testing: We got word from customer support that there are run-time
errors when using integers over 32000 but no one can repro it. Best recon is
on Win XP Pro with Office 2003 running in the background. Sam K. in CSS
says you can use his machine, and he also has customer specs.



Ship drill: Start Triangle! right out of the box. For example, is the readme
ready to go? We’re waiting from word on Legal as to the License Agreement,
but that shouldn’t hold you up. Also make sure you hit Vista and see what
issues arise there.



Claims testing: Triangle is meant for first graders, but we plan to ship a
version to General Dynamics in a few months. Try some usability profiles or
personas to see what functions become more or less risky. Also, discover the
algorithm by which Triangle! reports its results. Is it way off from what a user
would expect. Does it cause the user to lower their confidence?
Charter-creation method #2
Open-Book Testing
The act of creating open-ended
questions such that…
…testers…

…are immersed in the product right away, building a model or
mind map.
…learn how they are provoked into critical thinking by being
exposed to many types of questions (test ideas).
… quickly find bugs and raise issues in answering the questions
they are given
Questions

}

test ideas
test cases
test scenarios
test plans
test scripts
test designs
test strategies
test heuristics
test ideas
test cases
test scenarios
test plans
test scripts
test designs
test strategies
test heuristics

}

Questions

These comprise the exam to which software will either pass or fail.
A few non-obvious (?) sources for charters

•

Bug database

•

Testers (paired testing)

•

Programmers (different domain expertise)

•

Similar (or competing) products

•

Customer Support

•

Claims made by marketing

•

Emails / Meetings / RSS feeds
Resources
Questions and answers can originate from the same sources:











Documentation / Specifications
Web forums
Previous products
Team members
Competing products
PSS data / KB articles
Your expertise
Heuristics
Help files
Manuals
IM OPEN


Interrogate: The test manager or tester develops a list of
questions to answer.



Manipulate: The testers execute actions to answer the question.



Observe: Testers take notes on what they find.



Plan: Testers determine any follow-up questions (tests) that
occur to them, in preparation to debrief their results.



Evaluate: Testers and test manager meet to compare answers
(test results).



Negotiate: After the debrief, testers and test managers talk
about the appropriate next steps in mission or coverage
A Heuristic Test Strategy Model
Project
Environment

Tests
Quality
Criteria

Product
Elements
Perceived
Quality

45
A Heuristic Test Strategy Model
Project
Environment

Tests
Quality
Criteria

Product
Elements
Perceived
Quality

46
Coverage

Product coverage is the proportion of the product that has been tested.

 Structure
 Function
 Data
 Platform
 Operations
 Time

Capability
Reliability
Usability
Security
Scalability

Performance
Installability
Compatibility
Supportability
Testability

Maintainability
Portability
Localizability

47
Structural Coverage

Test what it’s
made of.

input

platform

 Print
–
–
–
–

testing example

Files associated with printing
Code modules that implement printing
Code statements inside the modules
Code branches inside the modules

output
Functional Coverage

Test what
it does.

input

functions
functions

platform

 Print
–
–
–

testing example

Print, page setup and print preview
Print range, print copies, zoom
Print all, current page, or specific range

output
Data Coverage

Test what
it does it to.

input

functions
&
structure

output

platform

 Print
–
–
–

testing example

Types of documents
Items in documents, size and structure of
documents
Data about how to print (e.g. zoom factor, no. of
copies)
Platform Coverage

Test what it
depends upon.

input

functions
&
structure

platform

 Print
–
–
–
–

testing example

Printers, spoolers, network behavior
Computers
Operating systems
Printer drivers

output
Operations Coverage

Test how
it’s used.

input

platform

 Print
–
–
–
–

testing example

Use defaults
Use realistic environments
Use realistic scenarios
Use complex flows

output
Time Coverage

Test how
it’s affected
by time.

input

output

platform

 Print
–
–
–
–
–
–

testing example

Try different network or port speeds
Print one document right after another, or after long intervals
Try time-related constraints--spooling, buffering, or timeouts
Try printing hourly, daily, month-end, and year-end reports
Try printing from two workstations at the same time
Try printing again, later.
Exercise

Does it work?
What is the hidden feature?
What story does the data tell?
How did you *find* that?
Some Exploration Skills and Tactics
“MR.Q COMP GRABC R&R?”
Modeling

Chartering

Generating/Elaborating

Recording

Resourcing

Observing

Refocusing

Reporting

Questioning

Manipulating

Alternating

Pairing

Branching/Backtracking
Conjecturing

Exploratory testing is a mindset using this skillset.
Skills of Exploration


Put the tester's mind at the center of testing.



Learn to deal with complexity and ambiguity.



Learn to tell a compelling testing story.



Develop testing skills through practice, not just talk.



Use heuristics to guide and structure your process.



Be a service to the project community, not an obstacle.



Consider cost vs. value in all your testing activity.



Diversify your team and your tactics.



Dynamically manage the focus of your work.



Your context should drive your choices, both of which evolve
over time.
56
Testing ourselves
Chartering is an opportunity for testers and managers to
cultivate and improve testing skill:

How did you arrive at that answer?
What did you see along the way?
Was there anything confusing about the questions?
Any riffs off of questions?
What test ideas did others have with the same question?
What managers might ask
How did you spend your time?
What did you find?
Did you need some help / tools?
Do you think there’s more to do here?
Was this charter reasonable?
Agenda: “PROOF”
Past
Results
Obstacles
Outlook
Feelings
The real message

What’s being asked

What they may be thinking

What was your mission?

Remind me what I told you to do…

How did it go?

What do I worry about next?

How far did you get?

Are we closer to shipping?

Need anything?

Can I speed this along?

When will you be done?

Will I get my bonus?
What to document
Historical Explorer

Tester

Observations

•

drawings of flora / fauna

•

feature model

(To the degree you think they are
relevant to stakeholders)

•

descriptions of indigenous people

•

text from log files

•

landmarks

•

text from dialogs

Conjectures

•

what is this thing?

•

test ideas

(Inferences based on experiences.
After I test, I think I know something)

•

where should we go today?

•

questions

•

how do we get there?

•

•

new orders from HQ?

•

are those people hostile?

product and project
issues
•

concerns
risks
charter

Project information

•

mission

•
•

(Independent of observer)

•

supplies and staff

•

test actions

•

latitude / longitude

•

config info

•

death and disease

•

build details

•

supply status

•

tools used
Testing *is* journalism
It involves consulting sources, references, oracles -- and
taking notes about those details.
It requires communication to an audience who wants
information and who will either scrutinize or trust your
report.
It involves a story formed by following up on rumors,
tips, leads, conjectures, and questions – in pursuit of
the truth.
When I was 10, Dad said…
Every story is this simple:
Somebody wants something…
Something stands in their way…
This is what they do about it…
Story Elements (Testing)
Characters
(Somebody)

Purpose
(Wants something)

Conflict
(Something’s in the way)

Actions
(What was done about it)

Testers
Customers
Stakeholders
“How stable are these new features?”
“I want to print all of my recipes.”
“Try to repro this bug.”

Limited budget and time
"How does this thing work?”
"We have yet to run <these> tests.”

Risks exposed
Techniques used
Features covered
Key Idea
Agility is about the freedom
to create, learn, and adapt,
as we get fast feedback.
[ Responding to change
over following a plan ]
Key Idea
Exploratory testing is about
the freedom
to discover, learn, and adapt,
while delivering fast feedback.
A report of my exploration
Activities to report
Bug Investigation
Test Design and Execution

Session Setup

(and Reporting)
Conclusion
There is structure and purpose in exploration …
know how to identify it.





Management Method: Session-Based tests
Chartering Method: Open-Book Testing
Idea Method: Heuristic Test Strategy Model
Technique inventory: stress, flow, risk, claims,
etc…

Exploratory Testing Explained

  • 1.
    Explorator y Testing Explained JonBach QE Director, Live Site Quality jobach@ebay.com STAR Canada 2013
  • 2.
    Do you seestructure here?
  • 3.
  • 4.
  • 5.
  • 6.
  • 7.
    Preamble Ever use theterm "playing around" to describe your testing? Ever cringe after saying it, wishing there was a better way of describing what you did than to give the impression it was all accidental and random? If so, this workshop may help you understand and explain exploratory testing as a thoughtful, purposeful approach whose results stand up under scrutiny. 7
  • 8.
    Promises  Participate in exercisesthat focus on bug isolation and investigation, risks and vulnerabilities.  Learn frameworks and heuristics of exploration to use in tight situations  Discover ways to report your exploration so it stands up to scrutiny. There is structure and purpose if you know how to identify it and tell a story about it. 8
  • 9.
    Why this talk? 1)Exploratory testers want respect: When testers explore during testing, they find great bugs. However, since they often don’t know how to describe their thinking, it’s considered to be dismissed as “playing around”. 2) The documentation dilemma: Project managers may insist that all testing be documented, so how to balance time spent documenting with time spent testing? 3) Your work might be scrutinized: You may have to give a report someday about something you did that was exploratory – like attending this conference.
  • 10.
    Exercise (ebay Search) Most bizarrething for sale on ebay? Most expensive thing on ebay? What’s trending? How can you find completed items? What’s most common item sold? How many categories of items for sale? How many actual items?
  • 11.
    Exploratory Testing • Sabourin:“continuous test design as testing continues; continuous testing as design continues; continuous test planning as testing continues” • Hendrickson: a style of testing in which you explore the software while simultaneously designing and executing tests, using feedback from the last test to inform the next (Test-Driven Testing?) • Bolton: Operating and observing the product with the freedom and mandate to investigate it in an open-ended search for information about the program. • Kaner: Simultaneous learning, design and execution, with an emphasis on learning.
  • 12.
    “The” ET Definition Astyle of software testing… that emphasizes the personal freedom… and responsibility of the individual tester… to continually optimize the quality of his/her work… by treating test-related learning… test design… test execution… and test result interpretation… as mutually supportive activities… that run in parallel… throughout the project. -- Cem Kaner, 2006
  • 13.
    “The” ET Definition Astyle of software testing… that emphasizes the personal freedom… and responsibility of the individual tester… to continually optimize the quality of his/her work… by treating test-related learning… test design… test execution… and test result interpretation… as mutually supportive activities… that run in parallel… throughout the project. -- Cem Kaner, 2006
  • 14.
    Analogies Psychologist Driving a car “20Questions” Sports Bounty Hunter Going to a testing conference Job Interview Jam session Newspaper reporter
  • 15.
    Missions that inspireET • Change test case variables • Execute a checklist • Regress a list of bugs • Confirm a rumor • Design a test case • Write some automation
  • 16.
    Testers light theway. This is our role. We see things for what they are. We make informed decisions about quality possible, because we think critically about software. 16
  • 17.
    Key Idea Testing is… aninfinite process of comparing the invisible to the ambiguous in order to avoid the unthinkable happening to the anonymous.
  • 18.
    Key Idea Testing is… aninfinite process of comparing the invisible to the ambiguous in order to avoid the unthinkable happening to the anonymous.
  • 19.
    What is testing? “Tryit and see if it works.” Learn anything reasonable that matters about whether it can work and how it might not work. 19
  • 20.
    What is testing? “Tryit and see if it works.” Coverage Oracles Get it set up Choose where to look Read specs Run it See what’s there See if product matches Run it again, maybe See what’s not there Find problems… …especially the bad ones Procedures 20
  • 21.
    “I want youto test this…”
  • 22.
    What is testing? Ifyou don’t have an understanding and an agreement on what is the mission of your testing, then doing it “rapidly” would be pointless. “everything that matters” 22
  • 23.
    The “tester freedom”scale pure scripted vague scripts fragmentary test cases (scenarios) freestyle exploratory charters roles To know where a test falls on this scale, the tester must ask themselves: “to what extent am I in control of the test, and from where did the idea originate?”
  • 24.
    Exercise This app asksyou for the next item in a sequence of numbers. What is the next number? Operating rule?
  • 25.
  • 26.
  • 27.
    and ends witha perception… Ea rth s top sh e re …
  • 28.
    …depending on themission Before exploring After exploring
  • 29.
    …mission, mission, mission Ifyou don’t know your mission, you’re not testing. That’s ok, just call it *touring*.
  • 30.
    Lewis & Clark,1802 Mission: Find a water passage across North America…
  • 31.
    The charter fromJefferson “The object of your mission is to explore the Missouri river, & such principal stream of it, as, by its course & communication with the water of the Pacific ocean may offer the most direct & practicable water communication across this continent, for the purposes of commerce.” http://www.monticello.org/jefferson/lewisandclark/instructions.html
  • 32.
    Chartering Making your owndecisions about what you will work on and how you will work. Understanding your client’s needs, the problems you must solve, and assuring that your work is on target.
  • 33.
    Sponsors and stakeholders • • • • • • • • • TestManager Product Manager CEO Customer Developer Marketing Tech Writer Customer Support Other testers
  • 34.
    Charter-based method #1 Session-BasedExploration Think in time-boxed missions to explore, resulting in a test report with Notes, Bugs, and Issues.
  • 35.
  • 36.
    “I want youto test this…” My testing demo…
  • 37.
    Some sample sessioncharters  Installation: When installed, does Triangle! put any files in the wrong places? Does it leave any files for the uninstall? Check the registry keys, use InCtrl to see what changes are made. Installation is new, so we want to be sure it’s clean.  Boundary testing: We got word from customer support that there are run-time errors when using integers over 32000 but no one can repro it. Best recon is on Win XP Pro with Office 2003 running in the background. Sam K. in CSS says you can use his machine, and he also has customer specs.  Ship drill: Start Triangle! right out of the box. For example, is the readme ready to go? We’re waiting from word on Legal as to the License Agreement, but that shouldn’t hold you up. Also make sure you hit Vista and see what issues arise there.  Claims testing: Triangle is meant for first graders, but we plan to ship a version to General Dynamics in a few months. Try some usability profiles or personas to see what functions become more or less risky. Also, discover the algorithm by which Triangle! reports its results. Is it way off from what a user would expect. Does it cause the user to lower their confidence?
  • 38.
    Charter-creation method #2 Open-BookTesting The act of creating open-ended questions such that…
  • 39.
    …testers… …are immersed inthe product right away, building a model or mind map. …learn how they are provoked into critical thinking by being exposed to many types of questions (test ideas). … quickly find bugs and raise issues in answering the questions they are given
  • 40.
    Questions } test ideas test cases testscenarios test plans test scripts test designs test strategies test heuristics
  • 41.
    test ideas test cases testscenarios test plans test scripts test designs test strategies test heuristics } Questions These comprise the exam to which software will either pass or fail.
  • 42.
    A few non-obvious(?) sources for charters • Bug database • Testers (paired testing) • Programmers (different domain expertise) • Similar (or competing) products • Customer Support • Claims made by marketing • Emails / Meetings / RSS feeds
  • 43.
    Resources Questions and answerscan originate from the same sources:           Documentation / Specifications Web forums Previous products Team members Competing products PSS data / KB articles Your expertise Heuristics Help files Manuals
  • 44.
    IM OPEN  Interrogate: Thetest manager or tester develops a list of questions to answer.  Manipulate: The testers execute actions to answer the question.  Observe: Testers take notes on what they find.  Plan: Testers determine any follow-up questions (tests) that occur to them, in preparation to debrief their results.  Evaluate: Testers and test manager meet to compare answers (test results).  Negotiate: After the debrief, testers and test managers talk about the appropriate next steps in mission or coverage
  • 45.
    A Heuristic TestStrategy Model Project Environment Tests Quality Criteria Product Elements Perceived Quality 45
  • 46.
    A Heuristic TestStrategy Model Project Environment Tests Quality Criteria Product Elements Perceived Quality 46
  • 47.
    Coverage Product coverage isthe proportion of the product that has been tested.  Structure  Function  Data  Platform  Operations  Time Capability Reliability Usability Security Scalability Performance Installability Compatibility Supportability Testability Maintainability Portability Localizability 47
  • 48.
    Structural Coverage Test whatit’s made of. input platform  Print – – – – testing example Files associated with printing Code modules that implement printing Code statements inside the modules Code branches inside the modules output
  • 49.
    Functional Coverage Test what itdoes. input functions functions platform  Print – – – testing example Print, page setup and print preview Print range, print copies, zoom Print all, current page, or specific range output
  • 50.
    Data Coverage Test what itdoes it to. input functions & structure output platform  Print – – – testing example Types of documents Items in documents, size and structure of documents Data about how to print (e.g. zoom factor, no. of copies)
  • 51.
    Platform Coverage Test whatit depends upon. input functions & structure platform  Print – – – – testing example Printers, spoolers, network behavior Computers Operating systems Printer drivers output
  • 52.
    Operations Coverage Test how it’sused. input platform  Print – – – – testing example Use defaults Use realistic environments Use realistic scenarios Use complex flows output
  • 53.
    Time Coverage Test how it’saffected by time. input output platform  Print – – – – – – testing example Try different network or port speeds Print one document right after another, or after long intervals Try time-related constraints--spooling, buffering, or timeouts Try printing hourly, daily, month-end, and year-end reports Try printing from two workstations at the same time Try printing again, later.
  • 54.
    Exercise Does it work? Whatis the hidden feature? What story does the data tell?
  • 55.
    How did you*find* that? Some Exploration Skills and Tactics “MR.Q COMP GRABC R&R?” Modeling Chartering Generating/Elaborating Recording Resourcing Observing Refocusing Reporting Questioning Manipulating Alternating Pairing Branching/Backtracking Conjecturing Exploratory testing is a mindset using this skillset.
  • 56.
    Skills of Exploration  Putthe tester's mind at the center of testing.  Learn to deal with complexity and ambiguity.  Learn to tell a compelling testing story.  Develop testing skills through practice, not just talk.  Use heuristics to guide and structure your process.  Be a service to the project community, not an obstacle.  Consider cost vs. value in all your testing activity.  Diversify your team and your tactics.  Dynamically manage the focus of your work.  Your context should drive your choices, both of which evolve over time. 56
  • 57.
    Testing ourselves Chartering isan opportunity for testers and managers to cultivate and improve testing skill: How did you arrive at that answer? What did you see along the way? Was there anything confusing about the questions? Any riffs off of questions? What test ideas did others have with the same question?
  • 58.
    What managers mightask How did you spend your time? What did you find? Did you need some help / tools? Do you think there’s more to do here? Was this charter reasonable? Agenda: “PROOF” Past Results Obstacles Outlook Feelings
  • 59.
    The real message What’sbeing asked What they may be thinking What was your mission? Remind me what I told you to do… How did it go? What do I worry about next? How far did you get? Are we closer to shipping? Need anything? Can I speed this along? When will you be done? Will I get my bonus?
  • 60.
    What to document HistoricalExplorer Tester Observations • drawings of flora / fauna • feature model (To the degree you think they are relevant to stakeholders) • descriptions of indigenous people • text from log files • landmarks • text from dialogs Conjectures • what is this thing? • test ideas (Inferences based on experiences. After I test, I think I know something) • where should we go today? • questions • how do we get there? • • new orders from HQ? • are those people hostile? product and project issues • concerns risks charter Project information • mission • • (Independent of observer) • supplies and staff • test actions • latitude / longitude • config info • death and disease • build details • supply status • tools used
  • 61.
    Testing *is* journalism Itinvolves consulting sources, references, oracles -- and taking notes about those details. It requires communication to an audience who wants information and who will either scrutinize or trust your report. It involves a story formed by following up on rumors, tips, leads, conjectures, and questions – in pursuit of the truth.
  • 62.
    When I was10, Dad said… Every story is this simple: Somebody wants something… Something stands in their way… This is what they do about it…
  • 63.
    Story Elements (Testing) Characters (Somebody) Purpose (Wantssomething) Conflict (Something’s in the way) Actions (What was done about it) Testers Customers Stakeholders “How stable are these new features?” “I want to print all of my recipes.” “Try to repro this bug.” Limited budget and time "How does this thing work?” "We have yet to run <these> tests.” Risks exposed Techniques used Features covered
  • 64.
    Key Idea Agility isabout the freedom to create, learn, and adapt, as we get fast feedback. [ Responding to change over following a plan ]
  • 65.
    Key Idea Exploratory testingis about the freedom to discover, learn, and adapt, while delivering fast feedback.
  • 66.
    A report ofmy exploration
  • 67.
    Activities to report BugInvestigation Test Design and Execution Session Setup (and Reporting)
  • 68.
    Conclusion There is structureand purpose in exploration … know how to identify it.     Management Method: Session-Based tests Chartering Method: Open-Book Testing Idea Method: Heuristic Test Strategy Model Technique inventory: stress, flow, risk, claims, etc…

Editor's Notes

  • #10 STAR demo, commentating, Bingo, Myths, Skills and Techniques
  • #25 Invalid value: &quot;$NaN&quot; -- &quot;the barcode value for a 02 year old Male, that lives in Tanzania, is 26 inches heigh and weighs 999 pounds is: $NaN&quot;
  • #55 inputting all 17000 TLAs and getting the result: FWD, ROL, ROR, ROR – notice that BAK is missing and ROR is repeated… why? But, RPT and CLS are invisible commands – they do not show up in the list – so could there be other commands, still? The array problem 0, 1, 2, 3, 4