-
1.
Experimenting in
Context for Exploratory
Testing
Maaret Pyhäjärvi
Email: <maaret@iki.fi> | Twitter:
maaretp
-
2.
Replacing a test-
case-driven style
with a learning-
tester-driven style
in two
organizations
-
3.
What Testing Gives Us
UnitTesting
ExploratoryTesting
SPEC
FEEDBACK
REGRESSION
GRANULARITY
GUIDANCE
UNDERSTANDING
MODELS
SERENDIPITY
Testing as
artifact
creation
Testing as
performanc
e
-
4.
Givens.
Some things you
can’t change (yet)
-
5.
Experiments.
Try changing something!
-
6.
Finding the
appropriate stretch
Case 1.
-
7.
Data-intensive
application
-
8.
Givens. Things I could not
change.
• Waterfall process.
• Contractual distance between acceptance
testers and subcontractor.
• Test-case metric based reporting.
• I manage, I don’t test.
• Business end users as testers.
-
9.
Experiments. Things I changed.
• Acceptance tester degree of freedom.
• Test cases from step-by-step scripts to test
data with process outline for notes.
• Making “change requests” acceptable.
• Reporting ~20% of testing to 3rd party.
• Inofficial tips sharing sessions with the
subcontractor.
-
10.
Finding the
appropriate stretch
Case II.
-
11.
Function-intensive
application
-
12.
Givens. Things I could not
change.
• Roadmapping creating disconnect to
current priorities.
• Tendency for remote work.
• Developers doing majority of testing.
• Requirements / Specifications format as UI
spec
-
13.
Experiments. Things I changed.
• No test cases or wasteful documentation.
• Tester with developer tools.
• Removing “acceptance testing” by moving
testing to the team.
• Continuous delivery (without test
automation).
• Holding space for testing to happen.
• True teamwork with mob programming.
-
14.
Framework of Management
”A day’s work”
Vision (“Sandbox”) Current Charter
Other Charters Details
Bug
Reports
Perception of
quality and
coverage
Quality
ReportDebriefing
Tester
Test
Manager
Past
Results
Obstacles
Outlook
Feelings
?
#
xCharter backlog of the future
testing
Out of
budget
Next in
importance!#, ?, x,
+20:20:60
Session sheets of the past testing
Idea of
exploratio
n
Metrics
summar
y
Coachin
g
Playbooks
-
15.
Thank you.
@maaretp
(please connect with me through
Twitter or LinkedIn)
All testing may be exploratory, but some of it is focused on creating artifacts. Saying “scripting is just an approach” is belittling when scripting can be the main approach to use the powers when testing.
“there’s a process of knowing” – learning
Does not give as regression; serendipity (safety against things happening randomly) / unwanted serendipity events.
This is what it is and what it could be. There’s a direction to it, not just statement of what it is.
Coaching is not just feedback, it’s pointing them to the right way.
Safety.
EXPERIENCE (the verb) rather than facts ; emotions over facts. REACTIONS.
HISTORY, Lessons learned, checklists. Modeling.
UNDERSTANDING – where you start (knowing the thing (code & environment), knowing the user, knowing the problems, knowing the developers (how to help them and what they do so that you can efficiently test), knowing the hackers (weird use cases outside common ‘have you tried reading it upside down’) , knowing all stakeholders, knowing the business priorities)
Uncovering things I cannot know, giving the application a change to reveal information for me.
This allows you to know things.
Example area: 53 test cases (P1 – visible) and 184 (P2 – not visible)
Minimal energy principle in reviews: ”thanks for feedback, we decide if we change or not”
Disobeying: ”add test case before writing a bug report”
Our written test cases were bad – just as anyone else’s except for one