Teams of testers often get minimal feedback on the results of their exploratory testing. Since the introduction of Session Based Test Management, standard practice has been to save exploratory test sessions in the form of flat text files. These files require additional tools to search and parse, which limits access to statistical data. I wanted to close this feedback loop with my testing department, so I sought a solution. After investigating a few options, I decided SessionWeb, an open source tool by Mattias Gustavsson, was a good starting place. SessionWeb stores session data in a relational database, so we were able to easily add customizations and create additional reports that mattered to our test leads and management. The instantaneous, on-demand nature of the reports from our test sessions has greatly increased adoption of session based test management practices in our department. The goal of this talk is to help others realize the vast amount of potential data that exists within their ET session notes, and how to use that data to provide deeper and more meaningful insights into their teams’ testing strategies and results.
Readers will learn how using a database to store test session notes and statistics can increase reporting capabilities and help generate meaningful statistics. This type of quick feedback can help testers increase adoption and acceptance of SBTM practices by all levels of their organizations.
Following a test team’s journey at an attempt to improve SBTM consistency across multiple agile teams.
A reflection on an experience, not theory.
Often it is easy to continue doing something the same way instead of rethinking how it could be done
Why do we need improved reporting?
- Knowing what we’ve done
- Managers NEED to know
Why use “exploratory methods”?
- freedom to allow talented testers the freedom to explore the areas of importance, as they see them.
- better coverage; repetitive testing can be done via automation
Who Am I?
-”Young” tester seeking to push the limits of myself and my company, and to grow and promote the industry of software testing
I like to learn by doing
Someone has to try new things…we were lucky enough to have a chance to
What is SBTM?
Creative test coverage with metrics
Who recognizes this template?
Why does it exist? What is it for?
-Using text files is not wrong, but we can do better.
There are tools to overcome some of these issues, but why create new tools when easy solutions exist?
Nothing met our immediate needs better than SessionWeb
Piloted the tool with 3 Agile teams (approx two testers on each team)
Issues WE wanted to solve:
-tracking sessions/test notes
-categorizing sessions
-reports
-feedback!
Issues we didn’t need to solve:
-Time spent recording test steps
-Capturing data/screenshots/etc
Demo time!
What can we infer from this?
- Lots of sessions waiting to be debriefed. Are people not doing debriefs? If not…why not?
0 Not Executed means no pre-planning of sessions (at least currently)
- 18 In-Progress. Do we have 18 testers currently testing? If not…why are these sessions in progress?
What can we infer?
We know testers prefer Chrome
Is this a good representation of what our clients use? (The answer is no…)
Why is IE9 more popular than IE10 or IE11?
What can we infer?
- Clearly all of our testers use Windows 7 machines.
- Someone has access to OSX/iOS, but it’s probably not easy enough for people to use frequently
What can we infer?
There has been significantly more testing on Feature F1500 and F1461 than the other features shown.
F1500 appears to have a large number of found defects associated with it – this makes sense
F1461 doesn’t have nearly as many defects found – perhaps testing spent here is not valuable
F1593 has lots of defects found and not much test time associated to it. Perhaps more test time should be spent here.
Piloted with one small area of product development
Realized we had to keep improving our process as we found areas for improvement
Lots of change/flux was confusing to testers/managers
I still say Success!
Pilot stats: 6 months in, 650+ Sessions, 100+ Sessions/month, approx. 1300 hours of in-session time
New department after reorganization:
-dev managers specifically asked for SessionWeb to be kept alive
-given multiple presentations about good SBTM practices to managers and their teams
More insight into the test results
Didn’t require individuals to roll up the information
Data was available “on demand” to whoever needed it (testers, leads, managers, directors)
Reports are clear and consistent
No ramp up time understanding the data – presented in charts consistently across groups
Raw data available to do additional manipulations/analysis
Helped everyone (testers, devs, managers) understand exploratory/session based testing, and why we capture what we do
Consistent process means anyone can test for any team and not have to ask questions about the how/what/where
Helped maintain good charters/test notes
Increased debriefs and the quality of them due to visibility