Let’s take a look how the process of quality assurance has evolved in Cognifide. I would like to take you on a journey through the transformation of quality assurance process in our company from the dinosaurs to the electrically driven car sent into space. The short history from script approach to exploratory testing, from Testers to Quality Assurance Engineers, from manual to automated approach, from Quality Assurance to Quality Assistance, from Continuous Integration to Continuous Delivery and many other elements of our software quality path. Have we found an ideal and bulletproof Quality Assurance process? Has the evolution finished? If not, what’s next?
Adam Makarowicz – Principal Quality Assurance Engineer in Cognifide. 8 years of experience in software testing. A highly motivated person who always tries to find the most effective solution in any situation. Working closely with clients to overcome their difficulties and help them to reach their business goals. Swiftly changing hats of QA Lead, Technical Lead, Business Analyst to learn, share and accommodate project and company needs.
Raspberry Pi 5: Challenges and Solutions in Bringing up an OpenGL/Vulkan Driv...
The quality, or there and back again
1. The Quality, or There and
Back Again
Adam Makarowicz
March 13, 2018
1
2. Adam Makarowicz
Principal Quality Assurance Engineer
8 years of experience in software testing
QA Lead, Technical Lead, Business Analyst, Product Owner
2
About me
10. 10
Exploratory testing
Exploratory approach
QA designs and executes tests simultaneously
relies on empirical knowledge
of the tested application
supports creativity
emphasizes personal freedom
and responsibility
continuously optimizes the quality
of tested work
Learn
Test
Design
Test
Execution
Analysis
11. 11
Standard checklist
A pre-scripted standard checklist:
list of standard aspects of the application that
must be checked
avoid situation when core features are not
tested
each project defines its own standard
checklist based on its platform requirements
13. 13
Session based approach, charters
Plan, execute and report the results of the exploratory session using the session based
approach.
charter – goal or agenda for the test session
follow the mission of the session
take notes during the session
report your findings
use tests session templates
14. 14
Cross testing
No duplication
Based on session notes
Great to learn how to test
21. 21
Automation
21
End to end
tests
Author feature
tests
Visual comparison
HTML comparison
JS errors
Design responsiveness
3rd party integrations
API / micro services tests
Unit tests
Code quality validation
22. 22
Definition of Done
Definition of Done
Task developed
– tests executed (acceptance and exploratory)
– regression tests done
– non-functional tests done
Maintainability achieved
– automated tests implemented
– documentation done
39. 39
Continuous delivery + Progressive UAT
Feature 1 Feature 2 … Feature N
UAT for Feature 1
UAT for Feature 2
…
…
Deployment of
Feature 1
Deployment of
Feature 2
F1
Sign-
off
F1
done
F2
Sign-
off
F2
done
…Sig
n-off
…don
e
Feature1Feature2Feature…
47. 47
Bug hunting
How?
all QA practice members in one room
we test and explore any application
we take a 2-hour timebox to find defects
and improvements
we log all things in one place
rules & scope are defined by the project
When?
in the middle of a project timeline
before the UAT phase
Since 2012 in Cognifide
Many roles to get wider spectrum
Recently I’ve become Product owner
Worked on QA Process across company, changing the way how we sell, verify, deliver the quality.
Whats is behind the title?
One step forward two steps back
Experiments
Success fails
As you will see we were experimenting a lot with different ways of working.
Not always it was end up with a success. We were changing one or multiple things and then validated if this is a good change. Sometimes we drop an idea after a short period of time. Sometimes it takes few years to understand to get a results.
Our journey is still in progress, we are still changing.
I would like to talk you through our path.
How the first process looks like?
I need to warn you, you can expect something really complicated, so stay focused
Are you ready for that?
Around 2004, beginning of the company, using JIRA
No quality activity presented.
Simplest possible workflow.
Testers exists
Standalone bugs
No clear status
That doesn’t mean that we have no testers back then
Testers were in company from the beginning but not visible in process
Tasks status checked via jira filters, very painful
Bugs raised as a stanalone tasks
Noone knows what is a status of the feature, a lot of reading in comments
Code, test, close.
First change in the process,
Highlighted testing in JIRA
After resolved – problem with clients.
Changed Resolved – means implemented and tested
Added Implemented
Additionaly Sub-bugsrestrictiion on unclosed subtasks
JIRA boards
Clear view on sprint
Clients starts understand that Resolved means implemented and tested.
Not only Testing in progress added,
Added Sub-bugs, now we were able to match bugs with feature, we know in what state is feature
Then
Added JIRA dashboards, team were able to get status just from one screen
Only when all sub-tasks are closed you are allowed to move task to QA Queue
We decided to leave simple workflow for sub-tasks. We don’t need to have same extended process then for them.
Na poczatku byly filtry
Pojawily sie boardy
Status calego taska na boardzie
Przesunac mozesz dopiero jak sa zamknięte
First just a cr – not in all projects
Then by TL – bottleneck
Then by min 2 devs with one Senior at least
Crucible, then Stash – better to present status in JIRA, create a branch from task etc.
Apart from jira, changes behind the scene
No test cases upfront
QA responsible to design end execute tests
Based on story description
Not only the jira workflow was changed, we introduced mamy more behind the scenes
Tested was involved after implementation,
We were siting together, but not designing the solution.
No test case upfront, we were never company like that.
Based on description of the story,
Full responsibility on QA to figured out what to test
Each project did on a little bit different
To standardise
Diffecences between projects
To do not forget about fundamentals
Because each qa did exporatory in a little different way, we had a problem that not all the projects consist the same quality verification steps.
So we get best practices from projects,
we created standard checklist,
it was added to each feature description to check if we verified everything.
Based on that we standarised our exploratory testing,
still we had oportunty to verify more
Still on Qmetry page there is our case study
As a test evidence for client
After manual tests
For regression
Problem with mainteniability
Some of the client requests from us evidence what was tested.
After exploratory testing, we were crating test cases for future regression.
Huge problem with updating tc, after a while tc were useless. Before executing we are almost always need to chenge it a little. Working in agile, not in waterfall, in development after each sprint we were doing some improvements.
http://www.qmetry.com/cognifide-managing-both-agile-and-waterfall/
Sometimes exporatory testing took to much time
Problem when to finish
What is the state
Not always we planned what is in scope
We force ourselves for that using session based approach and charters.
It was also a change in test result evidence, we drop test cases.
Testing notes were enough. We were able to show our clients what was tested.
Charters were devided into few.
We had clear timebox.
We plan what we are going to expore during each session
We are able to stop, thnik and decide if we need more.
Test session notes were connected to the feature in JIRA.
To improve quality level in our projects.
We introduced cross testing,
No duplication
Two testers must test same story
Story tested once by tester was retested by other tester.
Not duplicating the tests already executed, but extending it.
We had a session notes already so it was easy for next QA to verify what else he/she need to test.
Once we get more juniors testers, we try to use cross testing to learn them how to test our applications.
A lot waiting for cross check
Not delivered
We were started be more mature,
expirienced testers in the company.
So in the project with few that kind of testers often cross testing found nothing.
But this extend time to deliver feature, it was an usual situation that in qa queue we had many features waitng only for cross test
Then cross test found nothing but the sprint was already finished.
So we introduced test sessions debrief. Other tester debrief the session notes with first one
Short informal meeding, which was about to define if any more tests are needed. If not we simply close the feature.
Another way of speed up the teesting process
Integration between QAs – company pair testing
To speed up the learning process we introduced pair testing. Not only for that, it was also geat oportunity to integrate witch each other.
It was also a first mentoring action, which later transform for a whole program, that each week we have mentoring meeting which share the domain knowledge.
Some time ago we even organise company pair testing, we shuffle testers from any project to do a pair testing. That was a great fun and also oportunity to integrate with other testers with which we are not working daily.
Pair testing was great for juniors, we find how more expirienced testers work, test, what kind of bugs they find.
Process more mature but not in budget
Agree level of quality with client
Secure budget
Engage before the project
QA process and practive become more mature. We were verifaing the quality on many levels, we had already some standards, we weare clear what we want to verify as a minimum.
So we need to secure a budget for that in the projects.
We started taking part in the engagement even before the project. Started to be more proactive
During discavery phase we talk to client to highlight our vision on quality, agree tle level of quality. Agree what and when quality actions taking place during the project
A lot were checked
A lot of time to deliver something
Dev qa ping pong
We want that level of quality
Because we were testing more and more cases we end up with a situation that each feature consist some defects found during testing. This again place is on point that delivering the feature takes more time, we had a lot of discussions if the bug needs to be fixed or not. We were simply to good as a testers
Solution for that was a developer checklist
As a contract with developers
Demo content as a trick to force tests
At the begining of the project we agree a contract with developers what kind of activities they need to do before submitting a feature to test. The most important point of this list was demo content. It was a trick that forced our Devs to at least verify a basic scenario,
Others items on the list was to confirm if documentation is in place, code review pass, there was some design session with TL, it is working with the specific conditions etc.
Automation was already in place
First thing to descope
First when we have time,
then mandatory,
One person dedicated, whole team?
then only for key features
Balance
Big projects, clients
Till that time we were automating regression tests as a separate tasks. Of course that was a first thing to descope when we have limited time.
Once we started getting bigger and bigger projects we cannot work like that anymore
We added automation to our process inside the feature. First we had automation after the testing but before closing the task. So you are no able to close the feature story without proper automation
Then we gone one step further and automation needs to be delivered to tester along with implementation of the future itself. We were working closely with developers on features as soon as possible to do automation in parallel
We also test the set up when only one dedicated QA was automating regression but this lead us to a point where no one more then this tester were interested in tests.
A lot of activities needs to be done before close
Standardise the DoD
Delivered and maintainable
We had a lot of activities to finish before closing the story. We introduced definition of done, agreed on the begining of the project. This geve us a checklist to do not forget about anything
No longer open
Open vs new?
Misunderstood for a client
Similar:
Tested -> resolved
KANBAN
Powiazanie z procssem, czytelność, agile
Another change in Jira workflow
Open was misunderstood, everyone understands that I different way. Open Vs new why new is not open so is closed? A lot of discussions to be made with the clients to explain that.
That time we also sometimes started to use kanban so differentiate new with Todo allows us to maintain the approved stores which are valid and confirm by the tailted person
Checklist to know if ready for development
Stardardise actions required from client
How to know what needs to be ready before we move something into Todo column? Go through the dor list. This also standardised the actions that we required from the client, if the integration data are provided, if there are any dependency, did we done design session, are acceptance criteria are written etc
To be on the same page
To do it constantly
We also started using three amigos principle for verification if story is ready for development. Three perspectives meet together and decide if we are able to start
To use the same language
No technical stories and bussines stories
Upskill the BA = workshops
Stardardise a tempalte of story
Still chalenge when to use it
BDD describing
Wciaz nie jest jasne kiedy gdzie bdd
Before we had acceptance criteria in stories, but the one that was understable by Devs and QA were not by Ba. We introduced bdd I few project to test the new approach, it was not easy on the start. Then we standardised the way how we write it, provide template, had many workshops with ba, upskill temu how to write good bdd.
Still we have a challenge when to use it, not always it fit for purpose. We had an example of story which scope was to implement a design change described in bdd rather than just uploading a image
BDD from story into feature file
Dictionary,
Hard to balance
Idea BA write feature files
Not always BDD as automation
The benefit that we also want to get from bdd was to easily transfer acceptance criteria into a automated test. This requires a lot of attention from ba to use the same sentences over and over again. And as can imagine not always it was done. We try to encourage back to work on feature files instead of Jira description but it was to much. Our initial focus on bdd was to simplyfiy the requirement production. So still we were using bdd in stories for requirements but we stopped require that those will be 1 to 1 napęd into automated files.
To secure budget
To know excalcy what is in scope
What are our standards
As a whole company
Not only in QA practice we found standards usefully. As a whole company we worked out a document which consist our company wide standards. We had a good alignment between projects using the same standards.
We also known which standard our client select and expect from us .
Static ananlysis as part of build befere
Standard across company
Clear KPI what level support - TIOBE
Security rules into Sonar
Jacoco as test coverage
Static analysis was introduced before, but we would like to compare projects with each others and have a clear KPI
We I troduced qubacz and of course standardised a expected minimum KPI.
We also added a security rules into static analysis.
We were working together with other practices on the quality. It was ours initiative but other practices starts to be interested into the quality
Pozniej rule do security, wspolpraca praktyk od poczatku
Dostarczanie wspolne
Inicjatywa od qa do wprowadzania checkow przez devow
Mature company now
Qa audits, then review
Then mentoring
Reporting none,
Mandatory,
Only when client expect
Monitoring of NFR, Regression – smashing
JIRA as status
Governance as a company way
Raporty, na poczatku nie byo, potem byly mandatory, porownywalismy siebie nawzajem,
Potem raporty nie byly potrzebne
Potem doszedl smashing
Jak sie zminiala kontrola vs dojrzalosc
Ludzie stali sie doswiadczeni,
Jest ich wiecej
Nie ma potrzeby aby ich trzymac za reke
Przychodza jak maja problemy
Leads workshops
Z kontrol na enablement
With support on demand
The way how we test envolved
Again a lot of checks, qa proctice was more experienced
Extending AC by QA
Nie balismy sie rozwijania sie ac
Zamiast test cases byly ac
Tabelki status dev I qa
The biggest improvement
We were no longer testers
from the biz dev to bau on production
Developers responsible for tests
Before Flip over the wall
Implemented and tested
QA validation
To define what is in dev responsibility
Testing notes
Part of the story
Not on local machine!
Do we need more tests?
Green/amber/red tasks
Initial checks
Constant monitoring
Planned full tests
To get bussines feedback
QA was a master of merge
Separate features delivered
Kanban
Do not wait to the end of the sprint
Less regression on UAT
Hardening?
To help BA with their story production
Process which support the way we produce stories
Still not enough
Some parts of process to improved
PO part of the team
How to present which story is ready to deliver
Continuous deployments?
Demo to PO
Approve / decline
Release to Production
QA responsible to maintain version of application
JIRA + Bamboo
Version in feature,
Where deployed
Taka jaka potrzeba tak robimy w projekcie
Jenkins vs bamboo
Duplicated servers
Sync them, always the same version
Hot fixing
No downtime for authors/users
To get some fresh exploratory ideas
Not limited to QA – Financial and HR
Different devices
Did we found the balance?
Each project is different
Only common things standardised
Is not the end of our path