While Test Automation has great outcomes once implemented, there can be many challenges during the implementation process. This case study covers some great learnings from 99X Technology's Team Adra's automation journey
How Automated Testing was introduced and evolved in Adra
How to pitch and challenges & learning outcomes
Adra provides software solutions for automation of account reconciliation processes
Adra’s customer base from MNCs to smaller businesses with more than 3000 companies
These purpose built tools help Helps Adra’s customers to trust their numbers free up time close books effectively and quickly
Well, ENITRE LIFE CYCLE – Test cases, Test scenarios to Continuous Integration for a handful of features in the application
As the first step, develop a smoke test suite then evolve!
Tech stack!
Technology stack Selenium WebDriver, c#, Nunit & Jenkins
Clear segregation between page object classes and tests
Why Page Objects?
Improved Maintainability by insulating test code from changes in the user interface
Improved Readability
Increased Reusability
The next five slides describe the points that needed to be considered when implementing / continuing test automation effort in customer projects
If it’s hard, keep apart
Continuous Integration
Measure Test Automation
Fitting to the process
Rest and retire
Not everything can be automated
MVP initially, because many teams start automation and then get dragged away by putting many bells & whistles in their frameworks
Quick feedback is important
Customers don’t pay for test cases or scripts, they pay for information on quality
Test scripts should be triggered after any one checks in the code but
you could configure the code to build at regular intervals
Even you could configure nightly builds / running every 12 hours or so!
Important thing here, is that you have a schedule and you stick to it
Mean Time to Diagnosis. How long does it take you to find out why your tests are failing?
Because when a test fails you need to know why as quickly as possible!
Total Execution time of regression suite, smoke tests
Keeping a track of identified defects and average scripting time of a test case (help you to get better with your estimations)
Automated tests creation are part of the story (feature) comes as a task
Passing them will be one the done criteria
Include them in the regression suite
that becomes a release criteria During a release
Cats are champion sleepers, clocking around 15 hours a day
Don’t run your entire large regression suite on all the browsers coz there is high possibility for the tests to hang up due to various reasons other than malfunctioning of the feature and you tend to spend a lot of time investigating and fixing it
Instead, perhaps have your smoke tests run on every platform or browsers
Retire older test scripts when the feature matures
Test automation is not a silver bullet, Automation is not testing – Automation supports testing.
Go for the low hanging fruits first to avoid being stalled at any point