The Journey of Test Automation

73

Published on

Test Automation Journey - The Past, The Present and The Future

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
73
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
2
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

The Journey of Test Automation

  1. 1. The Journey of Test Automation The Past Software testing is a necessary process and everyone seems to agree on that. Where people differ is the timing and depth of the initiative. The industry has long moved from a stage when testing was considered a post-development, maintenance activity. We gradually interleaved the development and testing process. Both the development and the testing process tend to merge, bringing both new benefits and new challenges. The obvious benefit is fast feedback. We need to check an error as quick as possible. The not-so-obvious challenge is maintenance and overhead. This will come to haunt us later. Long since the early days, the industry adopted Continuous Integration. Whenever a developer is ready to push some code, he first has to pass it through the Unit Tests. If all is green then only can he actually commit the code. Even per-commit jobs are gaining popularity. People are delegating the responsibility of code validation to the build server itself. The idea is to integrate the Code – Build – Test – Deploy process in a single pipelined workflow. Every stage in the pipeline provides immediate feedback to the preceding stage. On the developers’ front automated Unit Tests have established themselves as an invaluable checkpoint. What about the managers and the business users? Everyone wants things to be well tested. But Unit Tests and their reports are not much meaningful for them. The unit tests are also not reliable in integration testing. Components having passed their unit tests may still fail to integrate. Here comes the need of functional testing. Organizations have dedicated departments of manual-testers. They validate the product at whole build level. Detailed reports are generated, modules’ health is monitored and bug tracking is integrated to this process. The Present Bulk of functional testing has always been manual. A big reason was lack of tools which can automate applications, detect and report the failures in doing so. But the last decade saw an exponential growth the automation sector. The authority of pioneering products was challenged by emerging tools. A majority of them were open source, promoting accessibility to even small companies. So when we can automate the functional testing process, why not integrate it to the CI server? Why not unit test and functional test every code batch? Here comes the hurdle which we knew but overlooked at the initial stages – maintainability. Automation scripts are traditionally codes. They are codes which test and validate other code; and they being code, need to be designed, written, maintained and reviewed. But they have a big advantage – they can execute just as any other code. That’s why JUnit and similar frameworks are so popular. Problem is they don’t scale well with functional testing. For one thing, functional tests are very broad, covering many workflows, scenarios, test cases and unique steps. Such big a code is difficult to maintain and soon outgrows the product itself. Another thing is being code; they don’t truly lie in tester’s domain. They need seasoned developers. You can have programmers write the test scripts but then a load mismatch remains.
  2. 2. Welcome the era of Keyword Driven Automated Functional Testing. Encapsulating the programming process in test scripts gives many benefits. First and foremost, we no longer need hard programmers for testing task. Any programming-literate can do the task. The Test scripts are more maintainable and language agnostic. Now we can actually have both unit tests and functional tests which can be triggered on each code check in. While unit tests validate code conformity, functional tests validate product’s behavioral consistency. And both work unattended. Developers can get feedback meaningful to them from their unit tests. Business users can get module health status meaningful to them from their functional tests, all in near real time. The Future There is no limit to innovation. Imagine the development of a web application. Let’s say the application needs to be supported on all major browsers, IE, FireFox, Chrome, Safari, and Opera. Across all Platforms: Windows/Linux/Mac Desktops/Laptops. Let us push the limits, ask for support on all major mobile devices: Android, iPhone, Blackberry, Windows; talk about orientations, resolutions. Now lets us ask the test engineers to automate the whole thing. Run all regression flows on all Platforms/browsers on all check-ins. After all we want a strict quality control. Difficult to accept but this simply cannot be done! Not without an automation framework which can do the heavy lifting. None of tools or frameworks in the market can cope with the requirements that we will be facing in coming years. What the industry needs is a test automation framework which: 1. Can integrate with the existing CI solutions. 2. Perform functional tests as seamlessly as Unit Tests. Run a test every night, every check in or on every Run-Test request. 3. Scripting process is better to be keyword driven than language based. We want one simple ClickOnButton; not a bunch of find, hooks, performs, again recheck hook and so on. 4. Provides high maintainability of Scripts. Should not need programmers to modify that validation. 5. Can gracefully orchestrate multiple tools to perform activities which otherwise don’t come under ambit of any one single tool. If Selenium cannot click on an image, get it done with Sikuli. 6. Can launch multiple configurations of a same test in parallel. We want to take the same test on all five major browsers. 7. Gives a uniform interface over heterogeneous tools. Tools may vary in language support even on platform support but we need it. We need to test our application on both Linux and Windows. 8. Generates business oriented reports. We want reports in terms of Business flows not individual test cases. We want health check of modules. We want basic metrics. We want to visualize defects over last five builds. 9. Drill down to step level failures, so that both testers and developers can get workable information regarding the failure. 10. Integrate with major Defect management tools like Bugzilla, Jira etc. Should be capable of automatically logging confirmed bugs. 11. Integrate with major Application Lifecycle Management software like HP ALM.
  3. 3. Now that is a whooping wish list and no product in the market claims to do it all. But this wish is soon going to be our requirement. Success and failure of products will depend on timely validation on all the target environments. Even if we somehow manage the sheer volume of test scripts and all the plumbing, we are stuck at the test infrastructure. We need to test on five browsers, each on latest version and three backward versions. We need to test those browsers on Windows, Linux and Mac. Now that makes about a hundred parallel executions on a hundred infrastructure. How are you going to manage that infrastructure? What happens when you need a bunch of Mobile devices? Are you going to purchase it all? Rent it all? Of course not. The future is of cloud computing. We will rent browsers on cloud. We will rent devices on cloud. We will rent the execution systems on cloud. Volumes test data will be stored on cloud, analyzed on cloud systems and reported from cloud. All we need on premises is just one CI tool and one test- automation-tool. The cloud will provide all the rest. We will rent resources on pay-per-use basic. Pay for the tests we run. No need to inventory the hardware resources, browsers, devices and all. And our steps to it That future is not so distant. You can get the machines from IaaS provides like Amazon or Rackspace. SauceLabs can rent you browsers; Xamarin can run your tests on rented mobile devices. These solutions are already available. But are these true solutions to your pains? No. you need an end to end automation solution and these isolated initiatives don’t help much. A service is required which can bring all these together. A lot of tools have targeted the challenges and most have succeeded too. But until now it had been mostly isolated, unorganized attempts. Notable efforts are from QTP – which along with its Open Test Architecture is the big brother of the current generation of Test Automation solutions and the other is OpKey – the next generation of Test Automation Framework. OpKey is determined to solve the problems which the future automation engineers and managers are about to face. It is designed for scalability and maintainability. The future is of multi platform deployment and so the challenge lies in multi-platform simultaneous quality control. OpKey is the only player which can tackle this challenge. Being scalable means anyone, even small scale developers can start with a basic OpKey setup and then scale up as the application grows. A time will come when a use case will span Web, Database, Web-service and Mobile interactions. At that moment you will realize that only OpKey can orchestrate that many tools, platforms and resources. OpKey is not just another framework. OpKey is the smartest Test Automation Framework. It adapts to your needs. It scales to your needs. It works for web, desktop and even mobile. It can simultaneously run all your tests on all the target platforms. What is more, you do not have to manage the entire hardware infrastructure yourself. OpKey is introducing Test as a Service paradigm. With that in place, all your inventory woes will be gone. You will trigger the tests and you will get the results. That is all. OpKey is the beginning of no-frills-testing. In next few articles I will demonstrate how it meets all that it claims.

×