SlideShare a Scribd company logo
1 of 25
Download to read offline
Adapting Automation
to the Available Workforce
Colm Harrington
VCE, Ireland
IsThisBookRightForMe?
INTRODUCTORY
Introductory content is for software testing professionals who are
relatively new to the subject matter of the ebook. Introductory
ebooks typically feature an overview of an aspect of testing or
guidance on understanding the fundamentals of the subject at hand.
INTERMEDIATE
Intermediate ebooks are for software testers who are already
familiar with the subject matter of the eBook but have limited hands-
on experience in this area. Intermediate level usually focuses on
working examples of a concept or relaying experiences of applying
the approach in a particular context.
ADVANCED
Advanced ebook content is for software testers who are, or intend
to become, experts on the subject matter. These ebooks look at
more advanced features of an aspect of software testing and help
the reader to develop a more complete understanding of the subject.
This eBook
is intended for
ALL LEVELS
Abstract
The challenge of producing software today is the same as it has always been:
Deliver high quality software within an acceptable timeframe.
The ways we, as software engineers, try to tackle this challenge have changed
somewhat now that we are in the 'agile age'.
Being agile means many things and one of those is that we need to automate our
testing to ensure we don't get caught up in the mire of ever increasing regression
test cycles.
The requirement to deliver 'completed' software within the confines of a sprint
means that more demands are being placed on the QA's of today to automate their
work. This has resulted in a significant challenge to the current workforce has not
yet caught up with demand in terms of SDETs (Software Development Engineers in
Test). The approach described in this paper can work whether there is an SDET
available to the team or not. If there is, they can fulfill the role of the developer and
free the developers to produce the actual application. For the purposes of brevity,
when I refer to developers in this paper this can equally apply to both developers
and developers in test.
The approach described in this paper is an attempt to tackle the problem of
automating software with limited resources by effectively splitting the job into two
parts, each tackled by the people with the best existing skills in that area:
1. Create a testing framework; this is put in place by the developers.
2. Create a suite of tests; These are created by the QA staff, which leverages the
framework.
Biography Colm Harrington
Colm is a Automation architect with VCE based in
Cork, Ireland with responsibilities for both front end
(via Selenium) and API testing. Colm has worked in the
software industry, for various companies including
Microsoft and Sage Ireland, for over 10 years.His real
passion is simplifying the test process of complex
applications and integrating the test process
seamlessly into the SDLC.
Adapting Automation to the available
workforce.
A white paper for developers AND testers.
Author: Colm Harrington, October 2014
Table of Contents
Executive Summary.......................................................................................................................................7
Introduction ..................................................................................................................................................8
The problem..................................................................................................................................................9
The solution .................................................................................................................................................10
Define the roles of the key participants ..................................................................................................10
The role of developers (Framework) ...................................................................................................10
The role of testers (Test cases) ............................................................................................................12
The role of Business Analysts (User stories) ........................................................................................13
Crafting an Automation API.....................................................................................................................14
Simplicity..............................................................................................................................................14
Consistency .........................................................................................................................................15
Intuitiveness........................................................................................................................................16
Styling an automation API....................................................................................................................... 19
Fluent APIs ..........................................................................................................................................19
Data Modeling.....................................................................................................................................19
Page Objects........................................................................................................................................19
Lists of ten (or so) ...................................................................................................................................20
10 things you should be doing in your automation framework/API: .................................................20
6 things you should not be doing in your automation framework/API:.............................................21
Call to action ...........................................................................................................................................22
Future Work............................................................................................................................................22
References ..............................................................................................................................................23
Executive Summary
The challenge of producing software today is the same as it has always been: Deliver high quality
software within an acceptable timeframe.
The ways we, as software engineers, try to tackle this challenge have changed somewhat now that we are
in the 'agile age'.
Being agile means many things and one of those is that we need to automate our testing to ensure we
don't get caught up in the mire of ever increasing regression test cycles.
The requirement to deliver 'completed' software within the confines of a sprint means that more demands
are being placed on the QA's of today to automate their work. This has resulted in a significant challenge
to the current workforce has not yet caught up with demand in terms of SDETs (Software Development
Engineers in Test). The approach described in this paper can work whether there is an SDET available to
the team or not. If there is, they can fulfill the role of the developer and free the developers to produce the
actual application. For the purposes of brevity, when I refer to developers in this paper this can equally
apply to both developers and developers in test.
The approach described in this paper is an attempt to tackle the problem of automating software with
limited resources by effectively splitting the job into two parts, each tackled by the people with the best
existing skills in that area:
1. Create a testing framework; this is put in place by the developers.
2. Create a suite of tests; These are created by the QA staff, which leverages the framework.
This approach allows each team member to focus on what they do best and still get the job of automating
the tests done. It has the primary benefit of softening the learning curve for QA's, who often come from a
manual testing background, when writing automated tests. It does, of course, also allow for developers to
gain an insight into the work of testers, and the craft of testing, and can be an invaluable aid in cross-
skilling within the team.
Introduction
Everyone knows the old proverb about a problem shared being a problem halved and it is never more
appropriate than when dealing with software automation. For years, automation was seen as the preserve
of a 'guy' in a company who existed in something of a rogue state. They were normally under the remit of
the QA manager but their work often overlapped with the development teams in terms of their day to day
deliverables. The demand for automation has increased as software has become ever more complex and
this has led to a massive increase in people with automation skills and, more recently, WebDriver skills in
particular.
The role of the SDET (Software Development Engineer in Test) has arisen in recent years and, to some
extent, the workforce has yet to catch up. The SDET is a strange beast, part tester and part developer and
often seen as neither by the dedicated team members. Never the less the role is in great demand as the
complexity of the world's software exponentially rises.
The increase in popularity of '100%' automation solutions as part of agile delivery has led to a massive
increase in the time and resources required to actually deliver on this. In addition to this, automation is
now very often included in the 'definition of done'[1] within the sprint confines.
The role of the SDET is essentially a combination of developer and tester and these can be hard to recruit.
As in sports, where it is more difficult to find players that attack and defend equally well, it can be a
challenge to recruit SDETs as it is not yet a common career path to take. In smaller companies where the
available headcount may not stretch to additional team members, it is a good solution to leverage the
existing skill sets and effectively split the job of the SDET among the developers and testers allowing
them to focus on the areas where they are already strong while also gaining an insight into the workings
of other team members.
This workload often fell to QA resources that were coming from a manual background and expected to
up-skill as developers time was taken up with delivering 'functionality'. The commonly led to poorly
written automation code (The DRY[2] principle was/is commonly ignored) which was difficult to
understand and maintain. Often simple changes to the underlying application required major changes to
the tests to get the 'green light'. This in turn led to high maintenance costs and often resulted in the
automation becoming most costly (in terms of time and money) than it was worth.
It's time to change all that.
The problem
Modern agile software teams are typically made up of three types of team member: Developers, Testers
and a Business Analyst/Product Owner.
All these resources (otherwise known as people!) have a part to play in assuring the quality of the
software delivered by the team. Unfortunately, only the testers have been traditionally responsible for
quality assurance. Indeed, this has happened to such an extent that the terms Quality Assurance and
testing have become synonymous in the software industry.
The difficulty facing the manual tester trying to up skill is that the learning curve is quite steep.
Newcomers have to tackle learning a programming language, an automation API (WebDriver, for
example), as well as potentially, programming best practices such as design patterns (more on these later).
The difficulty for the developer is largely one of time constraints, their time is largely consumed with
functional delivery and it requires 'buy in' from the business that some time will need to be diverted to
Automation. Also, the challenge can include the age old issue of developers not knowing what to test.
The difficulty facing the Business Analysts is that they need to be able to dig a little deeper into the
technical side to be able to contribute to the process. The introduction of a BDD[3] framework such as
Cucumber [4] or JBehave [5] is a major plus here as it insulates the business people from the heavy
technical details while still allowing everyone on the team to speak the same "language".
Software automation is often thought of as exercising the UI via a browser but there is more to
automation than this somewhat narrow viewpoint. In fact, one of the main reasons for the failure of
software automation projects is that there are too many tests at too high a level. This can be countered by
intelligently pushing tests down the pyramid [6].
The solution
The solution to the above stated problem could be summed up in a single sentence.
Lower the entry point for QA to write automated tests by developing an interface to make
automation more accessible.
This can be done largely by allowing testers to remain in the language of the problem domain and
avoiding, as much as possible, the language of the technology domain.
Define the roles of the key participants
In the agile world, quality is the responsibility of everyone on the team; it is no longer acceptable to throw
it ‘over the wall’ at the test team. The key to solving the problem of effective software automation is to
share the work load to the benefit of everyone. It's important to make the distinction here between sharing
the work and offloading the work.
It often happens that at a developers meeting that a decision is made that the QA department should take
on the task of reducing the automation backlog as the developers are overworked anyway and, besides, it
is testing work.
Meanwhile, down the hall at the QA meeting, the QA team has decided that the backlog should be taken
on by the developers as it programming work and, besides, we are already overworked.
The net result is that the backlog continues to grow and fingers get pointed at end of year review meetings
instead of actually solving the problem.
One of the surest ways of determining that something won't get done is to decide that someone else
should do it.
To combat this it is essential to breakdown the work into chunks that can be managed by both parties. The
key point here is that the chunks of work are done by those with the skill set[s] to complete it in the most
effective and efficient manner. The next section will deal with the various roles within a (typical) agile
team and what their role is in terms of quality assurance as opposed to testing.
The role of developers (Framework)
The main role of the developer, in this context, is to expose a clean, predictable API to QA. This has the
twin benefits of making like easier for QA and leads to more readable, maintainable tests.
If desired, BDD can be layered on top of this API but it is not necessary to get a large return on
investment (ROI) from this approach.
The real benefits of this approach to automation are that the skills are already present in the team
members doing the work. Avoiding memory leaks by applying a singleton pattern to driver management
in a selenium framework is straight forward to a competent developer but may result in a steep learning
curve for a QA without a background in programming.
Indeed, applying best practices to the development of a framework/API is the best way to ensure that the
team avoids the overhead of poorly developed automation suites that become more costly in terms of time
needed for maintenance than they save in the first place.
It is now accepted practice to implement the Page object model [7] when writing automation tests against
the WebDriver API. This allows us to abstract out low level detail such as element locators from the tests.
Again, this has the dual benefits of keeping the test code DRY as well as keeping the tests themselves
clean.
Another area that is very beneficial in terms of time saving is to implement custom assertions. Assertions
in tests are the part where we compare the actual result to an expected result:
assertTrue(“Processing Failed”,driver.findElement(By.id("element1")).isDisplayed()&&
driver.findElement(By.id("element42")).isEnabled());
This is a lot harder to read that something like:
assertOrderProcessingFailed(ResponseCode.200, ResultImage.GreenTick)
Of course, the custom assertions also help to keep your test code DRY as the locator logic is abstracted
away from the tests and the tester can easily write new tests without having to worry about low level
details such as locators. This is a very powerful concept, when implemented correctly and I will deal with
this in more detail in the section on exposing APIs.
In my experience, the single most common issue I find when reviewing tests written against the
WebDriver API (in Java) is synchronisation. To put it simply, we are not waiting for the right events
before continuing. In real life when we fill in a form on a page we often have to wait a second or two
while the next of a form loads or while a check is being carried out to see if the requested username
already exists. This is intuitively done by humans but presents a challenge for machines. This has become
even more relevant in the Web 2.0 age of AJAX calls to the server being commonplace on a site. For
example, if we are filling out an online payment form and enter our credit card number and then tab to the
next field for date, very often an asynchronous call will be made to the server in the background to check
if it is a valid card number.
If the card is valid, the ‘Pay Now’ is enabled, if not it remains greyed out. Often the UI will display a
green tick or some other visual feedback to the user to let them know that the card number has been
verified. As software automators we need to catch that event (the appearance of the green tick next to the
card number field) before attempting to submit the form. The trouble is it may take a second (or two or
three) to do a round trip to the server and determine the card validity and our test will continue and will
fail by the time the AJAX call returns and enables the ‘Pay Now’ button. The WebDriver API has built in
capabilities to handle these scenarios but they can prove challenging for newcomers to programming. It is
important that these waits are built into any test APIs so the tester can focus on writing tests and not have
to worry about handling asynchronous events which can be time consuming to track down and debug.
Another key aspect of this approach is to handle the management of the driver at a higher level than the
tests. This ensures that memory leaks are kept to a minimum as a common mistake when writing
automation is to create driver objects and leave them in memory even when they are no longer required.
This is particularly costly when writing WebDriver tests as each driver object represents a new instance of
a browser window and poor management can quickly lead to ‘out of memory’ errors.
The last topic I will cover here is the ongoing debate as to whether assertions should be in the tests or in
the page objects. In general it is better if the page object simply contains data about the page and does not
contain any business logic; however, sometimes it is ok to embed the assertions if the scope is limited.
The catch is that it harms reuse of these methods.
The role of testers (Test cases)
The role of the QA within this framework is to add tests by leveraging the API provided by the
development team. Ideally, and especially when the QA is not experienced, then the Sunny Day tests
provided by the developers can provide a Launch Pad to begin the creation of a suite of tests. There are
many possible error conditions for a given piece of functionality such as ordering a ‘widget’ online:
1. A blank orderId
2. No orderId field present
3. Invalid characters in the orderId field.
This is where the expertise of the tester comes into play and it would be neither practical nor desirable for
the developers to think up all these scenarios and implement them. Very often, this work will already
have been done by the QA team for their manual tests and this process ensures any duplication of this
work will be minimised.
A combination of the roles of QA and developer ensure that the twin goals of:
1. Testing the software to the limits deemed acceptable by the QA team
2. The automation code produced is maintainable to a standard deemed acceptable to the
development team
are both met.
A part of the deal for the QA as the developers are now providing them with a API for writing tests is that
they must be smart about what tests they write. Running tests through a GUI are relatively expensive in
terms of time required to run and any redundancy should be kept to a minimum. Consideration should be
given to running tests at the API level where appropriate as the size of the feedback loop is orders of
magnitude quick when there is no GUI to render. Sometimes, the best frontend test is no test at all.
The main benefit of this approach, in terms of product quality, is that the testers time is freed up to do
exploratory testing [8]. Exploratory testing is best described as test case execution and test case
documentation occurring at the same time. It is this type of testing that is far more likely to find defects in
the software rather than running a regression suite ad-nauseam.
The role of Business Analysts (User stories)
The role of the Business Analyst in the agile software team is an extremely import one as they are the
ones that ensure that the customer gets the software they want (Build the right software ...). In my
experience the best way of doing this is to clearly list out the Acceptance Criteria (AC) before a line of
code is written or a test case documented. AC’s are criteria that must be satisfied in order for a piece of
software to be declared ‘shippable’. These AC are what keep the developers on track and give the QA’s a
baseline to work from. This AC documentation, however, suffers from all the usual drawbacks of
documentation, not least of which is that is can be become out of date with the software. A solution to this
problem is to create a living documentation as opposed to a static one. This can be done by translating the
AC into executable tests via Cucumber or JBehave. This is commonly known as BDD. Porting ACs to the
Gherkin format (a form of structured text used by both Cucumber and JBehave) is a (largely) simple
process and is a very good skill for all software team members to have.
Crafting an Automation API
One of the main benefits of exposing an API for testers to leverage when writing their tests is that it frees
them up to target other areas such as exploratory testing.
This API should serve two basic functions:
1. Abstracting away the low level detail to ensure that the test code is not vulnerable to page
structure changes.
2. Providing a Domain Specific Language (DSL) to make it easier for testers to write tests while
thinking in the domain of the problem and not that of the technology being used.
There are many components to a well-crafted API and a good API takes time to design and implement. It
is important to invest some time up front to design an API correctly so that it helps the testers and results
in less change and maintenance for the developers.
There is a substantial amount of research being done on the design of APIs (in particular RESTful APIs)
but not all of it is relevant to automation. The following section amounts to a condensation of the
information I have learned along the way from various sources and from my own experience writing
automation frameworks and APIs.
When designing an automation framework I focus on three main principles: simplicity, consistency and
intuitiveness. Performance is also a factor but I will not cover it in detail here.
Simplicity
It can often occur, in complex web applications, that processing a transaction may require many
parameters. According to Joshua Bloch, author of a seminal text on the Java language [9], any more than
four parameters is considered excessive. Personally I have found that to be a bit restrictive in an
automation context but any more than six or seven and it becomes very difficult to remember what they
are all for. It is particularly difficult to remember the parameter usage if they are of the same type. A
method signature where the last five parameters are all Booleans is simply asking for trouble. Also,
because the types are the same the compiler cannot help the developer here. Remember, a misused
method will lead to failed tests (false negatives) and time wasted debugging test code. I often use builder
objects or helper classes to combat 'parameter creep'
Don't ask your consumers (testers in our case) to extend classes or implement interfaces. If you are doing
this it is a sign that you need to rethink your design. Remember, the key is to allow the tester to think in
the problem domain. If you are forcing them to implement interfaces etc. then you are pushing them into
the technology domain and this leads to inefficiencies at best and downright confusion at worst.
Handle change requests carefully. It is inevitable that there will be changes required. Often these come in
the form of requests from your testers that they want to add a new parameter to some method to handle
some, as yet unthought-of, use case. These requests can be handled in two ways, depending on the skill
set and experience of the tester involved.
 Ask them to submit the code change for review.
o As the programming skills of your testers improve (one or two heads will rise above the
pack in this regard) , this is a good option as it frees up developer time but be careful that
all changes are in keeping with the overall goals of the API. Beware the copy/paste bloat
of existing methods. This is something that can be earned and possibly used in quarterly
objectives etc.
 Ask them to specify their request clearly and commit the change yourself.
o This system works well with inexperienced programmers and allows more control over
any changes to the framework/API. It is important to separate the requirement from the
design suggestion and make a decision as to whether this is the best way to do this. The
tester may request a new method that takes an extra parameter unaware that a
‘Varargs’[10] variation already exists etc.
Another handy, but slightly more advanced, tip is to use interfaces in your method signatures instead of
the concrete classes. This gives the added flexibility that types that implement the same interface can be
passed in at runtime with no modification needed in the code. For example, you may have three types of
transaction in your system; ‘VisaTransaction’, ‘MastercardTransaction’ and ‘AmexTransaction’. If all
these types implement the transaction interface, then any method that requires a transaction object should
include the interface in the signature so any of the three types can be passed.
Consistency
The key to consistency in an API is to do things the same way everywhere you do it.
If you pass the driver as a parameter in some of you methods, always pass it as the first parameter or the
last parameter. It doesn't matter which as long as you are consistent about it.
If you use enumerated constants to represent order types, use them everywhere order types are required.
Avoid using strings in some methods and enumerated types in others. This is frustrating for testers who
are trying to figure out your API.
Follow the platform conventions of your parent language. If you are using C# you should put a capital I
before all interface names. If you are using Java you should not do this. Also, avoid any in-house or old
school notation systems, they are confusing and difficult to interpret.
Intuitiveness
Simple mistakes such as typos and grammatical errors are not a big problem and can be picked up in a
review process. Be careful if English is not your first language as subtle grammatical errors can lead to
unintuitive and inconsistently named methods.
Choose practical names for your methods and ensure they only do what is hinted at in the title [11]. If the
name of the method is 'processOrder()' then it should only process orders and not, for example, click on
the home button at the end.
Where feasible, implement two element enumerated types instead of Booleans. This lends itself to much
more readable code e.g.
processOrder(..., Transaction.REFUND)
is a lot easier to read than:
processOrder(..., false) // true is a for credit
You should list permissible values as enumerated types instead of strings. If you accept VISA,
MasterCard and American Express, create an enumerated type with three values instead of passing around
string values. This eliminates issues with case as well as spacing as well as providing protection via the
compiler. Use
purchaseItem(2.50, PaymentType.MASTERCARD)
instead of
purchaseItem(2.50, "Master Card")
Overload your methods, but don't go crazy. If there are multiple ways to process an order, then provide an
overloaded implementation for each scenario. You can have a method to process a 'vanilla' order as well
as one that takes a value if tax should be applied or a discount included etc. Make sure all parameters are
names appropriately to avoid confusion between the different implementations.
Java varargs are your friend. Varags in Java provide the ability to accept zero or more arguments as
parameters to a method. This can be very useful if you can fill out a form that has many optional fields
which can result in an exponential rise in methods to cover all the scenarios.
public AppPage sendRequest(String client, String account, String... optionalFields) {
login();
processOrder(client, account, optionalFields);
return new AppPage();
}
Calling the ‘sendRequest’ method would look something like this:
aPage.sendRequest(MERCHANT_ID, ACCOUNT_ID, "CURRENCY:EUR", "FIELD1:aaa",
"FIELD2:bbb");
Avoid naming systems that are too general as this will not read well to anyone using the API. Avoid
names like doSomething() or other unclear terminology. If you find yourself struggling to name a method
it is usually a sign that you need to redesign things.
Use names from the problem domain. This will allow the tester to think in terms of this domain which is
what you want. Leverage the business documentation for the correct terminology, if the documentation
says that you can add a connection, and then name your method:
addConnection(...)
Avoid similar sounding names like
createConnection(...)
It can be very confusing for the user when there are slightly different naming conventions being used by
different teams.
In general, don't ask your users to make assumptions. Sometimes they'll get it wrong and will blame your
API, and rightly so.
The quality of the API you write will have a direct bearing on the output that the testers can produce and
in turn, the amount of time they can redirect to other areas such as exploratory testing. This redirection of
effort can have a major bearing on the quality of the software that is released.
It's important to note that an automation API (like any API) is like a contract and any changes come with
an associated cost. This cost often comes in the form of time taken to update any tests that were created
using the original API. It is often a good idea to create some client code (tests in our case) along with the
API to verify that the interface provided is actually fit for purpose.
‘Ferenc Mihaly’ gave an extremely interesting talk in November 2011[12] about designing APIs and one
key piece of advice which I believe to be extremely helpful when designing an API is think from the
perspective of the caller. This means than when writing the API you should always keep in mind the code
that will need to be written by the client to consume your API. For example, say you have an API to
process an order on your system; you could call it something like this:
process(int orderId)
From the perspective of the API developer it is clear what this call does, we pass in an order id and we
process the order. However when we look at the code from the perspective of the caller we see something
different
process(123)
As you can see, when reading the client code it is not as clear as to what is being processed. Of course we
can always look up the API documentation or drill down if we are in an IDE but why force our users to
take that extra step.
Instead we can call the method:
processOrder(int orderId)
The calling code is a lot clearer at first glance
processOrder(123)
Of course, this is a trivial example but hopefully you can see the value of thinking about the code from
the caller perspective when designing your API.
Styling an automation API
Fluent APIs
There are a number of approaches you can take when designing your API. The best, in my experience, is
to implement a fluent API [13]. A fluent API can mean many things but for me the most important part is
method chaining. This means, in essence, that each method returns an object of it's parent class so that
each method in the class can call each other. This allows for the writing of tests that are very readable and
clear to understand with the need for cluttering the code with comments. An example of method chaining
is shown below:
customersPage.searchBySurname("McTest").addPaymentMethod("Credit Card",
"4242424242424242") .editPaymentMethod(1, "4242424242424242",
"02/2023").deletePaymentMethod();
This chaining of methods is not the only component of a fluent API but it is a very important part.
Data Modeling
You should always leverage data objects for entities in your domain such as user, transaction etc. These
can be implemented as simple java objects (POJOs). You can, of course, leverage inheritance for related
data types. For example, there may be multiple types of user in the system such as anonymous user,
power user, admin etc.
Page Objects
Page objects have become the default pattern for writing automated tests. They are probably badly named
as the concept of distinct pages often no longer applies to today’s web applications such as Single Page
Applications (SPAs).
Page Objects should abstract the HTML away from the test and also hide the asynchronous nature of most
modern sites by building in waits (The AngularJS framework handles this very nicely by using JavaScript
promises).
They are still relevant, however, and should be thought of in terms of a more abstract area of a web
application than the traditional pages of pre AJAX applications. There are a number of different
approaches that you can take when designing your page objects such as whether the page object will
expose a functional or structural interface. This means we need to decide whether we expose methods
such login() or enterUsername().
This are positive and negatives to both approaches and a business specific decision will need to be taken
on which path to take. Of course, you could expose methods of both types but this can lead to a very
cluttered API. The key thing to remember is to keep methods private where possible.
Lists of ten (or so)
10 things you should be doing in your automation framework/API:
1. Alan Richardson[14] says stop thinking about framework and think in terms of an API. Use a
FluentInterface and prefer humane[15] to minimal[16]. Test Frameworks are one of the few use cases
where the investment in creating a fluent API can be 'easily' justified [17].
2. Use builder objects to minimise hardwired data in the tests. "If it's not important for it to be in the test,
it's important for it to not be in the test." [18]. Use the builder pattern to hide unimportant details in test
data. Using data models will improve readability and maintainability in your tests.
3. Write your own custom matchers for clarity and flexibility [19]
4. Use the singleton pattern for driver management. Use something like getDriver() which is a static
factory method that returns a singleton on a per thread basis. Watch out for threading issues though and
avoid if you're unsure [20]. Don't fear design patterns but don't over use either. The selenium capsules
framework has some great ideas around applying patterns (notably the Decorator Pattern) but this is a
somewhat advanced topic [21].
5. Write Database helper methods to verify DB state where appropriate.
6. Leverage Maven & Jenkins integration capabilities. Externalise properties and pass as parameterise to
your Jenkins builds.
7. Use soft and hard asserts in the right places. This is a very powerful concept and is very much under
used in testing today [22].
8. Decide on whether to use Spring to handle your page dependencies. This leads to much cleaner code
and removes hard dependencies. However, be aware that it can make the framework harder to understand
for newcomers
9. Leverage Jenkins to automate your report generation as well as WebDriver to screenshot on failure
[23]. Build logging into page object methods to avoid clogging tests up with 'noise'. Delegation is the key,
not to people but to existing libraries, don't reinvent a proprietary wheel.
10. Reduce 'flakiness' by minimising tests. Keep the test scenarios as simple as possible and don’t try to
do too much in a single test. A neat trick is to capture cookies to simulate logins as this avoids slow login
calls for every test.
Sometimes it’s ok to break the rules e.g. writing two methods instead of one 'generic' method. Remember
you don't always have to use best practice i.e. DRY code. Sometimes, keeping it simpler is better. The
goal should be to help up skill your testers, where appropriate, and reduce the workload on developers.
You should resist the temptation to show off your skills with HashMaps that are three levels deep or
applying every design pattern under the sun. Keep in mind that it is not production code but don't make it
a second class citizen either.
6 things you should not be doing in your automation framework/API:
1. Sleeping and implicit waits. Just don't do it. Period. It's slow and performance dependent. BTW,
wrapping Thread.sleep() in a convenience method call waitAWhile(int) doesn't count as not sleeping in
your tests.
2. Injecting JavaScript unnecessarily. It's a powerful concept when used correctly but it can lead to
browser compatibility problems that can prove difficult to diagnose and can lead to misleading failures on
browser upgrades e.g.
document.getElementById("here").scrollIntoView()[24]
This can behave slightly differently on the various browsers. Remember Chrome forces updates and it
cannot be opted out of (unless you setup a firewall rule).
3. Using XPath for your locators. XPath locators should be avoided where possible. It's slow, hard to read
and inconsistent across browsers. CSS selectors [25] are faster at runtime and Ids are your friend.
4. Running all your tests through the UI. It's slow, brittle and is not necessary for lots of data driven
scenarios. A little time spend selecting the right layer to run your tests can pay off big time in the long
run. If your business requires the use of Equivalence Partitioning and Boundary Value Analysis, these
should be applied at a lower level such as the API as they tend to generate large amounts of tests. Keep
visibility of where testing is done all the way down the stack. Make informed decisions of where to place
the tests. Push tests down the stack unless you have a reason not to. Keep the feedback loop short, while
managing risk. Don't bypass business logic in any layer.
5. Over-generalising your test methods. A lot of time can be spent on making your test methods reusable
and this is generally a good thing but taking some time to determine a use case before doing so can save a
lot of time creating reuseless[26] code. This is an all too common anti-pattern where methods are pushed
into super classes for re-use even though no one ever reuses them. They end up cluttering the code and
API and lead to maintenance problems. Remember, declaring something as reusable does not make it so,
somebody has to actually want to reuse it.
6. Avoid "development in parallel". This means that is does not make sense to develop a framework so
complicated in nature as to rival the software it is designed to test. It's important to strike a balance
between maintaining the framework/API and getting the job done. Ensure any framework/APIs are
reviewed by architects/lead developers to catch such problems early on to avoid costly backtracking.
7. Avoid off by one errors, with thanks to Martin Fowler [27].
Call to action
Please don't just read this paper and leave it at that. Thoughts are meaningless without action. Start
stripping those locators from your tests and improve your existing automation approach. Your future self
will thank you for it.
Don't forget the three rules of good software automation:
o Abstraction
o Abstraction
o More abstraction
Remember, team-work is the key here. Help each other to learn new 'stuff'. Developers can teach testers
how to leverage Maven and Jenkins. Testers can teach developers how build up a suite of tests that can be
leveraged for unit testing. One of the most effective techniques in testing is to push tests down the
pyramid so feedback is provided earlier. If something is covered in a unit test, developers will find a
defect in it even before they commit their code. If it is covered in a GUI test, it may not be found until a
nightly build is run. That's not bad but we should always be aiming to do better, shouldn't we?
Future Work
Following the process outlined here will help in implementing an effective software automation approach
within your company but there are always new things on the horizon that can be leveraged to improve the
existing situation. Some current areas of interest for me include
 Leveraging Java 8
o The Lambdas [28] feature may allow for a significant reduction in the amount of code
required to implement synchronisation in the WebDriver API. This is due to the large
amount of anonymous inner classes required. If this can be replaced with a simple
passing of behaviour it could prove very beneficial.
 Leveraging the Memento pattern [29] to remember the state of web application. This could allow
for the dropping in and out of various points in test cases across complicated testing scenarios.
References
[All Links provided here were valid at the time of writing, last validated Oct-12-2014]
1. https://www.scrum.org/Resources/Scrum-Glossary/Definition-of-Done
2. http://programmer.97things.oreilly.com/wiki/index.php/Don%27t_Repeat_Yourself
3. http://dannorth.net/introducing-bdd/
4. http://cukes.info/
5. http://jbehave.org/
6. http://www.mountaingoatsoftware.com/blog/the-forgotten-layer-of-the-test-automation-pyramid
7. http://martinfowler.com/bliki/PageObject.html
8. http://www.satisfice.com/articles/what_is_et.shtml
9. Effective Java, Joshua Bloch, ISBN - 0321356683
10. http://docs.oracle.com/javase/1.5.0/docs/guide/language/varargs.html
11. http://martinfowler.com/bliki/TwoHardThings.html
12. http://theamiableapi.com/2011/11/21/api-design-webinar-slides-and-recording/
13. http://www.martinfowler.com/bliki/FluentInterface.html
14. http://seleniumsimplified.com/2014/09/selenium-webdriver-page-object-abstractions-and-beyond/
15. http://www.martinfowler.com/bliki/HumaneInterface.html
16. http://www.martinfowler.com/bliki/MinimalInterface.html
17. http://genehughson.wordpress.com/2012/02/20/fluent-interfaces/
18. ISBN-10: 0131495054, xUnit Test Patterns: Refactoring Test Code, Gerard Meszaros
19. 0321503627 Growing Object-Oriented Software, Guided by Tests
20. https://plumbr.eu/blog/how-to-shoot-yourself-in-foot-with-threadlocals
21. http://seleniumcapsules.blogspot.ie/
22. http://beust.com/weblog/2012/07/29/reinventing-assertions/
23. https://github.com/allure-framework/allure-core/wiki
24. https://developer.mozilla.org/en-US/docs/Web/API/Element.scrollIntoView
25. http://code.tutsplus.com/tutorials/the-30-css-selectors-you-must-memorize--net-16048
26. Scott Ambler, "Reuse Patterns and Antipatterns", 2000 Software Development Magazine,
http://www.drdobbs.com/reuse-patterns-and-antipatterns/184414576?cid=Ambysoft
27. http://martinfowler.com/bliki/TwoHardThings.html
28. http://www.oracle.com/webfolder/technetwork/tutorials/obe/java/Lambda-QuickStart/index.html
29. http://en.wikipedia.org/wiki/Memento_pattern
www.eurostarconferences.com
Join us online at the links below.

More Related Content

What's hot

Seven Keys to Navigating Your Agile Testing Transition
Seven Keys to Navigating Your Agile Testing TransitionSeven Keys to Navigating Your Agile Testing Transition
Seven Keys to Navigating Your Agile Testing TransitionTechWell
 
Back To Basics Hyper Free Principles For Software Developers
Back To Basics Hyper Free Principles For Software DevelopersBack To Basics Hyper Free Principles For Software Developers
Back To Basics Hyper Free Principles For Software DevelopersAdrian Treacy
 
! Testing for agile teams
! Testing for agile teams! Testing for agile teams
! Testing for agile teamsDennis Popov
 
Karate API Testing-Complete Guidance by Testrig
Karate API Testing-Complete Guidance by TestrigKarate API Testing-Complete Guidance by Testrig
Karate API Testing-Complete Guidance by TestrigPritiFGaikwad
 
Teaching Kids Programming
Teaching Kids ProgrammingTeaching Kids Programming
Teaching Kids ProgrammingLynn Langit
 
Evolution of Software Engineering in NCTR Projects
Evolution of Software Engineering in NCTR  Projects   Evolution of Software Engineering in NCTR  Projects
Evolution of Software Engineering in NCTR Projects Mohammed Abbas
 
Flavours of agile software engineering
Flavours of agile software engineeringFlavours of agile software engineering
Flavours of agile software engineeringZeeshan Masood S
 
Sourav_Kumar_SKUM279_Manoj_HYD_My Journey as a Software Testing Professional...
Sourav_Kumar_SKUM279_Manoj_HYD_My  Journey as a Software Testing Professional...Sourav_Kumar_SKUM279_Manoj_HYD_My  Journey as a Software Testing Professional...
Sourav_Kumar_SKUM279_Manoj_HYD_My Journey as a Software Testing Professional...sourav kumar
 
Agile Executive Briefing - Situational Assessment + 50k Ft View
Agile Executive Briefing - Situational Assessment + 50k Ft ViewAgile Executive Briefing - Situational Assessment + 50k Ft View
Agile Executive Briefing - Situational Assessment + 50k Ft ViewMichael Sahota
 
The Role of Quality Assurance in the World of Agile Development and Scrum
The Role of Quality Assurance in the World of Agile Development and ScrumThe Role of Quality Assurance in the World of Agile Development and Scrum
The Role of Quality Assurance in the World of Agile Development and ScrumRussell Pannone
 
Current Trends in Agile - opening keynote for Agile Israel 2014
Current Trends in Agile - opening keynote for Agile Israel 2014Current Trends in Agile - opening keynote for Agile Israel 2014
Current Trends in Agile - opening keynote for Agile Israel 2014Yuval Yeret
 
Scrum And The Enterprise
Scrum And The EnterpriseScrum And The Enterprise
Scrum And The EnterpriseJames Peckham
 
Rapid software testing
Rapid software testingRapid software testing
Rapid software testingSachin MK
 
Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010
Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010
Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010TEST Huddle
 
Michael Bolton - Two Futures of Software Testing
Michael Bolton - Two Futures of Software TestingMichael Bolton - Two Futures of Software Testing
Michael Bolton - Two Futures of Software TestingTEST Huddle
 
Agile Testing 20021015
Agile Testing 20021015Agile Testing 20021015
Agile Testing 20021015Raghu Karnati
 
Inhouse vs-off-the-shelf-may
Inhouse vs-off-the-shelf-mayInhouse vs-off-the-shelf-may
Inhouse vs-off-the-shelf-mayAyodeji Adesina
 

What's hot (20)

Seven Keys to Navigating Your Agile Testing Transition
Seven Keys to Navigating Your Agile Testing TransitionSeven Keys to Navigating Your Agile Testing Transition
Seven Keys to Navigating Your Agile Testing Transition
 
Back To Basics Hyper Free Principles For Software Developers
Back To Basics Hyper Free Principles For Software DevelopersBack To Basics Hyper Free Principles For Software Developers
Back To Basics Hyper Free Principles For Software Developers
 
! Testing for agile teams
! Testing for agile teams! Testing for agile teams
! Testing for agile teams
 
Karate API Testing-Complete Guidance by Testrig
Karate API Testing-Complete Guidance by TestrigKarate API Testing-Complete Guidance by Testrig
Karate API Testing-Complete Guidance by Testrig
 
Teaching Kids Programming
Teaching Kids ProgrammingTeaching Kids Programming
Teaching Kids Programming
 
Evolution of Software Engineering in NCTR Projects
Evolution of Software Engineering in NCTR  Projects   Evolution of Software Engineering in NCTR  Projects
Evolution of Software Engineering in NCTR Projects
 
Flavours of agile software engineering
Flavours of agile software engineeringFlavours of agile software engineering
Flavours of agile software engineering
 
Sourav_Kumar_SKUM279_Manoj_HYD_My Journey as a Software Testing Professional...
Sourav_Kumar_SKUM279_Manoj_HYD_My  Journey as a Software Testing Professional...Sourav_Kumar_SKUM279_Manoj_HYD_My  Journey as a Software Testing Professional...
Sourav_Kumar_SKUM279_Manoj_HYD_My Journey as a Software Testing Professional...
 
Agile Executive Briefing - Situational Assessment + 50k Ft View
Agile Executive Briefing - Situational Assessment + 50k Ft ViewAgile Executive Briefing - Situational Assessment + 50k Ft View
Agile Executive Briefing - Situational Assessment + 50k Ft View
 
The Role of Quality Assurance in the World of Agile Development and Scrum
The Role of Quality Assurance in the World of Agile Development and ScrumThe Role of Quality Assurance in the World of Agile Development and Scrum
The Role of Quality Assurance in the World of Agile Development and Scrum
 
Current Trends in Agile - opening keynote for Agile Israel 2014
Current Trends in Agile - opening keynote for Agile Israel 2014Current Trends in Agile - opening keynote for Agile Israel 2014
Current Trends in Agile - opening keynote for Agile Israel 2014
 
Scrum And The Enterprise
Scrum And The EnterpriseScrum And The Enterprise
Scrum And The Enterprise
 
Multi team release framework
Multi team release frameworkMulti team release framework
Multi team release framework
 
Rapid software testing
Rapid software testingRapid software testing
Rapid software testing
 
Journey of atdd
Journey of atddJourney of atdd
Journey of atdd
 
Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010
Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010
Darshan Desai - Virtual Test Labs,The Next Frontier - EuroSTAR 2010
 
Michael Bolton - Two Futures of Software Testing
Michael Bolton - Two Futures of Software TestingMichael Bolton - Two Futures of Software Testing
Michael Bolton - Two Futures of Software Testing
 
Agile Testing 20021015
Agile Testing 20021015Agile Testing 20021015
Agile Testing 20021015
 
Inhouse vs-off-the-shelf-may
Inhouse vs-off-the-shelf-mayInhouse vs-off-the-shelf-may
Inhouse vs-off-the-shelf-may
 
Model-Based Testing for Cypress
Model-Based Testing for CypressModel-Based Testing for Cypress
Model-Based Testing for Cypress
 

Viewers also liked

Олексій Тихонов: презентація «Енергоефективні будинки в місті»
Олексій Тихонов: презентація «Енергоефективні будинки в місті»Олексій Тихонов: презентація «Енергоефективні будинки в місті»
Олексій Тихонов: презентація «Енергоефективні будинки в місті»Kyiv Smart City
 
정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59
정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59 정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59
정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59 황 약사
 
CEN TC350 - sustainability assessment of buildings and other sustainability a...
CEN TC350 - sustainability assessment of buildings and other sustainability a...CEN TC350 - sustainability assessment of buildings and other sustainability a...
CEN TC350 - sustainability assessment of buildings and other sustainability a...Chris Hamans
 
Módulo 2: ¿Las TIc y yo?
Módulo 2: ¿Las TIc y yo?Módulo 2: ¿Las TIc y yo?
Módulo 2: ¿Las TIc y yo?Ingrid Heredia
 
Elective care conference: rules recap & effective management of diagnostic wa...
Elective care conference: rules recap & effective management of diagnostic wa...Elective care conference: rules recap & effective management of diagnostic wa...
Elective care conference: rules recap & effective management of diagnostic wa...NHS Improvement
 
Elective care conference: MDT workload tracker
Elective care conference: MDT workload trackerElective care conference: MDT workload tracker
Elective care conference: MDT workload trackerNHS Improvement
 
Elective Care Conference: using the IST capacity and demand tool
Elective Care Conference: using the IST capacity and demand toolElective Care Conference: using the IST capacity and demand tool
Elective Care Conference: using the IST capacity and demand toolNHS Improvement
 
Technical Debt
Technical DebtTechnical Debt
Technical DebtRob Myers
 
BBFC Certification Research
BBFC Certification ResearchBBFC Certification Research
BBFC Certification Researchtbgsmedia1517
 

Viewers also liked (14)

Sikap pemikiran
Sikap pemikiranSikap pemikiran
Sikap pemikiran
 
Олексій Тихонов: презентація «Енергоефективні будинки в місті»
Олексій Тихонов: презентація «Енергоефективні будинки в місті»Олексій Тихонов: презентація «Енергоефективні будинки в місті»
Олексій Тихонов: презентація «Енергоефективні будинки в місті»
 
정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59
정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59 정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59
정품 비아그라 판매 ↕↕ sy337˚net ↕↕ 프로코밀판매c59
 
CEN TC350 - sustainability assessment of buildings and other sustainability a...
CEN TC350 - sustainability assessment of buildings and other sustainability a...CEN TC350 - sustainability assessment of buildings and other sustainability a...
CEN TC350 - sustainability assessment of buildings and other sustainability a...
 
Módulo 2: ¿Las TIc y yo?
Módulo 2: ¿Las TIc y yo?Módulo 2: ¿Las TIc y yo?
Módulo 2: ¿Las TIc y yo?
 
Umen componentes modulos_2
Umen componentes modulos_2Umen componentes modulos_2
Umen componentes modulos_2
 
Tone B. Torgersen, The National Institute for Public Health, Norway
Tone B. Torgersen, The National Institute for Public Health, NorwayTone B. Torgersen, The National Institute for Public Health, Norway
Tone B. Torgersen, The National Institute for Public Health, Norway
 
Agile Risk Management
Agile Risk ManagementAgile Risk Management
Agile Risk Management
 
Elective care conference: rules recap & effective management of diagnostic wa...
Elective care conference: rules recap & effective management of diagnostic wa...Elective care conference: rules recap & effective management of diagnostic wa...
Elective care conference: rules recap & effective management of diagnostic wa...
 
Elective care conference: MDT workload tracker
Elective care conference: MDT workload trackerElective care conference: MDT workload tracker
Elective care conference: MDT workload tracker
 
Elective Care Conference: using the IST capacity and demand tool
Elective Care Conference: using the IST capacity and demand toolElective Care Conference: using the IST capacity and demand tool
Elective Care Conference: using the IST capacity and demand tool
 
Technical Debt
Technical DebtTechnical Debt
Technical Debt
 
BBFC Certification Research
BBFC Certification ResearchBBFC Certification Research
BBFC Certification Research
 
Job Description
Job DescriptionJob Description
Job Description
 

Similar to Adapting-Automation-to-the-available-workforce

BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINIBEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINIsuhasreddy1
 
The productivity of testing in software development life cycle
The productivity of testing in software development life cycleThe productivity of testing in software development life cycle
The productivity of testing in software development life cycleNora Alriyes
 
Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...
Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...
Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...Abdelkrim Boujraf
 
Beginners guide to software testing
Beginners guide to software testingBeginners guide to software testing
Beginners guide to software testingKevalkumar Shah
 
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest IrelandMarkus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest IrelandDavid O'Dowd
 
MTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshopMTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshopClemens Reijnen
 
Scaling Software Delivery.pdf
Scaling Software Delivery.pdfScaling Software Delivery.pdf
Scaling Software Delivery.pdfTiffany Jachja
 
Bpm10gperformancetuning 476208
Bpm10gperformancetuning 476208Bpm10gperformancetuning 476208
Bpm10gperformancetuning 476208Vibhor Rastogi
 
DevOps - Continuous Integration, Continuous Delivery - let's talk
DevOps - Continuous Integration, Continuous Delivery - let's talkDevOps - Continuous Integration, Continuous Delivery - let's talk
DevOps - Continuous Integration, Continuous Delivery - let's talkD Z
 
Testing Experience - Evolution of Test Automation Frameworks
Testing Experience - Evolution of Test Automation FrameworksTesting Experience - Evolution of Test Automation Frameworks
Testing Experience - Evolution of Test Automation FrameworksŁukasz Morawski
 
The Agile Readiness Assessment Tool Essay
The Agile Readiness Assessment Tool EssayThe Agile Readiness Assessment Tool Essay
The Agile Readiness Assessment Tool EssayHeidi Owens
 
Scrum an extension pattern language for hyperproductive software development
Scrum an extension pattern language  for hyperproductive software developmentScrum an extension pattern language  for hyperproductive software development
Scrum an extension pattern language for hyperproductive software developmentShiraz316
 
Design pattern tutorial
Design pattern tutorialDesign pattern tutorial
Design pattern tutorialPiyush Mittal
 
Estimating test effort part 1 of 2
Estimating test effort part 1 of 2Estimating test effort part 1 of 2
Estimating test effort part 1 of 2Ian McDonald
 
Tackling software testing challenges in the agile era
Tackling software testing challenges in the agile eraTackling software testing challenges in the agile era
Tackling software testing challenges in the agile eraQASymphony
 
Manoj Kolhe - Testing in Agile Environment
Manoj Kolhe - Testing in Agile EnvironmentManoj Kolhe - Testing in Agile Environment
Manoj Kolhe - Testing in Agile EnvironmentManoj Kolhe
 
Selection And Implementation Of An Enterprise Maturity...
Selection And Implementation Of An Enterprise Maturity...Selection And Implementation Of An Enterprise Maturity...
Selection And Implementation Of An Enterprise Maturity...Jenny Calhoon
 

Similar to Adapting-Automation-to-the-available-workforce (20)

Testing by padamini c
Testing by padamini cTesting by padamini c
Testing by padamini c
 
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINIBEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
BEGINNERS GUIDE TO SOFTWARE TESTING BY C.PADMINI
 
The productivity of testing in software development life cycle
The productivity of testing in software development life cycleThe productivity of testing in software development life cycle
The productivity of testing in software development life cycle
 
Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...
Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...
Test-Driven Developments are Inefficient; Behavior-Driven Developments are a ...
 
167312
167312167312
167312
 
Beginners guide to software testing
Beginners guide to software testingBeginners guide to software testing
Beginners guide to software testing
 
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest IrelandMarkus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
Markus Clermont - Surviving in an Agile Environment - Google - SoftTest Ireland
 
Test Driven Development (TDD)
Test Driven Development (TDD)Test Driven Development (TDD)
Test Driven Development (TDD)
 
MTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshopMTLM Visual Studio 2010 ALM workshop
MTLM Visual Studio 2010 ALM workshop
 
Scaling Software Delivery.pdf
Scaling Software Delivery.pdfScaling Software Delivery.pdf
Scaling Software Delivery.pdf
 
Bpm10gperformancetuning 476208
Bpm10gperformancetuning 476208Bpm10gperformancetuning 476208
Bpm10gperformancetuning 476208
 
DevOps - Continuous Integration, Continuous Delivery - let's talk
DevOps - Continuous Integration, Continuous Delivery - let's talkDevOps - Continuous Integration, Continuous Delivery - let's talk
DevOps - Continuous Integration, Continuous Delivery - let's talk
 
Testing Experience - Evolution of Test Automation Frameworks
Testing Experience - Evolution of Test Automation FrameworksTesting Experience - Evolution of Test Automation Frameworks
Testing Experience - Evolution of Test Automation Frameworks
 
The Agile Readiness Assessment Tool Essay
The Agile Readiness Assessment Tool EssayThe Agile Readiness Assessment Tool Essay
The Agile Readiness Assessment Tool Essay
 
Scrum an extension pattern language for hyperproductive software development
Scrum an extension pattern language  for hyperproductive software developmentScrum an extension pattern language  for hyperproductive software development
Scrum an extension pattern language for hyperproductive software development
 
Design pattern tutorial
Design pattern tutorialDesign pattern tutorial
Design pattern tutorial
 
Estimating test effort part 1 of 2
Estimating test effort part 1 of 2Estimating test effort part 1 of 2
Estimating test effort part 1 of 2
 
Tackling software testing challenges in the agile era
Tackling software testing challenges in the agile eraTackling software testing challenges in the agile era
Tackling software testing challenges in the agile era
 
Manoj Kolhe - Testing in Agile Environment
Manoj Kolhe - Testing in Agile EnvironmentManoj Kolhe - Testing in Agile Environment
Manoj Kolhe - Testing in Agile Environment
 
Selection And Implementation Of An Enterprise Maturity...
Selection And Implementation Of An Enterprise Maturity...Selection And Implementation Of An Enterprise Maturity...
Selection And Implementation Of An Enterprise Maturity...
 

Adapting-Automation-to-the-available-workforce

  • 1. Adapting Automation to the Available Workforce Colm Harrington VCE, Ireland
  • 2. IsThisBookRightForMe? INTRODUCTORY Introductory content is for software testing professionals who are relatively new to the subject matter of the ebook. Introductory ebooks typically feature an overview of an aspect of testing or guidance on understanding the fundamentals of the subject at hand. INTERMEDIATE Intermediate ebooks are for software testers who are already familiar with the subject matter of the eBook but have limited hands- on experience in this area. Intermediate level usually focuses on working examples of a concept or relaying experiences of applying the approach in a particular context. ADVANCED Advanced ebook content is for software testers who are, or intend to become, experts on the subject matter. These ebooks look at more advanced features of an aspect of software testing and help the reader to develop a more complete understanding of the subject. This eBook is intended for ALL LEVELS
  • 3. Abstract The challenge of producing software today is the same as it has always been: Deliver high quality software within an acceptable timeframe. The ways we, as software engineers, try to tackle this challenge have changed somewhat now that we are in the 'agile age'. Being agile means many things and one of those is that we need to automate our testing to ensure we don't get caught up in the mire of ever increasing regression test cycles. The requirement to deliver 'completed' software within the confines of a sprint means that more demands are being placed on the QA's of today to automate their work. This has resulted in a significant challenge to the current workforce has not yet caught up with demand in terms of SDETs (Software Development Engineers in Test). The approach described in this paper can work whether there is an SDET available to the team or not. If there is, they can fulfill the role of the developer and free the developers to produce the actual application. For the purposes of brevity, when I refer to developers in this paper this can equally apply to both developers and developers in test. The approach described in this paper is an attempt to tackle the problem of automating software with limited resources by effectively splitting the job into two parts, each tackled by the people with the best existing skills in that area: 1. Create a testing framework; this is put in place by the developers. 2. Create a suite of tests; These are created by the QA staff, which leverages the framework.
  • 4. Biography Colm Harrington Colm is a Automation architect with VCE based in Cork, Ireland with responsibilities for both front end (via Selenium) and API testing. Colm has worked in the software industry, for various companies including Microsoft and Sage Ireland, for over 10 years.His real passion is simplifying the test process of complex applications and integrating the test process seamlessly into the SDLC.
  • 5. Adapting Automation to the available workforce. A white paper for developers AND testers. Author: Colm Harrington, October 2014
  • 6. Table of Contents Executive Summary.......................................................................................................................................7 Introduction ..................................................................................................................................................8 The problem..................................................................................................................................................9 The solution .................................................................................................................................................10 Define the roles of the key participants ..................................................................................................10 The role of developers (Framework) ...................................................................................................10 The role of testers (Test cases) ............................................................................................................12 The role of Business Analysts (User stories) ........................................................................................13 Crafting an Automation API.....................................................................................................................14 Simplicity..............................................................................................................................................14 Consistency .........................................................................................................................................15 Intuitiveness........................................................................................................................................16 Styling an automation API....................................................................................................................... 19 Fluent APIs ..........................................................................................................................................19 Data Modeling.....................................................................................................................................19 Page Objects........................................................................................................................................19 Lists of ten (or so) ...................................................................................................................................20 10 things you should be doing in your automation framework/API: .................................................20 6 things you should not be doing in your automation framework/API:.............................................21 Call to action ...........................................................................................................................................22 Future Work............................................................................................................................................22 References ..............................................................................................................................................23
  • 7. Executive Summary The challenge of producing software today is the same as it has always been: Deliver high quality software within an acceptable timeframe. The ways we, as software engineers, try to tackle this challenge have changed somewhat now that we are in the 'agile age'. Being agile means many things and one of those is that we need to automate our testing to ensure we don't get caught up in the mire of ever increasing regression test cycles. The requirement to deliver 'completed' software within the confines of a sprint means that more demands are being placed on the QA's of today to automate their work. This has resulted in a significant challenge to the current workforce has not yet caught up with demand in terms of SDETs (Software Development Engineers in Test). The approach described in this paper can work whether there is an SDET available to the team or not. If there is, they can fulfill the role of the developer and free the developers to produce the actual application. For the purposes of brevity, when I refer to developers in this paper this can equally apply to both developers and developers in test. The approach described in this paper is an attempt to tackle the problem of automating software with limited resources by effectively splitting the job into two parts, each tackled by the people with the best existing skills in that area: 1. Create a testing framework; this is put in place by the developers. 2. Create a suite of tests; These are created by the QA staff, which leverages the framework. This approach allows each team member to focus on what they do best and still get the job of automating the tests done. It has the primary benefit of softening the learning curve for QA's, who often come from a manual testing background, when writing automated tests. It does, of course, also allow for developers to gain an insight into the work of testers, and the craft of testing, and can be an invaluable aid in cross- skilling within the team.
  • 8. Introduction Everyone knows the old proverb about a problem shared being a problem halved and it is never more appropriate than when dealing with software automation. For years, automation was seen as the preserve of a 'guy' in a company who existed in something of a rogue state. They were normally under the remit of the QA manager but their work often overlapped with the development teams in terms of their day to day deliverables. The demand for automation has increased as software has become ever more complex and this has led to a massive increase in people with automation skills and, more recently, WebDriver skills in particular. The role of the SDET (Software Development Engineer in Test) has arisen in recent years and, to some extent, the workforce has yet to catch up. The SDET is a strange beast, part tester and part developer and often seen as neither by the dedicated team members. Never the less the role is in great demand as the complexity of the world's software exponentially rises. The increase in popularity of '100%' automation solutions as part of agile delivery has led to a massive increase in the time and resources required to actually deliver on this. In addition to this, automation is now very often included in the 'definition of done'[1] within the sprint confines. The role of the SDET is essentially a combination of developer and tester and these can be hard to recruit. As in sports, where it is more difficult to find players that attack and defend equally well, it can be a challenge to recruit SDETs as it is not yet a common career path to take. In smaller companies where the available headcount may not stretch to additional team members, it is a good solution to leverage the existing skill sets and effectively split the job of the SDET among the developers and testers allowing them to focus on the areas where they are already strong while also gaining an insight into the workings of other team members. This workload often fell to QA resources that were coming from a manual background and expected to up-skill as developers time was taken up with delivering 'functionality'. The commonly led to poorly written automation code (The DRY[2] principle was/is commonly ignored) which was difficult to understand and maintain. Often simple changes to the underlying application required major changes to the tests to get the 'green light'. This in turn led to high maintenance costs and often resulted in the automation becoming most costly (in terms of time and money) than it was worth. It's time to change all that.
  • 9. The problem Modern agile software teams are typically made up of three types of team member: Developers, Testers and a Business Analyst/Product Owner. All these resources (otherwise known as people!) have a part to play in assuring the quality of the software delivered by the team. Unfortunately, only the testers have been traditionally responsible for quality assurance. Indeed, this has happened to such an extent that the terms Quality Assurance and testing have become synonymous in the software industry. The difficulty facing the manual tester trying to up skill is that the learning curve is quite steep. Newcomers have to tackle learning a programming language, an automation API (WebDriver, for example), as well as potentially, programming best practices such as design patterns (more on these later). The difficulty for the developer is largely one of time constraints, their time is largely consumed with functional delivery and it requires 'buy in' from the business that some time will need to be diverted to Automation. Also, the challenge can include the age old issue of developers not knowing what to test. The difficulty facing the Business Analysts is that they need to be able to dig a little deeper into the technical side to be able to contribute to the process. The introduction of a BDD[3] framework such as Cucumber [4] or JBehave [5] is a major plus here as it insulates the business people from the heavy technical details while still allowing everyone on the team to speak the same "language". Software automation is often thought of as exercising the UI via a browser but there is more to automation than this somewhat narrow viewpoint. In fact, one of the main reasons for the failure of software automation projects is that there are too many tests at too high a level. This can be countered by intelligently pushing tests down the pyramid [6].
  • 10. The solution The solution to the above stated problem could be summed up in a single sentence. Lower the entry point for QA to write automated tests by developing an interface to make automation more accessible. This can be done largely by allowing testers to remain in the language of the problem domain and avoiding, as much as possible, the language of the technology domain. Define the roles of the key participants In the agile world, quality is the responsibility of everyone on the team; it is no longer acceptable to throw it ‘over the wall’ at the test team. The key to solving the problem of effective software automation is to share the work load to the benefit of everyone. It's important to make the distinction here between sharing the work and offloading the work. It often happens that at a developers meeting that a decision is made that the QA department should take on the task of reducing the automation backlog as the developers are overworked anyway and, besides, it is testing work. Meanwhile, down the hall at the QA meeting, the QA team has decided that the backlog should be taken on by the developers as it programming work and, besides, we are already overworked. The net result is that the backlog continues to grow and fingers get pointed at end of year review meetings instead of actually solving the problem. One of the surest ways of determining that something won't get done is to decide that someone else should do it. To combat this it is essential to breakdown the work into chunks that can be managed by both parties. The key point here is that the chunks of work are done by those with the skill set[s] to complete it in the most effective and efficient manner. The next section will deal with the various roles within a (typical) agile team and what their role is in terms of quality assurance as opposed to testing. The role of developers (Framework) The main role of the developer, in this context, is to expose a clean, predictable API to QA. This has the twin benefits of making like easier for QA and leads to more readable, maintainable tests. If desired, BDD can be layered on top of this API but it is not necessary to get a large return on investment (ROI) from this approach. The real benefits of this approach to automation are that the skills are already present in the team members doing the work. Avoiding memory leaks by applying a singleton pattern to driver management
  • 11. in a selenium framework is straight forward to a competent developer but may result in a steep learning curve for a QA without a background in programming. Indeed, applying best practices to the development of a framework/API is the best way to ensure that the team avoids the overhead of poorly developed automation suites that become more costly in terms of time needed for maintenance than they save in the first place. It is now accepted practice to implement the Page object model [7] when writing automation tests against the WebDriver API. This allows us to abstract out low level detail such as element locators from the tests. Again, this has the dual benefits of keeping the test code DRY as well as keeping the tests themselves clean. Another area that is very beneficial in terms of time saving is to implement custom assertions. Assertions in tests are the part where we compare the actual result to an expected result: assertTrue(“Processing Failed”,driver.findElement(By.id("element1")).isDisplayed()&& driver.findElement(By.id("element42")).isEnabled()); This is a lot harder to read that something like: assertOrderProcessingFailed(ResponseCode.200, ResultImage.GreenTick) Of course, the custom assertions also help to keep your test code DRY as the locator logic is abstracted away from the tests and the tester can easily write new tests without having to worry about low level details such as locators. This is a very powerful concept, when implemented correctly and I will deal with this in more detail in the section on exposing APIs. In my experience, the single most common issue I find when reviewing tests written against the WebDriver API (in Java) is synchronisation. To put it simply, we are not waiting for the right events before continuing. In real life when we fill in a form on a page we often have to wait a second or two while the next of a form loads or while a check is being carried out to see if the requested username already exists. This is intuitively done by humans but presents a challenge for machines. This has become even more relevant in the Web 2.0 age of AJAX calls to the server being commonplace on a site. For example, if we are filling out an online payment form and enter our credit card number and then tab to the next field for date, very often an asynchronous call will be made to the server in the background to check if it is a valid card number. If the card is valid, the ‘Pay Now’ is enabled, if not it remains greyed out. Often the UI will display a green tick or some other visual feedback to the user to let them know that the card number has been verified. As software automators we need to catch that event (the appearance of the green tick next to the card number field) before attempting to submit the form. The trouble is it may take a second (or two or three) to do a round trip to the server and determine the card validity and our test will continue and will fail by the time the AJAX call returns and enables the ‘Pay Now’ button. The WebDriver API has built in capabilities to handle these scenarios but they can prove challenging for newcomers to programming. It is important that these waits are built into any test APIs so the tester can focus on writing tests and not have to worry about handling asynchronous events which can be time consuming to track down and debug.
  • 12. Another key aspect of this approach is to handle the management of the driver at a higher level than the tests. This ensures that memory leaks are kept to a minimum as a common mistake when writing automation is to create driver objects and leave them in memory even when they are no longer required. This is particularly costly when writing WebDriver tests as each driver object represents a new instance of a browser window and poor management can quickly lead to ‘out of memory’ errors. The last topic I will cover here is the ongoing debate as to whether assertions should be in the tests or in the page objects. In general it is better if the page object simply contains data about the page and does not contain any business logic; however, sometimes it is ok to embed the assertions if the scope is limited. The catch is that it harms reuse of these methods. The role of testers (Test cases) The role of the QA within this framework is to add tests by leveraging the API provided by the development team. Ideally, and especially when the QA is not experienced, then the Sunny Day tests provided by the developers can provide a Launch Pad to begin the creation of a suite of tests. There are many possible error conditions for a given piece of functionality such as ordering a ‘widget’ online: 1. A blank orderId 2. No orderId field present 3. Invalid characters in the orderId field. This is where the expertise of the tester comes into play and it would be neither practical nor desirable for the developers to think up all these scenarios and implement them. Very often, this work will already have been done by the QA team for their manual tests and this process ensures any duplication of this work will be minimised. A combination of the roles of QA and developer ensure that the twin goals of: 1. Testing the software to the limits deemed acceptable by the QA team 2. The automation code produced is maintainable to a standard deemed acceptable to the development team are both met. A part of the deal for the QA as the developers are now providing them with a API for writing tests is that they must be smart about what tests they write. Running tests through a GUI are relatively expensive in terms of time required to run and any redundancy should be kept to a minimum. Consideration should be given to running tests at the API level where appropriate as the size of the feedback loop is orders of magnitude quick when there is no GUI to render. Sometimes, the best frontend test is no test at all. The main benefit of this approach, in terms of product quality, is that the testers time is freed up to do exploratory testing [8]. Exploratory testing is best described as test case execution and test case
  • 13. documentation occurring at the same time. It is this type of testing that is far more likely to find defects in the software rather than running a regression suite ad-nauseam. The role of Business Analysts (User stories) The role of the Business Analyst in the agile software team is an extremely import one as they are the ones that ensure that the customer gets the software they want (Build the right software ...). In my experience the best way of doing this is to clearly list out the Acceptance Criteria (AC) before a line of code is written or a test case documented. AC’s are criteria that must be satisfied in order for a piece of software to be declared ‘shippable’. These AC are what keep the developers on track and give the QA’s a baseline to work from. This AC documentation, however, suffers from all the usual drawbacks of documentation, not least of which is that is can be become out of date with the software. A solution to this problem is to create a living documentation as opposed to a static one. This can be done by translating the AC into executable tests via Cucumber or JBehave. This is commonly known as BDD. Porting ACs to the Gherkin format (a form of structured text used by both Cucumber and JBehave) is a (largely) simple process and is a very good skill for all software team members to have.
  • 14. Crafting an Automation API One of the main benefits of exposing an API for testers to leverage when writing their tests is that it frees them up to target other areas such as exploratory testing. This API should serve two basic functions: 1. Abstracting away the low level detail to ensure that the test code is not vulnerable to page structure changes. 2. Providing a Domain Specific Language (DSL) to make it easier for testers to write tests while thinking in the domain of the problem and not that of the technology being used. There are many components to a well-crafted API and a good API takes time to design and implement. It is important to invest some time up front to design an API correctly so that it helps the testers and results in less change and maintenance for the developers. There is a substantial amount of research being done on the design of APIs (in particular RESTful APIs) but not all of it is relevant to automation. The following section amounts to a condensation of the information I have learned along the way from various sources and from my own experience writing automation frameworks and APIs. When designing an automation framework I focus on three main principles: simplicity, consistency and intuitiveness. Performance is also a factor but I will not cover it in detail here. Simplicity It can often occur, in complex web applications, that processing a transaction may require many parameters. According to Joshua Bloch, author of a seminal text on the Java language [9], any more than four parameters is considered excessive. Personally I have found that to be a bit restrictive in an automation context but any more than six or seven and it becomes very difficult to remember what they are all for. It is particularly difficult to remember the parameter usage if they are of the same type. A method signature where the last five parameters are all Booleans is simply asking for trouble. Also, because the types are the same the compiler cannot help the developer here. Remember, a misused method will lead to failed tests (false negatives) and time wasted debugging test code. I often use builder objects or helper classes to combat 'parameter creep' Don't ask your consumers (testers in our case) to extend classes or implement interfaces. If you are doing this it is a sign that you need to rethink your design. Remember, the key is to allow the tester to think in the problem domain. If you are forcing them to implement interfaces etc. then you are pushing them into the technology domain and this leads to inefficiencies at best and downright confusion at worst. Handle change requests carefully. It is inevitable that there will be changes required. Often these come in the form of requests from your testers that they want to add a new parameter to some method to handle
  • 15. some, as yet unthought-of, use case. These requests can be handled in two ways, depending on the skill set and experience of the tester involved.  Ask them to submit the code change for review. o As the programming skills of your testers improve (one or two heads will rise above the pack in this regard) , this is a good option as it frees up developer time but be careful that all changes are in keeping with the overall goals of the API. Beware the copy/paste bloat of existing methods. This is something that can be earned and possibly used in quarterly objectives etc.  Ask them to specify their request clearly and commit the change yourself. o This system works well with inexperienced programmers and allows more control over any changes to the framework/API. It is important to separate the requirement from the design suggestion and make a decision as to whether this is the best way to do this. The tester may request a new method that takes an extra parameter unaware that a ‘Varargs’[10] variation already exists etc. Another handy, but slightly more advanced, tip is to use interfaces in your method signatures instead of the concrete classes. This gives the added flexibility that types that implement the same interface can be passed in at runtime with no modification needed in the code. For example, you may have three types of transaction in your system; ‘VisaTransaction’, ‘MastercardTransaction’ and ‘AmexTransaction’. If all these types implement the transaction interface, then any method that requires a transaction object should include the interface in the signature so any of the three types can be passed. Consistency The key to consistency in an API is to do things the same way everywhere you do it. If you pass the driver as a parameter in some of you methods, always pass it as the first parameter or the last parameter. It doesn't matter which as long as you are consistent about it. If you use enumerated constants to represent order types, use them everywhere order types are required. Avoid using strings in some methods and enumerated types in others. This is frustrating for testers who are trying to figure out your API. Follow the platform conventions of your parent language. If you are using C# you should put a capital I before all interface names. If you are using Java you should not do this. Also, avoid any in-house or old school notation systems, they are confusing and difficult to interpret.
  • 16. Intuitiveness Simple mistakes such as typos and grammatical errors are not a big problem and can be picked up in a review process. Be careful if English is not your first language as subtle grammatical errors can lead to unintuitive and inconsistently named methods. Choose practical names for your methods and ensure they only do what is hinted at in the title [11]. If the name of the method is 'processOrder()' then it should only process orders and not, for example, click on the home button at the end. Where feasible, implement two element enumerated types instead of Booleans. This lends itself to much more readable code e.g. processOrder(..., Transaction.REFUND) is a lot easier to read than: processOrder(..., false) // true is a for credit You should list permissible values as enumerated types instead of strings. If you accept VISA, MasterCard and American Express, create an enumerated type with three values instead of passing around string values. This eliminates issues with case as well as spacing as well as providing protection via the compiler. Use purchaseItem(2.50, PaymentType.MASTERCARD) instead of purchaseItem(2.50, "Master Card") Overload your methods, but don't go crazy. If there are multiple ways to process an order, then provide an overloaded implementation for each scenario. You can have a method to process a 'vanilla' order as well as one that takes a value if tax should be applied or a discount included etc. Make sure all parameters are names appropriately to avoid confusion between the different implementations. Java varargs are your friend. Varags in Java provide the ability to accept zero or more arguments as parameters to a method. This can be very useful if you can fill out a form that has many optional fields which can result in an exponential rise in methods to cover all the scenarios. public AppPage sendRequest(String client, String account, String... optionalFields) { login(); processOrder(client, account, optionalFields); return new AppPage(); }
  • 17. Calling the ‘sendRequest’ method would look something like this: aPage.sendRequest(MERCHANT_ID, ACCOUNT_ID, "CURRENCY:EUR", "FIELD1:aaa", "FIELD2:bbb"); Avoid naming systems that are too general as this will not read well to anyone using the API. Avoid names like doSomething() or other unclear terminology. If you find yourself struggling to name a method it is usually a sign that you need to redesign things. Use names from the problem domain. This will allow the tester to think in terms of this domain which is what you want. Leverage the business documentation for the correct terminology, if the documentation says that you can add a connection, and then name your method: addConnection(...) Avoid similar sounding names like createConnection(...) It can be very confusing for the user when there are slightly different naming conventions being used by different teams. In general, don't ask your users to make assumptions. Sometimes they'll get it wrong and will blame your API, and rightly so. The quality of the API you write will have a direct bearing on the output that the testers can produce and in turn, the amount of time they can redirect to other areas such as exploratory testing. This redirection of effort can have a major bearing on the quality of the software that is released. It's important to note that an automation API (like any API) is like a contract and any changes come with an associated cost. This cost often comes in the form of time taken to update any tests that were created using the original API. It is often a good idea to create some client code (tests in our case) along with the API to verify that the interface provided is actually fit for purpose. ‘Ferenc Mihaly’ gave an extremely interesting talk in November 2011[12] about designing APIs and one key piece of advice which I believe to be extremely helpful when designing an API is think from the perspective of the caller. This means than when writing the API you should always keep in mind the code that will need to be written by the client to consume your API. For example, say you have an API to process an order on your system; you could call it something like this: process(int orderId) From the perspective of the API developer it is clear what this call does, we pass in an order id and we process the order. However when we look at the code from the perspective of the caller we see something different process(123)
  • 18. As you can see, when reading the client code it is not as clear as to what is being processed. Of course we can always look up the API documentation or drill down if we are in an IDE but why force our users to take that extra step. Instead we can call the method: processOrder(int orderId) The calling code is a lot clearer at first glance processOrder(123) Of course, this is a trivial example but hopefully you can see the value of thinking about the code from the caller perspective when designing your API.
  • 19. Styling an automation API Fluent APIs There are a number of approaches you can take when designing your API. The best, in my experience, is to implement a fluent API [13]. A fluent API can mean many things but for me the most important part is method chaining. This means, in essence, that each method returns an object of it's parent class so that each method in the class can call each other. This allows for the writing of tests that are very readable and clear to understand with the need for cluttering the code with comments. An example of method chaining is shown below: customersPage.searchBySurname("McTest").addPaymentMethod("Credit Card", "4242424242424242") .editPaymentMethod(1, "4242424242424242", "02/2023").deletePaymentMethod(); This chaining of methods is not the only component of a fluent API but it is a very important part. Data Modeling You should always leverage data objects for entities in your domain such as user, transaction etc. These can be implemented as simple java objects (POJOs). You can, of course, leverage inheritance for related data types. For example, there may be multiple types of user in the system such as anonymous user, power user, admin etc. Page Objects Page objects have become the default pattern for writing automated tests. They are probably badly named as the concept of distinct pages often no longer applies to today’s web applications such as Single Page Applications (SPAs). Page Objects should abstract the HTML away from the test and also hide the asynchronous nature of most modern sites by building in waits (The AngularJS framework handles this very nicely by using JavaScript promises). They are still relevant, however, and should be thought of in terms of a more abstract area of a web application than the traditional pages of pre AJAX applications. There are a number of different approaches that you can take when designing your page objects such as whether the page object will expose a functional or structural interface. This means we need to decide whether we expose methods such login() or enterUsername(). This are positive and negatives to both approaches and a business specific decision will need to be taken on which path to take. Of course, you could expose methods of both types but this can lead to a very cluttered API. The key thing to remember is to keep methods private where possible.
  • 20. Lists of ten (or so) 10 things you should be doing in your automation framework/API: 1. Alan Richardson[14] says stop thinking about framework and think in terms of an API. Use a FluentInterface and prefer humane[15] to minimal[16]. Test Frameworks are one of the few use cases where the investment in creating a fluent API can be 'easily' justified [17]. 2. Use builder objects to minimise hardwired data in the tests. "If it's not important for it to be in the test, it's important for it to not be in the test." [18]. Use the builder pattern to hide unimportant details in test data. Using data models will improve readability and maintainability in your tests. 3. Write your own custom matchers for clarity and flexibility [19] 4. Use the singleton pattern for driver management. Use something like getDriver() which is a static factory method that returns a singleton on a per thread basis. Watch out for threading issues though and avoid if you're unsure [20]. Don't fear design patterns but don't over use either. The selenium capsules framework has some great ideas around applying patterns (notably the Decorator Pattern) but this is a somewhat advanced topic [21]. 5. Write Database helper methods to verify DB state where appropriate. 6. Leverage Maven & Jenkins integration capabilities. Externalise properties and pass as parameterise to your Jenkins builds. 7. Use soft and hard asserts in the right places. This is a very powerful concept and is very much under used in testing today [22]. 8. Decide on whether to use Spring to handle your page dependencies. This leads to much cleaner code and removes hard dependencies. However, be aware that it can make the framework harder to understand for newcomers 9. Leverage Jenkins to automate your report generation as well as WebDriver to screenshot on failure [23]. Build logging into page object methods to avoid clogging tests up with 'noise'. Delegation is the key, not to people but to existing libraries, don't reinvent a proprietary wheel. 10. Reduce 'flakiness' by minimising tests. Keep the test scenarios as simple as possible and don’t try to do too much in a single test. A neat trick is to capture cookies to simulate logins as this avoids slow login calls for every test. Sometimes it’s ok to break the rules e.g. writing two methods instead of one 'generic' method. Remember you don't always have to use best practice i.e. DRY code. Sometimes, keeping it simpler is better. The goal should be to help up skill your testers, where appropriate, and reduce the workload on developers. You should resist the temptation to show off your skills with HashMaps that are three levels deep or
  • 21. applying every design pattern under the sun. Keep in mind that it is not production code but don't make it a second class citizen either. 6 things you should not be doing in your automation framework/API: 1. Sleeping and implicit waits. Just don't do it. Period. It's slow and performance dependent. BTW, wrapping Thread.sleep() in a convenience method call waitAWhile(int) doesn't count as not sleeping in your tests. 2. Injecting JavaScript unnecessarily. It's a powerful concept when used correctly but it can lead to browser compatibility problems that can prove difficult to diagnose and can lead to misleading failures on browser upgrades e.g. document.getElementById("here").scrollIntoView()[24] This can behave slightly differently on the various browsers. Remember Chrome forces updates and it cannot be opted out of (unless you setup a firewall rule). 3. Using XPath for your locators. XPath locators should be avoided where possible. It's slow, hard to read and inconsistent across browsers. CSS selectors [25] are faster at runtime and Ids are your friend. 4. Running all your tests through the UI. It's slow, brittle and is not necessary for lots of data driven scenarios. A little time spend selecting the right layer to run your tests can pay off big time in the long run. If your business requires the use of Equivalence Partitioning and Boundary Value Analysis, these should be applied at a lower level such as the API as they tend to generate large amounts of tests. Keep visibility of where testing is done all the way down the stack. Make informed decisions of where to place the tests. Push tests down the stack unless you have a reason not to. Keep the feedback loop short, while managing risk. Don't bypass business logic in any layer. 5. Over-generalising your test methods. A lot of time can be spent on making your test methods reusable and this is generally a good thing but taking some time to determine a use case before doing so can save a lot of time creating reuseless[26] code. This is an all too common anti-pattern where methods are pushed into super classes for re-use even though no one ever reuses them. They end up cluttering the code and API and lead to maintenance problems. Remember, declaring something as reusable does not make it so, somebody has to actually want to reuse it. 6. Avoid "development in parallel". This means that is does not make sense to develop a framework so complicated in nature as to rival the software it is designed to test. It's important to strike a balance between maintaining the framework/API and getting the job done. Ensure any framework/APIs are reviewed by architects/lead developers to catch such problems early on to avoid costly backtracking. 7. Avoid off by one errors, with thanks to Martin Fowler [27].
  • 22. Call to action Please don't just read this paper and leave it at that. Thoughts are meaningless without action. Start stripping those locators from your tests and improve your existing automation approach. Your future self will thank you for it. Don't forget the three rules of good software automation: o Abstraction o Abstraction o More abstraction Remember, team-work is the key here. Help each other to learn new 'stuff'. Developers can teach testers how to leverage Maven and Jenkins. Testers can teach developers how build up a suite of tests that can be leveraged for unit testing. One of the most effective techniques in testing is to push tests down the pyramid so feedback is provided earlier. If something is covered in a unit test, developers will find a defect in it even before they commit their code. If it is covered in a GUI test, it may not be found until a nightly build is run. That's not bad but we should always be aiming to do better, shouldn't we? Future Work Following the process outlined here will help in implementing an effective software automation approach within your company but there are always new things on the horizon that can be leveraged to improve the existing situation. Some current areas of interest for me include  Leveraging Java 8 o The Lambdas [28] feature may allow for a significant reduction in the amount of code required to implement synchronisation in the WebDriver API. This is due to the large amount of anonymous inner classes required. If this can be replaced with a simple passing of behaviour it could prove very beneficial.  Leveraging the Memento pattern [29] to remember the state of web application. This could allow for the dropping in and out of various points in test cases across complicated testing scenarios.
  • 23. References [All Links provided here were valid at the time of writing, last validated Oct-12-2014] 1. https://www.scrum.org/Resources/Scrum-Glossary/Definition-of-Done 2. http://programmer.97things.oreilly.com/wiki/index.php/Don%27t_Repeat_Yourself 3. http://dannorth.net/introducing-bdd/ 4. http://cukes.info/ 5. http://jbehave.org/ 6. http://www.mountaingoatsoftware.com/blog/the-forgotten-layer-of-the-test-automation-pyramid 7. http://martinfowler.com/bliki/PageObject.html 8. http://www.satisfice.com/articles/what_is_et.shtml 9. Effective Java, Joshua Bloch, ISBN - 0321356683 10. http://docs.oracle.com/javase/1.5.0/docs/guide/language/varargs.html 11. http://martinfowler.com/bliki/TwoHardThings.html 12. http://theamiableapi.com/2011/11/21/api-design-webinar-slides-and-recording/ 13. http://www.martinfowler.com/bliki/FluentInterface.html 14. http://seleniumsimplified.com/2014/09/selenium-webdriver-page-object-abstractions-and-beyond/ 15. http://www.martinfowler.com/bliki/HumaneInterface.html 16. http://www.martinfowler.com/bliki/MinimalInterface.html 17. http://genehughson.wordpress.com/2012/02/20/fluent-interfaces/ 18. ISBN-10: 0131495054, xUnit Test Patterns: Refactoring Test Code, Gerard Meszaros 19. 0321503627 Growing Object-Oriented Software, Guided by Tests 20. https://plumbr.eu/blog/how-to-shoot-yourself-in-foot-with-threadlocals 21. http://seleniumcapsules.blogspot.ie/ 22. http://beust.com/weblog/2012/07/29/reinventing-assertions/ 23. https://github.com/allure-framework/allure-core/wiki 24. https://developer.mozilla.org/en-US/docs/Web/API/Element.scrollIntoView
  • 24. 25. http://code.tutsplus.com/tutorials/the-30-css-selectors-you-must-memorize--net-16048 26. Scott Ambler, "Reuse Patterns and Antipatterns", 2000 Software Development Magazine, http://www.drdobbs.com/reuse-patterns-and-antipatterns/184414576?cid=Ambysoft 27. http://martinfowler.com/bliki/TwoHardThings.html 28. http://www.oracle.com/webfolder/technetwork/tutorials/obe/java/Lambda-QuickStart/index.html 29. http://en.wikipedia.org/wiki/Memento_pattern