SlideShare a Scribd company logo
1 of 50
Download to read offline
Implementation of Agile Methodology in
Mobile Automation testing
(Software Quality Management)
Table of Content:
Chapter 1 Introduction
Chapter 2 Project Overview
Chapter 3: Agile Methodologies
Chapter 4: Mobile Applications
Chapter 5: Mobile Automation Tools Assessment
Chapter 6: Mobile Automation Testing Framework
Chapter 7: Implementation of the Mobile automation testing framework using Appium
Chapter 8: Project Details and Implementation
Chapter 9: Results
Chapter 10 Conclusion
Chapter 11 Recommendations and Future works
Abstract
Now a days, IT industry is revolving itself and moving forward from desktop based
application to a mobile based application. Mobile applications are more complex than
ever before because of the variety of platforms and devices, a culture of quality in all
part of organization is essential for its success. Since there are many STLC (Software
testing life cycle) but all are having pros and cons relatively. In current scenario Agile
methodology is used in industry giving good quality of product. It is inspired by empirical
inspect and adept feedback loop to cope with complexity and risk. That’s why here
scrum method chosen for Mobile automation testing.
Here we are automating the testing of mobile application. Currently almost every
application is running on the mobile platform in any of the form i.e. Native, Web, hybrid
application. Quality of mobile application is more critical than any desktop based
application because once a user gets a bad experience about application he will not
return to the application again.
So it is important to test the functionalities of these mobile applications but each and
every new release where lots of new features get added to the existing system, test
management is a challenge.
Mobile application system is having different type of features. Testing such a huge
system required lots of time. In short interval of time, to test various scenarios of mobile
application manually is not feasible
The dissertation deal with implementing a scrum methodology in automation
framework development using Appium by which mobile automation will perform in each
release cycle. The best feature would be to automation all possible scenarios
which user might not have thought during execution phase of testing.
Broad Academics area of Work:
Software Quality Management
Key Words: Appium, Agile methodology, Mobile Automation testing, Framework
development, Scrum, Software testing life cycle.
ACKNOWLEDGEMENT
This project would not have been possible without the support of many people. I
would like to acknowledge and extend my heartfelt gratitude to the following
persons who have made the completion of this project possible.
My Team lead Mr. Satinder Singh for his full support and guidance throughout
this project.
My Frien Mr. Shiv Hari Singh for encouraging and supporting me to pursue this
project.
Dr. Rizwan Parveen from BITS Pilani for giving the feedback at various stages
of the Project which acted as a motivation for me to work on the project.
My testing team members who have helped me during various phases of the
project.
Last but not the least, I would like to express my love and gratitude to my
beloved family for their understanding & motivation through the duration of this
project.
LIST OF ABBREVIATIONS USED
TC: Test Case
TS: Test Script
POT: Proof of Testing
ENV: Environment
SDLC: Software Testing Life Cycle
JDK: Java Development kit
SDK: Software Development kit
STLC: Software Testing Life Cycle
PBI: Product Backlog Item
UAT: User Acceptance Testing
XP: Extreme Programming
DSDM: Dynamic System Testing Model
LD: Lean Testing
POM: Page Object Model
Adb: Android debug bridge
JRE: Java Runtime Environment
AVD: Android Virtual device
Chapter 1 Introduction
1.1 Organizational Introduction:
XXXXX is an Indian multinational IT services company, headquartered in Noida,
Uttar Pradesh, India. It is a subsidiary of XXX Enterprise. Originally a research
and development division of XXX, it emerged as an independent company in 1991
when XXX ventured into the software services business. XXX Technologies offers
services including IT consulting, enterprise transformation, remote infrastructure
management, engineering and R&D, and business process outsourcing (BPO).
.
1.2 About the Product
It is a transportation and logistics Mobile application used by driver for loading/unloading
shipment and status update.
<<Not including a detailed product details due to company policies >>
1.3 Architecture
Figure 1: Technical architecture of our product
<<Removed the product details due to company policies >>
Chapter 2 Project Overview
2.1 Current mode of operation:
 Currently in most of the automation projects our organization is using waterfall model for
STLC.
 In Waterfall model once an application is in the testing stage, it is very difficult to go back
and change something that was not well-thought out in the concept stage.
 Changes in any requirement is very difficult to manage.
 Once the manual testing is completed than only automation framework development starts,
which consume lots of time.
 Companies are using many existing models for developing automation framework based on
client’s requirements and the size of projects. Some models are preferred over the others
due to their properties and how they match the client’s needs.
FEATURES WATERFALL V-SHAPED INCREMENTAL SPIRAL RAD
Requirement
specifications
Beginning Beginning Beginning Beginning
Time
boxed
release
Cost Low Expensive Low Expensive Low
Simplicity Simple Intermediate Intermediate Intermediate
Very
Simple
Risk
involvement
High Low
Easily
manageable
Low Very low
Expertise High Medium High High Medium
Flexibility to
change
Difficult Difficult Easy Easy Easy
User
involvement
Only at
beginning
At the
beginning
Intermediate High
Only at the
beginning
Flexibility Rigid Little flexible Less flexible flexible High
Maintenance Least Least
Promotes
maintainability
Typical
Easily
maintained
Duration Long
According to
project size
Very long Long Short
2.2 Problem with existing solution:

• It is costly and time taking.
• If User wants any of the change, tester has to modify test script from start. So it’s difficult.
• In this model user involvement is only at beginning level.
• Not suitable for difficult or moderate projects.
• This model is not flexible.
• The most importantly- it’s a time taking process.
• Automation framework development faces the same issues as project development in
traditional approach.
2.3 Suggested solution
Considering the different problem faced during the automation framework development
in existing approach of automation testing, the project includes the following different
aspect to facilitate the user with necessary model and overcome the problem providing
additional advantages.
2.4 Objective
• To describe the agile methodology for Mobile automation testing framework
Development and convert it into a generic to use in other projects.
• It Covers Scenarios, which are not possible to be done manually (e.g. Black Box testing
techniques).
2.5 Expected Features
The Solution should be made to bring scrum methodology into mobile automation testing using
Appium. The developed framework will contain some generic features which may be used in other
projects. Scrum is having some basic feature [1] as per appendix.
2.6 Scope of the work
Scope of this project is to deal with the upcoming challenges during Mobile automation
testing framework development and come out with a unique development strategy on
the basis of the tools available in Market, which will be quite generic solution to all type
Mobile application testing.
Chapter 3: Agile Methodologies
3.1 What is Agile?
Agile methodology is an alternative to traditional project management, typically used in
software testing. It helps teams respond to unpredictability through incremental, iterative
work cadences, known as sprints. Agile methodologies are an alternative to waterfall, or
traditional sequential Testing.
4.2 Why Agile?
Agile methodology provides opportunities to assess the direction of a project throughout the Testing
lifecycle. This is achieved through regular cadences of work, known as sprints or iterations, at the
end of which teams must present a potentially shippable product increment. By focusing on the
repetition of abbreviated work cycles as well as the functional product they yield, agile methodology
is described as “iterative” and “incremental.”
Here difference between agile versus traditional is mentioned below in table 2:
ProcessFactor Traditional Agile
Measure of success Conformance to Plan
Response to change , working
software
Management Culture Command and Control Collaborative
Requirement & Design Big Upfront Continuous; emergent; IT
Planning & Scheduling
Detailed; Estimate time and
Resource; Fix Scope
Two levels – Time fixed, estimate
scope.
Coding & Implementation
Code all features in parallel, and
test later.
Code, test and deliver serially.
Test & Quality Assurance
Big upfront plan, Test late after
coding
Continuous ;concurrent; test
frequent & early
4.3 Types of Agile Methodology
There are different types of agile methodologies [7] which are mentioned as below in figure 3:
• Extreme Programming (XP)
• Dynamic System Testing Model (DSDM)
• Crystal Methods
• Lean Testing (LD)
• Kanban
• Scrum
Figure 3: Types of Agile Methodology
4.3.1 Extreme Programming (XP)
• Most popular and controversial agile methodologies
• Delivering high-quality software quickly and continuously.
• It promotes high customer involvement, rapid feedback loops, continuous testing,
continuous planning, and close teamwork to deliver working software at very frequent
intervals, typically every 1-3 weeks.
• XP Guided by Values, principles and values
 XP Values: Feedback, simplicity, communication, respect and courage.
 Principles driving Extreme Programming are: Humanity, mutual benefit, economics,
improvement, diversity, self-similarity, opportunity, reflection, redundancy, flow,
baby steps, failure, quality, and accepted responsibility.
 13 Practices that guide XP engineering are: Whole team, energized work, sit
together, informative workspace, slack, stories, pair programming, weekly cycle,
quarterly cycle, continuous integration, ten-minute build, incremental design and
test first programming.
4.3.2 FDD (Feature Driven Testing)
Describe the 5 processes:
 Develop an overall model
 Build a feature List
 Plan by feature
 Design by feature
 Build by feature
4.3.3 Lean Deployment
 Known as Kanban
 Purpose is to visualize the work flow and optimize it
 Reduce the cycle time of delivering fully completed features.
 Eliminating Waste
 Amplifying Learning
 Deciding as Late as Possible
 Delivering as Fast as Possible
 Empowering the Team
 Building Integrity In
 It has 3 artefacts:
 Kanban Board
 Work-in-Progress Limit
 Lead Time
4.3.4 DSDM (Dynamic System Testing Method)
 Straight forward framework based on best principles to start implementing a project
structure.
 Simple
 Extendible
 But not calming to be the solution to all kind of projects.
 It is prioritized using MoSCoW Rules:
M: Must have requirements
S: Should have if at all possible
C: Could have but not critical
W:Won‘t have this time, but potentially later
4.3.5 ASD (Adaptive Software Testing)
 Focused on the rapid creation and evolution of software systems.
 ASD replaces the traditional waterfall cycle with a repeating series of speculate,
collaborate, and learn cycles.
 ASD has 3 steps so , here those steps described briefly:
• Speculate: Initiation and Planning
• Collaborate : Concurrent feature Testing
• Learn : Quality Review
4.3.6 Crystal
• Crystal promotes early
• Frequent delivery of working software
• High user involvement
• Adaptability
• Removal of bureaucracy or distractions
4.3.7 Scrum
• Scrum is the most popular way of introducing Agility due to its simplicity and
flexibility.
• Based on iterative Testing
• Scrum is a management and control process that cuts through complexity to
focus on building software that meets business needs.
4.4 Why Scrum Methodology
Scrum is another Agile Testing framework and has several key features that are shown the
Figure 4 given below and explained in detail [3]. Due to these features I am using scrum for
my project
4.4.1 Sprint:
Scrum framework divides the product testing into iterations known as “Sprints” which are time
boxed to fixed length of 1 – 4 weeks.
• Every iteration attempt to build a potentially shippable (properly tested) product increment.
• The time duration for the Sprint and how long it lasts, is decided by the team based on their
requirements and capabilities.
• The Sprint duration once finalized should not be modified.
4.4.2 Product Increment: At the end of the every Sprint the test team delivers a
potentially shippable product that is tested and works well. Since Sprints are short in durations, only
important features would be developed first. This also gives customers a product that has the basic
features that they can test and provide feedback on.
4.4.3 Product Backlog:
The set of all requirements broken down into small work items and prioritized into a list is called
product backlog.
• The product backlog emerges over a period of time.
• Refinement of the product backlog is done during the Product Backlog refinement meeting.
• Basically Product Backlog is the requirements for the project and these requirements may be
refined during each Sprint.
4.4.4 Sprint Backlog: Sprint Backlog contains all the known User Stories (or
requirements) in the current Sprint.
• The requirement with top priority listed first.
• The team pulls the stories into the Sprint and work collectively.
• Sprint Backlog is basically a list of all the requirements that need to be completed during the
Sprint ordered by priority.
4.4.5 User Stories:
• It captures who, what and why of the requirement from the users perspective.
• Detailed requirements in agile software Testing are captured in the form of User Stories,
from the point of view of the user rather than the organization or project.
• Ex. As a customer <role>, I want to <action> so that I can <reason or goal>.
• User stories are short and concise statements.
• They are recorded on sticky notes, index cards etc so that they can be stuck on walls or
tables to be rearranged or used during discussion.
4.4.6 Definition of Done:
Definition of done is a checklist of all exit criteria that must be completed by the team to call it done.
Definition of done exists at User story level, Sprint level and Release level
4.4.7 Time boxing:
Time boxing is a concept of fixed time duration in which the team is expected to complete the
committed features of work. Every ceremony in Scrum is time boxed as per the recommendations
given in the Scrum guide.
4.4.8 Daily Stand up Meeting:
In Scrum methodology of Agile software testing, teams hold a daily planning meeting called the
“Daily Scrum Meeting” or “Scrum Meeting” or “Stand-up meeting”.
• In this meeting each team members give an update on 3 questions to the rest of the team
and not specifically to the management.
• These questions are – What have I accomplished yesterday? What will I do today? And what
is stopping me from proceeding? This increases the visibility of the tasks to everyone in the
team.
• This meeting can be also used to raise any potential impediments that block team from
accomplishing the sprint goal.
• These meetings are not expected to last more than 15 minutes and are held at the same
time and place, every day.
• A task board may be installed near the team’s physical location where everyone can see the
tasks moving from one block to the other.
4.5 Agile supports 12 principles:
Agile manifesto is having 12 principles which is mentioned in table 3 as below:
1. Our highest priority is to satisfy the customer through early and continuous delivery of
valuable software.
2. Welcome changing requirements, even late in Testing. Agile processes harness change for the
customer's competitive advantage.
3. Deliver working software frequently, from a couple of weeks to a couple of months, with a
preference to the shorter timescale.
4. Business people and developers must work together daily throughout the project.
5. Build projects around motivated individuals. Give them the environment and support they
need, and trust them to get the job done.
6. The most efficient and effective method of conveying information to and within a Testing
team is face-to-face conversation.
7. Working software is the primary measure of progress.
8. Agile processes promote sustainable Testing. The sponsors, developers, and users should be
able to maintain a constant pace indefinitely.
9. Continuous attention to technical excellence and good design enhances agility.
10. Simplicity--the art of maximizing the amount of work not done--is essential.
11. The best architectures, requirements, and designs emerge from self-organizing teams.
12. At regular intervals, the team reflects on how to become more effective, then tunes and
adjusts its behaviour accordingly.
Chapter 4: Mobile Applications
4.1 Types of Mobile applications:
4.1.1 Native application:
Native applications are the platform specific and they developed in specific language.
Native apps live on the device and are accessed through icons on the device home
screen. Native apps are installed through an application store (such as Google Play or
Apple’s App Store).They can take full advantage of all the device features — they can
use the camera, the GPS, the accelerometer, the compass, the list of contacts, and so
on.
4.1.2 Web Application:
Web apps are not real applications; they are really websites that, in many ways, look
and feel like native applications, but are not implemented as such. They are run by a
browser and typically written in HTML5. Users first access them as they would access
any web page: they navigate to a special URL and then have the option of “installing”
them on their home screen by creating a bookmark to that page.
4.1.3 Hybrid application:
These application are combination of both native and web applications. They developed
in HTML5 and wrap into the native format. These application are also downloaded from
play store and installed on the devices. Ex: Flip kart.
4.2 Difference between Mobile and Desktop Application
Testing:
Few obvious aspects that sets mobile app testing apart from the desktop testing
• On desktop, the application is tested on a central processing unit. On a mobile device,
the application is tested on handsets like Samsung, Nokia, Apple and HTC.
• Mobile device screen size is smaller than desktop.
• Mobile devices have less memory than desktop.
• Mobiles use network connections like 2G, 3G, 4G or WIFI where desktop use
broadband or dial up connections.
4.3 Types of Mobile App Testing:
To address all the above technical aspects, the following types of testing are performed
on Mobile applications.
• Usability testing– To make sure that the mobile app is easy to use and provides a
satisfactory user experience to the customers
• Compatibility testing– Testing of the application in different mobiles devices,
browsers, screen sizes and OS versions according to the requirements.
• Interface testing– Testing of menu options, buttons, bookmarks, history, settings,
and navigation flow of the application.
• Services testing– Testing the services of the application online and offline.
• Low level resource testing: Testing of memory usage, auto deletion of temporary
files, local database growing issues known as low level resource testing.
• Performance testing– Testing the performance of the application by changing the
connection from 2G, 3G to WIFI, sharing the documents, battery consumption, etc.
• Operational testing– Testing of backups and recovery plan if battery goes down, or
data loss while upgrading the application from store.
• Installation tests– Validation of the application by installing /uninstalling it on the
devices.
• Security Testing– Testing an application to validate if the information system
protects data or not.
4.4 Technical features of Mobile applications:
Each of these types of apps has their advantages and disadvantages, these app are
developed as per the requirement and below are the features given:
Device features. Although web apps can take advantage of some features, native apps
(and the native components of the hybrid apps) have access to the full paraphernalia of
device-specific features, including GPS, camera, gestures, and notifications.
Offline functioning. A native app is best if your app must work when there is no
connectivity. In-browser caching is available in HTML5, but it’s still more limited than
what you can get when you go native.
Discoverability. Web apps win the prize on discoverability. Content is a lot more
discoverable on the web than in an app: When people have a question or an information
need, they go to a search engine, type in their query, and choose a page from the
search results. They do not go to the app store, search for an app, download it, and then
try to find their answer within the app. Although there are app aficionados who may fish
for apps in app stores, most users don’t like installing and maintaining apps (and also
wasting space on their device), and will install an app only if they expect to use it often.
Speed. Native apps win the speed competition. In 2012 Mark Zuckerberg declared that
Facebook’s biggest mistake had been betting on the mobile web and not going native.
Up to that point, the Facebook app had been a hybrid app with an HTML core; in 2012 it
was replaced with a truly native app. Responsiveness is key to usability.
Installation. Installing a native or hybrid app is a hassle for users: They need to be
really motivated to justify the interaction cost. “Installing” a web app involves creating a
bookmark on the home screen; this process, while arguably simpler than downloading a
new app from an app store, is less familiar to users, as people don’t use bookmarks that
much on mobile.
Maintenance. Maintaining a native app can be complicated not only for users, but also
for developers (especially if they have to deal with multiple versions of the same
information on different platforms): Changes have to be packaged in a new version and
placed in the app store. On the other hand, maintaining a web app or a hybrid app is as
simple as maintaining a web page, and it can be done as often or as frequently as
needed.
Platform independence. While different browsers may support different versions of
HTML5, if platform independence is important, you definitely have a better chance of
achieving it with web apps and hybrid apps than with native apps. As discussed before,
at least parts of the code can be reused when creating hybrid or web apps.
Content restrictions, approval process, and fees. Dealing with a third party that
imposes rules on your content and design can be taxing both in terms of time and
money. Native and hybrid apps must pass approval processes and content restrictions
imposed by app stores, whereas the web is free for all. Not surprisingly, the first web
apps came from publications such as Playboy, who wanted to escape Apple’s prudish
content censure. And buying a subscription within an iOS app means that 30% of that
subscription cost goes to Apple, a big dent in the publishers’ budget.
Development cost. It’s arguably cheaper to develop hybrid and web apps, as these
require skills that build up on previous experience with the web. NN/g clients often find
that going fully native is a lot more expensive, as it requires more specialized talent.
But, on the other hand, HTML5 is fairly new, and good knowledge of it, as well as a good
understanding of developing for the mobile web and hybrid apps are also fairly advanced
skills.
User Interface. Last but not least, if one of your priorities is providing a user
experience that is consistent with the operating system and with the majority of the
other apps available on that platform, then native apps are the way to go. That doesn’t
mean that you cannot provide a good mobile user experience with a web app or a hybrid
app — it just means that the graphics and the visuals will not be exactly the same as
those with which users may be already accustomed, and that it will be harder to take
advantage of the mobile strengths and mitigate the mobile limitations.
4.5 Mobile Application Testing Strategy
The Test strategy should make sure that all the quality and performance guidelines are
met. A few pointers in this area:
1. Selection of the devices: Analyze the market and choose the devices that are
widely used. (This decision mostly relies on the clients. The client or the app
builders consider the popularity factor of a certain devices as well as the
marketing needs for the application to decide what handsets to use for testing.)
2. Emulators: The use of these is extremely useful in the initial stages of Testing,
as they allow quick and efficient checking of the app. Emulator is a system that
runs software from one environment to another environment without changing
the software itself. It duplicates the features and work on real system.
3. Types of Mobile Emulators
 Device Emulator- provided by device manufacturers
 Browser Emulator- simulates mobile browser environments.
 Operating systems Emulator- Apple provides emulators for iPhones, Microsoft
for Windows phones and Google Android phones
List of few free and easy to use mobile device emulators like iPhone Tester, Mobile Phone
Emulator, Responsivepx, Android4.2.2
Chapter 5: Mobile Automation tools Assessment
5.1 Tools selection:
There are various automation tools available in market for mobile automation testing.
Each one of having their own technique to automate the application.
We can categorize these tool in two parts, one is Image based automation tools and
second is object based automation tools. Image based tools used the screen co-
ordinates to identify the object, which is moreover devices specific automation, the
developed automation code can be run only on the same device/emulator. On the other
hand, object based automation tools use the object’s property to identify object and the
script developed using these tools can be run on any device/emulator. Based on the
features of the available tools I created on matrix for comparison which is shown below:
Tools
Paid/Open
Source
Native
Apps
Web
Hybrid
Apps
Android IOS Windows
Black-
berry
Library/Tool
Robotium Open
Source
Y - Y Y - - - Library
Sikuli Open
Source
Image
Based
Image
Based
Image
Based
Y Y Y Y Both
Selenium
web Driver
Open
Source
- Y - Y Y - - Library
NativeDriver Open
Source
Y - - Y Y - - Library
Appium Open
Source
Y Y Y Y Y - - Tool
MonkeyTalk Open
Source
Y Y Y Y Y - - Tool
SeeTest Paid Y Y Y Y Y Y Y Tool
M-eux(Jamo
Solutions)
Paid Y - Y Y Y Y Y Tool
EggPlant Paid Image
Based
Image
Based
Image
Based
y Y Y Y Tool
mAutomate Paid Y Y Y Y Y - - WebBased
Ranorex Paid Y Y Y Y Y - - Tool
As discussed above imaged based automation tools are specific to the devices or
emulator, using them will not be an effective method, other side there are many tools
available in object based automation which are listed in above table. Again they are
categorized in two parts, paid and open source. In this project I am considering open
source tools. Appium is the best suited tool because of its capability to automate all type
of mobile application on most of the platforms in comparison of other ones.
5.2 Appium Introduction
Appium is part of the Selenium 3.0, It is a set of different software tools each with a
different approach to supporting test automation. The entire suite of tools results in a
rich set of testing functions specifically geared to the needs of testing of Hybrid and
Native applications. These operations are highly flexible, allowing many options for
locating UI elements and comparing expected test results against actual application
behaviour.
 Appium is the cross-platform solution for native and hybrid mobile automation
 It supports: IOS, Android and FirefoxOS
 We don't have to recompile our app or modify it in any way, due to use of
standard automation APIs on all platforms.
 We can use any testing framework.
 Language Support: Java, C#, Pearl, Python and Ruby.
5.3 Appium Architecture
5.4 Technical Specification:
Feature Appium
Language Support Java, C#, Ruby, Python, Perl.
Windows (Non-browser)
based Application support No
App Support Web, Native and Hybrid
Environment Support Windows , Mac OS
Platform Support Android, IOS and FirefoxOS
Framework Appium+Selenium2.0 Jar + Eclipse + TestNG+Android SDK
Object Recognition /
Storage UI Automator Viewer
Software Cost Zero
Coding Experience of
Engineer Should be very good.
Script Creation Time
High for the first time (relatively less after the standards
are established)
Hardware resource (CPU +
RAM) consumption High
Product Support Open source. No dedicated support
5.5 Technical Requirements
S.
No.
Technical Requirement Remarks
1
Appium Server
Standalone server which send the testing code
to Mobile the connected device/Emulator
2 Selenium Web Driver
Java Latest version of open source automation tool.
3 Java Programming language selected for scripting.
4 Eclipse IDE Script development environment.
5 Object Identification
Tools
UI Automator viewer which comes with Android
SDK
6 TestNG An open source automation testing framework.
7 .Net framework 4.5 It support Appium tool.
8 AutoIT Tool to handle window popups
5.6 Benefits:
• Support for both platforms iOS and android.
• Appium is an open source tool that doesn’t cost anything.
• Scalable with standard desktop systems without cost for user licenses
• Support for continuous integration.
• Appium Web Driver supports many popular programming languages like Java,
C#, PHP, Pearl, Ruby, Python etc.
• Supports many test environments.
• Native Xpath support, if your html is complicated / full of nesting / does not have
id attributes, then this could be very important.
• Doesn't require access to your source code or library. You are testing with which
you will actually ship.
• Support for various frameworks.
5.7 Challenges
• Demand higher technical competencies.
• Being an open source, Appium has no official technical support.
• No built in Object Repository concept.
• Doesn't support image comparison.
• Handling Popup/Dialog/Menu windows are sometimes tricky.
• Barcode scanning can’t be done using automation.
Chapter 6: Mobile Automation Testing Framework
There are mainly two type of framework are popular in Mobile automation testing.
6.1 Page Object Model(POM):
Page Object Model is a design pattern to create Object Repository for web
UI elements. Under this model, for each web page in the application there
should be corresponding page class.This Page class will find the
WebElements of that web page and also contains Page methods which
perform operations on those WebElements.
Name of these methods should be given as per the task they are
performing.
The main advantage of Page Object Model is that if the UI changes for any
page, it don’t require us to change any tests, we just need to change only
the code within the page objects (Only at one place).
Page Object model is writing all the functionalities / reusable components
of a page that we want to automate in a separate class. Say now if we
consider four pages as Home page, Login page, Create Account and Forgot
password page etc
6.2 Page Factory Model:
Page Factory is an inbuilt page object model concept for Appium but it
is much optimized.
Here as well we follow the concept of separation of Page Object repository
and Test methods. Additionally with the help of PageFactory class we use
annotations @FindBy to find WebElement. We use initElements method to
initialize web elements
@FindBy can accept tagName, partialLinkText, name, linkText, id, css,
className, xpath as attributes.
I will explain more about these framework during implementation part.
Chapter 7: Implementation of the Mobile automation
testing framework using Appium
7.1 Environment setup
Step 1
Download and install JDK form
http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-
2133151.html
Validate the installed java
Setup the environment variable setup for java as
JAVA_HOME = C:Program FilesJavajdk1.8.0_101
JRE_HOME = C:Program FilesJavajre1.8.0_101
Edid Path = ; C:Program FilesJavajdk1.8.0_101bin;
Step 2
Download the Androids SDk form https://developer.android.com/studio/index.html
And istall the same:
Edit Path = ;C:UsersuserAppDataLocalAndroidsdkplatform-
tools;C:UsersuserAppDataLocalAndroidsdktools
Step 3: Validate the AVD(Android Virtual devices) has been installed with Android SDK
Navigate to C:UsersuserAppDataLocalAndroidsdk .
There will be an Icon named as AVD Manager, Cilick on it.
Validate the UI Automator Viewer also has been installed
Use the commend uiatomatorViewer in command prompt
This tool will help us to identify the objects properties of the Native application.
Step 4:
Download and install Dot Net framework 4.5 from https://www.microsoft.com/en-
in/download/details.aspx?id=30653
This is essential because appium server is designed in .net.
Step 5: Download and install Node.js from
https://nodejs.org/en/download/
step 6:
Download and install Appium from http://appium.io/downloads.html
Step 7: Download Java IDE (Eclipse) from
http://www.eclipse.org/downloads/download.php?file=/technology/epp/downloads/relea
se/mars/R/eclipse-jee-mars-R-win32-x86_64.zip
Step 8:
Download Selenium and appium Jar files
1. java-client-4.1.2 from:
https://mvnrepository.com/artifact/io.appium/java-client/4.1.2
2. gson-2.2.4 from
http://www.java2s.com/Code/Jar/g/Downloadgson224sourcesjar.htm
3. Selenium Jar files from http://www.seleniumhq.org/download/
I will use all of these Jar files in Mobile automation framework development.
Step 9:
Download and install TestNG Plugin in eclise from the eclise market place.
Chapter 8: Project Details and Implementation
8.1 Introduction
Below Steps define the implementation of the Mobile automation testing using traditional and a
agile approach and comparison between them.
8.2 Hardware Requirements
Any Windows machine with matches the following criteria:
Processor: Pentium 1 GHz or higher
RAM: 1 GB for x86 / 2GB for x64
System Type: 32 bit Operating System/64 Bit Operating System
8.3 Software Requirements
.net framework 4.
Windows XP with SP3 or higher.
8.4 Performing END to END testing Cycle of mobile application using
Automation.
Step 1: Connect the device with system and validate the same:
After connecting the device to system, Open the command window and type a command as “Adb
devices”, it will show all the connected devices.
Note: this image is a mirror image of the actual device connected, generated using Mobizen
software.
Step 2: Start the Appium Server.
Step 3: Setup the Appium server:
Step 4: Open the eclipse and create the Project:
Sample Code<> package com.framework.test;
import java.io.File;
import java.net.MalformedURLException;
import java.net.URL;
//import java.util.NoSuchElementException;
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.remote.DesiredCapabilities;
import org.testng.annotations.Test;
import io.appium.java_client.android.AndroidDriver;
public class Login {
AndroidDriver driver;
@Test
public void testapp() throws MalformedURLException, InterruptedException{
DesiredCapabilities capability = new DesiredCapabilities();
capability.setCapability("deviceName", "MOTO G");
capability.setCapability("platformName", "Android");
capability.setCapability("platformVersion", "6.0");
File F = new
File("C:UserskhoiwalkworkspaceMatixMobileapkcorpAndroidMatrixMobile-qa-
debug.apk");
//C:UserskhoiwalkworkspaceMatixMobileapkcorpAndroidMatrixMobile-
qa-debug.apk
capability.setCapability("app", F.getAbsolutePath());
driver = new AndroidDriver(new URL("http://10.69.147.82:4725/wd/hub"),
capability);
driver.manage().timeouts().implicitlyWait(5, TimeUnit.SECONDS);
//Thread.sleep(10000);
//driver.findElement(By.id("android:id/button2")).click();
Thread.sleep(15000);
driver.findElement(By.id("corp.android.MatrixMobile.QA:id/txtUserName")).sendKe
ys("nkhanna");
Thread.sleep(10000);
driver.findElement(By.id("corp.android.MatrixMobile.QA:id/txtPassword")).sendKey
s("Qwerty$612");
driver.findElement(By.id("corp.android.MatrixMobile.QA:id/btnLogIn")).click();
Thread.sleep(20000);
// driver.scrollTo("American Honda - PC10 (179)").click();
driver.findElement(By.id("corp.android.MatrixMobile.QA:id/btnGetTrip")).click();
driver.findElement(By.className("android.widget.EditText")).sendKeys("97551416");
driver.findElement(By.id("android:id/button1")).click();
Thread.sleep(5000);
driver.findElement(By.id("corp.android.MatrixMobile.QA:id/btnTripDetails")).click();
Thread.sleep(10000);
List<WebElement> b =
driver.findElements(By.className("android.widget.TwoLineListItem"));
b.get(2).click();
//driver.close();
}
}
Step 5: Start the Appium server.
Step 6: Run the test script using TestNG.
Step 7: Monitor the execution.
Step 8: Validate the Test result.
All of these above steps are same while creating and executing for each sprint.
Here created 3 Sprint of project A and division of user story would be done based on complexity of
user story (Complex, Medium, Simple) . These sprint shown in the below table.
Sprint User Story Story Points No of Test Cases
Project # A (Sprint 1) 4 7 16
Project # A (Sprint 2) 7 8 19
Project # A (Sprint 3) 10 18 23
Chapter 9: Results
9.1 Result
As the title of the Dissertation says, evaluate Mobile application automation testing by agile
methodology estimating the project quality by monitoring defect trend for Functional automation
testing.
Here functions are tested by searching search item and examine the search result in mobile
application, and other functionality is rarely considered.
Mobile testing we have conducted in phases as we have implemented iterative model where mobile
testing went on side by side. Here some analysis is done with some set of data and observed some
issues with earlier model which is mentioned as below:
 No formal test planning and it’s always started until late in the development process with
inadequate resources.
 Review & test effectiveness and efficiency are not known until UAT started.
 Communication barrier among stakeholders.
 Existing defect reporting is not structured and effective.
 The draft of the test method was given a positive response by the interviewees but should
still be slightly changed. The most important areas to introduce today are:
o Define responsibility for testing
o Define testing lifecycle
o Improvement in the construction process
o A defect reporting system
o Sub-process monitoring for review effectiveness and efficiency
o Test Dashboards
Due to above issue, new methodology is implemented using some metrics for earlier model and new
model. Key Metrices for testing project measurement template are mentioned as follows in Table 5
and analysis is done for pre and post implementation of evaluation:
9.2 Key Metrics for Testing Project Measurement:
S.
No.
Metric Name Formula
Project Goal Rationale
behind the
goal settingProject Goal
Upper
Limit
Lower
Limit
1 Schedule Variance (%) Schedule variance=
(Actual Calendar days-
Planned Calendar days
+ Start Variance) /
(Planned Calendar
days) * 100
5% 10% -10%
2 Effort Variance (%) Effort Variance= (Actual
Effort - Planned
Effort)/(Planned Effort)
* 100
0% 5% -5%
3 Test Script Coverage
(%)
Test Script Coverage =
(No of Test Script
Originally Written/(No
of Test Script Originally
Written + No of test
Script Added During
Execution))
95% 100% 85%
4 Productivity in Test
Script Preparation
Productivity in Test
Script Preparation =
Actual No of Test Script
/ Actual effort spent on
test Script preparation
5 6 3
5 Productivity in Test
Case Execution
Productivity in Test
Case Execution=Actual
No of Test Cases
(planned + adhoc )/
Actual effort spent on
testing
6 8 4
6 Productivity in Defect
Detection(Defects/
PD)
Productivity in Defect
Detection=Actual No of
Defects testing/ Actual
effort spent on Testing 4 5 3
7 Review Effectiveness Review Effectiveness =
(No of errors detected
in Review/(No of errors
detected in Review +
Total no of defects
detected in
Testing))*100
70% 90% 60%
8 Testing Effectiveness Internal Defect
Removal Effectiveness
= (No defects detected
in Internal Testing/( No
defects detected in
Internal Testing + No of
UAT defects))*100
95% 100% 90%
9.2.1 Metrics Collection:-Scheduled variance Evaluation
Pre Implementation
Schedule variance= (Actual Calendar days-Planned Calendar days + Start Variance) / (Planned
Calendar days) * 100 (Attached Table 6)
Project#
Planned
Calendar
days
Actual
Calendar
Days
Start
variance
Schedul
e
Varianc
e (%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Root Cause
Suggeste
d
Improve
ments &
Preventiv
e Action
Project#
A
47 50 1 8.51% 5.00% 10.00%
-
10.00%
1. Estimation
is not
effective and
there is a
deficiency in
the planning
components
1.
Estimatio
n review
2.
Introduci
ng Burn
down
chart on
daily basis
Post Implementation
Schedule variance= (Actual Calendar days-Planned Calendar days + Start Variance) / (Planned
Calendar days) * 100 (Attached Table 7)
Project#
Planned
Calendar
days
Actual
Calendar
Days
Start
variance
Schedule
Variance
(%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Project#
A (Sprint
1)
16 15 0 -6.25% 5.00% 10.00%
-
10.00%
Project#
A (Sprint
2)
13 13 0 0.00% 5.00% 10.00%
-
10.00%
Project#
A (Sprint
3)
18 19 0 5.56% 5.00% 10.00%
-
10.00%
9.2.2 Metrices Collection: - Efforts Variance Evaluation
Pre Implementation
Efforts Variance= (Actual Effort - Planned Effort)/(Planned Effort) * 100 )(Attached Table 8)
Project#
Planned
Efforts
(P Days)
Actual
Efforts
(P
Days)
Efforts
Variance
(%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Root Cause
Suggested
Improvements
& Preventive
Action
Project#
A
195 208 0.0667 0 0.05 -0.05
1.
Estimation
is not
effective
and there is
a deficiency
in the
planning
components
2. Rework
during
execution of
test script
1. Estimation
review
2. Introducing
Burn down
chart on daily
basis
Post Implementation
Efforts Variance= (Actual Effort - Planned Effort)/(Planned Effort) * 100 (Attached Table 9)
Project#
Planned
Efforts
(P Days)
Actual
Efforts
(P
Days)
Efforts
Variance
(%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Project# A
(Sprint 1)
65 64 -1.54% 0.00% 5.00% -5.00%
Project# A
(Sprint 2)
55 55 0.00% 0.00% 5.00% -5.00%
Project# A
(Sprint 3)
75 73 -2.67% 0.00% 5.00% -5.00%
9.2.3 Matrices Collection:-Test Case Coverage Evaluation
Pre Implementation:
Test Case Coverage = (No of Test Cases Originally Written/(No of Test Cases Originally Written + No
of test Cases Added During Execution)) (Attached Table 10)
Proj
ect #
# of
Test
Cases
Origina
lly
Writte
n
# of
test
Cases
Added
During
Executi
on
Tot
al #
of
Tes
t
Cas
es
Test
Covera
ge %
Proje
ct
Goal
%
Upper
Limit
%
Low
er
Limit
%
Root Cause
Suggested
Improvements
&
Preventive Action
Proje
ct #A
55 7 62 88.71%
95.0
0%
100.0
0%
85.0
0%
1.
Inadequate
/
ambiguous
requiremen
ts
2.
Productivity
of the team
is low due
to new
Technologie
s and the
resources
are new to
the project,
complex
domain
1..Nonfunctional
requirements
should be captured
during
requirements study
2.Training/knowled
ge program
Post Implementation
Test Case Coverage = (No of Test Cases Originally Written/(No of Test Cases Originally Written + No
of test Cases Added During Execution)) (Attached Table 11)
Project#
# of Test
Cases
Originally
Written
# of test
Cases
Added
During
Execution
Total #
of Test
Cases
Test
Coverage
%
Project
Goal %
Upper
Limit %
Lower
Limit %
Project# A
(Sprint 1)
15 1 16 93.75% 95.00% 100.00% 85.00%
Project# A
(Sprint 2)
18 1 19 94.74% 95.00% 100.00% 85.00%
Project# A
(Sprint 3)
22 1 23 95.65% 95.00% 100.00% 85.00%
9.2.4 Metrices Collection: -
Productivity in Test Script Preparation Evaluation
Pre Implementation
Productivity in Test Script Preparation = Actual No of Test Script / Actual effort spent on test Script
preparation (Attached Table 12)
Proje
ct#
Test
Scrip
t Size
Test
Script
Preparati
on
Efforts in
Person
Hours
Productiv
ity in Test
Script per
Hours
Project
Goal
(TS per
Hours)
Upper
Limit
(TS per
Hours)
Lower
Limit
(TS
per
Hours)
Root Cause
Suggested
Improveme
nts &
Preventive
Action
Proje
ct# A
62 17 5.41 4 6 3
1.Inadequate
/ambiguous
requirements
on security
features
1.Non
functional
requirement
s should be
captured
during
requirement
s study
Post Implementation
Productivity in Test Case Preparation = Actual No of Test Cases / Actual effort spent on test case
preparation (Attached Table 13)
Project#
Test Case
Size
Test Case
Preparation
Efforts in
Person
Hours
Productivity
in Test
Cases per
Hour
Project
Goal (TC
per Hour)
Upper
Limit (TC
per Hour)
Lower
Limit (TC
per Hour)
Project# A
(Sprint 1)
17 4 4.25 5 6 3
Project# A
(Sprint 2)
20 5 4 5 6 3
Project# A
(Sprint 3)
25 6 4.17 5 6 3
Productivity in Test Case Execution Evaluation
Pre Implementation
Productivity in Test Case Execution=Actual No of Test Cases (planned + adhoc)/ Actual effort spent
on testing (Attached Table 14)
Project#
Number
of Test
Scripts
Execution
Efforts in
Person
Hours
Productivity
in Test
Cases per
Hour
Project
Goal
(TC
per
Hour)
Upper
Limit
(TC
per
Hour)
Lower
Limit
(TC
per
Hour)
Root Cause
Suggested
Improvements &
Preventive Action
Project#
A
62 10 6.2 9 10 7
1.Inadequate
Test
Environment
set-up for
testing due to
lack of test
planning
2.Productivity
of the team is
low due to
evolving
functionality
during
construction
& testing
phases
3.Productivity
of the team is
low due to
new
technologies
and the
resources are
new to the
1.Implementation
scrum testing
methodology
including test
requirements and
test preparation
2.FSD to be frozen
and signed off from
customer in sprint
before design phase
3.training/Knowledge
up gradation
program
project,
complex
domain
Post Implementation
Productivity in Test Case Execution=Actual No of Test Cases (planned + adhoc)/ Actual effort spent
on testing (Attached Table 15)(For sprint wise No of TC ,Please consider table 4 for reference)
Project#
Number
of Test
Cases
Execution
Efforts in
Person
Hours
Productivity
in Test
Cases per
Hour
Project
Goal
(TC per
Hour)
Upper
Limit
(TC per
Hour)
Lower
Limit
(TC per
Hour)
Project# A
(Sprint 1)
17 2 8.5 6 8 4
Project# A
(Sprint 2)
20 3.0 6.6 6 8 4
Project# A
(Sprint 3)
25 4 6.25 6 8 4
9.2.5 Metrices Collection:- Productivity in Defect Detection (Defects/ PD)
Evaluation
Pre Implementation:
Productivity in Defect Detection=Actual No of Defects in testing/ Actual effort spent in Testing
(Attached Table 16)
Project#
# of
Total
Defects
in
Testing
Effort
in
Person
Days
in
Testing
Defect
Detection
Productivity
(Defects Per
Person Day)
Project
Goal
(Defects
Per
Person
Day)
Upper
Limit
(Defects
Per
Person
Day)
Lower
Limit
(Defect
s Per
Person
Day)
Root Cause
Suggested
Improvements &
Preventive Action
Project#
A
211 73 2.89 4 5 3
1. Longer
time to setup
the
environment
2. Faced the
test data
issue during
execution.
3. Run time
script
maintenance
.
1. Use the proper
guidelines for
framework which
will help in ease of
maintenance.
2 Review the
automation code
on regular basis to
avoid run time
maintenance.
Post Implementation
Productivity in Defect Detection=Actual No of Defects (Review + testing)/ Actual effort spent on
(Review + Testing) (Attached Table 17)
Project#
# of Total
Defects in
Testing
Effort in
Person
Days in
Testing
Defect
Detection
Productivity
(Defects Per
Person Day)
Project
Goal
(Defects
Per
Person
Day)
Upper
Limit
(Defects
Per
Person
Day)
Lower
Limit
(Defects
Per
Person
Day)
Project# A
(Sprint 1)
72 19 3.79 4 5 3
Project# A
(Sprint 2)
55 15 3.67 4 5 3
Project# A
(Sprint 3)
84 20 4.2 4 5 3
9.2.6 Metrices Collection:- Testing Effectiveness Evaluation
Pre Implementation
Internal Defect Removal Effectiveness = (No defects detected in Internal Testing/( No defects
detected in Internal Testing + No of UAT defects))*100 (Attached Table 20)
Project#
Regression
Testing
UAT
Internal Defect
Removal
Effectiveness
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Root
Cause
Suggested
Improvements
& Preventive
Action
Project#
A
221 29 84.4% 95.00% 100.00% 90.00%
Lack of
Process
Processed
daily stand up
meeting in
sprint
Post Implementation
Internal Defect Removal Effectiveness = (No defects detected in Internal Testing/( No defects
detected in Internal Testing + No of UAT defects))*100 (Attached Table 21)
Project#
Regression
Testing
UAT
Internal
Defect
Removal
Effectiveness
Project
Goal (%)
Upper
Limit (%)
Lower
Limit (%)
Project# A
(Sprint 1)
18 1 94.73% 95.00% 100.00% 90.00%
Project# A
(Sprint 2)
25 0 100.00% 95.00% 100.00% 90.00%
Project# A
(Sprint 3)
35 2 94.95% 95.00% 100.00% 90.00%
Conclusion:
After implementation and analysis of the above matrices here is the conclusion mentioned in below
matrix:
Keys
Project# A
Pre Implementation Post Implementation
Schedule Variance (%) 8.51% -0.23%
Effort Variance (%) 6.67% -1.40%
Test Case Coverage (%) 88.71% 94.71%
Productivity in Test Case Preparation 5.41 4.14
Productivity in Test Case Execution 6.2 7.11
Productivity in Defect
Detection(Defects/ PD)
2.89 3.89
Testing Effectiveness 94.40% 96.56%
-100.00%
0.00%
100.00%
200.00%
300.00%
400.00%
500.00%
600.00%
700.00%
800.00%
Project# A Pre Implementation Project# A Post Implementation
Above table shows the following comparisons:
1. Scheduled variance is decreased after implementing the agile approach in project A.
2. Effort variance is decreased after implementing the agile approach in project A.
3. Test case coverage increased after implementing the agile approach in project A.
4. Productivity in test case preparation is increased after implementing the agile approach in
project A.
5. Productivity in test case Execution is increased after implementing the agile approach in
project A.
6. Defect detection is increased after implementing the agile approach in project A.
7. Testing effectiveness is increased after implementing the agile approach in project A.
Chapter 10 Conclusion
Above comparison between Pre & Post Mobile automation testing results in between Pre
Implemented Project A and Post Implemented sprint Project A for different parameters which are
mentioned in rows of the table.
After implementation of Sprint projects it shows clearly how this sprint has improved testing results
in an effective manner and reduced the defects leakage.
Still there are many regression issues that were found in testing phase. The following are some of
the issues which are shown in a generic way X
1. Android Version is changing continuously so facing the configuration issue
2. Framework maintenance is required regularly.
3. There were time out errors for some operation during mobile app testing.
4. Some performance issues were also found during testing.
Apart from these, there are other issues which we are expecting and still it has to be adopted by the
whole team and then we will be able to find more issues. We are using it to make system better.
In this Dissertation, an effort is made to project quality by monitoring the defect trend
for the mobile application and to minimize the human effort involved in mobile application testing.
It has been observed that the scrum software testing methodology supports quickly adapting a
project to changing business realities, so here scrum methodology is used in software project and
the same is successful for long term project.
Regression testing which is done in every release is time consuming which has been chosen first
and then Scrum is implemented which has reduced significant amount of time for team and is
more effective.
Chapter 11 Recommendations and Future works
This methodology can be used in different prospective for different Operating System Simulator.
User just needs to use the same methodology for any upcoming OS with any new technology.
Methodology will be intelligent enough to evaluate quality & added features in new application.
Mobile application is being increased in industry. Similar trend has been observed for
Telecommunication service providers. Since at the back end there are many operating system that
works together, so mobile apps correct functioning is important not only for better functioning of
the system but for collaborations with external partners.
There are some points for future works are as follows:
 Testing support for emulators/simulators.
 Automation framework can be modified for future projects.
 Mobile automation framework can be extend for Web and Hybrid mobile applications.
 Compatibility testing for mobile application can be achieved using these framework.
 Network/Localization testing also can be achieved using cloud based mobile solution.
Appendix I
A scrum methodology is having different feature in different prospective, there are few features is
mentioned as below:
Features of Scrum from the Client Perspective
Ultimately testers have to deliver the product to satisfy the customer. Too many projects get caught
up in the overhead of administering the project, delays in shipping because of poorly-written
requirements, and subpar products that only meet minimal requirements (if that). These are the
features of Scrum that will help you create “wow” moments for our clients:
• Delight our clients by building exactly what they want
• The team is able to quickly deliver the most important features first
• Support our business partners by delivering value in short cycles
• Scrum is priority driven; what the team is working on adapts to meet the current business
need
• Change is embraced; in order to better meet true business needs
Features of Scrum from the Organization Perspective
Satisfied customers who demand our products are the lifeblood of our business. A repeatable
process that delivers products which fuel this demand is like having a goose that lays the golden egg.
In this sense, these features of Scrum will fit nicely into our organization.
• Builds continuous innovation into our organization
• Creates order out of chaos
• Scrum is a cultural shift
Features of Scrum from the Management Perspective
As important as the organization and customer perspectives are, a new process doesn’t make sense
if it creates a management nightmare. Fortunately, Scrum features some important characteristics
that shift the management paradigm in a positive manner.
• Control shifts from management to the team; the team is now responsible for the working
product
• Helps everyone understand how much work the team can do in a given timeframe
• Creates an environment where the team can manage themselves
Features of Scrum from the Product Perspective
In the Agile methodology, products just move along the Testing pipeline faster. The unique nature of
the sprint in Scrum’s framework ensures that a version of the product is always ready to ship.
Scrum’s other features as referenced to the product are:
• Due to collaboration, our product gets better
• Due to the real-time inspect-and-adapt loop, the team is able to deliver exactly what is
needed
• The product becomes more valuable because it does exactly what the client wants it to do
• Scrum provides early feedback
• Scrum supports predictability of our Testing process
• There is always a shippable product; in the worst case we revert a single sprints worth of
Testing
• Each sprint you have a new stable shippable product
Features of Scrum from the Team Perspective
And finally, there are features of Scrum that our team will find appealing. With all of the self-
managing aspects of Scrum, our team will discover newfound autonomy in the execution of their
projects with these features:
• Increase team productivity
• Increase job satisfaction
• Scrum plays to people strengths; focusing on intrinsic motivation, and allowing them to do
what they love to do

More Related Content

What's hot

Software Development Process Models (SCRUM Methodology)
Software Development Process Models (SCRUM Methodology)Software Development Process Models (SCRUM Methodology)
Software Development Process Models (SCRUM Methodology)Muhammad Ahmed
 
Software Engineering Process Models
Software Engineering Process Models Software Engineering Process Models
Software Engineering Process Models Satya P. Joshi
 
Software Engineering(unit 1)
Software Engineering(unit 1)Software Engineering(unit 1)
Software Engineering(unit 1)SURBHI SAROHA
 
Assignment 2nd(sdlc)id-17
Assignment 2nd(sdlc)id-17Assignment 2nd(sdlc)id-17
Assignment 2nd(sdlc)id-17Pardeep Bhadwal
 
Lecture 02 Software Process Model
Lecture 02 Software Process ModelLecture 02 Software Process Model
Lecture 02 Software Process ModelAchmad Solichin
 
Software Testing - Online Guide
Software Testing - Online GuideSoftware Testing - Online Guide
Software Testing - Online Guidebigspire
 
Automate the sdlc process
Automate the sdlc processAutomate the sdlc process
Automate the sdlc processMichael Deady
 
Acceptance test driven development
Acceptance test driven developmentAcceptance test driven development
Acceptance test driven developmentEditor Jacotech
 
Software development methodologies
Software development methodologiesSoftware development methodologies
Software development methodologiesAnkita Lachhwani
 
Software testing
Software testingSoftware testing
Software testingSengu Msc
 
Software testing Training Syllabus Course
Software testing Training Syllabus CourseSoftware testing Training Syllabus Course
Software testing Training Syllabus CourseTOPS Technologies
 
Manual Testing Material by Durgasoft
Manual Testing Material by DurgasoftManual Testing Material by Durgasoft
Manual Testing Material by DurgasoftDurga Prasad
 

What's hot (19)

Software Development Process Models (SCRUM Methodology)
Software Development Process Models (SCRUM Methodology)Software Development Process Models (SCRUM Methodology)
Software Development Process Models (SCRUM Methodology)
 
I
II
I
 
Software development process models
Software development process modelsSoftware development process models
Software development process models
 
Software Engineering Process Models
Software Engineering Process Models Software Engineering Process Models
Software Engineering Process Models
 
Software Engineering(unit 1)
Software Engineering(unit 1)Software Engineering(unit 1)
Software Engineering(unit 1)
 
Process Models IN software Engineering
Process Models IN software EngineeringProcess Models IN software Engineering
Process Models IN software Engineering
 
Ijetcas14 545
Ijetcas14 545Ijetcas14 545
Ijetcas14 545
 
Agile ALM Tool Comparison
Agile ALM Tool ComparisonAgile ALM Tool Comparison
Agile ALM Tool Comparison
 
Assignment 2nd(sdlc)id-17
Assignment 2nd(sdlc)id-17Assignment 2nd(sdlc)id-17
Assignment 2nd(sdlc)id-17
 
Lecture 02 Software Process Model
Lecture 02 Software Process ModelLecture 02 Software Process Model
Lecture 02 Software Process Model
 
Software Testing - Online Guide
Software Testing - Online GuideSoftware Testing - Online Guide
Software Testing - Online Guide
 
Automate the sdlc process
Automate the sdlc processAutomate the sdlc process
Automate the sdlc process
 
Acceptance test driven development
Acceptance test driven developmentAcceptance test driven development
Acceptance test driven development
 
CSC426 - SDLC Models
CSC426 - SDLC ModelsCSC426 - SDLC Models
CSC426 - SDLC Models
 
Software development methodologies
Software development methodologiesSoftware development methodologies
Software development methodologies
 
Software testing
Software testingSoftware testing
Software testing
 
Software testing Training Syllabus Course
Software testing Training Syllabus CourseSoftware testing Training Syllabus Course
Software testing Training Syllabus Course
 
Manual Testing Material by Durgasoft
Manual Testing Material by DurgasoftManual Testing Material by Durgasoft
Manual Testing Material by Durgasoft
 
3. ch 2-process model
3. ch 2-process model3. ch 2-process model
3. ch 2-process model
 

Similar to Implementation of agile methodology in mobile automation testing

Materi Testing dan Implementasi System
Materi Testing dan Implementasi SystemMateri Testing dan Implementasi System
Materi Testing dan Implementasi Systemdevinta sari
 
Best SQA Document.pdf
Best SQA Document.pdfBest SQA Document.pdf
Best SQA Document.pdfAzmatIqbal2
 
EMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODEL
EMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODELEMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODEL
EMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODELijseajournal
 
SDLC - Software Development Life Cycle
SDLC - Software Development Life CycleSDLC - Software Development Life Cycle
SDLC - Software Development Life CycleSuresh Koujalagi
 
functional testing
functional testing functional testing
functional testing bharathanche
 
A Comparative Study of Different types of Models in Software Development Life...
A Comparative Study of Different types of Models in Software Development Life...A Comparative Study of Different types of Models in Software Development Life...
A Comparative Study of Different types of Models in Software Development Life...IRJET Journal
 
Software lifecycle model report
Software lifecycle model reportSoftware lifecycle model report
Software lifecycle model reportAshutosh Singh
 
Lecture - 11-15.pptx
Lecture - 11-15.pptxLecture - 11-15.pptx
Lecture - 11-15.pptxFarHana74914
 
Software development process models
Software development process modelsSoftware development process models
Software development process modelsMuntha Ulfat
 
Software development models
Software development modelsSoftware development models
Software development modelsAzlan Nawawi
 
Manual testing testing master.pdf
Manual testing testing master.pdfManual testing testing master.pdf
Manual testing testing master.pdfsynamedia
 
ManualTestingMaterial.pdf
ManualTestingMaterial.pdfManualTestingMaterial.pdf
ManualTestingMaterial.pdfSCMCpvt
 
Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...
Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...
Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...Tiara Ramadhani
 
Webinar app testing and distribution
Webinar app testing and distribution Webinar app testing and distribution
Webinar app testing and distribution Service2Media
 

Similar to Implementation of agile methodology in mobile automation testing (20)

Materi Testing dan Implementasi System
Materi Testing dan Implementasi SystemMateri Testing dan Implementasi System
Materi Testing dan Implementasi System
 
Agile process
Agile processAgile process
Agile process
 
Best SQA Document.pdf
Best SQA Document.pdfBest SQA Document.pdf
Best SQA Document.pdf
 
EMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODEL
EMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODELEMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODEL
EMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODEL
 
SDLC - Software Development Life Cycle
SDLC - Software Development Life CycleSDLC - Software Development Life Cycle
SDLC - Software Development Life Cycle
 
functional testing
functional testing functional testing
functional testing
 
A Comparative Study of Different types of Models in Software Development Life...
A Comparative Study of Different types of Models in Software Development Life...A Comparative Study of Different types of Models in Software Development Life...
A Comparative Study of Different types of Models in Software Development Life...
 
Software lifecycle model report
Software lifecycle model reportSoftware lifecycle model report
Software lifecycle model report
 
Sdpl1
Sdpl1Sdpl1
Sdpl1
 
Lecture - 11-15.pptx
Lecture - 11-15.pptxLecture - 11-15.pptx
Lecture - 11-15.pptx
 
Software development process models
Software development process modelsSoftware development process models
Software development process models
 
SE-Lecture-2.pptx
SE-Lecture-2.pptxSE-Lecture-2.pptx
SE-Lecture-2.pptx
 
Software engineering the process
Software engineering the processSoftware engineering the process
Software engineering the process
 
Software development models
Software development modelsSoftware development models
Software development models
 
reaserch ppt.pptx
reaserch ppt.pptxreaserch ppt.pptx
reaserch ppt.pptx
 
SDLC MODEL
SDLC MODEL SDLC MODEL
SDLC MODEL
 
Manual testing testing master.pdf
Manual testing testing master.pdfManual testing testing master.pdf
Manual testing testing master.pdf
 
ManualTestingMaterial.pdf
ManualTestingMaterial.pdfManualTestingMaterial.pdf
ManualTestingMaterial.pdf
 
Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...
Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...
Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...
 
Webinar app testing and distribution
Webinar app testing and distribution Webinar app testing and distribution
Webinar app testing and distribution
 

Recently uploaded

Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...soniya singh
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)OPEN KNOWLEDGE GmbH
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...MyIntelliSource, Inc.
 
Intelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalmIntelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalmSujith Sukumaran
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Andreas Granig
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWave PLM
 
Project Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationProject Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationkaushalgiri8080
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio, Inc.
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfkalichargn70th171
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software DevelopersVinodh Ram
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsMehedi Hasan Shohan
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyFrank van der Linden
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...kellynguyen01
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityNeo4j
 
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - InfographicHr365.us smith
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfPower Karaoke
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptkotipi9215
 

Recently uploaded (20)

Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
 
Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)Der Spagat zwischen BIAS und FAIRNESS (2024)
Der Spagat zwischen BIAS und FAIRNESS (2024)
 
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
Steps To Getting Up And Running Quickly With MyTimeClock Employee Scheduling ...
 
Intelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalmIntelligent Home Wi-Fi Solutions | ThinkPalm
Intelligent Home Wi-Fi Solutions | ThinkPalm
 
Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024Automate your Kamailio Test Calls - Kamailio World 2024
Automate your Kamailio Test Calls - Kamailio World 2024
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need It
 
Project Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanationProject Based Learning (A.I).pptx detail explanation
Project Based Learning (A.I).pptx detail explanation
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed DataAlluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
Alluxio Monthly Webinar | Cloud-Native Model Training on Distributed Data
 
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdfThe Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
The Essentials of Digital Experience Monitoring_ A Comprehensive Guide.pdf
 
Professional Resume Template for Software Developers
Professional Resume Template for Software DevelopersProfessional Resume Template for Software Developers
Professional Resume Template for Software Developers
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software Solutions
 
Engage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The UglyEngage Usergroup 2024 - The Good The Bad_The Ugly
Engage Usergroup 2024 - The Good The Bad_The Ugly
 
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
Short Story: Unveiling the Reasoning Abilities of Large Language Models by Ke...
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered Sustainability
 
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
 
Asset Management Software - Infographic
Asset Management Software - InfographicAsset Management Software - Infographic
Asset Management Software - Infographic
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdf
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.ppt
 

Implementation of agile methodology in mobile automation testing

  • 1. Implementation of Agile Methodology in Mobile Automation testing (Software Quality Management)
  • 2. Table of Content: Chapter 1 Introduction Chapter 2 Project Overview Chapter 3: Agile Methodologies Chapter 4: Mobile Applications Chapter 5: Mobile Automation Tools Assessment Chapter 6: Mobile Automation Testing Framework Chapter 7: Implementation of the Mobile automation testing framework using Appium Chapter 8: Project Details and Implementation Chapter 9: Results Chapter 10 Conclusion Chapter 11 Recommendations and Future works
  • 3. Abstract Now a days, IT industry is revolving itself and moving forward from desktop based application to a mobile based application. Mobile applications are more complex than ever before because of the variety of platforms and devices, a culture of quality in all part of organization is essential for its success. Since there are many STLC (Software testing life cycle) but all are having pros and cons relatively. In current scenario Agile methodology is used in industry giving good quality of product. It is inspired by empirical inspect and adept feedback loop to cope with complexity and risk. That’s why here scrum method chosen for Mobile automation testing. Here we are automating the testing of mobile application. Currently almost every application is running on the mobile platform in any of the form i.e. Native, Web, hybrid application. Quality of mobile application is more critical than any desktop based application because once a user gets a bad experience about application he will not return to the application again. So it is important to test the functionalities of these mobile applications but each and every new release where lots of new features get added to the existing system, test management is a challenge. Mobile application system is having different type of features. Testing such a huge system required lots of time. In short interval of time, to test various scenarios of mobile application manually is not feasible The dissertation deal with implementing a scrum methodology in automation framework development using Appium by which mobile automation will perform in each release cycle. The best feature would be to automation all possible scenarios which user might not have thought during execution phase of testing. Broad Academics area of Work: Software Quality Management Key Words: Appium, Agile methodology, Mobile Automation testing, Framework development, Scrum, Software testing life cycle.
  • 4. ACKNOWLEDGEMENT This project would not have been possible without the support of many people. I would like to acknowledge and extend my heartfelt gratitude to the following persons who have made the completion of this project possible. My Team lead Mr. Satinder Singh for his full support and guidance throughout this project. My Frien Mr. Shiv Hari Singh for encouraging and supporting me to pursue this project. Dr. Rizwan Parveen from BITS Pilani for giving the feedback at various stages of the Project which acted as a motivation for me to work on the project. My testing team members who have helped me during various phases of the project. Last but not the least, I would like to express my love and gratitude to my beloved family for their understanding & motivation through the duration of this project.
  • 5. LIST OF ABBREVIATIONS USED TC: Test Case TS: Test Script POT: Proof of Testing ENV: Environment SDLC: Software Testing Life Cycle JDK: Java Development kit SDK: Software Development kit STLC: Software Testing Life Cycle PBI: Product Backlog Item UAT: User Acceptance Testing XP: Extreme Programming DSDM: Dynamic System Testing Model LD: Lean Testing POM: Page Object Model Adb: Android debug bridge JRE: Java Runtime Environment AVD: Android Virtual device
  • 6. Chapter 1 Introduction 1.1 Organizational Introduction: XXXXX is an Indian multinational IT services company, headquartered in Noida, Uttar Pradesh, India. It is a subsidiary of XXX Enterprise. Originally a research and development division of XXX, it emerged as an independent company in 1991 when XXX ventured into the software services business. XXX Technologies offers services including IT consulting, enterprise transformation, remote infrastructure management, engineering and R&D, and business process outsourcing (BPO). . 1.2 About the Product It is a transportation and logistics Mobile application used by driver for loading/unloading shipment and status update. <<Not including a detailed product details due to company policies >> 1.3 Architecture Figure 1: Technical architecture of our product <<Removed the product details due to company policies >>
  • 7. Chapter 2 Project Overview 2.1 Current mode of operation:  Currently in most of the automation projects our organization is using waterfall model for STLC.  In Waterfall model once an application is in the testing stage, it is very difficult to go back and change something that was not well-thought out in the concept stage.  Changes in any requirement is very difficult to manage.  Once the manual testing is completed than only automation framework development starts, which consume lots of time.  Companies are using many existing models for developing automation framework based on client’s requirements and the size of projects. Some models are preferred over the others due to their properties and how they match the client’s needs. FEATURES WATERFALL V-SHAPED INCREMENTAL SPIRAL RAD Requirement specifications Beginning Beginning Beginning Beginning Time boxed release Cost Low Expensive Low Expensive Low Simplicity Simple Intermediate Intermediate Intermediate Very Simple Risk involvement High Low Easily manageable Low Very low Expertise High Medium High High Medium Flexibility to change Difficult Difficult Easy Easy Easy User involvement Only at beginning At the beginning Intermediate High Only at the beginning Flexibility Rigid Little flexible Less flexible flexible High Maintenance Least Least Promotes maintainability Typical Easily maintained Duration Long According to project size Very long Long Short
  • 8. 2.2 Problem with existing solution:  • It is costly and time taking. • If User wants any of the change, tester has to modify test script from start. So it’s difficult. • In this model user involvement is only at beginning level. • Not suitable for difficult or moderate projects. • This model is not flexible. • The most importantly- it’s a time taking process. • Automation framework development faces the same issues as project development in traditional approach. 2.3 Suggested solution Considering the different problem faced during the automation framework development in existing approach of automation testing, the project includes the following different aspect to facilitate the user with necessary model and overcome the problem providing additional advantages. 2.4 Objective • To describe the agile methodology for Mobile automation testing framework Development and convert it into a generic to use in other projects. • It Covers Scenarios, which are not possible to be done manually (e.g. Black Box testing techniques). 2.5 Expected Features The Solution should be made to bring scrum methodology into mobile automation testing using Appium. The developed framework will contain some generic features which may be used in other projects. Scrum is having some basic feature [1] as per appendix. 2.6 Scope of the work Scope of this project is to deal with the upcoming challenges during Mobile automation testing framework development and come out with a unique development strategy on the basis of the tools available in Market, which will be quite generic solution to all type Mobile application testing.
  • 9. Chapter 3: Agile Methodologies 3.1 What is Agile? Agile methodology is an alternative to traditional project management, typically used in software testing. It helps teams respond to unpredictability through incremental, iterative work cadences, known as sprints. Agile methodologies are an alternative to waterfall, or traditional sequential Testing. 4.2 Why Agile? Agile methodology provides opportunities to assess the direction of a project throughout the Testing lifecycle. This is achieved through regular cadences of work, known as sprints or iterations, at the end of which teams must present a potentially shippable product increment. By focusing on the repetition of abbreviated work cycles as well as the functional product they yield, agile methodology is described as “iterative” and “incremental.” Here difference between agile versus traditional is mentioned below in table 2: ProcessFactor Traditional Agile Measure of success Conformance to Plan Response to change , working software Management Culture Command and Control Collaborative Requirement & Design Big Upfront Continuous; emergent; IT Planning & Scheduling Detailed; Estimate time and Resource; Fix Scope Two levels – Time fixed, estimate scope. Coding & Implementation Code all features in parallel, and test later. Code, test and deliver serially. Test & Quality Assurance Big upfront plan, Test late after coding Continuous ;concurrent; test frequent & early 4.3 Types of Agile Methodology There are different types of agile methodologies [7] which are mentioned as below in figure 3: • Extreme Programming (XP) • Dynamic System Testing Model (DSDM) • Crystal Methods • Lean Testing (LD) • Kanban • Scrum
  • 10. Figure 3: Types of Agile Methodology 4.3.1 Extreme Programming (XP) • Most popular and controversial agile methodologies • Delivering high-quality software quickly and continuously. • It promotes high customer involvement, rapid feedback loops, continuous testing, continuous planning, and close teamwork to deliver working software at very frequent intervals, typically every 1-3 weeks. • XP Guided by Values, principles and values  XP Values: Feedback, simplicity, communication, respect and courage.  Principles driving Extreme Programming are: Humanity, mutual benefit, economics, improvement, diversity, self-similarity, opportunity, reflection, redundancy, flow, baby steps, failure, quality, and accepted responsibility.  13 Practices that guide XP engineering are: Whole team, energized work, sit together, informative workspace, slack, stories, pair programming, weekly cycle, quarterly cycle, continuous integration, ten-minute build, incremental design and test first programming. 4.3.2 FDD (Feature Driven Testing) Describe the 5 processes:  Develop an overall model  Build a feature List  Plan by feature  Design by feature  Build by feature
  • 11. 4.3.3 Lean Deployment  Known as Kanban  Purpose is to visualize the work flow and optimize it  Reduce the cycle time of delivering fully completed features.  Eliminating Waste  Amplifying Learning  Deciding as Late as Possible  Delivering as Fast as Possible  Empowering the Team  Building Integrity In  It has 3 artefacts:  Kanban Board  Work-in-Progress Limit  Lead Time 4.3.4 DSDM (Dynamic System Testing Method)  Straight forward framework based on best principles to start implementing a project structure.  Simple  Extendible  But not calming to be the solution to all kind of projects.  It is prioritized using MoSCoW Rules: M: Must have requirements S: Should have if at all possible C: Could have but not critical W:Won‘t have this time, but potentially later 4.3.5 ASD (Adaptive Software Testing)  Focused on the rapid creation and evolution of software systems.  ASD replaces the traditional waterfall cycle with a repeating series of speculate, collaborate, and learn cycles.  ASD has 3 steps so , here those steps described briefly: • Speculate: Initiation and Planning • Collaborate : Concurrent feature Testing • Learn : Quality Review 4.3.6 Crystal • Crystal promotes early • Frequent delivery of working software • High user involvement • Adaptability • Removal of bureaucracy or distractions
  • 12. 4.3.7 Scrum • Scrum is the most popular way of introducing Agility due to its simplicity and flexibility. • Based on iterative Testing • Scrum is a management and control process that cuts through complexity to focus on building software that meets business needs. 4.4 Why Scrum Methodology Scrum is another Agile Testing framework and has several key features that are shown the Figure 4 given below and explained in detail [3]. Due to these features I am using scrum for my project 4.4.1 Sprint: Scrum framework divides the product testing into iterations known as “Sprints” which are time boxed to fixed length of 1 – 4 weeks. • Every iteration attempt to build a potentially shippable (properly tested) product increment. • The time duration for the Sprint and how long it lasts, is decided by the team based on their requirements and capabilities. • The Sprint duration once finalized should not be modified. 4.4.2 Product Increment: At the end of the every Sprint the test team delivers a potentially shippable product that is tested and works well. Since Sprints are short in durations, only
  • 13. important features would be developed first. This also gives customers a product that has the basic features that they can test and provide feedback on. 4.4.3 Product Backlog: The set of all requirements broken down into small work items and prioritized into a list is called product backlog. • The product backlog emerges over a period of time. • Refinement of the product backlog is done during the Product Backlog refinement meeting. • Basically Product Backlog is the requirements for the project and these requirements may be refined during each Sprint. 4.4.4 Sprint Backlog: Sprint Backlog contains all the known User Stories (or requirements) in the current Sprint. • The requirement with top priority listed first. • The team pulls the stories into the Sprint and work collectively. • Sprint Backlog is basically a list of all the requirements that need to be completed during the Sprint ordered by priority. 4.4.5 User Stories: • It captures who, what and why of the requirement from the users perspective. • Detailed requirements in agile software Testing are captured in the form of User Stories, from the point of view of the user rather than the organization or project. • Ex. As a customer <role>, I want to <action> so that I can <reason or goal>. • User stories are short and concise statements. • They are recorded on sticky notes, index cards etc so that they can be stuck on walls or tables to be rearranged or used during discussion. 4.4.6 Definition of Done: Definition of done is a checklist of all exit criteria that must be completed by the team to call it done. Definition of done exists at User story level, Sprint level and Release level 4.4.7 Time boxing: Time boxing is a concept of fixed time duration in which the team is expected to complete the committed features of work. Every ceremony in Scrum is time boxed as per the recommendations given in the Scrum guide.
  • 14. 4.4.8 Daily Stand up Meeting: In Scrum methodology of Agile software testing, teams hold a daily planning meeting called the “Daily Scrum Meeting” or “Scrum Meeting” or “Stand-up meeting”. • In this meeting each team members give an update on 3 questions to the rest of the team and not specifically to the management. • These questions are – What have I accomplished yesterday? What will I do today? And what is stopping me from proceeding? This increases the visibility of the tasks to everyone in the team. • This meeting can be also used to raise any potential impediments that block team from accomplishing the sprint goal. • These meetings are not expected to last more than 15 minutes and are held at the same time and place, every day. • A task board may be installed near the team’s physical location where everyone can see the tasks moving from one block to the other. 4.5 Agile supports 12 principles: Agile manifesto is having 12 principles which is mentioned in table 3 as below: 1. Our highest priority is to satisfy the customer through early and continuous delivery of valuable software. 2. Welcome changing requirements, even late in Testing. Agile processes harness change for the customer's competitive advantage. 3. Deliver working software frequently, from a couple of weeks to a couple of months, with a preference to the shorter timescale. 4. Business people and developers must work together daily throughout the project. 5. Build projects around motivated individuals. Give them the environment and support they need, and trust them to get the job done. 6. The most efficient and effective method of conveying information to and within a Testing team is face-to-face conversation. 7. Working software is the primary measure of progress. 8. Agile processes promote sustainable Testing. The sponsors, developers, and users should be able to maintain a constant pace indefinitely. 9. Continuous attention to technical excellence and good design enhances agility. 10. Simplicity--the art of maximizing the amount of work not done--is essential. 11. The best architectures, requirements, and designs emerge from self-organizing teams. 12. At regular intervals, the team reflects on how to become more effective, then tunes and adjusts its behaviour accordingly.
  • 15. Chapter 4: Mobile Applications 4.1 Types of Mobile applications: 4.1.1 Native application: Native applications are the platform specific and they developed in specific language. Native apps live on the device and are accessed through icons on the device home screen. Native apps are installed through an application store (such as Google Play or Apple’s App Store).They can take full advantage of all the device features — they can use the camera, the GPS, the accelerometer, the compass, the list of contacts, and so on. 4.1.2 Web Application: Web apps are not real applications; they are really websites that, in many ways, look and feel like native applications, but are not implemented as such. They are run by a browser and typically written in HTML5. Users first access them as they would access any web page: they navigate to a special URL and then have the option of “installing” them on their home screen by creating a bookmark to that page. 4.1.3 Hybrid application: These application are combination of both native and web applications. They developed in HTML5 and wrap into the native format. These application are also downloaded from play store and installed on the devices. Ex: Flip kart. 4.2 Difference between Mobile and Desktop Application Testing: Few obvious aspects that sets mobile app testing apart from the desktop testing • On desktop, the application is tested on a central processing unit. On a mobile device, the application is tested on handsets like Samsung, Nokia, Apple and HTC. • Mobile device screen size is smaller than desktop. • Mobile devices have less memory than desktop. • Mobiles use network connections like 2G, 3G, 4G or WIFI where desktop use broadband or dial up connections. 4.3 Types of Mobile App Testing: To address all the above technical aspects, the following types of testing are performed on Mobile applications. • Usability testing– To make sure that the mobile app is easy to use and provides a satisfactory user experience to the customers • Compatibility testing– Testing of the application in different mobiles devices, browsers, screen sizes and OS versions according to the requirements. • Interface testing– Testing of menu options, buttons, bookmarks, history, settings, and navigation flow of the application. • Services testing– Testing the services of the application online and offline. • Low level resource testing: Testing of memory usage, auto deletion of temporary files, local database growing issues known as low level resource testing. • Performance testing– Testing the performance of the application by changing the connection from 2G, 3G to WIFI, sharing the documents, battery consumption, etc. • Operational testing– Testing of backups and recovery plan if battery goes down, or data loss while upgrading the application from store.
  • 16. • Installation tests– Validation of the application by installing /uninstalling it on the devices. • Security Testing– Testing an application to validate if the information system protects data or not. 4.4 Technical features of Mobile applications: Each of these types of apps has their advantages and disadvantages, these app are developed as per the requirement and below are the features given: Device features. Although web apps can take advantage of some features, native apps (and the native components of the hybrid apps) have access to the full paraphernalia of device-specific features, including GPS, camera, gestures, and notifications. Offline functioning. A native app is best if your app must work when there is no connectivity. In-browser caching is available in HTML5, but it’s still more limited than what you can get when you go native. Discoverability. Web apps win the prize on discoverability. Content is a lot more discoverable on the web than in an app: When people have a question or an information need, they go to a search engine, type in their query, and choose a page from the search results. They do not go to the app store, search for an app, download it, and then try to find their answer within the app. Although there are app aficionados who may fish for apps in app stores, most users don’t like installing and maintaining apps (and also wasting space on their device), and will install an app only if they expect to use it often. Speed. Native apps win the speed competition. In 2012 Mark Zuckerberg declared that Facebook’s biggest mistake had been betting on the mobile web and not going native. Up to that point, the Facebook app had been a hybrid app with an HTML core; in 2012 it was replaced with a truly native app. Responsiveness is key to usability. Installation. Installing a native or hybrid app is a hassle for users: They need to be really motivated to justify the interaction cost. “Installing” a web app involves creating a bookmark on the home screen; this process, while arguably simpler than downloading a new app from an app store, is less familiar to users, as people don’t use bookmarks that much on mobile. Maintenance. Maintaining a native app can be complicated not only for users, but also for developers (especially if they have to deal with multiple versions of the same information on different platforms): Changes have to be packaged in a new version and placed in the app store. On the other hand, maintaining a web app or a hybrid app is as simple as maintaining a web page, and it can be done as often or as frequently as needed. Platform independence. While different browsers may support different versions of HTML5, if platform independence is important, you definitely have a better chance of achieving it with web apps and hybrid apps than with native apps. As discussed before, at least parts of the code can be reused when creating hybrid or web apps. Content restrictions, approval process, and fees. Dealing with a third party that imposes rules on your content and design can be taxing both in terms of time and money. Native and hybrid apps must pass approval processes and content restrictions imposed by app stores, whereas the web is free for all. Not surprisingly, the first web apps came from publications such as Playboy, who wanted to escape Apple’s prudish content censure. And buying a subscription within an iOS app means that 30% of that subscription cost goes to Apple, a big dent in the publishers’ budget. Development cost. It’s arguably cheaper to develop hybrid and web apps, as these require skills that build up on previous experience with the web. NN/g clients often find that going fully native is a lot more expensive, as it requires more specialized talent. But, on the other hand, HTML5 is fairly new, and good knowledge of it, as well as a good understanding of developing for the mobile web and hybrid apps are also fairly advanced skills.
  • 17. User Interface. Last but not least, if one of your priorities is providing a user experience that is consistent with the operating system and with the majority of the other apps available on that platform, then native apps are the way to go. That doesn’t mean that you cannot provide a good mobile user experience with a web app or a hybrid app — it just means that the graphics and the visuals will not be exactly the same as those with which users may be already accustomed, and that it will be harder to take advantage of the mobile strengths and mitigate the mobile limitations. 4.5 Mobile Application Testing Strategy The Test strategy should make sure that all the quality and performance guidelines are met. A few pointers in this area: 1. Selection of the devices: Analyze the market and choose the devices that are widely used. (This decision mostly relies on the clients. The client or the app builders consider the popularity factor of a certain devices as well as the marketing needs for the application to decide what handsets to use for testing.) 2. Emulators: The use of these is extremely useful in the initial stages of Testing, as they allow quick and efficient checking of the app. Emulator is a system that runs software from one environment to another environment without changing the software itself. It duplicates the features and work on real system. 3. Types of Mobile Emulators  Device Emulator- provided by device manufacturers  Browser Emulator- simulates mobile browser environments.  Operating systems Emulator- Apple provides emulators for iPhones, Microsoft for Windows phones and Google Android phones List of few free and easy to use mobile device emulators like iPhone Tester, Mobile Phone Emulator, Responsivepx, Android4.2.2
  • 18. Chapter 5: Mobile Automation tools Assessment 5.1 Tools selection: There are various automation tools available in market for mobile automation testing. Each one of having their own technique to automate the application. We can categorize these tool in two parts, one is Image based automation tools and second is object based automation tools. Image based tools used the screen co- ordinates to identify the object, which is moreover devices specific automation, the developed automation code can be run only on the same device/emulator. On the other hand, object based automation tools use the object’s property to identify object and the script developed using these tools can be run on any device/emulator. Based on the features of the available tools I created on matrix for comparison which is shown below: Tools Paid/Open Source Native Apps Web Hybrid Apps Android IOS Windows Black- berry Library/Tool Robotium Open Source Y - Y Y - - - Library Sikuli Open Source Image Based Image Based Image Based Y Y Y Y Both Selenium web Driver Open Source - Y - Y Y - - Library NativeDriver Open Source Y - - Y Y - - Library Appium Open Source Y Y Y Y Y - - Tool MonkeyTalk Open Source Y Y Y Y Y - - Tool SeeTest Paid Y Y Y Y Y Y Y Tool M-eux(Jamo Solutions) Paid Y - Y Y Y Y Y Tool EggPlant Paid Image Based Image Based Image Based y Y Y Y Tool mAutomate Paid Y Y Y Y Y - - WebBased Ranorex Paid Y Y Y Y Y - - Tool As discussed above imaged based automation tools are specific to the devices or emulator, using them will not be an effective method, other side there are many tools available in object based automation which are listed in above table. Again they are categorized in two parts, paid and open source. In this project I am considering open source tools. Appium is the best suited tool because of its capability to automate all type of mobile application on most of the platforms in comparison of other ones.
  • 19. 5.2 Appium Introduction Appium is part of the Selenium 3.0, It is a set of different software tools each with a different approach to supporting test automation. The entire suite of tools results in a rich set of testing functions specifically geared to the needs of testing of Hybrid and Native applications. These operations are highly flexible, allowing many options for locating UI elements and comparing expected test results against actual application behaviour.  Appium is the cross-platform solution for native and hybrid mobile automation  It supports: IOS, Android and FirefoxOS  We don't have to recompile our app or modify it in any way, due to use of standard automation APIs on all platforms.  We can use any testing framework.  Language Support: Java, C#, Pearl, Python and Ruby. 5.3 Appium Architecture 5.4 Technical Specification: Feature Appium Language Support Java, C#, Ruby, Python, Perl. Windows (Non-browser) based Application support No App Support Web, Native and Hybrid Environment Support Windows , Mac OS Platform Support Android, IOS and FirefoxOS Framework Appium+Selenium2.0 Jar + Eclipse + TestNG+Android SDK
  • 20. Object Recognition / Storage UI Automator Viewer Software Cost Zero Coding Experience of Engineer Should be very good. Script Creation Time High for the first time (relatively less after the standards are established) Hardware resource (CPU + RAM) consumption High Product Support Open source. No dedicated support 5.5 Technical Requirements S. No. Technical Requirement Remarks 1 Appium Server Standalone server which send the testing code to Mobile the connected device/Emulator 2 Selenium Web Driver Java Latest version of open source automation tool. 3 Java Programming language selected for scripting. 4 Eclipse IDE Script development environment. 5 Object Identification Tools UI Automator viewer which comes with Android SDK 6 TestNG An open source automation testing framework. 7 .Net framework 4.5 It support Appium tool. 8 AutoIT Tool to handle window popups 5.6 Benefits: • Support for both platforms iOS and android. • Appium is an open source tool that doesn’t cost anything. • Scalable with standard desktop systems without cost for user licenses • Support for continuous integration. • Appium Web Driver supports many popular programming languages like Java, C#, PHP, Pearl, Ruby, Python etc. • Supports many test environments. • Native Xpath support, if your html is complicated / full of nesting / does not have id attributes, then this could be very important. • Doesn't require access to your source code or library. You are testing with which you will actually ship. • Support for various frameworks.
  • 21. 5.7 Challenges • Demand higher technical competencies. • Being an open source, Appium has no official technical support. • No built in Object Repository concept. • Doesn't support image comparison. • Handling Popup/Dialog/Menu windows are sometimes tricky. • Barcode scanning can’t be done using automation.
  • 22. Chapter 6: Mobile Automation Testing Framework There are mainly two type of framework are popular in Mobile automation testing. 6.1 Page Object Model(POM): Page Object Model is a design pattern to create Object Repository for web UI elements. Under this model, for each web page in the application there should be corresponding page class.This Page class will find the WebElements of that web page and also contains Page methods which perform operations on those WebElements. Name of these methods should be given as per the task they are performing. The main advantage of Page Object Model is that if the UI changes for any page, it don’t require us to change any tests, we just need to change only the code within the page objects (Only at one place). Page Object model is writing all the functionalities / reusable components of a page that we want to automate in a separate class. Say now if we consider four pages as Home page, Login page, Create Account and Forgot password page etc 6.2 Page Factory Model: Page Factory is an inbuilt page object model concept for Appium but it is much optimized. Here as well we follow the concept of separation of Page Object repository and Test methods. Additionally with the help of PageFactory class we use annotations @FindBy to find WebElement. We use initElements method to initialize web elements @FindBy can accept tagName, partialLinkText, name, linkText, id, css, className, xpath as attributes. I will explain more about these framework during implementation part.
  • 23. Chapter 7: Implementation of the Mobile automation testing framework using Appium 7.1 Environment setup Step 1 Download and install JDK form http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads- 2133151.html Validate the installed java Setup the environment variable setup for java as JAVA_HOME = C:Program FilesJavajdk1.8.0_101 JRE_HOME = C:Program FilesJavajre1.8.0_101 Edid Path = ; C:Program FilesJavajdk1.8.0_101bin; Step 2 Download the Androids SDk form https://developer.android.com/studio/index.html And istall the same:
  • 24. Edit Path = ;C:UsersuserAppDataLocalAndroidsdkplatform- tools;C:UsersuserAppDataLocalAndroidsdktools Step 3: Validate the AVD(Android Virtual devices) has been installed with Android SDK Navigate to C:UsersuserAppDataLocalAndroidsdk . There will be an Icon named as AVD Manager, Cilick on it.
  • 25. Validate the UI Automator Viewer also has been installed Use the commend uiatomatorViewer in command prompt This tool will help us to identify the objects properties of the Native application. Step 4: Download and install Dot Net framework 4.5 from https://www.microsoft.com/en- in/download/details.aspx?id=30653 This is essential because appium server is designed in .net. Step 5: Download and install Node.js from https://nodejs.org/en/download/ step 6: Download and install Appium from http://appium.io/downloads.html
  • 26. Step 7: Download Java IDE (Eclipse) from http://www.eclipse.org/downloads/download.php?file=/technology/epp/downloads/relea se/mars/R/eclipse-jee-mars-R-win32-x86_64.zip Step 8: Download Selenium and appium Jar files 1. java-client-4.1.2 from: https://mvnrepository.com/artifact/io.appium/java-client/4.1.2
  • 27. 2. gson-2.2.4 from http://www.java2s.com/Code/Jar/g/Downloadgson224sourcesjar.htm 3. Selenium Jar files from http://www.seleniumhq.org/download/ I will use all of these Jar files in Mobile automation framework development. Step 9: Download and install TestNG Plugin in eclise from the eclise market place.
  • 28. Chapter 8: Project Details and Implementation 8.1 Introduction Below Steps define the implementation of the Mobile automation testing using traditional and a agile approach and comparison between them. 8.2 Hardware Requirements Any Windows machine with matches the following criteria: Processor: Pentium 1 GHz or higher RAM: 1 GB for x86 / 2GB for x64 System Type: 32 bit Operating System/64 Bit Operating System 8.3 Software Requirements .net framework 4. Windows XP with SP3 or higher. 8.4 Performing END to END testing Cycle of mobile application using Automation. Step 1: Connect the device with system and validate the same: After connecting the device to system, Open the command window and type a command as “Adb devices”, it will show all the connected devices.
  • 29. Note: this image is a mirror image of the actual device connected, generated using Mobizen software.
  • 30. Step 2: Start the Appium Server. Step 3: Setup the Appium server:
  • 31. Step 4: Open the eclipse and create the Project: Sample Code<> package com.framework.test; import java.io.File; import java.net.MalformedURLException; import java.net.URL; //import java.util.NoSuchElementException; import java.util.List; import java.util.concurrent.TimeUnit; import org.openqa.selenium.By; import org.openqa.selenium.WebElement; import org.openqa.selenium.remote.DesiredCapabilities; import org.testng.annotations.Test; import io.appium.java_client.android.AndroidDriver; public class Login { AndroidDriver driver; @Test public void testapp() throws MalformedURLException, InterruptedException{ DesiredCapabilities capability = new DesiredCapabilities(); capability.setCapability("deviceName", "MOTO G"); capability.setCapability("platformName", "Android"); capability.setCapability("platformVersion", "6.0"); File F = new File("C:UserskhoiwalkworkspaceMatixMobileapkcorpAndroidMatrixMobile-qa- debug.apk");
  • 32. //C:UserskhoiwalkworkspaceMatixMobileapkcorpAndroidMatrixMobile- qa-debug.apk capability.setCapability("app", F.getAbsolutePath()); driver = new AndroidDriver(new URL("http://10.69.147.82:4725/wd/hub"), capability); driver.manage().timeouts().implicitlyWait(5, TimeUnit.SECONDS); //Thread.sleep(10000); //driver.findElement(By.id("android:id/button2")).click(); Thread.sleep(15000); driver.findElement(By.id("corp.android.MatrixMobile.QA:id/txtUserName")).sendKe ys("nkhanna"); Thread.sleep(10000); driver.findElement(By.id("corp.android.MatrixMobile.QA:id/txtPassword")).sendKey s("Qwerty$612"); driver.findElement(By.id("corp.android.MatrixMobile.QA:id/btnLogIn")).click(); Thread.sleep(20000); // driver.scrollTo("American Honda - PC10 (179)").click(); driver.findElement(By.id("corp.android.MatrixMobile.QA:id/btnGetTrip")).click(); driver.findElement(By.className("android.widget.EditText")).sendKeys("97551416"); driver.findElement(By.id("android:id/button1")).click(); Thread.sleep(5000); driver.findElement(By.id("corp.android.MatrixMobile.QA:id/btnTripDetails")).click(); Thread.sleep(10000); List<WebElement> b = driver.findElements(By.className("android.widget.TwoLineListItem")); b.get(2).click(); //driver.close(); } }
  • 33. Step 5: Start the Appium server. Step 6: Run the test script using TestNG.
  • 34. Step 7: Monitor the execution. Step 8: Validate the Test result.
  • 35. All of these above steps are same while creating and executing for each sprint. Here created 3 Sprint of project A and division of user story would be done based on complexity of user story (Complex, Medium, Simple) . These sprint shown in the below table. Sprint User Story Story Points No of Test Cases Project # A (Sprint 1) 4 7 16 Project # A (Sprint 2) 7 8 19 Project # A (Sprint 3) 10 18 23
  • 36. Chapter 9: Results 9.1 Result As the title of the Dissertation says, evaluate Mobile application automation testing by agile methodology estimating the project quality by monitoring defect trend for Functional automation testing. Here functions are tested by searching search item and examine the search result in mobile application, and other functionality is rarely considered. Mobile testing we have conducted in phases as we have implemented iterative model where mobile testing went on side by side. Here some analysis is done with some set of data and observed some issues with earlier model which is mentioned as below:  No formal test planning and it’s always started until late in the development process with inadequate resources.  Review & test effectiveness and efficiency are not known until UAT started.  Communication barrier among stakeholders.  Existing defect reporting is not structured and effective.  The draft of the test method was given a positive response by the interviewees but should still be slightly changed. The most important areas to introduce today are: o Define responsibility for testing o Define testing lifecycle o Improvement in the construction process o A defect reporting system o Sub-process monitoring for review effectiveness and efficiency o Test Dashboards Due to above issue, new methodology is implemented using some metrics for earlier model and new model. Key Metrices for testing project measurement template are mentioned as follows in Table 5 and analysis is done for pre and post implementation of evaluation:
  • 37. 9.2 Key Metrics for Testing Project Measurement: S. No. Metric Name Formula Project Goal Rationale behind the goal settingProject Goal Upper Limit Lower Limit 1 Schedule Variance (%) Schedule variance= (Actual Calendar days- Planned Calendar days + Start Variance) / (Planned Calendar days) * 100 5% 10% -10% 2 Effort Variance (%) Effort Variance= (Actual Effort - Planned Effort)/(Planned Effort) * 100 0% 5% -5% 3 Test Script Coverage (%) Test Script Coverage = (No of Test Script Originally Written/(No of Test Script Originally Written + No of test Script Added During Execution)) 95% 100% 85% 4 Productivity in Test Script Preparation Productivity in Test Script Preparation = Actual No of Test Script / Actual effort spent on test Script preparation 5 6 3 5 Productivity in Test Case Execution Productivity in Test Case Execution=Actual No of Test Cases (planned + adhoc )/ Actual effort spent on testing 6 8 4 6 Productivity in Defect Detection(Defects/ PD) Productivity in Defect Detection=Actual No of Defects testing/ Actual effort spent on Testing 4 5 3
  • 38. 7 Review Effectiveness Review Effectiveness = (No of errors detected in Review/(No of errors detected in Review + Total no of defects detected in Testing))*100 70% 90% 60% 8 Testing Effectiveness Internal Defect Removal Effectiveness = (No defects detected in Internal Testing/( No defects detected in Internal Testing + No of UAT defects))*100 95% 100% 90% 9.2.1 Metrics Collection:-Scheduled variance Evaluation Pre Implementation Schedule variance= (Actual Calendar days-Planned Calendar days + Start Variance) / (Planned Calendar days) * 100 (Attached Table 6) Project# Planned Calendar days Actual Calendar Days Start variance Schedul e Varianc e (%) Project Goal (%) Upper Limit (%) Lower Limit (%) Root Cause Suggeste d Improve ments & Preventiv e Action Project# A 47 50 1 8.51% 5.00% 10.00% - 10.00% 1. Estimation is not effective and there is a deficiency in the planning components 1. Estimatio n review 2. Introduci ng Burn down chart on daily basis Post Implementation Schedule variance= (Actual Calendar days-Planned Calendar days + Start Variance) / (Planned Calendar days) * 100 (Attached Table 7)
  • 39. Project# Planned Calendar days Actual Calendar Days Start variance Schedule Variance (%) Project Goal (%) Upper Limit (%) Lower Limit (%) Project# A (Sprint 1) 16 15 0 -6.25% 5.00% 10.00% - 10.00% Project# A (Sprint 2) 13 13 0 0.00% 5.00% 10.00% - 10.00% Project# A (Sprint 3) 18 19 0 5.56% 5.00% 10.00% - 10.00% 9.2.2 Metrices Collection: - Efforts Variance Evaluation Pre Implementation Efforts Variance= (Actual Effort - Planned Effort)/(Planned Effort) * 100 )(Attached Table 8) Project# Planned Efforts (P Days) Actual Efforts (P Days) Efforts Variance (%) Project Goal (%) Upper Limit (%) Lower Limit (%) Root Cause Suggested Improvements & Preventive Action Project# A 195 208 0.0667 0 0.05 -0.05 1. Estimation is not effective and there is a deficiency in the planning components 2. Rework during execution of test script 1. Estimation review 2. Introducing Burn down chart on daily basis Post Implementation Efforts Variance= (Actual Effort - Planned Effort)/(Planned Effort) * 100 (Attached Table 9)
  • 40. Project# Planned Efforts (P Days) Actual Efforts (P Days) Efforts Variance (%) Project Goal (%) Upper Limit (%) Lower Limit (%) Project# A (Sprint 1) 65 64 -1.54% 0.00% 5.00% -5.00% Project# A (Sprint 2) 55 55 0.00% 0.00% 5.00% -5.00% Project# A (Sprint 3) 75 73 -2.67% 0.00% 5.00% -5.00% 9.2.3 Matrices Collection:-Test Case Coverage Evaluation Pre Implementation: Test Case Coverage = (No of Test Cases Originally Written/(No of Test Cases Originally Written + No of test Cases Added During Execution)) (Attached Table 10) Proj ect # # of Test Cases Origina lly Writte n # of test Cases Added During Executi on Tot al # of Tes t Cas es Test Covera ge % Proje ct Goal % Upper Limit % Low er Limit % Root Cause Suggested Improvements & Preventive Action Proje ct #A 55 7 62 88.71% 95.0 0% 100.0 0% 85.0 0% 1. Inadequate / ambiguous requiremen ts 2. Productivity of the team is low due to new Technologie s and the resources are new to the project, complex domain 1..Nonfunctional requirements should be captured during requirements study 2.Training/knowled ge program
  • 41. Post Implementation Test Case Coverage = (No of Test Cases Originally Written/(No of Test Cases Originally Written + No of test Cases Added During Execution)) (Attached Table 11) Project# # of Test Cases Originally Written # of test Cases Added During Execution Total # of Test Cases Test Coverage % Project Goal % Upper Limit % Lower Limit % Project# A (Sprint 1) 15 1 16 93.75% 95.00% 100.00% 85.00% Project# A (Sprint 2) 18 1 19 94.74% 95.00% 100.00% 85.00% Project# A (Sprint 3) 22 1 23 95.65% 95.00% 100.00% 85.00% 9.2.4 Metrices Collection: - Productivity in Test Script Preparation Evaluation Pre Implementation Productivity in Test Script Preparation = Actual No of Test Script / Actual effort spent on test Script preparation (Attached Table 12) Proje ct# Test Scrip t Size Test Script Preparati on Efforts in Person Hours Productiv ity in Test Script per Hours Project Goal (TS per Hours) Upper Limit (TS per Hours) Lower Limit (TS per Hours) Root Cause Suggested Improveme nts & Preventive Action Proje ct# A 62 17 5.41 4 6 3 1.Inadequate /ambiguous requirements on security features 1.Non functional requirement s should be captured during requirement s study Post Implementation Productivity in Test Case Preparation = Actual No of Test Cases / Actual effort spent on test case preparation (Attached Table 13)
  • 42. Project# Test Case Size Test Case Preparation Efforts in Person Hours Productivity in Test Cases per Hour Project Goal (TC per Hour) Upper Limit (TC per Hour) Lower Limit (TC per Hour) Project# A (Sprint 1) 17 4 4.25 5 6 3 Project# A (Sprint 2) 20 5 4 5 6 3 Project# A (Sprint 3) 25 6 4.17 5 6 3 Productivity in Test Case Execution Evaluation Pre Implementation Productivity in Test Case Execution=Actual No of Test Cases (planned + adhoc)/ Actual effort spent on testing (Attached Table 14) Project# Number of Test Scripts Execution Efforts in Person Hours Productivity in Test Cases per Hour Project Goal (TC per Hour) Upper Limit (TC per Hour) Lower Limit (TC per Hour) Root Cause Suggested Improvements & Preventive Action Project# A 62 10 6.2 9 10 7 1.Inadequate Test Environment set-up for testing due to lack of test planning 2.Productivity of the team is low due to evolving functionality during construction & testing phases 3.Productivity of the team is low due to new technologies and the resources are new to the 1.Implementation scrum testing methodology including test requirements and test preparation 2.FSD to be frozen and signed off from customer in sprint before design phase 3.training/Knowledge up gradation program
  • 43. project, complex domain Post Implementation Productivity in Test Case Execution=Actual No of Test Cases (planned + adhoc)/ Actual effort spent on testing (Attached Table 15)(For sprint wise No of TC ,Please consider table 4 for reference) Project# Number of Test Cases Execution Efforts in Person Hours Productivity in Test Cases per Hour Project Goal (TC per Hour) Upper Limit (TC per Hour) Lower Limit (TC per Hour) Project# A (Sprint 1) 17 2 8.5 6 8 4 Project# A (Sprint 2) 20 3.0 6.6 6 8 4 Project# A (Sprint 3) 25 4 6.25 6 8 4 9.2.5 Metrices Collection:- Productivity in Defect Detection (Defects/ PD) Evaluation Pre Implementation: Productivity in Defect Detection=Actual No of Defects in testing/ Actual effort spent in Testing (Attached Table 16) Project# # of Total Defects in Testing Effort in Person Days in Testing Defect Detection Productivity (Defects Per Person Day) Project Goal (Defects Per Person Day) Upper Limit (Defects Per Person Day) Lower Limit (Defect s Per Person Day) Root Cause Suggested Improvements & Preventive Action Project# A 211 73 2.89 4 5 3 1. Longer time to setup the environment 2. Faced the test data issue during execution. 3. Run time script maintenance . 1. Use the proper guidelines for framework which will help in ease of maintenance. 2 Review the automation code on regular basis to avoid run time maintenance.
  • 44. Post Implementation Productivity in Defect Detection=Actual No of Defects (Review + testing)/ Actual effort spent on (Review + Testing) (Attached Table 17) Project# # of Total Defects in Testing Effort in Person Days in Testing Defect Detection Productivity (Defects Per Person Day) Project Goal (Defects Per Person Day) Upper Limit (Defects Per Person Day) Lower Limit (Defects Per Person Day) Project# A (Sprint 1) 72 19 3.79 4 5 3 Project# A (Sprint 2) 55 15 3.67 4 5 3 Project# A (Sprint 3) 84 20 4.2 4 5 3 9.2.6 Metrices Collection:- Testing Effectiveness Evaluation Pre Implementation Internal Defect Removal Effectiveness = (No defects detected in Internal Testing/( No defects detected in Internal Testing + No of UAT defects))*100 (Attached Table 20) Project# Regression Testing UAT Internal Defect Removal Effectiveness Project Goal (%) Upper Limit (%) Lower Limit (%) Root Cause Suggested Improvements & Preventive Action Project# A 221 29 84.4% 95.00% 100.00% 90.00% Lack of Process Processed daily stand up meeting in sprint Post Implementation Internal Defect Removal Effectiveness = (No defects detected in Internal Testing/( No defects detected in Internal Testing + No of UAT defects))*100 (Attached Table 21) Project# Regression Testing UAT Internal Defect Removal Effectiveness Project Goal (%) Upper Limit (%) Lower Limit (%) Project# A (Sprint 1) 18 1 94.73% 95.00% 100.00% 90.00%
  • 45. Project# A (Sprint 2) 25 0 100.00% 95.00% 100.00% 90.00% Project# A (Sprint 3) 35 2 94.95% 95.00% 100.00% 90.00% Conclusion: After implementation and analysis of the above matrices here is the conclusion mentioned in below matrix: Keys Project# A Pre Implementation Post Implementation Schedule Variance (%) 8.51% -0.23% Effort Variance (%) 6.67% -1.40% Test Case Coverage (%) 88.71% 94.71% Productivity in Test Case Preparation 5.41 4.14 Productivity in Test Case Execution 6.2 7.11 Productivity in Defect Detection(Defects/ PD) 2.89 3.89 Testing Effectiveness 94.40% 96.56% -100.00% 0.00% 100.00% 200.00% 300.00% 400.00% 500.00% 600.00% 700.00% 800.00% Project# A Pre Implementation Project# A Post Implementation
  • 46. Above table shows the following comparisons: 1. Scheduled variance is decreased after implementing the agile approach in project A. 2. Effort variance is decreased after implementing the agile approach in project A. 3. Test case coverage increased after implementing the agile approach in project A. 4. Productivity in test case preparation is increased after implementing the agile approach in project A. 5. Productivity in test case Execution is increased after implementing the agile approach in project A. 6. Defect detection is increased after implementing the agile approach in project A. 7. Testing effectiveness is increased after implementing the agile approach in project A.
  • 47. Chapter 10 Conclusion Above comparison between Pre & Post Mobile automation testing results in between Pre Implemented Project A and Post Implemented sprint Project A for different parameters which are mentioned in rows of the table. After implementation of Sprint projects it shows clearly how this sprint has improved testing results in an effective manner and reduced the defects leakage. Still there are many regression issues that were found in testing phase. The following are some of the issues which are shown in a generic way X 1. Android Version is changing continuously so facing the configuration issue 2. Framework maintenance is required regularly. 3. There were time out errors for some operation during mobile app testing. 4. Some performance issues were also found during testing. Apart from these, there are other issues which we are expecting and still it has to be adopted by the whole team and then we will be able to find more issues. We are using it to make system better. In this Dissertation, an effort is made to project quality by monitoring the defect trend for the mobile application and to minimize the human effort involved in mobile application testing. It has been observed that the scrum software testing methodology supports quickly adapting a project to changing business realities, so here scrum methodology is used in software project and the same is successful for long term project. Regression testing which is done in every release is time consuming which has been chosen first and then Scrum is implemented which has reduced significant amount of time for team and is more effective.
  • 48. Chapter 11 Recommendations and Future works This methodology can be used in different prospective for different Operating System Simulator. User just needs to use the same methodology for any upcoming OS with any new technology. Methodology will be intelligent enough to evaluate quality & added features in new application. Mobile application is being increased in industry. Similar trend has been observed for Telecommunication service providers. Since at the back end there are many operating system that works together, so mobile apps correct functioning is important not only for better functioning of the system but for collaborations with external partners. There are some points for future works are as follows:  Testing support for emulators/simulators.  Automation framework can be modified for future projects.  Mobile automation framework can be extend for Web and Hybrid mobile applications.  Compatibility testing for mobile application can be achieved using these framework.  Network/Localization testing also can be achieved using cloud based mobile solution.
  • 49. Appendix I A scrum methodology is having different feature in different prospective, there are few features is mentioned as below: Features of Scrum from the Client Perspective Ultimately testers have to deliver the product to satisfy the customer. Too many projects get caught up in the overhead of administering the project, delays in shipping because of poorly-written requirements, and subpar products that only meet minimal requirements (if that). These are the features of Scrum that will help you create “wow” moments for our clients: • Delight our clients by building exactly what they want • The team is able to quickly deliver the most important features first • Support our business partners by delivering value in short cycles • Scrum is priority driven; what the team is working on adapts to meet the current business need • Change is embraced; in order to better meet true business needs Features of Scrum from the Organization Perspective Satisfied customers who demand our products are the lifeblood of our business. A repeatable process that delivers products which fuel this demand is like having a goose that lays the golden egg. In this sense, these features of Scrum will fit nicely into our organization. • Builds continuous innovation into our organization • Creates order out of chaos • Scrum is a cultural shift Features of Scrum from the Management Perspective As important as the organization and customer perspectives are, a new process doesn’t make sense if it creates a management nightmare. Fortunately, Scrum features some important characteristics that shift the management paradigm in a positive manner. • Control shifts from management to the team; the team is now responsible for the working product • Helps everyone understand how much work the team can do in a given timeframe • Creates an environment where the team can manage themselves Features of Scrum from the Product Perspective In the Agile methodology, products just move along the Testing pipeline faster. The unique nature of the sprint in Scrum’s framework ensures that a version of the product is always ready to ship. Scrum’s other features as referenced to the product are: • Due to collaboration, our product gets better • Due to the real-time inspect-and-adapt loop, the team is able to deliver exactly what is needed • The product becomes more valuable because it does exactly what the client wants it to do • Scrum provides early feedback • Scrum supports predictability of our Testing process • There is always a shippable product; in the worst case we revert a single sprints worth of Testing • Each sprint you have a new stable shippable product
  • 50. Features of Scrum from the Team Perspective And finally, there are features of Scrum that our team will find appealing. With all of the self- managing aspects of Scrum, our team will discover newfound autonomy in the execution of their projects with these features: • Increase team productivity • Increase job satisfaction • Scrum plays to people strengths; focusing on intrinsic motivation, and allowing them to do what they love to do