This document discusses implementing an agile methodology for mobile automation testing framework development using Appium. It begins with an introduction and overview of the project, including the current waterfall model challenges. It then discusses agile methodologies and why Scrum was chosen. The document covers mobile application types and mobile automation tools. It details developing a mobile automation testing framework using Appium, including implementing Scrum practices. It concludes with discussing results and recommendations.
A NOVEL METHOD FOR REDUCING TESTING TIME IN SCRUM AGILE PROCESSijseajournal
Recently, the software development in the industry is moving towards agile due to the advantages provided
by the agile development process. Main advantages of agile software development process are: delivering
high quality software in shorter intervals and embracing change. Testing is a vital activity for delivering a
high quality software product. Often testing accounts for more project effort and time than any other
software development activities. Testing strategies for conventional process models are well established,
but these strategies are not directly applicable to agile testing without modifications and changes. In this
paper, a novel method for agile testing in the scrum software development environment is proposed and
presented. The sprint and testing activities which form the context for the proposed testing method are
presented. The proposed method is applied on two cases studies. The results indicated that the testing time
can be reduced considerably by applying the proposed method
Introduction to Investigation And Utilizing Lean Test Metrics In Agile Softwa...IJERA Editor
The growth of the software development industry approaches the new development methodologies to deliver the
error free software to its end-user fulfilling the business values to product. The growth of tools and technology
has brought the automation in the development and software testing process, it has also increased the demand of
the fast testing and delivery of the software to end customers. Traditional software development methodologies
to trending agile software development trend have brought new philosophy, dimensions, and processes having
invested new tools to make process easy. The Agile development (Scrum, XP, FDD, BDD, ATDD, ASD,
DSDM, Kanban, Crystal and Lean) process also faces the software testing model crises because of the fast
development of life cycles, fast delivery to end users without having appropriate test metrics which make the
software testing process slow as well as increase the risk. The analysis of the testing metrics in the software
testing process and setting the right lean test metrics help to improve the software quality effectively in agile
process.
Comparing Various SDLC Models On The Basis Of Available MethodologyIJMER
There are various SDLC models widely accepted and employed for developing software.
SDLC models give a theoretical guide line regarding development of the software. Employing proper
SDLC allows the managers to regulate whole development strategy of the software. Each SDLC has its
advantages and disadvantages making it suitable for use under specific condition and constraints for
specified type of software only. We need to understand which SDLC would generate most successful
result when employed for software development. For this we need some method to compare SDLC
models. Various methods have been suggested which allows comparing SDLC models. Comparing SLDC
models is a complex task as there is no mathematical theorem or physical device available. The essence
of this paper is to analyse some methodologies that could result in successful comparison of the SDLC
models. For this we have studied various available tools, techniques and methodologies and have tried
to extract most simple, easy and highly understandable method for comparing SDLC models.
A NOVEL METHOD FOR REDUCING TESTING TIME IN SCRUM AGILE PROCESSijseajournal
Recently, the software development in the industry is moving towards agile due to the advantages provided
by the agile development process. Main advantages of agile software development process are: delivering
high quality software in shorter intervals and embracing change. Testing is a vital activity for delivering a
high quality software product. Often testing accounts for more project effort and time than any other
software development activities. Testing strategies for conventional process models are well established,
but these strategies are not directly applicable to agile testing without modifications and changes. In this
paper, a novel method for agile testing in the scrum software development environment is proposed and
presented. The sprint and testing activities which form the context for the proposed testing method are
presented. The proposed method is applied on two cases studies. The results indicated that the testing time
can be reduced considerably by applying the proposed method
Introduction to Investigation And Utilizing Lean Test Metrics In Agile Softwa...IJERA Editor
The growth of the software development industry approaches the new development methodologies to deliver the
error free software to its end-user fulfilling the business values to product. The growth of tools and technology
has brought the automation in the development and software testing process, it has also increased the demand of
the fast testing and delivery of the software to end customers. Traditional software development methodologies
to trending agile software development trend have brought new philosophy, dimensions, and processes having
invested new tools to make process easy. The Agile development (Scrum, XP, FDD, BDD, ATDD, ASD,
DSDM, Kanban, Crystal and Lean) process also faces the software testing model crises because of the fast
development of life cycles, fast delivery to end users without having appropriate test metrics which make the
software testing process slow as well as increase the risk. The analysis of the testing metrics in the software
testing process and setting the right lean test metrics help to improve the software quality effectively in agile
process.
Comparing Various SDLC Models On The Basis Of Available MethodologyIJMER
There are various SDLC models widely accepted and employed for developing software.
SDLC models give a theoretical guide line regarding development of the software. Employing proper
SDLC allows the managers to regulate whole development strategy of the software. Each SDLC has its
advantages and disadvantages making it suitable for use under specific condition and constraints for
specified type of software only. We need to understand which SDLC would generate most successful
result when employed for software development. For this we need some method to compare SDLC
models. Various methods have been suggested which allows comparing SDLC models. Comparing SLDC
models is a complex task as there is no mathematical theorem or physical device available. The essence
of this paper is to analyse some methodologies that could result in successful comparison of the SDLC
models. For this we have studied various available tools, techniques and methodologies and have tried
to extract most simple, easy and highly understandable method for comparing SDLC models.
Software development process models
Rapid Application Development (RAD) Model
Evolutionary Process Models
Spiral Model
THE FORMAL METHODS MODEL
Specialized Process Models
The Concurrent Development Model
This presentation provides useful and beneficial information related to software development companies. It defines Software development methodology and elaborates various methodologies adopted by software application development companies, web application development companies, custom software development services in India.
Courtesy: Shreyans Agrawal (ifour.shreyans.agrawal@gmail.com)
http://www.ifour-consultancy.com
http://www.ifourtechnolab.com
Software Testing accounts a very good percentage of the software development market worldwide. With the boom of the third party software testing business, the need for quality and trained manpower has become a critical issue in the industry. Get quality training from us. Visit us at TOPS Technologies http://www.tops-int.com
DURGASOFT is INDIA's No.1 Software Training Center offers online training on various technologies like JAVA, .NET, ANDROID,HADOOP,TESTING TOOLS , ADF, INFORMATICA,TALLEAU,IOS,OBIEE,ANJULAR JA, SAP...courses from Hyderabad & Bangalore - India with Real Time Experts.
Software development process models
Rapid Application Development (RAD) Model
Evolutionary Process Models
Spiral Model
THE FORMAL METHODS MODEL
Specialized Process Models
The Concurrent Development Model
This presentation provides useful and beneficial information related to software development companies. It defines Software development methodology and elaborates various methodologies adopted by software application development companies, web application development companies, custom software development services in India.
Courtesy: Shreyans Agrawal (ifour.shreyans.agrawal@gmail.com)
http://www.ifour-consultancy.com
http://www.ifourtechnolab.com
Software Testing accounts a very good percentage of the software development market worldwide. With the boom of the third party software testing business, the need for quality and trained manpower has become a critical issue in the industry. Get quality training from us. Visit us at TOPS Technologies http://www.tops-int.com
DURGASOFT is INDIA's No.1 Software Training Center offers online training on various technologies like JAVA, .NET, ANDROID,HADOOP,TESTING TOOLS , ADF, INFORMATICA,TALLEAU,IOS,OBIEE,ANJULAR JA, SAP...courses from Hyderabad & Bangalore - India with Real Time Experts.
EMBEDDING PERFORMANCE TESTING IN AGILE SOFTWARE MODELijseajournal
In the last couple of decades, the software development process has evolved drastically, starting from Big
Bang to Waterfall to Agile. The primary driver for the evolution of the software was the “Speed of
Delivery” of the Software Product which has significantly accelerated from months to less than weeks and
days. For IT (Information Technology) Organizations to be successful, they inevitably need a strong
technology presence to roll out new software and features as quickly as possible to their customer base.
The current user generation tends to use technology to maximum potential and is always striving to keep
up with the new trends. The main subject is for the organizations to be ready with their Speed of Delivery
strategy adapting to all technology modernization initiatives like CICD (Continuous Integration and
Continuous Deployment), Agile, DevOps, and Cloud so that there are negligible customer friction and no
risks to their Market shares,. The aim of this paper is to compare the performance testing in every stage of
the agile model to the traditional end testing. The results of the corresponding testing phases are presented
in this paper.
Evolution of software; Characteristics of software; Software applications; Components of software; Software myths; Software problems; Software reuse; Overview of risk management; Process visibility; Professional responsibility.
Tiara Ramadhani - Program Studi S1 Sistem Informasi - Fakultas Sains dan Tekn...Tiara Ramadhani
Tugas ini di buat untuk memenuhi salah satu tugas mata kuliah pada Program Studi S1 Sistem Informasi.
Oleh ;
Nama : Tiara Ramadhani.
NIM ; 11453201723
SIF VII E
UIN SUSKA RIAU
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Advanced Flow Concepts Every Developer Should KnowPeter Caitens
Tim Combridge from Sensible Giraffe and Salesforce Ben presents some important tips that all developers should know when dealing with Flows in Salesforce.
Your Digital Assistant.
Making complex approach simple. Straightforward process saves time. No more waiting to connect with people that matter to you. Safety first is not a cliché - Securely protect information in cloud storage to prevent any third party from accessing data.
Would you rather make your visitors feel burdened by making them wait? Or choose VizMan for a stress-free experience? VizMan is an automated visitor management system that works for any industries not limited to factories, societies, government institutes, and warehouses. A new age contactless way of logging information of visitors, employees, packages, and vehicles. VizMan is a digital logbook so it deters unnecessary use of paper or space since there is no requirement of bundles of registers that is left to collect dust in a corner of a room. Visitor’s essential details, helps in scheduling meetings for visitors and employees, and assists in supervising the attendance of the employees. With VizMan, visitors don’t need to wait for hours in long queues. VizMan handles visitors with the value they deserve because we know time is important to you.
Feasible Features
One Subscription, Four Modules – Admin, Employee, Receptionist, and Gatekeeper ensures confidentiality and prevents data from being manipulated
User Friendly – can be easily used on Android, iOS, and Web Interface
Multiple Accessibility – Log in through any device from any place at any time
One app for all industries – a Visitor Management System that works for any organisation.
Stress-free Sign-up
Visitor is registered and checked-in by the Receptionist
Host gets a notification, where they opt to Approve the meeting
Host notifies the Receptionist of the end of the meeting
Visitor is checked-out by the Receptionist
Host enters notes and remarks of the meeting
Customizable Components
Scheduling Meetings – Host can invite visitors for meetings and also approve, reject and reschedule meetings
Single/Bulk invites – Invitations can be sent individually to a visitor or collectively to many visitors
VIP Visitors – Additional security of data for VIP visitors to avoid misuse of information
Courier Management – Keeps a check on deliveries like commodities being delivered in and out of establishments
Alerts & Notifications – Get notified on SMS, email, and application
Parking Management – Manage availability of parking space
Individual log-in – Every user has their own log-in id
Visitor/Meeting Analytics – Evaluate notes and remarks of the meeting stored in the system
Visitor Management System is a secure and user friendly database manager that records, filters, tracks the visitors to your organization.
"Secure Your Premises with VizMan (VMS) – Get It Now"
Multiple Your Crypto Portfolio with the Innovative Features of Advanced Crypt...Hivelance Technology
Cryptocurrency trading bots are computer programs designed to automate buying, selling, and managing cryptocurrency transactions. These bots utilize advanced algorithms and machine learning techniques to analyze market data, identify trading opportunities, and execute trades on behalf of their users. By automating the decision-making process, crypto trading bots can react to market changes faster than human traders
Hivelance, a leading provider of cryptocurrency trading bot development services, stands out as the premier choice for crypto traders and developers. Hivelance boasts a team of seasoned cryptocurrency experts and software engineers who deeply understand the crypto market and the latest trends in automated trading, Hivelance leverages the latest technologies and tools in the industry, including advanced AI and machine learning algorithms, to create highly efficient and adaptable crypto trading bots
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
Large Language Models and the End of ProgrammingMatt Welsh
Talk by Matt Welsh at Craft Conference 2024 on the impact that Large Language Models will have on the future of software development. In this talk, I discuss the ways in which LLMs will impact the software industry, from replacing human software developers with AI, to replacing conventional software with models that perform reasoning, computation, and problem-solving.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Experience our free, in-depth three-part Tendenci Platform Corporate Membership Management workshop series! In Session 1 on May 14th, 2024, we began with an Introduction and Setup, mastering the configuration of your Corporate Membership Module settings to establish membership types, applications, and more. Then, on May 16th, 2024, in Session 2, we focused on binding individual members to a Corporate Membership and Corporate Reps, teaching you how to add individual members and assign Corporate Representatives to manage dues, renewals, and associated members. Finally, on May 28th, 2024, in Session 3, we covered questions and concerns, addressing any queries or issues you may have.
For more Tendenci AMS events, check out www.tendenci.com/events
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Field Employee Tracking System| MiTrack App| Best Employee Tracking Solution|...informapgpstrackings
Keep tabs on your field staff effortlessly with Informap Technology Centre LLC. Real-time tracking, task assignment, and smart features for efficient management. Request a live demo today!
For more details, visit us : https://informapuae.com/field-staff-tracking/
Innovating Inference - Remote Triggering of Large Language Models on HPC Clus...Globus
Large Language Models (LLMs) are currently the center of attention in the tech world, particularly for their potential to advance research. In this presentation, we'll explore a straightforward and effective method for quickly initiating inference runs on supercomputers using the vLLM tool with Globus Compute, specifically on the Polaris system at ALCF. We'll begin by briefly discussing the popularity and applications of LLMs in various fields. Following this, we will introduce the vLLM tool, and explain how it integrates with Globus Compute to efficiently manage LLM operations on Polaris. Attendees will learn the practical aspects of setting up and remotely triggering LLMs from local machines, focusing on ease of use and efficiency. This talk is ideal for researchers and practitioners looking to leverage the power of LLMs in their work, offering a clear guide to harnessing supercomputing resources for quick and effective LLM inference.
A Comprehensive Look at Generative AI in Retail App Testing.pdfkalichargn70th171
Traditional software testing methods are being challenged in retail, where customer expectations and technological advancements continually shape the landscape. Enter generative AI—a transformative subset of artificial intelligence technologies poised to revolutionize software testing.
A Comprehensive Look at Generative AI in Retail App Testing.pdf
Implementation of agile methodology in mobile automation testing
1. Implementation of Agile Methodology in
Mobile Automation testing
(Software Quality Management)
2. Table of Content:
Chapter 1 Introduction
Chapter 2 Project Overview
Chapter 3: Agile Methodologies
Chapter 4: Mobile Applications
Chapter 5: Mobile Automation Tools Assessment
Chapter 6: Mobile Automation Testing Framework
Chapter 7: Implementation of the Mobile automation testing framework using Appium
Chapter 8: Project Details and Implementation
Chapter 9: Results
Chapter 10 Conclusion
Chapter 11 Recommendations and Future works
3. Abstract
Now a days, IT industry is revolving itself and moving forward from desktop based
application to a mobile based application. Mobile applications are more complex than
ever before because of the variety of platforms and devices, a culture of quality in all
part of organization is essential for its success. Since there are many STLC (Software
testing life cycle) but all are having pros and cons relatively. In current scenario Agile
methodology is used in industry giving good quality of product. It is inspired by empirical
inspect and adept feedback loop to cope with complexity and risk. That’s why here
scrum method chosen for Mobile automation testing.
Here we are automating the testing of mobile application. Currently almost every
application is running on the mobile platform in any of the form i.e. Native, Web, hybrid
application. Quality of mobile application is more critical than any desktop based
application because once a user gets a bad experience about application he will not
return to the application again.
So it is important to test the functionalities of these mobile applications but each and
every new release where lots of new features get added to the existing system, test
management is a challenge.
Mobile application system is having different type of features. Testing such a huge
system required lots of time. In short interval of time, to test various scenarios of mobile
application manually is not feasible
The dissertation deal with implementing a scrum methodology in automation
framework development using Appium by which mobile automation will perform in each
release cycle. The best feature would be to automation all possible scenarios
which user might not have thought during execution phase of testing.
Broad Academics area of Work:
Software Quality Management
Key Words: Appium, Agile methodology, Mobile Automation testing, Framework
development, Scrum, Software testing life cycle.
4. ACKNOWLEDGEMENT
This project would not have been possible without the support of many people. I
would like to acknowledge and extend my heartfelt gratitude to the following
persons who have made the completion of this project possible.
My Team lead Mr. Satinder Singh for his full support and guidance throughout
this project.
My Frien Mr. Shiv Hari Singh for encouraging and supporting me to pursue this
project.
Dr. Rizwan Parveen from BITS Pilani for giving the feedback at various stages
of the Project which acted as a motivation for me to work on the project.
My testing team members who have helped me during various phases of the
project.
Last but not the least, I would like to express my love and gratitude to my
beloved family for their understanding & motivation through the duration of this
project.
5. LIST OF ABBREVIATIONS USED
TC: Test Case
TS: Test Script
POT: Proof of Testing
ENV: Environment
SDLC: Software Testing Life Cycle
JDK: Java Development kit
SDK: Software Development kit
STLC: Software Testing Life Cycle
PBI: Product Backlog Item
UAT: User Acceptance Testing
XP: Extreme Programming
DSDM: Dynamic System Testing Model
LD: Lean Testing
POM: Page Object Model
Adb: Android debug bridge
JRE: Java Runtime Environment
AVD: Android Virtual device
6. Chapter 1 Introduction
1.1 Organizational Introduction:
XXXXX is an Indian multinational IT services company, headquartered in Noida,
Uttar Pradesh, India. It is a subsidiary of XXX Enterprise. Originally a research
and development division of XXX, it emerged as an independent company in 1991
when XXX ventured into the software services business. XXX Technologies offers
services including IT consulting, enterprise transformation, remote infrastructure
management, engineering and R&D, and business process outsourcing (BPO).
.
1.2 About the Product
It is a transportation and logistics Mobile application used by driver for loading/unloading
shipment and status update.
<<Not including a detailed product details due to company policies >>
1.3 Architecture
Figure 1: Technical architecture of our product
<<Removed the product details due to company policies >>
7. Chapter 2 Project Overview
2.1 Current mode of operation:
Currently in most of the automation projects our organization is using waterfall model for
STLC.
In Waterfall model once an application is in the testing stage, it is very difficult to go back
and change something that was not well-thought out in the concept stage.
Changes in any requirement is very difficult to manage.
Once the manual testing is completed than only automation framework development starts,
which consume lots of time.
Companies are using many existing models for developing automation framework based on
client’s requirements and the size of projects. Some models are preferred over the others
due to their properties and how they match the client’s needs.
FEATURES WATERFALL V-SHAPED INCREMENTAL SPIRAL RAD
Requirement
specifications
Beginning Beginning Beginning Beginning
Time
boxed
release
Cost Low Expensive Low Expensive Low
Simplicity Simple Intermediate Intermediate Intermediate
Very
Simple
Risk
involvement
High Low
Easily
manageable
Low Very low
Expertise High Medium High High Medium
Flexibility to
change
Difficult Difficult Easy Easy Easy
User
involvement
Only at
beginning
At the
beginning
Intermediate High
Only at the
beginning
Flexibility Rigid Little flexible Less flexible flexible High
Maintenance Least Least
Promotes
maintainability
Typical
Easily
maintained
Duration Long
According to
project size
Very long Long Short
8. 2.2 Problem with existing solution:
• It is costly and time taking.
• If User wants any of the change, tester has to modify test script from start. So it’s difficult.
• In this model user involvement is only at beginning level.
• Not suitable for difficult or moderate projects.
• This model is not flexible.
• The most importantly- it’s a time taking process.
• Automation framework development faces the same issues as project development in
traditional approach.
2.3 Suggested solution
Considering the different problem faced during the automation framework development
in existing approach of automation testing, the project includes the following different
aspect to facilitate the user with necessary model and overcome the problem providing
additional advantages.
2.4 Objective
• To describe the agile methodology for Mobile automation testing framework
Development and convert it into a generic to use in other projects.
• It Covers Scenarios, which are not possible to be done manually (e.g. Black Box testing
techniques).
2.5 Expected Features
The Solution should be made to bring scrum methodology into mobile automation testing using
Appium. The developed framework will contain some generic features which may be used in other
projects. Scrum is having some basic feature [1] as per appendix.
2.6 Scope of the work
Scope of this project is to deal with the upcoming challenges during Mobile automation
testing framework development and come out with a unique development strategy on
the basis of the tools available in Market, which will be quite generic solution to all type
Mobile application testing.
9. Chapter 3: Agile Methodologies
3.1 What is Agile?
Agile methodology is an alternative to traditional project management, typically used in
software testing. It helps teams respond to unpredictability through incremental, iterative
work cadences, known as sprints. Agile methodologies are an alternative to waterfall, or
traditional sequential Testing.
4.2 Why Agile?
Agile methodology provides opportunities to assess the direction of a project throughout the Testing
lifecycle. This is achieved through regular cadences of work, known as sprints or iterations, at the
end of which teams must present a potentially shippable product increment. By focusing on the
repetition of abbreviated work cycles as well as the functional product they yield, agile methodology
is described as “iterative” and “incremental.”
Here difference between agile versus traditional is mentioned below in table 2:
ProcessFactor Traditional Agile
Measure of success Conformance to Plan
Response to change , working
software
Management Culture Command and Control Collaborative
Requirement & Design Big Upfront Continuous; emergent; IT
Planning & Scheduling
Detailed; Estimate time and
Resource; Fix Scope
Two levels – Time fixed, estimate
scope.
Coding & Implementation
Code all features in parallel, and
test later.
Code, test and deliver serially.
Test & Quality Assurance
Big upfront plan, Test late after
coding
Continuous ;concurrent; test
frequent & early
4.3 Types of Agile Methodology
There are different types of agile methodologies [7] which are mentioned as below in figure 3:
• Extreme Programming (XP)
• Dynamic System Testing Model (DSDM)
• Crystal Methods
• Lean Testing (LD)
• Kanban
• Scrum
10. Figure 3: Types of Agile Methodology
4.3.1 Extreme Programming (XP)
• Most popular and controversial agile methodologies
• Delivering high-quality software quickly and continuously.
• It promotes high customer involvement, rapid feedback loops, continuous testing,
continuous planning, and close teamwork to deliver working software at very frequent
intervals, typically every 1-3 weeks.
• XP Guided by Values, principles and values
XP Values: Feedback, simplicity, communication, respect and courage.
Principles driving Extreme Programming are: Humanity, mutual benefit, economics,
improvement, diversity, self-similarity, opportunity, reflection, redundancy, flow,
baby steps, failure, quality, and accepted responsibility.
13 Practices that guide XP engineering are: Whole team, energized work, sit
together, informative workspace, slack, stories, pair programming, weekly cycle,
quarterly cycle, continuous integration, ten-minute build, incremental design and
test first programming.
4.3.2 FDD (Feature Driven Testing)
Describe the 5 processes:
Develop an overall model
Build a feature List
Plan by feature
Design by feature
Build by feature
11. 4.3.3 Lean Deployment
Known as Kanban
Purpose is to visualize the work flow and optimize it
Reduce the cycle time of delivering fully completed features.
Eliminating Waste
Amplifying Learning
Deciding as Late as Possible
Delivering as Fast as Possible
Empowering the Team
Building Integrity In
It has 3 artefacts:
Kanban Board
Work-in-Progress Limit
Lead Time
4.3.4 DSDM (Dynamic System Testing Method)
Straight forward framework based on best principles to start implementing a project
structure.
Simple
Extendible
But not calming to be the solution to all kind of projects.
It is prioritized using MoSCoW Rules:
M: Must have requirements
S: Should have if at all possible
C: Could have but not critical
W:Won‘t have this time, but potentially later
4.3.5 ASD (Adaptive Software Testing)
Focused on the rapid creation and evolution of software systems.
ASD replaces the traditional waterfall cycle with a repeating series of speculate,
collaborate, and learn cycles.
ASD has 3 steps so , here those steps described briefly:
• Speculate: Initiation and Planning
• Collaborate : Concurrent feature Testing
• Learn : Quality Review
4.3.6 Crystal
• Crystal promotes early
• Frequent delivery of working software
• High user involvement
• Adaptability
• Removal of bureaucracy or distractions
12. 4.3.7 Scrum
• Scrum is the most popular way of introducing Agility due to its simplicity and
flexibility.
• Based on iterative Testing
• Scrum is a management and control process that cuts through complexity to
focus on building software that meets business needs.
4.4 Why Scrum Methodology
Scrum is another Agile Testing framework and has several key features that are shown the
Figure 4 given below and explained in detail [3]. Due to these features I am using scrum for
my project
4.4.1 Sprint:
Scrum framework divides the product testing into iterations known as “Sprints” which are time
boxed to fixed length of 1 – 4 weeks.
• Every iteration attempt to build a potentially shippable (properly tested) product increment.
• The time duration for the Sprint and how long it lasts, is decided by the team based on their
requirements and capabilities.
• The Sprint duration once finalized should not be modified.
4.4.2 Product Increment: At the end of the every Sprint the test team delivers a
potentially shippable product that is tested and works well. Since Sprints are short in durations, only
13. important features would be developed first. This also gives customers a product that has the basic
features that they can test and provide feedback on.
4.4.3 Product Backlog:
The set of all requirements broken down into small work items and prioritized into a list is called
product backlog.
• The product backlog emerges over a period of time.
• Refinement of the product backlog is done during the Product Backlog refinement meeting.
• Basically Product Backlog is the requirements for the project and these requirements may be
refined during each Sprint.
4.4.4 Sprint Backlog: Sprint Backlog contains all the known User Stories (or
requirements) in the current Sprint.
• The requirement with top priority listed first.
• The team pulls the stories into the Sprint and work collectively.
• Sprint Backlog is basically a list of all the requirements that need to be completed during the
Sprint ordered by priority.
4.4.5 User Stories:
• It captures who, what and why of the requirement from the users perspective.
• Detailed requirements in agile software Testing are captured in the form of User Stories,
from the point of view of the user rather than the organization or project.
• Ex. As a customer <role>, I want to <action> so that I can <reason or goal>.
• User stories are short and concise statements.
• They are recorded on sticky notes, index cards etc so that they can be stuck on walls or
tables to be rearranged or used during discussion.
4.4.6 Definition of Done:
Definition of done is a checklist of all exit criteria that must be completed by the team to call it done.
Definition of done exists at User story level, Sprint level and Release level
4.4.7 Time boxing:
Time boxing is a concept of fixed time duration in which the team is expected to complete the
committed features of work. Every ceremony in Scrum is time boxed as per the recommendations
given in the Scrum guide.
14. 4.4.8 Daily Stand up Meeting:
In Scrum methodology of Agile software testing, teams hold a daily planning meeting called the
“Daily Scrum Meeting” or “Scrum Meeting” or “Stand-up meeting”.
• In this meeting each team members give an update on 3 questions to the rest of the team
and not specifically to the management.
• These questions are – What have I accomplished yesterday? What will I do today? And what
is stopping me from proceeding? This increases the visibility of the tasks to everyone in the
team.
• This meeting can be also used to raise any potential impediments that block team from
accomplishing the sprint goal.
• These meetings are not expected to last more than 15 minutes and are held at the same
time and place, every day.
• A task board may be installed near the team’s physical location where everyone can see the
tasks moving from one block to the other.
4.5 Agile supports 12 principles:
Agile manifesto is having 12 principles which is mentioned in table 3 as below:
1. Our highest priority is to satisfy the customer through early and continuous delivery of
valuable software.
2. Welcome changing requirements, even late in Testing. Agile processes harness change for the
customer's competitive advantage.
3. Deliver working software frequently, from a couple of weeks to a couple of months, with a
preference to the shorter timescale.
4. Business people and developers must work together daily throughout the project.
5. Build projects around motivated individuals. Give them the environment and support they
need, and trust them to get the job done.
6. The most efficient and effective method of conveying information to and within a Testing
team is face-to-face conversation.
7. Working software is the primary measure of progress.
8. Agile processes promote sustainable Testing. The sponsors, developers, and users should be
able to maintain a constant pace indefinitely.
9. Continuous attention to technical excellence and good design enhances agility.
10. Simplicity--the art of maximizing the amount of work not done--is essential.
11. The best architectures, requirements, and designs emerge from self-organizing teams.
12. At regular intervals, the team reflects on how to become more effective, then tunes and
adjusts its behaviour accordingly.
15. Chapter 4: Mobile Applications
4.1 Types of Mobile applications:
4.1.1 Native application:
Native applications are the platform specific and they developed in specific language.
Native apps live on the device and are accessed through icons on the device home
screen. Native apps are installed through an application store (such as Google Play or
Apple’s App Store).They can take full advantage of all the device features — they can
use the camera, the GPS, the accelerometer, the compass, the list of contacts, and so
on.
4.1.2 Web Application:
Web apps are not real applications; they are really websites that, in many ways, look
and feel like native applications, but are not implemented as such. They are run by a
browser and typically written in HTML5. Users first access them as they would access
any web page: they navigate to a special URL and then have the option of “installing”
them on their home screen by creating a bookmark to that page.
4.1.3 Hybrid application:
These application are combination of both native and web applications. They developed
in HTML5 and wrap into the native format. These application are also downloaded from
play store and installed on the devices. Ex: Flip kart.
4.2 Difference between Mobile and Desktop Application
Testing:
Few obvious aspects that sets mobile app testing apart from the desktop testing
• On desktop, the application is tested on a central processing unit. On a mobile device,
the application is tested on handsets like Samsung, Nokia, Apple and HTC.
• Mobile device screen size is smaller than desktop.
• Mobile devices have less memory than desktop.
• Mobiles use network connections like 2G, 3G, 4G or WIFI where desktop use
broadband or dial up connections.
4.3 Types of Mobile App Testing:
To address all the above technical aspects, the following types of testing are performed
on Mobile applications.
• Usability testing– To make sure that the mobile app is easy to use and provides a
satisfactory user experience to the customers
• Compatibility testing– Testing of the application in different mobiles devices,
browsers, screen sizes and OS versions according to the requirements.
• Interface testing– Testing of menu options, buttons, bookmarks, history, settings,
and navigation flow of the application.
• Services testing– Testing the services of the application online and offline.
• Low level resource testing: Testing of memory usage, auto deletion of temporary
files, local database growing issues known as low level resource testing.
• Performance testing– Testing the performance of the application by changing the
connection from 2G, 3G to WIFI, sharing the documents, battery consumption, etc.
• Operational testing– Testing of backups and recovery plan if battery goes down, or
data loss while upgrading the application from store.
16. • Installation tests– Validation of the application by installing /uninstalling it on the
devices.
• Security Testing– Testing an application to validate if the information system
protects data or not.
4.4 Technical features of Mobile applications:
Each of these types of apps has their advantages and disadvantages, these app are
developed as per the requirement and below are the features given:
Device features. Although web apps can take advantage of some features, native apps
(and the native components of the hybrid apps) have access to the full paraphernalia of
device-specific features, including GPS, camera, gestures, and notifications.
Offline functioning. A native app is best if your app must work when there is no
connectivity. In-browser caching is available in HTML5, but it’s still more limited than
what you can get when you go native.
Discoverability. Web apps win the prize on discoverability. Content is a lot more
discoverable on the web than in an app: When people have a question or an information
need, they go to a search engine, type in their query, and choose a page from the
search results. They do not go to the app store, search for an app, download it, and then
try to find their answer within the app. Although there are app aficionados who may fish
for apps in app stores, most users don’t like installing and maintaining apps (and also
wasting space on their device), and will install an app only if they expect to use it often.
Speed. Native apps win the speed competition. In 2012 Mark Zuckerberg declared that
Facebook’s biggest mistake had been betting on the mobile web and not going native.
Up to that point, the Facebook app had been a hybrid app with an HTML core; in 2012 it
was replaced with a truly native app. Responsiveness is key to usability.
Installation. Installing a native or hybrid app is a hassle for users: They need to be
really motivated to justify the interaction cost. “Installing” a web app involves creating a
bookmark on the home screen; this process, while arguably simpler than downloading a
new app from an app store, is less familiar to users, as people don’t use bookmarks that
much on mobile.
Maintenance. Maintaining a native app can be complicated not only for users, but also
for developers (especially if they have to deal with multiple versions of the same
information on different platforms): Changes have to be packaged in a new version and
placed in the app store. On the other hand, maintaining a web app or a hybrid app is as
simple as maintaining a web page, and it can be done as often or as frequently as
needed.
Platform independence. While different browsers may support different versions of
HTML5, if platform independence is important, you definitely have a better chance of
achieving it with web apps and hybrid apps than with native apps. As discussed before,
at least parts of the code can be reused when creating hybrid or web apps.
Content restrictions, approval process, and fees. Dealing with a third party that
imposes rules on your content and design can be taxing both in terms of time and
money. Native and hybrid apps must pass approval processes and content restrictions
imposed by app stores, whereas the web is free for all. Not surprisingly, the first web
apps came from publications such as Playboy, who wanted to escape Apple’s prudish
content censure. And buying a subscription within an iOS app means that 30% of that
subscription cost goes to Apple, a big dent in the publishers’ budget.
Development cost. It’s arguably cheaper to develop hybrid and web apps, as these
require skills that build up on previous experience with the web. NN/g clients often find
that going fully native is a lot more expensive, as it requires more specialized talent.
But, on the other hand, HTML5 is fairly new, and good knowledge of it, as well as a good
understanding of developing for the mobile web and hybrid apps are also fairly advanced
skills.
17. User Interface. Last but not least, if one of your priorities is providing a user
experience that is consistent with the operating system and with the majority of the
other apps available on that platform, then native apps are the way to go. That doesn’t
mean that you cannot provide a good mobile user experience with a web app or a hybrid
app — it just means that the graphics and the visuals will not be exactly the same as
those with which users may be already accustomed, and that it will be harder to take
advantage of the mobile strengths and mitigate the mobile limitations.
4.5 Mobile Application Testing Strategy
The Test strategy should make sure that all the quality and performance guidelines are
met. A few pointers in this area:
1. Selection of the devices: Analyze the market and choose the devices that are
widely used. (This decision mostly relies on the clients. The client or the app
builders consider the popularity factor of a certain devices as well as the
marketing needs for the application to decide what handsets to use for testing.)
2. Emulators: The use of these is extremely useful in the initial stages of Testing,
as they allow quick and efficient checking of the app. Emulator is a system that
runs software from one environment to another environment without changing
the software itself. It duplicates the features and work on real system.
3. Types of Mobile Emulators
Device Emulator- provided by device manufacturers
Browser Emulator- simulates mobile browser environments.
Operating systems Emulator- Apple provides emulators for iPhones, Microsoft
for Windows phones and Google Android phones
List of few free and easy to use mobile device emulators like iPhone Tester, Mobile Phone
Emulator, Responsivepx, Android4.2.2
18. Chapter 5: Mobile Automation tools Assessment
5.1 Tools selection:
There are various automation tools available in market for mobile automation testing.
Each one of having their own technique to automate the application.
We can categorize these tool in two parts, one is Image based automation tools and
second is object based automation tools. Image based tools used the screen co-
ordinates to identify the object, which is moreover devices specific automation, the
developed automation code can be run only on the same device/emulator. On the other
hand, object based automation tools use the object’s property to identify object and the
script developed using these tools can be run on any device/emulator. Based on the
features of the available tools I created on matrix for comparison which is shown below:
Tools
Paid/Open
Source
Native
Apps
Web
Hybrid
Apps
Android IOS Windows
Black-
berry
Library/Tool
Robotium Open
Source
Y - Y Y - - - Library
Sikuli Open
Source
Image
Based
Image
Based
Image
Based
Y Y Y Y Both
Selenium
web Driver
Open
Source
- Y - Y Y - - Library
NativeDriver Open
Source
Y - - Y Y - - Library
Appium Open
Source
Y Y Y Y Y - - Tool
MonkeyTalk Open
Source
Y Y Y Y Y - - Tool
SeeTest Paid Y Y Y Y Y Y Y Tool
M-eux(Jamo
Solutions)
Paid Y - Y Y Y Y Y Tool
EggPlant Paid Image
Based
Image
Based
Image
Based
y Y Y Y Tool
mAutomate Paid Y Y Y Y Y - - WebBased
Ranorex Paid Y Y Y Y Y - - Tool
As discussed above imaged based automation tools are specific to the devices or
emulator, using them will not be an effective method, other side there are many tools
available in object based automation which are listed in above table. Again they are
categorized in two parts, paid and open source. In this project I am considering open
source tools. Appium is the best suited tool because of its capability to automate all type
of mobile application on most of the platforms in comparison of other ones.
19. 5.2 Appium Introduction
Appium is part of the Selenium 3.0, It is a set of different software tools each with a
different approach to supporting test automation. The entire suite of tools results in a
rich set of testing functions specifically geared to the needs of testing of Hybrid and
Native applications. These operations are highly flexible, allowing many options for
locating UI elements and comparing expected test results against actual application
behaviour.
Appium is the cross-platform solution for native and hybrid mobile automation
It supports: IOS, Android and FirefoxOS
We don't have to recompile our app or modify it in any way, due to use of
standard automation APIs on all platforms.
We can use any testing framework.
Language Support: Java, C#, Pearl, Python and Ruby.
5.3 Appium Architecture
5.4 Technical Specification:
Feature Appium
Language Support Java, C#, Ruby, Python, Perl.
Windows (Non-browser)
based Application support No
App Support Web, Native and Hybrid
Environment Support Windows , Mac OS
Platform Support Android, IOS and FirefoxOS
Framework Appium+Selenium2.0 Jar + Eclipse + TestNG+Android SDK
20. Object Recognition /
Storage UI Automator Viewer
Software Cost Zero
Coding Experience of
Engineer Should be very good.
Script Creation Time
High for the first time (relatively less after the standards
are established)
Hardware resource (CPU +
RAM) consumption High
Product Support Open source. No dedicated support
5.5 Technical Requirements
S.
No.
Technical Requirement Remarks
1
Appium Server
Standalone server which send the testing code
to Mobile the connected device/Emulator
2 Selenium Web Driver
Java Latest version of open source automation tool.
3 Java Programming language selected for scripting.
4 Eclipse IDE Script development environment.
5 Object Identification
Tools
UI Automator viewer which comes with Android
SDK
6 TestNG An open source automation testing framework.
7 .Net framework 4.5 It support Appium tool.
8 AutoIT Tool to handle window popups
5.6 Benefits:
• Support for both platforms iOS and android.
• Appium is an open source tool that doesn’t cost anything.
• Scalable with standard desktop systems without cost for user licenses
• Support for continuous integration.
• Appium Web Driver supports many popular programming languages like Java,
C#, PHP, Pearl, Ruby, Python etc.
• Supports many test environments.
• Native Xpath support, if your html is complicated / full of nesting / does not have
id attributes, then this could be very important.
• Doesn't require access to your source code or library. You are testing with which
you will actually ship.
• Support for various frameworks.
21. 5.7 Challenges
• Demand higher technical competencies.
• Being an open source, Appium has no official technical support.
• No built in Object Repository concept.
• Doesn't support image comparison.
• Handling Popup/Dialog/Menu windows are sometimes tricky.
• Barcode scanning can’t be done using automation.
22. Chapter 6: Mobile Automation Testing Framework
There are mainly two type of framework are popular in Mobile automation testing.
6.1 Page Object Model(POM):
Page Object Model is a design pattern to create Object Repository for web
UI elements. Under this model, for each web page in the application there
should be corresponding page class.This Page class will find the
WebElements of that web page and also contains Page methods which
perform operations on those WebElements.
Name of these methods should be given as per the task they are
performing.
The main advantage of Page Object Model is that if the UI changes for any
page, it don’t require us to change any tests, we just need to change only
the code within the page objects (Only at one place).
Page Object model is writing all the functionalities / reusable components
of a page that we want to automate in a separate class. Say now if we
consider four pages as Home page, Login page, Create Account and Forgot
password page etc
6.2 Page Factory Model:
Page Factory is an inbuilt page object model concept for Appium but it
is much optimized.
Here as well we follow the concept of separation of Page Object repository
and Test methods. Additionally with the help of PageFactory class we use
annotations @FindBy to find WebElement. We use initElements method to
initialize web elements
@FindBy can accept tagName, partialLinkText, name, linkText, id, css,
className, xpath as attributes.
I will explain more about these framework during implementation part.
23. Chapter 7: Implementation of the Mobile automation
testing framework using Appium
7.1 Environment setup
Step 1
Download and install JDK form
http://www.oracle.com/technetwork/java/javase/downloads/jdk8-downloads-
2133151.html
Validate the installed java
Setup the environment variable setup for java as
JAVA_HOME = C:Program FilesJavajdk1.8.0_101
JRE_HOME = C:Program FilesJavajre1.8.0_101
Edid Path = ; C:Program FilesJavajdk1.8.0_101bin;
Step 2
Download the Androids SDk form https://developer.android.com/studio/index.html
And istall the same:
24. Edit Path = ;C:UsersuserAppDataLocalAndroidsdkplatform-
tools;C:UsersuserAppDataLocalAndroidsdktools
Step 3: Validate the AVD(Android Virtual devices) has been installed with Android SDK
Navigate to C:UsersuserAppDataLocalAndroidsdk .
There will be an Icon named as AVD Manager, Cilick on it.
25. Validate the UI Automator Viewer also has been installed
Use the commend uiatomatorViewer in command prompt
This tool will help us to identify the objects properties of the Native application.
Step 4:
Download and install Dot Net framework 4.5 from https://www.microsoft.com/en-
in/download/details.aspx?id=30653
This is essential because appium server is designed in .net.
Step 5: Download and install Node.js from
https://nodejs.org/en/download/
step 6:
Download and install Appium from http://appium.io/downloads.html
26. Step 7: Download Java IDE (Eclipse) from
http://www.eclipse.org/downloads/download.php?file=/technology/epp/downloads/relea
se/mars/R/eclipse-jee-mars-R-win32-x86_64.zip
Step 8:
Download Selenium and appium Jar files
1. java-client-4.1.2 from:
https://mvnrepository.com/artifact/io.appium/java-client/4.1.2
28. Chapter 8: Project Details and Implementation
8.1 Introduction
Below Steps define the implementation of the Mobile automation testing using traditional and a
agile approach and comparison between them.
8.2 Hardware Requirements
Any Windows machine with matches the following criteria:
Processor: Pentium 1 GHz or higher
RAM: 1 GB for x86 / 2GB for x64
System Type: 32 bit Operating System/64 Bit Operating System
8.3 Software Requirements
.net framework 4.
Windows XP with SP3 or higher.
8.4 Performing END to END testing Cycle of mobile application using
Automation.
Step 1: Connect the device with system and validate the same:
After connecting the device to system, Open the command window and type a command as “Adb
devices”, it will show all the connected devices.
29. Note: this image is a mirror image of the actual device connected, generated using Mobizen
software.
30. Step 2: Start the Appium Server.
Step 3: Setup the Appium server:
31. Step 4: Open the eclipse and create the Project:
Sample Code<> package com.framework.test;
import java.io.File;
import java.net.MalformedURLException;
import java.net.URL;
//import java.util.NoSuchElementException;
import java.util.List;
import java.util.concurrent.TimeUnit;
import org.openqa.selenium.By;
import org.openqa.selenium.WebElement;
import org.openqa.selenium.remote.DesiredCapabilities;
import org.testng.annotations.Test;
import io.appium.java_client.android.AndroidDriver;
public class Login {
AndroidDriver driver;
@Test
public void testapp() throws MalformedURLException, InterruptedException{
DesiredCapabilities capability = new DesiredCapabilities();
capability.setCapability("deviceName", "MOTO G");
capability.setCapability("platformName", "Android");
capability.setCapability("platformVersion", "6.0");
File F = new
File("C:UserskhoiwalkworkspaceMatixMobileapkcorpAndroidMatrixMobile-qa-
debug.apk");
33. Step 5: Start the Appium server.
Step 6: Run the test script using TestNG.
34. Step 7: Monitor the execution.
Step 8: Validate the Test result.
35. All of these above steps are same while creating and executing for each sprint.
Here created 3 Sprint of project A and division of user story would be done based on complexity of
user story (Complex, Medium, Simple) . These sprint shown in the below table.
Sprint User Story Story Points No of Test Cases
Project # A (Sprint 1) 4 7 16
Project # A (Sprint 2) 7 8 19
Project # A (Sprint 3) 10 18 23
36. Chapter 9: Results
9.1 Result
As the title of the Dissertation says, evaluate Mobile application automation testing by agile
methodology estimating the project quality by monitoring defect trend for Functional automation
testing.
Here functions are tested by searching search item and examine the search result in mobile
application, and other functionality is rarely considered.
Mobile testing we have conducted in phases as we have implemented iterative model where mobile
testing went on side by side. Here some analysis is done with some set of data and observed some
issues with earlier model which is mentioned as below:
No formal test planning and it’s always started until late in the development process with
inadequate resources.
Review & test effectiveness and efficiency are not known until UAT started.
Communication barrier among stakeholders.
Existing defect reporting is not structured and effective.
The draft of the test method was given a positive response by the interviewees but should
still be slightly changed. The most important areas to introduce today are:
o Define responsibility for testing
o Define testing lifecycle
o Improvement in the construction process
o A defect reporting system
o Sub-process monitoring for review effectiveness and efficiency
o Test Dashboards
Due to above issue, new methodology is implemented using some metrics for earlier model and new
model. Key Metrices for testing project measurement template are mentioned as follows in Table 5
and analysis is done for pre and post implementation of evaluation:
37. 9.2 Key Metrics for Testing Project Measurement:
S.
No.
Metric Name Formula
Project Goal Rationale
behind the
goal settingProject Goal
Upper
Limit
Lower
Limit
1 Schedule Variance (%) Schedule variance=
(Actual Calendar days-
Planned Calendar days
+ Start Variance) /
(Planned Calendar
days) * 100
5% 10% -10%
2 Effort Variance (%) Effort Variance= (Actual
Effort - Planned
Effort)/(Planned Effort)
* 100
0% 5% -5%
3 Test Script Coverage
(%)
Test Script Coverage =
(No of Test Script
Originally Written/(No
of Test Script Originally
Written + No of test
Script Added During
Execution))
95% 100% 85%
4 Productivity in Test
Script Preparation
Productivity in Test
Script Preparation =
Actual No of Test Script
/ Actual effort spent on
test Script preparation
5 6 3
5 Productivity in Test
Case Execution
Productivity in Test
Case Execution=Actual
No of Test Cases
(planned + adhoc )/
Actual effort spent on
testing
6 8 4
6 Productivity in Defect
Detection(Defects/
PD)
Productivity in Defect
Detection=Actual No of
Defects testing/ Actual
effort spent on Testing 4 5 3
38. 7 Review Effectiveness Review Effectiveness =
(No of errors detected
in Review/(No of errors
detected in Review +
Total no of defects
detected in
Testing))*100
70% 90% 60%
8 Testing Effectiveness Internal Defect
Removal Effectiveness
= (No defects detected
in Internal Testing/( No
defects detected in
Internal Testing + No of
UAT defects))*100
95% 100% 90%
9.2.1 Metrics Collection:-Scheduled variance Evaluation
Pre Implementation
Schedule variance= (Actual Calendar days-Planned Calendar days + Start Variance) / (Planned
Calendar days) * 100 (Attached Table 6)
Project#
Planned
Calendar
days
Actual
Calendar
Days
Start
variance
Schedul
e
Varianc
e (%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Root Cause
Suggeste
d
Improve
ments &
Preventiv
e Action
Project#
A
47 50 1 8.51% 5.00% 10.00%
-
10.00%
1. Estimation
is not
effective and
there is a
deficiency in
the planning
components
1.
Estimatio
n review
2.
Introduci
ng Burn
down
chart on
daily basis
Post Implementation
Schedule variance= (Actual Calendar days-Planned Calendar days + Start Variance) / (Planned
Calendar days) * 100 (Attached Table 7)
39. Project#
Planned
Calendar
days
Actual
Calendar
Days
Start
variance
Schedule
Variance
(%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Project#
A (Sprint
1)
16 15 0 -6.25% 5.00% 10.00%
-
10.00%
Project#
A (Sprint
2)
13 13 0 0.00% 5.00% 10.00%
-
10.00%
Project#
A (Sprint
3)
18 19 0 5.56% 5.00% 10.00%
-
10.00%
9.2.2 Metrices Collection: - Efforts Variance Evaluation
Pre Implementation
Efforts Variance= (Actual Effort - Planned Effort)/(Planned Effort) * 100 )(Attached Table 8)
Project#
Planned
Efforts
(P Days)
Actual
Efforts
(P
Days)
Efforts
Variance
(%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Root Cause
Suggested
Improvements
& Preventive
Action
Project#
A
195 208 0.0667 0 0.05 -0.05
1.
Estimation
is not
effective
and there is
a deficiency
in the
planning
components
2. Rework
during
execution of
test script
1. Estimation
review
2. Introducing
Burn down
chart on daily
basis
Post Implementation
Efforts Variance= (Actual Effort - Planned Effort)/(Planned Effort) * 100 (Attached Table 9)
40. Project#
Planned
Efforts
(P Days)
Actual
Efforts
(P
Days)
Efforts
Variance
(%)
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Project# A
(Sprint 1)
65 64 -1.54% 0.00% 5.00% -5.00%
Project# A
(Sprint 2)
55 55 0.00% 0.00% 5.00% -5.00%
Project# A
(Sprint 3)
75 73 -2.67% 0.00% 5.00% -5.00%
9.2.3 Matrices Collection:-Test Case Coverage Evaluation
Pre Implementation:
Test Case Coverage = (No of Test Cases Originally Written/(No of Test Cases Originally Written + No
of test Cases Added During Execution)) (Attached Table 10)
Proj
ect #
# of
Test
Cases
Origina
lly
Writte
n
# of
test
Cases
Added
During
Executi
on
Tot
al #
of
Tes
t
Cas
es
Test
Covera
ge %
Proje
ct
Goal
%
Upper
Limit
%
Low
er
Limit
%
Root Cause
Suggested
Improvements
&
Preventive Action
Proje
ct #A
55 7 62 88.71%
95.0
0%
100.0
0%
85.0
0%
1.
Inadequate
/
ambiguous
requiremen
ts
2.
Productivity
of the team
is low due
to new
Technologie
s and the
resources
are new to
the project,
complex
domain
1..Nonfunctional
requirements
should be captured
during
requirements study
2.Training/knowled
ge program
41. Post Implementation
Test Case Coverage = (No of Test Cases Originally Written/(No of Test Cases Originally Written + No
of test Cases Added During Execution)) (Attached Table 11)
Project#
# of Test
Cases
Originally
Written
# of test
Cases
Added
During
Execution
Total #
of Test
Cases
Test
Coverage
%
Project
Goal %
Upper
Limit %
Lower
Limit %
Project# A
(Sprint 1)
15 1 16 93.75% 95.00% 100.00% 85.00%
Project# A
(Sprint 2)
18 1 19 94.74% 95.00% 100.00% 85.00%
Project# A
(Sprint 3)
22 1 23 95.65% 95.00% 100.00% 85.00%
9.2.4 Metrices Collection: -
Productivity in Test Script Preparation Evaluation
Pre Implementation
Productivity in Test Script Preparation = Actual No of Test Script / Actual effort spent on test Script
preparation (Attached Table 12)
Proje
ct#
Test
Scrip
t Size
Test
Script
Preparati
on
Efforts in
Person
Hours
Productiv
ity in Test
Script per
Hours
Project
Goal
(TS per
Hours)
Upper
Limit
(TS per
Hours)
Lower
Limit
(TS
per
Hours)
Root Cause
Suggested
Improveme
nts &
Preventive
Action
Proje
ct# A
62 17 5.41 4 6 3
1.Inadequate
/ambiguous
requirements
on security
features
1.Non
functional
requirement
s should be
captured
during
requirement
s study
Post Implementation
Productivity in Test Case Preparation = Actual No of Test Cases / Actual effort spent on test case
preparation (Attached Table 13)
42. Project#
Test Case
Size
Test Case
Preparation
Efforts in
Person
Hours
Productivity
in Test
Cases per
Hour
Project
Goal (TC
per Hour)
Upper
Limit (TC
per Hour)
Lower
Limit (TC
per Hour)
Project# A
(Sprint 1)
17 4 4.25 5 6 3
Project# A
(Sprint 2)
20 5 4 5 6 3
Project# A
(Sprint 3)
25 6 4.17 5 6 3
Productivity in Test Case Execution Evaluation
Pre Implementation
Productivity in Test Case Execution=Actual No of Test Cases (planned + adhoc)/ Actual effort spent
on testing (Attached Table 14)
Project#
Number
of Test
Scripts
Execution
Efforts in
Person
Hours
Productivity
in Test
Cases per
Hour
Project
Goal
(TC
per
Hour)
Upper
Limit
(TC
per
Hour)
Lower
Limit
(TC
per
Hour)
Root Cause
Suggested
Improvements &
Preventive Action
Project#
A
62 10 6.2 9 10 7
1.Inadequate
Test
Environment
set-up for
testing due to
lack of test
planning
2.Productivity
of the team is
low due to
evolving
functionality
during
construction
& testing
phases
3.Productivity
of the team is
low due to
new
technologies
and the
resources are
new to the
1.Implementation
scrum testing
methodology
including test
requirements and
test preparation
2.FSD to be frozen
and signed off from
customer in sprint
before design phase
3.training/Knowledge
up gradation
program
43. project,
complex
domain
Post Implementation
Productivity in Test Case Execution=Actual No of Test Cases (planned + adhoc)/ Actual effort spent
on testing (Attached Table 15)(For sprint wise No of TC ,Please consider table 4 for reference)
Project#
Number
of Test
Cases
Execution
Efforts in
Person
Hours
Productivity
in Test
Cases per
Hour
Project
Goal
(TC per
Hour)
Upper
Limit
(TC per
Hour)
Lower
Limit
(TC per
Hour)
Project# A
(Sprint 1)
17 2 8.5 6 8 4
Project# A
(Sprint 2)
20 3.0 6.6 6 8 4
Project# A
(Sprint 3)
25 4 6.25 6 8 4
9.2.5 Metrices Collection:- Productivity in Defect Detection (Defects/ PD)
Evaluation
Pre Implementation:
Productivity in Defect Detection=Actual No of Defects in testing/ Actual effort spent in Testing
(Attached Table 16)
Project#
# of
Total
Defects
in
Testing
Effort
in
Person
Days
in
Testing
Defect
Detection
Productivity
(Defects Per
Person Day)
Project
Goal
(Defects
Per
Person
Day)
Upper
Limit
(Defects
Per
Person
Day)
Lower
Limit
(Defect
s Per
Person
Day)
Root Cause
Suggested
Improvements &
Preventive Action
Project#
A
211 73 2.89 4 5 3
1. Longer
time to setup
the
environment
2. Faced the
test data
issue during
execution.
3. Run time
script
maintenance
.
1. Use the proper
guidelines for
framework which
will help in ease of
maintenance.
2 Review the
automation code
on regular basis to
avoid run time
maintenance.
44. Post Implementation
Productivity in Defect Detection=Actual No of Defects (Review + testing)/ Actual effort spent on
(Review + Testing) (Attached Table 17)
Project#
# of Total
Defects in
Testing
Effort in
Person
Days in
Testing
Defect
Detection
Productivity
(Defects Per
Person Day)
Project
Goal
(Defects
Per
Person
Day)
Upper
Limit
(Defects
Per
Person
Day)
Lower
Limit
(Defects
Per
Person
Day)
Project# A
(Sprint 1)
72 19 3.79 4 5 3
Project# A
(Sprint 2)
55 15 3.67 4 5 3
Project# A
(Sprint 3)
84 20 4.2 4 5 3
9.2.6 Metrices Collection:- Testing Effectiveness Evaluation
Pre Implementation
Internal Defect Removal Effectiveness = (No defects detected in Internal Testing/( No defects
detected in Internal Testing + No of UAT defects))*100 (Attached Table 20)
Project#
Regression
Testing
UAT
Internal Defect
Removal
Effectiveness
Project
Goal
(%)
Upper
Limit
(%)
Lower
Limit
(%)
Root
Cause
Suggested
Improvements
& Preventive
Action
Project#
A
221 29 84.4% 95.00% 100.00% 90.00%
Lack of
Process
Processed
daily stand up
meeting in
sprint
Post Implementation
Internal Defect Removal Effectiveness = (No defects detected in Internal Testing/( No defects
detected in Internal Testing + No of UAT defects))*100 (Attached Table 21)
Project#
Regression
Testing
UAT
Internal
Defect
Removal
Effectiveness
Project
Goal (%)
Upper
Limit (%)
Lower
Limit (%)
Project# A
(Sprint 1)
18 1 94.73% 95.00% 100.00% 90.00%
45. Project# A
(Sprint 2)
25 0 100.00% 95.00% 100.00% 90.00%
Project# A
(Sprint 3)
35 2 94.95% 95.00% 100.00% 90.00%
Conclusion:
After implementation and analysis of the above matrices here is the conclusion mentioned in below
matrix:
Keys
Project# A
Pre Implementation Post Implementation
Schedule Variance (%) 8.51% -0.23%
Effort Variance (%) 6.67% -1.40%
Test Case Coverage (%) 88.71% 94.71%
Productivity in Test Case Preparation 5.41 4.14
Productivity in Test Case Execution 6.2 7.11
Productivity in Defect
Detection(Defects/ PD)
2.89 3.89
Testing Effectiveness 94.40% 96.56%
-100.00%
0.00%
100.00%
200.00%
300.00%
400.00%
500.00%
600.00%
700.00%
800.00%
Project# A Pre Implementation Project# A Post Implementation
46. Above table shows the following comparisons:
1. Scheduled variance is decreased after implementing the agile approach in project A.
2. Effort variance is decreased after implementing the agile approach in project A.
3. Test case coverage increased after implementing the agile approach in project A.
4. Productivity in test case preparation is increased after implementing the agile approach in
project A.
5. Productivity in test case Execution is increased after implementing the agile approach in
project A.
6. Defect detection is increased after implementing the agile approach in project A.
7. Testing effectiveness is increased after implementing the agile approach in project A.
47. Chapter 10 Conclusion
Above comparison between Pre & Post Mobile automation testing results in between Pre
Implemented Project A and Post Implemented sprint Project A for different parameters which are
mentioned in rows of the table.
After implementation of Sprint projects it shows clearly how this sprint has improved testing results
in an effective manner and reduced the defects leakage.
Still there are many regression issues that were found in testing phase. The following are some of
the issues which are shown in a generic way X
1. Android Version is changing continuously so facing the configuration issue
2. Framework maintenance is required regularly.
3. There were time out errors for some operation during mobile app testing.
4. Some performance issues were also found during testing.
Apart from these, there are other issues which we are expecting and still it has to be adopted by the
whole team and then we will be able to find more issues. We are using it to make system better.
In this Dissertation, an effort is made to project quality by monitoring the defect trend
for the mobile application and to minimize the human effort involved in mobile application testing.
It has been observed that the scrum software testing methodology supports quickly adapting a
project to changing business realities, so here scrum methodology is used in software project and
the same is successful for long term project.
Regression testing which is done in every release is time consuming which has been chosen first
and then Scrum is implemented which has reduced significant amount of time for team and is
more effective.
48. Chapter 11 Recommendations and Future works
This methodology can be used in different prospective for different Operating System Simulator.
User just needs to use the same methodology for any upcoming OS with any new technology.
Methodology will be intelligent enough to evaluate quality & added features in new application.
Mobile application is being increased in industry. Similar trend has been observed for
Telecommunication service providers. Since at the back end there are many operating system that
works together, so mobile apps correct functioning is important not only for better functioning of
the system but for collaborations with external partners.
There are some points for future works are as follows:
Testing support for emulators/simulators.
Automation framework can be modified for future projects.
Mobile automation framework can be extend for Web and Hybrid mobile applications.
Compatibility testing for mobile application can be achieved using these framework.
Network/Localization testing also can be achieved using cloud based mobile solution.
49. Appendix I
A scrum methodology is having different feature in different prospective, there are few features is
mentioned as below:
Features of Scrum from the Client Perspective
Ultimately testers have to deliver the product to satisfy the customer. Too many projects get caught
up in the overhead of administering the project, delays in shipping because of poorly-written
requirements, and subpar products that only meet minimal requirements (if that). These are the
features of Scrum that will help you create “wow” moments for our clients:
• Delight our clients by building exactly what they want
• The team is able to quickly deliver the most important features first
• Support our business partners by delivering value in short cycles
• Scrum is priority driven; what the team is working on adapts to meet the current business
need
• Change is embraced; in order to better meet true business needs
Features of Scrum from the Organization Perspective
Satisfied customers who demand our products are the lifeblood of our business. A repeatable
process that delivers products which fuel this demand is like having a goose that lays the golden egg.
In this sense, these features of Scrum will fit nicely into our organization.
• Builds continuous innovation into our organization
• Creates order out of chaos
• Scrum is a cultural shift
Features of Scrum from the Management Perspective
As important as the organization and customer perspectives are, a new process doesn’t make sense
if it creates a management nightmare. Fortunately, Scrum features some important characteristics
that shift the management paradigm in a positive manner.
• Control shifts from management to the team; the team is now responsible for the working
product
• Helps everyone understand how much work the team can do in a given timeframe
• Creates an environment where the team can manage themselves
Features of Scrum from the Product Perspective
In the Agile methodology, products just move along the Testing pipeline faster. The unique nature of
the sprint in Scrum’s framework ensures that a version of the product is always ready to ship.
Scrum’s other features as referenced to the product are:
• Due to collaboration, our product gets better
• Due to the real-time inspect-and-adapt loop, the team is able to deliver exactly what is
needed
• The product becomes more valuable because it does exactly what the client wants it to do
• Scrum provides early feedback
• Scrum supports predictability of our Testing process
• There is always a shippable product; in the worst case we revert a single sprints worth of
Testing
• Each sprint you have a new stable shippable product
50. Features of Scrum from the Team Perspective
And finally, there are features of Scrum that our team will find appealing. With all of the self-
managing aspects of Scrum, our team will discover newfound autonomy in the execution of their
projects with these features:
• Increase team productivity
• Increase job satisfaction
• Scrum plays to people strengths; focusing on intrinsic motivation, and allowing them to do
what they love to do