With the rapid rise of mobile devices including smartphones and tablets, many organizations are rolling out mobile apps to extend the reach of their traditional web applications. Although the methodology for mobile application testing is fundamentally the same as that of traditional web and desktop application testing, mobile apps testing presents some unique challenges and issues including coverage of a myriad of mobile devices, usability testing, integration of mobile testing with web interface testing, mobile app performance, and security issues. Jimmy Xu describes these issues and current best practices for solving them. Jimmy introduces the latest technologies and tools in mobile application test automation, mobile application usability, performance, and security testing. In a non-technical and easy-to-understand approach that does not require previous mobile app testing or development experience, Jimmy uses a real world mobile app project to illustrate the challenges—and solutions—of mobile app testing.
2. Jimmy Xu
CGI
Jimmy Xu has been in the IT industry for more than sixteen years with various companies
including JDA Software/i2, IBM/DWL, and CGI. Jimmy has many years of experience designing,
developing, and testing enterprise applications, including web and mobile apps, for government,
manufacturing, financial, healthcare, and telecom clients. He has deep technical expertise in
Java, Linux, Android, iOS, and other enterprise application platforms and a good mastery of
various enterprise software development and testing methodologies. Jimmy holds CISSP,
CSSLP, and CSTE certifications; has published an eBook on software security; and has been a
conference presenter on software development, testing, security, and performance.
3. 4/18/2013
Mobile Application Testing: Challenges & Best Practices
Better Software 2013, Las Vegas, USA
Jimmy Xu, CISSP, CSSLP, CSTE
June 5, 2013
Agenda
•
•
•
•
•
•
Mobile Testing Overview
Mobile Test Automation
Mobile User Experience Test
Mobile Performance Test
Mobile Security Test
Questions and Answers
2
1
4. 4/18/2013
Mobile Testing Overview
Mobile Testing Challenges - Multiplicity
Mobile application
testing will usually need
to cover a multiplicity of
mobile devices with
different capabilities,
features, and limitations.
4
2
5. 4/18/2013
Mobile Testing Challenges - Usability
Usability is a greater
quality issue for mobile
applications than for
web or desktop
applications as mobile
users demand better
user experiences.
5
Mobile Testing Challenges - Integration
When mobile Applications
are developed as add-ons
to desktop web applications,
requirements, use cases,
and test cases will usually
need to be synched up.
6
3
6. 4/18/2013
Mobile Testing Challenges - Performance
Mobile applications add
additional user load to
enterprise application
infrastructure:
• Communication with
enterprise servers can be
limited and unreliable in
bandwidth
• Mobile device hardware
resources are usually not as
powerful as on desktop/laptop
computers.
7
Mobile Testing Challenges - Security
Mobile applications
expose enterprise data
to a larger potential
attack surface.
8
4
7. 4/18/2013
Different Types of Mobile Applications
Native Apps
Web Apps
Hybrid Apps
Apps developed to run natively on
mobile devices
Optimized web pages correctly
scales content for the device screen
and are optimized for mobile
browsers
Compatible web pages are
compatible with mobile browsers but
take no extra steps to optimize the
mobile viewing experience
Combines native UI elements with
access to web content within a web
content–viewing area.
Mobile Testing Perspectives
Comprehensive Scope
of MobileTesting
•
•
•
•
•
•
•
•
•
•
•
Native apps
Best
Practice
Mobile web apps
Hybrid apps
Smart phones,
tablets, Internet TVs
Android, Windows 8,
iOS, Blackberry OS
Performance
Security
Usability
Accessibility
Compliance
Testing
Tools
Regression
Increase
•
Quality of Testing
•
Test Coverage
•
Release confidence
Reduce
Mobile
Testing
Team
•
Time to Market
•
Testing Resources
•
Defect Resolution
Time
•
Overall Testing Costs
Automation
10
5
8. 4/18/2013
Mobile Testing Strategy
• Functional tests on
•
•
•
•
primary devices
User experience tests only
on primary devices or on
all devices
Automate regression test
cases
Execute regression test
cases on other devices
Execute security and
performance tests on
simulators / real devices
11
Achieving Excellence in Mobile Testing
Mobile testing services should drive 20% - 40% reduction in costs while improving speed
to market, productivity and quality.
Flexible delivery using “best
shore” global delivery
network provides riskbalanced delivery, optimized
offsite leverage and agility in
meeting demands
Lifecycle
Quality
Management
Frameworks that embody Lifecycle Quality
Management approach leading to defect prevention,
lower costs, and improved quality
Automation
Innovation
Best Shore
Global
Delivery
Model
Deep domain expertise
enabling business risk
mitigation and improved
effectiveness
Innovative automated
test service framework
with trusted tools
vendors resulting in
faster time to market,
improved efficiency,
increased productivity
and lower costs
Domain
Focused
Solutions
12
6
9. 4/18/2013
Mobile Test Automation
Mobile Test Automation Options
Manual test on simulators
Manual test on real devices
Cloud-based test
Cloud-based automated test
14
7
10. 4/18/2013
Manual Test on Simulators
•
Simulators not working exactly the same as real devices
• Not possible to test SMS, email, and phone call services
on real networks
• Test logs and video recording of test sessions possible
• Online peer sharing of test sessions possible
• Inexpensive investment in infrastructure
• Manual execution and expensive to run same test multiple
times
• Suitable for unit tests by developers
Manual Test on Real Devices
•
Test your apps on exactly the same real devices
& same live networks as for production use
• No test logs and video recording of test
sessions available
• Very difficult for online peer sharing of test
sessions
• Test infrastructure can be expensive if testing on
multiple devices + carrier networks is needed
• Manual execution and expensive to run same
test multiple times
• Suitable for SIT & UAT when # of combinations
of devices / networks is small
CGI RESTRICTED AND CONFIDENTIAL
8
11. 4/18/2013
Cloud-Based Test
•
•
•
•
•
•
•
Remote access to multiple real devices & live
networks
Screenshots, test logs and video recording of test
sessions available for defect analysis
Online peer sharing of test sessions to promote
offshore/onshore collaboration and agile testing
Monitoring of real-time device performance & user
experience
Test infrastructure can be expensive depending on
# of devices + carrier networks needed
Manual execution and expensive to run same test
multiple times
Suitable for SIT & UAT with large number of
devices / networks combinations but small number
of test cases
Cloud-Based Automated Test
•
•
•
•
•
•
•
Remote access to multiple real devices & live networks
Screenshots, test logs and video recording of test
sessions available for defect analysis
Online peer sharing of test sessions to promote offshore
/ onshore collaboration and agile testing
Monitoring of real-time device performance & user
experience
Script once, test automatically on multiple devices
Test infrastructure can be expensive depending on # of
devices + carrier networks needed
Suitable for SIT & UAT with large number of devices /
networks combinations and large number of test cases
Data
Interface
Action
9
12. 4/18/2013
Separation of Actions, Data, Interfaces
Actions
1) Enter value
2) Click Button
Interface
SystemUtil.Run "notepad","","",""
Window("Notepad").WinEditor("Edit"
).Type "yes"
Window("Notepad").WinEditor("Edit"
).Type micCtrlDwn + "s" +
micCtrlUp
Window("Notepad").Dialog("Save
As").WinEdit("File name:").Set
"test"
Window("Notepad").Dialog("Save
As").WinButton("Save").Click
SystemUtil.Run http://google.com/
,"","",""Browser("Google").Page("G
oogle").WebEdit("q").Set
"mba"Browser("Google").Page("Googl
e").WebEdit("q")
.SubmitBrowser("Google").Page("mba
- Google
Search").SyncBrowser("Google").Clo
se
Data
• Select a scripting
tool
• Select an execution
platform
• Separate actions,
data, and interfaces
19
Script Once, Test Many
•
•
•
•
Search for texts and images on the screen, regardless of their
locations, sizes, and colors
Record and play
Keyword-based scripting
Port scripts to any devices
10
13. 4/18/2013
Mobile Test Management Automation
•
•
•
•
•
•
•
Use ALM for mobile test management automation
Manage mobile test statements, test cases, and test
scripts
Manage mobile test execution
Manage mobile test defects lifecycle
Integration with mobile test environment (simulators, real
devices, or cloud)
Integration with mobile development IDE/SDK
Real-time traceability, KPIs, metrics, reporting
CGI RESTRICTED AND CONFIDENTIAL
Mobile User Experience Testing
11
14. 4/18/2013
Mobile User Experience Guidelines
General
Guidelines
• Mobile Web Best Practices (MWBP)
• Accessibility (WCAG)
Device-Specific
User Interface
Guidelines
• iOS user interface guidelines
• Android user interface guidelines
• Blackberry user interface guidelines
App-Specific User • Domain specific
Interface
• Organization specific
Guidelines
Mobile Device Characteristics
•
•
•
•
•
•
•
•
•
•
•
Display screen size
Keyboard entry limitation
Pointing device limitation
Network bandwidth limitation
Battery life limitation
Device memory limitation
Display color limitation
Color contrast limitation
File format rendering limitation
Browser variations
Support for HTML, HTML5, CSS,
JavaScript, Flash, Java
differences
•
•
•
•
•
•
•
People interact with one app at a
time
Single window with no visible
components
Display orientation
Gestures for user-device interaction
Voice recognition
Location services
Text message, email, and phone
call services
12
15. 4/18/2013
“One Web” Principle
Some services
have a primarily
mobile appeal
One Web:
same
services to
all users /
devices
Some services and
information are
more suitable for
and targeted at
particular user
contexts
Some services
have a
complementary
desktop and mobile
appeal
Some services
have a primarily
desktop appeal
Testable Statements – iOS Orientation Change
Example
iOS orientation change guideline
Testable Statements
• Think twice before preventing your
app from running in all
orientations.
• If your app only runs in one
orientation:
• Launch your app in your
supported orientation,
• Avoid displaying a UI element
that tells people to rotate the
device.
• Support both variants of an
orientation.
• The iOS version of App XYZ must
support both landscape and
portrait orientation
• The iOS version of App XYZ must
support both variants of the
landscape orientation by rotating
content 180 degrees.
• App XYZ is not required to rotate
its content when the iOS device is
held with the Home button on the
top.
26
13
16. 4/18/2013
Test Cases - iOS Orientation Change Example
Test Case
• Download and install App XYZ from App Store onto a real iPhone 5
device
• Start App XYZ from iPhone 5 home screen
• Navigate to different screens of App XYZ
• While on any screen, do the following:
•
•
•
•
Hold iPhone 5 device in landscape with home button on the right, and expect App
XYZ’s content to be displayed in a top-down, left-right order
Then hold the device in landscape with home button on the left, and expect App
XYZ’s content to be rotated 180 degrees
Then hold the device in portrait with home button on the bottom, and expect App
XYZ’s content to be rotated 45 degrees
Then hold the device in portrait with home button on the top, and expect App
XYZ’s display of content not to be changed.
27
Mobile Performance Test
14
17. 4/18/2013
Performance Engineering Domains
Load
Stress
Models the expected production usage of an application by simulating
multiple users accessing the application's services concurrently. It is the
most fundamental performance test to understand response times and
error rates.
Tests the system’s stability when the load is raised beyond
normal usage patterns. This test determines at what load an
application fails, and how it fails. Useful for determining
headroom for capacity planning
Reliability
Also known as endurance testing, this determines the
ability of an application to perform its required
functions under stated conditions for an extended
period of time
Volume
Most often used to measure an application's
throughput with respect to batch or message
processing where user response times are
not relevant
Scalability
By comparing to baselines, determines
how linear the application's
infrastructure can scale to support
increased work load
Testing Additional Load from Mobile Apps
•
•
Estimating additional load from mobile apps
Simulating requests from mobile apps with load test scripts
•
•
Selecting a primary load testing tool
Custom coding to support HTML5, JavaScript / Ajax, REST,
SOAP
Measuring server response time and throughput
Monitoring usage of server system resources
•
•
15
18. 4/18/2013
Testing, Tuning & Profiling Mobile Apps
• Measuring mobile user
experience
• Identifying bottlenecks in
your mobile apps
• Performing root cause
analysis
• Improving mobile user
experience
• Minimizing mobile app
footprint
• Profiling function calls
Monitoring Device System Resources
Measuring footprint of mobile
apps on mobile devices
Collecting & reporting system
usage metrics
CPU busy & idle time and
background processes
Memory & disk space usage
Network bandwidth, throughput,
and data usage
Batter life & power
consumption rate
Connection interruptions, performance jitters and degradations of mobile
networks
16
19. 4/18/2013
Mobile Security Test
Secure SDLC
REQUIREMENTS
OPERATE/ MAINTAIN
Vulnerability scanning should be regularly
performed during the maintenance phase on both
the application and infrastructure to ensure
no new security risks have been
introduced and that the level of
security is still intact.
DEPLOY MENT
Application should be tuned
and hardened at all layers
of the platform stack to
minimize infrastructure
software mis-configuration
vulnerabilities.
TESTING
Security testing should be
performed during SIT to simulate
application abuses and ensure any
vulnerabilities uncovered are properly
addressed as “security bugs.”
GOVERNANCE
Security Polices,
Guidelines, Standards,
Procedures, Metrics
created & enforced by
organizations
Should be defined according to
governance rules for authentication,
authorization, non-repudiation, data
confidentiality, integrity,
accountability, session management,
transport security, privacy, etc.
PLANNING & DESIGN
Should take into consideration
network, server, middleware,
database and programming
platform vulnerabilities,
leveraging techniques such as
threat modeling and risk
analysis.
DEVELOPMENT
Static code analysis should be
performed to ensure secure coding
guidelines are followed and coding
vulnerabilities are minimized.
17
20. 4/18/2013
Device Based Attacks
•
•
•
•
•
•
•
•
•
•
•
Misplaced or lost smart phones / tablets
Mobile devices not password protected
Unencrypted credentials, insecure storage, or cached data
Misconfigured certificate and proxy settings
Mobile devices may have unauthorized modifications
Malware apps downloaded from app stores or jail breaking
Security software often not installed to scan for Trojans,
spyware, malware, and spams
Invoking classes, services, activities from insecure sources
Out of date operating system versions
Out of date software utilities
Shared mobile app IDs and data
Server & Network Based Attacks
•
•
•
•
•
•
Data transmissions via Wi-Fi Hot Spots not always encrypted
Bluetooth communications in "open" or "discovery" mode
NFC offers no protection against eavesdropping
Internet connections usually not protected by firewalls
Man-in-the-middle attacks
Weak authentication schemes
Internet
18
21. 4/18/2013
Abuse Cases, Misuse Cases, Attack Scripts
•
Misuse cases refer to use cases when the
actor initiates unintentional but potentially
harmful actions.
•
Abuse cases refer to use cases when the
actor initiates intentionally harmful actions.
•
Attack scripts refer to scripts developed to
automate penetration such as dictionary
attack and ‘fuzzing’.
Mobile Security Testing Environment & Tools
Static source code analysis tools
Profiling tools
• Mobile software development kit
(SDK)
• Debugging tools
• Simulators
• Tracing tools
• System data dumping tools
• Decompiling tools
Penetration test tools
• Exploring tools
• SQL querying tools
• Sniffing tools
• Fuzzing tools
Proxy tools
• Tools that intercept and manipulate
the traffic between mobile devices
and servers
• Automation tools
38
19
22. 4/18/2013
Vulnerability Severity
•
Identified vulnerabilities will be flagged with a CVSS severity
level
Assignment of CVSS score
based on:
High Severity:
CVSS base score of
7.0-10.0
Medium Severity:
CVSS base score of
4.0-6.9
•
•
•
•
Low Severity:
CVSS base score of
0.0-3.9
•
•
Questions
The primary impact on the
confidentiality, integrity, and
availability of the protected
system/resources
The derivative impact on loss of
life and/or properties
The percentage of the impacted
area within the total environment
How easy it is to exploit the
vulnerability
How easy it is to remediate the
vulnerability
How confident the testing team is
about the existence of the
vulnerability
?
20
23. 4/18/2013
Our commitment to you
We approach every engagement with one
objective in mind: to help clients succeed
21