Your SlideShare is downloading. ×
0
Mobile Testing Mistakes
Top 5
Fred Beringer – SOASTA
July 2013 – SBSQE Meetup
Mobile Testers Need to
Avoid
 Java and smartcard @ Sun Microsystems (Nice )
 eCommerce platform, DB2 Replication, BI @ IBM (San Jose,CA )
 Software ...
3© 2012 SOASTA. All rights reserved. May 22, 2012
o First End-to-End Mobile App Test Automation Platform
• First Cloud-Bas...
4© 2012 SOASTA. All rights reserved. May 22, 2012
“Strong Product and
Sales Execution”
o Customer proven
o Ease of automat...
WE’RE HIRING!!!
DevOps Engineers
Performance and Mobile Test Engineers
VP Professional Services
fberinger@soasta.com
twitt...
My goal today
STORIES
SOMETHING FOR YOU
TO TAKE BACK HOME
Discussions
What mistakes did you make?
Picking the right locator for automation
Time delays for mobile automation?
Think again!
Brute force (time Delays)
Vs
Maintainable Tests (intelligent
waits)
Run regression tests at irregular intervals
In Mobile Development, CI is KING
Testing on simulators vs real devices
Which one is right?
Ignoring full end-to-end
Mobile performance testing
Please GET ACTIVE on
CloudLink, our community is
YOU!
Web: CloudLink
http://cloudlink.soasta.com
Thanks
RESOURCES
www.SOAS...
Upcoming SlideShare
Loading in...5
×

Top 5 Mobile Testing Mistakes Mobile Testers Need to Avoid

1,835

Published on

Mobile test automation requires stability, precision and repeatability to deliver value and instill confidence. Let's face it, mobile testing comes with many challenges, such as: Fast paced development cycles with multiple releases per week, multiple app technologies and development platforms to support, tons of devices and form factors, and additional pressure from enterprise and consumers less patient with low quality apps. With these new challenges, come a new set of mistakes testers can make!

Fred has worked with dozens of mobile test teams to help them avoid common traps when building test automation for mobile apps.

Traps such as:

1. Picking the wrong naming strategy for UI Elements when building cross-platform apps.
2. Relying on time delays to deal with back-end calls response time.
3. Run regression tests at irregular intervals.
4. Testing on simulators vs real devices.
5. Ignoring flu end-to-end mobile performance testing.

Published in: Technology
0 Comments
3 Likes
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total Views
1,835
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
56
Comments
0
Likes
3
Embeds 0
No embeds

No notes for slide
  • Real handsets vs. simulators
    A friend working for a leading global carrier once told me “We will start testing on simulators the minute we start selling simulators in stores”. In a nutshell, this represent the major differences between using simulators and real handsets for testing. Eventually, the application is used on real handset and on simulators. The closer you get to the platform, the better quality you will get.
    Simulators are very powerful tools for developing mobile applications; they have unique features which enable them to provide a debugging environment. However, by definition, they are simulators and lack the real target environment. Pilots are trained on simulators to test situations that they cannot simulate using real aircrafts as it is obviously a cheaper solution. But can you envision a pilot who never used a real aircraft before to take you across the Atlantic?

    Mobile Simulators
    History
    At the early days of the simulators, the approach was to develop a PC based application which can mimic the handset by mapping its attributes and behavior. For example, for mobile browsers such attribute is supporting of each HTML element, image types, etc…
    As time passes, this approach resulted with two problematic points

    Devices got complicated, as such; the number of attributes reaches thousands which should be mapped. Just mapping those to every new handset in the market turned out to be an impossible mission.
    Attribute mapping marked the “positive” side of the handset, but did not emulate its negative sides. Every software component, being PC based or handset based, suffers from flaws, this is the nature of software. As devices got more complicated, this results in the simulators path breaking away from its “home handset” in both unmapping of the handset flaws and the buildup of new flaws only appearing on the simulator itself.
    For this reason, you can find such simulators only for relatively low complex handsets. Other, are simply so “far” from the real handset that they have become irrelevant as a testing or even development tool.
    New simulators tend to take a different approach and actually compile the handset to work under a normal OS (such as Windows or MAC OS). This approach creates a virtual execution environment which simulates the hardware of the handset and running (as much as possible) the same “binary code” of the handset.
    With the first approach you can see simulators coming from Nokia, Openwave. With the second approach you can find simulators coming from Android, Windows Mobile, iPhone and blackberry.

    The benefit of using mobile simulator
    To start with, it’s cheap. In most cases, you will not have to pay anything for using a mobile simulator. You simply need to download it and start using it.
    In some cases, mobile simulators might give you the benefit of simulating hard to reproduce scenarios (no network, location, etc…)
    Mobile simulators can sometime give you access to detailed information, they might be integrated with IDE (integrated development environment) which allows you to debug your application step-by-step on the simulator.

    Drawbacks of using mobile simulator
    Bottom line, you are not testing on your actual platform. This means that even if all goes well, you cannot be sure it will actually work on the handset.
    Simple search on the web, will give you hundreds of cases where simulators are having problems not existing on real handsets. These are usually connected to the environment, the sensors, location and especially the network.
    The network environment is completely different. In mobile simulators you will use the PC (through any personal firewall), LAN connected to your corporate firewall and to the internet. Using real handsets, the network is connected to the radio interface and from there to the internet. In many cases, the low level network stack itself is completely different. Regardless of whether this is good or bad, it is different, meaning; your application will behave differently.
    As opposed to mobile simulators, real handsets have limited resources. The mobile handset is a compact platform with different CPU power and memory (this can be extended to other related “stuff" such as graphic chipset, input devices, etc…). What’s running on a powerful quad-core intel based CPU with 4GB of memory might not work properly (if at all) on single core, limited CPU with 500M of memory.
    When addressing testing using simulators, a tester might stumble a question of “It doesn’t work – what now?”. One can envision several possible scenarios.

    It doesn’t work on the simulator, need to test it on real handset
    It probably doesn’t work at all
    Obviously, yet another “hidden” scenario is “it does work on the simulator, what’s now?” – is this sufficient or should I also test this on real handset.

    Real handset
    Background
    When considering the area of testing on real handsets, there are several alternatives to consider. By definition, handsets are physical resources which need to be managed. Unlike simulators where additional “handsets” can be either additional software installed or simple configuration, real handsets are something physical to own.
    Services such as Perfecto Mobile allow you to virtualize those resources; however, those are still eventually real physical resources.
    Given this important fact, individuals testing on real handsets always had to choose how to manage those resources. In many cases, those have become “rare” resources for testers who had to define their priority list based on platform, brand, screen size, OS, etc…
    Beside “side projects” or “garage projects”, testing on real handsets is not an optional task, it is always a given task, and the only question to ask is to which extent.

    Benefits
    Testing on real handsets always give you the actual results. There is no need to worry about “false negatives” and more important there is no need to worry about “false positive” situations. The later, is crucial, as those are scenarios which have passed the testing on simulators but will fail on real handsets.
    Real handset testing will give actual results. Those will naturally take into account the network, CPU, memory, screen size, etc… The nice thing about the environment is that there is little to do, for good and bad, the real handset is the real handset.
    It is much easier to expose performance defects with real handsets as well as defects which are the results of the handset itself or its environment. All of those are, normally, not detected using simulators.

    Drawbacks
    As stated above, unless somehow managed, handsets are physical resources which need to be managed. Both in hardware and in cellular plans, networking, definition, etc.
    Real handsets are harder to connect to IDE. Although at later phase, this requirement is less relevant (important for early stage development), it is still a missing functionality.

    Conclusion
    The goal of testing is to make sure everything is working as expected. When using real handsets, the situation is quite simple, if it works, it works, if it doesn’t, it needs to be fixed.
    Simulators are very good for the early days of application development; their integration with the development environment IDE is very powerful. On the other side, before releasing the application (or web site) into the market, you probably do want to test it on real handsets (as it will probably be very hard to blame the simulator).
    The “gray” area between those two edges might vary between different organizations and needs. In most cases, the rule of thumb would be that actual development should use simulators (and few reference real handsets) and testing should be done on real handsets.
    If you are a “garage” developer who is willing to take great risks with your application, simulators are good and very cost effective solution for you. However, if you are developing an application for your organization for internal or external use, you should probably use real handsets for your testing as you don’t want those “flaws” to be first found in production mode.
    Use the below “acceptable equation” to do the math yourself, based on your needs
    Found defect cost in R&D = 10% of found defect cost in QA = 1% of found detect cost in production.
    The above equation has better statistical clearance with clear market applications. If the value of the application is clear, the equation is true. In other cases, the value is not that clear.
  • Transcript of "Top 5 Mobile Testing Mistakes Mobile Testers Need to Avoid"

    1. 1. Mobile Testing Mistakes Top 5 Fred Beringer – SOASTA July 2013 – SBSQE Meetup Mobile Testers Need to Avoid
    2. 2.  Java and smartcard @ Sun Microsystems (Nice )  eCommerce platform, DB2 Replication, BI @ IBM (San Jose,CA )  Software Testing director @ Experian (Monte Carlo )  Growing the hottest software testing startup @ SOASTA (Nice )  Helping Mobile Testers @ SOASTA (Mountain View, CA )
    3. 3. 3© 2012 SOASTA. All rights reserved. May 22, 2012 o First End-to-End Mobile App Test Automation Platform • First Cloud-Based Load Testing Solution for Performance Engineers • Largest Global Test Platform (17 Countries, 100 Cities, 500K Cloud Servers) • First Continuous Test Automation Solution For Mobile Developers/Testers • First “real-time” Real User Monitoring (RUM) Solution for web and mobile apps o Over 400 Global Corporate Customers • 20,000 Mobile Developers and Testers use SOASTA Cloud Services • Over 2,500 Mobile and Web Apps have been Tested with SOASTA. o Award Winning & Patented Technology • Gartner Magic Quadrant Leader & IDC Cloud Testing Leader • Wall Street Journal Top 50 Hottest Companies three years running o Global Offices • San Francisco, New York, London, Mumbai, Shanghai & Tokyo
    4. 4. 4© 2012 SOASTA. All rights reserved. May 22, 2012 “Strong Product and Sales Execution” o Customer proven o Ease of automation o Native device testing o Real-time analytics o Partnerships 2013 Gartner Magic Quadrant for Integrated Software Quality “market-leading” “disruptive” “visionary” “The company has continued to deliver innovation in a consistent fashion.” Source: “Magic Quadrant for Integrated Software Quality Suites” Gartner, July 2013
    5. 5. WE’RE HIRING!!! DevOps Engineers Performance and Mobile Test Engineers VP Professional Services fberinger@soasta.com twitter.com/fredberinger http://www.fredberinger.com
    6. 6. My goal today STORIES SOMETHING FOR YOU TO TAKE BACK HOME Discussions
    7. 7. What mistakes did you make?
    8. 8. Picking the right locator for automation
    9. 9. Time delays for mobile automation? Think again! Brute force (time Delays) Vs Maintainable Tests (intelligent waits)
    10. 10. Run regression tests at irregular intervals
    11. 11. In Mobile Development, CI is KING
    12. 12. Testing on simulators vs real devices Which one is right?
    13. 13. Ignoring full end-to-end Mobile performance testing
    14. 14. Please GET ACTIVE on CloudLink, our community is YOU! Web: CloudLink http://cloudlink.soasta.com Thanks RESOURCES www.SOASTA.com Contact SOASTA: www.soasta.com/products/touchtest info@soasta.com 866.344.8766 Follow us: twitter.com/cloudtest facebook.com/cloudtest
    1. A particular slide catching your eye?

      Clipping is a handy way to collect important slides you want to go back to later.

    ×