Mobile App Assurance: Yesterday, Today, and Tomorrow.

570 views

Published on

Keynote, Software QS-Tag. Nuremburg, November 2012.
Overview of mobile app testing: history, challenges, and approaches.

Published in: Technology
0 Comments
1 Like
Statistics
Notes
  • Be the first to comment

No Downloads
Views
Total views
570
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
10
Comments
0
Likes
1
Embeds 0
No embeds

No notes for slide

Mobile App Assurance: Yesterday, Today, and Tomorrow.

  1. 1. A foundation of quality Robert V. BinderDirector of Innovation, Olenick and Associates rbinder@olenick.comIMBUS Software QS Tag — November 8, 2012
  2. 2. The Big Picture
  3. 3. Overview• The mobile deluge• Mobile app testing retrospective• Mobile app assurance challenges• State of the art, 2012• Crafting a mobile app assurance strategy• Q&A
  4. 4. THE MOBILE DELUGE
  5. 5. SECURELYRELIABLY SEEMLESSLY
  6. 6. MOBILE APP TESTINGRETROSPECTIVE
  7. 7. Mobile Test Technology, 2002 Desktop Mobile User Serial Port Device Interface Base Station Parametric Mobile SwitchingTypical CenterMobile App, PublicEnd to End Switched Network Internet Desktop BackBone Load Test First Tier Server Business Server © 2004 mVerify Corporation 9
  8. 8. Critical Capabilities Handheld Application Server Functionality Functionality Response Time Response Time MUD Resource Utilization Server Resource Utilization Airlink Variation Billing/Provisioning/Security QOS Edge Combinations Background contention In-cluster Hand Offs Dispatch/Allocation Multiple Base St Protocol Background IP Load Roaming Client transaction saturation Location Services Server Interaction End-to-End Server Exception Response time Configuration Capacity Reliability Mobile Infrastructure Availability Op/Admin/Maintenance Geographic Coverage Background load (“breathing”) Packet Load Weather, solar, etc.
  9. 9. Hand Held Testing Handheld  Functionality  Response Time MUD Resource UtilizationMobile User Airlink Variation Device QOS Edge Combinations In-cluster Hand Offs Multiple Base St Protocol Roaming Base Station Location Services Server Interaction Server Exception Mobile Configuration Switching Base Station Center Op/Admin/Maintenance Background load (“breathing”) Public Packet Load Switched Weather, solar, etc. Network Application Server Functionality Response Time Internet Server Resource Utilization BackBone Billing/Provisioning/Security Background contention •No Load Dispatch/Allocation First Tier Background IP Load Server Client transaction saturation •No Mobility End-to-End Response time •No Interaction Business Capacity Server Reliability Availability Geographic Coverage © 2004 mVerify Corporation 11
  10. 10. Parametric Testing Transport Layer Handheld Network Layer Functionality Data Link Layer Response Time Physical Layer MUD Resource Utilization Airlink Variation Mobile User Device QOS Edge Combinations In-cluster Hand Offs Multiple Base St Protocol Roaming Base Station Location Services Server Interaction Mobile Server Exception Switching Configuration Center Base Station Op/Admin/Maintenance Public Background load (“breathing”) Switched Packet Load Network Weather, solar, etc. Application Server Functionality •Single connection Internet Response Time BackBone Server Resource Utilization Billing/Provisioning/Security •No Back-end Background contention First Tier Dispatch/Allocation Server Background IP Load •No Server Client transaction saturation End-to-End •No App Function Business Response time Server Capacity Reliability •Mobility & Load ? Availability Geographic Coverage
  11. 11. Server Testing Handheld Functionality Response Time MUD Resource Utilization Airlink Variation Mobile User QOS Edge Combinations Device In-cluster Hand Offs Multiple Base St Protocol Roaming Base Station Location Services Server Interaction Server Exception Mobile Configuration Switching Base Station Center Op/Admin/Maintenance Public Background load (“breathing”) Switched Packet Load Network Weather, solar, etc. Application Server Internet Functionality BackBone Response Time Server Resource Utilization Billing/Provisioning/Security First Tier Background contention•No Mobility Server Dispatch/Allocation Background IP Load•No Network Client transaction saturation Business End-to-End Server•Emulated Handheld Response time Capacity Reliability Availability Geographic Coverage
  12. 12. End-to-end: Manual/Live Network Handheld Functionality Response Time MUD Resource Utilization Airlink Variation QOS Edge Combinations In-cluster Hand Offs Multiple Base St Protocol Mobile User Device Roaming Location Services Server Interaction Server Exception Base Station Configuration Base Station Mobile Op/Admin/Maintenance Switching Background load (“breathing”) Center Packet Load Weather, solar, etc. Public Application Server Switched Functionality Network Response Time•Inconsistent Internet Server Resource Utilization Billing/Provisioning/Security•Can’t Scale BackBone Background contention Dispatch/Allocation•Time Consuming Background IP Load First Tier Client transaction saturation Server End-to-End•No Load Business  Response time Capacity•Mobility (Drive test ) Server Reliability Availability ? Geographic Coverage
  13. 13. AMATE Project  Advanced Mobile Application Test Environment Goal: Achieve realistic end-to-end mobile testing  Model-based, Mobile-Centric  Signal variation related to mobility  User behavior related to mobility  Traffic related to mobility  NIST/ATP funded R&D  2002-2004
  14. 14. Load Model• Vary aggregate input rate • Arc • Flat • Internet fractal • Negative ramp • Positive ramp • Random • Spikes • Square wave • Waves Actual “Waves” Loading © 2005 mVerify Corporation 16
  15. 15. Mobility Model• Map generates real time itinerary for N virtual users • Location-specific signal strength • Location-specific end-user behavior • Controls Airlink Emulator Signal Propagation Map Virtual Users 1 Bar Signal 2 Bar Signal 3 Bar Signal
  16. 16. © 2006 mVerify Corporation A Million Users in a Box ®The Mobile Testing Nightmare • Intense, high-stakes race to market • Configurations (platforms x devices x airlinks) increase exponentially • More testing necessary for competitive quality, reliability, performance • Ad hoc manual testing is slow, costly, ineffective
  17. 17. © 2007 mVerify Corporation A Million Users in a Box ®MTS: Any App, Any Platform Console Host Agent Host MTS Model Editor MTS MTS Test Repository Agent/STE MTS TEST RUN Console REPORTS MTS Remote Agent Agent Host Client Plug In Under Test MTS Test Agent/STE Client Host Under Test MTS Remote Agent MTS Remote Agent Server Under Test Client Channel Under Test Emulator Server Host Under Test Client Host Under Test Host Under Test may be ü Cell Phone, PDA MTS Remote ü Desktop or Server Agent ü Embedded Processor ü Comm Interface Plug In Server ü Network Equipment Under Test ü Access Point Server Host Under Test ü Base Station
  18. 18. MOBILE APP ASSURANCECHALLENGES
  19. 19. Functionality and Robustness Using Launchpad 2.6 with a mobile app, when the app is deactivated then reactivated, all the elements of the home view list are duplicated. When the phone rings, the incoming call screen pops up with options accept, ignore and ignore-with-text. Selecting ignore-with-text doesn’t disconnect, pops up two more times, and doesn’t send the text.
  20. 20. Same App, many platforms, locales … • Android • BlackBerry • iPhone, iPad • Mobile Web (HTML5) • Windows Mobile • Dozens of languages
  21. 21. Configuration Coverage• How many ways can your app be deployed? • Form factors • Screen sizes • Platforms and versions • Accessories • Wireless stacks • Bandwidth • Carrier policies and technology • Logo/Store Certification • Locales/localization • Server-side performance
  22. 22. Usability
  23. 23. Security
  24. 24. Scalability: Amazon Cloud Outage• Amazon web service Elastic • ELB reaction: try to allocate Compute Cloud crashes more larger servers. (power failure) • Backlog in the “control plane”• The Elastic Load Balancing results. (ELB) system frantically tries to assign workloads to • Demand from customers in available servers. unaffected availability zones continues• As Amazon’s cloud rebooted, “a large number of ELBs came • System swamped and crashes up in a state which triggered a again. bug we hadn’t seen before”
  25. 25. Latent Positive Feedback = Dragon King
  26. 26. Dragon Kings Sornett calls these exceptional events dragon kings “to stress that we deal with a completely different kind of animal, beyond the normal, whose extraordinary characteristics [have] profound significance.”
  27. 27. Here be Dragonsü Latent positive feedback  External disruptors  Malicious attack targetü Partial degradation  Provoked panicü Low testability  Multi-stack  Field-infeasible  Uncontrollable Inputs
  28. 28. STATE OF THE ART, 2012
  29. 29. Handheld Testability• Many UI Test Tools • UI event capture/replay • Image capture/bitmap compare• API for widgets/controls • Invasive • Brittle• Several test suite composers • Drag and drop • Natural language/keyword• No multi-endpoint capability• Most are platform-specific
  30. 30. Crowd-sourced Testing• Crowd Testing • UTest • MobTest • Mob4Hire • Others “60787 people (mobsters) have 34142 different mobile handsets on 448 carriers in 156 countries”
  31. 31. Test Environments as a Service SUT Remote Device Array Mobile Client Emulation SUT Scalable Client Emulation Live Target
  32. 32. Testing Technology for Mobile apps What’s not new What’s new• UI capture/replay • Testable UI APIs• Image capture/compare • Crowdsourced testing• Virtual users • Cloud-based Test in• Hand-crafted test suites Production• Massively manual testing • More stable infrastructure• Remote device array• Fragmented test harness• No Model-based Testing
  33. 33. CRAFTING A MOBILE APPASSURANCE STRATEGY
  34. 34. Brook’s Scope System of Programs Program 3X •Interfaces •End-to-end 3X 10X Program Product Program •Testing Systems •User Support Product •Maintenance
  35. 35. Brook’s Scope, Today’s Environment One Billion Smartphones Tiered Mobile App 3X App •Interfaces •End-to-end 3X 10X 2.5 Billion Endpoints Published App Mobile •Testing System •User Support Product •Maintenance
  36. 36. State of the Art• Handheld testability • A little better• Mobile testing nightmare • Remote Device Arrays • Crowdsourced testing• End-to-end coverage • Infrastructure much more robust • Capacity much improved • No end-to-end test harness• No support for Internet of Things• Systems are bigger, more complex, and more critical • Hic Sunt Dracones
  37. 37. Mobile App StrategyHiring Manager:“To what should you pay special attentionwhen testing a mobile application, incomparison with say a standard webapplication?” “The best answer from 3 candidates with over 10 years of claimed mobile application testing experience listed on their CV, and all ISEB qualified was, and I quote:” “It’s more easy on the mobile app".
  38. 38. Mobile App Strategy• Design for testability • Minimize variations • You’ll need tooling for each platform• Cost of failure is very high • App Store reject adds months • No second chance with users • Space is moving very fast
  39. 39. Mobile App Strategy• Cover features and events • Use and Abuse Cases • Event Plan, pairwise • Manual, automated, RDA UC01 UC01 UC02 … UC99• Cover Configurations Foregrnd/Backgrn  Background load  • Remote Device Array Power Sleep Cycle  • Crowdsource if necessary Battery Drain  Incoming Call  Incoming Text • Capacity Test Camera Usage  Bar code scan  • Cloud Test Services Accelerometer  Reboot  GPS Impairments • Assess Dragon King Risk WiFi Impairments  Cellular Impairments  • Model-based multi-dimensional
  40. 40. Content and Image SourcesBig Picture Amazon Outagehttp://apod.nasa.gov/apod/ap071021.html Hidden bugs that made Amazon Web Service outage worse. Cnet News, July 3, 2012.How do people want to use their devices http://news.cnet.com/8301-1009_3-57465761-83/hidden-http://network.cisco.com/how-do-people-want-to-use-their- bugs-that-made-amazon-web-service-outage-worse/devices.htmlInternet of Things Dragon-Kings, Black Swans and the Prediction of Crises. Didierhttp://blogs.cisco.com/news/the-internet-of-things-infographic/ Sornette. International Journal of Terraspace Science and Engineering 2009 http://www.youtube.com/watch?v=FlTSbzOvKZIFunctionality … Configuration Coverage, Event Planhttp://www.udemy.com/how-to-test-mobile-apps Latent Positive Feedback = Dragon King http://www.everythingselectric.com/forum/index.php?topic=24Usability 4.0http://www.pagetrafficbuzz.com/google-survey-reveals-75-mobilefriendly-sites/14736/ Dragon King http://wallpaper4me.com/wallpaper/Dragon-King-of-The-Land/NativeDriverMatt DeVore, Tomohiro Kaizu, Dezheng Xu, Daigo Hamura. Crowdsourced TestingNative Driver Native App UI Automation with WebDriver API,2011 Selenium Conference, San Francisco. http://mob4hire.com/http://www.nativedriver.googlecode.com/files/NativeDriver_introduction.pdf Hic Sunt Dracones http://tomlytle.com/ All other content Copyright © 2012, Robert V. Binder
  41. 41. Why are mobile apps so popular?• Low cost• Any time, any where, any one• Connectivity to everything• Complete control• Personal space
  42. 42. Current TechnologyTest Goal Plus MinusFunctionality and Simple Tools for all Stack SilosRobustness platforms Brittle testwareDeployment Coverage Crowdsourced Testing Systematic coverage? Remote Device Array SuperficialUsability Crowdsourced TestingSecurity Abuse Cases Manual/technicalScalability Proxy and cloud load Happy paths can’t find testing Dragon KingsNetwork of Things No framework for adaptersSophistication Platform IDEs Mostly manual No Model-based TestingAttitudes Seen as “easier”
  43. 43. © 2005 mVerify Corporation 46Testing with AMATE Model Builder 1 Generate Model Simulator 2 Generate Test Objects Test Agent 3 Run Test Objects Test Driver AUT Mobile Device Digital Airlink Emulator 4 Test Driver Controls AUT WiFi DALE Controls Airlink Access Point
  44. 44. Then and Now Handheld M Infras Server E2EScalability 1 2 4 2Mobility 1 2 0 0Connectivity 1 1 5 1Usability 1 3 4 1Security 1 3 4 1Controllability 1 2 5 1Observability 1 2 5 1Coverage 0 3 5 0 Handheld M Infras Server E2E Scalability 4 3 4 2 0 Mobility 2 3 3 2 2 Connectivity 4 3 5 2 1 Usability 4 3 4 3 2 Security 2 4 4 2 1 Controllability 2 2 5 2 1 Observability 2 2 5 2 1 Coverage 4 3 5 3 3
  45. 45. © 2011, Robert V Binder. All Rights ReservedBeyond Manual Testing• Automated Testing • Device side • Server side • Test management • Seat, open source, cloud• Automated performance/stress• Remote device sharing• Crowd-sourced testing services

×