The Advanced Mobile Application Testing Environment: Project Report

539 views
395 views

Published on

ICSE 25, AMOST Workshop. , St. Louis, May 16, 2005. Overview of challenges and results, demo of the AMATE project.

Published in: Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
539
On SlideShare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
7
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

The Advanced Mobile Application Testing Environment: Project Report

  1. 1. ™ mVerify A Million Users in a Box ® The Advanced Mobile ApplicationTesting Environment: Project Report Robert V. Binder James E. Hanlon A-MOST Workshop ICSE 25, St. Louis May 16, 2005 www.mVerify.com
  2. 2. Overview Background and Motivation Models Demo Lessons Learned © 2005 mVerify Corporation 2
  3. 3. Mobile App Challenges Rubik’s Cube complexity, many new failure modes  Connectivity  Mobility  Scalability  Security  PLUS assure functionality, performance, integration Adequate testing ? End to end, realistic testing only hope for high reliability © 2005 mVerify Corporation 3
  4. 4. AMATE Project Background Advanced Mobile Application Test Environment (AMATE) Goal: Achieve realistic end-to-end mobile testing Approach: Model-based, Mobile-Centric  Signal variation related to mobility  User behavior related to mobility  Traffic related to mobility NIST/ATP funded R&D © 2005 mVerify Corporation 4
  5. 5. AMATE Technology Highlights Tester’s Assistant generates initial model  Automatic inference (guess) of application semantics Simulator generates tests from model repository  Scheduled location-specific behavior  Scheduled location-specific airlink conditions Digital AirLink Emulator varies signal  Non-intrusive, software-defined radio  Consumes simulator-generated commands  Achieves controlled real-time airlink variation Distributed control, observation, evaluation  Scaleable (1:1000 fan out, 2 levels, mobile device)  Tcl test object framework, Tk GUI, C++ drivers & controllers  Relational database  Web Services: XML, WSDL, SOAP, BEEP, HTTP © 2005 mVerify Corporation 5
  6. 6. Behavior Model Extended Use Case Tester’s Assistant generated and user editable 1 2 3 4 5 Conditions Variable/Object Value/StateTest Input Widget 1 Query T TConditions for Widget 2 Set Time T T Logicautomatic test Widget 3 DEL combinations T Host Name Pick Valid T F T F DCinput generation Host Name Enter Host Name control test input Actions data selection Variable/Interface Value/Result Host Name Display No Change T T T T Deleted T Added Host Time Display No Change Usage Profile TRequired Host Time T T controls statisticalActions for distribution ofautomatic result CE Time Display Last Local Time T Host Time T T T T test caseschecking Error Message F T F T F Relative Frequency 0.35 0.20 0.30 0.10 0.05 © 2005 mVerify Corporation 6
  7. 7. Load Model Vary aggregate input rate  Arc  Flat  Internet fractal  Negative ramp  Positive ramp  Random  Spikes  Square wave  Waves Actual “Waves” Loading © 2005 mVerify Corporation 7
  8. 8. Mobility Model Map generates real time itinerary for N virtual users  Location-specific signal strength  Location-specific end-user behavior  Controls Airlink Emulator Signal Propagation Map Virtual Users 1 Bar Signal 2 Bar Signal 3 Bar Signal © 2005 mVerify Corporation 8
  9. 9. An AMATE SessionModel Builder 1 Generate Model Simulator 2 Generate Test Objects Test Agent 3 Run Test Objects Test Driver AUT Mobile Device Digital Airlink Emulator 4 Test Driver Controls AUT WiFi DALE Controls Airlink Access Point © 2005 mVerify Corporation 9
  10. 10. Lessons: Test Automation Goal: try to use/adapt standards, open source Eclipse/Hyades interesting, but too Java-centric FIPA agent model  Some concepts useful, but didn’t need movement Postgres + Win32 NRFPT  MySQL stable, fully featured  Relational-OO integration challenges Extreme Programming x-Unit framework not scalable Plumbing critical and expensive  CORBA too heavyweight, not web-friendly  Own message system abandoned – limited, buggy, expensive  BEEP + SOAP + transport (TCP, USB, named pipes …) © 2005 mVerify Corporation 10
  11. 11. Lessons: Model-Based Testing Deterministic end to end model not feasible  Exogenous simulation strategy effective RF Propagation  Modified ray-casting, Quad Trees, DXF import  UMTS Interference Spec XML limited as meta-data representation TTCN complete & well-structured, but standalone UML Test Profile still a muddle Usability critical, hard to get right  Incremental composition, minimize dependencies  “Tester’s Assistant” © 2005 mVerify Corporation 11
  12. 12. Lessons: Mobile Technology Software Defined Radio price/performance 100x Wavelength matters Proprietary islands; HW/SW Adapter framework  Win Mobile, J2ME, Symbian, BREW, ...  WiFi, CDMA, GSM, WCDMA, WiMax, Ultra WB, Mesh, RFID, … Fundamental limitations of lab testing  Scalability bandwidth limited  Virtualization doesn’t scale over 1000s  Share fielded configuration © 2005 mVerify Corporation 12
  13. 13. Thank You, Open Source Embedded  Tools  Tcl 8.4  gSOAP  Image Magic  SWIG  MySQL  Subversion  Linux (Red Hat)  Boost/Jam  Universal Software  RoboDoc Radio Peripheral  Tcl Wiki  gnu Radio (DSP  Bugzilla interface)  cppUnit  gEDA Free, but not cheap …  PCB © 2005 mVerify Corporation 13
  14. 14. Q&A© 2005 mVerify Corporation 14
  15. 15. Lessons: Process Systems engineering + iterative development  Six development increments  Architecture matters, patterns useful  Plan throw several away, you will anyhow  Result: family of OO Frameworks Full time sys/tool admin necessary XP-style testing, but not “Test First” Customer dialog  Real problems, avoid duplication Build shared vision through communication © 2005 mVerify Corporation 15

×