Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

ICSE17Tech Briefing - Automated GUI Testing of Android Apps: From Research to Practice

674 views

Published on

This is the slide deck for the Technical Briefing given by Kevin Moran, Mario Linares Vasquez, and Denys Poshyvanyk at ICSE17 this year. The presentation covers 1) The basics of GUI Testing and its importance to mobile testing; 2) The current state-of-the-art and practice for testing mobile applications; 3) Current Challenges facing researchers and practitioners aiming to design and build automated testing approaches; and 4) A new direction for automated mobile testing research issued as guidance and a challenge for the SE community.

Published in: Technology
  • Hello! Get Your Professional Job-Winning Resume Here - Check our website! https://vk.cc/818RFv
       Reply 
    Are you sure you want to  Yes  No
    Your message goes here

ICSE17Tech Briefing - Automated GUI Testing of Android Apps: From Research to Practice

  1. 1. AUTOMATED GUI TESTING OF ANDROID APPS: FROM RESEARCH TO PRACTICE Kevin Moran Mario Linares-Vásquez Denys Poshyvanyk
  2. 2. Mario Linares-Vásquez Assistant Professor Universidad de los Andes Bogotá, Colombia m.linaresv@uniandes.edu.co http://sistemas.uniandes.edu.co/~mlinaresv Kevin Moran Ph.D. candidate College of William and Mary Williamsburg, VA, USA kpmoran@cs.wm.edu http://www.cs.wm.edu/~kpmoran Denys Poshyvanyk Associate Professor College of William and Mary Williamsburg, VA, USA denys@cs.wm.edu http://www.cs.wm.edu/~denys
  3. 3. PART 1: CURRENT STATE OF RESEARCH & PRACTICE PART 2: MOBILE TESTING CHALLENGES PART 0: BACKGROUND AND CORE CONCEPTS PART 3: PRELIMINARY SOLUTIONS & RESEARCH VISION
  4. 4. PART 0: BACKGROUND & CORE CONCEPTS
  5. 5. The Importance of GUI Testing • Several different types of testing are important for ensuring software quality:
  6. 6. The Importance of GUI Testing Unit Testing Performance Testing Regression Testing Integration Testing Compatibility Testing • Several different types of testing are important for ensuring software quality:
  7. 7. The Importance of GUI Testing Unit Testing Performance Testing Regression Testing Integration Testing Compatibility Testing • Several different types of testing are important for ensuring software quality: • For Mobile, GUI-Based Testing subsumes many other types of testing • GUI-Testing is typically expensive, and test scripts are difficult to maintain • There is a clear opportunity for automation to Improve development workflows
  8. 8. GUI Testing: The Main Idea UI Events
  9. 9. GUI Testing: The Main Idea UI Events Oracle
  10. 10. GUI Testing: The Main Idea Output, layout, exceptions, presentation logic, quality attributes, … UI Events Oracle
  11. 11. Test result GUI Testing: Core Concepts Oracle
  12. 12. GUI Testing: Core Concepts Oracle Test result
  13. 13. GUI Testing: Example Detecting and Localizing Internationalization Presentation Failures in Web Applications. Abdulmajeed Alameer, Sonal Mahajan, William G.J. Halfond. In Proceeding of the 9th IEEE International Conference on Software Testing, Verification, and Validation (ICST). April 2016.
  14. 14. GUI Testing: Example Detecting and Localizing Internationalization Presentation Failures in Web Applications. Abdulmajeed Alameer, Sonal Mahajan, William G.J. Halfond. In Proceeding of the 9th IEEE International Conference on Software Testing, Verification, and Validation (ICST). April 2016.
  15. 15. Inputs: combinatorial explosion GUI Testing (Challenges)
  16. 16. Inputs: combinatorial explosion GUI Testing (Challenges)
  17. 17. Inputs: combinatorial explosion Internationalization GUI Testing (Challenges)
  18. 18. Inputs: combinatorial explosion Internationalization GUI Testing (Challenges)
  19. 19. Inputs: combinatorial explosion Internationalization GUI Testing (Challenges) Responsive design
  20. 20. Inputs: combinatorial explosion Internationalization GUI Testing (Challenges) Responsive design
  21. 21. Inputs: combinatorial explosion Internationalization GUI Testing (Challenges) Responsive design Unexpected usage scenarios
  22. 22. Inputs: combinatorial explosion Internationalization GUI Testing (Challenges) Responsive design Unexpected usage scenarios
  23. 23. MONKEY TESTING !!
  24. 24. MONKEY TESTING !! AUTOMATED TESTING !!
  25. 25. Automated GUI Testing Output, layout, exceptions, presentation logic, quality attributes, … UI Events Monkey
  26. 26. ANDROID GUI TESTING
  27. 27. Unique Challenges in Mobile Development
  28. 28. Thousands of apps are released and updated every day on the online store
  29. 29. apps - Google Play 2.8M downloads - Google Play65B releases (Android) since 2008 25
  30. 30. Large volume of crowdsourced requirements and ratings
  31. 31. Fragmentation at device and OS level
  32. 32. Pressure for continuous delivery
  33. 33. Manual testing is still preferred
  34. 34. Mobile-specific quality attributes, inputs, and scenarios
  35. 35. PART 1: STATE OF RESEARCH & PRACTICE FOR MOBILE TESTING
  36. 36. Overview of Tools & Services
  37. 37. Overview of Tools & Services • Automation Frameworks & APIs
  38. 38. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools
  39. 39. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools • Automated Input Generation Tools
  40. 40. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools • Automated Input Generation Tools • Bug & Error Reporting
  41. 41. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools • Automated Input Generation Tools • Bug & Error Reporting • Crowdsourced Testing
  42. 42. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools • Automated Input Generation Tools • Bug & Error Reporting • Crowdsourced Testing • Cloud Testing Services
  43. 43. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools • Automated Input Generation Tools • Bug & Error Reporting • Crowdsourced Testing • Cloud Testing Services • Device Streaming Tools
  44. 44. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools • Automated Input Generation Tools • Bug & Error Reporting • Crowdsourced Testing • Cloud Testing Services • Device Streaming Tools }Traditional Android Testing Tools and Approaches
  45. 45. Overview of Tools & Services • Automation Frameworks & APIs • Record & Replay Tools • Automated Input Generation Tools • Bug & Error Reporting • Crowdsourced Testing • Cloud Testing Services • Device Streaming Tools } } Traditional Android Testing Tools and Approaches Bug Reporting, Crowdsourcing and Services
  46. 46. ANDROID TESTING TOOLS & APPROACHES
  47. 47. Automation Frameworks/APIs (AF/A) TESTS JUnit, Espresso, UI Automator, Robotium Monkey
  48. 48. Testing Automation Frameworks/APIs UI Automator
  49. 49. Testing Automation Frameworks/APIs https://github.com/googlesamples/android-testing
  50. 50. Testing Automation Frameworks/APIs https://github.com/googlesamples/android-testing
  51. 51. Testing Automation Frameworks/APIs https://github.com/googlesamples/android-testing
  52. 52. Testing Automation Frameworks/APIs https://github.com/googlesamples/android-testing
  53. 53. Testing Automation Frameworks/APIs https://github.com/googlesamples/android-testing
  54. 54. Testing Automation Frameworks/APIs https://github.com/googlesamples/android-testing
  55. 55. Tools: Hierarchy Viewer
  56. 56. Tools: Hierarchy Viewer
  57. 57. Tools: UIAutomator Viewer
  58. 58. Tools: UIAutomator Viewer
  59. 59. Pros and Cons Automation Frameworks ✓ Easy reproduction ✓ High level syntax ✓ Black box testing - Learning curve - User-defined oracles - Expensive maintenance
  60. 60. Record and Replay (R&R) AUT/SUT UI Events
  61. 61. Record and Replay (R&R) AUT/SUT UI Events Recorder Script
  62. 62. Record and Replay (R&R) AUT/SUT UI Events Recorder Script Scripts
  63. 63. Record and Replay (R&R) AUT/SUT UI Events Recorder Script Scripts UI Events Monkey AUT/SUT UI Events
  64. 64. Tools: Barista http://checkdroid.com/barista/
  65. 65. Tools: Barista http://checkdroid.com/barista/
  66. 66. Tools: Barista http://checkdroid.com/barista/
  67. 67. Tools: ODBR www.android-dev-tools.com/odbr
  68. 68. Tools: ODBR www.android-dev-tools.com/odbr
  69. 69. Pros and Cons Automation Frameworks ✓ Easy reproduction ✓ High level syntax ✓ Black box testing - Learning curve - User-defined oracles - Expensive maintenance Record & Replay ✓ Easy reproduction - Expensive collection and maintenance - Coupled to locations
  70. 70. Automated Input Generation (AIG) Techniques
  71. 71. Automated Input Generation (AIG) Techniques • Differing Goals:
  72. 72. Automated Input Generation (AIG) Techniques • Differing Goals: • Code Coverage
  73. 73. Automated Input Generation (AIG) Techniques • Differing Goals: • Code Coverage • Crashes
  74. 74. Automated Input Generation (AIG) Techniques • Differing Goals: • Code Coverage • Crashes • Mimic Real Usages
  75. 75. Automated Input Generation (AIG) Techniques • Differing Goals: • Code Coverage • Crashes • Mimic Real Usages • Three Main Types:
  76. 76. Automated Input Generation (AIG) Techniques • Differing Goals: • Code Coverage • Crashes • Mimic Real Usages • Three Main Types: • Random-Based
  77. 77. Automated Input Generation (AIG) Techniques • Differing Goals: • Code Coverage • Crashes • Mimic Real Usages • Three Main Types: • Random-Based • Systematic
  78. 78. Automated Input Generation (AIG) Techniques • Differing Goals: • Code Coverage • Crashes • Mimic Real Usages • Three Main Types: • Random-Based • Systematic • Model-Based
  79. 79. Random/Fuzz Testing (R/FT) Monkey X or Y ? AUT/SUT
  80. 80. Random/Fuzz Testing (R/FT) Monkey X or Y ? AUT/SUT Event x Invalid
  81. 81. Random/Fuzz Testing (R/FT) Monkey X or Y ? AUT/SUT Event x Invalid Event Y Valid
  82. 82. Random/Fuzz Testing (R/FT) ./adb shell monkey -p com.evancharlton.mileage 10000
  83. 83. Random/Fuzz Testing (R/FT) ./adb shell monkey -p com.evancharlton.mileage 10000
  84. 84. Pros and Cons Automation Frameworks ✓ Easy reproduction ✓ High level syntax ✓ Black box testing - Learning curve - User-defined oracles - Expensive maintenance Record & Replay ✓ Easy reproduction - Expensive collection and maintenance - Coupled to locations AIG: Random Based ✓ Fast execution ✓ Good at finding crashes - Invalid events - Lack of expressiveness
  85. 85. Aside: GUI Ripping Ripper/Extractor ModelAUT/SUT Monkey
  86. 86. Aside: GUI Ripping Ripper/Extractor ModelAUT/SUT Monkey Snapshot 1
  87. 87. Aside: GUI Ripping Ripper/Extractor ModelAUT/SUT Monkey Snapshot 1 Snapshot 1
  88. 88. Aside: GUI Ripping Ripper/Extractor ModelAUT/SUT Monkey Snapshot 1 Snapshot 1 GUI State 1
  89. 89. Aside: GUI Ripping Ripper/Extractor ModelAUT/SUT Monkey Snapshot 1 Snapshot 1 GUI State 1 A or B ?
  90. 90. Aside: GUI Ripping Ripper/Extractor ModelAUT/SUT Monkey Snapshot 1 Snapshot 1 GUI State 1 Event 1 A or B ?
  91. 91. Aside: GUI Ripping Ripper/Extractor ModelAUT/SUT Monkey Snapshot 1 Snapshot 1 GUI State 1 Event 1 Snapshot 2 Snapshot 2 GUI State 2 A or B ?
  92. 92. GUI State extraction Ripper/Extractor Computer/Mobile device OS - Framework - API - Utilities GUI State Events Monkey
  93. 93. GUI State extraction
  94. 94. Monkey A or B ? Systematic Exploration
  95. 95. Monkey A or B ? Breadth-First (BF)Depth-First (DF) Systematic Exploration
  96. 96. Monkey A or B ? Breadth-First (BF)Depth-First (DF) Random (Uniform) Random (A-priori distr.) Other options (online decision) Systematic Exploration
  97. 97. Tools: Google Robo Test https://firebase.google.com/docs/test-lab/robo-ux-test
  98. 98. Pros and Cons Automation Frameworks ✓ Easy reproduction ✓ High level syntax ✓ Black box testing - Learning curve - User-defined oracles - Expensive maintenance Record & Replay ✓ Easy reproduction - Expensive collection and maintenance - Coupled to locations AIG: Random Based ✓ Fast execution ✓ Good at finding crashes - Invalid events - Lack of expressiveness AIG: Systematic ✓ Achieves Reasonable Coverage ✓ May miss crashes - Can be time consuming - Typically cannot exercise complex features
  99. 99. Model-Based Testing (MBT) UI Events Model Monkey AUT/SUT
  100. 100. Model-Based Testing (MBT) UI Events Model Monkey AUT/SUT - Manually generated - Automatically generated (source code) - Ripped at runtime (upfront) - Ripped at runtime (interactive)
  101. 101. Pros and Cons Automation Frameworks ✓ Easy reproduction ✓ High level syntax ✓ Black box testing - Learning curve - User-defined oracles - Expensive maintenance Record & Replay ✓ Easy reproduction - Expensive collection and maintenance - Coupled to locations AIG: Random Based ✓ Fast execution ✓ Good at finding crashes - Invalid events - Lack of expressiveness AIG: Systematic ✓ Achieves Reasonable Coverage ✓ May miss crashes - Can be time consuming - Typically cannot exercise complex features AIG: Model Based ✓ Event sequences ✓ Automatic exploration - Some Invalid sequences - State Explosion - Incomplete models
  102. 102. Other Types of AIG Approaches • Recently New Approaches have been introduced for AIG: • Search-Based Approaches1 • Symbolic/Concolic Execution2 • NLP for Text Input Generation - @ICSE17 1Ke Mao, Mark Harman, and Yue Jia. 2016. Sapienz: multi-objective automated testing for Android applications. In Proceedings of the 25th International Symposium on Software Testing and Analysis (ISSTA 2016) 2Nariman Mirzaei, Joshua Garcia, Hamid Bagheri, Alireza Sadeghi, and Sam Malek. 2016. Reducing combinatorics in GUI testing of android applications. In Proceedings of the 38th International Conference on Software Engineering (ICSE '16)
  103. 103. BUG REPORTING, CROWDSOURCING, & SERVICES
  104. 104. Bug Reporting Services AUT/SUTThird Party Library Web Service
  105. 105. Bug Reporting Services
  106. 106. Bug Reporting Services • Features of Bug Reporting Services:
  107. 107. Bug Reporting Services • Features of Bug Reporting Services: • Video Recording
  108. 108. Bug Reporting Services • Features of Bug Reporting Services: • Video Recording • App Analytics
  109. 109. Bug Reporting Services • Features of Bug Reporting Services: • Video Recording • App Analytics • Automated Crash Reporting
  110. 110. Bug Reporting Services • Features of Bug Reporting Services: • Video Recording • App Analytics • Automated Crash Reporting
  111. 111. Pros and Cons Bug Reporting Services ✓ Allows for more details about field failures ✓ App Analytics can help with UI/UX design - Can be expensive - Requires integrating library - Typically do report GUI- traces
  112. 112. Crowdsourced Testing AUT/SUTThird Party Service
  113. 113. Crowdsourced Testing
  114. 114. Crowdsourced Testing • Types of Crowdsourced Testing Services:
  115. 115. Crowdsourced Testing • Types of Crowdsourced Testing Services: • Expert Testing
  116. 116. Crowdsourced Testing • Types of Crowdsourced Testing Services: • Expert Testing • Functional Testing
  117. 117. Crowdsourced Testing • Types of Crowdsourced Testing Services: • Expert Testing • Functional Testing • UX Testing
  118. 118. Crowdsourced Testing • Types of Crowdsourced Testing Services: • Expert Testing • Functional Testing • UX Testing • Security Testing
  119. 119. Crowdsourced Testing • Types of Crowdsourced Testing Services: • Expert Testing • Functional Testing • UX Testing • Security Testing • Localization Testing
  120. 120. Usability Testing https://www.thoughtworks.com/insights/blog/recording-mobile-device-usability-testing- sessions-–-guerrilla-style
  121. 121. Usability Testing https://www.thoughtworks.com/insights/blog/recording-mobile-device-usability-testing- sessions-–-guerrilla-style
  122. 122. Usability Testing https://www.mrtappy.com
  123. 123. Crowdsourced Testing: Examples https://www.pay4bugs.com Expert Testers Functional Testing UX Testing Security Testing Localization Testing
  124. 124. http://testarmy.com/crowd-testing Expert Testers Functional Testing UX Testing Security Testing Localization Testing Crowdsourced Testing: Examples
  125. 125. https://www.applause.com/testing/ Expert Testers Functional Testing UX Testing Security Testing Localization Testing Crowdsourced Testing: Examples
  126. 126. Pros and Cons Crowdsourcing Services ✓ Low effort required from developers ✓ Expert tester might uncover unexpected bugs - Can be expensive - May not fit within Agile workflows - Quality of Reports can vary Bug Reporting Services ✓ Allows for more details about field failures ✓ App Analytics can help with UI/UX design - Can be expensive - Requires integrating library - Typically do report GUI- traces
  127. 127. Cloud Testing & Device Streaming (Devices farm) Third Party Service Third Party Service
  128. 128. Tools: Xamarin Test Cloud https://www.xamarin.com/test-cloud
  129. 129. Tools: AWS device farm https://aws.amazon.com/device-farm/
  130. 130. Tools: Google Cloud Test Lab https://developers.google.com/cloud-test-lab/
  131. 131. Tools: Vysor http://www.vysor.io
  132. 132. Tools: Vysor http://www.vysor.io
  133. 133. Tools: Vysor http://www.vysor.io
  134. 134. Tools: STF http://openstf.io and SEMERU extension
  135. 135. Tools: STF http://openstf.io and SEMERU extension
  136. 136. Pros and Cons Crowdsourcing Services ✓ Low effort required from developers ✓ Expert tester might uncover unexpected bugs - Can be expensive - May not fit within Agile workflows - Quality of Reports can vary Bug Reporting Services ✓ Allows for more details about field failures ✓ App Analytics can help with UI/UX design - Can be expensive - Requires integrating library - Typically do report GUI- traces Device Streaming ✓ Allows remote users access to controlled devices ✓ Allows for collection of detailed user information - Can be difficult to configure - Relies on strong network connection - Cannot simulate mobile specific contexts like sensors
  137. 137. PART 2: CURRENT CHALLENGES IN MOBILE TESTING
  138. 138. Accidental Challenges in Mobile GUI-Testing
  139. 139. Accidental Challenges in Mobile GUI-Testing Accidental Challenges
  140. 140. Accidental Challenges in Mobile GUI-Testing Accidental Challenges Application Data and Cold Starts
  141. 141. Accidental Challenges in Mobile GUI-Testing Accidental Challenges Application Data and Cold Starts Bugs in Framework Utilities
  142. 142. Accidental Challenges in Mobile GUI-Testing Accidental Challenges Application Data and Cold Starts Bugs in Framework Utilities Bugs and Limitations in the SDK Tools
  143. 143. Accidental Challenges in Mobile GUI-Testing Accidental Challenges Application Data and Cold Starts Bugs in Framework Utilities Bugs and Limitations in the SDK Tools Challenges Scaling Concurrent Virtual Devices
  144. 144. Essential Challenges in Mobile GUI-Testing
  145. 145. Essential Challenges in Mobile GUI-Testing Essential Challenges
  146. 146. Essential Challenges in Mobile GUI-Testing Essential Challenges Test Oracles
  147. 147. Essential Challenges in Mobile GUI-Testing Essential Challenges Test Oracles Event Coordination
  148. 148. Essential Challenges in Mobile GUI-Testing Essential Challenges Test Oracles Event Coordination External Features
  149. 149. Essential Challenges in Mobile GUI-Testing Essential Challenges Test Oracles Event Coordination External Features External Dependencies
  150. 150. Accidental Challenges: Application Data and Cold Starts
  151. 151. Accidental Challenges: Application Data and Cold Starts adb shell rm <app-data-folder> adb shell pm clear <app-package-name>
  152. 152. Accidental Challenges: Bugs in Framework Utilities
  153. 153. Accidental Challenges: Bugs in Framework Utilities
  154. 154. Accidental Challenges: Bugs in Framework Utilities adb shell shell /system/bin/uiautomator dump /sdcard/ui_dump.xml
  155. 155. Accidental Challenges: Bugs in Framework Utilities adb shell shell /system/bin/uiautomator dump /sdcard/ui_dump.xml
  156. 156. Accidental Challenges: Bugs in Framework Utilities adb shell shell /system/bin/uiautomator dump /sdcard/ui_dump.xml Solution: Modify Framework Components for the AOSP
  157. 157. Accidental Challenges: Bugs and Limitations in the SDK Tools adb Server 1 adb Server 2
  158. 158. Accidental Challenges: Bugs and Limitations in the SDK Tools adb Server 1 adb Server 2 No more than 15 devices per adb server instance!
  159. 159. Accidental Challenges: Scaling Concurrent Virtual Devices Potential Solution: Use android-x86 virtual machines
  160. 160. Accidental Challenges: Scaling Concurrent Virtual Devices Potential Solution: Use android-x86 virtual machines
  161. 161. Essential Challenges: Test Oracles Potential Solution: Use a hash of the current screen hierarchy Drawbacks: May miss some fined grained information about the GUI (e.g. colors, images)
  162. 162. Essential Challenges: Test Oracles Potential Solution: Use a hash of the current screen hierarchy adb shell shell /system/bin/uiautomator dump /sdcard/ui_dump.xml Drawbacks: May miss some fined grained information about the GUI (e.g. colors, images)
  163. 163. Essential Challenges: Event Coordination
  164. 164. Essential Challenges: Event Coordination adb shell dumpsys window -a mAppTransitionState APP_STATE_READY APP_STATE_IDLE APP_STATE_TIMEOUT APP_STATE_RUNNING
  165. 165. Essential Challenges: External Features & Dependencies Potential Solution: Use real usage data from the field?
  166. 166. PART 3: FUTURE RESEARCH VISION
  167. 167. Fully automated mobile testing: Are we there yet? APP - Oracles - Models - Scripts - Strategies - Fuzzers - Rippers - MBT - Automation APIs
  168. 168. Key Challenges 1. Open problems without solutions (e.g., oracles generation) 2. Facets of mobile testing not investigated yet (e.g., mobile-specific fault models) 3. Specific “characteristics” in mobile testing that drastically impact the testing process (e.g., fragmentation)
  169. 169. Key Challenges: Fragmentation
  170. 170. Key Challenges: Fragmentation https://developer.android.com/about/dashboards/index.html 25 API versions 7 API versions 97% OS Marketshare
  171. 171. Key Challenges: Fragmentation http://iossupportmatrix.com/ BY iOS 10 Code names; Whitetail iOS 9 Code names; Monarch, Boulder, Castlerock, Eagle iOS 8 Code names; Okemo, OkemoTaos, OkemoZurs, Stowe, Copper iOS 7 Code names; Innsbruck, Sochi iOS 6 Code names; Sundance, Brighton iOS 5 Code names; Telluride, Hoodoo iPhone SDK 4.0 Code names; Apex, Baker, Jasper, Phoenix, Durango iPhone SDK 3.0 Code names; Kirkwood, Northstar, Wildcat iPhone SDK 2.0 Code names; Big Bear, Sugarbowl, Timberline iPhone OS 1.0 Code names; Alpine, Heavenly Little Bear, Snobird, Oktoberfest 3.0 5.0 6.0 6.0 6.0 6.0 7.0 7.0 7.0 7.0 1.0 1.1 3.1.3 ▼ 3.1.3 ▼ 4.2.1 ▼ 4.2.1 ▼ 2.0 2.1.1 4.3.5 3.1.1 3.2 iPad SDK 4.0 (GSM) 4.2.6 (CDMA) 4.2.1 4.3.5 5.1 8.1 8.0 8.0 8.1 8.4 9.0 9.0 9.0 9.1 9.3 9.3 6.1.6 ▼ 5.1.1 ▼ 5.1.1 ▼ 7.1.2 ▼ 6.1.6 ▼ 9.3.3 ▼ 9.3.3 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 10.0 ▼ 9.3.3 ▼ 9.3.3 ▼ 9.3.3 ▼ iPadPro9.7” March2016 iPhoneSE March2016 iPadPro12.9” November2015 iPhone6sPlus September2015 iPhone6s September2015 iPadmini4 September2015 iPodtouch (6thgeneration)July2015 iPadAir2 October2014 iPhone6Plus September2014 iPhone6 September2014 iPadmini3 October2014 iPadmini2 October2013 iPadAir October2013 iPhone5s September2013 iPhone5c September2013 iPadOctober2012 iPhone5 September2012 iPadmini October2012 iPodtouch (5thgeneration)September2012 [new]iPad March2012 iPhone4S October2011 iPad2 March2011 iPodtouch (4thgeneration)September2010 iPhone4 June2010 iPadApril2010 iPodtouch (3rdgeneration)September2009 iPhone3GS June2009 iPodtouch (2ndgeneration)September2008 iPodtouch September2007 iPhone3G July2008 iPhone June2007 32bit/Swift/ARMv7-A A6 64bit/Cyclone/ARMv8-A A7 64bit/Typhoon/ARMv8-A A8 64bit/Twister/ARMv8-A A932bit/Cortex A8/ARMv7-A A4 32bit/Cortex A9/ARMv7-A A5 128MB 256MB 256MB 256MB512MB 512MB 512MB1GB 1GB 1GB 1GB 2GB 2GB 4GB2GB 2GB1GB @isupportmatrix iOSSupportMatrix.com Summer 2016 — v4.0b6 B E T A 6
  172. 172. Key Challenges: Lack of fault models
  173. 173. Key Challenges: Lack of fault models Fault model (a.k.a., fault profile): Catalog describing recurrent (or uncommon) bugs associated with software developed for a particular platform or domain Fault Model Potential locations/features in source code to be tested
  174. 174. Key Challenges: History awareness in test cases
  175. 175. Key Challenges: History awareness in test cases Current automated techniques do not support system/ acceptance testing because they are not history aware. - Fuzzers are totally random - Rippers follow heuristics for GUI traversal - Test scripts are hard-coded
  176. 176. Key Challenges: Evolution/maintenance of test scripts
  177. 177. Key Challenges: Evolution/maintenance of test scripts Location-coupled scripts Need different versions for each device and are highly sensitive to changes in the GUI Device/Location-agnostic scripts Are resilient to device fragmentation but need to be updated when the GUI changes in terms of added/ deleted components
  178. 178. Key Challenges: Oracles generation
  179. 179. Key Challenges: Oracles generation Manually Codified Oracles (MCO) Usually implemented as assertions or expected exceptions Exceptions-As-Oracle (EAO) Crashes and errors reported during the execution of a test case Gui-state-As-Oracle (GAO) Expected transitions between windows, and GUI screenshots
  180. 180. Key Challenges: Multi-goal testing
  181. 181. Key Challenges Challenge Solution Fragmentation Fault models History-awareness Multi-Goal testing Test scripts evolution Oracles generation Crowd/cloud based testing Research is needed MBT is a promise Open problem Open problem Open problem
  182. 182. The CEL Testing Principles A “fully" automated solution for testing mobile apps should continually generate test cases, underlying models, oracles, and actionable reports of detected errors, without human intervention.
  183. 183. The CEL Testing Principles Automated testing of mobile apps should help developers increase software quality within the following constraints: (i) restricted time/budget for testing, (ii) needs for diverse types of testing, and (iv) pressure from users for continuous delivery
  184. 184. The CEL Testing Principles Continuous: - Continuous multi-goal testing under different environmental conditions - Any change to the source code or environment should trigger — automatically – a testing iteration on the current version of the app - Test cases executed during the iteration should cover only the impact set of the changes that triggered the iteration
  185. 185. The CEL Testing Principles Evolutionary: - App source code and testing artifacts (i.e., the models, the test cases, and the oracles) should not evolve independently of one another - The testing artifacts should adapt automatically to changes in (i) the app, (ii) the usage patterns, and (iii) the available devices/OSes
  186. 186. The CEL Testing Principles Large-scale: - CEL testing should be supported on infrastructures for parallel execution of test cases on physical or virtual devices. - The large-scale engine should be accessible in the context of both cloud and on- premise hardware -This engine should enable execution of test cases that simulate real conditions in-the-wild.
  187. 187. CEL Testing: Proposed Architecture APIs evolution monitor Changes Monitoring Subsystem Developers User reviews APIs Users On-device usages monitor Source code changes monitor Markets monitor Source code Impact analyzer Execution traces+ logs analyzer API changes analyzer User reviews + ratings analyzer Models repository Models generator Test cases generator Large Execution Engine Containers Manager Reports repository Reports generator Test cases runner Testing Artifacts Generation Artifacts repository Multi-model generator Multi-model repository Test cases + oracles Domain, GUI, Usage, Contextual, Faults models Multi-model
  188. 188. Multi-Model GUI model Usage model Domain model Context model Fault model Windows, transitions, components Single use cases and combinations; (un)common usages Entities, data types, values, enumerations External events, sensors, connectivity (Un)Common programming errors
  189. 189. Multi-Model + GUI model Usage model Domain model Context model Fault model
  190. 190. Multi-Model + GUI model Usage model Domain model Context model Fault model Multi-model
  191. 191. Multi-Model + GUI model Usage model Domain model Context model Fault model Multi-model - Multi-goal testing - Oracles generation - On-demand test cases generation
  192. 192. SEMERU-Tools - MonkeyLab: New method for Modeling Actions - CrashScope: Practical Automated Testing - Fusion: Enhanced Bug Reporting - CrashDroid: Automated Crash Reproduction - GEMMA: Multi-Objective Energy Optimization - ODBR: On-Device Bug Reporting
  193. 193. THANK YOU !! QUESTIONS/DISCUSSION? kpmoran@cs.wm.edu m.linaresv@uniandes.edu.co denys@cs.wm.edu
  194. 194. Discussion Questions • Potential solutions to challenges we covered? • Other future research directions? • How does the rise of native web-programming frameworks for mobile apps impact testing?

×