Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Mobile Testing in the Cloud

1,316 views

Published on

Helping QA organizations manage the challenges of a mobile-first world.

Join Rachel Obstler, Sr. Director of Product Management with Keynote Systems as she covers how organizations are rapidly deploying mobile versions of their customer-facing and internal applications.

With the prevalence of more agile-based approaches and the challenge of an ever-increasing diversity of devices and OS versions, testers are being asked to accomplish more testing in less time.

Rachel shares how leading enterprises are improving the efficiency of their mobile testing using automation, and how they identify the right processes and tools for the job. Sharing some fascinating statistics from their recent mobile quality survey of more than 69,000 mobile app developers and QA organizations in the top US enterprises, Rachel dives into the challenges identified in the survey and shares how to improve your testing process through optimizing your device testing strategy, and automating your mobile tests.

Published in: Technology
  • Be the first to comment

Mobile Testing in the Cloud

  1. 1. Mobile Testing in the Cloud Helping QA organizations manage the challenges of a mobile-first world Rachel Obstler May 8, 2014
  2. 2. Contents  The Mobile Imperative  The Challenge of Mobile Quality  Solutions  Optimizing Your Device Testing Strategy  Automation 1
  3. 3. The Mobile Imperative DOWNLOADED IN 20131 102BILLION APPS 77BILLION ANTICIPATED REVENUE GENERATED THROUGH MOBILE APPS BY 20171 #1 PRIORITY: MOBILERETAILER TECHNOLOGY PRIORITIES2 2 OF 3 BANKSPREDICT 100% OF THEIR CUSTOMERS WILL USE MOBILE SERVICES BY 20173 1. Gartner; 2. Forrester, State of Retailing Online, 2014; 3. Metaforic UK 2
  4. 4. The Mobile Imperative 1. Econsultancy; 2. Forbes; 3. ZDNet ONLY 16%CONSUMERS SURVEYED WILLING TO GIVE A PROBLEMATIC MOBILE APP MORE THAN ONE ATTEMPT1 USERS EXPECT A MOBILE APP TO BE RESPONSIVE IN 3 SECONDS 3 A MOBILE FRIENDLY SITE MAKES 2 OF 3 USERSMORE LIKELY TO BUY A COMPANY’S PRODUCT OR SERVICE2 3
  5. 5. Survey: Quality Expectations of Mobile Websites & Apps Almost half of the respondents think mobile quality expectations are the same as desktop, and 35% think they are higher. 10% 49% 36% 5% Quality expectations for mobile applications and websites are lower than desktop Quality expectations for mobile applications and websites are the same as desktop Quality expectations for mobile applications and websites are higher than desktop Not sure Now when thinking about quality expectations of mobile websites and applications, which of the following is true? (1,314 respondents) 4
  6. 6. Challenge: Mobile = Devices  Device platforms, fragmentation, and growth • Multiple OS’s, form factors and screen resolutions • Frequent device refresh  New capabilities to test • Camera, GPS, direction, orientation, voice, etc.  Network considerations • Multiple carriers, variable throughput and latency, disconnected use, network switching 5
  7. 7. Scan & Certify Instrument Test IntegrateObtain Insight Manage Deploy Design & Develop Challenge: Mobile = Faster, More Iterative, Continuous Activity 6
  8. 8. Challenge: Mobile = Late to the Game Employee Skillsets  HP QC/IBM RQM users  HP UFT/QTP users  Automation engineers/ programmers  Manual testers 7
  9. 9. Challenge: Mobile = Spread Across the Organization Organizations  Multiple teams  Geographically diverse employee base  IT security requirements 8
  10. 10. Survey: The Mobile Testing Organization is Largely Decentralized How is mobile website and application testing organized within your company? (# of respondents in each column) All Verticals (1590) High Tech/IT/ Software (661) Fin Svcs & Ins (234) Telecom (151) Media (139) Retail* (57) Centralized testing group, mobile-specific 19.7% 20.3% 20.9% 19.9% 22.3% 17.5% Centralized testing group, not mobile-specific 12.6% 12.0% 12.8% 9.9% 18.7% 10.5% Individual QA groups within business units/divisions, mobile-specific 35.8% 38.1% 36.3% 44.4% 36.0% 36.8% Individual QA groups within business units/divisions, not mobile-specific 20.8% 22.5% 20.5% 13.9% 14.4% 17.5% Testing is predominantly outsourced 3.5% 2.6% 3.4% 2.6% 3.6% 8.8% Currently not testing any mobile apps or websites 7.5% 4.5% 6.0% 9.3% 5.0% 8.8% More testing groups are distributed, rather than in one centralized group. Individual and also centralized groups are more often mobile-specific. * Retail sample size <100. 9
  11. 11. Survey: Challenges of Testing Mobile Websites and Applications Regarding challenges encountered when testing mobile applications and websites, please rate the following challenges on a 1 to 10 scale, where 1 is ‘Not at all challenging’ and 10 is ‘Extremely challenging’. (1,314 respondents) 1 (Not Challenging) 2 3 4 5 6 7 8 9 10 (Extremely Challenging) Availability of mobile testing experts 4.7% 4.3% 5.8% 7.2% 17.4% 11.3% 15.4% 17.0% 8.7% 8.2% Implementing the right testing method/process for mobile 3.5% 3.2% 5.3% 7.4% 17.7% 11.5% 18.0% 18.3% 8.4% 6.5% Availability of proper testing tools 2.7% 3.2% 5.3% 5.4% 16.3% 11.6% 16.4% 16.6% 11.3% 11.3% Access to mobile devices 6.5% 6.3% 4.7% 4.5% 14.2% 10.0% 13.5% 16.7% 9.9% 13.6% Having enough time to test 4.2% 3.4% 5.6% 5.9% 16.7% 12.3% 14.9% 14.9% 10.2% 11.9% 10
  12. 12. Survey: Important Functional Testing Features Q A Within functional testing, what are the main priorities for QA? Easy access to a variety of device models was rated the most important functional testing feature across all verticals. 11
  13. 13. Optimize Your Device Testing Strategy  Determine your priority devices  Provide easy access to real devices  Enable local and remote employees  Provide a secure, enterprise-grade test environment 12
  14. 14. Use Data to Determine Priority Devices 13 Internal apps Customer apps Android Screen Size and Density Source: Google Android OS Market Share Source: Google
  15. 15. Real Devices in the cloudYour computer, connected to the internet Your key presses and mouse clicks sent to the device The device’s screen sent back to your computer Provide Easy Access to Devices Using a SaaS- based Real Device Solution 14
  16. 16. Different screen resolution / screen size Unreadable text, blurred images, misalignment of screen elements, and items that fall off the screen. Android Customizations On-screen and physical controls that function differently across devices Customized handling of inputs and events (e.g. Samsung Swype vs. default Android keyboard). Memory / CPU Low or insufficient memory or processing power. Impacts of other services running on the device. Mobile device characteristics that impact the quality of your application or website, but are not easily verifiable using an emulated phone or browser, include: Test on Real Devices 15
  17. 17. Devices Shared Devices  24X7 access to hundreds of smart devices  Use for compatibility testing, “untrusted device” testing Private Devices  24X7 access to your devices  Enables geographically diverse team  Inside corporate firewall or external Local Devices  Plug your smart devices on hand directly into your local computer  Make use of existing assets, great for local teams Enable Local and Remote Employees by Using a Combination of Local and Remote Devices 16
  18. 18. Deploy an Enterprise-grade Product 17 Access and Project Management Multiple levels of user permissions Control access to all assets, including devices, scripts and test results Create different groups to manage access Security Password protection features (enforce change; salted; disable save) SSL communication LDAP integration
  19. 19. 0 Automate! 18  Determine target test cases to automate  Support agile processes (continuous integration)  Find a cross-device testing tool  Make use of existing (desktop) automation tools and processes where possible  Provide multiple scripting options because your team is not one size fits all
  20. 20. Test Automation is Not Yet Widely Used for Mobile Web What percentage of your mobile web regression tests is currently automated? (# of respondents in each column) All Verticals (1176) High Tech/IT/ Software (520) Fin Svcs & Ins (183) Telecom (106) Media (96) Retail* (45) 0% 28.8% 27.1% 26.8% 17.9% 28.1% 35.6% 1-25% 28.7% 31.3% 25.1% 33.0% 29.2% 24.4% 26-50% 17.3% 19.0% 19.7% 15.1% 14.6% 17.8% 51-75% 9.8% 11.0% 8.2% 15.1% 10.4% 2.2% >75% 4.9% 3.1% 5.5% 5.7% 4.2% 11.1% Don't know 10.5% 8.5% 14.8% 13.2% 13.5% 8.9% Majority or organization have less than 25% of mobile web regression tests automated. * Retail vertical sample size <100. 19
  21. 21. Mobile Apps – Same Story What percentage of your mobile native applications regression tests is currently automated? (# of respondents in each column) All Verticals (1172) High Tech/IT/ Software (519) Fin Svcs & Ins (183) Telecom (105) Media (95) Retail (45) 0% 30.4% 28.3% 27.9% 20.0% 32.6% 46.7% 1-25% 27.2% 29.7% 26.2% 34.3% 22.1% 17.8% 26-50% 15.4% 18.9% 15.8% 11.4% 8.4% 11.1% 51-75% 9.2% 8.7% 10.4% 13.3% 9.5% 6.7% >75% 4.6% 4.2% 3.8% 5.7% 6.3% 8.9% Don't know 13.1% 10.2% 15.8% 15.2% 21.1% 8.9% * Retail vertical sample size <100. 20
  22. 22. Determine Your Targets for Automation Goals Test Cases Frequency Methodology Basic acceptance, build acceptance Functional testing across all existing areas of product Compatibility across devices Very high value to automate! High value to automate Medium value to automate # of Devices Optimal Test Strategy Automated Testing Automated Testing Manual or Automated Testing Smoke Testing Regression Testing Compatibility Testing Deep testing of new areas of functionality Low value to automate Manual Testing New Feature Testing Not automated Exploratory Testing Exploration of functionality from customer viewpoint Manual Testing 21
  23. 23. Test Continuous Integration Feedback Loop 22 Develop Build
  24. 24. Customer Case Study: Speed Time to Market  Each build of the native mobile retail investment app required a Build Acceptance Test with 300 hundred test cases across two devices (one Android and one iOS)  Run manually, tests were taking two QA engineers two weeks (30 test cases per engineer per day)  Engineering did not receive any build feedback until two weeks had passed  QA team had QTP skills, and managed the process using QC  Automated test cases using DeviceAnywhere and QTP  300 test cases executed across two devices (in parallel) now take 24 hours to run, vastly reducing time to market  Manual testers freed up to test new or more complex features  All results saved in QC BACKGROUND RESULT: Automation on real devices resulted in 10X Reduction in BAT time! CUSTOMER Director of QA at a major financial institution 23
  25. 25. Device Fragmentation  Less up front scripting cost  Reduce ongoing script maintenance Object-Level Scripting Important Cross-Device Testing 24
  26. 26. Customer Case Study: Improve Product Quality  The current mobile test process for the consumer mobile banking app involved outsourcing testing to a manual test house.  The available budget afforded them enough person/hours to complete their 900 regression test cases on only one device  Results and issues were managed using unwieldy word documents  As usage picked up, issues in the field highlighted the need to ensure quality across a variety of mobile platforms and devices BACKGROUND RESULTS CUSTOMER VP of Engineering at a major bank  Now the same 900 test cases are run across 20 devices, improving test coverage and quality  All 20 devices tested in half the time it took previously to test one device, improving time to market  Test results are automatically available online with screenshots (no more word docs, improved process) 25
  27. 27. Make Use of Existing Tools Where Possible ALM Tools (HP ALM, IBM Rational) Automation Tools such as HP QTP Open Source Tools like Selenium, Robotium Other tools 26 Survey your organization: What desktop test tools are currently being used?
  28. 28. Survey: Important Functional Testing Features by Vertical Integration with ALM (Application Lifecycle Management) tools were also highly rated for financial services companies. Having an ‘integration with open source tools’ (such as Selenium, Robotium, etc.) was rated very high for retail companies as they traditionally have been more e-commerce focused and would often be developing automation using open source web-based tools. Were there any other vertical-specific challenges that also rated high? Q 27
  29. 29. Beginning and Advanced Scripting Options Enable All Employees 28 UI-Based Scripting For complex logic and large test suite design and execution Integration with Leading Test Tools Allow employees with existing skillsets to easily extend to mobile Java API Enables your QA engineers with programming skills to write test in Java Script Recording Point and click recorder enables anyone to immediately create reusable test scripts Automation Engineers UFT/QTP users Programmers New Scripters
  30. 30. Challenges 29  Mobile = Devices  Mobile = Faster, More Iterative, Continuous Activity  Mobile = Late to the Game  Mobile = Spread Across the Organization
  31. 31. Solutions Challenge Solution(s) / Strategies to Address Mobile = Devices  Use data to determine priority devices  Provide access to real devices in the cloud  Automate using a cross-device testing tool Mobile = Faster, More Iterative, Continuous Activity  Determine high-value test cases to automate  Continuous integration  Automate using a cross–device testing tool Mobile = Late to the Game  Make use of existing test tools where possible  Provide a variety of scripting options for differing skill sets Mobile = Spread Across the Organization  Provide access to local and remote real devices  Enterprise-grade product to manage multiple teams 30
  32. 32. Make every digital interaction count MyKeynote® TESTANALYZE Mobile App Testing Mobile Web Testing Load Testing Data Visualization Competitive Intelligence Expert Analysis Mobile App Monitoring Website Monitoring Mobile Web Monitoring MONITOR 31
  33. 33. Keynote: Most Experienced Cloud-based Mobile, App, Web & Network Performance Company on the Planet  1995: Founded  500 employees  4,000+ customers  7,000+ measurement devices in ~300 locations – most scaled on-demand infrastructure in world  700,000,000 performance measurements daily  730 Carrier Networks monitored in ~300 locations in 200 countries  1,500+ devices in 15 global locations  Forbes ‘One of the Best 100 Companies in America’ 32 Sao Paolo Moscow Beijing Paris New York
  34. 34. Integrating the World’s Largest Global Testing & Monitoring Network and the World’s Largest Real Device Cloud 33
  35. 35. Enterprise Solutions on a Global Scale  3,000 customers in 130 countries  Business operations in EMEA, APAC, Americas  Global infrastructure  Web performance testing & monitoring agents in 100 locations  Mobile testing and monitoring agents in 250 locations  7,000 devices on network  270B measurements per year 34
  36. 36. Thank You Questions?

×