Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

The Right Tool for the Right Project

Selecting the right automation framework is hard. It might be a critical decision in implementing your continuous testing agenda. Today, there are so many possibilities. From open source solutions (Selenium Appium, etc.), to HPE developed commercial tools (QTP, UFT, Mobile Center).

What was the process that we did in order to select the right automation framework. Our approach was to 1st define our needs from such a framework and we ended up with a list of 10 must have requirements for such a framework and then went on to pick the right tool. We ended up with different frameworks for different projects with a wide combinations of open source to our own tools (UFT, LeanFT, etc.)

  • Login to see the comments

The Right Tool for the Right Project

  1. 1. Welcome!
  2. 2. Agenda –18:00-18:20 Gathering –18:20-19:00 The right tool for the right project | Ori Bendet –19:00-19:05 PIZZAS! –19:00-19:45 Surviving the Storm - continuous testing in the world of SaaS & Cloud | Karim Fanadka –19:45-20:30 networking and Pokémon party outside the office (2 Pokéstops w/ lures)
  3. 3. Have any ideas for a future meetup? –Contact me: r2d2@hpe.com 3
  4. 4. Please give us a 5-star rating! 4
  5. 5. The Right Tool for the Right Project @bendet_ori
  6. 6. 6y in HPE Software in various managerial QA rolesToday - Inbound PM for a new cloud testing offering ABOUT ME
  7. 7. Michael's Father Naomi’s Husband ABOUT ME
  8. 8. For small talk later… ABOUT ME
  9. 9. State of AutomationThe poll!
  10. 10. Now Let’s Get To Business!
  11. 11. Assumption #1 15 Vendors only use their own tools
  12. 12. Assumption #2 16 Evil corporates hate Open Source
  13. 13. Assumption #3 17 Everybody is doing automated testing
  14. 14. My Own Automaton Journey
  15. 15. Project #1 AUT: Analytic Platform Automation: Internal Standalone Tool 19
  16. 16. Analytic Platform for IT Executives (v 1.0) – AUT technology Stack – Glassfish Server – Flex + GWT – SAP BODS for ETLs – SAP BOE for BI – MSSQL – Automation – People doing automation: 0.5 / 5 – Focusing mainly on APIs – Java Beans (EJBs) – Internal tool called FIST – ROI: LOL 20 3.5 hours to install
  17. 17. 21
  18. 18. 22
  19. 19. 23
  20. 20. 24 An extendable Java class
  21. 21. 25
  22. 22. Why? 26
  23. 23. Lessons Learned StandaloneTool developed internally – Cons – Nobody knew about the tool, wasn’t cool, no buzz around it – External tool: unable to get DEV to cooperate with automation (or even install the tool) – Manually triggered (no part of the CI process) – Almost every new test required changes to be done by DEV (expose new API methods) – No direct access to source code – Pros – Small investment – QTTV – Easily extendable – Stability 27
  24. 24. Analytic Platform for IT Executives (v 1.0 2.0) – AUT technology Stack – Glassfish Server – Flex + GWT – SAP BODS for ETLs – SAP BOE for BI – MSSQL – Automation – People doing automation: 4 / 10, dedicated developer to assist – Focusing mainly on APIs Automation installation, E2E flows – Java Beans (EJBs) REST APIs – Internal tool called FIST Internally built framework REST client, Selenium, Flex Monkey – Automatic Deployment Solution: internally developed tool called Slick – ROI: 3 MD each sprint  1 MM per release 28 Automatic Deployment Solution
  25. 25. 29 Saved over 10K MH
  26. 26. Why? – High demand for automation coverage – Developers selected the automation framework – Wanted to work together with developers – Invested in automated installation as well 30
  27. 27. Lessons Learned Internally built Automation Framework (inside the IDE) – Cons – Required more coding skills – Large effort of getting things started – Harder to be used by less-technical testers – UI automation stability – Pros – Harness developers into the automation – Developers re-used testing assets for their own benefits – Testers have access to source code – Part of the CI process 31
  28. 28. Project #2 AUT: Performance Testing tool Automation: Open Source 32
  29. 29. Performance Testing tool – AUT technology Stack – nodeJS – AngularJS – Internal SQLlite – Automation – People doing automation: 1 / 4 – System tests leveraging developers assets – Focus on API testing, sanity level UI testing – Application Modeling for less-technical testers – Protractor, Mocha, Jasmine – ROI: 1.5 MD / sprint 33 Protractor
  30. 30. Without the framework With the framework REST API
  31. 31. Without the framework With the framework UI:
  32. 32. Test (created by non-technical engineer)
  33. 33. Why? – Best available choice for the technology stack – 1 automation engineer working on the framework, others re-using the assets 37
  34. 34. Lessons Learned Leveraging DEV assets and extending into our own framework (+ modeling) – Cons – Modeling takes time – Technical engineer becomes the bottleneck – UI automation stability – Pros – Use existing developers assets – Extend automation coverage using non-technical engineers – Modeling eases tests maintenance 38
  35. 35. Project #3 AUT: Firefox plugin Automation: Commercial Tool (LeanFT) 39
  36. 36. TruClient? TruClient is a tool for recordingWeb-based applications. It is used inside LoadRunner for performance testing on the browser level 40 1. TruClient Sidebar 2. TruClient Toolbox 3. Firefox browser 4. Application Browser Window 5. TruClient Sidebar Status Pane
  37. 37. TruClient – AUT technology Stack – Firefox Plugin – Pure Web – WPF – Windows app – Automation – People doing automation: 2 / 6 – Did not have an automation suite as they could not find a tool which has automation abilities for the full flow – including the three technologies. – Supports the three main browsers and needs an automation tool that can identify and test the objects in all the supported browsers – Selected LeanFT as the tool 41
  38. 38. The Automation Suite 42 LeanFT Application Model Containing the full AUT the App Model displays Modular view of all the implemented in the application
  39. 39. The Automation Suite 43 LeanFT test - Everything is written in the IDE, Dev have them for sanity) - Test code is completely reusable among the - The test is authored once, and can be run on - In the test setup, the TruClient launcher – and being used, and during the test, web and technologies are tested.
  40. 40. Why? – Cross Technology support (Desktop & Web) – Script once – run of all browsers – Re-use, share testing assets with Dev 44
  41. 41. Want to buy a new automation framework for 1 shekel?? 45
  42. 42. The Automation Council 46
  43. 43. 47
  44. 44. The Guidelines 48
  45. 45. Rule #1 49 Accessible in the developers workspace
  46. 46. Rule #2 50 Cross Browsers/Technology Support
  47. 47. Rule #3 51 Ability to easily model the UI
  48. 48. Rule #4 52 Full support for REST APIs testing
  49. 49. Rule #5 53 DB Layer
  50. 50. Rule #6 54 Ability to combine UI/API/DB in 1 flow
  51. 51. Rule #7 55 Messaging Parser (json, XML, etc.)
  52. 52. Rule #8 56 Parameterization of tests
  53. 53. Rule #9 57 CI/CD complaint
  54. 54. Rule #10 58 Modularity to allow re-use by less tech eng
  55. 55. 59
  56. 56. FIST Selenium built FWK LeanFT UFT IDE √ Cross X Mobile/Web Model the UI REST API testing X* DB Layer X* Combine UI/API/DB X* Parsers X* Parameterization CI/CD Modularity for less-tech Cross Platforms Lightweight 60= can be added by user
  57. 57. Summary Time
  58. 58. Assumption #1 62 Vendors only use their own tools
  59. 59. Assumption #1 63 Vendors only use their own tools It’s not about the tool
  60. 60. Assumption #2 64 Evil corporates hate Open Source
  61. 61. Assumption #2 65 Evil corporates hate Open Source We love Open Source!!
  62. 62. Assumption #3 66 Everybody is doing automated testing
  63. 63. Assumption #3 67 Everybody is doing automated testing Everybody is using automated testing
  64. 64. Take Home Message 68
  65. 65. It’s never about the tool It’s about finding the right tool for the right project

×