Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi


Published on

Back-end testing is an unfamiliar testing area to many testers, especially when the Back-end adopts web services technologies and has gigabytes of data need to be verified. The presentation will outlines numbers of testing activities that need to be done to deal with challenges.

Services/Domain Testing Introduction:
We have been providing automation test service for Back-end system which has web services, web application technologies and meta-data processing. The domain we has worked on is about Communication Media and Entertainment.

Complex business logic inside layer of data storage and processing to provide services. Different platforms under test.
Defragmented testing result so it is difficult to make decision.
Must align testing with development life cycle.
Apply automation testing to Continuous Integration.
Design automation test framework to deal with Shell, Web Service, Web Application, gigabytes of XML Data on Windows and Linux.
Select proper technology stack to centralize the testing result from both manual and automation teams.
Jenkins is continuous integration and continuous delivery application, as start point, run its job to build source code from development team. When unit testing for source code is passed, automated system test written by LISA is launched as flow controller for automation test framework.

The LISA’s core functionalities are to verify middleware layer, web services based on SOAP/RESTful and database. Extending LISA’ capabilities are also applied in practice to test different technologies under test such as web application by integrating with Selenium, Shell Script by JCraft and processing big data file by Xstream/JAXB.

Published in: Software
1 Like
  • Be the first to comment

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi

  1. 1. 1 Introduction to Back End Automation Testing 16 | 07 | 2016 Nguyen Vu HOANG Hoang PHI FPT Software
  2. 2. WHO we are 2 HOANG Nguyen PHI Hoang Education and Certification Education and Certification  Master of Computer Sciences  ISTQB (Advanced Level Technical Test Analyst)  Scrum Master, Product Owner and ITIL (FL)  CEH, LPIC-1, Novell CLA, CCNA, MCSE, MCDBA, MCSA, MCSD.NET and MCAD.NET.  Bachelor of Sciences  ISTQB (FL)  IBM Certified (Plan, Estimation, Design and Negotiation)
  3. 3. Agenda 3 WHAT is back end testing WHY Test Automation Testing Problems and Solutions WHAT we do HOW we do Measurable Results Demo
  4. 4. WHAT is back end testing 4  Service layer (80%) – perform testing for services WITHOUT User Interface (UI), AND WHAT in the scope  Enterprise Services at API layer.  RESTful Web Service architectural style.  SOAP Web Service used for general enterprise integration.  Connect-based middleware stack (i.e.: JMS – Java Message Service) WHAT are tested… No UI or Used by back-office operators Web Service, Database, File System, Web Application  Backend application (20%) – perform testing for application/system with UI which is used to monitor and control services.  Enterprise system with UI used by back-office operators.
  5. 5. WHAT is back end testing (Cont.) 5 Web Application (Service Monitor and Control) Core Business Services End-User Platforms External Services or Apps Client Platforms Operators Applications and Services Backend Web Application can be tested via user interface.  Platforms can be any if it follows service protocols. Delivered services are NOT for specific platform. 1 1 2 2 Not Available 3  Platforms are not available when testing. 3 3rd party Services DBMSXML
  6. 6. WHY Test Automation 6 Saves time and money Increases test coverage Improves accuracy and quality  Huge data verification achievable only by automation. (i.e. verify thousands of XML attributes)  Better coverage, more executions, higher number of test cycles.  Reliability – ensure correct test at protocol layer and eliminate human error.  Auditability – provide comprehensive results.  Regression testing – gain confidence that nothing is broken.  Reusability – same tests on different environments and configurations.  Reduce execution time.
  7. 7. Testing Problems and Solutions 7 Testing Problems High Level Solutions  Different technology platforms under test (Operating System, Database, Web Server and application protocols and development).  High data rates and responsiveness  Defragmented testing result so it is difficult to make decision.  Must align testing with agile development methodologies.  Integrate automation testing to Continuous Integration.  Select proper technology stack to centralize the testing result from both manual and automation teams  Design automation test framework to deal with Shell, Web Service, Web Application, gigabytes of XML Data on Windows and Linux. 1 2 3
  8. 8. WHAT we do - Standardize CI/CD 8 Continuous Integration Build Automation Static Code Quality Analysis Version Control Collaboration Wikis, Forums Manage Develop Build Test Deploy Test Automation, Performance Testing Code Review and Quality Analysis Unit Testing Cloud Deployment Containerize  Improve the way we develop and test applications by standardizing Continuous Integration (CI) and Continuous Deployment (CD) tool set.  Reduce test and deployment cycles through automation. Issue Tracking Test / Req Management Artifact Management 1
  9. 9. WHAT we do – Do Continuous Deployment 9  By saving time in deployment, testers will be able to spend more time on feature testing.  Prevent mistakes in installation while there is lots of configuration defined in release note.  New and old application/services are deployed side by side to eliminate delay impact on testing team in case testing is blocked on the new version.  Testing team starts testing cycle on the new version after smoke testing passed. Ant executes test cases Automation Test Engine VIP Cut Over Application 1.0 Virtual IP Application 1.1 Dependent ComponentDB 1 DB 2 1
  10. 10. WHAT we do – Centralize Test Result 10  Full cycle from test design to test execution and reporting.  LISA facilitates test result to SpiraTeam.  Integration with different tools – extensibility mechanisms to support Web GUI and bash shell.  Simple maintenance – applied keyword and data driven automation, conventions and best practices. 2 Reporting Database Test Case Management Requirement Management
  11. 11. HOW we do – LISA as Core 11  Deliver end-to-end parallel app development solutions to help develop and test simultaneously [1].  Powerful at testing middleware using Service-Oriented Architecture and Database.  Allow high level of customization. Functionality can be extended through its SDK (software development kit) [2].  Enable in-process, bi-directional communication between the Java VM and the CLR (.NET)  Both desktop or server versions provide integrated environment to build all in a code-less manner. 3
  12. 12. HOW we do – Test Web Service and Database 12  Use Service Virtualization to break dependency of testing from development schedule [2].  Use consistent approach to design test cases for web services.  Use powerful built-in functionalities of LISA to automate Web Services and Database testing.  Extend LISA’s capabilities to test JSON. Client Server Web Server Database Server Development in progress Business process analysis Transaction analysis (loss and duplication) Performance analysis Path coverage analysis Error Handling analysis Profiling … Service Mocking, Service Virtualized Test Case Design Approach 3
  13. 13. HOW we do – Consideration in Scripting 13 Identification Implementation Execution Report Assessment 3
  14. 14. HOW we do – Test Web GUI and Shell 14  Integrate Selenium into LISA so that it can test Web GUI.  A wrapper layer is built on top of core Selenium in order to provide strong GUI testing capability for LISA.  Integrate JCraft into LISA so that it can test Shell script.  Use Awk/Nawk/Gawk [3] language at remote host instead bash script to have fast text processing with low memory usage. Performance is 10+ times faster than bash script. JCraft Extended Selenium (Java base) Core Selenium LISA SSH 3
  15. 15. HOW we do – Test Big XML files 15 LISA Xstream JAXB  Process large files (2GB to 3GB for each).  Traverse nodes to smaller chunk.  Cost less memory resource but takes time to read XML file. Xstream  Process delta (small) data.  Binding Java-to-XML makes it easy to incorporate XML data and processing Java functions quickly.  Cost more memory resource but it’s fast. JAXB Combine Xstream and JAXB 3  ~3 min: XML (~3GB) and XSD Schema.  ~6 min: 2 XML files (~3GB per each) Results
  16. 16. Measurable Results 16  Unified automation test approach for 60+ projects.  Automated tests that cannot be done by manual.  100 000+ executed tests  12 000+ automated tests  92% saving execution time Application 1 (547 TCs) Manual 4 MDs Automated 50 min 97% Application 2 (590 TCs) Manual 23.5 MDs Automated 90 min 99% Application 3 (211 TCs) Manual 5.1 MDs Automated 1 MD 80% Application 4 (2000 TCs) Manual 1755 hrs Automated 155 hrs 91% Application 5 (221 TCs) Manual 117 hrs Automated 7 hrs 94% Application 6 (30 TCs) Manual 30 hrs Automated 13 min 99% Tool Cost ($) Similar Tool Spira 3-Users ($79.99 / month) Testlink (free) LISA 5-Users (~$1,666 / month) SoapUI (free/license)  Tool price and similar tool
  17. 17. Demo 17 1) Trigger test from Jenkins 2) Call LISA script 3) Call Shell script 4) Call Selenium script 5) Update result to SpiraTeam (Duration: 1.5 minutes)
  18. 18. References 18 [1]. Develop & Test (former named CA LISA), [2]. Documentation of DevTest Solutions - 8.0, [3]. Awk/Nawk/Gawk tutorial,
  19. 19. © 2016 HCMC Software Testing Club THANK YOU