SlideShare a Scribd company logo
How a Leading Bank
Automated Extensive ESB and API Testing
1
To reduce the risks associated with their business-critical transactions, a leading New Zealand bank and
financial-services provider wanted to extend their existing GUI-based testing to more extensively
exercise the application logic within internal systems. This logic resides within their ESB: a message
broker component that features 80+ services and 750+ operations. The ESB represents a single point of
failure within the infrastructure—but the company was unable to test it directly or automatically. With
IntegrationQA's experts and Parasoft's API Testing solution, the company gained a service testing
methodology supported by automated, reusable tests that performed a broad scope of testing directly
from the service/API layer. The result: they saved over $2.1 million NZD in testing and development
costs over an 18 month period.
The Challenge: Creating Reusable, Easily-Maintainable
ESB Service Tests that Automate 3 Key Test Patterns
To achieve their quality objectives, the company needed reusable test assets that would enable
exhaustive, direct testing of ESB services. These tests had to be able to robustly exercise any of the
available ESB services via three main test patterns:
 Confirmation - Smoke test
 Evaluation - Transformation, data variance, and error handling
 Replication - Business workflow scenarios
2
All three test patterns had to be automated in a way that was reusable and easily maintained across
different messaging protocols (e.g., messaging queues, FTP, SOAP, REST) and message payload formats
(e.g., XML, fixed length, JSON).
The Solution: IntegrationQA + Parasoft API Testing
To help the company establish the necessary testing methodology and processes, IntegrationQA
proposed a plan to provide a test design that could be easily applied across the numerous services that
needed to be tested. This involved establishing a framework which relied heavily on data constraints
and parameterisation of principal fields within the messaging client. Testing would be automated using
Parasoft's API Testing solution, Parasoft SOAtest.
Confirmation Test Patterns
Smoke Test
The confirmation or smoke test successfully established connectivity using Parasoft's MQ or SOAP
messaging client. It also performed data verification by confirming whether the response satisfied
expectations. Since the message payload was literal XML, XPaths were used to automatically check the
message's expected Response Code in order to determine whether the test passed or failed. A positive
response code not only indicated a successful transmission through the ESB, but also that the
HOST/mainframe was able to successfully process the data sent in the request. The successful
implementation of this test was the starting point of the framework going forward.
Regression Suite
The overall ESB regression suite comprised a high-level sub-suite of smoke tests for each service, plus a
separate low-level sub-suite for each service operation. This suite was applied as a regression test for
any deployment to the ESB and across any environment.
Notes:
 The smoke test was intended to confirm successful deployment of a service to the ESB, so a
positive or negative response from the provider was accepted as proof that the service was
deployed and operational. However, IntegrationQA went a step further and also configured the
test to confirm whether the Provider response was positive (as expected). This increased the
functional test team's confidence that the integrated systems were operational and ready.
 Although the test team's work was project-oriented, the implemented ESB regression suite was
ESB-oriented. The ESB regression suite was the core repository of service smoke tests. Project-
based test suites were first based upon in-scope ESB services from the repository; if the project
ESB service was new, then it was created at the project level, then added to the repository after
the project went live.
3
The XML payload for a smoke test is displayed as literal XML for the service request
Evaluation Test Patterns
The following evaluation tests (Transformation, Data Variance & Error Handling) establish the
robustness of the test design.
Transformation Tests
Building on the smoke test, an automated transformation test was a powerful test of the input to (and
output from) the ESB. A literal XML message that covered all potential service data fields was selected
for the test. Parasoft SOAtest was used to parse and compare values of the same messages in different
formats.
The two request messages were retrieved from the ESB message store database. The ESB Request was
in XML and parsed via XPath per element. The HOST Request, a fixed-length format message based on
z/OS IBM copybook technology, was passed to a custom parsing script written in JavaScript. Here,
parsing was dictated by constraints defined on an external data pool. A final step compared the parsed
data between the two message formats—again, using custom JavaScript. The same steps were run for
the Response component of the transaction.
Notes:
 Transformation tests were runtime tests that were ideal for uncovering small differences
between XML data and fixed-length data. For example, it could detect transformation defects
such as changed date formats, decimalised rate values, and dollar values.
4
 This evaluation tested the ESB's transformation logic and mapping control between messages. If
the ESB’s transformation and mapping logic was modified, either intentionally or
unintentionally, then this test would uncover that change during regression test execution.
 JavaScript was leveraged here because SOAtest enabled quick implementation without
compilation.
 A third-party Eclipse add-on (LegStar) was leveraged to load IBM copybooks to generate an XSD
via an automatically-generated Java package; pipe the appropriate fixed length message into the
appropriate class and an XML instance of the fixed length message is created. With some
application of the transformation steps to the data, this twist allows for more manageable test
script set-up since it removes much of the manual work in setting the constraints for parsing the
fixed-length message via a spreadsheet data pool.
Steps for sending, retrieving, parsing and comparing the message data
5
The spreadsheet that constrained and drove the parsing for each element in the messages
Data Variance Tests
An automated Data Variance test provided test coverage for all possible iterations of an operation
within a service. Test data from an external data pool was fed into a parameterised message structure
within SOAtest. This way, all possible message structures for a service’s operation or method would be
tested. The test pattern evaluated boundary rules and data type rules of fields within a request; it was
leveraged to trigger various response iterations from the Provider.
Notes:
 Data variance tests were structured to first focus on a test of all mandatory fields, then all
optional fields, and finally boundary analysis with maximum-length tests on each field in a
message. Once these tests were established, the focus turned to business-oriented scenarios,
such as listing all possible data entries for a field (if finite) or running through various iterations
of logical branches of calculations for the Provider to process.
 Testing data returned from the Provider was a bit trickier since it typically depended upon
existing data from the Provider’s back-end database tables. In some cases, it was possible to
create data and then delete it once the test was complete. However, the nature of the
company's business rules did not often allow for direct deletion of records. In some cases, an
additional step was added before a message client test could be called. This step would query
for data that met certain criteria, then a message client would return this data via the service
layer.
6
A data variance test; the XML is represented as a form and the key element (CountryCode) is parameterized
A sample spreadsheet that covers various data scenarios
7
Error Handling Tests
Using the same Data Variance pattern, and making alterations only to the external data pool, an Error
Handling test pattern was created which covered all business error scenarios. Verification of the
returned error code was evaluated with corresponding expected results within the external data pool.
The Provider always returned specific error descriptions to the ESB, but the ESB often would return a
generic error code. The ESB development team tended to prepare generic error codes for services and
only extended the error handling logic where the business or Consumer required a more defined
response from the ESB.
Notes:
 Before an error test scenario began, a positive run (usually the smoke test) was executed to
establish a baseline and ensure that the ESB and Provider were operational.
 From a technical point of view, error scenarios that might only appear during a technological
failure were also tested to ensure that the ESB could identify and report when invalid data or
badly-formed messages were received for a service.
An error handling variance test in Parasoft SOAtest
8
A sample spreadsheet that covers various error scenarios
Replication Test Patterns
The final stage was replication of the business workflow at the service layer. This was achieved by
reusing the established test assets and linking the services together to execute a business workflow—as
the end-user would experience it.
Arguably, this could be viewed as the most valuable test executed, since it includes calling all services
within the project scope and replicating the system behaviour during business workflows. It was also
highly-effective as a regression test because it was not hampered by a front-end user interface and it
could be executed relatively quickly, even compared to an automated functional test.
Notes:
 Scenario testing, whether automated or manual, can prove complicated and cumbersome. It
was essential that any scenario tests would take on the existing framework to allow for easy
manipulation or changes when required and that these changes would be global for all other
test patterns.
 Key data was fed in at key points along the message steps and the results from each message
were carried forward down the line in an attempt to achieve the expected end result.
9
Replicating the business workflow at the service layer in Parasoft SOAtest
Outcome: Risk Reduction, Cost Savings, and Additional
Time for Validating Business Requirements
This initial project was the proof of concept for this approach, and the results were extremely positive. It
was easy to demonstrate value to stakeholders since the test flows clearly demonstrated the evolution
of the test pattern taking shape. The success of the proof of concept led to the opportunity to apply the
framework to another project. This second implementation was considered a success as well. By
following the same design patterns for the test assets, test scripting was smooth and effortless—freeing
up more time for analysing the technical specs and business requirements.
With the solution that IntegrationQA designed and implemented using the Parasoft API Testing solution,
the company achieved:
 Maintainability - The framework decoupled the test logic from the actual test tools: tests were
largely data-driven and parameterised. This facilitated test maintenance/updating as the
application evolved.
 Reusability - The scripted tests were designed to be extracted from the project where they
originated and dropped into other projects or into a service test repository.
 Scalability - The framework supports a range from acute testing to larger performance testing.
10
 Cost Savings – The automation reduces the time and cost involved in regression testing the ESB
implementation and back-end logic.
 Reduced Risk - "White-box testing" the inner components of the IT infrastructure minimised risk
for traditional test phases going forward. This direct testing enabled the team to expose a high
number of critical defects during the testing phase.
Over 18 months, the company has saved over $2.1 million NZD in testing and development costs (based
on the company's internal defect cost valuations).
About IntegrationQA
IntegrationQA is an independent software design, development-QA and test consultancy located in
Wellington, New Zealand and Sydney, Australia. Established in 2009, after years of collective systems
integration experience, led us to form our own practice to provide our accrued knowledge to clients
throughout the Australia-New Zealand region. Our goal is to improve a client’s IT infrastructure through
better technical design, improved delivery practices and better software testing. Already, we have been
able to provide significant cost reductions and improved efficiency to clients who in turn have
dramatically altered their design, development and test approaches.
Contacting IntegrationQA
NEW ZEALAND AUSTRALIA
Level 7 Level 39
12-22 Johnston Street 2 Park Street
Wellington 6011 Sydney 2000
T: +64 4 473 8535 T: +61 2 803 53413
About Parasoft
Parasoft researches and develops software solutions that help organizations deliver defect-free
software efficiently. We reduce the time, effort, and cost of delivering secure, reliable, and compliant
software. Parasoft's enterprise and embedded development solutions are the industry's most
comprehensive—including static analysis, unit testing, requirements traceability, coverage analysis,
functional & load testing, dev/test environment management, and more. The majority of Fortune 500
companies rely on Parasoft in order to produce top-quality software consistently and efficiently as they
pursue agile, lean, DevOps, compliance, and safety-critical development initiatives.
Contacting Parasoft
USA Phone: (888) 305-0041 Email: info@parasoft.com
11
NORDICS Phone: +31-70-3922000 Email: info@parasoft.nl
GERMANY Phone: +49 731 880309-0 Email: info-de@parasoft.com
POLAND Phone: +48 12 290 91 01 Email: info-pl@parasoft.com
UK Phone: +44 (0)208 263 6005 Email: sales@parasoft-uk.com
FRANCE Phone: (33 1) 64 89 26 00, Email: sales@parasoft-fr.com
ITALY Phone: (+39) 06 96 03 86 74 Email: c.soulat@parasoft-fr.com
OTHER See http://www.parasoft.com/contacts
Author Information
This paper was written by:
 Chris Wellington (chris.wellington@integrationqa.com), Managing Director at IntegrationQA
 Andrew Saunders (andrew.saunders@integrationqa.co.nz), Senior Consultant at IntegrationQA
 Cynthia Dunlop (cynthia.dunlop@parasoft.com), Lead Technical Writer at Parasoft
© 2015 IntegrationQA Limited and Parasoft Corporation
All rights reserved. Parasoft and all Parasoft products and services listed within are trademarks or registered trademarks of Parasoft Corporation. All other products, services, and companies
are trademarks, registered trademarks, or servicemarks of their respective holders in the US and/or other countries.

More Related Content

What's hot

API automation with JMeter + Bamboo CI
API automation with JMeter + Bamboo CIAPI automation with JMeter + Bamboo CI
API automation with JMeter + Bamboo CI
Mykola Kovsh
 
Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0
Shay Ginsbourg
 
Keyword Driven Automation
Keyword Driven AutomationKeyword Driven Automation
Keyword Driven Automation
Pankaj Goel
 
Cloud Performance Testing with LoadRunner
Cloud Performance Testing with LoadRunnerCloud Performance Testing with LoadRunner
Cloud Performance Testing with LoadRunner
Richard Bishop
 
Performance testing with Apache JMeter
Performance testing with Apache JMeterPerformance testing with Apache JMeter
Performance testing with Apache JMeter
RedBlackTree
 
Case Study : Performance Testing (Educational Services)
Case Study : Performance Testing (Educational Services)Case Study : Performance Testing (Educational Services)
Case Study : Performance Testing (Educational Services)
360logica Software Testing Services (A Saksoft Company)
 
Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...
Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...
Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...
Alexandru Ersenie
 
Building Efficient Software with Property Based Testing
Building Efficient Software with Property Based TestingBuilding Efficient Software with Property Based Testing
Building Efficient Software with Property Based Testing
CitiusTech
 
Performance testing
Performance testing Performance testing
Performance testing
ekatechserv
 
Less01 1 introduction_module
Less01 1 introduction_moduleLess01 1 introduction_module
Less01 1 introduction_moduleSuresh Mishra
 
Best Practices for Applications Performance Testing
Best Practices for Applications Performance TestingBest Practices for Applications Performance Testing
Best Practices for Applications Performance Testing
Bhaskara Reddy Sannapureddy
 
Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020
Shay Ginsbourg
 
QSpiders - Simple Recording and Configuration of recording options for HP Loa...
QSpiders - Simple Recording and Configuration of recording options for HP Loa...QSpiders - Simple Recording and Configuration of recording options for HP Loa...
QSpiders - Simple Recording and Configuration of recording options for HP Loa...
Qspiders - Software Testing Training Institute
 
Spring integration
Spring integrationSpring integration
Spring integration
Zülfikar Karakaya
 
Integration Patterns With Spring integration
Integration Patterns With Spring integrationIntegration Patterns With Spring integration
Integration Patterns With Spring integration
Eldad Dor
 
Performance Testing in Oracle Apps
Performance Testing in Oracle AppsPerformance Testing in Oracle Apps
Performance Testing in Oracle Apps
Biswajit Pratihari
 
Michael Monaghan - Evolution of New Feature Verification in 3G Networks
Michael Monaghan - Evolution of New Feature Verification in 3G NetworksMichael Monaghan - Evolution of New Feature Verification in 3G Networks
Michael Monaghan - Evolution of New Feature Verification in 3G Networks
TEST Huddle
 
Pre-deployment Performance Evaluation of Web-based Product
Pre-deployment Performance Evaluation of Web-based ProductPre-deployment Performance Evaluation of Web-based Product
Pre-deployment Performance Evaluation of Web-based Product
STAG Software Private Limited
 
Load testing with J meter
Load testing with J meterLoad testing with J meter
Load testing with J meter
Manoj Shankaramanchi
 

What's hot (20)

API automation with JMeter + Bamboo CI
API automation with JMeter + Bamboo CIAPI automation with JMeter + Bamboo CI
API automation with JMeter + Bamboo CI
 
Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0Ginsbourg.com - Performance and load test report template ltr 2.0
Ginsbourg.com - Performance and load test report template ltr 2.0
 
Keyword Driven Automation
Keyword Driven AutomationKeyword Driven Automation
Keyword Driven Automation
 
Cloud Performance Testing with LoadRunner
Cloud Performance Testing with LoadRunnerCloud Performance Testing with LoadRunner
Cloud Performance Testing with LoadRunner
 
Wipro-Projects
Wipro-ProjectsWipro-Projects
Wipro-Projects
 
Performance testing with Apache JMeter
Performance testing with Apache JMeterPerformance testing with Apache JMeter
Performance testing with Apache JMeter
 
Case Study : Performance Testing (Educational Services)
Case Study : Performance Testing (Educational Services)Case Study : Performance Testing (Educational Services)
Case Study : Performance Testing (Educational Services)
 
Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...
Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...
Load and Performance Testing for J2EE - Testing, monitoring and reporting usi...
 
Building Efficient Software with Property Based Testing
Building Efficient Software with Property Based TestingBuilding Efficient Software with Property Based Testing
Building Efficient Software with Property Based Testing
 
Performance testing
Performance testing Performance testing
Performance testing
 
Less01 1 introduction_module
Less01 1 introduction_moduleLess01 1 introduction_module
Less01 1 introduction_module
 
Best Practices for Applications Performance Testing
Best Practices for Applications Performance TestingBest Practices for Applications Performance Testing
Best Practices for Applications Performance Testing
 
Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020Ginsbourg.com presentation of performance and load testing 2020
Ginsbourg.com presentation of performance and load testing 2020
 
QSpiders - Simple Recording and Configuration of recording options for HP Loa...
QSpiders - Simple Recording and Configuration of recording options for HP Loa...QSpiders - Simple Recording and Configuration of recording options for HP Loa...
QSpiders - Simple Recording and Configuration of recording options for HP Loa...
 
Spring integration
Spring integrationSpring integration
Spring integration
 
Integration Patterns With Spring integration
Integration Patterns With Spring integrationIntegration Patterns With Spring integration
Integration Patterns With Spring integration
 
Performance Testing in Oracle Apps
Performance Testing in Oracle AppsPerformance Testing in Oracle Apps
Performance Testing in Oracle Apps
 
Michael Monaghan - Evolution of New Feature Verification in 3G Networks
Michael Monaghan - Evolution of New Feature Verification in 3G NetworksMichael Monaghan - Evolution of New Feature Verification in 3G Networks
Michael Monaghan - Evolution of New Feature Verification in 3G Networks
 
Pre-deployment Performance Evaluation of Web-based Product
Pre-deployment Performance Evaluation of Web-based ProductPre-deployment Performance Evaluation of Web-based Product
Pre-deployment Performance Evaluation of Web-based Product
 
Load testing with J meter
Load testing with J meterLoad testing with J meter
Load testing with J meter
 

Viewers also liked

How to Choose an API Automation Tool for a Distributed Cloud-based App: To...
How to Choose an API Automation Tool for a Distributed Cloud-based App: To...How to Choose an API Automation Tool for a Distributed Cloud-based App: To...
How to Choose an API Automation Tool for a Distributed Cloud-based App: To...
Altoros
 
Secure RESTful API Automation With JavaScript
Secure RESTful API Automation With JavaScriptSecure RESTful API Automation With JavaScript
Secure RESTful API Automation With JavaScript
Jonathan LeBlanc
 
RESTful API Automation with JavaScript
RESTful API Automation with JavaScriptRESTful API Automation with JavaScript
RESTful API Automation with JavaScript
Jonathan LeBlanc
 
Ppt of soap ui
Ppt of soap uiPpt of soap ui
Ppt of soap ui
pkslide28
 
API Management Platform Technical Evaluation Framework
API Management Platform Technical Evaluation FrameworkAPI Management Platform Technical Evaluation Framework
API Management Platform Technical Evaluation Framework
WSO2
 
Testing RESTful web services with REST Assured
Testing RESTful web services with REST AssuredTesting RESTful web services with REST Assured
Testing RESTful web services with REST Assured
Bas Dijkstra
 
Soa testing soap ui (2)
Soa testing   soap ui (2)Soa testing   soap ui (2)
Soa testing soap ui (2)
Knoldus Inc.
 

Viewers also liked (7)

How to Choose an API Automation Tool for a Distributed Cloud-based App: To...
How to Choose an API Automation Tool for a Distributed Cloud-based App: To...How to Choose an API Automation Tool for a Distributed Cloud-based App: To...
How to Choose an API Automation Tool for a Distributed Cloud-based App: To...
 
Secure RESTful API Automation With JavaScript
Secure RESTful API Automation With JavaScriptSecure RESTful API Automation With JavaScript
Secure RESTful API Automation With JavaScript
 
RESTful API Automation with JavaScript
RESTful API Automation with JavaScriptRESTful API Automation with JavaScript
RESTful API Automation with JavaScript
 
Ppt of soap ui
Ppt of soap uiPpt of soap ui
Ppt of soap ui
 
API Management Platform Technical Evaluation Framework
API Management Platform Technical Evaluation FrameworkAPI Management Platform Technical Evaluation Framework
API Management Platform Technical Evaluation Framework
 
Testing RESTful web services with REST Assured
Testing RESTful web services with REST AssuredTesting RESTful web services with REST Assured
Testing RESTful web services with REST Assured
 
Soa testing soap ui (2)
Soa testing   soap ui (2)Soa testing   soap ui (2)
Soa testing soap ui (2)
 

Similar to Bank_Automated_API_Testing

Netserv Software Testing
Netserv Software TestingNetserv Software Testing
Netserv Software Testing
sthicks14
 
Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1
Suresh Mishra
 
Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi
Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang PhiIntroduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi
Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi
Ho Chi Minh City Software Testing Club
 
Writing Acceptance Tests Using Fitnesse
Writing Acceptance Tests Using FitnesseWriting Acceptance Tests Using Fitnesse
Writing Acceptance Tests Using Fitnesse
Facundo Farias
 
QSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load RunnerQSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load Runner
Qspiders - Software Testing Training Institute
 
OFM AIA FP Implementation View and Case Study
OFM AIA FP Implementation View and Case StudyOFM AIA FP Implementation View and Case Study
OFM AIA FP Implementation View and Case Study
Sreenivasa Setty
 
Scale and Load Testing of Micro-Service
Scale and Load Testing of Micro-ServiceScale and Load Testing of Micro-Service
Scale and Load Testing of Micro-Service
IRJET Journal
 
Extreme Automation Enables DirecTV to ”Shift Left” API Testing
Extreme Automation Enables DirecTV to ”Shift Left” API TestingExtreme Automation Enables DirecTV to ”Shift Left” API Testing
Extreme Automation Enables DirecTV to ”Shift Left” API Testing
Parasoft
 
Final Automation Testing
Final Automation TestingFinal Automation Testing
Final Automation Testingpriya_trivedi
 
A Technique for Testing Composed Web Services Including Footprint
A Technique for Testing Composed Web Services Including FootprintA Technique for Testing Composed Web Services Including Footprint
A Technique for Testing Composed Web Services Including Footprint
IRJET Journal
 
Automated rock testing tracker
Automated rock testing trackerAutomated rock testing tracker
Automated rock testing trackerMir Mustafa Ali
 
Shuvam dutta
Shuvam duttaShuvam dutta
Shuvam dutta
Shuvam Dutta
 
Performance testing : An Overview
Performance testing : An OverviewPerformance testing : An Overview
Performance testing : An Overview
sharadkjain
 
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
CA Technologies
 
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
CA Technologies
 
Performance Test Slideshow Recent
Performance Test Slideshow RecentPerformance Test Slideshow Recent
Performance Test Slideshow RecentFuture Simmons
 
Performance Test Slideshow R E C E N T
Performance Test Slideshow R E C E N TPerformance Test Slideshow R E C E N T
Performance Test Slideshow R E C E N TFuture Simmons
 
Test automation framework
Test automation frameworkTest automation framework
Test automation framework
QACampus
 
Automation Framework 042009 V2
Automation Framework   042009  V2Automation Framework   042009  V2
Automation Framework 042009 V2
Devukjs
 

Similar to Bank_Automated_API_Testing (20)

Sap bc performance test
Sap bc performance testSap bc performance test
Sap bc performance test
 
Netserv Software Testing
Netserv Software TestingNetserv Software Testing
Netserv Software Testing
 
Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1Less11 3 e_loadmodule_1
Less11 3 e_loadmodule_1
 
Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi
Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang PhiIntroduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi
Introduction to Back End Automation Testing - Nguyen Vu Hoang, Hoang Phi
 
Writing Acceptance Tests Using Fitnesse
Writing Acceptance Tests Using FitnesseWriting Acceptance Tests Using Fitnesse
Writing Acceptance Tests Using Fitnesse
 
QSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load RunnerQSpiders - Introduction to HP Load Runner
QSpiders - Introduction to HP Load Runner
 
OFM AIA FP Implementation View and Case Study
OFM AIA FP Implementation View and Case StudyOFM AIA FP Implementation View and Case Study
OFM AIA FP Implementation View and Case Study
 
Scale and Load Testing of Micro-Service
Scale and Load Testing of Micro-ServiceScale and Load Testing of Micro-Service
Scale and Load Testing of Micro-Service
 
Extreme Automation Enables DirecTV to ”Shift Left” API Testing
Extreme Automation Enables DirecTV to ”Shift Left” API TestingExtreme Automation Enables DirecTV to ”Shift Left” API Testing
Extreme Automation Enables DirecTV to ”Shift Left” API Testing
 
Final Automation Testing
Final Automation TestingFinal Automation Testing
Final Automation Testing
 
A Technique for Testing Composed Web Services Including Footprint
A Technique for Testing Composed Web Services Including FootprintA Technique for Testing Composed Web Services Including Footprint
A Technique for Testing Composed Web Services Including Footprint
 
Automated rock testing tracker
Automated rock testing trackerAutomated rock testing tracker
Automated rock testing tracker
 
Shuvam dutta
Shuvam duttaShuvam dutta
Shuvam dutta
 
Performance testing : An Overview
Performance testing : An OverviewPerformance testing : An Overview
Performance testing : An Overview
 
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
 
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
Case Study: How CA Went From 40 Days to Three Days Building Crystal-Clear Tes...
 
Performance Test Slideshow Recent
Performance Test Slideshow RecentPerformance Test Slideshow Recent
Performance Test Slideshow Recent
 
Performance Test Slideshow R E C E N T
Performance Test Slideshow R E C E N TPerformance Test Slideshow R E C E N T
Performance Test Slideshow R E C E N T
 
Test automation framework
Test automation frameworkTest automation framework
Test automation framework
 
Automation Framework 042009 V2
Automation Framework   042009  V2Automation Framework   042009  V2
Automation Framework 042009 V2
 

Bank_Automated_API_Testing

  • 1. How a Leading Bank Automated Extensive ESB and API Testing
  • 2. 1 To reduce the risks associated with their business-critical transactions, a leading New Zealand bank and financial-services provider wanted to extend their existing GUI-based testing to more extensively exercise the application logic within internal systems. This logic resides within their ESB: a message broker component that features 80+ services and 750+ operations. The ESB represents a single point of failure within the infrastructure—but the company was unable to test it directly or automatically. With IntegrationQA's experts and Parasoft's API Testing solution, the company gained a service testing methodology supported by automated, reusable tests that performed a broad scope of testing directly from the service/API layer. The result: they saved over $2.1 million NZD in testing and development costs over an 18 month period. The Challenge: Creating Reusable, Easily-Maintainable ESB Service Tests that Automate 3 Key Test Patterns To achieve their quality objectives, the company needed reusable test assets that would enable exhaustive, direct testing of ESB services. These tests had to be able to robustly exercise any of the available ESB services via three main test patterns:  Confirmation - Smoke test  Evaluation - Transformation, data variance, and error handling  Replication - Business workflow scenarios
  • 3. 2 All three test patterns had to be automated in a way that was reusable and easily maintained across different messaging protocols (e.g., messaging queues, FTP, SOAP, REST) and message payload formats (e.g., XML, fixed length, JSON). The Solution: IntegrationQA + Parasoft API Testing To help the company establish the necessary testing methodology and processes, IntegrationQA proposed a plan to provide a test design that could be easily applied across the numerous services that needed to be tested. This involved establishing a framework which relied heavily on data constraints and parameterisation of principal fields within the messaging client. Testing would be automated using Parasoft's API Testing solution, Parasoft SOAtest. Confirmation Test Patterns Smoke Test The confirmation or smoke test successfully established connectivity using Parasoft's MQ or SOAP messaging client. It also performed data verification by confirming whether the response satisfied expectations. Since the message payload was literal XML, XPaths were used to automatically check the message's expected Response Code in order to determine whether the test passed or failed. A positive response code not only indicated a successful transmission through the ESB, but also that the HOST/mainframe was able to successfully process the data sent in the request. The successful implementation of this test was the starting point of the framework going forward. Regression Suite The overall ESB regression suite comprised a high-level sub-suite of smoke tests for each service, plus a separate low-level sub-suite for each service operation. This suite was applied as a regression test for any deployment to the ESB and across any environment. Notes:  The smoke test was intended to confirm successful deployment of a service to the ESB, so a positive or negative response from the provider was accepted as proof that the service was deployed and operational. However, IntegrationQA went a step further and also configured the test to confirm whether the Provider response was positive (as expected). This increased the functional test team's confidence that the integrated systems were operational and ready.  Although the test team's work was project-oriented, the implemented ESB regression suite was ESB-oriented. The ESB regression suite was the core repository of service smoke tests. Project- based test suites were first based upon in-scope ESB services from the repository; if the project ESB service was new, then it was created at the project level, then added to the repository after the project went live.
  • 4. 3 The XML payload for a smoke test is displayed as literal XML for the service request Evaluation Test Patterns The following evaluation tests (Transformation, Data Variance & Error Handling) establish the robustness of the test design. Transformation Tests Building on the smoke test, an automated transformation test was a powerful test of the input to (and output from) the ESB. A literal XML message that covered all potential service data fields was selected for the test. Parasoft SOAtest was used to parse and compare values of the same messages in different formats. The two request messages were retrieved from the ESB message store database. The ESB Request was in XML and parsed via XPath per element. The HOST Request, a fixed-length format message based on z/OS IBM copybook technology, was passed to a custom parsing script written in JavaScript. Here, parsing was dictated by constraints defined on an external data pool. A final step compared the parsed data between the two message formats—again, using custom JavaScript. The same steps were run for the Response component of the transaction. Notes:  Transformation tests were runtime tests that were ideal for uncovering small differences between XML data and fixed-length data. For example, it could detect transformation defects such as changed date formats, decimalised rate values, and dollar values.
  • 5. 4  This evaluation tested the ESB's transformation logic and mapping control between messages. If the ESB’s transformation and mapping logic was modified, either intentionally or unintentionally, then this test would uncover that change during regression test execution.  JavaScript was leveraged here because SOAtest enabled quick implementation without compilation.  A third-party Eclipse add-on (LegStar) was leveraged to load IBM copybooks to generate an XSD via an automatically-generated Java package; pipe the appropriate fixed length message into the appropriate class and an XML instance of the fixed length message is created. With some application of the transformation steps to the data, this twist allows for more manageable test script set-up since it removes much of the manual work in setting the constraints for parsing the fixed-length message via a spreadsheet data pool. Steps for sending, retrieving, parsing and comparing the message data
  • 6. 5 The spreadsheet that constrained and drove the parsing for each element in the messages Data Variance Tests An automated Data Variance test provided test coverage for all possible iterations of an operation within a service. Test data from an external data pool was fed into a parameterised message structure within SOAtest. This way, all possible message structures for a service’s operation or method would be tested. The test pattern evaluated boundary rules and data type rules of fields within a request; it was leveraged to trigger various response iterations from the Provider. Notes:  Data variance tests were structured to first focus on a test of all mandatory fields, then all optional fields, and finally boundary analysis with maximum-length tests on each field in a message. Once these tests were established, the focus turned to business-oriented scenarios, such as listing all possible data entries for a field (if finite) or running through various iterations of logical branches of calculations for the Provider to process.  Testing data returned from the Provider was a bit trickier since it typically depended upon existing data from the Provider’s back-end database tables. In some cases, it was possible to create data and then delete it once the test was complete. However, the nature of the company's business rules did not often allow for direct deletion of records. In some cases, an additional step was added before a message client test could be called. This step would query for data that met certain criteria, then a message client would return this data via the service layer.
  • 7. 6 A data variance test; the XML is represented as a form and the key element (CountryCode) is parameterized A sample spreadsheet that covers various data scenarios
  • 8. 7 Error Handling Tests Using the same Data Variance pattern, and making alterations only to the external data pool, an Error Handling test pattern was created which covered all business error scenarios. Verification of the returned error code was evaluated with corresponding expected results within the external data pool. The Provider always returned specific error descriptions to the ESB, but the ESB often would return a generic error code. The ESB development team tended to prepare generic error codes for services and only extended the error handling logic where the business or Consumer required a more defined response from the ESB. Notes:  Before an error test scenario began, a positive run (usually the smoke test) was executed to establish a baseline and ensure that the ESB and Provider were operational.  From a technical point of view, error scenarios that might only appear during a technological failure were also tested to ensure that the ESB could identify and report when invalid data or badly-formed messages were received for a service. An error handling variance test in Parasoft SOAtest
  • 9. 8 A sample spreadsheet that covers various error scenarios Replication Test Patterns The final stage was replication of the business workflow at the service layer. This was achieved by reusing the established test assets and linking the services together to execute a business workflow—as the end-user would experience it. Arguably, this could be viewed as the most valuable test executed, since it includes calling all services within the project scope and replicating the system behaviour during business workflows. It was also highly-effective as a regression test because it was not hampered by a front-end user interface and it could be executed relatively quickly, even compared to an automated functional test. Notes:  Scenario testing, whether automated or manual, can prove complicated and cumbersome. It was essential that any scenario tests would take on the existing framework to allow for easy manipulation or changes when required and that these changes would be global for all other test patterns.  Key data was fed in at key points along the message steps and the results from each message were carried forward down the line in an attempt to achieve the expected end result.
  • 10. 9 Replicating the business workflow at the service layer in Parasoft SOAtest Outcome: Risk Reduction, Cost Savings, and Additional Time for Validating Business Requirements This initial project was the proof of concept for this approach, and the results were extremely positive. It was easy to demonstrate value to stakeholders since the test flows clearly demonstrated the evolution of the test pattern taking shape. The success of the proof of concept led to the opportunity to apply the framework to another project. This second implementation was considered a success as well. By following the same design patterns for the test assets, test scripting was smooth and effortless—freeing up more time for analysing the technical specs and business requirements. With the solution that IntegrationQA designed and implemented using the Parasoft API Testing solution, the company achieved:  Maintainability - The framework decoupled the test logic from the actual test tools: tests were largely data-driven and parameterised. This facilitated test maintenance/updating as the application evolved.  Reusability - The scripted tests were designed to be extracted from the project where they originated and dropped into other projects or into a service test repository.  Scalability - The framework supports a range from acute testing to larger performance testing.
  • 11. 10  Cost Savings – The automation reduces the time and cost involved in regression testing the ESB implementation and back-end logic.  Reduced Risk - "White-box testing" the inner components of the IT infrastructure minimised risk for traditional test phases going forward. This direct testing enabled the team to expose a high number of critical defects during the testing phase. Over 18 months, the company has saved over $2.1 million NZD in testing and development costs (based on the company's internal defect cost valuations). About IntegrationQA IntegrationQA is an independent software design, development-QA and test consultancy located in Wellington, New Zealand and Sydney, Australia. Established in 2009, after years of collective systems integration experience, led us to form our own practice to provide our accrued knowledge to clients throughout the Australia-New Zealand region. Our goal is to improve a client’s IT infrastructure through better technical design, improved delivery practices and better software testing. Already, we have been able to provide significant cost reductions and improved efficiency to clients who in turn have dramatically altered their design, development and test approaches. Contacting IntegrationQA NEW ZEALAND AUSTRALIA Level 7 Level 39 12-22 Johnston Street 2 Park Street Wellington 6011 Sydney 2000 T: +64 4 473 8535 T: +61 2 803 53413 About Parasoft Parasoft researches and develops software solutions that help organizations deliver defect-free software efficiently. We reduce the time, effort, and cost of delivering secure, reliable, and compliant software. Parasoft's enterprise and embedded development solutions are the industry's most comprehensive—including static analysis, unit testing, requirements traceability, coverage analysis, functional & load testing, dev/test environment management, and more. The majority of Fortune 500 companies rely on Parasoft in order to produce top-quality software consistently and efficiently as they pursue agile, lean, DevOps, compliance, and safety-critical development initiatives. Contacting Parasoft USA Phone: (888) 305-0041 Email: info@parasoft.com
  • 12. 11 NORDICS Phone: +31-70-3922000 Email: info@parasoft.nl GERMANY Phone: +49 731 880309-0 Email: info-de@parasoft.com POLAND Phone: +48 12 290 91 01 Email: info-pl@parasoft.com UK Phone: +44 (0)208 263 6005 Email: sales@parasoft-uk.com FRANCE Phone: (33 1) 64 89 26 00, Email: sales@parasoft-fr.com ITALY Phone: (+39) 06 96 03 86 74 Email: c.soulat@parasoft-fr.com OTHER See http://www.parasoft.com/contacts Author Information This paper was written by:  Chris Wellington (chris.wellington@integrationqa.com), Managing Director at IntegrationQA  Andrew Saunders (andrew.saunders@integrationqa.co.nz), Senior Consultant at IntegrationQA  Cynthia Dunlop (cynthia.dunlop@parasoft.com), Lead Technical Writer at Parasoft © 2015 IntegrationQA Limited and Parasoft Corporation All rights reserved. Parasoft and all Parasoft products and services listed within are trademarks or registered trademarks of Parasoft Corporation. All other products, services, and companies are trademarks, registered trademarks, or servicemarks of their respective holders in the US and/or other countries.