1. T r u s t t h e E x p e r t s
Scaling up the UAT practice for a major US Bank
Those were challenging times! Business was under pressure to expand across the globe in strategic locations. With
businesses booming on the one hand and under competitive pressure on the other, clients were off-shoring most of their
manpower intensive operations. There was also a reverse flow with foreign companies investing in the region, adding to
growth and business opportunities. Consequently, all operations of the Bank, including commercial, international and private
banking, as well as value-added services, had to be quickly ramped up to meet the pressures of growth.
As the CIO of the Bank, Milfred Mokowich was under pressure to not only expand but also to rationalize all systems and
operations. The board had approved all expansion plans and it was now up to her to address the bottlenecks. It was a typical
situation centered on UAT that she had encountered in the course of her previous assignments and this one was no different,
with time always at a premium. Operational staff had to be freed from testing assignments, costs had to be brought down
and efficiencies improved through standardization and appropriate benchmarking.
Staff Constraints: Existing operational staff could not be easily diverted for conducting UAT, leading to delayed deliveries
of software releases and upgrades.
Lack of Standardization: With the limited reuse of functional knowledge and learning deficiencies within the organization,
she had to contend with duplicated efforts.
Loss of Efficiency: In the absence of automation she was hard pressed to drive the costs down. She also had to contend
with misdirected efforts due to lack of global benchmarks vis-à-vis UAT practices.
From her previous experience Milfred knew that the only way forward was to partner a Testing Specialist from the banking
domain, to set up a dedicated UAT practice. Thus she wished to free business users by of much as 80% of their time. This
was expected to also improve the effectiveness, scale and maturity of the UAT practice.
2. TGSL (CS) 9 - Scaling up UAT practice
Vendor Specifications: The Specialist Testing House was
required to meet the following specifications:
• Established test processes refined over millions of hours of
testing
• Minimum of three large global assignments of setting up and
successful running of UAT practice
• Specialized expertise in testing financial software with an
accent on banking to assure
- 100% functional coverage of documented scope through
requirements driven testing
- Minimization of time overruns through rigorous reporting
through a control panel during the execution phase
- Strong test management expertise to anticipate and mitigate
risks
• Rigorous test methodologies, incorporating best practices
from learning gained through real-time pure testing
engagements
• Risk categorization of test objects for optimization of total cost
of ownership and also minimization of time to launch.
• Ability to provide both functional and non-functional testing
services.
• Leadership to drive the test planning and execution phases
from end-to-end, including testing of interfaces
• A combination of non-functional testing practices, business
knowledge and domain focus for test-automation,
performance-testing and security-testing
Award of Contract: In the absence of an objective assessment
framework, preliminary screening of potential vendors, backed
by strong references from her professional circle pointed to
Thinksoft, the only vendor who could perhaps meet all the speci-
fications, subjective as it may have been.
Thinksoft Global Services Limited(TGSL) was accordingly
shortlisted, put through a rigorous inquisition and finally awarded
the contract. In short TGSL was a testing specialist with depth
and breadth of domain knowledge.
Thinksoft’s Solution Framework: The framework developed by
domain experts and industry practitioners, specifically for UAT
implementation, is illustrated in Fig 1below:
Input
Enablers
Output
• Business Process
• Requirements
• KT Sessions
• Functional Artifacts
• Release Specific Plan
• Business User
Interaction
• Free up Business users
time in UAT
• Implement and standardize
UAT best practices
• Achieve cost effective UAT
Practice
• Design for Continuous
Improvement
Tools and
Automation
Expertise
Scalable Business
Tester capacity
as per business
needs
Standardized
Processes
Domain/Test
Knowledge
Repositories
Best Practices
Metrices &
Reporting
Risk Based
Test
Frame work
Resource
management
Test
Repository
Knowledge
Transfer &
Management
UAT
Practice
Governance
Models
Solution Framework
Project Highlights:
• The Pilot phase incorporated key applications such as:
- Flexcube Universal Banking Solution, a Core Banking
Application
- eConnect, an Online Banking portal
- S1, a Payments Platform
Fig1:TheSolutionFramework
• Applications were grouped into four phases as shown in Fig 2
• The UAT practice team devised a plan to align with the overall
strategy, as shown in Fig 3, working closely with the release
management and the IT operations teams to understand the
available environments
• Applications for automation were identified, based on
business criticality, number of releases and technology used
3. TGSL (CS) 9 - Scaling up UAT practice
• A reusable, easily maintainable automation test pack, using a
flexible and reusable automation-framework, was built that
could be adapted at all levels of testing
• Centralized and structured reporting
• Best practices in Integrated testing such as Risk Based
Testing (RBT), Static Testing and incorporation of manual test
repositories into the test process
• Adoption of an onsite/offshore model for bringing in the cost
advantage and global best practices. The team consisted of a
core group and a flexible group for ramping up as per
business needs.
UAT Practice
Implementation Pilot
4
Applications
16
Applications
16
Applications
SwiftNet
27
Applications
IBSNET
Phase-1 Phase-2 Phase-3 BAU
Legend:Advent:Treasury application, BOXI:Business objects 11th version (DWH, MIS application),DART;Data Application Reporting Tool
(Account MIS application), FAS 91: Loan amortization system, Fraudfilter: Fraud management system,Reg Reporter – Regulatory reporting
application, PSGL: People Soft General Ledger, S1 (ACH):Automated Clearing House (Payments application),
Fig 2: Phases of Implementing the Solution
Fig 3: UAT Practice collaboration model with IT and Business
Technical Support
Defect Fix
IT Delivery
Project
Plan
Design Development Unit Test
Project
Closure
UAT
Practice
Business
Requirements
Review
Test Impact
Test Strategy
Scenarios
Test Conditions/Cases/Data
Test Environment Needs
Test Execution
Defect
Management
Metrics
Automation
Test
Enhancements &
Bug Fixes
Test
Summary
Requirement
Finalization
Provide High Level
Business Knowledge,
Signoff Scenarios
Participate in
Defect
Meetings
Sign off Results of
Acceptance Tests
Dry Run
or Pilot
Handover
to
Maintenance
Major Applications and Functions tested:
• ACH: Automated Clearing House
• Client Onboarding
• Deposit Reclassification
• eConnect: An online banking channel - An Internet banking
portal for USA
• EIB: Enterprise Integration Bus - Interface between eConnect
and the Core Banking Solution
• Flexcube: Core Banking System and Universal Banking
System
• Lockbox: A cheque management system for incoming
cheques for specific corporate clients
• Mobile Application
• T Recs: Total Reconciliation Solutions
• TAG: Transaction Gateway.
• TurboCAR: A credit rating system
4. TGSL (CS) 9 - Scaling up UAT practice
Disclaimer: All the documentation and other material contained herein is the property of Thinksoft Global Services and all intellectual property
rights in and to the same are owned by Thinksoft Global Services. You shall not, unless previously authorized by Thinksoft Global Services in
writing, copy, reproduce, market, license, lease or in any other way, dispose of, or utilize for profit, or exercise any ownership rights over the same.
In no event, unless required by applicable law or agreed to in writing, shall Thinksoft Global Services, or any person be liable for any loss, expense
or damage, of any type or nature arising out of the use of, or inability to use any material contained herein. Any such material is provided “as is”,
without warranty of any type or nature, either express or implied. All names, logos are used for identification purposes only and are trademarks
or registered trademarks of their respective companies
For more details, visit www. thinksoftglobal.com
T r u s t t h e E x p e r t s
Benefits derived from the TGSL engagement:
Saving Business users’ effort:
• Total effort-savings of 329 man days since inception
Best practices integration:
• Simplified the risk based testing (RBT) framework to meet
business requirements
• Risk score applied for smarter testing
- Greater testing rigor for high priority areas
- Critical defects identified upfront
Static testing benefits:
• Applied to all projects with significant functional changes
• 34 static defects identified prior to development resulting in
significant cost saving
• Static gaps distribution
- 29 requirement gaps
- 19 documentation gaps
- 6 process gaps
Automated regression test-pack for key applications:
• Thinksoft’s readily usable automation framework reused
• Achieved 74% test-execution effort saving per cycle
Test repository
• Saved 304 man hours of test planning effort
• Savings achieved for reuse of the repository where projects
came up for a second release since start of UAT practices
Effectiveness of UAT (Defect Analysis):
• 25% of total defects identified resulted in code change
• 5% of defects addressed documentation bugs
• 9% of configuration bugs eliminated leading to smooth
production deployment
• 48% of overall defects identified were categorized as Critical
and High priority
Challenges Faced:
• Interface testing faced multiple challenges, i.e., coordination for
timely and appropriate delivery of interfaces, the right
environment set up, involvement of multiple vendors and
combination of technical and functional issues
• Stringent timelines and limited timeframe for testing
• Balancing the demands of the Business and IT
Value Addition
• Early removal of gaps, between the requirements furnished by
business users and the functional specifications prepared by
the development team
• A dedicated UAT team diligently undertook structured and
rigorous UAT, improving the core tester-productivity
• Adoption of re-usable repositories, resulting in increased
productivity.
Client Speak:
• Resource consolidation and cross application training, resulting
in optimum utilization of resources
• Process standardization and implementation of best practices
• Improved coverage and toll gates
• Risk Priority Matrix facilitated prioritization of the functions and
scenarios to be tested, resulting in identification of critical defects
during the initial phase of testing
• Regression-test automation, resulting in reduced effort, time and
cost
Milfred Mokowich
CTO
“
”
UAT practice participation was important to supplementing
UAT on behalf of the business.
The UAT team is a huge asset to the project and worked very
hard to accommodate our aggressive schedule. They greatly
improved the efficiency of the UAT process and significantly
reduced the work load on the core team
I appreciate the quality of reporting and commitment to
provide these in a timely manner”
I was very pleased with how the UAT team was able to work
with the mobile team and became part of the whole team.
Instead of approaching the project in a linear fashion, they
were assessing changes easily and adapted well with the flow
of the project.
Business
UAT
Practice
IT
Enablers:
• Offshore leverage
• Automation and Repository
• Leverage Release management
for optimized testing
Enablers:
• Structured business user interaction
• Clarification tracking & Static testing
• Scenario workshop
• Risk based testing
Build confidence on adequacy & quality of testing
Reduction of business UAT effort
Optimize testing cost
Timely roll out
Are we doing the right amount of testing?
How much do i need to pay for UAT?
Will we be able to go live on time?
Does UAT understand my business process flow?
Do they understand my pain points?
Are we doing enough testing?
Fig 4: UAT Practice-The Balancing Act