SlideShare a Scribd company logo
1 of 18
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 1 of 18
NAM 2011 Q4A Change Release
(Including first part of Automated Data Gathering)
User Acceptance Test Strategy
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 2 of 18
Document History
Version Changes Author Date
1.0 1st Draft David Crane 8/3/2011
1.1
2nd Draft – Added description for
acronyms, update to Show All History,
change name of release, move ECIF
testing to first in the list
David Crane 8/9/2011
1.2 Final Draft – Added Closeout to the
items being tested, Added Reviewers
and Approvers, Update LOBs Affected
David Crane 8/10/2011
Document Reviewers/Approvers
Name
Position
Reviewe
r (only)
Reviewer
and
Approver
Steve Dayton UAT Manager / Delivery Manager X
Alan Parsowith AML Operations Director X
Virginia Hibbard AML Advisory Compliance X
Kaushik Sinha Technology – Project Manager X
Kavya Kalyana Production Support Lead, Americas
HUB
X
Rachael Blanchard PMO PM X
Christie Vita AML Operations X
David Roset AML Operations X
Maryjane Noberto PMO PM X
David Crane UAT Coordinator X
Kisha Merrell UAT Project Manager X
Manoj Jejware Technology X
Prerak Vora AML Operations PM X
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 3 of 18
Table of Contents
Table of Contents 3
1. Introduction 4
1.1. Purpose of this document 4
2. Scope 4
2.1. In Scope 4
2.2. Out of Scope 5
3. System Access 5
3.1. System Access 5
4. UAT Strategy 6
4.1. Assumptions 6
4.2. Entry Criteria 6
4.3. Specific Deliverables 6
4.4. UAT Dates 7
4.5. UAT Testing Strategy 7
4.6 Exit Criteria 15
5 Progress, Issue Tracking & Resolution 16
6. Roles and Responsibilities 17
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 4 of 18
1. Introduction
1.1. Purpose of this document
This document describes the key activities for the NAM User Acceptance Test (UAT)
required for successful implementation of the Q3 (Third Quarter) Change Release.
This project delivers enhancements to the QuickScreens NAM software system by
introducing new functionality to create Manual cases, rollup household focus LSB alerts and
team lead assignment of cases. It will introduce productivity enhancements for Quick
Clearing and showing all transaction history, and will incorporate functionality for automated
data gathering from the ECIF system.
2. Scope
The goal of this project is to produce fully functional and acceptable software product that satisfies
the high level requirements defined below
2.1. In Scope
This project delivers enhancements to the QuickScreens NAM software module of the AML-
M: AML MANTAS Analytic system (CSI App Id 34916), by introducing the following new
functionality:
1. Automated Data Gathering from ECIF.
(Americas Data Gathering BRD #3.1.1.4; Global Technology WorkSlate #3c-2)
This functionality will allow AML users to view data related to an AML case from
ECIF as required by the AML review processes without having to log into ECIF
directly.
2. Ability to create Manual cases for CPB, IPB, IAML and LSB.
(Global Technology WorkSlate #94; CR3575 or CR11147)
Following manual case types can be created:
1) Know Your Customer (KYC)
2) Suspicious
3) Subpoena
4) Senior Public Figures (SPF)
5) Non Government Organization (NGO)
6) Embassy Credit Card
7) Cash Sales Logs (CSL)
8) Payable Upon Proper ID (PUPID)
3. Rollup Household focus alerts
For LSB business line, Household focus cases should be rolled up to Customer
focus cases.
4. Show All Transaction History - Americas Data Gathering
(Americas Data Gathering BRD #3.1.6; Global Technology WorkSlate #3c)
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 5 of 18
This functionality will provide a way to view all transactions associated with an alert’s
account’s customer, not just those transactions that triggered the alert.
This feature utilizes only MANTAS data and therefore only goes back to the extent
that the MANTAS data is available (current month and previous six months)
5. Quick Clearing Functionality - Americas Data Gathering
(Americas Data Gathering BRD #3.1.8; Global Technology WorkSlate #3c-7)
This functionality will allow for the ‘quick clearing’ of cases such as those where a
similar case was recently reviewed. Identify previously closed cases when a new
case generates.
6. Team Lead Case Assignment
(Global Technology WorkSlate #95; CR3580 or CR11152)
This functionality will allow Team Lead roles to have functionality to assign cases to
themselves without ability to disposition cases.
7. Closeout
(Global Technology WorkSlate #75; CR10984 or PID 20081110-100)
This functionality will create a link between the Closeout case and the underlying
case, to track the case number from which the Closeout case stems from. This is an
item from 2010 QSQ4 being added to this Change Release.
2.2. Out of Scope
The following areas are out-of-scope of testing:
1. General systems efficiency testing is out of scope of UAT testing.
2. Updates or creation of procedures for AML Monitoring Operations, Production Support,
Staffing, and Communications about changes within Citi are out of scope of the UAT
portion of the effort.
3. System Access
3.1. System Access
Access to source systems (production access), Mantas, QSCM UAT UI and QSCM back-
end will be arranged for the System Implementation Support Group team prior to starting of
the UAT activities.
UAT testers should have access to all systems used for alert monitoring as well as ECIF.
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 6 of 18
4. UAT Strategy
4.1. Assumptions
 All entry criteria for UAT have been met.
 It is assumed that all UAT activities and tests are carried out on the basis of successful
completion of Technology SIT testing, as well as successful implementation of rules
defined in the FRD.
 UAT should focus on end to end business processes from an end user perspective to
confirm usability. All UAT activities will be carried out on the production data loaded in
UAT environment.
4.2. Entry Criteria
The following will be the entry criteria that must be met prior to the commencement of UAT:
 Approval of UAT Strategy document (this document)
 Confirmed availability of UAT testing resources
 Confirmed availability of systems (Mantas, QS, ECIF)
 Confirm performance of UAT systems to similar level of Production
 Access to all necessary systems and applications for each respective UAT tester
 Completion of required data loads
 Coding logic changes deployed to Quick Screens
 By end of 1st week of UAT: New Alerts from prospective business groups will be
identified:
o CPB
o IPB
o IAML
o LSB
o CBNA
o GTS
4.3. Specific Deliverables
This section describes the specific deliverables from Technology & Compliance to help
facilitate the testing, and when those deliverables need to be provided; whilst some are
required before UAT starts, others are only needed when a particular test is being
executed.
Team Business Deliverables Due Date
Technology ICG/GCG All evidence entry criteria are met
UAT start
date
Technology ICG/GCG
Status on all issues found (on
daily basis)
Ongoing
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 7 of 18
Technology ICG/GCG Any scenario rerun, as necessary Ongoing
Compliance ICG/GCG Approvals, as needed Ongoing
Compliance ICG/GCG Attendance on Project level calls Ongoing
Compliance ICG/GCG Escalation, w hen necessary Ongoing
Operations ICG/GCG Approvals, as needed Ongoing
Operations ICG/GCG Attendance on Project level calls Ongoing
4.4. UAT Dates
Q3 Release START DATE END DATE
UAT Commence 08/08/2011 08/12/2011
UAT Testing 08/15/2011 09/07/2011
UAT Closure 09/08/2011 09/09/2011
Move to Production 09/09/2011 09/12/2011
Data Testing Plan Start End Resource
SystemsSetupTesting 08/10/2011 08/12/2011 IT
IngestedDataTesting 08/10/2011 08/12/2011 IT
UAT Test – Functionality(testsbelow) 08/15/2011 09/05/2011 UAT
ECIF AutomatedDataGathering 08/15/2011 08/19/2011 UAT
Create Manual Cases 08/22/2011 08/23/2011 UAT
RollupHouseholdFocusAlerts 08/24/2011 08/24/2011 UAT
ShowAll Transactions 08/25/2011 08/30/2011 UAT
QuickClearingFunctionality 08/31/2011 09/01/2011 UAT
Team LeadCase Assignment 09/02/2011 09/03/2011 UAT
Closeout 09/04/2011 09/05/2011 UAT
4.5. UAT TestingStrategy
Listed below are details of the tests that are to be performed for NAM. The scripts will need to be
updated and modified to match to the specific test plan. Test Plan is available here:
The table below summarizes the tests that should be performed as part of the UAT phase.
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 8 of 18
4.5.1. Automated Data Gathering from ECIF (Test 1)
Overview
Test
#
Title Type Purpose Test
Applicability
Line of
Business
Affected
1
Automated Data
Gathering from ECIF
Quickscreens UI
Functionality
Test new feature which allows AML users to
view data related to an AML case from ECIF
without having to log into ECIF directly
NAM GTS
2
Create Manual Cases
for CPB, IPB, IAML,
and LSB
Quickscreens
UI Functionality
Test that the QuickScreens environment will
be able to accommodate the new
functionality to be able to add the following
case types:
1) Know Your Customer (KYC)
2) Suspicious
3) Subpoena
4) Senior Public Figures (SPF)
5) Non Government Organization
(NGO)
6) Embassy Credit Card
7) Cash Sales Logs (CSL)
8) Payable Upon Proper ID (PUPID)
NAM
CPB, IPB,
IAML,
LSB
3
Rollup Household
focus alerts for LSB
line
MANTAS
Data
Test to check for successful rollup of alerts to
Customer focus cases
NAM LSB
4
Show All Transaction
History – Americas
Data Gathering
Quickscreens UI
Functionality
Test functionality of new feature to view all
transactions associated with an alert’s
account customer, not just those that
triggered the alert
NAM
All
5
Quick Clearing
Functionality –
Americas Data
Gathering
Quickscreens UI
Functionality
Test functionality of new feature to ‘quick
clear’ cases and identify previously closed
cases when a new case generates
NAM All
6
Team Lead Case
Assignment
Quickscreens UI
Functionality
Test functionality of new feature to allow
Team Lead roles to assign cases to
themselves without ability to disposition
cases
NAM All
7 Closeout
Quickscreens UI
Functionality
Test functionality of new “Initiate Closeout”
button which will initiate the closeout
process. Also test link between Closeout
case and the underlying case.
NAM
CPB, IPB,
IAML,
LSB
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 9 of 18
Purpose To test that QS front end performs in expected manner. Test to ensure new
enhancements work as expected and existing functionality is not impacted
by changes.
Strategy Create test scripts based on the existing QS application. The script should
encompass verification for all functionalities (new fields/functionality,
amendments to user roles, permissions and workflow, etc.). Execute test
scripts( in the UAT environment and test that QS interface behaves as
expected.
Expected Outputs / Acceptance
Criteria
 QS user interface behaves as expected including requested
enhancements.
 No severity High, Medium level issues outstanding.
 Any severity Low level issues outstanding have agreed workaround’s in
place.
Testing Requirements
Dependencies Technology SIT complete on SIT data.
Mantas UAT environment loaded with required data (Mantas data pre-
processing and data loading complete).
Completeness of data ingestion from source system to Mantas.
Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD
Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers,
accounts, and transaction data covering data items required for all
applicable scenario types.
Data feed from Source Systems.
Transaction details in Mantas and source systems.
Access Requirements Access to sample data extracts from Mantas back-end database.
Access to Mantas UAT environment.
Access to QSCM UAT environment.
Access to Source systems or to output files for the sample identified for
testing.
Access to Mantas and Source systems or to output files for the sample
identified for testing.
Known Issues / Potential
Feedback
Access to source systems
Planning Considerations
Who Project UAT team
Timing As soon as data is available in QS UAT environment.
Effort 4 Days
Location Americas Hub.
Linked Activities
Linked Activities
4.5.2. Create Manual Cases for CPB, IPB, IAML, and LSB (Test 2)
Overview
Purpose To test that QS front end performs in expected manner. Test to ensure new
enhancements work as expected and existing functionality is not impacted
by changes.
Strategy Create test scripts based on the existing QS application. The script should
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 10 of 18
encompass verification for all functionalities (new fields/functionality,
amendments to user roles, permissions and workflow, etc.). Execute test
scripts( in the UAT environment and test that QS interface behaves as
expected.
Expected Outputs / Acceptance
Criteria
 QS user interface behaves as expected including requested
enhancements.
 No severity High, Medium level issues outstanding.
 Any severity Low level issues outstanding have agreed workaround’s in
place.
Testing Requirements
Dependencies Technology SIT complete on SIT data.
Mantas UAT environment loaded with required data (Mantas data pre-
processing and data loading complete).
Completeness of data ingestion from source system to Mantas.
Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD
Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers,
accounts, and transaction data covering data items required for all
applicable scenario types.
Data feed from Source Systems.
Transaction details in Mantas and source systems.
Access Requirements Access to sample data extracts from Mantas back-end database.
Access to Mantas UAT environment.
Access to QSCM UAT environment.
Access to Source systems or to output files for the sample identified for
testing.
Access to Mantas and Source systems or to output files for the sample
identified for testing.
Known Issues / Potential
Feedback
Access to source systems
Planning Considerations
Who Project UAT team
Timing As soon as data is available in QS UAT environment.
Effort 2 Days
Location Americas Hub.
Linked Activities
Linked Activities
4.5.3. Rollup Household focus alerts for LSB line (Test 3)
Overview
Purpose Test that the Rollup up of Household focus alerts rolls up properly with
Customer focus.
Strategy Create test scripts based on the previous roll-ups. Test script should be
organized to find cases in the database and be able to match it to a result
using Quick Screens.
Expected Outputs / Acceptance
Criteria
 No severity High, Medium level issues outstanding.
 Any severity Low level issues outstanding have agreed workaround’s in
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 11 of 18
place.
Testing Requirements
Dependencies Technology SIT complete on SIT data.
Mantas UAT environment loaded with required data (Mantas data pre-
processing and data loading complete).
Completeness of data ingestion from source system to Mantas.
Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD
Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers,
accounts, and transaction data covering data items required for all
applicable scenario types.
Data feed from Source Systems.
Transaction details in Mantas and source systems.
Access Requirements Access to sample data extracts from Mantas back-end database.
Access to Mantas UAT environment.
Access to QSCM UAT environment.
Access to Source systems or to output files for the sample identified for
testing.
Access to Mantas and Source systems or to output files for the sample
identified for testing.
Known Issues / Potential
Feedback
Access to source systems
Planning Considerations
Who Project UAT team
Timing As soon as data is available in QS UAT environment.
Effort 1 Day
Location Americas Hub.
Linked Activities
Linked Activities
4.5.4. Show All Transaction History – Americas Data Gathering
(Test 4)
Overview
Purpose To test that QS front end performs in expected manner in regards to the new
screen. Test to ensure new enhancements work as expected and existing
functionality is not impacted by changes.
Strategy Create test scripts based on the existing QS application. The script should
encompass verification for all functionalities (including checking the new
hyperlink, populating the MANTAS Transaction Request Search parameters
and submit, validating that the data response from the All MANTAS
Transaction Request search, etc...) Execute test scripts in the UAT
environment and test that QS interface behaves as expected.
Expected Outputs / Acceptance
Criteria
 QS user interface behaves as expected including requested
enhancements.
 No severity High, Medium level issues outstanding.
 Any severity Low level issues outstanding have agreed workaround’s in
place.
Testing Requirements
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 12 of 18
Dependencies Technology SIT complete on SIT data.
Mantas UAT environment loaded with required data (Mantas data pre-
processing and data loading complete).
Completeness of data ingestion from source system to Mantas.
Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD
Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers,
accounts, and transaction data covering data items required for all
applicable scenario types.
Data feed from Source Systems.
Transaction details in Mantas and source systems.
Access Requirements Access to sample data extracts from Mantas back-end database.
Access to Mantas UAT environment.
Access to QSCM UAT environment.
Access to Source systems or to output files for the sample identified for
testing.
Access to Mantas and Source systems or to output files for the sample
identified for testing.
Known Issues / Potential
Feedback
Access to source systems
Planning Considerations
Who Project UAT team
Timing As soon as data is available in QS UAT environment.
Effort 4 Days
Location Americas Hub.
Linked Activities
Linked Activities
4.5.5. Quick Clearing Functionality – Americas Data Gathering
(Test 5)
Overview
Purpose Allows for the ‘quick clearing’ of cases such as those where a similar case
was recently reviewed. Identifies previously closed cases when a new case
generates. When a new case is generated, identifies any cases closed in
the last 45 days on the same Focus Name or Focus ID. If so, these cases
are flagged in the Case Details screen.
Strategy Create test scripts based on the existing QS application. The script should
encompass verification for all functionalities (new fields/functionality,
amendments to user roles, permissions and workflow, etc.). Execute test
scripts( in the UAT environment and test that QS interface behaves as
expected.
Expected Outputs / Acceptance
Criteria
 QS user interface behaves as expected including requested
enhancements.
 No severity High, Medium level issues outstanding.
 Any severity Low level issues outstanding have agreed workaround’s in
place.
Testing Requirements
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 13 of 18
Dependencies Technology SIT complete on SIT data.
Mantas UAT environment loaded with required data (Mantas data pre-
processing and data loading complete).
Completeness of data ingestion from source system to Mantas.
Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD
Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers,
accounts, and transaction data covering data items required for all
applicable scenario types.
Data feed from Source Systems.
Transaction details in Mantas and source systems.
Access Requirements Access to sample data extracts from Mantas back-end database.
Access to Mantas UAT environment.
Access to QSCM UAT environment.
Access to Source systems or to output files for the sample identified for
testing.
Access to Mantas and Source systems or to output files for the sample
identified for testing.
Known Issues / Potential
Feedback
Access to source systems
Planning Considerations
Who Project UAT team
Timing As soon as data is available in QS UAT environment.
Effort 2 Days
Location Americas Hub.
Linked Activities
Linked Activities
4.5.6. Team Lead Case Assignment (Test 6)
Overview
Purpose This functionality will allow Team Lead roles to have functionality to assign
cases to themselves without ability to disposition cases. Such cases must
be in Research category and in Initiated status.
When the Team Leads click on the Assign/Reassign button in Case Details
screen, they will be able to see themselves in the "Assign to" dropdown.
Strategy Create test scripts based on the existing QS application. The script should
encompass verification for all functionalities (new fields/functionality,
amendments to user roles, permissions and workflow, etc.). Execute test
scripts( in the UAT environment and test that QS interface behaves as
expected.
Expected Outputs / Acceptance
Criteria
 QS user interface behaves as expected including requested
enhancements.
 No severity High, Medium level issues outstanding.
 Any severity Low level issues outstanding have agreed workaround’s in
place.
Testing Requirements
Dependencies Technology SIT complete on SIT data.
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 14 of 18
Mantas UAT environment loaded with required data (Mantas data pre-
processing and data loading complete).
Completeness of data ingestion from source system to Mantas.
Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD
Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers,
accounts, and transaction data covering data items required for all
applicable scenario types.
Data feed from Source Systems.
Transaction details in Mantas and source systems.
Access Requirements Access to sample data extracts from Mantas back-end database.
Access to Mantas UAT environment.
Access to QSCM UAT environment.
Access to Source systems or to output files for the sample identified for
testing.
Access to Mantas and Source systems or to output files for the sample
identified for testing.
Known Issues / Potential
Feedback
Access to source systems
Planning Considerations
Who Project UAT team
Timing As soon as data is available in QS UAT environment.
Effort 2 Days
Location Americas Hub.
Linked Activities
Linked Activities
4.5.7. Closeout (Test 7)
Overview
Purpose This functionality will allow users to initiate the closeout process with one
button click.
Strategy Create test scripts based on the existing QS application. The script should
encompass verification for all functionalities (new fields/functionality,
amendments to user roles, permissions and workflow, etc.). Execute test
scripts( in the UAT environment and test that QS interface behaves as
expected.
Expected Outputs / Acceptance
Criteria
 QS user interface behaves as expected including requested
enhancements.
 No severity High, Medium level issues outstanding.
 Any severity Low level issues outstanding have agreed workaround’s in
place.
Testing Requirements
Dependencies Technology SIT complete on SIT data.
Mantas UAT environment loaded with required data (Mantas data pre-
processing and data loading complete).
Completeness of data ingestion from source system to Mantas.
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 15 of 18
Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD
Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers,
accounts, and transaction data covering data items required for all
applicable scenario types.
Data feed from Source Systems.
Transaction details in Mantas and source systems.
Access Requirements Access to sample data extracts from Mantas back-end database.
Access to Mantas UAT environment.
Access to QSCM UAT environment.
Access to Source systems or to output files for the sample identified for
testing.
Access to Mantas and Source systems or to output files for the sample
identified for testing.
Known Issues / Potential
Feedback
Access to source systems
Planning Considerations
Who Project UAT team
Timing As soon as data is available in QS UAT environment.
Effort 2 Days
Location Americas Hub.
Linked Activities
Linked Activities
4.6 Test Script
Draft Script.xlsx
4.7 ExitCriteria
The following are the UAT exit criteria that need to be satisfied prior to obtaining UAT Sign-off:
 Completed test scripts with test results
 Resolution of all ‘Showstopper’ & ‘High’ severity defects
 List of defects that are outstanding after UAT, including the workaround, resolution and
resolution date
 Updated UAT RICA log
 Closure Document (template attached below)
UAT Test Closure
Memo Template.doc
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 16 of 18
5 Progress, Issue Tracking & Resolution
The Project Team should ensure that:
 A RICA log (Risks, Issues, Change requests, Actions) is maintained - providing detailed
description, severity(High, Medium, Low, Showstopper), date found, target resolution date,
responsible team/individual for resolution.
Issues in the RICA log will be categorized and tracked as follows:
Category Description
ID Sequential Reference Num
UAT Phase <Comment: can category be based on Test number?>
One from:
 Data Existing data feeds + new sources -> EDW
 Data Feeds, EDW -> Mantas
 Data Transfer Mantas -> QSCM

Description Issue description
Status Open, Closed, On Hold
Technology
Component
“Data Feeds”, “Mantas”, “Quickscreens”, Thresholds.
Severity To assist testing team in understanding impact:
Showstopper-
Critical Issue that is stopping further progress being made; project
timelines are at risk, very severe impact to other issues and continued
testing. Should be worked as the most urgent type of issue. Should be
escalated if a swift resolution is not possible.
High-
Severe impact to ability to meet stated requirements, application / data
feed could not go-live without resolution of this issue, and other UAT is
dependent on, and will be delayed pending resolution of this issue.
Severe impact, application / data feed could not go-live without resolution
of this issue, other UAT activities may be affected but can continue whilst
this issue is resolved
Medium -
High impact, application / data feed could go live without resolution of this
issue, but resolution of the issue would be a high priority post-live and / or
significant re-work to processes / procedures will be required in the interim.
Low -
Moderate impact, application / data feed could go live without resolution of
this issue, resolution would be a medium priority post-live and some re-
work of processes / procedures would resolve the issue.
Moderate / Low impact, application / data feed could go-live without
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 17 of 18
Category Description
resolution of this issue, simple or no re-work to processes / procedures
required to resolve.
Priority To assist Tech team in prioritizing issue resolution work:
Showstopper – to be addressed as urgently as possible.
High - to be addressed asap
Medium - to be addressed to agreed timescale
Low - to be addressed as “nice to have” time allowing
Owner Business (Operational Readiness Team) Owner (Testing Team Stream
Leader)
Action Plan Record of action to be taken to resolve issue
Open Date Date issue identified
Technology
Contact
Technology team member assigned responsibility for resolution of the
issue
Target Date Target date for resolution – to be discussed and agreed with Technology
contact
Revised Target
Date
If applicable
Resolution /
Decision /
Outcome
Outcome of the issue
Resolved By Who decision, resolution, outcome reached by
Date Closed Date issue resolved / closed
Attached below are Templates for RICA Log.
RICA Log
Template.xls
6. Roles and Responsibilities
User Acceptance Test planning work, including UAT will be carried out in Tampa, Florida for the
migration of AML Monitoring activities from Puerto Rico to the NAM Hub.
Listed below is the UAT Testing Team and key contact:
Name Role Key Responsibilities
Kisha Merrell UAT Project Manager - Coordinateand manage UAT level activities.
David Crane UAT Coordinator - Support UAT team/PM
Steve Dayton UAT Test Lead
- Overall UAT senior project coordination.
Alan Shen
Apps. Dev. Group
Manager
- Development, configuration and
implementation of application (Mantas &
QSCM.
NAM 2011 Q4A Change Release
Version: v 1.1
Date: September 21, 2011
Page 18 of 18
- Development, configuration and
implementation of data (data feeds to Mantas)
components.
- Issueresolution –applications.
- Issueresolution –data.
- Scenarios.
Alan Parsowith Sr. Group Manager,
Americas Hub
Operations
Approver
Kavya Kalyana Production Support
Lead, Americas Hub
Reviewer
Othniel Alexandre Global Production
Support Manager
Approver
Dennis Carey UAT Test Team Writingtest scripts ,addinginto QC
Abbreviation Expanded Form
AML Anti –Money Laundering
BAU Business As Usual
BO XI Business Objects version XI (11)
CSIS Citi Systems Information Security
DIS Data Information Specification
DW Data Warehouse
GIW Global Information Warehouse
FIU Financial Investigations Unit
GTS Global Transaction Services
ICG Institutional Clients Group
KL Kuala Lumpur (Asia Pacific business region)
NAM North American business region
TRP Targeted Review Program
UAT User Acceptance Testing

More Related Content

What's hot

An Overview of User Acceptance Testing (UAT)
An Overview of User Acceptance Testing (UAT)An Overview of User Acceptance Testing (UAT)
An Overview of User Acceptance Testing (UAT)Usersnap
 
Alpha beta and acceptance testing
Alpha beta and acceptance testing Alpha beta and acceptance testing
Alpha beta and acceptance testing shah baadshah
 
Principles of Software testing
Principles of Software testingPrinciples of Software testing
Principles of Software testingMd Mamunur Rashid
 
Software Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s GuideSoftware Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s GuideSyed Hassan Raza
 
What are the advantages of non functional testing
What are the advantages of non functional testingWhat are the advantages of non functional testing
What are the advantages of non functional testingMaveric Systems
 
STLC (Software Testing Life Cycle)
STLC (Software Testing Life Cycle)STLC (Software Testing Life Cycle)
STLC (Software Testing Life Cycle)Ch Fahadi
 
Software testing life cycle
Software testing life cycleSoftware testing life cycle
Software testing life cycleNikhil Sharma
 
defect tracking and management
defect tracking and management   defect tracking and management
defect tracking and management Manish Chaurasia
 
Non Functional Testing_Sampath kumar Mohan
Non Functional Testing_Sampath kumar MohanNon Functional Testing_Sampath kumar Mohan
Non Functional Testing_Sampath kumar MohanSampath kumar Mohan
 
Basics of software testing webwing technologies
Basics of software testing webwing technologiesBasics of software testing webwing technologies
Basics of software testing webwing technologiesWebwing Technologies
 
Eleven step of software testing process
Eleven step of software testing processEleven step of software testing process
Eleven step of software testing processHimanshu
 
What is sanity testing
What is sanity testingWhat is sanity testing
What is sanity testingpooja deshmukh
 
Software Testing Principles and  Techniques
Software Testing Principles and  Techniques Software Testing Principles and  Techniques
Software Testing Principles and  Techniques suresh ramanujam
 
Regression testing complete guide
Regression testing complete guideRegression testing complete guide
Regression testing complete guideTestingXperts
 
Manual testing
Manual testingManual testing
Manual testingkaryatechs
 
Non-Functional testing
Non-Functional testingNon-Functional testing
Non-Functional testingKanoah
 

What's hot (20)

An Overview of User Acceptance Testing (UAT)
An Overview of User Acceptance Testing (UAT)An Overview of User Acceptance Testing (UAT)
An Overview of User Acceptance Testing (UAT)
 
Alpha beta and acceptance testing
Alpha beta and acceptance testing Alpha beta and acceptance testing
Alpha beta and acceptance testing
 
Principles of Software testing
Principles of Software testingPrinciples of Software testing
Principles of Software testing
 
Software Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s GuideSoftware Testing Life Cycle – A Beginner’s Guide
Software Testing Life Cycle – A Beginner’s Guide
 
Stlc 12 Steps Ppt
Stlc 12 Steps PptStlc 12 Steps Ppt
Stlc 12 Steps Ppt
 
Bug life cycle
Bug life cycleBug life cycle
Bug life cycle
 
What are the advantages of non functional testing
What are the advantages of non functional testingWhat are the advantages of non functional testing
What are the advantages of non functional testing
 
STLC (Software Testing Life Cycle)
STLC (Software Testing Life Cycle)STLC (Software Testing Life Cycle)
STLC (Software Testing Life Cycle)
 
Software testing life cycle
Software testing life cycleSoftware testing life cycle
Software testing life cycle
 
defect tracking and management
defect tracking and management   defect tracking and management
defect tracking and management
 
Non Functional Testing_Sampath kumar Mohan
Non Functional Testing_Sampath kumar MohanNon Functional Testing_Sampath kumar Mohan
Non Functional Testing_Sampath kumar Mohan
 
Basics of software testing webwing technologies
Basics of software testing webwing technologiesBasics of software testing webwing technologies
Basics of software testing webwing technologies
 
Eleven step of software testing process
Eleven step of software testing processEleven step of software testing process
Eleven step of software testing process
 
What is sanity testing
What is sanity testingWhat is sanity testing
What is sanity testing
 
7 steps to Software test automation success
7 steps to Software test automation success7 steps to Software test automation success
7 steps to Software test automation success
 
Software Testing Principles and  Techniques
Software Testing Principles and  Techniques Software Testing Principles and  Techniques
Software Testing Principles and  Techniques
 
Regression testing complete guide
Regression testing complete guideRegression testing complete guide
Regression testing complete guide
 
Manual testing
Manual testingManual testing
Manual testing
 
Non-Functional testing
Non-Functional testingNon-Functional testing
Non-Functional testing
 
Sanity testing and smoke testing
Sanity testing and smoke testingSanity testing and smoke testing
Sanity testing and smoke testing
 

Similar to NAM Q4a 2011 UAT Strategy Document v1 0

Comparative Analysis of IT Monitoring Tools
Comparative Analysis of IT Monitoring ToolsComparative Analysis of IT Monitoring Tools
Comparative Analysis of IT Monitoring Toolsapprize360
 
Coml Psg Automation Approach
Coml Psg Automation ApproachComl Psg Automation Approach
Coml Psg Automation Approachroopavani
 
20 Simple Questions from Exactpro for Your Enjoyment This Holiday Season
20 Simple Questions from Exactpro for Your Enjoyment This Holiday Season20 Simple Questions from Exactpro for Your Enjoyment This Holiday Season
20 Simple Questions from Exactpro for Your Enjoyment This Holiday SeasonIosif Itkin
 
Reducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test PerspectiveReducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test PerspectivePraveen Srivastava
 
Reducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test PerspectiveReducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test PerspectivePraveen Srivastava
 
IRJET- Performance Analysis of Store Inventory Management (SIM) an Enterp...
IRJET-  	  Performance Analysis of Store Inventory Management (SIM) an Enterp...IRJET-  	  Performance Analysis of Store Inventory Management (SIM) an Enterp...
IRJET- Performance Analysis of Store Inventory Management (SIM) an Enterp...IRJET Journal
 
Project Business Case and Capital Justification for Implementation of Applica...
Project Business Case and Capital Justification for Implementation of Applica...Project Business Case and Capital Justification for Implementation of Applica...
Project Business Case and Capital Justification for Implementation of Applica...Duane Bodle
 
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...Cisco Canada
 
Error isolation and management in agile
Error isolation and management in agileError isolation and management in agile
Error isolation and management in agileijccsa
 
Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications
Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications
Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications neirew J
 
IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...
IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...
IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...IRJET Journal
 
E Com Security solutions hand book on Firewall security management in PCI Com...
E Com Security solutions hand book on Firewall security management in PCI Com...E Com Security solutions hand book on Firewall security management in PCI Com...
E Com Security solutions hand book on Firewall security management in PCI Com...Dolly Juhu
 
IRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous DeliveryIRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous DeliveryIRJET Journal
 
Comparative analysis of it monitoring tools october2015 final
Comparative analysis of it monitoring tools october2015 finalComparative analysis of it monitoring tools october2015 final
Comparative analysis of it monitoring tools october2015 finalapprize360
 
Manoj_Netcool&Cognos_Consultant
Manoj_Netcool&Cognos_ConsultantManoj_Netcool&Cognos_Consultant
Manoj_Netcool&Cognos_Consultantmanoj yadav
 
360 cellutions casestudy
360 cellutions casestudy360 cellutions casestudy
360 cellutions casestudy360cell
 
Epco itsm transformation_roadmap_v5_draft_063008
Epco itsm transformation_roadmap_v5_draft_063008Epco itsm transformation_roadmap_v5_draft_063008
Epco itsm transformation_roadmap_v5_draft_063008Accenture
 

Similar to NAM Q4a 2011 UAT Strategy Document v1 0 (20)

Comparative Analysis of IT Monitoring Tools
Comparative Analysis of IT Monitoring ToolsComparative Analysis of IT Monitoring Tools
Comparative Analysis of IT Monitoring Tools
 
Coml Psg Automation Approach
Coml Psg Automation ApproachComl Psg Automation Approach
Coml Psg Automation Approach
 
20 Simple Questions from Exactpro for Your Enjoyment This Holiday Season
20 Simple Questions from Exactpro for Your Enjoyment This Holiday Season20 Simple Questions from Exactpro for Your Enjoyment This Holiday Season
20 Simple Questions from Exactpro for Your Enjoyment This Holiday Season
 
Tyco IS Oracle Apps Support Project
Tyco IS Oracle Apps Support ProjectTyco IS Oracle Apps Support Project
Tyco IS Oracle Apps Support Project
 
Reducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test PerspectiveReducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test Perspective
 
Reducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test PerspectiveReducing Cycle Time for iDEN Releases – A Development and Test Perspective
Reducing Cycle Time for iDEN Releases – A Development and Test Perspective
 
Resume
ResumeResume
Resume
 
IRJET- Performance Analysis of Store Inventory Management (SIM) an Enterp...
IRJET-  	  Performance Analysis of Store Inventory Management (SIM) an Enterp...IRJET-  	  Performance Analysis of Store Inventory Management (SIM) an Enterp...
IRJET- Performance Analysis of Store Inventory Management (SIM) an Enterp...
 
Project Business Case and Capital Justification for Implementation of Applica...
Project Business Case and Capital Justification for Implementation of Applica...Project Business Case and Capital Justification for Implementation of Applica...
Project Business Case and Capital Justification for Implementation of Applica...
 
Attachment_0.pdf
Attachment_0.pdfAttachment_0.pdf
Attachment_0.pdf
 
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
An Oversight or a New Customer Phenomenon, Getting the Most of your Contact C...
 
Error isolation and management in agile
Error isolation and management in agileError isolation and management in agile
Error isolation and management in agile
 
Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications
Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications
Error Isolation and Management in Agile Multi-Tenant Cloud Based Applications
 
IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...
IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...
IRJET- A Survey for Block Chaining based Cyber Security System for Fiscal Dev...
 
E Com Security solutions hand book on Firewall security management in PCI Com...
E Com Security solutions hand book on Firewall security management in PCI Com...E Com Security solutions hand book on Firewall security management in PCI Com...
E Com Security solutions hand book on Firewall security management in PCI Com...
 
IRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous DeliveryIRJET- Development Operations for Continuous Delivery
IRJET- Development Operations for Continuous Delivery
 
Comparative analysis of it monitoring tools october2015 final
Comparative analysis of it monitoring tools october2015 finalComparative analysis of it monitoring tools october2015 final
Comparative analysis of it monitoring tools october2015 final
 
Manoj_Netcool&Cognos_Consultant
Manoj_Netcool&Cognos_ConsultantManoj_Netcool&Cognos_Consultant
Manoj_Netcool&Cognos_Consultant
 
360 cellutions casestudy
360 cellutions casestudy360 cellutions casestudy
360 cellutions casestudy
 
Epco itsm transformation_roadmap_v5_draft_063008
Epco itsm transformation_roadmap_v5_draft_063008Epco itsm transformation_roadmap_v5_draft_063008
Epco itsm transformation_roadmap_v5_draft_063008
 

NAM Q4a 2011 UAT Strategy Document v1 0

  • 1. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 1 of 18 NAM 2011 Q4A Change Release (Including first part of Automated Data Gathering) User Acceptance Test Strategy
  • 2. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 2 of 18 Document History Version Changes Author Date 1.0 1st Draft David Crane 8/3/2011 1.1 2nd Draft – Added description for acronyms, update to Show All History, change name of release, move ECIF testing to first in the list David Crane 8/9/2011 1.2 Final Draft – Added Closeout to the items being tested, Added Reviewers and Approvers, Update LOBs Affected David Crane 8/10/2011 Document Reviewers/Approvers Name Position Reviewe r (only) Reviewer and Approver Steve Dayton UAT Manager / Delivery Manager X Alan Parsowith AML Operations Director X Virginia Hibbard AML Advisory Compliance X Kaushik Sinha Technology – Project Manager X Kavya Kalyana Production Support Lead, Americas HUB X Rachael Blanchard PMO PM X Christie Vita AML Operations X David Roset AML Operations X Maryjane Noberto PMO PM X David Crane UAT Coordinator X Kisha Merrell UAT Project Manager X Manoj Jejware Technology X Prerak Vora AML Operations PM X
  • 3. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 3 of 18 Table of Contents Table of Contents 3 1. Introduction 4 1.1. Purpose of this document 4 2. Scope 4 2.1. In Scope 4 2.2. Out of Scope 5 3. System Access 5 3.1. System Access 5 4. UAT Strategy 6 4.1. Assumptions 6 4.2. Entry Criteria 6 4.3. Specific Deliverables 6 4.4. UAT Dates 7 4.5. UAT Testing Strategy 7 4.6 Exit Criteria 15 5 Progress, Issue Tracking & Resolution 16 6. Roles and Responsibilities 17
  • 4. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 4 of 18 1. Introduction 1.1. Purpose of this document This document describes the key activities for the NAM User Acceptance Test (UAT) required for successful implementation of the Q3 (Third Quarter) Change Release. This project delivers enhancements to the QuickScreens NAM software system by introducing new functionality to create Manual cases, rollup household focus LSB alerts and team lead assignment of cases. It will introduce productivity enhancements for Quick Clearing and showing all transaction history, and will incorporate functionality for automated data gathering from the ECIF system. 2. Scope The goal of this project is to produce fully functional and acceptable software product that satisfies the high level requirements defined below 2.1. In Scope This project delivers enhancements to the QuickScreens NAM software module of the AML- M: AML MANTAS Analytic system (CSI App Id 34916), by introducing the following new functionality: 1. Automated Data Gathering from ECIF. (Americas Data Gathering BRD #3.1.1.4; Global Technology WorkSlate #3c-2) This functionality will allow AML users to view data related to an AML case from ECIF as required by the AML review processes without having to log into ECIF directly. 2. Ability to create Manual cases for CPB, IPB, IAML and LSB. (Global Technology WorkSlate #94; CR3575 or CR11147) Following manual case types can be created: 1) Know Your Customer (KYC) 2) Suspicious 3) Subpoena 4) Senior Public Figures (SPF) 5) Non Government Organization (NGO) 6) Embassy Credit Card 7) Cash Sales Logs (CSL) 8) Payable Upon Proper ID (PUPID) 3. Rollup Household focus alerts For LSB business line, Household focus cases should be rolled up to Customer focus cases. 4. Show All Transaction History - Americas Data Gathering (Americas Data Gathering BRD #3.1.6; Global Technology WorkSlate #3c)
  • 5. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 5 of 18 This functionality will provide a way to view all transactions associated with an alert’s account’s customer, not just those transactions that triggered the alert. This feature utilizes only MANTAS data and therefore only goes back to the extent that the MANTAS data is available (current month and previous six months) 5. Quick Clearing Functionality - Americas Data Gathering (Americas Data Gathering BRD #3.1.8; Global Technology WorkSlate #3c-7) This functionality will allow for the ‘quick clearing’ of cases such as those where a similar case was recently reviewed. Identify previously closed cases when a new case generates. 6. Team Lead Case Assignment (Global Technology WorkSlate #95; CR3580 or CR11152) This functionality will allow Team Lead roles to have functionality to assign cases to themselves without ability to disposition cases. 7. Closeout (Global Technology WorkSlate #75; CR10984 or PID 20081110-100) This functionality will create a link between the Closeout case and the underlying case, to track the case number from which the Closeout case stems from. This is an item from 2010 QSQ4 being added to this Change Release. 2.2. Out of Scope The following areas are out-of-scope of testing: 1. General systems efficiency testing is out of scope of UAT testing. 2. Updates or creation of procedures for AML Monitoring Operations, Production Support, Staffing, and Communications about changes within Citi are out of scope of the UAT portion of the effort. 3. System Access 3.1. System Access Access to source systems (production access), Mantas, QSCM UAT UI and QSCM back- end will be arranged for the System Implementation Support Group team prior to starting of the UAT activities. UAT testers should have access to all systems used for alert monitoring as well as ECIF.
  • 6. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 6 of 18 4. UAT Strategy 4.1. Assumptions  All entry criteria for UAT have been met.  It is assumed that all UAT activities and tests are carried out on the basis of successful completion of Technology SIT testing, as well as successful implementation of rules defined in the FRD.  UAT should focus on end to end business processes from an end user perspective to confirm usability. All UAT activities will be carried out on the production data loaded in UAT environment. 4.2. Entry Criteria The following will be the entry criteria that must be met prior to the commencement of UAT:  Approval of UAT Strategy document (this document)  Confirmed availability of UAT testing resources  Confirmed availability of systems (Mantas, QS, ECIF)  Confirm performance of UAT systems to similar level of Production  Access to all necessary systems and applications for each respective UAT tester  Completion of required data loads  Coding logic changes deployed to Quick Screens  By end of 1st week of UAT: New Alerts from prospective business groups will be identified: o CPB o IPB o IAML o LSB o CBNA o GTS 4.3. Specific Deliverables This section describes the specific deliverables from Technology & Compliance to help facilitate the testing, and when those deliverables need to be provided; whilst some are required before UAT starts, others are only needed when a particular test is being executed. Team Business Deliverables Due Date Technology ICG/GCG All evidence entry criteria are met UAT start date Technology ICG/GCG Status on all issues found (on daily basis) Ongoing
  • 7. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 7 of 18 Technology ICG/GCG Any scenario rerun, as necessary Ongoing Compliance ICG/GCG Approvals, as needed Ongoing Compliance ICG/GCG Attendance on Project level calls Ongoing Compliance ICG/GCG Escalation, w hen necessary Ongoing Operations ICG/GCG Approvals, as needed Ongoing Operations ICG/GCG Attendance on Project level calls Ongoing 4.4. UAT Dates Q3 Release START DATE END DATE UAT Commence 08/08/2011 08/12/2011 UAT Testing 08/15/2011 09/07/2011 UAT Closure 09/08/2011 09/09/2011 Move to Production 09/09/2011 09/12/2011 Data Testing Plan Start End Resource SystemsSetupTesting 08/10/2011 08/12/2011 IT IngestedDataTesting 08/10/2011 08/12/2011 IT UAT Test – Functionality(testsbelow) 08/15/2011 09/05/2011 UAT ECIF AutomatedDataGathering 08/15/2011 08/19/2011 UAT Create Manual Cases 08/22/2011 08/23/2011 UAT RollupHouseholdFocusAlerts 08/24/2011 08/24/2011 UAT ShowAll Transactions 08/25/2011 08/30/2011 UAT QuickClearingFunctionality 08/31/2011 09/01/2011 UAT Team LeadCase Assignment 09/02/2011 09/03/2011 UAT Closeout 09/04/2011 09/05/2011 UAT 4.5. UAT TestingStrategy Listed below are details of the tests that are to be performed for NAM. The scripts will need to be updated and modified to match to the specific test plan. Test Plan is available here: The table below summarizes the tests that should be performed as part of the UAT phase.
  • 8. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 8 of 18 4.5.1. Automated Data Gathering from ECIF (Test 1) Overview Test # Title Type Purpose Test Applicability Line of Business Affected 1 Automated Data Gathering from ECIF Quickscreens UI Functionality Test new feature which allows AML users to view data related to an AML case from ECIF without having to log into ECIF directly NAM GTS 2 Create Manual Cases for CPB, IPB, IAML, and LSB Quickscreens UI Functionality Test that the QuickScreens environment will be able to accommodate the new functionality to be able to add the following case types: 1) Know Your Customer (KYC) 2) Suspicious 3) Subpoena 4) Senior Public Figures (SPF) 5) Non Government Organization (NGO) 6) Embassy Credit Card 7) Cash Sales Logs (CSL) 8) Payable Upon Proper ID (PUPID) NAM CPB, IPB, IAML, LSB 3 Rollup Household focus alerts for LSB line MANTAS Data Test to check for successful rollup of alerts to Customer focus cases NAM LSB 4 Show All Transaction History – Americas Data Gathering Quickscreens UI Functionality Test functionality of new feature to view all transactions associated with an alert’s account customer, not just those that triggered the alert NAM All 5 Quick Clearing Functionality – Americas Data Gathering Quickscreens UI Functionality Test functionality of new feature to ‘quick clear’ cases and identify previously closed cases when a new case generates NAM All 6 Team Lead Case Assignment Quickscreens UI Functionality Test functionality of new feature to allow Team Lead roles to assign cases to themselves without ability to disposition cases NAM All 7 Closeout Quickscreens UI Functionality Test functionality of new “Initiate Closeout” button which will initiate the closeout process. Also test link between Closeout case and the underlying case. NAM CPB, IPB, IAML, LSB
  • 9. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 9 of 18 Purpose To test that QS front end performs in expected manner. Test to ensure new enhancements work as expected and existing functionality is not impacted by changes. Strategy Create test scripts based on the existing QS application. The script should encompass verification for all functionalities (new fields/functionality, amendments to user roles, permissions and workflow, etc.). Execute test scripts( in the UAT environment and test that QS interface behaves as expected. Expected Outputs / Acceptance Criteria  QS user interface behaves as expected including requested enhancements.  No severity High, Medium level issues outstanding.  Any severity Low level issues outstanding have agreed workaround’s in place. Testing Requirements Dependencies Technology SIT complete on SIT data. Mantas UAT environment loaded with required data (Mantas data pre- processing and data loading complete). Completeness of data ingestion from source system to Mantas. Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers, accounts, and transaction data covering data items required for all applicable scenario types. Data feed from Source Systems. Transaction details in Mantas and source systems. Access Requirements Access to sample data extracts from Mantas back-end database. Access to Mantas UAT environment. Access to QSCM UAT environment. Access to Source systems or to output files for the sample identified for testing. Access to Mantas and Source systems or to output files for the sample identified for testing. Known Issues / Potential Feedback Access to source systems Planning Considerations Who Project UAT team Timing As soon as data is available in QS UAT environment. Effort 4 Days Location Americas Hub. Linked Activities Linked Activities 4.5.2. Create Manual Cases for CPB, IPB, IAML, and LSB (Test 2) Overview Purpose To test that QS front end performs in expected manner. Test to ensure new enhancements work as expected and existing functionality is not impacted by changes. Strategy Create test scripts based on the existing QS application. The script should
  • 10. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 10 of 18 encompass verification for all functionalities (new fields/functionality, amendments to user roles, permissions and workflow, etc.). Execute test scripts( in the UAT environment and test that QS interface behaves as expected. Expected Outputs / Acceptance Criteria  QS user interface behaves as expected including requested enhancements.  No severity High, Medium level issues outstanding.  Any severity Low level issues outstanding have agreed workaround’s in place. Testing Requirements Dependencies Technology SIT complete on SIT data. Mantas UAT environment loaded with required data (Mantas data pre- processing and data loading complete). Completeness of data ingestion from source system to Mantas. Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers, accounts, and transaction data covering data items required for all applicable scenario types. Data feed from Source Systems. Transaction details in Mantas and source systems. Access Requirements Access to sample data extracts from Mantas back-end database. Access to Mantas UAT environment. Access to QSCM UAT environment. Access to Source systems or to output files for the sample identified for testing. Access to Mantas and Source systems or to output files for the sample identified for testing. Known Issues / Potential Feedback Access to source systems Planning Considerations Who Project UAT team Timing As soon as data is available in QS UAT environment. Effort 2 Days Location Americas Hub. Linked Activities Linked Activities 4.5.3. Rollup Household focus alerts for LSB line (Test 3) Overview Purpose Test that the Rollup up of Household focus alerts rolls up properly with Customer focus. Strategy Create test scripts based on the previous roll-ups. Test script should be organized to find cases in the database and be able to match it to a result using Quick Screens. Expected Outputs / Acceptance Criteria  No severity High, Medium level issues outstanding.  Any severity Low level issues outstanding have agreed workaround’s in
  • 11. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 11 of 18 place. Testing Requirements Dependencies Technology SIT complete on SIT data. Mantas UAT environment loaded with required data (Mantas data pre- processing and data loading complete). Completeness of data ingestion from source system to Mantas. Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers, accounts, and transaction data covering data items required for all applicable scenario types. Data feed from Source Systems. Transaction details in Mantas and source systems. Access Requirements Access to sample data extracts from Mantas back-end database. Access to Mantas UAT environment. Access to QSCM UAT environment. Access to Source systems or to output files for the sample identified for testing. Access to Mantas and Source systems or to output files for the sample identified for testing. Known Issues / Potential Feedback Access to source systems Planning Considerations Who Project UAT team Timing As soon as data is available in QS UAT environment. Effort 1 Day Location Americas Hub. Linked Activities Linked Activities 4.5.4. Show All Transaction History – Americas Data Gathering (Test 4) Overview Purpose To test that QS front end performs in expected manner in regards to the new screen. Test to ensure new enhancements work as expected and existing functionality is not impacted by changes. Strategy Create test scripts based on the existing QS application. The script should encompass verification for all functionalities (including checking the new hyperlink, populating the MANTAS Transaction Request Search parameters and submit, validating that the data response from the All MANTAS Transaction Request search, etc...) Execute test scripts in the UAT environment and test that QS interface behaves as expected. Expected Outputs / Acceptance Criteria  QS user interface behaves as expected including requested enhancements.  No severity High, Medium level issues outstanding.  Any severity Low level issues outstanding have agreed workaround’s in place. Testing Requirements
  • 12. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 12 of 18 Dependencies Technology SIT complete on SIT data. Mantas UAT environment loaded with required data (Mantas data pre- processing and data loading complete). Completeness of data ingestion from source system to Mantas. Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers, accounts, and transaction data covering data items required for all applicable scenario types. Data feed from Source Systems. Transaction details in Mantas and source systems. Access Requirements Access to sample data extracts from Mantas back-end database. Access to Mantas UAT environment. Access to QSCM UAT environment. Access to Source systems or to output files for the sample identified for testing. Access to Mantas and Source systems or to output files for the sample identified for testing. Known Issues / Potential Feedback Access to source systems Planning Considerations Who Project UAT team Timing As soon as data is available in QS UAT environment. Effort 4 Days Location Americas Hub. Linked Activities Linked Activities 4.5.5. Quick Clearing Functionality – Americas Data Gathering (Test 5) Overview Purpose Allows for the ‘quick clearing’ of cases such as those where a similar case was recently reviewed. Identifies previously closed cases when a new case generates. When a new case is generated, identifies any cases closed in the last 45 days on the same Focus Name or Focus ID. If so, these cases are flagged in the Case Details screen. Strategy Create test scripts based on the existing QS application. The script should encompass verification for all functionalities (new fields/functionality, amendments to user roles, permissions and workflow, etc.). Execute test scripts( in the UAT environment and test that QS interface behaves as expected. Expected Outputs / Acceptance Criteria  QS user interface behaves as expected including requested enhancements.  No severity High, Medium level issues outstanding.  Any severity Low level issues outstanding have agreed workaround’s in place. Testing Requirements
  • 13. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 13 of 18 Dependencies Technology SIT complete on SIT data. Mantas UAT environment loaded with required data (Mantas data pre- processing and data loading complete). Completeness of data ingestion from source system to Mantas. Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers, accounts, and transaction data covering data items required for all applicable scenario types. Data feed from Source Systems. Transaction details in Mantas and source systems. Access Requirements Access to sample data extracts from Mantas back-end database. Access to Mantas UAT environment. Access to QSCM UAT environment. Access to Source systems or to output files for the sample identified for testing. Access to Mantas and Source systems or to output files for the sample identified for testing. Known Issues / Potential Feedback Access to source systems Planning Considerations Who Project UAT team Timing As soon as data is available in QS UAT environment. Effort 2 Days Location Americas Hub. Linked Activities Linked Activities 4.5.6. Team Lead Case Assignment (Test 6) Overview Purpose This functionality will allow Team Lead roles to have functionality to assign cases to themselves without ability to disposition cases. Such cases must be in Research category and in Initiated status. When the Team Leads click on the Assign/Reassign button in Case Details screen, they will be able to see themselves in the "Assign to" dropdown. Strategy Create test scripts based on the existing QS application. The script should encompass verification for all functionalities (new fields/functionality, amendments to user roles, permissions and workflow, etc.). Execute test scripts( in the UAT environment and test that QS interface behaves as expected. Expected Outputs / Acceptance Criteria  QS user interface behaves as expected including requested enhancements.  No severity High, Medium level issues outstanding.  Any severity Low level issues outstanding have agreed workaround’s in place. Testing Requirements Dependencies Technology SIT complete on SIT data.
  • 14. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 14 of 18 Mantas UAT environment loaded with required data (Mantas data pre- processing and data loading complete). Completeness of data ingestion from source system to Mantas. Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers, accounts, and transaction data covering data items required for all applicable scenario types. Data feed from Source Systems. Transaction details in Mantas and source systems. Access Requirements Access to sample data extracts from Mantas back-end database. Access to Mantas UAT environment. Access to QSCM UAT environment. Access to Source systems or to output files for the sample identified for testing. Access to Mantas and Source systems or to output files for the sample identified for testing. Known Issues / Potential Feedback Access to source systems Planning Considerations Who Project UAT team Timing As soon as data is available in QS UAT environment. Effort 2 Days Location Americas Hub. Linked Activities Linked Activities 4.5.7. Closeout (Test 7) Overview Purpose This functionality will allow users to initiate the closeout process with one button click. Strategy Create test scripts based on the existing QS application. The script should encompass verification for all functionalities (new fields/functionality, amendments to user roles, permissions and workflow, etc.). Execute test scripts( in the UAT environment and test that QS interface behaves as expected. Expected Outputs / Acceptance Criteria  QS user interface behaves as expected including requested enhancements.  No severity High, Medium level issues outstanding.  Any severity Low level issues outstanding have agreed workaround’s in place. Testing Requirements Dependencies Technology SIT complete on SIT data. Mantas UAT environment loaded with required data (Mantas data pre- processing and data loading complete). Completeness of data ingestion from source system to Mantas.
  • 15. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 15 of 18 Supporting Documentation To enable creation of test scripts, availability of: BRD and MDD Inputs / Data Requirements Sufficient data sets available to enable sample checking of customers, accounts, and transaction data covering data items required for all applicable scenario types. Data feed from Source Systems. Transaction details in Mantas and source systems. Access Requirements Access to sample data extracts from Mantas back-end database. Access to Mantas UAT environment. Access to QSCM UAT environment. Access to Source systems or to output files for the sample identified for testing. Access to Mantas and Source systems or to output files for the sample identified for testing. Known Issues / Potential Feedback Access to source systems Planning Considerations Who Project UAT team Timing As soon as data is available in QS UAT environment. Effort 2 Days Location Americas Hub. Linked Activities Linked Activities 4.6 Test Script Draft Script.xlsx 4.7 ExitCriteria The following are the UAT exit criteria that need to be satisfied prior to obtaining UAT Sign-off:  Completed test scripts with test results  Resolution of all ‘Showstopper’ & ‘High’ severity defects  List of defects that are outstanding after UAT, including the workaround, resolution and resolution date  Updated UAT RICA log  Closure Document (template attached below) UAT Test Closure Memo Template.doc
  • 16. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 16 of 18 5 Progress, Issue Tracking & Resolution The Project Team should ensure that:  A RICA log (Risks, Issues, Change requests, Actions) is maintained - providing detailed description, severity(High, Medium, Low, Showstopper), date found, target resolution date, responsible team/individual for resolution. Issues in the RICA log will be categorized and tracked as follows: Category Description ID Sequential Reference Num UAT Phase <Comment: can category be based on Test number?> One from:  Data Existing data feeds + new sources -> EDW  Data Feeds, EDW -> Mantas  Data Transfer Mantas -> QSCM  Description Issue description Status Open, Closed, On Hold Technology Component “Data Feeds”, “Mantas”, “Quickscreens”, Thresholds. Severity To assist testing team in understanding impact: Showstopper- Critical Issue that is stopping further progress being made; project timelines are at risk, very severe impact to other issues and continued testing. Should be worked as the most urgent type of issue. Should be escalated if a swift resolution is not possible. High- Severe impact to ability to meet stated requirements, application / data feed could not go-live without resolution of this issue, and other UAT is dependent on, and will be delayed pending resolution of this issue. Severe impact, application / data feed could not go-live without resolution of this issue, other UAT activities may be affected but can continue whilst this issue is resolved Medium - High impact, application / data feed could go live without resolution of this issue, but resolution of the issue would be a high priority post-live and / or significant re-work to processes / procedures will be required in the interim. Low - Moderate impact, application / data feed could go live without resolution of this issue, resolution would be a medium priority post-live and some re- work of processes / procedures would resolve the issue. Moderate / Low impact, application / data feed could go-live without
  • 17. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 17 of 18 Category Description resolution of this issue, simple or no re-work to processes / procedures required to resolve. Priority To assist Tech team in prioritizing issue resolution work: Showstopper – to be addressed as urgently as possible. High - to be addressed asap Medium - to be addressed to agreed timescale Low - to be addressed as “nice to have” time allowing Owner Business (Operational Readiness Team) Owner (Testing Team Stream Leader) Action Plan Record of action to be taken to resolve issue Open Date Date issue identified Technology Contact Technology team member assigned responsibility for resolution of the issue Target Date Target date for resolution – to be discussed and agreed with Technology contact Revised Target Date If applicable Resolution / Decision / Outcome Outcome of the issue Resolved By Who decision, resolution, outcome reached by Date Closed Date issue resolved / closed Attached below are Templates for RICA Log. RICA Log Template.xls 6. Roles and Responsibilities User Acceptance Test planning work, including UAT will be carried out in Tampa, Florida for the migration of AML Monitoring activities from Puerto Rico to the NAM Hub. Listed below is the UAT Testing Team and key contact: Name Role Key Responsibilities Kisha Merrell UAT Project Manager - Coordinateand manage UAT level activities. David Crane UAT Coordinator - Support UAT team/PM Steve Dayton UAT Test Lead - Overall UAT senior project coordination. Alan Shen Apps. Dev. Group Manager - Development, configuration and implementation of application (Mantas & QSCM.
  • 18. NAM 2011 Q4A Change Release Version: v 1.1 Date: September 21, 2011 Page 18 of 18 - Development, configuration and implementation of data (data feeds to Mantas) components. - Issueresolution –applications. - Issueresolution –data. - Scenarios. Alan Parsowith Sr. Group Manager, Americas Hub Operations Approver Kavya Kalyana Production Support Lead, Americas Hub Reviewer Othniel Alexandre Global Production Support Manager Approver Dennis Carey UAT Test Team Writingtest scripts ,addinginto QC Abbreviation Expanded Form AML Anti –Money Laundering BAU Business As Usual BO XI Business Objects version XI (11) CSIS Citi Systems Information Security DIS Data Information Specification DW Data Warehouse GIW Global Information Warehouse FIU Financial Investigations Unit GTS Global Transaction Services ICG Institutional Clients Group KL Kuala Lumpur (Asia Pacific business region) NAM North American business region TRP Targeted Review Program UAT User Acceptance Testing