Large, complex projects—those with more than 100 people and lasting more that a year—require special considerations for developing, communicating, and managing the overall QA strategy and test plans. Shaun Bradshaw provides insights he gained from a $70 million financial software implementation project comprised of multiple components including a general ledger, business intelligence platform, data warehouse, and data integration hub. Tasked with managing the entire test effort as part of the third-party validation team, Shaun acted as QA architect to create the test strategy and plan for the project. He shares the challenges he and his team had to overcome to help deliver a smooth implementation and installation. Shaun discusses his experiences aligning the QA strategy with the culture of the organization and ensuring key test and QA roles were filled with the right people. Take back new ideas and approaches you can use to tame the testing beast in your large project.
1. W7
Test Management
5/1/2013 1:45:00 PM
Taming the Beast: Test/QA on Largescale Projects
Presented by:
Shaun Bradshaw
Zenergy Technologies, Inc.
Brought to you by:
340 Corporate Way, Suite 300, Orange Park, FL 32073
888-268-8770 ∙ 904-278-0524 ∙ sqeinfo@sqe.com ∙ www.sqe.com
2. Shaun Bradshaw
Cofounder of Zenergy Technologies, Inc., a QA and agile solutions consulting firm
(zenergytechnologies.com), Shaun Bradshaw has spent the past fifteen years advising, teaching, and
mentoring clients to improve their QA and test processes with effective testing and test management
techniques. He is the coauthor, editor, and trainer of a suite of methodologies covering testing, test
management, and test metrics. A popular speaker at many of the major QA industry conferences, Shaun
is known for his presentations on test metrics, the S-Curve, and the Zero Bug Bounce.
3. 4/16/2013
AGENDA
•
•
•
•
•
•
•
•
•
What constitutes a large-scale project?
QA Approach
Data Strategy
Environments / Infrastructure
Team Structure
Communications
Integration Testing
Performance Testing
Key Take Aways
1
4. 4/16/2013
LARGE–
LARGE–SCALE PROJECTS
•
“largeWhat constitutes a “large-scale project”?
Number of resources?
Duration?
Number of integrated components?
Cost?
Other?
LARGE-SCALE PROJECTS
LARGE•
What constitutes a “large-scale project”?
“largeNumber of resources? 100+
Duration? 2+ years
Number of integrated components? 6 (major)
Cost? $70 million
Other? Multiple external vendors operating in
strict waterfall methodology
2
5. 4/16/2013
TERMINOLOGY
•
•
System Testing
Performed at the component level
Includes functional, data
transformation/validation, and security tests
Integration Testing
Performed across multiple components of the
solution
Includes integration, data quality validation, and
cross-component security tests
QA APPROACH
•
Align with overall development methodology
Don’t create “culture shock” by following
completely different test methodology
If the project is waterfall, utilize the V-model of QA or
something similar
If the project is agile, help establish iterations that QA
can work with and ensure open collaboration across the
project
Ensure the QA approach is thoroughly
communicated across the project team
3
7. 4/16/2013
DETERMINE THE DATA STRATEGY
•
Data Acquisition
Create test data
From component systems
Based on test criteria
Pull data from production
Ensure proper data quality
Data meets the needs of the testing
Data is in “proper” form
Data masking?
Data Volume?
ENVIRONMENTS/INFRASTRUCTURE
•
Determine, as early as possible, what environments
are available & necessary
Component-level system test environment
Integration test environment
Performance test environment (possibly
production if this is the first instance of the
application)
UAT environment
5
8. 4/16/2013
TEAM STRUCTURE
Key Roles
Test Architect – Creates overall test strategy
communicates/socializes it to project team
Test Manager – Manages day-to-day test activities
Test Leads – Component-based/Technology-based
Test Analysts – Develop and execute tests
Test Project Manager – Ensures test activities are
properly scheduled
Integration Lead – Pulls together integration test effort
TEAM STRUCTURE
•
Key Roles (continued)
Performance Lead/Engineers – Develops performance
test plan, scenarios, and executes
Test Data Modeler – Determines data needs &
acquires test data
Business Leads– provide user-type input/review of test
scenarios and test cases
Technical Leads – Assist in creation of QA environment
and data needs
Release Manager – Helps manage all configuration
and component movement
6
9. 4/16/2013
COMMUNICATIONS
•
The BIGGEST Challenge for QA
Utilize 360° communication strategy
QA approach and reasoning to upper management (early
and often)
Resource and schedule requirements to peers
Tactics, techniques, and goals to down line resources
Beware of assumptions made due to lack of
communication
COMMUNICATIONS
•
The BIGGEST Challenge for a Large-Scale Project
LargeIdentify the right resources to include when
discussing
QA Approach / Test Strategy
Data Requirements / Strategy
Resources & Schedules
Test case framework
Test phases
7
10. 4/16/2013
INTEGRATION TESTING
•
LargeThe 2nd BIGGEST Challenge for a Large-Scale Project
Logistical challenges
Political challenges
•
Purpose – demonstrates that the IT processes and
systems/components built to support business
processes are mutually integrated correctly
INTEGRATION TESTING
•
Integration Test Approach
Map out critical data flows between components
(IDEF is a good model)
Determine “day in the life” scenarios ensuring that
all critical data flows are covered
Develop integration test cases using previously
created system test cases to minimize rework (use
a modular test framework)
Utilize a “bottom-up” approach where possible
(start with C2C scenarios, then E2E scenarios)
8
11. 4/16/2013
INTEGRATION TESTING
Controls, Contraints,
Rules, Timings
Notes
Source
Input
ACTIVITY
A
How to read the model
A Sends to B
Primary Flow
ACTIVITY Output
Destination
B 10
Input 01-09
ICOM
Output 10-19
Labels,
Number
Control 30-39
Descriptions
Mech 40-49
Mechanisms and non-data generated
Controls grouped to remove clutter of lines
B Pulls from A
E2 Mapping Rules
Data Dictionary / SIF
JGen Mappings
Excel Upload Template
Application and
User Security
Manual Open/Close Process
Business Rules Internal Controls
Processing Schedule
Journal Entry
Preparers
Journal Entry
Preparers
Journal Entry
Preparers
Spreadsheet Uploads
Attachments
01
30
02
Manual Data Entries
E2 to PS Chart field
translation - flat file
Input (PCR 15)
CDS
Black Box e2
Upstream Formatted Files
Sources
Upstream SIF Formatted Files
Sources
Direct to E2
Upstream Formatted Files
Sources
03
04
05
GL
40
BI Team
NXG
Team
ICAS
Send controlled by DI but
does not go through DI
VPD
Stop
Application and
Security
File
User Security
Date/Time
Internal Controls
Processing Schedule
PS Query
Results
Mechanisms,
Systems, Users
Red lines/text indicate High Risk
only
only
Manual
Green line is for Reports
Perl and
Upload
Dotted line is for PCR 33
Korn Shell,
Outstanding
Acs Scripts
Template
Match/Pass Rules
Remitter
Extract
Application and
Lookup
Criteria
User Security
Table
Template
Date/Time
Internal Controls
Processing Schedule
PS Universe
(Read Only)
Validation Error Report
FT
A
DR
ETL Controls
for ODS data
movement: All
but ODM data
Financial
Transactions
File (FTF)
Account
Structure
Extract
ODS/ODM/RDM
Universe (Read
Only)
12
03
01
02
03
Hyperion Out:
Actuals and
Budget
SQL Data
Load
Self
Service
Business
Object
Reports
80 Compass
Developed
Reports
AutoSys
System
Server/Mainframe
Connectivity
Environment
Oracle
SFTP
Excel
Active Directory
PeopleSoft Autosys PeopleSoft UI
PS Users
Adapter
Admins
AutoSys
Server/Mainframe
System
Environment
Connectivity
Oracle
Webservice
Informatica
Active Directory
Informatica
Users
DA Tool
Data Analyst
NDM
AutoSys
Server/Mainframe
System
Environment
Connectivity
Oracle Informatica Active Directory
Prod Application
Webservice NDM
Admin
Email Server
Admins
Tree Files
Webservice
Validation
Data (PCR
33)
10
BI
Reports
11
Self
Service
EssBase
Reports
12
13
10
30 31 32 33
11
01
12
02
NXG
13
03
14
04
40 15
BO Reports
NXG Data Base View
30 31 32
01
10
Treasury
02
40
03
30 31 32
Tax
01
40
Balance
Files
AutoSys
Server/Mainframe
System
Environment
Connectivity
Oracle
Informatica
Active Directory
InfoView Excel Add-in BO XIR3
Oracle DBAs
BO Portal
Financial Controller
(Internet)
Consumers
Business Objects
Hyperion EssBase
9.3.3 (Arc Cube)
Life Com
GL Accurate
WebClient
Common View CSA
(Webpage)
Processing
Schedule
01
02 Ess
10
03 Base
04
40
05
Hyperion In:
Budget Data
GL91I0014
RDM
Universe
(Read Only)
Consumers
Policy and Company Number
Hourly JE extract for spreadsheet upload
Reconciliation and Reporting Data
NXG BO
Universe
Reports
NXG
Balance
Self Service Balance
(Daily Balance File)
Webservice PS Chart of
Accounts (partial or
complete) to be validated
by DI (PCR 33)
Shared File
System
30
04
05
06 40
Cost Center Validation
Data
Modified ICAS Report
Excel File Manual Load
Policy Level Claims
Reports
Shared File
System
Trigger for
Balance
Trigger for
Balance
10
11
DW
02
(ODS/ODM)
EAS Ledger
Table Data
(ARC)
EIS Trees (ARC)
Remitter
Number Table
PS
CIMS
E&Y
Ariba
Trigger
for FTF
Trigger
for FTF
DI Staging Tables
(Read Only)
RDM
Date
Dimension
33
Rejected File
COPS Universe
(Read Only)
40
Manual Upload
Of Opening /
Outstanding Items
Journal Line Attachments
Cost Center Validation
Detail Extract via SFTP
Cost Center BU Account
Translated PS File
CDS
Converted Rejection Info
Email
Admins
Budget
On
Demand
(ETL)
30 31 32
01
Outstanding Items
Conversion Data
Annuity
Team
CDS
COPS Audit Info
Converted e2
to PS Data
Fin Detail
Table (Nightly
Batch)
E2 Balances
e2
e2
Config/Settings:
Open Periods, Mapping
Rules, Ledger Code,
Chart Fields, Open
Periods, Trees, Business
Units
30 31
10
32
01
11
12
DI
02
13
(Includes
03
14
04
COPS/RDM) 1615
05
17
06
40 18
PS
Formatted
Data to
JGen
(Batch
Data)
WLK Day
Transactions
e2
Controls
for Trickle
Feed and
Batch for
JE Lines
etc. data
movement
JE Lines/
Hdr
(Nightly
Batch)
JE Lines/Hdr
Ledger
(Trickle
Feed)
Date Dimension File (On
Demand)
Via Data Analyst Package
to RDM
Manual Schedule
Combo Edit
Business Rules Internal Controls
Processing Schedule
Explosions
Self Service Balance
(Daily Balance File)
31 10
11
12
13
14
15
16
17
17
18
Non Chart Field Table
(Excel File) Via Data
BI Team
Analyst Package to RDM
Application and
User Security
Date/Time
Internal Controls
Processing Schedule
Error Theshold
Application and
User Security
30 31 32
01 Invest
02
03 Accntg
04 40
AutoSys
Server/Mainframe
System
Environment
Connectivity
Oracle
Informatica
Web
Active Directory
System
Driver
Excel
NXG Admin
Connectivity Generator
(min
NXG Reconciler
Accurate Webclient
2007)
BO User
Commonview/CSA
Business Objects
System
Server/Mainframe
Connectivity
Environment
Webservice
Sy stem
Server/Mainframe
Connectivity
Environment
Compass Integration Model A1101711 1545.vsd
Author: Alan Smith
INTEGRATION TESTING
•
Validation Approach
Focus on positive scenarios during integration
(use system testing for most of the negative test
scenarios)
Determine the best approach given time,
resources, and risk
Validation of Execution
Validation of Results
9
12. 4/16/2013
INTEGRATION TESTING
•
Validation of Execution
Assumes detailed validation of expected results was
completed during system testing and system test
scripts are being re-used as part of integration
Generally, the most efficient method of validating
results for integration scripts, but increases the risk of
missing potential defects
Best used when a component has been thoroughly
system tested - it is acceptable to simply verify the
process executed
INTEGRATION TESTING
•
Validation of Results
Doesn’t assume the correctness of results from
previous testing phases
Necessary to fully validate the process outputs at
a detailed level via external calculations (i.e.
“tool”, spreadsheet, SQL query, manual
calculations, etc.)
Although dependable, low risk this method
time/resource intensive
10
13. 4/16/2013
INTEGRATION TESTING
•
Execution
Its all about COMMUNICATION
Because an individual tester may not have
requisite knowledge to validate every step of an
integration scenario “hand-offs” are necessary
Assign a resource to manage the communication of the
hand offs (we don’t want tests sitting)
Hold daily triage and schedule meetings
Be flexible ☺
PERFORMANCE TESTING
•
Application Performance
Verify each component is properly tuned as early
as feasible
Align with system test phase
•
Infrastructure Performance
Utilize an integrated environment to determine
issues in the infrastructure of the solution
Align with integration test phase
11
14. 4/16/2013
KEY TAKE AWAYS
•
•
•
Develop an overall test strategy that aligns with the
development methodology and culture of the
organization
Ensure key roles are properly filled
QA Architect
QA PM
Business Owners
Data Modeler
Determine where/how you will get your data (the data
strategy)
KEY TAKE AWAYS
•
•
•
•
•
Don’t rush to integration testing
Identify key business and technical resources to
assist in developing the integration scenarios
Utilize a modular testing framework so creation
of integration tests requires little additional work
Performance test key components as early as
possible and ensure the entire solution is
performance tested
Communicate, communicate, communicate
12