SlideShare a Scribd company logo
1 of 36
Approvals
Enterprise Architect: [Name]
Signature:
Date:
IT Project Mgr: [Name] Signature: Date:
Authorizing Business
Sponsor:
[Name] Signature: Date:
Authorizing IT
Sponsor:
[Name] Signature: Date:
History
[Purpose of this Section: Record changes to this document here making an entry for each new version.]
Version
Number
Release and/or
Approval Date
Author(s)
Section(s), Page (s) and Topic
Revised
1.0 12/07/2009 Offshore Team Initial Version
1.1 12/14/2009 Offshore Team Updated Version
1.2 04/21/2010 Offshore Team Updated Complete Version
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 1
Date: 04/21/2010
WellPoint Inc.
Clinical Informatics Solutions
Technical Design Document [Source-LZ]
[Technical Leader- ]
Portfolio
No.:
Service
Request No.:
CART
No.:
AOP
Tracking No.:
Notice of Confidentiality and Custodial Responsibilities
This WellPoint document contains confidential information that is WellPoint’s intellectual property. As a holder of this document, you may NOT
disclose its content or any information derived from it to any person or entity outside of WellPoint.
Contents
1.Introduction................................................................................................3
1.1.Scope...........................................................................................................................4
1.2.Definitions, Acronyms and Abbreviations....................................................................5
2.Resources Affected...................................................................................6
2.1.External Resources.....................................................................................................6
3.Application Design.....................................................................................7
3.1.Architectural and Coding References..........................................................................7
3.2.Platform and Version Information................................................................................8
4. Process Flow of CIS......................................................................12
4.1 Server check process................................................................................................13
4.2Load Log Script..........................................................................................................14
4.3Loading Process from Source to Landing Zone.........................................................16
4.3.1Weekly Full Refresh Loads........................................................................................16
4.3.2Weekly Incremental Loads.........................................................................................18
4.3.3History Loads.............................................................................................................20
4.3.4Deriving Member Key fields........................................................................................21
4.3.5Audit Check Process...............................................................................................25
5.Components/Objects/Modules .....................................................................................27
i.Major Component Inventory.............................................................................................27
ii.Major Component Details................................................................................................27
6.Data Stores .............................................................................................29
6.1 Data Store Inventory..................................................................................................30
6.2 Data Store Data Elements.........................................................................................30
6.3 Data Store Descriptions.............................................................................................31
7.Implementation Activities.........................................................................31
7.1 Packaging/Release Activity.......................................................................................32
8.Technical Assumptions............................................................................33
9.Reference Documents.............................................................................34
10.Project Team Signoffs / Approvals........................................................36
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 2
1. Introduction
This high-level section requires no input from the author. It is simply information to assist authors in
understanding and completing each sub-section.
General Template Information
Purpose of Document
The Functional Design Document (FDD), the Architectural Design Document and
the Infrastructure Impact Assessment (IIA) document are the predecessors to this
document. The IT Technical Lead normally completes this document with
contribution from the Solution Architect, Data Architect, and Infrastructure Build
Engineering as needed.
This document must contain all the elements needed to code a fully functional
system. This means that this document should be complete and accurate
enough so that any developer inside or outside the immediate development team
can construct all the components in order to complete the system.
The IT Technical Lead is responsible for the creation of this document. The
document is owned by IT.
Help Completing
Template pmm@wellpoint.com
Frequently Asked Questions about Completing this Document
# Question Answer
1
How do I attach another
doc as an object in this
doc?
(In this Word Doc) click Insert Object Create From File tabCheck-off
“Display as Icon”  Browse for file click “ok”. File should now be on the
document, but may not be fully visible. If not fully visible: click on the object
one right click “Format Object”  click layout tab select “Tight” hit “ok”.
2 How do I provide a
hyperlink in this doc to
another doc?
(In this Word Doc) InsertHyperlinkEnter hyperlink
3 How do I update the
Table of Contents?
Go to the Table of Contents page Position cursor to the left of the table (not
over the table) Left click mouse button. The entire table should be highlighted
Click the F9 key on keyboard.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 3
1.1.Scope
Information Guide on Scope
Scope
The current scope of the design is to extract data from four operational source systems
(ECC, WMDS, Trimed and Care planner) and load them into the Landing Zone (LZ). This
document is based on the Business System Document and details the solution approach
to load all the tables into the landing zone for CIS. There are 15 tables from ECC, 18
tables from WMDS, 8 tables from Trimed and 1 Flat file for CM. This document gives an
overview in terms of performing Incremental Refresh (IR), Full Refresh (FR) and One
Time Load of Historical Data.
In case of scheduled process, the Incremental data (containing new or changed records)
from the source databases (WMDS, ECC, Trimed and Care Planner) will be loaded into
Landing Zone by means of Informatica ETL mappings.
Every week the records from the source would be pulled based on the load criteria from
the source tables and loaded into their respective tables in Landing Zone (LZ).
Landing Zone will be a transient staging area and no history will be maintained.
Description
of Section
Following steps will be executed as part of the project scope
• Design and develop landing zone tables as per the source table layouts and
LZ table creation guidelines.
• Extract Clinical data from ECC (Oracle), WMDS (Oracle), Trimed (DB2),
Careplanner (Flat file) from Jan 1, 2007 forward.
• Load the History data into landing zone tables.
• Design processes to handle complex transformations.
• Design ETL process to load incremental data, full refresh data into landing
zone staging area.
• Setup process to load & maintain load log table for LZ data loads.
• Setup process for Audit balancing.
• Setup process for WLM jobs scheduling.
• Define job dependencies and restart-ability.
• Post load cleanup activities.
Note:
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 4
1.2.Definitions, Acronyms and Abbreviations
Information Guide on Definitions, Acronyms, and Abbreviations
BI
Business Intelligence
BI Staging Area Staging area for Current Clinical reporting platform on a regional server (AEDW)
CDC Change Data Capture
CM Case Management
CMS Codes Management System
COB Coordination of Benefits
COBRA Consolidated Omnibus Budget Reconciliation Act
CP Care Planner
CS90 Claims System – New York
CSA Conformed Staging Area
DM Disease Management
ECC Empire Care Connects
ECR Enterprise Client Reporting
EDL Enterprise Data Layer
EDL R2 Enterprise Data Layer Release 2
EDW Enterprise Data Warehouse
EDWard
Enterprise Data Warehouse and Research Depot.
Earlier this was known as EDL R2.
ERISA Employment Retirement Income Security Act
ETL Extract Transform & Load
FR Full Refresh
HMC Health Management Corporation
IM Information Management
INC Incremental Load
INFA Informatica
IQ Information Quality
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 5
IR Implementation Readiness
LZ Landing Zone Staging Area
MBU Marketing Business Unit
NAICS_CD North American Industry Classification System Code
PCP Primary Care Physician
POC Proof of Concept
RFC Request for Change
SDLC System Development Life Cycle
SIC_CD Standard Industry Classification Code
SLA Service Level Agreement
TD Teradata database
TROOP True Out of Pocket
UAT User Acceptance Testing
WEDW West-Enterprise Data Warehouse
WGS WellPoint Group Systems
WLM Work Load Manager
WMDS WellPoint Medical Decision Support System
WPD WellPoint - Product Database
2. Resources Affected
This high-level section requires no input from the author. It is simply information to assist authors in
understanding the sub-sections.
Information Guide on Resources Affected
Resources Affected
WMDS and ECC - Oracle database, Trimed – DB2 database, Careplanner – Flat file,
Landing Zone Area (LZ) - Teradata.
Description of
Section
List all other external resources (applications/entities) affected by the changes and a brief
description of how they will be affected. Do not include business entities.
External resources are represented by: hardware, a person, program, or another system.
Note:
2.1.External Resources
Information Guide on External Resources
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 6
External Resources
Operational System
 WMDS/ECC/Trimed/Careplanner Host Systems
Teradata
 ENTDWPROD - Host
 DBA Support
Unix
 TSM
 Disk Utilization
 System Admin Support
WLM
 WLM Scheduling Support
 Service Delivery Support
Informatica
 Underlying Oracle DB
 Application Admin Support
Description of
Section
These are the External Entities outside of the applications that can be affected (i.e. DBA,
External Vendor, Tape Management, Regulatory Agency, etc.). Describe in detail the
impact.
Note:
3. Application Design
This high-level section requires no input from the author. It is simply information to assist authors in
understanding the sub-sections.
Information Guide on Application Design
Application
Design
This section details all the technical design information for clinical subject area. Detailed
design approach is explained in detail in the below sub-sections.
Description of
Section
The aim of current application design is to load the CM from WMDS, ECC, Trimed,
Careplanner source system to Landing Zone (LZ). This document is based on the Business
System Design Document. This document is utilized to create Technical Detail Design
Specification Document. This approach document defines on the way to load landing Zone
tables for historical, incremental, full loads.
Note:
3.1.Architectural and Coding References
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 7
Information Guide on Architectural and Coding References
Architectural
and Coding
References
1. Informatica Developer guideline 4.4
C:MembershipINFA
Developer Handbook V4.4.doc
2. EDL R2 Informatica Standards and Naming Conventions
http://sharepoint.auth.wellpoint.com/sites/EDL/ETL%20Design
%20Workgroup/Forms/AllItems.aspx
3. ETL Guidelines document and RA decision Points:
The following Share Point link gives the information about the latest ETL Guidelines document
and Reference Architecture Decision Points:
http://sharepoint.auth.wellpoint.com/sites/EDL/ETL%20Design
%20Workgroup/Forms/AllItems.aspx
4. Landing Guidelines document – LZ_CDC_guidelines_V2
This document gives general guidelines on the process that need to be followed for creating
the Landing Zone (LZ) tables & high level inputs needed for Change Data Capture (CDC)
attribute mapping.
http://sharepoint.auth.wellpoint.com/sites/DIRECTSOURCING/Shared
%20Documents/Forms/AllItems.aspx
5. WLM Policy Document
http://sharepoint.auth.wellpoint.com/sites/DIRECTSOURCING/Shared
%20Documents/Forms/AllItems.aspx
Description of
Section
Identify the Architectural and coding standards to be used. If reference is to a document that
is not in an enterprise wide library, attach a link to the document or attach the document itself
in the Reference section of this document.
Note:
Please include as much information as possible.
3.2.Platform and Version Information
The below table contains the list of software for the Clinical Subject area.
Sl.
No
Software Required Version
1 Teradata 06.02.02.80
2 Informatica 8.6.1
3 Unix AIX
4 WLM 3.0.1
The following diagram describes the platform information.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 8
The below table contains various server details for Clinical Subject Area in different environments.
Informatica Servers/Environment:
Environment Server Name IP Ports
DEV/SIT vaathmr380.corp.anthem.com 30.135.22.46 6001,55010-55100
UAT/IR vaathmr381.corp.anthem.com 30.135.22.47 6001,55000 - 55100
PROD vaathmr357.corp.anthem.com 30.130.16.150 6001,55201- 55300
Teradata Servers/Environment:
Environment Server Name Server IP Ports
DEV DWDEV 30.135.31.232 1025
SIT DWTEST1 30.135.88.22 1025
UAT/IR DWTEST3 30.135.88.22 1025
PROD ENTDWPROD 30.128.223.28 1025
Informatica Repository:
Informatica Folders:
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 9
Environment Repository Name
DEV/SIT EDL_DEV_86x
UAT/IR EDL_UAT_86x
PROD EDL_PROD_86x
Environment Folder Name
DEV/SIT ENT_CLINICAL_PROGRAMS_DEV
ENT_CLINICAL_PROGRAMS_SHARED_DEV
UAT/IR ENT_CLINICAL_PROGRAMS_UAT
ENT_CLINICAL_PROGRAMS_SHARED_UAT
PROD ENT_CLINICAL_PROGRAMS
ENT_CLINICAL_PROGRAMS_SHARED
Teradata Database:
Environment Databases Name
DEV
ETL_TEMP_CPARP
QADATA_CPARP
CPARP
CPARP_ALLPHI
CPARP_NOPHI
CPARP_NOHAPHI
ETL_VIEWS_CPARP
SIT
T36_ETL_DATA_ENT
T36_ETL_TEMP_ENT
T36_QADATA_ENT
T36_UTLTY_ENT
T36_ETL_VIEWS_ENT
T36_EDW
T36_EDW_[ALL | NO | NOHA ] PHI
UAT/IR
T37_ETL_DATA_ENT
T37_ETL_TEMP_ENT
T37_QADATA_ENT
T37_UTLTY_ENT
T37_ETL_VIEWS_ENT
T37_EDW
T37_EDW_[ALL | NO | NOHA ] PHI
PROD ETL_DATA_V20_ENT
ETL_TEMP_V20_ENT
QADATA_V20_ENT
UTLTY_V20_ENT
ETL_VIEWS_V20_ENT
EDW_V20
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 10
CIS_HIST
EDW_[ALL | NO | NOHA ] PHI
EDW_SL_[ALL | NO | NOHA ] PHI
Supporting Databases:
Functionality Database Name
High level hierarchy for CIS PRJ_CPARP
Proxy ID for data loads CPARP_ETL_ID
Views
Macros
Stored Procedures
ETL_VIEWS_CPARP
ETLMACRO_CPARP
ETLPROC_CPARP
Work objects and Error Tables needed by Teradata Utilities UTLTY_CPARP
Hierarchy node for storing CIS DDLs for use by Developers CPARP_DEVDDL
UNIX Directories:
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 11
Type Path
Scripts /u01vaathmr380/app/cis/test/scripts
Source file /u97vaathmr380/pcenterdata/test/SrcFiles
Target file /u97vaathmr380/pcenterdata/test/TgtFiles
Cache File /u97vaathmr380/pcenterdata/test/Cache
Parameter Files /u97vaathmr380/pcenterdata/test/InfaParm
4. Process Flow of CIS
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 12
4.1 Server check process
For WMDS and ECC:
Informatica process is created to check for the server availability. Informatica process will load data from
ECC_GLOBAL_NAME table to a file. If the server is not up and running then Informatica will not be able
to connect to the database and load process will fail. Once the Informatica session fails, a command task
is executed to make the process wait for 1200 seconds. After 1200 seconds wait time, Informatica
session will again run to look for an entry in ECC_GLOBAL_NAME table. This process will continue for
7200 seconds before the entire workflow fails. Similar process is created for WMDS source which loads
data from WMDS_GLOBAL_NAME table.
.
For Trimed:
$s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME...
$s_CIS_TRIME...$s_CIS_TRIME...
$c_SLEEP_TIM...$c_SLEEP_TIM...$c_SLEEP_TIM...$c_SLEEP_TIM... $c_SLEEP_TIM...$c_SLEEP_TIM...
c_SLEEP_TIME
R_5
c_SLEEP_TIME
R_4
c_SLEEP_TIME
R_3
c_SLEEP_TIME
R_2
c_SLEEP_TIME
R_1
c_SLEEP_TIME
R
s_CIS_TRIMED
_SERVER_UP_
6
s_CIS_TRIMED
_SERVER_UP_
5
s_CIS_TRIMED
_SERVER_UP_
4
s_CIS_TRIMED
_SERVER_UP_
2
s_CIS_TRIMED
_SERVER_UP_
3
s_CIS_TRIME
D_SERVER_U
P
s_CIS_TRIMED
_SERVER_UP_
1
Start
This Informatica process is created to check for the availability of the DB2 database for Trimed. This
process will select data from SYSIBM.SYSDUMMY1 table for every 20 minutes up to 120 minutes as per the
business terms. A command task is executed after every session task to set the process SLEEP for 1200
seconds.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 13
4.2 Load Log Script
Insertion of the Load Log Key:
This workflow will run command tasks to invoke BTEQ script to insert load log key into LZ_LOAD_LOG
table, script to update Parameter file values ,script to update present date parameter and script to verify
whether Parameter file was updated properly or not.
bteq_LZ_load_log_in
s_load.sh
table_Strucure.txt Sample_Parameter.t
xt
Contents of the file:
This script checks if there is any load log key with load end date time as 8888-12-31. If it finds the load
key value with the Load end date time as 8888-12-31, it errors out with the return code 100. If it does not
find the record with the value of 8888-12-31, then it creates a new load log key entry based on the
information present in the parameter file.
Fetching and Updating the Load Log Key Value:
After the insertion of Load Log key value in the LZ_LOAD_LOG table successfully, The BTEQ script
fetches the current Load Log key value from the load log key table and updates the mapping parameter
file, replacing the previous Load Log key value in the FR and INC parameter files.
bteq_LZ_load_log_ke
y_parm_update.sh
Sample_Parameter_fi
le_fr.txt
Sample_Parameter_fi
le_inc.txt
1. The script checks whether any other instance of the program is being run. If so, it displays a
message that there is an existing instance of same program running and exits with the exit
code of 91.
2. The script checks whether the old bteq_outfile is present. If it exists, it removes that output
file.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 14
3. The script also checks if valid parameters are passed. If valid parameters are not passed it
displays a message that invalid parameter is passed and prompts to pass the Full Script
Name, Mapping Parameter File and Logon_Parameter_File.
4. The script checks if the Load log values have been fetched properly and if it doesn’t, it display
a message that there was no valid load log entry in the table and exits with the exit code of
100.After the successful run of the script it removes the Temp files and captures if any errors
are present.
5. The script fetches the current Load Log key value from the load log key table and updates the
mapping parameter file, replacing the previous Load Log key value the parameter files.
.
Fetching and Updating the Present Run Date Value:
After updating the Load Log key value in the mapping parameter file successfully, the BTEQ script
fetches and updates the present run date in the mapping parameter file with current date, replacing the
previous present run date in the FR and INC parameter files.
bteq_update_prsnt_
date.sh
Sample_Parameter_fi
le_fr.txt
Sample_Parameter_fi
le_inc.txt
Contents of the file:
1. The script checks whether any other instance of the program is being run. If so, it displays a
message that there is an existing instance of same program running and exits with the exit
code of 91.
2. The script checks whether the old bteq_outfile is present. If it exists, it removes that output
file.
3. The script also checks if valid parameters are passed. If valid parameters are not passed it
displays a message that invalid parameter is passed and prompts to pass the Full Script
Name, Mapping Parameter File and Logon_Parameter_File.
4. The script fetches and updates the present run date in the mapping parameter file with
current date, replacing the previous present run date in the FR and INC parameter files.
Validating the Load Log Key Value:
This script checks if the inserted load log value which exists in the load log table is same as in the
parameter file. If both the values are equal then it displays a message that the load log key value has
been updated correctly in the parameter file and the loads following this will proceed. If both the values
are not equal then it displays a message that the verification has failed and exits with the exit code of 9
and loads following this will not proceed.
It checks for both the loads FR and INC parameters.
bteq_load_log_verify
.sh
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 15
4.3 Loading Process from Source to Landing Zone
Once the load log key is created in the LZ load log table, Informatica workflows are executed to load the
data from the Source to the Landing Zone. As per the EDW standards all the Mappings from Source to
Landing Zone sets default values for nulls or blanks present in char, varchar, number and date fields.
Following are the types of load from Source to LZ.
1) The first type is Weekly Full refresh loads for tables with low volume counts. The full refresh
tables are truncate and reload. Every table will have a command session before it which contains
a generic delete script that deletes information from that particular table.
2) The second type is Incremental Load and it is based on the ‘Last Run Date’. All the records that
have been updated or inserted in the source after the last run date are fetched and loaded into
the Landing Zone tables. . As mentioned above, every table will have a command session before
it which contains a generic delete script that deletes information from that particular table.
4.3.1 Weekly Full Refresh Loads
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 16
An example of the full refresh mapping (m_CLINICAL_LZ_TRIMED_TMDTPATIENT) is shown
pictorially above. At high level, mapping will read the data from oracle database and loads into
the Landing Zone tables i.e., (LZ_TRIMED and LZ_CP) in Teradata. Also Load Log and Audit
Process are followed in each and every mapping for getting source record count and inserting
Audt_STTSTC for table for post load verification.
Sample_Parameter.t
xt
Mapping Description:
ID Transformation Name Component
Type
Description
1 SQ_CME_CASE_MANAGEMENT_E
PISODE
Source
Definition
This is the source qualifier for
the table
CME_CASE_MANAGEMENT_E
PISODE from ECC. This is a
straight load of the source data
set to the target table.
2 exp_DEFAULT_CONVERSION Expression This reusable expression is
used to convert the null, blank,
N/A values to default values
from source systems to landing
zone tables following EDW
standards.
3 exp_TO_TARGET_INP_AUDIT Expression This expression passes values
to target table AND passes input
values to audit mapplet.
4 mplt_LOAD_AUDIT_STTSTC Mapplet This mapplet inserts into the
AUDT_STTSTC with the initial
count loaded into the target
table.
5 LZ_ECC_CME_CM_EPISODE Target
Definition
Truncate and Load. Populate
Landing zone table with the
most recent data.
6 AUDT_STTSTC Audit Table A row with source count
appended in each mapping
based on table name.
Mapplet Description:
ID Transformation Name Component
Type
Description
1 agg_RECORD_COUNT Aggregator This aggregator gives a row with
source count appended in each
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 17
mapping based on table name.
2 exp_CALL_AUDT_BLNCG_RULE Expression This expression is used to call the
lookup for the AUDT_BLNCG_RULE
id based on table_nm, sor_cd and in
this AUDT_BLNCG_RULE id can’t be
NULL.
3 lkp_AUDT_BLNCG_RULE Lookup This lookup transformation is used to
get the AUDT_RULE_ID.
4 mplt_OUT Mapplet
Output
This is the mapplet output from where
we will accumulate the complete
mapplet data.
4.3.2 Weekly Incremental Loads
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 18
An example of the incremental process is shown pictorially above At high level, Mapping will read the
data from oracle database for the last three days based on the create date and Update date fields and
loads into the Landing Zone table, LZ_ECC_COI_CON_ISSUE in Teradata. Also load log and audit
process are followed in each and every mapping for getting source record count and writing to
Audt_STTSTC for table for post load verification.
Mapping Description:
ID Transformation Name Component
Type
Description
1 SQ_COI_CON_ISSUE Source
Definition
This is the source qualifier for the
table COI_CON_ISSUE from ECC.
This is a straight load of the source
data set to the target table.
2 exp_DEFAULT_CONVERSION Expression This reusable expression is used to
convert the null, blank, N/A values
to default values from source
systems to landing zone tables
following EDW standards.
3 exp_TO_TARGET_INP_AUDIT Expression This expression passes values to
target table AND passes input
values to audit mapplet.
4 mplt_LOAD_AUDIT_STTSTC Mapplet This mapplet inserts into the
AUDT_STTSTC with the initial
count loaded into the target table.
5 LZ_ECC_COI_CON_ISSUE Target
Definition
Truncate and Load. Populate
Landing zone table with the most
recent data.
6 AUDT_STTSTC Audit Table A row with source count appended
in each mapping based on table
name.
Mapplet Description:
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 19
ID Transformation Name Component
Type
Description
1 agg_RECORD_COUNT Aggregator This aggregator gives a row with
source count appended in each
mapping based on table name.
2 exp_CALL_AUDT_BLNCG_RULE Expression This expression is used to call the
lookup for the
AUDT_BLNCG_RULE id based on
table_nm, sor_cd and in this
AUDT_BLNCG_RULE id can’t be
NULL.
3 lkp_AUDT_BLNCG_RULE Lookup This lookup transformation is used
to get the AUDT_RULE_ID.
4 mplt_OUT Mapplet
Output
This is the mapplet output from
where we will accumulate the
complete mapplet data.
The Pseudo Code for the Source filter can be as follows:
WHERE
(
(
TRUNC(COI_CON_ISSUE.COI_CREATE_DATE)>=
TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_RUN_DATE'),'MM-DD-YYYY
HH24:MI:SS'))
AND
TRUNC(COI_CON_ISSUE.COI_CREATE_DATE)<
TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_PRSNT_DATE'),'MM-DD-YYYY
HH24:MI:SS'))
)
OR
(
TRUNC(COI_CON_ISSUE.COI_LAST_UPDATE_DATE)>=
TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_RUN_DATE'),'MM-DD-YYYY
HH24:MI:SS'))
AND
TRUNC(COI_CON_ISSUE.COI_LAST_UPDATE_DATE)<
TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_PRSNT_DATE'),'MM-DD-YYYY
HH24:MI:SS'))
)
)
4.3.3 History Loads
The history load is applicable only for tables identified as Incremental tables and the data of these tables
are brought to the Landing zone based on the category that they fall on to:
Eg:
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 20
select /*+parallel(a,4) parallel(b,4)*/
col1,
col2,
----------
----------
FROM
WHERE
TAU_TREATMENT_AUTHORIZATION.TAU_MEM_UID = MEM_MEMBER.MEM_UID
AND
trunc(TAU_TREATMENT_AUTHORIZATION.TAU_CREATE_DATE)>=trunc(to_date(to_char('$
$MAPP_HIST_START_DATE'),'mm-dd-yyyy hh24:mi:ss'))
AND
trunc(TAU_TREATMENT_AUTHORIZATION.TAU_CREATE_DATE)<trunc(to_date(to_char('$
$MAPP_HIST_END_DATE'),'mm-dd-yyyy hh24:mi:ss'))
Below are the tables where history data starting from Jan 01, 2007 and match based on the
TAU_TREATMENT_AUTHORIZATION.TAU_UID are brought down to Landing Zone.
Incase huge volume of data in source tables then there is an issue in loading the records as the time
taken to load these records is high. Hence the very huge tables will be split into multiple stages and are
loaded into the Landing Zone. Each stage will be having the data for a particular period based on the
TAU_TREATMENT_AUTHORIZATION table create_date (say 6/3 months). The Landing Zone table will
be loaded completely after the completion of all the stage loads and it is used as a single table for further
processing.
4.3.4 Deriving Member Key fields
Incremental.doc History.doc
WMDS.MEM_MEMBER table:
WMDS.MEM_MEMBER needs to be joined with WMDS.TAU_TREATMENT_AUTHORIZATION
table to get the matching MEM_ID for each TAU_UID. The information inside the ‘Member id’
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 21
ECC TABLE_NAME WMDS TABLE_NAME
MNO_MEM_NOTE COI_CON_ISSUE
COI_CON_ISSUE MEM_MEMBER
MEM_MEMBER MNO_MEM_NOTE
MPG_MEM_PLAN_GROUP TAU_TREATMENT_AUTHORIZATION
TAU_TREATMENT_AUTHORIZATION MPG_MEM_PLAN_GROUP
PVD_PROVIDER
needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Source Code’,
‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’.
Pseudo Code (LZ_WMDS_MEM_MEMBER) used in the Source qualifier can be found in
Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’.
Expression to Derive fields for MBR_KEY:
Ports Expression
SUBSCRIBER_ID decode (TRUE,
instr(MEM_ID, 'DENWGS') != 0 OR
instr(MEM_ID, 'DENINT') != 0 OR
instr(MEM_ID, 'WGS2BCC') != 0 OR
instr(MEM_ID, 'WGS2CO') != 0 OR
instr(MEM_ID, 'WGS2GA') != 0 OR
instr(MEM_ID, 'WGS2MO') != 0 OR
instr(MEM_ID, 'WGS2NV') != 0 OR
instr(MEM_ID, 'WGS2SSP') != 0 OR
instr(MEM_ID, 'WGS2UNI') != 0 OR
instr(MEM_ID, 'WGS2WI') != 0 OR
instr(MEM_ID, 'WGSMPD') != 0 OR
instr(MEM_ID, 'WBMO') != 0 OR
instr(MEM_ID, 'CR') != 0 OR
instr(MEM_ID, 'AFEP') != 0 OR
instr(MEM_ID, 'HLADV') != 0 OR
instr(MEM_ID, 'HLHMO') != 0 OR
instr(MEM_ID, 'HLPPO') != 0 OR
instr(MEM_ID, 'NA') != 0 OR
instr(MEM_ID, 'WBGA') != 0 OR
instr(MEM_ID, 'UNIBOR') != 0 OR
instr(MEM_ID, 'UNISHBP') != 0 OR
instr(MEM_ID, 'DENSTAR') != 0 OR
instr(MEM_ID, 'STAR') != 0 OR
instr(MEM_ID, 'MTRKUNI') != 0 OR
instr(MEM_ID, 'WGS13') != 0,
SUBSTR((RTRIM(LTRIM(MEM_ID))),1,9),
instr(MEM_ID, 'WBWI') != 0,
SUBSTR((RTRIM(LTRIM(MEM_ID))),4,9),
instr(MEM_ID, 'D950') != 0,
SUBSTR((RTRIM(LTRIM(MEM_ID))),1,12),
'UNK')
MEMBER_SEQ_NBR decode (TRUE,
instr(MEM_ID, 'CR') != 0 OR
instr(MEM_ID, 'AFEP') != 0 OR
instr(MEM_ID, 'UNIBOR') != 0 OR
instr(MEM_ID, 'UNISHBP') != 0,
to_char(SUBSTR((RTRIM(LTRIM(MEM_ID))),10,2)),
instr(MEM_ID, 'WBGA') != 0 OR
instr(MEM_ID, 'DENSTAR') != 0 OR
instr(MEM_ID, 'STAR') != 0 OR
instr(MEM_ID, 'WGS13') != 0,
to_char((SUBSTR((RTRIM(LTRIM(MEM_ID))),11,2))),
instr(MEM_ID, 'NA') != 0,
to_char((SUBSTR((RTRIM(LTRIM(MEM_ID))),15,2))),
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 22
'UNK')
MBR_CD decode (TRUE,
instr(MEM_ID, 'DENWGS') != 0 OR
instr(MEM_ID, 'WGS2BCC') != 0 OR
instr(MEM_ID, 'WGS2CO') != 0 OR
instr(MEM_ID, 'WGS2GA') != 0 OR
instr(MEM_ID, 'WGS2MO') != 0 OR
instr(MEM_ID, 'WGS2NV') != 0 OR
instr(MEM_ID, 'WGS2SSP') != 0 OR
instr(MEM_ID, 'WGS2UNI') != 0 OR
instr(MEM_ID, 'WGS2WI') != 0 OR
instr(MEM_ID, 'WGSMPD') != 0 OR
instr(MEM_ID, 'WBMO') != 0 OR
instr(MEM_ID, 'HLADV') != 0 OR
instr(MEM_ID, 'HLHMO') != 0 OR
instr(MEM_ID, 'HLPPO') != 0 OR
instr(MEM_ID, 'MTRKUNI') != 0,
SUBSTR((RTRIM(LTRIM(MEM_ID))),11,2),
instr(MEM_ID, 'WBWI') != 0 OR
instr(MEM_ID, 'D950') != 0,
SUBSTR((RTRIM(LTRIM(MEM_ID))),13,2),
'UNK')
MBR_SOR_CD decode (TRUE,
rtrim(substr(MEM_ID,instr(MEM_ID, 'CR'))) ='CR','823',
instr(MEM_ID, 'AFEP') != 0,'FEP',
(
instr(MEM_ID, 'NA')!=0
AND
is_spaces(substr(MEM_ID,instr(MEM_ID,'NA')-1,1))
), '824',
instr(MEM_ID, 'DEN') != 0,'NA',
instr(MEM_ID, 'STAR') != 0,'815',
instr(MEM_ID, 'WGS13') != 0,'NA',
instr(MEM_ID, 'WGS') != 0,'808',
'NA')
SRC_GRP_NBR 'UNK'
ECC.MEM_MEMBER table:
ECC.MEM_MEMBER needs to be joined with ECC.TAU_TREATMENT_AUTHORIZATION table
to get the matching MEM_ID for each TAU_UID. The information inside the ‘Member id’ needs to
be split and decoded in an expression to get the ‘Subscriber ID’, ‘Source Code’, ‘Member
Sequence Number’, ’Member Source Code’ and ‘Source Group Number’.
Pseudo Code (LZ_ECC_MEM_MEMBER) used in the Source qualifier can be found in
Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’.
Expression to Derive fields for MBR_KEY:
Ports Expression
SUBSCRIBER_ID DECODE(TRUE,
INSTR(RTRIM(LTRIM(MEM_ID)),'-')!=0,
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 23
SUBSTR(RTRIM(LTRIM(MEM_ID)),1,INSTR(MEM_ID,'-')-1),
MEM_ID)
MEMBER_SEQ_NBR DECODE(TRUE,
INSTR(RTRIM(LTRIM(MEM_ID)),'-')!=0,
SUBSTR(RTRIM(LTRIM(MEM_ID)),(INSTR(MEM_ID,'-')+1),2),
'NA')
MBR_SOR_CD '809'
TRIMED.OPRTPRSNCVRD table:
The information inside the ‘OPRTPRSNCVRD .I_COVD_PRSN ‘ needs to be split and decoded in
an expression to get the ‘Subscriber ID’, ‘Member Sequence Number’, ’Member Source Code’
and ‘Source Group Number’.
Pseudo Code (LZ_TRIMED_OPRTPRSNCVRD) used in the Source qualifier can be found in
Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’.
Expression to Derive fields for MBR_KEY:
Ports Expression
SUBSCRIBER_ID decode (TRUE,
instr(C_OWNER_CAT, 'CORP') != 0 OR
instr(C_OWNER_CAT, 'HMORIC') != 0 OR
instr(C_OWNER_CAT, 'FEP') != 0,
SUBSTR((RTRIM(LTRIM(I_COVD_PRSN))),1,9),
'UNK')
MEMBER_SEQ_NBR decode (TRUE,
instr(C_OWNER_CAT,'CORP') != 0 OR
instr(C_OWNER_CAT,'HMORIC') != 0 OR
instr(C_OWNER_CAT,'FEP') != 0,
to_char(SUBSTR((RTRIM(LTRIM(I_COVD_PRSN))),10,2)),
'UNK')
MBR_SOR_CD decode (TRUE,
instr(C_OWNER_CAT, 'HMORIC') != 0,'868',
instr(C_OWNER_CAT, 'CORP') != 0,'869',
instr(C_OWNER_CAT, 'FEP') != 0,'888',
'NA')
SRC_GRP_NBR 'UNK'
CarePlanner.CASE_EVNT Table
The information inside the ‘CASE_EVNT.PAT_ID ‘ needs to be split and decoded in an expression to get
the ‘Subscriber ID’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’,
’source member code’.
Expression to Derive fields for MBR_KEY:
Ports Expression
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 24
SUBSCRIBER_ID decode (TRUE,
instr(PAT_ID, 'ACES') != 0,
SUBSTR((RTRIM(LTRIM(PAT_ID))),1,10),
'UNK')
MEMBER_SEQ_NBR decode (TRUE,
instr(PAT_ID, 'ACES') != 0,
to_char((SUBSTR((RTRIM(LTRIM(PAT_ID))),12,1))),
'UNK')
MBR_SOR_CD '822'
SRC_GRP_NBR 'UNK'
SRC_MBR_CD 'UNK'
Default Values:
Mapping Parameters are defined and are utilized to assign the values appropriately. A sample parameter
definition and a sample parameter file are attached below.
Sample_Parameter.t
xt
The Pseudo expression templates for the replacement of the null values with the defaults based on the
data type can be summarized as follows:
Data Type Default
Value(s)
Pseudo Expression
CHAR,
VARCHAR
UNK,NA
DECODE(TRUE,ISNULL(IN_STRING_1),
to_char($$MAPP_DEFAULT_STRING_UNK),
IS_SPACES(IN_STRING_1),
to_char($$MAPP_DEFAULT_STRING_UNK),
IN_STRING_1= '',
to_char($$MAPP_DEFAULT_STRING_UNK),IN_STRING_1)
INTEGER,
DECIMAL
0 DECODE(TRUE, ISNULL(IN_NUMBER_1),$
$MAPP_DEFAULT_INTEGER,IN_NUMBER_1)
DATE
8888-12-31
00:00:00
DECODE(TRUE,
ISNULL(IN_DATE_1),
ROUND(to_date(to_char($$MAPP_DEFAULT_HIGH_DATE))),IN_DATE_1)
4.3.5 Audit Check Process
1) After the load process is completed at the Landing Zone tables, a BTEQ script is run to update
the load statistics of the Landing Zone tables into the Audit table AUDT_STTSTC.
bteq_LZ_AUDT_STA
T_LOAD.sh
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 25
Count of the row is updated for the load log key having the latest ‘Load start date time’ based on the
‘source code’, ‘subject area’, ‘workflow name’ and ‘publish indicator = N’. This is loaded into the GTT
(Global Temporary table).The data is loaded into the AUDT_STTSTC from the GTT table.
When there are zero rows processed from the source (i.e.) when the source does not have any rows
matching the fetch criteria, then the source record count in the audit table will not be inserted. The
target record count (LZ target table count) will be inserted with the row count of zero.
2) The record variance of the source and target column is aggregated and updated into the GTT
table. This is loaded back into the AUDT_BLNC_STTSTC table.
bteq_LZ_AUDT_BLN
C_STAT_LOAD.sh
3) This bteq checks if the difference of counts and sums are within the allowed variance quantity for
all the tables in a particular subject area. If the variance is more than the accepted value in the
rule table, then the subsequent process would not be executed.
bteq_LZ_AUDT_BAL_
RCD_CNT_VRNC.sh
4) If the above process succeeds, then a BTEQ script is executed to update the load end time in the
load log table.
bteq_LZ_load_log_e
nd_dtm_upd_load.sh
Updates the load end time based on the ‘source code’, ‘subject area’, ‘workflow name’ and
publish_ind = ‘N’
5) The final BTEQ updates the publish indicator to Y in the load log table.
bteq_LZ_load_log_p
ub_ind_upd_load.sh
In case of any count variances in the audit, Audit log entries needs to be deleted manually by
production support personal and Jobs needs to be restarted to fix the problem.
The following link will give the Audit Balancing tables structure DDL:
http://sharepoint.auth.wellpoint.com/sites/esppm/edlr2/Executing%20Phase
%20Documents/DDL/dev/IQ_cut1_00.ddl
A pictorial representation of the above steps:
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 26
5. Components/Objects/Modules
This section will have more detailed information for each and every component and objects for Clinical
subject area for WMDS/ECC source systems.
i. Major Component Inventory
SL No. Component Layer
1 Trigger file UNIX scripting
2 Load log keys Teradata
3 Source to Landing Zone Informatica
4 Audit Balancing Teradata/Informatica
ii. Major Component Details
This section will have detailed information on the source and the ETL process to load them to the Landing
Zone.
Source Data Extraction Process
Informatica process is created to check for the server availability. Informatica process will load data from
TRIMED_GLOBAL_NAME table to a file. If the server is not up and running then Informatica will not be
able to connect to the database and load process will fail. Once the Informatica session fails, a command
task is executed to make the process wait for 1200 seconds. After 1200 seconds wait time, Informatica
session will again run to look for an entry in TRIMED_GLOBAL_NAME table. This process will continue
for 7200 seconds before the entire workflow fails
Source:
WMDS Oracle Database
ECC Oracle Database
TRIMED DB2 Database
CareplannerCare planner Flat File
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 27
Source Table Details
Serial
No.
Source FR/INC Source Table Name Landing Zone Target table Name
1. WMDS FR CDA_CME_DIAGNOSIS LZ_WMDS_CDA_CME_DIAGNOSIS
2. WMDS FR CMA_CME_MNGR_ASGNMNT LZ_WMDS_CMA_CME_MNGR_ASGNMNT
3. WMDS FR CME_CM_EPISODE LZ_WMDS_CME_CM_EPISODE
4. WMDS FR CMG_CM_GOAL LZ_WMDS_CMG_CM_GOAL
5. WMDS FR CMI_CM_ISSUE LZ_WMDS_CMI_CM_ISSUE
6. WMDS FR CMV_CM_INTERVENTION LZ_WMDS_CMV_CM_INTERVENTION
7. WMDS INC COI_CON_ISSUE LZ_WMDS_COI_CON_ISSUE
8. WMDS FR DXC_DIAGNOSIS_CODE LZ_WMDS_DXC_DIAGNOSIS_CODE
9. WMDS INC MEM_MEMBER LZ_WMDS_ MEM_MEMBER
10. WMDS INC MNO_MEM_NOTE LZ_WMDS_ MNO_MEM_NOTE
11. WMDS FR ORG_ORGANIZATION LZ_WMDS_ ORG_ORGANIZATION
12. WMDS FR SRP_SBR_RESPONSE LZ_WMDS_ SRP_SBR_RESPONSE
13. WMDS FR STF_STAFF LZ_WMDS_ STF_STAFF
14. WMDS INC TAU_TREATMENT_AUTHRZN LZ_WMDS_TAU_TREATMENT_AUTHRZN
15. WMDS FR MCD_MEM_CLINICAL_DATA LZ_WMDS_MCD_MEM_CLINICAL_DATA
16. WMDS INC MPG_MEM_PLAN_GROUP LZ_WMDS_ MPG_MEM_PLAN_GROUP
17 WMDS INC PVD_PROVIDER LZ_WMDS_ PVD_PROVIDER
18 ECC FR CME_CM_EPISODE LZ_ECC_CME_CM_EPISODE
19 ECC FR SRP_SBR_RESPONSE LZ_ECC_ SRP_SBR_RESPONSE
20 ECC INC MNO_MEM_NOTE LZ_ECC_ MNO_MEM_NOTE
21 ECC FR SBR_SURVEY_BUILDER LZ_ECC_SBR_SURVEY_BUILDER
22 ECC FR CDA_CME_DIAGNOSIS LZ_ECC_ CDA_CME_DIAGNOSIS
23 ECC FR CMG_CM_GOAL LZ_ECC_CMG_CM_GOAL
24 ECC FR CMI_CM_ISSUE LZ_ECC_CMI_CM_ISSUE
25 ECC FR CMV_CASE_MGMT_INTRVNTN LZ_ECC_CMV_CASE_MGMT_INTRVNTN
26 ECC FR ORG_ORGANIZATION LZ_ECC_ORG_ORGANIZATION
27 ECC INC COI_CON_ISSUE LZ_ECC_ COI_CON_ISSUE
28 ECC INC MEM_MEMBER LZ_ECC_ MEM_MEMBER
29 ECC FR CMA_CME_MANAGER_ASGNMNT LZ_ECC_CMA_CME_MANAGER_ASGNMNT
30 ECC FR STF_STAFF LZ_ECC_ STF_STAFF
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 28
Serial
No.
Source FR/INC Source Table Name Landing Zone Target table Name
31 ECC INC MPG_MEM_PLAN_GROUP LZ_ECC_ MPG_MEM_PLAN_GROUP
32 ECC INC TAU_TREATMENT_AUTHRZN LZ_ECC_TAU_TREATMENT_AUTHRZN
33 TRIMED FR OPRTPRSNCVRD LZ_TRIMED_ OPRTPRSNCVRD
34 TRIMED FR TMDTPATIENT LZ_TRIMED _ USR_USER
35 TRIMED FR TMDTRPPTCMGT LZ_TRIMED_ TMDTRPPTCMGT
36 CP FR CARESECURE_CASE_EVNT LZ_CP_CARESECURE_CASE_EVNT
NOTE: TMDTRPTKLRLG is used only for history load. For incremental loads, we would be using
TMDTTMTICKLR which contains one month information.
ETL Process Flow
The existing data from the Landing Zone tables is deleted and the data from the source tables will be read
by Informatica power center and loaded into the corresponding landing zone tables. Teradata external
loader (MLoad) connections will be used to load the data into LZ tables.
If there are null values coming from the source, then it would to be populated with corresponding default
values as per the EDW standards.
The following link will give the detailed ETL Specification for all the above LZ tables.
http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?
RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and
%20Development%2fDesign%2fETL%5fSpecification%5fdoc&FolderCTID=&View=%7bB9C5506F
%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d
6. Data Stores
This high-level section requires no input from the author. It is simply information to assist authors in
understanding the sub-sections.
Information Guide on Data Stores
Data Stores
CIS History extract and incremental/full refresh data are loaded into Landing Zone table.
In this section all the data sources and its sizes will be documented.
Description of
Section
Identify the new, changed or deleted data schema(s) affected, highlighting changes
from/additions to existing schema(s).
Describe new tables, file, structures, data elements, and changes to the existing data
dictionary objects related to the changes.
Include data model diagrams as appropriate.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 29
Note:
Please include as much information as possible.
6.1 Data Store Inventory
See below grid shows the CM inventory list which contains the details of the history extract and its
information. Also gives the details for LZ & CSA tables. The name of the document is ‘Inventory List.xls’
Inventory list.xls
The below spreadsheet gives statistics of WMDS/ECC/Trimed/Careplanner Landing zone tables.
Release 3 Source
Tables_with_counts.xls
6.2 Data Store Data Elements
Create one table for each new or existing table/structure
Data Element Table Definitions
Schema/Database/File
Name
Name of new or existing Schema or Database or File Name
Table/Segment Name Name of new or existing table or segment
Copybook/Object Name Name of new or existing copybook or object name
Data Element Data element name
Description Description of the field
Data Type The data type for the element (alpha, numeric, String, Date, Integer, etc.)
Length The length of the data element – indicate units (characters, bytes, kb, etc.)
Values
If applicable, list the values for the data element, or reference a table or document
with this information.
Key/Search/Index Field Is this a key, search, or index field?
Mod Type
Modification Type (A, C, or D) for addition, change, or delete for the particular
data element.
The following Share Point links gives the information about Landing Zone tables DDL.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 30
Landing zone tables DDL
http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?
RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and
%20Development%2fDesign%2fData%20Models%2fLanding%20Zone%20Models&FolderCTID=&View=
%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d
6.3 Data Store Descriptions
Information Guide on Data Store Descriptions
Data Store
Descriptions
Description of the each and every Physical Model attribute.
Description of
Section
Provide narrative descriptions for field changes, if they will assist the developer when
coding. Refer to Data Elements/Data Stores based on Schema Name
Note:
Please include as much information as possible.
The following Share Point link gives the information about the Clinical Programs Physical Data Model and
Metadata for UM, CM. This document describes the each and every Physical Model attribute
http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?
RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and
%20Development%2fDesign%2fData%20Models%2fPhysical%20Model
%20Documents&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C
%2d8EBB1D5F4FE5%7d
7. Implementation Activities
This section holds the detail information of batch processing and recovery process.
General Information
Purpose of This Section
The CA UNICENTER scheduling tool, also referred to as
the WORKLOAD MANAGER (WLM), is the preferred
WellPoint, Inc. and Central Region Data Warehouse
(CRDW) and Service Delivery (SD) scheduling tool.
The WLM scheduling tool is designed to process job
execution across multiple technology platforms.
This section details about the WLM Jobset & Jobs that will be set
for WMDS/ECC/Trimed/Careplanner Source to Landing Zone load
will be discussed in detail.
Instructions for this Section: Repeat the following section for each layer in the application
(Presentation, Business, Messaging, Data, Applications, and
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 31
Infrastructure). Be sure to articulate all environments such as
Development, Test, QA and Production or any other
environment types.
NOTE
If you choose not to use the table below, you can provide your own
format. Be sure to provide any Implementation Activities not
contained explicitly in this section. Repeat the following table for
each step in the program unit, package or component.
7.1 Packaging/Release Activity
Scheduling of the workflows can be done using WLM: work load manager tool. The WLM tool is
designed to,
1. Manage end-to-end processing flows.
2. Manage execution of parallel job processing.
3. Manage overall process sequence. (i.e., Jobs or processes those are dependent upon
the completion of a preceding job or process.)
4. Manage job failure notification
For WMDS/ECC/CP/TriMed Source to Landing Zone load the process will be split into logical unit of
works called Job sets in WLM. Job set is logical unit of work in WLM and each Jobset will contain jobs
through which dependencies are set. Each JOB is units of work that will execute Informatica/ Teradata
processes to load the target tables.
Below is the list of all the return codes from Informatica which are captured by WLM team which are the
outcomes of Informatica pmcmd return code.
PMCMD RETURN CODES
Code Description
0 For all commands, a return value of zero indicates that the command ran successfully. You can
issue the following commands in the wait or nowait mode: starttask, startworkflow, aborttask, and
abortworkflow. If you issue a command in the wait mode, a return value of zero indicates the
command ran successfully. If you issue a command in the nowait mode, a return value of zero
indicates that the request was successfully transmitted to the Integration Service, and it
acknowledged the request.
1 Integration Service is not available, or pmcmd cannot connect to the Integration Service. There is a
problem with the TCP/IP host name or port number or with the network.
2 Task name, workflow name, or folder name does not exist.
3 An error occurred starting or running the workflow or task.
4 Usage error. You passed the wrong options to pmcmd.
5 An internal pmcmd error occurred. Contact Informatica Technical Support.
7 You used an invalid user name or password.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 32
8 You do not have the appropriate permissions or privileges to perform this task.
9 Connection to the Integration Service timed out while sending the request.
12 Integration Service cannot start recovery because the session or workflow is scheduled, waiting for
an event, waiting, initializing, aborting, stopping, disabled, or running.
13 User name environment variable is set to an empty value.
14 Password environment variable is set to an empty value.
15 User name environment variable is missing.
16 Password environment variable is missing.
17 Parameter file does not exist.
18 Integration Service found the parameter file, but it did not have the initial values for the session
parameters, such as $input or $output.
19 Integration Service cannot resume the session because the workflow is configured to run
continuously.
20 A repository error has occurred. Make sure that the Repository Service and the database are
running and the number of connections to the database is not exceeded.
21 Integration Service is shutting down and it is not accepting new requests.
22 Integration Service cannot find a unique instance of the workflow/session you specified. Enter the
command again with the folder name and workflow name.
23 There is no data available for the request.
24 Out of memory.
25 Command is cancelled.
8. Technical Assumptions
Each Assumption should be verified for accuracy. It is the responsibility of the document author(s) to
obtain this verification from the appropriate source.
Assumption Table Definitions
Assumption # Reference number for each Assumption.
Identified By The team member who identified the assumption.
Identified Date The date each assumption was identified.
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 33
Verified By
The resource (team member or other) who verified that the assumption is
accurate
Verified Date The date each assumption was verified
Assumption
Detailed description of the assumption being made in this document. If
possible, indicate what functional requirements this assumption affects.
Comments Include any comments about the assumption
Assumption #
Identified
By
Identified
Date
Verified
By
Verified
Date
Assumption Comments
1. CIS
Team
Initially Landing Zone tables
will hold a Week’s data later
it will get converted to daily.
2. CIS
Team
Preprocess Pharmacy LZ
tables are used for getting
member key field.
3. CIS
Team
All the attributes will be
defaulted for incoming
Nulls/blank values.
4. CIS
Team
5. Claims
Team
9. Reference Documents
In this section we can find all related documents.
Just below this table are all the share point links for all the related documents.
Mapping Docs
http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?
RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 34
%20Development%2fDesign%2fData%20Models%2fMapping%20Templates&FolderCTID=&View=
%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d
Approach Doc
http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?
RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and
%20Development%2fDesign%2fETL%20Approach&FolderCTID=&View=%7bB9C5506F%2dDD1C
%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d
Business System Design Doc
http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx?
RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and
%20Development%2fDesign%2fBusiness%20System%20Design%20Document%20%28BSD
%29&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 35
10. Project Team Signoffs / Approvals
SOLUTION DELIVERY APPROVALS
IT Technical
Lead:
IT Functional Area Lead
Signature: Date:
IT Project Mgr: IT Functional Area PM
Signature: Date:
Solution
Architect:
IT Functional Area Solution
Architect
Signature: Date:
b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 36

More Related Content

What's hot

Data archiving in sales and distribution (sd)
Data archiving in sales and distribution (sd)Data archiving in sales and distribution (sd)
Data archiving in sales and distribution (sd)Piyush Bose
 
Informatica Power Center - Workflow Manager
Informatica Power Center - Workflow ManagerInformatica Power Center - Workflow Manager
Informatica Power Center - Workflow ManagerZaranTech LLC
 
An introduction to fdqm
An introduction to fdqmAn introduction to fdqm
An introduction to fdqmAmit Sharma
 
Key Success Criteria for Hyperion Planning and HFM Integration
Key Success Criteria for Hyperion Planning and HFM IntegrationKey Success Criteria for Hyperion Planning and HFM Integration
Key Success Criteria for Hyperion Planning and HFM IntegrationAlithya
 
High level design document template
High level design document templateHigh level design document template
High level design document templateanosha jamshed
 
SAP BPC 10.1 NW Master Data loading
SAP BPC 10.1 NW Master Data loading SAP BPC 10.1 NW Master Data loading
SAP BPC 10.1 NW Master Data loading Manoj Kumar
 
Oracle Hyperion Planning Best Practices
Oracle Hyperion Planning Best PracticesOracle Hyperion Planning Best Practices
Oracle Hyperion Planning Best PracticesIssam Hejazin
 
Partnership success story
Partnership success storyPartnership success story
Partnership success storyKevin Burch
 
Reliability and performance with ibm db2 analytics accelerator
Reliability and performance with ibm db2 analytics acceleratorReliability and performance with ibm db2 analytics accelerator
Reliability and performance with ibm db2 analytics acceleratorbupbechanhgmail
 
Ebs 122 cum_rcd_hcm
Ebs 122 cum_rcd_hcmEbs 122 cum_rcd_hcm
Ebs 122 cum_rcd_hcmphr123
 
ETL Using Informatica Power Center
ETL Using Informatica Power CenterETL Using Informatica Power Center
ETL Using Informatica Power CenterEdureka!
 
FDMEE Tutorial - Part 1
FDMEE Tutorial - Part 1FDMEE Tutorial - Part 1
FDMEE Tutorial - Part 1Van Huy
 
Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...
Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...
Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...Martin Chamambo
 
Jaspersoft and Clarity PPM - Advanced Reporting with Data Warehouse
Jaspersoft and Clarity PPM - Advanced Reporting with Data WarehouseJaspersoft and Clarity PPM - Advanced Reporting with Data Warehouse
Jaspersoft and Clarity PPM - Advanced Reporting with Data WarehouseThiago Bottoni
 
installation and configuration of informatica server
installation and configuration of informatica serverinstallation and configuration of informatica server
installation and configuration of informatica serverketulp
 
Quick Preview: SuccessFactors Q3 - EC & Platform
Quick Preview: SuccessFactors Q3 - EC & PlatformQuick Preview: SuccessFactors Q3 - EC & Platform
Quick Preview: SuccessFactors Q3 - EC & PlatformChristoph Pohl
 
Sap business intelligence 4.0 report basic
Sap business intelligence 4.0   report basicSap business intelligence 4.0   report basic
Sap business intelligence 4.0 report basictovetrivel
 

What's hot (20)

Data archiving in sales and distribution (sd)
Data archiving in sales and distribution (sd)Data archiving in sales and distribution (sd)
Data archiving in sales and distribution (sd)
 
Informatica Power Center - Workflow Manager
Informatica Power Center - Workflow ManagerInformatica Power Center - Workflow Manager
Informatica Power Center - Workflow Manager
 
An introduction to fdqm
An introduction to fdqmAn introduction to fdqm
An introduction to fdqm
 
SAP BPC-SAP BPC TRAINING
SAP BPC-SAP BPC TRAININGSAP BPC-SAP BPC TRAINING
SAP BPC-SAP BPC TRAINING
 
Key Success Criteria for Hyperion Planning and HFM Integration
Key Success Criteria for Hyperion Planning and HFM IntegrationKey Success Criteria for Hyperion Planning and HFM Integration
Key Success Criteria for Hyperion Planning and HFM Integration
 
High level design document template
High level design document templateHigh level design document template
High level design document template
 
SAP BPC 10.1 NW Master Data loading
SAP BPC 10.1 NW Master Data loading SAP BPC 10.1 NW Master Data loading
SAP BPC 10.1 NW Master Data loading
 
Oracle Hyperion Planning Best Practices
Oracle Hyperion Planning Best PracticesOracle Hyperion Planning Best Practices
Oracle Hyperion Planning Best Practices
 
Partnership success story
Partnership success storyPartnership success story
Partnership success story
 
Reliability and performance with ibm db2 analytics accelerator
Reliability and performance with ibm db2 analytics acceleratorReliability and performance with ibm db2 analytics accelerator
Reliability and performance with ibm db2 analytics accelerator
 
Ebs 122 cum_rcd_hcm
Ebs 122 cum_rcd_hcmEbs 122 cum_rcd_hcm
Ebs 122 cum_rcd_hcm
 
ETL Using Informatica Power Center
ETL Using Informatica Power CenterETL Using Informatica Power Center
ETL Using Informatica Power Center
 
FDMEE Tutorial - Part 1
FDMEE Tutorial - Part 1FDMEE Tutorial - Part 1
FDMEE Tutorial - Part 1
 
Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...
Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...
Martin Chamambo - High Level Design-SDN BASED OPENSTACK IMPLEMENTATION IN ZIM...
 
SAP data archiving
SAP data archivingSAP data archiving
SAP data archiving
 
Jaspersoft and Clarity PPM - Advanced Reporting with Data Warehouse
Jaspersoft and Clarity PPM - Advanced Reporting with Data WarehouseJaspersoft and Clarity PPM - Advanced Reporting with Data Warehouse
Jaspersoft and Clarity PPM - Advanced Reporting with Data Warehouse
 
installation and configuration of informatica server
installation and configuration of informatica serverinstallation and configuration of informatica server
installation and configuration of informatica server
 
Software Design Document
Software Design DocumentSoftware Design Document
Software Design Document
 
Quick Preview: SuccessFactors Q3 - EC & Platform
Quick Preview: SuccessFactors Q3 - EC & PlatformQuick Preview: SuccessFactors Q3 - EC & Platform
Quick Preview: SuccessFactors Q3 - EC & Platform
 
Sap business intelligence 4.0 report basic
Sap business intelligence 4.0   report basicSap business intelligence 4.0   report basic
Sap business intelligence 4.0 report basic
 

Viewers also liked

SFUDemoFest 2015 - Flipping the LMS
SFUDemoFest 2015 - Flipping the LMSSFUDemoFest 2015 - Flipping the LMS
SFUDemoFest 2015 - Flipping the LMSPaul Hibbitts
 
Lec 3 b engineering education
Lec 3 b engineering educationLec 3 b engineering education
Lec 3 b engineering educationNUST Stuff
 
Adam Porter Resume
Adam Porter ResumeAdam Porter Resume
Adam Porter ResumeAdam Porter
 
Profile : Himanshu kandwal
Profile : Himanshu kandwalProfile : Himanshu kandwal
Profile : Himanshu kandwalHimanshu kandwal
 
2017 shareholder presentation final webcast
2017 shareholder presentation final webcast2017 shareholder presentation final webcast
2017 shareholder presentation final webcastvisainc
 
Andre' Davis DD-214
Andre' Davis DD-214Andre' Davis DD-214
Andre' Davis DD-214Andre Davis
 
Ashley Paro Resume
Ashley Paro ResumeAshley Paro Resume
Ashley Paro Resumeashparo
 

Viewers also liked (13)

Financial Clerk Ref
Financial Clerk RefFinancial Clerk Ref
Financial Clerk Ref
 
SFUDemoFest 2015 - Flipping the LMS
SFUDemoFest 2015 - Flipping the LMSSFUDemoFest 2015 - Flipping the LMS
SFUDemoFest 2015 - Flipping the LMS
 
Lec 3 b engineering education
Lec 3 b engineering educationLec 3 b engineering education
Lec 3 b engineering education
 
Adam Porter Resume
Adam Porter ResumeAdam Porter Resume
Adam Porter Resume
 
Profile : Himanshu kandwal
Profile : Himanshu kandwalProfile : Himanshu kandwal
Profile : Himanshu kandwal
 
2017 shareholder presentation final webcast
2017 shareholder presentation final webcast2017 shareholder presentation final webcast
2017 shareholder presentation final webcast
 
Shruti Curriculum Vitae
Shruti Curriculum VitaeShruti Curriculum Vitae
Shruti Curriculum Vitae
 
fat grafting
fat graftingfat grafting
fat grafting
 
Ocean pollution story
Ocean pollution storyOcean pollution story
Ocean pollution story
 
Andre' Davis DD-214
Andre' Davis DD-214Andre' Davis DD-214
Andre' Davis DD-214
 
Storm over gearpump
Storm over gearpumpStorm over gearpump
Storm over gearpump
 
Ashley Paro Resume
Ashley Paro ResumeAshley Paro Resume
Ashley Paro Resume
 
Mastopexia con Malla
Mastopexia con MallaMastopexia con Malla
Mastopexia con Malla
 

Similar to Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

PURPOSE of the project is Williams Specialty Company (WSC) reque.docx
PURPOSE of the project is Williams Specialty Company (WSC) reque.docxPURPOSE of the project is Williams Specialty Company (WSC) reque.docx
PURPOSE of the project is Williams Specialty Company (WSC) reque.docxamrit47
 
Prodev Solutions Intro
Prodev Solutions IntroProdev Solutions Intro
Prodev Solutions IntrolarryATprodev
 
Resume Manoj Kumar M
Resume Manoj Kumar MResume Manoj Kumar M
Resume Manoj Kumar MManoj Kumar
 
Resume Aden bahdon
Resume Aden bahdonResume Aden bahdon
Resume Aden bahdonAden Bahdon
 
DBT PU BI Lab Manual for ETL Exercise.pdf
DBT PU BI Lab Manual for ETL Exercise.pdfDBT PU BI Lab Manual for ETL Exercise.pdf
DBT PU BI Lab Manual for ETL Exercise.pdfJanakiramanS13
 
Obvient FocalPOINT Project Charter
Obvient FocalPOINT Project CharterObvient FocalPOINT Project Charter
Obvient FocalPOINT Project CharterBrian Kaiser, PE
 
Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17Arun Baby
 
Prakash_Profile(279074)
Prakash_Profile(279074)Prakash_Profile(279074)
Prakash_Profile(279074)Prakash s
 
Implement Data Ware House
Implement Data Ware HouseImplement Data Ware House
Implement Data Ware Housebhuphender
 
Business RequirementsReference number Document Control
Business RequirementsReference number Document ControlBusiness RequirementsReference number Document Control
Business RequirementsReference number Document ControlTawnaDelatorrejs
 
Evolutionary db development
Evolutionary db development Evolutionary db development
Evolutionary db development Open Party
 

Similar to Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2 (20)

Chaitanya_updated resume
Chaitanya_updated resumeChaitanya_updated resume
Chaitanya_updated resume
 
Chaitanya_updated resume
Chaitanya_updated resumeChaitanya_updated resume
Chaitanya_updated resume
 
Sap BPC concepts
Sap BPC conceptsSap BPC concepts
Sap BPC concepts
 
PURPOSE of the project is Williams Specialty Company (WSC) reque.docx
PURPOSE of the project is Williams Specialty Company (WSC) reque.docxPURPOSE of the project is Williams Specialty Company (WSC) reque.docx
PURPOSE of the project is Williams Specialty Company (WSC) reque.docx
 
Prodev Solutions Intro
Prodev Solutions IntroProdev Solutions Intro
Prodev Solutions Intro
 
Resume Manoj Kumar M
Resume Manoj Kumar MResume Manoj Kumar M
Resume Manoj Kumar M
 
Resume Aden bahdon
Resume Aden bahdonResume Aden bahdon
Resume Aden bahdon
 
Sunny_Resume
Sunny_ResumeSunny_Resume
Sunny_Resume
 
Sunny_Resume
Sunny_ResumeSunny_Resume
Sunny_Resume
 
Aksh 117 bpd_sd (1)
Aksh 117 bpd_sd (1)Aksh 117 bpd_sd (1)
Aksh 117 bpd_sd (1)
 
DBT PU BI Lab Manual for ETL Exercise.pdf
DBT PU BI Lab Manual for ETL Exercise.pdfDBT PU BI Lab Manual for ETL Exercise.pdf
DBT PU BI Lab Manual for ETL Exercise.pdf
 
Obvient FocalPOINT Project Charter
Obvient FocalPOINT Project CharterObvient FocalPOINT Project Charter
Obvient FocalPOINT Project Charter
 
Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17
 
0.3 aim phases_and_documentations
0.3 aim phases_and_documentations0.3 aim phases_and_documentations
0.3 aim phases_and_documentations
 
Prakash_Profile(279074)
Prakash_Profile(279074)Prakash_Profile(279074)
Prakash_Profile(279074)
 
M tierney res
M tierney resM tierney res
M tierney res
 
Veerapradeep_Apps_profile
Veerapradeep_Apps_profileVeerapradeep_Apps_profile
Veerapradeep_Apps_profile
 
Implement Data Ware House
Implement Data Ware HouseImplement Data Ware House
Implement Data Ware House
 
Business RequirementsReference number Document Control
Business RequirementsReference number Document ControlBusiness RequirementsReference number Document Control
Business RequirementsReference number Document Control
 
Evolutionary db development
Evolutionary db development Evolutionary db development
Evolutionary db development
 

Release-3_TSD_Source_to_LZ_-_CIS_-_v1.2 2

  • 1. Approvals Enterprise Architect: [Name] Signature: Date: IT Project Mgr: [Name] Signature: Date: Authorizing Business Sponsor: [Name] Signature: Date: Authorizing IT Sponsor: [Name] Signature: Date: History [Purpose of this Section: Record changes to this document here making an entry for each new version.] Version Number Release and/or Approval Date Author(s) Section(s), Page (s) and Topic Revised 1.0 12/07/2009 Offshore Team Initial Version 1.1 12/14/2009 Offshore Team Updated Version 1.2 04/21/2010 Offshore Team Updated Complete Version b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 1 Date: 04/21/2010 WellPoint Inc. Clinical Informatics Solutions Technical Design Document [Source-LZ] [Technical Leader- ] Portfolio No.: Service Request No.: CART No.: AOP Tracking No.: Notice of Confidentiality and Custodial Responsibilities This WellPoint document contains confidential information that is WellPoint’s intellectual property. As a holder of this document, you may NOT disclose its content or any information derived from it to any person or entity outside of WellPoint.
  • 2. Contents 1.Introduction................................................................................................3 1.1.Scope...........................................................................................................................4 1.2.Definitions, Acronyms and Abbreviations....................................................................5 2.Resources Affected...................................................................................6 2.1.External Resources.....................................................................................................6 3.Application Design.....................................................................................7 3.1.Architectural and Coding References..........................................................................7 3.2.Platform and Version Information................................................................................8 4. Process Flow of CIS......................................................................12 4.1 Server check process................................................................................................13 4.2Load Log Script..........................................................................................................14 4.3Loading Process from Source to Landing Zone.........................................................16 4.3.1Weekly Full Refresh Loads........................................................................................16 4.3.2Weekly Incremental Loads.........................................................................................18 4.3.3History Loads.............................................................................................................20 4.3.4Deriving Member Key fields........................................................................................21 4.3.5Audit Check Process...............................................................................................25 5.Components/Objects/Modules .....................................................................................27 i.Major Component Inventory.............................................................................................27 ii.Major Component Details................................................................................................27 6.Data Stores .............................................................................................29 6.1 Data Store Inventory..................................................................................................30 6.2 Data Store Data Elements.........................................................................................30 6.3 Data Store Descriptions.............................................................................................31 7.Implementation Activities.........................................................................31 7.1 Packaging/Release Activity.......................................................................................32 8.Technical Assumptions............................................................................33 9.Reference Documents.............................................................................34 10.Project Team Signoffs / Approvals........................................................36 b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 2
  • 3. 1. Introduction This high-level section requires no input from the author. It is simply information to assist authors in understanding and completing each sub-section. General Template Information Purpose of Document The Functional Design Document (FDD), the Architectural Design Document and the Infrastructure Impact Assessment (IIA) document are the predecessors to this document. The IT Technical Lead normally completes this document with contribution from the Solution Architect, Data Architect, and Infrastructure Build Engineering as needed. This document must contain all the elements needed to code a fully functional system. This means that this document should be complete and accurate enough so that any developer inside or outside the immediate development team can construct all the components in order to complete the system. The IT Technical Lead is responsible for the creation of this document. The document is owned by IT. Help Completing Template pmm@wellpoint.com Frequently Asked Questions about Completing this Document # Question Answer 1 How do I attach another doc as an object in this doc? (In this Word Doc) click Insert Object Create From File tabCheck-off “Display as Icon”  Browse for file click “ok”. File should now be on the document, but may not be fully visible. If not fully visible: click on the object one right click “Format Object”  click layout tab select “Tight” hit “ok”. 2 How do I provide a hyperlink in this doc to another doc? (In this Word Doc) InsertHyperlinkEnter hyperlink 3 How do I update the Table of Contents? Go to the Table of Contents page Position cursor to the left of the table (not over the table) Left click mouse button. The entire table should be highlighted Click the F9 key on keyboard. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 3
  • 4. 1.1.Scope Information Guide on Scope Scope The current scope of the design is to extract data from four operational source systems (ECC, WMDS, Trimed and Care planner) and load them into the Landing Zone (LZ). This document is based on the Business System Document and details the solution approach to load all the tables into the landing zone for CIS. There are 15 tables from ECC, 18 tables from WMDS, 8 tables from Trimed and 1 Flat file for CM. This document gives an overview in terms of performing Incremental Refresh (IR), Full Refresh (FR) and One Time Load of Historical Data. In case of scheduled process, the Incremental data (containing new or changed records) from the source databases (WMDS, ECC, Trimed and Care Planner) will be loaded into Landing Zone by means of Informatica ETL mappings. Every week the records from the source would be pulled based on the load criteria from the source tables and loaded into their respective tables in Landing Zone (LZ). Landing Zone will be a transient staging area and no history will be maintained. Description of Section Following steps will be executed as part of the project scope • Design and develop landing zone tables as per the source table layouts and LZ table creation guidelines. • Extract Clinical data from ECC (Oracle), WMDS (Oracle), Trimed (DB2), Careplanner (Flat file) from Jan 1, 2007 forward. • Load the History data into landing zone tables. • Design processes to handle complex transformations. • Design ETL process to load incremental data, full refresh data into landing zone staging area. • Setup process to load & maintain load log table for LZ data loads. • Setup process for Audit balancing. • Setup process for WLM jobs scheduling. • Define job dependencies and restart-ability. • Post load cleanup activities. Note: b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 4
  • 5. 1.2.Definitions, Acronyms and Abbreviations Information Guide on Definitions, Acronyms, and Abbreviations BI Business Intelligence BI Staging Area Staging area for Current Clinical reporting platform on a regional server (AEDW) CDC Change Data Capture CM Case Management CMS Codes Management System COB Coordination of Benefits COBRA Consolidated Omnibus Budget Reconciliation Act CP Care Planner CS90 Claims System – New York CSA Conformed Staging Area DM Disease Management ECC Empire Care Connects ECR Enterprise Client Reporting EDL Enterprise Data Layer EDL R2 Enterprise Data Layer Release 2 EDW Enterprise Data Warehouse EDWard Enterprise Data Warehouse and Research Depot. Earlier this was known as EDL R2. ERISA Employment Retirement Income Security Act ETL Extract Transform & Load FR Full Refresh HMC Health Management Corporation IM Information Management INC Incremental Load INFA Informatica IQ Information Quality b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 5
  • 6. IR Implementation Readiness LZ Landing Zone Staging Area MBU Marketing Business Unit NAICS_CD North American Industry Classification System Code PCP Primary Care Physician POC Proof of Concept RFC Request for Change SDLC System Development Life Cycle SIC_CD Standard Industry Classification Code SLA Service Level Agreement TD Teradata database TROOP True Out of Pocket UAT User Acceptance Testing WEDW West-Enterprise Data Warehouse WGS WellPoint Group Systems WLM Work Load Manager WMDS WellPoint Medical Decision Support System WPD WellPoint - Product Database 2. Resources Affected This high-level section requires no input from the author. It is simply information to assist authors in understanding the sub-sections. Information Guide on Resources Affected Resources Affected WMDS and ECC - Oracle database, Trimed – DB2 database, Careplanner – Flat file, Landing Zone Area (LZ) - Teradata. Description of Section List all other external resources (applications/entities) affected by the changes and a brief description of how they will be affected. Do not include business entities. External resources are represented by: hardware, a person, program, or another system. Note: 2.1.External Resources Information Guide on External Resources b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 6
  • 7. External Resources Operational System  WMDS/ECC/Trimed/Careplanner Host Systems Teradata  ENTDWPROD - Host  DBA Support Unix  TSM  Disk Utilization  System Admin Support WLM  WLM Scheduling Support  Service Delivery Support Informatica  Underlying Oracle DB  Application Admin Support Description of Section These are the External Entities outside of the applications that can be affected (i.e. DBA, External Vendor, Tape Management, Regulatory Agency, etc.). Describe in detail the impact. Note: 3. Application Design This high-level section requires no input from the author. It is simply information to assist authors in understanding the sub-sections. Information Guide on Application Design Application Design This section details all the technical design information for clinical subject area. Detailed design approach is explained in detail in the below sub-sections. Description of Section The aim of current application design is to load the CM from WMDS, ECC, Trimed, Careplanner source system to Landing Zone (LZ). This document is based on the Business System Design Document. This document is utilized to create Technical Detail Design Specification Document. This approach document defines on the way to load landing Zone tables for historical, incremental, full loads. Note: 3.1.Architectural and Coding References b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 7
  • 8. Information Guide on Architectural and Coding References Architectural and Coding References 1. Informatica Developer guideline 4.4 C:MembershipINFA Developer Handbook V4.4.doc 2. EDL R2 Informatica Standards and Naming Conventions http://sharepoint.auth.wellpoint.com/sites/EDL/ETL%20Design %20Workgroup/Forms/AllItems.aspx 3. ETL Guidelines document and RA decision Points: The following Share Point link gives the information about the latest ETL Guidelines document and Reference Architecture Decision Points: http://sharepoint.auth.wellpoint.com/sites/EDL/ETL%20Design %20Workgroup/Forms/AllItems.aspx 4. Landing Guidelines document – LZ_CDC_guidelines_V2 This document gives general guidelines on the process that need to be followed for creating the Landing Zone (LZ) tables & high level inputs needed for Change Data Capture (CDC) attribute mapping. http://sharepoint.auth.wellpoint.com/sites/DIRECTSOURCING/Shared %20Documents/Forms/AllItems.aspx 5. WLM Policy Document http://sharepoint.auth.wellpoint.com/sites/DIRECTSOURCING/Shared %20Documents/Forms/AllItems.aspx Description of Section Identify the Architectural and coding standards to be used. If reference is to a document that is not in an enterprise wide library, attach a link to the document or attach the document itself in the Reference section of this document. Note: Please include as much information as possible. 3.2.Platform and Version Information The below table contains the list of software for the Clinical Subject area. Sl. No Software Required Version 1 Teradata 06.02.02.80 2 Informatica 8.6.1 3 Unix AIX 4 WLM 3.0.1 The following diagram describes the platform information. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 8
  • 9. The below table contains various server details for Clinical Subject Area in different environments. Informatica Servers/Environment: Environment Server Name IP Ports DEV/SIT vaathmr380.corp.anthem.com 30.135.22.46 6001,55010-55100 UAT/IR vaathmr381.corp.anthem.com 30.135.22.47 6001,55000 - 55100 PROD vaathmr357.corp.anthem.com 30.130.16.150 6001,55201- 55300 Teradata Servers/Environment: Environment Server Name Server IP Ports DEV DWDEV 30.135.31.232 1025 SIT DWTEST1 30.135.88.22 1025 UAT/IR DWTEST3 30.135.88.22 1025 PROD ENTDWPROD 30.128.223.28 1025 Informatica Repository: Informatica Folders: b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 9 Environment Repository Name DEV/SIT EDL_DEV_86x UAT/IR EDL_UAT_86x PROD EDL_PROD_86x
  • 10. Environment Folder Name DEV/SIT ENT_CLINICAL_PROGRAMS_DEV ENT_CLINICAL_PROGRAMS_SHARED_DEV UAT/IR ENT_CLINICAL_PROGRAMS_UAT ENT_CLINICAL_PROGRAMS_SHARED_UAT PROD ENT_CLINICAL_PROGRAMS ENT_CLINICAL_PROGRAMS_SHARED Teradata Database: Environment Databases Name DEV ETL_TEMP_CPARP QADATA_CPARP CPARP CPARP_ALLPHI CPARP_NOPHI CPARP_NOHAPHI ETL_VIEWS_CPARP SIT T36_ETL_DATA_ENT T36_ETL_TEMP_ENT T36_QADATA_ENT T36_UTLTY_ENT T36_ETL_VIEWS_ENT T36_EDW T36_EDW_[ALL | NO | NOHA ] PHI UAT/IR T37_ETL_DATA_ENT T37_ETL_TEMP_ENT T37_QADATA_ENT T37_UTLTY_ENT T37_ETL_VIEWS_ENT T37_EDW T37_EDW_[ALL | NO | NOHA ] PHI PROD ETL_DATA_V20_ENT ETL_TEMP_V20_ENT QADATA_V20_ENT UTLTY_V20_ENT ETL_VIEWS_V20_ENT EDW_V20 b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 10
  • 11. CIS_HIST EDW_[ALL | NO | NOHA ] PHI EDW_SL_[ALL | NO | NOHA ] PHI Supporting Databases: Functionality Database Name High level hierarchy for CIS PRJ_CPARP Proxy ID for data loads CPARP_ETL_ID Views Macros Stored Procedures ETL_VIEWS_CPARP ETLMACRO_CPARP ETLPROC_CPARP Work objects and Error Tables needed by Teradata Utilities UTLTY_CPARP Hierarchy node for storing CIS DDLs for use by Developers CPARP_DEVDDL UNIX Directories: b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 11 Type Path Scripts /u01vaathmr380/app/cis/test/scripts Source file /u97vaathmr380/pcenterdata/test/SrcFiles Target file /u97vaathmr380/pcenterdata/test/TgtFiles Cache File /u97vaathmr380/pcenterdata/test/Cache Parameter Files /u97vaathmr380/pcenterdata/test/InfaParm
  • 12. 4. Process Flow of CIS b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 12
  • 13. 4.1 Server check process For WMDS and ECC: Informatica process is created to check for the server availability. Informatica process will load data from ECC_GLOBAL_NAME table to a file. If the server is not up and running then Informatica will not be able to connect to the database and load process will fail. Once the Informatica session fails, a command task is executed to make the process wait for 1200 seconds. After 1200 seconds wait time, Informatica session will again run to look for an entry in ECC_GLOBAL_NAME table. This process will continue for 7200 seconds before the entire workflow fails. Similar process is created for WMDS source which loads data from WMDS_GLOBAL_NAME table. . For Trimed: $s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME...$s_CIS_TRIME... $s_CIS_TRIME...$s_CIS_TRIME... $c_SLEEP_TIM...$c_SLEEP_TIM...$c_SLEEP_TIM...$c_SLEEP_TIM... $c_SLEEP_TIM...$c_SLEEP_TIM... c_SLEEP_TIME R_5 c_SLEEP_TIME R_4 c_SLEEP_TIME R_3 c_SLEEP_TIME R_2 c_SLEEP_TIME R_1 c_SLEEP_TIME R s_CIS_TRIMED _SERVER_UP_ 6 s_CIS_TRIMED _SERVER_UP_ 5 s_CIS_TRIMED _SERVER_UP_ 4 s_CIS_TRIMED _SERVER_UP_ 2 s_CIS_TRIMED _SERVER_UP_ 3 s_CIS_TRIME D_SERVER_U P s_CIS_TRIMED _SERVER_UP_ 1 Start This Informatica process is created to check for the availability of the DB2 database for Trimed. This process will select data from SYSIBM.SYSDUMMY1 table for every 20 minutes up to 120 minutes as per the business terms. A command task is executed after every session task to set the process SLEEP for 1200 seconds. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 13
  • 14. 4.2 Load Log Script Insertion of the Load Log Key: This workflow will run command tasks to invoke BTEQ script to insert load log key into LZ_LOAD_LOG table, script to update Parameter file values ,script to update present date parameter and script to verify whether Parameter file was updated properly or not. bteq_LZ_load_log_in s_load.sh table_Strucure.txt Sample_Parameter.t xt Contents of the file: This script checks if there is any load log key with load end date time as 8888-12-31. If it finds the load key value with the Load end date time as 8888-12-31, it errors out with the return code 100. If it does not find the record with the value of 8888-12-31, then it creates a new load log key entry based on the information present in the parameter file. Fetching and Updating the Load Log Key Value: After the insertion of Load Log key value in the LZ_LOAD_LOG table successfully, The BTEQ script fetches the current Load Log key value from the load log key table and updates the mapping parameter file, replacing the previous Load Log key value in the FR and INC parameter files. bteq_LZ_load_log_ke y_parm_update.sh Sample_Parameter_fi le_fr.txt Sample_Parameter_fi le_inc.txt 1. The script checks whether any other instance of the program is being run. If so, it displays a message that there is an existing instance of same program running and exits with the exit code of 91. 2. The script checks whether the old bteq_outfile is present. If it exists, it removes that output file. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 14
  • 15. 3. The script also checks if valid parameters are passed. If valid parameters are not passed it displays a message that invalid parameter is passed and prompts to pass the Full Script Name, Mapping Parameter File and Logon_Parameter_File. 4. The script checks if the Load log values have been fetched properly and if it doesn’t, it display a message that there was no valid load log entry in the table and exits with the exit code of 100.After the successful run of the script it removes the Temp files and captures if any errors are present. 5. The script fetches the current Load Log key value from the load log key table and updates the mapping parameter file, replacing the previous Load Log key value the parameter files. . Fetching and Updating the Present Run Date Value: After updating the Load Log key value in the mapping parameter file successfully, the BTEQ script fetches and updates the present run date in the mapping parameter file with current date, replacing the previous present run date in the FR and INC parameter files. bteq_update_prsnt_ date.sh Sample_Parameter_fi le_fr.txt Sample_Parameter_fi le_inc.txt Contents of the file: 1. The script checks whether any other instance of the program is being run. If so, it displays a message that there is an existing instance of same program running and exits with the exit code of 91. 2. The script checks whether the old bteq_outfile is present. If it exists, it removes that output file. 3. The script also checks if valid parameters are passed. If valid parameters are not passed it displays a message that invalid parameter is passed and prompts to pass the Full Script Name, Mapping Parameter File and Logon_Parameter_File. 4. The script fetches and updates the present run date in the mapping parameter file with current date, replacing the previous present run date in the FR and INC parameter files. Validating the Load Log Key Value: This script checks if the inserted load log value which exists in the load log table is same as in the parameter file. If both the values are equal then it displays a message that the load log key value has been updated correctly in the parameter file and the loads following this will proceed. If both the values are not equal then it displays a message that the verification has failed and exits with the exit code of 9 and loads following this will not proceed. It checks for both the loads FR and INC parameters. bteq_load_log_verify .sh b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 15
  • 16. 4.3 Loading Process from Source to Landing Zone Once the load log key is created in the LZ load log table, Informatica workflows are executed to load the data from the Source to the Landing Zone. As per the EDW standards all the Mappings from Source to Landing Zone sets default values for nulls or blanks present in char, varchar, number and date fields. Following are the types of load from Source to LZ. 1) The first type is Weekly Full refresh loads for tables with low volume counts. The full refresh tables are truncate and reload. Every table will have a command session before it which contains a generic delete script that deletes information from that particular table. 2) The second type is Incremental Load and it is based on the ‘Last Run Date’. All the records that have been updated or inserted in the source after the last run date are fetched and loaded into the Landing Zone tables. . As mentioned above, every table will have a command session before it which contains a generic delete script that deletes information from that particular table. 4.3.1 Weekly Full Refresh Loads b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 16
  • 17. An example of the full refresh mapping (m_CLINICAL_LZ_TRIMED_TMDTPATIENT) is shown pictorially above. At high level, mapping will read the data from oracle database and loads into the Landing Zone tables i.e., (LZ_TRIMED and LZ_CP) in Teradata. Also Load Log and Audit Process are followed in each and every mapping for getting source record count and inserting Audt_STTSTC for table for post load verification. Sample_Parameter.t xt Mapping Description: ID Transformation Name Component Type Description 1 SQ_CME_CASE_MANAGEMENT_E PISODE Source Definition This is the source qualifier for the table CME_CASE_MANAGEMENT_E PISODE from ECC. This is a straight load of the source data set to the target table. 2 exp_DEFAULT_CONVERSION Expression This reusable expression is used to convert the null, blank, N/A values to default values from source systems to landing zone tables following EDW standards. 3 exp_TO_TARGET_INP_AUDIT Expression This expression passes values to target table AND passes input values to audit mapplet. 4 mplt_LOAD_AUDIT_STTSTC Mapplet This mapplet inserts into the AUDT_STTSTC with the initial count loaded into the target table. 5 LZ_ECC_CME_CM_EPISODE Target Definition Truncate and Load. Populate Landing zone table with the most recent data. 6 AUDT_STTSTC Audit Table A row with source count appended in each mapping based on table name. Mapplet Description: ID Transformation Name Component Type Description 1 agg_RECORD_COUNT Aggregator This aggregator gives a row with source count appended in each b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 17
  • 18. mapping based on table name. 2 exp_CALL_AUDT_BLNCG_RULE Expression This expression is used to call the lookup for the AUDT_BLNCG_RULE id based on table_nm, sor_cd and in this AUDT_BLNCG_RULE id can’t be NULL. 3 lkp_AUDT_BLNCG_RULE Lookup This lookup transformation is used to get the AUDT_RULE_ID. 4 mplt_OUT Mapplet Output This is the mapplet output from where we will accumulate the complete mapplet data. 4.3.2 Weekly Incremental Loads b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 18
  • 19. An example of the incremental process is shown pictorially above At high level, Mapping will read the data from oracle database for the last three days based on the create date and Update date fields and loads into the Landing Zone table, LZ_ECC_COI_CON_ISSUE in Teradata. Also load log and audit process are followed in each and every mapping for getting source record count and writing to Audt_STTSTC for table for post load verification. Mapping Description: ID Transformation Name Component Type Description 1 SQ_COI_CON_ISSUE Source Definition This is the source qualifier for the table COI_CON_ISSUE from ECC. This is a straight load of the source data set to the target table. 2 exp_DEFAULT_CONVERSION Expression This reusable expression is used to convert the null, blank, N/A values to default values from source systems to landing zone tables following EDW standards. 3 exp_TO_TARGET_INP_AUDIT Expression This expression passes values to target table AND passes input values to audit mapplet. 4 mplt_LOAD_AUDIT_STTSTC Mapplet This mapplet inserts into the AUDT_STTSTC with the initial count loaded into the target table. 5 LZ_ECC_COI_CON_ISSUE Target Definition Truncate and Load. Populate Landing zone table with the most recent data. 6 AUDT_STTSTC Audit Table A row with source count appended in each mapping based on table name. Mapplet Description: b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 19
  • 20. ID Transformation Name Component Type Description 1 agg_RECORD_COUNT Aggregator This aggregator gives a row with source count appended in each mapping based on table name. 2 exp_CALL_AUDT_BLNCG_RULE Expression This expression is used to call the lookup for the AUDT_BLNCG_RULE id based on table_nm, sor_cd and in this AUDT_BLNCG_RULE id can’t be NULL. 3 lkp_AUDT_BLNCG_RULE Lookup This lookup transformation is used to get the AUDT_RULE_ID. 4 mplt_OUT Mapplet Output This is the mapplet output from where we will accumulate the complete mapplet data. The Pseudo Code for the Source filter can be as follows: WHERE ( ( TRUNC(COI_CON_ISSUE.COI_CREATE_DATE)>= TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_RUN_DATE'),'MM-DD-YYYY HH24:MI:SS')) AND TRUNC(COI_CON_ISSUE.COI_CREATE_DATE)< TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_PRSNT_DATE'),'MM-DD-YYYY HH24:MI:SS')) ) OR ( TRUNC(COI_CON_ISSUE.COI_LAST_UPDATE_DATE)>= TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_RUN_DATE'),'MM-DD-YYYY HH24:MI:SS')) AND TRUNC(COI_CON_ISSUE.COI_LAST_UPDATE_DATE)< TRUNC(TO_DATE(TO_CHAR('$$MAPP_LZ_LAST_PRSNT_DATE'),'MM-DD-YYYY HH24:MI:SS')) ) ) 4.3.3 History Loads The history load is applicable only for tables identified as Incremental tables and the data of these tables are brought to the Landing zone based on the category that they fall on to: Eg: b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 20
  • 21. select /*+parallel(a,4) parallel(b,4)*/ col1, col2, ---------- ---------- FROM WHERE TAU_TREATMENT_AUTHORIZATION.TAU_MEM_UID = MEM_MEMBER.MEM_UID AND trunc(TAU_TREATMENT_AUTHORIZATION.TAU_CREATE_DATE)>=trunc(to_date(to_char('$ $MAPP_HIST_START_DATE'),'mm-dd-yyyy hh24:mi:ss')) AND trunc(TAU_TREATMENT_AUTHORIZATION.TAU_CREATE_DATE)<trunc(to_date(to_char('$ $MAPP_HIST_END_DATE'),'mm-dd-yyyy hh24:mi:ss')) Below are the tables where history data starting from Jan 01, 2007 and match based on the TAU_TREATMENT_AUTHORIZATION.TAU_UID are brought down to Landing Zone. Incase huge volume of data in source tables then there is an issue in loading the records as the time taken to load these records is high. Hence the very huge tables will be split into multiple stages and are loaded into the Landing Zone. Each stage will be having the data for a particular period based on the TAU_TREATMENT_AUTHORIZATION table create_date (say 6/3 months). The Landing Zone table will be loaded completely after the completion of all the stage loads and it is used as a single table for further processing. 4.3.4 Deriving Member Key fields Incremental.doc History.doc WMDS.MEM_MEMBER table: WMDS.MEM_MEMBER needs to be joined with WMDS.TAU_TREATMENT_AUTHORIZATION table to get the matching MEM_ID for each TAU_UID. The information inside the ‘Member id’ b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 21 ECC TABLE_NAME WMDS TABLE_NAME MNO_MEM_NOTE COI_CON_ISSUE COI_CON_ISSUE MEM_MEMBER MEM_MEMBER MNO_MEM_NOTE MPG_MEM_PLAN_GROUP TAU_TREATMENT_AUTHORIZATION TAU_TREATMENT_AUTHORIZATION MPG_MEM_PLAN_GROUP PVD_PROVIDER
  • 22. needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Source Code’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’. Pseudo Code (LZ_WMDS_MEM_MEMBER) used in the Source qualifier can be found in Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’. Expression to Derive fields for MBR_KEY: Ports Expression SUBSCRIBER_ID decode (TRUE, instr(MEM_ID, 'DENWGS') != 0 OR instr(MEM_ID, 'DENINT') != 0 OR instr(MEM_ID, 'WGS2BCC') != 0 OR instr(MEM_ID, 'WGS2CO') != 0 OR instr(MEM_ID, 'WGS2GA') != 0 OR instr(MEM_ID, 'WGS2MO') != 0 OR instr(MEM_ID, 'WGS2NV') != 0 OR instr(MEM_ID, 'WGS2SSP') != 0 OR instr(MEM_ID, 'WGS2UNI') != 0 OR instr(MEM_ID, 'WGS2WI') != 0 OR instr(MEM_ID, 'WGSMPD') != 0 OR instr(MEM_ID, 'WBMO') != 0 OR instr(MEM_ID, 'CR') != 0 OR instr(MEM_ID, 'AFEP') != 0 OR instr(MEM_ID, 'HLADV') != 0 OR instr(MEM_ID, 'HLHMO') != 0 OR instr(MEM_ID, 'HLPPO') != 0 OR instr(MEM_ID, 'NA') != 0 OR instr(MEM_ID, 'WBGA') != 0 OR instr(MEM_ID, 'UNIBOR') != 0 OR instr(MEM_ID, 'UNISHBP') != 0 OR instr(MEM_ID, 'DENSTAR') != 0 OR instr(MEM_ID, 'STAR') != 0 OR instr(MEM_ID, 'MTRKUNI') != 0 OR instr(MEM_ID, 'WGS13') != 0, SUBSTR((RTRIM(LTRIM(MEM_ID))),1,9), instr(MEM_ID, 'WBWI') != 0, SUBSTR((RTRIM(LTRIM(MEM_ID))),4,9), instr(MEM_ID, 'D950') != 0, SUBSTR((RTRIM(LTRIM(MEM_ID))),1,12), 'UNK') MEMBER_SEQ_NBR decode (TRUE, instr(MEM_ID, 'CR') != 0 OR instr(MEM_ID, 'AFEP') != 0 OR instr(MEM_ID, 'UNIBOR') != 0 OR instr(MEM_ID, 'UNISHBP') != 0, to_char(SUBSTR((RTRIM(LTRIM(MEM_ID))),10,2)), instr(MEM_ID, 'WBGA') != 0 OR instr(MEM_ID, 'DENSTAR') != 0 OR instr(MEM_ID, 'STAR') != 0 OR instr(MEM_ID, 'WGS13') != 0, to_char((SUBSTR((RTRIM(LTRIM(MEM_ID))),11,2))), instr(MEM_ID, 'NA') != 0, to_char((SUBSTR((RTRIM(LTRIM(MEM_ID))),15,2))), b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 22
  • 23. 'UNK') MBR_CD decode (TRUE, instr(MEM_ID, 'DENWGS') != 0 OR instr(MEM_ID, 'WGS2BCC') != 0 OR instr(MEM_ID, 'WGS2CO') != 0 OR instr(MEM_ID, 'WGS2GA') != 0 OR instr(MEM_ID, 'WGS2MO') != 0 OR instr(MEM_ID, 'WGS2NV') != 0 OR instr(MEM_ID, 'WGS2SSP') != 0 OR instr(MEM_ID, 'WGS2UNI') != 0 OR instr(MEM_ID, 'WGS2WI') != 0 OR instr(MEM_ID, 'WGSMPD') != 0 OR instr(MEM_ID, 'WBMO') != 0 OR instr(MEM_ID, 'HLADV') != 0 OR instr(MEM_ID, 'HLHMO') != 0 OR instr(MEM_ID, 'HLPPO') != 0 OR instr(MEM_ID, 'MTRKUNI') != 0, SUBSTR((RTRIM(LTRIM(MEM_ID))),11,2), instr(MEM_ID, 'WBWI') != 0 OR instr(MEM_ID, 'D950') != 0, SUBSTR((RTRIM(LTRIM(MEM_ID))),13,2), 'UNK') MBR_SOR_CD decode (TRUE, rtrim(substr(MEM_ID,instr(MEM_ID, 'CR'))) ='CR','823', instr(MEM_ID, 'AFEP') != 0,'FEP', ( instr(MEM_ID, 'NA')!=0 AND is_spaces(substr(MEM_ID,instr(MEM_ID,'NA')-1,1)) ), '824', instr(MEM_ID, 'DEN') != 0,'NA', instr(MEM_ID, 'STAR') != 0,'815', instr(MEM_ID, 'WGS13') != 0,'NA', instr(MEM_ID, 'WGS') != 0,'808', 'NA') SRC_GRP_NBR 'UNK' ECC.MEM_MEMBER table: ECC.MEM_MEMBER needs to be joined with ECC.TAU_TREATMENT_AUTHORIZATION table to get the matching MEM_ID for each TAU_UID. The information inside the ‘Member id’ needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Source Code’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’. Pseudo Code (LZ_ECC_MEM_MEMBER) used in the Source qualifier can be found in Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’. Expression to Derive fields for MBR_KEY: Ports Expression SUBSCRIBER_ID DECODE(TRUE, INSTR(RTRIM(LTRIM(MEM_ID)),'-')!=0, b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 23
  • 24. SUBSTR(RTRIM(LTRIM(MEM_ID)),1,INSTR(MEM_ID,'-')-1), MEM_ID) MEMBER_SEQ_NBR DECODE(TRUE, INSTR(RTRIM(LTRIM(MEM_ID)),'-')!=0, SUBSTR(RTRIM(LTRIM(MEM_ID)),(INSTR(MEM_ID,'-')+1),2), 'NA') MBR_SOR_CD '809' TRIMED.OPRTPRSNCVRD table: The information inside the ‘OPRTPRSNCVRD .I_COVD_PRSN ‘ needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’. Pseudo Code (LZ_TRIMED_OPRTPRSNCVRD) used in the Source qualifier can be found in Incremental.doc or History.doc documents in ‘Deriving Member Key fields section’. Expression to Derive fields for MBR_KEY: Ports Expression SUBSCRIBER_ID decode (TRUE, instr(C_OWNER_CAT, 'CORP') != 0 OR instr(C_OWNER_CAT, 'HMORIC') != 0 OR instr(C_OWNER_CAT, 'FEP') != 0, SUBSTR((RTRIM(LTRIM(I_COVD_PRSN))),1,9), 'UNK') MEMBER_SEQ_NBR decode (TRUE, instr(C_OWNER_CAT,'CORP') != 0 OR instr(C_OWNER_CAT,'HMORIC') != 0 OR instr(C_OWNER_CAT,'FEP') != 0, to_char(SUBSTR((RTRIM(LTRIM(I_COVD_PRSN))),10,2)), 'UNK') MBR_SOR_CD decode (TRUE, instr(C_OWNER_CAT, 'HMORIC') != 0,'868', instr(C_OWNER_CAT, 'CORP') != 0,'869', instr(C_OWNER_CAT, 'FEP') != 0,'888', 'NA') SRC_GRP_NBR 'UNK' CarePlanner.CASE_EVNT Table The information inside the ‘CASE_EVNT.PAT_ID ‘ needs to be split and decoded in an expression to get the ‘Subscriber ID’, ‘Member Sequence Number’, ’Member Source Code’ and ‘Source Group Number’, ’source member code’. Expression to Derive fields for MBR_KEY: Ports Expression b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 24
  • 25. SUBSCRIBER_ID decode (TRUE, instr(PAT_ID, 'ACES') != 0, SUBSTR((RTRIM(LTRIM(PAT_ID))),1,10), 'UNK') MEMBER_SEQ_NBR decode (TRUE, instr(PAT_ID, 'ACES') != 0, to_char((SUBSTR((RTRIM(LTRIM(PAT_ID))),12,1))), 'UNK') MBR_SOR_CD '822' SRC_GRP_NBR 'UNK' SRC_MBR_CD 'UNK' Default Values: Mapping Parameters are defined and are utilized to assign the values appropriately. A sample parameter definition and a sample parameter file are attached below. Sample_Parameter.t xt The Pseudo expression templates for the replacement of the null values with the defaults based on the data type can be summarized as follows: Data Type Default Value(s) Pseudo Expression CHAR, VARCHAR UNK,NA DECODE(TRUE,ISNULL(IN_STRING_1), to_char($$MAPP_DEFAULT_STRING_UNK), IS_SPACES(IN_STRING_1), to_char($$MAPP_DEFAULT_STRING_UNK), IN_STRING_1= '', to_char($$MAPP_DEFAULT_STRING_UNK),IN_STRING_1) INTEGER, DECIMAL 0 DECODE(TRUE, ISNULL(IN_NUMBER_1),$ $MAPP_DEFAULT_INTEGER,IN_NUMBER_1) DATE 8888-12-31 00:00:00 DECODE(TRUE, ISNULL(IN_DATE_1), ROUND(to_date(to_char($$MAPP_DEFAULT_HIGH_DATE))),IN_DATE_1) 4.3.5 Audit Check Process 1) After the load process is completed at the Landing Zone tables, a BTEQ script is run to update the load statistics of the Landing Zone tables into the Audit table AUDT_STTSTC. bteq_LZ_AUDT_STA T_LOAD.sh b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 25
  • 26. Count of the row is updated for the load log key having the latest ‘Load start date time’ based on the ‘source code’, ‘subject area’, ‘workflow name’ and ‘publish indicator = N’. This is loaded into the GTT (Global Temporary table).The data is loaded into the AUDT_STTSTC from the GTT table. When there are zero rows processed from the source (i.e.) when the source does not have any rows matching the fetch criteria, then the source record count in the audit table will not be inserted. The target record count (LZ target table count) will be inserted with the row count of zero. 2) The record variance of the source and target column is aggregated and updated into the GTT table. This is loaded back into the AUDT_BLNC_STTSTC table. bteq_LZ_AUDT_BLN C_STAT_LOAD.sh 3) This bteq checks if the difference of counts and sums are within the allowed variance quantity for all the tables in a particular subject area. If the variance is more than the accepted value in the rule table, then the subsequent process would not be executed. bteq_LZ_AUDT_BAL_ RCD_CNT_VRNC.sh 4) If the above process succeeds, then a BTEQ script is executed to update the load end time in the load log table. bteq_LZ_load_log_e nd_dtm_upd_load.sh Updates the load end time based on the ‘source code’, ‘subject area’, ‘workflow name’ and publish_ind = ‘N’ 5) The final BTEQ updates the publish indicator to Y in the load log table. bteq_LZ_load_log_p ub_ind_upd_load.sh In case of any count variances in the audit, Audit log entries needs to be deleted manually by production support personal and Jobs needs to be restarted to fix the problem. The following link will give the Audit Balancing tables structure DDL: http://sharepoint.auth.wellpoint.com/sites/esppm/edlr2/Executing%20Phase %20Documents/DDL/dev/IQ_cut1_00.ddl A pictorial representation of the above steps: b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 26
  • 27. 5. Components/Objects/Modules This section will have more detailed information for each and every component and objects for Clinical subject area for WMDS/ECC source systems. i. Major Component Inventory SL No. Component Layer 1 Trigger file UNIX scripting 2 Load log keys Teradata 3 Source to Landing Zone Informatica 4 Audit Balancing Teradata/Informatica ii. Major Component Details This section will have detailed information on the source and the ETL process to load them to the Landing Zone. Source Data Extraction Process Informatica process is created to check for the server availability. Informatica process will load data from TRIMED_GLOBAL_NAME table to a file. If the server is not up and running then Informatica will not be able to connect to the database and load process will fail. Once the Informatica session fails, a command task is executed to make the process wait for 1200 seconds. After 1200 seconds wait time, Informatica session will again run to look for an entry in TRIMED_GLOBAL_NAME table. This process will continue for 7200 seconds before the entire workflow fails Source: WMDS Oracle Database ECC Oracle Database TRIMED DB2 Database CareplannerCare planner Flat File b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 27
  • 28. Source Table Details Serial No. Source FR/INC Source Table Name Landing Zone Target table Name 1. WMDS FR CDA_CME_DIAGNOSIS LZ_WMDS_CDA_CME_DIAGNOSIS 2. WMDS FR CMA_CME_MNGR_ASGNMNT LZ_WMDS_CMA_CME_MNGR_ASGNMNT 3. WMDS FR CME_CM_EPISODE LZ_WMDS_CME_CM_EPISODE 4. WMDS FR CMG_CM_GOAL LZ_WMDS_CMG_CM_GOAL 5. WMDS FR CMI_CM_ISSUE LZ_WMDS_CMI_CM_ISSUE 6. WMDS FR CMV_CM_INTERVENTION LZ_WMDS_CMV_CM_INTERVENTION 7. WMDS INC COI_CON_ISSUE LZ_WMDS_COI_CON_ISSUE 8. WMDS FR DXC_DIAGNOSIS_CODE LZ_WMDS_DXC_DIAGNOSIS_CODE 9. WMDS INC MEM_MEMBER LZ_WMDS_ MEM_MEMBER 10. WMDS INC MNO_MEM_NOTE LZ_WMDS_ MNO_MEM_NOTE 11. WMDS FR ORG_ORGANIZATION LZ_WMDS_ ORG_ORGANIZATION 12. WMDS FR SRP_SBR_RESPONSE LZ_WMDS_ SRP_SBR_RESPONSE 13. WMDS FR STF_STAFF LZ_WMDS_ STF_STAFF 14. WMDS INC TAU_TREATMENT_AUTHRZN LZ_WMDS_TAU_TREATMENT_AUTHRZN 15. WMDS FR MCD_MEM_CLINICAL_DATA LZ_WMDS_MCD_MEM_CLINICAL_DATA 16. WMDS INC MPG_MEM_PLAN_GROUP LZ_WMDS_ MPG_MEM_PLAN_GROUP 17 WMDS INC PVD_PROVIDER LZ_WMDS_ PVD_PROVIDER 18 ECC FR CME_CM_EPISODE LZ_ECC_CME_CM_EPISODE 19 ECC FR SRP_SBR_RESPONSE LZ_ECC_ SRP_SBR_RESPONSE 20 ECC INC MNO_MEM_NOTE LZ_ECC_ MNO_MEM_NOTE 21 ECC FR SBR_SURVEY_BUILDER LZ_ECC_SBR_SURVEY_BUILDER 22 ECC FR CDA_CME_DIAGNOSIS LZ_ECC_ CDA_CME_DIAGNOSIS 23 ECC FR CMG_CM_GOAL LZ_ECC_CMG_CM_GOAL 24 ECC FR CMI_CM_ISSUE LZ_ECC_CMI_CM_ISSUE 25 ECC FR CMV_CASE_MGMT_INTRVNTN LZ_ECC_CMV_CASE_MGMT_INTRVNTN 26 ECC FR ORG_ORGANIZATION LZ_ECC_ORG_ORGANIZATION 27 ECC INC COI_CON_ISSUE LZ_ECC_ COI_CON_ISSUE 28 ECC INC MEM_MEMBER LZ_ECC_ MEM_MEMBER 29 ECC FR CMA_CME_MANAGER_ASGNMNT LZ_ECC_CMA_CME_MANAGER_ASGNMNT 30 ECC FR STF_STAFF LZ_ECC_ STF_STAFF b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 28
  • 29. Serial No. Source FR/INC Source Table Name Landing Zone Target table Name 31 ECC INC MPG_MEM_PLAN_GROUP LZ_ECC_ MPG_MEM_PLAN_GROUP 32 ECC INC TAU_TREATMENT_AUTHRZN LZ_ECC_TAU_TREATMENT_AUTHRZN 33 TRIMED FR OPRTPRSNCVRD LZ_TRIMED_ OPRTPRSNCVRD 34 TRIMED FR TMDTPATIENT LZ_TRIMED _ USR_USER 35 TRIMED FR TMDTRPPTCMGT LZ_TRIMED_ TMDTRPPTCMGT 36 CP FR CARESECURE_CASE_EVNT LZ_CP_CARESECURE_CASE_EVNT NOTE: TMDTRPTKLRLG is used only for history load. For incremental loads, we would be using TMDTTMTICKLR which contains one month information. ETL Process Flow The existing data from the Landing Zone tables is deleted and the data from the source tables will be read by Informatica power center and loaded into the corresponding landing zone tables. Teradata external loader (MLoad) connections will be used to load the data into LZ tables. If there are null values coming from the source, then it would to be populated with corresponding default values as per the EDW standards. The following link will give the detailed ETL Specification for all the above LZ tables. http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx? RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and %20Development%2fDesign%2fETL%5fSpecification%5fdoc&FolderCTID=&View=%7bB9C5506F %2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d 6. Data Stores This high-level section requires no input from the author. It is simply information to assist authors in understanding the sub-sections. Information Guide on Data Stores Data Stores CIS History extract and incremental/full refresh data are loaded into Landing Zone table. In this section all the data sources and its sizes will be documented. Description of Section Identify the new, changed or deleted data schema(s) affected, highlighting changes from/additions to existing schema(s). Describe new tables, file, structures, data elements, and changes to the existing data dictionary objects related to the changes. Include data model diagrams as appropriate. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 29
  • 30. Note: Please include as much information as possible. 6.1 Data Store Inventory See below grid shows the CM inventory list which contains the details of the history extract and its information. Also gives the details for LZ & CSA tables. The name of the document is ‘Inventory List.xls’ Inventory list.xls The below spreadsheet gives statistics of WMDS/ECC/Trimed/Careplanner Landing zone tables. Release 3 Source Tables_with_counts.xls 6.2 Data Store Data Elements Create one table for each new or existing table/structure Data Element Table Definitions Schema/Database/File Name Name of new or existing Schema or Database or File Name Table/Segment Name Name of new or existing table or segment Copybook/Object Name Name of new or existing copybook or object name Data Element Data element name Description Description of the field Data Type The data type for the element (alpha, numeric, String, Date, Integer, etc.) Length The length of the data element – indicate units (characters, bytes, kb, etc.) Values If applicable, list the values for the data element, or reference a table or document with this information. Key/Search/Index Field Is this a key, search, or index field? Mod Type Modification Type (A, C, or D) for addition, change, or delete for the particular data element. The following Share Point links gives the information about Landing Zone tables DDL. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 30
  • 31. Landing zone tables DDL http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx? RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and %20Development%2fDesign%2fData%20Models%2fLanding%20Zone%20Models&FolderCTID=&View= %7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d 6.3 Data Store Descriptions Information Guide on Data Store Descriptions Data Store Descriptions Description of the each and every Physical Model attribute. Description of Section Provide narrative descriptions for field changes, if they will assist the developer when coding. Refer to Data Elements/Data Stores based on Schema Name Note: Please include as much information as possible. The following Share Point link gives the information about the Clinical Programs Physical Data Model and Metadata for UM, CM. This document describes the each and every Physical Model attribute http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx? RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and %20Development%2fDesign%2fData%20Models%2fPhysical%20Model %20Documents&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C %2d8EBB1D5F4FE5%7d 7. Implementation Activities This section holds the detail information of batch processing and recovery process. General Information Purpose of This Section The CA UNICENTER scheduling tool, also referred to as the WORKLOAD MANAGER (WLM), is the preferred WellPoint, Inc. and Central Region Data Warehouse (CRDW) and Service Delivery (SD) scheduling tool. The WLM scheduling tool is designed to process job execution across multiple technology platforms. This section details about the WLM Jobset & Jobs that will be set for WMDS/ECC/Trimed/Careplanner Source to Landing Zone load will be discussed in detail. Instructions for this Section: Repeat the following section for each layer in the application (Presentation, Business, Messaging, Data, Applications, and b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 31
  • 32. Infrastructure). Be sure to articulate all environments such as Development, Test, QA and Production or any other environment types. NOTE If you choose not to use the table below, you can provide your own format. Be sure to provide any Implementation Activities not contained explicitly in this section. Repeat the following table for each step in the program unit, package or component. 7.1 Packaging/Release Activity Scheduling of the workflows can be done using WLM: work load manager tool. The WLM tool is designed to, 1. Manage end-to-end processing flows. 2. Manage execution of parallel job processing. 3. Manage overall process sequence. (i.e., Jobs or processes those are dependent upon the completion of a preceding job or process.) 4. Manage job failure notification For WMDS/ECC/CP/TriMed Source to Landing Zone load the process will be split into logical unit of works called Job sets in WLM. Job set is logical unit of work in WLM and each Jobset will contain jobs through which dependencies are set. Each JOB is units of work that will execute Informatica/ Teradata processes to load the target tables. Below is the list of all the return codes from Informatica which are captured by WLM team which are the outcomes of Informatica pmcmd return code. PMCMD RETURN CODES Code Description 0 For all commands, a return value of zero indicates that the command ran successfully. You can issue the following commands in the wait or nowait mode: starttask, startworkflow, aborttask, and abortworkflow. If you issue a command in the wait mode, a return value of zero indicates the command ran successfully. If you issue a command in the nowait mode, a return value of zero indicates that the request was successfully transmitted to the Integration Service, and it acknowledged the request. 1 Integration Service is not available, or pmcmd cannot connect to the Integration Service. There is a problem with the TCP/IP host name or port number or with the network. 2 Task name, workflow name, or folder name does not exist. 3 An error occurred starting or running the workflow or task. 4 Usage error. You passed the wrong options to pmcmd. 5 An internal pmcmd error occurred. Contact Informatica Technical Support. 7 You used an invalid user name or password. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 32
  • 33. 8 You do not have the appropriate permissions or privileges to perform this task. 9 Connection to the Integration Service timed out while sending the request. 12 Integration Service cannot start recovery because the session or workflow is scheduled, waiting for an event, waiting, initializing, aborting, stopping, disabled, or running. 13 User name environment variable is set to an empty value. 14 Password environment variable is set to an empty value. 15 User name environment variable is missing. 16 Password environment variable is missing. 17 Parameter file does not exist. 18 Integration Service found the parameter file, but it did not have the initial values for the session parameters, such as $input or $output. 19 Integration Service cannot resume the session because the workflow is configured to run continuously. 20 A repository error has occurred. Make sure that the Repository Service and the database are running and the number of connections to the database is not exceeded. 21 Integration Service is shutting down and it is not accepting new requests. 22 Integration Service cannot find a unique instance of the workflow/session you specified. Enter the command again with the folder name and workflow name. 23 There is no data available for the request. 24 Out of memory. 25 Command is cancelled. 8. Technical Assumptions Each Assumption should be verified for accuracy. It is the responsibility of the document author(s) to obtain this verification from the appropriate source. Assumption Table Definitions Assumption # Reference number for each Assumption. Identified By The team member who identified the assumption. Identified Date The date each assumption was identified. b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 33
  • 34. Verified By The resource (team member or other) who verified that the assumption is accurate Verified Date The date each assumption was verified Assumption Detailed description of the assumption being made in this document. If possible, indicate what functional requirements this assumption affects. Comments Include any comments about the assumption Assumption # Identified By Identified Date Verified By Verified Date Assumption Comments 1. CIS Team Initially Landing Zone tables will hold a Week’s data later it will get converted to daily. 2. CIS Team Preprocess Pharmacy LZ tables are used for getting member key field. 3. CIS Team All the attributes will be defaulted for incoming Nulls/blank values. 4. CIS Team 5. Claims Team 9. Reference Documents In this section we can find all related documents. Just below this table are all the share point links for all the related documents. Mapping Docs http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx? RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 34
  • 35. %20Development%2fDesign%2fData%20Models%2fMapping%20Templates&FolderCTID=&View= %7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d Approach Doc http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx? RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and %20Development%2fDesign%2fETL%20Approach&FolderCTID=&View=%7bB9C5506F%2dDD1C %2d4F22%2dA54C%2d8EBB1D5F4FE5%7d Business System Design Doc http://sharepoint.auth.wellpoint.com/sites/EntClinProgRep/Shared%20Documents/Forms/AllItems.aspx? RootFolder=%2fsites%2fEntClinProgRep%2fShared%20Documents%2fTechnical%20Design%20and %20Development%2fDesign%2fBusiness%20System%20Design%20Document%20%28BSD %29&FolderCTID=&View=%7bB9C5506F%2dDD1C%2d4F22%2dA54C%2d8EBB1D5F4FE5%7d b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 35
  • 36. 10. Project Team Signoffs / Approvals SOLUTION DELIVERY APPROVALS IT Technical Lead: IT Functional Area Lead Signature: Date: IT Project Mgr: IT Functional Area PM Signature: Date: Solution Architect: IT Functional Area Solution Architect Signature: Date: b7ac26fb-bcaa-4926-8e9b-fab708d1823e-150217125228-conversion-gate01.doc 36