Analyze Table through informatica
Union Bank of California
Includethe confidentiality statement within the box provided.This has to be legally
The information contained in this documentis confidential and proprietary to TATA
ConsultancyServices. This information may not be disclosed,duplicated or used forany
other purposes.The information containedin this documentmay notbe released in whole
or in part outside TCSfor any purposewithoutthe express written permission of TATA
Tata Code of Conduct
We, in ourdealings, are self-regulated by a Codeof Conductasenshrined in the Tata Code
ofConduct.We requestyour supportin helping us adhere tothe Codein letter and spirit.
We request that any violation orpotential violation ofthe Code by anyperson be promptly
broughtto the notice ofthe Local EthicsCounseloror the Principal EthicsCounselor orthe
CEOof TCS. All communication received in this regard will be treated and kept as
We can use the DBMS_STATS package or the ANALYZE statement to gather
statistics about the physical storage characteristics of a table, index, or cluster.
These statistics are stored in the data dictionary and can be used by the
optimizer to choose the most efficient execution plan for SQL statements
accessing analyzed objects.
Oracle recommends using the more versatile DBMS_STATS package for
gathering optimizer statistics, but you must use the ANALYZE statement to
collect statistics unrelated to the optimizer, such as empty blocks, average
space, and so forth.
The DBMS_STATS package allows both the gathering of statistics, including
utilizing parallel execution, and the external manipulation of statistics. Statistics
can be stored in tables outside of the data dictionary, where they can be
manipulated without affecting the optimizer. Statistics can be copied between
databases or backup copies can be made.
Purpose of this document of to explain how we can analyze an oracle table
while processing data using ETL tool Informatica. Suppose there is a table in
ORACLE database which is very huge in size(GB). As per business requirement,
we have to use look up transformation on this table may be once or couple of
time. It will cause a huge performance problem. The mapping may take couple
of hours to get it complete or it may stuck for ever. One of the solution
overcome this situation is to analyze the table before execution of the specific
mapping which will speed up the execution. In that case ANALYZE statement
needs to be included in PRE_SQL command of session level. But the biggest
challenge is, developer needs admin rights to execute such command at ORACLE
db which is not granted usually. So the trick here is, create one
STORED_PROCEDURE for analyze the particular table and we can have
execution right very easily for this STORED_PROCEDURE.
Create or compile the below stored procedure at particular schema in which the
table is exists.
CREATE OR REPLACE PROCEDURE SCHEMA_NAME.SP_NAME(TABLE_NAME inVARCHAR2, cascade_opt in
VARCHAR2) as ot_cascade Boolean;
IF length(TABLE_NAME) >32 THEN
IF length(cascade_opt) >5 THEN
IF cascade_opt= ‘TRUE’ OR cascade_opt= ‘true’ then
ot_cascade := true;
ot_cascade := false;
dbms_stats.gather_table_stats(‘SCHEMA_NAME’ ,TABLE_NAME , cascade => ot_cascade,estimate_percent=>33,
granularity => ‘ALL’ , degree =>6);
SYNONYM AND GRANT
Create synonym for the store procedure for a particular user. Get the list of role
for the particular schema from DBA and give execute and debug grant like
CREATE OR REPLACE SYNONYMUSER.SP_NAME FOR SCHEMA_NAME.SP_NAME;
GRANT EXECUTE,DEBUG ON SCHEMA_NAME.SP_NAME to SCHEMA_ADMIN_ROLE;
CALL STORE PROCEDURE FROM INFORMATICA
Call the above created Store Procedure through Pre_SQL command of informatica(ETL) like