The document discusses the Oracle ELT Engine, which is a solution that automates Extract, Load, and Transform (ELT) processes within Oracle databases. It removes complexity from ETL tool developers by understanding Oracle database techniques and optimizing loads. The Engine generates SQL commands to efficiently load and manage large datasets using techniques like partitioning, parallel processing, and indexing. It provides logging and flexibility to support various scenarios like initial loads, incremental loads, and compressed table loads. Customers saw improvements over hand-coded ETL solutions, with one seeing a load drop from over 100 minutes to just 1.6 minutes.
Working with informtiaca teradata parallel transporterAnjaneyulu Gunti
It explains how to move data between PowerCenter and Teradata databases. It explains when to use Teradata relational connection and TPT connection. This article also lists issues we might encounter when loading data to or unloading data from Teradata and the workarounds for these issues.
Working with informtiaca teradata parallel transporterAnjaneyulu Gunti
It explains how to move data between PowerCenter and Teradata databases. It explains when to use Teradata relational connection and TPT connection. This article also lists issues we might encounter when loading data to or unloading data from Teradata and the workarounds for these issues.
It is an Comprehensive ETL Tool, Which provides, end to end ERP Solutions,Some of the Most popular ETL Tools are DSPX leader of ETL Tools, Started from 2006,Informatics,ODI,SAS (ETL STUDIO),BODI,ABNITRO.
For More Follow Below Link:
http://bit.ly/1zMzPjW
Oracle 21c: New Features and Enhancements of Data Pump & TTSChristian Gohmann
At the end of the year 2020, Oracle released 21c on its Cloud infrastructure. The on-premises version will follow later this year. As with every new Oracle version, the Data Pump utility gets new features or enhancements for existing features.
This presentation gives an overview of the enhancements of Data Pump and Transportable Tablespaces. The following list is an excerpt of the points I will talk about
- Simultaneous use of EXCLUDE and INCLUDE
- Parallelized import of metadata during a TTS import operation
- Checksum support for dump files
- Direct access to Oracle Cloud Object Store for exports and imports
Learn how to identify duplicate content and redirect duplicate content appropriately so that the pages YOU want indexed appear in Google and Bing search results.
Featuring SEO Expert Steve Wiideman, you'll get the code, techniques, and reasoning behind 301 and canonical redirects.
It is an Comprehensive ETL Tool, Which provides, end to end ERP Solutions,Some of the Most popular ETL Tools are DSPX leader of ETL Tools, Started from 2006,Informatics,ODI,SAS (ETL STUDIO),BODI,ABNITRO.
For More Follow Below Link:
http://bit.ly/1zMzPjW
Oracle 21c: New Features and Enhancements of Data Pump & TTSChristian Gohmann
At the end of the year 2020, Oracle released 21c on its Cloud infrastructure. The on-premises version will follow later this year. As with every new Oracle version, the Data Pump utility gets new features or enhancements for existing features.
This presentation gives an overview of the enhancements of Data Pump and Transportable Tablespaces. The following list is an excerpt of the points I will talk about
- Simultaneous use of EXCLUDE and INCLUDE
- Parallelized import of metadata during a TTS import operation
- Checksum support for dump files
- Direct access to Oracle Cloud Object Store for exports and imports
Learn how to identify duplicate content and redirect duplicate content appropriately so that the pages YOU want indexed appear in Google and Bing search results.
Featuring SEO Expert Steve Wiideman, you'll get the code, techniques, and reasoning behind 301 and canonical redirects.
trabajo sobre mecanismos y funciones del agua en los Reinos Vegetal y Animal.
El agua y los vegetales: El plancton y su desarrollo en el agua. Plantas acuáticas.
El agua y los animales acuáticos: Animales de agua dulce. Animales de aguas saladas.
Presentazione di Marco Baldi, Responsabile Area Economica e Territorio Fondazione Censis, alla Conferenza stampa di presentazione del rapporto Confcommercio-Censis su clima di fiducia e aspettative di famiglie e imprese
Desde su aparición, internet se ha convertido en una de las herramientas mas utilizadas por todos, agiliza la tarea de buscar información por diversos lugares, es más simple de utilizar y nos mantiene en contacto por todas partes del mundo.
Sin ninguna duda, es uno de las mejores creaciones y es de gran ayuda. Además de no ser complicada, se encuentra al alcance de todos y podemos obtener todo tipo de ayuda.
Internet tiene un impacto profundo en el trabajo, el ocio y el conocimiento. Gracias a la web, millones de personas tienen acceso fácil e inmediato a una cantidad extensa y diversa de información en línea.
Comparado a las enciclopedias y a las bibliotecas tradicionales, la web ha permitido una descentralización repentina y extrema de la información y de los datos.
Internet ha llegado a gran parte de los hogares y de las empresas de los países ricos, en este aspecto se ha abierto una brecha digital con los países pobres, en los cuales la penetración de Internet y las nuevas tecnologías es muy limitada para las personas.
A Product Manager often needs to persuade and motivate others without having any real authority. The only tool the product manager has is language, the spoken and written word. So how do you get people to understand why a particular course of action is warranted? And how do you persuade them to act and move forward on that knowledge without being able to wield any power or authority over them? You can’t just talk louder, or keep repeating your point or resort to nagging. You have to use particular language that is more effective. They key is in using carefully chosen language that addresses their Motivation Type.
Join our AIPMM Anthropologist, Paula Gray as she uncovers the key to more effective persuasion by understanding your audience’s Motivation Type. She will explain how to determine an individual’s motivation type and then how to craft your language to address the core issues of each type.
Topics covered will include:
What are Motivation Types and how people acquire them
How to determine an individual’s Motivation Type
How to speak to each Motivation Type to engage and persuade them
How to speak to a group that is a mix of Motivation Types
About Paula Gray
Paula Gray is an anthropologist and the Director of Knowledge Development at AIPMM. She has traveled the globe to work with companies throughout the US, Europe, Africa and Asia-Pacific to help them gain a deeper understanding of their customers. She is featured in Linda Gorchels’ book The Product Manager’s Handbook and has contributed to several books on product management including The Guide to the Product Management and Marketing Body of Knowledge (ProdBOK). She is also author of numerous blog posts and papers including Business Anthropology and the Culture of Product Managers.
AT1 - Revoir les pratiques de l'entretien professionnelCap'Com
Comment la communication interne peut-elle contribuer à faire partager une culture commune de l’entretien professionnel, tant pour les agents évalués, qu’évaluateurs ?
Éric Fauconnier, responsable de la communication interne d'Angers Loire Métropole décryptera la campagne "Les pros de l'entretien", Grand Prix Cap'Com 2015 dans la catégorie communication interne.
Organisation interne, cadre de travail, compétences, la réforme territoriale bouscule les agents. Comment communiquer en interne pour anticiper et gérer les conséquences des changements ? Dans la continuité de l'atelier com interne du Forum de Tours, retours sur l'expérience de la création du Nouveau Rhône et de la Métropole de Lyon.
Philippe Debondue, délégué général du Musée Gallo-romain et ancien directeur de la communication du Département du Rhône
We compare the traditional ETL approach to the newer Business Rules-driven E-LT paradigm, the answer whether conventional ETL tools should be considered obsolete and phased out of the Enterprise Architecture, and tools based on Business Rules and E-LT take their place.
Retail Analytics, with Oracle Data Integrator 11G.
Points about ODI Objects, Interfaces, Variables, Packages, Scenarios, Load Plans, Scheduling.
Batch Scheduling with RA 14.2, UAF in 14.2, Error Managment in RA 14.2
Performance Tuning Oracle's BI ApplicationsKPI Partners
http://www.kpipartners.com/webinar-Performance-Tuning-Oracle-BI-Applications/ ... From a virtual event that discusses techniques that can be used to optimize performance of the Oracle BI Apps.
The BI Apps from Oracle present customers with a nice head start to getting their BI environment up and running. But for many customers, their user community demands lighting-fast speeds while running dashboards, reports and ad-hoc queries. Learn about some of the key techniques you can use to take the BI Apps to performance levels you didn’t think were possible.
The discussion begins with a conceptual understanding of why performance problems can exist and the counteracting design considerations. Special attention will be paid to the concept of a Performance Layer, describing what it is, what it is comprised of and how to build it. The presentation includes several real world examples of the significant performance gains that can be had from a Performance Layer.
Objective 1: Learn about the concept of a performance layer and what is involved with building one.
Objective 2: Understand the most important steps to improve the performance of your system.
ETL stands for extract, transform, and load and is a traditionally accepted way for organizations to combine data from multiple systems into a single database, data store, data warehouse, or data lake.
Strata+Hadoop 2015 NYC End User Panel on Real-Time Data AnalyticsSingleStore
Strata+Hadoop 2015 NYC End User Panel on Real-Time Data Analytics: Novus, DigitalOcean, Akamai.
Building Predictive Applications with Real-Time Data Pipelines and Streamliner. Eric Frenkiel, CEO and Co-Founder, MemSQL
Part 3 - Data Warehousing Lecture at BW Cooperative State University (DHBW)Andreas Buckenhofer
Part 3(4)
The slides contain a DWH lecture given for students in 5th semester. Content:
- Introduction DWH and Business Intelligence
- DWH architecture
- DWH project phases
- Logical DWH Data Model
- Multidimensional data modeling
- Data import strategies / data integration / ETL
- Frontend: Reporting and anaylsis, information design
- OLAP
2. Large Dataset Performance Factors for ETL
2
ETL pulls data out of the database and pushes it back in – both over the same network
Always limited by network bandwidth, even when run in Parallel
ETL Tools do not truly understand the advanced database techniques provided by Oracle
to enable high speed loading
They rely on developers knowing how to do this and building complex database scripts
Loading jobs are inadequately optimized for each scenario (Initial, Incremental (Insert-
Only), Incremental (with Updates)
Loading jobs techniques do not align with the nature of the tables they are loading
Small vs. Large size, Partitioning, Table Compression
When manual scripts are used, the steps are not optimized
Most commonly problems are with handling of Partitions, Constraints and Indexes
When ETL jobs that load large datasets are performing slowly, there are generally
a few common problem areas:
Hardware resources are not being used adequately by the loading solution
–DBAs are frequently blamed when it is usually an application layer issue.
3. Databases are Designed for ELT Techniques
3
For decades database vendors have been developing enhancements and tools to aid loading,
managing and querying large datasets
ELT uses the power of the database hardware along with these advanced features to load data
more quickly than traditional ETL tools
Network bottleneck is removed
Fully Parallelized data movement & loading
Database servers are typically more powerful than ETL servers
Advanced commands for Partitioning, Index & Constraint management, Compression are all
SQL based
ELT combines Oracle special commands with the use of traditional SQL for transformation
All done within the database
All using the common language of SQL
Using SQL for transformation leverages the Parallel abilities of modern database systems
Nearly identical SQL is commonly used in ETL loads to extract datasets into the tool
At its simplest form, it adds an INSERT portion to the extract queries
ELT Techniques rely on the Parallel Processing power of the database to transform
large datasets without involving network bottlenecks or ETL tool limitations
4. Optimized ELT Patterns for Large Data Loads
Full Load:
1. Truncate Target
2. Disable Constraints
3. Drop All Indexes
4. Insert Data
5. Gather Stats
6. ReBuild All Indexes
7. ReEnable Constraints
8. Rename Partitions
Incremental Load:
1. Truncate Target
2. Disable Constraints
3. Drop Global Indexes
4. Loop through incoming data:
1. Truncate _Part table
2. Insert data into _Part
3. Create new partition if needed
4. Rename Partition
5. Exchange Partition
6. Gather stats on Partition
7. Rebuild Local Indexes
5. Rebuild Global Indexes
6. ReEnable Constraints
4
Shown below are two optimized Oracle load patterns for the same partitioned table
Optimized loading steps vary greatly by scenario requiring additional development
and QA effort
5. Automation – Key to Development Success
5
Remove the need for ETL Tool developers to know how to use advanced Oracle features
correctly
Tool developers are not database experts
Common code modules reduce development time substantially
Common libraries can be tested independently – reducing QA for each functional load
Enhancements, new Oracle Version features and bug fixes are shared across your code
Intelligent automation (choosing which technique to use) further increases power
Removes the developer from the technical decision making process
Removes the need for the developer to build multiple load techniques
Ensures the optimal technique is used regardless of situation
Ensures the same set of techniques are used across all loads
However: as more steps are done in the database, a new way to manage those
commands becomes necessary
ETL tool logging cannot help
Automation of loading steps into shared functions is a key to efficient loading and
timely delivery.
6. 6
Oracle ELT Solution
Engine removes all of the Oracle database complexity from developers
Is designed to plug into your existing ETL tool such as Informatica or DataStage
Is highly integrated with the Oracle catalog; thus it understands your table structure and
how to handle it dynamically
Changing columns, constraints, partitioning, compression or indexes is a zero code change operation
Automates nearly all of what the Oracle DAC does and then some!
Is fully data driven by additional Metadata, Oracle Catalog and incoming dataset
Offers several distinct and optimized load patterns:
Initial Insert Load, any table
Regular table Incremental
Interval Partitioned table load, using either Insert only or Partition Exchange based on data
SCD Type II load
Compressed table loads
Database link optimizations
Fact/Aggregate optimized loads
Aggregate full refresh load
Contains extensive logging, tracking and debugging information not possible in ETL Tool
The Oracle ELT Automation Engine does such tasks and more:
7. This simple Procedure Call Auto-Generates these Commands:
EBI_ELT: TRUNCATE TABLE EBI_ELT.FINPLAN_WRK
INSERT /*+ PARALLEL (32) */ INTO FINPLAN_WRK ( PER_HIERARCHY_ID,
EBI_ELT: Begin dbms_stats.gather_table_stats (ownname => …
SELECT /*+ PARALLEL (32) */ COUNT(*) FROM FINPLAN_WRK
UPDATE /*+ PARALLEL (32) */ FINPLAN_WRK SET DW_DUPLICATE_FLG = 'Y‘
TRUNCATE TABLE EDW.FINPLAN_F
ALTER TABLE EDW.FINPLAN_F DISABLE CONSTRAINT PLAN_F_PK
ALTER TABLE EDW.FINPLAN_F DISABLE CONSTRAINT PLAN_F_UK
DROP INDEX FINPLAN_F_PK
DROP INDEX FINPLAN_F_UK
DROP INDEX BMP_PLAN_F
DROP INDEX BMPIDX01
INSERT /*+ APPEND PARALLEL (32) */ INTO EDW.FINPLAN_F ( SK,
Begin dbms_stats.gather_table_stats (ownname => ‘EDW’, …
CREATE UNIQUE INDEX EDW.FINPLAN_F_PK
CREATE UNIQUE INDEX EDW.FINPLAN_F_UK
CREATE BITMAP INDEX EDW.BMP_PLAN_F
CREATE BITMAP INDEX EDW.BMPIDX01
ALTER TABLE EDW.FINPLAN_F ENABLE CONSTRAINT PLAN_F_PK
ALTER TABLE EDW.FINPLAN_F ENABLE CONSTRAINT PLAN_F_UK
ALTER TABLE EDW.FINPLAN_F RENAME PARTITION F_201013 TO FINPLAN_F_201012 (&
15 more periods…)
ALTER INDEX EDW.BMPIDX01 RENAME PARTITION F_201013 TO BMPIDX01_201012 (& 15
more periods…)
ALTER INDEX EDW.BMP_PLAN_F RENAME PARTITION F_201013 TO BMP_PLAN_F_201012 (&
15 more periods…)
7
Development & Results Customer Example #1
ELT.BuildWorkTableAndPublish(‘FINPLAN_WRK’, ‘FINPLAN_F’,
'INSERT INTO FINPLAN_WRK (
PER_HIERARCHY_ID, PLAN_TYP_ID,
ACCT_SCENARIO_DTL_SK, ORG_HIERARCHY_ID,
SECT_ID, FAC_ID,
BASE_CRNCY_CD, PLAN_AMT,
PLAN_QTD_AMT, PLAN_YTD_AMT,
SUBJ_AREA_CD)
SELECT Distinct
PHD.PER_HIERARCHY_ID AS PER_HIERARCHY_ID,
FPP.FINANCE_PLAN_TYP_ID AS PLAN_TYP_ID,
ASD.ACCT_SCENARIO_DTL_SK AS ACCT_SCENARIO_DTL_SK,
AUH.GL_ACCT_UNIT_ID AS ORG_HIERARCHY_ID,
SD.SECT_ID AS SECT_ID,
FPP.FAC_ID AS FAC_ID,
''USD'' AS BASE_CRNCY_CD,
FPP.PLAN_AMT,
sum(PLAN_AMT) over (partition by
FINANCE_PLAN_TYP_ID,FPP.FAC_ID,
FPP.DEPT_ID,FPP.SECT_ID,
ACCT_SCENARIO_DTL_SK, PHD.YR_NBR ,
PHD.QTR_NBR
ORDER BY PERIOD_ID,PROFIT_LOSS_LINE_CD) as PLAN_QTD_AMT,
SUM(PLAN_AMT) over (partition by
FINANCE_PLAN_TYP_ID,FPP.FAC_ID,
FPP.DEPT_ID,FPP.SECT_ID,
ACCT_SCENARIO_DTL_SK, PHD.YR_NBR
ORDER BY PERIOD_ID ,PROFIT_LOSS_LINE_CD) as PLAN_YTD_AMT,
''Finance'' SUBJ_AREA_CD
FROM FINANCE_PLAN_PERIOD FPP
JOIN PERIOD_HIERARCHY_D PHD
ON FPP.PERIOD_ID = PHD.PER_HIERARCHY_ID
JOIN ACCT_SCENARIO_DTL_SCD ASD
ON LTRIM(FPP.PROFIT_LOSS_LINE_CD,0)=UPPER( ASD.SEQ_NBR_TXT)
AND ACCT_SCENARIO_ID in (6,7)
AND ASD.CUR_VER_IND = 1
JOIN ACCOUNT_UNIT_HIERARCHY AUH
ON FPP.FAC_ID = AUH.FAC_ID
AND FPP.DEPT_ID = AUH.DEPT_ID
AND FPP.SECT_ID = AUH.SECT_ID
AND AUH.COMPANY_NBR = 1101
JOIN SECTION_D SD
ON AUH.SECT_NBR=SD.SECT_NBR
AND AUH.DEPT_NBR=SD.DEPT_NBR
WHERE NOT ( AUH.DEPT_NBR = 00339 AND ACCT_SCENARIO_ID = 7 )‘);
The Procedure call is entirely Functional logic identical to what was in the ETL extract.
The Engine generated 121 detailed logging commands for the simplest scenario.
Setup & migration took only 20 minutes.
8. Real Customer Example #2 (1/4)
Screenshot show actual code and metadata – this is all that is required
Development process:
Build INSERT INTO SELECT FROM SQL
Build Target and Work tables in database, with indexes, constraints and SKs
Create a new Process record in FW Metadata as shown above
Run
8
9. Real Customer Example #2 (2/4)
Monitor all executions of the Process
Incrementals and full loads
Review summary results for each run in a single location:
9
10. Real Customer Example #2 (3/4)
Monitor the details of the FW and the commands it generates
10
11. Real Customer Example #2 (4/4)
Monitor and track each load process in context with others
11
Results are tracked
automatically
12. 12
ELT Engine Performance Benchmarks
1. Customer initial coded solution (DataStage & SQL Merge): 100+mins
2. Customer hand reworked DataStage & ELT w/Partition Mgmt: 12 mins
3. Engine – Full Load (all local): 1.6 mins
4. Engine – ELT over DBLink *: 3.3 mins
5. Engine – 39M Updates (all local): 4.0 mins
Shown below are the results of the prior logic operating on 39M records on a 4
node ExaData environment split over a DBLink (Source DbLink Target)
* The DB Link is a substantial performance bottleneck; customer is working to consolidate. #s shown for reference.
• The revised customer result hit the network as the bottleneck
• Comparing #2 with #3 is the fairest comparison of the network impact
• Moreover, the ELT Engine ran through a total of 6 distinct scenarios:
• 3 scenarios for load optimization
• Ran locally and over a DBLink
• Note the performance of Scenario #5 – all updates. The ELT Engine processes very slow UPDATEs as
INSERTs for highest performance
Zero code changes or code paths were required to run all 6 scenarios
• The same PublishTable() command was used for all scenarios
• The Engine “figured it out” and chose the optimal plan and optimizations
13. 13
Force Multiplier
Index and constraints management commands
Handle the Partitioning scheme
How to do a Partition Exchange
When to issue Stats commands (a common problem) and what options to use
How to determine and log record counts
How to recover from an error in the middle of the load
How to handle a remote database
What any of the Oracle data dictionary tables are
Determine what scenario optimizations are needed or code for them with multiple paths
How to generate a new SK
How to set & update Metadata columns
How to deduplicate the incoming record set
Deal with or code the complexity of the Merge statement (reduced to a simple Insert Select)
What the date range of incoming data was to setup an optimized partition loop
How to load a compressed table properly
What the DDL details (tablespace, storage, etc.) are for indexes that were dropped and had to be re-created
By encapsulating advanced Oracle logic in intelligent functions, the developer did
not need to know how to do or code any of:
14. 14
Oracle ELT Engine
Can be up and running, with adaptations to your system, in a few short weeks
Removes all of the Oracle database complexity from inexperienced developers
Developers can 100% focus on functional business rules & transformation
Significantly reduce development time; supports Agile approaches
Can be added to your solution incrementally:
Start with the Publishing aspect that deals with the target tables and their complexities
Introduce more ELT by using the Engine to perform an entire loading job, from source to target
Migrate entire groups of loads into the Engine
Is PL/SQL based, making the skillset for support common place
Bugs and enhancements can be done in a day or two instead of weeks
Will improve your logging, debugging and Production Support capabilities
This was a key emphasis for its design
Improved performance allows you to meet SLAs or increase load frequency
The Oracle ELT Automation Engine is a jumpstart to bringing automated ELT
capability and benefits to your ETL system.