SlideShare a Scribd company logo
1 of 100
INFORMATICA POWERCENTER/ POWERMART DESIGNER
DESIGNER WORKSPACE ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
DESIGNER TOOLS ,[object Object],[object Object],[object Object],[object Object],[object Object]
SOURCE ANALYZER ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SOURCE ANALYZER – IMPORTING RELATIONAL SOURCE DEFINITIONS ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SOURCE ANALYZER – IMPORTING RELATIONAL SOURCE DEFINITIONS
SOURCE ANALYZER – FLAT FILE SOURCES ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SOURCE ANALYZER – FLAT FILE SOURCES ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SOURCE ANALYZER – FLAT FILE SOURCES ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
WAREHOUSE DESIGNER ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
WAREHOUSE DESIGNER ,[object Object],[object Object],[object Object],[object Object]
WAREHOUSE DESIGNER – CREATE/EDIT TARGET DEFINITIONS
WAREHOUSE DESIGNER ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
TRANSFORMATIONS ,[object Object],[object Object],[object Object],[object Object],[object Object]
TRANSFORMATIONS ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
TRANSFORMATIONS ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
TRANSFORMATIONS ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
TRANSFORMATIONS ,[object Object],[object Object],[object Object],[object Object]
TRANSFORMATIONS – PORT DEFAULT VALUES ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
AGGREGATOR TRANSFORMATION  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
AGGREGATOR TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object]
AGGREGATOR TRANSFORMATION ,[object Object],[object Object],[object Object]
EXPRESSION TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
EXPRESSION TRANSFORMATION ,[object Object]
FILTER TRANSFORMATION ,[object Object],[object Object],[object Object]
JOINER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
JOINER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
JOINER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
LOOKUP TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object]
LOOKUP TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
LOOKUP TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
LOOKUP TRANSFORMATION ,[object Object]
LOOKUP TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object]
LOOKUP TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
LOOKUP TRANSFORMATION ,[object Object],[object Object]
RANK TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
RANK TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
RANK TRANSFORMATION ,[object Object],[object Object],[object Object]
ROUTER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object]
COMPARING ROUTER & FILTER TRANSFORMATIONS
ROUTER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
ROUTER TRANSFORMATION ,[object Object],[object Object]
ROUTER TRANSFORMATION
SEQUENCE GENERATOR TRANSFORMATION  ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SEQUENCE GENERATOR TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SEQUENCE GENERATOR TRANSFORMATION ,[object Object],[object Object]
SOURCE QUALIFIER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SOURCE QUALIFIER TRANSFORMATION ,[object Object],[object Object]
SOURCE QUALIFIER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object]
SOURCE QUALIFIER TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
STORED PROCEDURE TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object]
STORED PROCEDURE TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object]
STORED PROCEDURE TRANSFORMATION ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
STORED PROCEDURE TRANSFORMATION ,[object Object],[object Object],[object Object]
STORED PROCEDURE TRANSFORMATION ,[object Object],[object Object]
UPDATE STRATEGY TRANSFORMATION  ,[object Object],[object Object],[object Object],[object Object],[object Object]
UPDATE STRATEGY TRANSFORMATION ,[object Object],[object Object],[object Object]
TRANSFORMATION LANGUAGE ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
TRANSFORMATION LANGUAGE ,[object Object],[object Object],[object Object],[object Object],[object Object]
TRANSFORMATION LANGUAGE ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING ,[object Object]
MAPPING ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING - VALIDATION ,[object Object],[object Object],[object Object],[object Object]
MAPPING - VALIDATION
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING WIZARD ,[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING PARAMETERS ,[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPING VARIABLE ,[object Object],[object Object],[object Object],[object Object],[object Object]
DEBUGGER  ,[object Object],[object Object],[object Object],[object Object]
DEBUGGER
DEBUGGER ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
DEBUGGER ,[object Object]
DEBUGGER ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPLET ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPLET ,[object Object],[object Object],[object Object]
MAPPLET ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
SAMPLE MAPPLET IN A MAPPING
MAPPLET ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPLET ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPLET ,[object Object]
MAPPLET ,[object Object],[object Object],[object Object],[object Object],[object Object]
MAPPLET ,[object Object]
MAPPLET
BUSINESS COMPONENTS  ,[object Object],[object Object],[object Object]
BUSINESS COMPONENTS ,[object Object],[object Object],[object Object],[object Object],[object Object],[object Object],[object Object]
BUSINESS COMPONENTS
BUSINESS COMPONENTS ,[object Object],[object Object],[object Object]
CUBES AND DIMENSIONS ,[object Object],[object Object],[object Object],[object Object]

More Related Content

What's hot

Informatica
InformaticaInformatica
Informaticamukharji
 
01 power center 8.6 basics
01 power center 8.6 basics01 power center 8.6 basics
01 power center 8.6 basicsuthayan87
 
Informatica push down optimization implementation
Informatica push down optimization implementationInformatica push down optimization implementation
Informatica push down optimization implementationdivjeev
 
Informatica PowerCenter
Informatica PowerCenterInformatica PowerCenter
Informatica PowerCenterRamy Mahrous
 
Informatica overview
Informatica overviewInformatica overview
Informatica overviewSwetha Naveen
 
Management in Informatica Power Center
Management in Informatica Power CenterManagement in Informatica Power Center
Management in Informatica Power CenterEdureka!
 
ETL Using Informatica Power Center
ETL Using Informatica Power CenterETL Using Informatica Power Center
ETL Using Informatica Power CenterEdureka!
 
Designing And Monitoring In Informatica PowerCenter
Designing And Monitoring In Informatica PowerCenterDesigning And Monitoring In Informatica PowerCenter
Designing And Monitoring In Informatica PowerCenterEdureka!
 
Informatica Training | Informatica PowerCenter | Informatica Tutorial | Edureka
Informatica Training | Informatica PowerCenter | Informatica Tutorial | EdurekaInformatica Training | Informatica PowerCenter | Informatica Tutorial | Edureka
Informatica Training | Informatica PowerCenter | Informatica Tutorial | EdurekaEdureka!
 
Oracle Applications R12 Architecture
Oracle Applications R12 ArchitectureOracle Applications R12 Architecture
Oracle Applications R12 ArchitectureViveka Solutions
 
Informatica and datawarehouse Material
Informatica and datawarehouse MaterialInformatica and datawarehouse Material
Informatica and datawarehouse Materialobieefans
 
Informatica Transformations with Examples | Informatica Tutorial | Informatic...
Informatica Transformations with Examples | Informatica Tutorial | Informatic...Informatica Transformations with Examples | Informatica Tutorial | Informatic...
Informatica Transformations with Examples | Informatica Tutorial | Informatic...Edureka!
 
R12 d49656 gc10-apps dba 04
R12 d49656 gc10-apps dba 04R12 d49656 gc10-apps dba 04
R12 d49656 gc10-apps dba 04zeesniper
 
1000 solved questions
1000 solved questions1000 solved questions
1000 solved questionsKranthi Kumar
 
Oracle Apps Technical Manual
Oracle Apps Technical ManualOracle Apps Technical Manual
Oracle Apps Technical ManualHossam El-Faxe
 
Peoplesoft PIA architecture
Peoplesoft PIA architecturePeoplesoft PIA architecture
Peoplesoft PIA architectureAmit rai Raaz
 
Ebs architecture con9036_pdf_9036_0001
Ebs architecture con9036_pdf_9036_0001Ebs architecture con9036_pdf_9036_0001
Ebs architecture con9036_pdf_9036_0001jucaab
 
Informatica basics for beginners | Informatica ppt
Informatica basics for beginners | Informatica pptInformatica basics for beginners | Informatica ppt
Informatica basics for beginners | Informatica pptIQ Online Training
 

What's hot (20)

Informatica session
Informatica sessionInformatica session
Informatica session
 
Informatica
InformaticaInformatica
Informatica
 
01 power center 8.6 basics
01 power center 8.6 basics01 power center 8.6 basics
01 power center 8.6 basics
 
Informatica push down optimization implementation
Informatica push down optimization implementationInformatica push down optimization implementation
Informatica push down optimization implementation
 
Informatica PowerCenter
Informatica PowerCenterInformatica PowerCenter
Informatica PowerCenter
 
Informatica overview
Informatica overviewInformatica overview
Informatica overview
 
Management in Informatica Power Center
Management in Informatica Power CenterManagement in Informatica Power Center
Management in Informatica Power Center
 
ETL Using Informatica Power Center
ETL Using Informatica Power CenterETL Using Informatica Power Center
ETL Using Informatica Power Center
 
Designing And Monitoring In Informatica PowerCenter
Designing And Monitoring In Informatica PowerCenterDesigning And Monitoring In Informatica PowerCenter
Designing And Monitoring In Informatica PowerCenter
 
Informatica Training | Informatica PowerCenter | Informatica Tutorial | Edureka
Informatica Training | Informatica PowerCenter | Informatica Tutorial | EdurekaInformatica Training | Informatica PowerCenter | Informatica Tutorial | Edureka
Informatica Training | Informatica PowerCenter | Informatica Tutorial | Edureka
 
Oracle Applications R12 Architecture
Oracle Applications R12 ArchitectureOracle Applications R12 Architecture
Oracle Applications R12 Architecture
 
Informatica and datawarehouse Material
Informatica and datawarehouse MaterialInformatica and datawarehouse Material
Informatica and datawarehouse Material
 
Informatica Transformations with Examples | Informatica Tutorial | Informatic...
Informatica Transformations with Examples | Informatica Tutorial | Informatic...Informatica Transformations with Examples | Informatica Tutorial | Informatic...
Informatica Transformations with Examples | Informatica Tutorial | Informatic...
 
R12 d49656 gc10-apps dba 04
R12 d49656 gc10-apps dba 04R12 d49656 gc10-apps dba 04
R12 d49656 gc10-apps dba 04
 
1000 solved questions
1000 solved questions1000 solved questions
1000 solved questions
 
Oracle Apps Technical Manual
Oracle Apps Technical ManualOracle Apps Technical Manual
Oracle Apps Technical Manual
 
Peoplesoft PIA architecture
Peoplesoft PIA architecturePeoplesoft PIA architecture
Peoplesoft PIA architecture
 
Ebs architecture con9036_pdf_9036_0001
Ebs architecture con9036_pdf_9036_0001Ebs architecture con9036_pdf_9036_0001
Ebs architecture con9036_pdf_9036_0001
 
1000 sap-interview-qa
1000 sap-interview-qa1000 sap-interview-qa
1000 sap-interview-qa
 
Informatica basics for beginners | Informatica ppt
Informatica basics for beginners | Informatica pptInformatica basics for beginners | Informatica ppt
Informatica basics for beginners | Informatica ppt
 

Viewers also liked

Informatica student meterial
Informatica student meterialInformatica student meterial
Informatica student meterialSunil Kotthakota
 
Zackman frame work
Zackman frame workZackman frame work
Zackman frame workganblues
 
Informatica complex transformation i
Informatica complex transformation iInformatica complex transformation i
Informatica complex transformation iAmit Sharma
 
Dimensional Fact Model @ BI Academy Launch
Dimensional Fact Model @ BI Academy LaunchDimensional Fact Model @ BI Academy Launch
Dimensional Fact Model @ BI Academy Launchcaccio
 
Why create a Data Mart with Dimensional Fact Model
Why create a Data Mart with Dimensional Fact ModelWhy create a Data Mart with Dimensional Fact Model
Why create a Data Mart with Dimensional Fact Modelcaccio
 
Informatica log files
Informatica log filesInformatica log files
Informatica log filesAmit Sharma
 
multi dimensional data model
multi dimensional data modelmulti dimensional data model
multi dimensional data modelmoni sindhu
 
Informatica power center 9.x developer & admin Basics | Demo | Introduction
Informatica power center 9.x developer & admin Basics | Demo | Introduction Informatica power center 9.x developer & admin Basics | Demo | Introduction
Informatica power center 9.x developer & admin Basics | Demo | Introduction Kernel Training
 
Hadoop basic commands
Hadoop basic commandsHadoop basic commands
Hadoop basic commandsbispsolutions
 
Informatica power center performance tuning
Informatica power center performance tuningInformatica power center performance tuning
Informatica power center performance tuningdivjeev
 

Viewers also liked (15)

Informatica student meterial
Informatica student meterialInformatica student meterial
Informatica student meterial
 
Dynamic Mapping with Automation
Dynamic Mapping with AutomationDynamic Mapping with Automation
Dynamic Mapping with Automation
 
Zackman frame work
Zackman frame workZackman frame work
Zackman frame work
 
Informatica complex transformation i
Informatica complex transformation iInformatica complex transformation i
Informatica complex transformation i
 
informatica
informaticainformatica
informatica
 
Dimensional Fact Model @ BI Academy Launch
Dimensional Fact Model @ BI Academy LaunchDimensional Fact Model @ BI Academy Launch
Dimensional Fact Model @ BI Academy Launch
 
Why create a Data Mart with Dimensional Fact Model
Why create a Data Mart with Dimensional Fact ModelWhy create a Data Mart with Dimensional Fact Model
Why create a Data Mart with Dimensional Fact Model
 
Informatica log files
Informatica log filesInformatica log files
Informatica log files
 
multi dimensional data model
multi dimensional data modelmulti dimensional data model
multi dimensional data model
 
Informatica power center 9.x developer & admin Basics | Demo | Introduction
Informatica power center 9.x developer & admin Basics | Demo | Introduction Informatica power center 9.x developer & admin Basics | Demo | Introduction
Informatica power center 9.x developer & admin Basics | Demo | Introduction
 
Informatica ppt
Informatica pptInformatica ppt
Informatica ppt
 
Hadoop basic commands
Hadoop basic commandsHadoop basic commands
Hadoop basic commands
 
Informatica power center performance tuning
Informatica power center performance tuningInformatica power center performance tuning
Informatica power center performance tuning
 
BUSINESS QUIZ -Round 1
 BUSINESS QUIZ -Round 1 BUSINESS QUIZ -Round 1
BUSINESS QUIZ -Round 1
 
Slideshare ppt
Slideshare pptSlideshare ppt
Slideshare ppt
 

Similar to Informatica Designer Module

Informatica overview
Informatica overviewInformatica overview
Informatica overviewkarthik kumar
 
Informatica overview
Informatica overviewInformatica overview
Informatica overviewkarthik kumar
 
47468272 introduction-to-informatica
47468272 introduction-to-informatica47468272 introduction-to-informatica
47468272 introduction-to-informaticaVenkat485
 
Azure Data Factory Data Flows Training (Sept 2020 Update)
Azure Data Factory Data Flows Training (Sept 2020 Update)Azure Data Factory Data Flows Training (Sept 2020 Update)
Azure Data Factory Data Flows Training (Sept 2020 Update)Mark Kromer
 
Mapping Data Flows Training April 2021
Mapping Data Flows Training April 2021Mapping Data Flows Training April 2021
Mapping Data Flows Training April 2021Mark Kromer
 
Mapping Data Flows Training deck Q1 CY22
Mapping Data Flows Training deck Q1 CY22Mapping Data Flows Training deck Q1 CY22
Mapping Data Flows Training deck Q1 CY22Mark Kromer
 
4 informatica transformation types sq and exp
4 informatica transformation types sq and exp4 informatica transformation types sq and exp
4 informatica transformation types sq and expRavindra Gangwar
 
SSIS 2008 R2 data flow
SSIS 2008 R2 data flowSSIS 2008 R2 data flow
SSIS 2008 R2 data flowSlava Kokaev
 
Application development using Microsoft SQL Server 2000
Application development using Microsoft SQL Server 2000Application development using Microsoft SQL Server 2000
Application development using Microsoft SQL Server 2000webhostingguy
 
Introduction of ssis
Introduction of ssisIntroduction of ssis
Introduction of ssisdeepakk073
 
6.1\9 SSIS 2008R2_Training - DataFlow Transformations
6.1\9 SSIS 2008R2_Training - DataFlow Transformations6.1\9 SSIS 2008R2_Training - DataFlow Transformations
6.1\9 SSIS 2008R2_Training - DataFlow TransformationsPramod Singla
 
Introduction to sas
Introduction to sasIntroduction to sas
Introduction to sasAjay Ohri
 
11i&r12 difference
11i&r12 difference11i&r12 difference
11i&r12 differencevenki_venki
 
Dynamic Publishing with Arbortext Data Merge
Dynamic Publishing with Arbortext Data MergeDynamic Publishing with Arbortext Data Merge
Dynamic Publishing with Arbortext Data MergeClay Helberg
 
Mule data weave_2
Mule data weave_2Mule data weave_2
Mule data weave_2kunal vishe
 

Similar to Informatica Designer Module (20)

Informatica overview
Informatica overviewInformatica overview
Informatica overview
 
Informatica overview
Informatica overviewInformatica overview
Informatica overview
 
47468272 introduction-to-informatica
47468272 introduction-to-informatica47468272 introduction-to-informatica
47468272 introduction-to-informatica
 
2 designer
2 designer2 designer
2 designer
 
Azure Data Factory Data Flows Training (Sept 2020 Update)
Azure Data Factory Data Flows Training (Sept 2020 Update)Azure Data Factory Data Flows Training (Sept 2020 Update)
Azure Data Factory Data Flows Training (Sept 2020 Update)
 
Mapping Data Flows Training April 2021
Mapping Data Flows Training April 2021Mapping Data Flows Training April 2021
Mapping Data Flows Training April 2021
 
Mapping Data Flows Training deck Q1 CY22
Mapping Data Flows Training deck Q1 CY22Mapping Data Flows Training deck Q1 CY22
Mapping Data Flows Training deck Q1 CY22
 
Oracle report from ppt
Oracle report from pptOracle report from ppt
Oracle report from ppt
 
4 informatica transformation types sq and exp
4 informatica transformation types sq and exp4 informatica transformation types sq and exp
4 informatica transformation types sq and exp
 
SSIS 2008 R2 data flow
SSIS 2008 R2 data flowSSIS 2008 R2 data flow
SSIS 2008 R2 data flow
 
Application development using Microsoft SQL Server 2000
Application development using Microsoft SQL Server 2000Application development using Microsoft SQL Server 2000
Application development using Microsoft SQL Server 2000
 
SAS - Training
SAS - Training SAS - Training
SAS - Training
 
Introduction of ssis
Introduction of ssisIntroduction of ssis
Introduction of ssis
 
6.1\9 SSIS 2008R2_Training - DataFlow Transformations
6.1\9 SSIS 2008R2_Training - DataFlow Transformations6.1\9 SSIS 2008R2_Training - DataFlow Transformations
6.1\9 SSIS 2008R2_Training - DataFlow Transformations
 
Introduction to sas
Introduction to sasIntroduction to sas
Introduction to sas
 
11i&r12 difference
11i&r12 difference11i&r12 difference
11i&r12 difference
 
Dynamic Publishing with Arbortext Data Merge
Dynamic Publishing with Arbortext Data MergeDynamic Publishing with Arbortext Data Merge
Dynamic Publishing with Arbortext Data Merge
 
AWS RDS Migration Tool
AWS RDS Migration Tool AWS RDS Migration Tool
AWS RDS Migration Tool
 
Mule data weave_2
Mule data weave_2Mule data weave_2
Mule data weave_2
 
IDUG 2015 NA Data Movement Utilities final
IDUG 2015 NA Data Movement Utilities finalIDUG 2015 NA Data Movement Utilities final
IDUG 2015 NA Data Movement Utilities final
 

More from ganblues

Data Warehouse
Data WarehouseData Warehouse
Data Warehouseganblues
 
Building the DW - ETL
Building the DW - ETLBuilding the DW - ETL
Building the DW - ETLganblues
 
Warehouse components
Warehouse componentsWarehouse components
Warehouse componentsganblues
 
Informatica PowerAnalyzer 4.0 3 of 3
Informatica PowerAnalyzer 4.0 3 of 3Informatica PowerAnalyzer 4.0 3 of 3
Informatica PowerAnalyzer 4.0 3 of 3ganblues
 
Informatica PowerAnalyzer 4.0 2 of 3
Informatica PowerAnalyzer 4.0 2 of 3Informatica PowerAnalyzer 4.0 2 of 3
Informatica PowerAnalyzer 4.0 2 of 3ganblues
 
Informatica PowerAnalyzer 4.0 1 of 3
Informatica PowerAnalyzer 4.0 1 of 3Informatica PowerAnalyzer 4.0 1 of 3
Informatica PowerAnalyzer 4.0 1 of 3ganblues
 

More from ganblues (6)

Data Warehouse
Data WarehouseData Warehouse
Data Warehouse
 
Building the DW - ETL
Building the DW - ETLBuilding the DW - ETL
Building the DW - ETL
 
Warehouse components
Warehouse componentsWarehouse components
Warehouse components
 
Informatica PowerAnalyzer 4.0 3 of 3
Informatica PowerAnalyzer 4.0 3 of 3Informatica PowerAnalyzer 4.0 3 of 3
Informatica PowerAnalyzer 4.0 3 of 3
 
Informatica PowerAnalyzer 4.0 2 of 3
Informatica PowerAnalyzer 4.0 2 of 3Informatica PowerAnalyzer 4.0 2 of 3
Informatica PowerAnalyzer 4.0 2 of 3
 
Informatica PowerAnalyzer 4.0 1 of 3
Informatica PowerAnalyzer 4.0 1 of 3Informatica PowerAnalyzer 4.0 1 of 3
Informatica PowerAnalyzer 4.0 1 of 3
 

Recently uploaded

Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxLoriGlavin3
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsRizwan Syed
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfLoriGlavin3
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.Curtis Poe
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsSergiu Bodiu
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity PlanDatabarracks
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxLoriGlavin3
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024Stephanie Beckett
 
unit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxunit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxBkGupta21
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESmohitsingh558521
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024Lonnie McRorey
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfPrecisely
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024BookNet Canada
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .Alan Dix
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebUiPathCommunity
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersRaghuram Pandurangan
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionDilum Bandara
 

Recently uploaded (20)

Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptxMerck Moving Beyond Passwords: FIDO Paris Seminar.pptx
Merck Moving Beyond Passwords: FIDO Paris Seminar.pptx
 
Scanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL CertsScanning the Internet for External Cloud Exposures via SSL Certs
Scanning the Internet for External Cloud Exposures via SSL Certs
 
Moving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdfMoving Beyond Passwords: FIDO Paris Seminar.pdf
Moving Beyond Passwords: FIDO Paris Seminar.pdf
 
How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.How AI, OpenAI, and ChatGPT impact business and software.
How AI, OpenAI, and ChatGPT impact business and software.
 
DevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platformsDevEX - reference for building teams, processes, and platforms
DevEX - reference for building teams, processes, and platforms
 
How to write a Business Continuity Plan
How to write a Business Continuity PlanHow to write a Business Continuity Plan
How to write a Business Continuity Plan
 
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptxThe Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
The Fit for Passkeys for Employee and Consumer Sign-ins: FIDO Paris Seminar.pptx
 
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data PrivacyTrustArc Webinar - How to Build Consumer Trust Through Data Privacy
TrustArc Webinar - How to Build Consumer Trust Through Data Privacy
 
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024What's New in Teams Calling, Meetings and Devices March 2024
What's New in Teams Calling, Meetings and Devices March 2024
 
unit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptxunit 4 immunoblotting technique complete.pptx
unit 4 immunoblotting technique complete.pptx
 
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICESSALESFORCE EDUCATION CLOUD | FEXLE SERVICES
SALESFORCE EDUCATION CLOUD | FEXLE SERVICES
 
TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024TeamStation AI System Report LATAM IT Salaries 2024
TeamStation AI System Report LATAM IT Salaries 2024
 
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdfHyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
Hyperautomation and AI/ML: A Strategy for Digital Transformation Success.pdf
 
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
Transcript: New from BookNet Canada for 2024: Loan Stars - Tech Forum 2024
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .From Family Reminiscence to Scholarly Archive .
From Family Reminiscence to Scholarly Archive .
 
Dev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio WebDev Dives: Streamline document processing with UiPath Studio Web
Dev Dives: Streamline document processing with UiPath Studio Web
 
Generative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information DevelopersGenerative AI for Technical Writer or Information Developers
Generative AI for Technical Writer or Information Developers
 
Advanced Computer Architecture – An Introduction
Advanced Computer Architecture – An IntroductionAdvanced Computer Architecture – An Introduction
Advanced Computer Architecture – An Introduction
 

Informatica Designer Module

Editor's Notes

  1. You can import relational source definitions from database tables, views, and synonyms. When you import a source definition, you import the above mentioned source metadata. To import a source definition, you must be able to connect to the source database from the client machine using a properly configured ODBC data source or gateway. You may also require read permission on the database object.
  2. To import a source definition: In the Source Analyzer, choose Sources-Import from Database. Select the ODBC data source used to connect to the source database. If you need to create or modify an ODBC data source, click the Browse button to open the ODBC Administrator. Create the appropriate data source and click OK. Select the new ODBC data source. Enter a database username and password to connect to the database. Note: The username must have the appropriate database permissions to view the object. You may need to specify the owner name for database objects you want to use as sources. Click Connect. If no table names appear or if the table you want to import does not appear, click All. Drill down through the list of sources to find the source you want to import. Select the relational object or objects you want to import. You can hold down the Shift key to select blocks of record sources within one folder, or hold down the Ctrl key to make non-consecutive selections within a folder. You can also select all tables within a folder by selecting the folder and clicking the Select All button. Use the Select None button to clear all highlighted selections.
  3. When you create a flat file source definition, you must define the properties of the file. The Source Analyzer provides a Flat File Wizard to prompt you for the above mentioned file properties. You can import fixed-width and delimited flat file source definitions that do not contain binary data. When importing the definition, the source file must be in a directory local to the client machine. In addition, the Informatica Server must be able to access all source files during the session.
  4. You can create the overall relationship, called a schema , as well as the target definitions, through wizards in the Designer. The Cubes and Dimensions Wizards follow common principles of data warehouse design to simplify the process of designing related targets.
  5. Some changes to target definitions can invalidate mappings. If the changes invalidate the mapping, you must open and edit the mapping. If the invalidated mapping is used in a session, you must validate the session. You can preview the data of relational target definitions in the Designer. This feature saves you time because you can browse the target data before you run a session or build a mapping. Edit target definitions to add comments or key relationships, or update them to reflect changed target definitions. When you change target definitions, the Designer propagates the changes to any mapping using that target
  6. Database location - You specify the database location when you import a relational source. You can specify a different location when you configure a session. Column names - After importing a relational target definition, you can enter table and column business names, and manually define key relationships. Datatypes - The Designer imports the native datatype for each column. Key constraints - The constraints in the target definition can be critical, since they may prevent you from moving data into the target if the Informatica Server violates a constraint during a session Key Relationships - You can customize the Warehouse Designer to automatically create primary-foreign key relationships
  7. Can specify default values with Constant data value Constant expression Built-in functions with constant parameters
  8. The Informatica Server performs aggregate calculations as it reads, and stores necessary data group and row data in an aggregate cache Aggregate expression - Entered in an output port, can include non-aggregate expressions and conditional clauses Group by port - Indicates how to create groups. can be any input, input/output, output, or variable port Sorted Input option - Use to improve session performance. To use Sorted Input, you must pass data to the Aggregator transformation sorted by group by port, in ascending or descending order Aggregate cache - Aggregator stores data in the aggregate cache until it completes aggregate calculations. It stores group values in an index cache and row data in data cache
  9. You can configure ports in an Aggregator transformation by the above mentioned ways. Use variable ports for local variables Create connections to other transformations as you enter an expression When using the transformation language to create aggregate expressions, you can use conditional clauses to filter records, providing more flexibility than SQL language. After you create a session that includes an Aggregator transformation, you can enable the session option, Incremental Aggregation. When the Informatica Server performs incremental aggregation, it passes new source data through the mapping and uses historical cache data to perform new aggregation calculations incrementally.
  10. You can enter multiple expressions in a single Expression transformation. As long as you enter only one expression for each output port, you can create any number of output ports in the transformation. In this way, you can use one Expression transformation rather than creating separate transformations for each calculation that requires the same set of data.
  11. As an active transformation, the Filter transformation may change the number of rows passed through it. A filter condition returns TRUE or FALSE for each row that passes through the transformation, depending on whether a row meets the specified condition. Only rows that return TRUE pass through this transformation. Discarded rows do not appear in the session log or reject files. To maximize session performance, include the Filter transformation as close to the sources in the mapping as possible. Rather than passing rows you plan to discard through the mapping, you then filter out unwanted data early in the flow of data from sources to targets. You cannot concatenate ports from more than one transformation into the Filter transformation. The input ports for the filter must come from a single transformation. The Filter transformation does not allow setting output default values.
  12. Allows to join sources that contain binary data To join more than two sources in a mapping, add additional Joiner transformations An input transformation is any transformation connected to the input ports of the current transformation. Specify one of the sources as the master source, and the other as the detail source. This is specified on the Properties tab in the transformation by clicking the M column. When you add the ports of a transformation to a Joiner transformation, the ports from the first source are automatically set as detail sources. Adding the ports from the second transformation automatically sets them as master sources. The master/detail relation determines how the join treats data from those sources based on the type of join. For example, you might want to join a flat file with in-house customer IDs and a relational database table that contains user-defined customer IDs. You could import the flat file into a temporary database table, then perform the join in the database. However, if you use the Joiner transformation, there is no need to import or create temporary tables.
  13. There are some limitations on the data flows you connect to the Joiner transformation
  14. Can configure the lookup transformation to be connected or unconnected, cached or uncached
  15. Connected and unconnected transformations receive input and send output in different ways. Sometimes you can improve session performance by caching the lookup table. If you cache the lookup table, you can choose to use a dynamic or static cache. By default, the lookup cache remains static and does not change during the session. With a dynamic cache, the Informatica Server inserts rows into the cache during the session. Informatica recommends that you cache the target table as the lookup. This enables you to look up values in the target and insert them if they do not exist.
  16. Lookup SQL Override - Overrides the default SQL statement to query the lookup table Lookup Table Name - Specifies the name of the table from which the transformation looks up and caches values Lookup Caching Enabled - Indicates whether the Lookup transformation caches lookup values during the session Lookup Condition - Displays the lookup condition you set in the Condition tab Location Information - Specifies the database containing the lookup table Lookup Policy on Multiple Match - Determines what happens when the Lookup transformation finds multiple rows that match the lookup condition You can import a lookup table from the mapping source or target database, or you can import a lookup table from any database that both the Informatica Server and Client machine can connect to. If your mapping includes heterogeneous joins, you can use any of the mapping sources or mapping targets as the lookup table. The lookup table can be a single table, or you can join multiple tables in the same database using a lookup query override. The Informatica Server queries the lookup table or an in-memory cache of the table for all incoming rows into the Lookup transformation. Connect to the database to import the lookup table definition. The Informatica Sever can connect to a lookup table using a native database driver or an ODBC driver. However, the native database drivers improve session performance.
  17. The Informatica Server builds the cache when it processes the first request lookup request. It queries the cache based on the lookup condition for each row that passes into the transformation. When the Informatica Server receives a new row (a row that is not in the cache), it inserts the row into the cache. You can configure the transformation to insert rows into the cache based on input ports or generated sequence IDs. The Informatica Server flags the row as new. When the Informatica Server receives an existing row (a row that is in the cache), it flags the row as existing. The Informatica Server does not insert the row into the cache. Use a Router or Filter transformation with the dynamic Lookup transformation to route new rows to the cached target table. You can route existing rows to another target table, or you can drop them. When you partition a source that uses a dynamic lookup cache, the Informatica Server creates one memory cache and one disk cache for each transformation.
  18. If you call the unconnected Lookup from an update strategy or filter expression, you are generally checking for null values. In this case, the return port can be anything. If, however, you call the Lookup from an expression performing a calculation, the return value needs to be the value you want to include in the calculation.
  19. The Rank transformation differs from the transformation functions MAX and MIN, in that it allows you to select a group of top or bottom values, not just one value. For example, you can use Rank to select the top 10 salespersons in a given territory. Or, to generate a financial report, you might also use a Rank transformation to identify the three departments with the lowest expenses in salaries and overhead. While the SQL language provides many functions designed to handle groups of data, identifying top or bottom strata within a set of rows is not possible using standard SQL functions. Allows to create local variables and write non-aggregate expressions
  20. During a session, the Informatica Server compares an input row with rows in the data cache. If the input row out-ranks a stored row, the Informatica Server replaces the stored row with the input row. If the Rank transformation is configured to rank across multiple groups, the Informatica Server ranks incrementally for each group it finds.
  21. Variable ports cannot be input or output ports. They pass data within the transformation only. You can designate only one Rank port in a Rank transformation. The Rank port is an input/output port. You must link the Rank port to another transformation
  22. The Router transformation is more efficient when you design a mapping and when you run a session For example, to test data based on three conditions, you only need one Router transformation instead of three filter transformations to perform this task. Likewise, when you use a Router transformation in a mapping, the Informatica Server processes the incoming data only once. When you use multiple Filter transformations in a mapping, the Informatica Server processes the incoming data for each transformation
  23. You create a user-defined group to test a condition based on incoming data. A user-defined group consists of output ports and a group filter condition. The Designer allows you to create and edit user-defined groups on the Groups tab. Create one user-defined group for each condition that you want to specify.
  24. Zero (0) is the equivalent of FALSE, and any non-zero value is the equivalent of TRUE In some cases, you might want to test data based on one or more group filter conditions. For example, you have customers from nine different countries, and you want to perform different calculations on the data from only three countries. You might want to use a Router transformation in a mapping to filter this data to three different Expression transformations. There is no group filter condition associated with the default group. However, you can create an Expression transformation to perform a calculation based on the data from the other six countries.
  25. The Informatica Server generates a value each time a row enters a connected transformation, even if that value is not used. When NEXTVAL is connected to the input port of another transformation, the Informatica Server generates a sequence of numbers. When CURRVAL is connected to the input port of another transformation, the Informatica Server generates the NEXTVAL value plus one.
  26. Start Value - The start value of the generated sequence that you want the Informatica Server to use if you use the Cycle option. If you select Cycle, the Informatica Server cycles back to this value when it reaches the End Value.The default value is 0 for both standard and reusable Sequence Generators. Increment By - The difference between two consecutive values from the NEXTVAL port.The default value is 1 for both standard and reusable Sequence Generators. End Value - The maximum value the Informatica Server generates. If the Informatica Server reaches this value during the session and the sequence is not configured to cycle, it fails the session. Current Value - The current value of the sequence. Enter the value you want the Informatica Server to use as the first value in the sequence. If the Number of Cached Values is set to 0, the Informatica Server updates Current Value to reflect the last-generated value for the session plus one, and then uses the updated Current Value as the basis for the next session run. However, if you use the Reset option, the Informatica Server resets this value to its original value after each session. Note: If you edit this setting, you reset the sequence to the new setting. (If you reset Current Value to 10, and the increment is 1, the next time the session runs, the Informatica Server generates a first value of 10.) Cycle - If selected, the Informatica Server automatically cycles through the sequence range. Otherwise, the Informatica Server stops the sequence at the configured End Value. Number of Cached Values - The number of sequential values the Informatica Server caches at a time. Use this option when multiple sessions use the same reusable Sequence Generator at the same time to ensure each session receives unique values. The Informatica Server updates the repository as it caches each value. When set to 0, the Informatica Server does not cache values. The default value for a standard Sequence Generator is 0.The default value for a reusable Sequence Generator is 1,000. Reset - If selected, the Informatica Server generates values based on the original Current Value for each session using the Sequence Generator. Otherwise, the Informatica Server updates Current Value to reflect the last-generated value for the session plus one, and then uses the updated Current Value as the basis for the next session run.This option is disabled for reusable Sequence Generators. Tracing Level - Level of detail about the transformation that the Informatica Server writes into the session log.
  27. Connect NEXTVAL to multiple transformations to generate unique values for each row in each transformation. For example, you might connect NEXTVAL to two target tables in a mapping to generate unique primary key values. The Informatica Server creates a column of unique primary key values for each target table. If you want the same generated value to go to more than one target that receives data from a single preceding transformation, you can connect a Sequence Generator to that preceding transformation. This allows the Informatica Server to pass unique values to the transformation, then route rows from the transformation to targets.
  28. The Source Qualifier displays the transformation datatypes. The transformation datatypes in the Source Qualifier determine how the source database binds data when you import it. Do not alter the datatypes in the Source Qualifier. If the datatypes in the source definition and Source Qualifier do not match, the Designer marks the mapping invalid when you save it.
  29. In the mapping shown above, although there are many columns in the source definition, only three columns are connected to another transformation. In this case, the Informatica Server generates a default query that selects only those three columns: SELECT CUSTOMERS.CUSTOMER_ID, CUSTOMERS.COMPANY, CUSTOMERS.FIRST_NAME FROM CUSTOMERS When generating the default query, the Designer delimits table and field names containing the slash character (/) with double quotes.
  30. When the Informatica Server performs an outer join, it returns all rows from one source table and rows from the second source table that match the join condition Use an outer join when you want to join two tables and return all rows from one of the tables. For example, you might perform an outer join when you want to join a table of registered customers with a monthly purchases table to determine registered customer activity. Using an outer join, you can join the registered customer table with the monthly purchases table and return all rows in the registered customer table, including customers who did not make purchases in the last month. If you perform a normal join, the Informatica Server returns only registered customers who made purchases during the month, and only purchases made by registered customers.
  31. SQL Query - Defines a custom query that replaces the default query the Informatica Server uses to read data from sources represented in this Source Qualifier User-Defined Join - Specifies the condition used to join data from multiple sources represented in the same Source Qualifier transformation Source Filter - Specifies the filter condition the Informatica Server applies when querying records Number of Sorted Ports - Indicates the number of columns used when sorting records queried from relational sources Select Distinct - Specifies if you want to select only unique records Tracing Level - Sets the amount of detail included in the session log when you run a session containing this transformation
  32. Limitations exist on passing data, depending on the database implementation Stored procedures are stored and run within the database. Not all databases support stored procedures, and database implementations vary widely on their syntax. You might use stored procedures to: Drop and recreate indexes. Check the status of a target database before moving records into it. Determine if enough space exists in a database. Perform a specialized calculation. Database developers and programmers use stored procedures for various tasks within databases, since stored procedures allow greater flexibility than SQL statements. Stored procedures also provide error handling and logging necessary for mission critical tasks. Developers create stored procedures in the database using the client tools provided with the database.
  33. The stored procedure issues a status code that notifies whether or not the stored procedure completed successfully
  34. You can run several Stored Procedure transformations in different modes in the same mapping. For example, a pre-load source stored procedure can check table integrity, a normal stored procedure can populate the table, and a post-load stored procedure can rebuild indexes in the database. However, you cannot run the same instance of a Stored Procedure transformation in both connected and unconnected mode in a mapping. You must create different instances of the transformation. If the mapping calls more than one source or target pre- or post-load stored procedure in a mapping, the Informatica Server executes the stored procedures in the execution order that you specify in the mapping.
  35. It determines how to handle changes to existing records When you design your data warehouse, you need to decide what type of information to store in targets. As part of your target table design, you need to determine whether to maintain all the historic data or just the most recent changes. For example, you might have a target table, T_CUSTOMERS, that contains customer data. When a customer address changes, you may want to save the original address in the table, instead of updating that portion of the customer record. In this case, you would create a new record containing the updated address, and preserve the original record with the old customer address. This illustrates how you might store historical information in a target table. However, if you want the T_CUSTOMERS table to be a snapshot of current customer data, you would update the existing customer record and lose the original address.
  36. The Update Strategy transformation is frequently the first transformation in a mapping, before data reaches a target table. You can use the Update Strategy transformation to determine how to flag that record. Later, when you configure a session based on this transformation, you can determine what to do with records flagged for insert, delete, or update. The Informatica Server writes all data flagged for reject to the session reject file. By default, the Informatica Server forwards rejected rows to the next transformation. The Informatica Server flags the rows for reject and writes them to the session reject file. If you do not select Forward Rejected Rows, the Informatica Server drops rejected rows and writes them to the session log file. Frequently, the update strategy expression uses the IIF or DECODE function from the transformation language to test each record to see if it meets a particular condition. If it does, you can then assign each record a numeric code to flag it for a particular database operation. For example, the following IIF statement flags a record for reject if the entry date is after the apply date. Otherwise, it flags the record for update: IIF( ( ENTRY_DATE > APPLY_DATE), DD_REJECT, DD_UPDATE )
  37. Mapping parameters and variables. Create mapping parameters for use within a mapping or mapplet to reference values that remain constant throughout a session, such as a state sales tax rate. Create mapping variables in mapplets or mappings to write expressions referencing values that change from session to session. See “Mapping Parameters and Variables” in the Designer Guide for details. Local and system variables - Use built-in variables to write expressions that reference value that vary, such as the system date Return values - You can also write expressions that include the return values from Lookup, Stored Procedure, and External Procedure transformations
  38. You can pass a value from a port, literal string or number, variable, Lookup transformation, Stored Procedure transformation, External Procedure transformation, or the results of another expression. Separate each argument in a function with a comma. Except for literals, the transformation language is not case-sensitive. Except for literals, the Designer and Informatica Server ignore spaces. The colon (:), comma (,), and period (.) have special meaning and should be used only to specify syntax. The Informatica Server treats a dash (-) as a minus operator. If you pass a literal value to a function, enclose literal strings within single quotation marks. Do not use quotation marks for literal numbers. The Informatica Server treats any string value enclosed in single quotation marks as a character string. When you pass a mapping parameter or variable to a function within an expression, do not use quotation marks to designate mapping parameters or variables. Do not use quotation marks to designate ports. You can nest multiple functions within an expression (except aggregate functions, which allow only one nested aggregate function). The Informatica Server evaluates the expression starting with the innermost function.
  39. To debug a mapping, you configure and run the Debugger from within the Mapping Designer. When you run the Debugger, it pauses at breakpoints and allows you to view and edit transformation output data. When you copy a mapping, the Designer creates a copy of each component in the mapping, if the component does not already exist If any of the mapping components already exist, the Designer prompts you to rename, replace, or reuse those components before you continue
  40. The Designer marks a mapping invalid when it detects errors that will prevent the Informatica Server from executing the mapping The Designer performs connection validation each time you connect ports in a mapping and each time you validate or save a mapping. At least one mapplet input port and output port is connected to the mapping. If the mapplet includes a Source Qualifier that uses a SQL override, the Designer prompts you to connect all mapplet output ports to the mapping. You can validate an expression in a transformation while you are developing a mapping. If you did not correct the errors, the Designer writes the error messages in the Output window when you save or validate the mapping. When you validate or save a mapping, the Designer verifies that the definitions of the independent objects, such as sources or mapplets, match the instance in the mapping. If any of the objects change while you configure the mapping, the mapping might contain errors.
  41. Getting Started Wizard - Creates mappings to load static fact and dimension tables, as well as slowly growing dimension tables Slowly Changing Dimensions Wizard - Creates mappings to load slowly changing dimension tables based on the amount of historical dimension data you want to keep and the method you choose to handle historical dimension data Simple Pass Through. Loads a static fact or dimension table by inserting all rows. Use this mapping when you want to drop all existing data from your table before loading new data. Slowly Growing Target. Loads a slowly growing fact or dimension table by inserting new rows. Use this mapping to load new data when existing data does not require updates.
  42. For example, you might have a vendor dimension table that remains the same for a year. At the end of the year, you reload the table to reflect new vendor contracts and contact information. If this information changes dramatically and you do not want to keep historical information, you can drop the existing dimension table and use the Simple Pass Through mapping to reload the entire table. If the information changes only incrementally, you might prefer to update the existing table using the Type 1 Dimension mapping created by the Slowly Changing Dimensions Wizard.
  43. Can not use COBOL or XML sources with the wizards Type 1 Dimension mapping. Loads a slowly changing dimension table by inserting new dimensions and overwriting existing dimensions. Use this mapping when you do not want a history of previous dimension data. Type 2 Dimension/Version Data mapping. Loads a slowly changing dimension table by inserting new and changed dimensions using a version number and incremented primary key to track changes. Use this mapping when you want to keep a full history of dimension data and to track the progression of changes. Type 2 Dimension/Flag Current mapping. Loads a slowly changing dimension table by inserting new and changed dimensions using a flag to mark current dimension data and an incremented primary key to track changes. Use this mapping when you want to keep a full history of dimension data, tracking the progression of changes while flagging only the current dimension. Type 2 Dimension/Effective Date Range mapping. Loads a slowly changing dimension table by inserting new and changed dimensions using a date range to define current dimension data. Use this mapping when you want to keep a full history of dimension data, tracking changes with an exact effective date range. Type 3 Dimension mapping. Loads a slowly changing dimension table by inserting new dimensions and updating values in existing dimensions. Use this mapping when you want to keep the current and previous dimension values in your dimension table.
  44. The Slowly Growing Target mapping filters source rows based on user-defined comparisons, and then inserts only those found to be new to the target. Use the Slowly Growing Target mapping to determine which source rows are new and to load them to an existing target table. In the Slowly Growing Target mapping, all rows are current. Use the Slowly Growing Target mapping to load a slowly growing fact or dimension table, one in which existing data does not require updates. For example, you have a site code dimension table that contains only a store name and a corresponding site code that you update only after your company opens a new store. Although listed stores might close, you want to keep the store code and name in the dimension for historical analysis. With the Slowly Growing Target mapping, you can load new source rows to the site code dimension table without deleting historical sites.
  45. For example, you want to use the same session to extract transaction records for each of your customers individually. Instead of creating a separate mapping for each customer account, you can create a mapping parameter to represent a single customer account. Then you can use the parameter in a source filter to extract only data for that customer account. Before running the session, you enter the value of the parameter in the parameter file. To reuse the same mapping to extract records for other customer accounts, you can enter a new value for the parameter in the parameter file and run the session. Or you can create a parameter file for each customer account and start the session with a different parameter file each time using pmcmd . By using a parameter file, you reduce the overhead of creating multiple mappings and sessions to extract transaction records for different customer accounts.
  46. Use mapping variables to perform automatic incremental reads of a source. For example, suppose the customer accounts in the mapping parameter example, above, are numbered from 001 to 065, incremented by one. Instead of creating a mapping parameter, you can create a mapping variable with an initial value of 001. In the mapping, use a variable function to increase the variable value by one. The first time the Informatica Server runs the session, it extracts the records for customer account 001. At the end of the session, it increments the variable by one and saves that value to the repository. The next time the Informatica Server runs the session, it automatically extracts the records for the next customer account, 002. It also increments the variable value so the next session extracts and looks up data for customer account 003.
  47. If a session fails or if you receive unexpected results in your target, you can run the Debugger against the session You might also want to run the Debugger against a session if you want the Informatica Server to process the configured session properties
  48. Can create data or error breakpoints for transformations or for global conditions Cannot create breakpoints for mapplet Input and Output transformations Create breakpoints. You create breakpoints in a mapping where you want the Informatica Server to evaluate data and error conditions. Configure the Debugger. Use the Debugger Wizard to configure the Debugger for the mapping. You can choose to run the Debugger against an existing session or you can create a debug session. When you run the Debugger against an existing session, the Informatica Server runs the session in debug mode. When you create a debug session, you configure a subset of session properties within the Debugger Wizard, such as source and target location. You can also choose to load or discard target data. Run the Debugger. Run the Debugger from within the Mapping Designer. When you run the Debugger the Designer connects to the Informatica Server. The Informatica Server initializes the Debugger and runs session. The Informatica Server reads the breakpoints and pauses the Debugger when the breakpoints evaluate to true. Monitor the Debugger. While you run the Debugger, you can monitor the target data, transformation and mapplet output data, the debug log, and the session log. When you run the Debugger, the Designer displays the following windows: Debug log. View messages from the Debugger. Session log. View session log. Target window. View target data. Instance window. View transformation data. Modify data and breakpoints. When the Debugger pauses, you can modify data and see the effect on transformations, mapplets, and targets as the data moves through the pipeline. You can also modify breakpoint information.
  49. The type of information that you monitor and the tasks that you perform can vary depending on the Debugger state. For example, you can monitor logs in all three Debugger states, but you can only modify data when the Debugger is in the paused state
  50. After you save a mapplet, you can use it in a mapping to represent the transformations within the mapplet. When you use a mapplet in a mapping, you use an instance of the mapplet. Like a reusable transformation, any changes made to the mapplet are automatically inherited by all instances of the mapplet. Can use it in a mapping to represent the transformations within the mapplet
  51. Apply the following rules while designing mapplets: Use only reusable Sequence Generators Do not use pre- or post-session stored procedures in a mapplet Use exactly one of the following in a mapplet: Source Qualifier transformation ERP Source Qualifier transformation Input transformation Use at least one Output transformation in a mapplet
  52. When you use an Input transformation in a mapplet, you must connect at least one port in the Input transformation to another transformation in the mapplet Sources within the mapplet. Mapplet input can originate from within the mapplet if you include one or more source definitions in the mapplet. When you use more than one source definition in a mapplet, you must connect the sources to a single Source Qualifier or ERP Source Qualifier transformation. When you use the mapplet in a mapping, the mapplet provides source data for the mapping. Sources outside the mapplet. Mapplet input can originate from outside a mapplet if you include an Input transformation to define mapplet input ports. When you use the mapplet in a mapping, data passes through the mapplet as part of the mapping pipeline.
  53. Each mapplet must contain at least one Output transformation, and at least one port in the Output transformation must be connected within the mapplet
  54. For example, you can create groups of source tables that you call Purchase Orders and Payment Vouchers. You can then organize the appropriate source definitions into logical groups and add descriptive names for them.