CHEAP Call Girls in Pushp Vihar (-DELHI )🔝 9953056974🔝(=)/CALL GIRLS SERVICE
01_Intro_SAP BO DATA Integrator.docx
1. SAP BO DATA Integrator / Data Services
Data servicesisintegrated withSAPBI/SAPR3/SAPApplicationsandnonSAPWare house.
Purpose:- ItdoETL viabatch Job and online methodthrubulkanddeltaloadprocessingof both
structuredand unstructured datato generate aWare House (sapand Non-sap)
Data Servicesisthe combinationof DataIntegratorand Data Quality.Previouslythese are separate tools
like DataIntegratorwhichisusedto do the ETL part and Data Qualitytodo the data profilingandData
Cleansing.NowwithDataServicesbothDIandDQ are combinedintoonce interface sothat itprovides
the complete solution(dataintegrationandQuality) underone platform.
Thisevencombinesthe separate jobservers&Repositoriesof DIand DI into one.
Data Federator: - The outputof the data federatoristhe virtual data.Federatorprovidesthe dataas
inputto the data servicesandusingfederatorwe canprojectdatafrom multiple sourcesasa single
source.
Data ServicesScenarios:-
Source Ware House
SQL -- DS -- SQL
FlatFile -- DS -- SQL
FlatFile -- DS -- BI
R/3 -- DS -- BI
R/3 -- DS -- SQL
SQL -- DS -- BI
We can move the data fromany source to any targetDB usingData Services.
Data Servicesisan utilitytodoETL process,Itis nota ware house ,so it doesn’tstage anyamountof
data init.
Data Servicescancreate ETL processand can create a ware house (SAP/Non-Sap) .
DS is usedmajorlyfor3 sort of projects
1) Migration
2) Ware house or DB building
3) Data Quality
2. Data Profiling:- Pre processingof databefore the ETL to check the healthof the data. By profilingwe
checkthe healthof the data if it’sgoodor bad.
Advantages of Data ServicesoverSAP BI/BW ETL process
It’sa GUI basedframe work
It has multiple datasourcesinbuiltconfiguration
It has numerousinbuiltTransformations(Integrator,Quality,Platform)
It doesdata profilingactivity
It easilyaddsexternal systems
It supportsExportExecutionCommandtoloadthe data into the ware house viabatchmode process
It generatesABAPcode automatically
It recognizesStructure andunstructuresdata
It can generate aware house (sap/Non Sap)
It supportshuge data cleansing/Consolidation/Transformation
It can do real time data load/Full dataload/Incremental Dataload
Data integrator / ServicesArchitecture
RepositoryManager
AccessServer
JobServer
Meta Data
Integrator
Designer
Management
Console
Profiler
Local
Central
Profiler
3. No conceptof Processchains/ DTP/Infopackagesif you use the data servicestoloadthe data.
Data IntegratorComponents
Designer
It Createsthe ETL Process
It has wide setof transformations
It includesall the artifactsof the project( Work Flow,Data Flow,DataStore,Tables)
It isa gate way to doprofiling
All the designerobjectsare reusable
4. ManagementConsole (URL basedtool / Webbasedtool)
It isusedto activate the repositories
It allowsusto activate userprofilestospecificenvironment
It allowsusto create usersand usergroupsand assignthe usersto the usergroupswithprivileges
It allowstoauto schedule orexecute the jobs
We can execute the jobsfromanygeographiclocationasthisisa webbasedtool
It allowsusto connectthe repositoriestoConnections(Dev/Qual /Prod)
It allowsusto customize the datastores
5. Access Server
It isusedto run the real time jobs
It getsthe XML input(real time data)
XML inputscanbe loadedtothe Ware house usingthe Accessserver
It isresponsibleforthe executionof online/real time jobs
RepositoryManager
It allowsusto create the Repositories(Local,Central,andProfiler)
Repositoriesare createdusingstandarddatabase
Data Servicessystemtablesare available here
6. Mete Data Integrator
It generates AutoDocumentation
It generatessample reportsandsemanticlayers
It generatesjobbasedstatisticdashboards
Job Server
Thisis the serverwhichisresponsible toexecutethe jobs.Withoutassigningthe local /central
repositorywe cannotexecute the job.
Data Integrator Objects
Projects:-
Projectisa folderwhere youstore all the relatedjobsatonce place.We can call it as a Folderto
organize jobs.
Jobs:-
Jobsare the executable partof the Data Services.Thisjobispresentunderthe project.
Batch Job
Online jobs
Work Flows:-
Thiswork flowactsa foldertocontainthe relatedDataFlows.ThisWorkFlowsare re-usable
Conditionals:-
Conditional containsWorkFlowsordata flowsandthese are controlledbyscriptwhethertotriggeror
not.
Scripts:-
Scriptsare setof codesusedtodefine orinitialize the globalvariables,Controlthe flow of conditionals
or control the flowof execution,toprintsome statementsatthe runtime andalsoto assignspecific
defaultvaluestothe variables.
Data Flow:-
The actual data processinghappenshere.
7. Source Data Store:-
It isthe place heldtoimportthe data fromthe data base/sapto data serviceslocal repository
Target Data Store:-
It isthe collectionof dimensionsandfacttablestocreate the data ware house.
Transformations:-
These are the querytransformationsthatare usedtocarry out the ETL process.These are broadly
categorizedinto3 (platform,Qualityandintegrator)
File Format :-
It containsvariouslegacysystemfileformats
Variables:-
We can create and use the local and global variablesanduse theminthe project.The variablesstarts
with“$” Symbol.
Functions:-
We have numerousinbuiltfunctionslike (String,math,lookup,enrichandsoon)
Template Table:-
These are the temporarytablesthatare usedto holdthe intermediate dataorthe final data.
Data Store:-
These data storesacts a port fromwhichyoucan define the connectionstothe source orthe target
systems.Youcan create multiple configurationsinone datastore to connectthisto the different
systems
ATL :-
ATL filesare like the BIARfiles.Thisisnamedafteracompany.ATL doesn’tholdanyfull formlike BIAR.
The Project/Job/Work Flow/DataFlow/Tablescan be exportedtoATL sothat theycan be moved
betweenDev Qual and from Qual Prod.
Similarlyyoucanalsoimportthe Project/Job/WorkFlow/Data Flow/Tableswhichare exportedtoATL,
back into the data services