Etl with talend (big data)


Published on

ETL operations using talend open studio for big data and HBase and Hive

Published in: Software, Technology
  • Be the first to comment

  • Be the first to like this

No Downloads
Total views
On SlideShare
From Embeds
Number of Embeds
Embeds 0
No embeds

No notes for slide

Etl with talend (big data)

  1. 1. ETL with talend POOJA B. MISHRA
  2. 2. What is ETL? Extract is the process of reading data from a database Transform is the process of converting the extracted data from its previous form into the form it needs to be in so that it can be placed into another database. Transformation occurs by using rules or lookup tables or by combining the data with other data Load is the process of writing the data into the target database
  3. 3. Process flow
  4. 4. Terms closely related and managed by ETL processes  data migration data management  data cleansing data synchronization  data consolidation. .
  5. 5. Different ETL tools •Informatica PowerCenter •Oracle ETL •Ab Initio •Pentaho Data Integration -Kettle Project (open source ETL) •SAS ETL studio •Cognos Decisionstream •Business Objects Data Integrator (BODI) •Microsoft SQL Server Integration Services (SSIS) •Talend
  6. 6. Prerequisites Talend Open Studio for Data Integration ◦ VirtualBox ◦ Hortonworks Sandbox VM ◦ sandbox/#install
  7. 7. How to set up – Step 1
  8. 8. Step 2
  9. 9. Step 3
  10. 10. Step 5
  11. 11. Talend Interface Workspace Repository tree Component configuration Palette WorkspaceRepository tree Palette Repository tree Workspace Palette Component configuration
  12. 12. Supported data input and output formats? • SQL • MySQL • PostgreSQL • Sybase • Teradata • MSSQL • Netezza • Greenplum • Access • DB2 Hive Pig Hbase Sqoop MongoDB Riak Many more
  13. 13. What kinds of datasets can be loaded? Talend Studio offers nearly comprehensive connectivity to: Packaged applications (ERP, CRM, etc.), databases, mainframes, files, Web Services, and so on to address the growing disparity of sources. Data warehouses, data marts, OLAP applications - for analysis, reporting, dashboarding, scorecarding, and so on. Built-in advanced components for ETL, including string manipulations, Slowly Changing Dimensions, automatic lookup handling, bulk loads support etc.
  14. 14. Tutorial overview We will do following tasks in this assignment: 1. Load data from DB on your local machine to HDFS 2. Write hive query to do analysis 3. Result of above hive query is then pushed to HBase output component
  15. 15. Step 1 Use row generator to simulate the rows in db and created a table with 3 columns, ID, name and level.
  16. 16. Step 2 • Drag and drop the hdfsoutput component to the surface and connect the major output of the row generator to the hdfs. • For hdfs component, double click on the HDFS component in design area and just specify the name node address, and the folder in your machine to hold the file
  17. 17. Step 3  After loading the data to HDFS, we can create external hive table customers using following command by logging in hive shell and executing following command CREATE EXTERNAL TABLE customers(id INT, name STRING, level STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY STORED AS TEXTFILE LOCATION '/usr/talend';
  18. 18. Step 4 Create one flow to read the data in hive. pick up the version ,and the thrift server ip/port, then write a hive query as shown in below screen
  19. 19. Step 5 Click the edit schema button and just add one column with type as object then we will parse the result and map to our schema. Click the advanced tab, to enable the parse query results, using the column we just created as object type. Drag the parserecordset component to the surface and conenct the mainout of hiverow to it, click edit schema to do the necessary mapping and then match the values as shown belowust created as object type
  20. 20. Results
  21. 21. Step 6 ◦ Click to run this job, from the console it tell you whether it has connected to the hive server successfully ◦ Go to the hive server and it will show you that it has received one query and will execute it ◦ you can see the results from the run talend console
  22. 22. Step 7 Drag one component called hbaseoutput from right pallette, and config the zookeeper info
  23. 23. Step 8 Run the job will get this as final output You can login into hbase shell and check if data is insterted to the hbase and the table was created by talend also!
  24. 24. Thank You!!