Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
www.KnowBigData.comHadoop
• Interact - Ask Questions
• Lifetime access of content
• Class Recording
• Cluster Access
• 24x...
www.KnowBigData.comHadoop
COURSE CONTENT
I Understanding BigData, Hadoop Architecture
II Environment Overview, MapReduce B...
www.KnowBigData.comHadoop
WHAT IS SQOOP?
• Open source tool
• Extract data from a structured data store
• Into Hadoop for ...
www.KnowBigData.comHadoop
A QUICK HELLO!
ssh hadoop2.knowbigdata.com
>sqoop help
Available commands:
codegen Generate code...
www.KnowBigData.comHadoop
CONNECTORS
Sqoop Connector
A modular component that uses extension framework
To enable Sqoop imp...
www.KnowBigData.comHadoop
A SAMPLE IMPORT
Start mysql:
cd /usr ; /usr/bin/mysqld_safe &
Create Sample Table in mysql
mysql...
www.KnowBigData.comHadoop
A SAMPLE IMPORT
Create Sample Table in mysql
CREATE TABLE widgets(
id INT NOT NULL PRIMARY KEY A...
www.KnowBigData.comHadoop
A SAMPLE IMPORT - HDFS
Import to a HDFS File:
sqoop import --connect jdbc:mysql://hadoop1.knowbi...
www.KnowBigData.comHadoop
A SAMPLE IMPORT - HIVE
Creates just the table:
sqoop create-hive-table --connect jdbc:mysql://ha...
www.KnowBigData.comHadoop
A SAMPLE IMPORT - HBASE
Creates just the table:
sqoop import --connect jdbc:mysql://hadoop1.know...
www.KnowBigData.comHadoop
EXPORT - HIVE TO MYSQL
Create Data File:
cat > sales.log
1,15,120 Any St.,Los Angeles,CA,90210,2...
www.KnowBigData.comHadoop
EXPORT - HIVE TO MYSQL
Create MYSQL Table:
CREATE TABLE sales(widget_id INT, qty INT,
street var...
Using Apache Sqoop- importing and exporting data
Using Apache Sqoop- importing and exporting data
Upcoming SlideShare
Loading in …5
×

Using Apache Sqoop- importing and exporting data

1,815 views

Published on

This is Apache Sqoop Session.

We provide training on Big Data & Hadoop,Hadoop Admin ,MongoDB,Data Analytics with R, Python..etc

Our Big Data & Hadoop course consists of Introduction of Hadoop and Big Data,HDFS architecture ,MapReduce ,YARN ,PIG Latin ,Hive,HBase,Mahout,Zookeeper,Oozie,Flume,Spark,Nosql with quizzes and assignments.

To watch the video or know more about the course, please visit http://www.knowbigdata.com/page/big-data-and-hadoop-online-instructor-led-training

Published in: Education

Using Apache Sqoop- importing and exporting data

  1. 1. www.KnowBigData.comHadoop • Interact - Ask Questions • Lifetime access of content • Class Recording • Cluster Access • 24x7 support WELCOME - KNOWBIGDATA • Real Life Project • Quizzes & Certification Test • 10 x (3hr class) • Socio-Pro Visibility • Mock Interviews
  2. 2. www.KnowBigData.comHadoop COURSE CONTENT I Understanding BigData, Hadoop Architecture II Environment Overview, MapReduce Basics III Adv MapReduce & Testing IV MapReduce in Java V Pig & Pig Latin VI Analytics using Hive VII NoSQL, HBASE VIII Zookeeper,HBASE IX Oozie, Flume, Sqoop, Mahout X Intro Spark, Storm, Compare DBs XI YARN, Big Data Sets & Project Assignment
  3. 3. www.KnowBigData.comHadoop WHAT IS SQOOP? • Open source tool • Extract data from a structured data store • Into Hadoop for further processing • Can also move data from DB to HBASE • Can export data back to the data store • Has extension framework - connectors
  4. 4. www.KnowBigData.comHadoop A QUICK HELLO! ssh hadoop2.knowbigdata.com >sqoop help Available commands: codegen Generate code to interact with database records create-hive-table Import a table definition into Hive eval Evaluate a SQL statement and display the results export Export an HDFS directory to a database table help List available commands import Import a table from a database to HDFS import-all-tables Import tables from a database to HDFS job Work with saved jobs list-databases List available databases on a server list-tables List available tables in a database merge Merge results of incremental imports metastore Run a standalone Sqoop metastore version Display version information You can further get more help by: sqoop help import
  5. 5. www.KnowBigData.comHadoop CONNECTORS Sqoop Connector A modular component that uses extension framework To enable Sqoop imports and exports for a data store Available Connectors: Include MySQL, PostgreSQL, Oracle, SQL Server, DB2. Generic JDBC Connector - any database that support jdbc Third Party too - Netezza, Teradata
  6. 6. www.KnowBigData.comHadoop A SAMPLE IMPORT Start mysql: cd /usr ; /usr/bin/mysqld_safe & Create Sample Table in mysql mysql -u root -p CREATE DATABASE sqoopex3; GRANT ALL PRIVILEGES ON sqoopex3.* TO '%'@'localhost'; GRANT ALL PRIVILEGES ON sqoopex3.* TO ''@'localhost'; GRANT ALL PRIVILEGES ON sqoopex3.* TO '%'@'knowbigdata.com'; GRANT ALL PRIVILEGES ON sqoopex3.* TO ''@'knowbigdata.com'; GRANT ALL PRIVILEGES ON sqoopex3.* TO '%'@'%.knowbigdata.com'; GRANT ALL PRIVILEGES ON sqoopex3.* TO ''@'hadoop2.knowbigdata.com'; GRANT ALL PRIVILEGES ON sqoopex3.* TO '%'@'hadoop3.knowbigdata.com'; GRANT ALL PRIVILEGES ON sqoopex3.* TO ''@'hadoop3.knowbigdata.com'; GRANT ALL PRIVILEGES ON sqoopex3.* TO '%'@'hadoop4.knowbigdata.com'; GRANT ALL PRIVILEGES ON sqoopex3.* TO ''@'hadoop4.knowbigdata.com';
  7. 7. www.KnowBigData.comHadoop A SAMPLE IMPORT Create Sample Table in mysql CREATE TABLE widgets( id INT NOT NULL PRIMARY KEY AUTO_INCREMENT, widget_name VARCHAR(64) NOT NULL, price DECIMAL(10,2), design_date DATE, version INT, design_comment VARCHAR(100)); INSERT INTO widgets VALUES (NULL, 'sprocket', 0.25, '2010-02-10', 1,'Connects two gizmos'); INSERT INTO widgets VALUES (NULL, 'gizmo', 4.00, '2009-11-30', 4, NULL); INSERT INTO widgets VALUES (NULL, 'gadget', 99.99, '1983-08-13', 13, 'Our flagship product');
  8. 8. www.KnowBigData.comHadoop A SAMPLE IMPORT - HDFS Import to a HDFS File: sqoop import --connect jdbc:mysql://hadoop1.knowbigdata. com/sqoopex3 --table widgets -m 1 --username root --password '' Check the content of the imported File: hadoop fs -cat widgets/part-m-00000 Also notice that widgets.java was created.
  9. 9. www.KnowBigData.comHadoop A SAMPLE IMPORT - HIVE Creates just the table: sqoop create-hive-table --connect jdbc:mysql://hadoop1. knowbigdata.com/sqoopex3 --table widgets --fields-terminated-by ',' --username root --password '' Imports: hive> LOAD DATA INPATH "widgets" INTO TABLE widgets; In single step sqoop import --connect jdbc:mysql://hadoop1. knowbigdata.com/sqoopex3 --table widgets -m 1 --hive- import --username root --password ''
  10. 10. www.KnowBigData.comHadoop A SAMPLE IMPORT - HBASE Creates just the table: sqoop import --connect jdbc:mysql://hadoop1.knowbigdata. com/sqoopex3 --table widgets --hbase-table 'customer' --column- family cf1 --username root --password '' --hbase-create-table -- columns a1,b1 --hbase-row-key id
  11. 11. www.KnowBigData.comHadoop EXPORT - HIVE TO MYSQL Create Data File: cat > sales.log 1,15,120 Any St.,Los Angeles,CA,90210,2010-08-01 3,4,120 Any St.,Los Angeles,CA,90210,2010-08-01 2,5,400 Some Pl.,Cupertino,CA,95014,2010-07-30 2,7,88 Mile Rd.,Manhattan,NY,10005,2010-07-18 Create Hive Table: CREATE TABLE sales(widget_id INT, qty INT, street STRING, city STRING, state STRING, zip INT, sale_date STRING) ROW FORMAT DELIMITED FIELDS TERMINATED BY ',' Load: LOAD DATA LOCAL INPATH "sales.log" INTO TABLE sales; select * from sales;
  12. 12. www.KnowBigData.comHadoop EXPORT - HIVE TO MYSQL Create MYSQL Table: CREATE TABLE sales(widget_id INT, qty INT, street varchar(100), city varchar(100), state varchar(100), zip INT, sale_date varchar(100)) Export: sqoop export --connect jdbc:mysql://hadoop1.knowbigdata. com/sqoopex3 -m 1 --table sales --export-dir /apps/hive/warehouse/sales --input-fields-terminated-by ',' --username root --password ''

×