SlideShare a Scribd company logo
1 of 3
VINEET ANAND
Mobile: 735838-7388
E-Mail: anand_vin72@hotmail.com
______________________________________________________________________________________
Professional Summary:
 Accomplished IT developer with 12 Yrs. Of Professional Experience in
Analysis, Design & Development of Enterprise Grade Application. 2 year hands
on experience in in cutting edge technology trends like Hadoop BigData and
Hadoop(CDH4).
 Hands on experience with Hadoop Core Components (HDFS, MapReduce) and
Hadoop Ecosystem (Sqoop, Flume, Hive, Pig, Oozie, HBase).
Experience in importing and exporting the data using Sqoop from Relational
Database (Db2) & My SQL Server to HDFS and reverse.
 Efficient in analyzing data using HiveQL, PigLatin, partitioning an existing data
set with static and dynamic partition, tune data for optimal query
performance.
 Good Understanding of creating a workflow with actions that includes Hadoop
jobs, Hive jobs, Pig jobs, Sqoop jobs using Oozie.
 Knowledge of extracting an Avro schema using avro-tools and evolving an
Avro schema by changing JSON files.
 Proficient working knowledge on Data Ingestion, Data Processing, Data
Analysis, Data Visualization and hands on experience working on projects
using Sqoop, Flume, Hive, Pig.
 Good Understanding about Spark SQL, PySpark and core Spark API.
 Excellent Analytical, Programming & Logical Skills. Capable of Handling
Multiple Projects same time.
 Previous Experience in Architecture Design, Database Design & Performance
Management using Mainframe Technology.
Technical Expertise
O/S MVS/ESA, Windows, LINUX, UNIX
RDBMS DB2, Oracle
Big Data Hadoop, HBase, Pig, Hive, Sqoop, Oozie, Flume, Mapreduce, HDFS,
Spark
Languages Core JAVA, Python
Other Tools ECLLIPSE
Other Technologies Mainframes
Methodologies Waterfall, Agile
Vineet Anand
Education / Certifications
Masters in Computer Applications BSc Physics/Chemistry/Maths
Gurukul K. University, India (1996) Garhwal University, India (1993)
Professional Experience
Mphasis Mar 2016 – Till Date
Application : Marketing & Merchandising, Item Maintance.
Role : Analyst
Project : Store Transaction Processing
Client : ROYAL AHOLD, USA
Environment : PIG, HIVE, SQOOP, HDFS, MapReduce, UNIX.
Project Description:
Royal Ahold has the largest supermarket chain.Retail Applications Like Promotions,
Coupon Generation, Retail Demand Forecasting (RDF), Managing Deals are some of
Business Critical Processes for analysis of item movement. Different reports are
required by business for Analysis. Existing system not capable to handle the same
because of large vol. of data. Switched to HADOOP for processing & analyzing the
data.
Role and Responsibility:
 Access business rules, collaborate with stakeholders and perform source-to-
target data Mapping, design and review.
 Tested Raw Data in Legacy System & same data in Hadoop by executing
performance scripts.
 Created Hive tables to store the processed results in a tabular format.Writing
the HIVE queries to extract data as per business requirement.
 Involved in Requirement Gathering, Analysis & Design.
 Managed Promotion Data on Cluster of 5 nodes with size of data close to 1000
GB on weekly basis.
 Analyze Log Files for Processed / Missing Promotions using Flume & loading to
HDFS.
 Promotion File reformatting and basic validation for large sets of Structured
data through PIG.
 Imports Stores / Item data from external structured datastores (SQL Server)
into HADOOP using SQOOP.Process the data using PIG & HIVE. Procesed data
is loaded to HBASE / HIVE Tables for Analysis & Report Generations.
 Data is processed using Map Reduce and file will be created for different feed
for Customer Data Warehouse and item movement Data warehouse.
 Storage of 7 days old raw Promotion / Coupon / Forecasting data in HDFS.
Mphasis Sep 2015 – Feb 2016
Application : Marketing & Merchandising, Item Maintance
Role : Developer
Project : Copient Coupon Billing & Item Billing.
Client : ROYAL AHOLD, USA
Environment : PIG, HIVE, SQOOP, HBASE, FLUME, HDFS, UNIX.
Vineet Anand
Project Description:
Vendor Promotes there Item(s) thru promotions which comes as part of Coupons.
Two types of Coupons (i) Copient Coupon – For the Customers who are enrolled in
the Stores (ii) Comes as New paper promotions After sales the discount has to be
reimbursed & has to be maintained in the system for the period of 6 Months for
Audit. Volume of data is huge & takes time to send feed to AFS. Big Data processing
was introduce to overcome the data processing issue.
 Imported Promotional Data using Sqoop into Hive & HBASE from existing SQL
Server.
 Designed Reports for the BI Team using Sqoop to extract Data Into HDFS &
HIVE.
 Involved in Creating Tables in Hive, Loading Data & Query the data for
Business requirement.
 Process / Load large set of Structured & Semi Structured Data.
 Involved in Requirement Gathering, Analysis & Design.
Application : Marketing & Merchandising, Item Maintance Jan2015–July 2015
Role : Developer
Project : Item Cost Calculation (POC)
Client : ROYAL AHOLD, USA
Environment: PIG, HIVE, SQOOP, HDFS, UNIX.
Daily TLOG Files (Store Files) were cleansed & loaded into HDFS to know Profit &
Loss for all store across different Geographic location on daily basis.Item Cost feed
comes from different sources. To process the data in legacy system was taking
longer time as expected because of high vol. of data, causing delay for report
generation. Hadoop Frame work was introduced to load Store data to HDFS, process
the same & generate the report.
 Riting PIG scripts for transforming the data such as data cleansing, data
validation, joining the tables and storing the final results in the Hive tables.
 Involved in Requirement Gathering, Analysis & Design.
 Creation of Partioned Table in HIVE & loading data to same.
 Implemented sqoop command to import data from DB2 / SQL Serve to Hive.
Writing HIVE queries to extract data as per business requirement.

More Related Content

What's hot

AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...Cambridge Semantics
 
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...Pentaho
 
Using a Semantic and Graph-based Data Catalog in a Modern Data Fabric
Using a Semantic and Graph-based Data Catalog in a Modern Data FabricUsing a Semantic and Graph-based Data Catalog in a Modern Data Fabric
Using a Semantic and Graph-based Data Catalog in a Modern Data FabricCambridge Semantics
 
Eligotech presents @ Data Donderdag on 24 April 2014
Eligotech presents @ Data Donderdag on 24 April 2014Eligotech presents @ Data Donderdag on 24 April 2014
Eligotech presents @ Data Donderdag on 24 April 2014Paul Broekhoven
 
IRJET- Business Intelligence using Hadoop
IRJET-  	  Business Intelligence using HadoopIRJET-  	  Business Intelligence using Hadoop
IRJET- Business Intelligence using HadoopIRJET Journal
 
Big Data Integration Webinar: Getting Started With Hadoop Big Data
Big Data Integration Webinar: Getting Started With Hadoop Big DataBig Data Integration Webinar: Getting Started With Hadoop Big Data
Big Data Integration Webinar: Getting Started With Hadoop Big DataPentaho
 
Pentaho Data Integration Introduction
Pentaho Data Integration IntroductionPentaho Data Integration Introduction
Pentaho Data Integration Introductionmattcasters
 
Exclusive Verizon Employee Webinar: Getting More From Your CDR Data
Exclusive Verizon Employee Webinar: Getting More From Your CDR DataExclusive Verizon Employee Webinar: Getting More From Your CDR Data
Exclusive Verizon Employee Webinar: Getting More From Your CDR DataPentaho
 
Lowering the entry point to getting going with Hadoop and obtaining business ...
Lowering the entry point to getting going with Hadoop and obtaining business ...Lowering the entry point to getting going with Hadoop and obtaining business ...
Lowering the entry point to getting going with Hadoop and obtaining business ...DataWorks Summit
 
Integratus solutions overview sap healthcare_in-memory
Integratus solutions overview sap healthcare_in-memoryIntegratus solutions overview sap healthcare_in-memory
Integratus solutions overview sap healthcare_in-memoryRon Lehman
 
What is big data
What is big data What is big data
What is big data DeZyre
 
Gartner peer forum sept 2011 orbitz
Gartner peer forum sept 2011   orbitzGartner peer forum sept 2011   orbitz
Gartner peer forum sept 2011 orbitzRaghu Kashyap
 
Big Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetu
Big Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetuBig Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetu
Big Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetuEmre Sevinç
 
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...MongoDB
 
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...Denodo
 

What's hot (20)

AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
 
View on big data technologies
View on big data technologiesView on big data technologies
View on big data technologies
 
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
Big Data Integration Webinar: Reducing Implementation Efforts of Hadoop, NoSQ...
 
Using a Semantic and Graph-based Data Catalog in a Modern Data Fabric
Using a Semantic and Graph-based Data Catalog in a Modern Data FabricUsing a Semantic and Graph-based Data Catalog in a Modern Data Fabric
Using a Semantic and Graph-based Data Catalog in a Modern Data Fabric
 
Eligotech presents @ Data Donderdag on 24 April 2014
Eligotech presents @ Data Donderdag on 24 April 2014Eligotech presents @ Data Donderdag on 24 April 2014
Eligotech presents @ Data Donderdag on 24 April 2014
 
IRJET- Business Intelligence using Hadoop
IRJET-  	  Business Intelligence using HadoopIRJET-  	  Business Intelligence using Hadoop
IRJET- Business Intelligence using Hadoop
 
Big Data Integration Webinar: Getting Started With Hadoop Big Data
Big Data Integration Webinar: Getting Started With Hadoop Big DataBig Data Integration Webinar: Getting Started With Hadoop Big Data
Big Data Integration Webinar: Getting Started With Hadoop Big Data
 
xGem BigData
xGem BigDataxGem BigData
xGem BigData
 
Resume
ResumeResume
Resume
 
Pentaho Data Integration Introduction
Pentaho Data Integration IntroductionPentaho Data Integration Introduction
Pentaho Data Integration Introduction
 
RESUME_N
RESUME_NRESUME_N
RESUME_N
 
Exclusive Verizon Employee Webinar: Getting More From Your CDR Data
Exclusive Verizon Employee Webinar: Getting More From Your CDR DataExclusive Verizon Employee Webinar: Getting More From Your CDR Data
Exclusive Verizon Employee Webinar: Getting More From Your CDR Data
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Lowering the entry point to getting going with Hadoop and obtaining business ...
Lowering the entry point to getting going with Hadoop and obtaining business ...Lowering the entry point to getting going with Hadoop and obtaining business ...
Lowering the entry point to getting going with Hadoop and obtaining business ...
 
Integratus solutions overview sap healthcare_in-memory
Integratus solutions overview sap healthcare_in-memoryIntegratus solutions overview sap healthcare_in-memory
Integratus solutions overview sap healthcare_in-memory
 
What is big data
What is big data What is big data
What is big data
 
Gartner peer forum sept 2011 orbitz
Gartner peer forum sept 2011   orbitzGartner peer forum sept 2011   orbitz
Gartner peer forum sept 2011 orbitz
 
Big Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetu
Big Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetuBig Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetu
Big Data Governance in Hadoop Environments with Cloudera Navigatorfeb2017meetu
 
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...
MongoDB IoT City Tour LONDON: Hadoop and the future of data management. By, M...
 
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...
Product Keynote: Denodo 8.0 - A Logical Data Fabric for the Intelligent Enter...
 

Viewers also liked

Sireesha Puppala_RF_Engg_7+years_exp (2)
Sireesha Puppala_RF_Engg_7+years_exp (2)Sireesha Puppala_RF_Engg_7+years_exp (2)
Sireesha Puppala_RF_Engg_7+years_exp (2)sireesha Puppala
 
Full_resume_Dr_Russell_John_Childs
Full_resume_Dr_Russell_John_ChildsFull_resume_Dr_Russell_John_Childs
Full_resume_Dr_Russell_John_ChildsRussell Childs
 
Technical Architect - Big Data systems
Technical Architect - Big Data systemsTechnical Architect - Big Data systems
Technical Architect - Big Data systemsMark Long
 
Tara van Dijk_Consultant CV
Tara van Dijk_Consultant CVTara van Dijk_Consultant CV
Tara van Dijk_Consultant CVTara van Dijk
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith R
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
 
Vijay kumar hadoop cv
Vijay kumar hadoop cvVijay kumar hadoop cv
Vijay kumar hadoop cvVijay Kumar
 

Viewers also liked (8)

Resume
ResumeResume
Resume
 
Sireesha Puppala_RF_Engg_7+years_exp (2)
Sireesha Puppala_RF_Engg_7+years_exp (2)Sireesha Puppala_RF_Engg_7+years_exp (2)
Sireesha Puppala_RF_Engg_7+years_exp (2)
 
Full_resume_Dr_Russell_John_Childs
Full_resume_Dr_Russell_John_ChildsFull_resume_Dr_Russell_John_Childs
Full_resume_Dr_Russell_John_Childs
 
Technical Architect - Big Data systems
Technical Architect - Big Data systemsTechnical Architect - Big Data systems
Technical Architect - Big Data systems
 
Tara van Dijk_Consultant CV
Tara van Dijk_Consultant CVTara van Dijk_Consultant CV
Tara van Dijk_Consultant CV
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
Vijay kumar hadoop cv
Vijay kumar hadoop cvVijay kumar hadoop cv
Vijay kumar hadoop cv
 

Similar to VINEET_ANAND_CV_HADOOP_VA_V3

Similar to VINEET_ANAND_CV_HADOOP_VA_V3 (20)

PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Kumar Godasi - Resume
Kumar Godasi - ResumeKumar Godasi - Resume
Kumar Godasi - Resume
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
Hd insight overview
Hd insight overviewHd insight overview
Hd insight overview
 
Resume
ResumeResume
Resume
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
Srijita Chaturvedi
Srijita ChaturvediSrijita Chaturvedi
Srijita Chaturvedi
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
Resume_2706
Resume_2706Resume_2706
Resume_2706
 
Handoop training in bangalore
Handoop training in bangaloreHandoop training in bangalore
Handoop training in bangalore
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
 
Atul Mithe
Atul MitheAtul Mithe
Atul Mithe
 

VINEET_ANAND_CV_HADOOP_VA_V3

  • 1. VINEET ANAND Mobile: 735838-7388 E-Mail: anand_vin72@hotmail.com ______________________________________________________________________________________ Professional Summary:  Accomplished IT developer with 12 Yrs. Of Professional Experience in Analysis, Design & Development of Enterprise Grade Application. 2 year hands on experience in in cutting edge technology trends like Hadoop BigData and Hadoop(CDH4).  Hands on experience with Hadoop Core Components (HDFS, MapReduce) and Hadoop Ecosystem (Sqoop, Flume, Hive, Pig, Oozie, HBase). Experience in importing and exporting the data using Sqoop from Relational Database (Db2) & My SQL Server to HDFS and reverse.  Efficient in analyzing data using HiveQL, PigLatin, partitioning an existing data set with static and dynamic partition, tune data for optimal query performance.  Good Understanding of creating a workflow with actions that includes Hadoop jobs, Hive jobs, Pig jobs, Sqoop jobs using Oozie.  Knowledge of extracting an Avro schema using avro-tools and evolving an Avro schema by changing JSON files.  Proficient working knowledge on Data Ingestion, Data Processing, Data Analysis, Data Visualization and hands on experience working on projects using Sqoop, Flume, Hive, Pig.  Good Understanding about Spark SQL, PySpark and core Spark API.  Excellent Analytical, Programming & Logical Skills. Capable of Handling Multiple Projects same time.  Previous Experience in Architecture Design, Database Design & Performance Management using Mainframe Technology. Technical Expertise O/S MVS/ESA, Windows, LINUX, UNIX RDBMS DB2, Oracle Big Data Hadoop, HBase, Pig, Hive, Sqoop, Oozie, Flume, Mapreduce, HDFS, Spark Languages Core JAVA, Python Other Tools ECLLIPSE Other Technologies Mainframes Methodologies Waterfall, Agile
  • 2. Vineet Anand Education / Certifications Masters in Computer Applications BSc Physics/Chemistry/Maths Gurukul K. University, India (1996) Garhwal University, India (1993) Professional Experience Mphasis Mar 2016 – Till Date Application : Marketing & Merchandising, Item Maintance. Role : Analyst Project : Store Transaction Processing Client : ROYAL AHOLD, USA Environment : PIG, HIVE, SQOOP, HDFS, MapReduce, UNIX. Project Description: Royal Ahold has the largest supermarket chain.Retail Applications Like Promotions, Coupon Generation, Retail Demand Forecasting (RDF), Managing Deals are some of Business Critical Processes for analysis of item movement. Different reports are required by business for Analysis. Existing system not capable to handle the same because of large vol. of data. Switched to HADOOP for processing & analyzing the data. Role and Responsibility:  Access business rules, collaborate with stakeholders and perform source-to- target data Mapping, design and review.  Tested Raw Data in Legacy System & same data in Hadoop by executing performance scripts.  Created Hive tables to store the processed results in a tabular format.Writing the HIVE queries to extract data as per business requirement.  Involved in Requirement Gathering, Analysis & Design.  Managed Promotion Data on Cluster of 5 nodes with size of data close to 1000 GB on weekly basis.  Analyze Log Files for Processed / Missing Promotions using Flume & loading to HDFS.  Promotion File reformatting and basic validation for large sets of Structured data through PIG.  Imports Stores / Item data from external structured datastores (SQL Server) into HADOOP using SQOOP.Process the data using PIG & HIVE. Procesed data is loaded to HBASE / HIVE Tables for Analysis & Report Generations.  Data is processed using Map Reduce and file will be created for different feed for Customer Data Warehouse and item movement Data warehouse.  Storage of 7 days old raw Promotion / Coupon / Forecasting data in HDFS. Mphasis Sep 2015 – Feb 2016 Application : Marketing & Merchandising, Item Maintance Role : Developer Project : Copient Coupon Billing & Item Billing. Client : ROYAL AHOLD, USA Environment : PIG, HIVE, SQOOP, HBASE, FLUME, HDFS, UNIX.
  • 3. Vineet Anand Project Description: Vendor Promotes there Item(s) thru promotions which comes as part of Coupons. Two types of Coupons (i) Copient Coupon – For the Customers who are enrolled in the Stores (ii) Comes as New paper promotions After sales the discount has to be reimbursed & has to be maintained in the system for the period of 6 Months for Audit. Volume of data is huge & takes time to send feed to AFS. Big Data processing was introduce to overcome the data processing issue.  Imported Promotional Data using Sqoop into Hive & HBASE from existing SQL Server.  Designed Reports for the BI Team using Sqoop to extract Data Into HDFS & HIVE.  Involved in Creating Tables in Hive, Loading Data & Query the data for Business requirement.  Process / Load large set of Structured & Semi Structured Data.  Involved in Requirement Gathering, Analysis & Design. Application : Marketing & Merchandising, Item Maintance Jan2015–July 2015 Role : Developer Project : Item Cost Calculation (POC) Client : ROYAL AHOLD, USA Environment: PIG, HIVE, SQOOP, HDFS, UNIX. Daily TLOG Files (Store Files) were cleansed & loaded into HDFS to know Profit & Loss for all store across different Geographic location on daily basis.Item Cost feed comes from different sources. To process the data in legacy system was taking longer time as expected because of high vol. of data, causing delay for report generation. Hadoop Frame work was introduced to load Store data to HDFS, process the same & generate the report.  Riting PIG scripts for transforming the data such as data cleansing, data validation, joining the tables and storing the final results in the Hive tables.  Involved in Requirement Gathering, Analysis & Design.  Creation of Partioned Table in HIVE & loading data to same.  Implemented sqoop command to import data from DB2 / SQL Serve to Hive. Writing HIVE queries to extract data as per business requirement.