SlideShare a Scribd company logo
1 of 3
VENUGOPAL CHIPPA 
E-Mail: venuzone@gmail.com 
Contact No.: 9000134110 
Seeking a Hadoop Technical Lead position at high reputed organization where I can maximize my technical and management skills. 
PROFILE SUMMARY 
 4 years of experience in IT industry including Big Data-Hadoop and Data Warehousing technologies. 
 Hands on experience with Big Data core components and Eco System (HDFS, MapReduce, Hive, Sqoop, Flume, Oozie, HBase, 
Pig, Core Java ) 
 Hands-on experience in configuring, developing, monitoring, debugging and performance tuning of Big Data applications using 
Hadoop 
 Deep experience with Distributed systems and large scale non-relational data stores. 
 Experience in Informatica ETL processes. 
 Experience in handling Enterprise Data warehouses-Development, Support and Maintenance 
 Worked with customers closely to provide solutions to various problems 
 Quick learner and self starter to go ahead with any new technology. 
ORGANISATIONAL EXPERIENCE 
 Working with Tech Mahindra (Satyam Computer Services Ltd) from Feb 2011 to Till date 
Education Details 
 B.Tech from JNT University in the year 2010 
Technical Skills 
 Big Data/Hadoop eco system : Hadoop,HDFS,MapReduce,HIVE,HiveQL,Pig,Sqoop,HBase,Oozie,Flume,Splunk 
 ETL Tools : Informatica 9.1/8.x 
 Reporting tool : Business Objects 
 Databases : Oracle 11g/10g/9i/8i 
 Data Management tools :SQL *Plus, SQL Developer, Enterprise Manager, Toad 
 DB Languages : SQL 
 NoSql : HBase, Cassandra 
 Operating Systems : VMware, Linux, Unix, WINDOWS 98,CENT 
Professional Experience: 
Hadoop (Big Data) 
Type: Development 
Customer: GE Atlanta, USA 
Offshore: Mahindra Satyam Computer Services Ltd Limited, Hyderabad 
Tools/Platform: BigData, Hadoop, HDFS, Hive, Hive QL, Pig, Sqoop, Flume, Oozie, Oracle 
Responsibilities: 
 Analyze Client Systems and Gather requirements. 
 Design technical solutions based out of the Client needs and systems architecture. 
 Data is collected from Teradata and pushing into Hadoop using Sqoop. 
 The logs are pulled form log server and stored in the FTP server hourly wise; this data is pushed into Hadoop by deleting data 
in the FTP server. 
 Data was pre-processed using MapReduce and stored into the Hive tables. 
 Implementation part of the Big Data solution has been taken care. 
 Extract data from various log files (Apache weblog, adobe log, Hit Data log) and load it into HDFS under Hadoop.
 Developed and build HIVE tables using Hive QL. 
 Developed scripts using Hive QL for the analysis purpose. 
 Tuned Hive queries to get the best performance. 
 Written Apache Pig scripts for data analysis on top of the HDFS files. 
 Extracted data from RDBMS – Oracle into HADOOP using Sqoop. 
 Used Hive on the RDBMS data for data analysis and stored back to DB. 
 Used Oozie for scheduling. 
 Provided solutions to various problems in Hadoop. 
Hadoop: General Electronics (G.E) 
Type: Development 
Customer: General Electric, Cincinnati, OH, USA 
Off shore : Mahindra Satyam Computer Services Ltd Limited, Hyderabad 
Tools/Platform: Big Data, Hadoop, HDFS, Hive, Hive QL, Apache Pig, Sqoop and Oozie 
Responsibilities: 
 Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS 
 Developed the PIG UDF’S to pre-process the data for analysis 
 Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with PIG 
 Developed HIVE queries for the analysts 
 Moving data using flume 
 Provided solutions to various problems in Hadoop 
 Developed and build HIVE tables using Hive QL 
Title : Shop floor Systems 
Client : GE Energy 
Role : Informatica Developer 
Project Description: 
GE Energy is one of the 12 major businesses of General Electric Company USA and is the world's leading supplier of power 
generation technology, service and energy management systems. GE Energy services customers globally through a network of loca l 
offices and service centers, with three regional headquarters - GE Energy Asia-Pacific in Hong Kong, GE Energy Europe in London 
and GE Energy Americas in Schenectady, NY. 
GE Energy includes 12 manufacturing centers, nearly 22,200 employees, 130 sales and service engineering offices, and 58 
apparatus service centers, worldwide.Relates the turbines data and all the part counts which will be counted in the application where the 
ETL loads run at Informatica and reports with BO 6.5. 
Contribution: 
As an ETL Developer, I was responsible for 
 Design of Extraction Transform & Load (ETL) environment 
 Involved in requirements study and understanding the functionalities 
 Extensively worked on transformations like Filter, Expression, Router, Lookup, Update Strat egy, Sequence generator 
 Analysis of Specifications Provided by Clients 
 Created various Transformations like Connected and Unconnected Lookups, Router, Aggregator, Joiner, Update Strategy etc… 
 Created and Monitored Informatica sessions. 
 Prepared Low level mapping documents for the ETL built 
 Provided Knowledge sharing sessions and helping team to understand the requirements clearly 
 Good interaction with the team and business users 
Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX 
Title : MITTR01 
Client : GE Energy 
Role : Informatica Developer
Project Description: 
The primary objective of the MITT is to bring together information from disparate legacy systems for its data and put the information 
into format that is conducive to make business decisions. Usually the data is fed from COPICS tables into an FTP Server and f rom there 
it is picked up by the scripts and loaded into the base tables and the Rollup tables. It also fetche s the data from other source systems like 
TIMES, COSDOM. 
This data from various source systems is primarily extracted on a daily and weekly basis, which in turn is also, used by fina nce and 
Engineering team data reports. Currently the Functional and IM teams are notified by mails if there is a load failure over the 
weekend/daily loads. 
Contribution: 
 Creating Mapping using Informatica 
 Analyzing data issue problems with data marts and recommending solutions 
 Analyzing ETL mappings and making changes for optimization of load 
 Experience using ETL development (Informatica) 
 Experience using relational databases (Oracle 9i,10g/11g) 
 Experience with developing and supporting ETL applications 
 Responsible to deliver the work on time and customer challenges. 
Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX 
Personal Details: 
Name : venugopal.ch 
Email ID : venuzone@gmail.com 
Passport No : J1238807(Valid upto 20/10/2020) 
Location : Hyderabad 
Contact Numbers : +91 9000134110 
Date: 
Hyderabad (Venugopal)

More Related Content

Similar to VENU_Hadoop_Resume

Trade Ideas Data: Market Intelligence for the Financial Technology Industry
Trade Ideas Data: Market Intelligence for the Financial Technology Industry Trade Ideas Data: Market Intelligence for the Financial Technology Industry
Trade Ideas Data: Market Intelligence for the Financial Technology Industry David Aferiat
 
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014Frauke Ziedorn
 
Webperfdays 2014 - common web performance mistakes
Webperfdays 2014  - common web performance mistakesWebperfdays 2014  - common web performance mistakes
Webperfdays 2014 - common web performance mistakesdkoston
 
Put Down That Checkbook! - Big Data without the Big Bucks
Put Down That Checkbook! - Big Data without the Big BucksPut Down That Checkbook! - Big Data without the Big Bucks
Put Down That Checkbook! - Big Data without the Big BucksCharlie Greenbacker
 
Presentazione Sharepoint 2007 - MOSS - WSS
Presentazione Sharepoint 2007 - MOSS - WSSPresentazione Sharepoint 2007 - MOSS - WSS
Presentazione Sharepoint 2007 - MOSS - WSSDecatec
 
以数据驱动为中心-FreeWheel
以数据驱动为中心-FreeWheel以数据驱动为中心-FreeWheel
以数据驱动为中心-FreeWheelairsex
 
2010 06 15 SecondNug - JAVA vs NET
2010 06 15 SecondNug - JAVA vs NET2010 06 15 SecondNug - JAVA vs NET
2010 06 15 SecondNug - JAVA vs NETBruno Capuano
 
Centre4 Cloud - Work Smart Go Google 31 maart
Centre4 Cloud - Work Smart Go Google 31 maartCentre4 Cloud - Work Smart Go Google 31 maart
Centre4 Cloud - Work Smart Go Google 31 maartWork Smart Go Google
 
The new release of Oracle BI 11g R1 - OGH – 15 September 2010
The new release of Oracle BI 11g R1 - OGH – 15 September 2010The new release of Oracle BI 11g R1 - OGH – 15 September 2010
The new release of Oracle BI 11g R1 - OGH – 15 September 2010Daan Bakboord
 
plan informatico
plan informaticoplan informatico
plan informaticoGarlop Rmz
 
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le webMix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le webChristophe Lauer
 
Entity+framework+
Entity+framework+Entity+framework+
Entity+framework+Rey zhang
 
Genesys - 14oct2010
Genesys - 14oct2010Genesys - 14oct2010
Genesys - 14oct2010Agora Group
 
F5 Networks - Soluciones para Banca & Finanzas
F5 Networks - Soluciones para Banca & FinanzasF5 Networks - Soluciones para Banca & Finanzas
F5 Networks - Soluciones para Banca & FinanzasAEC Networks
 
Oil & gas strategic it planning leads to million dollar savings
Oil & gas strategic it planning leads to million dollar savingsOil & gas strategic it planning leads to million dollar savings
Oil & gas strategic it planning leads to million dollar savingsShepherd Mlambo
 
Introduction sur l'Open Source
Introduction sur l'Open SourceIntroduction sur l'Open Source
Introduction sur l'Open SourceEtienne Juliot
 

Similar to VENU_Hadoop_Resume (20)

Trade Ideas Data: Market Intelligence for the Financial Technology Industry
Trade Ideas Data: Market Intelligence for the Financial Technology Industry Trade Ideas Data: Market Intelligence for the Financial Technology Industry
Trade Ideas Data: Market Intelligence for the Financial Technology Industry
 
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
Neu bei der DOI-Registrierung? Ihr Weg zum DOI! - DataCite 2014
 
Webperfdays 2014 - common web performance mistakes
Webperfdays 2014  - common web performance mistakesWebperfdays 2014  - common web performance mistakes
Webperfdays 2014 - common web performance mistakes
 
Put Down That Checkbook! - Big Data without the Big Bucks
Put Down That Checkbook! - Big Data without the Big BucksPut Down That Checkbook! - Big Data without the Big Bucks
Put Down That Checkbook! - Big Data without the Big Bucks
 
Presentazione Sharepoint 2007 - MOSS - WSS
Presentazione Sharepoint 2007 - MOSS - WSSPresentazione Sharepoint 2007 - MOSS - WSS
Presentazione Sharepoint 2007 - MOSS - WSS
 
Cloud Computing
Cloud ComputingCloud Computing
Cloud Computing
 
以数据驱动为中心-FreeWheel
以数据驱动为中心-FreeWheel以数据驱动为中心-FreeWheel
以数据驱动为中心-FreeWheel
 
2010 06 15 SecondNug - JAVA vs NET
2010 06 15 SecondNug - JAVA vs NET2010 06 15 SecondNug - JAVA vs NET
2010 06 15 SecondNug - JAVA vs NET
 
Centre4 Cloud - Work Smart Go Google 31 maart
Centre4 Cloud - Work Smart Go Google 31 maartCentre4 Cloud - Work Smart Go Google 31 maart
Centre4 Cloud - Work Smart Go Google 31 maart
 
solomon_mayfield.docx
solomon_mayfield.docxsolomon_mayfield.docx
solomon_mayfield.docx
 
The new release of Oracle BI 11g R1 - OGH – 15 September 2010
The new release of Oracle BI 11g R1 - OGH – 15 September 2010The new release of Oracle BI 11g R1 - OGH – 15 September 2010
The new release of Oracle BI 11g R1 - OGH – 15 September 2010
 
plan informatico
plan informaticoplan informatico
plan informatico
 
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le webMix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web
Mix Paris 2007 - CLaueR - Optimisez l'impact de vos services sur le web
 
Entity+framework+
Entity+framework+Entity+framework+
Entity+framework+
 
Genesys - 14oct2010
Genesys - 14oct2010Genesys - 14oct2010
Genesys - 14oct2010
 
F5 Networks - Soluciones para Banca & Finanzas
F5 Networks - Soluciones para Banca & FinanzasF5 Networks - Soluciones para Banca & Finanzas
F5 Networks - Soluciones para Banca & Finanzas
 
Intro Cloud I
Intro Cloud IIntro Cloud I
Intro Cloud I
 
Oil & gas strategic it planning leads to million dollar savings
Oil & gas strategic it planning leads to million dollar savingsOil & gas strategic it planning leads to million dollar savings
Oil & gas strategic it planning leads to million dollar savings
 
Teletex - Estrutura Convergente
Teletex - Estrutura ConvergenteTeletex - Estrutura Convergente
Teletex - Estrutura Convergente
 
Introduction sur l'Open Source
Introduction sur l'Open SourceIntroduction sur l'Open Source
Introduction sur l'Open Source
 

VENU_Hadoop_Resume

  • 1. VENUGOPAL CHIPPA E-Mail: venuzone@gmail.com Contact No.: 9000134110 Seeking a Hadoop Technical Lead position at high reputed organization where I can maximize my technical and management skills. PROFILE SUMMARY  4 years of experience in IT industry including Big Data-Hadoop and Data Warehousing technologies.  Hands on experience with Big Data core components and Eco System (HDFS, MapReduce, Hive, Sqoop, Flume, Oozie, HBase, Pig, Core Java )  Hands-on experience in configuring, developing, monitoring, debugging and performance tuning of Big Data applications using Hadoop  Deep experience with Distributed systems and large scale non-relational data stores.  Experience in Informatica ETL processes.  Experience in handling Enterprise Data warehouses-Development, Support and Maintenance  Worked with customers closely to provide solutions to various problems  Quick learner and self starter to go ahead with any new technology. ORGANISATIONAL EXPERIENCE  Working with Tech Mahindra (Satyam Computer Services Ltd) from Feb 2011 to Till date Education Details  B.Tech from JNT University in the year 2010 Technical Skills  Big Data/Hadoop eco system : Hadoop,HDFS,MapReduce,HIVE,HiveQL,Pig,Sqoop,HBase,Oozie,Flume,Splunk  ETL Tools : Informatica 9.1/8.x  Reporting tool : Business Objects  Databases : Oracle 11g/10g/9i/8i  Data Management tools :SQL *Plus, SQL Developer, Enterprise Manager, Toad  DB Languages : SQL  NoSql : HBase, Cassandra  Operating Systems : VMware, Linux, Unix, WINDOWS 98,CENT Professional Experience: Hadoop (Big Data) Type: Development Customer: GE Atlanta, USA Offshore: Mahindra Satyam Computer Services Ltd Limited, Hyderabad Tools/Platform: BigData, Hadoop, HDFS, Hive, Hive QL, Pig, Sqoop, Flume, Oozie, Oracle Responsibilities:  Analyze Client Systems and Gather requirements.  Design technical solutions based out of the Client needs and systems architecture.  Data is collected from Teradata and pushing into Hadoop using Sqoop.  The logs are pulled form log server and stored in the FTP server hourly wise; this data is pushed into Hadoop by deleting data in the FTP server.  Data was pre-processed using MapReduce and stored into the Hive tables.  Implementation part of the Big Data solution has been taken care.  Extract data from various log files (Apache weblog, adobe log, Hit Data log) and load it into HDFS under Hadoop.
  • 2.  Developed and build HIVE tables using Hive QL.  Developed scripts using Hive QL for the analysis purpose.  Tuned Hive queries to get the best performance.  Written Apache Pig scripts for data analysis on top of the HDFS files.  Extracted data from RDBMS – Oracle into HADOOP using Sqoop.  Used Hive on the RDBMS data for data analysis and stored back to DB.  Used Oozie for scheduling.  Provided solutions to various problems in Hadoop. Hadoop: General Electronics (G.E) Type: Development Customer: General Electric, Cincinnati, OH, USA Off shore : Mahindra Satyam Computer Services Ltd Limited, Hyderabad Tools/Platform: Big Data, Hadoop, HDFS, Hive, Hive QL, Apache Pig, Sqoop and Oozie Responsibilities:  Developed PIG Latin scripts to extract the data from the web server output files to load into HDFS  Developed the PIG UDF’S to pre-process the data for analysis  Developed workflow in Oozie to automate the tasks of loading the data into HDFS and pre-processing with PIG  Developed HIVE queries for the analysts  Moving data using flume  Provided solutions to various problems in Hadoop  Developed and build HIVE tables using Hive QL Title : Shop floor Systems Client : GE Energy Role : Informatica Developer Project Description: GE Energy is one of the 12 major businesses of General Electric Company USA and is the world's leading supplier of power generation technology, service and energy management systems. GE Energy services customers globally through a network of loca l offices and service centers, with three regional headquarters - GE Energy Asia-Pacific in Hong Kong, GE Energy Europe in London and GE Energy Americas in Schenectady, NY. GE Energy includes 12 manufacturing centers, nearly 22,200 employees, 130 sales and service engineering offices, and 58 apparatus service centers, worldwide.Relates the turbines data and all the part counts which will be counted in the application where the ETL loads run at Informatica and reports with BO 6.5. Contribution: As an ETL Developer, I was responsible for  Design of Extraction Transform & Load (ETL) environment  Involved in requirements study and understanding the functionalities  Extensively worked on transformations like Filter, Expression, Router, Lookup, Update Strat egy, Sequence generator  Analysis of Specifications Provided by Clients  Created various Transformations like Connected and Unconnected Lookups, Router, Aggregator, Joiner, Update Strategy etc…  Created and Monitored Informatica sessions.  Prepared Low level mapping documents for the ETL built  Provided Knowledge sharing sessions and helping team to understand the requirements clearly  Good interaction with the team and business users Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX Title : MITTR01 Client : GE Energy Role : Informatica Developer
  • 3. Project Description: The primary objective of the MITT is to bring together information from disparate legacy systems for its data and put the information into format that is conducive to make business decisions. Usually the data is fed from COPICS tables into an FTP Server and f rom there it is picked up by the scripts and loaded into the base tables and the Rollup tables. It also fetche s the data from other source systems like TIMES, COSDOM. This data from various source systems is primarily extracted on a daily and weekly basis, which in turn is also, used by fina nce and Engineering team data reports. Currently the Functional and IM teams are notified by mails if there is a load failure over the weekend/daily loads. Contribution:  Creating Mapping using Informatica  Analyzing data issue problems with data marts and recommending solutions  Analyzing ETL mappings and making changes for optimization of load  Experience using ETL development (Informatica)  Experience using relational databases (Oracle 9i,10g/11g)  Experience with developing and supporting ETL applications  Responsible to deliver the work on time and customer challenges. Environment: Informatica Power Center 8.1 and 9.1, Windows XP, Oracle10g/11g, SQL, UNIX Personal Details: Name : venugopal.ch Email ID : venuzone@gmail.com Passport No : J1238807(Valid upto 20/10/2020) Location : Hyderabad Contact Numbers : +91 9000134110 Date: Hyderabad (Venugopal)