SlideShare a Scribd company logo
1 of 2
Sourabh Parikh
Address – P202, Treasure Park, Mob: +91-8600888997
Kashid Park, Pimple Gaurav, Email: sourabhparikh90@gmail.com
Pune (Maharashtra) – 411061
TOTAL EXPERIENCE: Over 3 years IT experience, currently working with Infosys Limited as Hadoop
Developer.
TECHNICAL SCOPE: In the area of development, maintenance and enhancement of Big data
applications, Java applications, Data Analytics using Big Data tools (Hadoop ecosystem).
TECHNICAL SKILLS SUMMARY
Technologies: Core Java, Hadoop, MapReduce, Pig, Hive, HBase, Sqoop, Flume
Database: MySQL
Tools: Eclipse, OBIEE, Tableau, SVN, Jenkins, Maven
Operating Systems: Window, Linux, UNIX
Education: B.TECH (ECE) from Rajasthan Technical University in 2012
PROFESSIONAL EXPERIENCE
Infosys Limited: July’ 2013 - Present (Hadoop Developer)
ICS - SUNCORP (Jan 2015 - till present)
Description:
ICS (Insurance Claims System) is a claims management system that handles all the processing involved
in a claim, from the receipt of claim/loss information to the pay out of the claim. ICS is a one-stop
shop for processing a claim. It simplifies the claims pay out process as it handles all the calculations
involved taking into account all rules, regulations and rates applicable automatically. The user needs
to enter only the policy information and the rest of the processing is handled by ICS. Created secured
password management system for the users.
Responsibilities:
 Development of Hadoop/Map Reduce programs in Java/Hive.
 Writing efficient Hive UDFs in Java for advanced analytics.
 Develop workflows using Java/hive.
 Understanding the technical specifications.
 Writing unit test cases for each of the component.
 Generating various reports using Tableau.
 Preparing test data for Unit testing.
 Involved in Unit testing of coded programs.
 While performing unit testing, capturing unit test results along with screenshots.
Technologies& Databases: Hadoop, Hive, MySQL, Sqoop
Environment: Centos, CDH5 (Cloudera Manager)
ORBITZ (Oct 2013 - Dec 2014)
Description:
Orbitz provides innovative solutions for hotels around the globe that increase revenue, reduce cost,
and improve performance. Orbitz.com generates nearly 1 million hotel searches every day. Project
involved migrating data from SQL Server Database to Hadoop to perform analysis to drive hotel
marketing, revenue management, and business strategy.
Responsibilities:
● Developed Big Data Solutions that enabled the business and technology teams to make data-
driven decisions on the best ways to acquire customers and provide them business solutions.
● Involved in migrating the existing data to Hadoop from MySQL using Sqoop for processing the
data.
● Created internal and external tables with properly defined static and dynamic partitions for
efficiency.
● Implemented Hive custom UDF's to achieve comprehensive data analysis.
● Used Pig to develop ad-hoc queries.
● Generated various excel reports based on business user requirements.
Technologies& Databases: Hadoop (APIs) and its ecosystem, Hive, MapReduce, Pig, Sqoop
Environment: Centos, CDH5
Personal Details:
Name in Full : Sourabh Parikh
DOB : 02 July, 1989

More Related Content

Similar to Hadoop Developer

Similar to Hadoop Developer (20)

Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
BigData_Krishna Kumar Sharma
BigData_Krishna Kumar SharmaBigData_Krishna Kumar Sharma
BigData_Krishna Kumar Sharma
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+Years
 
Madhu
MadhuMadhu
Madhu
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Aishwarya
AishwaryaAishwarya
Aishwarya
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
 
Resume - Narasimha Rao B V (TCS)
Resume - Narasimha  Rao B V (TCS)Resume - Narasimha  Rao B V (TCS)
Resume - Narasimha Rao B V (TCS)
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
 
RESUME_N
RESUME_NRESUME_N
RESUME_N
 
Hadoop-Geetha
Hadoop-GeethaHadoop-Geetha
Hadoop-Geetha
 
Kalyan Hadoop
Kalyan HadoopKalyan Hadoop
Kalyan Hadoop
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
 

Hadoop Developer

  • 1. Sourabh Parikh Address – P202, Treasure Park, Mob: +91-8600888997 Kashid Park, Pimple Gaurav, Email: sourabhparikh90@gmail.com Pune (Maharashtra) – 411061 TOTAL EXPERIENCE: Over 3 years IT experience, currently working with Infosys Limited as Hadoop Developer. TECHNICAL SCOPE: In the area of development, maintenance and enhancement of Big data applications, Java applications, Data Analytics using Big Data tools (Hadoop ecosystem). TECHNICAL SKILLS SUMMARY Technologies: Core Java, Hadoop, MapReduce, Pig, Hive, HBase, Sqoop, Flume Database: MySQL Tools: Eclipse, OBIEE, Tableau, SVN, Jenkins, Maven Operating Systems: Window, Linux, UNIX Education: B.TECH (ECE) from Rajasthan Technical University in 2012 PROFESSIONAL EXPERIENCE Infosys Limited: July’ 2013 - Present (Hadoop Developer) ICS - SUNCORP (Jan 2015 - till present) Description: ICS (Insurance Claims System) is a claims management system that handles all the processing involved in a claim, from the receipt of claim/loss information to the pay out of the claim. ICS is a one-stop shop for processing a claim. It simplifies the claims pay out process as it handles all the calculations involved taking into account all rules, regulations and rates applicable automatically. The user needs to enter only the policy information and the rest of the processing is handled by ICS. Created secured password management system for the users. Responsibilities:
  • 2.  Development of Hadoop/Map Reduce programs in Java/Hive.  Writing efficient Hive UDFs in Java for advanced analytics.  Develop workflows using Java/hive.  Understanding the technical specifications.  Writing unit test cases for each of the component.  Generating various reports using Tableau.  Preparing test data for Unit testing.  Involved in Unit testing of coded programs.  While performing unit testing, capturing unit test results along with screenshots. Technologies& Databases: Hadoop, Hive, MySQL, Sqoop Environment: Centos, CDH5 (Cloudera Manager) ORBITZ (Oct 2013 - Dec 2014) Description: Orbitz provides innovative solutions for hotels around the globe that increase revenue, reduce cost, and improve performance. Orbitz.com generates nearly 1 million hotel searches every day. Project involved migrating data from SQL Server Database to Hadoop to perform analysis to drive hotel marketing, revenue management, and business strategy. Responsibilities: ● Developed Big Data Solutions that enabled the business and technology teams to make data- driven decisions on the best ways to acquire customers and provide them business solutions. ● Involved in migrating the existing data to Hadoop from MySQL using Sqoop for processing the data. ● Created internal and external tables with properly defined static and dynamic partitions for efficiency. ● Implemented Hive custom UDF's to achieve comprehensive data analysis. ● Used Pig to develop ad-hoc queries. ● Generated various excel reports based on business user requirements. Technologies& Databases: Hadoop (APIs) and its ecosystem, Hive, MapReduce, Pig, Sqoop Environment: Centos, CDH5 Personal Details: Name in Full : Sourabh Parikh DOB : 02 July, 1989