SlideShare a Scribd company logo
1 of 4
Bharath Kumar Rapolu
Contact No :+91-9885363653
E-Mail: bharathrapolu.kumar@gmail.com
Professional Summary:
 4.6+ years of overall IT experience in Application Development in Pl/Sql and
Big Data Hadoop.
 1.5 years of exclusive experience in Hadoop and its components like HDFS,
Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie
 Extensive Experience in Setting Hadoop Cluster
 Good working knowledge with Map Reduce and Apache Pig
 Involved in writing the Pig scripts and Pig UDFs to reduce the job execution
time
 Experience in creating Hive External and Managed tables and writing queries on
them
 Involved in writing Hive UDFs for specific functionalities.
 Experience in importing and exporting data operation with RDBMS using Sqoop
CLI Commands
 Involved in scheduling MR,Pig and Sqoop Jobs in Apache Oozie.
 Experience in writing Procedures,Functions,Triggers, Indexes and Packages in
RDBMS.
 Written SQL queries for DDL and DML operations.
 Experience in Importing and Exporting of data from text and excel sheets in Sql
Server.
 Having knowledge of Fact and Dimensional Tables in RDBMS.
 Experience in Performance Tuning and Query Optimizing in RDBMS.
 Experience in Requirement Analysis and Table Design.
 Having Knowledge of Pentaho Report Designer.
 Ability to be an effective team player and work under time constraints.
 Good interpersonal communication skills & Technical Documentation skills.
 Having knowledge of usage of Hints in Performance Tuning
 Knowledge on FLUME and NO-SQL
Professional Experience:
 Currently Working as a IT Analyst in TCS (Tata Consultancy Services),
Hyderabad, India since Jul 2011.
Qualifications:
 Bachelor of Technology from SASTRA University, Thanjavur, Chennai,
with 7.7 /10 CGPA.
Technical Skills:
Languages Core java,SQL,PL/SQL MapReduce, Pig, Sqoop, Pig, Hive, Hbase.
Servers IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
Frameworks Hadoop and .NET
Java IDEs Eclipse Europa 2008, PL/SQL Developer
Version Control /
Tracking Tools
Visual Source Safe (VSS)
Databases DB2 9.x,MySql, Sql Server,Oracle- SQL (DDL, DML, DCL) and
PL/SQL.
Operating Systems Windows7, Windows XP, 2000, 2003, Unix and Linux
Project Details:
PROJECT #3:
Project Name : BestBuy – Web Intelligence
Client : BestBuy Minneapolis, Minnesota, USA.
Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL
Duration : Nov 2014 to till Date
Role : Hadoop Developer
Description:
This Project is all about the rehousting of their (BestBuy) current existing project
into Hadoop platform. Previously BestBuy was using mysql DB for storing their
competitor’s retailer’s information.[The Crawled web data]. Early BestBuy use to
have only 4 competitor retailers namely Amazon.com, walmart.com etc….
But as and when the competitor retailers are increasing the data generated out of
their web crawling is also increased massively and which cannot be accomodable
in a mysql kind of data box with the same reason BestBuy wants to move it
Hadoop, where exactly we can handle massive amount of data by means of its
cluster nodes and also to satisfy the scaling needs of the BestBuy business
operation.
Roles and Responsibilities:
 Moved all crawl data flat files generated from various retailers to HDFS for
further processing.
 Written the Apache PIG scripts to process the HDFS data.
 Created Hive tables to store the processed results in a tabular format.
 Developed the sqoop scripts in order to make the interaction between Pig
and MySQL Database.
 Writing the script files for processing data and loading to HDFS
 Writing CLI commands using HDFS.
 Developed the UNIX shell scripts for creating the reports from Hive data.
 Completely involved in the requirement analysis phase.
 Created two different users (hduser for performing hdfs operations and map
red user for performing map reduce operations only)
 Ensured NFS is configured for Name Node
 Setting Password less hadoop
 Setting up cron job to delete hadoop logs/local old job files/cluster temp
files
 Setup Hive with MySQL as a Remote Metastore
 Moved all log/text files generated by various products into HDFS location
 Written Map Reduce code that will take input as log files and parse the logs
and structure them in tabular format to facilitate effective querying on the
log data
 Created External Hive Table on top of parsed data.
#Project 2: SERP (Society for Elimination of Rural Poverty)
Client : Govt Of Andhra Pradesh and Telangana.
Project Title : SthreeNidhi
Environment :Windows XP Professional ,Windows 7.
Duration :Sep -2013 to Dec-2014.
Role : Pl/Sql Developer.
Team Size : 12
Tools : Sql Server Management Studio .
Description: SERP mission is to enable the disadvantaged communities to perceive
possibilities for change and bring about desired change by exercising informed choices
through collective action.
-The disadvantaged communities shall be empowered to overcome all social, economic,
cultural and psychological barriers through self-managed organizations
SthreeNidhi credit cooperative Federation Ltd., is promoted by the Government
and the MandalSamkahyas to supplement credit flow from banking sector and is a
flagship programme of the Government. SthreeNidhi provides timely and affordable
credit to the poor SHG members as a part of the overall strategy of SERP for poverty
alleviation.
SHGs are comfortable to access hassle free credit from SthreeNidhi as and when
required using their mobile and therefore do not see any need to borrow from other
sources at usurious rates of interest.SthreeNidhi is in a position to extend credit to the
SHGs even in far flung areas of the state in 48 hours to meet credit needs for exigencies
like health, education and other income generation needs like agriculture, dairy and other
activities. As credit availability is linked to grading of MS and VOs, community is keen
to improve functioning of the same to access higher amount of credit limits from
SthreeNidhi.
Contribution:
 Performed DBA Activities like Creating and Maintaining tables
 Exporting and importing of data from text,CSV,Excel Files.
 Involved in designing tables for screens.
 Analyzing the Requirement and attended client meetings.
 Developed Procedures ,functions and Views for Report Generation.
 Development of User Defined Functions that work for throughout solution.
 Creation and Maintenance of Indexes and Views for Performance Tuning.
 Developed Fact Tables for Analysis Report .
 Resolving the defects at time of production.
 Having knowledge of usage of Hints in Performance Tuning
#Project 1: TCS iON Education Solution
Client : TCS Internal
Customers : Manav Rachna International University,SASTRA University
and few more
Engineering colleges and Univeristies.
Environment :Windows XP Professional ,Oracle 11g
Duration :Aug -2011 to Jun-2013.
Role : Pl/Sql Developer.
Team Size : 6
Tools : Pentaho Report Designer.
Description: TCS iON a cloud based ERP solution was conceptualized by TCS through
close interactions with Small and Medium Businesses (SMB) across relevant
stakeholders.
- iON Education Solution has wide range of cloud based solutions with foot prints that
covers the entire value chain of education eco system covering K12 Schools, Affiliated
Colleges, Vocational Institutes, Boards and Universities.
- This covers Student Life Cycle management; Attendance Management, Fees
Management and Academic Operations Management.
Contribution:
 Devoloped Pre-Configured and On Demand reports.
 Used Pentaho Report Designer to design the format of Reports.
 Analyzing the Requirement and Report Design.
 Development of UserDefined Functions that work for throughout solution.
 Involved in the customer conversations.
 Following up with QA team during testing phase.
 Development and Maintenance of PL/SQL Procedures and Triggers.
 Creation of Indexes and Views for Performance Tuning.
 Documentation of good practices and logics for future reference.
 Developed Fact Tables for On Demand Reports.
 Resolving the defects at time of production.

More Related Content

What's hot

Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaMopuru Babu
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
 
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh Yadav
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINAthokpam Nabakumar
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydsrikanth K
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith R
 
Vijay_hadoop admin
Vijay_hadoop adminVijay_hadoop admin
Vijay_hadoop adminvijay vijay
 

What's hot (19)

Resume - Narasimha Rao B V (TCS)
Resume - Narasimha  Rao B V (TCS)Resume - Narasimha  Rao B V (TCS)
Resume - Narasimha Rao B V (TCS)
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Resume (2)
Resume (2)Resume (2)
Resume (2)
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
Suresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_ResumeSuresh_Yadav_Hadoop_Fresher_Resume
Suresh_Yadav_Hadoop_Fresher_Resume
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
 
Resume
ResumeResume
Resume
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
Vijay_hadoop admin
Vijay_hadoop adminVijay_hadoop admin
Vijay_hadoop admin
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Oracle in Database Hadoop
Oracle in Database HadoopOracle in Database Hadoop
Oracle in Database Hadoop
 

Similar to Bharath Hadoop Resume (20)

Resume
ResumeResume
Resume
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Swathi EV 3years
Swathi EV  3yearsSwathi EV  3years
Swathi EV 3years
 
RESUME_N
RESUME_NRESUME_N
RESUME_N
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
Kalyan Hadoop
Kalyan HadoopKalyan Hadoop
Kalyan Hadoop
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Shiv Shakti
Shiv ShaktiShiv Shakti
Shiv Shakti
 
Yasar resume 2
Yasar resume 2Yasar resume 2
Yasar resume 2
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
Resume_Karthick
Resume_KarthickResume_Karthick
Resume_Karthick
 
Resume_bibhu_prasad_dash
Resume_bibhu_prasad_dash Resume_bibhu_prasad_dash
Resume_bibhu_prasad_dash
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
 
BigData_Krishna Kumar Sharma
BigData_Krishna Kumar SharmaBigData_Krishna Kumar Sharma
BigData_Krishna Kumar Sharma
 

Bharath Hadoop Resume

  • 1. Bharath Kumar Rapolu Contact No :+91-9885363653 E-Mail: bharathrapolu.kumar@gmail.com Professional Summary:  4.6+ years of overall IT experience in Application Development in Pl/Sql and Big Data Hadoop.  1.5 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie  Extensive Experience in Setting Hadoop Cluster  Good working knowledge with Map Reduce and Apache Pig  Involved in writing the Pig scripts and Pig UDFs to reduce the job execution time  Experience in creating Hive External and Managed tables and writing queries on them  Involved in writing Hive UDFs for specific functionalities.  Experience in importing and exporting data operation with RDBMS using Sqoop CLI Commands  Involved in scheduling MR,Pig and Sqoop Jobs in Apache Oozie.  Experience in writing Procedures,Functions,Triggers, Indexes and Packages in RDBMS.  Written SQL queries for DDL and DML operations.  Experience in Importing and Exporting of data from text and excel sheets in Sql Server.  Having knowledge of Fact and Dimensional Tables in RDBMS.  Experience in Performance Tuning and Query Optimizing in RDBMS.  Experience in Requirement Analysis and Table Design.  Having Knowledge of Pentaho Report Designer.  Ability to be an effective team player and work under time constraints.  Good interpersonal communication skills & Technical Documentation skills.  Having knowledge of usage of Hints in Performance Tuning  Knowledge on FLUME and NO-SQL Professional Experience:  Currently Working as a IT Analyst in TCS (Tata Consultancy Services), Hyderabad, India since Jul 2011. Qualifications:  Bachelor of Technology from SASTRA University, Thanjavur, Chennai, with 7.7 /10 CGPA. Technical Skills: Languages Core java,SQL,PL/SQL MapReduce, Pig, Sqoop, Pig, Hive, Hbase. Servers IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
  • 2. Frameworks Hadoop and .NET Java IDEs Eclipse Europa 2008, PL/SQL Developer Version Control / Tracking Tools Visual Source Safe (VSS) Databases DB2 9.x,MySql, Sql Server,Oracle- SQL (DDL, DML, DCL) and PL/SQL. Operating Systems Windows7, Windows XP, 2000, 2003, Unix and Linux Project Details: PROJECT #3: Project Name : BestBuy – Web Intelligence Client : BestBuy Minneapolis, Minnesota, USA. Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL Duration : Nov 2014 to till Date Role : Hadoop Developer Description: This Project is all about the rehousting of their (BestBuy) current existing project into Hadoop platform. Previously BestBuy was using mysql DB for storing their competitor’s retailer’s information.[The Crawled web data]. Early BestBuy use to have only 4 competitor retailers namely Amazon.com, walmart.com etc…. But as and when the competitor retailers are increasing the data generated out of their web crawling is also increased massively and which cannot be accomodable in a mysql kind of data box with the same reason BestBuy wants to move it Hadoop, where exactly we can handle massive amount of data by means of its cluster nodes and also to satisfy the scaling needs of the BestBuy business operation. Roles and Responsibilities:  Moved all crawl data flat files generated from various retailers to HDFS for further processing.  Written the Apache PIG scripts to process the HDFS data.  Created Hive tables to store the processed results in a tabular format.  Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.  Writing the script files for processing data and loading to HDFS  Writing CLI commands using HDFS.
  • 3.  Developed the UNIX shell scripts for creating the reports from Hive data.  Completely involved in the requirement analysis phase.  Created two different users (hduser for performing hdfs operations and map red user for performing map reduce operations only)  Ensured NFS is configured for Name Node  Setting Password less hadoop  Setting up cron job to delete hadoop logs/local old job files/cluster temp files  Setup Hive with MySQL as a Remote Metastore  Moved all log/text files generated by various products into HDFS location  Written Map Reduce code that will take input as log files and parse the logs and structure them in tabular format to facilitate effective querying on the log data  Created External Hive Table on top of parsed data. #Project 2: SERP (Society for Elimination of Rural Poverty) Client : Govt Of Andhra Pradesh and Telangana. Project Title : SthreeNidhi Environment :Windows XP Professional ,Windows 7. Duration :Sep -2013 to Dec-2014. Role : Pl/Sql Developer. Team Size : 12 Tools : Sql Server Management Studio . Description: SERP mission is to enable the disadvantaged communities to perceive possibilities for change and bring about desired change by exercising informed choices through collective action. -The disadvantaged communities shall be empowered to overcome all social, economic, cultural and psychological barriers through self-managed organizations SthreeNidhi credit cooperative Federation Ltd., is promoted by the Government and the MandalSamkahyas to supplement credit flow from banking sector and is a flagship programme of the Government. SthreeNidhi provides timely and affordable credit to the poor SHG members as a part of the overall strategy of SERP for poverty alleviation. SHGs are comfortable to access hassle free credit from SthreeNidhi as and when required using their mobile and therefore do not see any need to borrow from other sources at usurious rates of interest.SthreeNidhi is in a position to extend credit to the SHGs even in far flung areas of the state in 48 hours to meet credit needs for exigencies like health, education and other income generation needs like agriculture, dairy and other activities. As credit availability is linked to grading of MS and VOs, community is keen to improve functioning of the same to access higher amount of credit limits from SthreeNidhi.
  • 4. Contribution:  Performed DBA Activities like Creating and Maintaining tables  Exporting and importing of data from text,CSV,Excel Files.  Involved in designing tables for screens.  Analyzing the Requirement and attended client meetings.  Developed Procedures ,functions and Views for Report Generation.  Development of User Defined Functions that work for throughout solution.  Creation and Maintenance of Indexes and Views for Performance Tuning.  Developed Fact Tables for Analysis Report .  Resolving the defects at time of production.  Having knowledge of usage of Hints in Performance Tuning #Project 1: TCS iON Education Solution Client : TCS Internal Customers : Manav Rachna International University,SASTRA University and few more Engineering colleges and Univeristies. Environment :Windows XP Professional ,Oracle 11g Duration :Aug -2011 to Jun-2013. Role : Pl/Sql Developer. Team Size : 6 Tools : Pentaho Report Designer. Description: TCS iON a cloud based ERP solution was conceptualized by TCS through close interactions with Small and Medium Businesses (SMB) across relevant stakeholders. - iON Education Solution has wide range of cloud based solutions with foot prints that covers the entire value chain of education eco system covering K12 Schools, Affiliated Colleges, Vocational Institutes, Boards and Universities. - This covers Student Life Cycle management; Attendance Management, Fees Management and Academic Operations Management. Contribution:  Devoloped Pre-Configured and On Demand reports.  Used Pentaho Report Designer to design the format of Reports.  Analyzing the Requirement and Report Design.  Development of UserDefined Functions that work for throughout solution.  Involved in the customer conversations.  Following up with QA team during testing phase.  Development and Maintenance of PL/SQL Procedures and Triggers.  Creation of Indexes and Views for Performance Tuning.  Documentation of good practices and logics for future reference.  Developed Fact Tables for On Demand Reports.  Resolving the defects at time of production.