1. Bharath Kumar Rapolu
Contact No :+91-9885363653
E-Mail: bharathrapolu.kumar@gmail.com
Professional Summary:
4.6+ years of overall IT experience in Application Development in Pl/Sql and
Big Data Hadoop.
1.5 years of exclusive experience in Hadoop and its components like HDFS,
Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie
Extensive Experience in Setting Hadoop Cluster
Good working knowledge with Map Reduce and Apache Pig
Involved in writing the Pig scripts and Pig UDFs to reduce the job execution
time
Experience in creating Hive External and Managed tables and writing queries on
them
Involved in writing Hive UDFs for specific functionalities.
Experience in importing and exporting data operation with RDBMS using Sqoop
CLI Commands
Involved in scheduling MR,Pig and Sqoop Jobs in Apache Oozie.
Experience in writing Procedures,Functions,Triggers, Indexes and Packages in
RDBMS.
Written SQL queries for DDL and DML operations.
Experience in Importing and Exporting of data from text and excel sheets in Sql
Server.
Having knowledge of Fact and Dimensional Tables in RDBMS.
Experience in Performance Tuning and Query Optimizing in RDBMS.
Experience in Requirement Analysis and Table Design.
Having Knowledge of Pentaho Report Designer.
Ability to be an effective team player and work under time constraints.
Good interpersonal communication skills & Technical Documentation skills.
Having knowledge of usage of Hints in Performance Tuning
Knowledge on FLUME and NO-SQL
Professional Experience:
Currently Working as a IT Analyst in TCS (Tata Consultancy Services),
Hyderabad, India since Jul 2011.
Qualifications:
Bachelor of Technology from SASTRA University, Thanjavur, Chennai,
with 7.7 /10 CGPA.
Technical Skills:
Languages Core java,SQL,PL/SQL MapReduce, Pig, Sqoop, Pig, Hive, Hbase.
Servers IBM Web Sphere Application Server 7.0, Web Logic and Tomcat
2. Frameworks Hadoop and .NET
Java IDEs Eclipse Europa 2008, PL/SQL Developer
Version Control /
Tracking Tools
Visual Source Safe (VSS)
Databases DB2 9.x,MySql, Sql Server,Oracle- SQL (DDL, DML, DCL) and
PL/SQL.
Operating Systems Windows7, Windows XP, 2000, 2003, Unix and Linux
Project Details:
PROJECT #3:
Project Name : BestBuy – Web Intelligence
Client : BestBuy Minneapolis, Minnesota, USA.
Environment : Hadoop, Apache Pig, Hive, SQOOP, Java, UNIX, MySQL
Duration : Nov 2014 to till Date
Role : Hadoop Developer
Description:
This Project is all about the rehousting of their (BestBuy) current existing project
into Hadoop platform. Previously BestBuy was using mysql DB for storing their
competitor’s retailer’s information.[The Crawled web data]. Early BestBuy use to
have only 4 competitor retailers namely Amazon.com, walmart.com etc….
But as and when the competitor retailers are increasing the data generated out of
their web crawling is also increased massively and which cannot be accomodable
in a mysql kind of data box with the same reason BestBuy wants to move it
Hadoop, where exactly we can handle massive amount of data by means of its
cluster nodes and also to satisfy the scaling needs of the BestBuy business
operation.
Roles and Responsibilities:
Moved all crawl data flat files generated from various retailers to HDFS for
further processing.
Written the Apache PIG scripts to process the HDFS data.
Created Hive tables to store the processed results in a tabular format.
Developed the sqoop scripts in order to make the interaction between Pig
and MySQL Database.
Writing the script files for processing data and loading to HDFS
Writing CLI commands using HDFS.
3. Developed the UNIX shell scripts for creating the reports from Hive data.
Completely involved in the requirement analysis phase.
Created two different users (hduser for performing hdfs operations and map
red user for performing map reduce operations only)
Ensured NFS is configured for Name Node
Setting Password less hadoop
Setting up cron job to delete hadoop logs/local old job files/cluster temp
files
Setup Hive with MySQL as a Remote Metastore
Moved all log/text files generated by various products into HDFS location
Written Map Reduce code that will take input as log files and parse the logs
and structure them in tabular format to facilitate effective querying on the
log data
Created External Hive Table on top of parsed data.
#Project 2: SERP (Society for Elimination of Rural Poverty)
Client : Govt Of Andhra Pradesh and Telangana.
Project Title : SthreeNidhi
Environment :Windows XP Professional ,Windows 7.
Duration :Sep -2013 to Dec-2014.
Role : Pl/Sql Developer.
Team Size : 12
Tools : Sql Server Management Studio .
Description: SERP mission is to enable the disadvantaged communities to perceive
possibilities for change and bring about desired change by exercising informed choices
through collective action.
-The disadvantaged communities shall be empowered to overcome all social, economic,
cultural and psychological barriers through self-managed organizations
SthreeNidhi credit cooperative Federation Ltd., is promoted by the Government
and the MandalSamkahyas to supplement credit flow from banking sector and is a
flagship programme of the Government. SthreeNidhi provides timely and affordable
credit to the poor SHG members as a part of the overall strategy of SERP for poverty
alleviation.
SHGs are comfortable to access hassle free credit from SthreeNidhi as and when
required using their mobile and therefore do not see any need to borrow from other
sources at usurious rates of interest.SthreeNidhi is in a position to extend credit to the
SHGs even in far flung areas of the state in 48 hours to meet credit needs for exigencies
like health, education and other income generation needs like agriculture, dairy and other
activities. As credit availability is linked to grading of MS and VOs, community is keen
to improve functioning of the same to access higher amount of credit limits from
SthreeNidhi.
4. Contribution:
Performed DBA Activities like Creating and Maintaining tables
Exporting and importing of data from text,CSV,Excel Files.
Involved in designing tables for screens.
Analyzing the Requirement and attended client meetings.
Developed Procedures ,functions and Views for Report Generation.
Development of User Defined Functions that work for throughout solution.
Creation and Maintenance of Indexes and Views for Performance Tuning.
Developed Fact Tables for Analysis Report .
Resolving the defects at time of production.
Having knowledge of usage of Hints in Performance Tuning
#Project 1: TCS iON Education Solution
Client : TCS Internal
Customers : Manav Rachna International University,SASTRA University
and few more
Engineering colleges and Univeristies.
Environment :Windows XP Professional ,Oracle 11g
Duration :Aug -2011 to Jun-2013.
Role : Pl/Sql Developer.
Team Size : 6
Tools : Pentaho Report Designer.
Description: TCS iON a cloud based ERP solution was conceptualized by TCS through
close interactions with Small and Medium Businesses (SMB) across relevant
stakeholders.
- iON Education Solution has wide range of cloud based solutions with foot prints that
covers the entire value chain of education eco system covering K12 Schools, Affiliated
Colleges, Vocational Institutes, Boards and Universities.
- This covers Student Life Cycle management; Attendance Management, Fees
Management and Academic Operations Management.
Contribution:
Devoloped Pre-Configured and On Demand reports.
Used Pentaho Report Designer to design the format of Reports.
Analyzing the Requirement and Report Design.
Development of UserDefined Functions that work for throughout solution.
Involved in the customer conversations.
Following up with QA team during testing phase.
Development and Maintenance of PL/SQL Procedures and Triggers.
Creation of Indexes and Views for Performance Tuning.
Documentation of good practices and logics for future reference.
Developed Fact Tables for On Demand Reports.
Resolving the defects at time of production.