1. Y.HIMABINDU
Email: binduvemula1990@gmail.com Mobile: +91-8142872101
Career Objective:-
To take up a challenging career and position in organization of repute in the software industry
which will give me an opportunity to work with the latest technologies and grow with the
organization.
Professional Summary:-
3+ years of overall IT experience in Application Development in Java and Big Data
Hadoop.
1.2 years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie, Mongo DB.
Extensive Experience in Setting Hadoop Cluster.
Good working knowledge with Map Reduce and Apache Pig.
Involved in writing the Pig scripts to reduce the job execution time.
Have executed projects using Java/J2EE technologies such as Core Java, Servlets, Jsp.
Very well experienced in designing and developing both server side and client side
applications.
Expertise working with CoreJava Applications.
Good Knowledge of OOPS concepts.
Having knowledge in ORACLE Sql, pl/sql.
Highly motivated, detail oriented, has ability to work independently and as a part of
the team with Excellent Technical, Analytical.
Exceptional ability to learn new concepts.
Hard working and enthusiastic.
Knowledge on FLUME and NO-SQL.
Knowledge on SPARK and SCALA.
Technical Skills:-
Hadoop HDFS, Map Reduce, Apache Pig, Hive, Sqoop
, HBase.
Databases Oracle SQL.
Operating System Linux, Windows 7.
Programming Language Core java.
EDUCATION:
B.TECH Computer Science and Engineering from MRRITS with an aggregate of 75%
in 2011.
Professional Details:-
Currently working as a Software Engineer in PROKARMA SOFTTECH Since 2013 to till
now.
2. PROJECT DETAILS:-
PROJECT #3
Title : I-DASH
Client : Best Info
Environment : Hadoop, Apache Pig, Hive, SQOOP, MySQL
Analytical Tools : Machine Learning, Predictive Analysis
Role : Hadoop Developer
Duration : March 2015 to May 2016
Description:
The purpose of the project is to store terabytes of log information generated by the
ecommerce website and extract meaning information out of it. The solution is based on the
open source Big Data s/w Hadoop .The data will be stored in Hadoop file system and
processed using Map/Reduce jobs. Which intern includes getting the raw html data from the
websites, Process the html to obtain product and pricing information, Extract various reports
out of the product pricing information and Export the information for further processing.
Machine Learning concepts are used for recommendation of products, classification
of products, pattern matching of certain products. This project is mainly for the
replatforming of the current existing system which is running on WebHarvest a third party
JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to
process large date sets (i.e. Terabytes and Peta bytes of data) in order to meet the client
requirements with the increasing competition from his retailers.
Roles and Responsibilities:
1. Moved all crawl data flat files generated from various retailers to HDFS for further
processing.
2. Written the Apache PIG scripts to process the HDFS data.
3. Created Hive tables to store the processed results in a tabular format.
4. Developed the Sqoop scripts in order to make the interaction between Pig and MySQL
Database.
5. For the development of Dashboard solution, developed the Controller, Service and Dao
layers of Hibernate Framework.
6. Involved in resolving the JIRAs based on Hadoop.
7. Developed the UNIX shell scripts for creating the reports from Hive data.
8. Completely involved in the requirement analysis phase.
PROJECT #2
Title IPM- Intellectual Property Management
Team Size 4
Client IPM
Role Team Member
Duration March 2014 to Feb 2015
Technologies Tomcat 7.0, Globalscape, Jersey Webservices
3. This is a SOA application aimed to serve EFT users to use the DMS tool popularly known as
Globalscape for scheduled file transfers and invoke a service which would perfom the file
processing operations like FileName Validation, Signature verification,decrypt,unzip and and
untar and move the files to retention location after performing the file processing and
cryptographic operations.Effective logging implementation is also involved which give the
information of file movement from the movement it entered to the movement it exited.
Involved in developing the POC for the Project.
Involved in development of JOBS for transferring the data files between the
servers.
Developed Exe's for logging the sftp file information and also FileShare
Information.
Developed Exe which would be used to invoke the service through GS.
Developed Code for File Processing and cryptographic operations.
Given Technical guidance to the team as and when required.
Involved in Devlopment,Unit Testing , UAT ,Pre-prod testing and Prod
deployment of the job.
PROJECT #1
Title HBO-MIDAS
Team Size 40
Client Home Box Office, New York, America
Role Team Member
Duration March 2013 to Feb 2014
Technologies Java1.5,,Oracle9i,Web Logic
Description:
The Subscriber Management and Revenue Tracking application (SMART) is a critical
application utilized by HBO's Cash and Revenue Operations department ("CRO"). It is used
by CRO for tracking subscribers and revenue/billing for the HBO services.
HBO is replacing SMART with a new Management of Invoices for Deals, Affiliates and
Subscribers (MIDAS) to be developed using a software language that meets HBO and
industry standards.
A project assumption is that MIDAS will be developed to utilize the same underlying
database currently used by SMART. It is also assumed that MIDAS will provide the same
functionality and reporting capabilities as SMART.
Roles and Responsibilities:
1. Involved in Development and Integration
2. Involved in Integration of Deals and Billing Module.
3. Involved in Defect tracking and Bug Fixing.
Declaration:
I hereby declare that the information furnished above is to the best of my knowledge and
belief.
Place: Hyderabad.
Date: