SlideShare a Scribd company logo
1 of 4
Y.HIMABINDU
Email: binduvemula1990@gmail.com Mobile: +91-8142872101
Career Objective:-
To take up a challenging career and position in organization of repute in the software industry
which will give me an opportunity to work with the latest technologies and grow with the
organization.
Professional Summary:-
 3+ years of overall IT experience in Application Development in Java and Big Data
Hadoop.
 1.2 years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie, Mongo DB.
 Extensive Experience in Setting Hadoop Cluster.
 Good working knowledge with Map Reduce and Apache Pig.
 Involved in writing the Pig scripts to reduce the job execution time.
 Have executed projects using Java/J2EE technologies such as Core Java, Servlets, Jsp.
 Very well experienced in designing and developing both server side and client side
applications.
 Expertise working with CoreJava Applications.
 Good Knowledge of OOPS concepts.
 Having knowledge in ORACLE Sql, pl/sql.
 Highly motivated, detail oriented, has ability to work independently and as a part of
the team with Excellent Technical, Analytical.
 Exceptional ability to learn new concepts.
 Hard working and enthusiastic.
 Knowledge on FLUME and NO-SQL.
 Knowledge on SPARK and SCALA.
Technical Skills:-
 Hadoop HDFS, Map Reduce, Apache Pig, Hive, Sqoop
, HBase.
 Databases Oracle SQL.
 Operating System Linux, Windows 7.
 Programming Language Core java.
EDUCATION:
 B.TECH Computer Science and Engineering from MRRITS with an aggregate of 75%
in 2011.
Professional Details:-
 Currently working as a Software Engineer in PROKARMA SOFTTECH Since 2013 to till
now.
PROJECT DETAILS:-
PROJECT #3
Title : I-DASH
Client : Best Info
Environment : Hadoop, Apache Pig, Hive, SQOOP, MySQL
Analytical Tools : Machine Learning, Predictive Analysis
Role : Hadoop Developer
Duration : March 2015 to May 2016
Description:
The purpose of the project is to store terabytes of log information generated by the
ecommerce website and extract meaning information out of it. The solution is based on the
open source Big Data s/w Hadoop .The data will be stored in Hadoop file system and
processed using Map/Reduce jobs. Which intern includes getting the raw html data from the
websites, Process the html to obtain product and pricing information, Extract various reports
out of the product pricing information and Export the information for further processing.
Machine Learning concepts are used for recommendation of products, classification
of products, pattern matching of certain products. This project is mainly for the
replatforming of the current existing system which is running on WebHarvest a third party
JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to
process large date sets (i.e. Terabytes and Peta bytes of data) in order to meet the client
requirements with the increasing competition from his retailers.
Roles and Responsibilities:
1. Moved all crawl data flat files generated from various retailers to HDFS for further
processing.
2. Written the Apache PIG scripts to process the HDFS data.
3. Created Hive tables to store the processed results in a tabular format.
4. Developed the Sqoop scripts in order to make the interaction between Pig and MySQL
Database.
5. For the development of Dashboard solution, developed the Controller, Service and Dao
layers of Hibernate Framework.
6. Involved in resolving the JIRAs based on Hadoop.
7. Developed the UNIX shell scripts for creating the reports from Hive data.
8. Completely involved in the requirement analysis phase.
PROJECT #2
Title IPM- Intellectual Property Management
Team Size 4
Client IPM
Role Team Member
Duration March 2014 to Feb 2015
Technologies Tomcat 7.0, Globalscape, Jersey Webservices
This is a SOA application aimed to serve EFT users to use the DMS tool popularly known as
Globalscape for scheduled file transfers and invoke a service which would perfom the file
processing operations like FileName Validation, Signature verification,decrypt,unzip and and
untar and move the files to retention location after performing the file processing and
cryptographic operations.Effective logging implementation is also involved which give the
information of file movement from the movement it entered to the movement it exited.
 Involved in developing the POC for the Project.
 Involved in development of JOBS for transferring the data files between the
servers.
 Developed Exe's for logging the sftp file information and also FileShare
Information.
 Developed Exe which would be used to invoke the service through GS.
 Developed Code for File Processing and cryptographic operations.
 Given Technical guidance to the team as and when required.
 Involved in Devlopment,Unit Testing , UAT ,Pre-prod testing and Prod
deployment of the job.
PROJECT #1
Title HBO-MIDAS
Team Size 40
Client Home Box Office, New York, America
Role Team Member
Duration March 2013 to Feb 2014
Technologies Java1.5,,Oracle9i,Web Logic
Description:
The Subscriber Management and Revenue Tracking application (SMART) is a critical
application utilized by HBO's Cash and Revenue Operations department ("CRO"). It is used
by CRO for tracking subscribers and revenue/billing for the HBO services.
HBO is replacing SMART with a new Management of Invoices for Deals, Affiliates and
Subscribers (MIDAS) to be developed using a software language that meets HBO and
industry standards.
A project assumption is that MIDAS will be developed to utilize the same underlying
database currently used by SMART. It is also assumed that MIDAS will provide the same
functionality and reporting capabilities as SMART.
Roles and Responsibilities:
1. Involved in Development and Integration
2. Involved in Integration of Deals and Billing Module.
3. Involved in Defect tracking and Bug Fixing.
Declaration:
I hereby declare that the information furnished above is to the best of my knowledge and
belief.
Place: Hyderabad.
Date:
HIMABINDU Y

More Related Content

What's hot

Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love ItIBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love ItIBM Analytics
 
A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...ijcses
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resumeShiv Shakti
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copysrikanth K
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
 
Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resumeSasmita Swain
 
NETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCE
NETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCENETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCE
NETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCEcsandit
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overviewRohit Jain
 

What's hot (20)

Resume
ResumeResume
Resume
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_Resume
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love ItIBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
IBM InfoSphere BigInsights for Hadoop: 10 Reasons to Love It
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
Atul Mithe
Atul MitheAtul Mithe
Atul Mithe
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Hareesh
HareeshHareesh
Hareesh
 
Big data course
Big data  courseBig data  course
Big data course
 
Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resume
 
NETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCE
NETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCENETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCE
NETWORK TRAFFIC ANALYSIS: HADOOP PIG VS TYPICAL MAPREDUCE
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overview
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 

Viewers also liked

Viewers also liked (8)

OSA Workshop 2016
OSA Workshop 2016OSA Workshop 2016
OSA Workshop 2016
 
GE Final Poster
GE Final PosterGE Final Poster
GE Final Poster
 
Manipulated images
Manipulated imagesManipulated images
Manipulated images
 
Marketing campaign
Marketing campaignMarketing campaign
Marketing campaign
 
Resume
ResumeResume
Resume
 
Null Bachaav - May 07 Attack Monitoring workshop.
Null Bachaav - May 07 Attack Monitoring workshop.Null Bachaav - May 07 Attack Monitoring workshop.
Null Bachaav - May 07 Attack Monitoring workshop.
 
Indexing repositories: Pitfalls & best practices
Indexing repositories: Pitfalls & best practicesIndexing repositories: Pitfalls & best practices
Indexing repositories: Pitfalls & best practices
 
Surgical anatomy of nerve and vascular injuries in the upper limb
Surgical anatomy of nerve and vascular injuries in the upper limbSurgical anatomy of nerve and vascular injuries in the upper limb
Surgical anatomy of nerve and vascular injuries in the upper limb
 

Similar to HimaBindu

Similar to HimaBindu (20)

Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
ChandraSekhar CV
ChandraSekhar CVChandraSekhar CV
ChandraSekhar CV
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+Years
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Yasar resume 2
Yasar resume 2Yasar resume 2
Yasar resume 2
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
 
Shiv Shakti
Shiv ShaktiShiv Shakti
Shiv Shakti
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotech
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
SreenivasulaReddy
SreenivasulaReddySreenivasulaReddy
SreenivasulaReddy
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 

HimaBindu

  • 1. Y.HIMABINDU Email: binduvemula1990@gmail.com Mobile: +91-8142872101 Career Objective:- To take up a challenging career and position in organization of repute in the software industry which will give me an opportunity to work with the latest technologies and grow with the organization. Professional Summary:-  3+ years of overall IT experience in Application Development in Java and Big Data Hadoop.  1.2 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie, Mongo DB.  Extensive Experience in Setting Hadoop Cluster.  Good working knowledge with Map Reduce and Apache Pig.  Involved in writing the Pig scripts to reduce the job execution time.  Have executed projects using Java/J2EE technologies such as Core Java, Servlets, Jsp.  Very well experienced in designing and developing both server side and client side applications.  Expertise working with CoreJava Applications.  Good Knowledge of OOPS concepts.  Having knowledge in ORACLE Sql, pl/sql.  Highly motivated, detail oriented, has ability to work independently and as a part of the team with Excellent Technical, Analytical.  Exceptional ability to learn new concepts.  Hard working and enthusiastic.  Knowledge on FLUME and NO-SQL.  Knowledge on SPARK and SCALA. Technical Skills:-  Hadoop HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase.  Databases Oracle SQL.  Operating System Linux, Windows 7.  Programming Language Core java. EDUCATION:  B.TECH Computer Science and Engineering from MRRITS with an aggregate of 75% in 2011. Professional Details:-  Currently working as a Software Engineer in PROKARMA SOFTTECH Since 2013 to till now.
  • 2. PROJECT DETAILS:- PROJECT #3 Title : I-DASH Client : Best Info Environment : Hadoop, Apache Pig, Hive, SQOOP, MySQL Analytical Tools : Machine Learning, Predictive Analysis Role : Hadoop Developer Duration : March 2015 to May 2016 Description: The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract meaning information out of it. The solution is based on the open source Big Data s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from the websites, Process the html to obtain product and pricing information, Extract various reports out of the product pricing information and Export the information for further processing. Machine Learning concepts are used for recommendation of products, classification of products, pattern matching of certain products. This project is mainly for the replatforming of the current existing system which is running on WebHarvest a third party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process large date sets (i.e. Terabytes and Peta bytes of data) in order to meet the client requirements with the increasing competition from his retailers. Roles and Responsibilities: 1. Moved all crawl data flat files generated from various retailers to HDFS for further processing. 2. Written the Apache PIG scripts to process the HDFS data. 3. Created Hive tables to store the processed results in a tabular format. 4. Developed the Sqoop scripts in order to make the interaction between Pig and MySQL Database. 5. For the development of Dashboard solution, developed the Controller, Service and Dao layers of Hibernate Framework. 6. Involved in resolving the JIRAs based on Hadoop. 7. Developed the UNIX shell scripts for creating the reports from Hive data. 8. Completely involved in the requirement analysis phase. PROJECT #2 Title IPM- Intellectual Property Management Team Size 4 Client IPM Role Team Member Duration March 2014 to Feb 2015 Technologies Tomcat 7.0, Globalscape, Jersey Webservices
  • 3. This is a SOA application aimed to serve EFT users to use the DMS tool popularly known as Globalscape for scheduled file transfers and invoke a service which would perfom the file processing operations like FileName Validation, Signature verification,decrypt,unzip and and untar and move the files to retention location after performing the file processing and cryptographic operations.Effective logging implementation is also involved which give the information of file movement from the movement it entered to the movement it exited.  Involved in developing the POC for the Project.  Involved in development of JOBS for transferring the data files between the servers.  Developed Exe's for logging the sftp file information and also FileShare Information.  Developed Exe which would be used to invoke the service through GS.  Developed Code for File Processing and cryptographic operations.  Given Technical guidance to the team as and when required.  Involved in Devlopment,Unit Testing , UAT ,Pre-prod testing and Prod deployment of the job. PROJECT #1 Title HBO-MIDAS Team Size 40 Client Home Box Office, New York, America Role Team Member Duration March 2013 to Feb 2014 Technologies Java1.5,,Oracle9i,Web Logic Description: The Subscriber Management and Revenue Tracking application (SMART) is a critical application utilized by HBO's Cash and Revenue Operations department ("CRO"). It is used by CRO for tracking subscribers and revenue/billing for the HBO services. HBO is replacing SMART with a new Management of Invoices for Deals, Affiliates and Subscribers (MIDAS) to be developed using a software language that meets HBO and industry standards. A project assumption is that MIDAS will be developed to utilize the same underlying database currently used by SMART. It is also assumed that MIDAS will provide the same functionality and reporting capabilities as SMART. Roles and Responsibilities: 1. Involved in Development and Integration 2. Involved in Integration of Deals and Billing Module. 3. Involved in Defect tracking and Bug Fixing. Declaration: I hereby declare that the information furnished above is to the best of my knowledge and belief. Place: Hyderabad. Date: