SlideShare a Scribd company logo
1 of 4
R.HariKrishna
Krishnareddy.revuri@gmail.com 8892662994
PROFESSIONAL SUMMARY:
Over 4+ years of professional experience in IT industry, involving in Software development and
implementation of web based applications using MVC architecture. Having 2.5 year of experience in
Hadoop development using HDFS, MapReduce, Pig, Hive, HBase to include designing, developing and
deploying n-tired and enterprise level distributed applications.
• Good experience on Map Reduce, Pig and Hive and experienced for storing data in NOSQL
Databases like Mongo DB and HBase.
• Proficient in writing Map reduce jobs, Hive queries and Pig based scripts.
• Good knowledge on Map Reduce Framework & HDFS Architecture.
• Good knowledge of installation and configuration of Hadoop Cluster.
• Good knowledge in Hadoop Shell Commands.
• Extensively worked on Hive HQL Queries.
• Interpersonal skills with excellent analytical, problem solving and easily cop up with the
changing business needs.
• Ability to meet deadlines and handle pressure in coordinating multiple tasks in the work
environment.
SKILLS PROFILE:
Operating System : UNIX, Windows.
Hadoop : HDFS, Map Reduce, Hive, Pig, Kafka, Spark.
NoSql Databases : HBase, Mongodb.
ML Technologies : Mahout, NLP, Python, Data Science Packages.
Languages : Core java, Scala, SQL.
J2EE Technologie : JDBC, Servlets, JSP.
Query languages : Oracle SQL, PL/SQL.
Other Tools : Eclipse, Maven, Sbt.
EDUCATION: B.Tech (CSE) from Jawaharlal Nehru Technological University, HYD, A.P in the year 2012.
WORK EXPERIENCE
Working as a Hadoop Developer in “CMS IT SERVICES PVT LTD“ Bangalore.
# Project1: Sensor Analytics
Client : AMWAY
Duration : July 16 to Till Date
Role : Team Member
Team Size : 9
Environment: HDFS, FLUME, Kafka, Spark, Cassandra, zeppelin,Scala.
Description:
Sensor Analytics is Provides of Wi-fi hot spot devices in public spaces. Ability to collect the data
from these devices and analyse Existing System. Collect data and process in daily batches to generate
required results. There is Lot of Failures in User Login Need to analyse why there is a drop in user logins
ability to analyse the data in real time rather than daily batches. A reporting mechanism to view the
insights obtained from the analysis need to see results in real time. In a simple term we can call it as a real
time monitoring System.
Roles and Responsibilities :
• Worked as a Setting up hadoop Cluster Ecosystem with (hotonworks and Apache distribution)
• Worked on flume for moving large amounts of data from many different sources to a centralized
Data store (HDFS)
• Worked on Kafka distributed commit log for Store the Data.
• Worked on spark streaming with scala Programming to access live data.
• Maintains the nosql database using Cassandra.
• Monitoring the result sing the zeppelin web based notebook that allows interactive data analysis.
# Project2: PRANTO TOOL
Client : AMWAY
Duration : Sep 2014 to July 15
Role : Team Member
Team Size : 8
Environment: Hadoop, Mapreduce, Hbase, Flume, Spark, scala, R.
Description:
Predictive Analytics is the art and science of using data to make better informed decisions.
Predictive analytics helps you uncover hidden patterns and relationships in your data that can help you
predict with greater confidence what may happen in the future, and provide you with valuable, actionable
insights for your organization.
• Predicting the all server logs for the future Analytics.
• Doing Analytics using machine learning algorithms.
Roles and Responsibilities :
• Worked as a Setting up Hadoop Cluster Ecosystem with (Hotonworks and Apache Distribution)
• Worked on flume for moving large amounts of data from many different sources to a centralized
data store(HDFS)
• Wrote Map Reduce Jobs using Java to parse the logs which are stored in HDFS.
• Worked On Spark Streaming With Scala Programing to Access Live Data.
• Unit testing of the components, Maintenance of the code and components.
• Maintains the NoSql database using Hbase.
# Project3: PRODUTION CREDIT REPORTING
Environment: HADOOP HDFS, MAPREDUCE, HIVE, PIG, SQOOP.
Description:
PCR is a Production Credits calculation which captures banking information from regional system.
This project focuses on getting data about banking transactions from different sources. The transactions
are then analyzed using Map reduce program to find different user patterns. A visualization layer help user
to track his financial activity in many scenarios. System captures the data through CSV files and direct
database pull from approximately 60 different front office systems and manual sources throughout the
world.
Roles and Responsibilities :
• Worked on setting up Hadoop, PIG, Hive over multiple nodes
• Wrote Map Reduce Jobs using Java to parse the logs which are stored in HDFS.
• Storing and retrieved data using HQL Queries in Hive
• Implemented Hive tables and HQL Queries for the reports
• Involved in database connection by Using SQOOP.
• Archiving files to HAR file.
PERSONAL PROFILE:
Father’s Name : R.Ganapathi Reddy
Date of Birth : 20th
April 1990
Sex : Male
Marital Status : Single
Nationality : Indian
Languages Known : English , Telugu and Hindi
DECLARATION:
I hereby declare that all above mentioned information is true the best of my
knowledge.
(R.HariKrishna)

More Related Content

What's hot

Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft PlatformAndrew Brust
 
Payment Gateway Live hadoop project
Payment Gateway Live hadoop projectPayment Gateway Live hadoop project
Payment Gateway Live hadoop projectKamal A
 
Big data vahidamiri-tabriz-13960226-datastack.ir
Big data vahidamiri-tabriz-13960226-datastack.irBig data vahidamiri-tabriz-13960226-datastack.ir
Big data vahidamiri-tabriz-13960226-datastack.irdatastack
 
Hadoop And Their Ecosystem
 Hadoop And Their Ecosystem Hadoop And Their Ecosystem
Hadoop And Their Ecosystemsunera pathan
 
Hadoop Ecosystem Architecture Overview
Hadoop Ecosystem Architecture Overview Hadoop Ecosystem Architecture Overview
Hadoop Ecosystem Architecture Overview Senthil Kumar
 
4. hadoop גיא לבנברג
4. hadoop  גיא לבנברג4. hadoop  גיא לבנברג
4. hadoop גיא לבנברגTaldor Group
 
Architectural Evolution Starting from Hadoop
Architectural Evolution Starting from HadoopArchitectural Evolution Starting from Hadoop
Architectural Evolution Starting from HadoopSpagoWorld
 
MapR-DB – The First In-Hadoop Document Database
MapR-DB – The First In-Hadoop Document DatabaseMapR-DB – The First In-Hadoop Document Database
MapR-DB – The First In-Hadoop Document DatabaseMapR Technologies
 
Apache hadoop introduction and architecture
Apache hadoop  introduction and architectureApache hadoop  introduction and architecture
Apache hadoop introduction and architectureHarikrishnan K
 

What's hot (20)

Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft Platform
 
Payment Gateway Live hadoop project
Payment Gateway Live hadoop projectPayment Gateway Live hadoop project
Payment Gateway Live hadoop project
 
Big data Hadoop
Big data  Hadoop   Big data  Hadoop
Big data Hadoop
 
Big data vahidamiri-tabriz-13960226-datastack.ir
Big data vahidamiri-tabriz-13960226-datastack.irBig data vahidamiri-tabriz-13960226-datastack.ir
Big data vahidamiri-tabriz-13960226-datastack.ir
 
Hadoop jon
Hadoop jonHadoop jon
Hadoop jon
 
Hadoop And Their Ecosystem
 Hadoop And Their Ecosystem Hadoop And Their Ecosystem
Hadoop And Their Ecosystem
 
Hadoop and HBase @eBay
Hadoop and HBase @eBayHadoop and HBase @eBay
Hadoop and HBase @eBay
 
Hadoop
Hadoop Hadoop
Hadoop
 
Getting started big data
Getting started big dataGetting started big data
Getting started big data
 
Bi with apache hadoop(en)
Bi with apache hadoop(en)Bi with apache hadoop(en)
Bi with apache hadoop(en)
 
Cloudera Hadoop Distribution
Cloudera Hadoop DistributionCloudera Hadoop Distribution
Cloudera Hadoop Distribution
 
Hadoop Ecosystem Architecture Overview
Hadoop Ecosystem Architecture Overview Hadoop Ecosystem Architecture Overview
Hadoop Ecosystem Architecture Overview
 
Hadoop Ecosystem
Hadoop EcosystemHadoop Ecosystem
Hadoop Ecosystem
 
4. hadoop גיא לבנברג
4. hadoop  גיא לבנברג4. hadoop  גיא לבנברג
4. hadoop גיא לבנברג
 
Architectural Evolution Starting from Hadoop
Architectural Evolution Starting from HadoopArchitectural Evolution Starting from Hadoop
Architectural Evolution Starting from Hadoop
 
Hadoop ecosystem
Hadoop ecosystemHadoop ecosystem
Hadoop ecosystem
 
Hadoop white papers
Hadoop white papersHadoop white papers
Hadoop white papers
 
MapR-DB – The First In-Hadoop Document Database
MapR-DB – The First In-Hadoop Document DatabaseMapR-DB – The First In-Hadoop Document Database
MapR-DB – The First In-Hadoop Document Database
 
Philly DB MapR Overview
Philly DB MapR OverviewPhilly DB MapR Overview
Philly DB MapR Overview
 
Apache hadoop introduction and architecture
Apache hadoop  introduction and architectureApache hadoop  introduction and architecture
Apache hadoop introduction and architecture
 

Viewers also liked

How a Developer can Troubleshoot a SQL performing poorly on a Production DB
How a Developer can Troubleshoot a SQL performing poorly on a Production DBHow a Developer can Troubleshoot a SQL performing poorly on a Production DB
How a Developer can Troubleshoot a SQL performing poorly on a Production DBCarlos Sierra
 
10 Problems with your RMAN backup script - whitepaper
10 Problems with your RMAN backup script - whitepaper10 Problems with your RMAN backup script - whitepaper
10 Problems with your RMAN backup script - whitepaperYury Velikanov
 
Oracle Performance Tools of the Trade
Oracle Performance Tools of the TradeOracle Performance Tools of the Trade
Oracle Performance Tools of the TradeCarlos Sierra
 
Using SQL Plan Management (SPM) to balance Plan Flexibility and Plan Stability
Using SQL Plan Management (SPM) to balance Plan Flexibility and Plan StabilityUsing SQL Plan Management (SPM) to balance Plan Flexibility and Plan Stability
Using SQL Plan Management (SPM) to balance Plan Flexibility and Plan StabilityCarlos Sierra
 
Backup and Restore of database on 2-Node RAC
Backup and Restore of database on 2-Node RACBackup and Restore of database on 2-Node RAC
Backup and Restore of database on 2-Node RACPaulo Fagundes
 
Hortonworks Technical Workshop: HDP everywhere - cloud considerations using...
Hortonworks Technical Workshop:   HDP everywhere - cloud considerations using...Hortonworks Technical Workshop:   HDP everywhere - cloud considerations using...
Hortonworks Technical Workshop: HDP everywhere - cloud considerations using...Hortonworks
 
Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0
Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0
Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0Yury Velikanov
 
introduction to data processing using Hadoop and Pig
introduction to data processing using Hadoop and Pigintroduction to data processing using Hadoop and Pig
introduction to data processing using Hadoop and PigRicardo Varela
 
Pig, Making Hadoop Easy
Pig, Making Hadoop EasyPig, Making Hadoop Easy
Pig, Making Hadoop EasyNick Dimiduk
 
Hadoop installation, Configuration, and Mapreduce program
Hadoop installation, Configuration, and Mapreduce programHadoop installation, Configuration, and Mapreduce program
Hadoop installation, Configuration, and Mapreduce programPraveen Kumar Donta
 
Big Data & Hadoop Tutorial
Big Data & Hadoop TutorialBig Data & Hadoop Tutorial
Big Data & Hadoop TutorialEdureka!
 

Viewers also liked (12)

How a Developer can Troubleshoot a SQL performing poorly on a Production DB
How a Developer can Troubleshoot a SQL performing poorly on a Production DBHow a Developer can Troubleshoot a SQL performing poorly on a Production DB
How a Developer can Troubleshoot a SQL performing poorly on a Production DB
 
10 Problems with your RMAN backup script - whitepaper
10 Problems with your RMAN backup script - whitepaper10 Problems with your RMAN backup script - whitepaper
10 Problems with your RMAN backup script - whitepaper
 
Oracle Performance Tools of the Trade
Oracle Performance Tools of the TradeOracle Performance Tools of the Trade
Oracle Performance Tools of the Trade
 
Using SQL Plan Management (SPM) to balance Plan Flexibility and Plan Stability
Using SQL Plan Management (SPM) to balance Plan Flexibility and Plan StabilityUsing SQL Plan Management (SPM) to balance Plan Flexibility and Plan Stability
Using SQL Plan Management (SPM) to balance Plan Flexibility and Plan Stability
 
RHadoop
RHadoopRHadoop
RHadoop
 
Backup and Restore of database on 2-Node RAC
Backup and Restore of database on 2-Node RACBackup and Restore of database on 2-Node RAC
Backup and Restore of database on 2-Node RAC
 
Hortonworks Technical Workshop: HDP everywhere - cloud considerations using...
Hortonworks Technical Workshop:   HDP everywhere - cloud considerations using...Hortonworks Technical Workshop:   HDP everywhere - cloud considerations using...
Hortonworks Technical Workshop: HDP everywhere - cloud considerations using...
 
Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0
Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0
Oracle 12c RAC On your laptop Step by Step Implementation Guide 1.0
 
introduction to data processing using Hadoop and Pig
introduction to data processing using Hadoop and Pigintroduction to data processing using Hadoop and Pig
introduction to data processing using Hadoop and Pig
 
Pig, Making Hadoop Easy
Pig, Making Hadoop EasyPig, Making Hadoop Easy
Pig, Making Hadoop Easy
 
Hadoop installation, Configuration, and Mapreduce program
Hadoop installation, Configuration, and Mapreduce programHadoop installation, Configuration, and Mapreduce program
Hadoop installation, Configuration, and Mapreduce program
 
Big Data & Hadoop Tutorial
Big Data & Hadoop TutorialBig Data & Hadoop Tutorial
Big Data & Hadoop Tutorial
 

Similar to Expert Hadoop Developer with 4+ years of experience

Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydsrikanth K
 
sam_resume - updated
sam_resume - updatedsam_resume - updated
sam_resume - updatedsam k
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016Rajesh Kumar
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam M
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)ch koti
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
 
Big Data Hoopla Simplified - TDWI Memphis 2014
Big Data Hoopla Simplified - TDWI Memphis 2014Big Data Hoopla Simplified - TDWI Memphis 2014
Big Data Hoopla Simplified - TDWI Memphis 2014Rajan Kanitkar
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copysrikanth K
 

Similar to Expert Hadoop Developer with 4+ years of experience (20)

hadoop exp
hadoop exphadoop exp
hadoop exp
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Nagesh Hadoop Profile
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profile
 
Resume
ResumeResume
Resume
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Poorna Hadoop
Poorna HadoopPoorna Hadoop
Poorna Hadoop
 
sam_resume - updated
sam_resume - updatedsam_resume - updated
sam_resume - updated
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Hadoop.pptx
Hadoop.pptxHadoop.pptx
Hadoop.pptx
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+Years
 
What is hadoop
What is hadoopWhat is hadoop
What is hadoop
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)
 
Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData Resume
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Big Data Hoopla Simplified - TDWI Memphis 2014
Big Data Hoopla Simplified - TDWI Memphis 2014Big Data Hoopla Simplified - TDWI Memphis 2014
Big Data Hoopla Simplified - TDWI Memphis 2014
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
 

Expert Hadoop Developer with 4+ years of experience

  • 1. R.HariKrishna Krishnareddy.revuri@gmail.com 8892662994 PROFESSIONAL SUMMARY: Over 4+ years of professional experience in IT industry, involving in Software development and implementation of web based applications using MVC architecture. Having 2.5 year of experience in Hadoop development using HDFS, MapReduce, Pig, Hive, HBase to include designing, developing and deploying n-tired and enterprise level distributed applications. • Good experience on Map Reduce, Pig and Hive and experienced for storing data in NOSQL Databases like Mongo DB and HBase. • Proficient in writing Map reduce jobs, Hive queries and Pig based scripts. • Good knowledge on Map Reduce Framework & HDFS Architecture. • Good knowledge of installation and configuration of Hadoop Cluster. • Good knowledge in Hadoop Shell Commands. • Extensively worked on Hive HQL Queries. • Interpersonal skills with excellent analytical, problem solving and easily cop up with the changing business needs. • Ability to meet deadlines and handle pressure in coordinating multiple tasks in the work environment. SKILLS PROFILE: Operating System : UNIX, Windows. Hadoop : HDFS, Map Reduce, Hive, Pig, Kafka, Spark. NoSql Databases : HBase, Mongodb. ML Technologies : Mahout, NLP, Python, Data Science Packages. Languages : Core java, Scala, SQL. J2EE Technologie : JDBC, Servlets, JSP. Query languages : Oracle SQL, PL/SQL. Other Tools : Eclipse, Maven, Sbt. EDUCATION: B.Tech (CSE) from Jawaharlal Nehru Technological University, HYD, A.P in the year 2012.
  • 2. WORK EXPERIENCE Working as a Hadoop Developer in “CMS IT SERVICES PVT LTD“ Bangalore. # Project1: Sensor Analytics Client : AMWAY Duration : July 16 to Till Date Role : Team Member Team Size : 9 Environment: HDFS, FLUME, Kafka, Spark, Cassandra, zeppelin,Scala. Description: Sensor Analytics is Provides of Wi-fi hot spot devices in public spaces. Ability to collect the data from these devices and analyse Existing System. Collect data and process in daily batches to generate required results. There is Lot of Failures in User Login Need to analyse why there is a drop in user logins ability to analyse the data in real time rather than daily batches. A reporting mechanism to view the insights obtained from the analysis need to see results in real time. In a simple term we can call it as a real time monitoring System. Roles and Responsibilities : • Worked as a Setting up hadoop Cluster Ecosystem with (hotonworks and Apache distribution) • Worked on flume for moving large amounts of data from many different sources to a centralized Data store (HDFS) • Worked on Kafka distributed commit log for Store the Data. • Worked on spark streaming with scala Programming to access live data. • Maintains the nosql database using Cassandra. • Monitoring the result sing the zeppelin web based notebook that allows interactive data analysis. # Project2: PRANTO TOOL Client : AMWAY Duration : Sep 2014 to July 15 Role : Team Member Team Size : 8
  • 3. Environment: Hadoop, Mapreduce, Hbase, Flume, Spark, scala, R. Description: Predictive Analytics is the art and science of using data to make better informed decisions. Predictive analytics helps you uncover hidden patterns and relationships in your data that can help you predict with greater confidence what may happen in the future, and provide you with valuable, actionable insights for your organization. • Predicting the all server logs for the future Analytics. • Doing Analytics using machine learning algorithms. Roles and Responsibilities : • Worked as a Setting up Hadoop Cluster Ecosystem with (Hotonworks and Apache Distribution) • Worked on flume for moving large amounts of data from many different sources to a centralized data store(HDFS) • Wrote Map Reduce Jobs using Java to parse the logs which are stored in HDFS. • Worked On Spark Streaming With Scala Programing to Access Live Data. • Unit testing of the components, Maintenance of the code and components. • Maintains the NoSql database using Hbase. # Project3: PRODUTION CREDIT REPORTING Environment: HADOOP HDFS, MAPREDUCE, HIVE, PIG, SQOOP. Description: PCR is a Production Credits calculation which captures banking information from regional system. This project focuses on getting data about banking transactions from different sources. The transactions are then analyzed using Map reduce program to find different user patterns. A visualization layer help user to track his financial activity in many scenarios. System captures the data through CSV files and direct database pull from approximately 60 different front office systems and manual sources throughout the world. Roles and Responsibilities :
  • 4. • Worked on setting up Hadoop, PIG, Hive over multiple nodes • Wrote Map Reduce Jobs using Java to parse the logs which are stored in HDFS. • Storing and retrieved data using HQL Queries in Hive • Implemented Hive tables and HQL Queries for the reports • Involved in database connection by Using SQOOP. • Archiving files to HAR file. PERSONAL PROFILE: Father’s Name : R.Ganapathi Reddy Date of Birth : 20th April 1990 Sex : Male Marital Status : Single Nationality : Indian Languages Known : English , Telugu and Hindi DECLARATION: I hereby declare that all above mentioned information is true the best of my knowledge. (R.HariKrishna)