SlideShare a Scribd company logo
M.V. Rama Kumar Mobile: +91-9014514500
Email:matururamakumar@gmail.com
Professional Summary:
• 3 years of overall IT experience in Application Development in Java and Big Data
Hadoop.
• 1.6 years of exclusive experience in Hadoop and its components like HDFS, Map
Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie
• Extensive Experience in Setting Hadoop Cluster
• Capable of processing large sets of structured, semi-structured and unstructured
data and supporting systems application architecture.
• Involved in writing the Pig scripts to reduce the job execution time.
• Experience in importing and exporting data using Sqoop from Hdfs to Relational
Database Systems and vice-versa.
• Extending Hive and Pig core functionality by writing Custom UDFs.
• Load data in to the Hive tables.
• Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce
programs in Java.
• Knowledge in job workflow scheduling tool like Oozie.
• Experience in Capturing data from existing databases that provides SQL interfaces
using Sqoop.
• Good working knowledge with Sqoop and Apache Pig , Apache Oozie
• Excellent communication, interpersonal, analytical skills, and strong ability to
perform as part of team.
• Exceptional ability to learn new concepts.
• Hard working and enthusiastic.
Qualifications:
Program of
Study
Specialization Institute Board/University
Year of
Passing
%
M.C.A CS
Prakasam Eng
College
JNTUK 2012 70%
Technical Skills:
Languages SQL, Pig Script, HQL, Core Java, Map Reduce.
Big data Technologies HDFS,MapReduce, Pig, Sqoop, Hive, Hbase,Oozie
Frameworks Hadoop.
Java IDEs Eclipse.
Operating Systems Windows7, Windows XP and Linux
Work Experience
• Currently working as a System Analyst with Global InfoVision Pvt Ltd from
Sepetember 2012
Professional Summary
• Deploying and configuring the components of the system infrastructure on an
ongoing basis
• Understanding the client required tools and configuring the same in the business
environment with the reusable documentation
• Giving KT to the team members on the newly configured tools
• Configuring Basic Hadoop Big Data on Linux platform.
• Providing application support (administration, configuration, and monitoring) for
both off the shelf and home grown applications
• Supporting 100+ servers in the Dev/ IDE / STAGING QA/ Prod environment
• Working closely with the software development, Integration, Quality Assurance and
production teams from design to deployment phase of the applications
• Troubleshooting client related issues.
• Designed, configured and managed the backup and recovery for Hadoop data.
• Optimized and tuned the Hadoop environment to meet the performance
requirements.
• Proficient in designing, configuring and managing the backup and recovery for
Hadoop data.
• Proficient installing and configuring Hive, Pig, HBASE and other Hadoop tools.
Project#1:
Project Name : Sales Analytics
Environment : Hadoop, HDFS, Oozie, Map Reduce, Hbase, Pig, Hive, SQOOP, Java
Duration : Jan 2015 to till date
Role : Hadoop Developer
Project Description
• We use to analyze historical sales information and generate reports on monthly basis
and compare sales between months, compare sales of different product categories
and analyze growth and decline of sales, generating weekly, monthly and yearly
report for the same.
• Doing forecast for future sales on also comparing the target achieved or not for
weekly and yearly basis.
• When selling products, in order to prevent products stock out due to the uncertainty
events, enterprises usually reserves a certain amount of safety stock.
• The amount of safety stock is directly related to inventory costs and no. of sales.
• We determine the amount of future safety stock as per the previous sales report.
• A Cluster of 7 nodes was setup using Cloudera Manager 4.0.Various data sources
like csv, xml and RDBMS were used as data source
Roles & Responsibilities
• Involved in setting up the development environment
• Developed Map Reduce jobs in Java for data cleaning and pre-processing
• Developed Map Reduce programs to parse the raw data, populate staging tables and
store the refined data in partitioned tables in the EDW
• Assisted with data capacity planning and node forecasting.
• Created Hive queries that helped market analysts spot emerging trends by
comparing fresh data with EDW reference tables and historical metrics.
• Used to write hive and pig queries on different data sets and joining them.
• Used Sqoop for data transfer between RDBMS and HDFS.
• Checked the performance by introducing concept of combiner and Partitioner
• Hadoop Administration and Support, log monitoring as well as node monitoring for
performance issues. Used Cloudera manager as well as system log files.
• Worked on setting up a Cluster with High Availability Name node configuration
(active and standby mode) through NFS.
• Supported code/design analysis, strategy development and project planning
• Created scripts to delete log files of hadoop periodically
Project#2:
Project Name : Sentiment Analysis on Customer evolution in banking domain
Environment : Hadoop, HDFS, Apache Pig, SQOOP, Java, Linux, MySQL
Duration : May 2014 to Sep 2014
Role : Hadoop Developer
Project Description:
The purpose of the project is to store information generated by the bank’s historical data,
extract meaning information out of it and based on the information predict the customer’s
category. The solution is based on the open source Big data s/w Hadoop. The data will be
stored in hadoop file system and processed using Map/Reduce jobs for the Product and
pricing information.
Roles and Responsibilities:
• Involved in developing the pig Scripts
• Developed the Sqoop scripts in order to make the integration between Pig and
MySQL Database.
• Completely involved in the requirement analysis phase
Project#3:
Project Name : Preparing a web interface for hbase
Environment : Hadoop, HDFS, Apache Pig, SQOOP, Map Reduce, MySQL, Hbase
Duration : Sep 2014 to Dec 2014
Role : Hadoop Developer
Project Description:
The purpose of the project is to perform the analysis on the effectiveness and validity of
controls and to store terabytes of log information generated by the source providers as part
of the analysis and extract meaningful information out of it.The solution is based on the
open source bigdata software hadoop. The data will be stored in hadoop filesystem and
processed using Map Reduce jobs, which intern includes getting the raw data, process the
data to obtain controls and redesign/change history information, extract various reports out
of the controls history and export the information for further processing.
Roles and Responsibilities:
• Fetched data from hdfs using pig, hive, mapreduce into hbase
• Fetched data from existing MySql database to HBase using Sqoop and Map reduce.
• Prepared a web UI for the HBase database for crud operation like put, get, scan,
delete, update etc.
• Provided a UI to enter user’s Query which would be parsed and executed internally
and Result would be displayed on the web interface.
Project#4:
Project Name : Severity Claims System (SCS).
Duration : Sep 2012 to Apr 2014
Role : Developer
Type of Project : Ticket Support and Maintenance.
Project Description:
SCS (Severity Claims System) application is a core insurance claim processing system which
caters to the Excess Claims, Environmental and Toxic tort LOBs (Line of Businesses). It aims
to identify potential Severity claims, determine and eliminate duplicate claims, calculate the
potential exposure of AIG regarding claims and optimize the process of claims processing.
Roles and Responsibilities:
• Perform Root Cause analysis for User requests.
• Performance Tuning and Query Optimization.
• Ensure smooth knowledge transition to team members.
• Attend Status Meetings.
• Doing a peer review of the deliverables and documenting the defects.
• Giving new proposals and suggestions for application performance improvement.
Operating System : Windows XP, Windows 7, OS/390.
Languages : CORE JAVA, HTML, JCL, COBOL, DB2.
Special Software : Eclipse, PVCS, Query Tool.
PERSONAL DETAILS:
Name : M.V. Rama Kumar
Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM
(M.D)
Guntur (DIST), A.P-522311
D.O.B :12-11-1987
Languages Known : English, Telugu, Hindi
Nationality : Indian
Interests : Playing Cricket
• Performance Tuning and Query Optimization.
• Ensure smooth knowledge transition to team members.
• Attend Status Meetings.
• Doing a peer review of the deliverables and documenting the defects.
• Giving new proposals and suggestions for application performance improvement.
Operating System : Windows XP, Windows 7, OS/390.
Languages : CORE JAVA, HTML, JCL, COBOL, DB2.
Special Software : Eclipse, PVCS, Query Tool.
PERSONAL DETAILS:
Name : M.V. Rama Kumar
Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM
(M.D)
Guntur (DIST), A.P-522311
D.O.B :12-11-1987
Languages Known : English, Telugu, Hindi
Nationality : Indian
Interests : Playing Cricket

More Related Content

What's hot

Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
srikanth K
 
Vijay_hadoop admin
Vijay_hadoop adminVijay_hadoop admin
Vijay_hadoop adminvijay vijay
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)ch koti
 
sam_resume - updated
sam_resume - updatedsam_resume - updated
sam_resume - updatedsam k
 
Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
Thanh Nguyen
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith R
 
Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resumeSasmita Swain
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
DataWorks Summit
 
Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft Platform
Andrew Brust
 
Etu L2 Training - Hadoop 企業應用實作
Etu L2 Training - Hadoop 企業應用實作Etu L2 Training - Hadoop 企業應用實作
Etu L2 Training - Hadoop 企業應用實作
James Chen
 
Hadoop in a Nutshell
Hadoop in a NutshellHadoop in a Nutshell
Hadoop in a Nutshell
Anthony Thomas
 
Big data Hadoop
Big data  Hadoop   Big data  Hadoop
Big data Hadoop
Ayyappan Paramesh
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINAthokpam Nabakumar
 
How can Hadoop & SAP be integrated
How can Hadoop & SAP be integratedHow can Hadoop & SAP be integrated
How can Hadoop & SAP be integrated
Douglas Bernardini
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
Shiv Shakti
 
Rameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_AdminRameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_AdminRameez Rangrez
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
mallikarjunkoriindia
 

What's hot (20)

Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Nagesh Hadoop Profile
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profile
 
Vijay_hadoop admin
Vijay_hadoop adminVijay_hadoop admin
Vijay_hadoop admin
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)
 
sam_resume - updated
sam_resume - updatedsam_resume - updated
sam_resume - updated
 
Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
 
Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resume
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
 
Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft Platform
 
Etu L2 Training - Hadoop 企業應用實作
Etu L2 Training - Hadoop 企業應用實作Etu L2 Training - Hadoop 企業應用實作
Etu L2 Training - Hadoop 企業應用實作
 
Poorna Hadoop
Poorna HadoopPoorna Hadoop
Poorna Hadoop
 
Hadoop in a Nutshell
Hadoop in a NutshellHadoop in a Nutshell
Hadoop in a Nutshell
 
Big data Hadoop
Big data  Hadoop   Big data  Hadoop
Big data Hadoop
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
 
How can Hadoop & SAP be integrated
How can Hadoop & SAP be integratedHow can Hadoop & SAP be integrated
How can Hadoop & SAP be integrated
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
Rameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_AdminRameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_Admin
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
RESUME_N
RESUME_NRESUME_N
RESUME_N
 

Viewers also liked

The Science of Listening
The Science of ListeningThe Science of Listening
The Science of Listening
Mark Robinson
 
AppTrim - A Healthy Way to Manage Appetite and Obesity
AppTrim -  A Healthy Way to Manage Appetite and ObesityAppTrim -  A Healthy Way to Manage Appetite and Obesity
AppTrim - A Healthy Way to Manage Appetite and Obesity
Targeted Medical Pharma
 
Dress to party 2013
Dress to party 2013Dress to party 2013
Dress to party 2013
longtracy332
 
Percura, Improve Pain and Numbness Associated with Peripheral Neuropathy
Percura, Improve Pain and Numbness Associated with Peripheral NeuropathyPercura, Improve Pain and Numbness Associated with Peripheral Neuropathy
Percura, Improve Pain and Numbness Associated with Peripheral Neuropathy
Targeted Medical Pharma
 
カナエール2012 報告書
カナエール2012 報告書カナエール2012 報告書
カナエール2012 報告書
Canayell
 
翔太の紙ヒコーキ~カナエール物語~
翔太の紙ヒコーキ~カナエール物語~翔太の紙ヒコーキ~カナエール物語~
翔太の紙ヒコーキ~カナエール物語~Canayell
 
Community engagement in the extractive industries in mongolia
Community engagement in the extractive industries in mongoliaCommunity engagement in the extractive industries in mongolia
Community engagement in the extractive industries in mongoliaAmjiltcom Mongolia
 
A Prom Girl Diary
A Prom Girl DiaryA Prom Girl Diary
A Prom Girl Diary
longtracy332
 
Add value to bread 3
Add value to bread 3Add value to bread 3
Add value to bread 3tainkab
 
Tyler's work assassin .key
Tyler's work assassin .keyTyler's work assassin .key
Tyler's work assassin .keyroom1pws
 
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
Targeted Medical Pharma
 
Breathe Free Day and Night without Dangerous Side Effects
Breathe Free Day and Night without Dangerous Side EffectsBreathe Free Day and Night without Dangerous Side Effects
Breathe Free Day and Night without Dangerous Side Effects
Targeted Medical Pharma
 
HAKI (Hak Atas Kekayaan Intelektual) for GPJHS
HAKI (Hak Atas Kekayaan Intelektual) for GPJHSHAKI (Hak Atas Kekayaan Intelektual) for GPJHS
HAKI (Hak Atas Kekayaan Intelektual) for GPJHS
Mochammad Nazar
 

Viewers also liked (15)

Deck
DeckDeck
Deck
 
The Science of Listening
The Science of ListeningThe Science of Listening
The Science of Listening
 
AppTrim - A Healthy Way to Manage Appetite and Obesity
AppTrim -  A Healthy Way to Manage Appetite and ObesityAppTrim -  A Healthy Way to Manage Appetite and Obesity
AppTrim - A Healthy Way to Manage Appetite and Obesity
 
Dress to party 2013
Dress to party 2013Dress to party 2013
Dress to party 2013
 
Percura, Improve Pain and Numbness Associated with Peripheral Neuropathy
Percura, Improve Pain and Numbness Associated with Peripheral NeuropathyPercura, Improve Pain and Numbness Associated with Peripheral Neuropathy
Percura, Improve Pain and Numbness Associated with Peripheral Neuropathy
 
カナエール2012 報告書
カナエール2012 報告書カナエール2012 報告書
カナエール2012 報告書
 
翔太の紙ヒコーキ~カナエール物語~
翔太の紙ヒコーキ~カナエール物語~翔太の紙ヒコーキ~カナエール物語~
翔太の紙ヒコーキ~カナエール物語~
 
Future of Grammar
Future of GrammarFuture of Grammar
Future of Grammar
 
Community engagement in the extractive industries in mongolia
Community engagement in the extractive industries in mongoliaCommunity engagement in the extractive industries in mongolia
Community engagement in the extractive industries in mongolia
 
A Prom Girl Diary
A Prom Girl DiaryA Prom Girl Diary
A Prom Girl Diary
 
Add value to bread 3
Add value to bread 3Add value to bread 3
Add value to bread 3
 
Tyler's work assassin .key
Tyler's work assassin .keyTyler's work assassin .key
Tyler's work assassin .key
 
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
Targeted Medical Pharma Innovating Healthcare for Better Patient Health IP 2014
 
Breathe Free Day and Night without Dangerous Side Effects
Breathe Free Day and Night without Dangerous Side EffectsBreathe Free Day and Night without Dangerous Side Effects
Breathe Free Day and Night without Dangerous Side Effects
 
HAKI (Hak Atas Kekayaan Intelektual) for GPJHS
HAKI (Hak Atas Kekayaan Intelektual) for GPJHSHAKI (Hak Atas Kekayaan Intelektual) for GPJHS
HAKI (Hak Atas Kekayaan Intelektual) for GPJHS
 

Similar to hadoop exp

Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
sudhanshu kumar
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cvrevuri
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Mopuru Babu
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Mopuru Babu
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
Sourav Banerjee
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNag Arjun
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016Rajesh Kumar
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_ResumeYeduvaka Ganesh
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_DamarlaNag Arjun
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
ramaprasad owk
 

Similar to hadoop exp (20)

Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
hadoop_bigdata
hadoop_bigdatahadoop_bigdata
hadoop_bigdata
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cv
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Resume_2706
Resume_2706Resume_2706
Resume_2706
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016
 
ganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resumeganesh_2+yrs_Java_Developer_Resume
ganesh_2+yrs_Java_Developer_Resume
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 

hadoop exp

  • 1. M.V. Rama Kumar Mobile: +91-9014514500 Email:matururamakumar@gmail.com Professional Summary: • 3 years of overall IT experience in Application Development in Java and Big Data Hadoop. • 1.6 years of exclusive experience in Hadoop and its components like HDFS, Map Reduce, Apache Pig, Hive, Sqoop , HBase and Oozie • Extensive Experience in Setting Hadoop Cluster • Capable of processing large sets of structured, semi-structured and unstructured data and supporting systems application architecture. • Involved in writing the Pig scripts to reduce the job execution time. • Experience in importing and exporting data using Sqoop from Hdfs to Relational Database Systems and vice-versa. • Extending Hive and Pig core functionality by writing Custom UDFs. • Load data in to the Hive tables. • Experience in analyzing data using HiveQL, Pig Latin, and custom Map Reduce programs in Java. • Knowledge in job workflow scheduling tool like Oozie. • Experience in Capturing data from existing databases that provides SQL interfaces using Sqoop. • Good working knowledge with Sqoop and Apache Pig , Apache Oozie • Excellent communication, interpersonal, analytical skills, and strong ability to perform as part of team. • Exceptional ability to learn new concepts. • Hard working and enthusiastic. Qualifications: Program of Study Specialization Institute Board/University Year of Passing % M.C.A CS Prakasam Eng College JNTUK 2012 70% Technical Skills: Languages SQL, Pig Script, HQL, Core Java, Map Reduce. Big data Technologies HDFS,MapReduce, Pig, Sqoop, Hive, Hbase,Oozie Frameworks Hadoop. Java IDEs Eclipse. Operating Systems Windows7, Windows XP and Linux Work Experience
  • 2. • Currently working as a System Analyst with Global InfoVision Pvt Ltd from Sepetember 2012 Professional Summary • Deploying and configuring the components of the system infrastructure on an ongoing basis • Understanding the client required tools and configuring the same in the business environment with the reusable documentation • Giving KT to the team members on the newly configured tools • Configuring Basic Hadoop Big Data on Linux platform. • Providing application support (administration, configuration, and monitoring) for both off the shelf and home grown applications • Supporting 100+ servers in the Dev/ IDE / STAGING QA/ Prod environment • Working closely with the software development, Integration, Quality Assurance and production teams from design to deployment phase of the applications • Troubleshooting client related issues. • Designed, configured and managed the backup and recovery for Hadoop data. • Optimized and tuned the Hadoop environment to meet the performance requirements. • Proficient in designing, configuring and managing the backup and recovery for Hadoop data. • Proficient installing and configuring Hive, Pig, HBASE and other Hadoop tools. Project#1: Project Name : Sales Analytics Environment : Hadoop, HDFS, Oozie, Map Reduce, Hbase, Pig, Hive, SQOOP, Java Duration : Jan 2015 to till date Role : Hadoop Developer Project Description • We use to analyze historical sales information and generate reports on monthly basis and compare sales between months, compare sales of different product categories and analyze growth and decline of sales, generating weekly, monthly and yearly report for the same. • Doing forecast for future sales on also comparing the target achieved or not for weekly and yearly basis. • When selling products, in order to prevent products stock out due to the uncertainty events, enterprises usually reserves a certain amount of safety stock. • The amount of safety stock is directly related to inventory costs and no. of sales. • We determine the amount of future safety stock as per the previous sales report.
  • 3. • A Cluster of 7 nodes was setup using Cloudera Manager 4.0.Various data sources like csv, xml and RDBMS were used as data source Roles & Responsibilities • Involved in setting up the development environment • Developed Map Reduce jobs in Java for data cleaning and pre-processing • Developed Map Reduce programs to parse the raw data, populate staging tables and store the refined data in partitioned tables in the EDW • Assisted with data capacity planning and node forecasting. • Created Hive queries that helped market analysts spot emerging trends by comparing fresh data with EDW reference tables and historical metrics. • Used to write hive and pig queries on different data sets and joining them. • Used Sqoop for data transfer between RDBMS and HDFS. • Checked the performance by introducing concept of combiner and Partitioner • Hadoop Administration and Support, log monitoring as well as node monitoring for performance issues. Used Cloudera manager as well as system log files. • Worked on setting up a Cluster with High Availability Name node configuration (active and standby mode) through NFS. • Supported code/design analysis, strategy development and project planning • Created scripts to delete log files of hadoop periodically Project#2: Project Name : Sentiment Analysis on Customer evolution in banking domain Environment : Hadoop, HDFS, Apache Pig, SQOOP, Java, Linux, MySQL Duration : May 2014 to Sep 2014 Role : Hadoop Developer Project Description: The purpose of the project is to store information generated by the bank’s historical data, extract meaning information out of it and based on the information predict the customer’s category. The solution is based on the open source Big data s/w Hadoop. The data will be stored in hadoop file system and processed using Map/Reduce jobs for the Product and pricing information. Roles and Responsibilities: • Involved in developing the pig Scripts • Developed the Sqoop scripts in order to make the integration between Pig and MySQL Database. • Completely involved in the requirement analysis phase Project#3: Project Name : Preparing a web interface for hbase
  • 4. Environment : Hadoop, HDFS, Apache Pig, SQOOP, Map Reduce, MySQL, Hbase Duration : Sep 2014 to Dec 2014 Role : Hadoop Developer Project Description: The purpose of the project is to perform the analysis on the effectiveness and validity of controls and to store terabytes of log information generated by the source providers as part of the analysis and extract meaningful information out of it.The solution is based on the open source bigdata software hadoop. The data will be stored in hadoop filesystem and processed using Map Reduce jobs, which intern includes getting the raw data, process the data to obtain controls and redesign/change history information, extract various reports out of the controls history and export the information for further processing. Roles and Responsibilities: • Fetched data from hdfs using pig, hive, mapreduce into hbase • Fetched data from existing MySql database to HBase using Sqoop and Map reduce. • Prepared a web UI for the HBase database for crud operation like put, get, scan, delete, update etc. • Provided a UI to enter user’s Query which would be parsed and executed internally and Result would be displayed on the web interface. Project#4: Project Name : Severity Claims System (SCS). Duration : Sep 2012 to Apr 2014 Role : Developer Type of Project : Ticket Support and Maintenance. Project Description: SCS (Severity Claims System) application is a core insurance claim processing system which caters to the Excess Claims, Environmental and Toxic tort LOBs (Line of Businesses). It aims to identify potential Severity claims, determine and eliminate duplicate claims, calculate the potential exposure of AIG regarding claims and optimize the process of claims processing. Roles and Responsibilities: • Perform Root Cause analysis for User requests.
  • 5. • Performance Tuning and Query Optimization. • Ensure smooth knowledge transition to team members. • Attend Status Meetings. • Doing a peer review of the deliverables and documenting the defects. • Giving new proposals and suggestions for application performance improvement. Operating System : Windows XP, Windows 7, OS/390. Languages : CORE JAVA, HTML, JCL, COBOL, DB2. Special Software : Eclipse, PVCS, Query Tool. PERSONAL DETAILS: Name : M.V. Rama Kumar Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM (M.D) Guntur (DIST), A.P-522311 D.O.B :12-11-1987 Languages Known : English, Telugu, Hindi Nationality : Indian Interests : Playing Cricket
  • 6. • Performance Tuning and Query Optimization. • Ensure smooth knowledge transition to team members. • Attend Status Meetings. • Doing a peer review of the deliverables and documenting the defects. • Giving new proposals and suggestions for application performance improvement. Operating System : Windows XP, Windows 7, OS/390. Languages : CORE JAVA, HTML, JCL, COBOL, DB2. Special Software : Eclipse, PVCS, Query Tool. PERSONAL DETAILS: Name : M.V. Rama Kumar Address : 2-147/A, CHANDOLE (POST), PITTALAVANI PALEM (M.D) Guntur (DIST), A.P-522311 D.O.B :12-11-1987 Languages Known : English, Telugu, Hindi Nationality : Indian Interests : Playing Cricket