SlideShare a Scribd company logo
1 of 4
Srikanth K :srikanthkamatam1@gmail.com
Hadoop Developer : +91-7075436413
EXECUTIVE SUMMARY
 Having 3+ years of IT experience as an Big Data and Hadoop.
 Having work experience in HDFS, MapReduce, Apache PIG, HIVE, Sqoop and Hbase..
 A dedicated team player, committed towards providing high quality support and excellent
problem solving skills.
 Good communication & interpersonal skills.
 Good Team player and self motivated.
TECHNICAL SKILLS
 Hadoop : HDFS, Map Reduce, Apache PIG, HIVE, Sqoop and Hbase
 Operating Systems : Windows 2003/2008, Unix, Cent-Linux
 Relational Databases: My SQL, ORACLE
 Java : Core Java
EDUCATION
 M.C.A from J N T U University.
 BSc (M.p.ca) from Osmania University.
PROFESSIONAL EXPERIENCE
 Working for ADP India Pvt Ltd through Alwasi Software Pvt Ltd, Hyderabad from
March 2012 to till date.
 Trained in BIGDATA / Hadoop and deployed to client.
PROJECTDETAILS
PROJECT : LOWES Re-hosting of Web Intelligence
Environment : Hadoop, Apache Pig, Hive, Sqoop, Java, Linux, MySQL
Duration : Jan 2014 to till Date
Description:
The purpose of the project is to store terabytes of log information generated by the ecommerce
website and extract meaning information out of it. The solution is based on the open source Big
Data s/w Hadoop .The data will be stored in Hadoop file system and processed using
Map/Reduce jobs. Which intern includes getting the raw html data from the websites, Process the
html to obtain product and pricing information, Extract various reports out of the product pricing
information and Export the information for further processing.
This project is mainly for the re-plat forming of the current existing system which is running on
Web Harvest a third party JAR and in My SQL DB to a new cloud solution technology called
Hadoop which can able to process large date sets (i.e. Tera bytes and Peta bytes of data) in order
to meet the client requirements with the increasing completion from his retailers.
Responsibilities:
 Participated in client calls to gather and analyses the requirement.
 Moved all crawl data flat files generated from various retailers to HDFS for further
processing.
 Written the Apache PIG scripts to process the HDFS data.
 Created Hive tables to store the processed results in a tabular format.
 Developed the sqoop scripts in order to make the interaction between Pig and My
SQL Database.
 For the development of Dashboard solution, developed the Controller, Service and
Dao layers of Spring Framework.
 Developed scripts for creating the reports from Hive data.
 Completely involved in the requirement analysis phase.
Project : Private Bank Repository DW
Environment : Hadoop, HDFS, Map Reduce, Hive, Hbase and MYSQL
Duration : June2012 to Dec 2013
ProjectSynopsis
A full-fledged dimensional data mart to cater to the CPB analytical reporting
requirement as the current GWM system is mainly focused on data enrichment,
adjustment, defaulting and other data oriented process. Involved in the full development
life cycle in a distributed environment for the Candidate Module. . Private Bank
Repository system is processing approximately 5, 00,000 of records every month.
Responsibilities
 Participated in client calls to gather and analyses the requirement.
 Involved in setup for Hadoop Cluster in Pseudo-Distributed Mode based on Linux
commands.
 Involved in Core Concepts of Hadoop HDFS, Map Reduce (Like Job Tracker, Task
tracker).
 Involved in Map Reduce phases By Using Core java , create and put jar files in to HDFS
and run web UI for Name node , Job Tracker and Task Tracker.
 Involved in Extracting, transforming, loading Data from Hive to Load an RDBMS.
 Involved in Transforming Data within a Hadoop Cluster
 Involved in using Pentaho Map Reduce to Parse Weblog Data for Pentaho Map Reduce
to convert raw weblog data into parsed, delimited records.
 Involved Job to Load, Loading Data into Hive.
 Involved in Create the Table in Hbase, Create a Transformation to Load Data into Hbase.
 Involved in Writing input output Formats for CSV.
 Involved in Import and Export by using Sqoop for job entries.
 Design and development by using Pentaho.
 Involved in Unit Test Pentaho Map Reduce Transformation
Project
Project under Training – March 2012 to June 2012
Big Data initiative in the largest FinancialInstitute in North America:
One of the largest financial institutions in north America had implemented small
business banking e statements project using existing software tools and applications. The
overall process to generate e statement and send alerts to customers was taking 18 to 30 hours
per cycle day. Hence missing all SLA's leading to customer dissatisfaction.
The purpose of the project to cut down the processing time to generate E-statements and alerts
by at least 50% and also cut down the cost by 50%.
Solution
 All sources of Structured and Unstructured data ingested into Hadoop platform
o Unstructured data for small business e statements
o structured data of Financial transaction, cycle transaction, supplemental
transaction, WCC customer data and GAI online banking data.
 4GB chunks of data are created at the source (card processing system) and send
directly to card processing system for PDF's generation.
 E-Statements account numbers are generated using PIG scripts.
 All the data combined together using Hive QL to create necessary data for customer
notification engine.
 Tools used: Sqoop to import data from databases to Hadoop
Environment : Hadoop, Map Reduce, Hive.
Role : Hadoop Developer
Roles & Responsibilities:
 Create Hduser for performing HDFS operations
 Create Map Reduce user for performing map Reduce operations only
 Written the Apache PIG scripts to process the HDFS data.
 Setting Password less Hadoop
 Hadoop Installation Verification(Terra sort benchmark test)
 Setup Hive with My Sql as a Remote Meta store
 Developed the sqoop scripts in order to make the interaction between Hive and My SQL
Database.
 Moved all log files generated by various network devices into HDFS location
 Created External Hive Table on top of parsed data
(Srikanth)

More Related Content

What's hot

Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...
Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...
Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...DataWorks Summit/Hadoop Summit
 
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...Agile Testing Alliance
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNag Arjun
 
Distributed Data Analysis with Hadoop and R - OSCON 2011
Distributed Data Analysis with Hadoop and R - OSCON 2011Distributed Data Analysis with Hadoop and R - OSCON 2011
Distributed Data Analysis with Hadoop and R - OSCON 2011Jonathan Seidman
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resumeShiv Shakti
 
Big dataarchitecturesandecosystem+nosql
Big dataarchitecturesandecosystem+nosqlBig dataarchitecturesandecosystem+nosql
Big dataarchitecturesandecosystem+nosqlKhanderao Kand
 
Common and unique use cases for Apache Hadoop
Common and unique use cases for Apache HadoopCommon and unique use cases for Apache Hadoop
Common and unique use cases for Apache HadoopBrock Noland
 
DBA to Data Scientist
DBA to Data ScientistDBA to Data Scientist
DBA to Data Scientistpasalapudi
 
Hadoop - Architectural road map for Hadoop Ecosystem
Hadoop -  Architectural road map for Hadoop EcosystemHadoop -  Architectural road map for Hadoop Ecosystem
Hadoop - Architectural road map for Hadoop Ecosystemnallagangus
 
Costing your Bug Data Operations
Costing your Bug Data OperationsCosting your Bug Data Operations
Costing your Bug Data OperationsDataWorks Summit
 
Cascading User Group Meet
Cascading User Group MeetCascading User Group Meet
Cascading User Group MeetVinoth Kannan
 
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud ComputingBattling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud ComputingEdwin Poot
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
 
Building a Big Data platform with the Hadoop ecosystem
Building a Big Data platform with the Hadoop ecosystemBuilding a Big Data platform with the Hadoop ecosystem
Building a Big Data platform with the Hadoop ecosystemGregg Barrett
 
Functional programming
 for optimization problems 
in Big Data
Functional programming
  for optimization problems 
in Big DataFunctional programming
  for optimization problems 
in Big Data
Functional programming
 for optimization problems 
in Big DataPaco Nathan
 

What's hot (20)

Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...
Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...
Building Information Platform - Integration of Hadoop with SAP HANA and HANA ...
 
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
Distributed Data Analysis with Hadoop and R - OSCON 2011
Distributed Data Analysis with Hadoop and R - OSCON 2011Distributed Data Analysis with Hadoop and R - OSCON 2011
Distributed Data Analysis with Hadoop and R - OSCON 2011
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
Big dataarchitecturesandecosystem+nosql
Big dataarchitecturesandecosystem+nosqlBig dataarchitecturesandecosystem+nosql
Big dataarchitecturesandecosystem+nosql
 
Common and unique use cases for Apache Hadoop
Common and unique use cases for Apache HadoopCommon and unique use cases for Apache Hadoop
Common and unique use cases for Apache Hadoop
 
DBA to Data Scientist
DBA to Data ScientistDBA to Data Scientist
DBA to Data Scientist
 
Hadoop - Architectural road map for Hadoop Ecosystem
Hadoop -  Architectural road map for Hadoop EcosystemHadoop -  Architectural road map for Hadoop Ecosystem
Hadoop - Architectural road map for Hadoop Ecosystem
 
Costing your Bug Data Operations
Costing your Bug Data OperationsCosting your Bug Data Operations
Costing your Bug Data Operations
 
Cascading User Group Meet
Cascading User Group MeetCascading User Group Meet
Cascading User Group Meet
 
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud ComputingBattling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
Battling the disrupting Energy Markets utilizing PURE PLAY Cloud Computing
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
Building a Big Data platform with the Hadoop ecosystem
Building a Big Data platform with the Hadoop ecosystemBuilding a Big Data platform with the Hadoop ecosystem
Building a Big Data platform with the Hadoop ecosystem
 
Functional programming
 for optimization problems 
in Big Data
Functional programming
  for optimization problems 
in Big DataFunctional programming
  for optimization problems 
in Big Data
Functional programming
 for optimization problems 
in Big Data
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
BIGDATA ppts
BIGDATA pptsBIGDATA ppts
BIGDATA ppts
 
What is hadoop
What is hadoopWhat is hadoop
What is hadoop
 

Similar to Srikanth hadoop 3.6yrs_hyd

Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_DamarlaNag Arjun
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cvrevuri
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam M
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaMopuru Babu
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)ch koti
 
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune amrutupre
 

Similar to Srikanth hadoop 3.6yrs_hyd (20)

Resume
ResumeResume
Resume
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
Yasar resume 2
Yasar resume 2Yasar resume 2
Yasar resume 2
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Nagesh Hadoop Profile
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profile
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cv
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Resume_Karthick
Resume_KarthickResume_Karthick
Resume_Karthick
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+Years
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
 
Mansi Khare
Mansi KhareMansi Khare
Mansi Khare
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)
 
Atul Mithe
Atul MitheAtul Mithe
Atul Mithe
 
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
 

Recently uploaded

Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantAxelRicardoTrocheRiq
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...stazi3110
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVshikhaohhpro
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxTier1 app
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfPower Karaoke
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptkotipi9215
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...OnePlan Solutions
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsMehedi Hasan Shohan
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...gurkirankumar98700
 
cybersecurity notes for mca students for learning
cybersecurity notes for mca students for learningcybersecurity notes for mca students for learning
cybersecurity notes for mca students for learningVitsRangannavar
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideChristina Lin
 
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝soniya singh
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWave PLM
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about usDynamic Netsoft
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfjoe51371421
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...soniya singh
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEOrtus Solutions, Corp
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxbodapatigopi8531
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityNeo4j
 

Recently uploaded (20)

Salesforce Certified Field Service Consultant
Salesforce Certified Field Service ConsultantSalesforce Certified Field Service Consultant
Salesforce Certified Field Service Consultant
 
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
Building a General PDE Solving Framework with Symbolic-Numeric Scientific Mac...
 
Optimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTVOptimizing AI for immediate response in Smart CCTV
Optimizing AI for immediate response in Smart CCTV
 
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptxKnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
KnowAPIs-UnknownPerf-jaxMainz-2024 (1).pptx
 
The Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdfThe Evolution of Karaoke From Analog to App.pdf
The Evolution of Karaoke From Analog to App.pdf
 
chapter--4-software-project-planning.ppt
chapter--4-software-project-planning.pptchapter--4-software-project-planning.ppt
chapter--4-software-project-planning.ppt
 
Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...Advancing Engineering with AI through the Next Generation of Strategic Projec...
Advancing Engineering with AI through the Next Generation of Strategic Projec...
 
XpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software SolutionsXpertSolvers: Your Partner in Building Innovative Software Solutions
XpertSolvers: Your Partner in Building Innovative Software Solutions
 
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
(Genuine) Escort Service Lucknow | Starting ₹,5K To @25k with A/C 🧑🏽‍❤️‍🧑🏻 89...
 
cybersecurity notes for mca students for learning
cybersecurity notes for mca students for learningcybersecurity notes for mca students for learning
cybersecurity notes for mca students for learning
 
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop SlideBuilding Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
Building Real-Time Data Pipelines: Stream & Batch Processing workshop Slide
 
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
Call Girls in Naraina Delhi 💯Call Us 🔝8264348440🔝
 
What is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need ItWhat is Fashion PLM and Why Do You Need It
What is Fashion PLM and Why Do You Need It
 
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...Call Girls In Mukherjee Nagar 📱  9999965857  🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
Call Girls In Mukherjee Nagar 📱 9999965857 🤩 Delhi 🫦 HOT AND SEXY VVIP 🍎 SE...
 
DNT_Corporate presentation know about us
DNT_Corporate presentation know about usDNT_Corporate presentation know about us
DNT_Corporate presentation know about us
 
why an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdfwhy an Opensea Clone Script might be your perfect match.pdf
why an Opensea Clone Script might be your perfect match.pdf
 
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
Russian Call Girls in Karol Bagh Aasnvi ➡️ 8264348440 💋📞 Independent Escort S...
 
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASEBATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
BATTLEFIELD ORM: TIPS, TACTICS AND STRATEGIES FOR CONQUERING YOUR DATABASE
 
Hand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptxHand gesture recognition PROJECT PPT.pptx
Hand gesture recognition PROJECT PPT.pptx
 
EY_Graph Database Powered Sustainability
EY_Graph Database Powered SustainabilityEY_Graph Database Powered Sustainability
EY_Graph Database Powered Sustainability
 

Srikanth hadoop 3.6yrs_hyd

  • 1. Srikanth K :srikanthkamatam1@gmail.com Hadoop Developer : +91-7075436413 EXECUTIVE SUMMARY  Having 3+ years of IT experience as an Big Data and Hadoop.  Having work experience in HDFS, MapReduce, Apache PIG, HIVE, Sqoop and Hbase..  A dedicated team player, committed towards providing high quality support and excellent problem solving skills.  Good communication & interpersonal skills.  Good Team player and self motivated. TECHNICAL SKILLS  Hadoop : HDFS, Map Reduce, Apache PIG, HIVE, Sqoop and Hbase  Operating Systems : Windows 2003/2008, Unix, Cent-Linux  Relational Databases: My SQL, ORACLE  Java : Core Java EDUCATION  M.C.A from J N T U University.  BSc (M.p.ca) from Osmania University. PROFESSIONAL EXPERIENCE  Working for ADP India Pvt Ltd through Alwasi Software Pvt Ltd, Hyderabad from March 2012 to till date.  Trained in BIGDATA / Hadoop and deployed to client.
  • 2. PROJECTDETAILS PROJECT : LOWES Re-hosting of Web Intelligence Environment : Hadoop, Apache Pig, Hive, Sqoop, Java, Linux, MySQL Duration : Jan 2014 to till Date Description: The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract meaning information out of it. The solution is based on the open source Big Data s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from the websites, Process the html to obtain product and pricing information, Extract various reports out of the product pricing information and Export the information for further processing. This project is mainly for the re-plat forming of the current existing system which is running on Web Harvest a third party JAR and in My SQL DB to a new cloud solution technology called Hadoop which can able to process large date sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the increasing completion from his retailers. Responsibilities:  Participated in client calls to gather and analyses the requirement.  Moved all crawl data flat files generated from various retailers to HDFS for further processing.  Written the Apache PIG scripts to process the HDFS data.  Created Hive tables to store the processed results in a tabular format.  Developed the sqoop scripts in order to make the interaction between Pig and My SQL Database.  For the development of Dashboard solution, developed the Controller, Service and Dao layers of Spring Framework.  Developed scripts for creating the reports from Hive data.  Completely involved in the requirement analysis phase.
  • 3. Project : Private Bank Repository DW Environment : Hadoop, HDFS, Map Reduce, Hive, Hbase and MYSQL Duration : June2012 to Dec 2013 ProjectSynopsis A full-fledged dimensional data mart to cater to the CPB analytical reporting requirement as the current GWM system is mainly focused on data enrichment, adjustment, defaulting and other data oriented process. Involved in the full development life cycle in a distributed environment for the Candidate Module. . Private Bank Repository system is processing approximately 5, 00,000 of records every month. Responsibilities  Participated in client calls to gather and analyses the requirement.  Involved in setup for Hadoop Cluster in Pseudo-Distributed Mode based on Linux commands.  Involved in Core Concepts of Hadoop HDFS, Map Reduce (Like Job Tracker, Task tracker).  Involved in Map Reduce phases By Using Core java , create and put jar files in to HDFS and run web UI for Name node , Job Tracker and Task Tracker.  Involved in Extracting, transforming, loading Data from Hive to Load an RDBMS.  Involved in Transforming Data within a Hadoop Cluster  Involved in using Pentaho Map Reduce to Parse Weblog Data for Pentaho Map Reduce to convert raw weblog data into parsed, delimited records.  Involved Job to Load, Loading Data into Hive.  Involved in Create the Table in Hbase, Create a Transformation to Load Data into Hbase.  Involved in Writing input output Formats for CSV.  Involved in Import and Export by using Sqoop for job entries.  Design and development by using Pentaho.  Involved in Unit Test Pentaho Map Reduce Transformation Project Project under Training – March 2012 to June 2012 Big Data initiative in the largest FinancialInstitute in North America:
  • 4. One of the largest financial institutions in north America had implemented small business banking e statements project using existing software tools and applications. The overall process to generate e statement and send alerts to customers was taking 18 to 30 hours per cycle day. Hence missing all SLA's leading to customer dissatisfaction. The purpose of the project to cut down the processing time to generate E-statements and alerts by at least 50% and also cut down the cost by 50%. Solution  All sources of Structured and Unstructured data ingested into Hadoop platform o Unstructured data for small business e statements o structured data of Financial transaction, cycle transaction, supplemental transaction, WCC customer data and GAI online banking data.  4GB chunks of data are created at the source (card processing system) and send directly to card processing system for PDF's generation.  E-Statements account numbers are generated using PIG scripts.  All the data combined together using Hive QL to create necessary data for customer notification engine.  Tools used: Sqoop to import data from databases to Hadoop Environment : Hadoop, Map Reduce, Hive. Role : Hadoop Developer Roles & Responsibilities:  Create Hduser for performing HDFS operations  Create Map Reduce user for performing map Reduce operations only  Written the Apache PIG scripts to process the HDFS data.  Setting Password less Hadoop  Hadoop Installation Verification(Terra sort benchmark test)  Setup Hive with My Sql as a Remote Meta store  Developed the sqoop scripts in order to make the interaction between Hive and My SQL Database.  Moved all log files generated by various network devices into HDFS location  Created External Hive Table on top of parsed data (Srikanth)