SlideShare a Scribd company logo
1 of 4
+91-
Srikanth K
+91-7075436413
srikanthkamatam1@gmail.com
Skilled software engineer looking to enhance professional skills in a dynamic and a fast paced workplace while
contributing to challenging goals in a project based environment. I am seeking for an opportunity that challenges my
skill set so that I can contribute in the growth and development of the organization using high end technologies.
• 3+ of experience in Big-Data Analytics, Hadoop Paradigm and Core Java along with designing,
developing and deploying large scale distributed systems.
• Good experience in Hadoop Framework, HDFS, Map/Reduce, Pig, Hive, Sqoop.
• Involved in implementing Proof of Concepts for various clients across various ISU in Hadoop and its related
technologies.
• Extensive expertise and solid understanding of OOPs and collection framework
• Having strong trouble shooting and problem solving skills.
• Proficient in Database Programming skills SQL.
• Well versed in MapReduce MRv1.
• Excellent understanding of Hadoop architecture and various components such as HDFS, Job Tracker, Task
Tracker, Name Node, Data Node and Map Reduce programming paradigm.
• Extensive experience in analyzing data with big data tools like Pig Latin and Hive QL.
• Extending Hive and Pig core functionality by writing custom UDFs.
• Hands on experience in installing, configuring and using ecosystem components like Hadoop Map Reduce,
HDFS, HBase, Oozie, Sqoop, Flume, Pig & Hive.
EDUCATION QUALIFICATION
• M C A from Jawaharlal Nehru Technological University
PROFESSIONAL EXPERIENCE
Organization: - ADP INDIA PVT LTD through ALWASI SOFTWARE Pvt. Ltd. Hyderabad
Duration: - July 2012– To Present.
Designation: - Software Engineer.
Project 1: - Re-hosting of Web Intelligence
Client : - LOWES
Duration: - Dec 2013 – To Present.
Description:
The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract
meaning information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored
in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from
1
+91-
the websites, Process the html to obtain product and pricing information, Extract various reports out of the product
pricing information and Export the information for further processing
This project is mainly for the re-platforming of the current existing system which is running on WebHarvest a third
party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process large date
sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the increasing completion
from his retailers.
Environment : Hadoop, Apache Pig, Hive, Sqoop, Java, Linux, MySQL
Roles & Responsibilities:-
• Participated in client calls to gather and analyses the requirement.
• Moved all crawl data flat files generated from various retailers to HDFS for further processing.
• Written the Apache PIG scripts to process the HDFS data.
• Created Hive tables to store the processed results in a tabular format.
• Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.
• For the development of Dashboard solution, developed the Controller, Service and Dao layers of Spring
Framework.
• Developed scripts for creating the reports from Hive data.
• Completely involved in the requirement analysis phase.
Project 2: - Repository DW
Client : - Private Bank
Duration: - July 2013 – Nov 2013
Description:
A full-fledged dimensional data mart to cater to the CPB analytical reporting requirement as the current
GWM system is mainly focused on data enrichment, adjustment, defaulting and other data oriented
process. Involved in the full development life cycle in a distributed environment for the Candidate Module.
Private Bank Repository system is processing approximately 5, 00,000 of records every month.
2
+91-
Roles & Responsibilities: -
• Participated in client calls to gather and analyses the requirement.
• Involved in setup for Hadoop Cluster in Pseudo-Distributed Mode based on Linux commands.
• Involved in Core Concepts of Hadoop HDFS, Map Reduce (Like Job Tracker, Task tracker).
• Involved in Map Reduce phases By Using Core java , create and put jar files in to HDFS and run web UI for
Name node , Job Tracker and Task Tracker.
• Involved in Extracting, transforming, loading Data from Hive to Load an RDBMS.
• Involved in Transforming Data within a Hadoop Cluster
• Involved in Using Pentaho Map Reduce to Parse Weblog Data for Pentaho Map Reduce to convert raw
weblog data into parsed, delimited records.
• Involved Job to Load, Loading Data into Hive.
• Involved in Create the Table in Hbase , Create a Transformation to Load Data into Hbase .
• Involved in Writing input output Formats for CSV.
• Involved in Import and Export by using Sqoop for job entries.
• Design and development by using Pentaho.
• Involved in Unit Test Pentaho Map Reduce Transformation
Project under training: - Big Data initiative in the largest Financial Institute in North America.
Client: Xavient Information Systems.
Duration: July 2012 to June 2013
Description: -
One of the largest financial institutions in North America had implemented small business banking e statements
project using existing software tools and applications. The overall process to generate e statement and send alerts to
customers was taking 18 to 30 hours per cycle day. Hence missing all SLA's leading to customer dissatisfaction.
The purpose of the project to cut down the processing time to generate E-statements and alerts by at least 50% and
also cut down the cost by 50%.
Environment : Hadoop, Map Reduce, Hive
3
+91-
Roles & Responsibilities:-
• Create hduser for performing hdfs operations
• Create Map Reduce user for performing map Reduce operations only
• Written the Apache PIG scripts to process the HDFS data.
• Setting Password less Hadoop
• Hadoop Installation Verification(Terra sort benchmark test)
• Setup Hive with Mysql as a Remote Meta store
• Developed the sqoop scripts in order to make the interaction between Hive and My SQL Database.
• Moved all log files generated by various network devices into HDFS location
• Created External Hive Table on top of parsed data
(SRIKANTH K)
4

More Related Content

What's hot

Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Thanh Nguyen
 
Hadoop - Architectural road map for Hadoop Ecosystem
Hadoop -  Architectural road map for Hadoop EcosystemHadoop -  Architectural road map for Hadoop Ecosystem
Hadoop - Architectural road map for Hadoop Ecosystemnallagangus
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNag Arjun
 
Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft PlatformAndrew Brust
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpbigdata sunil
 
VMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware HadoopVMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware HadoopVMUG IT
 
Payment Gateway Live hadoop project
Payment Gateway Live hadoop projectPayment Gateway Live hadoop project
Payment Gateway Live hadoop projectKamal A
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cvrevuri
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016Rajesh Kumar
 
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015Abdul Nasir
 
Hadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceHadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceNeev Technologies
 
My other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 editionMy other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 editionSteve Loughran
 
Big Data Technologies - Hadoop
Big Data Technologies - HadoopBig Data Technologies - Hadoop
Big Data Technologies - HadoopTalentica Software
 
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...DataWorks Summit
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNDataWorks Summit
 

What's hot (20)

Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
 
Hadoop - Architectural road map for Hadoop Ecosystem
Hadoop -  Architectural road map for Hadoop EcosystemHadoop -  Architectural road map for Hadoop Ecosystem
Hadoop - Architectural road map for Hadoop Ecosystem
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
Hadoop
HadoopHadoop
Hadoop
 
Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft Platform
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
VMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware HadoopVMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware Hadoop
 
Payment Gateway Live hadoop project
Payment Gateway Live hadoop projectPayment Gateway Live hadoop project
Payment Gateway Live hadoop project
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cv
 
Nagesh Hadoop Profile
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profile
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016
 
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
 
Hadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceHadoop Ecosystem at a Glance
Hadoop Ecosystem at a Glance
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
My other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 editionMy other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 edition
 
Big Data Technologies - Hadoop
Big Data Technologies - HadoopBig Data Technologies - Hadoop
Big Data Technologies - Hadoop
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
 
Big data hadoop rdbms
Big data hadoop rdbmsBig data hadoop rdbms
Big data hadoop rdbms
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
 

Similar to Srikanth hadoop hyderabad_3.4yeras - copy

Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_DamarlaNag Arjun
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkMopuru Babu
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkMopuru Babu
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop ResumeBharath Kumar
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerramaprasad owk
 

Similar to Srikanth hadoop hyderabad_3.4yeras - copy (20)

Resume_2706
Resume_2706Resume_2706
Resume_2706
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
hadoop_bigdata
hadoop_bigdatahadoop_bigdata
hadoop_bigdata
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
 
Pallavi_Resume
Pallavi_ResumePallavi_Resume
Pallavi_Resume
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Resume
ResumeResume
Resume
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
Yasar resume 2
Yasar resume 2Yasar resume 2
Yasar resume 2
 

Recently uploaded

My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024The Digital Insurer
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptxLBM Solutions
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Wonjun Hwang
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsAndrey Dotsenko
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024Scott Keck-Warren
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsPrecisely
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024BookNet Canada
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfjimielynbastida
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsHyundai Motor Group
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr LapshynFwdays
 

Recently uploaded (20)

My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024My INSURER PTE LTD - Insurtech Innovation Award 2024
My INSURER PTE LTD - Insurtech Innovation Award 2024
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
Key Features Of Token Development (1).pptx
Key  Features Of Token  Development (1).pptxKey  Features Of Token  Development (1).pptx
Key Features Of Token Development (1).pptx
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
Bun (KitWorks Team Study 노별마루 발표 2024.4.22)
 
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
#StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024SQL Database Design For Developers at php[tek] 2024
SQL Database Design For Developers at php[tek] 2024
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Unlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power SystemsUnlocking the Potential of the Cloud for IBM Power Systems
Unlocking the Potential of the Cloud for IBM Power Systems
 
Pigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping ElbowsPigging Solutions Piggable Sweeping Elbows
Pigging Solutions Piggable Sweeping Elbows
 
The transition to renewables in India.pdf
The transition to renewables in India.pdfThe transition to renewables in India.pdf
The transition to renewables in India.pdf
 
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
Transcript: #StandardsGoals for 2024: What’s new for BISAC - Tech Forum 2024
 
Science&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdfScience&tech:THE INFORMATION AGE STS.pdf
Science&tech:THE INFORMATION AGE STS.pdf
 
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter RoadsSnow Chain-Integrated Tire for a Safe Drive on Winter Roads
Snow Chain-Integrated Tire for a Safe Drive on Winter Roads
 
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
"Federated learning: out of reach no matter how close",Oleksandr Lapshyn
 

Srikanth hadoop hyderabad_3.4yeras - copy

  • 1. +91- Srikanth K +91-7075436413 srikanthkamatam1@gmail.com Skilled software engineer looking to enhance professional skills in a dynamic and a fast paced workplace while contributing to challenging goals in a project based environment. I am seeking for an opportunity that challenges my skill set so that I can contribute in the growth and development of the organization using high end technologies. • 3+ of experience in Big-Data Analytics, Hadoop Paradigm and Core Java along with designing, developing and deploying large scale distributed systems. • Good experience in Hadoop Framework, HDFS, Map/Reduce, Pig, Hive, Sqoop. • Involved in implementing Proof of Concepts for various clients across various ISU in Hadoop and its related technologies. • Extensive expertise and solid understanding of OOPs and collection framework • Having strong trouble shooting and problem solving skills. • Proficient in Database Programming skills SQL. • Well versed in MapReduce MRv1. • Excellent understanding of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm. • Extensive experience in analyzing data with big data tools like Pig Latin and Hive QL. • Extending Hive and Pig core functionality by writing custom UDFs. • Hands on experience in installing, configuring and using ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Sqoop, Flume, Pig & Hive. EDUCATION QUALIFICATION • M C A from Jawaharlal Nehru Technological University PROFESSIONAL EXPERIENCE Organization: - ADP INDIA PVT LTD through ALWASI SOFTWARE Pvt. Ltd. Hyderabad Duration: - July 2012– To Present. Designation: - Software Engineer. Project 1: - Re-hosting of Web Intelligence Client : - LOWES Duration: - Dec 2013 – To Present. Description: The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract meaning information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from 1
  • 2. +91- the websites, Process the html to obtain product and pricing information, Extract various reports out of the product pricing information and Export the information for further processing This project is mainly for the re-platforming of the current existing system which is running on WebHarvest a third party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process large date sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the increasing completion from his retailers. Environment : Hadoop, Apache Pig, Hive, Sqoop, Java, Linux, MySQL Roles & Responsibilities:- • Participated in client calls to gather and analyses the requirement. • Moved all crawl data flat files generated from various retailers to HDFS for further processing. • Written the Apache PIG scripts to process the HDFS data. • Created Hive tables to store the processed results in a tabular format. • Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database. • For the development of Dashboard solution, developed the Controller, Service and Dao layers of Spring Framework. • Developed scripts for creating the reports from Hive data. • Completely involved in the requirement analysis phase. Project 2: - Repository DW Client : - Private Bank Duration: - July 2013 – Nov 2013 Description: A full-fledged dimensional data mart to cater to the CPB analytical reporting requirement as the current GWM system is mainly focused on data enrichment, adjustment, defaulting and other data oriented process. Involved in the full development life cycle in a distributed environment for the Candidate Module. Private Bank Repository system is processing approximately 5, 00,000 of records every month. 2
  • 3. +91- Roles & Responsibilities: - • Participated in client calls to gather and analyses the requirement. • Involved in setup for Hadoop Cluster in Pseudo-Distributed Mode based on Linux commands. • Involved in Core Concepts of Hadoop HDFS, Map Reduce (Like Job Tracker, Task tracker). • Involved in Map Reduce phases By Using Core java , create and put jar files in to HDFS and run web UI for Name node , Job Tracker and Task Tracker. • Involved in Extracting, transforming, loading Data from Hive to Load an RDBMS. • Involved in Transforming Data within a Hadoop Cluster • Involved in Using Pentaho Map Reduce to Parse Weblog Data for Pentaho Map Reduce to convert raw weblog data into parsed, delimited records. • Involved Job to Load, Loading Data into Hive. • Involved in Create the Table in Hbase , Create a Transformation to Load Data into Hbase . • Involved in Writing input output Formats for CSV. • Involved in Import and Export by using Sqoop for job entries. • Design and development by using Pentaho. • Involved in Unit Test Pentaho Map Reduce Transformation Project under training: - Big Data initiative in the largest Financial Institute in North America. Client: Xavient Information Systems. Duration: July 2012 to June 2013 Description: - One of the largest financial institutions in North America had implemented small business banking e statements project using existing software tools and applications. The overall process to generate e statement and send alerts to customers was taking 18 to 30 hours per cycle day. Hence missing all SLA's leading to customer dissatisfaction. The purpose of the project to cut down the processing time to generate E-statements and alerts by at least 50% and also cut down the cost by 50%. Environment : Hadoop, Map Reduce, Hive 3
  • 4. +91- Roles & Responsibilities:- • Create hduser for performing hdfs operations • Create Map Reduce user for performing map Reduce operations only • Written the Apache PIG scripts to process the HDFS data. • Setting Password less Hadoop • Hadoop Installation Verification(Terra sort benchmark test) • Setup Hive with Mysql as a Remote Meta store • Developed the sqoop scripts in order to make the interaction between Hive and My SQL Database. • Moved all log files generated by various network devices into HDFS location • Created External Hive Table on top of parsed data (SRIKANTH K) 4