SlideShare a Scribd company logo
+91-
Srikanth K
+91-7075436413
srikanthkamatam1@gmail.com
Skilled software engineer looking to enhance professional skills in a dynamic and a fast paced workplace while
contributing to challenging goals in a project based environment. I am seeking for an opportunity that challenges my
skill set so that I can contribute in the growth and development of the organization using high end technologies.
• 3+ of experience in Big-Data Analytics, Hadoop Paradigm and Core Java along with designing,
developing and deploying large scale distributed systems.
• Good experience in Hadoop Framework, HDFS, Map/Reduce, Pig, Hive, Sqoop.
• Involved in implementing Proof of Concepts for various clients across various ISU in Hadoop and its related
technologies.
• Extensive expertise and solid understanding of OOPs and collection framework
• Having strong trouble shooting and problem solving skills.
• Proficient in Database Programming skills SQL.
• Well versed in MapReduce MRv1.
• Excellent understanding of Hadoop architecture and various components such as HDFS, Job Tracker, Task
Tracker, Name Node, Data Node and Map Reduce programming paradigm.
• Extensive experience in analyzing data with big data tools like Pig Latin and Hive QL.
• Extending Hive and Pig core functionality by writing custom UDFs.
• Hands on experience in installing, configuring and using ecosystem components like Hadoop Map Reduce,
HDFS, HBase, Oozie, Sqoop, Flume, Pig & Hive.
EDUCATION QUALIFICATION
• M C A from Jawaharlal Nehru Technological University
PROFESSIONAL EXPERIENCE
Organization: - ADP INDIA PVT LTD through ALWASI SOFTWARE Pvt. Ltd. Hyderabad
Duration: - July 2012– To Present.
Designation: - Software Engineer.
Project 1: - Re-hosting of Web Intelligence
Client : - LOWES
Duration: - Dec 2013 – To Present.
Description:
The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract
meaning information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored
in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from
1
+91-
the websites, Process the html to obtain product and pricing information, Extract various reports out of the product
pricing information and Export the information for further processing
This project is mainly for the re-platforming of the current existing system which is running on WebHarvest a third
party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process large date
sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the increasing completion
from his retailers.
Environment : Hadoop, Apache Pig, Hive, Sqoop, Java, Linux, MySQL
Roles & Responsibilities:-
• Participated in client calls to gather and analyses the requirement.
• Moved all crawl data flat files generated from various retailers to HDFS for further processing.
• Written the Apache PIG scripts to process the HDFS data.
• Created Hive tables to store the processed results in a tabular format.
• Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database.
• For the development of Dashboard solution, developed the Controller, Service and Dao layers of Spring
Framework.
• Developed scripts for creating the reports from Hive data.
• Completely involved in the requirement analysis phase.
Project 2: - Repository DW
Client : - Private Bank
Duration: - July 2013 – Nov 2013
Description:
A full-fledged dimensional data mart to cater to the CPB analytical reporting requirement as the current
GWM system is mainly focused on data enrichment, adjustment, defaulting and other data oriented
process. Involved in the full development life cycle in a distributed environment for the Candidate Module.
Private Bank Repository system is processing approximately 5, 00,000 of records every month.
2
+91-
Roles & Responsibilities: -
• Participated in client calls to gather and analyses the requirement.
• Involved in setup for Hadoop Cluster in Pseudo-Distributed Mode based on Linux commands.
• Involved in Core Concepts of Hadoop HDFS, Map Reduce (Like Job Tracker, Task tracker).
• Involved in Map Reduce phases By Using Core java , create and put jar files in to HDFS and run web UI for
Name node , Job Tracker and Task Tracker.
• Involved in Extracting, transforming, loading Data from Hive to Load an RDBMS.
• Involved in Transforming Data within a Hadoop Cluster
• Involved in Using Pentaho Map Reduce to Parse Weblog Data for Pentaho Map Reduce to convert raw
weblog data into parsed, delimited records.
• Involved Job to Load, Loading Data into Hive.
• Involved in Create the Table in Hbase , Create a Transformation to Load Data into Hbase .
• Involved in Writing input output Formats for CSV.
• Involved in Import and Export by using Sqoop for job entries.
• Design and development by using Pentaho.
• Involved in Unit Test Pentaho Map Reduce Transformation
Project under training: - Big Data initiative in the largest Financial Institute in North America.
Client: Xavient Information Systems.
Duration: July 2012 to June 2013
Description: -
One of the largest financial institutions in North America had implemented small business banking e statements
project using existing software tools and applications. The overall process to generate e statement and send alerts to
customers was taking 18 to 30 hours per cycle day. Hence missing all SLA's leading to customer dissatisfaction.
The purpose of the project to cut down the processing time to generate E-statements and alerts by at least 50% and
also cut down the cost by 50%.
Environment : Hadoop, Map Reduce, Hive
3
+91-
Roles & Responsibilities:-
• Create hduser for performing hdfs operations
• Create Map Reduce user for performing map Reduce operations only
• Written the Apache PIG scripts to process the HDFS data.
• Setting Password less Hadoop
• Hadoop Installation Verification(Terra sort benchmark test)
• Setup Hive with Mysql as a Remote Meta store
• Developed the sqoop scripts in order to make the interaction between Hive and My SQL Database.
• Moved all log files generated by various network devices into HDFS location
• Created External Hive Table on top of parsed data
(SRIKANTH K)
4

More Related Content

What's hot

Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
Thanh Nguyen
 
Hadoop - Architectural road map for Hadoop Ecosystem
Hadoop -  Architectural road map for Hadoop EcosystemHadoop -  Architectural road map for Hadoop Ecosystem
Hadoop - Architectural road map for Hadoop Ecosystem
nallagangus
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNag Arjun
 
Hadoop
HadoopHadoop
Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft Platform
Andrew Brust
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpbigdata sunil
 
VMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware HadoopVMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware Hadoop
VMUG IT
 
Payment Gateway Live hadoop project
Payment Gateway Live hadoop projectPayment Gateway Live hadoop project
Payment Gateway Live hadoop projectKamal A
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cvrevuri
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016Rajesh Kumar
 
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Abdul Nasir
 
Hadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceHadoop Ecosystem at a Glance
Hadoop Ecosystem at a Glance
Neev Technologies
 
My other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 editionMy other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 edition
Steve Loughran
 
Big Data Technologies - Hadoop
Big Data Technologies - HadoopBig Data Technologies - Hadoop
Big Data Technologies - Hadoop
Talentica Software
 
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
DataWorks Summit
 
Big data hadoop rdbms
Big data hadoop rdbmsBig data hadoop rdbms
Big data hadoop rdbms
Arjen de Vries
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
DataWorks Summit
 

What's hot (20)

Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
 
Hadoop - Architectural road map for Hadoop Ecosystem
Hadoop -  Architectural road map for Hadoop EcosystemHadoop -  Architectural road map for Hadoop Ecosystem
Hadoop - Architectural road map for Hadoop Ecosystem
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
Hadoop
HadoopHadoop
Hadoop
 
Big Data on the Microsoft Platform
Big Data on the Microsoft PlatformBig Data on the Microsoft Platform
Big Data on the Microsoft Platform
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
VMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware HadoopVMUGIT UC 2013 - 08a VMware Hadoop
VMUGIT UC 2013 - 08a VMware Hadoop
 
Payment Gateway Live hadoop project
Payment Gateway Live hadoop projectPayment Gateway Live hadoop project
Payment Gateway Live hadoop project
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cv
 
Nagesh Hadoop Profile
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profile
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016
 
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
Hadoop Distriubted File System (HDFS) presentation 27- 5-2015
 
Hadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceHadoop Ecosystem at a Glance
Hadoop Ecosystem at a Glance
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
My other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 editionMy other computer is a datacentre - 2012 edition
My other computer is a datacentre - 2012 edition
 
Big Data Technologies - Hadoop
Big Data Technologies - HadoopBig Data Technologies - Hadoop
Big Data Technologies - Hadoop
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
Big Data Meets NVM: Accelerating Big Data Processing with Non-Volatile Memory...
 
Big data hadoop rdbms
Big data hadoop rdbmsBig data hadoop rdbms
Big data hadoop rdbms
 
Big Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeNBig Data Simplified - Is all about Ab'strakSHeN
Big Data Simplified - Is all about Ab'strakSHeN
 

Similar to Srikanth hadoop hyderabad_3.4yeras - copy

Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
Sourav Banerjee
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
mallikarjunkoriindia
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_DamarlaNag Arjun
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Mopuru Babu
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Mopuru Babu
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop ResumeBharath Kumar
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePRABHAKAR T
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
ramaprasad owk
 

Similar to Srikanth hadoop hyderabad_3.4yeras - copy (20)

Resume_2706
Resume_2706Resume_2706
Resume_2706
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
hadoop_bigdata
hadoop_bigdatahadoop_bigdata
hadoop_bigdata
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
 
Pallavi_Resume
Pallavi_ResumePallavi_Resume
Pallavi_Resume
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Prabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years ExperiencePrabhakar_Hadoop_2 years Experience
Prabhakar_Hadoop_2 years Experience
 
Resume
ResumeResume
Resume
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
Yasar resume 2
Yasar resume 2Yasar resume 2
Yasar resume 2
 

Recently uploaded

AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
Product School
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
Paul Groth
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
ControlCase
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
KatiaHIMEUR1
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
Alison B. Lowndes
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
Ana-Maria Mihalceanu
 
Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*
Frank van Harmelen
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
DianaGray10
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
OnBoard
 
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
Product School
 
How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...
Product School
 
Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
Thijs Feryn
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
Product School
 
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Product School
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
ThousandEyes
 
The Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and SalesThe Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and Sales
Laura Byrne
 
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Inflectra
 

Recently uploaded (20)

AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
AI for Every Business: Unlocking Your Product's Universal Potential by VP of ...
 
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMsTo Graph or Not to Graph Knowledge Graph Architectures and LLMs
To Graph or Not to Graph Knowledge Graph Architectures and LLMs
 
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdfFIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
FIDO Alliance Osaka Seminar: Passkeys and the Road Ahead.pdf
 
PCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase TeamPCI PIN Basics Webinar from the Controlcase Team
PCI PIN Basics Webinar from the Controlcase Team
 
Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !Securing your Kubernetes cluster_ a step-by-step guide to success !
Securing your Kubernetes cluster_ a step-by-step guide to success !
 
Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........Bits & Pixels using AI for Good.........
Bits & Pixels using AI for Good.........
 
Monitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR EventsMonitoring Java Application Security with JDK Tools and JFR Events
Monitoring Java Application Security with JDK Tools and JFR Events
 
Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*Neuro-symbolic is not enough, we need neuro-*semantic*
Neuro-symbolic is not enough, we need neuro-*semantic*
 
UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3UiPath Test Automation using UiPath Test Suite series, part 3
UiPath Test Automation using UiPath Test Suite series, part 3
 
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdfFIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
FIDO Alliance Osaka Seminar: FIDO Security Aspects.pdf
 
Leading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdfLeading Change strategies and insights for effective change management pdf 1.pdf
Leading Change strategies and insights for effective change management pdf 1.pdf
 
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
De-mystifying Zero to One: Design Informed Techniques for Greenfield Innovati...
 
How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...How world-class product teams are winning in the AI era by CEO and Founder, P...
How world-class product teams are winning in the AI era by CEO and Founder, P...
 
Accelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish CachingAccelerate your Kubernetes clusters with Varnish Caching
Accelerate your Kubernetes clusters with Varnish Caching
 
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdfFIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
FIDO Alliance Osaka Seminar: Passkeys at Amazon.pdf
 
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
From Daily Decisions to Bottom Line: Connecting Product Work to Revenue by VP...
 
Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...Designing Great Products: The Power of Design and Leadership by Chief Designe...
Designing Great Products: The Power of Design and Leadership by Chief Designe...
 
Assuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyesAssuring Contact Center Experiences for Your Customers With ThousandEyes
Assuring Contact Center Experiences for Your Customers With ThousandEyes
 
The Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and SalesThe Art of the Pitch: WordPress Relationships and Sales
The Art of the Pitch: WordPress Relationships and Sales
 
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualitySoftware Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered Quality
 

Srikanth hadoop hyderabad_3.4yeras - copy

  • 1. +91- Srikanth K +91-7075436413 srikanthkamatam1@gmail.com Skilled software engineer looking to enhance professional skills in a dynamic and a fast paced workplace while contributing to challenging goals in a project based environment. I am seeking for an opportunity that challenges my skill set so that I can contribute in the growth and development of the organization using high end technologies. • 3+ of experience in Big-Data Analytics, Hadoop Paradigm and Core Java along with designing, developing and deploying large scale distributed systems. • Good experience in Hadoop Framework, HDFS, Map/Reduce, Pig, Hive, Sqoop. • Involved in implementing Proof of Concepts for various clients across various ISU in Hadoop and its related technologies. • Extensive expertise and solid understanding of OOPs and collection framework • Having strong trouble shooting and problem solving skills. • Proficient in Database Programming skills SQL. • Well versed in MapReduce MRv1. • Excellent understanding of Hadoop architecture and various components such as HDFS, Job Tracker, Task Tracker, Name Node, Data Node and Map Reduce programming paradigm. • Extensive experience in analyzing data with big data tools like Pig Latin and Hive QL. • Extending Hive and Pig core functionality by writing custom UDFs. • Hands on experience in installing, configuring and using ecosystem components like Hadoop Map Reduce, HDFS, HBase, Oozie, Sqoop, Flume, Pig & Hive. EDUCATION QUALIFICATION • M C A from Jawaharlal Nehru Technological University PROFESSIONAL EXPERIENCE Organization: - ADP INDIA PVT LTD through ALWASI SOFTWARE Pvt. Ltd. Hyderabad Duration: - July 2012– To Present. Designation: - Software Engineer. Project 1: - Re-hosting of Web Intelligence Client : - LOWES Duration: - Dec 2013 – To Present. Description: The purpose of the project is to store terabytes of log information generated by the ecommerce website and extract meaning information out of it. The solution is based on the open source BigData s/w Hadoop .The data will be stored in Hadoop file system and processed using Map/Reduce jobs. Which intern includes getting the raw html data from 1
  • 2. +91- the websites, Process the html to obtain product and pricing information, Extract various reports out of the product pricing information and Export the information for further processing This project is mainly for the re-platforming of the current existing system which is running on WebHarvest a third party JAR and in MySQL DB to a new cloud solution technology called Hadoop which can able to process large date sets (i.e. Tera bytes and Peta bytes of data) in order to meet the client requirements with the increasing completion from his retailers. Environment : Hadoop, Apache Pig, Hive, Sqoop, Java, Linux, MySQL Roles & Responsibilities:- • Participated in client calls to gather and analyses the requirement. • Moved all crawl data flat files generated from various retailers to HDFS for further processing. • Written the Apache PIG scripts to process the HDFS data. • Created Hive tables to store the processed results in a tabular format. • Developed the sqoop scripts in order to make the interaction between Pig and MySQL Database. • For the development of Dashboard solution, developed the Controller, Service and Dao layers of Spring Framework. • Developed scripts for creating the reports from Hive data. • Completely involved in the requirement analysis phase. Project 2: - Repository DW Client : - Private Bank Duration: - July 2013 – Nov 2013 Description: A full-fledged dimensional data mart to cater to the CPB analytical reporting requirement as the current GWM system is mainly focused on data enrichment, adjustment, defaulting and other data oriented process. Involved in the full development life cycle in a distributed environment for the Candidate Module. Private Bank Repository system is processing approximately 5, 00,000 of records every month. 2
  • 3. +91- Roles & Responsibilities: - • Participated in client calls to gather and analyses the requirement. • Involved in setup for Hadoop Cluster in Pseudo-Distributed Mode based on Linux commands. • Involved in Core Concepts of Hadoop HDFS, Map Reduce (Like Job Tracker, Task tracker). • Involved in Map Reduce phases By Using Core java , create and put jar files in to HDFS and run web UI for Name node , Job Tracker and Task Tracker. • Involved in Extracting, transforming, loading Data from Hive to Load an RDBMS. • Involved in Transforming Data within a Hadoop Cluster • Involved in Using Pentaho Map Reduce to Parse Weblog Data for Pentaho Map Reduce to convert raw weblog data into parsed, delimited records. • Involved Job to Load, Loading Data into Hive. • Involved in Create the Table in Hbase , Create a Transformation to Load Data into Hbase . • Involved in Writing input output Formats for CSV. • Involved in Import and Export by using Sqoop for job entries. • Design and development by using Pentaho. • Involved in Unit Test Pentaho Map Reduce Transformation Project under training: - Big Data initiative in the largest Financial Institute in North America. Client: Xavient Information Systems. Duration: July 2012 to June 2013 Description: - One of the largest financial institutions in North America had implemented small business banking e statements project using existing software tools and applications. The overall process to generate e statement and send alerts to customers was taking 18 to 30 hours per cycle day. Hence missing all SLA's leading to customer dissatisfaction. The purpose of the project to cut down the processing time to generate E-statements and alerts by at least 50% and also cut down the cost by 50%. Environment : Hadoop, Map Reduce, Hive 3
  • 4. +91- Roles & Responsibilities:- • Create hduser for performing hdfs operations • Create Map Reduce user for performing map Reduce operations only • Written the Apache PIG scripts to process the HDFS data. • Setting Password less Hadoop • Hadoop Installation Verification(Terra sort benchmark test) • Setup Hive with Mysql as a Remote Meta store • Developed the sqoop scripts in order to make the interaction between Hive and My SQL Database. • Moved all log files generated by various network devices into HDFS location • Created External Hive Table on top of parsed data (SRIKANTH K) 4