SlideShare a Scribd company logo
1 of 5
KARTHICK
EMail:skarthicece@gmail.com
Mobile: +919095318544
Objective
Intend to build a career with leading corporate of Hi-tech environment with committed and
dedicated people, which will help me to explore myself fully and realize my potential.
PROFESSIONAL EXPERIENCE:
 Having 3.6 years of experience in Software Development, Involved in developing of
applications on Hadoop, Java on windows and Linux
 Hands on experience in installing, configuring and using Apache Hadoop ecosystem
components like HDFS, Hadoop MapReduce, Apache Pig, Apache Hive, Apache
Sqoop, Oozie, and Neo4j, Apache Mahout .
 Expertise in Neo4j - Cypher Query Language(CQL) for importing large data sets and
creating Nodes and Relationships
 Strong Experience in Profile Recommendation by using Collaborative Filtering
Algorithm and making Bi-Directional Relationship in Neo4j
 Analysed and Explored machine learning algorithm (Log Likelihood) and making
Profile Recommendation using Apache Mahout
 Involved in running Hadoop jobs for processing millions of records of text data.
 Analysed large data sets by running Pig scripts.
 Designed and implemented Pig UDF's for evaluation, filtering, loading and storing of
data.
 Developed multiple MapReduce jobs in java for data cleaning and pre-processing.
 Involved in unit testing using MR unit for MapReduce jobs.
 Involved in creating Hive tables, and loading and analysing data using hive queries.
 The Hive tables created as per requirement were Internal or External tables defined
with appropriate Static and Dynamic partitions, intended for efficiency.
 Experience in importing and exporting data using Sqoop from HDFS to Relational
Database Systems (RDBMS) and vice-versa.
 Basic knowledge in Spark
 Worked on large cluster having 200 nodes
 Completed IBM certifications on BIG DATA
 Completed 6 months course on Big Data Hadoop
 Applications on Java, Hadoop on windows and Linux(Ubuntu)
Educational Profile
B.E (Electronics & Communication Engineering) 7.22 CGPA 2012
HSC 84% 2008
SSLC 84% 2006
Working Experience
Working as Software Engineer (Research and Development) for Matrimony.com Ltd
from July 2014 - till date
Worked on Hadoop Developer for SI System from June 2012 to July 2014
Project Details
Project Title: Viewed Profile also Viewed
Description
This is one of the portion which is present in each division of Matrimony.com site. Here,
people can see profiles as recommendation based on who are all viewed their own profile.
Role and Responsibility
Developer:
 Installed and Configured the Neo4j Graph Database
 Imported all the viewed history files into Neo4j graph database using Cypher Query
Language(CQL)
 Learned and Converted Collaborative Filtering algorithm into Neo4j need
 ExploredBi-Directional Relationship between viewer andviewed list of Idsusing CQL
 Brought up Recommendation Ids from who are all viewed a particular Id using CQL
Tools: Neo4j-2.2.1
Project Title: Search Results
Description
In each division of Matrimony.com websites, there are some option to categorize the
searching results. People can Decline and Ignore profiles by their wish. Once they have done,
they could make fast searching and get new recommended profiles.
Role and Responsibility
Developer:
 Installed and Configured Apache Hadoop and Apache Pig
 Imported all the four category files into HDFS
 Loaded all the files into Apache Pig
 Joined all the category files based on the Filtration need in Apache Pig
 Filtered all the Declined and Ignored Ids from the source and acceptance files
Tools: Apache Pig-0.13.0, Apache Hadoop-2.4.0
Project Title: Profile Recommendation
Description
This is Profile Recommendation to users when they are seeing any profile. This will give
profile as recommendation by collecting the user interest from the user viewed history and
bring results which matches their profile.
Role and Responsibility
Developer:
 Installed and Configured the Apache Mahout with Apache Hadoop
 Explored MapReduce program to make input file from viewed history which is
acceptable by Apache Mahout
 Explored Log Likelihood machine learning algorithm in Mahout to make
Recommendation for each Id present in the viewed history file.
Tools: Apache Mahout-0.9, Apache Hadoop-1.2.1, MapReduce
Project: Customer Insights Platform
Client: Kohl’s Corporation
Duration: May 2013 to June 2014
Description
Kohl’s and its associated retail companies have very vast customer base both from Brick
and motor & online stores. With a generation capacity of around 10,000 transactions per
hour across all the entities, which comprises of 2.5 petabytes of data, Kohl’s is focused on
creating a central hub for customer specific information using BIGDATA technologies.
This project aims at providing analytics dashboard for business and data scientists for data
analysis and to generate new insights.
Role and Responsibility
Developer:
 Working in an agile methodology, understand the requirements of the user stories and
prepare High-level design documentation for approval
 On approval, derive Low level design documents for development
 Develop the draft version of the scripts in Java Map reduce and Pig (Data transformation)
and HiveQL script (if it involves ad-hoc querying)
 Fine tune the process based on the Map Reduce jobs processed
 Using Apache Sqoop, the data from various database sources like Oracle, Teradata, DB2
and Informix is extracted to HDFS
 Based on the requirement, schedule for each Job setup in Apache Oozie, Data from
multiple database sources loaded into Hive tables.
 Apache Hive is being extensively used to build various tables, which are specific to user
requirement
Tools:
Hadoop (CDH4), HDFS, Sqoop, Hive, Map Reduce, Pig, Oozie
Hadoop: POC
Project: Web Log Analyzer
Description
On clicking the menus of the given organization the date, time, ip address from which the
application' URL was accessed are logged into the log file. The log file is moved to HDFS. The
HDFS file is processed using Map Reduce and Pig to find the maximum number of customer
who visited a particular URL. The result is stored in HBase and the report is displayed to the
user
Role and Responsibility
Developer:
 Involved in requirements gathering, analysis
 Understanding and designing the architecture based on the requirements
 Storage of data in HDFS
 Processing of data using Map Reduce and Pig
 Processed result is stored in Hbase
 Retrieval of reports from Hbase and displayed to user
Tools:
Hadoop (CDH4), HDFS, Sqoop, Hbase, Map Reduce, Pig, Oozie
Software Proficiency
 Frameworks : Apache Hadoop, Mapreduce, Spark
 Hadoop Echo Systems : Mahout, Pig, Hive, Hbase, Sqoop, Oozie
 Programming Languages : Core Java
 Database Connectivity : Neo4j, MySQL
 Operating System : Linux(Ubuntu), Windows
 Packages : WinSCP, Putty, MS office
IBM certifications on BIGDATA
 Hadoop Fundamental
 Introduction to Map Reduce Programming
 Introduction to PIG
 Accessing Hadoop Data using Hive
 Controlling Hadoop Jobs using Oozie
Achievements
 Course completed in Big Data- Hadoop
 Course completed in Diploma in Computer Application
 Course completed in Junior Diploma in Computer science.
KEY SOFT SKILL
 Flexible and good in team work.
 Listen to the feedback and work on it.
 Hardworking, Trustworthy and Patient.
Personal Profile
Father’s Name Selvaraj.K
Sex Male
Marital Status Single
Date of Birth 22.11.1990
Languages Known English, Tamil
Declaration
I declare that the details given above are true to the best of my knowledge and
belief.
Karthick. S

More Related Content

What's hot

Im symposium presentation - OCR and Text analytics for Medical Chart Review ...
Im symposium presentation -  OCR and Text analytics for Medical Chart Review ...Im symposium presentation -  OCR and Text analytics for Medical Chart Review ...
Im symposium presentation - OCR and Text analytics for Medical Chart Review ...Alex Zeltov
 
Vishnu_HadoopDeveloper
Vishnu_HadoopDeveloperVishnu_HadoopDeveloper
Vishnu_HadoopDevelopervishnu ch
 
A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...ijcses
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaMopuru Babu
 
Hadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log ProcessingHadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log ProcessingHitendra Kumar
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
 
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...Agile Testing Alliance
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 
De-Bugging Hive with Hadoop-in-the-Cloud
De-Bugging Hive with Hadoop-in-the-CloudDe-Bugging Hive with Hadoop-in-the-Cloud
De-Bugging Hive with Hadoop-in-the-CloudDataWorks Summit
 
Keynote from ApacheCon NA 2011
Keynote from ApacheCon NA 2011Keynote from ApacheCon NA 2011
Keynote from ApacheCon NA 2011Hortonworks
 
Hortonworks Yarn Code Walk Through January 2014
Hortonworks Yarn Code Walk Through January 2014Hortonworks Yarn Code Walk Through January 2014
Hortonworks Yarn Code Walk Through January 2014Hortonworks
 

What's hot (20)

Im symposium presentation - OCR and Text analytics for Medical Chart Review ...
Im symposium presentation -  OCR and Text analytics for Medical Chart Review ...Im symposium presentation -  OCR and Text analytics for Medical Chart Review ...
Im symposium presentation - OCR and Text analytics for Medical Chart Review ...
 
Resume
ResumeResume
Resume
 
Vishnu_HadoopDeveloper
Vishnu_HadoopDeveloperVishnu_HadoopDeveloper
Vishnu_HadoopDeveloper
 
A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...A comparative survey based on processing network traffic data using hadoop pi...
A comparative survey based on processing network traffic data using hadoop pi...
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scalaSunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
Sunshine consulting mopuru babu cv_java_j2ee_spring_bigdata_scala
 
BigData_Krishna Kumar Sharma
BigData_Krishna Kumar SharmaBigData_Krishna Kumar Sharma
BigData_Krishna Kumar Sharma
 
Hadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log ProcessingHadoop a Natural Choice for Data Intensive Log Processing
Hadoop a Natural Choice for Data Intensive Log Processing
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
Introduction To Big Data with Hadoop and Spark - For Batch and Real Time Proc...
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 
De-Bugging Hive with Hadoop-in-the-Cloud
De-Bugging Hive with Hadoop-in-the-CloudDe-Bugging Hive with Hadoop-in-the-Cloud
De-Bugging Hive with Hadoop-in-the-Cloud
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Keynote from ApacheCon NA 2011
Keynote from ApacheCon NA 2011Keynote from ApacheCon NA 2011
Keynote from ApacheCon NA 2011
 
Hortonworks Yarn Code Walk Through January 2014
Hortonworks Yarn Code Walk Through January 2014Hortonworks Yarn Code Walk Through January 2014
Hortonworks Yarn Code Walk Through January 2014
 
Resume (2)
Resume (2)Resume (2)
Resume (2)
 

Viewers also liked

Martial arts supplies
Martial arts suppliesMartial arts supplies
Martial arts suppliesjamesayounger
 
教學簡報 Google 日曆-02建立及進階設定
教學簡報 Google 日曆-02建立及進階設定教學簡報 Google 日曆-02建立及進階設定
教學簡報 Google 日曆-02建立及進階設定馬克 朱
 
Cloud shadow detection over water
Cloud shadow detection over waterCloud shadow detection over water
Cloud shadow detection over waterDivyansh Jha
 
Investors Presentation Emailer
Investors Presentation EmailerInvestors Presentation Emailer
Investors Presentation EmailerYolisaSibali
 
Hubungan diplomatik indonesia arab saudi (2)
Hubungan diplomatik indonesia arab saudi (2)Hubungan diplomatik indonesia arab saudi (2)
Hubungan diplomatik indonesia arab saudi (2)Manuel Marbun
 

Viewers also liked (9)

Karate belts
Karate beltsKarate belts
Karate belts
 
Taekwondo uniform
Taekwondo uniformTaekwondo uniform
Taekwondo uniform
 
Martial arts supplies
Martial arts suppliesMartial arts supplies
Martial arts supplies
 
教學簡報 Google 日曆-02建立及進階設定
教學簡報 Google 日曆-02建立及進階設定教學簡報 Google 日曆-02建立及進階設定
教學簡報 Google 日曆-02建立及進階設定
 
Presentasi pihi
Presentasi pihiPresentasi pihi
Presentasi pihi
 
Cloud shadow detection over water
Cloud shadow detection over waterCloud shadow detection over water
Cloud shadow detection over water
 
Investors Presentation Emailer
Investors Presentation EmailerInvestors Presentation Emailer
Investors Presentation Emailer
 
Hubungan diplomatik indonesia arab saudi (2)
Hubungan diplomatik indonesia arab saudi (2)Hubungan diplomatik indonesia arab saudi (2)
Hubungan diplomatik indonesia arab saudi (2)
 
Pampasan
PampasanPampasan
Pampasan
 

Similar to Resume_Karthick

Similar to Resume_Karthick (20)

Shiv Shakti
Shiv ShaktiShiv Shakti
Shiv Shakti
 
RESUME_N
RESUME_NRESUME_N
RESUME_N
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Yasar resume 2
Yasar resume 2Yasar resume 2
Yasar resume 2
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotech
 
sudipto_resume
sudipto_resumesudipto_resume
sudipto_resume
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUMEVAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
VAMSHI KRISHNA GADDAM IDRBT Experienced RESUME
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Resume
ResumeResume
Resume
 
Nagesh Hadoop Profile
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profile
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)
 
Hadoop content
Hadoop contentHadoop content
Hadoop content
 

Resume_Karthick

  • 1. KARTHICK EMail:skarthicece@gmail.com Mobile: +919095318544 Objective Intend to build a career with leading corporate of Hi-tech environment with committed and dedicated people, which will help me to explore myself fully and realize my potential. PROFESSIONAL EXPERIENCE:  Having 3.6 years of experience in Software Development, Involved in developing of applications on Hadoop, Java on windows and Linux  Hands on experience in installing, configuring and using Apache Hadoop ecosystem components like HDFS, Hadoop MapReduce, Apache Pig, Apache Hive, Apache Sqoop, Oozie, and Neo4j, Apache Mahout .  Expertise in Neo4j - Cypher Query Language(CQL) for importing large data sets and creating Nodes and Relationships  Strong Experience in Profile Recommendation by using Collaborative Filtering Algorithm and making Bi-Directional Relationship in Neo4j  Analysed and Explored machine learning algorithm (Log Likelihood) and making Profile Recommendation using Apache Mahout  Involved in running Hadoop jobs for processing millions of records of text data.  Analysed large data sets by running Pig scripts.  Designed and implemented Pig UDF's for evaluation, filtering, loading and storing of data.  Developed multiple MapReduce jobs in java for data cleaning and pre-processing.  Involved in unit testing using MR unit for MapReduce jobs.  Involved in creating Hive tables, and loading and analysing data using hive queries.  The Hive tables created as per requirement were Internal or External tables defined with appropriate Static and Dynamic partitions, intended for efficiency.  Experience in importing and exporting data using Sqoop from HDFS to Relational Database Systems (RDBMS) and vice-versa.  Basic knowledge in Spark  Worked on large cluster having 200 nodes  Completed IBM certifications on BIG DATA  Completed 6 months course on Big Data Hadoop  Applications on Java, Hadoop on windows and Linux(Ubuntu)
  • 2. Educational Profile B.E (Electronics & Communication Engineering) 7.22 CGPA 2012 HSC 84% 2008 SSLC 84% 2006 Working Experience Working as Software Engineer (Research and Development) for Matrimony.com Ltd from July 2014 - till date Worked on Hadoop Developer for SI System from June 2012 to July 2014 Project Details Project Title: Viewed Profile also Viewed Description This is one of the portion which is present in each division of Matrimony.com site. Here, people can see profiles as recommendation based on who are all viewed their own profile. Role and Responsibility Developer:  Installed and Configured the Neo4j Graph Database  Imported all the viewed history files into Neo4j graph database using Cypher Query Language(CQL)  Learned and Converted Collaborative Filtering algorithm into Neo4j need  ExploredBi-Directional Relationship between viewer andviewed list of Idsusing CQL  Brought up Recommendation Ids from who are all viewed a particular Id using CQL Tools: Neo4j-2.2.1 Project Title: Search Results Description In each division of Matrimony.com websites, there are some option to categorize the searching results. People can Decline and Ignore profiles by their wish. Once they have done, they could make fast searching and get new recommended profiles. Role and Responsibility
  • 3. Developer:  Installed and Configured Apache Hadoop and Apache Pig  Imported all the four category files into HDFS  Loaded all the files into Apache Pig  Joined all the category files based on the Filtration need in Apache Pig  Filtered all the Declined and Ignored Ids from the source and acceptance files Tools: Apache Pig-0.13.0, Apache Hadoop-2.4.0 Project Title: Profile Recommendation Description This is Profile Recommendation to users when they are seeing any profile. This will give profile as recommendation by collecting the user interest from the user viewed history and bring results which matches their profile. Role and Responsibility Developer:  Installed and Configured the Apache Mahout with Apache Hadoop  Explored MapReduce program to make input file from viewed history which is acceptable by Apache Mahout  Explored Log Likelihood machine learning algorithm in Mahout to make Recommendation for each Id present in the viewed history file. Tools: Apache Mahout-0.9, Apache Hadoop-1.2.1, MapReduce Project: Customer Insights Platform Client: Kohl’s Corporation Duration: May 2013 to June 2014 Description Kohl’s and its associated retail companies have very vast customer base both from Brick and motor & online stores. With a generation capacity of around 10,000 transactions per hour across all the entities, which comprises of 2.5 petabytes of data, Kohl’s is focused on creating a central hub for customer specific information using BIGDATA technologies. This project aims at providing analytics dashboard for business and data scientists for data analysis and to generate new insights. Role and Responsibility Developer:  Working in an agile methodology, understand the requirements of the user stories and prepare High-level design documentation for approval
  • 4.  On approval, derive Low level design documents for development  Develop the draft version of the scripts in Java Map reduce and Pig (Data transformation) and HiveQL script (if it involves ad-hoc querying)  Fine tune the process based on the Map Reduce jobs processed  Using Apache Sqoop, the data from various database sources like Oracle, Teradata, DB2 and Informix is extracted to HDFS  Based on the requirement, schedule for each Job setup in Apache Oozie, Data from multiple database sources loaded into Hive tables.  Apache Hive is being extensively used to build various tables, which are specific to user requirement Tools: Hadoop (CDH4), HDFS, Sqoop, Hive, Map Reduce, Pig, Oozie Hadoop: POC Project: Web Log Analyzer Description On clicking the menus of the given organization the date, time, ip address from which the application' URL was accessed are logged into the log file. The log file is moved to HDFS. The HDFS file is processed using Map Reduce and Pig to find the maximum number of customer who visited a particular URL. The result is stored in HBase and the report is displayed to the user Role and Responsibility Developer:  Involved in requirements gathering, analysis  Understanding and designing the architecture based on the requirements  Storage of data in HDFS  Processing of data using Map Reduce and Pig  Processed result is stored in Hbase  Retrieval of reports from Hbase and displayed to user Tools: Hadoop (CDH4), HDFS, Sqoop, Hbase, Map Reduce, Pig, Oozie Software Proficiency  Frameworks : Apache Hadoop, Mapreduce, Spark  Hadoop Echo Systems : Mahout, Pig, Hive, Hbase, Sqoop, Oozie  Programming Languages : Core Java
  • 5.  Database Connectivity : Neo4j, MySQL  Operating System : Linux(Ubuntu), Windows  Packages : WinSCP, Putty, MS office IBM certifications on BIGDATA  Hadoop Fundamental  Introduction to Map Reduce Programming  Introduction to PIG  Accessing Hadoop Data using Hive  Controlling Hadoop Jobs using Oozie Achievements  Course completed in Big Data- Hadoop  Course completed in Diploma in Computer Application  Course completed in Junior Diploma in Computer science. KEY SOFT SKILL  Flexible and good in team work.  Listen to the feedback and work on it.  Hardworking, Trustworthy and Patient. Personal Profile Father’s Name Selvaraj.K Sex Male Marital Status Single Date of Birth 22.11.1990 Languages Known English, Tamil Declaration I declare that the details given above are true to the best of my knowledge and belief. Karthick. S