SlideShare a Scribd company logo
1 of 6
ATHOKPAM NABAKUMAR SINGH
E-mail: nabakumar47@gmail.com
Phone: (+91) 7795408480
Carrier Objective:
To use my skills in the best way possible for achieving the company’s goals. To be associated as
an active member with a progressive software industry that gives me scope to enhance my skills and
strengths in accordance with the latest technologies and part of the team that dynamically works
towards the growth of the organization and gain satisfaction there off.
Professional Summary and Technical skills:
• Software Professional with 4 years and 2 months of experience as Hadoop Admin & support
(Installation/Support/Maintenance) and Big data analytics.
• Deep knowledge of hive(Installation/tuning/running complicated hive queries).
• Deep knowledge of hbase(Installation/tuning/running complicated hive queries).
• Configuring fair scheduler to grant access to multiple user to submit job to multiple pools.
• Deep knowledge of installation of opscenter/solr and configuring solr nodes.
• Good knowledge of flume .
• Creation of keyspaces and columns in Cassandra both from opscenter and Casandra cli.
• Configuring the Hadoop cluster.
• Proficient with installation of pentaho and configuring them based on the needs to
accommodate with the features of Hadoop.
• Hands on experience on Java,SQL & Unix.
• Worked in database - oracle 9i/10g.
• Proficient in understanding the requirements.
• Worked in tools – Toad, Putty, Win SCP, SQL Developer , Opsware (HP Server Automation
Client v9.13),Weblogic, Netapp Service Now.
• Exposure to shell scripting and perl scripting and automation of processes.
• Good familiarity with ETL process used Pentaho tool for Data Integration (ETL and job
scheduling).
• Have worked on Incident and Problem Management (ITIL).
• Working knowledge in Eclipse and Java Beans IDE.
• Good analytical skills and efficient in debugging.
• Excellent verbal and written communication.
• Self-motivated, energetic and a quick learner.
Professional Experience:
Currently working in Trianz Holdings Pvt. Limited – Bangalore, India as a Senior Software Engineer
with 4 years of experience.
Trianz Holding pvt. Ltd – Jan 2012 - till date
Project Details:
Project Name – DW4 (Asup Next)
Client – NetApp
Duration – Oct 2012 – till date
Technology used – Hadoop, Java, Flume, Hive, HDFS, Pentaho, SQL, ETL, Unix
Description:
NetApp’s My AutoSupport is a suite of proactive support tools that is based on AutoSupport
information from your systems and helps improve storage and operational efficiency. The ASUP
Data Platform Analytics Team receives the huge volume of AutoSupport files and processes the
information on the Hadoop cluster and provides the insights gained to the different users in the
company. We have ETL which then processed the data and stored the data in DSS (oracle
database) which are used by customers/users to fetch the data. These applications are
scheduled through Pentaho scheduler .
Role & responsibility:
 Installation of Hadoop cluster and configuring the datanodes/namenodes based on the
requirement and tuning them .
 Installation of pentaho and configuring them based on the needs to accommodate with the
features of Hadoop.
 Installation of opscenter and solr and configure solr nodes based on the requirement .
 Installation of hive and run complicated hive queries to generate report .
 Monitoring the pre-scheduled jobs through Job tracker/pentaho logs/application logs/system
logs.
 Raising tickets in case of critical failures .
 Writing SQL queries in order to fetch the data from the database.
 Verification/Analysis of data in HDFS using hdfs shell commands.
 Verification/Analysis of data stored in Hbase.
 Verification/Analysis of data through HIVE and SOLR using Hive and SOLR queries
 Providing the daily status report.
 Writing shell script /sql queries to avoid manual effort of monitoring the job.
 Preparing and providing business solutions to clients and business requirements.
 Provide resolution based on the ERROR through code analysis.
 Code level investigation of the jobs (UNIX, Shell script, Oracle9i (SQL, PL/SQL), java , hql ) for
finding the bug and fixing it.
 Writing shell script /sql queries to generate reports based on business requirements.
 Preparing the jobs and transformations and scheduling through pentaho scheduler.
Project Name – ASUP DW3
Client – Netapp
Duration – Dec 2011 – Oct 2012
Technology used –Unix, sql, pl/sql, Shell Scripting, Perl Scripting, opsware, ETL
Description:
NetApp’s My AutoSupport is a suite of proactive support tools that is based on AutoSupport
information from your systems and helps improve storage and operational efficiency. The ASUP
Data Platform Analytics Team receives the volume of AutoSupport files and processes the
information through various layers of application and provides the insights gained to the
different users in the company. ETL built on perl scripting using sql, runs on these data from ODS
, processed them and again dumped the data to DSS ( oracle database) which are used by
customers/user. Some of these applications are scheduled through crontab in unix and some
runs as daemon application .
Role & responsibility:
 Scheduling the jobs in crontab and monitoring them.
 Raising tickets in case of critical failures.
 Providing support 24*7 bases depending on the Priority of the ticket.
 Writing SQL queries in order to fetch the data from the database.
 Providing the daily status report.
 Writing shell script /sql queries to avoid manual effort of monitoring the job.
 Preparing and providing business solutions to clients and business requirements.
 Provide resolution based on the ERROR through code analysis.
 Code level investigation of the jobs (UNIX, Shell script, Oracle9i (SQL, PL/SQL)) for finding the
bug and fixing it.
 Writing shell script /sql queries to generate reports based on business requirements.
Education Details:
Degree/ Year Institute/College/Board % Age/CGPA
B.E (Computer Science
and Engg)/2010
PGP College Of Engineering And
Technology (ANNA University)
76
10+2 /2006 Manipur Public School (CBSE) 71
10th
/2004 CC Higher Secondary (BOSEM) 70.2
Achievement:
 Have achieved an A certificate in NCC, Manipur.
 Have also participated in various MATHS TALENT SEARCH and have won prizes in the
respective event.
 Have undergone INPLANT TRAINING at BSNL, Nammakal.
 Completed Diploma course in MySQL, Java, and Advanced Java in NIIT.
Relevant Courses Completed:
Programming in Java under the following major subjects:
 Core Java Concepts
 SQL , PL/SQL

Other Interests:
Solving puzzles, Internet Surfing, Listening Music, Gaming
Personal Details:
Name Athokpam Nabakumar Singh
Age 25
Nationality Indian
Gender Male
Marital Status Single
Current Address No 89, M. Narayana Reddy Building, 12th
cross, Sarakki village,
JP Nagar 1st
phase, Bangalore -560078
Permanent Adress Lalambung Makhong, Takhellambam Leikai,
Imphal West-795001, Manipur
Declaration:
I hereby declare, that the above information and details provided by me are correct and to the
best of my knowledge.
Place:-Bangalore, Karnataka
ATHOKPAM NABAKUMAR SINGH
I hereby declare, that the above information and details provided by me are correct and to the
best of my knowledge.
Place:-Bangalore, Karnataka
ATHOKPAM NABAKUMAR SINGH

More Related Content

What's hot

Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData ResumeAnil Sokhal
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop ResumeBharath Kumar
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resumearbind_jha
 
Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
 
Big data solutions in Azure
Big data solutions in AzureBig data solutions in Azure
Big data solutions in AzureMostafa
 
End-to-End Security and Auditing in a Big Data as a Service Deployment
End-to-End Security and Auditing in a Big Data as a Service DeploymentEnd-to-End Security and Auditing in a Big Data as a Service Deployment
End-to-End Security and Auditing in a Big Data as a Service DeploymentDataWorks Summit/Hadoop Summit
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
 
Comparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and sparkComparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and sparkAgnihotriGhosh2
 
Building Big data solutions in Azure
Building Big data solutions in AzureBuilding Big data solutions in Azure
Building Big data solutions in AzureMostafa
 
Big data solutions in azure
Big data solutions in azureBig data solutions in azure
Big data solutions in azureMostafa
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoopKishore Babu
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
 
Introduction To Hadoop Administration - SpringPeople
Introduction To Hadoop Administration - SpringPeopleIntroduction To Hadoop Administration - SpringPeople
Introduction To Hadoop Administration - SpringPeopleSpringPeople
 

What's hot (19)

Anil_BigData Resume
Anil_BigData ResumeAnil_BigData Resume
Anil_BigData Resume
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Hadoop Big Data Resume
Hadoop Big Data ResumeHadoop Big Data Resume
Hadoop Big Data Resume
 
Resume
ResumeResume
Resume
 
Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_Resume
 
Pallavi_Resume
Pallavi_ResumePallavi_Resume
Pallavi_Resume
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Big data solutions in Azure
Big data solutions in AzureBig data solutions in Azure
Big data solutions in Azure
 
End-to-End Security and Auditing in a Big Data as a Service Deployment
End-to-End Security and Auditing in a Big Data as a Service DeploymentEnd-to-End Security and Auditing in a Big Data as a Service Deployment
End-to-End Security and Auditing in a Big Data as a Service Deployment
 
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Comparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and sparkComparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and spark
 
Building Big data solutions in Azure
Building Big data solutions in AzureBuilding Big data solutions in Azure
Building Big data solutions in Azure
 
Big data solutions in azure
Big data solutions in azureBig data solutions in azure
Big data solutions in azure
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoop
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
Introduction To Hadoop Administration - SpringPeople
Introduction To Hadoop Administration - SpringPeopleIntroduction To Hadoop Administration - SpringPeople
Introduction To Hadoop Administration - SpringPeople
 
Gyan_resume
Gyan_resumeGyan_resume
Gyan_resume
 

Similar to Big Data Engineer Resume

Similar to Big Data Engineer Resume (20)

Pradeepa dharmappa
Pradeepa dharmappaPradeepa dharmappa
Pradeepa dharmappa
 
Pradeepa dharmappa
Pradeepa dharmappaPradeepa dharmappa
Pradeepa dharmappa
 
Deepesh curriculum 18 aug 2016
Deepesh curriculum 18 aug 2016Deepesh curriculum 18 aug 2016
Deepesh curriculum 18 aug 2016
 
CV
CVCV
CV
 
Resume
ResumeResume
Resume
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 
Muhammad Amir Hossain CV
Muhammad Amir Hossain CVMuhammad Amir Hossain CV
Muhammad Amir Hossain CV
 
Kanakaraj_Periasamy
Kanakaraj_PeriasamyKanakaraj_Periasamy
Kanakaraj_Periasamy
 
Jeevananthan_Informatica
Jeevananthan_InformaticaJeevananthan_Informatica
Jeevananthan_Informatica
 
Deepa_Resume
Deepa_ResumeDeepa_Resume
Deepa_Resume
 
Bigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExpBigdata.sunil_6+yearsExp
Bigdata.sunil_6+yearsExp
 
Mallikarjun Terdal
Mallikarjun TerdalMallikarjun Terdal
Mallikarjun Terdal
 
Chandan's_Resume
Chandan's_ResumeChandan's_Resume
Chandan's_Resume
 
Avin kotian u
Avin kotian uAvin kotian u
Avin kotian u
 
Rahul Resume
Rahul Resume Rahul Resume
Rahul Resume
 
BigData_Krishna Kumar Sharma
BigData_Krishna Kumar SharmaBigData_Krishna Kumar Sharma
BigData_Krishna Kumar Sharma
 
Resume Deepthi Reddy
Resume Deepthi ReddyResume Deepthi Reddy
Resume Deepthi Reddy
 
MySql_PlSQL_7yrs_CV
MySql_PlSQL_7yrs_CVMySql_PlSQL_7yrs_CV
MySql_PlSQL_7yrs_CV
 
RajeshS_ETL
RajeshS_ETLRajeshS_ETL
RajeshS_ETL
 
Anuragh Ravindran
Anuragh RavindranAnuragh Ravindran
Anuragh Ravindran
 

Big Data Engineer Resume

  • 1. ATHOKPAM NABAKUMAR SINGH E-mail: nabakumar47@gmail.com Phone: (+91) 7795408480 Carrier Objective: To use my skills in the best way possible for achieving the company’s goals. To be associated as an active member with a progressive software industry that gives me scope to enhance my skills and strengths in accordance with the latest technologies and part of the team that dynamically works towards the growth of the organization and gain satisfaction there off. Professional Summary and Technical skills: • Software Professional with 4 years and 2 months of experience as Hadoop Admin & support (Installation/Support/Maintenance) and Big data analytics. • Deep knowledge of hive(Installation/tuning/running complicated hive queries). • Deep knowledge of hbase(Installation/tuning/running complicated hive queries). • Configuring fair scheduler to grant access to multiple user to submit job to multiple pools. • Deep knowledge of installation of opscenter/solr and configuring solr nodes. • Good knowledge of flume . • Creation of keyspaces and columns in Cassandra both from opscenter and Casandra cli. • Configuring the Hadoop cluster. • Proficient with installation of pentaho and configuring them based on the needs to accommodate with the features of Hadoop. • Hands on experience on Java,SQL & Unix. • Worked in database - oracle 9i/10g. • Proficient in understanding the requirements. • Worked in tools – Toad, Putty, Win SCP, SQL Developer , Opsware (HP Server Automation Client v9.13),Weblogic, Netapp Service Now. • Exposure to shell scripting and perl scripting and automation of processes. • Good familiarity with ETL process used Pentaho tool for Data Integration (ETL and job scheduling). • Have worked on Incident and Problem Management (ITIL). • Working knowledge in Eclipse and Java Beans IDE. • Good analytical skills and efficient in debugging. • Excellent verbal and written communication. • Self-motivated, energetic and a quick learner. Professional Experience: Currently working in Trianz Holdings Pvt. Limited – Bangalore, India as a Senior Software Engineer with 4 years of experience. Trianz Holding pvt. Ltd – Jan 2012 - till date
  • 2. Project Details: Project Name – DW4 (Asup Next) Client – NetApp Duration – Oct 2012 – till date Technology used – Hadoop, Java, Flume, Hive, HDFS, Pentaho, SQL, ETL, Unix Description: NetApp’s My AutoSupport is a suite of proactive support tools that is based on AutoSupport information from your systems and helps improve storage and operational efficiency. The ASUP Data Platform Analytics Team receives the huge volume of AutoSupport files and processes the information on the Hadoop cluster and provides the insights gained to the different users in the company. We have ETL which then processed the data and stored the data in DSS (oracle database) which are used by customers/users to fetch the data. These applications are scheduled through Pentaho scheduler . Role & responsibility:  Installation of Hadoop cluster and configuring the datanodes/namenodes based on the requirement and tuning them .  Installation of pentaho and configuring them based on the needs to accommodate with the features of Hadoop.  Installation of opscenter and solr and configure solr nodes based on the requirement .  Installation of hive and run complicated hive queries to generate report .  Monitoring the pre-scheduled jobs through Job tracker/pentaho logs/application logs/system logs.  Raising tickets in case of critical failures .  Writing SQL queries in order to fetch the data from the database.  Verification/Analysis of data in HDFS using hdfs shell commands.  Verification/Analysis of data stored in Hbase.  Verification/Analysis of data through HIVE and SOLR using Hive and SOLR queries  Providing the daily status report.  Writing shell script /sql queries to avoid manual effort of monitoring the job.  Preparing and providing business solutions to clients and business requirements.  Provide resolution based on the ERROR through code analysis.
  • 3.  Code level investigation of the jobs (UNIX, Shell script, Oracle9i (SQL, PL/SQL), java , hql ) for finding the bug and fixing it.  Writing shell script /sql queries to generate reports based on business requirements.  Preparing the jobs and transformations and scheduling through pentaho scheduler. Project Name – ASUP DW3 Client – Netapp Duration – Dec 2011 – Oct 2012 Technology used –Unix, sql, pl/sql, Shell Scripting, Perl Scripting, opsware, ETL Description: NetApp’s My AutoSupport is a suite of proactive support tools that is based on AutoSupport information from your systems and helps improve storage and operational efficiency. The ASUP Data Platform Analytics Team receives the volume of AutoSupport files and processes the information through various layers of application and provides the insights gained to the different users in the company. ETL built on perl scripting using sql, runs on these data from ODS , processed them and again dumped the data to DSS ( oracle database) which are used by customers/user. Some of these applications are scheduled through crontab in unix and some runs as daemon application . Role & responsibility:  Scheduling the jobs in crontab and monitoring them.  Raising tickets in case of critical failures.  Providing support 24*7 bases depending on the Priority of the ticket.  Writing SQL queries in order to fetch the data from the database.  Providing the daily status report.  Writing shell script /sql queries to avoid manual effort of monitoring the job.  Preparing and providing business solutions to clients and business requirements.  Provide resolution based on the ERROR through code analysis.  Code level investigation of the jobs (UNIX, Shell script, Oracle9i (SQL, PL/SQL)) for finding the bug and fixing it.  Writing shell script /sql queries to generate reports based on business requirements. Education Details:
  • 4. Degree/ Year Institute/College/Board % Age/CGPA B.E (Computer Science and Engg)/2010 PGP College Of Engineering And Technology (ANNA University) 76 10+2 /2006 Manipur Public School (CBSE) 71 10th /2004 CC Higher Secondary (BOSEM) 70.2 Achievement:  Have achieved an A certificate in NCC, Manipur.  Have also participated in various MATHS TALENT SEARCH and have won prizes in the respective event.  Have undergone INPLANT TRAINING at BSNL, Nammakal.  Completed Diploma course in MySQL, Java, and Advanced Java in NIIT. Relevant Courses Completed: Programming in Java under the following major subjects:  Core Java Concepts  SQL , PL/SQL  Other Interests: Solving puzzles, Internet Surfing, Listening Music, Gaming Personal Details: Name Athokpam Nabakumar Singh Age 25 Nationality Indian Gender Male Marital Status Single Current Address No 89, M. Narayana Reddy Building, 12th cross, Sarakki village, JP Nagar 1st phase, Bangalore -560078 Permanent Adress Lalambung Makhong, Takhellambam Leikai, Imphal West-795001, Manipur Declaration:
  • 5. I hereby declare, that the above information and details provided by me are correct and to the best of my knowledge. Place:-Bangalore, Karnataka ATHOKPAM NABAKUMAR SINGH
  • 6. I hereby declare, that the above information and details provided by me are correct and to the best of my knowledge. Place:-Bangalore, Karnataka ATHOKPAM NABAKUMAR SINGH