SlideShare a Scribd company logo
1 of 4
Nikunj Ramani
46/1/2/2, Vidyanagar Gali No.1, Krishna Chowk
New Sangavi, Pune - 411027.
E-mail: nikunjramani9691@gmail.com
Mobile: +91 9049868875
 OBJECTIVE :-
Seeking a challenging position in IT field, want to develop my skills and hence contribute to the growth
of organization.
 TECHNICAL SKILLS :-
Big Data Solution PIG , Hive, Sqoop, Flume, oozie and Hbase installation and Cloudera 5.1 monitoring
Platform
Redhat, CentOs, Ubuntu & Windows (Windows 7/8/10 & Windows Server
2008/2012 R2)
Cloud based Big Data
Solution
Amazon EMR, Amazon EC2 and Amazon S3
Virtualization Tool VM ware, Virtual Box and OpenVZ
Monitoring Tools Nagios, Cacti, MailScanner, Logwatch and Webmin
Linux Servers FTP, Apache, SENDMAIL, NFS, LDAP, SAMBA, DHCP and DNS
Languages Bash, SQL and beginner to Python
Database MySQL and Oracle
Storage NetApp
Networking CCNA, N+ and A+
Certification
Red Hat Certified System Administration [RHCSA] On Red Hat Enterprise Linux
6 [130-149-370]
Red Hat Certified Engineer [RHCE] On Red Hat Enterprise Linux 6 [130-149-
370]
 WORK EXPERIENCE :-
 Total Experience : 3.4 Years [Big Data Solution , Unix Admin ]
 Big Data Solution Experience : 1 year
 Organization: 3DPLM Software Solutions Ltd. Pune from July 2014 to till date
 Designation: Software Engineer cum Admin
Project 3DPLM
Client Dassault System Diffusion Environment Online Support
Team Strength 5
Role Hadoop Admin
Technology Hadoop (PIG, HIVE, Sqoop, Flume, Hbase and Oozie) and Cloudera 5.1 Monitoring
Cloud based Big Data
Solution
Amazon EMR, Amazon EC2 and Amazon S3
Duration ( Jul 2015 to Aug 2016)
OS Linux (Redhat, CentOs, Ubuntu)
Database MySQL, PhpMyAdmin and Oracle
Virtualization Tool VM ware and Virtual Box
Languages Bash
 Applications Summary :-
This program started in Jul-2015 for more improvement in products by collecting various products
applications information and to setup organization level capability for Hadoop. This involved getting trained in
Hadoop, setting up Infrastructure for capability and work on user cases and project with this group.
 Responsibilities :-
Hadoop Administration :-
 Setup Infrastructure for the Dassault System by installing Apache Hadoop Stack and Cloudera 5.1 for
monitoring as standard admin.
 Commissioning, Decommissioning nodes on cluster.
 Create user/groups and assign proper permission.
 Managed different services by using Service manager and Service control.
 Scheduling work flow by oozie.
 Maintain LDAP configuration on nodes.
 Install/Uninstall/Update services to the nodes.
 Setup Networking, firewall and dependent packages for Hadoop.
 Monitor the cluster health daily basis and fix any issues.
 Install Hadoop Apache stack, create required databases, Users, Hadoop packages, UNIX structure and
configuration files for the cluster.
 Validate on Hadoop Infrastructure and monitoring data growth as well as inform to concern teams.
 Worked on the executed and advised optimal solution implementation.
Project 3DPLM
Client Dassault System Diffusion Production
Team Strength 10
Role Diffusion Admin
Duration ( Jul 2015 to Aug 2016)
OS
Redhat, CentOs, Ubuntu & Windows (Windows 7/8/10 & Windows Server
2008/2012 R2
Database MySQL, PhpMyAdmin and Oracle
Servers FTP, Apache, SENDMAIL and NFS
Virtualization Tool VM ware and Virtual Box
Languages Bash and SQL
Storage NetApp
 Applications Summary :-
This program started in Jul-2015 for more improvement in products by collecting various
products applications information and to setup organization level capability
 Responsibilities:
 Worked on Dassualt System propriety tools for monitoring Diffusion application as well as their all
Prod/Non-Prod servers resources.
 Worked on Dassualt System IP-protection tools for taking snap and permission to user/group on
Netapp storage for different Diffusion tasks.
 Worked on Shell Scripting - Bash Shell and Korn shell and SQL.
 Worked on VMware vSphere/Esxi and Virtual Box for Virtualization.
 Perform User, Group Management task and other Administration task.
 Worked with Mysql, PhpMyAdmin and Oracle Database.
 Directly handle the Production Server for Backup, Inventory and other activities.
 Logs Checking, Monitoring and Maintenance for all Production/ Non-Production servers.
 Worked on Dassualt System Ticketing Tool to generate and solve tickets for Diffusion application issues
which raised/created on all Production/Non-Production environments by developers/clients.
 Organization: Hostin Services Pvt. Ltd. Pune from April 2013 to July 2014
Designation: Jr. System Admin
Project Hostin Services
Client Hungama Digital and Digital Hathi
Team Strength 12
Duration ( Apr'13 to Jul ‘14)
Role Jr. System Admin
OS Linux, Ubuntu & Windows (Windows 7 & Windows Server 2008 R2)
Monitoring Tools Nagios, Cacti, MailScanner and Webmin
Database MySQL, PhpMyAdmin and Oracle
Servers FTP, Apache, LDAP, SENDMAIL, NFS, SAMBA, DHCP and DNS
Virtualization Tool VM ware, Virtual Box and OpenVZ
Languages SQL
Backup Vzdump
Hosting Technologies Parallels Plesk 11 and Cpanel
Mail Servers Sendmail and Postfix
 Applications Summary :-
This program started in Aug-2012 to providing backed technology support to the multimedia
companies fulfill their requirements in various aspects. This involved getting trained in Linux, setting up
multimedia infrastructure for providing environment to industry and users.
 Responsibilities:
 Perform User, Group Management task and other Administration task.
 Keeping the servers up and running with minimal downtime.
 Worked on Nagios, Cacti, MailScanner, Parallels Virtual Automation, Logwatch and Webmin Tools to
Monitoring all Prod/Non-Prod servers resources.
 Worked on VMware vSphere/Esxi, Virtual Box and OpenVZ for Virtualization.
 For Creating and managing VPS on OpenVZ and Taking Backup of VPS by using vzdump.
 Directly handle the Production Server for Backup, Inventory and other activities.
 Worked on Shell Scripting - Bash Shell.
 Worked with Mysql, PhpMyAdmin Open Source Database also with Oracle Database.
 Configuration and Maintenance of Mail Servers (Sendmail, Postfix).
 Logs Checking, Monitoring and Maintenance for all Production/ Non-Production servers.
 Worked on Ticketing Tool to generate and solve tickets for different issues which raised on all
Production/ Non-Production servers by clients.
 Worked on Parallels Plesk 11, Cpanel for Hosting purpose.
 EDUCATIONAL DETAILS :-
Completed Graduation in B.E. Computer Engineering from NMU with 67.93% DISTINCTION in 2012.
 PERSONAL DETAILS :-
 Name : Nikunj Suryakant Ramani
 Date of Birth : 9th
June 1991
 Marital Status: Married
 Passport No. : J9164123 valid till July 2021
 Languages Known: English, Hindi, Gujarati and Marathi
 Mailing Address : “BALKRISHNA KRIPA” , Haweli Street ,
At: Kalana , Tal: Dhoraji , Dist: Rajkot (GUJARAT) ,Pin Code:360410
All the information mentioned above is true to the best of my knowledge…
Yours faithfully,
(Nikunj Ramani)

More Related Content

What's hot

Joginder S Sethi Resume
Joginder S Sethi ResumeJoginder S Sethi Resume
Joginder S Sethi ResumeJoginder Sethi
 
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UK
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UKSUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UK
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UKhuguk
 
TareqZantoot-Haddop_Administrator_Security
TareqZantoot-Haddop_Administrator_SecurityTareqZantoot-Haddop_Administrator_Security
TareqZantoot-Haddop_Administrator_SecurityTarek Elzanaty
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)ch koti
 
Advanced Security In Hadoop Cluster
Advanced Security In Hadoop ClusterAdvanced Security In Hadoop Cluster
Advanced Security In Hadoop ClusterEdureka!
 
Avi Jain - Resume Descriptive 2015
Avi Jain - Resume Descriptive 2015Avi Jain - Resume Descriptive 2015
Avi Jain - Resume Descriptive 2015AVI JAIN
 
Hadoop in the Cloud - The what, why and how from the experts
Hadoop in the Cloud - The what, why and how from the expertsHadoop in the Cloud - The what, why and how from the experts
Hadoop in the Cloud - The what, why and how from the expertsDataWorks Summit/Hadoop Summit
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop ResumeBharath Kumar
 
Hadoop Security and Compliance - StampedeCon 2016
Hadoop Security and Compliance - StampedeCon 2016Hadoop Security and Compliance - StampedeCon 2016
Hadoop Security and Compliance - StampedeCon 2016StampedeCon
 
PASS Summit - SQL Server 2017 Deep Dive
PASS Summit - SQL Server 2017 Deep DivePASS Summit - SQL Server 2017 Deep Dive
PASS Summit - SQL Server 2017 Deep DiveTravis Wright
 
Hadoop Innovation Summit 2014
Hadoop Innovation Summit 2014Hadoop Innovation Summit 2014
Hadoop Innovation Summit 2014Data Con LA
 
Hadoop in three use cases
Hadoop in three use casesHadoop in three use cases
Hadoop in three use casesJoey Echeverria
 
SQL Server 2017 Machine Learning Services
SQL Server 2017 Machine Learning ServicesSQL Server 2017 Machine Learning Services
SQL Server 2017 Machine Learning ServicesSorin Peste
 
Building Big data solutions in Azure
Building Big data solutions in AzureBuilding Big data solutions in Azure
Building Big data solutions in AzureMostafa
 
Big data solutions in azure
Big data solutions in azureBig data solutions in azure
Big data solutions in azureMostafa
 
Hadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceHadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceNeev Technologies
 
Comparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and sparkComparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and sparkAgnihotriGhosh2
 

What's hot (20)

Joginder S Sethi Resume
Joginder S Sethi ResumeJoginder S Sethi Resume
Joginder S Sethi Resume
 
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UK
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UKSUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UK
SUSE, Hadoop and Big Data Update. Stephen Mogg, SUSE UK
 
TareqZantoot-Haddop_Administrator_Security
TareqZantoot-Haddop_Administrator_SecurityTareqZantoot-Haddop_Administrator_Security
TareqZantoot-Haddop_Administrator_Security
 
KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)KOTI_RESUME_(1) (2)
KOTI_RESUME_(1) (2)
 
Advanced Security In Hadoop Cluster
Advanced Security In Hadoop ClusterAdvanced Security In Hadoop Cluster
Advanced Security In Hadoop Cluster
 
Avi Jain - Resume Descriptive 2015
Avi Jain - Resume Descriptive 2015Avi Jain - Resume Descriptive 2015
Avi Jain - Resume Descriptive 2015
 
Hadoop in the Cloud - The what, why and how from the experts
Hadoop in the Cloud - The what, why and how from the expertsHadoop in the Cloud - The what, why and how from the experts
Hadoop in the Cloud - The what, why and how from the experts
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Hadoop Security and Compliance - StampedeCon 2016
Hadoop Security and Compliance - StampedeCon 2016Hadoop Security and Compliance - StampedeCon 2016
Hadoop Security and Compliance - StampedeCon 2016
 
PASS Summit - SQL Server 2017 Deep Dive
PASS Summit - SQL Server 2017 Deep DivePASS Summit - SQL Server 2017 Deep Dive
PASS Summit - SQL Server 2017 Deep Dive
 
Hadoop Innovation Summit 2014
Hadoop Innovation Summit 2014Hadoop Innovation Summit 2014
Hadoop Innovation Summit 2014
 
Hadoop in three use cases
Hadoop in three use casesHadoop in three use cases
Hadoop in three use cases
 
SQL Server 2017 Machine Learning Services
SQL Server 2017 Machine Learning ServicesSQL Server 2017 Machine Learning Services
SQL Server 2017 Machine Learning Services
 
Building Big data solutions in Azure
Building Big data solutions in AzureBuilding Big data solutions in Azure
Building Big data solutions in Azure
 
Big data solutions in azure
Big data solutions in azureBig data solutions in azure
Big data solutions in azure
 
Introduction to Hadoop
Introduction to HadoopIntroduction to Hadoop
Introduction to Hadoop
 
Hadoop Ecosystem at a Glance
Hadoop Ecosystem at a GlanceHadoop Ecosystem at a Glance
Hadoop Ecosystem at a Glance
 
Comparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and sparkComparison among rdbms, hadoop and spark
Comparison among rdbms, hadoop and spark
 
Big data Hadoop
Big data  Hadoop   Big data  Hadoop
Big data Hadoop
 
SQL Server 2012 and Big Data
SQL Server 2012 and Big DataSQL Server 2012 and Big Data
SQL Server 2012 and Big Data
 

Viewers also liked

Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopabinash bindhani
 
Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)Sourav Madhesiya
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoopKishore Babu
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
 
Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
 
Evolution of Data Analytics: the past, the present and the future
Evolution of Data Analytics: the past, the present and the futureEvolution of Data Analytics: the past, the present and the future
Evolution of Data Analytics: the past, the present and the futureVarun Nemmani
 
RENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPrenuga V
 
Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir Saxena
 
E commerce project
E commerce project E commerce project
E commerce project dezyneecole
 

Viewers also liked (13)

Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoopSenior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoop
 
Resume_Ayush_Hadoop
Resume_Ayush_HadoopResume_Ayush_Hadoop
Resume_Ayush_Hadoop
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_Resume
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Evolution of Data Analytics: the past, the present and the future
Evolution of Data Analytics: the past, the present and the futureEvolution of Data Analytics: the past, the present and the future
Evolution of Data Analytics: the past, the present and the future
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
RENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOP
 
Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume
 
E commerce project
E commerce project E commerce project
E commerce project
 

Similar to Nikunj_Hadoop_Admin_Resume (20)

Siddhartha-Resume
Siddhartha-ResumeSiddhartha-Resume
Siddhartha-Resume
 
sarath very latest
sarath very latestsarath very latest
sarath very latest
 
Adhila_CV_DevOps_Linux_Profile
Adhila_CV_DevOps_Linux_ProfileAdhila_CV_DevOps_Linux_Profile
Adhila_CV_DevOps_Linux_Profile
 
Monika_Raghuvanshi
Monika_RaghuvanshiMonika_Raghuvanshi
Monika_Raghuvanshi
 
Ravi_updated_resume
Ravi_updated_resumeRavi_updated_resume
Ravi_updated_resume
 
Shiv Shakti
Shiv ShaktiShiv Shakti
Shiv Shakti
 
VINDS resume brk
VINDS resume brkVINDS resume brk
VINDS resume brk
 
SUJITH_SURENDRAN_PILLAI_Resume_2015
SUJITH_SURENDRAN_PILLAI_Resume_2015SUJITH_SURENDRAN_PILLAI_Resume_2015
SUJITH_SURENDRAN_PILLAI_Resume_2015
 
SUJITH_SURENDRAN_PILLAI_Resume_2015
SUJITH_SURENDRAN_PILLAI_Resume_2015SUJITH_SURENDRAN_PILLAI_Resume_2015
SUJITH_SURENDRAN_PILLAI_Resume_2015
 
Brian bills-system-adinistrator-2
Brian bills-system-adinistrator-2Brian bills-system-adinistrator-2
Brian bills-system-adinistrator-2
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Karthik K-Senior System Admin
Karthik K-Senior System AdminKarthik K-Senior System Admin
Karthik K-Senior System Admin
 
Dean Hagen
Dean HagenDean Hagen
Dean Hagen
 
Resume Aug
Resume AugResume Aug
Resume Aug
 
Resume raushan
Resume raushanResume raushan
Resume raushan
 
Deepesh curriculum 18 aug 2016
Deepesh curriculum 18 aug 2016Deepesh curriculum 18 aug 2016
Deepesh curriculum 18 aug 2016
 
Abhilash_Documentum
Abhilash_Documentum Abhilash_Documentum
Abhilash_Documentum
 
Abhilash_Documentum
Abhilash_Documentum Abhilash_Documentum
Abhilash_Documentum
 
Bijeet_ITInfra_CV
Bijeet_ITInfra_CVBijeet_ITInfra_CV
Bijeet_ITInfra_CV
 
Bintan
BintanBintan
Bintan
 

Nikunj_Hadoop_Admin_Resume

  • 1. Nikunj Ramani 46/1/2/2, Vidyanagar Gali No.1, Krishna Chowk New Sangavi, Pune - 411027. E-mail: nikunjramani9691@gmail.com Mobile: +91 9049868875  OBJECTIVE :- Seeking a challenging position in IT field, want to develop my skills and hence contribute to the growth of organization.  TECHNICAL SKILLS :- Big Data Solution PIG , Hive, Sqoop, Flume, oozie and Hbase installation and Cloudera 5.1 monitoring Platform Redhat, CentOs, Ubuntu & Windows (Windows 7/8/10 & Windows Server 2008/2012 R2) Cloud based Big Data Solution Amazon EMR, Amazon EC2 and Amazon S3 Virtualization Tool VM ware, Virtual Box and OpenVZ Monitoring Tools Nagios, Cacti, MailScanner, Logwatch and Webmin Linux Servers FTP, Apache, SENDMAIL, NFS, LDAP, SAMBA, DHCP and DNS Languages Bash, SQL and beginner to Python Database MySQL and Oracle Storage NetApp Networking CCNA, N+ and A+ Certification Red Hat Certified System Administration [RHCSA] On Red Hat Enterprise Linux 6 [130-149-370] Red Hat Certified Engineer [RHCE] On Red Hat Enterprise Linux 6 [130-149- 370]  WORK EXPERIENCE :-  Total Experience : 3.4 Years [Big Data Solution , Unix Admin ]  Big Data Solution Experience : 1 year  Organization: 3DPLM Software Solutions Ltd. Pune from July 2014 to till date  Designation: Software Engineer cum Admin Project 3DPLM Client Dassault System Diffusion Environment Online Support Team Strength 5 Role Hadoop Admin Technology Hadoop (PIG, HIVE, Sqoop, Flume, Hbase and Oozie) and Cloudera 5.1 Monitoring Cloud based Big Data Solution Amazon EMR, Amazon EC2 and Amazon S3 Duration ( Jul 2015 to Aug 2016) OS Linux (Redhat, CentOs, Ubuntu) Database MySQL, PhpMyAdmin and Oracle Virtualization Tool VM ware and Virtual Box Languages Bash  Applications Summary :- This program started in Jul-2015 for more improvement in products by collecting various products applications information and to setup organization level capability for Hadoop. This involved getting trained in Hadoop, setting up Infrastructure for capability and work on user cases and project with this group.
  • 2.  Responsibilities :- Hadoop Administration :-  Setup Infrastructure for the Dassault System by installing Apache Hadoop Stack and Cloudera 5.1 for monitoring as standard admin.  Commissioning, Decommissioning nodes on cluster.  Create user/groups and assign proper permission.  Managed different services by using Service manager and Service control.  Scheduling work flow by oozie.  Maintain LDAP configuration on nodes.  Install/Uninstall/Update services to the nodes.  Setup Networking, firewall and dependent packages for Hadoop.  Monitor the cluster health daily basis and fix any issues.  Install Hadoop Apache stack, create required databases, Users, Hadoop packages, UNIX structure and configuration files for the cluster.  Validate on Hadoop Infrastructure and monitoring data growth as well as inform to concern teams.  Worked on the executed and advised optimal solution implementation. Project 3DPLM Client Dassault System Diffusion Production Team Strength 10 Role Diffusion Admin Duration ( Jul 2015 to Aug 2016) OS Redhat, CentOs, Ubuntu & Windows (Windows 7/8/10 & Windows Server 2008/2012 R2 Database MySQL, PhpMyAdmin and Oracle Servers FTP, Apache, SENDMAIL and NFS Virtualization Tool VM ware and Virtual Box Languages Bash and SQL Storage NetApp  Applications Summary :- This program started in Jul-2015 for more improvement in products by collecting various products applications information and to setup organization level capability  Responsibilities:  Worked on Dassualt System propriety tools for monitoring Diffusion application as well as their all Prod/Non-Prod servers resources.  Worked on Dassualt System IP-protection tools for taking snap and permission to user/group on Netapp storage for different Diffusion tasks.  Worked on Shell Scripting - Bash Shell and Korn shell and SQL.  Worked on VMware vSphere/Esxi and Virtual Box for Virtualization.  Perform User, Group Management task and other Administration task.  Worked with Mysql, PhpMyAdmin and Oracle Database.  Directly handle the Production Server for Backup, Inventory and other activities.  Logs Checking, Monitoring and Maintenance for all Production/ Non-Production servers.  Worked on Dassualt System Ticketing Tool to generate and solve tickets for Diffusion application issues which raised/created on all Production/Non-Production environments by developers/clients.
  • 3.  Organization: Hostin Services Pvt. Ltd. Pune from April 2013 to July 2014 Designation: Jr. System Admin Project Hostin Services Client Hungama Digital and Digital Hathi Team Strength 12 Duration ( Apr'13 to Jul ‘14) Role Jr. System Admin OS Linux, Ubuntu & Windows (Windows 7 & Windows Server 2008 R2) Monitoring Tools Nagios, Cacti, MailScanner and Webmin Database MySQL, PhpMyAdmin and Oracle Servers FTP, Apache, LDAP, SENDMAIL, NFS, SAMBA, DHCP and DNS Virtualization Tool VM ware, Virtual Box and OpenVZ Languages SQL Backup Vzdump Hosting Technologies Parallels Plesk 11 and Cpanel Mail Servers Sendmail and Postfix  Applications Summary :- This program started in Aug-2012 to providing backed technology support to the multimedia companies fulfill their requirements in various aspects. This involved getting trained in Linux, setting up multimedia infrastructure for providing environment to industry and users.  Responsibilities:  Perform User, Group Management task and other Administration task.  Keeping the servers up and running with minimal downtime.  Worked on Nagios, Cacti, MailScanner, Parallels Virtual Automation, Logwatch and Webmin Tools to Monitoring all Prod/Non-Prod servers resources.  Worked on VMware vSphere/Esxi, Virtual Box and OpenVZ for Virtualization.  For Creating and managing VPS on OpenVZ and Taking Backup of VPS by using vzdump.  Directly handle the Production Server for Backup, Inventory and other activities.  Worked on Shell Scripting - Bash Shell.  Worked with Mysql, PhpMyAdmin Open Source Database also with Oracle Database.  Configuration and Maintenance of Mail Servers (Sendmail, Postfix).  Logs Checking, Monitoring and Maintenance for all Production/ Non-Production servers.  Worked on Ticketing Tool to generate and solve tickets for different issues which raised on all Production/ Non-Production servers by clients.  Worked on Parallels Plesk 11, Cpanel for Hosting purpose.  EDUCATIONAL DETAILS :- Completed Graduation in B.E. Computer Engineering from NMU with 67.93% DISTINCTION in 2012.  PERSONAL DETAILS :-  Name : Nikunj Suryakant Ramani  Date of Birth : 9th June 1991  Marital Status: Married  Passport No. : J9164123 valid till July 2021  Languages Known: English, Hindi, Gujarati and Marathi  Mailing Address : “BALKRISHNA KRIPA” , Haweli Street , At: Kalana , Tal: Dhoraji , Dist: Rajkot (GUJARAT) ,Pin Code:360410
  • 4. All the information mentioned above is true to the best of my knowledge… Yours faithfully, (Nikunj Ramani)