SlideShare a Scribd company logo
MONIKA RAGHUVANSHI
Hadoop Administrator
“Transforminglarge, unruly data sets into competitive advantages”
+1 2407083996
moni.raghuvansh@gmail.com
CAREER OBJECTIVE
 To pursue a growth orientedcareer witha progressive companythat provides a scope to applymyknowledge andskills that wouldhelp me contribute
my best to the organization.
 To demonstrate myexpertise as HadoopAdministrator to ensure Hadoop cluster administration and technical support in large scale IT industry.
PROFILESUMMARY
 Purveyorof competitive intelligence andholistic, timelyanalysis of BigDatamade possible by the successful installation, configuration, administration
and maintenance of Hadoop ecosystem components and architecture.
 7 Years of total IT experience with strong exposure in Hadoop and Unix administration.
 Excellent understanding of Hadoop architecture and Map Reduce (MR1) & YARN Frameworks.
 Hands on experiencein installing, configuringandusingHadoop ecosystem components like MapReduce, HDFS, Hive, Hbase, Zoo Keeper, Oozie,
Sqoop, Flume,impala, spark, storm, hue .
 Practical knowledge on functionalities ofall Hadoop daemons, interaction between them, resource utilizations and dynamic tuning to make cluster
available and efficient.
 Ensuring hadoop cluster security by implementing SSL and setting up Kerberos enviornment.
 Experience with troubleshooting Hadoop performance issues as job failures, slow running queries and cluster issues.
 Proficient in project management activities which includes planning, design, scope management, estimation, resource administr ation and project
completion as per the quality parameters and best practice guidelines.
 Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.
 Expertise with full Project Life CycleandProject Management, Standards and Configuration Management, QA Management, Team Management,
documentation, Implementation, project deployment etc.
 Hands on experience in all aspects of software development life cycle (SDLC) in Waterfall & Agile enviornments.
 Knowledge of ISO, Capability Maturity Model Integration (CMMI), and IEEE processes.
 Experienced in analysis, design of manual and automated testing for Client/Server and Web-based applications.
 Strong interpersonal skills with the experience of working in a multi-cultural environment.
 Excellent written and oral Communication skills.
TECHNICAL EXPERTISE
Hadoop Distributions :
 Apache Hadoop
 Cloudera
Excellent installationandadministration skills of Hadoopcomponents:
 HDFS
 MapReduce
 Hive
 HBase
 Flume
 Sqoop
 Zookeeper
Proficient in supporting Linux operatingsystems:
 CentOS
 Ubuntu
 RHEL
Solid understandingof open source monitoringtools:
 Nagios
 Ganglia
 Cloudera Manager
 Hue
Familiaritywith networks:
 TCP/IP
 Firewall
 DNS
Exceptional in overseeingsystemadministrationoperations:
 Performance tuning
 Storage capacity management
Skilled in programming/scriptinglanguages:
 Sell Scripting
 Core Java
Well-versedin databases:
 Oracle
 MS SQL
 MY-SQL
PROFESSIONAL EXPERIENCESUMMARY
PROJECTS
Oct 2015 – Till date
1) Client : Barclays Financial
Location: Delaware
Role: HadoopAdmin.
Responsibilities:
 Involved in Hadoop Cluster administration that includes commissioning & decommissioning of datanode, capacity planning,
performance tunning, cluster monitoring and troubleshooting.
 Involved in Kerberos and SSL setup to ensure cluster security.
 Working with data delivery teams to setup new Hadoop users which includes setting up Linux users (AD), setting up Kerberos
principals and testing HDFS, Hive, Pig and MapReduce access for the new users.
 Installed and configured MapReduce, HIVE and the HDFS, Assisted with performance tuning and monitoring.
 Workedon settingup high availabilityfor major production cluster and designed automatic failover control using zookeeper and
quorum journal nodes.
 Performance Tuningandoptimizingclusters, toget best throughput usingtools like HIVE, Impala, Hbase, Spark.
 Assemble newly bought hardware into racks with switches, assign IP addresses, firewalling, enable/disable ports etc.
 Importing and exporting data into HDFS and Hive using Sqoop.
 Configured Oozie for workflow automation and coordination.
 Configured ZooKeeper to implement node coordination, in clustering support.
 Modified cluster nodes and analyzed Hadoop log files.
 Workingonthe issues anddefects reportedon CDH platform, anddrive them to closure in coordination withCloudera Support Team
where needed.
Jan 2013 – May 2015
2) Client : GE Healthcare
Location : Mumbai, India
Role:Hadoop Consultant.
Responsibilities:
 Hands on experience with Apache Hadoop and CDH distributions.
 Installed and configured multiple Apache and CDH Hadoop clusters.
 Involvedin hadoop cluster planning, finalizing the architecture, onboarding the projects to the cluster, cluster monitoring and
maintenance.
 Assisted with data capacity planning and node forecasting.
 Translation of functional and technical requirements into detailed architecture and design.
 Installation, monitoring, managing, troubleshooting, applyingpatches in different environments such as Development Cluster, Test
Cluster and Production environments.
 Monitoring and controlling local file system disk space usage, local log files, cleaning log files with automated scripts.
 As a Hadoop admin, monitoringcluster health status on daily basis, tuning system performance related configuration parameters.
 Assuring cluster security by Kerberos and SSL/TSL setup on development cluster
 Undertakingdesign & development,testing, debuggingandtroubleshootingof the application; administeringsmoothimplementation
of the application
 Managingcode deployment, change management andqualityassurance; providingtechnical support toApplicationDevelopment
Teams andsetting/maintainingstandards.
 Supportedtechnical teammembers forautomation, installation andconfigurationtasks.
 Experience in setup, configurationandmanagement of Apache Sentry for Role-basedauthorizationandprivilege validation for Hive
and Impala Services.
 Setting up the machines with Network Control, Static IP, Disabled Firewalls, Swap memory.
 Suggested latest upgrades and patches for operating systems and Hadoop.
 AutomatingManual and repetitivetasks usingshell scripts
 Aligningwith the systems engineeringteamtopropose anddeploynewhardware andsoftware environments requiredfor Hadoopand
to expandexistingenvironments
 Implemented rack aware topology on the Hadoop cluster.
 Implemented Fair scheduler on the job tracker to allocate fair amount of resources to small jobs.
 Implemented Kerberos Security Authentication protocol for existing cluster.
 Good experience in troubleshooting performance and security level issues in the cluster and its functionality.
 Regular Commissioning and Decommissioning of nodes depending upon the amount of data.
 ManagingOperational Acceptance Testingof projects/applications developedon Hadoopplatform andtheir deployment toproduction
environment
 Live deployment ofApplicationbuildon hadoopplatform andsupport for thefirst-run in production environment andlongterm
solution for incidents andcontributingtowards buildingstable andreliable systems.
 Composingdevelopment planandeffort estimations for newproject/change requests anddistributingwork amongteammembers,
managingassigneddeliveries andprovidingend-to-endconsultancyon project life cycle
 Executingcontinuous integration andcode deployment automationusingGit, Jenkins,Nexus andshell scripts.
 Formulatedprocedures for planningandexecutionof systemupgrades forall existingHadoopclusters.
 Participated in regular production support services to Hadoop infrastructure components.
April 2011 – Jan 2013
3) Client : OntarioMinistryof Transportation
Location : Mumbai, India
Role:UnixAdministrator.
Responsibilities:
 Involved in support and monitoring production Linux Systems.
 Managed client’s Unix environments by installing, supporting, monitoring and automating the latest unix servers.
 Installation SQL and DB Backup.
 Expertise in Archive logs and Monitoring the jobs.
 Monitoring Linux daily jobs and monitoring log management system.
 Expertise in troubleshooting and able to work with a team to fix large production issues.
 Expertise in creating and managing DB tables, Index and Views.
 User creation and managing user accounts and permissions on Linux level and DB level.
 Expertise in Security in OS level and DB level.
Nov2008 - April 2011
4) Client : Nortel,Canada
Location : Mumbai,India
Role:UnixAdministrator.
Responsibilities:
 Involved in Development and monitoring Application.
 Prod patch and config implementation on production unix servers
 Good Experience in Estimating work effort.
 Performed backup and regression performance testing to decide the benchmark.
 Expertise with Developing SQL scripts and Performance Tuning.
 Expertise in Analyzing data Quality checks using shell scripts.
 Expertise in Loading data into Data Base using Linux OS
 Developing MapReduce Program to format the data.
 Expertise in handling with Large Data Warehouses for pulling reports.
 Expertise in preparing the HLDs and LDS and preparing Unit Test Cases based on functionality.
 Expertise in Linux OS health Checks and debugging the issues.
 Expertise in installing prm packages on Linux.
 Expertise in Security Layer in OS permission level and DB table level.
 Expertise in Alert System Nagios
TRAINING
 Administrator Training for Apache Hadoop.
 Agile Training
 Software Testing Foundation and Advance Level.
 Goal SettingandTime management
ACHIEVEMENTS
 Rolta Star Award.
 TCS Gems and Great Team Spirit Award
 Great Individual performer award in GE Healthcare.
EDUCATIONAL QUALIFICATION
 MBA (Information Systems) from ICFAI, India
 Bachelor of Technology (BTech) from MIT Ujjain, India

More Related Content

What's hot

Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...
Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...
Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...
Hortonworks
 
Big Data Analytics - Is Your Elephant Enterprise Ready?
Big Data Analytics - Is Your Elephant Enterprise Ready?Big Data Analytics - Is Your Elephant Enterprise Ready?
Big Data Analytics - Is Your Elephant Enterprise Ready?
Hortonworks
 
Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...
Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...
Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...
Cloudera, Inc.
 
Oracle Unified Information Architeture + Analytics by Example
Oracle Unified Information Architeture + Analytics by ExampleOracle Unified Information Architeture + Analytics by Example
Oracle Unified Information Architeture + Analytics by Example
Harald Erb
 
Introduction to Hadoop
Introduction to HadoopIntroduction to Hadoop
Introduction to HadoopJoey Jablonski
 
Advanced Security In Hadoop Cluster
Advanced Security In Hadoop ClusterAdvanced Security In Hadoop Cluster
Advanced Security In Hadoop Cluster
Edureka!
 
Brian bills-system-adinistrator-2
Brian bills-system-adinistrator-2Brian bills-system-adinistrator-2
Brian bills-system-adinistrator-2
Brian Bills
 
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in Production
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in ProductionUpgrade Without the Headache: Best Practices for Upgrading Hadoop in Production
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in Production
Cloudera, Inc.
 
What it takes to bring Hadoop to a production-ready state
What it takes to bring Hadoop to a production-ready stateWhat it takes to bring Hadoop to a production-ready state
What it takes to bring Hadoop to a production-ready state
ClouderaUserGroups
 
Data Science Day New York: The Platform for Big Data
Data Science Day New York: The Platform for Big DataData Science Day New York: The Platform for Big Data
Data Science Day New York: The Platform for Big Data
Cloudera, Inc.
 
Hadoop in the Enterprise
Hadoop in the EnterpriseHadoop in the Enterprise
Hadoop in the EnterpriseJoey Jablonski
 
Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
Thanh Nguyen
 
What the Enterprise Requires - Business Continuity and Visibility
What the Enterprise Requires - Business Continuity and VisibilityWhat the Enterprise Requires - Business Continuity and Visibility
What the Enterprise Requires - Business Continuity and Visibility
Cloudera, Inc.
 
Facing enterprise specific challenges – utility programming in hadoop
Facing enterprise specific challenges – utility programming in hadoopFacing enterprise specific challenges – utility programming in hadoop
Facing enterprise specific challenges – utility programming in hadoop
fann wu
 
Christopher S OBrien - 7-2016 (1)
Christopher S OBrien - 7-2016 (1)Christopher S OBrien - 7-2016 (1)
Christopher S OBrien - 7-2016 (1)Christopher O'Brien
 
Best Practices for Deploying Hadoop (BigInsights) in the Cloud
Best Practices for Deploying Hadoop (BigInsights) in the CloudBest Practices for Deploying Hadoop (BigInsights) in the Cloud
Best Practices for Deploying Hadoop (BigInsights) in the Cloud
Leons Petražickis
 
Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃
Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃
Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃
Etu Solution
 
Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)
Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)
Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)
Cedric CARBONE
 

What's hot (20)

Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...
Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...
Discover Enterprise Security Features in Hortonworks Data Platform 2.1: Apach...
 
Big Data Analytics - Is Your Elephant Enterprise Ready?
Big Data Analytics - Is Your Elephant Enterprise Ready?Big Data Analytics - Is Your Elephant Enterprise Ready?
Big Data Analytics - Is Your Elephant Enterprise Ready?
 
Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...
Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...
Strata + Hadoop World 2012: Data Science on Hadoop: How Cloudera Impala Unloc...
 
Oracle Unified Information Architeture + Analytics by Example
Oracle Unified Information Architeture + Analytics by ExampleOracle Unified Information Architeture + Analytics by Example
Oracle Unified Information Architeture + Analytics by Example
 
Introduction to Hadoop
Introduction to HadoopIntroduction to Hadoop
Introduction to Hadoop
 
Rajaraman _HADOOP docx
Rajaraman _HADOOP docxRajaraman _HADOOP docx
Rajaraman _HADOOP docx
 
Advanced Security In Hadoop Cluster
Advanced Security In Hadoop ClusterAdvanced Security In Hadoop Cluster
Advanced Security In Hadoop Cluster
 
Brian bills-system-adinistrator-2
Brian bills-system-adinistrator-2Brian bills-system-adinistrator-2
Brian bills-system-adinistrator-2
 
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in Production
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in ProductionUpgrade Without the Headache: Best Practices for Upgrading Hadoop in Production
Upgrade Without the Headache: Best Practices for Upgrading Hadoop in Production
 
What it takes to bring Hadoop to a production-ready state
What it takes to bring Hadoop to a production-ready stateWhat it takes to bring Hadoop to a production-ready state
What it takes to bring Hadoop to a production-ready state
 
Data Science Day New York: The Platform for Big Data
Data Science Day New York: The Platform for Big DataData Science Day New York: The Platform for Big Data
Data Science Day New York: The Platform for Big Data
 
Hadoop in the Enterprise
Hadoop in the EnterpriseHadoop in the Enterprise
Hadoop in the Enterprise
 
StephanieRoberts16
StephanieRoberts16StephanieRoberts16
StephanieRoberts16
 
Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1Overview of Big data, Hadoop and Microsoft BI - version1
Overview of Big data, Hadoop and Microsoft BI - version1
 
What the Enterprise Requires - Business Continuity and Visibility
What the Enterprise Requires - Business Continuity and VisibilityWhat the Enterprise Requires - Business Continuity and Visibility
What the Enterprise Requires - Business Continuity and Visibility
 
Facing enterprise specific challenges – utility programming in hadoop
Facing enterprise specific challenges – utility programming in hadoopFacing enterprise specific challenges – utility programming in hadoop
Facing enterprise specific challenges – utility programming in hadoop
 
Christopher S OBrien - 7-2016 (1)
Christopher S OBrien - 7-2016 (1)Christopher S OBrien - 7-2016 (1)
Christopher S OBrien - 7-2016 (1)
 
Best Practices for Deploying Hadoop (BigInsights) in the Cloud
Best Practices for Deploying Hadoop (BigInsights) in the CloudBest Practices for Deploying Hadoop (BigInsights) in the Cloud
Best Practices for Deploying Hadoop (BigInsights) in the Cloud
 
Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃
Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃
Track B-3 解構大數據架構 - 大數據系統的伺服器與網路資源規劃
 
Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)
Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)
Apache Falcon : 22 Sept 2014 for Hadoop User Group France (@Criteo)
 

Similar to Monika_Raghuvanshi

Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
suresh thallapelly
 
Owez_IBM_Hadoop_Admin
Owez_IBM_Hadoop_AdminOwez_IBM_Hadoop_Admin
Owez_IBM_Hadoop_AdminOwez Mujawar
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith R
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
sudhanshu kumar
 
Rameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_AdminRameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_AdminRameez Rangrez
 
Amit Anand - devops
Amit Anand - devopsAmit Anand - devops
Amit Anand - devops
Amit Anand
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotechlccinfotech
 
Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0
Pabba Gupta
 

Similar to Monika_Raghuvanshi (20)

Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
Rameez Rangrez(3)
Rameez Rangrez(3)Rameez Rangrez(3)
Rameez Rangrez(3)
 
Owez_IBM_Hadoop_Admin
Owez_IBM_Hadoop_AdminOwez_IBM_Hadoop_Admin
Owez_IBM_Hadoop_Admin
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Sidharth_CV
Sidharth_CVSidharth_CV
Sidharth_CV
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Hareesh
HareeshHareesh
Hareesh
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
 
Rameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_AdminRameez Rangrez_Hadoop_Admin
Rameez Rangrez_Hadoop_Admin
 
Poorna Hadoop
Poorna HadoopPoorna Hadoop
Poorna Hadoop
 
Amit Anand - devops
Amit Anand - devopsAmit Anand - devops
Amit Anand - devops
 
Vijay
VijayVijay
Vijay
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotech
 
Hadoop_Developer
Hadoop_DeveloperHadoop_Developer
Hadoop_Developer
 
RESUME_N
RESUME_NRESUME_N
RESUME_N
 
Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0
 

Monika_Raghuvanshi

  • 1. MONIKA RAGHUVANSHI Hadoop Administrator “Transforminglarge, unruly data sets into competitive advantages” +1 2407083996 moni.raghuvansh@gmail.com CAREER OBJECTIVE  To pursue a growth orientedcareer witha progressive companythat provides a scope to applymyknowledge andskills that wouldhelp me contribute my best to the organization.  To demonstrate myexpertise as HadoopAdministrator to ensure Hadoop cluster administration and technical support in large scale IT industry. PROFILESUMMARY  Purveyorof competitive intelligence andholistic, timelyanalysis of BigDatamade possible by the successful installation, configuration, administration and maintenance of Hadoop ecosystem components and architecture.  7 Years of total IT experience with strong exposure in Hadoop and Unix administration.  Excellent understanding of Hadoop architecture and Map Reduce (MR1) & YARN Frameworks.  Hands on experiencein installing, configuringandusingHadoop ecosystem components like MapReduce, HDFS, Hive, Hbase, Zoo Keeper, Oozie, Sqoop, Flume,impala, spark, storm, hue .  Practical knowledge on functionalities ofall Hadoop daemons, interaction between them, resource utilizations and dynamic tuning to make cluster available and efficient.  Ensuring hadoop cluster security by implementing SSL and setting up Kerberos enviornment.  Experience with troubleshooting Hadoop performance issues as job failures, slow running queries and cluster issues.  Proficient in project management activities which includes planning, design, scope management, estimation, resource administr ation and project completion as per the quality parameters and best practice guidelines.  Able to assess business rules, collaborate with stakeholders and perform source-to-target data mapping, design and review.  Expertise with full Project Life CycleandProject Management, Standards and Configuration Management, QA Management, Team Management, documentation, Implementation, project deployment etc.  Hands on experience in all aspects of software development life cycle (SDLC) in Waterfall & Agile enviornments.  Knowledge of ISO, Capability Maturity Model Integration (CMMI), and IEEE processes.  Experienced in analysis, design of manual and automated testing for Client/Server and Web-based applications.  Strong interpersonal skills with the experience of working in a multi-cultural environment.  Excellent written and oral Communication skills. TECHNICAL EXPERTISE Hadoop Distributions :  Apache Hadoop  Cloudera Excellent installationandadministration skills of Hadoopcomponents:  HDFS  MapReduce  Hive  HBase  Flume  Sqoop  Zookeeper Proficient in supporting Linux operatingsystems:  CentOS  Ubuntu  RHEL Solid understandingof open source monitoringtools:  Nagios  Ganglia  Cloudera Manager  Hue Familiaritywith networks:  TCP/IP  Firewall  DNS Exceptional in overseeingsystemadministrationoperations:
  • 2.  Performance tuning  Storage capacity management Skilled in programming/scriptinglanguages:  Sell Scripting  Core Java Well-versedin databases:  Oracle  MS SQL  MY-SQL PROFESSIONAL EXPERIENCESUMMARY PROJECTS Oct 2015 – Till date 1) Client : Barclays Financial Location: Delaware Role: HadoopAdmin. Responsibilities:  Involved in Hadoop Cluster administration that includes commissioning & decommissioning of datanode, capacity planning, performance tunning, cluster monitoring and troubleshooting.  Involved in Kerberos and SSL setup to ensure cluster security.  Working with data delivery teams to setup new Hadoop users which includes setting up Linux users (AD), setting up Kerberos principals and testing HDFS, Hive, Pig and MapReduce access for the new users.  Installed and configured MapReduce, HIVE and the HDFS, Assisted with performance tuning and monitoring.  Workedon settingup high availabilityfor major production cluster and designed automatic failover control using zookeeper and quorum journal nodes.  Performance Tuningandoptimizingclusters, toget best throughput usingtools like HIVE, Impala, Hbase, Spark.  Assemble newly bought hardware into racks with switches, assign IP addresses, firewalling, enable/disable ports etc.  Importing and exporting data into HDFS and Hive using Sqoop.  Configured Oozie for workflow automation and coordination.  Configured ZooKeeper to implement node coordination, in clustering support.  Modified cluster nodes and analyzed Hadoop log files.  Workingonthe issues anddefects reportedon CDH platform, anddrive them to closure in coordination withCloudera Support Team where needed. Jan 2013 – May 2015 2) Client : GE Healthcare Location : Mumbai, India Role:Hadoop Consultant. Responsibilities:  Hands on experience with Apache Hadoop and CDH distributions.  Installed and configured multiple Apache and CDH Hadoop clusters.  Involvedin hadoop cluster planning, finalizing the architecture, onboarding the projects to the cluster, cluster monitoring and maintenance.  Assisted with data capacity planning and node forecasting.  Translation of functional and technical requirements into detailed architecture and design.  Installation, monitoring, managing, troubleshooting, applyingpatches in different environments such as Development Cluster, Test Cluster and Production environments.  Monitoring and controlling local file system disk space usage, local log files, cleaning log files with automated scripts.  As a Hadoop admin, monitoringcluster health status on daily basis, tuning system performance related configuration parameters.  Assuring cluster security by Kerberos and SSL/TSL setup on development cluster  Undertakingdesign & development,testing, debuggingandtroubleshootingof the application; administeringsmoothimplementation of the application  Managingcode deployment, change management andqualityassurance; providingtechnical support toApplicationDevelopment Teams andsetting/maintainingstandards.  Supportedtechnical teammembers forautomation, installation andconfigurationtasks.  Experience in setup, configurationandmanagement of Apache Sentry for Role-basedauthorizationandprivilege validation for Hive and Impala Services.  Setting up the machines with Network Control, Static IP, Disabled Firewalls, Swap memory.  Suggested latest upgrades and patches for operating systems and Hadoop.  AutomatingManual and repetitivetasks usingshell scripts  Aligningwith the systems engineeringteamtopropose anddeploynewhardware andsoftware environments requiredfor Hadoopand to expandexistingenvironments  Implemented rack aware topology on the Hadoop cluster.  Implemented Fair scheduler on the job tracker to allocate fair amount of resources to small jobs.  Implemented Kerberos Security Authentication protocol for existing cluster.  Good experience in troubleshooting performance and security level issues in the cluster and its functionality.  Regular Commissioning and Decommissioning of nodes depending upon the amount of data.
  • 3.  ManagingOperational Acceptance Testingof projects/applications developedon Hadoopplatform andtheir deployment toproduction environment  Live deployment ofApplicationbuildon hadoopplatform andsupport for thefirst-run in production environment andlongterm solution for incidents andcontributingtowards buildingstable andreliable systems.  Composingdevelopment planandeffort estimations for newproject/change requests anddistributingwork amongteammembers, managingassigneddeliveries andprovidingend-to-endconsultancyon project life cycle  Executingcontinuous integration andcode deployment automationusingGit, Jenkins,Nexus andshell scripts.  Formulatedprocedures for planningandexecutionof systemupgrades forall existingHadoopclusters.  Participated in regular production support services to Hadoop infrastructure components. April 2011 – Jan 2013 3) Client : OntarioMinistryof Transportation Location : Mumbai, India Role:UnixAdministrator. Responsibilities:  Involved in support and monitoring production Linux Systems.  Managed client’s Unix environments by installing, supporting, monitoring and automating the latest unix servers.  Installation SQL and DB Backup.  Expertise in Archive logs and Monitoring the jobs.  Monitoring Linux daily jobs and monitoring log management system.  Expertise in troubleshooting and able to work with a team to fix large production issues.  Expertise in creating and managing DB tables, Index and Views.  User creation and managing user accounts and permissions on Linux level and DB level.  Expertise in Security in OS level and DB level. Nov2008 - April 2011 4) Client : Nortel,Canada Location : Mumbai,India Role:UnixAdministrator. Responsibilities:  Involved in Development and monitoring Application.  Prod patch and config implementation on production unix servers  Good Experience in Estimating work effort.  Performed backup and regression performance testing to decide the benchmark.  Expertise with Developing SQL scripts and Performance Tuning.  Expertise in Analyzing data Quality checks using shell scripts.  Expertise in Loading data into Data Base using Linux OS  Developing MapReduce Program to format the data.  Expertise in handling with Large Data Warehouses for pulling reports.  Expertise in preparing the HLDs and LDS and preparing Unit Test Cases based on functionality.  Expertise in Linux OS health Checks and debugging the issues.  Expertise in installing prm packages on Linux.  Expertise in Security Layer in OS permission level and DB table level.  Expertise in Alert System Nagios TRAINING  Administrator Training for Apache Hadoop.  Agile Training  Software Testing Foundation and Advance Level.  Goal SettingandTime management ACHIEVEMENTS  Rolta Star Award.  TCS Gems and Great Team Spirit Award  Great Individual performer award in GE Healthcare. EDUCATIONAL QUALIFICATION  MBA (Information Systems) from ICFAI, India  Bachelor of Technology (BTech) from MIT Ujjain, India