1. Manoj Kumar Sahu
Call-08446783942
Manojsahu.a9@gmail.com
SUMMARY
• Over 4 year of IT experience in Hadoop/Bigdata and Linux Support Environment.
• Having Knowledge of Cassandra administration, Application maintenance & support.
• Having knowledge of Hadoop Administration with Nosql Cassandra,Hbase and Mongodb
• Expertise with Hadoop and Cassandra tools like CM,Hortonwork Ops Centre.
• Having knowledge on file System like Hadoop,Cassandra (CFS) and Linux,Unix.
• Having Knowledge on Cassendra API ,Write Operation data and CQL.
• Expertise on Hadoop Architect and Cassandra Architect.
• Having knowledge on Cassandra Data Model, Keyspace, Interface and Cap Theorem.
• Hands on Commissioning, Decommissioning, Bootstrapping and Balancing for Node.
• Having knowledge on Network topology, Thrift, Avro and Gossip Protocols.
• Handling on ATT &T Mobility project Datalake Hadoop with respective team.
• Hands on experience installation and Management of Hadoop Cluster and Cassandra.
• Hands on experience in Hadoop eco system Hive,Pig,Sqoop,Flume,Hbase installations.
• Expertise with Hadoop Master and Slave daemons of HDFS and MapReduce.
• Having knowledge in Rack awareness and Cassandra cluster Centre.
• Hands on perform for Hadoop Distcp.
• Installation of HadoopEco system and Apache Cassandra (1.0 x , 1.2 x& 2.0x ).
• Hands on TWS (8.5.1) jobs monitoring and tracking using logs file.
• Hands on experiences in Nagios and Ganglia monitoring tool.
• Hands on Monitoring and Managing Hadoop respective daemons.
• Hands on experience in install and configuration hadoop Cloudera Manager.
• Monitoring the Cluster and deploying the daemons using CM, and Hortonwork.
• Good knowledge on MR2 (Yarn), HA(High Availability),hadoop JVM
• Good knowledge on Cluster Tuning and Node Benchmarking Processes.
• Hands on experience in installation and Configuration of Linux Redhat,CentOS,Ubuntu.
• Hands on Remote tools Super Putty and VMware workstation 10.
• Having Knowledge on Linux command and setup the Server single Node and Multiple.
• Creating Partitions Formating Mounting and change the Owner group and other
permissions.
• knowledge on LVM,Raid,Kick Start installation for linux
• Creating repository local and online through Rpm and Yum package.
• Good knowledge on linux NFS,FTP,LDAP.
2. Software Skillset:
• Hadoop: HDFS, MapReduce, Claudra Manager, Hortonwork.
• HadoopEcosystem: Hive, Pig, Sqoop, Flume, Hbase, Zookeeper.
• Security: Kerberos.
• Cluster Management Tools: Cloudera Manager, OpsCenter, JMX, Nodetool.
• NoSQL Databases: Cassandra.
• Relational Databases: Microsoft SQL Server, MySQL.
• Languages: C, Basic Java, C#, HTML PHP.
• Operating Systems: Linux, Unix, Cent OS 6.4, Windows
PROFESSIONAL EXPERIENCE:
Amdocs | Pune India Mar, 2014 –
Present
Hadoop/ Ops Administrator
Environment: Hadoop, SQL Server,DB,TWS9.2.
Responsibilities:
• As Hadoop & Cassandra Administrator support production server such as KM,STL,
BHM, Dev and QA.
• Monitor and handle application on TWS 9.2 production and Nagios Alerts.
• Deploying Application on Hadoop Cluster.
• Handling Gerick Framework and IF framework.
• Manage Files Systems and Creating Operation Scripts
• Source File monitoring and Run book Design package Reviews
• Automated and deployed Cassandra environments using Chef recipes.
• Evaluated, benchmarked and tuned data model by running endurance tests using
JMeter, Cassandra Stress Tool and OpsCenter.
• Used Cassandra node tool to manage Cassandra cluster.
• Worked on optimizing the Cassandra cluster by making changes in Cassandra
configuration file and Linux OS configurations.
• Management Reporting and Communication and Development Support (Env Queries,
New application Queries etc.)
• Involved in writing multiple scripts to monitor Cassandra cluster and OpsCenter.
• Distcp for Dual-Loaded sources and Monitoring of publishing feeds
• Handling of on-request jobs/re-publishing feeds and Adhoc data movement request for
QA Testing
3. • Job failure report and Overview report maintenance , when new sources added
• Deployment execution and code migrations and Long running job monitoring and
reporting."
Cloudwick Technology, INDIA 28 – Oct, 2011
Hadoop/ Linux Support Engineer
Environment: Hadoop, SQL Server,DB,TWS9.2,Linux 6.4.
Responsibilities:
• Scheduled repair and cleanup process in production environment.
• Administering and monitoring the cluster using tools
• Scheduled moving production data to development environment for testing purpose.
• Installed and configure mongodb server and mongo client package.
• Exported the JSON data into Mongo db.
• Install and configure Hive and Hive metadata and creating hive table to get result.
• Scheduled repair and cleanup process in production environment during off peak
times.
• Exporting Mongodb data into HDFS.
• Loading Hive data to transfer in to MSQL using Sqoop.
• Experience in storing the analyzed results back into the Cassandra cluster.
• Designed and configured gateway node to the cluster.
• Performed Stress and Performance testing, benchmark for the cluster.
• Installing Hadoop-Lzo Compression Using RPM and YUM repo.
• Adding compression properties into Hadoop Configuration file
• Exported the JSON data into Mongo db.
• Exporting Mongodb data into HDFS.
• Install and configure Hive and Hive metadata and creating hive table to get result.
• Creating Lzo file and inserting into HDFS.
• Creating Index and Running map-reduce Job .
• Installing Hdfs and adding property into configuration file.
• Adding the jdbc mysql connecter for Sqoop
4. • Compresses data import into Mysql using Sqoop.
• Add the property into flume configurations and load the streaming data into HDFS.
• Running streaming data into Hadoop cluster and monitoring with Ganglia.
• Monitoring the Health of Node using Nagios.
• Hands on experience in installation and Configuration of Linux Redhat,Cent OS,Ubuntu
• Hands on Remote tools VNC viewer,Teamviewer and VMware worksation 10.
• Configuration and managing SSH keygen for cluster.
• Basic knowledge on LVM,Raid, installestion for linux and LAMP.
• Creating repository local and online through Rpm and Yum package.