Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Sidhant_Resume

156 views

Published on

  • Login to see the comments

  • Be the first to like this

Sidhant_Resume

  1. 1. Sidhant Mahajan Big-data Hadoop developer Email Id:sidhant07@live.com Mobile no: +91 7769 056 120, +91 8591 455 561 Objective: Seeking a challenging job where I can contribute my best towards the success of the Company and company provide me with an opportunity to explore my potential to the fullest. Professional Experience:  3.3 years of IT experience in TCS.  1.5 year of experience as Hadoop Developer and MongoDBA.  Knowledge and Experience of Hadoop and its ecosystem (HDFS, YARN, Map Reduce, Hive, Impala, Sqoop, Flume, Oozie), Hue and MongoDB.  Working as MongoDB Administrator.  Basics of Map Reduce programming.  Unix Commands and Shell Scripting.  MapR Hadoop Developer certificated.  AS400 (SQL), Core Java, AWD (Automated Work Distributor) Tool , Jira , Putty , WinSCP. Projects and Trainings: Customer Barclays (Bank UK) (Dec,2015- Till Date) Role MongoDB Administration Technical Skills Data Restoration,Replicated Cluster setup, Shard cluster setup, Upgrading Cluster, Troubleshootingcluster issues. Achievement  M 102 MongoDB certified.  Appreciated by client on resolving Cluster issues.  Provided training sessions on MongoDB to associates.  Understood the concepts of NoSQL database.  Experience in CRUD and Aggregation operations of MongoDB.  Inserted of Large Documents using Mongo Import Tool.  Extracted data by using Mongo Export tool.  Worked on three types of files CSV, TSV and JSON.  Well known to Components of MongoDB package and their uses.
  2. 2. Customer PNC (Bank US) (Aug,2015-Nov,2015) Role Hadoop Developer Technical Skills Hive, Map Reduce, HDFS, YARN, Hue (Hadoop Web UI), Oozie, MySQL, Sqoop. Achievement  Ingestion of large data from RDBMS (MySQL) to HDFS by using Sqoop.  Monitoring of Oozie jobs.  Understood various hive optimization techniques  Well versed with hive functions. Customer Aviva (Insurance UK) (April,2015- July,2015) Role Hadoop Developer Technical Skills Hive, Map Reduce, Hue (Hadoop Web UI), HDFS, YARN, Oozie. Achievement  Processingof largedata volume.  Solution provided for near real time capability to generate singleviewof customer.  Provided Reusableand Scalable“HiveUser Defined Function" (UDF) Code which generates primary key for each row.  Reduced the batch window for the ETL process.  Data rows belongingto same customer gets same Master Id.  Saved time by over 6 hours compared to Traditional Master Data Management Tools
  3. 3. TCS Horizontal DESS PUNE (Jan,2015- Mar,2015) Role Hadoop Developer Technical Skills Map Reduce, Hive, Pig,Impala,Sqoop, Flume Achievement  Understoodthe differencebetweenRDBMS(MySQL) and Hadoopalso solve differentproblemswhile migratingdata fromMySQL to Hadoop (HDFS).  Learnedand exploredHadoopClusterManagement.  Learnedtouse differentHadoopecosystemcomponentsuchas Hive,ImpalaandPig,Flume.  Implemented SCD type-1 and type-2 usingHiveand Impala.  Setup MongoDB shard cluster  Removed special characters fromthe data set usingHive UDF and Map Reduce.  Used RowSequenceNumber UDF to create surrogatekey in Hive.  Created Agent configuration fileand injected data into HDFS using Flume. Customer Friends Life (FL) (Sept ,2013 - Dec, 2014) Role Developer Project Profile AWD Tool,a tool which is used to Automate the work distribution Responsibilities  Review SRS (System Requirement Specification) after receivingfrom business and do walk through/clarification of our understandingwith business and provideeffortestimation on same.  Work distribution/allocation amongteam.  Enhancement and code review.  Analytics on the number of Incidents and MWI per month which helped to reduce the number of repeat cases.  WritingSQL scripts.  Automated the process usingJava  Issues handlingof the AWD users acrosstheglobe.  Clientinteraction over the ongoing issues and major incidents. Technologies & Tool  Core Java , Assysttool, SQL(AS400)
  4. 4. POCs 1) Removed special characters from the data present in HDFS using Hive UDF. 2) Imported relational data from MySQL using Sqoop. The imported data was loaded into Hive tables and then SCD (Slowly Changing Dimensions type-1 and type-2) were performed on the data to calculate the delta .Further real time query engine Impala was used for querying purpose. 3) Cluster setup of MongoDB in Project Lab of TCS DESS Pune. Certifications   IBM recognized certification from Big Data University in Hadoop fundamentals and Pig.  MapR certified in Hadoop essentials.  M102 MongoDBA certified from MongoDB University. Academic Details: Class/Degree Institution/College Board/University Year of Passing % of marks obtained B-TECH (ECE) Amritsar college of Engg. and Technology Punjab technical university 2008-2012 72% 10+2 (Non-Med.) Manav Public School, Amritsar C.B.S.E 2008 65% 10th ST. Francis School, Amritsar I.C.S.E 2006 78% INTERESTS AND HOBBIES: Counter strike, Internet surfing, Casual follower of ‘Fighter Jets’, be up-to-date with latest technology trends in market.
  5. 5. Extra Co-Curricular activities:  Actively participate in events organized by TCS.  Passed my graduation degree with distinction.  Participated in table tennis tournaments.  Participated in intra-college Gaming competition and stood SECOND  Worked as student coordinator in FUSION (science and cultural fair).  Active participator in intra-college Aptitude Test.  Participated national level aptitude test NITAT.  Coordinator in national level tech. fest “PRAYAAS-2011”. PERSONAL PROFILE: Date of Birth : 14-12-1989 Father’s Name : Mr. Sushil Mahajan Sex : Male Linguistic skills : English, Hindi, Punjabi Address : 143- A Tilak Nagar Near Shivala Mandir Amritsar, Punjab (143001) DECLARATION: I assure you that the above given information is true to best of my knowledge. Sidhant Mahajan

×