Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.
ANKIT BEOHAR
Email: ankitbeohar90@gmail.com
Phone: +91 8802292518
~ SEEKING CHALLENGING ASSIGNMENTS TO LEVERAGE EXPERIENCE...
 Hands-on experience on Python MR.
DATA ENGINEER,LIVECAREER, NOIDA (OCT15-
MAR16)
Project #1 JobTap
Role: Data Engineer/B...
 Create BIRT and SAP BO reports and dashboard on the basis of key processing indicator.
 Create design SAP BO universe a...
 Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases
 Wrote Fu...
mail and start working on it. If Until Data Manager does not verify the solutions of that task, the task
will not be resol...
mail and start working on it. If Until Data Manager does not verify the solutions of that task, the task
will not be resol...
Upcoming SlideShare
Loading in …5
×

AnkitBeohar_CV_DS

920 views

Published on

  • Login to see the comments

  • Be the first to like this

AnkitBeohar_CV_DS

  1. 1. ANKIT BEOHAR Email: ankitbeohar90@gmail.com Phone: +91 8802292518 ~ SEEKING CHALLENGING ASSIGNMENTS TO LEVERAGE EXPERIENCE & EXPERTISE WITH AN ORGANIZATION OF HIGH REPUTE ~ PROFESSIONAL SNAPSHOT  Solutions focused, Meticulous and result oriented B.E (Electronics & Communications) professional offering 4.3 years of successful career in IT arena distinguished by commended performance and proven results  Proven track record of excellence in tackling the issues of IT arena (Data Engineering) and other related lines; Analysis, Design and Development of Database & Data warehouse(DWH),Business Intelligence,Business analytics for different projects  Expertise in handling Hadoop Activity like Development,Support etc.  Extensive experience in Oracle database platform, with expertise in SQL standards, PL/SQL and Hadoop Stack.  Experience on Advance analytics  Broad exposure to data warehousing, including Extract, Transform, and Load Processes.  Experience in Big Data Analysis and Development.  Excellent domain knowledge in ‘Telecom’ & ‘Oil & Gas’  Expertise to work in Agile Environment  Excellent interpersonal, communication and organizational skills with proven abilities in team management and planning EMPLOYMENT CHRONICLE DATA ENGINEER,IMPETUS(CLEAR-TRAIL), NOIDA (MAR16- PRESENT) Project #1 OSINT Role: Data Engineer/Big Data Developer Technologies: MySQL, Unix,Hadoop,HIVE,Sqoop,HBase,Kafka,Spark,R,Tableau,ROSOKA Description: Clear-Trail is a Product Hand of Impetus. We are developing Open Source Intelligence(OSINT) product. Any web feeds like website data are the input of the application than we run NLP and cater the Analysis. For NLP we are using ROSOKA and for storing the data we use Hadoop. Key Deliverables:  Involved in analysis of end user requirements and business rules based on given documentation and work closely with tech leads and Business analysts in understanding the current system.  Create Cloudera cluster with all desired Hadoop Stack like Hive,Hbase,Spark etc.  Involve in some Hadoop Administration works for this we are using Cloudera Manager and HUE.  Create Custom Map-Reduce Jobs for parsing files(csv,json) and load into Hadoop(HBase).  Create Sqoop scripts for loading data into Hadoop and schedule into crontab.  Create SPARK Streaming Jobs.  For some use cases implement SPARK-SQL also.  Loading realtime data into HBase with the help of Kafka and Spark.  With the help of ROSOKA APIs we run the NLP.  Knowleadge about Regression Technique like logistics,linear etc. • Planning & Analysis • Requirements Analysis • Database Administration • Database Design & Development • Datawarehouse Management • Data Analysis,BI,Data Engineering • Hadoop Developer • Spark,Hive,MR,R,Python • Oracle PL/SQL • Tableau,SAP Webi • Statistic Tool R
  2. 2.  Hands-on experience on Python MR. DATA ENGINEER,LIVECAREER, NOIDA (OCT15- MAR16) Project #1 JobTap Role: Data Engineer/Big Data Developer Technologies: MySQL, Unix,Hadoop,HIVE,Sqoop,HBase,Kafka,Spark,R,Tableau Description: Livecareer is a job portal and data is related to job seekers profile like resume,cover letter and all. Data Volume is too high around 50Gb a day. So basically goal of this project is to store the high volume data Into Hadoop and creating DWH for the BI purpose. Relation data is store into HIVE and clickstream data Which is semi-structre is store into HBase. History data is in Azure Cloud and it is also successfully load Into Hadoop(Hive+HBase). Key Deliverables:  Involved in analysis of end user requirements and business rules based on given documentation and work closely with tech leads and Business analysts in understanding the current system.  Create Hortornworks cluster with all desired Hadoop Stack like Hive,Hbase,Spark etc.  Involve in some Hadoop Administration works for this we are using Ambari and HUE.  Create Custom Map-Reduce Jobs for parsing files(csv,json) and load into Hadoop(HBase).  Create Sqoop scripts for loading data into Hadoop and schedule into crontab.  POC for loading realtime data into HBase with the help of Kafka and Spark.  POC on R for advance analytics,built BTYD,CHURN analysis.  Knowleadge about Regression Technique like logistics,linear etc.  Hands-on experience on Python MR.  Hands-on experience on spark. SOFTWARE ENGINEER, ONE97(PAYTM) COMMUNICATION LTD., NOIDA (NOV‘14 – OCT15) Project #1 ARPU Role: Data Analyst/Database Developer/Big Data Developer Technologies: MySQL, SAP, BIRT, Unix,Hadoop,HIVE,R,Tableau Description: ARPU (Average revenue per user) is a telecom domain project in which we maintain circle wise all telecom (mobile) user data in a single repository (Data Warehouse) according to VAS/Non VAS services of the operator we analysis the data and create dashboard and report using same data. Key Deliverables:  Involved in analysis of end user requirements and business rules based on given documentation and work closely with tech leads and Business analysts in understanding the current system.  Create custom ETL process using Unix shell script and MySQL procedure and functions.  Store Source data into MySQL database.  Store Source data into columnar database “Vectorwise”.  Analysis the data using different methods like Linear,Logistic Regression etc.  Experience on Big Data development as well as analytics.  Experience on Hadoop Development Lifecycle from installation to Analytics.  Experince in HIVE and Sqoop..  Experience in Map Reduced Jobs in java.  Experince on Apache Spark(POC).  Experince in HBASE,Zookeeper,Pig,Pig Lating
  3. 3.  Create BIRT and SAP BO reports and dashboard on the basis of key processing indicator.  Create design SAP BO universe and map business requirements.  Schedule reports daily and send mail to concern users.  Handle heavy data volume on the daily basis and optimize query and ETL process.  Archive daily data into Hadoop cluster using HIVE integrated with unix shell script.  Data analysis of history data using HIVE quieries.  Handle DBA activity for MySQL and Vectorwise.  Creating MySQL procedures to handle data activity.  Extensive Experience in Data Visualisation, design develope dashboards in Tableau. SYSTEM ENGINEER, TATA CONSULTANCY SERVICE LTD, GURGAON (DEC ‘11 – NOV’14) Project #1 Navig-8 Role: Designer/Developer Technologies: Tableau, Oracle Description: This is a water utility management system in which data comes from different sources and then managed in a single repository. It involves transforming raw data into meaningful and useful for business purposes. With the help of business intelligence tool Qlikview generate dashboard and reports also integrate that to GIS maps Key Deliverables:  Involved in analysis of end user requirements and business rules based on given documentation and worked closely with tech leads and Business analysts in understanding the current system  Analysed the requirements and framed the business logic for the ETL process.  Design of Data-warehouse using Star-Schema methodology and converted data from various sources to oracle tables  Developed views, functions, procedures, and packages using PL/SQL & SQL to transform data between source staging area to target staging area  Wrote SQL queries to perform Data Validation and Data Integrity testing  Formulated SQL-Loader scripts to load legacy data into Oracle staging tables and wrote SQL queries to perform Data Validation and Data Integrity testing  Created staging tables necessary to store validated customer data prior to loading data into customer interface tables  Key role in the maintenance and support of deployed application.  Archive daily data into Hadoop cluster using HIVE integrated with unix shell script.  Extensive Experience in Data Visualisation, design develope dashboards in Tableau Project #2 Seismic Encoding Decoding Tool Role: Designer/Developer Technologies: Java, JSP, Dojo, ORACLE 11g Description: This tool is basically used for generating (encode) files on the basis of different seismic metadata values. Encode rule is define in the system. The unique number is given for each files and user can decode the files and check the metadata values. Through user management tab, admin user can define the version and encoding rule of the metadata fields for the file creation. Version can be saved as active or also can be acted as draft version for future modification Key Deliverables:  Supervised the tasks of designing, developing and maintaining optimized databases  Developed structural designs of various databases based upon logical data models.  Performed physical database design like join indexes, primary indexes, secondary indexes etc  Populating a database with new information or transfer existing data into it
  4. 4.  Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases  Wrote Functions, Procedure to meet business requirements  Wrote Oracle Batch to send mail and SMS through SMTP Project #3 Nomination Checker Role: Designer/Developer Technologies: Java, JSP, Dojo, ORACLE 11g Description: Maersk Oil has a large production of gas as a part of its business that is transported in pipeline. Pipeline 'transmission' networks are organized in 'systems' with each system having a 'transmission system operator (TSO)'. The rules are such that a company transporting natural gas has to be 'in balance' in each system, meaning that what is put into the system ('entry') is the same as what is taken out ('exit'). If a company is not in balance it incurs penalities. The process of informing the TSO of entry and exit is known as 'nomination'. In other words we have to nominate each entry and exit of gas towards the individual TSO Key Deliverables:  Supervised the tasks of designing, developing and maintaining optimized databases  Developed structural designs of various databases based upon logical data models  Performed physical database design like join indexes, primary indexes, secondary indexes etc  Populated a database with new information or transfer existing data into it  Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases  Wrote Functions, Procedure to meet business requirements  Wrote Oracle Batch to send mail and sms through SMTP  Wrote Oracle Batch to parse incoming mails Project #4 Vehicle Tracking System (VTS) Role: Designer/Developer Technologies: Java, JSP, Dojo, PostgreSQL Description: This product is a fully fledged vehicle tracking system with integrated GIS support for real time viewing. Product supports tracking of all the vehicles, running or stopped, whether they are over speeding, going out of the route or changing the specified route. GIS support is provided to view the real time data on the map to visualize the position of vehicle at a particular moment with alarms being generated for every unwanted activity by the driver of vehicle. Key Deliverables:  Supervised the tasks of designing, developing and maintaining optimized databases.  Developed structural designs of various databases based upon logical data models.  Performed physical database design like join indexes, primary indexes, secondary indexes etc.  Populated a database with new information or transfer existing data into it.  Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases.  Wrote Functions, Procedure to meet business requirements.  Wrote Function to parsing XML data in real time and integrate data to GIS map.  Wrote PostGIS function for Locating Position Project #5 Task Management System Role: Designer/Developer Technologies: Java, JSP, Dojo, ORACLE 11g Description: This application is built upon a JIRA product which is using for work flow management. In this application user creates a task (bug, maintenance) for different project or domain. Each project has a specific work flow for resolving a task. Application is managed by different user groups like Data Manager, Admin etc. When task is created then all users who are in work flow for that task receive a
  5. 5. mail and start working on it. If Until Data Manager does not verify the solutions of that task, the task will not be resolved Key Deliverables:  Supervised the tasks of designing, developing and maintaining optimized databases  Developed structural designs of various databases based upon logical data models  Performed physical database design like join indexes, primary indexes, secondary indexes etc.  Populated a database with new information or transfer existing data into it.  Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases  Wrote Functions, Procedure to meet business requirements.  Concerned with upgrading and customizing JIRA Database  Creating backup copies of data through Oracle Batch.  Maintenance and support of the deployed application.3 CREDENTIALS Professional:  B.E (Electronics & Communications) from GRKIST Jabalpur, RGTU Bhopal University in 2011 with 69.25% Academic:  XII from MP Board in 2007 with 80.22%  X from M.P. Board in 2005 with 85.00% Technical: Operating Systems: Windows server 2008/XP/vista/Windows 7/Windows 8. Languages: SQL, PL/SQL Database: Oracle 10g/11g, Postgresql, mySql IDE: Oracle SQL Developer, Oracle Data Modeler, PostgreSQL Pgadmin,SQL Yog PERSONAL VITAE Date of Birth: 04th June, 1990 Languages Known: English & Hindi Mailing Address: H.No.259, GF Sector 17A, Gurgaon, Haryana - 122001 Location Preference: Hyderabad / Bangalore / Delhi NCR / Muscat / Dubai / Abu Dhabi Passport Status: Available Visa Status: Available (UK-CoS ST) References: Will be pleased to furnish upon request (ANKIT BEOHAR)
  6. 6. mail and start working on it. If Until Data Manager does not verify the solutions of that task, the task will not be resolved Key Deliverables:  Supervised the tasks of designing, developing and maintaining optimized databases  Developed structural designs of various databases based upon logical data models  Performed physical database design like join indexes, primary indexes, secondary indexes etc.  Populated a database with new information or transfer existing data into it.  Extensively used Joins and sub-queries for complex queries involving multiple tables from different databases  Wrote Functions, Procedure to meet business requirements.  Concerned with upgrading and customizing JIRA Database  Creating backup copies of data through Oracle Batch.  Maintenance and support of the deployed application.3 CREDENTIALS Professional:  B.E (Electronics & Communications) from GRKIST Jabalpur, RGTU Bhopal University in 2011 with 69.25% Academic:  XII from MP Board in 2007 with 80.22%  X from M.P. Board in 2005 with 85.00% Technical: Operating Systems: Windows server 2008/XP/vista/Windows 7/Windows 8. Languages: SQL, PL/SQL Database: Oracle 10g/11g, Postgresql, mySql IDE: Oracle SQL Developer, Oracle Data Modeler, PostgreSQL Pgadmin,SQL Yog PERSONAL VITAE Date of Birth: 04th June, 1990 Languages Known: English & Hindi Mailing Address: H.No.259, GF Sector 17A, Gurgaon, Haryana - 122001 Location Preference: Hyderabad / Bangalore / Delhi NCR / Muscat / Dubai / Abu Dhabi Passport Status: Available Visa Status: Available (UK-CoS ST) References: Will be pleased to furnish upon request (ANKIT BEOHAR)

×