1. VIJAY PAI J
No. 202, Vars Eildon Castle, 1st
Main, 16th
B Cross, Pai Layout, Bangalore–16
vjpaij@rediffmail.com
+91-9886036693
JOB OBJECTIVE
Scaling new heights of success with hard work & dedication and leaving a mark of excellence on each step; possessing a proven
ability to lead project teams to successfully deliver agreed upon solutions of the highest quality, often in complex and challenging
customer environments. Aiming for lead assignments in Big Data- Hadoop Development/ Data Analyst with a leading
organization of repute.
AREAS OF EXPERTISE
BigData Hadoop Analyst
Unix and Shell Scripting
Technical Support
IT Technical Analysis
Client Relationship Management
Project Execution
Testing
Software Development
PROFILE SUMMARY
• A competent professional with 7.5 years of retail experience working as Principal
Software Engineer with 5.5 years’ experience in Mainframe Development,
Support and Technical Analysis and over 2 years in BigData Hadoop in Retail
sector.
• Verifiable experience in handling Development & Support projects including
designing & execution of framework, schedule development, creation of work
breakdown structure, resource management, progress monitoring & delivery
• Experience in working with Hadoop components like HDFS, Map Reduce, Hive, Pig,
Hbase, and Sqoop.
• Skilled in conducting accurate System Analysis with the implementation of
appropriate data collection and proposing solutions
• Proven abilities in developing software applications involving requirement analysis,
functional specifications, scheduling, system study, designing, coding, unit testing,
quality reviews, de-bugging, documentation & troubleshooting
• Skilled in identifying client / business process needs & conceptualising solutions to
achieve corporate goals; excellent track record of spearheading Service
Improvement initiatives to minimize gaps in effectiveness of service delivery
• An innovative, loyal & result-orientated professional with strong
communication, analytical, interpersonal & problem-solving skills
• Working in Agile Methodology.
CORE COMPETENCIES
Hadoop Analyst:
• Technical Expertise in Hive, PIG, Map Reduce, Sqoop, HBase.
• Knowledge on Hadoop architecture and Hadoop Distributed File System, Hadoop ecosystem.
• Involved in Hadoop Framework design and understanding the business functionality & Analysis of business requirements.
• Loading files to HDFS and writing Hive queries to process required data
• Loading data to Hive tables and writing queries using HiveQL to process
• Loaded data from DWH systems to HBase by using Sqoop.
• Writing Hive queries and Pig scripts.
• Experience in importing and exporting data using Sqoop from HDFS to Relational Database and Vice-Versa.
• Proficient in managing the software Release management operations involving requirement gathering, and deployment of
codes to lives.
Project Execution:
• Steering the successful rollout of projects with accountability of defining scope, setting timelines, analysing requirements,
prioritising tasks, identifying dependencies and evaluating risks & issues as per pre-set budgets
• Monitoring project progress & outstanding issues and ensuring the quality & timeliness of deliverables; extending post-
implementation support to technical team members by defining SLA norms
Technical Support:
• Providing post-deployment support till successful release on each application; rendering support statistics to report issues &
possible precautions and driving project change orders, rollback & error logs for change analysis
• Serving as single source of contact 24*7 for multiple applications; managing resources in deployment & support areas
Client Servicing:
• Addressing queries of clients regarding IT applications by providing support for troubleshooting problems related to
performance tuning & application conflicts
• Maintaining healthy relations with internal & external stakeholders to provide support for various IT issues by keeping a
close track on recent developments
ORGANISATIONAL EXPERIENCE
TESCO HSC, Bangalore as Principal Software Engineer
Key Projects Handled:
Title: Primary Transport and Fresh Volumetrics
Role: Development and Design
2. Skills: Unix, HDFS, Pig, Hive, Sqoop and MapReduce
Description: Primary Transport manages the scheduling of trucks/hauliers between suppliers & depots whereas
Fresh Volumetrics manages the scheduling of trucks/hauliers between depots and stores. The main
purpose of Primary Transport and Fresh Volumetrics is to calculate the cases, pallets and trucks
needed for scheduling the products. The system mainly deals with scheduling the purchase order,
building the shipments, capturing the goods-in and generating the vouchers details.
This project aims to move all log data from individual servers to HDFS as the main log storage and
management system and then perform analysis on these HDFS data-sets. Flume was used to move
the log data periodically into HDFS. Once the data-set is inside HDFS, Pig and Hive were used to
perform various analyses.
Key Result Areas:
• Handling a team of 2 members.
• Attending daily status meeting with business users to discuss on open issues
• Involved in transferring files from OLTP server to Hadoop file system.
• Involved in writing queries with Hive QL and Pig.
• Involved in database connection by using SQOOP.
• Importing and Exporting Data from HiveQL to HDFS.
• Process and analyze the data from Hive tables using HiveQL.
• Analysed transactions using Pig scripts and Hive to generate reports for end users
Title: Logistics Management and Group Depot Ordering
Role: Development and Design
Skills: Unix, HDFS, Pig, Hive, Sqoop and MapReduce
Description: LM is a system used in Supply Chain. It is a critical system that manages the creation of purchase
orders for delivery of products to depots from the suppliers in UK & ROI based on demand & stock in
depots / stores. It is interfaced with various systems as it forms starting point of supply chain to send,
receive & share data for timely availability of products in the store.
Other than UK & ROI, TESCO is present in other countries. The purchase orders were raised by Legacy
Systems. There was a need of improving the purchase orders for better availability of stock & less
wastage in depots. Group Depot Ordering was developed to replicate the UK & ROI functionalities with
existing interfaces. It has now been implemented across 7 countries.
Key Result Areas:
• Handling a team of 7 members.
• Producing coherent technical proposals that meets the customer requirements
• Attending daily status meeting with business users to discuss on open issues
• Managing development of documentation to meet client expectations
• Assisting the senior technical analyst in defining effective strategies for the systems and developing system requirements
• Writing the script files for loading data to HDFS and processing data in HDFS.
• Written the Apache PIG scripts to validate and process the HDFS data.
• Developing applications on MapReduce programming Model of Hadoop in Java.
• Creating Partitions of Hive tables to store the processed results in a tabular format
• Analyze the Data Specifications provided by Group countries.
• Preparation of Deployment check list.
• Defects fixing, managing defect tracking by Clear Quest.
• Support system testing, User acceptance test and pre prod testing.
• Production deployment and stabilization support.
Highlights:
• Bagged Value Award as “No One Tries Harder for Customers” for excellent dedication
• Received “Star of the Month” award with cash price of ₹ 2000 for successful delivery of project during Christmas
• After the delivery of the project, the product availability in store was improved to 98% and wastage in depots reduced to 2-
3%.
• Bagged Value Award - “Every Little Helps” for dedication & successful delivery of project within given timeline
IT SKILLS
Operating Platforms: Z/OS, Unix and Windows
Primary Skills: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, COBOL, JCL, Focus, SQL and Objectstar
Project Acquired Skill: Java and Unix Shell Scripting.
Database: VSAM, Huron, DB2 and Teradata.
Software Configuration Tools: Endeavor and Telon
Job Scheduling & Monitoring: CA-7
Other Tools / Utilities: Insync, Dump Master, Abend Aid, XCOM
Training Attended: BigData-Hadoop
Domain Knowledge: Retail Domain – Supply Chain Management.
Methodologies: Agile and Waterfall models.
EDUCATION
2007 B.E. from M.S.R.I.T., Bangalore affiliated to VTU
PERSONAL DETAILS
Date of Birth: 23rd
October 1985