SlideShare a Scribd company logo
1 of 3
VIJAY PAI J
No. 202, Vars Eildon Castle, 1st
Main, 16th
B Cross, Pai Layout, Bangalore–16
vjpaij@rediffmail.com
+91-9886036693
JOB OBJECTIVE
Scaling new heights of success with hard work & dedication and leaving a mark of excellence on each step; possessing a proven
ability to lead project teams to successfully deliver agreed upon solutions of the highest quality, often in complex and challenging
customer environments. Aiming for lead assignments in Big Data- Hadoop Development/ Data Analyst with a leading
organization of repute.
AREAS OF EXPERTISE
BigData Hadoop Analyst
Unix and Shell Scripting
Technical Support
IT Technical Analysis
Client Relationship Management
Project Execution
Testing
Software Development
PROFILE SUMMARY
• A competent professional with 7.5 years of retail experience working as Principal
Software Engineer with 5.5 years’ experience in Mainframe Development,
Support and Technical Analysis and over 2 years in BigData Hadoop in Retail
sector.
• Verifiable experience in handling Development & Support projects including
designing & execution of framework, schedule development, creation of work
breakdown structure, resource management, progress monitoring & delivery
• Experience in working with Hadoop components like HDFS, Map Reduce, Hive, Pig,
Hbase, and Sqoop.
• Skilled in conducting accurate System Analysis with the implementation of
appropriate data collection and proposing solutions
• Proven abilities in developing software applications involving requirement analysis,
functional specifications, scheduling, system study, designing, coding, unit testing,
quality reviews, de-bugging, documentation & troubleshooting
• Skilled in identifying client / business process needs & conceptualising solutions to
achieve corporate goals; excellent track record of spearheading Service
Improvement initiatives to minimize gaps in effectiveness of service delivery
• An innovative, loyal & result-orientated professional with strong
communication, analytical, interpersonal & problem-solving skills
• Working in Agile Methodology.
CORE COMPETENCIES
Hadoop Analyst:
• Technical Expertise in Hive, PIG, Map Reduce, Sqoop, HBase.
• Knowledge on Hadoop architecture and Hadoop Distributed File System, Hadoop ecosystem.
• Involved in Hadoop Framework design and understanding the business functionality & Analysis of business requirements.
• Loading files to HDFS and writing Hive queries to process required data
• Loading data to Hive tables and writing queries using HiveQL to process
• Loaded data from DWH systems to HBase by using Sqoop.
• Writing Hive queries and Pig scripts.
• Experience in importing and exporting data using Sqoop from HDFS to Relational Database and Vice-Versa.
• Proficient in managing the software Release management operations involving requirement gathering, and deployment of
codes to lives.
Project Execution:
• Steering the successful rollout of projects with accountability of defining scope, setting timelines, analysing requirements,
prioritising tasks, identifying dependencies and evaluating risks & issues as per pre-set budgets
• Monitoring project progress & outstanding issues and ensuring the quality & timeliness of deliverables; extending post-
implementation support to technical team members by defining SLA norms
Technical Support:
• Providing post-deployment support till successful release on each application; rendering support statistics to report issues &
possible precautions and driving project change orders, rollback & error logs for change analysis
• Serving as single source of contact 24*7 for multiple applications; managing resources in deployment & support areas
Client Servicing:
• Addressing queries of clients regarding IT applications by providing support for troubleshooting problems related to
performance tuning & application conflicts
• Maintaining healthy relations with internal & external stakeholders to provide support for various IT issues by keeping a
close track on recent developments
ORGANISATIONAL EXPERIENCE
TESCO HSC, Bangalore as Principal Software Engineer
Key Projects Handled:
Title: Primary Transport and Fresh Volumetrics
Role: Development and Design
Skills: Unix, HDFS, Pig, Hive, Sqoop and MapReduce
Description: Primary Transport manages the scheduling of trucks/hauliers between suppliers & depots whereas
Fresh Volumetrics manages the scheduling of trucks/hauliers between depots and stores. The main
purpose of Primary Transport and Fresh Volumetrics is to calculate the cases, pallets and trucks
needed for scheduling the products. The system mainly deals with scheduling the purchase order,
building the shipments, capturing the goods-in and generating the vouchers details.
This project aims to move all log data from individual servers to HDFS as the main log storage and
management system and then perform analysis on these HDFS data-sets. Flume was used to move
the log data periodically into HDFS. Once the data-set is inside HDFS, Pig and Hive were used to
perform various analyses.
Key Result Areas:
• Handling a team of 2 members.
• Attending daily status meeting with business users to discuss on open issues
• Involved in transferring files from OLTP server to Hadoop file system.
• Involved in writing queries with Hive QL and Pig.
• Involved in database connection by using SQOOP.
• Importing and Exporting Data from HiveQL to HDFS.
• Process and analyze the data from Hive tables using HiveQL.
• Analysed transactions using Pig scripts and Hive to generate reports for end users
Title: Logistics Management and Group Depot Ordering
Role: Development and Design
Skills: Unix, HDFS, Pig, Hive, Sqoop and MapReduce
Description: LM is a system used in Supply Chain. It is a critical system that manages the creation of purchase
orders for delivery of products to depots from the suppliers in UK & ROI based on demand & stock in
depots / stores. It is interfaced with various systems as it forms starting point of supply chain to send,
receive & share data for timely availability of products in the store.
Other than UK & ROI, TESCO is present in other countries. The purchase orders were raised by Legacy
Systems. There was a need of improving the purchase orders for better availability of stock & less
wastage in depots. Group Depot Ordering was developed to replicate the UK & ROI functionalities with
existing interfaces. It has now been implemented across 7 countries.
Key Result Areas:
• Handling a team of 7 members.
• Producing coherent technical proposals that meets the customer requirements
• Attending daily status meeting with business users to discuss on open issues
• Managing development of documentation to meet client expectations
• Assisting the senior technical analyst in defining effective strategies for the systems and developing system requirements
• Writing the script files for loading data to HDFS and processing data in HDFS.
• Written the Apache PIG scripts to validate and process the HDFS data.
• Developing applications on MapReduce programming Model of Hadoop in Java.
• Creating Partitions of Hive tables to store the processed results in a tabular format
• Analyze the Data Specifications provided by Group countries.
• Preparation of Deployment check list.
• Defects fixing, managing defect tracking by Clear Quest.
• Support system testing, User acceptance test and pre prod testing.
• Production deployment and stabilization support.
Highlights:
• Bagged Value Award as “No One Tries Harder for Customers” for excellent dedication
• Received “Star of the Month” award with cash price of ₹ 2000 for successful delivery of project during Christmas
• After the delivery of the project, the product availability in store was improved to 98% and wastage in depots reduced to 2-
3%.
• Bagged Value Award - “Every Little Helps” for dedication & successful delivery of project within given timeline
IT SKILLS
Operating Platforms: Z/OS, Unix and Windows
Primary Skills: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, COBOL, JCL, Focus, SQL and Objectstar
Project Acquired Skill: Java and Unix Shell Scripting.
Database: VSAM, Huron, DB2 and Teradata.
Software Configuration Tools: Endeavor and Telon
Job Scheduling & Monitoring: CA-7
Other Tools / Utilities: Insync, Dump Master, Abend Aid, XCOM
Training Attended: BigData-Hadoop
Domain Knowledge: Retail Domain – Supply Chain Management.
Methodologies: Agile and Waterfall models.
EDUCATION
2007 B.E. from M.S.R.I.T., Bangalore affiliated to VTU
PERSONAL DETAILS
Date of Birth: 23rd
October 1985
Languages Known: English, Hindi, Kannada and Konkani
LinkedIN: in.linkedin.com/in/vijaypaij/

More Related Content

What's hot

Dan reznick resume_05102016
Dan reznick resume_05102016Dan reznick resume_05102016
Dan reznick resume_05102016Daniel Reznick
 
HP Enterprises in Hana Pankaj Jain May 2016
HP Enterprises in Hana Pankaj Jain May 2016HP Enterprises in Hana Pankaj Jain May 2016
HP Enterprises in Hana Pankaj Jain May 2016INDUSCommunity
 
2016 NCS ASE short
2016 NCS ASE short2016 NCS ASE short
2016 NCS ASE shortSteve Stuck
 
The Impact of SAP Hana on the SAP Infrastructure Utility Services Marketplace
The Impact of SAP Hana on the SAP Infrastructure Utility Services MarketplaceThe Impact of SAP Hana on the SAP Infrastructure Utility Services Marketplace
The Impact of SAP Hana on the SAP Infrastructure Utility Services MarketplaceLisa Milani, MBA
 
TheValueChain Beyond Simple 10-05-16 - HANA migration
TheValueChain Beyond Simple 10-05-16 - HANA migrationTheValueChain Beyond Simple 10-05-16 - HANA migration
TheValueChain Beyond Simple 10-05-16 - HANA migrationTheValueChain
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overviewRohit Jain
 
2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBase2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBaseRohit Jain
 
Monish R_9163_b
Monish R_9163_bMonish R_9163_b
Monish R_9163_bsamnik60
 
Hadoop integration with SAP HANA
Hadoop integration with SAP HANAHadoop integration with SAP HANA
Hadoop integration with SAP HANADebajit Banerjee
 
SAPPHIRENOW_2016_HDS_running_VMware V2.1
SAPPHIRENOW_2016_HDS_running_VMware V2.1SAPPHIRENOW_2016_HDS_running_VMware V2.1
SAPPHIRENOW_2016_HDS_running_VMware V2.1Frank Olszewski
 
Finance month closing with HANA
Finance month closing with HANAFinance month closing with HANA
Finance month closing with HANADouglas Bernardini
 
Game Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise ThinkingGame Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise ThinkingInside Analysis
 
Ramp Consulting Hosted & Managed Services Solutions
Ramp Consulting Hosted & Managed Services SolutionsRamp Consulting Hosted & Managed Services Solutions
Ramp Consulting Hosted & Managed Services SolutionsBrian McCarthy
 
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousingAbhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousingabhijit singh
 

What's hot (20)

Dan reznick resume_05102016
Dan reznick resume_05102016Dan reznick resume_05102016
Dan reznick resume_05102016
 
HP Enterprises in Hana Pankaj Jain May 2016
HP Enterprises in Hana Pankaj Jain May 2016HP Enterprises in Hana Pankaj Jain May 2016
HP Enterprises in Hana Pankaj Jain May 2016
 
2016 NCS ASE short
2016 NCS ASE short2016 NCS ASE short
2016 NCS ASE short
 
The Impact of SAP Hana on the SAP Infrastructure Utility Services Marketplace
The Impact of SAP Hana on the SAP Infrastructure Utility Services MarketplaceThe Impact of SAP Hana on the SAP Infrastructure Utility Services Marketplace
The Impact of SAP Hana on the SAP Infrastructure Utility Services Marketplace
 
Resume
ResumeResume
Resume
 
Tamilarasu_Uthirasamy_10Yrs_Resume
Tamilarasu_Uthirasamy_10Yrs_ResumeTamilarasu_Uthirasamy_10Yrs_Resume
Tamilarasu_Uthirasamy_10Yrs_Resume
 
TheValueChain Beyond Simple 10-05-16 - HANA migration
TheValueChain Beyond Simple 10-05-16 - HANA migrationTheValueChain Beyond Simple 10-05-16 - HANA migration
TheValueChain Beyond Simple 10-05-16 - HANA migration
 
Trafodion overview
Trafodion overviewTrafodion overview
Trafodion overview
 
2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBase2 - Trafodion and Hadoop HBase
2 - Trafodion and Hadoop HBase
 
Monish R_9163_b
Monish R_9163_bMonish R_9163_b
Monish R_9163_b
 
Hadoop integration with SAP HANA
Hadoop integration with SAP HANAHadoop integration with SAP HANA
Hadoop integration with SAP HANA
 
SAPPHIRENOW_2016_HDS_running_VMware V2.1
SAPPHIRENOW_2016_HDS_running_VMware V2.1SAPPHIRENOW_2016_HDS_running_VMware V2.1
SAPPHIRENOW_2016_HDS_running_VMware V2.1
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
 
Finance month closing with HANA
Finance month closing with HANAFinance month closing with HANA
Finance month closing with HANA
 
LaMima Gilbert Lewis PM
LaMima Gilbert Lewis PMLaMima Gilbert Lewis PM
LaMima Gilbert Lewis PM
 
Gokulkrishna BA.DOC
Gokulkrishna  BA.DOCGokulkrishna  BA.DOC
Gokulkrishna BA.DOC
 
DianaTChua
DianaTChuaDianaTChua
DianaTChua
 
Game Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise ThinkingGame Changed – How Hadoop is Reinventing Enterprise Thinking
Game Changed – How Hadoop is Reinventing Enterprise Thinking
 
Ramp Consulting Hosted & Managed Services Solutions
Ramp Consulting Hosted & Managed Services SolutionsRamp Consulting Hosted & Managed Services Solutions
Ramp Consulting Hosted & Managed Services Solutions
 
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousingAbhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing
Abhijit 10 years Oracle PLSQL, Shell Scripting, Unix, DataWarehousing
 

Similar to Resume_2706

Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copysrikanth K
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016Rajesh Kumar
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkMopuru Babu
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkMopuru Babu
 
Innovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data WarehouseInnovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data WarehouseDataWorks Summit
 
Vishwanath_M_CV_NL
Vishwanath_M_CV_NLVishwanath_M_CV_NL
Vishwanath_M_CV_NLVishwanath M
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram Parida
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerramaprasad owk
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev
 

Similar to Resume_2706 (20)

Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
 
Resume
ResumeResume
Resume
 
Robin_Hadoop
Robin_HadoopRobin_Hadoop
Robin_Hadoop
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016new_Rajesh_Hadoop Developer_2016
new_Rajesh_Hadoop Developer_2016
 
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_SparkSunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
Sunshine consulting mopuru babu cv_java_j2_ee_spring_bigdata_scala_Spark
 
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_SparkSunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
Sunshine consulting Mopuru Babu CV_Java_J2ee_Spring_Bigdata_Scala_Spark
 
Atul Mithe
Atul MitheAtul Mithe
Atul Mithe
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 
Prasanna Resume
Prasanna ResumePrasanna Resume
Prasanna Resume
 
Innovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data WarehouseInnovation in the Enterprise Rent-A-Car Data Warehouse
Innovation in the Enterprise Rent-A-Car Data Warehouse
 
Sudhanshu kumar hadoop
Sudhanshu kumar hadoopSudhanshu kumar hadoop
Sudhanshu kumar hadoop
 
Vishwanath_M_CV_NL
Vishwanath_M_CV_NLVishwanath_M_CV_NL
Vishwanath_M_CV_NL
 
GauravSriastava
GauravSriastavaGauravSriastava
GauravSriastava
 
Jayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum MasterJayaram_Parida- Big Data Architect and Technical Scrum Master
Jayaram_Parida- Big Data Architect and Technical Scrum Master
 
Rama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developerRama prasad owk etl hadoop_developer
Rama prasad owk etl hadoop_developer
 
Resume
ResumeResume
Resume
 
Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015Deepankar Sehdev- Resume2015
Deepankar Sehdev- Resume2015
 

Resume_2706

  • 1. VIJAY PAI J No. 202, Vars Eildon Castle, 1st Main, 16th B Cross, Pai Layout, Bangalore–16 vjpaij@rediffmail.com +91-9886036693 JOB OBJECTIVE Scaling new heights of success with hard work & dedication and leaving a mark of excellence on each step; possessing a proven ability to lead project teams to successfully deliver agreed upon solutions of the highest quality, often in complex and challenging customer environments. Aiming for lead assignments in Big Data- Hadoop Development/ Data Analyst with a leading organization of repute. AREAS OF EXPERTISE BigData Hadoop Analyst Unix and Shell Scripting Technical Support IT Technical Analysis Client Relationship Management Project Execution Testing Software Development PROFILE SUMMARY • A competent professional with 7.5 years of retail experience working as Principal Software Engineer with 5.5 years’ experience in Mainframe Development, Support and Technical Analysis and over 2 years in BigData Hadoop in Retail sector. • Verifiable experience in handling Development & Support projects including designing & execution of framework, schedule development, creation of work breakdown structure, resource management, progress monitoring & delivery • Experience in working with Hadoop components like HDFS, Map Reduce, Hive, Pig, Hbase, and Sqoop. • Skilled in conducting accurate System Analysis with the implementation of appropriate data collection and proposing solutions • Proven abilities in developing software applications involving requirement analysis, functional specifications, scheduling, system study, designing, coding, unit testing, quality reviews, de-bugging, documentation & troubleshooting • Skilled in identifying client / business process needs & conceptualising solutions to achieve corporate goals; excellent track record of spearheading Service Improvement initiatives to minimize gaps in effectiveness of service delivery • An innovative, loyal & result-orientated professional with strong communication, analytical, interpersonal & problem-solving skills • Working in Agile Methodology. CORE COMPETENCIES Hadoop Analyst: • Technical Expertise in Hive, PIG, Map Reduce, Sqoop, HBase. • Knowledge on Hadoop architecture and Hadoop Distributed File System, Hadoop ecosystem. • Involved in Hadoop Framework design and understanding the business functionality & Analysis of business requirements. • Loading files to HDFS and writing Hive queries to process required data • Loading data to Hive tables and writing queries using HiveQL to process • Loaded data from DWH systems to HBase by using Sqoop. • Writing Hive queries and Pig scripts. • Experience in importing and exporting data using Sqoop from HDFS to Relational Database and Vice-Versa. • Proficient in managing the software Release management operations involving requirement gathering, and deployment of codes to lives. Project Execution: • Steering the successful rollout of projects with accountability of defining scope, setting timelines, analysing requirements, prioritising tasks, identifying dependencies and evaluating risks & issues as per pre-set budgets • Monitoring project progress & outstanding issues and ensuring the quality & timeliness of deliverables; extending post- implementation support to technical team members by defining SLA norms Technical Support: • Providing post-deployment support till successful release on each application; rendering support statistics to report issues & possible precautions and driving project change orders, rollback & error logs for change analysis • Serving as single source of contact 24*7 for multiple applications; managing resources in deployment & support areas Client Servicing: • Addressing queries of clients regarding IT applications by providing support for troubleshooting problems related to performance tuning & application conflicts • Maintaining healthy relations with internal & external stakeholders to provide support for various IT issues by keeping a close track on recent developments ORGANISATIONAL EXPERIENCE TESCO HSC, Bangalore as Principal Software Engineer Key Projects Handled: Title: Primary Transport and Fresh Volumetrics Role: Development and Design
  • 2. Skills: Unix, HDFS, Pig, Hive, Sqoop and MapReduce Description: Primary Transport manages the scheduling of trucks/hauliers between suppliers & depots whereas Fresh Volumetrics manages the scheduling of trucks/hauliers between depots and stores. The main purpose of Primary Transport and Fresh Volumetrics is to calculate the cases, pallets and trucks needed for scheduling the products. The system mainly deals with scheduling the purchase order, building the shipments, capturing the goods-in and generating the vouchers details. This project aims to move all log data from individual servers to HDFS as the main log storage and management system and then perform analysis on these HDFS data-sets. Flume was used to move the log data periodically into HDFS. Once the data-set is inside HDFS, Pig and Hive were used to perform various analyses. Key Result Areas: • Handling a team of 2 members. • Attending daily status meeting with business users to discuss on open issues • Involved in transferring files from OLTP server to Hadoop file system. • Involved in writing queries with Hive QL and Pig. • Involved in database connection by using SQOOP. • Importing and Exporting Data from HiveQL to HDFS. • Process and analyze the data from Hive tables using HiveQL. • Analysed transactions using Pig scripts and Hive to generate reports for end users Title: Logistics Management and Group Depot Ordering Role: Development and Design Skills: Unix, HDFS, Pig, Hive, Sqoop and MapReduce Description: LM is a system used in Supply Chain. It is a critical system that manages the creation of purchase orders for delivery of products to depots from the suppliers in UK & ROI based on demand & stock in depots / stores. It is interfaced with various systems as it forms starting point of supply chain to send, receive & share data for timely availability of products in the store. Other than UK & ROI, TESCO is present in other countries. The purchase orders were raised by Legacy Systems. There was a need of improving the purchase orders for better availability of stock & less wastage in depots. Group Depot Ordering was developed to replicate the UK & ROI functionalities with existing interfaces. It has now been implemented across 7 countries. Key Result Areas: • Handling a team of 7 members. • Producing coherent technical proposals that meets the customer requirements • Attending daily status meeting with business users to discuss on open issues • Managing development of documentation to meet client expectations • Assisting the senior technical analyst in defining effective strategies for the systems and developing system requirements • Writing the script files for loading data to HDFS and processing data in HDFS. • Written the Apache PIG scripts to validate and process the HDFS data. • Developing applications on MapReduce programming Model of Hadoop in Java. • Creating Partitions of Hive tables to store the processed results in a tabular format • Analyze the Data Specifications provided by Group countries. • Preparation of Deployment check list. • Defects fixing, managing defect tracking by Clear Quest. • Support system testing, User acceptance test and pre prod testing. • Production deployment and stabilization support. Highlights: • Bagged Value Award as “No One Tries Harder for Customers” for excellent dedication • Received “Star of the Month” award with cash price of ₹ 2000 for successful delivery of project during Christmas • After the delivery of the project, the product availability in store was improved to 98% and wastage in depots reduced to 2- 3%. • Bagged Value Award - “Every Little Helps” for dedication & successful delivery of project within given timeline IT SKILLS Operating Platforms: Z/OS, Unix and Windows Primary Skills: Hadoop, MapReduce, HDFS, Hive, Pig, Sqoop, COBOL, JCL, Focus, SQL and Objectstar Project Acquired Skill: Java and Unix Shell Scripting. Database: VSAM, Huron, DB2 and Teradata. Software Configuration Tools: Endeavor and Telon Job Scheduling & Monitoring: CA-7 Other Tools / Utilities: Insync, Dump Master, Abend Aid, XCOM Training Attended: BigData-Hadoop Domain Knowledge: Retail Domain – Supply Chain Management. Methodologies: Agile and Waterfall models. EDUCATION 2007 B.E. from M.S.R.I.T., Bangalore affiliated to VTU PERSONAL DETAILS Date of Birth: 23rd October 1985
  • 3. Languages Known: English, Hindi, Kannada and Konkani LinkedIN: in.linkedin.com/in/vijaypaij/