SlideShare a Scribd company logo
1 of 3
RESUME | Abinash Bindhani (Hadoop Developer)
Career Objective
Seekingachallengingposition,withastrongemphasisonHadoopandJava Technologies,where I
can use my abilitiesand explorethe bestof skillsandacumen tobecome avaluable assetto the
organization.
Professional Experience
 Workingas a Seniorsystems engineerinInfosys fromOctober-2014to till date.
 Over2+ yearsof experience in HadoopEcosystem withhands-onprojectexperience in
BankingandFinancial Sector.
 Over2+ yearsof experience inEnhancement,Development,SupportandTestingin Javain
BankingandFinancial Sector.
Technical Skill Set
 HadoopFramework : MapReduce,Pig,Hive
 NoSQL : Hadoop HBase
 HadoopTools : SQOOPand Flume
 Languages : Core Java,J2EE(Spring)
 Database : HDFS,MySql
 IDE & Tools : Eclipse
 Webserver : Apache Tomcat
Methodologies
Agile &Scrum
Career Summary
 Movingdata from Oracle to RDBMS and vice-versausingSQOOP.
 Collectinglarge amountsof logdatausingApache Flume and aggregatingusingPIG/HIVEin
HDFS for furtheranalysis.
 Verygoodunderstandingof Partitioning,BucketingconceptsinHive anddesignedboth
Managed andExternal tablesinHive tooptimize performance.
 Solvedperformance issuesinHive andPigscriptswithunderstandingof Joins,Groupand
Aggregationandhowittranslate to MapReduce jobs.
 Developed MapReduce jobsinjavafordata cleaningandpreprocessing.
 DevelopedUDFsinJava as and whennecessarytouse inPIGand HIVE queries.
 Good workingknowledge of NoSQLdatabase HBase
 Verygoodexperience withbothMapReduce 1(JobTracker) and MapReduce 2 (YARN
framework.
 Expertise inHDFSArchitecture andClusterconcepts.
 ExperiencedDeveloperspecializedinDesign/Architecture,Coding/Development,Unit
TestingandSystemTesting.
 Hands onexperiencein Mainframe technologyusingCOBOL,DB2, JCL.
 Well experiencedineachphase of Software developmentlife cycle phase (SDLC) namely
analysis,design,testing,documentationandmaintenance.
Abinash Bindhani
BBBIndhani
Hadoop Developer
E-mail- abinash61991@gmail.com
Contact no. - +91-9341134022
RESUME | Abinash Bindhani (Hadoop Developer)
Work Experience
Company Infosysltd, Bangalore
Designation SeniorSystems Engineer
Client Northwesternmutual
Project CSO data migration
Project Description The purpose of the projectisdata migrationfromoracle tohadoopplatform.
The solutionisbasedonthe opensource
Big Data & Hadoop .The data will be storedinHadoopfile systemand
processedusingMap/Reduce jobswhichinternincludesgettingthe word,
excel,pptandPDF data fromthe clientteam, Processthe datato
Obtainrespective departmentrelatedinformationwithrespecttotheir
domains,extractvariousreportsoutof the processedinformationandExport
the informationforfurtherprocessing.Thisproject
ismainlyto change the platform of the current existingsystemwhichisrunning
In Oracle to a new cloud solutiontechnologycalledHadoopwhichcanable to
processlarge date sets (i.e.TerabytesandPetabytes of data) inorder to meet
the clientrequirementswiththe increasingdatavolume fromother
departments.
Responsibilities  Managing andReviewingHadoopLogFiles.
 WritingPigLatinscripts andHiveQL.
 Data loadingtoolslike Flume,SQOOP
 Designing,building, installingandconfiguringHadoop.
 Managing anddeployingHBase ona pseudodistributedcluster.
 WritingMapReduce programsfor the COBOL codes as part of data
migrationfromMainframe toBig Data.
 Requirementgatheringfromclientpersonnelfornew features.
 Fixingexistingissuesinthe systemandalsowritingcode for
enhancement.
 Unit testingandintegrationtestingof owneduse cases.
 Writingdifferentkindof analysis,design,testingandBusiness
validations.
 Coordinatinginstallationof software system.
Domain BankingandFinancial Sector
Duration Oct 2014 till Date
Educational Qualifications
 Bachelor’sDegree inElectronics &communication engineeringfromModerninstitute of
technology&managementinthe year2012 with74%.
 SeniorSecondaryfrom RairangpurCollege,Rairangpurwith68%.
 HigherSecondaryfromRairangpurboys HighSchool, Rairangpurwith77%.
Personal Details
 Father’s Name : Mr. AbhilashBindhani
 Mother’sName : Mrs. JharanaBindhani
 Date of Birth : 1st June 1991
 Marital status : Single
 Nationality : Indian
 Languagesknown : English,Hindi, Odia
Declaration
I herebydeclare thatthe above particularsare true and correct to the bestof my knowledge and
belief.
RESUME | Abinash Bindhani (Hadoop Developer)
Place- Bangalore
Date- 01/01/2017
AbinashBindhani

More Related Content

What's hot

Devoxx : being productive with JHipster
Devoxx : being productive with JHipsterDevoxx : being productive with JHipster
Devoxx : being productive with JHipsterJulien Dubois
 
Introduction to docker
Introduction to dockerIntroduction to docker
Introduction to dockerInstruqt
 
Apache Knox setup and hive and hdfs Access using KNOX
Apache Knox setup and hive and hdfs Access using KNOXApache Knox setup and hive and hdfs Access using KNOX
Apache Knox setup and hive and hdfs Access using KNOXAbhishek Mallick
 
API & Backend Integration
API & Backend IntegrationAPI & Backend Integration
API & Backend IntegrationElewayte
 
Monoliths and Microservices
Monoliths and Microservices Monoliths and Microservices
Monoliths and Microservices Bozhidar Bozhanov
 
Microservices Architecture - Cloud Native Apps
Microservices Architecture - Cloud Native AppsMicroservices Architecture - Cloud Native Apps
Microservices Architecture - Cloud Native AppsAraf Karsh Hamid
 
Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)Jean-Paul Azar
 
Cloud workload migration guidelines
Cloud workload migration guidelinesCloud workload migration guidelines
Cloud workload migration guidelinesJen Wei Lee
 
Architecting security and governance across your AWS environment
Architecting security and governance across your AWS environmentArchitecting security and governance across your AWS environment
Architecting security and governance across your AWS environmentAmazon Web Services
 
How To Run Your Containers on AWS with ECS & Fargate: Collision 2018
How To Run Your Containers on AWS with ECS & Fargate: Collision 2018How To Run Your Containers on AWS with ECS & Fargate: Collision 2018
How To Run Your Containers on AWS with ECS & Fargate: Collision 2018Amazon Web Services
 
Building flexible ETL pipelines with Apache Camel on Quarkus
Building flexible ETL pipelines with Apache Camel on QuarkusBuilding flexible ETL pipelines with Apache Camel on Quarkus
Building flexible ETL pipelines with Apache Camel on QuarkusIvelin Yanev
 
Confluent Partner Tech Talk with QLIK
Confluent Partner Tech Talk with QLIKConfluent Partner Tech Talk with QLIK
Confluent Partner Tech Talk with QLIKconfluent
 
Design patterns for microservice architecture
Design patterns for microservice architectureDesign patterns for microservice architecture
Design patterns for microservice architectureThe Software House
 
Infrastructure as Code - Getting Started, Concepts & Tools
Infrastructure as Code - Getting Started, Concepts & ToolsInfrastructure as Code - Getting Started, Concepts & Tools
Infrastructure as Code - Getting Started, Concepts & ToolsLior Kamrat
 
Microservices architecture
Microservices architectureMicroservices architecture
Microservices architectureAbdelghani Azri
 

What's hot (20)

Devoxx : being productive with JHipster
Devoxx : being productive with JHipsterDevoxx : being productive with JHipster
Devoxx : being productive with JHipster
 
Introduction to docker
Introduction to dockerIntroduction to docker
Introduction to docker
 
Apache Knox setup and hive and hdfs Access using KNOX
Apache Knox setup and hive and hdfs Access using KNOXApache Knox setup and hive and hdfs Access using KNOX
Apache Knox setup and hive and hdfs Access using KNOX
 
Cloud Native In-Depth
Cloud Native In-DepthCloud Native In-Depth
Cloud Native In-Depth
 
API & Backend Integration
API & Backend IntegrationAPI & Backend Integration
API & Backend Integration
 
Monoliths and Microservices
Monoliths and Microservices Monoliths and Microservices
Monoliths and Microservices
 
Microservices Architecture - Cloud Native Apps
Microservices Architecture - Cloud Native AppsMicroservices Architecture - Cloud Native Apps
Microservices Architecture - Cloud Native Apps
 
Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)Kafka Tutorial - Introduction to Apache Kafka (Part 1)
Kafka Tutorial - Introduction to Apache Kafka (Part 1)
 
Cloud workload migration guidelines
Cloud workload migration guidelinesCloud workload migration guidelines
Cloud workload migration guidelines
 
Architecting security and governance across your AWS environment
Architecting security and governance across your AWS environmentArchitecting security and governance across your AWS environment
Architecting security and governance across your AWS environment
 
How To Run Your Containers on AWS with ECS & Fargate: Collision 2018
How To Run Your Containers on AWS with ECS & Fargate: Collision 2018How To Run Your Containers on AWS with ECS & Fargate: Collision 2018
How To Run Your Containers on AWS with ECS & Fargate: Collision 2018
 
AWS Service Catalog
AWS Service CatalogAWS Service Catalog
AWS Service Catalog
 
Building flexible ETL pipelines with Apache Camel on Quarkus
Building flexible ETL pipelines with Apache Camel on QuarkusBuilding flexible ETL pipelines with Apache Camel on Quarkus
Building flexible ETL pipelines with Apache Camel on Quarkus
 
Confluent Partner Tech Talk with QLIK
Confluent Partner Tech Talk with QLIKConfluent Partner Tech Talk with QLIK
Confluent Partner Tech Talk with QLIK
 
Design patterns for microservice architecture
Design patterns for microservice architectureDesign patterns for microservice architecture
Design patterns for microservice architecture
 
Infrastructure as Code - Getting Started, Concepts & Tools
Infrastructure as Code - Getting Started, Concepts & ToolsInfrastructure as Code - Getting Started, Concepts & Tools
Infrastructure as Code - Getting Started, Concepts & Tools
 
Database CI/CD Pipeline
Database CI/CD PipelineDatabase CI/CD Pipeline
Database CI/CD Pipeline
 
App Modernization with Microsoft Azure
App Modernization with Microsoft AzureApp Modernization with Microsoft Azure
App Modernization with Microsoft Azure
 
Microservices architecture
Microservices architectureMicroservices architecture
Microservices architecture
 
Azure migration
Azure migrationAzure migration
Azure migration
 

Viewers also liked

Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)Sourav Madhesiya
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoopKishore Babu
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperAbhinav khanduja
 
Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh yadav
 
Nikunj_Hadoop_Admin_Resume
Nikunj_Hadoop_Admin_ResumeNikunj_Hadoop_Admin_Resume
Nikunj_Hadoop_Admin_ResumeNikunj Ramani
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop ResumeBharath Kumar
 
RENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPrenuga V
 
Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir Saxena
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside WorldPankaj Kumar
 

Viewers also liked (13)

Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)Resume Parsing And Processing Using Hadoop (1)
Resume Parsing And Processing Using Hadoop (1)
 
Resume_Ayush_Hadoop
Resume_Ayush_HadoopResume_Ayush_Hadoop
Resume_Ayush_Hadoop
 
kishore resume hadoop
kishore resume hadoopkishore resume hadoop
kishore resume hadoop
 
Resume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_DeveloperResume_Abhinav_Hadoop_Developer
Resume_Abhinav_Hadoop_Developer
 
Suresh_Hadoop_Resume
Suresh_Hadoop_ResumeSuresh_Hadoop_Resume
Suresh_Hadoop_Resume
 
Nikunj_Hadoop_Admin_Resume
Nikunj_Hadoop_Admin_ResumeNikunj_Hadoop_Admin_Resume
Nikunj_Hadoop_Admin_Resume
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
RENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOPRENUGA VEERARAGAVAN Resume HADOOP
RENUGA VEERARAGAVAN Resume HADOOP
 
Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume Sudhir hadoop and Data warehousing resume
Sudhir hadoop and Data warehousing resume
 
Pankaj Resume for Hadoop,Java,J2EE - Outside World
Pankaj Resume for Hadoop,Java,J2EE -  Outside WorldPankaj Resume for Hadoop,Java,J2EE -  Outside World
Pankaj Resume for Hadoop,Java,J2EE - Outside World
 

Similar to Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop

Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydsrikanth K
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotechlccinfotech
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam M
 
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...mindscriptsseo
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cvrevuri
 
Bigdata and hadoop
Bigdata and hadoopBigdata and hadoop
Bigdata and hadoopRamyaG50
 
Bigdata and hadoop
Bigdata and hadoopBigdata and hadoop
Bigdata and hadoopRamyaG50
 
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune amrutupre
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resumeShiv Shakti
 

Similar to Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop (20)

Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
Nagesh Hadoop Profile
Nagesh Hadoop ProfileNagesh Hadoop Profile
Nagesh Hadoop Profile
 
Hadoop training kit from lcc infotech
Hadoop   training kit from lcc infotechHadoop   training kit from lcc infotech
Hadoop training kit from lcc infotech
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
RESUME_N
RESUME_NRESUME_N
RESUME_N
 
Manikyam_Hadoop_5+Years
Manikyam_Hadoop_5+YearsManikyam_Hadoop_5+Years
Manikyam_Hadoop_5+Years
 
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
Big-Data Hadoop Training Institutes in Pune | CloudEra Certification courses ...
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Resume (1)
Resume (1)Resume (1)
Resume (1)
 
Resume_Karthick
Resume_KarthickResume_Karthick
Resume_Karthick
 
PRAFUL_HADOOP
PRAFUL_HADOOPPRAFUL_HADOOP
PRAFUL_HADOOP
 
Hadoop content
Hadoop contentHadoop content
Hadoop content
 
Monika_Raghuvanshi
Monika_RaghuvanshiMonika_Raghuvanshi
Monika_Raghuvanshi
 
HariKrishna4+_cv
HariKrishna4+_cvHariKrishna4+_cv
HariKrishna4+_cv
 
Bigdata and hadoop
Bigdata and hadoopBigdata and hadoop
Bigdata and hadoop
 
Bigdata and hadoop
Bigdata and hadoopBigdata and hadoop
Bigdata and hadoop
 
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
Big-Data Hadoop Tutorials - MindScripts Technologies, Pune
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 

Senior systems engineer at Infosys with 2.4yrs of experience on Bigdata & hadoop

  • 1. RESUME | Abinash Bindhani (Hadoop Developer) Career Objective Seekingachallengingposition,withastrongemphasisonHadoopandJava Technologies,where I can use my abilitiesand explorethe bestof skillsandacumen tobecome avaluable assetto the organization. Professional Experience  Workingas a Seniorsystems engineerinInfosys fromOctober-2014to till date.  Over2+ yearsof experience in HadoopEcosystem withhands-onprojectexperience in BankingandFinancial Sector.  Over2+ yearsof experience inEnhancement,Development,SupportandTestingin Javain BankingandFinancial Sector. Technical Skill Set  HadoopFramework : MapReduce,Pig,Hive  NoSQL : Hadoop HBase  HadoopTools : SQOOPand Flume  Languages : Core Java,J2EE(Spring)  Database : HDFS,MySql  IDE & Tools : Eclipse  Webserver : Apache Tomcat Methodologies Agile &Scrum Career Summary  Movingdata from Oracle to RDBMS and vice-versausingSQOOP.  Collectinglarge amountsof logdatausingApache Flume and aggregatingusingPIG/HIVEin HDFS for furtheranalysis.  Verygoodunderstandingof Partitioning,BucketingconceptsinHive anddesignedboth Managed andExternal tablesinHive tooptimize performance.  Solvedperformance issuesinHive andPigscriptswithunderstandingof Joins,Groupand Aggregationandhowittranslate to MapReduce jobs.  Developed MapReduce jobsinjavafordata cleaningandpreprocessing.  DevelopedUDFsinJava as and whennecessarytouse inPIGand HIVE queries.  Good workingknowledge of NoSQLdatabase HBase  Verygoodexperience withbothMapReduce 1(JobTracker) and MapReduce 2 (YARN framework.  Expertise inHDFSArchitecture andClusterconcepts.  ExperiencedDeveloperspecializedinDesign/Architecture,Coding/Development,Unit TestingandSystemTesting.  Hands onexperiencein Mainframe technologyusingCOBOL,DB2, JCL.  Well experiencedineachphase of Software developmentlife cycle phase (SDLC) namely analysis,design,testing,documentationandmaintenance. Abinash Bindhani BBBIndhani Hadoop Developer E-mail- abinash61991@gmail.com Contact no. - +91-9341134022
  • 2. RESUME | Abinash Bindhani (Hadoop Developer) Work Experience Company Infosysltd, Bangalore Designation SeniorSystems Engineer Client Northwesternmutual Project CSO data migration Project Description The purpose of the projectisdata migrationfromoracle tohadoopplatform. The solutionisbasedonthe opensource Big Data & Hadoop .The data will be storedinHadoopfile systemand processedusingMap/Reduce jobswhichinternincludesgettingthe word, excel,pptandPDF data fromthe clientteam, Processthe datato Obtainrespective departmentrelatedinformationwithrespecttotheir domains,extractvariousreportsoutof the processedinformationandExport the informationforfurtherprocessing.Thisproject ismainlyto change the platform of the current existingsystemwhichisrunning In Oracle to a new cloud solutiontechnologycalledHadoopwhichcanable to processlarge date sets (i.e.TerabytesandPetabytes of data) inorder to meet the clientrequirementswiththe increasingdatavolume fromother departments. Responsibilities  Managing andReviewingHadoopLogFiles.  WritingPigLatinscripts andHiveQL.  Data loadingtoolslike Flume,SQOOP  Designing,building, installingandconfiguringHadoop.  Managing anddeployingHBase ona pseudodistributedcluster.  WritingMapReduce programsfor the COBOL codes as part of data migrationfromMainframe toBig Data.  Requirementgatheringfromclientpersonnelfornew features.  Fixingexistingissuesinthe systemandalsowritingcode for enhancement.  Unit testingandintegrationtestingof owneduse cases.  Writingdifferentkindof analysis,design,testingandBusiness validations.  Coordinatinginstallationof software system. Domain BankingandFinancial Sector Duration Oct 2014 till Date Educational Qualifications  Bachelor’sDegree inElectronics &communication engineeringfromModerninstitute of technology&managementinthe year2012 with74%.  SeniorSecondaryfrom RairangpurCollege,Rairangpurwith68%.  HigherSecondaryfromRairangpurboys HighSchool, Rairangpurwith77%. Personal Details  Father’s Name : Mr. AbhilashBindhani  Mother’sName : Mrs. JharanaBindhani  Date of Birth : 1st June 1991  Marital status : Single  Nationality : Indian  Languagesknown : English,Hindi, Odia Declaration I herebydeclare thatthe above particularsare true and correct to the bestof my knowledge and belief.
  • 3. RESUME | Abinash Bindhani (Hadoop Developer) Place- Bangalore Date- 01/01/2017 AbinashBindhani