SlideShare a Scribd company logo
1 of 3
Mansi Khare
Contact: +91-9145290877 / E-Mail: ermansikhare@gmail.com
LinkedIn: www.linkedin.com/in/mansi-khare-86208868
Passionate to churn large volume data to unearth meaningful information as Big Data Analyst and
Developerwithsoundtechnical knowledgeofvarioustechnologies from Hadoop EcoSystem. Being
recognizedaseffectivecontributorwithinorganization,lookingfor opportunities with challenging
environment to unleash my technical and analytical capabilities and to nurture my skills to more
advanced level.
Profile Summary
 2 years and 10 months of experience in banking and finance domain.
 Expertise in SWIFT, European payments, UK domestic payments, SWIFT Messaging standard, SEPA,
Message Repairing, Hadoop eco systems HDFS, Map Reduce, Pig, Sqoop and Hive for scalability,
distributed computing and high performance computing.
 Worked for 2.8 years for the payment gateway GT Exchange, GT Frame, CHAPS, BACS payments.
 Completed 3 months training and have a Sound Knowledge in Hadoop eco systems HDFS, Map
Reduce, Pig, Sqoop and Hive for scalability, distributed computing and high performance
computing.
 Worked 1.5 years on a Hadoop ecosystem which includes successful completion of POC.
 Exposure to HBase NoSql database technology.
 Well versed in Core Java.
 Extensive experience in Software development life Cycle (SDLC)
 Excellent verbal, written and communication skills
 Self-motivated and ready to learn new Technologies
Technical and Domain Exposure
 Technical Skills
 Sqoop
 Hadoop
 Pig
 Hive
 Oozie
 Linux
 CoreJava
 Hibernate
 HTML
 Eclipse
 DomainSkills
 SWIFT measuringstandards
 SEPA
 CHAPS
 PaymentGateway
 SWIFT Payments
Career Milestones
 Appreciated and recognized at organization level as a Business Enabler.
 Appreciated and recognized at organization level as a Campus Associate.
 2nd Runner Up for LBG Hackathon Contest at organizational Level.
 Was awarded with dream team award.
Organisational Experience
Since Nov’13 with Cognizant Pune as Software Developer
Project Name: UK Lloyds Banking Group
Role: Application developer and maintenance
Tools used: UNIX, PL SQL
This Project is around the payment gateway being used by the Lloyds banking group(LBG). I
have been working as an application developer and for its maintenance. LBG performs the
transactions across through the SWIFT services. It is connected to SWIFT through a gateway
known as GTExchange. To carry out its business the gateway needs to understand all the SWIFT
standards, its Messaging standards which is followed all across the world by the banks using the
SWIFT services.
Roles and Responsibility:
 Troubleshooting that involved a complete investigation from end to end from SWIFT to
the gateway application and vice versa. This involved the complete understanding of the
SWIFT messaging standards, Behaviour of different payments in different scenarios.
 Requirements Gathering & Analysis this involved the business requirement received from
business regarding any new feature or enhancement or a functionality change in the
gateway.
 Functional Specifications De-sign this activity was dependent as per the requirement from
the business. The gateway configuration set up and the code in java along with the scripts
the operates over Unix lpars were designed or enhanced
 Development Support this completely involved the support as a developer to the various
testing cycles of the project ST, SIT, UAT and Regression cycles.
 Scripting and automation for enhancement this involved the manual work automation
including the configuration and scheduling of automatic batch job, Housekeeping scripts,
Report generating scripts.
The experience spans in areas of SWIFT, European Payments, UK domestic Payments, SWIFT
Messaging Standards, SWIFT Standard Releases, SEPA, Message Repairing etc.
POC Completed for Hadoop BIGDATA for a project
 Role: Hadoop Developer
 Tools used: Sqoop,Flume, HDFS, PIG, HIVE, Map Reduce, Hbase
POC was to validate the functionality to store tera-bytes of log information generated by the
Gateway processing of daily transactions and extract meaningful information out of it. The
solution is based on the open source Big Data s/w Hadoop .It includes getting the raw data
(structured, semi structured and unstructured data) from the various jurisdictions and branches
of banks using sqoop and flume, Process those reports Using the ETL tools PIG the semi
structured and unstructured content was extracted, loaded and transformed into structured
format and then that structured format was then loaded into the HIVE tables. Besides indirect
loading of the HIVE tables from MySQL and Oracle Database, We have also loaded the tables
directly as well from the DB. Those HIVE tables are further queried upon to obtain the desired
data to generate the reports as per the business requirements. The data obtained after the query
execution were then again loaded into the new HIVE tables. In the Process partitioning and
clustering of the tables were done as well to increase the efficiency of the data retrievals. The
Reports generated by the module were then used by the client to carry out their business.
Roles and Responsibility:
 Involvedin DesignandDevelopmentof technicalspecificationsusing Hadoop
Technology.
 RequirementsGathering&Analysis thisinvolved the business requirement received from
business
 Involvedin moving all log files generated from varioussourcesto HDFS using SQOOP for the
structured data
 FLUME for the unstructured and semi structured data.
 Written the Apache PIG scripts to process the HDFS data that has been fetchedby SQOOP and
FLUME.
 Created Hive tables to store the processed results in a tabular format.
 Have performed the partitioning and clustering (bucketing) on the tables as a part of
performance enhancement.
 Have loaded the HIVE tablesdirectly fromthe MySQLand Oracle database servers
(structured data) instead of loading the data first in HDFS and then onto the tables.
 Executed the differentqueriesonthe HIVE tables to extract the data as required by the
business reports
 Monitoring Hadoopscripts whichtakethe input from HDFS and load the data into Hive.
 Created external tables in Hive.
 In the due course of the project we have been through the classroomtrainingsof
Zookeeper,HBase,Oozie
 Application installation of Hadoop,Hive, MapReduce, Sqoop.
 Developed the sqoop scripts in order to make the interactionbetweenPigand MySQL.
 Involvedin developing the HiveReports.
 Involved in developing the Pig scripts.
 Having Good knowledge on Single node and Multinode Cluster Configurations.
 HDFS support and maintenance .
 Was a part of MapReduce application development using Hadoop, MapReduce programming
and Hbase.
 Being a part of a team working on the MapReduce module of the POC has given contribution to
the core java code part.
 Understood and applied the concept of partitioner and Combiner in the MapReduce code.
Academic Details
 B.Tech from Gautam Buddha Technical University, Lucknow in 2013 with 79.64%
 Intermediate from St. Fidelis College, Lucknow in 2009 with 91.7%
 High School from St. Fidelis College, Lucknow in 2007 with 86.4%
College Level Trainings & Project
 2 month certification in CC++ from Microsoft Academy.
 2 month Summer internship from BSNL Lucknow
 4th Year Project – Audio & Video Conferencing in Java.
Personal Details
Date of Birth: 1st June 1991
Address: Parkways Society, Wakad Pune.
Languages Known: English and Hindi

More Related Content

What's hot

Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resumeSasmita Swain
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINAthokpam Nabakumar
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalTRIVENI PATRO
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith R
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17Shipra Jaiswal
 
Vijay_hadoop admin
Vijay_hadoop adminVijay_hadoop admin
Vijay_hadoop adminvijay vijay
 
CV_JMorilloEN-LinkedIn
CV_JMorilloEN-LinkedInCV_JMorilloEN-LinkedIn
CV_JMorilloEN-LinkedInJos Morillo
 
Kumar Godasi - Resume
Kumar Godasi - ResumeKumar Godasi - Resume
Kumar Godasi - ResumeKumar Godasi
 
Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0Pabba Gupta
 

What's hot (19)

ritabrata_bhattacharya_cv
ritabrata_bhattacharya_cvritabrata_bhattacharya_cv
ritabrata_bhattacharya_cv
 
Resume_VipinKP
Resume_VipinKPResume_VipinKP
Resume_VipinKP
 
Sasmita bigdata resume
Sasmita bigdata resumeSasmita bigdata resume
Sasmita bigdata resume
 
Borja González - Resume ​Big Data Architect
Borja González - Resume ​Big Data ArchitectBorja González - Resume ​Big Data Architect
Borja González - Resume ​Big Data Architect
 
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMINATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
ATHOKPAM NABAKUMAR SINGH's HADOOP ADMIN
 
Resume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop ProfessionalResume_Triveni_Bigdata_Hadoop Professional
Resume_Triveni_Bigdata_Hadoop Professional
 
Amith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CVAmith_Hadoop_Admin_CV
Amith_Hadoop_Admin_CV
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
 
Vijay_hadoop admin
Vijay_hadoop adminVijay_hadoop admin
Vijay_hadoop admin
 
BI consultant
BI consultantBI consultant
BI consultant
 
CV_JMorilloEN-LinkedIn
CV_JMorilloEN-LinkedInCV_JMorilloEN-LinkedIn
CV_JMorilloEN-LinkedIn
 
BigData_Krishna Kumar Sharma
BigData_Krishna Kumar SharmaBigData_Krishna Kumar Sharma
BigData_Krishna Kumar Sharma
 
Kumar Godasi - Resume
Kumar Godasi - ResumeKumar Godasi - Resume
Kumar Godasi - Resume
 
Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0Sanath pabba hadoop resume 1.0
Sanath pabba hadoop resume 1.0
 
PHP Quebec - May 2015
PHP Quebec - May 2015PHP Quebec - May 2015
PHP Quebec - May 2015
 
Yasar resume 2
Yasar resume 2Yasar resume 2
Yasar resume 2
 
Resume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - HadoopResume_of_Vasudevan - Hadoop
Resume_of_Vasudevan - Hadoop
 
PHP[tek] - 2015
PHP[tek] - 2015PHP[tek] - 2015
PHP[tek] - 2015
 
Chandrasekhar_Resume
Chandrasekhar_ResumeChandrasekhar_Resume
Chandrasekhar_Resume
 

Similar to Mansi Khare (20)

Srikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hydSrikanth hadoop 3.6yrs_hyd
Srikanth hadoop 3.6yrs_hyd
 
Nagarjuna_Damarla
Nagarjuna_DamarlaNagarjuna_Damarla
Nagarjuna_Damarla
 
Mukul-Resume
Mukul-ResumeMukul-Resume
Mukul-Resume
 
Prashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEWPrashanth Kumar_Hadoop_NEW
Prashanth Kumar_Hadoop_NEW
 
Shiv shakti resume
Shiv shakti resumeShiv shakti resume
Shiv shakti resume
 
HimaBindu
HimaBinduHimaBindu
HimaBindu
 
Bharath Hadoop Resume
Bharath Hadoop ResumeBharath Hadoop Resume
Bharath Hadoop Resume
 
Pushpendra
PushpendraPushpendra
Pushpendra
 
Sourav banerjee resume
Sourav banerjee   resumeSourav banerjee   resume
Sourav banerjee resume
 
Anoop Saxena
Anoop SaxenaAnoop Saxena
Anoop Saxena
 
Resume
ResumeResume
Resume
 
hadoop exp
hadoop exphadoop exp
hadoop exp
 
Sureh hadoop 3 years t
Sureh hadoop 3 years tSureh hadoop 3 years t
Sureh hadoop 3 years t
 
Srikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copySrikanth hadoop hyderabad_3.4yeras - copy
Srikanth hadoop hyderabad_3.4yeras - copy
 
hadoop resume
hadoop resumehadoop resume
hadoop resume
 
DeepeshRehi
DeepeshRehiDeepeshRehi
DeepeshRehi
 
Maharshi_Amin_416
Maharshi_Amin_416Maharshi_Amin_416
Maharshi_Amin_416
 
Neelima_Resume
Neelima_ResumeNeelima_Resume
Neelima_Resume
 
Nagarjuna_Damarla_Resume
Nagarjuna_Damarla_ResumeNagarjuna_Damarla_Resume
Nagarjuna_Damarla_Resume
 
Hadoop Developer
Hadoop DeveloperHadoop Developer
Hadoop Developer
 

Mansi Khare

  • 1. Mansi Khare Contact: +91-9145290877 / E-Mail: ermansikhare@gmail.com LinkedIn: www.linkedin.com/in/mansi-khare-86208868 Passionate to churn large volume data to unearth meaningful information as Big Data Analyst and Developerwithsoundtechnical knowledgeofvarioustechnologies from Hadoop EcoSystem. Being recognizedaseffectivecontributorwithinorganization,lookingfor opportunities with challenging environment to unleash my technical and analytical capabilities and to nurture my skills to more advanced level. Profile Summary  2 years and 10 months of experience in banking and finance domain.  Expertise in SWIFT, European payments, UK domestic payments, SWIFT Messaging standard, SEPA, Message Repairing, Hadoop eco systems HDFS, Map Reduce, Pig, Sqoop and Hive for scalability, distributed computing and high performance computing.  Worked for 2.8 years for the payment gateway GT Exchange, GT Frame, CHAPS, BACS payments.  Completed 3 months training and have a Sound Knowledge in Hadoop eco systems HDFS, Map Reduce, Pig, Sqoop and Hive for scalability, distributed computing and high performance computing.  Worked 1.5 years on a Hadoop ecosystem which includes successful completion of POC.  Exposure to HBase NoSql database technology.  Well versed in Core Java.  Extensive experience in Software development life Cycle (SDLC)  Excellent verbal, written and communication skills  Self-motivated and ready to learn new Technologies Technical and Domain Exposure  Technical Skills  Sqoop  Hadoop  Pig  Hive  Oozie  Linux  CoreJava  Hibernate  HTML  Eclipse  DomainSkills  SWIFT measuringstandards  SEPA  CHAPS  PaymentGateway  SWIFT Payments Career Milestones  Appreciated and recognized at organization level as a Business Enabler.  Appreciated and recognized at organization level as a Campus Associate.  2nd Runner Up for LBG Hackathon Contest at organizational Level.  Was awarded with dream team award.
  • 2. Organisational Experience Since Nov’13 with Cognizant Pune as Software Developer Project Name: UK Lloyds Banking Group Role: Application developer and maintenance Tools used: UNIX, PL SQL This Project is around the payment gateway being used by the Lloyds banking group(LBG). I have been working as an application developer and for its maintenance. LBG performs the transactions across through the SWIFT services. It is connected to SWIFT through a gateway known as GTExchange. To carry out its business the gateway needs to understand all the SWIFT standards, its Messaging standards which is followed all across the world by the banks using the SWIFT services. Roles and Responsibility:  Troubleshooting that involved a complete investigation from end to end from SWIFT to the gateway application and vice versa. This involved the complete understanding of the SWIFT messaging standards, Behaviour of different payments in different scenarios.  Requirements Gathering & Analysis this involved the business requirement received from business regarding any new feature or enhancement or a functionality change in the gateway.  Functional Specifications De-sign this activity was dependent as per the requirement from the business. The gateway configuration set up and the code in java along with the scripts the operates over Unix lpars were designed or enhanced  Development Support this completely involved the support as a developer to the various testing cycles of the project ST, SIT, UAT and Regression cycles.  Scripting and automation for enhancement this involved the manual work automation including the configuration and scheduling of automatic batch job, Housekeeping scripts, Report generating scripts. The experience spans in areas of SWIFT, European Payments, UK domestic Payments, SWIFT Messaging Standards, SWIFT Standard Releases, SEPA, Message Repairing etc. POC Completed for Hadoop BIGDATA for a project  Role: Hadoop Developer  Tools used: Sqoop,Flume, HDFS, PIG, HIVE, Map Reduce, Hbase POC was to validate the functionality to store tera-bytes of log information generated by the Gateway processing of daily transactions and extract meaningful information out of it. The solution is based on the open source Big Data s/w Hadoop .It includes getting the raw data (structured, semi structured and unstructured data) from the various jurisdictions and branches of banks using sqoop and flume, Process those reports Using the ETL tools PIG the semi structured and unstructured content was extracted, loaded and transformed into structured format and then that structured format was then loaded into the HIVE tables. Besides indirect loading of the HIVE tables from MySQL and Oracle Database, We have also loaded the tables directly as well from the DB. Those HIVE tables are further queried upon to obtain the desired data to generate the reports as per the business requirements. The data obtained after the query execution were then again loaded into the new HIVE tables. In the Process partitioning and clustering of the tables were done as well to increase the efficiency of the data retrievals. The Reports generated by the module were then used by the client to carry out their business.
  • 3. Roles and Responsibility:  Involvedin DesignandDevelopmentof technicalspecificationsusing Hadoop Technology.  RequirementsGathering&Analysis thisinvolved the business requirement received from business  Involvedin moving all log files generated from varioussourcesto HDFS using SQOOP for the structured data  FLUME for the unstructured and semi structured data.  Written the Apache PIG scripts to process the HDFS data that has been fetchedby SQOOP and FLUME.  Created Hive tables to store the processed results in a tabular format.  Have performed the partitioning and clustering (bucketing) on the tables as a part of performance enhancement.  Have loaded the HIVE tablesdirectly fromthe MySQLand Oracle database servers (structured data) instead of loading the data first in HDFS and then onto the tables.  Executed the differentqueriesonthe HIVE tables to extract the data as required by the business reports  Monitoring Hadoopscripts whichtakethe input from HDFS and load the data into Hive.  Created external tables in Hive.  In the due course of the project we have been through the classroomtrainingsof Zookeeper,HBase,Oozie  Application installation of Hadoop,Hive, MapReduce, Sqoop.  Developed the sqoop scripts in order to make the interactionbetweenPigand MySQL.  Involvedin developing the HiveReports.  Involved in developing the Pig scripts.  Having Good knowledge on Single node and Multinode Cluster Configurations.  HDFS support and maintenance .  Was a part of MapReduce application development using Hadoop, MapReduce programming and Hbase.  Being a part of a team working on the MapReduce module of the POC has given contribution to the core java code part.  Understood and applied the concept of partitioner and Combiner in the MapReduce code. Academic Details  B.Tech from Gautam Buddha Technical University, Lucknow in 2013 with 79.64%  Intermediate from St. Fidelis College, Lucknow in 2009 with 91.7%  High School from St. Fidelis College, Lucknow in 2007 with 86.4% College Level Trainings & Project  2 month certification in CC++ from Microsoft Academy.  2 month Summer internship from BSNL Lucknow  4th Year Project – Audio & Video Conferencing in Java. Personal Details Date of Birth: 1st June 1991 Address: Parkways Society, Wakad Pune. Languages Known: English and Hindi