1. Mansi Khare
Contact: +91-9145290877 / E-Mail: ermansikhare@gmail.com
LinkedIn: www.linkedin.com/in/mansi-khare-86208868
Passionate to churn large volume data to unearth meaningful information as Big Data Analyst and
Developerwithsoundtechnical knowledgeofvarioustechnologies from Hadoop EcoSystem. Being
recognizedaseffectivecontributorwithinorganization,lookingfor opportunities with challenging
environment to unleash my technical and analytical capabilities and to nurture my skills to more
advanced level.
Profile Summary
2 years and 10 months of experience in banking and finance domain.
Expertise in SWIFT, European payments, UK domestic payments, SWIFT Messaging standard, SEPA,
Message Repairing, Hadoop eco systems HDFS, Map Reduce, Pig, Sqoop and Hive for scalability,
distributed computing and high performance computing.
Worked for 2.8 years for the payment gateway GT Exchange, GT Frame, CHAPS, BACS payments.
Completed 3 months training and have a Sound Knowledge in Hadoop eco systems HDFS, Map
Reduce, Pig, Sqoop and Hive for scalability, distributed computing and high performance
computing.
Worked 1.5 years on a Hadoop ecosystem which includes successful completion of POC.
Exposure to HBase NoSql database technology.
Well versed in Core Java.
Extensive experience in Software development life Cycle (SDLC)
Excellent verbal, written and communication skills
Self-motivated and ready to learn new Technologies
Technical and Domain Exposure
Technical Skills
Sqoop
Hadoop
Pig
Hive
Oozie
Linux
CoreJava
Hibernate
HTML
Eclipse
DomainSkills
SWIFT measuringstandards
SEPA
CHAPS
PaymentGateway
SWIFT Payments
Career Milestones
Appreciated and recognized at organization level as a Business Enabler.
Appreciated and recognized at organization level as a Campus Associate.
2nd Runner Up for LBG Hackathon Contest at organizational Level.
Was awarded with dream team award.
2. Organisational Experience
Since Nov’13 with Cognizant Pune as Software Developer
Project Name: UK Lloyds Banking Group
Role: Application developer and maintenance
Tools used: UNIX, PL SQL
This Project is around the payment gateway being used by the Lloyds banking group(LBG). I
have been working as an application developer and for its maintenance. LBG performs the
transactions across through the SWIFT services. It is connected to SWIFT through a gateway
known as GTExchange. To carry out its business the gateway needs to understand all the SWIFT
standards, its Messaging standards which is followed all across the world by the banks using the
SWIFT services.
Roles and Responsibility:
Troubleshooting that involved a complete investigation from end to end from SWIFT to
the gateway application and vice versa. This involved the complete understanding of the
SWIFT messaging standards, Behaviour of different payments in different scenarios.
Requirements Gathering & Analysis this involved the business requirement received from
business regarding any new feature or enhancement or a functionality change in the
gateway.
Functional Specifications De-sign this activity was dependent as per the requirement from
the business. The gateway configuration set up and the code in java along with the scripts
the operates over Unix lpars were designed or enhanced
Development Support this completely involved the support as a developer to the various
testing cycles of the project ST, SIT, UAT and Regression cycles.
Scripting and automation for enhancement this involved the manual work automation
including the configuration and scheduling of automatic batch job, Housekeeping scripts,
Report generating scripts.
The experience spans in areas of SWIFT, European Payments, UK domestic Payments, SWIFT
Messaging Standards, SWIFT Standard Releases, SEPA, Message Repairing etc.
POC Completed for Hadoop BIGDATA for a project
Role: Hadoop Developer
Tools used: Sqoop,Flume, HDFS, PIG, HIVE, Map Reduce, Hbase
POC was to validate the functionality to store tera-bytes of log information generated by the
Gateway processing of daily transactions and extract meaningful information out of it. The
solution is based on the open source Big Data s/w Hadoop .It includes getting the raw data
(structured, semi structured and unstructured data) from the various jurisdictions and branches
of banks using sqoop and flume, Process those reports Using the ETL tools PIG the semi
structured and unstructured content was extracted, loaded and transformed into structured
format and then that structured format was then loaded into the HIVE tables. Besides indirect
loading of the HIVE tables from MySQL and Oracle Database, We have also loaded the tables
directly as well from the DB. Those HIVE tables are further queried upon to obtain the desired
data to generate the reports as per the business requirements. The data obtained after the query
execution were then again loaded into the new HIVE tables. In the Process partitioning and
clustering of the tables were done as well to increase the efficiency of the data retrievals. The
Reports generated by the module were then used by the client to carry out their business.
3. Roles and Responsibility:
Involvedin DesignandDevelopmentof technicalspecificationsusing Hadoop
Technology.
RequirementsGathering&Analysis thisinvolved the business requirement received from
business
Involvedin moving all log files generated from varioussourcesto HDFS using SQOOP for the
structured data
FLUME for the unstructured and semi structured data.
Written the Apache PIG scripts to process the HDFS data that has been fetchedby SQOOP and
FLUME.
Created Hive tables to store the processed results in a tabular format.
Have performed the partitioning and clustering (bucketing) on the tables as a part of
performance enhancement.
Have loaded the HIVE tablesdirectly fromthe MySQLand Oracle database servers
(structured data) instead of loading the data first in HDFS and then onto the tables.
Executed the differentqueriesonthe HIVE tables to extract the data as required by the
business reports
Monitoring Hadoopscripts whichtakethe input from HDFS and load the data into Hive.
Created external tables in Hive.
In the due course of the project we have been through the classroomtrainingsof
Zookeeper,HBase,Oozie
Application installation of Hadoop,Hive, MapReduce, Sqoop.
Developed the sqoop scripts in order to make the interactionbetweenPigand MySQL.
Involvedin developing the HiveReports.
Involved in developing the Pig scripts.
Having Good knowledge on Single node and Multinode Cluster Configurations.
HDFS support and maintenance .
Was a part of MapReduce application development using Hadoop, MapReduce programming
and Hbase.
Being a part of a team working on the MapReduce module of the POC has given contribution to
the core java code part.
Understood and applied the concept of partitioner and Combiner in the MapReduce code.
Academic Details
B.Tech from Gautam Buddha Technical University, Lucknow in 2013 with 79.64%
Intermediate from St. Fidelis College, Lucknow in 2009 with 91.7%
High School from St. Fidelis College, Lucknow in 2007 with 86.4%
College Level Trainings & Project
2 month certification in CC++ from Microsoft Academy.
2 month Summer internship from BSNL Lucknow
4th Year Project – Audio & Video Conferencing in Java.
Personal Details
Date of Birth: 1st June 1991
Address: Parkways Society, Wakad Pune.
Languages Known: English and Hindi