Syed Babar H Rizvi has over 12 years of experience as a software developer and team lead working on projects in healthcare, telecom and manufacturing. He currently works at Parametric Technology Corporation as a module lead. He has extensive experience with ETL tools like Informatica and databases like Oracle. He also has knowledge of big data technologies such as Hadoop, Spark and machine learning techniques.
High Class Call Girls Noida Sector 39 Aarushi 🔝8264348440🔝 Independent Escort...
Syed babar resume
1. SYED BABAR H RIZVI
Page 1 of 4
Name : Syed Babar Hussain Rizvi
EmailID : babar.rizvi.pu@gmail.com
ContactNo: 919739283080
BigData & Machine Learning Practitioner
Executive Summary
A professional with 12+ years of experience in Software Development / Onsite Co-ordination / Team Leading
with deep insight in to Healthcare / Telecom/Manufacturing industry.
Working with Parametric Technology Corporation as Module lead.
An expert in leading teams to successful project implem entation with leadership skills, quality assurance and
schedule adherence.
Worked on all phases ofdata warehouse developmentlifecycle,from gathering requirements,ETL design and
implementation,and supportor new and existing applications.
Excellent technical and analytical skills with clear understanding ofETL design and projectarchitecture based on
reporting requirements.
Demonstrated expertise-utilizing ETL tools,including prior version of Informatica 7.X, 8.X to currentversion
(9.5.1) and RDBM systems like Oracle.
Worked in Agile and SDLC models.Worked on Jira board to track and reportprogress to management/Clients.
Estimated user storypoints,helped on Sprintplanning and handled go/no-go calls.
Knowledge ofBig data technologies like Hadoop,Spark,Hive and Python.
Working Knowledge of various tools like Control-M, Bitbucket, SVN, Splunk etc.
Areas of Exposure:
ETL: Informatica 7.x, 8.x, 9.x
SQL: SQL Developer, Toad
RDBMS: Oracle 10g, Oracle 9i
Programming Languages: SQL, PL-SQL, Python
Version Control: VSS, VCTL, Github
Professional Qualification:
Post-Graduate Certificate Program on BigData Analytics & Optimization (360 contact hours +
project hackathon & Viva) from INSOFE, accredited by Language Technologies Institute of
Carnegie Mellon University, USA.
BE (IT):- Aggregate 62% Year-2006
College:- Government Engineering College, Jagdalpur
University: Pt Ravishankar Shukla University
Data Science Projects
Image Classification: The objective of this projectwas to correctly identify any image thatis provided as a test.
Approach: A train data setwas provided which consisted of10 folders having differentimages.The total numbers of
images in all these folders were around 3000
• Data was pre-processed and broughtin the correct format.
• Data Augmentation technique was used to further enhances the image classification accuracy.
• To further see the improvement,VGG16 architecture was used and it improved the accuracy since we had used
the transfer learning concept.
2. SYED BABAR H RIZVI
Page 2 of 4
Text Classification: The objective of this projectwas to correctly identify the urgency (1, 2, 3 or 4) of the support
tickets based on the text in the body of the email.
Approach: A train data setwas provided which consisted of47000 records with 9 columns
• Data was pre-processed and was cleaned ofall the NA values to bring it in correct format.
• Tokenizer API was used to convert all the words into numeric format
• Glove embedding weights for our text data so that they can be further used for the embedding layer
• 4 layer CNN model along with embedding layer was builtto train and testthe data
Fraud Detection: Predicting the suspicious transactions using retail salesmen reports analysis & Salesmen
Segmentation
Approach: A train Data setwas provided which contained approx.48000 sale transaction records.Records were
classified as Fraud-Yes,
• Fraud-No or Fraud-Indeterminate.
• Data was pre-processed and Ration Analysis was done to create new features along with Outlier detection
methods.
• Benford's Law was used to analyse the prospective fraudulenttransactions.
• SMOTE was done for the class imbalance problem.
• Various models were builtto predictthe correct classification ofthe records.
• Clustering was also used to classifythe Salesperson as High Risk,Medium Risk and Low Risk based on his Sales
record data.
Tool Used:
Deep Learning (MLP, CNN, RNN and LSTM),
Machine- Learning (Regression, PCA, SVM, Clustering,Naive Bayes, KNN, Ensemble and DT),Linear
Programming,NLP,Image and Video processing, Monte Carlo Simulation,Genetic algorithm, Using Hadoop and
Spark Ecosystem,Statistics & Probability, Spark and Python.
PROFESSIONAL EXPERIENCE
Organization: - Parametric Technology Corporation, Pune
Duration: - (from Sept 13-Current)
Recent Project Name : Amtrak
Sector: USA Govt.
Designation: ETL developer and Information Analyst
Client: Vodafone Italy
Tools/Technology: Informatica PowerCenter 9.5, Oracle11g, Amazon Redshift, S3, EC2.
Accountabilities:
Preparing the POC and design path for EDW migration..
Prepared E2E for Oracle to Redshiftmovementofthe warehouse.
Worked on high level ERDs.
Database migration,Production Support,RCA.
Functional requirementanalysis, QualityImprovement,Regression testing.
3. SYED BABAR H RIZVI
Page 3 of 4
Organization: - Vodafone India Services pvt Ltd, Pune
Duration: - (from Mar 11- Aug 2013)
Project Name : Vodafone Online
Sector Telecom
Duration: Mar’11- Till Date
Designation: Onsite Co-ordinator and Information Analyst
Client: Vodafone Italy
Tools/Technology: Informatica 7.1 and 8.5.1 as ETL Tool, Oracle9i and Oracle10g as Database and PL/SQL
and Shell scripts.
Details: At its core is the Vodafone-developed interchange system. The n-tier design and service
oriented architecture delivers enhanced flexibility, scalability, and reliability. To meet the
reporting and analysis needs of Post/Pre paid, the data warehouse, Management
Reporting System and Surveillance and Utilization Review System components are
integrated into a cohesive Billing solution built to use high-performance data marts. The
individual business process RADs discusses the detailed use of the interChange
application to supportthe specific DSS business processes. The complete requirements
analysis for the various DSS business processes is available within the individual
business processes. The Business Objects and Informatica Project within the Vodafone
Italy Online Practice of the business is the first step in VF-Italy goal to migrate from their
current model to a new model where Vodafone provides all their Business Intelligence
developmentand supportresources and services in an onshore /offshore model (20-80%
mix).
Accountabilities:
Responsible for Preparing Technical Specs,analyzing Functional Specs,Developmentand
maintenance ofcode.
Responsible for the complexity determination and estimation.
Developing complexETL mapping and its corresponding sessions & worklets,workflows.
End-to-end testing of Data warehouse/Data Mart load.
Distributing the work between offshore team member and tracking the developmentprogress.
Discussing the technical way of developmentwith onsite and offshore team lead.
Doing the review of code developed from offshore.
Organization: - Tela Sourcing Pvt ltd, Pune
Duration: - (from May 09- Dec 2011)
2) ProjectName : HealthAspex
Skill/Tools : Informatica 8.5.1, UNIX, Oracle 10g, SQLServer 2005
Role : ETL Developer
HealthAspex system is business integration solution provider focused exclusively on healthcare payers.
Systems help in processing ofthe claims filed by the differentproviders for remunerations from the client
(payers) with both online and offline functionality ofprocessing claims.Our solution helps TPAs and PPOs
save time and money by converting paper claims into easy-to-manage electronic data.
4. SYED BABAR H RIZVI
Page 4 of 4
ResponsibilitiesasaTeam Member:
Extraction, Transformation and loading ofdata from flat file, Oracle, SQL server 2005 sources to
oracle database.
Created Informatica mappings and mapplets using differenttransformations.
Used differenttransformations like source qualifier,filter,aggregator,expression,connected and
unconnected lookup,sequence generator,router and update strategy.
Analysis of certain existing mappings,which were producing errors and modifying them to produce
correct results.
Used workflow manager to create session task and other tasks.
Involved in unittesting.
Documentations ofmappings as per standards.
Involved in the code review and preparing the code review document.
Organization: - Mphasis An EDS Company, Pune
Duration: - (from Jan 07- Feb 09)
1) ProjectName : USGS (iCE 4.1)
Skill/Tools : Informatica 8.5.1, Unix, Oracle 9i
Role : ETL Developer
This application is developed for Medicaid system (US government’s health care project). It’s an N-tier based application
that manages automation for different systems of Medicaid such as; Claims, Financial, Provider, Recipient, Drug Rebate and
Third Party Liabilities, etc. The architecture of this portal consists of 2 major parts; the core application and the state-wise
application. The core application is the underlying platform that has all the standard features for Medicaid automation and state-
wise application is the extended and customized version of the core to suit the requirements for specific state.
The Social Security Act is a Federal/State entitlement program that pays for medical assistance for certain individuals and
families with low incomes and resources. This project aimed at development, testing and deployment of Healthcare Insurance
infrastructure for all the States in USA. MMIS is an automated claims processing and information retrieval system that helps
State government administer the Medicaid program. This contains different subsystems like claims, managed care, recipient,
provider, third party liability, MAR, portal etc.
Worked on and mastered Claims and Financial, two of the most complex subsystems in USGS and helped team members on
the technical and functional minutes of Claims and their resolutions.
Responsibilitiesas aTeam Member:
Extraction, Transformation and Loading of the data Using Informatica
Used relational sources and flat files to populate the data mart
Involved in the development of informatica mappings
Created informatica mappings to load the data mart and monitored them.
Used workflow manager to create session task and other tasks
Involved in unit testing