SlideShare a Scribd company logo

Abdul ETL Resume

1 of 4
Download to read offline
Page 1 of 4
Abdul Mohammed
E-mail: abdulmohammed.etldeveloper@gmail.com
Phone: 773-599-9678
SUMMARY
 8 years of functional experience in Decision Support Systems – Data Warehousing, ETL (Extract,
Transform and Load) using Informatica 7.x/8.x/9.x
 Main areas of expertise are analyzing, developing and testing the Data warehousing projects.
 Experience in Development of mappings using needed Transformations using Informatica tool.
 Experience in the Data Warehousing using Data Extraction, Data Transformation and Data
Loading.
 Preparing the Documentation for the mappings according to the Business Requirements.
 Good Knowledge on Data warehousing concepts like Star Schema, Dimensions and Fact tables.
 Optimizing Informatica Mappings and Sessions to improve the performance.
 Extensively used Informatica client tools – Source Analyzer, Warehouse designer, Mapping
designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager
 Having extensive experience on ETL automation using UNIX scripting.
 Having extensive experience on writing ETL technical documentation and creating testing
documents.
 Good knowledge on UNIX Scripting.
 Developmental experience on Windows NT/2000/XP, UNIX and Linux platforms.
 Excellent communication and interpersonal skills.
 Possess complete domain and development Life Cycle knowledge of Data Warehousing.
 Excellent problem solving skills with strong technical background and good interpersonal skills. Quick
learner and excellent team player, ability to meet deadlines and work under pressure
EDUCATION AND CERTIFICATIONS
Masters in Information System
Bachelor in Computers
Certification
ITIL V3 Foundation Certificate (2011)
PROFESSIONAL EXPERIENCE
Client Infosys Technologies
Role ETL/SR Informatica Lead
Duration Jan 2015 – Present
Description: This Data warehouse was designed to generate reports for the Human Health
Department in Guidant Pharmaceuticals. This warehouse is to generate reports and analyze the
sales of various products. The product data is categorized depending on the product group and
product family. It is also used to analyze the usage of product at different times of the year. It reports
the historical data stored in various databases like Oracle and Flat Files. Data from different
sources was brought using. Informatica ETL and sent for reporting using Business Objects.
As the data captured from the chemists is very important and which is driving the business it should
be very accurate. So user needs an application with an instrument by which PSR can be able to
feed the data then and there.
This application involves the various systems. The systems are SAS and BI. All the calls (chemist
calls, doctor calls etc.) will go to CRM. Attendance and Expense will go to ECC. The information
like Targeting and Sales Reports comes from BI.
Page 2 of 4
Responsibility:
 Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping
Designer,
 Documenting Business, Functional, and Non-functional requirements.
 Involved in current state project assessment and Gap Analysis.
 Propose the effort estimate for the changes as they come in. Manage and deliver changes as
discussed.
 Analyzed business process workflows and assisted in the development of ETL procedures for
moving data from source to target systems.
 Creating the Base objects, Staging tables and landing tables foreign key relationships, static
lookups, dynamic lookups, queries, packages and query groups.
 Created Mappings to get the data loaded into the Staging tables during the Stage Process.
 Defined Trust and validation rules for the base tables.
 Coordinated with Business team and making them understand Match & Merge and incorporated
their requirements.
 Developed and created queries that can be used for over and under Matching.
 Analyzed the data by running the queries and provided the stats after Initial data and incremental
Loads.
 Discussed use of Roles, creation of users and assignment of user to Role.
 Defined Roles and privileges for each environment according to requirements.
 Defined the security such that schema will be secured with access only granted for specific
downstream integration uses, using users created for those specific integrations.
 Created Batch Groups in Utilities Workbench and scheduled them externally using Power Center
and Autosys (any scheduling tool).
 Contribute towards enrichment of data quality rules.
 Combining regulatory and application requirements into the business rules.
 Providing the advanced business rule writing skills via IDQ Developer and our skilled technicians
 Tuned the mappings and involved in performance improvement phase.
Environment: Informatica Power Center 9.5.2 (Source Analyzer, Data warehouse designer,
Mapping Designer, Mapplets, Transformations, Workflow Manager, Workflow Monitor), PL/SQL,
Oracle 11g, DB2, Windows XP and UNIX
Client Motorola, Chicago, IL
Role ETL/SR Informatica Lead
Duration Jan 2012 – Dec 2014
Description: British Telecom is a global provider for Telecommunications services. The
company’s global service division provides network security and telecom services to many
customers worldwide. BT Commercial handles the sales division. This project involved building
Data Mart for the commercial sales division. The sources included Oracle databases, DB2, Flat
Flies and COBOL sources. The Data marts were built on Oracle. Project involves working with a
team, on the entire ETL process and development of the data marts using Informatica Power
Center.
Responsibilities:
Wrote and tested business applications using many software languages: primarily Informatica 9.x
and PL and SQL tied to a Oracle and SQL database.
Analyzed, designed, created and manipulated MS Access and Oracle (9i) tables.
Researched manual and automated systems, developed recommendations and prepared
documentation. The recommendations identified unnecessary steps.
 Worked with financial staff in the use of Oracle based tables with the Oracle Discoverer product
and MS Access.
Page 3 of 4
 Worked with data warehouse staff to incorporate best practices from Informatica.
 Worked with analysts, using work sessions, to translating business requirements into technical user
specifications, including data, process and interface specifications.
 Developed new methods and procedures.
 Developed the methods and procedures for data warehouse staff,
 Developed implementation plans and schedules for Informatica enhancements,
 Developed reports based on issues related to the data warehouse.
 Worked with finance staff to document their business rules, data needs and requirements in order
to create data warehouse systems.
 Assessing and documenting existing business processes and identifying where improvements can
be made or where new processes can be installed.
 Developed data marts using sources such as: agency’s HRM data, personnel files and payroll and
financial files.
 Supports and maintains metadata and application data residing inside EDW Development. Follows
established standards Staging Area, and ODS.
 To develop the data marts I conducted meetings with the data mart users and developed prototypes
of the data mart as discuss items.
 Involved in data warehouse projects where I was the project leader directing contractors and state
staff.
 Developed new enhancement training sessions for staff. These sessions instructed staff in the use
of new Informatica enhancements.
 Administered a business application Oracle Discoverer business intelligence product.
 Provided problem resolution for both the Informatica and Discoverer applications.
 Assisted in the development and training related to best practices for the Data Warehouse group.
 Developed frequent reports to management regarding my activities.
 Researched and developed methods related to information processing metrics.
 Experience with DVO to verify data format in target tables/files containing large volumes of data.
 Involved with moving data from flat files, including Excel files, to an Oracle Environment.
 Developed documentation, for both business and technical staff, related to my activities.
Environment: Informatica Power Center 9.0 (Source Analyzer, Data warehouse designer,
Mapping Designer, Mapplets, Transformations, Workflow Manager, Workflow Monitor), Teradata,
Tableau 7.0, PL/SQL, Oracle 11g, DB2, Windows 7, Windows Server 2005, Erwin 6.5, OBIEE,
DVO.
Client State Farm Insurance, Bloomington, IL
Role ETL/Informatica Developer
Duration June 2010 – Oct 2011
Description: The project for BT Internal Service Unit involved the company’s capability to support
a wide range of customers with networking products and services. The goal was to get rid of large
volumes of redundant data, integrate the data well in fact and dimensional tables. The basic idea
was to get rid of the isolated pockets of data while ensuring that the INS office still has access to
detail and summary data at both the enterprise and line-of-business levels.
Responsibilities:
 Gathered and analyzed requirements by meeting business stakeholders and other
technical team members prior to the design of ETL.
 Understanding High Level ETL Requirement specifications to develop LLD for type-I, SCD Type-II
and Type III mappings and done rigorous testing for various test cases.
 Performance tuning has been done to increase the through put for both mapping and session level
and SQL Queries optimization as well.
 Possess experience in architecture level by providing optimization for data validation solutions.
 Synchronization has been done between Informatica Metadata and data modeling diagrams.
 Provided support and quality validation thru test cases for all stages Unit and Integration testing.
Page 4 of 4
 Preparation of QA migration and PROD migration Documents after development and testing in
Development environment and executed Quality Assurance testing.
 Developed complex mappings as per the requirement using almost all transformations and
effectively used Debugger for identifying errors and resolving them.
 Have done Informatica performance tuning and query tuning.
 Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata,
reducing the development time.
 Team leader who was responsible for the assignment of tasks for the team members, creation of
Metrics reports, weekly status reports, preparation of project plan and quality related documents &
migration documents etc.
 Used share point to maintain documents and tracking.
Environment: Informatica Power Center 8.6, Oracle 10g, Linux/UNIX Sun Solaris 5.8, UNIX Shell
Scripting, Mainframe, SQL Developer
Client BT Global, Princeton, NJ
Role ETL/Informatica Developer
Duration Jan 2008 – May 2010
Description: Purpose of this project is to maintain a data warehouse that would enable the
management to take corporate decisions. The project deals with building a real time view of
enterprise wide data. A decision support system is built to compare and analyze product prices,
their quantities and client profiles. This Data Warehouse is use to deliver reports and information
to sales and marketing management.
Responsibilities:
 Profound knowledge in Data Warehouse concepts and in Architecture design.
 Worked extensively on transformations like Source Qualifier, Aggregator, Expression, Lookup,
Filter, Router, and Update Strategy.
 Extracted data from various heterogeneous sources cleaned the data before applying business
logic and loaded them into central Target Data Warehouse.
 Proficient in using Informatica Client tools (Repository Manager, Source Analyzer, Warehouse
Designer, Transformation Developer, Mapplets Designer, Mapping Designer, Workflow Manager,
and Workflow Monitor).
 Created Workflows and Work lets for designed mappings.
 Generated status reports using Workflow Manager.
 Environment: Informatica Power Center 7.1.4, Oracle 10g, Linux/UNIX Sun Solaris 5.8, UNIX
Shell Scripting, Mainframe, SQL Developer
Ad

Recommended

Uml diagram for_hospital_management_system
Uml diagram for_hospital_management_systemUml diagram for_hospital_management_system
Uml diagram for_hospital_management_systemPradeep Bhosale
 
A deep dive session on Tableau
A deep dive session on TableauA deep dive session on Tableau
A deep dive session on TableauVisual_BI
 
Tableau Tutorial For Beginners | Tableau Training For Beginners | Tableau Cer...
Tableau Tutorial For Beginners | Tableau Training For Beginners | Tableau Cer...Tableau Tutorial For Beginners | Tableau Training For Beginners | Tableau Cer...
Tableau Tutorial For Beginners | Tableau Training For Beginners | Tableau Cer...Edureka!
 
Moving to the Maintenance Cloud
Moving to the Maintenance CloudMoving to the Maintenance Cloud
Moving to the Maintenance CloudIan Monaghan
 

More Related Content

What's hot

04 data flow architecture
04 data flow architecture 04 data flow architecture
04 data flow architecture Jadavsejal
 
Data Lakehouse, Data Mesh, and Data Fabric (r2)
Data Lakehouse, Data Mesh, and Data Fabric (r2)Data Lakehouse, Data Mesh, and Data Fabric (r2)
Data Lakehouse, Data Mesh, and Data Fabric (r2)James Serra
 
Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)Chaitanya Prasad
 
Dbms mini project
Dbms mini projectDbms mini project
Dbms mini projectHome
 
Datastage to ODI
Datastage to ODIDatastage to ODI
Datastage to ODINagendra K
 
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...Edureka!
 
Building a modern data warehouse
Building a modern data warehouseBuilding a modern data warehouse
Building a modern data warehouseJames Serra
 
Big Data Evolution
Big Data EvolutionBig Data Evolution
Big Data Evolutionitnewsafrica
 
ETL Made Easy with Azure Data Factory and Azure Databricks
ETL Made Easy with Azure Data Factory and Azure DatabricksETL Made Easy with Azure Data Factory and Azure Databricks
ETL Made Easy with Azure Data Factory and Azure DatabricksDatabricks
 
Tableau online training
Tableau online trainingTableau online training
Tableau online trainingsuresh
 
Core Concepts in azure data factory
Core Concepts in azure data factoryCore Concepts in azure data factory
Core Concepts in azure data factoryBRIJESH KUMAR
 
Business intelligence and data warehouses
Business intelligence and data warehousesBusiness intelligence and data warehouses
Business intelligence and data warehousesDhani Ahmad
 
Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...
Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...
Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...Edureka!
 
CRISP-DM - Agile Approach To Data Mining Projects
CRISP-DM - Agile Approach To Data Mining ProjectsCRISP-DM - Agile Approach To Data Mining Projects
CRISP-DM - Agile Approach To Data Mining ProjectsMichał Łopuszyński
 
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?Kai Wähner
 
Azure data platform overview
Azure data platform overviewAzure data platform overview
Azure data platform overviewJames Serra
 

What's hot (20)

04 data flow architecture
04 data flow architecture 04 data flow architecture
04 data flow architecture
 
Data Lakehouse, Data Mesh, and Data Fabric (r2)
Data Lakehouse, Data Mesh, and Data Fabric (r2)Data Lakehouse, Data Mesh, and Data Fabric (r2)
Data Lakehouse, Data Mesh, and Data Fabric (r2)
 
Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)Chaitanya_Vemparala_Informatica_Resume(2)
Chaitanya_Vemparala_Informatica_Resume(2)
 
Dbms mini project
Dbms mini projectDbms mini project
Dbms mini project
 
Datastage to ODI
Datastage to ODIDatastage to ODI
Datastage to ODI
 
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...
Azure Data Factory | Moving On-Premise Data to Azure Cloud | Microsoft Azure ...
 
Building a modern data warehouse
Building a modern data warehouseBuilding a modern data warehouse
Building a modern data warehouse
 
BIG DATA and USE CASES
BIG DATA and USE CASESBIG DATA and USE CASES
BIG DATA and USE CASES
 
Big Data Evolution
Big Data EvolutionBig Data Evolution
Big Data Evolution
 
ETL Made Easy with Azure Data Factory and Azure Databricks
ETL Made Easy with Azure Data Factory and Azure DatabricksETL Made Easy with Azure Data Factory and Azure Databricks
ETL Made Easy with Azure Data Factory and Azure Databricks
 
Tableau online training
Tableau online trainingTableau online training
Tableau online training
 
Core Concepts in azure data factory
Core Concepts in azure data factoryCore Concepts in azure data factory
Core Concepts in azure data factory
 
Business intelligence and data warehouses
Business intelligence and data warehousesBusiness intelligence and data warehouses
Business intelligence and data warehouses
 
Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...
Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...
Data Warehouse Interview Questions And Answers | Data Warehouse Tutorial | Ed...
 
CRISP-DM - Agile Approach To Data Mining Projects
CRISP-DM - Agile Approach To Data Mining ProjectsCRISP-DM - Agile Approach To Data Mining Projects
CRISP-DM - Agile Approach To Data Mining Projects
 
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?
Data Warehouse vs. Data Lake vs. Data Streaming – Friends, Enemies, Frenemies?
 
Azure data platform overview
Azure data platform overviewAzure data platform overview
Azure data platform overview
 
Oracle Analytics Cloud
Oracle Analytics CloudOracle Analytics Cloud
Oracle Analytics Cloud
 
Data warehouse
Data warehouseData warehouse
Data warehouse
 
DBMS
DBMSDBMS
DBMS
 

Similar to Abdul ETL Resume

Similar to Abdul ETL Resume (20)

Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16
 
Resume
ResumeResume
Resume
 
Resume
ResumeResume
Resume
 
Kiran Infromatica developer
Kiran Infromatica developerKiran Infromatica developer
Kiran Infromatica developer
 
Resume
ResumeResume
Resume
 
RajeshS_ETL
RajeshS_ETLRajeshS_ETL
RajeshS_ETL
 
Ganesh profile
Ganesh profileGanesh profile
Ganesh profile
 
VenkatSubbaReddy_Resume
VenkatSubbaReddy_ResumeVenkatSubbaReddy_Resume
VenkatSubbaReddy_Resume
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Resume_PratikDey
Resume_PratikDeyResume_PratikDey
Resume_PratikDey
 
VamsiKrishna Maddiboina
VamsiKrishna MaddiboinaVamsiKrishna Maddiboina
VamsiKrishna Maddiboina
 
PratikGhosh_Resume_Final
PratikGhosh_Resume_FinalPratikGhosh_Resume_Final
PratikGhosh_Resume_Final
 
GouriShankar_Informatica
GouriShankar_InformaticaGouriShankar_Informatica
GouriShankar_Informatica
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latest
 
Resume_Informatica_4.3yrs_CSC_MCA_from_NIT_Venkat_CV.v1.0
Resume_Informatica_4.3yrs_CSC_MCA_from_NIT_Venkat_CV.v1.0Resume_Informatica_4.3yrs_CSC_MCA_from_NIT_Venkat_CV.v1.0
Resume_Informatica_4.3yrs_CSC_MCA_from_NIT_Venkat_CV.v1.0
 
Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17
 
Anil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETLAnil_Kumar_Andra_ETL
Anil_Kumar_Andra_ETL
 
ArunKrishnappa_Resume
ArunKrishnappa_ResumeArunKrishnappa_Resume
ArunKrishnappa_Resume
 
Siva Kanagaraj Resume
Siva Kanagaraj ResumeSiva Kanagaraj Resume
Siva Kanagaraj Resume
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
 

Abdul ETL Resume

  • 1. Page 1 of 4 Abdul Mohammed E-mail: abdulmohammed.etldeveloper@gmail.com Phone: 773-599-9678 SUMMARY  8 years of functional experience in Decision Support Systems – Data Warehousing, ETL (Extract, Transform and Load) using Informatica 7.x/8.x/9.x  Main areas of expertise are analyzing, developing and testing the Data warehousing projects.  Experience in Development of mappings using needed Transformations using Informatica tool.  Experience in the Data Warehousing using Data Extraction, Data Transformation and Data Loading.  Preparing the Documentation for the mappings according to the Business Requirements.  Good Knowledge on Data warehousing concepts like Star Schema, Dimensions and Fact tables.  Optimizing Informatica Mappings and Sessions to improve the performance.  Extensively used Informatica client tools – Source Analyzer, Warehouse designer, Mapping designer, Mapplet Designer, Informatica Repository Manager and Informatica Workflow Manager  Having extensive experience on ETL automation using UNIX scripting.  Having extensive experience on writing ETL technical documentation and creating testing documents.  Good knowledge on UNIX Scripting.  Developmental experience on Windows NT/2000/XP, UNIX and Linux platforms.  Excellent communication and interpersonal skills.  Possess complete domain and development Life Cycle knowledge of Data Warehousing.  Excellent problem solving skills with strong technical background and good interpersonal skills. Quick learner and excellent team player, ability to meet deadlines and work under pressure EDUCATION AND CERTIFICATIONS Masters in Information System Bachelor in Computers Certification ITIL V3 Foundation Certificate (2011) PROFESSIONAL EXPERIENCE Client Infosys Technologies Role ETL/SR Informatica Lead Duration Jan 2015 – Present Description: This Data warehouse was designed to generate reports for the Human Health Department in Guidant Pharmaceuticals. This warehouse is to generate reports and analyze the sales of various products. The product data is categorized depending on the product group and product family. It is also used to analyze the usage of product at different times of the year. It reports the historical data stored in various databases like Oracle and Flat Files. Data from different sources was brought using. Informatica ETL and sent for reporting using Business Objects. As the data captured from the chemists is very important and which is driving the business it should be very accurate. So user needs an application with an instrument by which PSR can be able to feed the data then and there. This application involves the various systems. The systems are SAS and BI. All the calls (chemist calls, doctor calls etc.) will go to CRM. Attendance and Expense will go to ECC. The information like Targeting and Sales Reports comes from BI.
  • 2. Page 2 of 4 Responsibility:  Worked on Power Center client tools like Source Analyzer, Warehouse Designer, Mapping Designer,  Documenting Business, Functional, and Non-functional requirements.  Involved in current state project assessment and Gap Analysis.  Propose the effort estimate for the changes as they come in. Manage and deliver changes as discussed.  Analyzed business process workflows and assisted in the development of ETL procedures for moving data from source to target systems.  Creating the Base objects, Staging tables and landing tables foreign key relationships, static lookups, dynamic lookups, queries, packages and query groups.  Created Mappings to get the data loaded into the Staging tables during the Stage Process.  Defined Trust and validation rules for the base tables.  Coordinated with Business team and making them understand Match & Merge and incorporated their requirements.  Developed and created queries that can be used for over and under Matching.  Analyzed the data by running the queries and provided the stats after Initial data and incremental Loads.  Discussed use of Roles, creation of users and assignment of user to Role.  Defined Roles and privileges for each environment according to requirements.  Defined the security such that schema will be secured with access only granted for specific downstream integration uses, using users created for those specific integrations.  Created Batch Groups in Utilities Workbench and scheduled them externally using Power Center and Autosys (any scheduling tool).  Contribute towards enrichment of data quality rules.  Combining regulatory and application requirements into the business rules.  Providing the advanced business rule writing skills via IDQ Developer and our skilled technicians  Tuned the mappings and involved in performance improvement phase. Environment: Informatica Power Center 9.5.2 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplets, Transformations, Workflow Manager, Workflow Monitor), PL/SQL, Oracle 11g, DB2, Windows XP and UNIX Client Motorola, Chicago, IL Role ETL/SR Informatica Lead Duration Jan 2012 – Dec 2014 Description: British Telecom is a global provider for Telecommunications services. The company’s global service division provides network security and telecom services to many customers worldwide. BT Commercial handles the sales division. This project involved building Data Mart for the commercial sales division. The sources included Oracle databases, DB2, Flat Flies and COBOL sources. The Data marts were built on Oracle. Project involves working with a team, on the entire ETL process and development of the data marts using Informatica Power Center. Responsibilities: Wrote and tested business applications using many software languages: primarily Informatica 9.x and PL and SQL tied to a Oracle and SQL database. Analyzed, designed, created and manipulated MS Access and Oracle (9i) tables. Researched manual and automated systems, developed recommendations and prepared documentation. The recommendations identified unnecessary steps.  Worked with financial staff in the use of Oracle based tables with the Oracle Discoverer product and MS Access.
  • 3. Page 3 of 4  Worked with data warehouse staff to incorporate best practices from Informatica.  Worked with analysts, using work sessions, to translating business requirements into technical user specifications, including data, process and interface specifications.  Developed new methods and procedures.  Developed the methods and procedures for data warehouse staff,  Developed implementation plans and schedules for Informatica enhancements,  Developed reports based on issues related to the data warehouse.  Worked with finance staff to document their business rules, data needs and requirements in order to create data warehouse systems.  Assessing and documenting existing business processes and identifying where improvements can be made or where new processes can be installed.  Developed data marts using sources such as: agency’s HRM data, personnel files and payroll and financial files.  Supports and maintains metadata and application data residing inside EDW Development. Follows established standards Staging Area, and ODS.  To develop the data marts I conducted meetings with the data mart users and developed prototypes of the data mart as discuss items.  Involved in data warehouse projects where I was the project leader directing contractors and state staff.  Developed new enhancement training sessions for staff. These sessions instructed staff in the use of new Informatica enhancements.  Administered a business application Oracle Discoverer business intelligence product.  Provided problem resolution for both the Informatica and Discoverer applications.  Assisted in the development and training related to best practices for the Data Warehouse group.  Developed frequent reports to management regarding my activities.  Researched and developed methods related to information processing metrics.  Experience with DVO to verify data format in target tables/files containing large volumes of data.  Involved with moving data from flat files, including Excel files, to an Oracle Environment.  Developed documentation, for both business and technical staff, related to my activities. Environment: Informatica Power Center 9.0 (Source Analyzer, Data warehouse designer, Mapping Designer, Mapplets, Transformations, Workflow Manager, Workflow Monitor), Teradata, Tableau 7.0, PL/SQL, Oracle 11g, DB2, Windows 7, Windows Server 2005, Erwin 6.5, OBIEE, DVO. Client State Farm Insurance, Bloomington, IL Role ETL/Informatica Developer Duration June 2010 – Oct 2011 Description: The project for BT Internal Service Unit involved the company’s capability to support a wide range of customers with networking products and services. The goal was to get rid of large volumes of redundant data, integrate the data well in fact and dimensional tables. The basic idea was to get rid of the isolated pockets of data while ensuring that the INS office still has access to detail and summary data at both the enterprise and line-of-business levels. Responsibilities:  Gathered and analyzed requirements by meeting business stakeholders and other technical team members prior to the design of ETL.  Understanding High Level ETL Requirement specifications to develop LLD for type-I, SCD Type-II and Type III mappings and done rigorous testing for various test cases.  Performance tuning has been done to increase the through put for both mapping and session level and SQL Queries optimization as well.  Possess experience in architecture level by providing optimization for data validation solutions.  Synchronization has been done between Informatica Metadata and data modeling diagrams.  Provided support and quality validation thru test cases for all stages Unit and Integration testing.
  • 4. Page 4 of 4  Preparation of QA migration and PROD migration Documents after development and testing in Development environment and executed Quality Assurance testing.  Developed complex mappings as per the requirement using almost all transformations and effectively used Debugger for identifying errors and resolving them.  Have done Informatica performance tuning and query tuning.  Deployed reusable transformation objects such as Mapplets to avoid duplication of metadata, reducing the development time.  Team leader who was responsible for the assignment of tasks for the team members, creation of Metrics reports, weekly status reports, preparation of project plan and quality related documents & migration documents etc.  Used share point to maintain documents and tracking. Environment: Informatica Power Center 8.6, Oracle 10g, Linux/UNIX Sun Solaris 5.8, UNIX Shell Scripting, Mainframe, SQL Developer Client BT Global, Princeton, NJ Role ETL/Informatica Developer Duration Jan 2008 – May 2010 Description: Purpose of this project is to maintain a data warehouse that would enable the management to take corporate decisions. The project deals with building a real time view of enterprise wide data. A decision support system is built to compare and analyze product prices, their quantities and client profiles. This Data Warehouse is use to deliver reports and information to sales and marketing management. Responsibilities:  Profound knowledge in Data Warehouse concepts and in Architecture design.  Worked extensively on transformations like Source Qualifier, Aggregator, Expression, Lookup, Filter, Router, and Update Strategy.  Extracted data from various heterogeneous sources cleaned the data before applying business logic and loaded them into central Target Data Warehouse.  Proficient in using Informatica Client tools (Repository Manager, Source Analyzer, Warehouse Designer, Transformation Developer, Mapplets Designer, Mapping Designer, Workflow Manager, and Workflow Monitor).  Created Workflows and Work lets for designed mappings.  Generated status reports using Workflow Manager.  Environment: Informatica Power Center 7.1.4, Oracle 10g, Linux/UNIX Sun Solaris 5.8, UNIX Shell Scripting, Mainframe, SQL Developer