SlideShare a Scribd company logo
1 of 6
Download to read offline
Arun Mathew Thomas Mobile: 949-680-0907
Richmond, VA arunmathewthomas.mannil@gmail.com
SUMMARY:
Over 9 years of expertise in IT Industry and over 8 years of comprehensive experience in developing and
maintaining Data Warehouse applications in Healthcare domains.
In depth knowledge of data warehousing techniques, Star / Snowflake schema, ETL, Fact and Dimensions
tables, physical and logical data modeling.
Experience in designing, developing the ETL process for loading data from heterogeneous source systems
like flat files, Oracle, SQL server, Teradata, VSAM files and Teradata using Informatica Power Center
9.x/8.x
Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter,
Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter,
Sequence Generator, SQL transformation
Experience in Performance Tuning of the mappings and Sessions, implementing the complex business
rules, optimizing the mappings.
Build slowly Changing Dimensions (SCD type1 and type2) mapping, partitioning, override, parameters,
variables, push down optimization, mapplets and worklets.
Experience in detecting/resolving bottlenecks at various levels like source, target, mapping and sessions.
Experienced in Data Quality Management. Worked in Informatica Developer and Trillium Software
systems.
Experienced in Data Analysis, Data Profiling and Data Mapping. Has very good knowledge on data
standardization using Informatica developer.
Responsible for major Data Quality activities including Address and Data cleansing, Address
standardization, Developing Match & Merge and House-holding rules.
Highly efficient in developing fuzzy matching logics for Match & Merge and House-holding.
Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and
System Development Life Cycle (SDLC).
More than 5 years of development experience in various Data warehousing projects using Teradata 13.10
and 14
Experienced in working with Teradata utilities like BTEQ, Stored Procedures, FastExport, FastLoad,
MultiLoad ,TPT
Experienced in writing complex SQL queries to leverage the Push down Optimization feature of
Informatica.
6 years of experience in Data Warehousing and ETL using Informatica Power Center 8.6 and 9.1/9.6, MS
DTS.
5+ years of experience in development using Oracle 9i using PL/SQL, SQL *PLUS and TOAD.
Highly skilled in Data Analysis, Requirement Gathering, Requirement Analysis, Gap Analysis, Data
Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
Has the experience of leading large data ware house teams comprising of 10-20 people for building
Enterprise data warehouse.
Experience working in onsite-offshore model.
TECHNICAL SKILLS:
Operating System: Windows/Dos, UNIX, Linux
Languages: PL/SQL, Shell Script
ETL: Informatica Power Center 8.x, 9.x, MS DTS
Databases: Oracle, Teradata 14, MS Access, SQL Server 2008
Reporting Tool: Elixir
Data Quality Management: Informatica Developer, Informatica Analyst, Trillium Software Systems.
Other Tools: Teradata SQL Assistant, Toad, MS Visio, Work Load Manager, MS Office
Professional Experience
1. Company – UST Global
Duration – May2014 – Current
Role – Sr. Systems Analyst
Client: Anthem Inc.
Project Name: Enterprise Data Warehouse and Research Depot (EDWard)
Project Description:
Anthem, Inc. (formerly known as WellPoint Inc) is the largest managed health care, for-profit company in the Blue
Cross and Blue Shield Association. It was formed when WellPoint Health Networks, Inc. merged into Anthem, Inc.,
with the surviving Anthem adopting the name, WellPoint, Inc. and began trading its common stock under the WLP
symbol on December 1, 2004. In December 2014, WellPoint, Inc. changed its corporate name to Anthem, Inc.
Anthem is one of the nation’s largest health benefits companies, with more than 37 million members in its
affiliated health plans and nearly 67 million individuals served through its subsidiaries.
Responsibilities:
Source data analysis and ‘GAP’ identification of the proposed data sources. Data profiling and data quality
analysis of the source data.
Help business understand the quality of the source data and the educating them on the accuracy and the
completeness of data.
Comparing different data sources for completeness of data and suggesting the best source based on the
analysis.
Gathering and understanding requirements from business analysts and preparing technical design
documents.
Develop high level, low level ETL designs and develop Informatica mapping documents outlining the
transformations required to integrate data from various sources into Enterprise Data Warehouse.
Developing new and modify existing complex Informatica mappings to extract data from various sources
according to guidelines provided by the business users to populate data into target systems.
Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router,
Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow
variables
Created Informatica mappings using push down optimization to push all Integration, Semantic layer’s
transformation, data integration logic into Teradata database.
Customize the Audit balance process to balance the transactions between different staging layers and
identify the execution status of each jobs.
Development of BTEQ scripts to implement highly complex business logics. Packaging of these scripts and
invoking them through command tasks in Informatica workflow.
Performance tuning - identified and fixed bottlenecks and tuned the complex Informatica mappings for
performance optimization.
Provide technical support to other teams on issues related to Informatica performance tuning.
Work with Project DBA to finalize the data model, publish the data model after QA check.
Environment: Informatica powercenter9.6.1, Informatica Developer 9.6.1, Teradata 14, UNIX, Workload Manager
(WLM), Teradata SQL Assistant, putty
2. Company – Infosys Limited
Duration - Feb 2012 – May 2014
Role - Data warehousing Lead/ Data Quality Architect
Client details and projects:
Client: CVS Pharmacy
Project: Viper Replacement
Project Description:
This project was designed to retire the existing loss prevention application of client and move towards IDW
(Integrated Data Warehouse) for analyzing and reporting to the Loss Prevention (LP) business group. The new IDW
built on Teradata will be a one stop shop for integrated data to all the Business users.
Role: ETL Developer
Responsibilities:
Analyzed the existing systems and the business requirements for tactical and strategic needs.
‘GAP’ analysis and compatibility analysis of the sources and decide the between the source of truth from
which data should be sourced to Enterprise data warehouse.
Develop the strategy to migrate the data from the legacy system to the Enterprise data ware house. Also
came up with the high level and the detailed design for the data migration activity.
Developed Informatica mapping document to outline the transformations required to integrate data from
various sources into Enterprise Data Warehouse.
Designed and implemented the Audit balance process to balance the transactions between different
staging layers and identify the execution status of each jobs.
Created the Multi-Dimensional data model (STAR Schema) for the reporting (sematic) layer.
Fine-tuning of Informatica mappings and SQLs to obtain optimal performance and throughput.
Interacted with the business users on regular basis to consolidate and analyze the requirements and
present them the design results.
Scheduling and tracking of tasks
Environment: Informatica powercenter9.1, Informatica Developer 9.1, Teradata 14, UNIX, Teradata SQL
Assistant,putty
Client: Ahold USA
Project Name: Customer Master
Project Description:
The Master Data Management Solution (MDM), will consolidate customer data from multiple sources, and provide
a single source of customers, households and other organization hierarchy information. This project relied heavily
on consolidating, cleansing, de-duping and House-holding the Customer data from different sources. The data
migration and cleansing was carried out by a combination of ETL and Trillium.
Role: ETL/Data Quality Architect
Responsibilities:
Analyzing, designing and developing ETL strategies and processes, writing ETL specifications and
Informatica development.
Developed strategy related to data cleansing, data quality. Designed and implemented the data profiling
procedures to come up with data cleansing and standardization rules.
Designed major Data Quality activities including Address and Data cleansing, Address standardization, and
development of rules to cleanse data. Informatica developer and Analyst tools were used for this purpose.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and
SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex
mappings involving numerous transformations and hence degrading the performance of the session.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Moving the data from source systems to different schemas based on the dimensions and fact tables by
using the slowly changing dimensions type 2 and type 1.
Worked on the various enhancements activities, involved in process improvement.
Worked independently on the critical milestone of the project interfaces by designing a completely
parameterized code to be used across the interfaces and delivered them on time in spite of several
hurdles like requirement changes, business rules changes, source data issues and complex business
functionality.
Analyzed the source systems to detect the data patterns and designed ETL strategy to process the data.
Involved in the continuous enhancements and fixing of production problems.
Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key
relationships, complex joins, and building efficient views.
Supported the development and production support group in identifying and resolving production issues.
Was instrumental in understanding the functional specifications and helped the rest of the team to
understand the same.
Involved in All the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing
and UAT
Created and executed test cases, test scripts and test summary reports
Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the
development time.
Migrated the code into QA (Testing) and supported QA team and UAT (User).
Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter,
Router, Lookup, Update Strategy, and Sequence Generator.
Environment: Informatica powercenter9.1, Informatica Developer 9.1, Oracle, UNIX, Toad for Oracle, putty
3. Company – Tata Consultancy Services
Duration – Sep 2010 – Feb 2012
Working as - Data Warehousing Analyst
Following is clients and project, I worked for:
Client: Woolworths
Project Name: Oxygen
Project Description:
This project was designed to integrate the data flowing from the satellite systems of the client to the Retail
Management System (RMS). The data from these legacy systems had to be extracted, cleansed, and transformed
before loading to RMS.
Role: ETL Lead
Responsibilities:
Gathering and understanding requirements from business analysts and preparing technical design
documents.
Developing new and modified existing complex mappings to extract data from various sources according
to guidelines provided by the business users to populate data into target systems.
Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router,
Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow
variables.
Worked on different tasks in workflows like sessions, event raise, event wait, decision, command,
worklets, assignment, control and timer.
Used workflow manager for creating, validating, testing and running the sequential, parallel, initial and
incremental load.
Developed unit test cases to ensure successful execution of data loading process.
Worked on unit testing of developed mappings and responsible for analyzing the root cause of the
issue/bug raised by the application owner or business user.
Used Debugger in Informatica Designer tool to test the data and fix errors in the mapping.
Developed PL/SQL procedures for creating/dropping indexes in pre and post sessions for better
performance.
Worked on performance tuning - identified and fixed bottlenecks and tuned the complex Informatica
mappings for performance optimization.
Setup and running the jobs using pmcmd commands and scheduling Informatica workflows to execute in
timely fashion.
Environment: Informatica powercenter9.1, Oracle, UNIX, Toad for Oracle, putty
4. Company – UST Global
Duration – Dec 2006 – Sep 2010
Role – Senior Software Engineer
Client: Anthem Inc.
Project Name: SSB Reporting, SSB State Sponsored Business
Project Description:
State Sponsored Business (SSB) is one of the most successful business units of Anthem that manages health care
products that include Medicaid, Children’s Health Insurance Program (CHIP) for low income and uninsured
populations.
SSB team was primarily responsible for building, enhancing and maintaining all the Medicaid applications for
Anthem Medicaid products. The primary supports involves, not limited to, processing member ,eligibility and
provider network data to the central claim processing and adjudication system-Diamond950 alias D950, daily
claims adjudication, Check & Remittance Advice processing, EFT, ERA, EOBs, Provider Network, Tax- 1099 ,
Actuarial reporting, Capitation, Encounter submission and response processing. The team also involves in
production deployments, resolving the Lights on items (Production issues), Queue items (Small System Change
Request) and also to service any adhoc requests.
Responsibilities:
Develop Informatica components for SSB Reporting Project. The components were developed on an
adhoc basis.
Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router,
Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow
variables.
Involved in scope and Impact analysis, requirement analysis, estimation, Design, development schedule
tracking, status reporting.
Develop the Technical design document based on the business requirements. Discussion with business to
clarify the requirements.
Maintained and generated Monthly reports from the tool – opus elixir.
Was the POC of SSB Reporting team for migration activities. Controlled all the production release
activities.
Worked on unit testing of developed mappings and responsible for analyzing the root cause of the
issue/bug raised by the application owner or business user.
Managed the SSB Reporting team with strength of 5. Responsible for task assignment and tracking of
tasks.
Performance tuned and optimized various complex SQL queries for reports
Environment : Oracle , PL/SQL, Informatica 8.6, MS Access, SQL Server, DTS, UNIX and Batch scripting, Opus
Elixir, WLM, Windows server.
EDUCATION, TRAINING & CERTIFICATIONS
Education: Bachelor in Technology (B.Tech) from Anna University, India
Trainings: Trained in Teradata, Oracle, Informatica and Oracle forms
CERTIFICATIONS
 Informatica Certified Developer (8.6) – Certificate Number - 219698
 OCA Certified Professional (Developer Track)
 DB2 fundamentals – IBM

More Related Content

What's hot

Riyas_Oracle DBA_BA_ST_Resume_Latest
Riyas_Oracle DBA_BA_ST_Resume_LatestRiyas_Oracle DBA_BA_ST_Resume_Latest
Riyas_Oracle DBA_BA_ST_Resume_LatestRiyas Mohamed
 
Business objects data services in an sap landscape
Business objects data services in an sap landscapeBusiness objects data services in an sap landscape
Business objects data services in an sap landscapePradeep Ketoli
 
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...Swapna Tammishetty
 
Etl - Extract Transform Load
Etl - Extract Transform LoadEtl - Extract Transform Load
Etl - Extract Transform LoadABDUL KHALIQ
 
Levin_Michael_2016-04
Levin_Michael_2016-04Levin_Michael_2016-04
Levin_Michael_2016-04Michael Levin
 
Resume John Stires Il
Resume John Stires IlResume John Stires Il
Resume John Stires Ilpencarver
 
Rizvi_Shaik
Rizvi_ShaikRizvi_Shaik
Rizvi_Shaikrizvi_s
 
Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016David Colbourn
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latestKallesha CB
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam
 
Informatica
InformaticaInformatica
Informaticamukharji
 
Teradata Big Data London Seminar
Teradata Big Data London SeminarTeradata Big Data London Seminar
Teradata Big Data London SeminarHortonworks
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developerbasha shaik
 

What's hot (20)

Ganesh CV
Ganesh CVGanesh CV
Ganesh CV
 
Etl techniques
Etl techniquesEtl techniques
Etl techniques
 
Riyas_Oracle DBA_BA_ST_Resume_Latest
Riyas_Oracle DBA_BA_ST_Resume_LatestRiyas_Oracle DBA_BA_ST_Resume_Latest
Riyas_Oracle DBA_BA_ST_Resume_Latest
 
Business objects data services in an sap landscape
Business objects data services in an sap landscapeBusiness objects data services in an sap landscape
Business objects data services in an sap landscape
 
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...
 
Etl - Extract Transform Load
Etl - Extract Transform LoadEtl - Extract Transform Load
Etl - Extract Transform Load
 
Levin_Michael_2016-04
Levin_Michael_2016-04Levin_Michael_2016-04
Levin_Michael_2016-04
 
Resume
ResumeResume
Resume
 
Resume John Stires Il
Resume John Stires IlResume John Stires Il
Resume John Stires Il
 
Richa_Profile
Richa_ProfileRicha_Profile
Richa_Profile
 
Philip_Calmerin_Updated_CV_2016
Philip_Calmerin_Updated_CV_2016Philip_Calmerin_Updated_CV_2016
Philip_Calmerin_Updated_CV_2016
 
Rizvi_Shaik
Rizvi_ShaikRizvi_Shaik
Rizvi_Shaik
 
Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016Resume_David_Colbourn September 2016
Resume_David_Colbourn September 2016
 
What is ETL?
What is ETL?What is ETL?
What is ETL?
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latest
 
Sakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing ConsultantSakthi Shenbagam - Data warehousing Consultant
Sakthi Shenbagam - Data warehousing Consultant
 
Informatica
InformaticaInformatica
Informatica
 
Teradata Big Data London Seminar
Teradata Big Data London SeminarTeradata Big Data London Seminar
Teradata Big Data London Seminar
 
ER/Studio Data Architect Datasheet
ER/Studio Data Architect DatasheetER/Studio Data Architect Datasheet
ER/Studio Data Architect Datasheet
 
Basha_ETL_Developer
Basha_ETL_DeveloperBasha_ETL_Developer
Basha_ETL_Developer
 

Viewers also liked

Joan Willox Senior CA - Resume
Joan Willox Senior CA - ResumeJoan Willox Senior CA - Resume
Joan Willox Senior CA - ResumeJoan Willox
 
Kevin Gallagher Resume
Kevin Gallagher ResumeKevin Gallagher Resume
Kevin Gallagher Resumekevingallagher
 
Himel_Sen_Resume
Himel_Sen_ResumeHimel_Sen_Resume
Himel_Sen_Resumehimel sen
 
Sudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha Janmeda
 
Akram_Resume_ETL_Informatica
Akram_Resume_ETL_InformaticaAkram_Resume_ETL_Informatica
Akram_Resume_ETL_InformaticaAkram Bhuyan
 
RanjithHV 2.5yrExp in Dotnet
RanjithHV 2.5yrExp in DotnetRanjithHV 2.5yrExp in Dotnet
RanjithHV 2.5yrExp in Dotnetranjith hv
 
VISHVANATH PAWAR - RESUME- Front-End developer, UI Design & Developer
VISHVANATH PAWAR - RESUME- Front-End developer, UI Design & DeveloperVISHVANATH PAWAR - RESUME- Front-End developer, UI Design & Developer
VISHVANATH PAWAR - RESUME- Front-End developer, UI Design & DeveloperVishvanath Pawar
 
ASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer ResumeASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer ResumeSorakayala Ashok
 

Viewers also liked (16)

Joan Willox Senior CA - Resume
Joan Willox Senior CA - ResumeJoan Willox Senior CA - Resume
Joan Willox Senior CA - Resume
 
Kevin Gallagher Resume
Kevin Gallagher ResumeKevin Gallagher Resume
Kevin Gallagher Resume
 
Rajeev_Resume
Rajeev_ResumeRajeev_Resume
Rajeev_Resume
 
Resume 2015 SW
Resume 2015 SWResume 2015 SW
Resume 2015 SW
 
Sudhir Gajjela_Resume
Sudhir Gajjela_ResumeSudhir Gajjela_Resume
Sudhir Gajjela_Resume
 
Himel_Sen_Resume
Himel_Sen_ResumeHimel_Sen_Resume
Himel_Sen_Resume
 
Sudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_DeveloperSudiksha_CV_Informatica_Developer
Sudiksha_CV_Informatica_Developer
 
Akram_Resume_ETL_Informatica
Akram_Resume_ETL_InformaticaAkram_Resume_ETL_Informatica
Akram_Resume_ETL_Informatica
 
Renu_Resume
Renu_ResumeRenu_Resume
Renu_Resume
 
Saheel_Babu _KT
Saheel_Babu _KTSaheel_Babu _KT
Saheel_Babu _KT
 
RanjithHV 2.5yrExp in Dotnet
RanjithHV 2.5yrExp in DotnetRanjithHV 2.5yrExp in Dotnet
RanjithHV 2.5yrExp in Dotnet
 
VISHVANATH PAWAR - RESUME- Front-End developer, UI Design & Developer
VISHVANATH PAWAR - RESUME- Front-End developer, UI Design & DeveloperVISHVANATH PAWAR - RESUME- Front-End developer, UI Design & Developer
VISHVANATH PAWAR - RESUME- Front-End developer, UI Design & Developer
 
Utter resume.docx (2)
Utter resume.docx (2)Utter resume.docx (2)
Utter resume.docx (2)
 
MyResume
MyResumeMyResume
MyResume
 
Resume
ResumeResume
Resume
 
ASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer ResumeASHOK KUMAR UI Developer Resume
ASHOK KUMAR UI Developer Resume
 

Similar to Arun Mathew Thomas_resume

Similar to Arun Mathew Thomas_resume (20)

Kumarswamy_ETL
Kumarswamy_ETLKumarswamy_ETL
Kumarswamy_ETL
 
Abdul ETL Resume
Abdul ETL ResumeAbdul ETL Resume
Abdul ETL Resume
 
Resume
ResumeResume
Resume
 
Arun Kondra
Arun KondraArun Kondra
Arun Kondra
 
Varadarajan CV
Varadarajan CVVaradarajan CV
Varadarajan CV
 
Resume
ResumeResume
Resume
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
 
Gowthami_Resume
Gowthami_ResumeGowthami_Resume
Gowthami_Resume
 
8143320263_krishna12
8143320263_krishna128143320263_krishna12
8143320263_krishna12
 
Mani_Sagar_ETL
Mani_Sagar_ETLMani_Sagar_ETL
Mani_Sagar_ETL
 
Resume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exp
 
Ajith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETLAjith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETL
 
Siva - Resume
Siva - ResumeSiva - Resume
Siva - Resume
 
Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2Venkatesh-Babu-Profile2
Venkatesh-Babu-Profile2
 
Kiran Infromatica developer
Kiran Infromatica developerKiran Infromatica developer
Kiran Infromatica developer
 
Nitin Paliwal
Nitin PaliwalNitin Paliwal
Nitin Paliwal
 
Resume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_DeveloperResume_RaghavMahajan_ETL_Developer
Resume_RaghavMahajan_ETL_Developer
 
Resume ratna rao updated
Resume ratna rao updatedResume ratna rao updated
Resume ratna rao updated
 
Resume_Ratna Rao updated
Resume_Ratna Rao updatedResume_Ratna Rao updated
Resume_Ratna Rao updated
 
Madhukar_Eunny_BIDW_Consultant
Madhukar_Eunny_BIDW_ConsultantMadhukar_Eunny_BIDW_Consultant
Madhukar_Eunny_BIDW_Consultant
 

Arun Mathew Thomas_resume

  • 1. Arun Mathew Thomas Mobile: 949-680-0907 Richmond, VA arunmathewthomas.mannil@gmail.com SUMMARY: Over 9 years of expertise in IT Industry and over 8 years of comprehensive experience in developing and maintaining Data Warehouse applications in Healthcare domains. In depth knowledge of data warehousing techniques, Star / Snowflake schema, ETL, Fact and Dimensions tables, physical and logical data modeling. Experience in designing, developing the ETL process for loading data from heterogeneous source systems like flat files, Oracle, SQL server, Teradata, VSAM files and Teradata using Informatica Power Center 9.x/8.x Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter, Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter, Sequence Generator, SQL transformation Experience in Performance Tuning of the mappings and Sessions, implementing the complex business rules, optimizing the mappings. Build slowly Changing Dimensions (SCD type1 and type2) mapping, partitioning, override, parameters, variables, push down optimization, mapplets and worklets. Experience in detecting/resolving bottlenecks at various levels like source, target, mapping and sessions. Experienced in Data Quality Management. Worked in Informatica Developer and Trillium Software systems. Experienced in Data Analysis, Data Profiling and Data Mapping. Has very good knowledge on data standardization using Informatica developer. Responsible for major Data Quality activities including Address and Data cleansing, Address standardization, Developing Match & Merge and House-holding rules. Highly efficient in developing fuzzy matching logics for Match & Merge and House-holding. Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and System Development Life Cycle (SDLC). More than 5 years of development experience in various Data warehousing projects using Teradata 13.10 and 14 Experienced in working with Teradata utilities like BTEQ, Stored Procedures, FastExport, FastLoad, MultiLoad ,TPT Experienced in writing complex SQL queries to leverage the Push down Optimization feature of Informatica. 6 years of experience in Data Warehousing and ETL using Informatica Power Center 8.6 and 9.1/9.6, MS DTS. 5+ years of experience in development using Oracle 9i using PL/SQL, SQL *PLUS and TOAD. Highly skilled in Data Analysis, Requirement Gathering, Requirement Analysis, Gap Analysis, Data Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis. Has the experience of leading large data ware house teams comprising of 10-20 people for building Enterprise data warehouse. Experience working in onsite-offshore model.
  • 2. TECHNICAL SKILLS: Operating System: Windows/Dos, UNIX, Linux Languages: PL/SQL, Shell Script ETL: Informatica Power Center 8.x, 9.x, MS DTS Databases: Oracle, Teradata 14, MS Access, SQL Server 2008 Reporting Tool: Elixir Data Quality Management: Informatica Developer, Informatica Analyst, Trillium Software Systems. Other Tools: Teradata SQL Assistant, Toad, MS Visio, Work Load Manager, MS Office Professional Experience 1. Company – UST Global Duration – May2014 – Current Role – Sr. Systems Analyst Client: Anthem Inc. Project Name: Enterprise Data Warehouse and Research Depot (EDWard) Project Description: Anthem, Inc. (formerly known as WellPoint Inc) is the largest managed health care, for-profit company in the Blue Cross and Blue Shield Association. It was formed when WellPoint Health Networks, Inc. merged into Anthem, Inc., with the surviving Anthem adopting the name, WellPoint, Inc. and began trading its common stock under the WLP symbol on December 1, 2004. In December 2014, WellPoint, Inc. changed its corporate name to Anthem, Inc. Anthem is one of the nation’s largest health benefits companies, with more than 37 million members in its affiliated health plans and nearly 67 million individuals served through its subsidiaries. Responsibilities: Source data analysis and ‘GAP’ identification of the proposed data sources. Data profiling and data quality analysis of the source data. Help business understand the quality of the source data and the educating them on the accuracy and the completeness of data. Comparing different data sources for completeness of data and suggesting the best source based on the analysis. Gathering and understanding requirements from business analysts and preparing technical design documents. Develop high level, low level ETL designs and develop Informatica mapping documents outlining the transformations required to integrate data from various sources into Enterprise Data Warehouse. Developing new and modify existing complex Informatica mappings to extract data from various sources according to guidelines provided by the business users to populate data into target systems. Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow variables Created Informatica mappings using push down optimization to push all Integration, Semantic layer’s transformation, data integration logic into Teradata database. Customize the Audit balance process to balance the transactions between different staging layers and
  • 3. identify the execution status of each jobs. Development of BTEQ scripts to implement highly complex business logics. Packaging of these scripts and invoking them through command tasks in Informatica workflow. Performance tuning - identified and fixed bottlenecks and tuned the complex Informatica mappings for performance optimization. Provide technical support to other teams on issues related to Informatica performance tuning. Work with Project DBA to finalize the data model, publish the data model after QA check. Environment: Informatica powercenter9.6.1, Informatica Developer 9.6.1, Teradata 14, UNIX, Workload Manager (WLM), Teradata SQL Assistant, putty 2. Company – Infosys Limited Duration - Feb 2012 – May 2014 Role - Data warehousing Lead/ Data Quality Architect Client details and projects: Client: CVS Pharmacy Project: Viper Replacement Project Description: This project was designed to retire the existing loss prevention application of client and move towards IDW (Integrated Data Warehouse) for analyzing and reporting to the Loss Prevention (LP) business group. The new IDW built on Teradata will be a one stop shop for integrated data to all the Business users. Role: ETL Developer Responsibilities: Analyzed the existing systems and the business requirements for tactical and strategic needs. ‘GAP’ analysis and compatibility analysis of the sources and decide the between the source of truth from which data should be sourced to Enterprise data warehouse. Develop the strategy to migrate the data from the legacy system to the Enterprise data ware house. Also came up with the high level and the detailed design for the data migration activity. Developed Informatica mapping document to outline the transformations required to integrate data from various sources into Enterprise Data Warehouse. Designed and implemented the Audit balance process to balance the transactions between different staging layers and identify the execution status of each jobs. Created the Multi-Dimensional data model (STAR Schema) for the reporting (sematic) layer. Fine-tuning of Informatica mappings and SQLs to obtain optimal performance and throughput. Interacted with the business users on regular basis to consolidate and analyze the requirements and present them the design results. Scheduling and tracking of tasks Environment: Informatica powercenter9.1, Informatica Developer 9.1, Teradata 14, UNIX, Teradata SQL Assistant,putty Client: Ahold USA Project Name: Customer Master Project Description:
  • 4. The Master Data Management Solution (MDM), will consolidate customer data from multiple sources, and provide a single source of customers, households and other organization hierarchy information. This project relied heavily on consolidating, cleansing, de-duping and House-holding the Customer data from different sources. The data migration and cleansing was carried out by a combination of ETL and Trillium. Role: ETL/Data Quality Architect Responsibilities: Analyzing, designing and developing ETL strategies and processes, writing ETL specifications and Informatica development. Developed strategy related to data cleansing, data quality. Designed and implemented the data profiling procedures to come up with data cleansing and standardization rules. Designed major Data Quality activities including Address and Data cleansing, Address standardization, and development of rules to cleanse data. Informatica developer and Analyst tools were used for this purpose. Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer. Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex mappings involving numerous transformations and hence degrading the performance of the session. Involved in Performance tuning at source, target, mappings, sessions, and system levels. Moving the data from source systems to different schemas based on the dimensions and fact tables by using the slowly changing dimensions type 2 and type 1. Worked on the various enhancements activities, involved in process improvement. Worked independently on the critical milestone of the project interfaces by designing a completely parameterized code to be used across the interfaces and delivered them on time in spite of several hurdles like requirement changes, business rules changes, source data issues and complex business functionality. Analyzed the source systems to detect the data patterns and designed ETL strategy to process the data. Involved in the continuous enhancements and fixing of production problems. Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key relationships, complex joins, and building efficient views. Supported the development and production support group in identifying and resolving production issues. Was instrumental in understanding the functional specifications and helped the rest of the team to understand the same. Involved in All the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing and UAT Created and executed test cases, test scripts and test summary reports Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the development time. Migrated the code into QA (Testing) and supported QA team and UAT (User). Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter, Router, Lookup, Update Strategy, and Sequence Generator. Environment: Informatica powercenter9.1, Informatica Developer 9.1, Oracle, UNIX, Toad for Oracle, putty
  • 5. 3. Company – Tata Consultancy Services Duration – Sep 2010 – Feb 2012 Working as - Data Warehousing Analyst Following is clients and project, I worked for: Client: Woolworths Project Name: Oxygen Project Description: This project was designed to integrate the data flowing from the satellite systems of the client to the Retail Management System (RMS). The data from these legacy systems had to be extracted, cleansed, and transformed before loading to RMS. Role: ETL Lead Responsibilities: Gathering and understanding requirements from business analysts and preparing technical design documents. Developing new and modified existing complex mappings to extract data from various sources according to guidelines provided by the business users to populate data into target systems. Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow variables. Worked on different tasks in workflows like sessions, event raise, event wait, decision, command, worklets, assignment, control and timer. Used workflow manager for creating, validating, testing and running the sequential, parallel, initial and incremental load. Developed unit test cases to ensure successful execution of data loading process. Worked on unit testing of developed mappings and responsible for analyzing the root cause of the issue/bug raised by the application owner or business user. Used Debugger in Informatica Designer tool to test the data and fix errors in the mapping. Developed PL/SQL procedures for creating/dropping indexes in pre and post sessions for better performance. Worked on performance tuning - identified and fixed bottlenecks and tuned the complex Informatica mappings for performance optimization. Setup and running the jobs using pmcmd commands and scheduling Informatica workflows to execute in timely fashion. Environment: Informatica powercenter9.1, Oracle, UNIX, Toad for Oracle, putty 4. Company – UST Global Duration – Dec 2006 – Sep 2010 Role – Senior Software Engineer Client: Anthem Inc. Project Name: SSB Reporting, SSB State Sponsored Business
  • 6. Project Description: State Sponsored Business (SSB) is one of the most successful business units of Anthem that manages health care products that include Medicaid, Children’s Health Insurance Program (CHIP) for low income and uninsured populations. SSB team was primarily responsible for building, enhancing and maintaining all the Medicaid applications for Anthem Medicaid products. The primary supports involves, not limited to, processing member ,eligibility and provider network data to the central claim processing and adjudication system-Diamond950 alias D950, daily claims adjudication, Check & Remittance Advice processing, EFT, ERA, EOBs, Provider Network, Tax- 1099 , Actuarial reporting, Capitation, Encounter submission and response processing. The team also involves in production deployments, resolving the Lights on items (Production issues), Queue items (Small System Change Request) and also to service any adhoc requests. Responsibilities: Develop Informatica components for SSB Reporting Project. The components were developed on an adhoc basis. Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router, Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow variables. Involved in scope and Impact analysis, requirement analysis, estimation, Design, development schedule tracking, status reporting. Develop the Technical design document based on the business requirements. Discussion with business to clarify the requirements. Maintained and generated Monthly reports from the tool – opus elixir. Was the POC of SSB Reporting team for migration activities. Controlled all the production release activities. Worked on unit testing of developed mappings and responsible for analyzing the root cause of the issue/bug raised by the application owner or business user. Managed the SSB Reporting team with strength of 5. Responsible for task assignment and tracking of tasks. Performance tuned and optimized various complex SQL queries for reports Environment : Oracle , PL/SQL, Informatica 8.6, MS Access, SQL Server, DTS, UNIX and Batch scripting, Opus Elixir, WLM, Windows server. EDUCATION, TRAINING & CERTIFICATIONS Education: Bachelor in Technology (B.Tech) from Anna University, India Trainings: Trained in Teradata, Oracle, Informatica and Oracle forms CERTIFICATIONS  Informatica Certified Developer (8.6) – Certificate Number - 219698  OCA Certified Professional (Developer Track)  DB2 fundamentals – IBM