SlideShare a Scribd company logo
1 of 4
Download to read offline
Kavitha. G
Phone: +91 9986587687
E-mail: kavithagonela25@gmail.com
PROFESSIONAL SUMMARY:
 3+ years of experience in Data warehouse implementation using Informatica tool as
ETL-Developer.
 Working with different data sources like Flat Files and Relational Databases.
 Strong Experience on Repository Manager, Mapping Designer, Mapplet Designer,
Transformation Developer, Workflow Manager and Workflow Monitor.
 Responsible for Extracting, Transformation and Loading of data from multiple
databases.
 Used most of the transformations such as the Source Qualifier, Expression, Aggregator,
Connected & unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner,
and Update Strategy.
 Working on mapping re-usability using Mapping Parameters and Mapping Variables.
 Optimizing Informatica Mappings and Sessions to improve the performance.
 Created and monitored the workflows using Workflow Manager and Workflow Monitor
and the log were studied to modify the mappings to obtain correct results from the
transformation.
 The Mappings were Unit tested to check for the expected results.
 Good knowledge of Data Warehousing concepts and Dimensional modeling like Star
Schema and Snowflake Schema.
 Having good Knowledge in Oracle.
 Having Basic Knowledge in Unix.
 Having good communication skills, Interpersonal relations, hardworking.
 Interested to learning new technologies. Ability to work under team and self.
 Strong analytical skills, organizational and coordination skills.
ORGANIZATIONAL EXPERIENCE:
 Currently Working with Ciber, Bangalore from Jun 2013 to till date.
EDUCATION QUALIFICATION:
B.Tech (Computer science) from JNTU University.
TECHNICAL SKILLS:
 ETL Tools: Informatica Power Center 9.x/8.x
 RDBMS: Oracle and Sql Server
 Tools: Toad Sql Assistance
 Languages: SQL and UNIX Shell Scripting
 Operating Systems: Windows, UNIX
 Scheduler Tools: Automic and Informatica Scheduler
PROJECT #1: Nov’14-Till now
Project Name : Medica Health Care.
Client : USA.
Role : ETL Developer.
Environment : Informatica 9.1, Oracle10g.
Team Size : 5
Description:
Medica is a non-profit corporation that provides health insurance products for the
individual, family, group, senior and government markets as well as a health
management company, a charitable grant-making foundation and a research institute.
They want to know the revenue at customer, location and insurance type. The main
objective of FRD program is to built an analytics data mart in which atomic data needs
to be stored. Data is going to be maintain at lowest possible granularity that is
customer, location, insurance type. This helps finance team to accurately analyze
revenue details. So that finance team can design business strategies that improves
performance of the business.
Responsibilities:
 Working on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer &
Mapplet, and Transformation Developer.
 Imported data from various sources transformed and loaded into Data Warehouse
Targets using Informatica.
 Worked with different sources such as Oracle and flat files.
 Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner,
Expression, Filter, Lookup, Sequence generator and Update Strategy.
 Developed several Mapplets that were used in other mappings.
 Run the mapping through the Debugger to test the business logic accordingly.
 Designed and Developed the Informatica workflows/sessions to ext ract, transform
and load the data into Target.
 Developed mapping re-usability using Mapping Parameters and Mapping Variables.
 Create the Sequential Sessions in the Workflow Manager that loads the data when
start the session.
 Analyze the result set of data loaded by monitoring the properties using the
Workflow Monitor.
PROJECT #2: Jun’13-Nov ’14
Project Name : Lamps Plus
Client : CA, USA
Role : ETL Developer
Environment : Informatica 8.6, Oracle 10g
Team Size : 4
Description:
Lamps plus is a global retail company in USA. Lamps plus Hometown Proud stores can
be found in more than forty countries around the world. And it has more than forty
national brand partners. This project involves Distribution of Historical sales data of
IGA on daily basis. The system collects information from all the stores across the world
and loads the sales information into Data Mart.
Responsibilities:
 Analyzing and understanding the requirements.
 Created ETL mappings using Informatica Power Center to move Data from multiple
sources like Flat files, Oracle into a common target area such as Staging, Data
Warehouse and Data Marts.
 Maintained staging area for analyzing the data inconsistency
 Developed different mappings by using different Transformations like Aggregator,
Lookup, Expression, update Strategy, Joiner, Router etc. to load the data into
staging tables and then to target.
 Created Type-2 Mappings.
 Use SQL tools like TOAD to run SQL queries and validate the data loaded into the
target tables.
 Rectified the problems using the Debugger tool of Informatica.
 Validation of target data against the source.
 Performing deployment ETL code activities.

More Related Content

What's hot

Lecture 1 introduction to data warehouse
Lecture 1 introduction to data warehouseLecture 1 introduction to data warehouse
Lecture 1 introduction to data warehouseShani729
 
introduction to data warehousing and mining
 introduction to data warehousing and mining introduction to data warehousing and mining
introduction to data warehousing and miningRajesh Chandra
 
Mastering in Data Warehousing and Business Intelligence
Mastering in Data Warehousing and Business IntelligenceMastering in Data Warehousing and Business Intelligence
Mastering in Data Warehousing and Business IntelligenceEdureka!
 
John Paredes Resume Sfnm
John Paredes Resume SfnmJohn Paredes Resume Sfnm
John Paredes Resume SfnmJohn Paredes
 
SAP HANA Integrated with Microstrategy
SAP HANA Integrated with MicrostrategySAP HANA Integrated with Microstrategy
SAP HANA Integrated with Microstrategysnehal parikh
 
142230 633685297550892500
142230 633685297550892500142230 633685297550892500
142230 633685297550892500sumit621
 
Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!
Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!
Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!Caserta
 
Data mining and data warehousing
Data mining and data warehousingData mining and data warehousing
Data mining and data warehousingSatya P. Joshi
 
Data Warehouse by Amr Ali
Data Warehouse by Amr AliData Warehouse by Amr Ali
Data Warehouse by Amr AliAmr Ali
 
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...Cambridge Semantics
 
Data Warehousing & Basic Architectural Framework
Data Warehousing & Basic Architectural FrameworkData Warehousing & Basic Architectural Framework
Data Warehousing & Basic Architectural FrameworkDr. Sunil Kr. Pandey
 
Data Warehouse Architectures
Data Warehouse ArchitecturesData Warehouse Architectures
Data Warehouse ArchitecturesTheju Paul
 
Unstructured Data Processing
Unstructured Data ProcessingUnstructured Data Processing
Unstructured Data ProcessingJohn Paul
 
Oracle dba-daily-operations
Oracle dba-daily-operationsOracle dba-daily-operations
Oracle dba-daily-operationsraima sen
 
Data warehouseconceptsandarchitecture
Data warehouseconceptsandarchitectureData warehouseconceptsandarchitecture
Data warehouseconceptsandarchitecturesamaksh1982
 
Implementing bi in proof of concept techniques
Implementing bi in proof of concept techniquesImplementing bi in proof of concept techniques
Implementing bi in proof of concept techniquesRanjith Ramanan
 
Components of a Data-Warehouse
Components of a Data-WarehouseComponents of a Data-Warehouse
Components of a Data-WarehouseAbdul Aslam
 
Big Data: Its Characteristics And Architecture Capabilities
Big Data: Its Characteristics And Architecture CapabilitiesBig Data: Its Characteristics And Architecture Capabilities
Big Data: Its Characteristics And Architecture CapabilitiesAshraf Uddin
 

What's hot (20)

Lecture 1 introduction to data warehouse
Lecture 1 introduction to data warehouseLecture 1 introduction to data warehouse
Lecture 1 introduction to data warehouse
 
introduction to data warehousing and mining
 introduction to data warehousing and mining introduction to data warehousing and mining
introduction to data warehousing and mining
 
Mastering in Data Warehousing and Business Intelligence
Mastering in Data Warehousing and Business IntelligenceMastering in Data Warehousing and Business Intelligence
Mastering in Data Warehousing and Business Intelligence
 
John Paredes Resume Sfnm
John Paredes Resume SfnmJohn Paredes Resume Sfnm
John Paredes Resume Sfnm
 
Resume
ResumeResume
Resume
 
SAP HANA Integrated with Microstrategy
SAP HANA Integrated with MicrostrategySAP HANA Integrated with Microstrategy
SAP HANA Integrated with Microstrategy
 
142230 633685297550892500
142230 633685297550892500142230 633685297550892500
142230 633685297550892500
 
Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!
Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!
Big Data Warehousing Meetup: Dimensional Modeling Still Matters!!!
 
Data mining and data warehousing
Data mining and data warehousingData mining and data warehousing
Data mining and data warehousing
 
Data Warehouse by Amr Ali
Data Warehouse by Amr AliData Warehouse by Amr Ali
Data Warehouse by Amr Ali
 
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
AnzoGraph DB: Driving AI and Machine Insights with Knowledge Graphs in a Conn...
 
ITReady DW Day2
ITReady DW Day2ITReady DW Day2
ITReady DW Day2
 
Data Warehousing & Basic Architectural Framework
Data Warehousing & Basic Architectural FrameworkData Warehousing & Basic Architectural Framework
Data Warehousing & Basic Architectural Framework
 
Data Warehouse Architectures
Data Warehouse ArchitecturesData Warehouse Architectures
Data Warehouse Architectures
 
Unstructured Data Processing
Unstructured Data ProcessingUnstructured Data Processing
Unstructured Data Processing
 
Oracle dba-daily-operations
Oracle dba-daily-operationsOracle dba-daily-operations
Oracle dba-daily-operations
 
Data warehouseconceptsandarchitecture
Data warehouseconceptsandarchitectureData warehouseconceptsandarchitecture
Data warehouseconceptsandarchitecture
 
Implementing bi in proof of concept techniques
Implementing bi in proof of concept techniquesImplementing bi in proof of concept techniques
Implementing bi in proof of concept techniques
 
Components of a Data-Warehouse
Components of a Data-WarehouseComponents of a Data-Warehouse
Components of a Data-Warehouse
 
Big Data: Its Characteristics And Architecture Capabilities
Big Data: Its Characteristics And Architecture CapabilitiesBig Data: Its Characteristics And Architecture Capabilities
Big Data: Its Characteristics And Architecture Capabilities
 

Viewers also liked

Astha Kumari Resume2
Astha Kumari Resume2Astha Kumari Resume2
Astha Kumari Resume2Astha Singh
 
Saptarshi_Banerjee_Experience_Cognizant
Saptarshi_Banerjee_Experience_CognizantSaptarshi_Banerjee_Experience_Cognizant
Saptarshi_Banerjee_Experience_CognizantSaptarshi Banerjee
 
HOT LIST
HOT LISTHOT LIST
HOT LISTMike K
 
China E-commerce Analytics [Credit Suisse]
China E-commerce Analytics [Credit Suisse]China E-commerce Analytics [Credit Suisse]
China E-commerce Analytics [Credit Suisse]Phi Jack
 
Jeannette Verran Resume 2016
Jeannette Verran Resume 2016Jeannette Verran Resume 2016
Jeannette Verran Resume 2016Jeanette Verran
 
Us It Jobs From Jobs Bridge April 16
Us It Jobs From Jobs Bridge  April 16Us It Jobs From Jobs Bridge  April 16
Us It Jobs From Jobs Bridge April 16JobsBridge Publisher
 
Oracle atg-mobile-wp-345770 mobile trends
Oracle atg-mobile-wp-345770 mobile trendsOracle atg-mobile-wp-345770 mobile trends
Oracle atg-mobile-wp-345770 mobile trendsMarketingfacts
 
JG_RESUME_031416
JG_RESUME_031416JG_RESUME_031416
JG_RESUME_031416Jerry Gibbs
 

Viewers also liked (12)

Astha Kumari Resume2
Astha Kumari Resume2Astha Kumari Resume2
Astha Kumari Resume2
 
Saptarshi_Banerjee_Experience_Cognizant
Saptarshi_Banerjee_Experience_CognizantSaptarshi_Banerjee_Experience_Cognizant
Saptarshi_Banerjee_Experience_Cognizant
 
Niranjan_Ingale
Niranjan_IngaleNiranjan_Ingale
Niranjan_Ingale
 
HOT LIST
HOT LISTHOT LIST
HOT LIST
 
China E-commerce Analytics [Credit Suisse]
China E-commerce Analytics [Credit Suisse]China E-commerce Analytics [Credit Suisse]
China E-commerce Analytics [Credit Suisse]
 
Jeannette Verran Resume 2016
Jeannette Verran Resume 2016Jeannette Verran Resume 2016
Jeannette Verran Resume 2016
 
PrabhuCV_2017
PrabhuCV_2017PrabhuCV_2017
PrabhuCV_2017
 
Us It Jobs From Jobs Bridge April 16
Us It Jobs From Jobs Bridge  April 16Us It Jobs From Jobs Bridge  April 16
Us It Jobs From Jobs Bridge April 16
 
QA Resume _Muthu_Updated
QA Resume _Muthu_UpdatedQA Resume _Muthu_Updated
QA Resume _Muthu_Updated
 
Oracle atg-mobile-wp-345770 mobile trends
Oracle atg-mobile-wp-345770 mobile trendsOracle atg-mobile-wp-345770 mobile trends
Oracle atg-mobile-wp-345770 mobile trends
 
JG_RESUME_031416
JG_RESUME_031416JG_RESUME_031416
JG_RESUME_031416
 
QA manager
QA managerQA manager
QA manager
 

Similar to info_etl_3+exp_resume

Similar to info_etl_3+exp_resume (20)

RAM PRASAD SVK
RAM PRASAD SVKRAM PRASAD SVK
RAM PRASAD SVK
 
Resume
ResumeResume
Resume
 
Arun Mathew Thomas_resume
Arun Mathew Thomas_resumeArun Mathew Thomas_resume
Arun Mathew Thomas_resume
 
ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17ETL_Developer_Resume_Shipra_7_02_17
ETL_Developer_Resume_Shipra_7_02_17
 
Resume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_expResume_Informatica&IDQ_4+years_of_exp
Resume_Informatica&IDQ_4+years_of_exp
 
jagadeesh updated
jagadeesh updatedjagadeesh updated
jagadeesh updated
 
VenkatSubbaReddy_Resume
VenkatSubbaReddy_ResumeVenkatSubbaReddy_Resume
VenkatSubbaReddy_Resume
 
Resume_kallesh_latest
Resume_kallesh_latestResume_kallesh_latest
Resume_kallesh_latest
 
RahulSoni_ETL_resume
RahulSoni_ETL_resumeRahulSoni_ETL_resume
RahulSoni_ETL_resume
 
Ajith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETLAjith_kumar_4.3 Years_Informatica_ETL
Ajith_kumar_4.3 Years_Informatica_ETL
 
Arun Kondra
Arun KondraArun Kondra
Arun Kondra
 
New.rk latest
New.rk latestNew.rk latest
New.rk latest
 
latest 4 Years of Exp
latest 4 Years of Explatest 4 Years of Exp
latest 4 Years of Exp
 
Kumarswamy_ETL
Kumarswamy_ETLKumarswamy_ETL
Kumarswamy_ETL
 
Abdul ETL Resume
Abdul ETL ResumeAbdul ETL Resume
Abdul ETL Resume
 
Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16Informatica_Rajesh-CV 28_03_16
Informatica_Rajesh-CV 28_03_16
 
GouriShankar_Informatica
GouriShankar_InformaticaGouriShankar_Informatica
GouriShankar_Informatica
 
Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17Resume_Arun_Baby_03Jan17
Resume_Arun_Baby_03Jan17
 
Resume
ResumeResume
Resume
 
Prasad_Resume
Prasad_ResumePrasad_Resume
Prasad_Resume
 

info_etl_3+exp_resume

  • 1. Kavitha. G Phone: +91 9986587687 E-mail: kavithagonela25@gmail.com PROFESSIONAL SUMMARY:  3+ years of experience in Data warehouse implementation using Informatica tool as ETL-Developer.  Working with different data sources like Flat Files and Relational Databases.  Strong Experience on Repository Manager, Mapping Designer, Mapplet Designer, Transformation Developer, Workflow Manager and Workflow Monitor.  Responsible for Extracting, Transformation and Loading of data from multiple databases.  Used most of the transformations such as the Source Qualifier, Expression, Aggregator, Connected & unconnected lookups, Filter, Router, Sequence Generator, Sorter, Joiner, and Update Strategy.  Working on mapping re-usability using Mapping Parameters and Mapping Variables.  Optimizing Informatica Mappings and Sessions to improve the performance.  Created and monitored the workflows using Workflow Manager and Workflow Monitor and the log were studied to modify the mappings to obtain correct results from the transformation.  The Mappings were Unit tested to check for the expected results.  Good knowledge of Data Warehousing concepts and Dimensional modeling like Star Schema and Snowflake Schema.  Having good Knowledge in Oracle.  Having Basic Knowledge in Unix.  Having good communication skills, Interpersonal relations, hardworking.  Interested to learning new technologies. Ability to work under team and self.  Strong analytical skills, organizational and coordination skills. ORGANIZATIONAL EXPERIENCE:  Currently Working with Ciber, Bangalore from Jun 2013 to till date.
  • 2. EDUCATION QUALIFICATION: B.Tech (Computer science) from JNTU University. TECHNICAL SKILLS:  ETL Tools: Informatica Power Center 9.x/8.x  RDBMS: Oracle and Sql Server  Tools: Toad Sql Assistance  Languages: SQL and UNIX Shell Scripting  Operating Systems: Windows, UNIX  Scheduler Tools: Automic and Informatica Scheduler PROJECT #1: Nov’14-Till now Project Name : Medica Health Care. Client : USA. Role : ETL Developer. Environment : Informatica 9.1, Oracle10g. Team Size : 5 Description: Medica is a non-profit corporation that provides health insurance products for the individual, family, group, senior and government markets as well as a health management company, a charitable grant-making foundation and a research institute. They want to know the revenue at customer, location and insurance type. The main objective of FRD program is to built an analytics data mart in which atomic data needs to be stored. Data is going to be maintain at lowest possible granularity that is customer, location, insurance type. This helps finance team to accurately analyze revenue details. So that finance team can design business strategies that improves performance of the business.
  • 3. Responsibilities:  Working on Informatica - Source Analyzer, Warehouse Designer, Mapping Designer & Mapplet, and Transformation Developer.  Imported data from various sources transformed and loaded into Data Warehouse Targets using Informatica.  Worked with different sources such as Oracle and flat files.  Extensively used Transformations like Router, Aggregator, Source Qualifier, Joiner, Expression, Filter, Lookup, Sequence generator and Update Strategy.  Developed several Mapplets that were used in other mappings.  Run the mapping through the Debugger to test the business logic accordingly.  Designed and Developed the Informatica workflows/sessions to ext ract, transform and load the data into Target.  Developed mapping re-usability using Mapping Parameters and Mapping Variables.  Create the Sequential Sessions in the Workflow Manager that loads the data when start the session.  Analyze the result set of data loaded by monitoring the properties using the Workflow Monitor. PROJECT #2: Jun’13-Nov ’14 Project Name : Lamps Plus Client : CA, USA Role : ETL Developer Environment : Informatica 8.6, Oracle 10g Team Size : 4
  • 4. Description: Lamps plus is a global retail company in USA. Lamps plus Hometown Proud stores can be found in more than forty countries around the world. And it has more than forty national brand partners. This project involves Distribution of Historical sales data of IGA on daily basis. The system collects information from all the stores across the world and loads the sales information into Data Mart. Responsibilities:  Analyzing and understanding the requirements.  Created ETL mappings using Informatica Power Center to move Data from multiple sources like Flat files, Oracle into a common target area such as Staging, Data Warehouse and Data Marts.  Maintained staging area for analyzing the data inconsistency  Developed different mappings by using different Transformations like Aggregator, Lookup, Expression, update Strategy, Joiner, Router etc. to load the data into staging tables and then to target.  Created Type-2 Mappings.  Use SQL tools like TOAD to run SQL queries and validate the data loaded into the target tables.  Rectified the problems using the Debugger tool of Informatica.  Validation of target data against the source.  Performing deployment ETL code activities.