Sandhya Chamarthi Mobile: +917338250888
Email : sandhyachamarthi.dwh@gmail.com
SUMMARY:
• Having around 4 years of IT experience which 1 year in Informatica and 3 years in Pentaho Data
Integration (KETTLE) Technology with Technical and Functional Experience in developing and
maintaining data warehouses/marts. Proficiency in Data warehousing, ETL process, OLAP systems,
Business Intelligence. Currently working with PENTAHO(KETTLE).
• Expertise in Extraction Transformation and Loading (ETL) processes using Pentaho 5.0.0.1/ 5.4.0 and
informatica 8.6.
• Extensively worked on Designer, Repository Manager, Workflow Manager and Workflow Monitor
using Informatica ETL Tool.
• Having Functional Knowledge in Manufacturing, Sales, and Marketing and Life & Pensions domains.
• Comprehensive knowledge in RDBMS Concepts.
• Well versed with Dimensional Modeling (Star and Snow-flake Schemas) for building Data warehouse.
• Strong experience in Pentaho DI and Informatica mapping specification documentation, tuning
mappings to increase performance.
• Extensively used Mapplets, parameters & variables in various projects.
• Extensively used Transformations like Lookup, Router, Update Strategy, Filter, Sequence Generator,
Joiner, Aggregator, and Expression Transformation.
• Strong knowledge on different data sources like flat files and Oracle databases.
• Strongly working with debugger from within mapping designer to troubleshoot the mapping.
• Extensive expertise in implementing complex business rules by creating reusable transformations and
Transformation/Job/Mappings/Mapplets also tuned them for Optimum performance.
• Proficient in the development of ETL (Extract, Transform, Load) processes with a good understanding
of source to target data mapping, ability to define, capture Meta data and Business rules.
• Experience in developing Test Plans, Test Strategies and Test Cases for Data Warehousing projects
ensuring the data meets the business requirements.
• Excellent problem solving, communication, analytical, interpersonal skills and ability to perform
independently and as part of a team.
EDUCATION:
• B.Tech in Electrical & Electronics Engineering from JNT University, Anantapur in the year (2012).
TECHNICAL SKILLS:
ETL Tools Informatica 8.6/9.1, Pentaho 4,4.0/5.0.0.1/5.4.0
Databases Oracle10g, My SQL and Files
Operating Systems Windows 95/98/2000/NT/XP, Unix/Linux
Supporting Tools Toad, Putty and WYNscp,SQL developer,Jira and GIT.
Languages SQL
PROFESSIONAL EXPERIENCE:
• Working as Spftware Engineer for Wipro Technologies, Bangalore from Sep 2015 – Till date.
• Worked as Associate Consultant for Capgemini, Bangalore from May 2012 – Aug 2015.
PROJECTS HANDLED:
Project 1:
Title SSP(Super Simplification Program)
Client Sun crop, Australia
Role ETL Developer
Environment Pentaho DI 5.4, Oracle 12c, Windows 2000, Unix, GIT and Jenkins
Description:
Sun crop is one of the Australia’s leading insurance, Banking, Life Insurance and Superannuation
brands. This Project involves data migration into Exiting system(Sysbase) to other system(Oracle) using
Pentaho ETL tool.
Responsibilities:
• Understand the requirements for designing the BI Architecture and ETL Architecture.
• Good Experience in Pentaho Administration & Configuration in Windows and Linux Environments.
• Good Experience in Extraction, Transformation and Loading from Homogeneous and Heterogeneous
Sources.
• Understanding the Business Data Mapping requirements and preparing the Tech. Design documents.
• Creation of Fact and dimension tables and bridge tables as per the requirements and source systems.
• Involved in developing ETL to extract data from multiple sources like files, Oracle database and Mssql
server database.
• Designed ETL that contain business rules for extract, transform and load process using for staging and
fact and dimension tables population.
• Responsible for migrating DDL and GIT code from Development to test, UAT & Production
(deployments).
• Preparation of Control-M sheet for jobs scheduling purpose.
• Involved in Daily Standup calls for continue development and integration Agile Methodologies.
• Involved in development of Prod/QA and QA/Dev data sync process ETL jobs for UAT.
• Involved in development of Audit ETL jobs for different modules.
• Involved in Unit testing and Test case document preparation.
Project 2:
Title Sales and Marketing DW
Client E-Mark Electronics, Malaysia
Role ETL Developer
Environment Pentaho DI 5.0, Flat Files, Oracle 11g, Windows 2000, Unix, TOAD and SQL.
Description:
E-Mark Electronics is one of the Malaysia’s leading distributor of electronic equipment and other and a
major producer of specialty Electronics products. This Project involves the creation of a Data Warehouse for
the Sales Reporting System. Data from Operational Source System is extracted, transformed and loaded into
data warehouse Using Informatica. The reports are being generated in Cognos. Series to study the Sales
Analysis System.
Responsibilities:
• Involved in creation of Data Mart as per the source system and requirements.
• Performed design of ETL components in Development and testing were my primary responsibilities.
• Involved in developing product which serves all the client data to extract and load into ASW Redshift
Database.
• Involved in Daily Scrum calls and for continue development and integration Agile Methodologies.
• Designed the source to target mappings that contain the business rules for during the extract, transform
and load process using Pentaho ETL Tool.
• Developed transformations and jobs, verified the Result sets according to the specifications.
• Developed ETL process using Pentaho PDI to extract the data from sources, and populated it in to BI
Data mart.
• Responsible for migrating code from Development to test and to Production (deployments).
• Configured the Bamboo test plan for continues development and integration.
• Worked in Extract the data from online Data with Pentaho Instaview and generating the reports.
Project 3:
Title American Health services
Client Presbyterian Health Care System, USA
Role ETL Developer
Environment Pentaho DI 4.4, Oracle 10g, Windows XP, Unix and TOAD.
Description:
This Data Warehouse is constructed to stage the Sales & Distribution and Inventory related data for
Presbyterian Pharmaceuticals (Pharmaceuticals and Hospital related products). These applications cater to
provide data to various marketing groups of a major pharmaceutical company. The project comprises of various
applications across various platforms. The data is received from an external vendor, is processed and then
loaded in various databases and supplied to the respective business owners for further analysis and processing.
Responsibilities:
• Understanding the existing environment to start up ETL process from High level documentation.
• Developed transformations and jobs by following the prescribed BRD’s, verified the Result sets
according to the specifications.
• Developed ETL process using Pentaho PDI to extract the data from HIS and Vista and populated it in to
our BI Data mart.
• Involved in the code review, unit testing and peer review and documentation.
• Involved in the deployment of code in QA and Production Environment.
• Designing and developing the ETL Mapping Specification based on the Business requirements &
specifications and designing the ETL process.
• Designing & Developing the ETL mappings using Pentaho as per the requirements.
• Data extraction from source systems, cleansing & transforming the data in staging and finally loading
the data into warehouse schemas.
• Performed efficient testing of the ETL mappings and the transformed & cleansed data in Dev and SIT
environments to match the user requirements and the data quality in the warehouse.
• Analysing the job running time and improving the performance.
• Involved in reviews, performance testing and Unit Testing.
• Extensively used Expression, Filter, Joiner, Lookup, Aggregator and Update Strategy transformations.
Used debugger to test the mapping and fixed the bugs.
• Created Transformation, Jobs using different steps.
• Performed Unit tests and validates results with Business Analyst and end users.
Project 4:
Title Enterprises Business Management
Client EMBARQ , USA
Role Informatica Developer
Environment Informatica Power center 8.6,oracle 10g, Windows XP, Unix and TOAD.
Description:
The Client, EMBARQ USA, is one of the leading telecom Company for over 20 years,
requires an analytical data warehouse. The development and design of a data warehouse is inextricably
link to business needs of an enterprise. The project will improve the current House holding process to
correct certain processing deficiencies and implement new support functions for Client. The changes
would improve both the matching quality and the processing speed. Understanding existing business
model and customer requirements.
Responsibilities:
• Understanding the Business Requirements and Develop an ETL process to load data from source to
target.
• Extensively used Informatica power centre for loading the data from sources involving flat files to
target as relational table.
• Mainly involved in ETL developing to Extract, Transform and Load the data into stage area and data
mart.
• Extensively used different transformations like Source Qualifier, Expression, Lookup, Filter, Update
Strategy, Router, Normalizer, and Sequence Generator to develop Data Mart from staging area.
• Creation of sessions for mapping and validate the session.
• Schedule and monitor the sessions work flows using Informatica work flow manager and work flow
monitor, daily basis.
• Using Target Load Plan to optimize the performance of the server.
• Identify the bugs in existing mapping by analyzing the data flow.
• Responsible for test case writing and testing.
• Preparing Minutes of Meeting with Client.

sandhya exp resume

  • 1.
    Sandhya Chamarthi Mobile:+917338250888 Email : sandhyachamarthi.dwh@gmail.com SUMMARY: • Having around 4 years of IT experience which 1 year in Informatica and 3 years in Pentaho Data Integration (KETTLE) Technology with Technical and Functional Experience in developing and maintaining data warehouses/marts. Proficiency in Data warehousing, ETL process, OLAP systems, Business Intelligence. Currently working with PENTAHO(KETTLE). • Expertise in Extraction Transformation and Loading (ETL) processes using Pentaho 5.0.0.1/ 5.4.0 and informatica 8.6. • Extensively worked on Designer, Repository Manager, Workflow Manager and Workflow Monitor using Informatica ETL Tool. • Having Functional Knowledge in Manufacturing, Sales, and Marketing and Life & Pensions domains. • Comprehensive knowledge in RDBMS Concepts. • Well versed with Dimensional Modeling (Star and Snow-flake Schemas) for building Data warehouse. • Strong experience in Pentaho DI and Informatica mapping specification documentation, tuning mappings to increase performance. • Extensively used Mapplets, parameters & variables in various projects. • Extensively used Transformations like Lookup, Router, Update Strategy, Filter, Sequence Generator, Joiner, Aggregator, and Expression Transformation. • Strong knowledge on different data sources like flat files and Oracle databases. • Strongly working with debugger from within mapping designer to troubleshoot the mapping. • Extensive expertise in implementing complex business rules by creating reusable transformations and Transformation/Job/Mappings/Mapplets also tuned them for Optimum performance. • Proficient in the development of ETL (Extract, Transform, Load) processes with a good understanding of source to target data mapping, ability to define, capture Meta data and Business rules. • Experience in developing Test Plans, Test Strategies and Test Cases for Data Warehousing projects ensuring the data meets the business requirements. • Excellent problem solving, communication, analytical, interpersonal skills and ability to perform independently and as part of a team. EDUCATION: • B.Tech in Electrical & Electronics Engineering from JNT University, Anantapur in the year (2012). TECHNICAL SKILLS: ETL Tools Informatica 8.6/9.1, Pentaho 4,4.0/5.0.0.1/5.4.0 Databases Oracle10g, My SQL and Files Operating Systems Windows 95/98/2000/NT/XP, Unix/Linux Supporting Tools Toad, Putty and WYNscp,SQL developer,Jira and GIT. Languages SQL PROFESSIONAL EXPERIENCE:
  • 2.
    • Working asSpftware Engineer for Wipro Technologies, Bangalore from Sep 2015 – Till date. • Worked as Associate Consultant for Capgemini, Bangalore from May 2012 – Aug 2015. PROJECTS HANDLED: Project 1: Title SSP(Super Simplification Program) Client Sun crop, Australia Role ETL Developer Environment Pentaho DI 5.4, Oracle 12c, Windows 2000, Unix, GIT and Jenkins Description: Sun crop is one of the Australia’s leading insurance, Banking, Life Insurance and Superannuation brands. This Project involves data migration into Exiting system(Sysbase) to other system(Oracle) using Pentaho ETL tool. Responsibilities: • Understand the requirements for designing the BI Architecture and ETL Architecture. • Good Experience in Pentaho Administration & Configuration in Windows and Linux Environments. • Good Experience in Extraction, Transformation and Loading from Homogeneous and Heterogeneous Sources. • Understanding the Business Data Mapping requirements and preparing the Tech. Design documents. • Creation of Fact and dimension tables and bridge tables as per the requirements and source systems. • Involved in developing ETL to extract data from multiple sources like files, Oracle database and Mssql server database. • Designed ETL that contain business rules for extract, transform and load process using for staging and fact and dimension tables population. • Responsible for migrating DDL and GIT code from Development to test, UAT & Production (deployments). • Preparation of Control-M sheet for jobs scheduling purpose. • Involved in Daily Standup calls for continue development and integration Agile Methodologies. • Involved in development of Prod/QA and QA/Dev data sync process ETL jobs for UAT. • Involved in development of Audit ETL jobs for different modules. • Involved in Unit testing and Test case document preparation. Project 2: Title Sales and Marketing DW Client E-Mark Electronics, Malaysia Role ETL Developer Environment Pentaho DI 5.0, Flat Files, Oracle 11g, Windows 2000, Unix, TOAD and SQL. Description:
  • 3.
    E-Mark Electronics isone of the Malaysia’s leading distributor of electronic equipment and other and a major producer of specialty Electronics products. This Project involves the creation of a Data Warehouse for the Sales Reporting System. Data from Operational Source System is extracted, transformed and loaded into data warehouse Using Informatica. The reports are being generated in Cognos. Series to study the Sales Analysis System. Responsibilities: • Involved in creation of Data Mart as per the source system and requirements. • Performed design of ETL components in Development and testing were my primary responsibilities. • Involved in developing product which serves all the client data to extract and load into ASW Redshift Database. • Involved in Daily Scrum calls and for continue development and integration Agile Methodologies. • Designed the source to target mappings that contain the business rules for during the extract, transform and load process using Pentaho ETL Tool. • Developed transformations and jobs, verified the Result sets according to the specifications. • Developed ETL process using Pentaho PDI to extract the data from sources, and populated it in to BI Data mart. • Responsible for migrating code from Development to test and to Production (deployments). • Configured the Bamboo test plan for continues development and integration. • Worked in Extract the data from online Data with Pentaho Instaview and generating the reports. Project 3: Title American Health services Client Presbyterian Health Care System, USA Role ETL Developer Environment Pentaho DI 4.4, Oracle 10g, Windows XP, Unix and TOAD. Description: This Data Warehouse is constructed to stage the Sales & Distribution and Inventory related data for Presbyterian Pharmaceuticals (Pharmaceuticals and Hospital related products). These applications cater to provide data to various marketing groups of a major pharmaceutical company. The project comprises of various applications across various platforms. The data is received from an external vendor, is processed and then loaded in various databases and supplied to the respective business owners for further analysis and processing. Responsibilities: • Understanding the existing environment to start up ETL process from High level documentation. • Developed transformations and jobs by following the prescribed BRD’s, verified the Result sets according to the specifications. • Developed ETL process using Pentaho PDI to extract the data from HIS and Vista and populated it in to our BI Data mart. • Involved in the code review, unit testing and peer review and documentation. • Involved in the deployment of code in QA and Production Environment. • Designing and developing the ETL Mapping Specification based on the Business requirements & specifications and designing the ETL process. • Designing & Developing the ETL mappings using Pentaho as per the requirements.
  • 4.
    • Data extractionfrom source systems, cleansing & transforming the data in staging and finally loading the data into warehouse schemas. • Performed efficient testing of the ETL mappings and the transformed & cleansed data in Dev and SIT environments to match the user requirements and the data quality in the warehouse. • Analysing the job running time and improving the performance. • Involved in reviews, performance testing and Unit Testing. • Extensively used Expression, Filter, Joiner, Lookup, Aggregator and Update Strategy transformations. Used debugger to test the mapping and fixed the bugs. • Created Transformation, Jobs using different steps. • Performed Unit tests and validates results with Business Analyst and end users. Project 4: Title Enterprises Business Management Client EMBARQ , USA Role Informatica Developer Environment Informatica Power center 8.6,oracle 10g, Windows XP, Unix and TOAD. Description: The Client, EMBARQ USA, is one of the leading telecom Company for over 20 years, requires an analytical data warehouse. The development and design of a data warehouse is inextricably link to business needs of an enterprise. The project will improve the current House holding process to correct certain processing deficiencies and implement new support functions for Client. The changes would improve both the matching quality and the processing speed. Understanding existing business model and customer requirements. Responsibilities: • Understanding the Business Requirements and Develop an ETL process to load data from source to target. • Extensively used Informatica power centre for loading the data from sources involving flat files to target as relational table. • Mainly involved in ETL developing to Extract, Transform and Load the data into stage area and data mart. • Extensively used different transformations like Source Qualifier, Expression, Lookup, Filter, Update Strategy, Router, Normalizer, and Sequence Generator to develop Data Mart from staging area. • Creation of sessions for mapping and validate the session. • Schedule and monitor the sessions work flows using Informatica work flow manager and work flow monitor, daily basis. • Using Target Load Plan to optimize the performance of the server. • Identify the bugs in existing mapping by analyzing the data flow. • Responsible for test case writing and testing. • Preparing Minutes of Meeting with Client.