This document provides a detailed summary of Arun Mathew Thomas's work experience in IT and data warehousing. It outlines his over 9 years of experience in developing and maintaining data warehouse applications, with expertise in ETL processes, data modeling, performance tuning, and working with tools like Informatica, Teradata, and SQL. It also provides details on two specific roles he held, including developing mappings to load data from various sources into an enterprise data warehouse for Anthem Inc., and serving as an ETL/data quality architect for customer data projects.
This document summarizes Jay Zabinsky's experience as a consultant with over 15 years of experience in metadata management, data modeling, database design, data analysis, data integration, and data governance. He has extensive experience with software such as Oracle, SQL Server, IBM Information Analyzer, Erwin, and SSIS. His experience includes roles managing metadata repositories, data modeling, ETL development, data warehousing, and data integration projects.
This document is a curriculum vitae for Bodala Jagadeesh summarizing his skills and experience. He has 1.5 years of experience in designing data warehousing projects using Informatica and Oracle. He has created mappings to extract data from various sources, load data into databases, and create reusable objects. He also has experience with Tableau creating dashboards from various data sources according to client requirements. His technical skills include Informatica, Oracle, Postgres, SQL, and Tableau.
This document provides a comparison of SAP BW and Teradata, two leading tools for reporting and analysis. It begins with background information on each tool, describing SAP BW as a comprehensive business intelligence package that merges, transforms, and interprets business data to support decision making. Teradata is introduced as a fully scalable relational database management system designed for analytical queries. The document then compares the pros and cons of each tool based on factors like users, value proposition, usability, interfaces, and features. SAP BW is generally better for small organizations while Teradata can handle extremely large amounts of data and thousands of users through massively parallel processing.
Pratik Dey is an IT professional with over 4 years of experience in data warehousing and ETL development. He has strong skills in Informatica PowerCenter and experience loading data from various sources into Teradata. Currently he works as an ETL Data Specialist for Thomson Reuters implementing their Connect Data Warehouse. Previously he worked on projects in healthcare and banking to develop ETL processes and load data into data warehouses.
Gary Borden has over 25 years of experience in ERP systems such as SAP and Oracle, with a focus on master data management, business analytics, and CRM systems like Salesforce.com. He currently works as an SAP Material Master Coordinator and Salesforce.com Database Administrator at ACCO Brands. Previously he has held data analyst roles at Corbus LLC, CSC, and MeadWestvaco where he managed ERP data migration projects, developed reports, and performed mass data loads. He has a bachelor's degree in management science from Wright State University.
Deepak Sharma has over 9 years of experience as an ETL programmer focusing on data warehousing, data integration, and business intelligence. He has expertise in designing, developing, and implementing data warehouse/data integration solutions using tools like Informatica PowerCenter and Oracle. Some of his responsibilities include interacting with business stakeholders to understand requirements, designing mappings, developing ETL processes, testing solutions, and supporting production environments. He currently works as a senior ETL developer at Bank of America on their Corporate Investment Data Warehouse.
This document is a resume for Sujit Kumar Jha, an Oracle Certified Professional (OCP) and Tuning expert with 13 years of experience in PL/SQL development. He has extensive experience designing, developing and implementing solutions for clients in finance, telecom and insurance. Some of his key skills include Oracle SQL, PL/SQL, Unix shell scripting, data modeling, ETL processes, and working in Agile methodologies. He has worked as a lead developer on numerous projects involving building databases, ETL code, reports and batch processes to meet business requirements.
This document contains a resume for Chakravarthy Uppara. It summarizes his contact information, objective, 6+ years of experience in database development using SQL Server and SSIS. It details his roles and responsibilities in projects for Tesco and Accenture developing ETL processes and interfaces to integrate various systems. His technical skills include SQL Server, SSIS, SSAS and .NET. He holds a B-Tech in Information Technology and is currently employed as a Senior Software Engineer at Tesco in Bangalore, India.
This document summarizes Jay Zabinsky's experience as a consultant with over 15 years of experience in metadata management, data modeling, database design, data analysis, data integration, and data governance. He has extensive experience with software such as Oracle, SQL Server, IBM Information Analyzer, Erwin, and SSIS. His experience includes roles managing metadata repositories, data modeling, ETL development, data warehousing, and data integration projects.
This document is a curriculum vitae for Bodala Jagadeesh summarizing his skills and experience. He has 1.5 years of experience in designing data warehousing projects using Informatica and Oracle. He has created mappings to extract data from various sources, load data into databases, and create reusable objects. He also has experience with Tableau creating dashboards from various data sources according to client requirements. His technical skills include Informatica, Oracle, Postgres, SQL, and Tableau.
This document provides a comparison of SAP BW and Teradata, two leading tools for reporting and analysis. It begins with background information on each tool, describing SAP BW as a comprehensive business intelligence package that merges, transforms, and interprets business data to support decision making. Teradata is introduced as a fully scalable relational database management system designed for analytical queries. The document then compares the pros and cons of each tool based on factors like users, value proposition, usability, interfaces, and features. SAP BW is generally better for small organizations while Teradata can handle extremely large amounts of data and thousands of users through massively parallel processing.
Pratik Dey is an IT professional with over 4 years of experience in data warehousing and ETL development. He has strong skills in Informatica PowerCenter and experience loading data from various sources into Teradata. Currently he works as an ETL Data Specialist for Thomson Reuters implementing their Connect Data Warehouse. Previously he worked on projects in healthcare and banking to develop ETL processes and load data into data warehouses.
Gary Borden has over 25 years of experience in ERP systems such as SAP and Oracle, with a focus on master data management, business analytics, and CRM systems like Salesforce.com. He currently works as an SAP Material Master Coordinator and Salesforce.com Database Administrator at ACCO Brands. Previously he has held data analyst roles at Corbus LLC, CSC, and MeadWestvaco where he managed ERP data migration projects, developed reports, and performed mass data loads. He has a bachelor's degree in management science from Wright State University.
Deepak Sharma has over 9 years of experience as an ETL programmer focusing on data warehousing, data integration, and business intelligence. He has expertise in designing, developing, and implementing data warehouse/data integration solutions using tools like Informatica PowerCenter and Oracle. Some of his responsibilities include interacting with business stakeholders to understand requirements, designing mappings, developing ETL processes, testing solutions, and supporting production environments. He currently works as a senior ETL developer at Bank of America on their Corporate Investment Data Warehouse.
This document is a resume for Sujit Kumar Jha, an Oracle Certified Professional (OCP) and Tuning expert with 13 years of experience in PL/SQL development. He has extensive experience designing, developing and implementing solutions for clients in finance, telecom and insurance. Some of his key skills include Oracle SQL, PL/SQL, Unix shell scripting, data modeling, ETL processes, and working in Agile methodologies. He has worked as a lead developer on numerous projects involving building databases, ETL code, reports and batch processes to meet business requirements.
This document contains a resume for Chakravarthy Uppara. It summarizes his contact information, objective, 6+ years of experience in database development using SQL Server and SSIS. It details his roles and responsibilities in projects for Tesco and Accenture developing ETL processes and interfaces to integrate various systems. His technical skills include SQL Server, SSIS, SSAS and .NET. He holds a B-Tech in Information Technology and is currently employed as a Senior Software Engineer at Tesco in Bangalore, India.
Ganesh Kamble has over 2 years of experience in data integration and business intelligence development using tools like Informatica, Oracle, SQL Server, and Business Objects. He has extensive experience in requirements gathering, ETL development, report design, and data warehousing. His most recent projects involved migrating a system from BI 3.1 to 4.0, loading historical data using Informatica, and building a data warehouse for a new company branch using a separate ETL system.
This document provides an overview of current ETL techniques from a big data perspective. It discusses the evolution of ETL from traditional batch-based techniques to near real-time and real-time approaches. However, existing real-time ETL approaches are inadequate to address the volume, velocity, and variety characteristics of data streams. The document also surveys available ETL tools and techniques for handling data streams, and concludes that the ETL process needs to be redefined to better address issues in processing dynamic data streams.
Riyas Mohamed is an IT professional with over 7 years of experience in roles such as Oracle DBA, business analyst, and software tester. He has extensive experience with database administration, installation, backup/recovery, performance tuning, and reconciliation of asset data across multiple databases. Currently he works as an IT service delivery professional at Hewlett Packard Enterprise in Chennai, India.
Business objects data services in an sap landscapePradeep Ketoli
The document discusses SAP BusinessObjects Data Services and its role in an SAP landscape. It provides an overview of SAP's enterprise information management solutions including data integration, data quality management, master data management and enterprise data warehousing. It then discusses how Data Services can be used for data integration, data quality, loading SAP BW, extracting from BW, and supporting business processes like data migration and master data management.
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...Swapna Tammishetty
Swapna Tammishetty has over 11 years of experience in data warehousing, business intelligence, and analytics. She has worked on projects involving technologies like Teradata, Crystal Reports, and Tableau. Some of her responsibilities have included requirements gathering, data modeling, ETL development, testing, and providing support. She has extensive experience in the healthcare domain working with clients such as Anthem and analyzing ACA/HIX data.
Data Warehouse:
A physical repository where relational data are specially organized to provide enterprise-wide, cleansed data in a standardized format.
Reconciled data: detailed, current data intended to be the single, authoritative source for all decision support.
Extraction:
The Extract step covers the data extraction from the source system and makes it accessible for further processing. The main objective of the extract step is to retrieve all the required data from the source system with as little resources as possible.
Data Transformation:
Data transformation is the component of data reconcilation that converts data from the format of the source operational systems to the format of enterprise data warehouse.
Data Loading:
During the load step, it is necessary to ensure that the load is performed correctly and with as little resources as possible. The target of the Load process is often a database. In order to make the load process efficient, it is helpful to disable any constraints and indexes before the load and enable them back only after the load completes. The referential integrity needs to be maintained by ETL tool to ensure consistency.
This document is a resume for R. Michael Levin summarizing his objective, clearances, technical skills, and professional experience. Levin has over 20 years of experience in fields like enterprise architecture, data architecture, data modeling, database administration, and project management. He currently works as a Solution Architect and Technical Project Manager developing data warehouses and managing teams. His previous roles include positions as a Principal Solution Data Architect, Senior Data Architect, and Senior Database Architect at various companies.
This document provides a summary of Krishna's experience and skills as an Obiee and Informatica consultant. It outlines 4+ years of experience administering security, upgrading RPDs, migrating reports, and implementing data level security in Obiee. It also details experience installing and configuring Obiee, Informatica, and BI applications. Recent projects include developing reports and dashboards in Obiee for Skechers and Aglient. Past experience includes report development in Obiee and creating mappings in Informatica for an insurance data warehouse.
The document is a resume for John V. Stires highlighting his experience in business intelligence using SQL Server and Microsoft technologies, with over 30 years of experience in programming, analysis, and consulting roles working with various databases and platforms. He is looking for a new opportunity to design and build applications using business intelligence to solve business problems. The resume provides details of his technical skills and work history.
This document contains a resume summary for Richa Sharma highlighting her 8.5 years of experience in data warehousing, ETL development, and business intelligence. She has expertise in IBM Datastage and Cognos reporting tools. Her experience includes data modeling, ETL development, database administration, performance tuning, and project experience with clients in the retail and telecom industries.
Philip Francis P. Calmerin has over 7 years of experience as a Teradata DBA. He currently works as a Teradata DBA for IBM Solutions Delivery, Inc. where he performs daily monitoring of production databases, implements alerts and resource collections, performs queries analysis, and generates performance reports. Previously, he held Teradata DBA roles for Avows Technologies Sdn Bhd and Teradata Philippines where he was responsible for database performance monitoring, capacity planning, query optimization, backup and recovery, and leading a team of offshore DBAs. He also has experience as an Oracle/SQL Server DBA for Hewlett Packard Asia Pacific.
This document provides a summary of Rizvi Shaik's professional experience as an Informatica ETL Developer over 9+ years. It outlines his extensive skills in areas like data warehousing, ETL development, data modeling, testing and production support. Recent roles include working with Horizon Blue Cross Blue Shield of NJ on projects involving data integration, replication and synchronization between various data sources using Informatica PowerCenter and Cloud.
David Colbourn is an experienced information architect seeking a senior role. He has over 25 years of experience in areas such as software analysis, design, data modeling, project management, and big data integration. His core competencies include information architecture, data modeling, database design, project management, and relational and non-relational systems. He has worked in various industries including banking, healthcare, telecommunications, and government.
ETL (Extract, Transform, Load) is a process that allows companies to consolidate data from multiple sources into a single target data store, such as a data warehouse. It involves extracting data from heterogeneous sources, transforming it to fit operational needs, and loading it into the target data store. ETL tools automate this process, allowing companies to access and analyze consolidated data for critical business decisions. Popular ETL tools include IBM Infosphere Datastage, Informatica, and Oracle Warehouse Builder.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
Sakthi Shenbagam is a senior Informatica lead developer with over 7 years of experience in data warehousing and ETL development. She has extensive experience designing and developing complex ETL mappings and workflows to load data into data warehouses from various sources like Oracle, Netezza, and flat files. Some of her responsibilities include requirement gathering, design, development, testing, performance tuning, and support of ETL processes and data warehousing projects for clients across various industries. She is proficient in Informatica PowerCenter, QlikView, Oracle, and other tools.
The document describes the software architecture of Informatica PowerCenter ETL product. It consists of 3 main components: 1) Client tools that enable development and monitoring. 2) A centralized repository that stores all metadata. 3) The server that executes mappings and loads data into targets. The architecture diagram shows the data flow from sources to targets via the server.
The document discusses a unified data architecture that enables any user to access and analyze any data type from data capture through analysis. It describes using a discovery platform to enable interactive data discovery on structured and unstructured data without extensive modeling. It also describes using an integrated data warehouse for cross-functional analysis, shared analytics, and lowest total cost of ownership. Finally, it provides examples of using the architecture for IPTV quality of service analysis, including predictive models using decision trees and naive Bayes.
Learn more about ER/Studio Data Architect and try it free at: http://embt.co/ERStudioDA
With round-trip database support, data architects using ER/Studio Data Architect have the power to easily reverse-engineer, compare and merge, and visually document data assets residing in diverse locations from data centers to mobile platforms. Enterprise data can be more effectively leveraged as a corporate asset, while compliance is supported for business standards and mandatory regulations -- essential factors in an organizational data governance program. A range of data sources are supported ranging from those residing on the cloud to data sources residing on mobile phones. A variety of database platforms, including traditional RDBMS and big data technologies such as MongoDB and Hadoop Hive, can be imported and integrated into shared models and metadata definitions.
The document provides a summary of an ETL Developer's skills and experience. The developer has 3 years of experience using IBM InfoSphere Datastage for ETL projects involving data extraction, transformation, and loading. Responsibilities include developing and debugging ETL jobs, testing and tuning performance, implementing changes, and working with databases like Oracle. The developer has worked on risk data warehousing and order tracking projects, developing jobs to move data between systems and load enterprise data warehouses.
Joan Willox has over 20 years of experience in contracts administration and commercial management for construction projects in Australia, Asia, the Middle East, and the UK. She has worked on a wide range of infrastructure, mining, oil and gas projects with budgets ranging from $10 million to $10.5 billion. Her expertise includes project setup, procurement, risk management, and contract administration. She holds a Bachelor's degree in Quantity Surveying and is a member of the Royal Institute of Chartered Surveyors.
Kevin Gallagher has over 15 years of experience as a recruiter and staffing manager, specializing in recruiting engineers for technology companies. He has a proven track record of successfully filling positions in fields such as software, hardware, sales, and marketing. Gallagher possesses strong recruiting, negotiation, and relationship building skills along with expertise in compliance with immigration and security regulations.
Ganesh Kamble has over 2 years of experience in data integration and business intelligence development using tools like Informatica, Oracle, SQL Server, and Business Objects. He has extensive experience in requirements gathering, ETL development, report design, and data warehousing. His most recent projects involved migrating a system from BI 3.1 to 4.0, loading historical data using Informatica, and building a data warehouse for a new company branch using a separate ETL system.
This document provides an overview of current ETL techniques from a big data perspective. It discusses the evolution of ETL from traditional batch-based techniques to near real-time and real-time approaches. However, existing real-time ETL approaches are inadequate to address the volume, velocity, and variety characteristics of data streams. The document also surveys available ETL tools and techniques for handling data streams, and concludes that the ETL process needs to be redefined to better address issues in processing dynamic data streams.
Riyas Mohamed is an IT professional with over 7 years of experience in roles such as Oracle DBA, business analyst, and software tester. He has extensive experience with database administration, installation, backup/recovery, performance tuning, and reconciliation of asset data across multiple databases. Currently he works as an IT service delivery professional at Hewlett Packard Enterprise in Chennai, India.
Business objects data services in an sap landscapePradeep Ketoli
The document discusses SAP BusinessObjects Data Services and its role in an SAP landscape. It provides an overview of SAP's enterprise information management solutions including data integration, data quality management, master data management and enterprise data warehousing. It then discusses how Data Services can be used for data integration, data quality, loading SAP BW, extracting from BW, and supporting business processes like data migration and master data management.
Swapna Tammishetty CV-Business & Systems Analyst-Data Analyst-Crystal Reports...Swapna Tammishetty
Swapna Tammishetty has over 11 years of experience in data warehousing, business intelligence, and analytics. She has worked on projects involving technologies like Teradata, Crystal Reports, and Tableau. Some of her responsibilities have included requirements gathering, data modeling, ETL development, testing, and providing support. She has extensive experience in the healthcare domain working with clients such as Anthem and analyzing ACA/HIX data.
Data Warehouse:
A physical repository where relational data are specially organized to provide enterprise-wide, cleansed data in a standardized format.
Reconciled data: detailed, current data intended to be the single, authoritative source for all decision support.
Extraction:
The Extract step covers the data extraction from the source system and makes it accessible for further processing. The main objective of the extract step is to retrieve all the required data from the source system with as little resources as possible.
Data Transformation:
Data transformation is the component of data reconcilation that converts data from the format of the source operational systems to the format of enterprise data warehouse.
Data Loading:
During the load step, it is necessary to ensure that the load is performed correctly and with as little resources as possible. The target of the Load process is often a database. In order to make the load process efficient, it is helpful to disable any constraints and indexes before the load and enable them back only after the load completes. The referential integrity needs to be maintained by ETL tool to ensure consistency.
This document is a resume for R. Michael Levin summarizing his objective, clearances, technical skills, and professional experience. Levin has over 20 years of experience in fields like enterprise architecture, data architecture, data modeling, database administration, and project management. He currently works as a Solution Architect and Technical Project Manager developing data warehouses and managing teams. His previous roles include positions as a Principal Solution Data Architect, Senior Data Architect, and Senior Database Architect at various companies.
This document provides a summary of Krishna's experience and skills as an Obiee and Informatica consultant. It outlines 4+ years of experience administering security, upgrading RPDs, migrating reports, and implementing data level security in Obiee. It also details experience installing and configuring Obiee, Informatica, and BI applications. Recent projects include developing reports and dashboards in Obiee for Skechers and Aglient. Past experience includes report development in Obiee and creating mappings in Informatica for an insurance data warehouse.
The document is a resume for John V. Stires highlighting his experience in business intelligence using SQL Server and Microsoft technologies, with over 30 years of experience in programming, analysis, and consulting roles working with various databases and platforms. He is looking for a new opportunity to design and build applications using business intelligence to solve business problems. The resume provides details of his technical skills and work history.
This document contains a resume summary for Richa Sharma highlighting her 8.5 years of experience in data warehousing, ETL development, and business intelligence. She has expertise in IBM Datastage and Cognos reporting tools. Her experience includes data modeling, ETL development, database administration, performance tuning, and project experience with clients in the retail and telecom industries.
Philip Francis P. Calmerin has over 7 years of experience as a Teradata DBA. He currently works as a Teradata DBA for IBM Solutions Delivery, Inc. where he performs daily monitoring of production databases, implements alerts and resource collections, performs queries analysis, and generates performance reports. Previously, he held Teradata DBA roles for Avows Technologies Sdn Bhd and Teradata Philippines where he was responsible for database performance monitoring, capacity planning, query optimization, backup and recovery, and leading a team of offshore DBAs. He also has experience as an Oracle/SQL Server DBA for Hewlett Packard Asia Pacific.
This document provides a summary of Rizvi Shaik's professional experience as an Informatica ETL Developer over 9+ years. It outlines his extensive skills in areas like data warehousing, ETL development, data modeling, testing and production support. Recent roles include working with Horizon Blue Cross Blue Shield of NJ on projects involving data integration, replication and synchronization between various data sources using Informatica PowerCenter and Cloud.
David Colbourn is an experienced information architect seeking a senior role. He has over 25 years of experience in areas such as software analysis, design, data modeling, project management, and big data integration. His core competencies include information architecture, data modeling, database design, project management, and relational and non-relational systems. He has worked in various industries including banking, healthcare, telecommunications, and government.
ETL (Extract, Transform, Load) is a process that allows companies to consolidate data from multiple sources into a single target data store, such as a data warehouse. It involves extracting data from heterogeneous sources, transforming it to fit operational needs, and loading it into the target data store. ETL tools automate this process, allowing companies to access and analyze consolidated data for critical business decisions. Popular ETL tools include IBM Infosphere Datastage, Informatica, and Oracle Warehouse Builder.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
Sakthi Shenbagam is a senior Informatica lead developer with over 7 years of experience in data warehousing and ETL development. She has extensive experience designing and developing complex ETL mappings and workflows to load data into data warehouses from various sources like Oracle, Netezza, and flat files. Some of her responsibilities include requirement gathering, design, development, testing, performance tuning, and support of ETL processes and data warehousing projects for clients across various industries. She is proficient in Informatica PowerCenter, QlikView, Oracle, and other tools.
The document describes the software architecture of Informatica PowerCenter ETL product. It consists of 3 main components: 1) Client tools that enable development and monitoring. 2) A centralized repository that stores all metadata. 3) The server that executes mappings and loads data into targets. The architecture diagram shows the data flow from sources to targets via the server.
The document discusses a unified data architecture that enables any user to access and analyze any data type from data capture through analysis. It describes using a discovery platform to enable interactive data discovery on structured and unstructured data without extensive modeling. It also describes using an integrated data warehouse for cross-functional analysis, shared analytics, and lowest total cost of ownership. Finally, it provides examples of using the architecture for IPTV quality of service analysis, including predictive models using decision trees and naive Bayes.
Learn more about ER/Studio Data Architect and try it free at: http://embt.co/ERStudioDA
With round-trip database support, data architects using ER/Studio Data Architect have the power to easily reverse-engineer, compare and merge, and visually document data assets residing in diverse locations from data centers to mobile platforms. Enterprise data can be more effectively leveraged as a corporate asset, while compliance is supported for business standards and mandatory regulations -- essential factors in an organizational data governance program. A range of data sources are supported ranging from those residing on the cloud to data sources residing on mobile phones. A variety of database platforms, including traditional RDBMS and big data technologies such as MongoDB and Hadoop Hive, can be imported and integrated into shared models and metadata definitions.
The document provides a summary of an ETL Developer's skills and experience. The developer has 3 years of experience using IBM InfoSphere Datastage for ETL projects involving data extraction, transformation, and loading. Responsibilities include developing and debugging ETL jobs, testing and tuning performance, implementing changes, and working with databases like Oracle. The developer has worked on risk data warehousing and order tracking projects, developing jobs to move data between systems and load enterprise data warehouses.
Joan Willox has over 20 years of experience in contracts administration and commercial management for construction projects in Australia, Asia, the Middle East, and the UK. She has worked on a wide range of infrastructure, mining, oil and gas projects with budgets ranging from $10 million to $10.5 billion. Her expertise includes project setup, procurement, risk management, and contract administration. She holds a Bachelor's degree in Quantity Surveying and is a member of the Royal Institute of Chartered Surveyors.
Kevin Gallagher has over 15 years of experience as a recruiter and staffing manager, specializing in recruiting engineers for technology companies. He has a proven track record of successfully filling positions in fields such as software, hardware, sales, and marketing. Gallagher possesses strong recruiting, negotiation, and relationship building skills along with expertise in compliance with immigration and security regulations.
Rajeev Bhatnagar has over 13 years of experience designing and developing distributed systems using Java technologies. He has expertise in integration frameworks like Apache Camel. He has worked with technologies including XML, web services, databases, and application servers. He holds an M.Tech in Electronics Engineering and has skills in languages like Java, databases like Oracle, and frameworks like Spring MVC.
Timothy Shearer has over 30 years of experience as a software engineer and release manager. He has worked at several companies including Certent, Alloptic, Symetricom, and Timeplex where he designed and coded embedded software and firmware. Most recently, he was a release manager and senior software engineer at Certent where he ensured compliance for weekly and scheduled releases. He has expertise in C#, VB.NET, Oracle SQL, and Microsoft technologies.
Sudhir Gajjela has over 3 years of experience as a Big Data Hadoop Administrator and Informatica Administrator. He has expertise in architecting, building, supporting and troubleshooting both Cloudera and Hortonworks Big Data clusters as well as various Informatica tools. He has also worked as an Informatica Developer. His skills include Hadoop services like HDFS, Hive, HBase, Zookeeper, Flume, Sqoop, Oozie and Storm as well as Informatica tools such as PowerCenter, PowerExchange, Data Quality, MDM, Cloud, BDE and BDM. He has delivered training sessions to colleagues on topics such as Big Data, Hadoop, In
Himel Sen has over 5 years of experience in master data management, CRM, and data warehousing technologies. He has extensive technical skills in MDM tools like Informatica MDM and Siperian and databases like Oracle. He has worked on various MDM projects for pharmaceutical and other clients, managing all phases from analysis to testing.
Sudiksha Janmeda seeks a role that provides opportunities for learning and growth. She has 2 years of experience in data warehousing using Informatica Power Center. She has worked on projects extracting, transforming, and loading data from sources like Oracle and flat files into targets. Her responsibilities included creating mappings, transformations, handling slowly changing dimensions, and tuning mappings. She also has experience as an Informatica administrator providing technical support, monitoring services, code deployment, and issue resolution. Outside of work, she participates in outreach programs for children and the community.
Akramuzzaman Bhuyan is a senior analyst and ETL developer with over 6 years of experience in software development using Informatica. He has extensive experience designing and developing complex ETL mappings and workflows to load data into data warehouses from various sources such as databases, flat files, and XML. He is proficient in Informatica, Unix, SQL, and has worked on projects in telecom, utilities, and retail domains.
Renu Lalwani has over 3 years of experience as an ETL Developer using Informatica Power Centre 9.0. She currently works for Cognizant in Pune, India developing data warehouses and ETL processes from various sources like Oracle and flat files. Some of her responsibilities include requirements gathering, data extraction, transformation and loading, designing mappings, creating reusable objects, and testing. Previously she worked for Intimetec Vision Soft on another data warehousing project involving Oracle databases. She has a Bachelor's degree in Computer Science and is proficient in technologies like SQL, Java, HTML and version control systems.
Saheel Babu .KT is a senior UX/UI designer and front end developer with 8 years of experience in web design, graphic design, and UI development. He has expertise in HTML5, CSS3, Adobe Creative Suite, and converting PSD to HTML. He holds various technical qualifications and certifications. His objective is to contribute to a growth-oriented organization by enhancing his skills both professionally and personally.
This document is a curriculum vitae for Ranjith H. V. that summarizes his career objective, technical skills, work experience, education, and personal details. It outlines his 2.7 years of experience in application development using technologies like C#.NET, ASP.NET, and SQL Server. He has worked as a junior software developer at two companies, developing real estate and records management projects. His responsibilities included coding, maintenance, and adding new features to existing applications.
This document provides a summary of Vishvanath Pawar's experience as a web and UI developer. It outlines over 7 years of experience designing and developing websites, applications, and graphics using tools like Photoshop, Illustrator, and technologies like HTML5, CSS3, JavaScript, and jQuery. Recent experience includes roles as a senior developer at QualitasIT where projects involved designing and developing responsive healthcare applications.
This document provides a summary of qualifications and work history for Courtney Utter. She has over 25 years of experience in insurance claims adjusting, underwriting, and administration. Her qualifications include a Bachelor's degree in Business/Communications and certifications as a Microsoft Certified Systems Engineer and Microsoft Certified Professional. Her work history details roles with increasing responsibility in claims adjusting, underwriting, and administration for workers' compensation and other insurance lines with companies such as Texas Mutual Insurance, Frost Insurance, and Farmers Insurance.
Vanessa Torres has experience as a front-end web developer and senior programmer with skills in HTML5/CSS3, JavaScript, jQuery, Bootstrap, AJAX, and UX basics. She has a B.S. in Business Administration with a focus on computer information systems and accounting. Her work experience includes roles in HR, executive assistance, education, banking, and health care. She is bilingual.
This document contains an individual's resume. It outlines their contact information, objective, qualifications, skills, experience and projects. The individual has over 6 years of experience in areas like PHP, MySQL, MVC patterns and frameworks like Zend and Magento. They have worked on e-commerce sites, mobile recharge portals and other projects using technologies like Linux, Apache and jQuery.
S. Ashok Kumar is a software engineer with over 3 years of experience developing websites using HTML, CSS, CSS3, Bootstrap, jQuery, Dreamweaver and other technologies. He has extensive experience converting designs to code, making sites responsive, ensuring cross-browser compatibility, and working in fast-paced teams. Some of the projects he has worked on include websites for banks, e-commerce sites, and other businesses. He is proficient with technologies like PHP, MySQL, Photoshop and MS Office.
This document contains a resume for Kumaraswamy N summarizing his experience and skills as an ETL developer. He has over 8 years of experience in ETL development using Informatica and developing data warehouses. Some of his key skills and experiences include designing and developing ETL mappings in Informatica, data modeling, working with various data sources, performance tuning, and implementing slowly changing dimensions. He has worked on multiple projects for clients in various industries extracting data from sources like Oracle, SQL Server, and flat files and loading it into data warehouses and data marts.
- The document contains the resume of Abdul Mohammed, an ETL developer with 8 years of experience using Informatica for data warehousing projects.
- He has expertise in requirements gathering, data extraction from various sources, transforming the data using Informatica tools, and loading the data into target databases.
- His most recent role was as an ETL/SR Informatica Lead from 2015-present where he worked on building a data warehouse for a pharmaceutical company using Informatica to extract data from Oracle and flat files.
The document provides a summary of an individual's experience working as a Software Engineer specializing in data warehousing and ETL processes using Informatica. It outlines 3 years of experience developing mappings to extract, transform and load data from various sources into staging and data warehouse databases. Specific projects are described involving building data marts for semiconductor manufacturing, sales and distribution, and promotional sales. Responsibilities included requirements gathering, mapping development, debugging, testing and monitoring ETL workflows.
This document contains a professional summary and resume for Arun Roy Kondra. It summarizes his experience of over 4 years working with data warehousing and ETL solutions using Informatica PowerCenter. It details his technical skills and 3 projects he has worked on extracting and transforming data from various sources like Oracle, GreenPlum and flat files into data warehouses.
Varadarajan Sourirajan is a data architect with over 16 years of experience seeking a new position. He has extensive experience in data modeling for both online transaction processing and data warehousing applications. Currently he is working on implementing a data warehouse for the treasury line of business at a large bank in the US, drawing on his experience delivering previous data warehouse projects and a proven track record of success.
Veeranji has over 3 years of experience developing ETL processes using Informatica PowerCenter. He has experience extracting data from sources like Oracle and flat files, transforming the data using various transformations, and loading the data into data warehouses. He has worked on projects in various industries including electronics distribution and semiconductor manufacturing. His responsibilities have included requirements gathering, data modeling, mapping development, testing, and performance tuning. He is proficient in SQL, Informatica PowerCenter, Oracle, and Linux/Windows environments.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Gowthami S is a software developer and designer with over 2 years of experience in data warehousing using databases like Teradata and Oracle. She has extensive experience with ETL tools like Informatica and data loading utilities for Teradata. She has worked on full data warehouse development lifecycles including requirements, design, implementation and maintenance. Currently working as a software engineer at Tech Mahindra, her projects include developing ETL processes and Teradata SQL queries to load and transform data from various sources into a Cisco enterprise data warehouse supporting business intelligence reporting and analytics.
Krishna has over 2 years of experience designing and developing ETL processes in IBM InfoSphere Datastage. He has expertise extracting data from heterogeneous systems, transforming it according to business logic, and loading it into target databases. Some of his skills include designing parallel jobs, working with various Datastage stages, and using the Datastage Administrator, Designer, and Director tools. He has worked on data warehouse projects in the banking and insurance industries.
Mani Sagar is an ETL Sr Developer and Lead with over 8 years of experience in designing, developing, and maintaining large enterprise applications. He has expert knowledge of ETL technologies like Informatica and data management processes including data migration, profiling, quality, security, and warehousing. He has led teams of up to 8 developers and delivered projects on time for clients across various industries.
Rajarao Marisa has over 4 years of experience in data warehousing and ETL using Informatica. He has worked on multiple projects extracting data from sources like SAP, Salesforce, and flat files and loading it into databases including Teradata, Oracle, and SQL Server. Rajarao is proficient in Informatica PowerCenter and has experience with transformations, mappings, performance tuning, and testing. He currently works as a senior software engineer for Tech Mahindra.
The candidate has over 4 years of experience as an ETL developer using Informatica. They have extensive experience developing mappings using transformations like aggregator, lookup, filter and joiner. They have worked on multiple projects in the telecommunications domain for clients like British Telecom extracting data from various sources and loading to data warehouses. Their skills include Informatica, Oracle, SQL and they have experience in requirements analysis, mapping development, testing and support.
This document provides a summary of Sivakumar's professional experience and qualifications. He has over 6 years of experience in data warehousing projects across various domains. He is proficient in Informatica and has extensive experience in ETL development, testing, and support. Currently he works as a consultant for Genpact on a project involving data extraction and loading from various sources into GE's Transaction Life Cycle Management tool.
The document provides a professional summary for N Venkatesh Babu, who has over 14 years of experience in IT with a focus on database technologies, data warehousing, and ETL development. He has extensive experience designing, developing, and implementing data warehouse solutions using technologies such as Oracle, Informatica, Hadoop, and Tableau. He has led projects in various industries and is proficient in technologies such as SQL, Unix scripting, Python, and Java.
This document provides a summary of Kiran Annamaneni's experience working as an Informatica ETL developer over 9.5 years. It outlines his technical skills and experience with Informatica PowerCenter, data warehousing, ETL processes, Oracle, and SQL. Specific experiences are listed for projects at Accenture, BCBSM, Farmers Insurance, and Level 3 Communications developing and supporting BI/DW solutions.
The document contains Nitin Paliwal's resume. It includes his contact information, career objective, professional summary highlighting over 4.5 years of experience in data warehousing using Datastage ETL. It lists his technical skills and certifications in Datastage, Cognos, SQL. It details his professional experience as an ETL Datastage developer at Tech Mahindra working on projects for Chevron Corporation involving data integration, data quality management, and runtime support. It concludes with his educational qualifications and hobbies.
Raghav Mahajan has 3.8 years of experience in data warehousing and business intelligence using Informatica and Oracle. He has expertise in ETL development, data modeling, and working with various data sources. His skills include Informatica, Oracle SQL, Unix scripting, and Qlikview. He is currently an Associate ETL Developer at Cognizant Technology Solutions working on projects for clients such as AT&T.
Ratna Rao Yamani has over 9 years of experience in IT and 7 years of experience with data warehousing technologies like Informatica Power Center and Informatica MDM. They have extensive experience developing ETL code, working with databases like Oracle and DB2, and performing tasks like requirements gathering, design documentation, testing, and performance tuning for various projects involving data integration and data warehousing.
This document contains the professional summary and experience of Madhukar Eunny. He has over 12 years of experience working as a senior consultant on data warehousing projects. His roles have included ETL architect, developer, team lead, and production support. He has strong skills in ETL tools like Informatica and databases like Teradata, Oracle, and SQL Server. He currently works as a senior BI consultant for Medibank where he is responsible for requirements gathering, data modeling, ETL development, and providing business support.
1. Arun Mathew Thomas Mobile: 949-680-0907
Richmond, VA arunmathewthomas.mannil@gmail.com
SUMMARY:
Over 9 years of expertise in IT Industry and over 8 years of comprehensive experience in developing and
maintaining Data Warehouse applications in Healthcare domains.
In depth knowledge of data warehousing techniques, Star / Snowflake schema, ETL, Fact and Dimensions
tables, physical and logical data modeling.
Experience in designing, developing the ETL process for loading data from heterogeneous source systems
like flat files, Oracle, SQL server, Teradata, VSAM files and Teradata using Informatica Power Center
9.x/8.x
Extensively worked on Informatica Power Center Transformations such as Source Qualifier, Lookup, Filter,
Expression, Router, Normalizer, Joiner, Update Strategy, Rank, Aggregator, Stored Procedure, Sorter,
Sequence Generator, SQL transformation
Experience in Performance Tuning of the mappings and Sessions, implementing the complex business
rules, optimizing the mappings.
Build slowly Changing Dimensions (SCD type1 and type2) mapping, partitioning, override, parameters,
variables, push down optimization, mapplets and worklets.
Experience in detecting/resolving bottlenecks at various levels like source, target, mapping and sessions.
Experienced in Data Quality Management. Worked in Informatica Developer and Trillium Software
systems.
Experienced in Data Analysis, Data Profiling and Data Mapping. Has very good knowledge on data
standardization using Informatica developer.
Responsible for major Data Quality activities including Address and Data cleansing, Address
standardization, Developing Match & Merge and House-holding rules.
Highly efficient in developing fuzzy matching logics for Match & Merge and House-holding.
Extensive knowledge in Business Intelligence and Data Warehousing Concepts with emphasis on ETL and
System Development Life Cycle (SDLC).
More than 5 years of development experience in various Data warehousing projects using Teradata 13.10
and 14
Experienced in working with Teradata utilities like BTEQ, Stored Procedures, FastExport, FastLoad,
MultiLoad ,TPT
Experienced in writing complex SQL queries to leverage the Push down Optimization feature of
Informatica.
6 years of experience in Data Warehousing and ETL using Informatica Power Center 8.6 and 9.1/9.6, MS
DTS.
5+ years of experience in development using Oracle 9i using PL/SQL, SQL *PLUS and TOAD.
Highly skilled in Data Analysis, Requirement Gathering, Requirement Analysis, Gap Analysis, Data
Cleansing, Data Transformations, Data Relationships, Source Systems Analysis and Reporting Analysis.
Has the experience of leading large data ware house teams comprising of 10-20 people for building
Enterprise data warehouse.
Experience working in onsite-offshore model.
2. TECHNICAL SKILLS:
Operating System: Windows/Dos, UNIX, Linux
Languages: PL/SQL, Shell Script
ETL: Informatica Power Center 8.x, 9.x, MS DTS
Databases: Oracle, Teradata 14, MS Access, SQL Server 2008
Reporting Tool: Elixir
Data Quality Management: Informatica Developer, Informatica Analyst, Trillium Software Systems.
Other Tools: Teradata SQL Assistant, Toad, MS Visio, Work Load Manager, MS Office
Professional Experience
1. Company – UST Global
Duration – May2014 – Current
Role – Sr. Systems Analyst
Client: Anthem Inc.
Project Name: Enterprise Data Warehouse and Research Depot (EDWard)
Project Description:
Anthem, Inc. (formerly known as WellPoint Inc) is the largest managed health care, for-profit company in the Blue
Cross and Blue Shield Association. It was formed when WellPoint Health Networks, Inc. merged into Anthem, Inc.,
with the surviving Anthem adopting the name, WellPoint, Inc. and began trading its common stock under the WLP
symbol on December 1, 2004. In December 2014, WellPoint, Inc. changed its corporate name to Anthem, Inc.
Anthem is one of the nation’s largest health benefits companies, with more than 37 million members in its
affiliated health plans and nearly 67 million individuals served through its subsidiaries.
Responsibilities:
Source data analysis and ‘GAP’ identification of the proposed data sources. Data profiling and data quality
analysis of the source data.
Help business understand the quality of the source data and the educating them on the accuracy and the
completeness of data.
Comparing different data sources for completeness of data and suggesting the best source based on the
analysis.
Gathering and understanding requirements from business analysts and preparing technical design
documents.
Develop high level, low level ETL designs and develop Informatica mapping documents outlining the
transformations required to integrate data from various sources into Enterprise Data Warehouse.
Developing new and modify existing complex Informatica mappings to extract data from various sources
according to guidelines provided by the business users to populate data into target systems.
Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router,
Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow
variables
Created Informatica mappings using push down optimization to push all Integration, Semantic layer’s
transformation, data integration logic into Teradata database.
Customize the Audit balance process to balance the transactions between different staging layers and
3. identify the execution status of each jobs.
Development of BTEQ scripts to implement highly complex business logics. Packaging of these scripts and
invoking them through command tasks in Informatica workflow.
Performance tuning - identified and fixed bottlenecks and tuned the complex Informatica mappings for
performance optimization.
Provide technical support to other teams on issues related to Informatica performance tuning.
Work with Project DBA to finalize the data model, publish the data model after QA check.
Environment: Informatica powercenter9.6.1, Informatica Developer 9.6.1, Teradata 14, UNIX, Workload Manager
(WLM), Teradata SQL Assistant, putty
2. Company – Infosys Limited
Duration - Feb 2012 – May 2014
Role - Data warehousing Lead/ Data Quality Architect
Client details and projects:
Client: CVS Pharmacy
Project: Viper Replacement
Project Description:
This project was designed to retire the existing loss prevention application of client and move towards IDW
(Integrated Data Warehouse) for analyzing and reporting to the Loss Prevention (LP) business group. The new IDW
built on Teradata will be a one stop shop for integrated data to all the Business users.
Role: ETL Developer
Responsibilities:
Analyzed the existing systems and the business requirements for tactical and strategic needs.
‘GAP’ analysis and compatibility analysis of the sources and decide the between the source of truth from
which data should be sourced to Enterprise data warehouse.
Develop the strategy to migrate the data from the legacy system to the Enterprise data ware house. Also
came up with the high level and the detailed design for the data migration activity.
Developed Informatica mapping document to outline the transformations required to integrate data from
various sources into Enterprise Data Warehouse.
Designed and implemented the Audit balance process to balance the transactions between different
staging layers and identify the execution status of each jobs.
Created the Multi-Dimensional data model (STAR Schema) for the reporting (sematic) layer.
Fine-tuning of Informatica mappings and SQLs to obtain optimal performance and throughput.
Interacted with the business users on regular basis to consolidate and analyze the requirements and
present them the design results.
Scheduling and tracking of tasks
Environment: Informatica powercenter9.1, Informatica Developer 9.1, Teradata 14, UNIX, Teradata SQL
Assistant,putty
Client: Ahold USA
Project Name: Customer Master
Project Description:
4. The Master Data Management Solution (MDM), will consolidate customer data from multiple sources, and provide
a single source of customers, households and other organization hierarchy information. This project relied heavily
on consolidating, cleansing, de-duping and House-holding the Customer data from different sources. The data
migration and cleansing was carried out by a combination of ETL and Trillium.
Role: ETL/Data Quality Architect
Responsibilities:
Analyzing, designing and developing ETL strategies and processes, writing ETL specifications and
Informatica development.
Developed strategy related to data cleansing, data quality. Designed and implemented the data profiling
procedures to come up with data cleansing and standardization rules.
Designed major Data Quality activities including Address and Data cleansing, Address standardization, and
development of rules to cleanse data. Informatica developer and Analyst tools were used for this purpose.
Used various transformations like Filter, Expression, Sequence Generator, Update Strategy, Joiner, and
SQL, Lookup (File and Database) to develop robust mappings in the Informatica Designer.
Worked and Implemented Pushdown Optimization (PDO) to optimize performance issues of complex
mappings involving numerous transformations and hence degrading the performance of the session.
Involved in Performance tuning at source, target, mappings, sessions, and system levels.
Moving the data from source systems to different schemas based on the dimensions and fact tables by
using the slowly changing dimensions type 2 and type 1.
Worked on the various enhancements activities, involved in process improvement.
Worked independently on the critical milestone of the project interfaces by designing a completely
parameterized code to be used across the interfaces and delivered them on time in spite of several
hurdles like requirement changes, business rules changes, source data issues and complex business
functionality.
Analyzed the source systems to detect the data patterns and designed ETL strategy to process the data.
Involved in the continuous enhancements and fixing of production problems.
Defining the schema, staging tables, and landing zone tables, configuring base objects, foreign-key
relationships, complex joins, and building efficient views.
Supported the development and production support group in identifying and resolving production issues.
Was instrumental in understanding the functional specifications and helped the rest of the team to
understand the same.
Involved in All the phases of the development like Analysis, Design, Coding, Unit Testing, System Testing
and UAT
Created and executed test cases, test scripts and test summary reports
Used reusable transformation objects such as mapplets to avoid duplication of metadata, reducing the
development time.
Migrated the code into QA (Testing) and supported QA team and UAT (User).
Created mappings using transformations like Source Qualifier, Joiner, Aggregator, Expression, Filter,
Router, Lookup, Update Strategy, and Sequence Generator.
Environment: Informatica powercenter9.1, Informatica Developer 9.1, Oracle, UNIX, Toad for Oracle, putty
5. 3. Company – Tata Consultancy Services
Duration – Sep 2010 – Feb 2012
Working as - Data Warehousing Analyst
Following is clients and project, I worked for:
Client: Woolworths
Project Name: Oxygen
Project Description:
This project was designed to integrate the data flowing from the satellite systems of the client to the Retail
Management System (RMS). The data from these legacy systems had to be extracted, cleansed, and transformed
before loading to RMS.
Role: ETL Lead
Responsibilities:
Gathering and understanding requirements from business analysts and preparing technical design
documents.
Developing new and modified existing complex mappings to extract data from various sources according
to guidelines provided by the business users to populate data into target systems.
Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router,
Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow
variables.
Worked on different tasks in workflows like sessions, event raise, event wait, decision, command,
worklets, assignment, control and timer.
Used workflow manager for creating, validating, testing and running the sequential, parallel, initial and
incremental load.
Developed unit test cases to ensure successful execution of data loading process.
Worked on unit testing of developed mappings and responsible for analyzing the root cause of the
issue/bug raised by the application owner or business user.
Used Debugger in Informatica Designer tool to test the data and fix errors in the mapping.
Developed PL/SQL procedures for creating/dropping indexes in pre and post sessions for better
performance.
Worked on performance tuning - identified and fixed bottlenecks and tuned the complex Informatica
mappings for performance optimization.
Setup and running the jobs using pmcmd commands and scheduling Informatica workflows to execute in
timely fashion.
Environment: Informatica powercenter9.1, Oracle, UNIX, Toad for Oracle, putty
4. Company – UST Global
Duration – Dec 2006 – Sep 2010
Role – Senior Software Engineer
Client: Anthem Inc.
Project Name: SSB Reporting, SSB State Sponsored Business
6. Project Description:
State Sponsored Business (SSB) is one of the most successful business units of Anthem that manages health care
products that include Medicaid, Children’s Health Insurance Program (CHIP) for low income and uninsured
populations.
SSB team was primarily responsible for building, enhancing and maintaining all the Medicaid applications for
Anthem Medicaid products. The primary supports involves, not limited to, processing member ,eligibility and
provider network data to the central claim processing and adjudication system-Diamond950 alias D950, daily
claims adjudication, Check & Remittance Advice processing, EFT, ERA, EOBs, Provider Network, Tax- 1099 ,
Actuarial reporting, Capitation, Encounter submission and response processing. The team also involves in
production deployments, resolving the Lights on items (Production issues), Queue items (Small System Change
Request) and also to service any adhoc requests.
Responsibilities:
Develop Informatica components for SSB Reporting Project. The components were developed on an
adhoc basis.
Extensively worked on various transformations (Joiner, Sorter, Aggregator, Expression, Lookup, Router,
Filter, Union, Update Strategy, Sequence Generator, Rank and Mapplets), mapping and workflow
variables.
Involved in scope and Impact analysis, requirement analysis, estimation, Design, development schedule
tracking, status reporting.
Develop the Technical design document based on the business requirements. Discussion with business to
clarify the requirements.
Maintained and generated Monthly reports from the tool – opus elixir.
Was the POC of SSB Reporting team for migration activities. Controlled all the production release
activities.
Worked on unit testing of developed mappings and responsible for analyzing the root cause of the
issue/bug raised by the application owner or business user.
Managed the SSB Reporting team with strength of 5. Responsible for task assignment and tracking of
tasks.
Performance tuned and optimized various complex SQL queries for reports
Environment : Oracle , PL/SQL, Informatica 8.6, MS Access, SQL Server, DTS, UNIX and Batch scripting, Opus
Elixir, WLM, Windows server.
EDUCATION, TRAINING & CERTIFICATIONS
Education: Bachelor in Technology (B.Tech) from Anna University, India
Trainings: Trained in Teradata, Oracle, Informatica and Oracle forms
CERTIFICATIONS
Informatica Certified Developer (8.6) – Certificate Number - 219698
OCA Certified Professional (Developer Track)
DB2 fundamentals – IBM