Tamilarasu Uthirasamy has over 10 years of experience in data warehousing, database design, ETL processes, and analytics. He has skills in technologies like Python, Spark, UNIX shell scripting, databases like Netezza and Oracle, and tools like Datastage and R. He has worked on projects in healthcare, retail, and banking domains, designing data models and warehouses and developing ETL processes.
Rajarao Marisa has over 4 years of experience in data warehousing and ETL using Informatica. He has worked on multiple projects extracting data from sources like SAP, Salesforce, and flat files and loading it into databases including Teradata, Oracle, and SQL Server. Rajarao is proficient in Informatica PowerCenter and has experience with transformations, mappings, performance tuning, and testing. He currently works as a senior software engineer for Tech Mahindra.
The document provides a summary of an ETL developer's skills and experience. It includes 3+ years of experience developing ETL processes in IBM InfoSphere Datastage 9.1. Specific experience includes developing Datastage jobs using various stages, debugging, performance tuning, implementing slowly changing dimensions, and working with databases like Oracle, SQL Server and Netezza. Project experience is provided for three projects involving reverse mortgage data warehousing, risk data warehousing, and an order tracking application. Responsibilities included developing ETL processes, testing, and supporting production environments.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
Ramachandran has over 9 years of experience as a Senior Informatica Developer. He has expertise in data warehousing, ETL development and implementation using Informatica PowerCenter. Some of his key skills include dimensional modeling, mapping complex transformations, tuning Informatica workflows, and working with databases such as Oracle, SQL Server and Teradata. He has worked on multiple projects in the healthcare domain for clients in the US and India.
Sudeshna Ghosh Dastidar has over 6 years of experience as an IT Analyst at TCS. She has expertise in design, development, testing, implementation, and reporting using tools like Crystal Reports, SAP BO, and SQL. She is proficient in Oracle, SQL, PL/SQL, and has experience in data warehousing and ETL using Informatica. She has worked on projects in domains like banking, financial services, insurance, and telecom.
Ratna Rao Yamani has over 9 years of experience in IT and 7 years of experience with data warehousing technologies like Informatica Power Center and Informatica MDM. They have extensive experience developing ETL code, working with databases like Oracle and DB2, and performing tasks like requirements gathering, design documentation, testing, and performance tuning for various projects involving data integration and data warehousing.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Mani Sagar is an ETL Sr Developer and Lead with over 8 years of experience in designing, developing, and maintaining large enterprise applications. He has expert knowledge of ETL technologies like Informatica and data management processes including data migration, profiling, quality, security, and warehousing. He has led teams of up to 8 developers and delivered projects on time for clients across various industries.
Rajarao Marisa has over 4 years of experience in data warehousing and ETL using Informatica. He has worked on multiple projects extracting data from sources like SAP, Salesforce, and flat files and loading it into databases including Teradata, Oracle, and SQL Server. Rajarao is proficient in Informatica PowerCenter and has experience with transformations, mappings, performance tuning, and testing. He currently works as a senior software engineer for Tech Mahindra.
The document provides a summary of an ETL developer's skills and experience. It includes 3+ years of experience developing ETL processes in IBM InfoSphere Datastage 9.1. Specific experience includes developing Datastage jobs using various stages, debugging, performance tuning, implementing slowly changing dimensions, and working with databases like Oracle, SQL Server and Netezza. Project experience is provided for three projects involving reverse mortgage data warehousing, risk data warehousing, and an order tracking application. Responsibilities included developing ETL processes, testing, and supporting production environments.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
Ramachandran has over 9 years of experience as a Senior Informatica Developer. He has expertise in data warehousing, ETL development and implementation using Informatica PowerCenter. Some of his key skills include dimensional modeling, mapping complex transformations, tuning Informatica workflows, and working with databases such as Oracle, SQL Server and Teradata. He has worked on multiple projects in the healthcare domain for clients in the US and India.
Sudeshna Ghosh Dastidar has over 6 years of experience as an IT Analyst at TCS. She has expertise in design, development, testing, implementation, and reporting using tools like Crystal Reports, SAP BO, and SQL. She is proficient in Oracle, SQL, PL/SQL, and has experience in data warehousing and ETL using Informatica. She has worked on projects in domains like banking, financial services, insurance, and telecom.
Ratna Rao Yamani has over 9 years of experience in IT and 7 years of experience with data warehousing technologies like Informatica Power Center and Informatica MDM. They have extensive experience developing ETL code, working with databases like Oracle and DB2, and performing tasks like requirements gathering, design documentation, testing, and performance tuning for various projects involving data integration and data warehousing.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Mani Sagar is an ETL Sr Developer and Lead with over 8 years of experience in designing, developing, and maintaining large enterprise applications. He has expert knowledge of ETL technologies like Informatica and data management processes including data migration, profiling, quality, security, and warehousing. He has led teams of up to 8 developers and delivered projects on time for clients across various industries.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
FlexPod Select for Hadoop is a pre-validated solution from Cisco and NetApp that provides an enterprise-class architecture for deploying Apache Hadoop workloads at scale. The solution includes Cisco UCS servers and fabric interconnects for compute, NetApp storage arrays, and Cloudera's Distribution of Apache Hadoop for the software stack. It offers benefits like high performance, reliability, scalability, simplified management, and reduced risk for organizations running business-critical Hadoop workloads.
The document provides a summary of Vinoth Perumal's professional experience. It includes details about his 6+ years of experience in data warehousing technologies like Informatica and SQL. It also lists his roles and responsibilities in various projects for clients like Barclays Bank, Health Net, and UBS bank in Zurich involving tasks like ETL development, testing, support and administration.
Ralph J Padula is a senior-level database architect, engineer and administrator specializing in Oracle database management systems and enterprise storage infrastructure. He has over 15 years of experience leading teams supporting mission-critical systems. Currently, he is the Sr. IT Director at Dealertrack Technologies, where he oversees 35 staff and a $15M budget. Previously, he held leadership roles at GE Asset Management, where he managed large-scale database and storage projects, and Oracle Corporation.
Bridging the Last Mile: Getting Data to the People Who Need It (APAC)Denodo
Watch full webinar here: https://bit.ly/34iCruM
Many organizations are embarking on strategically important journeys to embrace data and analytics. The goal can be to improve internal efficiencies, improve the customer experience, drive new business models and revenue streams, or – in the public sector – provide better services. All of these goals require empowering employees to act on data and analytics and to make data-driven decisions. However, getting data – the right data at the right time – to these employees is a huge challenge and traditional technologies and data architectures are simply not up to this task. This webinar will look at how organizations are using Data Virtualization to quickly and efficiently get data to the people that need it.
Attend this session to learn:
- The challenges organizations face when trying to get data to the business users in a timely manner
- How Data Virtualization can accelerate time-to-value for an organization’s data assets
- Examples of leading companies that used data virtualization to get the right data to the users at the right time
Bhaviya Bhagawan has over 8 years of experience as a Data Quality Architect. She has extensive experience designing and developing ETL solutions using tools like IBM Datastage and working with databases such as Teradata and Netezza. Some of her key skills include data integration, data modeling, SQL tuning, and working with clients in industries such as energy, e-commerce, and publishing.
I gave this presentation at OUGF14 in Helsinki, Finland and again for TDWI Nashville. This presentation takes a look at the Agile Manifesto and the 12 Principles of Agile Development and discusses how these apply to Data Warehousing and Business Intelligence projects. Several examples and details from my past experience are included.
The document provides a summary of an individual's experience and skills. It includes over 10 years of IT experience, with a focus on data warehousing, ETL development and implementation. Specific skills and technologies highlighted include IBM DataStage, Oracle, SQL, UNIX scripting, and experience leading teams on various projects in industries such as retail, insurance, and energy.
SAP Sybase IQ uses a technique called distributed query processing (DQP) that can improve query performance by breaking queries into pieces and distributing the pieces across multiple SAP Sybase IQ servers. DQP provides both intra-query and inter-query parallelism. It dynamically manages resources to balance workloads and avoid saturating the system. For DQP to be effective, the storage area network must have sufficient performance to support the increased parallelism.
This document contains a summary of Raj Ganesh Subramanian's work experience and qualifications. He has over 5 years of experience in data warehousing, ETL development, and database management. He has extensive experience with Informatica PowerCenter and has worked on projects for clients such as GE Transportation, GE Aviation, and IGATE Technologies. He has expertise in Oracle, SQL Server, Informatica, Unix scripting, and reporting tools such as Spotfire, Tableau, and Cognos.
Cloud Based Data Warehousing and AnalyticsSeeling Cheung
This document discusses Marriott International's journey to implementing a cloud-based data warehouse and analytics platform using IBM BigSQL on Softlayer cloud infrastructure. It describes the limitations of their existing on-premises system, challenges faced in migrating data and queries to the cloud, lessons learned, and next steps to further improve the platform. The system is now in production use by an initial group of users at Marriott.
The document provides a summary of Arun Kumar's professional experience and skills. He has 8 years of experience in software development including requirements analysis, design, coding, testing and implementation. He has extensive experience with Oracle Enterprise Data Quality (OEDQ 11g) for data quality, profiling, matching, merging and master data management. He has also worked on projects involving data extraction, loading and transformation using Oracle Data Integrator (ODI 11g).
The document discusses operational data warehousing and the Data Vault model. It begins with an agenda for the presentation and introduction of the speaker. It then provides a short review of the Data Vault model. The remainder of the document discusses operational data warehousing, how the Data Vault model is well-suited for this purpose, and the benefits it provides including flexibility, scalability, and productivity. It also discusses how tools and technologies are advancing to support automation and self-service business intelligence using an operational data warehouse architecture based on the Data Vault model.
Seema Shinde has over 9 years of experience in IT with expertise in database migration, data warehousing, and BI solutions. She has strong skills in Oracle, SQL Server, Teradata, and Hadoop technologies. She has led teams for ETL, data warehouse, and BI implementations and has experience delivering projects for banking clients.
Oracle Data Integrator is an ETL tool that has three main differentiators: 1) It uses a declarative, set-based design approach which allows for shorter implementation times and reduced learning curves compared to specialized ETL skills. 2) It can transform data directly in the existing RDBMS for high performance and lower costs versus using a separate ETL server. 3) It has hot-pluggable knowledge modules that provide a library of reusable templates to standardize best practices and reduce costs.
Performance advantages of Hadoop ETL offload with the Intel processor-powered...Principled Technologies
High-level Hadoop analysis requires custom solutions to deliver the data that you need, and the faster these jobs run the better. What if ETL jobs created by an entry-level employee after only a few days of training could run even faster than the same jobs created by a Hadoop expert with 18 years of database experience?
This is exactly what we found in our testing with the Dell | Cloudera | Syncsort solution. Not only was this solution was faster, easier, and less expensive to implement, but the ETL use cases our beginner created with this solution ran up to 60.3 percent more quickly than those our expert created with open-source tools.
Using the Dell | Cloudera | Syncsort solution means that your organization can compensate a lower-level employee for half as much time as a senior engineer doing less-optimized work. That is a clear path to savings.
- The document contains the resume of Sandhya Chamarthi which summarizes her work experience in IT with a focus on Informatica and Pentaho Data Integration (KETTLE) ETL tools. She has over 4 years of experience in data warehousing, ETL processes, business intelligence and dimensional modeling. Some of the key projects listed include developing ETL processes for insurance, banking and telecom clients to load data into data warehouses and datamarts.
El proyecto Círculos Horizontes tiene como objetivo contribuir al desarrollo de una cultura de la innovación en Medellín mediante el fomento del aprendizaje de la ciencia, la investigación y el desarrollo tecnológico entre estudiantes y maestros de colegios públicos y privados. El proyecto busca desarrollar capital humano calificado para una futura industria del conocimiento. Su metodología se compone de cuatro fases: formular preguntas de investigación, generar visiones, construir conocimiento y comunic
Este documento describe los aspectos fundamentales de la redacción de textos. Explica que la redacción implica más que aplicar reglas gramaticales, y requiere argumentar ideas y organizarlas de manera coherente. Luego detalla las etapas de planificación, redacción y revisión, y conceptos como estilo, coherencia, cohesión y claridad. También cubre la estructura del párrafo y los tipos de textos (argumentativo, descriptivo, narrativo). Por último, ofrece definiciones de resumen y ensayo académic
The document summarizes a summit held by the Mitsubishi Electric America Foundation to discuss strategies for improving employment outcomes for youth with disabilities. Over 20 organizations participated in the summit, including employers, educators, and youth. They discussed the need for cross-sector collaboration and data sharing to better prepare, provide opportunities, and support youth with disabilities in their career development. Key recommendations included focusing on strengths over deficits, increasing work-based learning, and reducing stigma through showcasing successful individuals with disabilities.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
FlexPod Select for Hadoop is a pre-validated solution from Cisco and NetApp that provides an enterprise-class architecture for deploying Apache Hadoop workloads at scale. The solution includes Cisco UCS servers and fabric interconnects for compute, NetApp storage arrays, and Cloudera's Distribution of Apache Hadoop for the software stack. It offers benefits like high performance, reliability, scalability, simplified management, and reduced risk for organizations running business-critical Hadoop workloads.
The document provides a summary of Vinoth Perumal's professional experience. It includes details about his 6+ years of experience in data warehousing technologies like Informatica and SQL. It also lists his roles and responsibilities in various projects for clients like Barclays Bank, Health Net, and UBS bank in Zurich involving tasks like ETL development, testing, support and administration.
Ralph J Padula is a senior-level database architect, engineer and administrator specializing in Oracle database management systems and enterprise storage infrastructure. He has over 15 years of experience leading teams supporting mission-critical systems. Currently, he is the Sr. IT Director at Dealertrack Technologies, where he oversees 35 staff and a $15M budget. Previously, he held leadership roles at GE Asset Management, where he managed large-scale database and storage projects, and Oracle Corporation.
Bridging the Last Mile: Getting Data to the People Who Need It (APAC)Denodo
Watch full webinar here: https://bit.ly/34iCruM
Many organizations are embarking on strategically important journeys to embrace data and analytics. The goal can be to improve internal efficiencies, improve the customer experience, drive new business models and revenue streams, or – in the public sector – provide better services. All of these goals require empowering employees to act on data and analytics and to make data-driven decisions. However, getting data – the right data at the right time – to these employees is a huge challenge and traditional technologies and data architectures are simply not up to this task. This webinar will look at how organizations are using Data Virtualization to quickly and efficiently get data to the people that need it.
Attend this session to learn:
- The challenges organizations face when trying to get data to the business users in a timely manner
- How Data Virtualization can accelerate time-to-value for an organization’s data assets
- Examples of leading companies that used data virtualization to get the right data to the users at the right time
Bhaviya Bhagawan has over 8 years of experience as a Data Quality Architect. She has extensive experience designing and developing ETL solutions using tools like IBM Datastage and working with databases such as Teradata and Netezza. Some of her key skills include data integration, data modeling, SQL tuning, and working with clients in industries such as energy, e-commerce, and publishing.
I gave this presentation at OUGF14 in Helsinki, Finland and again for TDWI Nashville. This presentation takes a look at the Agile Manifesto and the 12 Principles of Agile Development and discusses how these apply to Data Warehousing and Business Intelligence projects. Several examples and details from my past experience are included.
The document provides a summary of an individual's experience and skills. It includes over 10 years of IT experience, with a focus on data warehousing, ETL development and implementation. Specific skills and technologies highlighted include IBM DataStage, Oracle, SQL, UNIX scripting, and experience leading teams on various projects in industries such as retail, insurance, and energy.
SAP Sybase IQ uses a technique called distributed query processing (DQP) that can improve query performance by breaking queries into pieces and distributing the pieces across multiple SAP Sybase IQ servers. DQP provides both intra-query and inter-query parallelism. It dynamically manages resources to balance workloads and avoid saturating the system. For DQP to be effective, the storage area network must have sufficient performance to support the increased parallelism.
This document contains a summary of Raj Ganesh Subramanian's work experience and qualifications. He has over 5 years of experience in data warehousing, ETL development, and database management. He has extensive experience with Informatica PowerCenter and has worked on projects for clients such as GE Transportation, GE Aviation, and IGATE Technologies. He has expertise in Oracle, SQL Server, Informatica, Unix scripting, and reporting tools such as Spotfire, Tableau, and Cognos.
Cloud Based Data Warehousing and AnalyticsSeeling Cheung
This document discusses Marriott International's journey to implementing a cloud-based data warehouse and analytics platform using IBM BigSQL on Softlayer cloud infrastructure. It describes the limitations of their existing on-premises system, challenges faced in migrating data and queries to the cloud, lessons learned, and next steps to further improve the platform. The system is now in production use by an initial group of users at Marriott.
The document provides a summary of Arun Kumar's professional experience and skills. He has 8 years of experience in software development including requirements analysis, design, coding, testing and implementation. He has extensive experience with Oracle Enterprise Data Quality (OEDQ 11g) for data quality, profiling, matching, merging and master data management. He has also worked on projects involving data extraction, loading and transformation using Oracle Data Integrator (ODI 11g).
The document discusses operational data warehousing and the Data Vault model. It begins with an agenda for the presentation and introduction of the speaker. It then provides a short review of the Data Vault model. The remainder of the document discusses operational data warehousing, how the Data Vault model is well-suited for this purpose, and the benefits it provides including flexibility, scalability, and productivity. It also discusses how tools and technologies are advancing to support automation and self-service business intelligence using an operational data warehouse architecture based on the Data Vault model.
Seema Shinde has over 9 years of experience in IT with expertise in database migration, data warehousing, and BI solutions. She has strong skills in Oracle, SQL Server, Teradata, and Hadoop technologies. She has led teams for ETL, data warehouse, and BI implementations and has experience delivering projects for banking clients.
Oracle Data Integrator is an ETL tool that has three main differentiators: 1) It uses a declarative, set-based design approach which allows for shorter implementation times and reduced learning curves compared to specialized ETL skills. 2) It can transform data directly in the existing RDBMS for high performance and lower costs versus using a separate ETL server. 3) It has hot-pluggable knowledge modules that provide a library of reusable templates to standardize best practices and reduce costs.
Performance advantages of Hadoop ETL offload with the Intel processor-powered...Principled Technologies
High-level Hadoop analysis requires custom solutions to deliver the data that you need, and the faster these jobs run the better. What if ETL jobs created by an entry-level employee after only a few days of training could run even faster than the same jobs created by a Hadoop expert with 18 years of database experience?
This is exactly what we found in our testing with the Dell | Cloudera | Syncsort solution. Not only was this solution was faster, easier, and less expensive to implement, but the ETL use cases our beginner created with this solution ran up to 60.3 percent more quickly than those our expert created with open-source tools.
Using the Dell | Cloudera | Syncsort solution means that your organization can compensate a lower-level employee for half as much time as a senior engineer doing less-optimized work. That is a clear path to savings.
- The document contains the resume of Sandhya Chamarthi which summarizes her work experience in IT with a focus on Informatica and Pentaho Data Integration (KETTLE) ETL tools. She has over 4 years of experience in data warehousing, ETL processes, business intelligence and dimensional modeling. Some of the key projects listed include developing ETL processes for insurance, banking and telecom clients to load data into data warehouses and datamarts.
El proyecto Círculos Horizontes tiene como objetivo contribuir al desarrollo de una cultura de la innovación en Medellín mediante el fomento del aprendizaje de la ciencia, la investigación y el desarrollo tecnológico entre estudiantes y maestros de colegios públicos y privados. El proyecto busca desarrollar capital humano calificado para una futura industria del conocimiento. Su metodología se compone de cuatro fases: formular preguntas de investigación, generar visiones, construir conocimiento y comunic
Este documento describe los aspectos fundamentales de la redacción de textos. Explica que la redacción implica más que aplicar reglas gramaticales, y requiere argumentar ideas y organizarlas de manera coherente. Luego detalla las etapas de planificación, redacción y revisión, y conceptos como estilo, coherencia, cohesión y claridad. También cubre la estructura del párrafo y los tipos de textos (argumentativo, descriptivo, narrativo). Por último, ofrece definiciones de resumen y ensayo académic
The document summarizes a summit held by the Mitsubishi Electric America Foundation to discuss strategies for improving employment outcomes for youth with disabilities. Over 20 organizations participated in the summit, including employers, educators, and youth. They discussed the need for cross-sector collaboration and data sharing to better prepare, provide opportunities, and support youth with disabilities in their career development. Key recommendations included focusing on strengths over deficits, increasing work-based learning, and reducing stigma through showcasing successful individuals with disabilities.
Activity of professionals and managers in the labour market in Ireland Amy Jackson
This document summarizes the findings of a survey of 1,600 professionals and managers in Ireland conducted in November 2015. Some key findings:
- 25% were actively seeking new jobs, 53% were passively looking, and 22% were not looking to change jobs.
- Those most likely to actively look included those aged 31-40, and those in FMCG, pharmaceuticals, tourism, HR, sales, and administration.
- Those most likely to passively look included those in law, IT, engineering and finance, aged 20-40.
- Those not looking included those in law firms, transport, and automotive, aged 20-30 in logistics and IT.
This document summarizes Elaine Fischer's capstone presentation on her Hepatitis C Virus (HCV) project in Monterey County, California. The project aimed to increase awareness and screening for HCV. Key activities included training as an HCV educator, conducting workshops for healthcare professionals, holding an educational support group, and providing 50 HCV screenings. Evaluation found the project increased knowledge among over 100 participants and secured 25% more screening than anticipated. The presentation concludes by recommending ongoing screening and linkage to care services through collaboration.
Um Munck é um caminhão equipado com um braço hidráulico telescópico usado para içamento e movimentação de equipamentos e materiais industriais e de construção civil de forma segura. Os principais usos de um Munck incluem o transporte de máquinas pesadas, içamento de cestos aéreos, utilidade em obras e transporte de contêineres devido à sua versatilidade para alcançar lugares inatingíveis.
Dark circles and bags that appear below the eyes and give a tired look, unhealthy, exhausted and you look older. This appears due to skin problem, and reduces the beauty of your face. Don’t get worried, you can regain your youthful eyes by making few simple changes to your lifestyle.
Best-practice-recruitment-and-selection-a-tool-kitAmy Jackson
This document provides an overview of best practices for recruitment and selection. It discusses planning the recruitment process, performing job analysis, attracting candidates, shortlisting, interviews and assessments, reference checks, making a selection, and evaluation. The goal is to select the best candidate for the role using a standardized, fair process. Job analysis is identified as the most important initial step to determine the key criteria for the role. Attraction strategies are important to market the organization and role. Shortlisting helps filter candidates based on criteria. Behavioral interviews structured around key criteria are recommended over unstructured interviews. Reference checks, induction, and evaluation complete the process. Using this standardized best practice approach helps hiring managers select the most suitable candidates and avoid poor hiring
Las herramientas han evolucionado desde la prehistoria cuando el hombre necesitaba objetos para ayudarlo a realizar tareas. La primera máquina herramienta fue el torno inventado en 1751, mientras que las primeras máquinas aparecieron en la antigüedad para torneado y taladrado. A lo largo de los siglos, las herramientas se han tecnificado gracias a máquinas movidas por energía como la vapor y la electricidad, permitiendo una mayor producción e intercambiabilidad de piezas.
This document outlines a marketing strategy for 7drinks beverage company based on analyzing over 1000 tweets through text mining, clustering, topic modeling, and sentiment analysis techniques. Key findings include recommending a social media campaign using the "#7drinks" hashtag and potentially partnering with celebrities mentioned in trending tweets, such as Klay Thompson. Limitations of the Twitter analysis are also discussed, such as its short time horizon. The next steps proposed are continuing monitoring Twitter data over a longer period and exploring other channels for sentiment analysis.
Um Munck é um caminhão equipado com um braço hidráulico telescópico usado para içamento e movimentação de equipamentos e materiais industriais e de construção civil de forma segura. Os principais usos de um Munck incluem o transporte de máquinas pesadas, içamento de cestos aéreos, auxílio em obras e transporte de contêineres devido à sua versatilidade.
Formation Les base générales de la responsabilité civileActions-Finance
Actions-Finance propose la formation Les base générales de la responsabilité civile
Cette formation en finance permet notamment de:
•S’approprier les bases de l’assurance.
•Distinguer les différentes responsabilités.
•Comprendre les garanties accordées par l’assureur.
•Différencier l’assuré et les tiers au contrat.
Pour plus de renseignements sur la formation Les base générales de la responsabilité civile, N’hésitez pas à nous contacter par téléphone au + 33 (0)1 47 20 37 30, ou par email sur contact@actions-finance.com
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes for data migration, analytics projects for clients in various industries. His roles have included requirement analysis, mapping design, testing, performance tuning and managing project timelines.
This document provides a summary of Surendranath Gandla's professional experience and qualifications. He has over 3 years of experience in ETL development using tools like Informatica, UNIX shell scripting, SQL, and PL/SQL. Currently working as an ETL developer at ADP where he develops mappings and workflows to extract, transform and load data from various sources into target databases. He has also worked on reporting tools like Tableau. He holds a B.Tech degree in Electronics and Communication Engineering.
Sourav Giri has over 11 years of experience in software development, including expertise in Oracle, Sybase, PL/SQL, Java, and Hadoop. He has worked on projects involving data extraction, transformation, and loading (ETL) using tools like Datastage, Sqoop and Flume. Currently he is working on a project involving loading data from mainframes to MongoDB using Hadoop. He aims to utilize his skills in database management, ETL, and big data systems.
The document contains the resume of Naveen Reddy Tamma which summarizes his work experience and qualifications. He has over 7 years of experience working as an Associate at Cognizant Technology Solutions on various projects involving Informatica ETL development, data quality testing, and report generation. He holds a B.Tech in Computer Science and has experience working with technologies like Informatica, Teradata, Oracle, and Cognos.
The document contains the resume of Naveen Reddy Tamma which summarizes his work experience and qualifications. He has over 7 years of experience working as an Associate at Cognizant Technology Solutions on various projects involving Informatica ETL development, data quality, and reporting. He holds a B.Tech in Computer Science and has experience with technologies like Informatica, Teradata, Oracle, and Cognos.
Shanujain has over 9 years of experience as a senior software developer and production support analyst. He has extensive experience with Oracle PL/SQL, SQL Server, and Netezza databases. Currently he is a technical lead for a core prime brokerage project involving data processing and report generation using SQL Server, Unix, and Informatica.
Shivaprasada Kodoth is seeking a position as an ETL Lead/Architect with experience in data warehousing and ETL. He has over 8 years of experience in data warehousing and Informatica design and development. He is proficient in technologies like Oracle, Teradata, SQL, and PL/SQL. Some of his key projects include developing ETL mappings and workflows for integrating various systems at BoheringerIngelheim and UBS. He is looking for opportunities in Bangalore, Mangalore, Cochin, Europe, USA, Australia, or Singapore.
Rajesh S has over 3 years of experience in developing ETL applications using IBM Datastage. He has extensive experience designing and developing Datastage jobs to extract, transform and load data from various sources such as Oracle and Teradata databases into data warehouses. Some of his key skills include Datastage, Unix scripting, Oracle, Teradata and working on projects in the healthcare and retail domains.
This document is a curriculum vitae for Rajeswari Pothala. It outlines her professional experience working for Tata Consultancy Services for over 6 years leading teams of up to 8 members on data warehousing and ETL development projects. It also lists her educational qualifications including a B.Tech in Electronics and Communication Engineering. Key projects outlined include work on the TCS Trimatrix EDW project and several projects for Aviva involving data integration, mappings development, and module lead responsibilities.
This document contains the resume of Anil Kumar Andra. It summarizes his 5 years of experience as an ETL Developer in the IT industry and 3 years of experience in non-IT work. It lists his technical skills including experience with IBM Datastage ETL tool, SQL, DB2, and relational databases. It also provides details of two projects he worked on, one for Bharti Airtel and Vodafone migrating and transforming telecom data, and another for Shell migrating sample test data. It describes his responsibilities of designing and developing ETL jobs to load large volumes of data into data warehouses. Finally, it briefly outlines his non-IT experience maintaining electrical equipment as a supervisor for HS
This document contains a summary of Amit Kumar's professional experience and qualifications. He has over 9 years of IT experience, including 8 years of data warehouse and business intelligence experience. Currently he works as a data architect at Capgemini, leading a team of 12 on an oil and gas equipment install base project. He has extensive experience designing and developing data integration solutions using tools like Informatica, Hadoop, and SQL.
Subhoshree Deo has over 4 years of experience as an ETL developer and Salesforce integrator. She has worked on projects involving Informatica, SQL Server, Oracle, and Salesforce. Her experience includes designing and developing interfaces to migrate data between various systems, load historical data, and integrate applications like Veeva and Salesforce. She is proficient in technologies like Informatica, SQL, and programming languages like C/C++ and has handled projects for clients including Astrazeneca, Merck, and Takeda.
Subhoshree Deo has over 4 years of experience as an ETL developer and Salesforce integrator. She has worked on projects involving Informatica, SQL Server, Oracle, and Siebel. Her experience includes designing and developing interfaces to migrate data between various systems like Veeva, Salesforce, and Siebel. She is proficient in technologies like Informatica, SQL, and programming languages like C/C++.
Pradeep Kumar Pandey has over 10 years of experience as a data/systems integration specialist and ETL expert. He has extensive experience designing and implementing data warehouses using tools like IBM DataStage, Informatica, Oracle OBIEE, and Oracle OBIA. He has led teams and taken on roles such as developer, technical lead, and team lead. Pradeep has worked on projects across various industries including telecom, financial services, HR, and retail.
Alok Singh is seeking challenging assignments in Business Intelligence/Data warehousing. He has nearly 7 years of experience in BI/DW, ETL, data integration, and data warehousing solution design. He is proficient in SQL, ETL tools like Informatica and SSIS, and visualization tools like QlikView and Tableau. He has experience designing and developing ETL solutions, requirements gathering, and data analysis. His past roles include positions at Technologia, Subex, and Reliance Communications where he worked on projects involving Teradata, Oracle, billing systems, and fraud detection. He has a bachelor's degree in electronics and telecommunications.
IT Professional with 9 years of Data Warehousing experience in the areas of ETL design and Development.Excellent Experience in Requirement Gathering, Designing, Developing, Documenting, Testing of ETL jobs and mappings in Parallel jobs using Data Stage to populate tables in Data Warehouse and Data marts.
Anusaa Vemuri is a Systems Engineer at Tata Consultancy Services with over 3 years of experience in IT. She has expertise in business intelligence tools like Tibco, Qlikview and report development. She is proficient in SQL, PL/SQL and has worked on projects involving data extraction, modeling, dashboard and report design for clients like Qualcomm and Tata Teleservices.
Rajnish Kumar has over 7 years of experience in the IT industry as a Team Lead and Senior Software Engineer. He has extensive experience with ETL tools like DataStage and databases like Teradata, Oracle, and DB2. Some of his project experiences include developing ETL processes and data warehouses for clients in telecom and banking industries. He has technical skills in areas like data warehousing, ETL, SQL, shell scripting, and UNIX.
Sonal Verma has over 9 years of experience in IT working on various projects in the banking and telecommunications domains. She has expertise in programming languages like SQL, PL/SQL, Java, C++ and tools like Oracle, UNIX and Windows. Her experience includes roles like developer, team lead, business analyst and managing offshore delivery. She currently works as a senior developer at Barclays Technology focusing on payment processing solutions.
Pallavi Gokhale Mishra has over 16 years of experience in data migration, data warehousing, project management, and software development. She currently works as a Solution Architect for IBM India on a project involving data migration from Oracle CRM to Siebel CRM for Vodafone India. Her experience includes managing teams and leading complex data migration projects involving Siebel, SAP, and other applications for clients in telecom, automotive, banking, and other industries. She has strong skills in ETL tools like IBM Datastage, databases like Oracle, and programming languages like SQL.
Similar to Tamilarasu_Uthirasamy_10Yrs_Resume (20)
1. Tamilarasu Uthirasamy
Mobile : +1 847 630 7689
E-mail : arasu.tamilarasu@gmail.com
Professional Summary :
• 10+ years of experience in Data warehouse, Database Design and ETL processes in Dev,
Assembly, QA and production environments of various business domains like Health Care,
Retail - Market research (on what people buy) and Banking Domain.
• Skilled as a Design Architect mainly involved in designing Data Models for various Business
Processes.
• Have very good experience in UNIX shell scripting, R, Python, Spark and Debugging
• Have a good understanding on the Big Data Eco Systems with Pig and Hive.
• Highly skilled in Netezza database and NZSQL, and having experience in performance
tuning in Netezza
• Experience in Oracle database, SQL and PL/SQL programming skills
• Have experience in TIBCO orchestration of UNIX Shell Script jobs
• Have knowledge in ELT Data Warehouse concepts and implementation
• Have knowledge and Understanding of Dimensional models, Slowly changing dimensions
and STAR, SNOWFLAKE schema data models
• Have experience working in Agile environment.
• Have worked very closely with Production support team on closing numerous production
issues (Tickets) and handled Production SEV -1 tickets
• Have very good experience in Software Development Life Cycle.
• Attended multiple trainings in Data warehousing tools and Database concepts. Have ability to
grasp new tools and technologies quickly
Technical Skills
Languages : Python , Spark, UNIX Shell Scripting, NZSQL, PL/SQL
Technologies : Sync Sort , ETL – Datastage (7.5 and 8.1)
Database : Netezza, Oracle
Other Tools : Rational Rose Professional
Page | 1
2. Version Manager Tools : TortoiseSVN, Visual Source Safe, PVCS
Operating Systems : Windows 2000 Professional / Server and Windows XP
Domain skills : Retail - Market Research, Banking, Healthcare
Work Experience
Jul 2015 – Till Date, Dell Inc
Designation : Technical Lead
Nov 2010 – Jul 2015, TATA Consultancy Services Ltd
Designation : Technical Lead
Dec 2008 – Nov 2010, Wipro Technologies Ltd
Designation : Sr Software Engineer
Feb 2006 – Nov 2008, Patni Computer Systems Ltd
Designation : Software Engineer
Professional Experience
Company: Dell Inc
Project Name : Data Integration
Client Name : Tenet Health
Role : Team Lead
Duration : Jul 2015 – Till Date
Technologies : Datastage 8.1, Netezza 6.2, Oracle 10g, Netezza SQL, Python &
R/Shell Scripting.
Description of Projects:
• Readmission risk report – predicting the probability of patient getting readmitted using random forest
algorithm in R / Shell scripting.
• Case management and Nursing reporting / dashboard – Design and build Data warehouse to host
AllScripts® and Cerner® data source and build data mart for various reporting on case management
and nursing application.
• PMI – Performance Management Innovation, Design and development of application specific data
marts and integration of various data sources from AllScripts, Midas and Cerner.
• IA – Insight Analytics, maintenance of a massive reporting based application and sourcing data for
Cognos® cubes rebuilt on a daily basis, troubleshooting and fixing production issues which might
raise due to the data related issues from the upstream data sources.
Page | 2
3. • Design and development of Staging type 2 data mart to host Cerner data from different HUBs to
perform delta processing on the incremental changes / updates on the source data.
• Design and development of UDX using IBM Netezza ® C++ libraries for core DA requirements
Responsibilities:
• Designing, data modelling, requirement analysis and validation for application development.
• Developing applications to leverage Netezza AMPP architecture (usage of temp tables, distribution
and organization keys)
• Creation of Netezza store procedure and functions as an implementation of application design.
• Performance tuning and optimization of Cognos® reporting or Netezza user based queries / store
procedure and functions.
• Periodic refresh (weekly and monthly) of data on Netezza database which sources the downstream
applications.
• Monitoring the query performance based on increase in volume of data being processed and tuning /
optimizing query / procedure / function for better performance.
• Modelling Netezza tables / databases (optimal data type, distribution and organization keys selection)
• Assist various teams in setting up Data model to make best use of Netezza from performance
perspective.
• Creation of Materialized views for better performance of routine weekly or monthly query execution /
report generation.
• Assessing performance gains and bottlenecks in new platform and mitigating performance issues by
effective use of zone maps, JIT stats and table distribution.
• Analyse Netezza plan file to look for processing skew, large (fact) table broadcast, large (fact) table
redistributions, disk hash joins, merge joins and expression based joins, non-usage of zone maps, host
based processing and reviewing join order
• Preparing and educating developers on Netezza standards / best practices, especially when querying
huge volume of data.
• Analysing intermediate data skew that gets generated due to various join operations on complex
queries and deriving an alternative to avoid skew and improving processing speed.
• Build User defined Functions (UDFs) based on the requirements from various teams (Meaning Use,
Insight Analytics and PMI)
Company: TATA Consultancy Services Ltd
Project Name : AOD 4.5-4.6 Release – Production Governance
Client Name : The Nielsen Company
Role : Team Lead
Duration : Sep 2014 – Till Date
Page | 3
4. Technologies : UNIX Shell Scripting, Datastage 8.1, Netezza 6.2, Oracle 10g,
TIBCO, Netezza SQL, SQL and PL/SQL
Description:
With this new initiative and with the new team, the team was intended to govern the
metadata changes that are made in the production environment. Leadership team wanted to
have a hold on the smoother weekly delivery for the entire Nielsen AOD clients. So, a new
team was formed to govern the changes that go across a single client or multiple client. And
this team comprises of members from Service delivery, App Dev , Execution and validation
team. So the main goal of this time is to steam line the current process that are in place and
control the way they are delivered every week with the new data.
Responsibilities:
• Change request/Requirement analysis
• Impact analysis on the existing system.
• Feasibility study on the change request.
o On the type of change / delivery date / impact areas/clients
• YSA Approvals.
• Provide necessary steps/guidance to the execution team on the change and govern the
changes that are happening for the week.
• Identify potential areas of improvement across the system and address it.
o Performance improvements
o Process improvements
Company: TATA Consultancy Services Ltd
Project Name : AOD 4.0 Release – Kraft on board in Convergence platform
Client Name : The Nielsen Company
Role : Team Lead
Duration : January 2014 – September 2014
Technologies : UNIX Shell Scripting, Datastage 8.1, Netezza 6.2, Oracle 10g,
TIBCO, Netezza SQL, SQL and PL/SQL
Description:
Page | 4
5. Kraft is an existing Nielsen client having their data managed in AOD in Dimensional format
to pull the reports for their Market research on Kraft Products. Currently Kraft application is
running on Legacy AOD platform that is very complex and not flexible to make easy
changes based on client (Kraft) request. Therefore, the AOD team decided to move the Kraft
over to Converged platform developed as part of AOD 3.2 release and successfully on
boarded existing P&G manufacturer client as part of AOD 3.2 release. The New
convergence platform is going to bring the configurable environment to Kraft end users to
maintain their Meta data changes, Data restrictions, and Technical information and User
level securities.
Responsibilities:
• Requirement analysis
• High Level Design reviews
• Impact analysis on existing system
• Low level design Preparation
• Assembly testing planning
• Onsite – Offshore coordination
• Cut over planning
• Design discussions with SQA Team to help their SQA Test scenarios
• Implementation reviews
• Implementation
• Knowledge transfer to product support team
• Production support in warranty period
Company: TATA Consultancy Services Ltd
Project Name : AOD 3.2 Release – Convergence Platform – P&G On board
Client Name : The Nielsen Company
Role : Team Lead
Duration : November 2012 – December 2013
Technologies : UNIX Shell Scripting, Datastage 8.1, Netezza 5.0, Oracle 10g,
TIBCO, Netezza SQL, SQL and PL/SQL
Page | 5
6. Description:
As part of AOD 3.2 release, we have developed a Converged platform for all manufacture
clients to streamline the Dimension build & Fact Sourcing build process. P&G was an
existing Nielsen client in legacy AOD system holding various reports, Facts and huge Meta
data containing numerous Product, Store, Market and Facts information. If a new Product,
Market or Store information had to be updated into the system, there were manual
interventions requiring a code deployment and release. To bring in a streamlined approach to
these basic configurations a new system called MSM (Metadata Service Management) was
created to handle all configurations related information enabling the end users to manage
Meta data.
So the system was extended to other manufacturers like Mars, Tetra pack & Miller Coors so
far.
Responsibilities:
• Requirement analysis
• Impact analysis on existing system
• Low level design preparation
• Assembly testing planning
• Onsite – Offshore coordination
• Cut over planning
• Design discussions with SQA Team to help their SQA Test case scenarios
• Implementation reviews
• Implementation / Coding
• Unit Testing / ASM Testing
• Knowledge transfer to product support team
• Production support in warranty period
Company: TATA Consultancy Services Ltd
Project Name : AOD 2.8 Release – Loyalty Platform - Safeway
Client Name : The Nielsen Company
Role : Sr Developer / Technical Lead
Page | 6
7. Duration : Nov 2010 – November 2012
Technologies : UNIX Shell Scripting, Datastage 8.1, Netezza 4.5, Oracle 10g,
TIBCO, Netezza SQL, SQL and PL/SQL
Description:
The Nielsen Company introduced the Loyalty program in AOD (Answers on Demand) and
Safeway retailer was the first client on-board. As part of loyalty program, we have
introduced the Snowflake schema model into AOD. We have built the Product, Store,
Period, Household and Identifying Card dimensions as base dimensions and Period totals,
Product totals and Segment dimensions as Sub dimensions; all these surrounding the FACT
dimension. Based on the above dimensions, Reports/Facts created at household/ Loyalty
card level and Basket/Product segment level. These Safeway reports were created to report
18 different facts. We have provided 18 facts reports to Safeway to analyse their data at
basket, product, and store and household/identity level.
Responsibilities:
• Requirement analysis
• Impact analysis on existing system
• Low level design preparation
• Assembly testing planning
• Implementation reviews
• Implementation / Coding
• Unit Testing / ASM Testing
• Knowledge transfer to product support team
• Production support in warranty period
• Onsite – Offshore coordination
Company: Wipro Technologies Ltd.
Project Name : Data Warehouse (auth)
Client Name : Mastercard Inc.
Role : Senior Developer
Page | 7
8. Duration : December 2008 – Nov 2010
Technology : UNIX Shell Scripting, Netezza, Sync Sort, SQL, PL/SQL, Oracle
9i.
Description:
This is a credit authorizing (auth) project out of the Auth, Debit and clearance aspects in Credit Card
industry. The Authorization data warehouse project would provide the authorization details of all the
financial transactions which happened via MasterCard Network. It consists of batches that process and
stores data at a detail/transaction level as well as summarized or aggregated data. This is a single integrated
source of Authorization data to support decision making for operational, tactical and strategic business
processes.
Responsibilities
• Senior Developer
• Module & Impact Analysis
• Module Specification and Documentation
• Coding
• Unit Testing
• Implementation
• Post implementation support
Company: Patni Computer Systems Ltd.
Project Name : KRONOS
Client Name : GE Aviation.
Role : Developer
Duration : Jan 2007 – Oct 2008
Technology : Oracle PL/SQL, Kronos Workforce Connect , Unix Shell
Scripting
Tool : Kronos Workforce Connect
Description:
Page | 8
9. KRONOS is a workforce management tool on it’s own and help organizations of all sizes and
industries better manage their workforce in the cloud. The project involves enhancements in the
Kronos application configured for various sites in GE locations. Kronos is a time and attendance
system that we implement for GE employees using a tool named Workforce Central Suite and
configurations in the system application.
Responsibilities
• Module Analysis
• Module Specification and Documentation
• Coding
• Unit Testing
• Implementation
• Production System Implementation
• Post implementation support
Company: Patni Computer Systems Ltd.
Project Name : Middleware
Client Name : GE Money
Role : Developer
Duration : May 2006 – Nov 2007
Technology : Mercator Ascential Datastage, Unix Shell Scripting, FTD
Description:
This projects deals with the Private Label Credit Card (PLCC) for GE Money. All the transactions
that go with a PLCC of GE & Alice would go via a regular credit card industry process by taking the
online transaction and get it processed. Middleware is a real-time application that acts as a bridge
between various front-ends and backend, inside and outside GE network. Any data request to
backend is provided to Middleware in the form of packets. My contribution towards this project was
maintaining, enhancing and supporting the existing applications. Scope for the work involved
Analysis, New development, Enhancements, Unit testing, Integration testing, Review and
deployment of the applications (Packets). Providing support to various front-end at the time of
Page | 9
10. integration testing. In this project, I have worked on build as well as run aspects of the project. This
project runs in a real time environment. MW Build deals with the real time transactional details and
MW Run deals with the batch processing and support.
Responsibilities
• Technical Member / Group Lead
• Module Analysis
• Module Specification and Documentation
• Coding
• Unit Testing
• Implementation
• Production System Implementation
• Post implementation support
Academics
Master of Compute Application, Anna University, Tamilnadu, India, 2005
Page | 10