Kent E. Schweitzer has over 20 years of experience as an Oracle developer, database administrator, and team leader/manager. He has extensive experience with Oracle database administration, PL/SQL development, data warehousing, ETL processes, and automation of batch jobs. His background includes developing and supporting large data warehouse and reporting applications, performance tuning, and managing teams. He is currently a Vice President of Enterprise Data and Analytics at Wells Fargo, where he has worked on several projects involving data integration, reporting, and analytics.
Rajeev kumar apache_spark & scala developerRajeev Kumar
Rajeev Kumar is an experienced Apache Spark and Scala developer based in Amsterdam, NL. He has over 8 years of experience working with big data technologies like Apache Spark, Scala, Java, Hadoop, and data integration tools. He is proficient in processing large structured and unstructured datasets to identify patterns and gain insights. His experience includes designing and developing Spark applications using Scala, ETL processes, data warehousing, and working with technologies like Hive, HDFS, MapReduce, Sqoop, Kafka and more.
This document provides a summary of Maharshi Amin's professional experience and technical skills. He has over 10 years of experience developing software applications for the financial industry using technologies like Perl, Java, Sybase, and Informatica. His experience includes roles supporting trading, risk management, and regulatory reporting systems. He has strong skills in database design, application development, performance tuning, and leading development teams.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Asadullah Khan has over 16 years of experience in database development, ETL processes, SQL, and Oracle. He has strong skills in data analysis, data mapping, and extracting data from various sources for loading into target databases. Some of his key projects include merging large datasets containing over 500 million parts and automating legacy ETL tools. He is proficient in SQL, PL/SQL, Oracle, SQL Server and has experience leading teams and managing projects.
This document contains the resume of Ramesh, who has over 5 years of experience in SAP EIM technologies like BODS, BODI, Information Steward, and HANA. He has extensive experience designing and implementing ETL processes to extract, transform, and load data from various source systems into data warehouses. Ramesh has worked on multiple projects involving SAP data integration, data governance, and BI reporting.
This document contains the resume of Ramesh, who has over 5 years of experience in SAP EIM technologies including SAP Data Services, SAP Information Steward, and SAP HANA. He has extensive experience designing and implementing ETL processes, data integration projects, data governance solutions, and reporting dashboards. Ramesh has worked on multiple projects for clients in various industries.
This document provides a summary of an individual's skills and experience in data migration projects. The key points are:
1. The individual has over 7 years of experience in data extraction, transformation and loading using tools like SAP BODS, IBM Websphere Datastage, and SQL.
2. They have worked on international projects for clients in Germany, USA, Canada, UK, and Australia; and have experience in technologies like SQL Server, Oracle, SAP BO, SAP BODS.
3. Their roles include designing ETL processes, mapping data, implementing and optimizing extract-transform-load processes, and understanding business and technical requirements for data migration projects.
The document discusses new features in SQL Server 2008 that improve data storage, analytics, performance, scalability, high availability, security, and manageability. Key highlights include:
- Storing and querying multiple data types like relational, documents, XML, and spatial data more efficiently
- Enhancements for analytics, reporting, and mixed queries using features like column sets and sparse columns
- Increased scalability through features such as resource governor, memory management improvements, and query optimization
- High availability options like database mirroring, failover clustering, and replication
- Security enhancements including encryption, auditing, and reduced attack surfaces
- Simplified administration using tools such as SQL Server Management
Rajeev kumar apache_spark & scala developerRajeev Kumar
Rajeev Kumar is an experienced Apache Spark and Scala developer based in Amsterdam, NL. He has over 8 years of experience working with big data technologies like Apache Spark, Scala, Java, Hadoop, and data integration tools. He is proficient in processing large structured and unstructured datasets to identify patterns and gain insights. His experience includes designing and developing Spark applications using Scala, ETL processes, data warehousing, and working with technologies like Hive, HDFS, MapReduce, Sqoop, Kafka and more.
This document provides a summary of Maharshi Amin's professional experience and technical skills. He has over 10 years of experience developing software applications for the financial industry using technologies like Perl, Java, Sybase, and Informatica. His experience includes roles supporting trading, risk management, and regulatory reporting systems. He has strong skills in database design, application development, performance tuning, and leading development teams.
Shipra Jaiswal has over 6 years of experience in data warehousing and business intelligence solutions using tools like Informatica and Teradata. She has worked on ETL projects in various domains including healthcare, banking, e-commerce, and aviation. Her responsibilities have included requirements gathering, data modeling, mapping design, development, testing, implementation, and support.
Asadullah Khan has over 16 years of experience in database development, ETL processes, SQL, and Oracle. He has strong skills in data analysis, data mapping, and extracting data from various sources for loading into target databases. Some of his key projects include merging large datasets containing over 500 million parts and automating legacy ETL tools. He is proficient in SQL, PL/SQL, Oracle, SQL Server and has experience leading teams and managing projects.
This document contains the resume of Ramesh, who has over 5 years of experience in SAP EIM technologies like BODS, BODI, Information Steward, and HANA. He has extensive experience designing and implementing ETL processes to extract, transform, and load data from various source systems into data warehouses. Ramesh has worked on multiple projects involving SAP data integration, data governance, and BI reporting.
This document contains the resume of Ramesh, who has over 5 years of experience in SAP EIM technologies including SAP Data Services, SAP Information Steward, and SAP HANA. He has extensive experience designing and implementing ETL processes, data integration projects, data governance solutions, and reporting dashboards. Ramesh has worked on multiple projects for clients in various industries.
This document provides a summary of an individual's skills and experience in data migration projects. The key points are:
1. The individual has over 7 years of experience in data extraction, transformation and loading using tools like SAP BODS, IBM Websphere Datastage, and SQL.
2. They have worked on international projects for clients in Germany, USA, Canada, UK, and Australia; and have experience in technologies like SQL Server, Oracle, SAP BO, SAP BODS.
3. Their roles include designing ETL processes, mapping data, implementing and optimizing extract-transform-load processes, and understanding business and technical requirements for data migration projects.
The document discusses new features in SQL Server 2008 that improve data storage, analytics, performance, scalability, high availability, security, and manageability. Key highlights include:
- Storing and querying multiple data types like relational, documents, XML, and spatial data more efficiently
- Enhancements for analytics, reporting, and mixed queries using features like column sets and sparse columns
- Increased scalability through features such as resource governor, memory management improvements, and query optimization
- High availability options like database mirroring, failover clustering, and replication
- Security enhancements including encryption, auditing, and reduced attack surfaces
- Simplified administration using tools such as SQL Server Management
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has extensive experience designing, developing, and supporting complex ETL processes involving large datasets up to 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata tools and has worked on multiple banking, insurance, and healthcare projects.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
The document describes an Ab Initio developer with over 8 years of experience developing ETL applications using Ab Initio. Some key responsibilities included designing ETL processes to extract data from various sources like DB2, Oracle, SQL Server, load it into staging tables, transform the data using Ab Initio components, and load it into data warehouses and datamarts. The developer has extensive experience building complex graphs with various Ab Initio components, implementing data parallelism, and automating processes with shell scripts. Programming skills include SQL, Perl, and experience with databases like Oracle, SQL Server, and Teradata.
Carl Hathaway is a highly skilled Senior Oracle Database Analyst Developer with over 18 years of experience working with Oracle technologies such as PL/SQL, SQL, and data warehousing. He is seeking a new senior permanent position where he can utilize his strong analysis, design, problem-solving, and communication skills. He has a proven track record of optimizing systems, reducing processing times, and leading technical teams.
Rajarao Marisa has over 4 years of experience in data warehousing and ETL using Informatica. He has worked on multiple projects extracting data from sources like SAP, Salesforce, and flat files and loading it into databases including Teradata, Oracle, and SQL Server. Rajarao is proficient in Informatica PowerCenter and has experience with transformations, mappings, performance tuning, and testing. He currently works as a senior software engineer for Tech Mahindra.
S. Prabhu is a highly experienced Oracle Database Administrator with over 13 years of experience installing, configuring, maintaining and tuning Oracle databases from versions 8i through 11g on various operating systems. He has extensive expertise in areas such as Oracle RAC, Data Guard, Golden Gate, ASM, backup strategies using RMAN, performance tuning using AWR/ADDM and SQL tuning. Prabhu has worked with global clients in both Fortune 500 companies and service providers on large, complex databases and holds an MCA from Madras University.
Mopuru Babu has over 9 years of experience in software development using Java technologies and 3 years experience in Hadoop development. He has extensive experience designing, developing, and deploying multi-tier and enterprise-level distributed applications. He has expertise in technologies like Hadoop, Hive, Pig, Spark, and frameworks like Spring and Struts. He has worked on both small and large projects for clients in various industries.
Navneet Tiwari is a highly skilled SQL Server developer with over 9 years of experience in database technologies including SQL Server 2012/2008R2/2008/2005, SSIS, and SSRS. He has extensive experience designing and developing ETL processes, writing complex T-SQL queries, performance tuning, and report creation. Some of his roles include senior SQL BI developer/lead and SQL BI developer/DBA. He is proficient in a variety of technologies including .NET, JavaScript, XML, and Oracle.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
The document provides an introduction to Microsoft Business Intelligence (MSBI). It discusses how MSBI addresses the needs of users by integrating data across networks, providing summarized and historical data to help understand organizational health, and enabling 'what-if' analysis. It describes the MSBI architecture and how it uses SQL Server Integration Services, SQL Server Analysis Services, and SQL Server Reporting Services to move data between sources and destinations, perform online analytical processing to build cubes for analysis, and deliver reports, respectively. The document also compares MSBI to other BI tools and argues it provides the most reliable solution at the lowest total cost.
The document outlines the agenda for a presentation on new features in SQL Server 2008. It will cover enhancements and new capabilities in T-SQL, SQL Server Management Studio, the SQL Database Engine, SQL Reporting Services, SQL Server Integration Services, and SQL Server Analysis Services. Demonstrations will be provided for several of the new features.
This document provides a summary of Samuel Bayeta's qualifications and experience. It outlines his expertise in SQL Server, BI development, data modeling, ETL processes, and report design. It also lists his educational background in electrical engineering and over 5 years of experience as a SQL Server/BI developer for several companies, demonstrating skills in SQL, SSIS, SSRS, and data warehousing.
Kishore Chaganti provides information on Tableau products, architecture, licensing, performance, and optimization. The document discusses Tableau Desktop, Server, architecture which includes gateways, application servers, VizQL servers, data servers, background processes, data engines, repositories, and search. It also covers licensing considerations for single node, 3 node, and 5 node topologies as well as guidelines for optimizing query performance.
Tony Reid has over 30 years of experience as a software engineer and analyst with expertise in C#, ASP, Visual Basic, databases like Oracle and SQL Server, and methodologies including Agile and Waterfall. He has strong communication skills and experience managing both in-house and offshore development teams. Currently he is a Senior Systems Analyst at Eli Lilly where he has led projects involving data migration, application development, and technical support.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
This document contains a professional summary and details for Satheesh Talluri. It outlines over 8 years of experience as an Oracle Database Administrator with skills in Oracle 11g, 10g, 9i, and other technologies. Recent roles include working as an Oracle DBA for AT&T in Dallas, Texas and managing critical online applications. Previous experience includes work as an Oracle DBA for IEEE.org and Mylan.
Pentaho Data Integration/Kettle is an open source ETL tool that has been used by the presenter for two years. It allows users to extract, transform and load data from various sources like databases, files and NoSQL into destinations like data warehouses. Some advantages of Kettle include its graphical user interface, large library of components, performance processing large datasets, and ability to leverage Java libraries. The presenter demonstrates syncing and processing data between different sources using Kettle.
Pentaho Data Integration (Kettle) is an open-source extract, transform, load (ETL) tool. It allows users to visually design data transformations and jobs to extract data from source systems, transform it, and load it into data warehouses. Kettle includes components like Spoon for designing transformations and jobs, Pan for executing transformations, and Carte for remote execution. It supports various databases and file formats through flexible components and transformations.
This document demonstrates how to implement a delta mechanism when connecting a SAP BI system to a relational database like SQL or Oracle. It involves creating base and delta tables in the database, along with triggers to populate the delta table with new/updated records. An info package in SAP BI is configured to extract data from the delta table using a data transfer process, in order to synchronize new records from the database into the BI system while avoiding duplicate entries.
Shrikantha DM is a senior software engineer with over 3.5 years of experience developing applications using Oracle PL/SQL. He has skills in application design, development, testing and implementation using Oracle databases, PL/SQL and SQL. His experience includes developing applications for tasks such as URL filtering, managing IP addresses and domain servers, and organizing meetings using wireless networks.
The document provides a summary of Ashutosh Pandey's experience as an Oracle Database professional with over 10 years of experience. He has extensive skills in database administration, performance tuning, high availability solutions, database backups and recovery, and data replication technologies. Some of the key projects listed in his experience include Oracle 11g RAC implementations, database upgrades, data migrations involving large databases, and GoldenGate setups for active-active replication.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has extensive experience designing, developing, and supporting complex ETL processes involving large datasets up to 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata tools and has worked on multiple banking, insurance, and healthcare projects.
This resume is for Basu K S, an SAP BODS ETL Developer with over 3 years of experience developing and maintaining data warehouses and performing data migration. He has extensive experience using tools like SAP BODS, Information Steward, SQL Server, and writing SQL stored procedures. Some of his responsibilities include providing ETL designs, developing ETL jobs, performing data cleansing, testing, and supporting production loads. He is currently working at Utopia and has previously worked at Mindtree.
The document describes an Ab Initio developer with over 8 years of experience developing ETL applications using Ab Initio. Some key responsibilities included designing ETL processes to extract data from various sources like DB2, Oracle, SQL Server, load it into staging tables, transform the data using Ab Initio components, and load it into data warehouses and datamarts. The developer has extensive experience building complex graphs with various Ab Initio components, implementing data parallelism, and automating processes with shell scripts. Programming skills include SQL, Perl, and experience with databases like Oracle, SQL Server, and Teradata.
Carl Hathaway is a highly skilled Senior Oracle Database Analyst Developer with over 18 years of experience working with Oracle technologies such as PL/SQL, SQL, and data warehousing. He is seeking a new senior permanent position where he can utilize his strong analysis, design, problem-solving, and communication skills. He has a proven track record of optimizing systems, reducing processing times, and leading technical teams.
Rajarao Marisa has over 4 years of experience in data warehousing and ETL using Informatica. He has worked on multiple projects extracting data from sources like SAP, Salesforce, and flat files and loading it into databases including Teradata, Oracle, and SQL Server. Rajarao is proficient in Informatica PowerCenter and has experience with transformations, mappings, performance tuning, and testing. He currently works as a senior software engineer for Tech Mahindra.
S. Prabhu is a highly experienced Oracle Database Administrator with over 13 years of experience installing, configuring, maintaining and tuning Oracle databases from versions 8i through 11g on various operating systems. He has extensive expertise in areas such as Oracle RAC, Data Guard, Golden Gate, ASM, backup strategies using RMAN, performance tuning using AWR/ADDM and SQL tuning. Prabhu has worked with global clients in both Fortune 500 companies and service providers on large, complex databases and holds an MCA from Madras University.
Mopuru Babu has over 9 years of experience in software development using Java technologies and 3 years experience in Hadoop development. He has extensive experience designing, developing, and deploying multi-tier and enterprise-level distributed applications. He has expertise in technologies like Hadoop, Hive, Pig, Spark, and frameworks like Spring and Struts. He has worked on both small and large projects for clients in various industries.
Navneet Tiwari is a highly skilled SQL Server developer with over 9 years of experience in database technologies including SQL Server 2012/2008R2/2008/2005, SSIS, and SSRS. He has extensive experience designing and developing ETL processes, writing complex T-SQL queries, performance tuning, and report creation. Some of his roles include senior SQL BI developer/lead and SQL BI developer/DBA. He is proficient in a variety of technologies including .NET, JavaScript, XML, and Oracle.
Mukhtar Ahmed has over 8 years of experience in data warehousing and ETL projects. He has designed, developed, deployed and supported large scale ETL processes involving sources over 100 terabytes. He is specialized in IBM InfoSphere Datastage and Teradata utilities. He has worked on multiple industries including healthcare, banking and insurance.
The document provides an introduction to Microsoft Business Intelligence (MSBI). It discusses how MSBI addresses the needs of users by integrating data across networks, providing summarized and historical data to help understand organizational health, and enabling 'what-if' analysis. It describes the MSBI architecture and how it uses SQL Server Integration Services, SQL Server Analysis Services, and SQL Server Reporting Services to move data between sources and destinations, perform online analytical processing to build cubes for analysis, and deliver reports, respectively. The document also compares MSBI to other BI tools and argues it provides the most reliable solution at the lowest total cost.
The document outlines the agenda for a presentation on new features in SQL Server 2008. It will cover enhancements and new capabilities in T-SQL, SQL Server Management Studio, the SQL Database Engine, SQL Reporting Services, SQL Server Integration Services, and SQL Server Analysis Services. Demonstrations will be provided for several of the new features.
This document provides a summary of Samuel Bayeta's qualifications and experience. It outlines his expertise in SQL Server, BI development, data modeling, ETL processes, and report design. It also lists his educational background in electrical engineering and over 5 years of experience as a SQL Server/BI developer for several companies, demonstrating skills in SQL, SSIS, SSRS, and data warehousing.
Kishore Chaganti provides information on Tableau products, architecture, licensing, performance, and optimization. The document discusses Tableau Desktop, Server, architecture which includes gateways, application servers, VizQL servers, data servers, background processes, data engines, repositories, and search. It also covers licensing considerations for single node, 3 node, and 5 node topologies as well as guidelines for optimizing query performance.
Tony Reid has over 30 years of experience as a software engineer and analyst with expertise in C#, ASP, Visual Basic, databases like Oracle and SQL Server, and methodologies including Agile and Waterfall. He has strong communication skills and experience managing both in-house and offshore development teams. Currently he is a Senior Systems Analyst at Eli Lilly where he has led projects involving data migration, application development, and technical support.
Kallesha has over 4 years of experience as an Informatica/PLSQL developer. She has extensive experience developing mappings in Informatica to extract, transform and load data from various sources into data warehouses. She has worked on projects in various domains including storage, sales, banking, and finance. Kallesha is proficient in technologies like Informatica, Pentaho, Hive, HBase, Pig, Oracle, Teradata, and Shell scripting.
This document contains a professional summary and details for Satheesh Talluri. It outlines over 8 years of experience as an Oracle Database Administrator with skills in Oracle 11g, 10g, 9i, and other technologies. Recent roles include working as an Oracle DBA for AT&T in Dallas, Texas and managing critical online applications. Previous experience includes work as an Oracle DBA for IEEE.org and Mylan.
Pentaho Data Integration/Kettle is an open source ETL tool that has been used by the presenter for two years. It allows users to extract, transform and load data from various sources like databases, files and NoSQL into destinations like data warehouses. Some advantages of Kettle include its graphical user interface, large library of components, performance processing large datasets, and ability to leverage Java libraries. The presenter demonstrates syncing and processing data between different sources using Kettle.
Pentaho Data Integration (Kettle) is an open-source extract, transform, load (ETL) tool. It allows users to visually design data transformations and jobs to extract data from source systems, transform it, and load it into data warehouses. Kettle includes components like Spoon for designing transformations and jobs, Pan for executing transformations, and Carte for remote execution. It supports various databases and file formats through flexible components and transformations.
This document demonstrates how to implement a delta mechanism when connecting a SAP BI system to a relational database like SQL or Oracle. It involves creating base and delta tables in the database, along with triggers to populate the delta table with new/updated records. An info package in SAP BI is configured to extract data from the delta table using a data transfer process, in order to synchronize new records from the database into the BI system while avoiding duplicate entries.
Shrikantha DM is a senior software engineer with over 3.5 years of experience developing applications using Oracle PL/SQL. He has skills in application design, development, testing and implementation using Oracle databases, PL/SQL and SQL. His experience includes developing applications for tasks such as URL filtering, managing IP addresses and domain servers, and organizing meetings using wireless networks.
The document provides a summary of Ashutosh Pandey's experience as an Oracle Database professional with over 10 years of experience. He has extensive skills in database administration, performance tuning, high availability solutions, database backups and recovery, and data replication technologies. Some of the key projects listed in his experience include Oracle 11g RAC implementations, database upgrades, data migrations involving large databases, and GoldenGate setups for active-active replication.
Veera Narayanaswamy is an Oracle PL/SQL developer with over 5 years of experience developing business applications using Oracle databases. He has expertise in all phases of development including analysis, design, development, testing and maintenance. He is proficient in SQL, PL/SQL, Unix scripting and tools like SQL Developer and Toad. Some of his project experience includes migrating legacy data to a new Oracle application for Kaiser Permanente, developing reports, procedures and packages to support daily data feeds for a pension fund client, and creating forms, reports and database objects for an HR application for AT&T. He is a skilled full-stack developer who enjoys working on diverse Oracle projects.
This document contains a professional profile summary for Pranabesh Ghosh. It outlines his work experience as a technology consultant, including strengths like client interfacing and communication skills. It also lists technical skills and certifications. Several work experience summaries are then provided detailing roles on projects for clients like HP, Ahold, Aurora Energy, CVS Caremark, and Franklin Templeton, demonstrating experience leading teams and designing/developing ETL solutions.
Raymond Wu is a senior Oracle PL/SQL developer with over 8 years of experience developing applications for various clients in industries such as investment banking, retail, and insurance. He has extensive experience with Oracle databases from 8i to 12c, data migration, ETL processes, shell scripting, and performance tuning. Raymond is proficient in PL/SQL, SQL, and tools such as Toad and has a track record of successfully managing database projects.
Larry Cross is a senior Oracle developer with over 21 years of experience working with Oracle databases, PL/SQL, Pro*C, SQL*Loader, and other Oracle technologies. He has extensive experience designing, developing, and supporting retail merchandising systems for companies like Hess, Speedway, and Performance Retail. His background demonstrates a strong ability to solve problems, meet user needs, and improve systems to increase efficiency.
Aden Bahdon has over 15 years of experience as an Oracle developer and database administrator, specializing in designing and implementing data warehouse solutions. He has extensive experience working on projects for clients such as IBM Canada, Bell Canada, and the Department of National Defence, where he developed databases, ETL processes, and reports. His skills include Oracle, SQL, PL/SQL, Java, DataStage, MicroStrategy, and he has experience in all phases of the software development lifecycle.
This document summarizes the experience of Jacob Keecheril as a Senior .NET Developer with over 24 years of experience. He has extensive experience designing, developing, and testing client-server, web, and N-tier applications. His technical skills include VB.NET, C#, ASP.NET, SQL Server, and Oracle. He has worked on projects involving compliance monitoring systems, environmental reporting systems, and human resources systems for Pennsylvania state agencies.
R.ANANTH RAM DIKSHIT is an Oracle PL/SQL Developer with over 10 years of experience developing business applications using Oracle technologies. He has extensive experience in all phases of the SDLC, from analysis and design to development, testing, implementation and maintenance. He is proficient in Oracle databases, PL/SQL, SQL, and tools like TOAD. He has worked on several projects for clients in various domains and industries.
This document is a resume for Mark D. Andrews that summarizes his skills and experience as a data architect. He has over 25 years of experience in database design, development, and optimization using technologies like Java, SQL, Oracle, PostgreSQL, and Python. His most recent role was as a data architect at Advanti Solutions where he developed data ingestion components in Python and REST APIs to access data. Prior to that, he held roles at Thomson Reuters and BIOSIS where he performed tasks like database migrations, performance tuning, and developing applications and reporting systems that utilized Oracle databases.
Munir Muhammad has over 15 years of experience as an Oracle PL/SQL and ETL developer. He currently works as a senior database developer at JP Morgan Chase, where he supports various projects involving Oracle E-Business Suite. Previously, he held roles as a senior data warehouse developer at Barclays Bank and as a senior Oracle developer for the Department of Correction in Delaware. He has extensive experience with Oracle databases, PL/SQL, ETL processes, data modeling, and report development.
This document is a resume for Sujit Kumar Jha, an Oracle Certified Professional (OCP) and Tuning expert with 13 years of experience in PL/SQL development. He has extensive experience designing, developing and implementing solutions for clients in finance, telecom and insurance. Some of his key skills include Oracle SQL, PL/SQL, Unix shell scripting, data modeling, ETL processes, and working in Agile methodologies. He has worked as a lead developer on numerous projects involving building databases, ETL code, reports and batch processes to meet business requirements.
Milan Jain is an experienced IT professional with over 6 years of experience in Oracle database administration, performance tuning, PL/SQL, and Golden Gate. He currently works as an Oracle DBA/Developer at CVS Health in Woonsocket, RI where he performs tasks like data migration, backup and restoration, patch application, query tuning, and GoldenGate script development. Previously, he worked as an Oracle DBA/Developer at GE in Mumbai where he was responsible for database creation, monitoring, performance issue resolution, statistics analysis, and query optimization. Milan holds a B.Tech in IT and certifications in Oracle OCA 11g, Six Sigma Green Belt, and other technical training.
Ramesh Gurajana is an Oracle DBA with over 3.8 years of experience working with Oracle databases from 10g to 12c. He currently works as an Oracle DBA for SBIICM in Hyderabad, India. Some of his key skills and responsibilities include Oracle database installation, configuration, backup/recovery, performance tuning, query optimization, and high availability solutions like Oracle RAC and Data Guard. He has worked on several projects for banks and telecom companies involving large Oracle databases and e-learning portals.
The document provides a summary of Rama Prasad Owk's professional experience as an ETL/Hadoop Developer. It outlines over 6 years of experience in areas such as: designing, developing and implementing ETL jobs using IBM Data Stage; working with various Hadoop technologies such as MapReduce, Pig, Hive, HBase, Sqoop, and Spark; troubleshooting errors; importing and exporting data between HDFS and databases; and writing Spark and Python applications for Hadoop. Specific experiences are listed from roles at clients such as United Services Automobile Association, Walmart, and General Motors where responsibilities included developing ETL jobs, Hadoop programs, and SQL queries to meet business requirements.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes, performing testing, and collaborating with other teams on data migration projects for clients in various industries.
Sivakumar has over 9 years of experience in data warehousing and ETL development using tools like Informatica and Teradata. He has extensive experience designing and developing ETL processes for data migration, analytics projects for clients in various industries. His roles have included requirement analysis, mapping design, testing, performance tuning and managing project timelines.
Sandhya Rani Kallempudi has over 4 years of experience as an Oracle Database Administrator. She has experience installing, administering, upgrading, migrating, and patching Oracle databases. She also has experience with Oracle RAC, ASM, Data Guard, Golden Gate, and performance tuning. She has worked on both development and production Oracle databases.
The document provides a summary of an ETL Developer's skills and experience. The developer has 3 years of experience using IBM InfoSphere Datastage for ETL projects involving data extraction, transformation, and loading. Responsibilities include developing and debugging ETL jobs, testing and tuning performance, implementing changes, and working with databases like Oracle. The developer has worked on risk data warehousing and order tracking projects, developing jobs to move data between systems and load enterprise data warehouses.
Vineet Kurrewar is an Oracle Database Administrator with over 8 years of experience managing Oracle databases. He currently works at Brocade Communications in Bangalore as a DBA, where he focuses on performance tuning, data warehousing systems, and monitoring databases. Previously he worked at Fidelity Investments and Oracle Financials, among other companies, gaining experience with technologies like Oracle, SQL, RAC clusters, and database replication tools. He holds an MS in Systems Engineering and certifications in Oracle 10g.
1. Kent E. Schweitzer
704-591-1035
kstegacay@icloud.com
Objective: A challenging position with opportunities to use and expand my technical and leadership skills as a member of
an Application Development team.
Technology:
Oracle database 9i,10g,11g, rac, pl/sql,dml,ddl, bulk operations,table and indexcompression,
dynamic sql,statistics gathering, regular expressions,analytic functions,partitioning,subpartitioning,
cursors,refcursors, sqlldr,external tables, import/export,datapump, remote queries & database links,
index (b-tree, bitmap,function-based),parallel query, hints,savepoints, collections, dbms packages,
views, materialized views,roles,performance tuning (explain plan,sql tracing /tkprof), xml, system &
catalog views, dbms_job. OFSA 4.5 experience with rules,batch_ids,hierarchies,tables (ledger_stat,
idt_rollup, idt_batch,idt_sql and others).
Perl scripting for file processing,both reading and writing and command execution like sqlplus.
Created dynamic sql scriptthat prompts and outputs perl code to automaticallygenerate a perl script
during run-time based on record type volume. Use arrays, hashes,regular expressions and capture
variables.
Korn shell scripting to execute os-level tasks called by from autosys jobs.
Autosys (versions 4.5,r11.3) to automate and schedule batch jobs. Used boxjobs,command jobs,
file watchers,job dependencies,exitcode dependencies,calendars,date and time dependencies,
sendeventand autorep commands and load balancing.
Teradata BTEQ (minor usage)
Visual Basic 4.0, 5.0, Winbatch
COBOL, JCL, IMS, TSO, MVS and Easytrieve
OS: linux, hp/ux, solaris and aix.
Subversion,Harvestcode repositories.
Experience: Oracle Developer & Application DBA, Team Leader / Manager, Vice President
Enterprise Data and Analytics
Wells Fargo, Charlotte NC
2010 – Present
Business Direct Originations Datamartproject
Managed small team ofemployees and contractors during projectlifecycle.
Design,development,testing and production installation lead for new +3tb datamartfor originations
data. Performed the majorityof both developmentand leadership tasks throughoutduration of
project.
Load 500+ tables from data files and through database links.
Implemented concurrentprocessing tasks through autosys jobstreams.
Pl/sql ETLs perform complete refresh or maintain historyofchanges in partitioned tables.
Created subpartitioned staging tables (date,list) and partitioned (list) production tables
Used perl to splitlarge data file with multiple file types into individual files for loading. The perl script
is dynamically(re)created from a sql scriptevery execution based on record type volume. The sql
scriptoutputs the perl code. Other perl code creates files with dml commands thatare then used in
sqlplus calls to update database tables.
Used html/css to formatbatch load email reports thatshow record counts,errors and timing of a
particular load.
Wrote Teradata BTEQ scripts to create and load Teradata tables from flat files,execute queries and
export results into a flat file to load into Oracle through sqlldr. Used openssl to encrypt Teradata
password to preventstoring clear-text password on the Oracle database server.
2. Business Direct BDAPPMGR project
Designed,developed and installed custom oracle and korn shell application to handle job execution
by providing a metadata table in oracle for commands and 20+ generic korn shell functions that
accept arguments for use by any automated job.
Autosys jobs call BDAPPMGR korn shell scriptpassing variables thatBDAPPMGR uses to query an
oracle table for the commands to process and in whatorder.
BDAPPMGR korn scriptmanages the execution of the job with options to override return codes and
skip commands.
BDAPPMGR oracle functions update execution statistics in oracle tables for run-time values and
batch process reporting.
All new applications are written to BDAPPMGR standards and older applications are being modified
or re-written to use BDAPPMGR.
BDAPPMGR allows the developmentand supportteam to automate jobs in a consistent,known and
trustworthy process.
Business Direct LCRE project
Provide supportand new enhancements for application thatretrieves and relates accounts for
custom collections platform.
Automate business and risk models based on aggregations ofdata and application ofrules.
Business Direct BDD_RPR project
Provide supportand new enhancements for a star-schema reporting application.
Update dimensions and facttable and aggregate tables based on requirements.
Oracle Developer & Application DBA
Home Lending
Wells Fargo, Charlotte NC
2009 – 2010
CORE project
New developmentand supportof datamartand reporting applications for the CORE application.
Wrote xml within pl/sql procedures to parse and load data from various data sources.
Wrote perl scripts to automate autosys reporting.
Wrote pl/sql packages to load data and track it through the application for auditrequirements.
Performed performance tuning for queries in ETLs and used by other reporting teams.
Oracle Developer / Application DBA
Corporate Finance
Wachovia, Charlotte NC
2000 – 2009
Reconciliation project
Wrote oracle packages to migrate reconciled data from processing tables to compressed partitioned
historical tables.
Involved tables with millions ofrows and numerous indexes.
Code made extensive use of dbms_job to rebuild indexes in parallel to reduce overall execution time.
Code was created with dynamic sql with calls to dictionary views to make the code generic and
flexible.
Perl scripts were written to reformat complex data files prior to loading files through SQL
Loader.
IT Chargeback project
Designer and lead database developer on unit costing application.
The application received data in files and through database links and was processed and
stored into partitioned and subpartitioned tables through pl/sql packages.
External tables were used to load data files.
Cost calculations were performed and stored in subpartitioned tables (range then list).
User reports were created through korn shell scripts generating HTML around sql plus queries.
Autosys jobs utilized load-balancing to further improve performance.
3. Used autosys global variables to assistapplication scheduling atthe autosys layer.
This application was written to replace an older,mainframe application thatwas supported by several
business partners and took days to process. The new application required one business partner and
justa few hours to complete.
Wachovia ClientPartnership project
Lead designer and developer on large,8 terabyte database application involving data retrieval,
processing and reporting ofcurrent and historical data.
The database houses manypartitioned tables,the largestcontaining over two billion records that
undergo a sizeable number ofdml operations within defined timeframes
Restatement project
Designer and lead developer on large, multiple database application thatinvolves rewriting an
existing application to utilize partitioned tables,enabling business users to concurrentlyperform
restatementactivities more efficientlyand effectively with multiple years of data. The application is
used to house and prepare the data necessaryfor the Oracle Financial Services Application.
Speedy Pop (OFSA) project
Lead designer and developer on a multiple database application that summarizes tables and objects
on a detail-level database and inserts the smaller population ofdata and objects into a summary-
level database.
This application involves concurrentlyexecuting remote operations and a large amountofdata
transfer.
This application was designed for speed and enables the business team to view summary-level
results in a matter of hours in comparison to days on the detail level.
OFSA project
Supportand customization ofOracle Financial Services Application (OFSA).
Designed and developed migration package to move application objects between different
databases. This package made extensive use of triggers to validate application objects and user sql.
The package extended OFSA functionalityand metall SOX requirements..
Manage database and application resources and perform tuning functions atthe database and
application layers for these and other applications.
Identified and corrected inefficientparameter settings,indexes,analyze methods and poorly-
written application code to reduce runtimes from five days to two days on a detail-level database
and from seven hours to less than three on a summary-level database.
Identified incorrectparallel setting to reduce 55+ hour sql execution to less than two hours.
Identified incorrectly compressed indexes and rebuiltthem appropriately, resulting in smaller
segmentsizes and space savings.
Analyzed bitmap and b-tree index usage and converted several bitmap indexes to smaller,
compressed b-tree indexes.
Used knowledge ofexplain plan analysis to resolve issues related to poor joins,incorrectindex /
table access and parallel execution.
Defined tablespace sizing requirements for tables and indexes in the above applications and
others. Perform periodic review of space allocations,objectusage and provide suggestions where
necessary.
Provide application supportfor Disaster Recoveryexercises.
Project Manager for merger integration projects.
Visual Basic and Oracle Developer
Remote Banking
1997 – 2000
First Union, Charlotte NC
Designed,developed and supported Visual Basic and Oracle application to replace existing manual
process for gathering and reporting information related to FirstUnion’s branches and Automated
Teller Machines (ATMs).
Supported Visual Basic and Oracle online and batch customer enrollmentapplication. This
application used telephonysoftware to gather customer data to retrieve from various enterprise data
sources prior to routing the call to customer service agents. This application connected to as Oracle
database and performed all types of dml and queries required bythe application.
Wrote Winbatch programs to encrypt and transmitextract files to external vendor through SFTP.
4. Mainframe Application Programmer
Consumer Loans
1995 - 1997
First Union,Charlotte NC
Supportand develop new code for legacy IMS application thatsupported the Consumer Loan
division. Both online and batch programs were created or modified. Als o created and supported
Easytrieve programs.
Education: University of North Carolina atCharlotte
BS, MIS
1993 -1995
Charlotte,NC
GPA of 4.0
Invitations to the Honor Societies ofPhi Kappa Phi and Beta Gamma Sigma.
Central PiedmontCommunityCollege
Work-related Continuing Education
1994 – 1997
Charlotte,NC
Classes related to work experience; CICS, JCL, C and Smalltalk.
Appalachian State University
BS, Health Care Management
1987 – 1991
Boone,NC