This document outlines a data readiness methodology for migrating data from legacy systems to SAP. The methodology includes extracting data from source systems or collecting manual data, transforming the data in a staging area, and loading it into SAP. It describes components like extract, transform, and load. The methodology is intended to identify data quality issues early and deliver consistent, predictable results for data migration.
Analytix Data Services provides data integration software and services. Its flagship product, AnalytiX Mapping Manager, is an enterprise solution for data mapping and metadata management. It helps customers accelerate project delivery through automating source-to-target mappings, enabling collaboration, and increasing productivity. The solution comprises modules for resource management, system management, and mapping specifications. It has over 50 customers worldwide.
Hovitaga Data Visualizer is a lightweight solution for real-time OLAP analysis and data visualization primarily for SAP systems. It can present data with a pivot grids, maps, charts, gauges, treemaps and lists. Dashboards can be created with drag and drop which may gather data from different systems to give decision makers stunning interfaces to have an overview of key business data at one place.
This paper gives an overview of the product.
The document discusses the MCA21 program, an e-governance project undertaken by Tata Consultancy Services for the Ministry of Corporate Affairs in India. The key points are:
1) MCA21 digitized over 45 million paper records across Registrar of Companies, implemented end-to-end automation of ministry functions, and enabled electronic filing and paperless processes.
2) The project setup infrastructure including data centers, servers, and networks across 80 ministry offices nationwide. It also established 52 facilitation centers for public assistance.
3) The MCA21 solution features a secure and scalable architecture, supports over 8 million daily website hits and 70,000 electronic filings. It ensures 24/7 availability
Dariusz Pietrzak is a Dutch national and senior SAP consultant living in Rotterdam specializing in business logistics. He has over 30 years of experience in IT and has worked extensively with SAP solutions including MM, IS-Oil, and MDM at many large companies. He is highly skilled in areas like master data management, implementation, and project leadership. Currently, he is self-employed and provides SAP consulting services through his company DPBC.
1) Microsoft provides business intelligence solutions that deliver insights to employees, leading to better, faster decisions.
2) The solutions include tools for analytics, reporting, dashboards and performance management that integrate with Microsoft applications and platforms.
3) Microsoft's business intelligence offerings provide a complete and integrated solution designed to deliver widespread intelligence throughout an organization.
Aden Bahdon has over 15 years of experience as an Oracle developer and database administrator, specializing in designing and implementing data warehouse solutions. He has extensive experience working on projects for clients such as IBM Canada, Bell Canada, and the Department of National Defence, where he developed databases, ETL processes, and reports. His skills include Oracle, SQL, PL/SQL, Java, DataStage, MicroStrategy, and he has experience in all phases of the software development lifecycle.
The Longview Integration Suite (LVIS) provides data integration between Longview 7 and any data sources. LVIS simplifies and automates the integration process through a graphical interface and pre-built connectors. It extracts and transforms data from hundreds of sources, loads it into Longview, and automates scheduled or triggered integration. LVIS helps customers overcome data integration challenges and provides benefits like faster planning and reduced costs.
David Colbourn is an experienced information architect seeking a senior role. He has over 25 years of experience in areas such as software analysis, design, data modeling, project management, and big data integration. His core competencies include information architecture, data modeling, database design, project management, and relational and non-relational systems. He has worked in various industries including banking, healthcare, telecommunications, and government.
Analytix Data Services provides data integration software and services. Its flagship product, AnalytiX Mapping Manager, is an enterprise solution for data mapping and metadata management. It helps customers accelerate project delivery through automating source-to-target mappings, enabling collaboration, and increasing productivity. The solution comprises modules for resource management, system management, and mapping specifications. It has over 50 customers worldwide.
Hovitaga Data Visualizer is a lightweight solution for real-time OLAP analysis and data visualization primarily for SAP systems. It can present data with a pivot grids, maps, charts, gauges, treemaps and lists. Dashboards can be created with drag and drop which may gather data from different systems to give decision makers stunning interfaces to have an overview of key business data at one place.
This paper gives an overview of the product.
The document discusses the MCA21 program, an e-governance project undertaken by Tata Consultancy Services for the Ministry of Corporate Affairs in India. The key points are:
1) MCA21 digitized over 45 million paper records across Registrar of Companies, implemented end-to-end automation of ministry functions, and enabled electronic filing and paperless processes.
2) The project setup infrastructure including data centers, servers, and networks across 80 ministry offices nationwide. It also established 52 facilitation centers for public assistance.
3) The MCA21 solution features a secure and scalable architecture, supports over 8 million daily website hits and 70,000 electronic filings. It ensures 24/7 availability
Dariusz Pietrzak is a Dutch national and senior SAP consultant living in Rotterdam specializing in business logistics. He has over 30 years of experience in IT and has worked extensively with SAP solutions including MM, IS-Oil, and MDM at many large companies. He is highly skilled in areas like master data management, implementation, and project leadership. Currently, he is self-employed and provides SAP consulting services through his company DPBC.
1) Microsoft provides business intelligence solutions that deliver insights to employees, leading to better, faster decisions.
2) The solutions include tools for analytics, reporting, dashboards and performance management that integrate with Microsoft applications and platforms.
3) Microsoft's business intelligence offerings provide a complete and integrated solution designed to deliver widespread intelligence throughout an organization.
Aden Bahdon has over 15 years of experience as an Oracle developer and database administrator, specializing in designing and implementing data warehouse solutions. He has extensive experience working on projects for clients such as IBM Canada, Bell Canada, and the Department of National Defence, where he developed databases, ETL processes, and reports. His skills include Oracle, SQL, PL/SQL, Java, DataStage, MicroStrategy, and he has experience in all phases of the software development lifecycle.
The Longview Integration Suite (LVIS) provides data integration between Longview 7 and any data sources. LVIS simplifies and automates the integration process through a graphical interface and pre-built connectors. It extracts and transforms data from hundreds of sources, loads it into Longview, and automates scheduled or triggered integration. LVIS helps customers overcome data integration challenges and provides benefits like faster planning and reduced costs.
David Colbourn is an experienced information architect seeking a senior role. He has over 25 years of experience in areas such as software analysis, design, data modeling, project management, and big data integration. His core competencies include information architecture, data modeling, database design, project management, and relational and non-relational systems. He has worked in various industries including banking, healthcare, telecommunications, and government.
Microsoft SQL Server - How to Collaboratively Manage Excel DataMark Ginnebaugh
How to Collaboratively Manage Excel-Based Process Data in SQL Server
Your organization probably uses Excel for a variety of business processes including budgeting, sales revenue forecasting, product demand planning, and project management.
You'll learn how to set up and manage multi-user collaborative processes using Excel as the data form and SQL Server as the data store and process engine.
You'll learn:
* How to enable cell-level collaboration between multiple users using Excel and SQL Server.
* How to effectively integrate desktop Excel-based process data with enterprise applications.
* How to mitigate the limitations normally associated with Excel-to-database connections including record locking (check-in/out), conflict management, and change management and versioning.
This is a information-packed presentation on data migration made by BWIR, global solutions and services partner to SolidWorks Enterprise PDM. This was showcased at SolidWorks World 2011 and the presentation talks about data migration from other PDM/PLM systems to SolidWorks EPDM.
The document describes an IT portfolio management solution called IT Discovery that provides a centralized view of metadata from various IT systems. It extracts information from sources like code, databases, logs and integrates it into a repository. This enables managers to analyze applications, optimize resources, and improve communication across departments. IT Discovery runs on mainframes and supports various programming languages, databases and tools. It provides reporting and querying capabilities to help with tasks like license management, performance analysis and maintenance planning.
Geometric provides an intelligent approach to the migration of the PDM data in the context of applications and processes by assisting the customer in planning, assessment and suggesting right migration approach.
Pallavi Gokhale Mishra has over 16 years of experience in data migration, data warehousing, project management, and software development. She currently works as a Solution Architect for IBM India on a project involving data migration from Oracle CRM to Siebel CRM for Vodafone India. Her experience includes managing teams and leading complex data migration projects involving Siebel, SAP, and other applications for clients in telecom, automotive, banking, and other industries. She has strong skills in ETL tools like IBM Datastage, databases like Oracle, and programming languages like SQL.
Oracle Data Integrator is an ETL tool that has three main differentiators: 1) It uses a declarative, set-based design approach which allows for shorter implementation times and reduced learning curves compared to specialized ETL skills. 2) It can transform data directly in the existing RDBMS for high performance and lower costs versus using a separate ETL server. 3) It has hot-pluggable knowledge modules that provide a library of reusable templates to standardize best practices and reduce costs.
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
The document is a curriculum vitae that summarizes the professional experience and technical skills of a software professional. It includes:
- Over 9 years of experience in Microsoft technologies including software development, SQL Server, .NET, and Crystal Reports.
- Strong skills in software development lifecycle processes, object oriented programming, and client/server application development.
- Experience leading development teams and working on projects for organizations like Perot Systems and Dell involving applications like RAS, HomeExpress, and CAPA.
This document describes a proposed scheduling and shift management software called Shift-iT. Surveys and interviews revealed inefficiencies in current scheduling procedures used by industries, public sector, and private services companies. Shift-iT aims to address these inefficiencies by providing an automated scheduling solution using a service-oriented architecture with web and mobile applications. The product is expected to provide scheduling optimization, real-time duty management, and transparency. Financial projections estimate the product can achieve profitability in its second year with a valuation of $5-10 million by its fifth year as it targets the global market for scheduling and shift management software.
Srinivas Chunduru has over 12 years of experience as a database architect and administrator working primarily with Oracle, SQL Server, Sybase, and MongoDB databases. He is currently working as a senior database administrator at SSNC & CitiBank, where his responsibilities include database design, configuration, implementation, backups, disaster recovery, monitoring, automation, upgrades, and more. He has extensive experience working in the financial industry.
This document summarizes new features in versions 3 and 4 of IBM Cognos 8 business intelligence software. Key features discussed include briefing books, self-service alerts, improved data modeling capabilities, express report authoring, new scorecard portlets, system monitoring tools, and enhanced mobile and dashboard capabilities. The software aims to provide more context, relevance, and confidence in business information for informed decision making.
Allow me to show you what I have done over the last 25 years.dthornton4
Derrick Thornton has over 25 years of experience in information systems, infrastructure technology, and manufacturing across various industries. He has worked for many contracting and staffing companies providing IT services and support. His roles have included systems analyst, network administrator, help desk technician, and field support technician where he built images, deployed systems, performed break/fix and asset management.
The document provides an overview of the Informatica PowerCenter 7.1 product, describing its major components for ETL development, how to build basic mappings and workflows, and available options for loading target data. It also outlines the course objectives to understand PowerCenter architecture and components, build mappings and workflows, and troubleshoot common problems. Resources available from Informatica like documentation, support, and certification programs are also summarized.
Anthony Feliciano has extensive experience developing complex queries and stored procedures using SQL Server 2008 and DB2. He has experience in data warehouse environments and has optimized database performance through tuning and indexing strategies. He also has experience managing offshore development teams.
The document describes an eluzzion CRM system for sales force excellence. It allows users to carry their portion of the central database with them on a PDA or web browser. Users can enter and access information from anywhere. The system is customizable by administrators and managers to configure reports, user permissions, and organizational structures for optimal use. Web reports provide exportable, customizable, and up-to-date reports on key metrics with alert functions. The home page is also customizable and provides quick access to modules, news, activities and notifications.
Skelta is a product company founded in 2003, headquartered in Bangalore with its sales headquarters in Boston, USA.
Skelta’s highly innovative flagship product BPM.NET ™ specializes in enterprise wide Business Process Management (BPM) and Advanced workflow solutions for small to large sized enterprises worldwide. It is the world’s first 100% embeddable BPM and advanced workflow framework built on .NET technology.
Skelta provides BPM solutions which integrate between system to system, system to human and Human Workflow Solutions for Business Users, Power Users, and Developers for providing BPM functionalities inside existing applications, making it an excellent candidate for OEMing applications that require BPM functionality. Skelta BPM.NET™ particularly integrates well with products based on Microsoft Technologies. Skelta is also utilized as a Business Application Platform to build horizontal solutions like such as Accounts Payable Solution, Document Management for Paperless Processes, Corporate Governance, and Human Resource Information System for various industries ranging from Aerospace and Defense, Automotive, Retail, Government, Healthcare, Finance and many more.
Prasanna Kumar has over 12 years of experience in SAP BI BW focusing on implementations and technical expertise in areas like data modeling, extraction, loading, transformations, aggregations and reporting. He has worked on multiple projects for clients in various industries and has expertise in upgrading from BW 7.0 to 7.3. Currently he works as a consultant for MAN Diesel & Turbo India providing maintenance and support for their SAP BI landscape.
Change Manager‘s database comparison, alter, and synchronization capabilities enabled DBA Consulting to generate reports and reconcile differences between the different versions of the databases, tables, schemas, and other database objects.
This document provides an overview of an Informatica training course offered by Edureka. The course covers topics such as ETL fundamentals, Informatica PowerCenter components, transformations, debugging techniques, and performance tuning. It aims to help students of varying experience levels learn skills for roles like ETL developer, data specialist, and Informatica administrator. The course contains modules on PowerCenter installation, administration, architecture, and best practices, along with hands-on labs and projects. Students will receive a certificate upon completion. More details on the course structure and registration are available on Edureka's website.
Magic quadrant for data warehouse database management systems divjeev
This document provides a Magic Quadrant analysis of 16 data warehouse database management system vendors to help readers choose the right vendor for their needs. It discusses trends in the market in 2010 such as acquisitions, the introduction of new appliances, and continued performance issues. The document also outlines key factors that will influence the market in 2011, including demands for better performance, extreme data management, and new applications delivering high business value.
Innovation Webinar - Using IFS Applications BI to drive business excellenceIFS
Studies show that best-in-class businesses—those with the best operating margins and turnover growth in their industries—have clearly defined objectives supported by a Business Intelligence solution. In this session, we’ll look at specific features in the IFS Applications Business Intelligence solution. See how easily these features can help you support strategic business initiatives and reach improved operational results.
Microsoft SQL Server - How to Collaboratively Manage Excel DataMark Ginnebaugh
How to Collaboratively Manage Excel-Based Process Data in SQL Server
Your organization probably uses Excel for a variety of business processes including budgeting, sales revenue forecasting, product demand planning, and project management.
You'll learn how to set up and manage multi-user collaborative processes using Excel as the data form and SQL Server as the data store and process engine.
You'll learn:
* How to enable cell-level collaboration between multiple users using Excel and SQL Server.
* How to effectively integrate desktop Excel-based process data with enterprise applications.
* How to mitigate the limitations normally associated with Excel-to-database connections including record locking (check-in/out), conflict management, and change management and versioning.
This is a information-packed presentation on data migration made by BWIR, global solutions and services partner to SolidWorks Enterprise PDM. This was showcased at SolidWorks World 2011 and the presentation talks about data migration from other PDM/PLM systems to SolidWorks EPDM.
The document describes an IT portfolio management solution called IT Discovery that provides a centralized view of metadata from various IT systems. It extracts information from sources like code, databases, logs and integrates it into a repository. This enables managers to analyze applications, optimize resources, and improve communication across departments. IT Discovery runs on mainframes and supports various programming languages, databases and tools. It provides reporting and querying capabilities to help with tasks like license management, performance analysis and maintenance planning.
Geometric provides an intelligent approach to the migration of the PDM data in the context of applications and processes by assisting the customer in planning, assessment and suggesting right migration approach.
Pallavi Gokhale Mishra has over 16 years of experience in data migration, data warehousing, project management, and software development. She currently works as a Solution Architect for IBM India on a project involving data migration from Oracle CRM to Siebel CRM for Vodafone India. Her experience includes managing teams and leading complex data migration projects involving Siebel, SAP, and other applications for clients in telecom, automotive, banking, and other industries. She has strong skills in ETL tools like IBM Datastage, databases like Oracle, and programming languages like SQL.
Oracle Data Integrator is an ETL tool that has three main differentiators: 1) It uses a declarative, set-based design approach which allows for shorter implementation times and reduced learning curves compared to specialized ETL skills. 2) It can transform data directly in the existing RDBMS for high performance and lower costs versus using a separate ETL server. 3) It has hot-pluggable knowledge modules that provide a library of reusable templates to standardize best practices and reduce costs.
Pedro P. Gatica Jr. provides his contact information and an extensive professional profile summarizing his experience and skills as an IT Lead Software Developer/Integrator with over 25 years of experience in data warehousing, ETL development, and project management. He has expertise in technologies like DataStage, Informatica, DB2, Oracle, and SQL Server. His experience includes roles at USAA from 1997 to 2014 where he led many strategic data integration projects and established batch and real-time ETL environments.
The document is a curriculum vitae that summarizes the professional experience and technical skills of a software professional. It includes:
- Over 9 years of experience in Microsoft technologies including software development, SQL Server, .NET, and Crystal Reports.
- Strong skills in software development lifecycle processes, object oriented programming, and client/server application development.
- Experience leading development teams and working on projects for organizations like Perot Systems and Dell involving applications like RAS, HomeExpress, and CAPA.
This document describes a proposed scheduling and shift management software called Shift-iT. Surveys and interviews revealed inefficiencies in current scheduling procedures used by industries, public sector, and private services companies. Shift-iT aims to address these inefficiencies by providing an automated scheduling solution using a service-oriented architecture with web and mobile applications. The product is expected to provide scheduling optimization, real-time duty management, and transparency. Financial projections estimate the product can achieve profitability in its second year with a valuation of $5-10 million by its fifth year as it targets the global market for scheduling and shift management software.
Srinivas Chunduru has over 12 years of experience as a database architect and administrator working primarily with Oracle, SQL Server, Sybase, and MongoDB databases. He is currently working as a senior database administrator at SSNC & CitiBank, where his responsibilities include database design, configuration, implementation, backups, disaster recovery, monitoring, automation, upgrades, and more. He has extensive experience working in the financial industry.
This document summarizes new features in versions 3 and 4 of IBM Cognos 8 business intelligence software. Key features discussed include briefing books, self-service alerts, improved data modeling capabilities, express report authoring, new scorecard portlets, system monitoring tools, and enhanced mobile and dashboard capabilities. The software aims to provide more context, relevance, and confidence in business information for informed decision making.
Allow me to show you what I have done over the last 25 years.dthornton4
Derrick Thornton has over 25 years of experience in information systems, infrastructure technology, and manufacturing across various industries. He has worked for many contracting and staffing companies providing IT services and support. His roles have included systems analyst, network administrator, help desk technician, and field support technician where he built images, deployed systems, performed break/fix and asset management.
The document provides an overview of the Informatica PowerCenter 7.1 product, describing its major components for ETL development, how to build basic mappings and workflows, and available options for loading target data. It also outlines the course objectives to understand PowerCenter architecture and components, build mappings and workflows, and troubleshoot common problems. Resources available from Informatica like documentation, support, and certification programs are also summarized.
Anthony Feliciano has extensive experience developing complex queries and stored procedures using SQL Server 2008 and DB2. He has experience in data warehouse environments and has optimized database performance through tuning and indexing strategies. He also has experience managing offshore development teams.
The document describes an eluzzion CRM system for sales force excellence. It allows users to carry their portion of the central database with them on a PDA or web browser. Users can enter and access information from anywhere. The system is customizable by administrators and managers to configure reports, user permissions, and organizational structures for optimal use. Web reports provide exportable, customizable, and up-to-date reports on key metrics with alert functions. The home page is also customizable and provides quick access to modules, news, activities and notifications.
Skelta is a product company founded in 2003, headquartered in Bangalore with its sales headquarters in Boston, USA.
Skelta’s highly innovative flagship product BPM.NET ™ specializes in enterprise wide Business Process Management (BPM) and Advanced workflow solutions for small to large sized enterprises worldwide. It is the world’s first 100% embeddable BPM and advanced workflow framework built on .NET technology.
Skelta provides BPM solutions which integrate between system to system, system to human and Human Workflow Solutions for Business Users, Power Users, and Developers for providing BPM functionalities inside existing applications, making it an excellent candidate for OEMing applications that require BPM functionality. Skelta BPM.NET™ particularly integrates well with products based on Microsoft Technologies. Skelta is also utilized as a Business Application Platform to build horizontal solutions like such as Accounts Payable Solution, Document Management for Paperless Processes, Corporate Governance, and Human Resource Information System for various industries ranging from Aerospace and Defense, Automotive, Retail, Government, Healthcare, Finance and many more.
Prasanna Kumar has over 12 years of experience in SAP BI BW focusing on implementations and technical expertise in areas like data modeling, extraction, loading, transformations, aggregations and reporting. He has worked on multiple projects for clients in various industries and has expertise in upgrading from BW 7.0 to 7.3. Currently he works as a consultant for MAN Diesel & Turbo India providing maintenance and support for their SAP BI landscape.
Change Manager‘s database comparison, alter, and synchronization capabilities enabled DBA Consulting to generate reports and reconcile differences between the different versions of the databases, tables, schemas, and other database objects.
This document provides an overview of an Informatica training course offered by Edureka. The course covers topics such as ETL fundamentals, Informatica PowerCenter components, transformations, debugging techniques, and performance tuning. It aims to help students of varying experience levels learn skills for roles like ETL developer, data specialist, and Informatica administrator. The course contains modules on PowerCenter installation, administration, architecture, and best practices, along with hands-on labs and projects. Students will receive a certificate upon completion. More details on the course structure and registration are available on Edureka's website.
Magic quadrant for data warehouse database management systems divjeev
This document provides a Magic Quadrant analysis of 16 data warehouse database management system vendors to help readers choose the right vendor for their needs. It discusses trends in the market in 2010 such as acquisitions, the introduction of new appliances, and continued performance issues. The document also outlines key factors that will influence the market in 2011, including demands for better performance, extreme data management, and new applications delivering high business value.
Innovation Webinar - Using IFS Applications BI to drive business excellenceIFS
Studies show that best-in-class businesses—those with the best operating margins and turnover growth in their industries—have clearly defined objectives supported by a Business Intelligence solution. In this session, we’ll look at specific features in the IFS Applications Business Intelligence solution. See how easily these features can help you support strategic business initiatives and reach improved operational results.
Collaborate 2012 - the never ending road of project management presentation c...Chain Sys Corporation
The document discusses a presentation on project management best practices for Oracle R12 implementations. It provides an overview of the presenting organization, CHAIN SYS, and covers topics like technology and project management trends, R12 development issues, agile development methodology, project management disciplines, and lessons learned. CHAIN SYS recommends a hybrid implementation methodology combining features of Oracle's methodologies to manage R12 projects.
1. The document describes building an analytical platform for a retailer by using open source tools R and RStudio along with SAP Sybase IQ database.
2. Key aspects included setting up SAP Sybase IQ as a column-store database for storage and querying of data, implementing R and RStudio for statistical analysis, and automating running of statistical models on new data.
3. The solution provided a low-cost platform capable of rapid prototyping of analytical models and production use for predictive analytics.
1. The customer asked the author to build an analytical platform to store data in a database and perform statistical analysis from a front-end interface.
2. The author chose an SAP Sybase IQ column-store database to store data, the open-source R programming language to perform statistical analysis, and RStudio as the front-end interface.
3. The solution provided a simple way to load and query large amounts of data, automated running of statistical models, and could be deployed in the cloud.
Prashanth Shankar Kumar has over 8 years of experience in data analytics, Hadoop, Teradata, and mainframes. He currently works as a Hadoop Developer/Tech Lead at Bank of America where he develops Hive queries, Impala queries, MapReduce programs, and Oozie workflows. Previously he worked as a Hadoop Developer at State Farm Insurance where he installed and managed Hadoop clusters and developed solutions using Hive, Pig, Sqoop, and HBase. He has expertise in Teradata, SQL, Java, Linux, and agile methodologies.
The document presents information on data warehousing. It defines a data warehouse as a repository for integrating enterprise data for analysis and decision making. It describes the key components, including operational data sources, an operational data store, and end-user access tools. It also outlines the processes of extracting, cleaning, transforming, loading and accessing the data, as well as common management tools. Data marts are discussed as focused subsets of a data warehouse tailored for a specific department.
This document provides an overview of a fastrack distribution management system (DMS) pilot implementation approach for utilities. The approach involves four phases: Build, where a subset of the utility's network is modeled; Learn, where the model is evaluated; Plan, where future goals and strategies are identified; and Execute, where the DMS software is deployed. The pilot helps utilities demonstrate DMS benefits, better understand their data needs, and build support for further smart grid projects.
This document discusses maximizing returns from a data warehouse. It covers the need for real-time data integration to power business intelligence and enable timely, trusted decisions. It outlines challenges with traditional batch-based approaches and how Oracle's data integration solutions address these through products that enable real-time data capture and delivery, bulk data movement, and data quality profiling to build an enterprise data warehouse.
Cloud-Native Patterns for Data-Intensive ApplicationsVMware Tanzu
Are you interested in learning how to schedule batch jobs in container runtimes?
Maybe you’re wondering how to apply continuous delivery in practice for data-intensive applications? Perhaps you’re looking for an orchestration tool for data pipelines?
Questions like these are common, so rest assured that you’re not alone.
In this webinar, we’ll cover the recent feature improvements in Spring Cloud Data Flow. More specifically, we’ll discuss data processing use cases and how they simplify the overall orchestration experience in cloud runtimes like Cloud Foundry and Kubernetes.
Please join us and be part of the community discussion!
Presenters :
Sabby Anandan, Product Manager
Mark Pollack, Software Engineer, Pivotal
The document discusses challenges related to migrating data from legacy systems to new applications and systems. It notes there are typically many source systems in various formats with incomplete or unknown information. Effective data migration requires understanding source systems, data mapping, quality analysis, and design of the migration process. It also stresses the importance of data governance and quality to ensure migrated data can be effectively used.
Sakthi Shenbagam is a senior Informatica lead developer with over 7 years of experience in data warehousing and ETL development. She has extensive experience designing and developing complex ETL mappings and workflows to load data into data warehouses from various sources like Oracle, Netezza, and flat files. Some of her responsibilities include requirement gathering, design, development, testing, performance tuning, and support of ETL processes and data warehousing projects for clients across various industries. She is proficient in Informatica PowerCenter, QlikView, Oracle, and other tools.
Affordable Analytics & Planning IBM Cognos ExpressSenturus
Solution that delivers reporting, analysis, dashboard, planning, budgeting and forecasting capabilities at an affordable price. View the webinar video recording and download this deck: http://www.senturus.com/resources/affordable-analytics-and-planning/.
Watch this webinar if you are considering switching from spreadsheets to a business analytics system and searching for an affordable, easy-to-implement solution.
Senturus, a business analytics consulting firm, has a resource library with hundreds of free recorded webinars, trainings, demos and unbiased product reviews. Take a look and share them with your colleagues and friends: http://www.senturus.com/resources/.
Why Your Digital Transformation Strategy Demands Middleware ModernizationVMware Tanzu
Your current middleware platform is costing you more than you think. It wasn't designed to support high-velocity software releases and frequent iteration of applications—prerequisites for success in today’s world. A new, modern approach to middleware is needed that enables both developer productivity and operational efficiency.
Join Pivotal’s Rohit Kelapure and Perficient’s Joel Thimsen as they discuss:
- The limitations of traditional middleware
- The benefits of middleware modernization
- Your options for modernization, including a cloud-native platform
- Tips for overcoming some common challenges
Presenters: Rohit Kelapure, Pivotal, Joel Thimsen, Perficient & Jeff Kelly, Pivotal (Host)
The document discusses SAP NetWeaver and its capabilities for application-to-application (A2A) integration and analytics/reporting. It describes how SAP NetWeaver supports A2A integration through various components that enable data integration, process integration, and business process management across applications. It also explains how SAP NetWeaver provides a platform for analytics and reporting through components that integrate and analyze data from various sources and systems.
Transcend Automation is the Authorized distributor of Canary Labs for their products in India.We distribute/Market/Promote/Integrate their products for customers in India.
Managing a multiplatform development software factorry using Team Foundation ...José Freire Neto
This document discusses how Sonda IT, a large IT company in Latin America, manages multiplatform development across different technologies and geographies using Team Foundation Server. It outlines Sonda IT's previous challenges with separate methodologies, source control, management tools and help desks for different platforms. It then describes how TFS provides a unified approach across .NET, Java, SAP and other platforms with centralized indicators, reports and process management. Key benefits highlighted include increased visibility, quality and team collaboration across the multiplatform software factory.
What are the most common aliments facing workers who need to create reports? How can you use Performance Canvas to cure these ailments and gain a competitive advantage?
Similar to Establishing A Robust Data Migration Methodology - White Paper (20)
2. Our Recommended Solution
Summary
The data readiness strategy and methodology
The planning, execution, verification, and
described below is the result of an evolutionary
documentation of the migration of application data
process developed over many SAP
from legacy or source systems to SAP are critical
implementations with multiple clients in various
components to any successful SAP project
industry verticals. This methodology is intended to
implementation. SAP requires and expects master
not only deliver repeatable, predictable, and
and transactional data of high quality for the
demonstrate results; but also bring visibility to
intended process integration benefits to be
data quality issues early enough in the project to
realized.
mitigate them.
Data Readiness is, however, one of the most
overlooked aspects of an implementation project.
Data Readiness Components
This is partly because so much emphasis is placed Let us first introduce the distinct components that
on re-engineering the business processes that the make up a data migration landscape. As
quality and accuracy of data often takes a lesser illustrated in Figure 1, our recommended
priority. However, based on our experience, we methodology follows the traditional Extract,
would suggest that many SAP implementation Transform, and Load (ETL) data migration
projects simply lack the tools and methodologies component model.
to systematically identify and perform data
readiness and conversion activities and resolve
data quality issues.
Data Conversion Component Overview
Data Export
Data Input Source Data Staging
Destination
Source
Applications LSMW
Manual Data Collection BDC BDC Direct
/
iMac
via data construction
application
CATT
Manual Data Collection
-
via Excel and Flat File
Custom ABAP
Central Data Staging
And Transformation Tool
Manual Data Collection
via SAP Manual Input
SAP Systems
Extract Transform Load
Figure 1
Confidential GROM Associates, Inc. -2
3. Data Input Sources These data are subsequently provided to the
central Data Staging & Transformation tool
Data for the project implementation come from
sources as identified in the functional Manual Data Collection in Excel and Flat File – In
specifications. The data for loading into SAP either some cases the need to collect data manually that
already exists in an electronic format or are does not exist in the source system(s) is served by
manually captured in an approved electronic MS-Excel Spreadsheets or Flat Text File. Based on
format. Import programs need to be kept as the complexity of the data that is needed, the
simple as possible for faster implementation and project team develops and distributes an Excel
easier traceability. Import data can come from the spreadsheet application to help facilitate the
following sources: manual data collection process. The data is
subsequently uploaded to the central Data Staging
Source Application Data – Data from source & Transformation tool.
systems are either exported into a comma
delimited text file or copied tables when ODBC Manual Data Collection in SAP – In certain
database connections are available. Data are functional areas, the project can manually collect
extracted out of source applications following the data for SAP where data do not exist in source
principle of “all data and records” without data systems directly in the SAP system. It is
filtering, filtering, translation, or formatting. sometimes advantageous to build SAP data directly
in the SAP environment and take advantage of
Manual Data Collection – Data may be manually existing pre-defined data value tables and
collected in situations where source data does not validation logic. The data is subsequently
exist. Based on the complexity and referential
extracted from SAP and provided to the central
dependency of the collected data, a data Data Staging & Transformation tool.
construction application can be developed to help
facilitate the manual data collection and validation
process.
Data Readiness Process Overview
Extract Transform Load
1
1 7
2 4
SOURCE Staged s
Proces
Target TARGET
SYSTEMS Process Source Data SYSTEMS
Data
DATA STAGING Uploaded
Up
Up
APPLICATION Target Data
rt
rt
da
po
dat
po
5
te
Re
Re
e
8
Target Data
Source Data Kickouts
3 Kickouts
Configuration
Team
Team
Data Owner Referential
& Target Data
Supplemential Kickouts
Data
Data Owner
6
Figure 2
Confidential GROM Associates, Inc. -3
4. Data Staging Comprehensive Data Readiness Process
All master and transactional data loaded into the Let us now describe the steps involved in a robust
SAP system should be staged in a central Data and comprehensive data readiness process. The
Staging & Transformation tool. This repository overall process is illustrated in Figure 2.
receives source data and outputs transformed
In order to ensure ongoing execution,
target data. It contains source data in its
troubleshooting, and problem resolution
originally supplied form, all the rules to convert,
throughout the data conversion test cycles
translate, supplement and format this data into the
described in the next section “Data Conversion
destination format, and intermediate tables
Approach and Methodology”, the Systematic Data
required for data readiness processing. The output
Readiness Process is followed for each data test
from the central Data Staging & Transformation
run. Following is a high-level overview of the
tool is used as the source of data loads into SAP.
process.
Commercial ETL tools are designed for the purpose
of extracting, transforming, and loading data.
These tools should be leveraged on projects where
available. On projects where a commercial ETL
Step 1: Extraction of Source Data
tool is not available, native database tools such as The conversion starts with the extraction of source
Microsoft’s DTS or Oracle’s Warehouse Builder can data. This extraction, depending upon its source
be used as well. may be a direct ODBC connection, a spreadsheet
or flat file created programmatically, or a manually
Once staged in their original or approved collection
loaded spreadsheet. Original spreadsheets and
format, all data is filtered, translated, and
flat files must be secured in a centralized location
formatted in a traceable and reportable fashion via
for audit and validation purposes. In all cases, the
execution of individual data rules in the central
extract of source data must be accompanied by a
Data Staging & Transformation Tool. Exceptions to
report that details the contents. A Source Data
this rule should only be permitted for manually
Reconciliation Report should be produced for each
entered data objects.
extract and must indicate the total number of
records contained in the source. Other metrics
Data Export Destination Programs should be supplied for key data fields such as
sums, totals, or hash totals of data columns
Data is exported from the central Data Staging &
contained in the source. This information will be
Transformation tool into SAP via standard SAP
very important in demonstrating that the source
data conversion methods and tools. Data
data has been completely and accurately imported
programs must be kept as simple as possible to
into the central Data Staging & Transformation
ensure quick development and better traceability
tool.
for troubleshooting and reconciliation purposes.
These conversion methods and tools are:
LSMW – Legacy System Migration Step 2–3: Upload, Process, and
Workbench Verification of Extracted Data & Data
BDC Programs – Binary Direct Connection Quality Checkpoint One
CATT – Computer Aided Test Tool
Post Load Custom ABAP The next step in the process begins the upload of
Post Load Manual Input data from source applications and manual
collection repositories in their native format into
the central Data Staging & Transformation tool. It
is critical for all data to be imported into the
staging tool in an “as-is” format. All source
Confidential GROM Associates, Inc. -4
5. application tables and/or spreadsheet rows and source data from its original record format to a
columns are imported into the staging tool without format that can be read by the SAP data upload
any filtering and manipulation. This ensures that programs for loading into SAP. These data staging
all data record filtering, translation, harmonization, rules, define the main transformation of the
and formatting operations are performed in the filtered source data into data that is coded and
staging tool in an approved, auditable, traceable, formatted for SAP upload purposes. All data
and reportable fashion via execution of business formatting, filtering, and translation rules are
rules at individual source level. based on criteria documented in the functional
specifications. Data reconciliation activities are
Once the data has been successfully extracted into
performed to verify that all required business rules
the central Data Staging & Transformation tool,
defined in the functional specifications have been
the source data is modified according to data
completely and accurately applied.
filtering rules. Data filtering refers to reducing the
dataset based upon rules documented in the Step 5: Data Quality Checkpoint Two
functional specifications and business relevancy
parameters. This filtering is performed in order to Once the data has been successfully filtered,
ensure that only active and relevant data are translated, and formatted, the resulting dataset
loaded into SAP. Additionally, source data can can be subject to another set of quality and
now be subject to a variety of quality and integrity integrity checks aimed at identifying target data
checks to identify source data issues that can integrity and completeness issues. These issues
either be resolved in the staging tool as a can be resolved in the staging tool as a
transformation rule, resolved in SAP, resolved in
transformation rule or be resolved back in the
source system. Data records that do not pass key the data construction application, or resolved back
quality or integrity checks should be flagged as in the source system. Data records which do not
such and omitted from subsequent transformation pass key quality or integrity checks should be
and loading steps, and directed to Data Owners for flagged as such and omitted from subsequent
correction or clarification. loading steps, and directed to data owners and
configuration team for correction or clarification.
Data reconciliation activities are also performed.
All results are gathered and compared to the Data reconciliation activities are also performed
Source Data Reconciliation Report. Results and from the target SAP environment perspective. All
Kickouts are provided to Data Owners for review, results are gathered and compared to verify that
approval and correction. all required business rules defined in the functional
specifications have been completely and accurately
Step 4: Transformation of Staged Data applied. Results and kickouts reports are provided
to Data Owners and Configuration Team for review
Once the source data has been filtered, all source
and correction.
data are combined into a single staged target SAP
data for translation, supplementation and Step 6: Data Supplementation
formatting rules specifically designed for the target
environment per Design Specifications. Data Following review of target data results and
translation refers to replacing source system kickouts reports, data owners have the opportunity
coding, groupings, and other source system to inject additional data into the transformation
application data characteristics to corresponding process of staged data. Additional data refers to
SAP coding, groupings, and data characteristics. missing data component that is required according
to functional or SAP system specifications and
Supplementation refers to supplying additional
referential or required data according to Design cross reference data that mapping legacy data into
Specifications that are not available from source new SAP data per Design Specifications.
data. Data formatting refers to converting the Configuration team has the opportunity to verify,
validate and correct data value needed in target
Confidential GROM Associates, Inc. -5
6. SAP system in order to load approved staged Project Preparation – This phase is to provide
target data without errors. initial preparation and planning for the SAP
implementation project, the important data
Step 7-8: Loading of Target Data into SAP readiness issues addressed during the project
& Final Verification preparation phase are:
Subsequent to the successful completion of data Finalization of data migration scope and data
quality checks, translated and formatted data will readiness strategy
be loaded into SAP via any of the mechanism On-boarding of data team
described under the “Data Export Destination Installation of ETL toolset
Programs” section of this document and verified Initiation of legacy system connection and
for accuracy and completeness. This verification extraction
will involve a combination of visual inspection and
technical checks including record counts, sums, Business Blueprint – Define the business
and or hash totals of data columns contained in processes to be supported by the SAP system and
the export files and SAP tables. the functional requirements, data conversion and
readiness activities begins with the identification of
Data Readiness Approach and data objects which require conversion from the
Methodology source application to the SAP system. During this
phase, all data and records will be extracted and
profiled from source systems, business and SAP
Now that we have introduced both data readiness readiness requirements will be defined, and
landscape components and process, we can finally Mapping Documentation completed in order for
position how this all fits in the lifecycle of an SAP data quality report development. The quality and
implementation project. integrity of the source data will assessed
What follows is a description of the various data repeatedly during this period.
readiness activities as they are executed Realization (Build) – Build the system based
throughout the Grom’s Best Practice Data upon the requirements described in the functional
Readiness Approach. Grom’s Data Readiness specifications, included in this phase are several
Approach is an enhanced, refined and data readiness process development and individual
complementary to ASAP methodology that SAP data object testing cycles. During the early part of
implementation project is typically followed. realization, functional specifications are developed
for the data conversion objects identified during
requirements gathering. These design
Project Definition – The purpose of this phase is specifications serve as the basis for determining
to understand and define data quality baseline and which conversion mechanisms are used and
a path forward with respect to data readiness for provide additional functional conversion program
SAP implementation. Once data quality baseline development and testing details for a given data
has been defined and understood, data migration object. The project team develops all required data
and readiness scope can be derived and estimated conversion rules and programs. These conversion
in alignment with business objectives of SAP rules and programs are tested repeatedly in the
implementation. Toolset selection can be Q/A or Unit Test environments as illustrated in
accomplished based on scope of the conversion. Figure 3.
Finally, the effort and cost of the conversion can
be estimated for approval.
Confidential GROM Associates, Inc. -6
7. Continual Improvement Iterative Process
Business
Blueprint
Unit Test
Environments
Integration Test
Test
Environments
ing E
Pull On Demand
entsv
Sources
Data Staging
Application Go-Live
Cutover
Rehearsal
Environments
Results
Resolutions User Reports
SAP
Production
Figure 3
Realization (Test) – The purpose of this phase is Level prior to Go-Live as illustrated below in figure
dedicated for testing and refinement of conversion 4. By the end of this realization test phase, the
rules and programs of the central Data Staging central Data Staging & Transformation tool will be
and Transformation tool. As source data evolves tested with full data conversions in 2 to 3 rounds
in the course of normal business operation over of Unit Testing and 2 to 3 rounds of Integration
the project timeline, new data issues may surface Testing.
and conversion rules may need to be updated or
refined through the Continual Improvement Data Quality with Continual Improvement Process
Interactive Process. As the target SAP system in Transactionable Data Quality Level
High
each environments continue to mature into “To-
Data
Go-Live
Be” production system, data readiness will be Qua
lity
measured and reported against environment to
confirm alignment of design and functional
Business Blueprint
Install/Run/Support
specifications. Through this iterative testing and
Final Preparation
Da Act
repeatable process, data quality with respect to
ta ivit
Re ies
readiness will elevate closer toward
ad
Transactionable Data Quality
ine
ss
Realization Realization
(Build) (Test)
Low
Project Time Line
Figure 4
Confidential GROM Associates, Inc. -7
8. Final Preparation – Development of the central About the Author
Data Staging & Transformation tool is completed
and cutover activities will be rehearsed 2 to 3 James Chi is the Director of the GROM’s Business
rounds during this phase. As part of final Consulting Group Enterprise Solutions Practice and
production cutover, final source data extractions has overall delivery responsibilities for all GROM-
and preparations will be performed and all master led projects. James joined GROM after spending
and transactional data will be loaded into the the last seventeen years delivering SAP solutions
production environment. Production data in the pharmaceutical, medical device, and
reconciliation and validation reports will be consumer products industries. James’ strong
prepared to ensure all records are accounted for. functional background in Supply Chain Planning
Any additional manual data conversion activities and Manufacturing Execution has blended to create
and manual configuration steps in SAP will a well-rounded business expert with more than
executed according to conversion plan. Finally, fifteen years of Project Management experience.
data owners sign-off the production load and James has a BE in Electrical Engineering from
validation reports as required by the SAP Stevens Institute of Technology.
implementation project.
Install / Run / Support – As the purpose of this
phase is the transition from the pre-production
environment to live production operation, this
phase is used to closely monitor system
transactions, and to optimize system performance.
From a data conversion perspective, any post go-
live issues related to data should be investigated,
resolved, and closed.
Confidential GROM Associates, Inc. -8