This project expanded an existing business intelligence system to include accounts payable, accounts receivable, and logistics by creating new dimensional data structures, ETL processes, and OLAP cubes to meet specific requirements for each department, including handling customer balances, aging, and outstanding amounts for accounts receivable. Documentation was also generated to support the new ETL processes and OLAP cubes implemented for each department.
The document discusses archiving purchasing and financial data at Miller Brewing Company. It provides an overview of Miller's SAP implementation timeline and sizes of key purchasing tables. It then offers tips and tricks for archiving purchase order documents, such as developing a custom program to automatically close old purchase orders based on criteria. The document also discusses ensuring purchase order output types are not generated when closing old orders, and archiving purchase order images for audit purposes.
Data archiving in sales and distribution (sd)Piyush Bose
The document provides information about archiving sales documents in Sales and Distribution (SD), including:
- The SD_VBAK archiving object is used to archive sales document header and item data from various tables.
- Programs are provided for writing to the archive, deleting from the live system, and reloading archived data.
- Parameters can be set to control the archiving process, and test runs allow validating the archiving without deleting data.
- Additional customizing and configuration may be required to define which document types and fields are archived.
This document summarizes SAP archiving and document management at Australia Post. It discusses why Open Text was selected as the archiving solution and provides an overview of the SAP and Open Text architecture landscape. It also lists several SAP objects that have been implemented for archiving and discusses new opportunities for archive and imaging within SAP at Australia Post.
Informix warehouse and accelerator overviewKeshav Murthy
This document provides an overview of Informix Warehouse and Informix Warehouse Accelerator. It discusses data warehousing industry trends, features of Informix Warehouse 11.70 including loading, storage optimization, and query processing capabilities. It also describes the Informix Warehouse Accelerator which uses columnar storage, compression and massive parallelism to accelerate select queries with unprecedented response times.
Scenarios define which fields are updated in ledgers during a posting from other application components. The standard SAP scenarios update fields like cost center, consolidation information, business area, profit center, and segment. Scenarios must be assigned to ledgers to determine which fields are updated when posting documents to that ledger.
The document summarizes SAP Business Intelligence architecture and data flow. It describes how data is extracted from source systems like ERP, transformed and loaded into the data warehouse. The data warehouse consists of layers like a datamart layer for detailed normalized data and an enterprise data warehouse layer for aggregated historical storage. It also outlines the key objects involved in ETL like infopackages, datasources and transformations that move data between source systems and the data warehouse.
The FICO-Reconciliation Tool allows users to easily reconcile financial data between SAP FI-GL and CO-PA modules. It displays reconciliation items with differences in a transparent user interface. Users can then analyze document flows, track items, and communicate with responsible parties to correct any wrong or missing postings. The tool leverages the existing SAP Intercompany Reconciliation process and provides benefits such as faster month-end closing, reduced reconciliation efforts, and improved monitoring of business processes.
Sap bi step by step procedure for data archiving by adk and reloading archive...Charanjit Singh
This document provides step-by-step instructions for archiving data in SAP BW using ADK, deleting the archived data from the database, and reloading the archived data into a separate target. It discusses creating a logical file, archiving process, archive administration, deleting archived data, and reloading data into a copy of the original infoprovider. The purpose is to explain the detailed technical procedure for ADK archiving, deleting, and reloading archived data in SAP BW.
The document discusses archiving purchasing and financial data at Miller Brewing Company. It provides an overview of Miller's SAP implementation timeline and sizes of key purchasing tables. It then offers tips and tricks for archiving purchase order documents, such as developing a custom program to automatically close old purchase orders based on criteria. The document also discusses ensuring purchase order output types are not generated when closing old orders, and archiving purchase order images for audit purposes.
Data archiving in sales and distribution (sd)Piyush Bose
The document provides information about archiving sales documents in Sales and Distribution (SD), including:
- The SD_VBAK archiving object is used to archive sales document header and item data from various tables.
- Programs are provided for writing to the archive, deleting from the live system, and reloading archived data.
- Parameters can be set to control the archiving process, and test runs allow validating the archiving without deleting data.
- Additional customizing and configuration may be required to define which document types and fields are archived.
This document summarizes SAP archiving and document management at Australia Post. It discusses why Open Text was selected as the archiving solution and provides an overview of the SAP and Open Text architecture landscape. It also lists several SAP objects that have been implemented for archiving and discusses new opportunities for archive and imaging within SAP at Australia Post.
Informix warehouse and accelerator overviewKeshav Murthy
This document provides an overview of Informix Warehouse and Informix Warehouse Accelerator. It discusses data warehousing industry trends, features of Informix Warehouse 11.70 including loading, storage optimization, and query processing capabilities. It also describes the Informix Warehouse Accelerator which uses columnar storage, compression and massive parallelism to accelerate select queries with unprecedented response times.
Scenarios define which fields are updated in ledgers during a posting from other application components. The standard SAP scenarios update fields like cost center, consolidation information, business area, profit center, and segment. Scenarios must be assigned to ledgers to determine which fields are updated when posting documents to that ledger.
The document summarizes SAP Business Intelligence architecture and data flow. It describes how data is extracted from source systems like ERP, transformed and loaded into the data warehouse. The data warehouse consists of layers like a datamart layer for detailed normalized data and an enterprise data warehouse layer for aggregated historical storage. It also outlines the key objects involved in ETL like infopackages, datasources and transformations that move data between source systems and the data warehouse.
The FICO-Reconciliation Tool allows users to easily reconcile financial data between SAP FI-GL and CO-PA modules. It displays reconciliation items with differences in a transparent user interface. Users can then analyze document flows, track items, and communicate with responsible parties to correct any wrong or missing postings. The tool leverages the existing SAP Intercompany Reconciliation process and provides benefits such as faster month-end closing, reduced reconciliation efforts, and improved monitoring of business processes.
Sap bi step by step procedure for data archiving by adk and reloading archive...Charanjit Singh
This document provides step-by-step instructions for archiving data in SAP BW using ADK, deleting the archived data from the database, and reloading the archived data into a separate target. It discusses creating a logical file, archiving process, archive administration, deleting archived data, and reloading data into a copy of the original infoprovider. The purpose is to explain the detailed technical procedure for ADK archiving, deleting, and reloading archived data in SAP BW.
ETL stands for Extraction, Transformation, and Loading. The document describes an example ETL process to load master customer data from an Excel file into an SAP BI platform. First, the data is extracted from the Excel file into the BI data warehouse using a data source and info package. Next, the data in the persistent staging area is transformed by defining the customer ID and name fields as characteristic info objects. Finally, a data transfer process loads the mapped data from the source into the appropriate info objects, completing the ETL process.
Parallel accounting in sap erp account approachversus ledger approachin new g...Imran M Arab
This document discusses two approaches for implementing parallel accounting in SAP ERP: the account approach and ledger approach. The account approach uses parallel general ledger accounts with prefixes to distinguish valuations, while the ledger approach uses separate ledgers in new general ledger accounting. The ledger approach is recommended as it provides better integration across SAP modules and avoids issues with the account approach like redundant account creation. Parallel accounting can be depicted in asset accounting, financial accounting, controlling, materials management, and other SAP ERP components using either approach.
This document summarizes Jennifer McNeill's presentation on converting Oracle Reports to BI Publisher. It discusses the challenges of enterprise reporting, benefits of using BI Publisher like simplified report maintenance and flexible output formats. It provides a step-by-step approach to conversion including using the conversion utility, integrating Oracle Forms with BI Publisher, and challenges of a conversion project.
Day 9 __10_introduction_to_bi_enterprise_reporting_1___2tovetrivel
This document provides an overview of SAP BI reporting tools. It discusses business information management practices including enterprise reporting, query and analysis. It describes SAP NetWeaver BI and its reporting capabilities. Key SAP BI reporting tools are introduced, including BEx Query Designer for designing queries, BEx Analyzer for modifying queries, and BEx Browser for organizing queries. Report design considerations like optimizing performance and global report design are also covered.
This document provides a step-by-step guide on using Business Transaction Events (BTEs) as an enhancement technique in SAP's Financial Accounting module. It describes what BTEs are, the difference between BTEs and BADIs, the two types of BTE interfaces, and provides an example of how to configure a BTE to copy an assignment field with a custom value when accounting documents are posted for a specific company code. The document outlines finding the relevant BTE, copying the sample function module, writing ABAP code to update the field, saving and activating the function module, and assigning it to the appropriate event, country and application.
This document provides instructions on integrating Excel with SAP to allow for data posting capabilities. It describes preparing an RFC function module in SAP to interface with an Excel VBA macro. The RFC FM handles mapping data from Excel to standard SAP BAPIs to create a sales order. The VBA macro calls the RFC FM and maps the returned sales order number back to Excel. The end-to-end process is tested by entering sample sales order data in Excel and having it successfully posted to SAP.
Combined COPA allows organizations to analyze profit-related transactions (such as the invoicing of a customer or consumption through delivery), both in the form of value fields and also in the form of accounts to which posting takes place in financial accounting.
The document provides an overview of SAP BI training. It discusses that SAP stands for Systems Applications and Products in Data Processing and was founded in 1972 in Germany. It is the world's fourth largest software provider and largest provider of ERP software. The training covers topics such as the 3-tier architecture of SAP, data warehousing, ETL, the SAP BI architecture and key components, OLTP vs OLAP, business intelligence definition, and the ASAP methodology for SAP implementations.
In SAP COPA, you can use summarisation level as one of the key measure to improve processing time of reporting, planning and assessments. This will also lead to better system performance.
Transaction processing within COPA, initially, reads data from summarisation tables and then from segment tables and then from line item tables. By building summarisation levels (in COPA) tailored for specific transactional processes that involve large volumes of data or large processing times, you can significantly reduce those processing times and improve the performance of the system.
Blogs on SAP COPA summarisation:
#1 Summarisation levels in SAP COPA – an overview (2012/05/09)
#2 Summarisation levels in SAP COPA – define your summarisation level (2012/05/16)
#3 Summarisation levels in SAP COPA – build your summarisation level (2012/05/23)
#4 Summarisation levels in SAP COPA – Tips to optimise your summarisation level (2012/05/30)
The document provides an overview of SAP BW (Business Warehouse), including its key components and architecture. SAP BW is a data warehouse system optimized for reporting and analysis. It includes preconfigured support for extracting data from SAP systems like R/3 as well as tools for extracting from non-SAP sources. The core components include the Administrator Workbench for managing metadata and content, data modeling tools, extraction and loading processes, the operational data store, and BEX reporting tools. Data is loaded from source systems into an in-memory database optimized for online analytical processing.
The document discusses dimensional modeling concepts used in data warehouse design. Dimensional modeling organizes data into facts and dimensions. Facts are measures that are analyzed, while dimensions provide context for the facts. The dimensional model uses star and snowflake schemas to store data in denormalized tables optimized for querying. Key aspects covered include fact and dimension tables, slowly changing dimensions, and handling many-to-many and recursive relationships.
R12 includes several new features compared to 11i.5.10 including:
1) Enhanced sub-ledger accounting, multiple reporting currencies, and deferred revenue/COGS matching.
2) Unified inventory architecture and enhanced inventory reports.
3) Enhancements to order management, shipping, warehouse management, and shop floor management including new multi-organization access control.
4) Matured modules but some like e-business tax still have bugs being addressed. Customized integrations will require analysis for architectural changes.
Day 6.3 extraction_business_content_and_generictovetrivel
This document provides an overview of extractors in SAP BW, including business content extractors, generic extractors, and data source enhancement. Business content extractors are application-specific and provided with SAP applications, while generic extractors can be used across applications and are created using views, infosets, or function modules. Data source enhancement involves adding additional fields not available in standard data sources. The document also discusses update modes, delta capability, record types, and delta tables used in SAP BW extractors.
Lecture about SAP HANA and Enterprise Comupting at University of HalleTobias Trapp
HANA provides an in-memory platform for real-time analytics that can simplify queries, data models, and business processes. Its column-oriented architecture allows for fast aggregation and analysis of large datasets. However, fully leveraging HANA's capabilities requires evolving existing applications, addressing challenges around real-time data warehousing and OLTP reporting, and developing quantitative skills for business insight and decision-making beyond traditional areas.
This document outlines a 60 minute learning demo on SAP Business Intelligence and Analytics. The demo covers 15 units that introduce SAP ERP, SAP BW, data modeling in BW, data acquisition from source systems, BW content, transport and ABAP, additional info providers, front end solutions like BEx Query Designer and Analyzer, query performance optimization, support activities, and integrated solutions with Business Objects.
This document provides an overview of SAP Business Warehouse (BW) and how to generate reports from BW. It explains that BW allows flexible analysis of large amounts of business data and creation of dynamic reports. Data is transferred from SAP servers to the BW server, where it is stored in high-performance databases called infocubes. Users can then generate reports on various business functions like logistics, sales, finance, and more by logging into BW and using the Business Explorer tool within Microsoft Excel. The training covers logging into BW, selecting variables to filter reports, and generating sample customer master reports.
SAP NetWeaver provides Business Intelligence (BI) functionality including data warehousing, a BI platform, and business intelligence tools. BI allows businesses to integrate data from various sources, transform and consolidate it in the data warehouse, and perform flexible reporting, analysis, and planning to support evaluation and interpretation of data for well-founded decision making. Key components of BI include the Data Warehousing Workbench, BI Platform, Business Explorer suite, and additional development technologies like the BI Java SDK.
The document summarizes tests of using the DB Connect feature in SAP BW 3.0B to extract data from non-SAP databases and transfer it to a SAP BW database. Three configurations were tested: 1) Extracting data from an Oracle database and transferring it to a DB2 database in a BW system, 2) Extracting data from a DB2 database running locally to the BW system, and 3) Extracting data from a remote DB2 database via an application server. The tests showed that in all three configurations, data was successfully extracted from the non-SAP databases and loaded into the SAP BW PSA for analysis.
DATA WAREHOUSE IMPLEMENTATION BY SAIKIRAN PANJALASaikiran Panjala
This document discusses data warehouses, including what they are, how they are implemented, and how they can be further developed. It provides definitions of key concepts like data warehouses, data cubes, and OLAP. It also describes techniques for efficient data cube computation, indexing of OLAP data, and processing of OLAP queries. Finally, it discusses different approaches to data warehouse implementation and development of data cube technology.
Oracle Shop Floor Management provides comprehensive capabilities for manufacturing companies. New features in R12 include enhanced lot/serial management, more granular shop floor modeling and planning, improved execution through dispatch lists and lot travelers, and real-time alerts. Oracle aims to offer the most complete and lowest risk solution for industries such as semiconductors, medical devices, and automotive through tight integration of shop floor and back office functions in a single system.
(1) The document discusses data warehousing, business intelligence, and their relationship to addressing challenges from multiple data sources.
(2) A layered scalable architecture is presented as a reference architecture for data warehouses to provide reliable, consistent, and understandable data from different source systems.
(3) Big data is also discussed in relation to data warehousing, noting differences in schema and consistency needs between traditional warehouses and big data systems handling high volumes and varieties of data.
ETL stands for Extraction, Transformation, and Loading. The document describes an example ETL process to load master customer data from an Excel file into an SAP BI platform. First, the data is extracted from the Excel file into the BI data warehouse using a data source and info package. Next, the data in the persistent staging area is transformed by defining the customer ID and name fields as characteristic info objects. Finally, a data transfer process loads the mapped data from the source into the appropriate info objects, completing the ETL process.
Parallel accounting in sap erp account approachversus ledger approachin new g...Imran M Arab
This document discusses two approaches for implementing parallel accounting in SAP ERP: the account approach and ledger approach. The account approach uses parallel general ledger accounts with prefixes to distinguish valuations, while the ledger approach uses separate ledgers in new general ledger accounting. The ledger approach is recommended as it provides better integration across SAP modules and avoids issues with the account approach like redundant account creation. Parallel accounting can be depicted in asset accounting, financial accounting, controlling, materials management, and other SAP ERP components using either approach.
This document summarizes Jennifer McNeill's presentation on converting Oracle Reports to BI Publisher. It discusses the challenges of enterprise reporting, benefits of using BI Publisher like simplified report maintenance and flexible output formats. It provides a step-by-step approach to conversion including using the conversion utility, integrating Oracle Forms with BI Publisher, and challenges of a conversion project.
Day 9 __10_introduction_to_bi_enterprise_reporting_1___2tovetrivel
This document provides an overview of SAP BI reporting tools. It discusses business information management practices including enterprise reporting, query and analysis. It describes SAP NetWeaver BI and its reporting capabilities. Key SAP BI reporting tools are introduced, including BEx Query Designer for designing queries, BEx Analyzer for modifying queries, and BEx Browser for organizing queries. Report design considerations like optimizing performance and global report design are also covered.
This document provides a step-by-step guide on using Business Transaction Events (BTEs) as an enhancement technique in SAP's Financial Accounting module. It describes what BTEs are, the difference between BTEs and BADIs, the two types of BTE interfaces, and provides an example of how to configure a BTE to copy an assignment field with a custom value when accounting documents are posted for a specific company code. The document outlines finding the relevant BTE, copying the sample function module, writing ABAP code to update the field, saving and activating the function module, and assigning it to the appropriate event, country and application.
This document provides instructions on integrating Excel with SAP to allow for data posting capabilities. It describes preparing an RFC function module in SAP to interface with an Excel VBA macro. The RFC FM handles mapping data from Excel to standard SAP BAPIs to create a sales order. The VBA macro calls the RFC FM and maps the returned sales order number back to Excel. The end-to-end process is tested by entering sample sales order data in Excel and having it successfully posted to SAP.
Combined COPA allows organizations to analyze profit-related transactions (such as the invoicing of a customer or consumption through delivery), both in the form of value fields and also in the form of accounts to which posting takes place in financial accounting.
The document provides an overview of SAP BI training. It discusses that SAP stands for Systems Applications and Products in Data Processing and was founded in 1972 in Germany. It is the world's fourth largest software provider and largest provider of ERP software. The training covers topics such as the 3-tier architecture of SAP, data warehousing, ETL, the SAP BI architecture and key components, OLTP vs OLAP, business intelligence definition, and the ASAP methodology for SAP implementations.
In SAP COPA, you can use summarisation level as one of the key measure to improve processing time of reporting, planning and assessments. This will also lead to better system performance.
Transaction processing within COPA, initially, reads data from summarisation tables and then from segment tables and then from line item tables. By building summarisation levels (in COPA) tailored for specific transactional processes that involve large volumes of data or large processing times, you can significantly reduce those processing times and improve the performance of the system.
Blogs on SAP COPA summarisation:
#1 Summarisation levels in SAP COPA – an overview (2012/05/09)
#2 Summarisation levels in SAP COPA – define your summarisation level (2012/05/16)
#3 Summarisation levels in SAP COPA – build your summarisation level (2012/05/23)
#4 Summarisation levels in SAP COPA – Tips to optimise your summarisation level (2012/05/30)
The document provides an overview of SAP BW (Business Warehouse), including its key components and architecture. SAP BW is a data warehouse system optimized for reporting and analysis. It includes preconfigured support for extracting data from SAP systems like R/3 as well as tools for extracting from non-SAP sources. The core components include the Administrator Workbench for managing metadata and content, data modeling tools, extraction and loading processes, the operational data store, and BEX reporting tools. Data is loaded from source systems into an in-memory database optimized for online analytical processing.
The document discusses dimensional modeling concepts used in data warehouse design. Dimensional modeling organizes data into facts and dimensions. Facts are measures that are analyzed, while dimensions provide context for the facts. The dimensional model uses star and snowflake schemas to store data in denormalized tables optimized for querying. Key aspects covered include fact and dimension tables, slowly changing dimensions, and handling many-to-many and recursive relationships.
R12 includes several new features compared to 11i.5.10 including:
1) Enhanced sub-ledger accounting, multiple reporting currencies, and deferred revenue/COGS matching.
2) Unified inventory architecture and enhanced inventory reports.
3) Enhancements to order management, shipping, warehouse management, and shop floor management including new multi-organization access control.
4) Matured modules but some like e-business tax still have bugs being addressed. Customized integrations will require analysis for architectural changes.
Day 6.3 extraction_business_content_and_generictovetrivel
This document provides an overview of extractors in SAP BW, including business content extractors, generic extractors, and data source enhancement. Business content extractors are application-specific and provided with SAP applications, while generic extractors can be used across applications and are created using views, infosets, or function modules. Data source enhancement involves adding additional fields not available in standard data sources. The document also discusses update modes, delta capability, record types, and delta tables used in SAP BW extractors.
Lecture about SAP HANA and Enterprise Comupting at University of HalleTobias Trapp
HANA provides an in-memory platform for real-time analytics that can simplify queries, data models, and business processes. Its column-oriented architecture allows for fast aggregation and analysis of large datasets. However, fully leveraging HANA's capabilities requires evolving existing applications, addressing challenges around real-time data warehousing and OLTP reporting, and developing quantitative skills for business insight and decision-making beyond traditional areas.
This document outlines a 60 minute learning demo on SAP Business Intelligence and Analytics. The demo covers 15 units that introduce SAP ERP, SAP BW, data modeling in BW, data acquisition from source systems, BW content, transport and ABAP, additional info providers, front end solutions like BEx Query Designer and Analyzer, query performance optimization, support activities, and integrated solutions with Business Objects.
This document provides an overview of SAP Business Warehouse (BW) and how to generate reports from BW. It explains that BW allows flexible analysis of large amounts of business data and creation of dynamic reports. Data is transferred from SAP servers to the BW server, where it is stored in high-performance databases called infocubes. Users can then generate reports on various business functions like logistics, sales, finance, and more by logging into BW and using the Business Explorer tool within Microsoft Excel. The training covers logging into BW, selecting variables to filter reports, and generating sample customer master reports.
SAP NetWeaver provides Business Intelligence (BI) functionality including data warehousing, a BI platform, and business intelligence tools. BI allows businesses to integrate data from various sources, transform and consolidate it in the data warehouse, and perform flexible reporting, analysis, and planning to support evaluation and interpretation of data for well-founded decision making. Key components of BI include the Data Warehousing Workbench, BI Platform, Business Explorer suite, and additional development technologies like the BI Java SDK.
The document summarizes tests of using the DB Connect feature in SAP BW 3.0B to extract data from non-SAP databases and transfer it to a SAP BW database. Three configurations were tested: 1) Extracting data from an Oracle database and transferring it to a DB2 database in a BW system, 2) Extracting data from a DB2 database running locally to the BW system, and 3) Extracting data from a remote DB2 database via an application server. The tests showed that in all three configurations, data was successfully extracted from the non-SAP databases and loaded into the SAP BW PSA for analysis.
DATA WAREHOUSE IMPLEMENTATION BY SAIKIRAN PANJALASaikiran Panjala
This document discusses data warehouses, including what they are, how they are implemented, and how they can be further developed. It provides definitions of key concepts like data warehouses, data cubes, and OLAP. It also describes techniques for efficient data cube computation, indexing of OLAP data, and processing of OLAP queries. Finally, it discusses different approaches to data warehouse implementation and development of data cube technology.
Oracle Shop Floor Management provides comprehensive capabilities for manufacturing companies. New features in R12 include enhanced lot/serial management, more granular shop floor modeling and planning, improved execution through dispatch lists and lot travelers, and real-time alerts. Oracle aims to offer the most complete and lowest risk solution for industries such as semiconductors, medical devices, and automotive through tight integration of shop floor and back office functions in a single system.
(1) The document discusses data warehousing, business intelligence, and their relationship to addressing challenges from multiple data sources.
(2) A layered scalable architecture is presented as a reference architecture for data warehouses to provide reliable, consistent, and understandable data from different source systems.
(3) Big data is also discussed in relation to data warehousing, noting differences in schema and consistency needs between traditional warehouses and big data systems handling high volumes and varieties of data.
Lo extraction – part 5 sales and distribution (sd) datasource overviewJNTU University
This document provides an overview of Sales and Distribution (SD) data sources that can be used for LO extraction in SAP BI. It describes the different event types that can trigger data transfers to the data warehouse and lists various SD extractors and their assigned data sources. Useful SAP notes are also referenced that provide more details on specific SD data sources.
This document describes the overview of SAP BusinessObjects Rapid Marts, available Rapid Mart
packages, how Rapid Mart packages helps and accelerates in Data Warehouse implementation process
This document provides an overview of Hyperion and Essbase. It discusses how raw data is transformed into information through data warehousing processes like extracting, transforming, and loading data. It then explains what an OLTP system is and how Essbase provides multi-dimensional analysis capabilities. Key features of Essbase like dimensions, facts, aggregation, and its architecture are summarized. Finally, the document outlines the typical lifecycle of building and maintaining an Essbase database application.
ActiveWarehouse/ETL - BI & DW for Ruby/RailsPaul Gallagher
Presentation delivered at the Singapore Ruby Brigade meetup 6-Jan-2010 (at hackerspace.sg). Discusses BI and DW in the Rails context, and test drives ActiveWarehouse and ActiveWarehouse/ETL with a "Cupcakes Inc" sample application.
This document provides a summary of a candidate's experience as a results-oriented Oracle Applications Techno-Functional Developer and Development Lead with over 15 years of experience. It lists their contact information, professional skills, Oracle module experience in versions 11.x and R12, technical and functional skills, and representative professional experience including roles as a senior Oracle Applications consultant and technical lead on various Oracle ERP implementations across different industries.
This document provides an overview of SAP, an enterprise resource planning (ERP) software package. It describes SAP R/3, the main ERP application, its modules for areas like sales, procurement, production, and finance. It also explains the typical client-server architecture used to implement SAP R/3 systems with configurations like 2-tier and 3-tier. The document discusses customizing the software to model an organization's business processes and rules.
This document provides an overview of SAP, an enterprise resource planning (ERP) software package. It describes SAP R/3, the main ERP application, its modules for areas like sales, procurement, production, and finance. It also explains the typical client-server architecture used to implement SAP R/3, including 2-tier and 3-tier configurations. Additional topics covered include customizing the software to model a business, reporting tools, and system messages.
The document discusses Open Text (iXOS) archiving at AusPost, including:
1) Open Text is a globally proven solution that is integrated with SAP and meets compliance standards.
2) The current archiving architecture includes logical archives stored on IBM storage using the LEA solution.
3) Many common SAP objects are currently archived such as sales documents, deliveries, and customer master data.
4) New opportunities exist to archive additional SAP objects like sales contacts and purchasing documents to gain more benefits from archiving.
The document discusses Oracle Financials applications, providing an overview of the software modules including Oracle Assets, General Ledger, Inventory, and Payables. It also covers the hardware and software requirements for running Oracle Financials R12, including the needed operating system, database, disk space, and RAM. The course details include training on Oracle Financials concepts, modules, configurations, interfaces, reports, and customizations.
Saba Mallick has over 5 years of experience working as an Oracle Apps Technical Consultant and PL/SQL Developer. She has strong skills in developing reports, forms, and customizations for the Oracle ERP system. Some of the companies she has worked with include Dell, Motorola Solutions, Igate Global Solutions, and several companies in Saudi Arabia providing Oracle application support, maintenance and implementation.
Total 5.0 Years of experience and relevant experience 4.6 Years in R12 Oracle Application Technical in SCM module, working as Oracle Developer and Oracle Consultant, developing RICE Components such as Oracle Repor, XML report, AOL, Interface Conversion, Customization, and has in SQL and PL/SQL.
The document discusses Oracle Financial Analyzer, a tool for financial reporting, analysis, budgeting and planning that integrates transactional data from ERP systems into a multidimensional data model for flexible analysis. It provides an overview of Oracle Financial Analyzer's features and how it can be used by financial analysts, business managers, and CFOs. Additionally, it explains how Oracle Financial Analyzer is integrated with the general ledger to provide a link between transactional and analytical data.
This document provides an overview of key concepts related to data warehousing including what a data warehouse is, common data warehouse architectures, types of data warehouses, and dimensional modeling techniques. It defines key terms like facts, dimensions, star schemas, and snowflake schemas and provides examples of each. It also discusses business intelligence tools that can analyze and extract insights from data warehouses.
- SAP stands for Systems, Applications & Products and was founded in 1972 in Germany by five former IBM employees. It provides enterprise software including ERP, CRM, SCM, PLM, HCM and more.
- The core components of SAP include ECC for ERP, SAP HANA for analytics, SAP BW for data warehousing, and various other modules. Data can be provisioned between systems using tools like SLT, BODS, RFCs, and third-party options.
- SAP uses a multi-tiered landscape with the front-end applications interfacing with back-end databases and data provisioning technologies to integrate systems and enable analytics.
SAP BOBJ RapidMarts provide a prepackaged data warehouse solution for SAP systems. They contain ready-made data models, ETL processes, universes and reports. Implementing a data warehouse with RapidMarts takes weeks compared to months using traditional methods and reduces risks, efforts, costs and expertise needed. RapidMarts implementation involves deploying the content, customizing as needed, loading data, and monitoring the processes.
VoltDB and Flytxt Present: Building a Single Technology Platform for Real-Tim...VoltDB
Webinar presented by Prateek Kapadia, CTO, Flytxt and Ryan Betts, CTO, VoltDB.
Prateek and Ryan highlight the technology stack for Fast + Big data and iterative analytics using Hadoop and VoltDB. In addition, they will outline use cases for Big Data analytics with Hadoop, iterative analytics and machine learning using Spark and real-time analytics using VoltDB supplemented by Big Data + iterative analytics.
The Configuration Change Management workflow template:
1. Notifies a project planner of changes to a product configuration that require manual adjustments to an associated network.
2. Allows the project planner to view the required changes and create a text description of the adjustments.
3. Sends the text description to the employee who made the original configuration change for review.
4. Allows the project planner to make the necessary adjustments to the network after receiving approval.
The document provides background on multi-dimensional modeling techniques used to create BI InfoCubes. It discusses translating analytical needs into a multi-dimensional data model and star schema, with facts in the center and dimensions as surrounding tables. Guidelines are presented for modeling dimensions, attributes, hierarchies, and fact tables to support OLAP and allow flexible analysis of business process data.
1. PROJECTS PORTFOLIO
The core goal of the following projects portfolio, is to introduce the reader to the work I have done in
the area of Business Intelligence / Data Warehouse through my four and half years of experience in this
field.
The following are five projects covering, Microsoft SQL Server 2000, DTS, Analysis Services, Microsoft
SSIS, SSAS, SSRP 2005, SharePoint and Performance Point Server 2007.
Other skills I developed during this time are the administration of appropriate work and the successful
execution of projects. Good communication with the team work as well with other coworkers. Abilities
to implement and use a new methodology work. Change adaptation work.
Projects Titile
1) Analysis, design, construction and implementation of a data warehouse for a company
dedicated to the baking industry, supported with technologies from Microsoft SQL Server, Data
Transformation Services, Analysis Services and Open Intelligence, using data mining models,
followed Business Dimensional Lifecycle proposed by Dr. Ralph Kimball ....................................... 2
2) Expand the application of The Business Intelligence for the departments of accounts payable,
accounts receivable and logistics. Making changes to current processes of actual ETL and OLAP
cubes. ............................................................................................................................................. 15
3) SetFocus Business Intelligence Project using Microsoft SQL Service Integration Services 2005 .. 26
4) SetFocus Business Intelligence Project using Microsoft SQL Service Analysis Services 2005 ....... 33
5) SetFocus Business Intelligence Project using Microsoft SQL Service Reporting Services 2005,
Performance Point Server and SharePoint Server ......................................................................... 40
1
2. Analysis, design, construction and implementation of a data
warehouse for a company dedicated to the baking industry,
supported with technologies from Microsoft SQL Server, Data
Transformation Services, Analysis Services and Open
Intelligence, using data mining models, followed Business
Dimensional Lifecycle proposed by Dr. Ralph Kimball
Introduction
Paper thesis for a professional diploma in computer systems engineering, in which the
implementation of the data warehouse is described in detail, through several chapters covering
the general description of the company, project justification, the framework, the five steps:
analysis, design, construction, testing and implementation of an information technology
project, supported by the business dimensional lifecycle proposed by Dr. Ralph Kimball and the
conclusion of project and future work.
This project shows how the organization carries out the implementation of Data Warehouse
architecture, which allowed them to perform complex analysis, better understand of the
organization and be able to make better decision in the sales, accounting, warehouse, accounts
payable and accounts receivable areas.
Audience
Focused on business intelligence developers, as well as any professional in the information
technology area. Any person who is involved in decision-making for the company.
Project Goals:
Create the necessary documentation for the processes of analysis, design, construction,
testing and implementation of data warehouse system.
Create the physical and logical architecture of the hardware and software for the
correct operation of data warehouse system.
Allow the extraction from the current data base systems, transform for the business
needed and loaded to dimensional database models.
Create dimensional cube based on business needs for sales, accounting, inventory,
accounts payable and accounts receivable.
Allow having a source of information of the data extracted safely and the ability to be
scalable over time.
Allow access to information (OLAP), to end users through tools such as Open
Intelligence or Microsoft Excel for the exploitation of information.
2
3. Actual Business Diagram
Explanation
This diagram show how the business works regarding to make the information analysis at the
end of each month and all the process from the extraction of the information trough standard
and complex query’s to the DB2 data base, the manipulation trough Macros on Excel and the
interpretation to make decisions.
3
4. Propose Business Diagram
Explanation
This diagram shows the workflow with the proposal and implementation of a data warehouse
system. ETL processes as well as the load on dimensional databases are present in this diagram.
4
5. Data Source Analysis
Command Module Files or Objects
Line
/GL Accountability FLP060 Parameters files (catalogs, etc. etc.)
FLP008 Detail Transaction Document
/OE Invoice and Orders OEP16 Price List
OEPDF Other Price List
OEP20 Customers Details
OEP40 Order Header
OEP55 Order Details
OEP65 Sales Header
OEP70 Sales Detail
/IN Warehouse INP10 Company Information
INP13 Warehouse Descriptions
INP15 Files Descriptions
INP35 Product Catalog
INP38 Product Catalog Others
INP95 Warehouse movements history
/AP Accounts Payable PLP05 Supplier Catalog
PLP15 Transaction Master
PLP20 Detail Transaction Master
PLP25 Session Control File
PLP45 Supplier Payment
/AR Accounts Receivable SLP05 Customer Master
SLP15 Transaction Master file
SLP20 History Transactional Master File
SLP25 Session control File
Explanation
This table shows the Data Source Structure of the DB2 for the JBA/AS400 mainframe.
5
6. Data Warehouse Bus Architecture Matrix
Dimensions
Transactions
Warehouse
Customer
Supplier
General
Product
Ledger
Zone
Date
Accounts
Payables X X X
Sales X X X X X
Data Mart
Purchases X X X X
Production X X X X
Warehouse X X X X
Accountability X X X
Taxes X X X
Explanation
This Architecture Bus shows how Dimensions cross with the Data Marts.
Logic Design of Sales Fact Table with Dimensional Table and the Attributes
Tabla de Hecho
VENTAS
Cliente Región
Tabla de Hecho Dimensiones
Ventas Cliente
Fecha
Granularidad Región
Snapshot Producto
Fecha Producto Atributos
Unidades
Monto
Costo
Margen de contribución *
Explanation
1) This Image show the relation between the Dimensional Tables of Customers, Date, Zone
and Products are related to the Fact Table of Sales that is the Data Mart of Sales
Department.
2) This Image Show the attributes for the Sales Fact Table that are, Units, Sold, Cost and
one derived column Gross profit.
6
7. Detail Product Dimension with logic description and Diagram.
Product Dimension
Attribute Description Changing Dimension Example
Name Product Name Slowly Changing DAWN RLL RD MANZANA 10.9
Type Product Type Slowly Changing Finished, Process, Raw Material
Category Product Category Slowly Changing Wet, Dry, Frozen, Other
Major Group Product Major Group Slowly Changing Flavor, Chocolate,
Division Product Division Slowly Changing Purchased, Produced
Presentation Product Presentation Slowly Changing EA, CS, BG
Dimensión
PRODUCTO
División Categoría
(2) (5)
Tipo Grupo Mayor
(7) (n)
Presentación Nombre del
(22) Producto
Explanation
1) The table shows the attributes name the description also if the dimension is a slowly
changing dimension and some examples of the information.
2) Show the logical and the hierarchy of the product dimension.
7
8. Data Source-Destination Table
Data File Field
Table Column Data Type Long Description
Source Source Name
Product
Surrogate Key Integer 6 Surrogate Key
Dimension N/A N/A N/A
Product
Code Varchar 12 Product Code PNUM35
Dimension JBA INP35
Product
Dimension Name Varchar 45 Product Name JBA INP35 PDES35
Product
Dimension Type Varchar 20 Product Type JBA INP35 PTYP35
Product Product
Dimension Category Varchar 15 Category JBA INP35 PCLS35
Product Product
Dimension Division Varchar 10 Division JBA INP35 DIVN35
Product Product
Dimension Presentation Varchar 2 Presentation JBA INP35 SUNT35
Product Product Major
Dimension Mayor Group Varchar 10 Group JBA INP35 PGMJ35
Tabla de Hecho
FVentas
Logic Attribute Physic Attribute
Dimensiones Name strDProducto_nombre
intDCliente_clave Type strDProducto_tipo
intDFecha_clave Category strDProducto_categoria
intDRegion_clave
intDProducto_clave
Mayor Group strDProducto_gpo_mayor
Division strDProducto_division
Atributos Presentation strDProducto_presentacion
intFVenta_unidad
intFVenta_venta
intFVenta_costo
intFVenta_margen *
Explanation
1) The first table show a more detail description of the product dimension table. The
source field into the DB2 Data Base and the data type of the columns.
2) The Image shows a better practice of how to name the attributes in the SQL Server
Table, for the int to integers, str, to string.
3) The second table shows the logic and physic attributes.
8
10. T-SQL Statements
Código SQL:
CREATE TABLE [dbo].[DProducto] (
[intDProducto_clave] [bigint] IDENTITY (1, 1) NOT NULL,
[strDProducto_codigo] [varchar] (12) COLLATE Traditional_Spanish_CI_AS NOT NULL,
[strDProducto_nombre] [varchar] (50) COLLATE Traditional_Spanish_CI_AS NOT NULL
[strDProducto_tipo] [varchar] (20) COLLATE Traditional_Spanish_CI_AS NOT NULL,
[strDProducto_categoria] [varchar] (10) COLLATE Traditional_Spanish_CI_AS NOT NULL,
[strDProducto_division] [varchar] (15) COLLATE Traditional_Spanish_CI_AS NOT NULL,
[strDProducto_presentacion] [varchar] (3) COLLATE Traditional_Spanish_CI_AS NOT NULL,
[strDProducto_gpo_mayor] [varchar] (20) COLLATE Traditional_Spanish_CI_AS NOT NULL
) ON [PRIMARY]
GO
CREATE TABLE [dbo].[FVenta] (
[intFCliente_clave] [bigint] NULL,
[intFProducto_clave] [bigint] NULL,
[intFFecha_clave] [bigint] NULL,
[intFRegion_clave] [bigint] NOT NULL,
[intFVenta_unidad] [int] NOT NULL,
[intFVenta_venta] [bigint] NOT NULL,
[intFVenta_costo] [bigint] NOT NULL,
[intFVenta_margen] [bigint] NOT NULL
) ON [PRIMARY]
GO
Explanation
1) T-SQL statements to create the product dimension table and the sales fact table.
2) ETL Process for the creation of all the Dimensions, reading from flat files and DB2
sources.
10
11. Visual Basic Script
Function Main()
DTSDestination("intDCliente_limite_credito") = Trim(DTSSource("CRLM05") )
DTSDestination("strDCliente_mkt_channel") = Trim(DTSSource("CGP105"))
DTSDestination("strDCliente_vendedor") = Trim(DTSSource("DESC6001"))
DTSDestination("strDCliente_grupo") = Trim(DTSSource("DESC60"))
DTSDestination("strDCliente_tipo") = Trim(DTSSource("DSCL63"))
DTSDestination("strDCliente_codigo") = Trim(DTSSource("CUSN05")) + Trim(DTSSource("DSEQ05"))
DTSDestination("strDCliente_corporativo") = Trim(DTSSource("CUSN05")) +" "+ Trim(DTSSource("CNAM05"))
if (Trim(DTSSource("CAD105")) = "") then
DTSDestination("strDCliente_nombre") = Trim(DTSSource("DSEQ05"))+ " " + Trim(DTSSource("CNAM05"))
else
DTSDestination("strDCliente_nombre") = Trim(DTSSource("DSEQ05"))+ " " + Trim(DTSSource("CAD105"))
end if
Main = DTSTransformStat_OK
End Function
1) Visual Basic Script for a transformation for the Customer Dimension Table.
2) ETL Process for Sales Fact Table.
11
12. Metadata from Analysis Services
Explanation
Metadata information from the result of processing the information in the Analysis Services.
12
13. Data Warehouse Architecture
Example Report from OLAP Data Base
Explanation
1) The first image shows up the General Data Warehouse Architecture of how was built.
2) The second image represents a Corporate Sales Report in with the Gross profit % BTW
years for the amount sold.
13
14. Extra information
The result of this project ends in my thesis documentation with a deeply research of how to
implement a BI Project and also I made it. This book is around five hundred pages and what it’s
presented in this portfolio is only 5% or less of the completed work.
The completed work was made it in the Spanish language, if one who is interested in review the
work, can make a request and make the necessary translation of the electronic PDF file.
This PDF file can be requested by email at, Jorge.gomezdanes@gmail.com.
The Chapters below are available.
Chapters
1) Book cover
2) Acknowledgments, Dedication and Prologue
3) Index
4) Summary
5) Introduction, Company and Project Justification
6) Framework (Methodology)
7) Analysis and Design
8) Construction
9) Testing, Deployment, Conclusion, Future Work
10) References
11) Annexes
The PDF file provided by Jorge Arturo Gómez-Danes Mejia are copyrighted and is illegal to
copy, use, duplicated or distribute without the consent letter signed by the principal author.
14
15. Expand the application of The Business Intelligence for the
departments of accounts payable, accounts receivable and
logistics. Making changes to current processes of actual ETL
and OLAP cubes.
Introduction
Once the organization has started to use the services and benefits of the Business Intelligence
systems, it’s when the desire arises to replicate it to other departments. This project shows
complex structures to the new implementation in the particular process of ETL, OLAP and MDX.
It also presents the documentation for those activities and process.
What happens to the outstanding balances on an account receivable in which the customer has
credit days and OLAP cube must be able to show customer's current balance, outstanding
balance, aging days and balances to overcome periods of 30, 60 or 90 working days depending
on the company metrics? We need to have a many transformations and calculations in the ETL
process as well in the MDX language.
Audience
Focused on Business Intelligence developers, any IT professional, as well any management level
in the accounts receivable, accounts payable and logistic departments.
Project Goals:
Create dimensional data structures suitable for the ETL process.
Create new ETL processes that meet the requirements of the organization to implement
data marts.
Create new Dimensional OLAP cubes that meet the requirements of the organization.
Perform the necessary tests to create new processes.
Making changes to current processes of ETL and OLAP cubes.
Generate relevant documentation to support the work mentioned above.
15
17. DTS Packages
Explanation
List of the DTS packages. We have here the dimensional process and the fact table process for
logistics, sales, accounts payables, accounts receivable, warehouse and accountability.
17
18. Specific DTS for the account receivable process
Explanation
In this complex process, we have two connection managers, first from the AS400 DB2 database,
second from the SQL Server database. Also the image shows many T-SQL statements for:
truncate tables, create temporal tables, extraction of the information, some core process to
load the specific data of the accounts receivable information. For end process, it image shows
dimensional and cube OLAP process.
18
19. Account receivable T-SQL ETL code
DECLARE @Periodo As INT SET @strQry=@strQry+' where pper20<='+CAST(@Periodo As CHAR(5))+'
DECLARE @Sig As INT Group By LREF20, ETYP20) As A'
DECLARE @PeriodoIni As INT SET @strQry=@strQry+' WHERE BALANCE<>0 AND
DECLARE @PeriodoFin As INT pper20<='+CAST(@Periodo As CHAR(5))+')'
DECLARE @strQry AS VARCHAR(8000)
DECLARE @strUnion AS CHAR(10) SET @strUnion=' UNION '
DECLARE @UltimaFecha As DATETIME
SET @PeriodoIni= 10801 --CAST(CAST(DATEPART(YY,GETDATE())-1900 AS IF RIGHT(CAST(@Periodo AS CHAR(5)),2)=12
CHAR(3))+ '01' AS INT) BEGIN
SET @PeriodoFin= CAST(CAST(DATEPART(YY,GETDATE())-1900 AS SET @Periodo=@Periodo + 89
CHAR(3))+ RIGHT('00'+RTRIM(CAST(DATEPART(M,GETDATE()) AS END
CHAR(2))),2) AS INT) ELSE
BEGIN
SET @strUnion='' SET @Periodo=@Periodo + 1
SET @Periodo=@PeriodoIni END
SET @Sig=0
SET @strQry='' END
WHILE @Periodo<=@PeriodoFin --print @strQry
BEGIN EXEC(@strQry)
IF RIGHT(CAST(@Periodo AS CHAR(5)),2)=12
BEGIN
SET @Sig=@Periodo + 89
END
ELSE
BEGIN
SET @Sig=@Periodo + 1
END
SET @UltimaFecha = DATEADD(dd,-
1,CAST(CAST((CAST(SUBSTRING(CAST(@Sig As CHAR(5)),1,3) As INT) + 1900)
As CHAR(4))+'-'+SUBSTRING(CAST(@Sig As CHAR(5)),4,2)+'-01' As
DATETIME))
if @Periodo=@PeriodoFin
BEGIN
SET @UltimaFecha=GETDATE()
END
SET @strQry=@strQry+@strUnion+'SELECT [CONO20],'
SET @strQry=@strQry+'[CUSN20],'
SET @strQry=@strQry+'[ETYP20],'
SET @strQry=@strQry+'[LREF20],'
SET @strQry=@strQry+'[TTYP20],'
SET @strQry=@strQry+'[RCOD20],'
SET @strQry=@strQry+'[BTMT20],'
SET @strQry=@strQry+'[PTMT20],'
SET @strQry=@strQry+'[CURN20],'
SET @strQry=@strQry+'[DOCD20],'
SET @strQry=@strQry+'[SESN20],'
SET @strQry=@strQry+'[EVNT20],'
SET @strQry=@strQry+'[PPER20],'
SET @strQry=@strQry+CAST(@Periodo As CHAR(5))+' As Periodo,'
SET @strQry=@strQry+''''+rtrim(CAST(Datepart(dd,@UltimaFecha) AS
CHAR(2)))+'/'+rtrim(CAST(Datepart(mm,@UltimaFecha) As
CHAR(2)))+'/'+CAST(Datepart(YY,@UltimaFecha) AS CHAR(4))+''' As
FechaActual,'
SET @strQry=@strQry+'[CLRD20],'
SET @strQry=@strQry+'[VTMT20] FROM TEMPSLP20'
SET @strQry=@strQry+' WHERE LREF20+ETYP20 IN ('
SET @strQry=@strQry+'SELECT LREF20 + ETYP20 FROM ('
SET @strQry=@strQry+'select LREF20,ETYP20,sum(BTMT20) As BALANCE'
SET @strQry=@strQry+' from TEMPSLP20 '
19
20. Explanation
Before you run this query, you must remove all previous customers who have no outstanding
balances with the company, ie the sum of what you have paid and purchased are zero.
Subsequently, those who are carrying a balance, you perform a sum of all assets that have
outstanding balances for each period of the last 12 months. ie a January 2010, the total amount
of debt, to December 2009, the sum total of their debts, and so repeatedly to 12 months in
advance.
Since the end of each month, it is pertinent to consider that total debt is held by a customer
and what month comes each debt.
Account receivable T-SQL query process.
Update TEMPSLP20V2
SET BALANCE = A.BALANCE
FROM
(select sum(BTMT20) As BALANCE, MIN(LREF20) as DOCUMENTO, MIN(PERIODO) as PER,
MIN(ETYP20) as ENTRADA, MIN(EVNT20) as EVENTO
from TEMPSLP20V2
group by LREF20, ETYP20, PERIODO) as A
WHERE LREF20 = A.DOCUMENTO and PERIODO = A.PER and ETYP20 = A.ENTRADA and EVNT20
= A.EVENTO
Update TEMPSLP20V2
SET BALANCEUSD = A.BALANCE
FROM
(select sum(PTMT20) As BALANCE, MIN(LREF20) as DOCUMENTO, MIN(PERIODO) as PER,
MIN(ETYP20) as ENTRADA, MIN(EVNT20) as EVENTO
from TEMPSLP20V2
group by LREF20, ETYP20, PERIODO) as A
WHERE LREF20 = A.DOCUMENTO and PERIODO = A.PER and ETYP20 = A.ENTRADA and EVNT20
= A.EVENTO
Explanation
This T-SQL specific, shows the sum of the balance, among other fields, for clients who have debt
with the company for both USD and MXN currency.
NOTE: for this project there is not a solution to the types of changes. Each currency (MXN,
USD) was calculated.
20
21. OLAP Structure
Explanation
This images shows up the different kind of
OLAP cubes, and for the Account Receivable
Cube it shows the different kind of
dimensions and also are in hierarchies,
customer, date, transaction, mixes
dimensions.
21
22. Star Schema Diagram for Account Receivable
Explanation
Star schema with eight dimensions (Periods, Warehouse, Entry Type, Reason Codes, Zone, Date,
Customer and Sub Entry Type) and also into the fact table there are many kind of natural
measures, and we have two type of core measures, degenerate dimensions (order, invoice,
etc.) and normal measures (debts, balance, etc.)
22
23. Measures and Calculate members
Explanation
Natural measures and calculate measures. Many of the calculate members are evaluate the
same measure but with a different period the common use is for the last period.
23
24. MDX code and Drill Through options
Explanation
In this MDX expression we have many functions on it. Iif(), PrevMemeber(), CurrentMember()
NOT an operator and using the NULL values. This is very important because the PrevMember()
and CurrentMember(), we need to use the Date Dimension, and the Date Dimension need to be
set up as DateTime Dimensios Datatype, this is an important configuration.
Explanation
Columns that will appear in a drill through execution.
24
25. User Roles and Restricted Dimension access
Explanation
In the first image we have all the users that have a different access to the OLAP cubes and for
the second image we have a specific restriction to access just a one kind of granularity in the
customer dimension.
25
26. SetFocus Business Intelligence Project using Microsoft SQL
Service Integration Services 2005
Introduction
Create SSIS packages to extract information from excel and CVS files formats with Data Source and the
connection manager properly. Create specifics transformations tasks like data conversion, derived
columns, look up, conditional split and scripts tasks, as well with the correct precedent constraint and
pipelines. Create the upload of the information for new, existing and bad Id’s. Used of the notification
email. Deploy the packages and create a schedule jobs.
Audience
Focused on business intelligence developers, as well as any professional in the information
technology area.
Project Goals:
Review the source data (Excel or CVS). Make some enhancements to the existing data
sources, to support more flexible business practices for customer invoicing.
Use SQL Server 2005 Integration Services to integrate these external data sources into
the SQL Server database. (Using the specific task to create the necessary
transformation)
Create the necessary tests to the package as well execute successfully.
Create the specific documentation for each package (Documentation with the source,
destination, files to read, data base to load, process, results)
For each package generate a successful and failure emails using the variables for new,
updated or bad ids.
Create a maintenance package to backup, shrink and reindex the data base.
Use correct proper error handler as need it. Correct NULL statements, use of the new
rows Inserted using OLEDB Destination, updates using OLEDB Command, use
annotation, comments and best practice to name objects and variables.
26
27. Read several CVS files
Explanation
Package that read several CVS files from some path, made the transformation as need on each
one of them, create the necessary calculations with the variables and send a successful or
failure email.
27
28. Declaring and using variables
Explanation
The fist image show a list of variables used in the last package. The name and the scope and the
right type of data. And the second image shows a simple Visual Basic script of how use that
variables for each file read it.
28
29. Using several task into the Data Flow Tab
Explanation
There are 9 basic steps beginning with the extraction from an Excel source file, data conversion,
two lookup task and a conditional split. What in the lookup task want to do us to find that the
EmployeID exists into the BTW the source file and the data base using a JOIN statement for the
Employe table and EmployeRate Table. And for the second Lookup to identify what rows are
new or what rows are changed from the Unique Keys identifier.
29
30. Conditional Split task and Destination source.
Explanation
The image shows multiple conditions for proper error handler. If an EmployeeID is not present
in the source file or if the JobID is not present in the source file or if the WorkDate is grather or
equal than WorkClose, the three option go to a flat file destination and report in the with rows
are wrong.
And for the OLE DB Destination is for new rows and the OLE DB Command is for the updated
rows using the parameters as need in the configuration.
30
31. Manage all packages with Maintenance plan
Explanation
This is the master package, that executes a bunch of package with some ETL process and also it
includes a simple database maintenance plan with the emails associate with successfully and
failure.
31
32. Job scheduling
Explanation
The image above shows the basic configuration of a SQL Job, with a step and schedule
configuration for the master package. In this package the connection manager and the owner
of the SQL Job, was configured properly to run well every midnight.
32
33. SetFocus Business Intelligence Project using Microsoft SQL
Service Analysis Services 2005
Introduction
Created a SSAS solution in Microsoft Business Intelligence Developer Solution with four fact tables, five
dimensions, one of them with multiple hierarchies and five KPI’s. Created a MDX code solution in
Microsoft Management Studio. Develop and access the information from Microsoft Excel 2007 and
created five reports.
Audience
Focused on business intelligence developers, as well as any professional in the information
technology area.
Project Goals:
Created data sources and data sources view.
Created five dimensions.
Created multiple hierarchies relationship between one dimension and other data source
tables.
Created one cube with four fact tables.
Created eight calculate members
Created five KPI’s indicators.
Created two partitions for each fact table and aggregations up to 50%.
Deploy to SSAS Services and execute from Management Studio.
Written down 24 complex MDX code.
Created five reports on Microsoft Excel 2007.
33
34. Data Source View – Cube
Explanation
The first picture shows a data source diagram from the data staging area in the
database. Relationships between data base tables.
The second picture shows a cube diagram from the data source view. In this diagram we
can appreciate the four fact tables and only five dimension tables. We have a peculiar
difference btw the first and second picture, the number of tables. I will show the
difference in the next picture.
34
35. Job Master Dimension – Multiple relationships
Explanation
This picture shows a JobMaster.dim, dimension table that is related with other four tables. We
have here the configuration of the relationship between tables that are not part of a dimension
in a cube, but some attributes from other tables need to show up into the cube.
35
36. Dimensions Usage – Calculate Members
Explanation
The first picture shows how each fact table is related to each dimension into the cube
structure. In this structure we can see that the job Master and the All Works Calendar is
related for all fact tables, when the overhead, material types and employees is just
related into some of the fact tables.
The second picture is a list of some calculates members that were created for the cube
structure. In the next picture in the portfolio I will show how I can handle more and
complex MDX code.
36
37. KPI
Explanation
The image show how is set upped a the Open Receivables KPI , that is a value expresion for a
calculate member called Open Receivables and evaluated to be in a range of values to set a
goodterm, midterm and badterm values for a traffic light colors. KPIs and Trend are very usefull
to see some patterns in the history of the data.
37
38. Cube Partitions and Cube Browser
Explanation
The first image shows two partitions for each fact table and a 50% aggregations. Each
partition has a condition about historical and actual year.
The second image shows the cube browser for previous results evaluations
38
39. MDX Code
Explanation
MDX query’s for the CUBE structure and end-user ad-hoc requirements. I can handle many of
the MDX functions and also VBA, and Excel Functions.
39
40. SetFocus Business Intelligence Project using Microsoft SQL
Service Reporting Services 2005, Performance Point Server and
SharePoint Server
Introduction
End users need access tools to analyze information. This information can be from relational data
sources, dimensional data sources, OLAP cubes, Excel sheets, web information and also information in
flats files. Reporting Services as well Performance Point Server allow us to create a simple and complex
end users report. Also SharePoint Server is an intranet that let us to publish many of this reports in a
very structure way with many features. In this project I will show you what I have done and how.
Audience
Focused in business intelligence developers as well as any professional in the information
technology area and business end users.
Project Goals:
Created reports with SQL Server Report Server in Business Intelligence Developer
Studio, with information from relational databases and OLAP information. Created
parameters for those repots to be dynamic and also give the adequate format to
present the report.
Created reports on Performance Point Server with OLAP information. Created
Dashboards, KPI, Reports, with multiple parameters graphics, grids and Excel Services
reports.
Created reports with Excel, with pivot tables, pivot charts with multiple filters.
Created a Collation Site on SharePoint Server, with multiple documents library and
many subscriptions.
Deploy and publish all the reports for SQL Server Report Server, Performance Point
Server and Excel Services.
40
41. SQL Server Report Services – Data
Explanation
On SSRS, it is possible to define the dataset from we want to query and create the results
information that we want to build in the layout. We can use the OLAP information, relational
databases. Both types, automatic design and expert design to build the information that we
want to see. In this case with expert design with and MDX query.
41
42. SQL Server Report Services – Layout
Explanation
This is an important step. The developer need to know the tool to create the report layout
exactly what end user need to see, and also how to play with the different tool that SSRS have.
In this layout we can create, tables, matrix, charts, with many data sources, grouping
information, create summarize data, and give a fancy end users design.
42
43. Performance Point Server
Explanation
Performance Point Services allow us to create dashboards, KPI’s, Scorecards and Reports from
many data sources with different types of indicators as light or gauge. This image, show many
different objects, that I will show in next images but in SharePoint Server.
43
44. Share Point Server with Excel Services
Explanation
This image show two important things, the first one is the chart wit two different type of
indicator a percentage and a bar chart, created in Excel Services, then it show the background
of the SharePoint Server where it has many documents library with the different kind of
reports.
44
45. Reporting Services with Share Point
Explanation
This image show the way look SSRS deployed it in Share Point Server, with the parameters of
time and clients. It also show the grouping by client and the jobs worked by hours and total
labor cost.
45
46. Performance Point Server on SharePoint Server
Explanation
This image shows two KPI reports with one parameter in Share Point Server. This two KPI, are in
the image above in the Performance Point Server section.
46
47. Performance Point Server with Share Point Server
Explanation
This image shows two basic reports created on Performance Point server, but it had a particular
feature, the two reports are linked with the time parameter and the top 10 jobs and 5 workers.
47
48. Excel Services with SharePoint Server
Explanation
This image shows two reports created on Excel Services, a chart image with two measures and
the table for that chart image. The configuration in the Performance Point Server is important
because from there we can create the layout of how we want to show the information.
48