This document discusses the various ways that ABAP (Advanced Business Application Programming) code can be used in SAP BI (Business Intelligence). ABAP code can be used in info packages, data transfer processes, and transformations to manipulate and transform data according to business requirements. Specific examples are provided for using ABAP routines in info package extraction and data selection tabs, as well as routines like start, end, field, and expert routines in transformations. The document aims to provide insight into using ABAP in SAP BI components.
This document discusses common production failures encountered with SAP BW, including transactional RFC errors due to non-updated IDocs, time stamp errors, short dumps, job cancellations in the R/3 source system, incorrect data in the PSA, ODS activation failures, missing caller profiles, failed attribute change runs, failed R/3 extraction jobs, missing files, table space issues, and restarting failed process chains. It provides explanations of each error, the impacts, and recommended actions to resolve the issues. The document serves as a quick reference guide for BW consultants to help solve complex production problems.
How to write a routine for 0 calday in infopackage selectionValko Arbalov
This document describes how to write an ABAP routine for 0CALDAY in an InfoPackage selection to load only data for a particular date from a source system into SAP BW/BI. It provides steps to create a flat file data source, define transformations and a data transfer process, add a routine to the InfoPackage, and write ABAP code that sets the selection date to system date minus one. Running a full load using the InfoPackage will then only import data for the day before the load date.
This document discusses performance tuning in SAP BI 7.0 at the backend and frontend levels. At the backend, factors like data load sequence, PSA partition size, parallelizing uploads, export data sources, and transformation rules can impact performance. At the frontend, query performance, aggregates, OLAP cache settings, and read mode can be tuned. The document provides steps to optimize these factors through tools like transaction codes RSCUSTV6, RSCUSTV14, and RSDIPROP.
This document discusses how to write and debug start routines in SAP BW update rules and transformations. It provides an example of using a start routine in update rules to delete records from the data package based on certain customer numbers before loading data into the InfoCube. It also discusses how to set breakpoints and step through the start routine code using the debugger to test it. Similar instructions are provided for writing and debugging start routines in transformations.
Using error stack and error dt ps in sap bi 7.0gireesho
This document discusses using error stacks and error data transfer processes (DTPs) in SAP BI 7.0 to handle erroneous records. It describes collecting erroneous records from a regular DTP load in the error stack, validating the errors, and using an error DTP to move the corrected records to the target system. The 7-step process includes enabling the error stack, defining semantic groups, executing the initial DTP, validating errors, and creating and running an error DTP.
SAP BW - Master data load via flat fileYasmin Ashraf
1) The document describes the steps to load master data attributes and texts for an Info Object called ZMATERIAL using two flat files.
2) It involves creating data sources for the attributes and texts files, mapping the fields, generating info packages, and executing data transfer processes to load the data into the SAP system.
3) Checking the load is done by viewing the contents of the ZMATERIAL info object tables for the attributes and texts.
This document provides steps for LO extraction from SAP R/3 to SAP BI/BW. It discusses LBWE activities like scheduling extraction jobs, different update modes including direct delta, queued delta, and unserialized V3 update. It also covers setup tables which collect required data from application tables to initialize delta loads and full loads, avoiding direct access to R/3 tables. The author is an SAP BI consultant who has experience with ABAP, custom report development, and system migrations.
Quick Viewer is a report generating tool in SAP that allows users to create simple reports without any ABAP coding knowledge. It generates basic list reports by connecting database tables, selecting fields, and setting filters. The document provides step-by-step instructions on how to use Quick Viewer, including selecting data sources, joining tables, choosing fields, setting filters and sorts, and various output options. It also compares Quick Viewer to SAP Query and discusses converting QuickViews to SAP Queries and transporting QuickViews between systems.
This document discusses common production failures encountered with SAP BW, including transactional RFC errors due to non-updated IDocs, time stamp errors, short dumps, job cancellations in the R/3 source system, incorrect data in the PSA, ODS activation failures, missing caller profiles, failed attribute change runs, failed R/3 extraction jobs, missing files, table space issues, and restarting failed process chains. It provides explanations of each error, the impacts, and recommended actions to resolve the issues. The document serves as a quick reference guide for BW consultants to help solve complex production problems.
How to write a routine for 0 calday in infopackage selectionValko Arbalov
This document describes how to write an ABAP routine for 0CALDAY in an InfoPackage selection to load only data for a particular date from a source system into SAP BW/BI. It provides steps to create a flat file data source, define transformations and a data transfer process, add a routine to the InfoPackage, and write ABAP code that sets the selection date to system date minus one. Running a full load using the InfoPackage will then only import data for the day before the load date.
This document discusses performance tuning in SAP BI 7.0 at the backend and frontend levels. At the backend, factors like data load sequence, PSA partition size, parallelizing uploads, export data sources, and transformation rules can impact performance. At the frontend, query performance, aggregates, OLAP cache settings, and read mode can be tuned. The document provides steps to optimize these factors through tools like transaction codes RSCUSTV6, RSCUSTV14, and RSDIPROP.
This document discusses how to write and debug start routines in SAP BW update rules and transformations. It provides an example of using a start routine in update rules to delete records from the data package based on certain customer numbers before loading data into the InfoCube. It also discusses how to set breakpoints and step through the start routine code using the debugger to test it. Similar instructions are provided for writing and debugging start routines in transformations.
Using error stack and error dt ps in sap bi 7.0gireesho
This document discusses using error stacks and error data transfer processes (DTPs) in SAP BI 7.0 to handle erroneous records. It describes collecting erroneous records from a regular DTP load in the error stack, validating the errors, and using an error DTP to move the corrected records to the target system. The 7-step process includes enabling the error stack, defining semantic groups, executing the initial DTP, validating errors, and creating and running an error DTP.
SAP BW - Master data load via flat fileYasmin Ashraf
1) The document describes the steps to load master data attributes and texts for an Info Object called ZMATERIAL using two flat files.
2) It involves creating data sources for the attributes and texts files, mapping the fields, generating info packages, and executing data transfer processes to load the data into the SAP system.
3) Checking the load is done by viewing the contents of the ZMATERIAL info object tables for the attributes and texts.
This document provides steps for LO extraction from SAP R/3 to SAP BI/BW. It discusses LBWE activities like scheduling extraction jobs, different update modes including direct delta, queued delta, and unserialized V3 update. It also covers setup tables which collect required data from application tables to initialize delta loads and full loads, avoiding direct access to R/3 tables. The author is an SAP BI consultant who has experience with ABAP, custom report development, and system migrations.
Quick Viewer is a report generating tool in SAP that allows users to create simple reports without any ABAP coding knowledge. It generates basic list reports by connecting database tables, selecting fields, and setting filters. The document provides step-by-step instructions on how to use Quick Viewer, including selecting data sources, joining tables, choosing fields, setting filters and sorts, and various output options. It also compares Quick Viewer to SAP Query and discusses converting QuickViews to SAP Queries and transporting QuickViews between systems.
This document summarizes the data flow process from ECC to BW/BI. It shows the number of records in different queues like LBWQ, SMQ1, and RSA7 before and after running a V3 job. The summary steps are:
1) Records are first seen in LBWQ/SMQ1 after transactions are posted in ECC.
2) A V3 job is started which transfers records from LBWQ/SMQ1 to RSA7.
3) Before the V3 job, RSA7 shows 32 records but LBWQ/SMQ1 shows multiple records.
4) After the successful completion of the V3 job, RSA7 shows an increased number of
The document provides steps to configure Real-time Data Acquisition (RDA) in SAP BI 7.0 to load transactional data from a source system into the SAP BI system every minute. It involves creating a generic data source in the source system, an info package for the data source with the "Initialize Delta Process" option, a data store object, transformation, and data transfer process with type "Real Time Data Acquisition". It also requires setting up a daemon process to trigger the real-time data loading through the info package on a scheduled interval.
Line item dimension and high cardinality dimensionPraveen Kumar
The document discusses how to identify dimensions in SAP BI cubes that should be changed to line item dimensions or high cardinality dimensions to improve query performance. It provides examples of dimensions that met criteria for these changes and describes the steps to implement the changes in a live environment. Creating line item and high cardinality dimensions avoids creating dimension IDs, reducing query time and improving data load time.
BW Adjusting settings and monitoring data loadsLuc Vanrobays
The document discusses various settings related to loading data into SAP BW, including:
1) Monitoring and adjusting data package settings to address performance issues during data loads. Large numbers of data packages or large individual package sizes can slow loads.
2) Checking transfer parameter settings for data loads from source systems into BW to ensure they are optimized.
3) Ways to split large initial data loads into smaller parallel loads to improve performance, such as using selection criteria to restrict the data per package.
This document discusses the LO extraction update logic in SAP. It explains that updates occur asynchronously through update work processes. The update is divided into critical V1 modules and less critical V2 modules, with V1 updates occurring first in a single transaction and V2 updates separately. An update request contains the update header, V1 modules, V2 modules, and collective runs, with the corresponding data stored in database tables VBHDR, VBMOD, and VBDATA.
An info cube is a data storage area that maintains summarized and aggregated data in a star schema structure. It consists of one fact table containing key figures and dimensions tables. There are 11 steps to create a standard info cube which stores data physically in the cube: 1) Create a data source, 2) Create an info package to load data, 3) Create the info cube, 4) Assign info objects to dimensions, 5) Create a transformation, 6) Create the data transfer process, 7) Execute to load data, 8) Check loaded data. Standard info cubes allow only read access while virtual cubes access live data and real-time cubes allow read/write.
This document explains how to enhance a LO data source by appending a new field to extract additional data. Specifically, it describes appending the material type (MTART) field from the MARA table to the 2LIS_11_VAITM data source. The steps include checking for the field in existing communication structures, appending the extract structure, writing ABAP code to populate the new field during extraction, and configuring the data source metadata. Performing an extract check confirms the field is now being extracted.
This document discusses the different types of tables that are generated when activating an info object and its structures in SAP BI 7.0. It explains the master data table, text table, SID table, attribute tables (P, Q, X, Y tables), and hierarchies tables (H table) that can be created. It provides the naming conventions and key fields for each table type.
Lo extraction – part 5 sales and distribution (sd) datasource overviewJNTU University
This document provides an overview of Sales and Distribution (SD) data sources that can be used for LO extraction in SAP BI. It describes the different event types that can trigger data transfers to the data warehouse and lists various SD extractors and their assigned data sources. Useful SAP notes are also referenced that provide more details on specific SD data sources.
This document discusses DSO job logs and activation parameters in SAP BW 7.x. It explains that activation is done in three steps: checking request status, checking data against archives, and activating data using parallel BIBCTL* jobs. The number of BIBCTL jobs is based on activation settings like package size. Transaction RSODSO_SETTINGS is used to set parameters that influence activation performance, like package size and parallel processes. Proper settings can reduce activation time for large datasets.
The document provides an overview and roadmap for SAP BW/4HANA. It discusses SAP's three approach strategy for data warehousing, including SAP BW/4HANA, SQL, and mixed approaches. It outlines the key principles and architecture of SAP BW/4HANA, including simplifying models and dataflows, openness to external data sources, modern user interfaces, and high performance. The roadmap discusses continuous innovation through a single codeline to deliver SAP BW/4HANA on-premise and in the public cloud.
This document provides an overview of the different update modes for logistics extraction in SAP BI: direct delta, queued delta, and unserialized V3 update. It explains the concepts, benefits, and use cases of each update mode. The direct delta mode transfers extracted data directly to the delta queue with each document posting. The queued delta mode collects extracted data in an extraction queue and transfers it to the delta queue in batches. The unserialized V3 update writes extracted data to update tables and transfers it to the delta queue without regard to sequence.
This document provides an overview of how to build, maintain, and optimize the use of aggregates in SAP BW, including how to create aggregates on specific characteristics and attributes, run aggregate roll-ups after data loads, and improve aggregate efficiency through techniques like switching aggregates on/off, deleting unused aggregates, and compressing aggregate data. The document includes step-by-step instructions for creating aggregates on an InfoCube and rolling up aggregates after data changes.
SAP Data Archiving allows organizations to remove old data from their SAP database and store it externally to reduce costs and improve performance. The archiving process involves creating archive files, running delete programs to remove data from the database, and storing the archive files externally. Archiving objects define which data to archive and how. The archive information system then allows users to search and retrieve archived data.
Beginner's guide create a custom 'copy' planning function typeNaveen Kumar Kotha
This document provides a step-by-step guide to creating a custom "Copy" planning function in SAP BW. It explains how to set up the environment, create a custom class, define the planning function type in transaction RSPLF1, integrate the function into a planning application, create and execute a planning sequence, modify the function code, debug and test, and transport the custom function. The guide aims to help users avoid obstacles in customizing planning functions and provide a working example to copy actual sales data from one year to a plan for the next year.
SAP SD is one of the key modules of SAP ERP that manages customer relationships and logistics functions from order quotation to billing. It is integrated with other modules like Material Management, Finance and Accounting. The SAP SD module uses master data like customers, materials, and pricing to process transactions through the sales cycle from pre-sales activities, order processing, delivery, and billing.
An InfoCube contains integrated data from multiple sources and is optimized for analysis. It contains dimensions such as material, time, and sales organization, and key figures like sales quantity, revenue, and discount. An InfoCube allows users to analyze relationships between different data points for better business decisions.
This document provides steps to create a generic delta extractor in SAP BW to extract data from tables that do not have standard extractors. It involves creating a generic data source, specifying the delta-specific field and type, and setting safety intervals to prevent missing records. Tips are provided on determining appropriate selection intervals based on the delta field type, such as using an upper limit of 30 minutes for a timestamp field. The generic delta allows flexible data extraction from any source regardless of application.
Automatic batch determination based on shelf lifeMauricio Beltran
This document discusses how to configure automatic batch determination in SAP based on shelf life for materials like pharmaceuticals and foods. Key steps include importing standard characteristics, creating classes linked to expiration date, setting material master fields for shelf life, receiving goods into batches, and creating a search class and sort rule to select batches with the closest expiration dates first during delivery. This ensures batches are delivered before their expiration while meeting the required minimum remaining shelf life.
This document provides instructions for configuring a company code in SAP FICO. It begins with an introduction to SAP FI and relevant terminology. It then outlines the steps to define a company, create a company code, assign the company code to a company and chart of accounts, define relevant organizational structures and assign them to the company code, and configure settings for the currency, fiscal year, posting periods, document numbers, and more. The goal is to fully configure company code 1100 for the fictional company ABC Ltd located in the US.
\n\nThe document describes how to use the Analysis Process Designer (APD) in SAP BW to download data from an InfoCube into a CSV file. It provides step-by-step instructions to select specific InfoObjects from the InfoCube, transform the date format of one field, and write the data to a CSV file on the desktop. The goal is to change the format of a calendar day field from YYYYMMDD to MM/DD/YYYY when exporting the data. Thirteen steps are outlined to configure the APD process with a data source, target, routine for transformation, and execution of the process.
This document outlines 6 activities related to formatting, creating, and enhancing reports in Tableau. Activity 1 describes formatting a query with breaks. Activity 2 involves creating a crosstab report. Activity 3 enhances a report with additional formatting and filters. Activity 4 adds sections to a report. Activity 5 introduces formulas, variables, and prompts. Activity 6 creates a complex conditional formatting rule.
This document summarizes the data flow process from ECC to BW/BI. It shows the number of records in different queues like LBWQ, SMQ1, and RSA7 before and after running a V3 job. The summary steps are:
1) Records are first seen in LBWQ/SMQ1 after transactions are posted in ECC.
2) A V3 job is started which transfers records from LBWQ/SMQ1 to RSA7.
3) Before the V3 job, RSA7 shows 32 records but LBWQ/SMQ1 shows multiple records.
4) After the successful completion of the V3 job, RSA7 shows an increased number of
The document provides steps to configure Real-time Data Acquisition (RDA) in SAP BI 7.0 to load transactional data from a source system into the SAP BI system every minute. It involves creating a generic data source in the source system, an info package for the data source with the "Initialize Delta Process" option, a data store object, transformation, and data transfer process with type "Real Time Data Acquisition". It also requires setting up a daemon process to trigger the real-time data loading through the info package on a scheduled interval.
Line item dimension and high cardinality dimensionPraveen Kumar
The document discusses how to identify dimensions in SAP BI cubes that should be changed to line item dimensions or high cardinality dimensions to improve query performance. It provides examples of dimensions that met criteria for these changes and describes the steps to implement the changes in a live environment. Creating line item and high cardinality dimensions avoids creating dimension IDs, reducing query time and improving data load time.
BW Adjusting settings and monitoring data loadsLuc Vanrobays
The document discusses various settings related to loading data into SAP BW, including:
1) Monitoring and adjusting data package settings to address performance issues during data loads. Large numbers of data packages or large individual package sizes can slow loads.
2) Checking transfer parameter settings for data loads from source systems into BW to ensure they are optimized.
3) Ways to split large initial data loads into smaller parallel loads to improve performance, such as using selection criteria to restrict the data per package.
This document discusses the LO extraction update logic in SAP. It explains that updates occur asynchronously through update work processes. The update is divided into critical V1 modules and less critical V2 modules, with V1 updates occurring first in a single transaction and V2 updates separately. An update request contains the update header, V1 modules, V2 modules, and collective runs, with the corresponding data stored in database tables VBHDR, VBMOD, and VBDATA.
An info cube is a data storage area that maintains summarized and aggregated data in a star schema structure. It consists of one fact table containing key figures and dimensions tables. There are 11 steps to create a standard info cube which stores data physically in the cube: 1) Create a data source, 2) Create an info package to load data, 3) Create the info cube, 4) Assign info objects to dimensions, 5) Create a transformation, 6) Create the data transfer process, 7) Execute to load data, 8) Check loaded data. Standard info cubes allow only read access while virtual cubes access live data and real-time cubes allow read/write.
This document explains how to enhance a LO data source by appending a new field to extract additional data. Specifically, it describes appending the material type (MTART) field from the MARA table to the 2LIS_11_VAITM data source. The steps include checking for the field in existing communication structures, appending the extract structure, writing ABAP code to populate the new field during extraction, and configuring the data source metadata. Performing an extract check confirms the field is now being extracted.
This document discusses the different types of tables that are generated when activating an info object and its structures in SAP BI 7.0. It explains the master data table, text table, SID table, attribute tables (P, Q, X, Y tables), and hierarchies tables (H table) that can be created. It provides the naming conventions and key fields for each table type.
Lo extraction – part 5 sales and distribution (sd) datasource overviewJNTU University
This document provides an overview of Sales and Distribution (SD) data sources that can be used for LO extraction in SAP BI. It describes the different event types that can trigger data transfers to the data warehouse and lists various SD extractors and their assigned data sources. Useful SAP notes are also referenced that provide more details on specific SD data sources.
This document discusses DSO job logs and activation parameters in SAP BW 7.x. It explains that activation is done in three steps: checking request status, checking data against archives, and activating data using parallel BIBCTL* jobs. The number of BIBCTL jobs is based on activation settings like package size. Transaction RSODSO_SETTINGS is used to set parameters that influence activation performance, like package size and parallel processes. Proper settings can reduce activation time for large datasets.
The document provides an overview and roadmap for SAP BW/4HANA. It discusses SAP's three approach strategy for data warehousing, including SAP BW/4HANA, SQL, and mixed approaches. It outlines the key principles and architecture of SAP BW/4HANA, including simplifying models and dataflows, openness to external data sources, modern user interfaces, and high performance. The roadmap discusses continuous innovation through a single codeline to deliver SAP BW/4HANA on-premise and in the public cloud.
This document provides an overview of the different update modes for logistics extraction in SAP BI: direct delta, queued delta, and unserialized V3 update. It explains the concepts, benefits, and use cases of each update mode. The direct delta mode transfers extracted data directly to the delta queue with each document posting. The queued delta mode collects extracted data in an extraction queue and transfers it to the delta queue in batches. The unserialized V3 update writes extracted data to update tables and transfers it to the delta queue without regard to sequence.
This document provides an overview of how to build, maintain, and optimize the use of aggregates in SAP BW, including how to create aggregates on specific characteristics and attributes, run aggregate roll-ups after data loads, and improve aggregate efficiency through techniques like switching aggregates on/off, deleting unused aggregates, and compressing aggregate data. The document includes step-by-step instructions for creating aggregates on an InfoCube and rolling up aggregates after data changes.
SAP Data Archiving allows organizations to remove old data from their SAP database and store it externally to reduce costs and improve performance. The archiving process involves creating archive files, running delete programs to remove data from the database, and storing the archive files externally. Archiving objects define which data to archive and how. The archive information system then allows users to search and retrieve archived data.
Beginner's guide create a custom 'copy' planning function typeNaveen Kumar Kotha
This document provides a step-by-step guide to creating a custom "Copy" planning function in SAP BW. It explains how to set up the environment, create a custom class, define the planning function type in transaction RSPLF1, integrate the function into a planning application, create and execute a planning sequence, modify the function code, debug and test, and transport the custom function. The guide aims to help users avoid obstacles in customizing planning functions and provide a working example to copy actual sales data from one year to a plan for the next year.
SAP SD is one of the key modules of SAP ERP that manages customer relationships and logistics functions from order quotation to billing. It is integrated with other modules like Material Management, Finance and Accounting. The SAP SD module uses master data like customers, materials, and pricing to process transactions through the sales cycle from pre-sales activities, order processing, delivery, and billing.
An InfoCube contains integrated data from multiple sources and is optimized for analysis. It contains dimensions such as material, time, and sales organization, and key figures like sales quantity, revenue, and discount. An InfoCube allows users to analyze relationships between different data points for better business decisions.
This document provides steps to create a generic delta extractor in SAP BW to extract data from tables that do not have standard extractors. It involves creating a generic data source, specifying the delta-specific field and type, and setting safety intervals to prevent missing records. Tips are provided on determining appropriate selection intervals based on the delta field type, such as using an upper limit of 30 minutes for a timestamp field. The generic delta allows flexible data extraction from any source regardless of application.
Automatic batch determination based on shelf lifeMauricio Beltran
This document discusses how to configure automatic batch determination in SAP based on shelf life for materials like pharmaceuticals and foods. Key steps include importing standard characteristics, creating classes linked to expiration date, setting material master fields for shelf life, receiving goods into batches, and creating a search class and sort rule to select batches with the closest expiration dates first during delivery. This ensures batches are delivered before their expiration while meeting the required minimum remaining shelf life.
This document provides instructions for configuring a company code in SAP FICO. It begins with an introduction to SAP FI and relevant terminology. It then outlines the steps to define a company, create a company code, assign the company code to a company and chart of accounts, define relevant organizational structures and assign them to the company code, and configure settings for the currency, fiscal year, posting periods, document numbers, and more. The goal is to fully configure company code 1100 for the fictional company ABC Ltd located in the US.
\n\nThe document describes how to use the Analysis Process Designer (APD) in SAP BW to download data from an InfoCube into a CSV file. It provides step-by-step instructions to select specific InfoObjects from the InfoCube, transform the date format of one field, and write the data to a CSV file on the desktop. The goal is to change the format of a calendar day field from YYYYMMDD to MM/DD/YYYY when exporting the data. Thirteen steps are outlined to configure the APD process with a data source, target, routine for transformation, and execution of the process.
This document outlines 6 activities related to formatting, creating, and enhancing reports in Tableau. Activity 1 describes formatting a query with breaks. Activity 2 involves creating a crosstab report. Activity 3 enhances a report with additional formatting and filters. Activity 4 adds sections to a report. Activity 5 introduces formulas, variables, and prompts. Activity 6 creates a complex conditional formatting rule.
The document provides instructions for completing an activity in 5 steps: log in to the Launch Pad, open a specific web service document, refresh a report and select a value from a list, save the document to favorites, and log off the Launch Pad.
Sap business intelligence 4.0 report basictovetrivel
This 2-day course covers creating and formatting Interactive Analysis reports in Business Objects. The course consists of 5 modules: 1) Introduction to Interactive Analysis Reporting; 2) Creating Interactive Analysis documents; 3) Formatting documents; 4) Working with charts; and 5) Analyzing data. Module 2 focuses on building queries, adding filters and prompts to queries, and creating the first Interactive Analysis report. Key topics covered include writing queries, using different data sources, adding document-level filters and prompts, applying multiple prompts, and previewing data.
Day 9 __10_introduction_to_bi_enterprise_reporting_1___2tovetrivel
This document provides an overview of SAP BI reporting tools. It discusses business information management practices including enterprise reporting, query and analysis. It describes SAP NetWeaver BI and its reporting capabilities. Key SAP BI reporting tools are introduced, including BEx Query Designer for designing queries, BEx Analyzer for modifying queries, and BEx Browser for organizing queries. Report design considerations like optimizing performance and global report design are also covered.
The document provides an overview of extractors and data extraction methods in SAP BW, including business content extraction from SAP R/3, LO Cockpit extraction, and generic extractors. It discusses different types of extractors such as application specific, cross-application, BW content, and customer generated extractors. The document also summarizes different update modes for LO extraction including direct delta, queued delta, and un-serialized V3 update, and how they handle updating the BW delta queue. Finally, it reviews the process for filling setup tables, initializing/simulating extractions, and scheduling V3 collective runs to transfer extraction data to the BW delta queue.
The document provides an overview of SAP BI training. It discusses that SAP stands for Systems Applications and Products in Data Processing and was founded in 1972 in Germany. It is the world's fourth largest software provider and largest provider of ERP software. The training covers topics such as the 3-tier architecture of SAP, data warehousing, ETL, the SAP BI architecture and key components, OLTP vs OLAP, business intelligence definition, and the ASAP methodology for SAP implementations.
This document provides information about SAP BW modeling training including the schedule, terminology, modeling concepts, and data flow. It outlines a 5-day training schedule from May 19-23, 2014 with different sessions each day led by various trainers. It defines common BW terminology and concepts such as InfoObjects, InfoCubes, DSOs, characteristics, and key figures. It describes the purpose of data modeling in BW and provides examples of data flow and the use of process chains.
Day 6.3 extraction_business_content_and_generictovetrivel
This document provides an overview of extractors in SAP BW, including business content extractors, generic extractors, and data source enhancement. Business content extractors are application-specific and provided with SAP applications, while generic extractors can be used across applications and are created using views, infosets, or function modules. Data source enhancement involves adding additional fields not available in standard data sources. The document also discusses update modes, delta capability, record types, and delta tables used in SAP BW extractors.
1. The document discusses parameters in the YEDW_BWADMIN table that can be used to limit the number of extractions started from BIG when BIM is overloaded.
2. It explains factors like available batch work processes, CPU usage, I/O wait, and response times that are checked to calculate how many new extractions can be safely started.
3. Monitoring reports like YGTTC_PERF_MONITOR and YGTTC_SM66_READ are used to capture performance metrics from BIM and identify jobs consuming excessive resources.
The document provides information about LIS extraction in SAP. It begins with an introduction to LIS extraction and then outlines the 12 main steps involved in the LIS extraction process, including creating an information structure, generating a data source, transporting the data source to SAP BW, creating an infopackage, and running initial and delta loads. It also provides a diagram of the internal architecture showing how LIS extraction moves data from SAP R/3 tables to SAP BW. The document concludes with a brief discussion of information structures and logistics information in LIS extraction.
Field symbols are placeholders that point to data objects in memory after being assigned. They provide flexibility in referencing data by allowing offsets, lengths, and parts of fields to be addressed dynamically. While field symbols offer elegant solutions, errors can easily occur since field assignments are not checked until runtime. It is best to only use field symbols when other ABAP statements cannot achieve the required result.
Lo extraction part 6 implementation methodologyJNTU University
The document provides an overview of the implementation methodology for LO extraction in SAP BI. It describes the key steps in the ECC and BI systems, including activating data sources, extract structures, and filling setup tables in ECC, and replicating, migrating, and loading data sources and cubes in the BI system. It also discusses delta loading to update cubes with changed data from the ECC system.
This document explains the logic behind SAP LO extractors, including the relevant backend tables and important function modules. It describes the design of extract structures, datasources, datasource activation, extraction structures, setup tables, and extractors. It also provides an overview of the dataflow and the LO Cockpit tool.
The document discusses the Logistics Extraction Cockpit, a new technique for extracting logistics information from SAP systems. It has standardized extract structures and data sources that improve performance over the previous LIS method. The cockpit provides a centralized tool for maintaining extractions across logistics applications like sales and distribution. It reduces data volumes and load on the system compared to LIS.
This document provides an overview of the Sales and Distribution (SD) module in SAP. It describes the key areas of sales, shipping, and billing. It also explains the organizational structure, master data, processes, tables, and transactions involved in SD. The next part will cover extractors used for LO extraction from the SD module.
This document provides steps to download data from a SAP BW/BI report into a CSV file, perform additional calculations, and sort the data using the Analysis Process Designer (APD). Specifically, it describes how to:
1. Create an analysis process in APD to read data from an existing BW/BI report.
2. Add transformations to sort the data based on an amount field in descending order and calculate a new field that multiplies the amount by 10.
3. Configure the data target to write the processed data to a CSV file on the client workstation.
The steps allow downloading report data, performing additional calculations not available in the original report, and sorting the data
This presentation discusses an approach to building the business case for SAP HANA, including practical advice on getting a project live, fast.
This was first presented at the SAP Insider HANA 2015 conference in Las Vegas
Delivering digital devolution in local authorities bluefin solutions - dece...Bluefin Solutions
This was presented at a Bluefin event for Local Authority Department Heads wanting to find out how to shape their digital strategies to capitalise on devolution. It includes guest presentations from Cardiff County Council, Barnsley Metropolitan Borough Council, Surrey County Council, and SAP.
The document provides instructions for setting up a replication model between SAP's CO-PA profitability analysis module and SAP BW for data analysis. It describes creating a data source in CO-PA, activating relevant business content in BW, replicating the data source, mapping fields, creating an info source, info cube, and update rules to transfer transactional CO-PA data to BW on a regular basis using a delta method for ongoing updates.
The document describes how to integrate an ABAP program into a process chain in SAP BW. It involves creating an ABAP program to implement the needed functionality. A variant is then created for any selection parameters. The program is added as a process type to a new process chain along with specifying the variant. The process chain can then be activated and scheduled to execute the ABAP program as part of the chain.
This document provides an overview of common ABAP interview questions and answers. It discusses topics such as the ABAP data dictionary, domains and data elements, foreign key relationships, data classes, indexes, transparent vs pooled tables, ABAP/4 queries, BDC programming, internal tables, ITS, DynPros, screen and menu painters, SAP script components, ALV programming, ABAP events, CTS, logical databases, batch input sessions, CATT data uploading, Smart Forms, dependent vs independent data, and the differences between macros and subroutines.
_Using Selective Deletion in Process Chains.pdfssuserfe1f82
This document discusses using selective deletion in SAP BW process chains to delete specific date range data from an InfoCube before loading new data. The key steps include using transaction code DELETE_FACTS to generate a deletion program, creating a variant for the program to select the date range to delete, and adding the program and variant to a process chain to automate deleting data from the current date to 30 days in the future before each new data load.
Creating attachments to work items or to user decisions in workflowsHicham Khallouki
This document discusses how to attach documents like PDF files to work items or user decision steps in SAP workflows. It provides code examples for creating a PDF output from an Adobe form, converting it to binary format, and using function modules to add it as an attachment. It also shows how to design a workflow with tasks to create the attachment and send it to a user decision step. The goal is to demonstrate a full example of programmatically generating and attaching documents to workflow tasks and steps.
Beginner's Guide: Programming with ABAP on HANAAshish Saxena
The focus of this blog is to present an overview of the new programming techniques in ABAP after the introduction of HANA database. The focus will be towards providing a guideline on why and how an ABAP developer should start transitioning its code to use the new coding technique’s.
The document discusses the ABAP Query tool in SAP, including its components (InfoSets, user groups, queries), areas (standard vs global), InfoSet design strategies, and how to create InfoSets, user groups, and queries. Key points include using logical databases or table joins as an InfoSet data source, assigning additional tables/fields, and using queries to build reports with selection criteria and drill-down functionality.
Errors while sending packages from oltp to bi (one of error at the time of da...bhaskarbi
This document provides steps to resolve the error "Errors while Sending Packages from OLTP to BI" that can occur when using process chains for daily data loads from an OLTP system to a SAP BW/BI system. The error is caused when IDocs are stuck in the ALE outbox and do not arrive in the ALE inbox of the BW/BI system. The steps include checking the TRFC log, monitoring jobs and processes, and manually executing any stuck LUWs in the TRFC.
Reporting data in alternate unit of measure in bi 7.0Ashwin Kumar
This document discusses how to report data in alternate units of measure in SAP Business Intelligence 7.0. It covers creating a quantity conversion type using transaction RSUOM, specifying the conversion type in a query definition, and creating a variable for the target unit of measure to provide flexibility in unit selection. When the query is executed, the user will be prompted to select a unit, and results will be displayed in the selected unit due to conversion calculations performed using the conversion factor from the reference InfoObject.
SAP PI Sheet (Xstep) integration with Weighing Machine/ScaleAnkit Sharma
This document is related to SAP PI Sheet (XStep) integration with Weighing Machine/Scale.
Accurate weights are vital in any manufacturing process. In Pharmaceutical industry it comes under GMP. A slight wrong weight in medicine manufacturing can cause several death.
ERP-Scale interfaces weigh scales and other analytical devices with:
SAP OPC Data Access (SAP ODA)
• In PI-Sheets and XSteps to support weighing of components for goods-issues including target weighing using a weighing bar and goods-receipts of finished or semi-finished products
• Using data access subscriptions to automatically trigger functions in SAP (e.g. order confirmation) when the weight is received from the scale
• In manufacturing cockpits
This document covers SAP settings and overall architecture details and basic settings required in the third party ERP Scale software on the scales.
This document provides best practices for using key ABAP programming features, including data storage and retrieval, dynamic programming, and administrative issues. It recommends storing persistent data in database tables and using shared objects in shared memory instead of shared buffers. For dynamic programming, it suggests prudent use and preferring dynamic token specification over code generation. It also covers best practices for dynamic data objects, anonymous objects, field symbols, dynamic tokens, RTTI/RTTC, and program generation. Finally, it discusses testing, documenting, and using packages for programs.
Data extraction and retraction in bpc bivikram2355
This document provides an overview of data integration between SAP BPC and SAP BI. Master and transactional data is stored in SAP BI objects and tables and needs to be extracted into BPC cubes for planning and reporting purposes. Planned data in BPC can also be retracted back into BI cubes as needed. Data can be extracted from BI to BPC using either a BW ETL process or the BPC Data Manager functionality, while retraction from BPC to BI currently requires loading data to a flat file first before loading into BI.
This document provides an overview of data integration between SAP BPC and SAP BI. Master and transactional data is stored in SAP BI objects and tables and needs to be extracted into BPC cubes for planning and reporting. Planned data in BPC can also be retracted back into BI cubes. Data can be extracted from BI to BPC using either a BW ETL process or the BPC Data Manager functionality. Retraction of data from BPC to BI currently requires loading data to a flat file first before loading it into BI.
1. A logical database provides optimized access to related database tables in an ABAP program. It handles the database queries and returns the results to the calling program.
2. When a logical database is used, a selection screen is automatically displayed to filter the returned data. The program only needs to define data processing using GET events.
3. GET events are executed in tree-like order to retrieve related records from different tables, matching the relationships defined in the logical database structure. This simplifies programming multi-table reports.
This document provides a step-by-step guide on using Business Transaction Events (BTE) as an enhancement technique in SAP's Financial Accounting module. BTEs allow customizing business processes by triggering ABAP code at certain transaction events. The guide describes BTEs and their differences from BADIs, the two types of BTE interfaces, how to find and configure BTEs using transaction codes and the function module approach, and provides an example of using a BTE to copy a custom value to a document line item field on posting.
This document provides a step-by-step guide on using Business Transaction Events (BTEs) as an enhancement technique in SAP's Financial Accounting module. It describes what BTEs are, the difference between BTEs and BADIs, the two types of BTE interfaces, and provides an example of how to configure a BTE to copy an assignment field with a custom value when accounting documents are posted for a specific company code. The document outlines finding the relevant BTE, copying the sample function module, writing ABAP code to update the field, saving and activating the function module, and assigning it to the appropriate event, country and application.
The three stages of Power BI Deployment PipelineservicesNitor
The deployment pipeline is an efficient tool for BI creators. Read our blog to discover details about the three stages of Power BI deployment pipeline.
This document provides a step-by-step process for writing Backup Domain Controller (BDC) reports to transfer legacy data into an SAP system. It describes recording a transaction, creating an ABAP report from the recording, and writing ABAP code to fetch data from a legacy system and store it in SAP. The steps include creating internal tables, calling upload functions, using a loop to replace constants with table fields, and testing the data transfer.
Sap query creation and transport procedure in ecc6bluechxi
This document provides instructions for creating queries and transporting them across SAP clients. It describes creating a user group, infoset, and query. The query is then exported to create a transport request which can be imported into other clients. The process requires no technical knowledge and allows effective reports to be made available across clients.
Communicating effectively and consistently with students can help them feel at ease during their learning experience and provide the instructor with a communication trail to track the course's progress. This workshop will take you through constructing an engaging course container to facilitate effective communication.
हिंदी वर्णमाला पीपीटी, hindi alphabet PPT presentation, hindi varnamala PPT, Hindi Varnamala pdf, हिंदी स्वर, हिंदी व्यंजन, sikhiye hindi varnmala, dr. mulla adam ali, hindi language and literature, hindi alphabet with drawing, hindi alphabet pdf, hindi varnamala for childrens, hindi language, hindi varnamala practice for kids, https://www.drmullaadamali.com
Leveraging Generative AI to Drive Nonprofit InnovationTechSoup
In this webinar, participants learned how to utilize Generative AI to streamline operations and elevate member engagement. Amazon Web Service experts provided a customer specific use cases and dived into low/no-code tools that are quick and easy to deploy through Amazon Web Service (AWS.)
Temple of Asclepius in Thrace. Excavation resultsKrassimira Luka
The temple and the sanctuary around were dedicated to Asklepios Zmidrenus. This name has been known since 1875 when an inscription dedicated to him was discovered in Rome. The inscription is dated in 227 AD and was left by soldiers originating from the city of Philippopolis (modern Plovdiv).
it describes the bony anatomy including the femoral head , acetabulum, labrum . also discusses the capsule , ligaments . muscle that act on the hip joint and the range of motion are outlined. factors affecting hip joint stability and weight transmission through the joint are summarized.
How to Make a Field Mandatory in Odoo 17Celine George
In Odoo, making a field required can be done through both Python code and XML views. When you set the required attribute to True in Python code, it makes the field required across all views where it's used. Conversely, when you set the required attribute in XML views, it makes the field required only in the context of that particular view.
Philippine Edukasyong Pantahanan at Pangkabuhayan (EPP) CurriculumMJDuyan
(𝐓𝐋𝐄 𝟏𝟎𝟎) (𝐋𝐞𝐬𝐬𝐨𝐧 𝟏)-𝐏𝐫𝐞𝐥𝐢𝐦𝐬
𝐃𝐢𝐬𝐜𝐮𝐬𝐬 𝐭𝐡𝐞 𝐄𝐏𝐏 𝐂𝐮𝐫𝐫𝐢𝐜𝐮𝐥𝐮𝐦 𝐢𝐧 𝐭𝐡𝐞 𝐏𝐡𝐢𝐥𝐢𝐩𝐩𝐢𝐧𝐞𝐬:
- Understand the goals and objectives of the Edukasyong Pantahanan at Pangkabuhayan (EPP) curriculum, recognizing its importance in fostering practical life skills and values among students. Students will also be able to identify the key components and subjects covered, such as agriculture, home economics, industrial arts, and information and communication technology.
𝐄𝐱𝐩𝐥𝐚𝐢𝐧 𝐭𝐡𝐞 𝐍𝐚𝐭𝐮𝐫𝐞 𝐚𝐧𝐝 𝐒𝐜𝐨𝐩𝐞 𝐨𝐟 𝐚𝐧 𝐄𝐧𝐭𝐫𝐞𝐩𝐫𝐞𝐧𝐞𝐮𝐫:
-Define entrepreneurship, distinguishing it from general business activities by emphasizing its focus on innovation, risk-taking, and value creation. Students will describe the characteristics and traits of successful entrepreneurs, including their roles and responsibilities, and discuss the broader economic and social impacts of entrepreneurial activities on both local and global scales.
Beyond Degrees - Empowering the Workforce in the Context of Skills-First.pptxEduSkills OECD
Iván Bornacelly, Policy Analyst at the OECD Centre for Skills, OECD, presents at the webinar 'Tackling job market gaps with a skills-first approach' on 12 June 2024
Chapter wise All Notes of First year Basic Civil Engineering.pptxDenish Jangid
Chapter wise All Notes of First year Basic Civil Engineering
Syllabus
Chapter-1
Introduction to objective, scope and outcome the subject
Chapter 2
Introduction: Scope and Specialization of Civil Engineering, Role of civil Engineer in Society, Impact of infrastructural development on economy of country.
Chapter 3
Surveying: Object Principles & Types of Surveying; Site Plans, Plans & Maps; Scales & Unit of different Measurements.
Linear Measurements: Instruments used. Linear Measurement by Tape, Ranging out Survey Lines and overcoming Obstructions; Measurements on sloping ground; Tape corrections, conventional symbols. Angular Measurements: Instruments used; Introduction to Compass Surveying, Bearings and Longitude & Latitude of a Line, Introduction to total station.
Levelling: Instrument used Object of levelling, Methods of levelling in brief, and Contour maps.
Chapter 4
Buildings: Selection of site for Buildings, Layout of Building Plan, Types of buildings, Plinth area, carpet area, floor space index, Introduction to building byelaws, concept of sun light & ventilation. Components of Buildings & their functions, Basic concept of R.C.C., Introduction to types of foundation
Chapter 5
Transportation: Introduction to Transportation Engineering; Traffic and Road Safety: Types and Characteristics of Various Modes of Transportation; Various Road Traffic Signs, Causes of Accidents and Road Safety Measures.
Chapter 6
Environmental Engineering: Environmental Pollution, Environmental Acts and Regulations, Functional Concepts of Ecology, Basics of Species, Biodiversity, Ecosystem, Hydrological Cycle; Chemical Cycles: Carbon, Nitrogen & Phosphorus; Energy Flow in Ecosystems.
Water Pollution: Water Quality standards, Introduction to Treatment & Disposal of Waste Water. Reuse and Saving of Water, Rain Water Harvesting. Solid Waste Management: Classification of Solid Waste, Collection, Transportation and Disposal of Solid. Recycling of Solid Waste: Energy Recovery, Sanitary Landfill, On-Site Sanitation. Air & Noise Pollution: Primary and Secondary air pollutants, Harmful effects of Air Pollution, Control of Air Pollution. . Noise Pollution Harmful Effects of noise pollution, control of noise pollution, Global warming & Climate Change, Ozone depletion, Greenhouse effect
Text Books:
1. Palancharmy, Basic Civil Engineering, McGraw Hill publishers.
2. Satheesh Gopi, Basic Civil Engineering, Pearson Publishers.
3. Ketki Rangwala Dalal, Essentials of Civil Engineering, Charotar Publishing House.
4. BCP, Surveying volume 1