This document provides guidance on debugging and troubleshooting Demantra data loading issues. It discusses custom hooks, preventing data from loading, resolving bugs, and publishing forecasts. Specific techniques covered include setting debug mode, examining log files, modifying custom hook packages, and checking staging and Demantra tables. The document is intended as a starting point for investigating data inconsistencies.
Advanced Planning Administrator responsibilities include managing instances, organizational security, and creating forecast rules. Material planners are responsible for forecasting, master production scheduling, and material requirements planning. Key steps in the APS process include:
1. Creating forecast sets and entering forecasts.
2. Defining sourcing rules to determine how to source demand.
3. Collecting planning data from source systems.
4. Creating production plans and simulating to check supply and demand before releasing orders.
Oracle Advanced Supply Chain Planning (ASCP) is a comprehensive planning solution that decides when and where supplies should be deployed within an extended supply chain. It allows for multi-organization planning across customer, manufacturing, distribution, and supplier organizations. ASCP performs holistic planning across the entire planning horizon, supply chain, and all manufacturing methods in a single plan. It also enables constrained, finite capacity planning while respecting material and capacity constraints. Optimization allows the planning process to determine the lowest cost plan based on assigned costs and penalties.
Time Series Vs Order based Planning in SAP IBPAYAN BISHNU
A comparison of time series based supply planning vs Order based supply planning in SAP IBP, taking into considerations, some of the most needed solution across industries.
The document provides an overview of Oracle Demantra, a demand management and sales and operations planning solution. It describes key Demantra capabilities like statistical forecasting, shape modeling, handling causal factors, new product introductions, and global forecasting. It also discusses how Demantra supports business processes such as workflow-enabled collaboration, disaggregating forecasts, using expressions, and consumption-based planning. The document is intended to help understand what Demantra is and how it can be used.
This document provides an overview of the configuration of Oracle Process Manufacturing for BAPCO's Lubricants Business Unit. It includes descriptions of the business flow, organizational structure, configuration requirements for stock transfers between plants, and preliminary setup steps. Key aspects that were assumed in the configuration are outlined, such as using actual costing, cost component classes, and inventory transfer configurations. An application setup control sheet lists the various setup responsibilities.
The document summarizes the key differences between Earliest Due Date (EDD) and Earliest Component Completion (ECC) planning methods in Advanced Supply Chain Planning (ASCP). It shows how EDD focuses on meeting demand dates without considering resource constraints, which can result in overloading resources, while ECC aims to finish components earliest to level the workload across resources. The document provides examples of output from ASCP plans using both EDD and ECC, demonstrating how ECC plans can eliminate exceptions by balancing the workload across constrained resources.
This document provides an overview of an ASCP training session on Oracle's Advanced Supply Chain Planning solution. It discusses:
1) The basics of ASCP, including planning materials, capacity, and production to map supply to demand.
2) Key terms used in ASCP like purchase orders, work orders, routings, and resources.
3) An implementation of ASCP for a machining center to improve scheduling and capacity planning across 21 machines and over 150 SKUs.
4) The steps required to properly set up the planning environment in ASCP, including cleaning up old data, maintaining accurate item masters, and entering material and resource constraints.
Advanced Planning Administrator responsibilities include managing instances, organizational security, and creating forecast rules. Material planners are responsible for forecasting, master production scheduling, and material requirements planning. Key steps in the APS process include:
1. Creating forecast sets and entering forecasts.
2. Defining sourcing rules to determine how to source demand.
3. Collecting planning data from source systems.
4. Creating production plans and simulating to check supply and demand before releasing orders.
Oracle Advanced Supply Chain Planning (ASCP) is a comprehensive planning solution that decides when and where supplies should be deployed within an extended supply chain. It allows for multi-organization planning across customer, manufacturing, distribution, and supplier organizations. ASCP performs holistic planning across the entire planning horizon, supply chain, and all manufacturing methods in a single plan. It also enables constrained, finite capacity planning while respecting material and capacity constraints. Optimization allows the planning process to determine the lowest cost plan based on assigned costs and penalties.
Time Series Vs Order based Planning in SAP IBPAYAN BISHNU
A comparison of time series based supply planning vs Order based supply planning in SAP IBP, taking into considerations, some of the most needed solution across industries.
The document provides an overview of Oracle Demantra, a demand management and sales and operations planning solution. It describes key Demantra capabilities like statistical forecasting, shape modeling, handling causal factors, new product introductions, and global forecasting. It also discusses how Demantra supports business processes such as workflow-enabled collaboration, disaggregating forecasts, using expressions, and consumption-based planning. The document is intended to help understand what Demantra is and how it can be used.
This document provides an overview of the configuration of Oracle Process Manufacturing for BAPCO's Lubricants Business Unit. It includes descriptions of the business flow, organizational structure, configuration requirements for stock transfers between plants, and preliminary setup steps. Key aspects that were assumed in the configuration are outlined, such as using actual costing, cost component classes, and inventory transfer configurations. An application setup control sheet lists the various setup responsibilities.
The document summarizes the key differences between Earliest Due Date (EDD) and Earliest Component Completion (ECC) planning methods in Advanced Supply Chain Planning (ASCP). It shows how EDD focuses on meeting demand dates without considering resource constraints, which can result in overloading resources, while ECC aims to finish components earliest to level the workload across resources. The document provides examples of output from ASCP plans using both EDD and ECC, demonstrating how ECC plans can eliminate exceptions by balancing the workload across constrained resources.
This document provides an overview of an ASCP training session on Oracle's Advanced Supply Chain Planning solution. It discusses:
1) The basics of ASCP, including planning materials, capacity, and production to map supply to demand.
2) Key terms used in ASCP like purchase orders, work orders, routings, and resources.
3) An implementation of ASCP for a machining center to improve scheduling and capacity planning across 21 machines and over 150 SKUs.
4) The steps required to properly set up the planning environment in ASCP, including cleaning up old data, maintaining accurate item masters, and entering material and resource constraints.
This document provides an overview of implementing Oracle Advanced Supply Chain Planning (ASCP). It discusses the key capabilities of ASCP and how it differs from MRP. The implementation process involves initial setups like defining instances, organizations, and item attributes. Critical steps include collecting sales order demand, defining the master demand schedule and plan, setting plan options, executing the plan, and accessing the planner's workbench. Potential issues that could arise are also reviewed.
Oracle EAM enables organizations to drive maintenance best practices, empower workers with self-service applications, and manage the full asset lifecycle with a single integrated solution. It provides features such as asset management, preventative maintenance, work order management, planning and scheduling, stores management, and cost management. Oracle EAM aims to improve asset performance while supporting compliance.
How to apply surcharges to the sales orders surcharges to the sales ordersSubramanyam Yadav
1. Define a manual modifier called "XXMANUAL-SPECAIL-CHARGES" with a charge name of "Administration Fees" and a lump sum value of 20.
2. Build attribute mapping rules for the modifier using Oracle Pricing Manager.
3. Apply the necessary pricing privileges to maintain the modifier.
4. Enter and save a sales order, then select "Charges" from the action menu to apply the "Administration Fees" surcharge of 20 to the order.
This document provides an overview of Oracle Advanced Supply Chain Planning (ASCP) training. It outlines the intended audience, related training materials, and how to log into the Oracle Applications. It then describes the ASCP data flow and planning cycle. The document details how to set up important inventory item attributes, planning attributes, lead times, calendars, resources, routing, and supply chain parameters in Oracle Applications. It explains data collection methods and how to define, launch, and copy ASCP plans. The remainder of the document discusses using the Planner Workbench for analysis, simulations, and reports as well as frequently asked questions and constraint-based planning.
The document discusses the benefits of a new centralized inventory system in Oracle R12 for companies that do both process and discrete manufacturing. It allows such hybrid manufacturers to operate with a single item master and inventory system. This unified system reduces redundant data and maintenance while improving inventory visibility and supply chain integration. It also enables process manufacturers to engage in available-to-promise checking while respecting quality processes through new inventory status attributes.
Oracle VCP Training - Oracle Value Chain PlanningAmit Sharma
Oracle Value Chain Planning (VCP) is a comprehensive solution that brings together planning, optimization, and analytics capabilities. It helps organizations create flexible supply chains that can adapt to demand variations. The VCP products are modular and can be implemented incrementally, allowing organizations to first resolve key issues while building on prior work. VCP consists of various products that support advanced planning, manufacturing, inventory optimization, collaborative planning, and more.
View Related videos:-
Truth about Supply Demand Planning:-
http://www.youtube.com/watch?v=K66q2o1ED3c
Demantra Vs Oracle Demand Planning
http://www.youtube.com/watch?v=QwAzP3T6ut4
Another slideshare PPT:-
http://www.slideshare.net/amitforu78/demantra-vs-oracle-demand-planning
Contact me at www.ezdia.com
<a>AsiaLinks</a>
This document summarizes a presentation about Oracle Demantra software modules for supply chain planning. It discusses Beckman Coulter's business environment with multiple forecast sources and planning teams. It then reviews the Demand Management, Advanced Forecasting, and Real Time S&OP modules, comparing their functionality for demand data, hierarchies, forecasting methods, and sales and operations planning. The selection was made based on Demand Management for basic functionality, Advanced Forecasting for advanced forecasting models, and Real Time S&OP for supply data and S&OP worksheets.
This document discusses setting up Oracle Enterprise Asset Management (eAM) in Oracle E-Business Suite R12.1. It covers required setup tasks like defining category codes, category sets, asset groups, activities, and associating activities with assets. The document provides step-by-step instructions for performing each setup task with screenshots. It explains how to create activity association templates to streamline activity and asset associations. The overall purpose is to educate users on how to configure eAM for use within their organization.
This document provides an overview of SAP PS configuration including defining project structures, time profiles, budgeting and controlling parameters, and scheduling types. It discusses configuring special characters for project coding, project coding masks, field selection for work breakdown structures, and other parameters. The configuration enables structuring projects, planning, monitoring and controlling project progress according to business needs.
This document provides an implementation and user's guide for Oracle Global Order Promising. It contains 7 chapters that describe how to set up and use ATP functionality based on collected data or planning output, including configuration, product family, and multi-level supply chain ATP. It also covers ATP inquiry, order scheduling, a diagnostic ATP tool, and an order backlog workbench for scheduling order lines.
1. The document discusses setting up quality management in Oracle applications, including defining items, specifications, tests, sampling plans, and other key quality parameters.
2. It provides instructions on configuring automatic sample creation for inventory, work in process, and suppliers to simplify sampling in high-volume environments. This includes setting up business events, validation rules, and inventory deductions.
3. Built-in reports are available to view test results for inventory, work in process, certificates of analysis, and other quality data to monitor compliance with specifications.
Sap alv excel inplace with macro recording sapigniteAromal Raveendran
This article explains how we can make use of MS Excel Inplace functionality in the SAP ALV toolbar with macro recording to avoid repetitive tasks such as adding custom calculation field in the standard /Custom/ Query reports (e.g. Order Qty- Delivery Qty, Pivot Table chart etc)
- The document describes billing plans, which define schedules of billing dates for items in sales documents. There are two main types of billing plans: periodic billing, which bills a total amount on set dates, and milestone billing, which distributes amounts over dates linked to project milestones.
- Billing plans are controlled through billing plan types, date categories, and date descriptions defined in customizing. These determine how dates are automatically set and what data is associated with each date.
- Periodic billing typically uses monthly or quarterly intervals between set dates. Milestone billing links dates to percentages of project completion or amounts upon reaching project milestones.
This document provides an overview of backflush processing in Oracle Work in Process. It defines backflushing as automatically reducing component quantities from inventory when assemblies or operations are completed. It discusses the six processes that can trigger backflush transactions: 1) completing assemblies at operations, 2) moving and completing assemblies into inventory, 3) completing assemblies into inventory, 4) receiving assemblies from outside processing, 5) importing move transactions, and 6) importing inventory transactions. For each process, it explains when backflush transactions occur and which component supply types are backflushed.
1) Availability check in SAP SD checks if sufficient material is available on the calculated availability date to deliver goods to customers on the requested delivery date. It is done at the plant level and considers factors like replenishment lead time and partial vs complete delivery.
2) Customization of availability check can be done by configuring transfer of requirements, planning strategies, MRP groups, checking groups, and schedule line categories to control the scope and level of checking.
3) Key determinants include requirements class, checking group, strategy group, and delivery item category, which control whether checking occurs and at what level or frequency.
The document describes the process of setting up an approval hierarchy in Oracle Apps. It involves defining jobs, positions and employees, building the position hierarchy, creating approval groups, assigning approvals, and setting the default hierarchy on document types. The example sets up a three-tier approval hierarchy for purchase orders, with purchaser, department head, and branch head approval levels based on purchase amount.
- Oracle Form Builder is a component of Oracle Developer/2000 that is used to create event-driven applications to enter, access, change, or delete data from an Oracle database.
- A Forms application consists of forms, menus, and libraries. Forms uses triggers, processes, and events to control user interactions and database transactions.
- The main components of a Forms application include windows, canvases, blocks, items, triggers, alerts, lists of values (LOVs), editors, parameters, program units, libraries, and object groups.
This document provides an overview of setting up a multi-organization structure in Oracle Financials R12. It discusses defining business groups, ledgers, legal entities, operating units, and inventory organizations. It also covers multi-org access control, preferences, and validation reports. The document outlines the steps to create these elements and establish relationships between the different organization types.
DAC Notes. We provide best training and placement in Data warehousing and big data analytics . Mainily we offer training on
1) OBIEE
2)ODI
3)OBIA
4)INFORMATICA
5)HADOOP
Oracle ACE Director Dan Morgan presented those slides about migrating to database 12c and how to get it right. For more information, visit www.perftuning.com
Between 2015 and 2017 a large percentage of Oracle's existing customer base will be upgrading their existing databases to the new version 12cR1. Most of the time when upgrades happen the only benefits organizations receive are the satisfaction of having survived the upgrade unscathed. In general, the new database, other than having a new version number, provides little in the way of tangible benefits.
With the re-architecture that can come with a 12cR1 upgrade it is, for the first time, possible to plan for and receive substantial measurable benefits, and possible to make costly mistakes that could create substantial liabilities that are both business and financial.
Oracle ACE Director and industry veteran Dan Morgan, in a presentation targeted to IT/IS management explores both the benefits and the risks and provide a guideline for "getting it right."
This Performance Tuning's Lunch & Learn event focuses on management, planning, and budgeting, not features and technology, and provides you and your management teams the information they need to perform the next database upgrade or migration cycle.
This document provides an overview of implementing Oracle Advanced Supply Chain Planning (ASCP). It discusses the key capabilities of ASCP and how it differs from MRP. The implementation process involves initial setups like defining instances, organizations, and item attributes. Critical steps include collecting sales order demand, defining the master demand schedule and plan, setting plan options, executing the plan, and accessing the planner's workbench. Potential issues that could arise are also reviewed.
Oracle EAM enables organizations to drive maintenance best practices, empower workers with self-service applications, and manage the full asset lifecycle with a single integrated solution. It provides features such as asset management, preventative maintenance, work order management, planning and scheduling, stores management, and cost management. Oracle EAM aims to improve asset performance while supporting compliance.
How to apply surcharges to the sales orders surcharges to the sales ordersSubramanyam Yadav
1. Define a manual modifier called "XXMANUAL-SPECAIL-CHARGES" with a charge name of "Administration Fees" and a lump sum value of 20.
2. Build attribute mapping rules for the modifier using Oracle Pricing Manager.
3. Apply the necessary pricing privileges to maintain the modifier.
4. Enter and save a sales order, then select "Charges" from the action menu to apply the "Administration Fees" surcharge of 20 to the order.
This document provides an overview of Oracle Advanced Supply Chain Planning (ASCP) training. It outlines the intended audience, related training materials, and how to log into the Oracle Applications. It then describes the ASCP data flow and planning cycle. The document details how to set up important inventory item attributes, planning attributes, lead times, calendars, resources, routing, and supply chain parameters in Oracle Applications. It explains data collection methods and how to define, launch, and copy ASCP plans. The remainder of the document discusses using the Planner Workbench for analysis, simulations, and reports as well as frequently asked questions and constraint-based planning.
The document discusses the benefits of a new centralized inventory system in Oracle R12 for companies that do both process and discrete manufacturing. It allows such hybrid manufacturers to operate with a single item master and inventory system. This unified system reduces redundant data and maintenance while improving inventory visibility and supply chain integration. It also enables process manufacturers to engage in available-to-promise checking while respecting quality processes through new inventory status attributes.
Oracle VCP Training - Oracle Value Chain PlanningAmit Sharma
Oracle Value Chain Planning (VCP) is a comprehensive solution that brings together planning, optimization, and analytics capabilities. It helps organizations create flexible supply chains that can adapt to demand variations. The VCP products are modular and can be implemented incrementally, allowing organizations to first resolve key issues while building on prior work. VCP consists of various products that support advanced planning, manufacturing, inventory optimization, collaborative planning, and more.
View Related videos:-
Truth about Supply Demand Planning:-
http://www.youtube.com/watch?v=K66q2o1ED3c
Demantra Vs Oracle Demand Planning
http://www.youtube.com/watch?v=QwAzP3T6ut4
Another slideshare PPT:-
http://www.slideshare.net/amitforu78/demantra-vs-oracle-demand-planning
Contact me at www.ezdia.com
<a>AsiaLinks</a>
This document summarizes a presentation about Oracle Demantra software modules for supply chain planning. It discusses Beckman Coulter's business environment with multiple forecast sources and planning teams. It then reviews the Demand Management, Advanced Forecasting, and Real Time S&OP modules, comparing their functionality for demand data, hierarchies, forecasting methods, and sales and operations planning. The selection was made based on Demand Management for basic functionality, Advanced Forecasting for advanced forecasting models, and Real Time S&OP for supply data and S&OP worksheets.
This document discusses setting up Oracle Enterprise Asset Management (eAM) in Oracle E-Business Suite R12.1. It covers required setup tasks like defining category codes, category sets, asset groups, activities, and associating activities with assets. The document provides step-by-step instructions for performing each setup task with screenshots. It explains how to create activity association templates to streamline activity and asset associations. The overall purpose is to educate users on how to configure eAM for use within their organization.
This document provides an overview of SAP PS configuration including defining project structures, time profiles, budgeting and controlling parameters, and scheduling types. It discusses configuring special characters for project coding, project coding masks, field selection for work breakdown structures, and other parameters. The configuration enables structuring projects, planning, monitoring and controlling project progress according to business needs.
This document provides an implementation and user's guide for Oracle Global Order Promising. It contains 7 chapters that describe how to set up and use ATP functionality based on collected data or planning output, including configuration, product family, and multi-level supply chain ATP. It also covers ATP inquiry, order scheduling, a diagnostic ATP tool, and an order backlog workbench for scheduling order lines.
1. The document discusses setting up quality management in Oracle applications, including defining items, specifications, tests, sampling plans, and other key quality parameters.
2. It provides instructions on configuring automatic sample creation for inventory, work in process, and suppliers to simplify sampling in high-volume environments. This includes setting up business events, validation rules, and inventory deductions.
3. Built-in reports are available to view test results for inventory, work in process, certificates of analysis, and other quality data to monitor compliance with specifications.
Sap alv excel inplace with macro recording sapigniteAromal Raveendran
This article explains how we can make use of MS Excel Inplace functionality in the SAP ALV toolbar with macro recording to avoid repetitive tasks such as adding custom calculation field in the standard /Custom/ Query reports (e.g. Order Qty- Delivery Qty, Pivot Table chart etc)
- The document describes billing plans, which define schedules of billing dates for items in sales documents. There are two main types of billing plans: periodic billing, which bills a total amount on set dates, and milestone billing, which distributes amounts over dates linked to project milestones.
- Billing plans are controlled through billing plan types, date categories, and date descriptions defined in customizing. These determine how dates are automatically set and what data is associated with each date.
- Periodic billing typically uses monthly or quarterly intervals between set dates. Milestone billing links dates to percentages of project completion or amounts upon reaching project milestones.
This document provides an overview of backflush processing in Oracle Work in Process. It defines backflushing as automatically reducing component quantities from inventory when assemblies or operations are completed. It discusses the six processes that can trigger backflush transactions: 1) completing assemblies at operations, 2) moving and completing assemblies into inventory, 3) completing assemblies into inventory, 4) receiving assemblies from outside processing, 5) importing move transactions, and 6) importing inventory transactions. For each process, it explains when backflush transactions occur and which component supply types are backflushed.
1) Availability check in SAP SD checks if sufficient material is available on the calculated availability date to deliver goods to customers on the requested delivery date. It is done at the plant level and considers factors like replenishment lead time and partial vs complete delivery.
2) Customization of availability check can be done by configuring transfer of requirements, planning strategies, MRP groups, checking groups, and schedule line categories to control the scope and level of checking.
3) Key determinants include requirements class, checking group, strategy group, and delivery item category, which control whether checking occurs and at what level or frequency.
The document describes the process of setting up an approval hierarchy in Oracle Apps. It involves defining jobs, positions and employees, building the position hierarchy, creating approval groups, assigning approvals, and setting the default hierarchy on document types. The example sets up a three-tier approval hierarchy for purchase orders, with purchaser, department head, and branch head approval levels based on purchase amount.
- Oracle Form Builder is a component of Oracle Developer/2000 that is used to create event-driven applications to enter, access, change, or delete data from an Oracle database.
- A Forms application consists of forms, menus, and libraries. Forms uses triggers, processes, and events to control user interactions and database transactions.
- The main components of a Forms application include windows, canvases, blocks, items, triggers, alerts, lists of values (LOVs), editors, parameters, program units, libraries, and object groups.
This document provides an overview of setting up a multi-organization structure in Oracle Financials R12. It discusses defining business groups, ledgers, legal entities, operating units, and inventory organizations. It also covers multi-org access control, preferences, and validation reports. The document outlines the steps to create these elements and establish relationships between the different organization types.
DAC Notes. We provide best training and placement in Data warehousing and big data analytics . Mainily we offer training on
1) OBIEE
2)ODI
3)OBIA
4)INFORMATICA
5)HADOOP
Oracle ACE Director Dan Morgan presented those slides about migrating to database 12c and how to get it right. For more information, visit www.perftuning.com
Between 2015 and 2017 a large percentage of Oracle's existing customer base will be upgrading their existing databases to the new version 12cR1. Most of the time when upgrades happen the only benefits organizations receive are the satisfaction of having survived the upgrade unscathed. In general, the new database, other than having a new version number, provides little in the way of tangible benefits.
With the re-architecture that can come with a 12cR1 upgrade it is, for the first time, possible to plan for and receive substantial measurable benefits, and possible to make costly mistakes that could create substantial liabilities that are both business and financial.
Oracle ACE Director and industry veteran Dan Morgan, in a presentation targeted to IT/IS management explores both the benefits and the risks and provide a guideline for "getting it right."
This Performance Tuning's Lunch & Learn event focuses on management, planning, and budgeting, not features and technology, and provides you and your management teams the information they need to perform the next database upgrade or migration cycle.
Sql server 2008_replication_technical_case_studyKlaudiia Jacome
The document discusses how a company implemented SQL Server database replication to improve high availability and disaster recovery capabilities across multiple data centers. It describes the company's original infrastructure with a single point of failure and key requirements. The project team then planned and designed a new topology using SQL Server 2008 replication to synchronize data between data centers and enable failover, improving availability and recovery in the event of an outage.
"It can always get worse!" – Lessons Learned in over 20 years working with Or...Markus Michalewicz
First presented during the DOAG 2022 Conference and Exhibition, this presentation discusses and reviews the most significant lessons learned in over 20 years of working with Oracle Maximum Availability Architecture. It explains why documentation is good, but automated checks are better, and why standardization can help increase the availability of nearly all systems, including database systems.
The document discusses PowerCenter 9.x upgrade strategies presented by Softpath at the Atlanta User Group. It introduces the presenters and provides an overview of Softpath. Various upgrade approaches - such as zero downtime, parallel, cloned, and in-place upgrades - are presented along with their benefits, risks, and time requirements. The stages of an upgrade including planning, preparation work, installation, testing, and production implementation are also outlined.
This document discusses how MySQL indexes and histograms can speed up queries. It begins with an introduction to the presenter and topic. The goal of reducing query response time is discussed. Methods for identifying inefficient queries are covered, including using the sys schema. The role of the MySQL optimizer in evaluating query plans is then explained. Different types of indexes that can be used to optimize queries are also outlined.
TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...Symantec
This Blueprint is designed to help with customers who are utilising Backup Exec’s deduplication Option to improve back end storage capabilities within a complex backup environment.
Challenges of Legacy Disaster Recovery Methods
Offsite data protection helps organizations plan for disaster recovery by keeping backup copies of important data at one or more additional locations other than the main office. Data is usually transported offsite using removable storage media such as magnetic tape or optical storage.
Companies relying on tape solutions to protect against disaster face several challenges, including tape transport costs, security issues, and the complexities of media management. Many companies are looking for alternatives that allow them to overcome or avoid these challenges, such as methods that copy or replicate data electronically over a WAN/LAN connection to disk storage at a disaster recovery site. Alternatives such as these enable lower costs, improved security, and improved Recovery Point and Recovery Time Objectives (RPOs/RTOs).
Backup Exec’s Optimized Duplication Technology
Backup Exec™ 2014 offers a cost-effective backup replication method known as optimized duplication. Optimized duplication combines the powerful backup and data deduplication technologies in Backup Exec™ 2014 to enable the optimized transfer of data over a LAN/WAN connection from one Backup Exec server to another Backup Exec server. Copying backup data from one Backup Exec server to another using optimized duplication makes the same backup data available for recovery at multiple locations, thereby offering a convenient and cost-effective disaster recovery solution.
The document provides an introduction and overview of Neo4j Ops Manager (NOM). It discusses the current challenges of managing large graph databases and clusters that are growing in data size and complexity. It presents NOM as a solution for monitoring, administering, and operating Neo4j installations from a single interface. The presentation includes a demonstration of NOM's key features like monitoring dashboards, alerts, security management, and upgrade capabilities. It also covers how to register an agent and add a new instance to NOM for management.
O365con14 - migrating your e-mail to the cloudNCCOMMS
This document provides an overview of migrating email to Office 365. It introduces the speaker and his background in IT and Exchange. The agenda covers why Office 365, how to plan, prepare and deploy an Office 365 migration, and emphasizes new features in SP1, tools, and common pitfalls. Example deployment scenarios are provided for small businesses and established companies. Planning questions are addressed around Active Directory, multi-factor authentication, mail encryption. Preparing the on-premise environment involves cleaning up Active Directory and Exchange. Tools like Exchange Best Practices Analyzer are recommended. Tips are provided for running Office 365 including the admin center and Powershell. Troubleshooting mail flow and connectivity issues are also addressed.
[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdfChris Hoyean Song
The document outlines an agenda for the NFTBank x Snowflake Tech Seminar. The seminar will cover three sessions: 1) data quality and productivity with discussions of data validation, cataloging and lineage documentation, and an introduction to DBT; 2) integrating DBT with Airflow using Astronomer Cosmos; and 3) cost optimization through query optimization and cost monitoring. The seminar will be led by Chris Hoyean Song, VP of AIOps at NFTBank.
Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...Data Con LA
Frank Bell, Data Thought Leader and Snowflake SME at Accenture - CEO at ITS
We will cover all aspects of optimizing your Snowflake Data Cloud including:
*Dive deep into how Snowflake pay as you go costs work and how by utilizing our proven optimization tools - Snoptimizer SaaS Snowflake Optimizer - https://snoptimizer.com/
, scripts, and architecture techniques you typically can save 10-40++% on your existing Snowflake Account costs.
*Explain how Snowflake Compute works and proven techniques on how to architect warehouses for both cost and performance efficiency. We cover in depth how snowflake scales BOTH out and in as well as up and down with compute resources.
*Explain how Snowflake data storage works with Replication, Time-Travel, and Cloning. We explain these awesome features as well as their downsides if they are used and configured wrongly.
*Cover Snowflake cloud services costs and features that have costs related to them, including Snowpipe, Search Optimization, Materialized Views, Auto-clustering, and other recent new cost based features that provide value at a cost.
*Finally, we will discuss how you can ensure your Snowflake Account(s) are fully optimized not just for cost but also for security and performance on Snowflake. We will show you security and performance best practices as well as pitfalls to avoid.
This document contains Deepak Jacob's resume, including his contact information, education background with an MBA in IT and BSc in Computer Science, 11+ years of experience in IT roles focused on storage administration and management, current role as a Senior Consultant and Technical Delivery Lead for storage infrastructure at Capgemini, and objective to obtain a challenging position utilizing his technical and interpersonal skills.
This document contains a practice exam for the Oracle Exadata Database Machine 2014 Implementation Essentials certification (exam 1Z0-485). It includes 21 multiple choice questions about configuring and implementing Exadata, with explanations provided for each answer. Key topics covered include Exadata networking, storage configuration, cell offloading, I/O resource management, backups, health checks, and integrating Exadata with Enterprise Manager.
Open Source 101 2022 - MySQL Indexes and HistogramsFrederic Descamps
Nobody complains that the database is too fast. But when things slow down, the complaints come quickly. The two most popular approaches to speeding up queries are indexes and histograms. But there are so many options and types on indexes that it can get confusing. Histograms are fairly new to MySQL but they do not work for all types of data. This talk covers how indexes and histograms work and show you how to test just how effective they are so you can measure the performance of your queries.
The document summarizes the presentation "Top Ten S7 Tips and Tricks" given at the 2011 Automation Summit. The presentation covered ten tips for programming Siemens S7 PLCs more efficiently, including using modular object-oriented architecture with function blocks and data types, monitoring function block instances, reporting system errors, using RAM disks and auto-generating symbol tables, activating and deactivating network nodes, basic safety programming, parsing data in local memory, backing up data block data, and useful keyboard shortcuts. The presenter was Nick Shea from DMC Engineering.
How Skroutz S.A. utilizes Deep Learning and Machine Learning techniques to efficiently serve product categorization! Based on my talk at Athens PyData meetup!
The global PyData network promotes discussion of best practices, new approaches, and emerging technologies for data management, processing, analytics, and visualization
The DotMac Kit version 2.0 is a framework that allows developers to build Mac applications with .Mac features like messaging and file sharing without network programming. This document provides release notes on version 2.0 developer preview 2, including new universal binary support, bug fixes, and recommendations for optimal network configurations for messaging. Developers are encouraged to submit bug reports and feedback to further improve the framework.
The document provides an overview of Oracle Enterprise Manager 12c (OEM12c) with the following key points:
1. It introduces OEM12c and its capabilities for complete cloud lifecycle management including planning, building, testing, deploying, monitoring cloud services.
2. It discusses how to install OEM12c including checking requirements, using the bundle patch, and setting the correct hostname during installation.
3. It covers some common troubleshooting steps like resolving issues with configuration requirements and changing the hostname or IP address.
4. It provides some tips for OEM12c like creating scripts for starting, stopping and checking status, and backing up the admin server configuration.
5.
Ganesh Narayan Sonsale has over 20 years of experience as an Oracle Database and WebLogic Administrator. He currently works as a Manager DBA at Tech Mahindra, where he is responsible for managing Phoenix and other applications. Some of his key responsibilities include database administration, application server administration, performance tuning, backup/recovery strategies, and infrastructure management. He has extensive experience migrating and upgrading databases from Oracle 10g to 11g, and WebLogic 8 to 10.
Enchancing adoption of Open Source Libraries. A case study on Albumentations.AIVladimir Iglovikov, Ph.D.
Presented by Vladimir Iglovikov:
- https://www.linkedin.com/in/iglovikov/
- https://x.com/viglovikov
- https://www.instagram.com/ternaus/
This presentation delves into the journey of Albumentations.ai, a highly successful open-source library for data augmentation.
Created out of a necessity for superior performance in Kaggle competitions, Albumentations has grown to become a widely used tool among data scientists and machine learning practitioners.
This case study covers various aspects, including:
People: The contributors and community that have supported Albumentations.
Metrics: The success indicators such as downloads, daily active users, GitHub stars, and financial contributions.
Challenges: The hurdles in monetizing open-source projects and measuring user engagement.
Development Practices: Best practices for creating, maintaining, and scaling open-source libraries, including code hygiene, CI/CD, and fast iteration.
Community Building: Strategies for making adoption easy, iterating quickly, and fostering a vibrant, engaged community.
Marketing: Both online and offline marketing tactics, focusing on real, impactful interactions and collaborations.
Mental Health: Maintaining balance and not feeling pressured by user demands.
Key insights include the importance of automation, making the adoption process seamless, and leveraging offline interactions for marketing. The presentation also emphasizes the need for continuous small improvements and building a friendly, inclusive community that contributes to the project's growth.
Vladimir Iglovikov brings his extensive experience as a Kaggle Grandmaster, ex-Staff ML Engineer at Lyft, sharing valuable lessons and practical advice for anyone looking to enhance the adoption of their open-source projects.
Explore more about Albumentations and join the community at:
GitHub: https://github.com/albumentations-team/albumentations
Website: https://albumentations.ai/
LinkedIn: https://www.linkedin.com/company/100504475
Twitter: https://x.com/albumentations
Let's Integrate MuleSoft RPA, COMPOSER, APM with AWS IDP along with Slackshyamraj55
Discover the seamless integration of RPA (Robotic Process Automation), COMPOSER, and APM with AWS IDP enhanced with Slack notifications. Explore how these technologies converge to streamline workflows, optimize performance, and ensure secure access, all while leveraging the power of AWS IDP and real-time communication via Slack notifications.
In his public lecture, Christian Timmerer provides insights into the fascinating history of video streaming, starting from its humble beginnings before YouTube to the groundbreaking technologies that now dominate platforms like Netflix and ORF ON. Timmerer also presents provocative contributions of his own that have significantly influenced the industry. He concludes by looking at future challenges and invites the audience to join in a discussion.
GraphSummit Singapore | The Future of Agility: Supercharging Digital Transfor...Neo4j
Leonard Jayamohan, Partner & Generative AI Lead, Deloitte
This keynote will reveal how Deloitte leverages Neo4j’s graph power for groundbreaking digital twin solutions, achieving a staggering 100x performance boost. Discover the essential role knowledge graphs play in successful generative AI implementations. Plus, get an exclusive look at an innovative Neo4j + Generative AI solution Deloitte is developing in-house.
A tale of scale & speed: How the US Navy is enabling software delivery from l...sonjaschweigert1
Rapid and secure feature delivery is a goal across every application team and every branch of the DoD. The Navy’s DevSecOps platform, Party Barge, has achieved:
- Reduction in onboarding time from 5 weeks to 1 day
- Improved developer experience and productivity through actionable findings and reduction of false positives
- Maintenance of superior security standards and inherent policy enforcement with Authorization to Operate (ATO)
Development teams can ship efficiently and ensure applications are cyber ready for Navy Authorizing Officials (AOs). In this webinar, Sigma Defense and Anchore will give attendees a look behind the scenes and demo secure pipeline automation and security artifacts that speed up application ATO and time to production.
We will cover:
- How to remove silos in DevSecOps
- How to build efficient development pipeline roles and component templates
- How to deliver security artifacts that matter for ATO’s (SBOMs, vulnerability reports, and policy evidence)
- How to streamline operations with automated policy checks on container images
Essentials of Automations: The Art of Triggers and Actions in FMESafe Software
In this second installment of our Essentials of Automations webinar series, we’ll explore the landscape of triggers and actions, guiding you through the nuances of authoring and adapting workspaces for seamless automations. Gain an understanding of the full spectrum of triggers and actions available in FME, empowering you to enhance your workspaces for efficient automation.
We’ll kick things off by showcasing the most commonly used event-based triggers, introducing you to various automation workflows like manual triggers, schedules, directory watchers, and more. Plus, see how these elements play out in real scenarios.
Whether you’re tweaking your current setup or building from the ground up, this session will arm you with the tools and insights needed to transform your FME usage into a powerhouse of productivity. Join us to discover effective strategies that simplify complex processes, enhancing your productivity and transforming your data management practices with FME. Let’s turn complexity into clarity and make your workspaces work wonders!
UiPath Test Automation using UiPath Test Suite series, part 6DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 6. In this session, we will cover Test Automation with generative AI and Open AI.
UiPath Test Automation with generative AI and Open AI webinar offers an in-depth exploration of leveraging cutting-edge technologies for test automation within the UiPath platform. Attendees will delve into the integration of generative AI, a test automation solution, with Open AI advanced natural language processing capabilities.
Throughout the session, participants will discover how this synergy empowers testers to automate repetitive tasks, enhance testing accuracy, and expedite the software testing life cycle. Topics covered include the seamless integration process, practical use cases, and the benefits of harnessing AI-driven automation for UiPath testing initiatives. By attending this webinar, testers, and automation professionals can gain valuable insights into harnessing the power of AI to optimize their test automation workflows within the UiPath ecosystem, ultimately driving efficiency and quality in software development processes.
What will you get from this session?
1. Insights into integrating generative AI.
2. Understanding how this integration enhances test automation within the UiPath platform
3. Practical demonstrations
4. Exploration of real-world use cases illustrating the benefits of AI-driven test automation for UiPath
Topics covered:
What is generative AI
Test Automation with generative AI and Open AI.
UiPath integration with generative AI
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
How to Get CNIC Information System with Paksim Ga.pptxdanishmna97
Pakdata Cf is a groundbreaking system designed to streamline and facilitate access to CNIC information. This innovative platform leverages advanced technology to provide users with efficient and secure access to their CNIC details.
Securing your Kubernetes cluster_ a step-by-step guide to success !KatiaHIMEUR1
Today, after several years of existence, an extremely active community and an ultra-dynamic ecosystem, Kubernetes has established itself as the de facto standard in container orchestration. Thanks to a wide range of managed services, it has never been so easy to set up a ready-to-use Kubernetes cluster.
However, this ease of use means that the subject of security in Kubernetes is often left for later, or even neglected. This exposes companies to significant risks.
In this talk, I'll show you step-by-step how to secure your Kubernetes cluster for greater peace of mind and reliability.
Cosa hanno in comune un mattoncino Lego e la backdoor XZ?Speck&Tech
ABSTRACT: A prima vista, un mattoncino Lego e la backdoor XZ potrebbero avere in comune il fatto di essere entrambi blocchi di costruzione, o dipendenze di progetti creativi e software. La realtà è che un mattoncino Lego e il caso della backdoor XZ hanno molto di più di tutto ciò in comune.
Partecipate alla presentazione per immergervi in una storia di interoperabilità, standard e formati aperti, per poi discutere del ruolo importante che i contributori hanno in una comunità open source sostenibile.
BIO: Sostenitrice del software libero e dei formati standard e aperti. È stata un membro attivo dei progetti Fedora e openSUSE e ha co-fondato l'Associazione LibreItalia dove è stata coinvolta in diversi eventi, migrazioni e formazione relativi a LibreOffice. In precedenza ha lavorato a migrazioni e corsi di formazione su LibreOffice per diverse amministrazioni pubbliche e privati. Da gennaio 2020 lavora in SUSE come Software Release Engineer per Uyuni e SUSE Manager e quando non segue la sua passione per i computer e per Geeko coltiva la sua curiosità per l'astronomia (da cui deriva il suo nickname deneb_alpha).
Introducing Milvus Lite: Easy-to-Install, Easy-to-Use vector database for you...Zilliz
Join us to introduce Milvus Lite, a vector database that can run on notebooks and laptops, share the same API with Milvus, and integrate with every popular GenAI framework. This webinar is perfect for developers seeking easy-to-use, well-integrated vector databases for their GenAI apps.
“An Outlook of the Ongoing and Future Relationship between Blockchain Technologies and Process-aware Information Systems.” Invited talk at the joint workshop on Blockchain for Information Systems (BC4IS) and Blockchain for Trusted Data Sharing (B4TDS), co-located with with the 36th International Conference on Advanced Information Systems Engineering (CAiSE), 3 June 2024, Limassol, Cyprus.
Full-RAG: A modern architecture for hyper-personalizationZilliz
Mike Del Balso, CEO & Co-Founder at Tecton, presents "Full RAG," a novel approach to AI recommendation systems, aiming to push beyond the limitations of traditional models through a deep integration of contextual insights and real-time data, leveraging the Retrieval-Augmented Generation architecture. This talk will outline Full RAG's potential to significantly enhance personalization, address engineering challenges such as data management and model training, and introduce data enrichment with reranking as a key solution. Attendees will gain crucial insights into the importance of hyperpersonalization in AI, the capabilities of Full RAG for advanced personalization, and strategies for managing complex data integrations for deploying cutting-edge AI solutions.
Building RAG with self-deployed Milvus vector database and Snowpark Container...Zilliz
This talk will give hands-on advice on building RAG applications with an open-source Milvus database deployed as a docker container. We will also introduce the integration of Milvus with Snowpark Container Services.
14. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS!
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution.
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
Demantra Custom Hooks and Assorted Debugging Techniques
Data Loading Flow
-----------------
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
* These are the T_SRC_% and error tables. ep_load_main procedure.
2. Moving data from the collection staging tables into Demantra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demantra back to the Demantra data store.
4. Pushing the data back to the source instance in the form of a forecast.
We will cover #3 and #4 in this presentation.
next
================================
Creating a new series through data model upgrade from business modeler, brining in
the data through custom hooks and ep_load.
Step 1 - Add new column in the interface table t_src_sales_tmpl.
Step 2 – Add new series through the data model wizard using the newly
created column in the table t_src_sales_tmpl.
Step 3 – Upgrade the data model.
Step 4 – Put the custom query in the hook provided in package msd_dem_custom_hook.
Step 5 – Run Shipment and booking history collection with auto download set to ‘No’.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (1 of 8)5/6/2009 8:51:26 AM
15. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
Step 6 – Check the data in the interface table t_src_sales_tmpl.
Step 7 – Run Workflow EBS_Full_download.
Step 8 – See the data in the worksheet.
Please see:
ORACLE DEMANTRA, Customizing Hierarchies
* Please request a copy from Oracle Support
next
================================
Working with Oracle Support & Development: Debugging Custom Hooks
Please set the profile MSD_DEM: Debug Mode to Yes. And then run the
shipment and booking history program again and provide the following:
1. Log files and Output Files of all the concurrent programs launched.
2. Trace file and the tkprof output of the trace file for the DB session.
Also upload the modified custom hooks package spec and body.
next
================================
Preventing Data from Being Loaded
- You do not want the demand class item level to be imported to Demantra.
- You want to update the demand class column in item staging table to N/A.
- To accomplish this task, you can modify the sales history custom hook package,
on the EBS side, so that it will be executed during EBS collections.
- The sql stament inserted into the custom hook program is below.
----
UPDATE MSDEM.t_src_sales_tmpl
SET EBS_DEMAND_CLASS_SR_PK = '-777', EBS_DEMAND_CLASS_CODE = '0';
COMMIT;
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (2 of 8)5/6/2009 8:51:26 AM
16. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
----
- After running the Shipping/Booking History COllections, the SQL statement will
update the T_SRC_SALES_TMPL table. All other custom hooks which are placed
for updating T_SRC_ITEMS_TMPL and T_SRC_LOC_TMPL will also be updated.
next
================================
Discovered Bug and Customer Problem
1. Open and run worksheet zzz. CTO: My BOM view
2. Select combination date 10/06/2008
SLC:M1:Seattle Manufacturing
- Computer Service:1006:Chattanooga (OPS):Vision Operations -> CN974444
3. Enter a value for "Forecast Dependent Demand Override"
4. Press worksheet update data then rerun worksheet
5. Rerun worksheet and see that the old value is shown instead of the value entered.
6. Close and reoopen worksheet - "Forecast Dependent Demand Override" still shows old value.
7. After some 5 to 10m minutes rerun or reopen worksheet
8. See that new value finally shows up in "Forecast Dependent Demand Override"
Developer Explanation fixed in 7.3
----------------------------------
After we update some CTO GL series in Worksheet (e.g. - 'Forecast Dependent Demand Override'),
the old data appear while rerunning the Worksheet, and the new data will appear only if we reopen
the Worksheet and then rerun.
Internal Machinations: Technical Analysis and Resolution of the problem
-----------------------------------------------------------------------
1. In the Incremental Loading Mechanism - While preparing the Sql for the
T_POPU table we don't add the LUD column of the GL Data table (which exists
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (3 of 8)5/6/2009 8:51:26 AM
17. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
in the GL Matrix table) into the Expression in the Select part
(the expression which indicates whether a combination was changed).
2. Therefore, the solution will be: add the GL Data table LUD column (e.g. -
t_ep_cto_data_lud) into the POPU Sql in the Expression which will check
if a combination was changed.
3. In the Combination Cache Mechanism - after a Worksheet rerun we clear
the Sub Combination and Combination Maps and then the next time we access
these Maps we don't find the changed Combination, and therefore, do not remove them
from the Map of Loaded Combinations, so the Client won't get the changed data.
4. The solution will be: Do not clear the Sub Comb & Combination Maps, but
only synch them with the latest requested Sub Combs & Combinations.
next
================================
I installed the latest build on my local environment, but I still have a problem with
the worksheet.
When I update a base model in the "zzz. CTO: My BOM view", the "parent demand" series
should change when the update hook runs.
Here is what I did:
1) Open the worksheet "zzz. CTO: My BOM view"
2) On the page level browser, go click on APAC, then APAC Site 1, and then D530
3) For the row item 10/27/2008 | D530 | D530, put the number 22 in the "Base
Override" Series. Press update.
4) Wait 10 seconds and hit refresh. Notice that the "parent demand" series
has not changed. Several of values in this series should have the number 22
as a value.
5) In SQL Developer or other SQL tool, run the following query:
select * from t_ep_cto_data where cto_parent_dem_final=22
You should see several records that have the value 22 for the parent
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (4 of 8)5/6/2009 8:51:27 AM
18. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
demand series column. Note that the last_update_date column on t_ep_cto_data
has been modified.
6) Close the worksheet, then reopen it. Only now will the value 22 will
apear in the "parent demand" series.
Customer Solution
-----------------
- The problem in this is in the Update Hook code. Upon examination of the CALC_WORKSHEET
procedure I noticed that you update the 'LAST_UPDATE_DATE' column in T_EP_CTO_DATA table.
- While running the worksheet we do not check the 'LAST_UPDATE_DATE' column in the
T_EP_CTO_DATA
table but the 'LUD' columns in the T_EP_CTO_MATRIX table.
- This is according to a new feature named 'GL_MATRIX Support' which has been added to version 7.3.
- In this case the column 'T_EP_CTO_DATA_LUD' in the T_EP_CTO_MATRIX table should be
updated also.
- In each GL Matrix table, we have the LUD columns of all the Levels which belongs to the current
GL (General Level):
* Plus the LAST_UPDATE_DATE column of this table
* Plus the LUD column of the GL DATA table ('T_EP_CTO_DATA).
After adjusting the update, the customer worksheet displayed correct results.
next
================================================
Object MSD_DEM_QUERIES
* I have little information regarding the table that stores the dynamic SQL used for Demantra.
If there is interest, we can speak with DEV and develop a white paper geared to diagnostics.
next
================================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (5 of 8)5/6/2009 8:51:27 AM
19. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
For Data Collection Know Your Instances
I have seen numerous errors that are related to the awareness of which instance is being
collected.
select * from apps.msc_apps_instances;
Check the t_src_loc_tmpl (or other staging table) table to determine instances and organizations
are collected.
next
================================
Publishing Forecast Results
There have been few issues reported concerning the publishing of Demantra forecast to source. Here
are a few pointers:
The workflow has completed with out error yet there are no rows in the table
MSD_DP_SCN_ENTRIES_DENORM.
There are a number of possibilities.
1) Does BIEO_Local_Fcst have data?
2) Validation the org id of MSC_APPS_INSTANCES.
SQL> select id from transfer_query
where query_name = 'Local Forecast';
----353
SQL> select * from transfer_query_levels
where id = <id from 1st script>;
3) Verify the workflow status.
next
================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (6 of 8)5/6/2009 8:51:27 AM
20. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
SNO Integration Note
We support the following demand levels in SNO integration:
1. Item-Org
2. Item - Customer-customer site
3. Item - Zone
When we publish the constrained forecast from SNO, we publish all these 3 types.
ASCP handles these 3 types of constrained forecasts and plans successfully.
next
================================
Working with Oracle Support
Please set the profile MSD_DEM: Debug Mode to Yes. And then run the
shipment and booking history program again and provide the following:
1. Log files and Output Files of all the concurrent programs launched.
2. Trace file and the tkprof output of the trace file for the DB session.
next
================================================
R12 and Demantra 7.2.02 Install Procedure
-----------------------------------------
Currently we have EBS-Demantra Integration Installation Overview and Diagram, note 434991.1.
- This note addresses 11.5.10.2 and Demantra 7.1.1.
- We are assembling a note that will cover install procedures for r12 and Demantra 7.2.0.2
- Metalink Delivery date 30-May-2009
next
================================================
Wrap Up
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (7 of 8)5/6/2009 8:51:27 AM
21. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
-------
Please reference the following notes available on Metalink or email jeffery.goulette@oracle.com
if the article is being moderated. These are available in a .zip file.
** Note.789477.1 Manipulating Orgs For The Demantra Data Load
** Note.809410.1 EBS to Demantra Data Load / Import Diagnostics Investigation
** Note.815124.1 Data Loading into Demantra EP_Load
** Note.817973.1 Demantra Custom Hooks and Additiona Pointers
** Note.806295.1 Demantra Solutions TIPs for April 2009
** Note.563732.1 Demantra 7.1 7.2 Pre Post and Install Cross Checks
** Note.754237.1 DEMANTRA Q/A Setup, Implementation Ideas, Behavior 7.0.2, 7.1, 7.2 R12 EBS
** Note.802395.1 Demantra Patching - Version Control at EBS and Demantra Compatibility
** Note.462321.1 Demantra Environment Manipulation and Performance Tuning Suggestions
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (8 of 8)5/6/2009 8:51:27 AM
22. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS!
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution. T
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
==================================
Data Loading into Demantra EP_Load
==================================
Data Loading Flow
-----------------
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
2. Moving data from the collection staging tables into Dematra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demtra back to the Demantr data store.
4. Pushing the data back to the source instance in the form of a forecast.
We will cover #2 and #3 in this presentation.
EBS to Demantra Data Load / Import Diagnostics Investigation
------------------------------------------------------------
There are several methods to load data into the Demantra staging tables. Based on the number
of problems reported, the tools seem to operate as advertised. The data loaded into the
Demantra staging tables can be an issue.
We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation
to indentify, explain and fix the load result.
Summary of Integration Tasks
----------------------------
This section lists integration tasks in the appropriate sequence:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (1 of 19)5/6/2009 8:52:23 AM
23. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
1. Initial Setup, See Implementation Guide.
2. Collect Data and Download to Three Staging Tables. See Implementation Guide.
3. Transfer data to Oracle Demantra schema. See Implementation Guide.
* EP_LOAD
* Import Integration Profiles
4. Generate forecasts
5. Export Output from Oracle Demantra. See Implementation Guide.
* Export Integration Profiles
6. Upload Forecast. See Implementation Guide.
next
================================
EP_LOAD download procedures are used for booking history streams and level members.
- For example, the EP_LOAD procedures are used to load booking history by
organization-site-sales channel and item-demand class into staging tables.
- If the Download Now check box was not selected during the collections
process, run EP_LOAD and Import Integration Profiles to move data from the
staging tables into the Oracle Demantra Demand Management schema.
next
================================
Launch EP LOAD.
- Historical information and level data are imported into Oracle Demantra via
the EP_LOAD procedure.
- All other series data are imported into Oracle Demantra via Import Integration Profiles.
An assumption of the EP LOAD procedure is that the series are populated into the Oracle
Demantra staging tables before the load process begins.
- To ensure this occurs, the collection programs for all eight historical series have been
merged so that these streams are always collected simultaneously.
next
================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (2 of 19)5/6/2009 8:52:23 AM
24. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
Launch EP_LOAD Continued.
For members and history series, which are downloaded via the EP_LOAD mechanism,
the mode of update is:
- If there is a new member, it is added in Oracle Demantra.
- If a member has been deleted in the E-Business Suite source, it stays in Oracle
Demantra along with all series data for combinations that include the member.
* The administrative user must manually delete the member in Oracle Demantra.
- Series data in the staging area overwrites the series data in Oracle Demantra, for the
combinations that are represented in the staging area.
- Series data in Oracle Demantra for combinations that are not in the staging area are
left unchanged.
- The staging area is erased after the download.
- All series data in Oracle Demantra, for all combinations, are set to null before the
download actions take place.
* There are a total of three EP_LOAD workflows, one EP_LOAD workflow for each of the following
series:
- Item members
- Location members
- Shipment and Booking History
Caution: There is a risk that if multiple lines of business run collections
very close in time to each other, a single EP_LOAD run may pull in
data from multiple lines of business.
- See Line Of Business Configuration and Execution in the User Guide.
* We are not covering the Create the EP_LOAD_MAIN procedure, which loads data into the data
model from staging tables or from files, according to your choice as setup in the Data Model Wizard.
next
================================
Future Date Loading
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (3 of 19)5/6/2009 8:52:23 AM
25. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
EP_LOAD process. All future information (that is, forecast data) is loaded using
integration profiles or other loading mechanisms. This mechanism controls the dates
marked as end of history for the Forecasting Engine and the Collaborator Workbench.
With the addition of the MaxSalesGen parameter, you can now use the EP_LOAD
process to load future dates into Demantra. This parameter determines how data after
the end of history is populated.
Note: When populating the MaxSalesGen parameter, its important to
enter all dates in the MM-DD-YYYY 00:00:00 format.
next
================================
Three main staging tables:
T_SRC_ITEM_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to a
unique item entity based on all lowest item levels in the model.
T_SRC_LOC_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to a
unique location entity based on all lowest item levels in the model.
T_SRC_SALES_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to
sales data for a given item and location combination, based on all lowest item levels in
the model.
next
================================
Functional Steps to load Demantra
1. Load the data into the staging tables:
t_Src_sales_tmpl,
t_src_item_tmpl,
t_src_loc_tmpl
2. Login to the Workflow Manager.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (4 of 19)5/6/2009 8:52:23 AM
26. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
3. Start the workflow 'EBS Full Download'
4. Check the status by clicking on the 'Instance' number of the workflow.
5. Check the status on the Collaborator Workbench in the My Tasks pane.
next
================================
This pre seeded WF should include all required steps to enable a successfull
download of items/location/sales data into Demantra tables. Currently the WF
runs the following processes:
EP_LOAD_ITEMS
EP_LOAD_LOCATION
EP_LOAD_SALES
However, in order to complete the loading process you may need to run the MDP_ADD
procedure to make sure new combination are added into MDP_MATRIX.
next
================================
Technical Investigation
-----------------------
On 7.2.0.1 in Production:
We find that after running ep_load_sales, the procedure completed without any error
message, but there is no record in sales_data table.
EXPECTED BEHAVIOR
-----------------------------------
We expect data to be loaded
Steps To Reproduce:
1. load data into t_src_sales_tmpl
2. run ep_load_sales
3. check integ_status
4. check t_src_sales_tmpl_err
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (5 of 19)5/6/2009 8:52:23 AM
27. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
5. check sales_data
We have just one record in t_src_sales_tmpl, after running ep_load_sales, the procedure
completed without any error message and there is no error row in t_src_sales_tmpl_err.
Upon checking sales_data and integ_status, there are no rows in these tables.
Debugging Steps:
1. Check if you have any error in db_exception_log table.
select * from db_exception_log order by err_date desc;
2. Verify that there is not 'new' data. The existing row could have simply been updated.
3. Please review all columns:
select * from t_src_sales_tmpl;
select * from sales_data;
Upon closer examination, we found invalid data in t_src_sales_tmpl table.
After we updated the column ACTUAL_QTY from NULL to '0', the ep load sales was successful.
* You would need to change the value coming from the source to successfully load without
manipulating staging table data.
next
================================
After launching Collection, Collect Shipment and Booking history with the
parameter 'EP_LOAD'=N and running ep_load after collection, the
actual_quantity loaded into sales_data is different from the actual_qty in
t_src_sales_tmpl.
This is what we did:
--------------------
1. Launch Collection : Collect Shipment and Booking history with the
parameter 'EP_LOAD' = N
2. Launch a custom program to populate the following columns: T_ep_i_att_9,
T_ep_p3, T_ep_p2 in the staging table t_ep_item_tmpl from the column
ebs_product_category_desc in the table t_ep_item_tmpl.
3. Launched the EP_Load as a standalone request.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (6 of 19)5/6/2009 8:52:23 AM
28. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
4. Restarted Application Server
5. Verified the data for the items 130-0290-910 in the bucket starting 05/26/2008=18333
6. Verified from the following Query:
select sum(actual_qty) from t_src_sales_tmpl where
dm_item_code='130-0290-910'
and sales_date between '26-may-2008' and '01-jun-2008';
- Result: 121 <----
7. Verified from Sales Data table from the following query:
select sum(actual_quantity) from Sales_data where item_id in
(select item_id from msdem.mdp_matrix where t_ep_item_ep_id in
(select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910'))
and sales_date between '26-may-2008' and '01-jun-2008';
- Result:18333 <----
Debugging
---------
Please check the setting of SYS_PARAMS 'accumulatedOrUpdate' setting:
'update' or 'accumulate'
EP_LOAD_SALES should aggregate the sales in the T_SRC_SALES table by combination and
sales date.
- accumulatedOrUpdate = Update
If the sale already exists then the actual_quantity value should be replaced.
- accumulatedOrUpdate = accumulate
Should add the new value to any existing values (provides the series is set
as proportional)
Debugging Step 2
----------------
This is one method of comparing source data to mdp_matrix. In this case we
are verifying the sums for a given date range / item.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (7 of 19)5/6/2009 8:52:23 AM
29. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
- Check the source:
select sum(shipped_quantity), sum(cancelled_quantity), sum(ordered_quantity),
ship_from_org_id
from apps.oe_order_lines_all where actual_shipment_date
between '26-may-2008' and '01-jun-2008' and ordered_item= '130-0290-910'
group by ship_from_org_id;
- Check what was collected:
select sum(actual_qty), dm_org_code from t_src_sales_tmpl where
dm_item_code='130-0290-910'
and sales_date between '26-may-2008' and '01-jun-2008' group by dm_org_code;
- Check what was loaded into MMP_Matrix:
select sum(actual_quantity) from Sales_data where item_id in
(select item_id from msdem.mdp_matrix where t_ep_item_ep_id in
(select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910'))
and sales_date between '26-may-2008' and '01-jun-2008';
- Here we are checking a different item with a ship_from_org_id:
select sum(shipped_quantity), sum(cancelled_quantity), sum(ordered_quantity),
ship_from_org_id
from apps.oe_order_lines_all where actual_shipment_date
between '26-may-2008' and '01-jun-2008' and ordered_item= '130-0290-910' and
ship_from_org_id = 722 group by ship_from_org_id;
- Following the trail to collected data:
select sum(actual_qty), dm_org_code from t_src_sales_tmpl where
dm_item_code='130-0290-910'
and sales_date = '30-May-2008' group by dm_org_code;
- And finally in mdp_matrix:
select sum(actual_quantity) from Sales_data where item_id in
(select item_id from msdem.mdp_matrix where t_ep_item_ep_id in
(select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910'))
and sales_date ='30-May-2008';
If there is a difference, the sql can be adjusted to weed out extra rows or
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (8 of 19)5/6/2009 8:52:23 AM
30. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
rows that were not collected through to mdp_matrix.
next
================================
I created Integration interface Variable Cost and a workflow with transfer step
to load data for that integration interface. When the staging table for integration
interface is populated with data and the workflow is run, the errored out records move
to the err table and the correct records vanish from the staging table but nothing is
populated in demantra internal tables.
If all the records in the staging table are correct then the data is
populated into the base tables of Demantra.
Investigation
-------------
The staging table (integ_inf_var_cost) seems to contain dirty data. By this
I mean that in addition to missing members (that are being handled by the
system and can be noted in the error table), some of the combinations do not
have valid item-location combination in the MDP_MATRIX table. Hence, the
update process had failed to find any valid rows to update.
On v7.1.1, the integration process validations included the following
validations:
1. Integration profile structure
2. Staging table data dates range
3. Existance and validity of members and drop down series values
Any error that is found, a row in the error table indicates and holds
information about the problem.
On v7.1.1 validation DID NOT include population validation (i.e., that the
specified combination indeed exsits in MDP_MATRIX). This feature was added
in the v7.2.0 release.
Nonetheless, please find the following 2 SQL statements, either one will reveal those notorious
combinations in the staging table:
SELECT DISTINCT i.*
FROM integ_inf_var_cost i,
t_ep_e1_it_br_cat_3 e,
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (9 of 19)5/6/2009 8:52:23 AM
31. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
t_ep_region r,
t_ep_finproductgroup f
WHERE i.level1 = e.e1_it_br_cat_3
AND i.level2 = r.region
AND i.level3 = f.finproductgroup
AND NOT EXISTS (
SELECT *
FROM mdp_matrix m
WHERE m.t_ep_e1_it_br_cat_3_ep_id = e.t_ep_e1_it_br_cat_3_ep_id
AND m.t_ep_region_ep_id = r.t_ep_region_ep_id
AND m.t_ep_finproductgroup_ep_id = f.t_ep_finproductgroup_ep_id);
select distinct t6.e1_it_br_cat_3,t5.region,t4.finproductgroup
from mdp_matrix t1,
t_ep_finproductgroup t4,
t_ep_region t5,
t_ep_e1_it_br_cat_3 t6,
integ_inf_var_cost vc
where t1.t_ep_region_ep_id = t5.t_ep_region_ep_id
and t1.t_ep_finproductgroup_ep_id = t4.t_ep_finproductgroup_ep_id
and t1.t_ep_e1_it_br_cat_3_ep_id = t6.t_ep_e1_it_br_cat_3_ep_id
and t1.variable_cost is null
and t6.e1_it_br_cat_3 = vc.level1
and t5.region = vc.level2
and t4.finproductgroup = vc.level3;
next
================================
This is a a sound approach to test the complete collection and presention of
data to MDP_MATRIX and Demantra.
Actions that were taken:
1. Reset the tables, by executing the following:
-- Reset
TRUNCATE TABLE integ_inf_var_cost_err;
TRUNCATE TABLE integ_inf_var_cost;
UPDATE mdp_matrix
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (10 of 19)5/6/2009 8:52:23 AM
32. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
SET variable_cost = NULL;
COMMIT;
2. Fill staging table, by executing the following:
-- Insert data into the staging table
INSERT INTO integ_inf_var_cost
(sdate, level1, level2, level3, variable_cost)
SELECT TO_DATE (TO_CHAR (NEXT_DAY (SYSDATE, 'monday') - 7,
'DD-MM-YYYY'),
'DD-MM-YYYY'
) sdate,
shipto_commercial_org level1,
region level2,
finance_product_line level3,
variable_cost
FROM cstm_var_cost_eur;
COMMIT;
3. Run the "Variable Cost Download" workflow.
4. Checked the results:
a) Verify that the integ_inf_var_cost table was empty, using the
following:
-- Checked that all data from the staging table was handled
SELECT COUNT (*)
FROM integ_inf_var_cost;
which had returned 0, as expected.
b) Verified that that the mdp_matrix table had new rows using
the following:
-- Checked that the new data was introduced
SELECT COUNT (*)
FROM mdp_matrix
WHERE variable_cost IS NOT NULL;
which had returned 247.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (11 of 19)5/6/2009 8:52:23 AM
33. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
next
================================
Working with Oracle Support or DEV
----------------------------------
Exporting your Demantra data model for revew:
An export file which contains your data model can be exported by running
the Business Modeler and selecting the menu "Data Model --> Open Data
Model". Then select the active model (it is yellow) and click on the
"Export" button. Then specify the location to save the *.dmw file.
next
================================
Data Loading Directly into Staging Tables
-----------------------------------------
1. Load a single record (for the purpose of this test case) into each of
t_src_item_tmpl,t_src_loc_tmpl and t_src_sales_tmpl
2. We then run a hybrid of the standard 'sales load' workflow, where
DL_RUN_PROC is called with various arguments one by one:
- ep_prepare_data,
- ep_load_items,
- ep_load_location,
- ep_load_sales
3. Look for errors the %err tables and find that t_src_sales_tmpl_err has an
error and t_src_sales_tmpl is empty
Determined Cause:
When loading SALES_DATA all of the levels must match existing levels in
the hierarchy, not just the item code. This is ensured by a large JOIN near
the beginning of the EP_LOAD_SALES procedure.
We have also seen this strange result as product of bug 6520853 EP_LOAD_ITEMS DOESN'T
LOAD ITEM AND EP_CHECK_ITEMS DOESN'T GIVE ERROR. This was based on proper case.
When the case did not match.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (12 of 19)5/6/2009 8:52:23 AM
34. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
This is fixed in 7.2.0.2
next
================================
Missing Date Values in Loading
------------------------------
After running the EP_Load procedure the worksheet does not show the data.
The TO_DATE & FROM_DATE columns are not populated.
Steps to Reproduce:
1. Run Build Model.
2. Load data to the T_SRC_ITEM_TMPL, T_SRC_LOC_TMPL & T_SRC_SALES_TMPL tables.
3. Run: ?Demand PlannerDeskTopload_data.bat
4. View that there are no errors in:
T_SRC_ITEM_TMPL_ERR,
T_SRC_LOC_TMPL_ERR
T_SRC_SALES_TMPL_ERR tables
5. Check that the data was inserted to the data base.
6. Create a worksheet that will show the data.
7. See if the data is seen in the worksheet.
- My results are that the levels, members and data exist in the data base.
- The levels and members are seen in the worksheet wizard and can be selected.
- The levels, members and data are not shown in the worksheet.
Neat Debugging Example
----------------------
Create temp table with min and max dates per mdp_matrix combination:
create table temp_matrix_dates as SELECT s.item_id, s.location_id,
MIN(s.sales_date) AS from_date, MAX(s.sales_date) AS until_date
FROM sales_data s, mdp_matrix t
WHERE s.item_id = t.item_id
AND s.location_id = t.location_id
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (13 of 19)5/6/2009 8:52:23 AM
35. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
GROUP BY s.item_id, s.location_id
Update mdp_matrix from_date and until_date based on the table created above:
UPDATE mdp_matrix m
set (from_date,until_date) = (
select from_date,until_date
from temp_matrix_dates WHERE item_id = m.item_id
AND location_id = m.location_id)
WHERE item_id = m.item_id
AND location_id = m.location_id;
This will update the date columns in mdp_matrix for all combinations not just
the ones you loaded. Use the following SQL to identify which combinations in
MDP_MATRIX are missing the date values.
select item_id, location_id, from_date, until_date from mdp_matrix where
from_date is null and until_date is null.
next
================================
The following investigation can be used to verify that data from the source actually
loads into the Demantra tables. The item hhm-1 is missing in the Demantra sales_data tables
but there were no errors. This customer is useing .ctl files to load the dmtra_template
schema. You can change the default schema name, see note 551455.1.
Step by Step Walk Through to Find the Data
Following are the steps followed and the details of the issue.
Load your data into flat files matching the .ctl and perform the following:
- exec DATA_LOAD.EP_PREPARE_DATA; ( Runs "replace_apostrophe" which removes
single quotes from the T_SRC tables)
- exec DATA_LOAD.EP_LOAD_ITEMS; ( Runs no errors found in _ERR table)
- exec DATA_LOAD.EP_LOAD_LOCATION; ( Runs no errors found in _ERR table)
- exec DATA_LOAD.EP_LOAD_SALES; ( Runs no errors found in _ERR table)
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (14 of 19)5/6/2009 8:52:23 AM
36. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
1. Performed booking and shipping history collections with Auto down load option.
2. Found that the data is not collected into demantra. this was confirmed after
verifying the data is not present in the work sheets.
3. Performed EBS full down load Work Flow. This collected the location data into the staging table.
select *
from dmtra_template.t_src_sales_tmpl
where lower(dm_item_code) = 'hse-1';
4. Then performed booking and shipping history collections with auto down load
5. SELECT DISTINCT
dm_item_code,
dm_org_code,
dm_site_code,
t_ep_lr1,
t_ep_ls1,
t_ep_p1,
ebs_demand_class_code,
ebs_sales_channel_code,
aggre_sd -- filter column
FROM ep_T_SRC_SALES_TMPL_ld
WHERE ep_T_SRC_SALES_TMPL_ld.actual_qty IS NOT NULL
and dm_item_code ='HHM-1'
ORDER BY
dm_item_code,dm_org_code,dm_site_code,t_ep_lr1,t_ep_ls1,t_ep_p1,ebs_demand_cla
ss_code,ebs_sales_channel_code,aggre_sd;
The 2 dates shown are 05-FEB-07 and 29-JAN-07
6. SELECT ITEMS.item_id,LOCATION.location_id
FROM ITEMS, LOCATION,
t_ep_item,
t_ep_organization,
t_ep_site,
t_ep_lr1,
t_ep_ls1,
t_ep_p1,
t_ep_ebs_dema,
nd_class,
t_ep_ebs_sales_ch
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (15 of 19)5/6/2009 8:52:23 AM
37. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
WHERE 1 = 1
AND items.t_ep_item_ep_id = t_ep_item.t_ep_item_ep_id
AND t_ep_item.item = 'HHM-1'
AND location.t_ep_organization_ep_id = t_ep_organization.t_ep_organization_ep_id
AND t_ep_organization.organization = 'TST:M1'
AND location.t_ep_site_ep_id = t_ep_site.t_ep_site_ep_id
AND t_ep_site.site = 'ABC Corporation Americas:2637:7233:Vision Operations'
AND location.t_ep_lr1_ep_id = t_ep_lr1.t_ep_lr1_ep_id
AND t_ep_lr1.lr1 = 'N/A'
AND location.t_ep_ls1_ep_id = t_ep_ls1.t_ep_ls1_ep_id
AND t_ep_ls1.ls1 = 'N/A'
AND items.t_ep_p1_ep_id = t_ep_p1.t_ep_p1_ep_id
AND t_ep_p1.p1 = 'N/A'
AND items.t_ep_ebs_demand_class_ep_id = t_ep_ebs_demand_class.t_ep_ebs_demand_class_ep_id
AND t_ep_ebs_demand_class.ebs_demand_class = '0'
AND location.t_ep_ebs_sales_ch_ep_id = t_ep_ebs_sales_ch.t_ep_ebs_sales_ch_ep_id
AND t_ep_ebs_sales_ch.ebs_sales_ch = 'Direct;'
Gives the ITEM_ID 565 and LOCATION_ID 607
7. What you now see is only the 2 rows inserted / updated into SALES_DATA
because the 'aggre_sd' date is used as a filter from the actual_qty NOT NULL select.
SELECT COUNT(*)
from sales_data
where item_id = 565
and location_id = 607
and TRUNC(sales_date) = TO_DATE('05-FEB-07','DD-MON-RR')
Gives 1 row which is right.
8. These rows are also added to MDP_LOAD_ASSIST.
* When the actual_qty is NULL in the distinct list it is inserted in to a table
MDP_LOAD_ASSIST and is used to populate MDP_MATRIX.
* The actual_qty is NOT NULL list is aggregated by the aggre_sd filter date and the rows
are inserted / updated in SALES_DATA.
* The columns quantity values are averaged across all the distrinct rows even
including the NOT NULL actual_qty rows by the 'aggre_sd'
The EP_LOAD_SALES continues to load the rows from MDP_LOAD_ASSIST via:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (16 of 19)5/6/2009 8:52:23 AM
38. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
mdp_add('mdp_load_assist') which is in DATA_LOAD.EP_LOAD_SALES
From T_SRC_SALES_TMPL
Rows
Total 12,476
Distinct IDs with actual_qty NULL 807 <-- These are the problem rows
Distinct ID's Incl. aggre_sd and actual_qty NOT NULL 11,669
Why are there 807 problem rows? Why are they not in the err table?
9. Trying to verify success of the following items : HSE-1
select i.t_ep_item_ep_id,
i.item_id,t.item,
i.is_fictive
from dmtra_template.items i, t_ep_item t
where lower(item) ='hse-1'
and i.t_ep_item_ep_id = t.t_ep_item_ep_id
order by 1 ;
The above will deliver the item_id to be used to verify that the data is in
the sales_data table:
select * from dmtra_template.sales_data where item_id in (654,671,727,823);
Returned zero rows.
Unclear as to why ep_load_sales does not load sales nor insert errored rows
into error tables.
10. To continue the investigation
SELECT *
FROM dmtra_template.t_ep_item
WHERE item = 'HSE-1';
Output:t_ep_item_ep_id = 188
11. What are the actual_qty values and sales_date in the source table?
select dm_item_code, sales_date, actual_qty
from dmtra_template.t_src_sales_tmpl
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (17 of 19)5/6/2009 8:52:23 AM
39. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
where lower(dm_item_code) in ('hse-1')
and actual_qty IS NOT NULL;
DM_ITEM_CODE SALES_DATE ACTUAL_QTY
------------ ---------- ----------
HSE-1 12/25/2006 70
How many rows where the actual_qty values are NULL in the source table?
select dm_item_code, sales_date, actual_qty
from dmtra_template.t_src_sales_tmpl
where lower(dm_item_code) in ('hse-1')
and actual_qty IS NULL;
DM_ITEM_CODE SALES_DATE ACTUAL_QTY
------------ ---------- ----------
HSE-1 1/1/2007
HSE-1 1/1/2007
12. SELECT *
FROM dmtra_template.items
WHERE t_ep_item_ep_id = 188;
Output :
item_id demand_class
------- ----------------------------------
654 Unassicaoted
671 Australia Sales Region
727 East US Sales Region
823 West US Sales Region
13. What is the corresponding item id values from the ITEMS table?
select i.t_ep_item_ep_id, i.item_id,t.item, i.is_fictive
from items i, t_ep_item t
where lower(item) ='hse-1'
and i.t_ep_item_ep_id = t.t_ep_item_ep_id
order by 1;
T_EP_ITEM_EP_ID ITEM_ID ITEM IS_FICTIVE
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (18 of 19)5/6/2009 8:52:23 AM
40. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
--------------- ------- ------- ----------
188 654 HSE-1 2
14. select * from dmtra_template.sales_data
where item_id = 654
and sales_date IN ('12/25/2006','1/1/2007','1/29/2007');
ITEM_ID LOCATION_ID SALES_DATE ACTUAL_QUANTITY
------- ----------- ---------- ---------------
671 746 1/29/2007 70
671 752 1/1/2007
671 753 1/1/2007
As you can see where quantity is not null there is data with correct
corresponding values and dates in SALES_DATA.
Expected Behaviour
-------------------
We should be able to see the data for History for these series as well even
when the acutal_quantity IS NULL
Booking History - booked items - booked date,
Booking History - requested items - booked date,
Booking History - booked items - requested date,
Booking History - requested items - requested date,
Shipment History - shipped items - shipped date,
Shipment History - shipped items - requested date,
Shipment History - requested items - requested date
<< end of document >>
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (19 of 19)5/6/2009 8:52:23 AM
41. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS!
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution.
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
Data Loading Flow
-----------------
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
* These are the T_SRC_% and error tables. ep_load_main procedure.
2. Moving data from the collection staging tables into Dematra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demtra back to the Demantr data store.
4. Pushing the data back to the source instance in the form of a forecast.
We will cover #1 and #2 in this presentation.
next
================================================
Summary of Integration Tasks
----------------------------
This section lists integration tasks in the appropriate sequence:
1. Initial Setup, See Implementation Guide.
2. Collect Data and Download to Three Staging Tables. See Implementation Guide.
3. Transfer data to Oracle Demantra schema. See Implementation Guide.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (1 of 24)5/6/2009 8:53:00 AM
42. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
* EP_LOAD
* Import Integration Profiles
4. Generate forecasts
5. Export Output from Oracle Demantra. See Implementation Guide.
* Export Integration Profiles
6. Upload Forecast. See Implementation Guide.
next
================================================
EBS to Demantra Data Load / Import Diagnostics Investigation
------------------------------------------------------------
There are several methods to load data into the Demantra staging tables. Based on the number
of problems reported, the tools seem to operate as advertised. The data loaded into the
Demantra staging tables can be an issue.
We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation
to indentify, explain and fix the load result.
Load Methods
------------
- Integration Interface Wizard
- Data Model Wizard
- Demantra Import Tool
- SQL*Loader
next
================================================
The following table summarizes the core Demantra import and export tools:
Data To import, use... To export, use...
------------------------------- ------------------------------- -----------------
Lowest level item and Data Model Wizard* N/A
location data; sales data
Series data at any Integration Interface Wizard* Integration Interface Wizard
aggregation levels
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (2 of 24)5/6/2009 8:53:01 AM
43. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Sales promotions Integration Interface Wizard* Integration Interface Wizard
Members and attributes of N/A Integration Interface Wizard
other levels
Other data, for example, Demantra import tool* N/A
lookup tables
*These options are in the Business Modeler.
next
================================================
The core Demantra tools allow you to do the following:
• Import lowest-level item, location, and sales data
• Import or export series data at any aggregation level, with optional filtering
• Import promotions and promotional series
• Export members of any aggregation level
• Import supplementary data into supporting tables as needed
next
================================================
Object Manipulation
-------------------
The Demantra Business Modeler contains many useful tools to perform maintenace and
development.
- Create Table
- Alter Table
- Recompile
- View the Procedure Error Log
- Cleanup Demantra temporary tables
- Oracle Sessions Monitor
- Please see user guide for complete list
next
================================================
Oracle Demantra Workflows
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (3 of 24)5/6/2009 8:53:01 AM
44. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- EBS Full Download: Downloads items, locations, and shipment and booking history.
- EBS Return History: Download: Downloads Return History
- EBS Price List Download: Downloads Price Lists
Workflows can do all the following actions
- Run integration interfaces.
- Run stored database procedures.
- Run external batch scripts and Java classes.
- Pause the workflow until a specific condition is met, possibly from a set of allowed
conditions. For example, a workflow can wait for new data in a file or in a table.
- Send tasks to users or groups; these tasks appear in the My Tasks module for those
users, within Collaborator Workbench. A typical task is a request to examine a
worksheet, make a decision, and possibly edit data. A task can also include a link to
a Web page for more information.
next
================================================
WF Technical
------------
Monitor the WF for errors:
SELECT ACTIVITY_NAME, ACTIVITY_STATUS_CODE,
ACTIVITY_RESULT_CODE, TO_CHAR(ACTIVITY_BEGIN_DATE, 'DD/MM/YYYY -
HH24:MI:SS'), TO_CHAR(ACTIVITY_END_DATE, 'DD/MM/YYYY - HH24:MI:SS'),
ERROR_NAME,
ERROR_MESSAGE
FROM WF_ITEM_ACTIVITY_STATUSES_V
WHERE ACTIVITY_STATUS_CODE = 'ERROR'
Check the collaborator log
--------------------------
I have error in the collaborator.log file resulting from the "Download Plan Scenario Data"
workflow.
- They where caused by the "Notify" email step in the workflow, these errors were being
generated because the demantra environment was not configured with the proper details for sending
email notifications to users.
- These errors might prevent the workflow from completing the rest of the steps in the flow.
- The User Guide explains proper setup quite well.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (4 of 24)5/6/2009 8:53:01 AM
45. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Restart / Debugging Steps
-------------------------
1) Shutdown the applications server.
2) Clear tasks from collaborator workbench for the user.
3) Delete new DAILY_PROCESS workflow.
Clear wf_process_log tableDelete all rows in the wf_scheduler and wf_process_log tables:
- delete from wf_scheduler;
- delete from wf_process_log;
4) Restart the application server/web server.
5) Scheduled the WF "daily process"
Additional WF Checks
--------------------
1. Are there more than a single instance of the Demantra application
running (i.e., with different context root, on different machines, etc.)?
2. What is the name of the problematic workflow? Is there another workflow schema by that name?
3. How were schemas removed? Using Demantra application or directly from
the database (using the DELETE statement)? Use the application whenever possible.
Poor WF Performance?
--------------------
- Ask yourself, is this necessary data? While historic data is important, not all historic
data needs to be included.
- Are most of the quantities in these records valid quantities?
- Have you considered changing the plan settings to limit out quantities that
are below a threshold?
- Can older scenario revision data be deleted if it's no longer required?
- What are the output levels (including time) of the scenarios?
Size Check for Performance
--------------------------
Use the output as a high water mark setting for future growth/performance analysis.
select SCENARIO_NAME, SCENARIO_OUTPUT_PERIOD_TYPE,DP_DIMENSION,
LEVEL_NAME
from MSD_DP_SCN_OUTPUT_LEVELS_V
where demand_plan_id = <plan id>
order by scenario_name, dp_dimension, level_name
select count(*),SCENARIO_ID from msd.MSD_DP_SCN_ENTRIES_DENORM where
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (5 of 24)5/6/2009 8:53:01 AM
46. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
DEMAND_PLAN_ID=6031 group by
SCENARIO_ID;
select count(*),SCENARIO_ID,revision from
msd.MSD_DP_SCENARIO_ENTRIES
where demand_plan_id=6031 group by SCENARIO_ID,revision;
- Is there data that can be purged?
- Are all of the scenarios and revisions lean or do they contain stale data?
next
================================================
Integration profile Performance Problem?
----------------------------------------
For performance reasons, the updates are not being executed immediately,
but are being accumulated, and once exceeding the configured limit being written
as a chunk or block.
* defined in the ImportBlockSize property in appserver.properties file
Modify the 'ImportBlockSize' parameter in appserver.properties to a number that reflects
better performance at your site after testing different settings.
next
================================================
The Basic Input Data Review
---------------------------
When fully configured, Demantra imports the following data, at a minimum, from your
enterprise systems:
- Item data, which describes each product that you sell.
- Location data, which describes each location to which you sell or ship your products.
- Sales history, which describes each sale made at each location. Specifically this
includes the items sold and the quantity of those items, in each sale.
- For Promotion Effectiveness:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (6 of 24)5/6/2009 8:53:01 AM
47. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Historical information about promotional activities.
Demantra can import and use other data such as returned amounts, inventory data,
orders, and settlement data.
- From my understanding of the Shipping/Booking History Collection for Demantra, the
sales data is loaded first and then the item and loc tables are derived from it.
next
================================================
Functional Considerations
-------------------------
There are many functional setup issues relating directly to successful data load and execution.
For example, the MSC:Organization containing generic BOM for forecast explosion profile
can be set to one of the organizations at the site level. This will limit the entry of rows into
MSD_DP_SCENARIO_ENTRIES and MSD_DP_SCN_ENTRIES_DENORM.
Changing the profile to the Global Item Master Organization will make the records available for loading.
next
================================================
Imported Data
-------------
You can collect internal sales orders. They appear in the customer dimension; the customer
appears as the organization code.
Seeded collections from EBS into Oracle Demantra Demand Management include:
- Shipment history, booking history, and returns history
- Manufacturing and fiscal calendars
- Price lists, currencies, and currency conversion factors
- Dimensions, levels, hierarchies, and level values for demand analysis.
Item levels are:
- Product category: Item > Category
- Product family: Item > Product family
- Demand class
Location levels are:
- Zone: Site > Trading partner > Zone
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (7 of 24)5/6/2009 8:53:01 AM
48. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- Customer class: Site > Account > Customer > Customer class
- Business group: Organization > Operating unit > Business group
- Sales channel
next
================================================
Integration Interface Wizard
----------------------------
The Integration Interface Wizard initializes the names of the staging tables, but you
can rename the tables within the wizard if needed. The default names start with biio_.
Make a note of the names of your tables, as displayed within the Integration Interface Wizard.
(not a complete list)
- biio_supply_plans
- biio_supply_plans_pop
- biio_other_plan_data
- biio_PURGE_PLAN
- biio_scenario_resources
Troubleshooting
---------------
Look to the logs
- Integration Interface Table name>_ERR.
- If the staging table is name Biio_My_Demand, the error table is Biio_My_Demand_Err
- The _Err table will contain the rows that were not successfully added to the staging tables.
- Check the Integration Log
If your URL to Demantra is http://DEMANTRAMACHINE:8080/demantra/portal/loginpage.jsp
then the logs would be accessed by point your browser to:
http://DEMANTRAMACHINE:8080/demantra/admin/systemLogs.jsp
NOTE, it is advised that you review the contents of the following for erred rows:
- UPDATE_BATCH_TRAIL
- UPDATE_BATCH_VALUES
- UPDATE_BATCH_TRAIL_ERR
- UPDATE_BATCH_VALUES_ERR
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (8 of 24)5/6/2009 8:53:01 AM
49. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Please run the following script in the demantra schema to verify data
---------------------------------------------------------------------
begin
apps.msd_dem_sop.load_plan_data (:supply_plan_id);
end;
The parameter to the procedure is the column value of supply_plan_id from the
supply_plan table. The correct ID is for the supply plan for which data is being
loaded from ASCP to Demantra. Use SQL to attain.
For e.g. if for plan 'ASCP-DEM', the supply plan id is 134, the script should be run as
begin
apps.msd_dem_sop.load_plan_data (134);
end;
If you must log an SR, please enable trace for the session and provide the tkprof
output of the trace file after the execution.
Also provide the row count of the following tables in Demantra schema after
running the script:
1. biio_other_plan_data
2. biio_resource_capacity
3. biio_supply_plans
4. biio_supply_plans_pop
5. biio_scenario_resources
6. biio_scenario_resource_pop
7. biio_resources
8. t_src_item_tmpl
9. t_src_loc_tmpl
10. BIIO_PURGE_PLAN
next
================================================
Dumping your source data for review:
1. A dump file (and log file) which contain the tables
T_SRC_ITEM_TMPL,
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (9 of 24)5/6/2009 8:53:01 AM
50. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
T_SRC_LOC_TMPL,
T_SRC_SALES_TMPL,
TABLE_NAME,
WF_GROUPS_SCHEMAS,
WF_PROCESS_LOG,
WF_SCHEDULER,
WF_SCHEMAS,
WF_SCHEMA_GROUPS,
WF_SCHEMA_TEMPLATES.
The command to make such a dump file:
(You may need to adjust)
exp user@server file=resmed_oct.dmp log=resmed_oct.log tables=(
T_SRC_ITEM_TMPL, T_SRC_LOC_TMPL, T_SRC_SALES_TMPL, TABLE_NAME,
WF_GROUPS_SCHEMAS, WF_PROCESS_LOG, WF_SCHEDULER, WF_SCHEMAS,
WF_SCHEMA_GROUPS, and WF_SCHEMA_TEMPLATES)
next
================================================
Prarmeter checks Used for Source Data Load and EP Load. Be familiar
with the contents of the following tables:
select * from init_params_0 order by 1;
select * from sys_params order by 1;
select * from db_params order by 1;
select * from aps_params order by 1;
- init_params_0 table should be all the descritions of the parameters.
- sys_params contains the parameters listed on the Database, System, and
Worksheet tabs.
- db_params constains operational entries such as check_and_drop_sleep_limit.
For example, if you should receive the following error:
ORA-20000: Cannot DROP AK because of ORA-54 resource busy and have timed
out after 255 seconds
ORA-6512: at "YOURCO_DEC.CHECK_AND_DROP", line 143
The sleep period starts at 1 second and increments exponentially until it
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (10 of 24)5/6/2009 8:53:01 AM
51. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
reaches a sleep period limit > 128 seconds (current default). This gives
a total attempt time of 255 seconds. This limit can be changed via the
new parameter in DB_PARAMS 'check_and_drop_sleep_limit'.
Retry messages are written to the DB_EXCEPTION_LOG table.
Another Key Entry for EP_Load
There is a parameter in DB_PARAMS, ep_load_do_commits, the default is TRUE.
- TRUE causes commits to be issued in the SALES MERGE loop.
- FALSE causes only 1 commit to be issued just after the SALES MERGE loop.
- aps_params contains the password the Business Modeler should use to connect to
connect to the database as well as other important operational settings.
next
================================================
Detailed Source Data Investigation
----------------------------------
Scenario: Shipment and booking history are not completely collected when
we establish a new schema. Though the collections for shipment and booking
history is successful with no error or warning message, we could not see the
items in Demantra. If there is unclean data in t_src_item_tmpl and
t_src_loc_tmpl tables, EBS full download moves the unclean data from these
tables to the error table. This works fine but the process is not moving
clean data to the Demantra base tables.
* You will need to find your item_id organization_id as well as other data required
to run the following sql.
Steps to Produce:
1. Performed booking and shipping history collectins with Auto down load option.
2. Found that the data is not collected into Demantra. This was confirmed after
verifying the data in work sheets.
3. Verified that this issue is related to locations via simple SQL checks.
select * from dmtra_template.t_src_loc_tmpl_err;
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (11 of 24)5/6/2009 8:53:01 AM
52. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Found that there were no errors in this table.
4. Performed EBS full down load work flow. This collected the location data into the
staging table. But not sales data as last time the locations data were not collected.
Hence nothing in the staging table.
5. Then performed booking and shipping history collections with auto down load. Still
the sales data not collected into demantra. (not sure why the customer did this)
Now, we will dig through the tables to locate the problem
Step 1
================================================--
In the table INTEG_STATUS it shows that when DMTRA_TEMPLATE runs the EP_LOAD
it succeeds but when APPS runs it, it fails, this could be a permissions issue.
Step 2
------
select * from dmtra_template.sales_data where item_id in (654,671,727,823);
Step 3
------
Have confirmed items HSE-1, HSE-2,HSE-3,HSE-4,HSE-5 are in staging tables:
select * from dmtra_template.t_src_sales_tmpl where lower(dm_item_code) in
('hse-1', 'hse-2','hse-3','hse-4','hse-5');
Step 4
------
Also confirmed items are loaded into the system:
select i.t_ep_item_ep_id,i.item_id,t.item,i.is_fictive
from dmtra_template.items i,dmtra_template.t_ep_item t where lower(item) in
('hse-1', 'hse-2','hse-3','hse-4','hse-5')
and i.t_ep_item_ep_id=t.t_ep_item_ep_id
order by 1;
Step 5
------
Launched complete refresh collections for Shipment and booking history with
Auto-download set to 'yes'.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (12 of 24)5/6/2009 8:53:01 AM
53. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
There were no records pertatining to HSE-1,HHM-1(for e.g.) in the
t_src_sales_tmpl_err table and there were many records for the same items are
in sales_data table. There is inconsitency between the records in t_src_sales_tmpl
and sales_data table.
Executed the following sqls:
For Item HHM-1:
SELECT *
FROM dmtra_template.sales_data
WHERE item_id IN(627,666);
SELECT * FROM dmtra_template.t_src_sales_tmpl where dm_item_code ='HHM-1'
order by sales_date desc;
For Item HSE-1:
SELECT *
FROM dmtra_template.sales_data
WHERE item_id IN(654, 671, 727, 823);
SELECT * FROM dmtra_template.t_src_sales_tmpl where dm_item_code ='HSE-1'
order by sales_date desc
After the above data is available you should be able to determine the missing data.
In this case the customer was missing location. Location is one of the major keys.
next
================================================
Demand Class
------------
The customer noted that the load was successful however, there were sales orders
missing.
The collections code is bringing in demand classes from oe_order_lines_all.
But the (master table) lookup for demand classes is missing some of the demand classes.
For eg. - The demand class code '1-WLKLE' is available in order lines, but it
is not available in the lookup 'DEMAND_CLASS'. The following query returns zero
rows:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (13 of 24)5/6/2009 8:53:01 AM
54. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
select * from apps.fnd_common_lookups where lookup_type= 'DEMAND_CLASS'
and lookup_code='1-WLKLE';
As a temporary workaround, you could modify the shipment history query to not bring
in demand class code, if the demand class is missing in the lookup 'DEMAND_CLASS'.
After that, run Shipment and Booking History collection followed by
ep_load. Verify the Actual_quantity for one particular item both in the
staging table t_src_sales_tmpl and internal table sales_data.
The issue is due to missing demand classes in the lookup. If the demand
classes available in the lookup 'DEMAND_CLASS' and order lines are in synch
with each other, then the sales history quantities should be loaded correctly
into sales data.
You must check why the demand classes are missing from the lookup
DEMAND_CLASS, but present in order lines. These need to be in synch.
next
================================================
Working with Oracle Support
---------------------------
- When there is an issue successfully collecting from EBS to Staging Tables.
1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set
to the correct values. Also make sure demantra URL is up and accessible.
2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with
'Launch Download' set to 'Yes'.
3. Upload log as well as output files for the 'Launch EP LOAD' stage.
4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL'.
5. Supply the logs found under Demand PlannerSchedulerbin folder.
next
================================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (14 of 24)5/6/2009 8:53:01 AM
55. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Version Control
- Downgrading is not supported. Our schema upgrade mechanism only knows how to
rev foreward not look backward.
next
================================================
EBS Data Load
Please note that all successfully loaded rows are loaded into the MSD_DP_SCN_ENTRIES_DENORM
table.
MSD_DP_SCENARIO_ENTRIES table contains only the records which errored out. Errored records of
a
previous load will be deleted in the next load.
To produce acceptable debug logs:
1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set
to the correct values. Also make sure demantra URL is up and accessible.
2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with
'Launch Download' set to 'Yes'.
3. Upload log as well as output files for the 'Launch EP LOAD' stage.
4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host
URL'.
5. Upload the logs.
next
================================================
Error Check
-----------
The following is a list of tables to review should an error occur.
We would suggest that these tables are empty before the load and
that you produce an automated script to count any new rows
Provide the Following to Oracle Support:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (15 of 24)5/6/2009 8:53:01 AM
56. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- DB_EXCEPTION_LOG table
select * from db_exception_log order by err_date;=
- select * from version_details_history order by upgrade_date desc;
- select object_name, object_type from user_objects where upper(status) = 'INVALID';
- select count(*) from dmtra_template.t_src_item_tmpl;
- select count(*) from dmtra_template.t_src_item_tmpl_err;
- select count(*) from dmtra_template.t_src_loc_tmpl;
- select count(*) from dmtra_template.t_src_loc_tmpl_err;
- select count(*) from dmtra_template.t_src_sales_tmpl;
- select count(*) from dmtra_template.t_src_sales_tmpl_err;
- What was the 'Launch Download' Parameter set to?
- Please provide the Log file for the respective 'request set' collection program.
- select * from t_ep_item;
- select * from t_ep_organization;
- select * from t_ep_site;
After running ASCP -> Demand Management System Administrator -> Collect Shipment
and Booking History - Flat File,concurrent program completed successfully
Check the following Demantra tables:
- select * from t_src_item_tmpl
- select * from t_src_item_tmpl_err
- select * from t_src_loc_tmpl
- select * from t_src_loc_tmpl_err
- select * from t_src_sales_tmpl
- select * from t_src_sales_tmpl_err
You may also consider verifying the contents of the following:
- biio_scenario_resources / _err
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (16 of 24)5/6/2009 8:53:01 AM
57. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- integ_inf_var_cost_err
- biio_SUPPLY_PLANS_POP / _err
- biio_supply_plans / _er
- biio_other_plan_data / _err
- BIIO_PURGE_PLAN
next
================================================
Price List Collection Verification
----------------------------------
1. Select * from MSD_DEM_PRL_FROM_SOURCE_V;
(note: This query should be run on SOURCE instance only)
2. Select * from msd_dem_entities_inuse where ebs_entity = 'PRL';
Setup issue:
1. You must select the "Planning Method" & "Forecast control" for the Master Org.
2. The customer has built an new data model, which deleted all the seeded
display units in demantra. Because of this, pricelists were not able to be
loaded into demantra. You would have to rereate new display units like the
original seeded units.
Then run the pricelist collections.
Additional SQL to dig into the Price List
-----------------------------------------
1. select display_units ,display_units_id ,data_table ,data_field from
DEM.DISPLAY_UNITS
where display_units_id in
(select distinct display_units_id from DEM.DISPLAY_UNITS
minus
select distinct display_units_id
from DEM.DCM_PRODUCTS_UNITS )
and display_units like '%EBSPRICELIST%' and rownum < 2;
2. select distinct display_units_id
from DEM.DISPLAY_UNITS
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (17 of 24)5/6/2009 8:53:01 AM
58. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
minus
select distinct display_units_id
from DEM.DCM_PRODUCTS_UNITS;
3. select * from DEM.DISPLAY_UNITS;
4 select * from DEM.DCM_PRODUCTS_UNITS;
5. select distinct price_list_name from MSD_DEM_PRL_FROM_SOURCE_V
Price List at the Source Instane
--------------------------------
Execute the following SQL to investigate at the source:
1. select count(*) from MSD_DEM_PRL_FROM_SOURCE_V ;
2. select distinct price_list_name from MSD_DEM_PRL_FROM_SOURCE_V ;
3. select text from all_views where view_name like
'MSD_DEM_PRL_FROM_SOURCE_V' ;
next
================================================
Collection Success/Error Investigation
--------------------------------------
At the Destination, Check the sales date.
select min(sales_date), max(sales_date) from dmtra_template.sales_data
where actual_quantity > 0
next
================================================
Source Views of Interest
------------------------
The view 'MSD_DEM_DEPENDENT_DEMAND_V' is fetching date values from ASCP
tables which includes time as well.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (18 of 24)5/6/2009 8:53:01 AM
59. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- The procedure MSD_DEM_SOP.PUSH_TIME_DATA inserts dates corresponding to
time buckets into MSD_DEM_DATES table before supply plan data is
downloaded.
Other important objects:
- MSD_DEM_DEPENDENT_DEMAND_V
- MSD_DEM_DATES
- MSD_DEM_ENTITIES_INUSE
- MSD_DEM_ENTITY_QUERIES
- MSD_DEM_GROUP_TABLES
- MSD_DEM_ITEMS_GTT
- MSD_DEM_CONSTRAINED_FORECAST_V
- MSD_DEM_LOCATIONS_GTT
- MSD_DEM_NEW_ITEMS
- MSD_DEM_PRICE_LISTS
Problem: Legacy Collections for Shipment and Booking History errored out at Collect
Level Type stage with the following error in the log:
* Workaround: Run ERP collections for future dates which will wipe out
the sales staging tables. Then run the legacy collections with proper
vaues for demand class and sales channel (or your desired data group)
next
==================================
EBS to Demantra collection with download = YES does not insert date into the Demantra
tables. Data moves into the Demantra staging not into the base table. There is no
error in the log file.
Steps followed:
1. Create new item and sales order
2. Ran the standard collections and EP_LOAD (with download = Yes) The
concurrent programs have been completed successfully with no errors.
3. The item and sales data have been inserted into the Demantra staging tables.
4. Ran the workflow "EBS Full Download" manually, then the data was moved into
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (19 of 24)5/6/2009 8:53:01 AM
60. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
the Demantra base tables.
5. However, we could not see the data in worksheet. We expect EP_LOAD to put
the data into Demantra base tables.
Solution Explanation
--------------------
AppServerURL
1. The first check to make is proper setting of the parameter 'AppServerURL' in demantra.
This can be verified from the Business Modeler or from the backend. Use the following
query to get the value of this parameter from the demantra schema
- You have to start the application server before running the Business Modeler wizard.
- In Most cases if the application server is up the problem is with the application server URL.
select pval from sys_params where pname like 'AppServerURL';
This should be set to 'http://dskhyd707878.yourcompany.com:80/demantra', or
wherever your demantra server is running.
2. Check the profiles 'MSD_DEM_SCHEMA' and 'MSD_DEM_HOST_URL'. Are they set properly.
* For more information regarding MSD_DEM_HOST_URL, see note 431301.1
-------------------------------------------------------------
Profile Name - Value
-------------------------------------------------------------
Profile MSD_DEM_CATEGORY_SET_NAME -
Profile MSD_DEM_CONVERSION_TYPE -
Profile MSD_DEM_CURRENCY_CODE -
Profile MSD_DEM_MASTER_ORG - 204
Profile MSD_DEM_CUSTOMER_ATTRIBUTE - NONE
Profile MSD_DEM_TWO_LEVEL_PLANNING - 2
Profile MSD_DEM_SCHEMA - MSDEM
-------------------------------------------------------------
* Please make sure that profiles MSD_DEM_CONVERSION_TYPE and
MSD_DEM_MASTER_ORG
are set in Source instance and MSD_DEM_CURRENCY_CODE and MSD_DEM_SCHEMA profiles
are set in the Planning Server.
next
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (20 of 24)5/6/2009 8:53:01 AM
61. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
==================================
After running "Shipment and booking history" request from EBS application, only
some partial data gets loaded in demantra TMPL tables.
Here is my log:
+---------------------------------------------------------------------------+
Demand Planning: Version : 12.0.0
Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
MSDDEMARD module: Launch EP LOAD
+---------------------------------------------------------------------------+
Current system time is 22-AUG-2008 18:19:53
+---------------------------------------------------------------------------+
**Starts**22-AUG-2008 18:19:53
**Ends**22-AUG-2008 18:24:38
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
+---------------------------------------------------------------------------+
Start of log messages from FND_FILE
+---------------------------------------------------------------------------+
Exception: msd_dem_collect_history_data.run_load - 22-AUG-2008 18:24:38
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
+---------------------------------------------------------------------------+
End of log messages from FND_FILE
+---------------------------------------------------------------------------+
+---------------------------------------------------------------------------+
Executing request completion options...
Finished executing request completion options.
+---------------------------------------------------------------------------+
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (21 of 24)5/6/2009 8:53:01 AM
62. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Exceptions posted by this request:
Concurrent Request for "Launch EP LOAD" has completed with error.
+---------------------------------------------------------------------------+
Concurrent request completed
Current system time is 22-AUG-2008 18:24:40
+---------------------------------------------------------------------------+
QUESTION
========
After running "Shipment and booking history" request from EBS application, only
some partial data gets loaded in demantra TMPL tables.
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
1. The issue might be because the parameter 'AppServerURL' isn't set properly.
Please run the following query from the Demantra schema:
select pname, pval from sys_params where pname like 'AppServerURL';
2. Please check the AppServerURL in the business modeler.
3. Either reconfigure CONNECT_TIMEOUT to be 0, which means wait indefinitely, or
reconfigure CONNECT_TIMEOUT to be some higher value. Or, if the timeout is
unacceptably long, turn on tracing for further information.
next
================================================
Source Data Manipulation, Order Management booked_date.
There was data missing in the load because the booked_date was null.
Here is what we did to discover/fix the problem.
1. Delete t_src_loc_tmpl
2. Execute request set "Standard Collection".
3. Execute request set "Shipment and Booking History".
4. Review the process log from EBS
5. Review the collaboration.log
6. Login to Workflow in Demantra environment
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (22 of 24)5/6/2009 8:53:01 AM
63. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
7. Check if the schema ep_load exist
Below tables do not contain any rows:
T_SRC_SALES_TMPL
T_SRC_ITEMS_TMPL
T_SRC_LOC_TMPL
The error tables doe not contain any rows:
T_SRC_SALES_TMPL_ERR
T_SRC_ITEMS_TMPL_ERR
T_SRC_LOC_TMPL_ERR
Check the following:
1. The source for all the Booking History series - oe_order_headers_all and
oe_order_lines_all has data as expected.
2. The source for all the Shipment History series - oe_order_headers_all and
oe_order_lines_all has the data expected.
3. The table oe_order_headers_all has the headers_id populated.
4. The table oe_order_lines_all has line and the ordered_item is populated.
6. The table oe_order_headers_all does not have the booked_date set up.
SELECT booked_date
FROM oe_order_headers_all
To implement the solution, please execute the following steps:
1. Make a copy of the table oe_order_headers_all .
2. Populate the column booked_date with correct value on the table
oe_order_headers_all .
3. Execute request set "Standard Collection".
4. Execute request set "Shipment and Booking History".
5. Checked T_SRC_* tables in Demantra for data is collected
6. Migrate the solution as appropriate to other environments.
Also:
Do you have a shipped date populated for new combinations?
Series data must be populated into the Demantra staging tables before the load
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (23 of 24)5/6/2009 8:53:01 AM
64. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
process begins. To ensure this occurs, the collection programs for all eight
historical series have been merged:
• Booking History – booked items – booked date
• Booking History – requested items – booked date
• Booking History – booked items – requested date
next
================================================
Data not loading because of worksheet date settings?
Regarding the issue of data downloaded into demantra not showing up in worksheet, this is
because of the date range specified in the worksheet settings.
CONFIGURE LOADING TEXT FILES DOES NOT LOAD DATA FOR MORE THAN 2000
COLUMN WIDTH.
During data load from text file, if total width of columns is more than 2000,
it shows error as "ORA-12899: value too large for column
"DEMANTRA"."DM_WIZ_IMPORT_FILE_DEF"."SRC_CTL_SYNTAX" (actual: 3037,
maximum:2000)".
* This is the limitation of the functionality. Change covered in enhancement request 6879562.
next
================================================
SQL* Loader Technical Health Check
1. Make sure that SQLLDR.EXE utility exist under the oracle client bin
directory and that the system path points there. Verify by opening a CMD
window and executing SQLLDR.EXE, if it's not found, add it to the system path and restart.
2. Otherwise, please provide the full engine log from that run, also provide
all the *.log/*.bad/*.txt files that should be under the engine's bin
directory.
< END OF DOC >
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (24 of 24)5/6/2009 8:53:01 AM