SlideShare a Scribd company logo
1 of 64
Download to read offline
1
© 2009 Oracle Corporation – Proprietary and Confidential
2
Demantra Solutions
Day, Date, 2004
time p.m. ET
Teleconference Access:
North America: xxxx
International: xxxx
Password: Advisor
Wednesday, May 6, 2009
Time MDT
Teleconference Access:
1-866-627-3315
+1-706-679-4880
Passcode: 93272045
Future Demantra Solutions Advisor Webcasts!!
Don't miss out.. Register today!!
Wednesday
May 6, 2009
Troubleshooting Techniques for
Demantra Data Load Issues
791239.1
Wednesday
June 3, 2009
The Key to Optimizing Demantra
Worksheet Performance
797933.1
Wednesday
July 1, 2009
TBA TBA
For a listing of webcasts in other product areas, see Note
398884.1.
For a recording of today’s or a previous Advisor Webcast,
see Note 740297.1.
© 2009 Oracle Corporation – Proprietary and Confidential
3
Agenda
• Presentation and Demo – approximately 45 minutes
• Q&A Session – approximately 15 minutes
• Please hold all questions to the end of the session.
• To ask a question, move your cursor to the top of the screen and select the
‘bubble’ icon next to the moderator’s name.
• A dialog box will open. Enter your question and select “Send.”
• During the Q&A session your question will be read and an answer will
follow.
© 2009 Oracle Corporation – Proprietary and Confidential
4
ATTENTION – AUDIO INFORMATION
If you encounter any audio issues, please call
InterCall (Audio Conferencing) and mute
your phone.
1-888-259-4812
+1-706-679-4880
Conference ID: 93272045
We would like to encourage attendees with sufficient
Internet bandwidth to listen through VoiceStreaming to
continue to use this audio source.
Thank you.
© 2009 Oracle Corporation – Proprietary and Confidential
5
Safe Harbor Statement
The following is intended to outline our general
product direction. It is intended for information
purposes only, and may not be incorporated into
any contract. It is not a commitment to deliver any
material, code, or functionality, and should not be
relied upon in making purchasing decision. The
development, release, and timing of any features
or functionality described for Oracle’s products
remains at the sole discretion of Oracle.
© 2009 Oracle Corporation – Proprietary and Confidential
6
<Insert Picture Here>
Demantra Solutions Data
Loading Troubleshooting
Jeff Goulette
7
Agenda
The following presentation will be delivered in
three parts:
1)EBS to Demantra Staging
• WC-Staging-Tables.txt
2) Demantra Data Store to Demantra
• WC-EP-Load-Debug.txt
3) Miscellaneous Including Publish Forecast,
Custom Hooks, etc.
• WC-Hook.txt
4) Duration approximately 45 Minutes Followed by Q/A
Please note that all three files are available in the Demantra Solutions Metalink Forum
© 2009 Oracle Corporation – Proprietary and Confidential
8
Theory and Future
• Typically functional issues are addressed one time, never to
reoccur again. This presentation is meant to be a guide, a
point of entry to discover data inconsistencies.
• There are shortfalls in this presentation as the number of
possible data problems are potentially significant. Version 2 of
this web cast will include points made today as well as
comments from the Demantra Solutions forum.
• These articles are available at the Demantra Solutions forum.
• WC-Staging-Tables.txt
• WC-EP_LOAD_Debug.txt
• WC-Hook.txt
• Specific business scenarios are always welcome.
• For example, ‘Can Oracle supply a getting started document that addresses
11.5.10.2 MRP and Demantra?
© 2009 Oracle Corporation – Proprietary and Confidential
9
Refresh
Snapshot
Logs
Snapshots
MRP_AP
Views
MSD_DEM
Objects
INV
BOM
WIP
PO
OM
Forecast
MDS
OPM
Shipping
Booking
MSC_Staging Tables
MSC ODS raw data
MSC ODS raw data
MSC PDS planning data
MSDEM Schema
1
3
2
© 2009 Oracle Corporation – Proprietary and Confidential
10
© 2009 Oracle Corporation – Proprietary and Confidential
• To ask a question, move your cursor to the top of the
screen and select the ‘bubble’ icon next to the moderator’s
name.
• A dialog box will open. Enter your question and select
“Send.”
Questions
11
Visit My Oracle Communities
Collaborate with a large network of your industry peers, support professionals, and Oracle
experts to exchange information, ask questions & get answers. Find out how your peers
are using Oracle technologies and services to better meet their support and business
needs.
• Exchange Knowledge
• Resolve Issues
• Gain Expertise
Visit the Demantra Solutions
Support Community now!!
1. Log into My Oracle Support.
2. Select the Community link.
3. Select the Enter Here button.
4. Select the Demantra
Solutions link under the E-
Business Suite section of the
My Communities Menu on the
left side of the window.
© 2009 Oracle Corporation – Proprietary and Confidential
12
Feedback
Please let us know how we are doing by providing the following:
1. Based on the webcast description, how did your learning experience
compare to what you expected when you began the webcast?
2. Would you recommend the recording for this webcast to your
colleagues?
3. What topics specifically would you like to see covered in the Demantra
Advisor Webcasts?
4. Will you attend future Demantra Advisor Webcasts?
Email your feedback to jamie.binkley@oracle.com or
coremfg-news_us@oracle.com.
© 2009 Oracle Corporation – Proprietary and Confidential
13
THANK YOU
© 2009 Oracle Corporation – Proprietary and Confidential
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS!
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution.
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
Demantra Custom Hooks and Assorted Debugging Techniques
Data Loading Flow
-----------------
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
* These are the T_SRC_% and error tables. ep_load_main procedure.
2. Moving data from the collection staging tables into Demantra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demantra back to the Demantra data store.
4. Pushing the data back to the source instance in the form of a forecast.
We will cover #3 and #4 in this presentation.
next
================================
Creating a new series through data model upgrade from business modeler, brining in
the data through custom hooks and ep_load.
Step 1 - Add new column in the interface table t_src_sales_tmpl.
Step 2 – Add new series through the data model wizard using the newly
created column in the table t_src_sales_tmpl.
Step 3 – Upgrade the data model.
Step 4 – Put the custom query in the hook provided in package msd_dem_custom_hook.
Step 5 – Run Shipment and booking history collection with auto download set to ‘No’.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (1 of 8)5/6/2009 8:51:26 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
Step 6 – Check the data in the interface table t_src_sales_tmpl.
Step 7 – Run Workflow EBS_Full_download.
Step 8 – See the data in the worksheet.
Please see:
ORACLE DEMANTRA, Customizing Hierarchies
* Please request a copy from Oracle Support
next
================================
Working with Oracle Support & Development: Debugging Custom Hooks
Please set the profile MSD_DEM: Debug Mode to Yes. And then run the
shipment and booking history program again and provide the following:
1. Log files and Output Files of all the concurrent programs launched.
2. Trace file and the tkprof output of the trace file for the DB session.
Also upload the modified custom hooks package spec and body.
next
================================
Preventing Data from Being Loaded
- You do not want the demand class item level to be imported to Demantra.
- You want to update the demand class column in item staging table to N/A.
- To accomplish this task, you can modify the sales history custom hook package,
on the EBS side, so that it will be executed during EBS collections.
- The sql stament inserted into the custom hook program is below.
----
UPDATE MSDEM.t_src_sales_tmpl
SET EBS_DEMAND_CLASS_SR_PK = '-777', EBS_DEMAND_CLASS_CODE = '0';
COMMIT;
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (2 of 8)5/6/2009 8:51:26 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
----
- After running the Shipping/Booking History COllections, the SQL statement will
update the T_SRC_SALES_TMPL table. All other custom hooks which are placed
for updating T_SRC_ITEMS_TMPL and T_SRC_LOC_TMPL will also be updated.
next
================================
Discovered Bug and Customer Problem
1. Open and run worksheet zzz. CTO: My BOM view
2. Select combination date 10/06/2008
SLC:M1:Seattle Manufacturing
- Computer Service:1006:Chattanooga (OPS):Vision Operations -> CN974444
3. Enter a value for "Forecast Dependent Demand Override"
4. Press worksheet update data then rerun worksheet
5. Rerun worksheet and see that the old value is shown instead of the value entered.
6. Close and reoopen worksheet - "Forecast Dependent Demand Override" still shows old value.
7. After some 5 to 10m minutes rerun or reopen worksheet
8. See that new value finally shows up in "Forecast Dependent Demand Override"
Developer Explanation fixed in 7.3
----------------------------------
After we update some CTO GL series in Worksheet (e.g. - 'Forecast Dependent Demand Override'),
the old data appear while rerunning the Worksheet, and the new data will appear only if we reopen
the Worksheet and then rerun.
Internal Machinations: Technical Analysis and Resolution of the problem
-----------------------------------------------------------------------
1. In the Incremental Loading Mechanism - While preparing the Sql for the
T_POPU table we don't add the LUD column of the GL Data table (which exists
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (3 of 8)5/6/2009 8:51:26 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
in the GL Matrix table) into the Expression in the Select part
(the expression which indicates whether a combination was changed).
2. Therefore, the solution will be: add the GL Data table LUD column (e.g. -
t_ep_cto_data_lud) into the POPU Sql in the Expression which will check
if a combination was changed.
3. In the Combination Cache Mechanism - after a Worksheet rerun we clear
the Sub Combination and Combination Maps and then the next time we access
these Maps we don't find the changed Combination, and therefore, do not remove them
from the Map of Loaded Combinations, so the Client won't get the changed data.
4. The solution will be: Do not clear the Sub Comb & Combination Maps, but
only synch them with the latest requested Sub Combs & Combinations.
next
================================
I installed the latest build on my local environment, but I still have a problem with
the worksheet.
When I update a base model in the "zzz. CTO: My BOM view", the "parent demand" series
should change when the update hook runs.
Here is what I did:
1) Open the worksheet "zzz. CTO: My BOM view"
2) On the page level browser, go click on APAC, then APAC Site 1, and then D530
3) For the row item 10/27/2008 | D530 | D530, put the number 22 in the "Base
Override" Series. Press update.
4) Wait 10 seconds and hit refresh. Notice that the "parent demand" series
has not changed. Several of values in this series should have the number 22
as a value.
5) In SQL Developer or other SQL tool, run the following query:
select * from t_ep_cto_data where cto_parent_dem_final=22
You should see several records that have the value 22 for the parent
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (4 of 8)5/6/2009 8:51:27 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
demand series column. Note that the last_update_date column on t_ep_cto_data
has been modified.
6) Close the worksheet, then reopen it. Only now will the value 22 will
apear in the "parent demand" series.
Customer Solution
-----------------
- The problem in this is in the Update Hook code. Upon examination of the CALC_WORKSHEET
procedure I noticed that you update the 'LAST_UPDATE_DATE' column in T_EP_CTO_DATA table.
- While running the worksheet we do not check the 'LAST_UPDATE_DATE' column in the
T_EP_CTO_DATA
table but the 'LUD' columns in the T_EP_CTO_MATRIX table.
- This is according to a new feature named 'GL_MATRIX Support' which has been added to version 7.3.
- In this case the column 'T_EP_CTO_DATA_LUD' in the T_EP_CTO_MATRIX table should be
updated also.
- In each GL Matrix table, we have the LUD columns of all the Levels which belongs to the current
GL (General Level):
* Plus the LAST_UPDATE_DATE column of this table
* Plus the LUD column of the GL DATA table ('T_EP_CTO_DATA).
After adjusting the update, the customer worksheet displayed correct results.
next
================================================
Object MSD_DEM_QUERIES
* I have little information regarding the table that stores the dynamic SQL used for Demantra.
If there is interest, we can speak with DEV and develop a white paper geared to diagnostics.
next
================================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (5 of 8)5/6/2009 8:51:27 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
For Data Collection Know Your Instances
I have seen numerous errors that are related to the awareness of which instance is being
collected.
select * from apps.msc_apps_instances;
Check the t_src_loc_tmpl (or other staging table) table to determine instances and organizations
are collected.
next
================================
Publishing Forecast Results
There have been few issues reported concerning the publishing of Demantra forecast to source. Here
are a few pointers:
The workflow has completed with out error yet there are no rows in the table
MSD_DP_SCN_ENTRIES_DENORM.
There are a number of possibilities.
1) Does BIEO_Local_Fcst have data?
2) Validation the org id of MSC_APPS_INSTANCES.
SQL> select id from transfer_query
where query_name = 'Local Forecast';
----353
SQL> select * from transfer_query_levels
where id = <id from 1st script>;
3) Verify the workflow status.
next
================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (6 of 8)5/6/2009 8:51:27 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
SNO Integration Note
We support the following demand levels in SNO integration:
1. Item-Org
2. Item - Customer-customer site
3. Item - Zone
When we publish the constrained forecast from SNO, we publish all these 3 types.
ASCP handles these 3 types of constrained forecasts and plans successfully.
next
================================
Working with Oracle Support
Please set the profile MSD_DEM: Debug Mode to Yes. And then run the
shipment and booking history program again and provide the following:
1. Log files and Output Files of all the concurrent programs launched.
2. Trace file and the tkprof output of the trace file for the DB session.
next
================================================
R12 and Demantra 7.2.02 Install Procedure
-----------------------------------------
Currently we have EBS-Demantra Integration Installation Overview and Diagram, note 434991.1.
- This note addresses 11.5.10.2 and Demantra 7.1.1.
- We are assembling a note that will cover install procedures for r12 and Demantra 7.2.0.2
- Metalink Delivery date 30-May-2009
next
================================================
Wrap Up
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (7 of 8)5/6/2009 8:51:27 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt
-------
Please reference the following notes available on Metalink or email jeffery.goulette@oracle.com
if the article is being moderated. These are available in a .zip file.
** Note.789477.1 Manipulating Orgs For The Demantra Data Load
** Note.809410.1 EBS to Demantra Data Load / Import Diagnostics Investigation
** Note.815124.1 Data Loading into Demantra EP_Load
** Note.817973.1 Demantra Custom Hooks and Additiona Pointers
** Note.806295.1 Demantra Solutions TIPs for April 2009
** Note.563732.1 Demantra 7.1 7.2 Pre Post and Install Cross Checks
** Note.754237.1 DEMANTRA Q/A Setup, Implementation Ideas, Behavior 7.0.2, 7.1, 7.2 R12 EBS
** Note.802395.1 Demantra Patching - Version Control at EBS and Demantra Compatibility
** Note.462321.1 Demantra Environment Manipulation and Performance Tuning Suggestions
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (8 of 8)5/6/2009 8:51:27 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS!
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution. T
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
==================================
Data Loading into Demantra EP_Load
==================================
Data Loading Flow
-----------------
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
2. Moving data from the collection staging tables into Dematra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demtra back to the Demantr data store.
4. Pushing the data back to the source instance in the form of a forecast.
We will cover #2 and #3 in this presentation.
EBS to Demantra Data Load / Import Diagnostics Investigation
------------------------------------------------------------
There are several methods to load data into the Demantra staging tables. Based on the number
of problems reported, the tools seem to operate as advertised. The data loaded into the
Demantra staging tables can be an issue.
We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation
to indentify, explain and fix the load result.
Summary of Integration Tasks
----------------------------
This section lists integration tasks in the appropriate sequence:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (1 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
1. Initial Setup, See Implementation Guide.
2. Collect Data and Download to Three Staging Tables. See Implementation Guide.
3. Transfer data to Oracle Demantra schema. See Implementation Guide.
* EP_LOAD
* Import Integration Profiles
4. Generate forecasts
5. Export Output from Oracle Demantra. See Implementation Guide.
* Export Integration Profiles
6. Upload Forecast. See Implementation Guide.
next
================================
EP_LOAD download procedures are used for booking history streams and level members.
- For example, the EP_LOAD procedures are used to load booking history by
organization-site-sales channel and item-demand class into staging tables.
- If the Download Now check box was not selected during the collections
process, run EP_LOAD and Import Integration Profiles to move data from the
staging tables into the Oracle Demantra Demand Management schema.
next
================================
Launch EP LOAD.
- Historical information and level data are imported into Oracle Demantra via
the EP_LOAD procedure.
- All other series data are imported into Oracle Demantra via Import Integration Profiles.
An assumption of the EP LOAD procedure is that the series are populated into the Oracle
Demantra staging tables before the load process begins.
- To ensure this occurs, the collection programs for all eight historical series have been
merged so that these streams are always collected simultaneously.
next
================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (2 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
Launch EP_LOAD Continued.
For members and history series, which are downloaded via the EP_LOAD mechanism,
the mode of update is:
- If there is a new member, it is added in Oracle Demantra.
- If a member has been deleted in the E-Business Suite source, it stays in Oracle
Demantra along with all series data for combinations that include the member.
* The administrative user must manually delete the member in Oracle Demantra.
- Series data in the staging area overwrites the series data in Oracle Demantra, for the
combinations that are represented in the staging area.
- Series data in Oracle Demantra for combinations that are not in the staging area are
left unchanged.
- The staging area is erased after the download.
- All series data in Oracle Demantra, for all combinations, are set to null before the
download actions take place.
* There are a total of three EP_LOAD workflows, one EP_LOAD workflow for each of the following
series:
- Item members
- Location members
- Shipment and Booking History
Caution: There is a risk that if multiple lines of business run collections
very close in time to each other, a single EP_LOAD run may pull in
data from multiple lines of business.
- See Line Of Business Configuration and Execution in the User Guide.
* We are not covering the Create the EP_LOAD_MAIN procedure, which loads data into the data
model from staging tables or from files, according to your choice as setup in the Data Model Wizard.
next
================================
Future Date Loading
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (3 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
EP_LOAD process. All future information (that is, forecast data) is loaded using
integration profiles or other loading mechanisms. This mechanism controls the dates
marked as end of history for the Forecasting Engine and the Collaborator Workbench.
With the addition of the MaxSalesGen parameter, you can now use the EP_LOAD
process to load future dates into Demantra. This parameter determines how data after
the end of history is populated.
Note: When populating the MaxSalesGen parameter, its important to
enter all dates in the MM-DD-YYYY 00:00:00 format.
next
================================
Three main staging tables:
T_SRC_ITEM_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to a
unique item entity based on all lowest item levels in the model.
T_SRC_LOC_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to a
unique location entity based on all lowest item levels in the model.
T_SRC_SALES_TMPL
This staging table is used by the ep_load_main procedure. Each record corresponds to
sales data for a given item and location combination, based on all lowest item levels in
the model.
next
================================
Functional Steps to load Demantra
1. Load the data into the staging tables:
t_Src_sales_tmpl,
t_src_item_tmpl,
t_src_loc_tmpl
2. Login to the Workflow Manager.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (4 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
3. Start the workflow 'EBS Full Download'
4. Check the status by clicking on the 'Instance' number of the workflow.
5. Check the status on the Collaborator Workbench in the My Tasks pane.
next
================================
This pre seeded WF should include all required steps to enable a successfull
download of items/location/sales data into Demantra tables. Currently the WF
runs the following processes:
EP_LOAD_ITEMS
EP_LOAD_LOCATION
EP_LOAD_SALES
However, in order to complete the loading process you may need to run the MDP_ADD
procedure to make sure new combination are added into MDP_MATRIX.
next
================================
Technical Investigation
-----------------------
On 7.2.0.1 in Production:
We find that after running ep_load_sales, the procedure completed without any error
message, but there is no record in sales_data table.
EXPECTED BEHAVIOR
-----------------------------------
We expect data to be loaded
Steps To Reproduce:
1. load data into t_src_sales_tmpl
2. run ep_load_sales
3. check integ_status
4. check t_src_sales_tmpl_err
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (5 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
5. check sales_data
We have just one record in t_src_sales_tmpl, after running ep_load_sales, the procedure
completed without any error message and there is no error row in t_src_sales_tmpl_err.
Upon checking sales_data and integ_status, there are no rows in these tables.
Debugging Steps:
1. Check if you have any error in db_exception_log table.
select * from db_exception_log order by err_date desc;
2. Verify that there is not 'new' data. The existing row could have simply been updated.
3. Please review all columns:
select * from t_src_sales_tmpl;
select * from sales_data;
Upon closer examination, we found invalid data in t_src_sales_tmpl table.
After we updated the column ACTUAL_QTY from NULL to '0', the ep load sales was successful.
* You would need to change the value coming from the source to successfully load without
manipulating staging table data.
next
================================
After launching Collection, Collect Shipment and Booking history with the
parameter 'EP_LOAD'=N and running ep_load after collection, the
actual_quantity loaded into sales_data is different from the actual_qty in
t_src_sales_tmpl.
This is what we did:
--------------------
1. Launch Collection : Collect Shipment and Booking history with the
parameter 'EP_LOAD' = N
2. Launch a custom program to populate the following columns: T_ep_i_att_9,
T_ep_p3, T_ep_p2 in the staging table t_ep_item_tmpl from the column
ebs_product_category_desc in the table t_ep_item_tmpl.
3. Launched the EP_Load as a standalone request.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (6 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
4. Restarted Application Server
5. Verified the data for the items 130-0290-910 in the bucket starting 05/26/2008=18333
6. Verified from the following Query:
select sum(actual_qty) from t_src_sales_tmpl where
dm_item_code='130-0290-910'
and sales_date between '26-may-2008' and '01-jun-2008';
- Result: 121 <----
7. Verified from Sales Data table from the following query:
select sum(actual_quantity) from Sales_data where item_id in
(select item_id from msdem.mdp_matrix where t_ep_item_ep_id in
(select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910'))
and sales_date between '26-may-2008' and '01-jun-2008';
- Result:18333 <----
Debugging
---------
Please check the setting of SYS_PARAMS 'accumulatedOrUpdate' setting:
'update' or 'accumulate'
EP_LOAD_SALES should aggregate the sales in the T_SRC_SALES table by combination and
sales date.
- accumulatedOrUpdate = Update
If the sale already exists then the actual_quantity value should be replaced.
- accumulatedOrUpdate = accumulate
Should add the new value to any existing values (provides the series is set
as proportional)
Debugging Step 2
----------------
This is one method of comparing source data to mdp_matrix. In this case we
are verifying the sums for a given date range / item.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (7 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
- Check the source:
select sum(shipped_quantity), sum(cancelled_quantity), sum(ordered_quantity),
ship_from_org_id
from apps.oe_order_lines_all where actual_shipment_date
between '26-may-2008' and '01-jun-2008' and ordered_item= '130-0290-910'
group by ship_from_org_id;
- Check what was collected:
select sum(actual_qty), dm_org_code from t_src_sales_tmpl where
dm_item_code='130-0290-910'
and sales_date between '26-may-2008' and '01-jun-2008' group by dm_org_code;
- Check what was loaded into MMP_Matrix:
select sum(actual_quantity) from Sales_data where item_id in
(select item_id from msdem.mdp_matrix where t_ep_item_ep_id in
(select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910'))
and sales_date between '26-may-2008' and '01-jun-2008';
- Here we are checking a different item with a ship_from_org_id:
select sum(shipped_quantity), sum(cancelled_quantity), sum(ordered_quantity),
ship_from_org_id
from apps.oe_order_lines_all where actual_shipment_date
between '26-may-2008' and '01-jun-2008' and ordered_item= '130-0290-910' and
ship_from_org_id = 722 group by ship_from_org_id;
- Following the trail to collected data:
select sum(actual_qty), dm_org_code from t_src_sales_tmpl where
dm_item_code='130-0290-910'
and sales_date = '30-May-2008' group by dm_org_code;
- And finally in mdp_matrix:
select sum(actual_quantity) from Sales_data where item_id in
(select item_id from msdem.mdp_matrix where t_ep_item_ep_id in
(select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910'))
and sales_date ='30-May-2008';
If there is a difference, the sql can be adjusted to weed out extra rows or
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (8 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
rows that were not collected through to mdp_matrix.
next
================================
I created Integration interface Variable Cost and a workflow with transfer step
to load data for that integration interface. When the staging table for integration
interface is populated with data and the workflow is run, the errored out records move
to the err table and the correct records vanish from the staging table but nothing is
populated in demantra internal tables.
If all the records in the staging table are correct then the data is
populated into the base tables of Demantra.
Investigation
-------------
The staging table (integ_inf_var_cost) seems to contain dirty data. By this
I mean that in addition to missing members (that are being handled by the
system and can be noted in the error table), some of the combinations do not
have valid item-location combination in the MDP_MATRIX table. Hence, the
update process had failed to find any valid rows to update.
On v7.1.1, the integration process validations included the following
validations:
1. Integration profile structure
2. Staging table data dates range
3. Existance and validity of members and drop down series values
Any error that is found, a row in the error table indicates and holds
information about the problem.
On v7.1.1 validation DID NOT include population validation (i.e., that the
specified combination indeed exsits in MDP_MATRIX). This feature was added
in the v7.2.0 release.
Nonetheless, please find the following 2 SQL statements, either one will reveal those notorious
combinations in the staging table:
SELECT DISTINCT i.*
FROM integ_inf_var_cost i,
t_ep_e1_it_br_cat_3 e,
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (9 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
t_ep_region r,
t_ep_finproductgroup f
WHERE i.level1 = e.e1_it_br_cat_3
AND i.level2 = r.region
AND i.level3 = f.finproductgroup
AND NOT EXISTS (
SELECT *
FROM mdp_matrix m
WHERE m.t_ep_e1_it_br_cat_3_ep_id = e.t_ep_e1_it_br_cat_3_ep_id
AND m.t_ep_region_ep_id = r.t_ep_region_ep_id
AND m.t_ep_finproductgroup_ep_id = f.t_ep_finproductgroup_ep_id);
select distinct t6.e1_it_br_cat_3,t5.region,t4.finproductgroup
from mdp_matrix t1,
t_ep_finproductgroup t4,
t_ep_region t5,
t_ep_e1_it_br_cat_3 t6,
integ_inf_var_cost vc
where t1.t_ep_region_ep_id = t5.t_ep_region_ep_id
and t1.t_ep_finproductgroup_ep_id = t4.t_ep_finproductgroup_ep_id
and t1.t_ep_e1_it_br_cat_3_ep_id = t6.t_ep_e1_it_br_cat_3_ep_id
and t1.variable_cost is null
and t6.e1_it_br_cat_3 = vc.level1
and t5.region = vc.level2
and t4.finproductgroup = vc.level3;
next
================================
This is a a sound approach to test the complete collection and presention of
data to MDP_MATRIX and Demantra.
Actions that were taken:
1. Reset the tables, by executing the following:
-- Reset
TRUNCATE TABLE integ_inf_var_cost_err;
TRUNCATE TABLE integ_inf_var_cost;
UPDATE mdp_matrix
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (10 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
SET variable_cost = NULL;
COMMIT;
2. Fill staging table, by executing the following:
-- Insert data into the staging table
INSERT INTO integ_inf_var_cost
(sdate, level1, level2, level3, variable_cost)
SELECT TO_DATE (TO_CHAR (NEXT_DAY (SYSDATE, 'monday') - 7,
'DD-MM-YYYY'),
'DD-MM-YYYY'
) sdate,
shipto_commercial_org level1,
region level2,
finance_product_line level3,
variable_cost
FROM cstm_var_cost_eur;
COMMIT;
3. Run the "Variable Cost Download" workflow.
4. Checked the results:
a) Verify that the integ_inf_var_cost table was empty, using the
following:
-- Checked that all data from the staging table was handled
SELECT COUNT (*)
FROM integ_inf_var_cost;
which had returned 0, as expected.
b) Verified that that the mdp_matrix table had new rows using
the following:
-- Checked that the new data was introduced
SELECT COUNT (*)
FROM mdp_matrix
WHERE variable_cost IS NOT NULL;
which had returned 247.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (11 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
next
================================
Working with Oracle Support or DEV
----------------------------------
Exporting your Demantra data model for revew:
An export file which contains your data model can be exported by running
the Business Modeler and selecting the menu "Data Model --> Open Data
Model". Then select the active model (it is yellow) and click on the
"Export" button. Then specify the location to save the *.dmw file.
next
================================
Data Loading Directly into Staging Tables
-----------------------------------------
1. Load a single record (for the purpose of this test case) into each of
t_src_item_tmpl,t_src_loc_tmpl and t_src_sales_tmpl
2. We then run a hybrid of the standard 'sales load' workflow, where
DL_RUN_PROC is called with various arguments one by one:
- ep_prepare_data,
- ep_load_items,
- ep_load_location,
- ep_load_sales
3. Look for errors the %err tables and find that t_src_sales_tmpl_err has an
error and t_src_sales_tmpl is empty
Determined Cause:
When loading SALES_DATA all of the levels must match existing levels in
the hierarchy, not just the item code. This is ensured by a large JOIN near
the beginning of the EP_LOAD_SALES procedure.
We have also seen this strange result as product of bug 6520853 EP_LOAD_ITEMS DOESN'T
LOAD ITEM AND EP_CHECK_ITEMS DOESN'T GIVE ERROR. This was based on proper case.
When the case did not match.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (12 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
This is fixed in 7.2.0.2
next
================================
Missing Date Values in Loading
------------------------------
After running the EP_Load procedure the worksheet does not show the data.
The TO_DATE & FROM_DATE columns are not populated.
Steps to Reproduce:
1. Run Build Model.
2. Load data to the T_SRC_ITEM_TMPL, T_SRC_LOC_TMPL & T_SRC_SALES_TMPL tables.
3. Run: ?Demand PlannerDeskTopload_data.bat
4. View that there are no errors in:
T_SRC_ITEM_TMPL_ERR,
T_SRC_LOC_TMPL_ERR
T_SRC_SALES_TMPL_ERR tables
5. Check that the data was inserted to the data base.
6. Create a worksheet that will show the data.
7. See if the data is seen in the worksheet.
- My results are that the levels, members and data exist in the data base.
- The levels and members are seen in the worksheet wizard and can be selected.
- The levels, members and data are not shown in the worksheet.
Neat Debugging Example
----------------------
Create temp table with min and max dates per mdp_matrix combination:
create table temp_matrix_dates as SELECT s.item_id, s.location_id,
MIN(s.sales_date) AS from_date, MAX(s.sales_date) AS until_date
FROM sales_data s, mdp_matrix t
WHERE s.item_id = t.item_id
AND s.location_id = t.location_id
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (13 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
GROUP BY s.item_id, s.location_id
Update mdp_matrix from_date and until_date based on the table created above:
UPDATE mdp_matrix m
set (from_date,until_date) = (
select from_date,until_date
from temp_matrix_dates WHERE item_id = m.item_id
AND location_id = m.location_id)
WHERE item_id = m.item_id
AND location_id = m.location_id;
This will update the date columns in mdp_matrix for all combinations not just
the ones you loaded. Use the following SQL to identify which combinations in
MDP_MATRIX are missing the date values.
select item_id, location_id, from_date, until_date from mdp_matrix where
from_date is null and until_date is null.
next
================================
The following investigation can be used to verify that data from the source actually
loads into the Demantra tables. The item hhm-1 is missing in the Demantra sales_data tables
but there were no errors. This customer is useing .ctl files to load the dmtra_template
schema. You can change the default schema name, see note 551455.1.
Step by Step Walk Through to Find the Data
Following are the steps followed and the details of the issue.
Load your data into flat files matching the .ctl and perform the following:
- exec DATA_LOAD.EP_PREPARE_DATA; ( Runs "replace_apostrophe" which removes
single quotes from the T_SRC tables)
- exec DATA_LOAD.EP_LOAD_ITEMS; ( Runs no errors found in _ERR table)
- exec DATA_LOAD.EP_LOAD_LOCATION; ( Runs no errors found in _ERR table)
- exec DATA_LOAD.EP_LOAD_SALES; ( Runs no errors found in _ERR table)
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (14 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
1. Performed booking and shipping history collections with Auto down load option.
2. Found that the data is not collected into demantra. this was confirmed after
verifying the data is not present in the work sheets.
3. Performed EBS full down load Work Flow. This collected the location data into the staging table.
select *
from dmtra_template.t_src_sales_tmpl
where lower(dm_item_code) = 'hse-1';
4. Then performed booking and shipping history collections with auto down load
5. SELECT DISTINCT
dm_item_code,
dm_org_code,
dm_site_code,
t_ep_lr1,
t_ep_ls1,
t_ep_p1,
ebs_demand_class_code,
ebs_sales_channel_code,
aggre_sd -- filter column
FROM ep_T_SRC_SALES_TMPL_ld
WHERE ep_T_SRC_SALES_TMPL_ld.actual_qty IS NOT NULL
and dm_item_code ='HHM-1'
ORDER BY
dm_item_code,dm_org_code,dm_site_code,t_ep_lr1,t_ep_ls1,t_ep_p1,ebs_demand_cla
ss_code,ebs_sales_channel_code,aggre_sd;
The 2 dates shown are 05-FEB-07 and 29-JAN-07
6. SELECT ITEMS.item_id,LOCATION.location_id
FROM ITEMS, LOCATION,
t_ep_item,
t_ep_organization,
t_ep_site,
t_ep_lr1,
t_ep_ls1,
t_ep_p1,
t_ep_ebs_dema,
nd_class,
t_ep_ebs_sales_ch
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (15 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
WHERE 1 = 1
AND items.t_ep_item_ep_id = t_ep_item.t_ep_item_ep_id
AND t_ep_item.item = 'HHM-1'
AND location.t_ep_organization_ep_id = t_ep_organization.t_ep_organization_ep_id
AND t_ep_organization.organization = 'TST:M1'
AND location.t_ep_site_ep_id = t_ep_site.t_ep_site_ep_id
AND t_ep_site.site = 'ABC Corporation Americas:2637:7233:Vision Operations'
AND location.t_ep_lr1_ep_id = t_ep_lr1.t_ep_lr1_ep_id
AND t_ep_lr1.lr1 = 'N/A'
AND location.t_ep_ls1_ep_id = t_ep_ls1.t_ep_ls1_ep_id
AND t_ep_ls1.ls1 = 'N/A'
AND items.t_ep_p1_ep_id = t_ep_p1.t_ep_p1_ep_id
AND t_ep_p1.p1 = 'N/A'
AND items.t_ep_ebs_demand_class_ep_id = t_ep_ebs_demand_class.t_ep_ebs_demand_class_ep_id
AND t_ep_ebs_demand_class.ebs_demand_class = '0'
AND location.t_ep_ebs_sales_ch_ep_id = t_ep_ebs_sales_ch.t_ep_ebs_sales_ch_ep_id
AND t_ep_ebs_sales_ch.ebs_sales_ch = 'Direct;'
Gives the ITEM_ID 565 and LOCATION_ID 607
7. What you now see is only the 2 rows inserted / updated into SALES_DATA
because the 'aggre_sd' date is used as a filter from the actual_qty NOT NULL select.
SELECT COUNT(*)
from sales_data
where item_id = 565
and location_id = 607
and TRUNC(sales_date) = TO_DATE('05-FEB-07','DD-MON-RR')
Gives 1 row which is right.
8. These rows are also added to MDP_LOAD_ASSIST.
* When the actual_qty is NULL in the distinct list it is inserted in to a table
MDP_LOAD_ASSIST and is used to populate MDP_MATRIX.
* The actual_qty is NOT NULL list is aggregated by the aggre_sd filter date and the rows
are inserted / updated in SALES_DATA.
* The columns quantity values are averaged across all the distrinct rows even
including the NOT NULL actual_qty rows by the 'aggre_sd'
The EP_LOAD_SALES continues to load the rows from MDP_LOAD_ASSIST via:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (16 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
mdp_add('mdp_load_assist') which is in DATA_LOAD.EP_LOAD_SALES
From T_SRC_SALES_TMPL
Rows
Total 12,476
Distinct IDs with actual_qty NULL 807 <-- These are the problem rows
Distinct ID's Incl. aggre_sd and actual_qty NOT NULL 11,669
Why are there 807 problem rows? Why are they not in the err table?
9. Trying to verify success of the following items : HSE-1
select i.t_ep_item_ep_id,
i.item_id,t.item,
i.is_fictive
from dmtra_template.items i, t_ep_item t
where lower(item) ='hse-1'
and i.t_ep_item_ep_id = t.t_ep_item_ep_id
order by 1 ;
The above will deliver the item_id to be used to verify that the data is in
the sales_data table:
select * from dmtra_template.sales_data where item_id in (654,671,727,823);
Returned zero rows.
Unclear as to why ep_load_sales does not load sales nor insert errored rows
into error tables.
10. To continue the investigation
SELECT *
FROM dmtra_template.t_ep_item
WHERE item = 'HSE-1';
Output:t_ep_item_ep_id = 188
11. What are the actual_qty values and sales_date in the source table?
select dm_item_code, sales_date, actual_qty
from dmtra_template.t_src_sales_tmpl
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (17 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
where lower(dm_item_code) in ('hse-1')
and actual_qty IS NOT NULL;
DM_ITEM_CODE SALES_DATE ACTUAL_QTY
------------ ---------- ----------
HSE-1 12/25/2006 70
How many rows where the actual_qty values are NULL in the source table?
select dm_item_code, sales_date, actual_qty
from dmtra_template.t_src_sales_tmpl
where lower(dm_item_code) in ('hse-1')
and actual_qty IS NULL;
DM_ITEM_CODE SALES_DATE ACTUAL_QTY
------------ ---------- ----------
HSE-1 1/1/2007
HSE-1 1/1/2007
12. SELECT *
FROM dmtra_template.items
WHERE t_ep_item_ep_id = 188;
Output :
item_id demand_class
------- ----------------------------------
654 Unassicaoted
671 Australia Sales Region
727 East US Sales Region
823 West US Sales Region
13. What is the corresponding item id values from the ITEMS table?
select i.t_ep_item_ep_id, i.item_id,t.item, i.is_fictive
from items i, t_ep_item t
where lower(item) ='hse-1'
and i.t_ep_item_ep_id = t.t_ep_item_ep_id
order by 1;
T_EP_ITEM_EP_ID ITEM_ID ITEM IS_FICTIVE
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (18 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt
--------------- ------- ------- ----------
188 654 HSE-1 2
14. select * from dmtra_template.sales_data
where item_id = 654
and sales_date IN ('12/25/2006','1/1/2007','1/29/2007');
ITEM_ID LOCATION_ID SALES_DATE ACTUAL_QUANTITY
------- ----------- ---------- ---------------
671 746 1/29/2007 70
671 752 1/1/2007
671 753 1/1/2007
As you can see where quantity is not null there is data with correct
corresponding values and dates in SALES_DATA.
Expected Behaviour
-------------------
We should be able to see the data for History for these series as well even
when the acutal_quantity IS NULL
Booking History - booked items - booked date,
Booking History - requested items - booked date,
Booking History - booked items - requested date,
Booking History - requested items - requested date,
Shipment History - shipped items - shipped date,
Shipment History - shipped items - requested date,
Shipment History - requested items - requested date
<< end of document >>
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (19 of 19)5/6/2009 8:52:23 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS!
The material presented below is of a technical nature. Little attention has been
given to functional navigation, functional demonstrations. The vast majority of
data load issues are investigated / solved using SQL. To that end, this presentation
focuses almost exclusively on problem investigation and resolution.
This material has been assembled from 100s of bugs in which DEV gave us techniques
to drill into source/destination data.
Data Loading Flow
-----------------
There are four data flows that move data in and out of Demantra:
1. Loading data from source into collection staging tables
* These are the T_SRC_% and error tables. ep_load_main procedure.
2. Moving data from the collection staging tables into Dematra data store
3. Loading data from the Demantra data store into the Demantra engine and
downloading from Demtra back to the Demantr data store.
4. Pushing the data back to the source instance in the form of a forecast.
We will cover #1 and #2 in this presentation.
next
================================================
Summary of Integration Tasks
----------------------------
This section lists integration tasks in the appropriate sequence:
1. Initial Setup, See Implementation Guide.
2. Collect Data and Download to Three Staging Tables. See Implementation Guide.
3. Transfer data to Oracle Demantra schema. See Implementation Guide.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (1 of 24)5/6/2009 8:53:00 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
* EP_LOAD
* Import Integration Profiles
4. Generate forecasts
5. Export Output from Oracle Demantra. See Implementation Guide.
* Export Integration Profiles
6. Upload Forecast. See Implementation Guide.
next
================================================
EBS to Demantra Data Load / Import Diagnostics Investigation
------------------------------------------------------------
There are several methods to load data into the Demantra staging tables. Based on the number
of problems reported, the tools seem to operate as advertised. The data loaded into the
Demantra staging tables can be an issue.
We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation
to indentify, explain and fix the load result.
Load Methods
------------
- Integration Interface Wizard
- Data Model Wizard
- Demantra Import Tool
- SQL*Loader
next
================================================
The following table summarizes the core Demantra import and export tools:
Data To import, use... To export, use...
------------------------------- ------------------------------- -----------------
Lowest level item and Data Model Wizard* N/A
location data; sales data
Series data at any Integration Interface Wizard* Integration Interface Wizard
aggregation levels
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (2 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Sales promotions Integration Interface Wizard* Integration Interface Wizard
Members and attributes of N/A Integration Interface Wizard
other levels
Other data, for example, Demantra import tool* N/A
lookup tables
*These options are in the Business Modeler.
next
================================================
The core Demantra tools allow you to do the following:
• Import lowest-level item, location, and sales data
• Import or export series data at any aggregation level, with optional filtering
• Import promotions and promotional series
• Export members of any aggregation level
• Import supplementary data into supporting tables as needed
next
================================================
Object Manipulation
-------------------
The Demantra Business Modeler contains many useful tools to perform maintenace and
development.
- Create Table
- Alter Table
- Recompile
- View the Procedure Error Log
- Cleanup Demantra temporary tables
- Oracle Sessions Monitor
- Please see user guide for complete list
next
================================================
Oracle Demantra Workflows
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (3 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- EBS Full Download: Downloads items, locations, and shipment and booking history.
- EBS Return History: Download: Downloads Return History
- EBS Price List Download: Downloads Price Lists
Workflows can do all the following actions
- Run integration interfaces.
- Run stored database procedures.
- Run external batch scripts and Java classes.
- Pause the workflow until a specific condition is met, possibly from a set of allowed
conditions. For example, a workflow can wait for new data in a file or in a table.
- Send tasks to users or groups; these tasks appear in the My Tasks module for those
users, within Collaborator Workbench. A typical task is a request to examine a
worksheet, make a decision, and possibly edit data. A task can also include a link to
a Web page for more information.
next
================================================
WF Technical
------------
Monitor the WF for errors:
SELECT ACTIVITY_NAME, ACTIVITY_STATUS_CODE,
ACTIVITY_RESULT_CODE, TO_CHAR(ACTIVITY_BEGIN_DATE, 'DD/MM/YYYY -
HH24:MI:SS'), TO_CHAR(ACTIVITY_END_DATE, 'DD/MM/YYYY - HH24:MI:SS'),
ERROR_NAME,
ERROR_MESSAGE
FROM WF_ITEM_ACTIVITY_STATUSES_V
WHERE ACTIVITY_STATUS_CODE = 'ERROR'
Check the collaborator log
--------------------------
I have error in the collaborator.log file resulting from the "Download Plan Scenario Data"
workflow.
- They where caused by the "Notify" email step in the workflow, these errors were being
generated because the demantra environment was not configured with the proper details for sending
email notifications to users.
- These errors might prevent the workflow from completing the rest of the steps in the flow.
- The User Guide explains proper setup quite well.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (4 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Restart / Debugging Steps
-------------------------
1) Shutdown the applications server.
2) Clear tasks from collaborator workbench for the user.
3) Delete new DAILY_PROCESS workflow.
Clear wf_process_log tableDelete all rows in the wf_scheduler and wf_process_log tables:
- delete from wf_scheduler;
- delete from wf_process_log;
4) Restart the application server/web server.
5) Scheduled the WF "daily process"
Additional WF Checks
--------------------
1. Are there more than a single instance of the Demantra application
running (i.e., with different context root, on different machines, etc.)?
2. What is the name of the problematic workflow? Is there another workflow schema by that name?
3. How were schemas removed? Using Demantra application or directly from
the database (using the DELETE statement)? Use the application whenever possible.
Poor WF Performance?
--------------------
- Ask yourself, is this necessary data? While historic data is important, not all historic
data needs to be included.
- Are most of the quantities in these records valid quantities?
- Have you considered changing the plan settings to limit out quantities that
are below a threshold?
- Can older scenario revision data be deleted if it's no longer required?
- What are the output levels (including time) of the scenarios?
Size Check for Performance
--------------------------
Use the output as a high water mark setting for future growth/performance analysis.
select SCENARIO_NAME, SCENARIO_OUTPUT_PERIOD_TYPE,DP_DIMENSION,
LEVEL_NAME
from MSD_DP_SCN_OUTPUT_LEVELS_V
where demand_plan_id = <plan id>
order by scenario_name, dp_dimension, level_name
select count(*),SCENARIO_ID from msd.MSD_DP_SCN_ENTRIES_DENORM where
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (5 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
DEMAND_PLAN_ID=6031 group by
SCENARIO_ID;
select count(*),SCENARIO_ID,revision from
msd.MSD_DP_SCENARIO_ENTRIES
where demand_plan_id=6031 group by SCENARIO_ID,revision;
- Is there data that can be purged?
- Are all of the scenarios and revisions lean or do they contain stale data?
next
================================================
Integration profile Performance Problem?
----------------------------------------
For performance reasons, the updates are not being executed immediately,
but are being accumulated, and once exceeding the configured limit being written
as a chunk or block.
* defined in the ImportBlockSize property in appserver.properties file
Modify the 'ImportBlockSize' parameter in appserver.properties to a number that reflects
better performance at your site after testing different settings.
next
================================================
The Basic Input Data Review
---------------------------
When fully configured, Demantra imports the following data, at a minimum, from your
enterprise systems:
- Item data, which describes each product that you sell.
- Location data, which describes each location to which you sell or ship your products.
- Sales history, which describes each sale made at each location. Specifically this
includes the items sold and the quantity of those items, in each sale.
- For Promotion Effectiveness:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (6 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Historical information about promotional activities.
Demantra can import and use other data such as returned amounts, inventory data,
orders, and settlement data.
- From my understanding of the Shipping/Booking History Collection for Demantra, the
sales data is loaded first and then the item and loc tables are derived from it.
next
================================================
Functional Considerations
-------------------------
There are many functional setup issues relating directly to successful data load and execution.
For example, the MSC:Organization containing generic BOM for forecast explosion profile
can be set to one of the organizations at the site level. This will limit the entry of rows into
MSD_DP_SCENARIO_ENTRIES and MSD_DP_SCN_ENTRIES_DENORM.
Changing the profile to the Global Item Master Organization will make the records available for loading.
next
================================================
Imported Data
-------------
You can collect internal sales orders. They appear in the customer dimension; the customer
appears as the organization code.
Seeded collections from EBS into Oracle Demantra Demand Management include:
- Shipment history, booking history, and returns history
- Manufacturing and fiscal calendars
- Price lists, currencies, and currency conversion factors
- Dimensions, levels, hierarchies, and level values for demand analysis.
Item levels are:
- Product category: Item > Category
- Product family: Item > Product family
- Demand class
Location levels are:
- Zone: Site > Trading partner > Zone
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (7 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- Customer class: Site > Account > Customer > Customer class
- Business group: Organization > Operating unit > Business group
- Sales channel
next
================================================
Integration Interface Wizard
----------------------------
The Integration Interface Wizard initializes the names of the staging tables, but you
can rename the tables within the wizard if needed. The default names start with biio_.
Make a note of the names of your tables, as displayed within the Integration Interface Wizard.
(not a complete list)
- biio_supply_plans
- biio_supply_plans_pop
- biio_other_plan_data
- biio_PURGE_PLAN
- biio_scenario_resources
Troubleshooting
---------------
Look to the logs
- Integration Interface Table name>_ERR.
- If the staging table is name Biio_My_Demand, the error table is Biio_My_Demand_Err
- The _Err table will contain the rows that were not successfully added to the staging tables.
- Check the Integration Log
If your URL to Demantra is http://DEMANTRAMACHINE:8080/demantra/portal/loginpage.jsp
then the logs would be accessed by point your browser to:
http://DEMANTRAMACHINE:8080/demantra/admin/systemLogs.jsp
NOTE, it is advised that you review the contents of the following for erred rows:
- UPDATE_BATCH_TRAIL
- UPDATE_BATCH_VALUES
- UPDATE_BATCH_TRAIL_ERR
- UPDATE_BATCH_VALUES_ERR
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (8 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Please run the following script in the demantra schema to verify data
---------------------------------------------------------------------
begin
apps.msd_dem_sop.load_plan_data (:supply_plan_id);
end;
The parameter to the procedure is the column value of supply_plan_id from the
supply_plan table. The correct ID is for the supply plan for which data is being
loaded from ASCP to Demantra. Use SQL to attain.
For e.g. if for plan 'ASCP-DEM', the supply plan id is 134, the script should be run as
begin
apps.msd_dem_sop.load_plan_data (134);
end;
If you must log an SR, please enable trace for the session and provide the tkprof
output of the trace file after the execution.
Also provide the row count of the following tables in Demantra schema after
running the script:
1. biio_other_plan_data
2. biio_resource_capacity
3. biio_supply_plans
4. biio_supply_plans_pop
5. biio_scenario_resources
6. biio_scenario_resource_pop
7. biio_resources
8. t_src_item_tmpl
9. t_src_loc_tmpl
10. BIIO_PURGE_PLAN
next
================================================
Dumping your source data for review:
1. A dump file (and log file) which contain the tables
T_SRC_ITEM_TMPL,
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (9 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
T_SRC_LOC_TMPL,
T_SRC_SALES_TMPL,
TABLE_NAME,
WF_GROUPS_SCHEMAS,
WF_PROCESS_LOG,
WF_SCHEDULER,
WF_SCHEMAS,
WF_SCHEMA_GROUPS,
WF_SCHEMA_TEMPLATES.
The command to make such a dump file:
(You may need to adjust)
exp user@server file=resmed_oct.dmp log=resmed_oct.log tables=(
T_SRC_ITEM_TMPL, T_SRC_LOC_TMPL, T_SRC_SALES_TMPL, TABLE_NAME,
WF_GROUPS_SCHEMAS, WF_PROCESS_LOG, WF_SCHEDULER, WF_SCHEMAS,
WF_SCHEMA_GROUPS, and WF_SCHEMA_TEMPLATES)
next
================================================
Prarmeter checks Used for Source Data Load and EP Load. Be familiar
with the contents of the following tables:
select * from init_params_0 order by 1;
select * from sys_params order by 1;
select * from db_params order by 1;
select * from aps_params order by 1;
- init_params_0 table should be all the descritions of the parameters.
- sys_params contains the parameters listed on the Database, System, and
Worksheet tabs.
- db_params constains operational entries such as check_and_drop_sleep_limit.
For example, if you should receive the following error:
ORA-20000: Cannot DROP AK because of ORA-54 resource busy and have timed
out after 255 seconds
ORA-6512: at "YOURCO_DEC.CHECK_AND_DROP", line 143
The sleep period starts at 1 second and increments exponentially until it
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (10 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
reaches a sleep period limit > 128 seconds (current default). This gives
a total attempt time of 255 seconds. This limit can be changed via the
new parameter in DB_PARAMS 'check_and_drop_sleep_limit'.
Retry messages are written to the DB_EXCEPTION_LOG table.
Another Key Entry for EP_Load
There is a parameter in DB_PARAMS, ep_load_do_commits, the default is TRUE.
- TRUE causes commits to be issued in the SALES MERGE loop.
- FALSE causes only 1 commit to be issued just after the SALES MERGE loop.
- aps_params contains the password the Business Modeler should use to connect to
connect to the database as well as other important operational settings.
next
================================================
Detailed Source Data Investigation
----------------------------------
Scenario: Shipment and booking history are not completely collected when
we establish a new schema. Though the collections for shipment and booking
history is successful with no error or warning message, we could not see the
items in Demantra. If there is unclean data in t_src_item_tmpl and
t_src_loc_tmpl tables, EBS full download moves the unclean data from these
tables to the error table. This works fine but the process is not moving
clean data to the Demantra base tables.
* You will need to find your item_id organization_id as well as other data required
to run the following sql.
Steps to Produce:
1. Performed booking and shipping history collectins with Auto down load option.
2. Found that the data is not collected into Demantra. This was confirmed after
verifying the data in work sheets.
3. Verified that this issue is related to locations via simple SQL checks.
select * from dmtra_template.t_src_loc_tmpl_err;
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (11 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Found that there were no errors in this table.
4. Performed EBS full down load work flow. This collected the location data into the
staging table. But not sales data as last time the locations data were not collected.
Hence nothing in the staging table.
5. Then performed booking and shipping history collections with auto down load. Still
the sales data not collected into demantra. (not sure why the customer did this)
Now, we will dig through the tables to locate the problem
Step 1
================================================--
In the table INTEG_STATUS it shows that when DMTRA_TEMPLATE runs the EP_LOAD
it succeeds but when APPS runs it, it fails, this could be a permissions issue.
Step 2
------
select * from dmtra_template.sales_data where item_id in (654,671,727,823);
Step 3
------
Have confirmed items HSE-1, HSE-2,HSE-3,HSE-4,HSE-5 are in staging tables:
select * from dmtra_template.t_src_sales_tmpl where lower(dm_item_code) in
('hse-1', 'hse-2','hse-3','hse-4','hse-5');
Step 4
------
Also confirmed items are loaded into the system:
select i.t_ep_item_ep_id,i.item_id,t.item,i.is_fictive
from dmtra_template.items i,dmtra_template.t_ep_item t where lower(item) in
('hse-1', 'hse-2','hse-3','hse-4','hse-5')
and i.t_ep_item_ep_id=t.t_ep_item_ep_id
order by 1;
Step 5
------
Launched complete refresh collections for Shipment and booking history with
Auto-download set to 'yes'.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (12 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
There were no records pertatining to HSE-1,HHM-1(for e.g.) in the
t_src_sales_tmpl_err table and there were many records for the same items are
in sales_data table. There is inconsitency between the records in t_src_sales_tmpl
and sales_data table.
Executed the following sqls:
For Item HHM-1:
SELECT *
FROM dmtra_template.sales_data
WHERE item_id IN(627,666);
SELECT * FROM dmtra_template.t_src_sales_tmpl where dm_item_code ='HHM-1'
order by sales_date desc;
For Item HSE-1:
SELECT *
FROM dmtra_template.sales_data
WHERE item_id IN(654, 671, 727, 823);
SELECT * FROM dmtra_template.t_src_sales_tmpl where dm_item_code ='HSE-1'
order by sales_date desc
After the above data is available you should be able to determine the missing data.
In this case the customer was missing location. Location is one of the major keys.
next
================================================
Demand Class
------------
The customer noted that the load was successful however, there were sales orders
missing.
The collections code is bringing in demand classes from oe_order_lines_all.
But the (master table) lookup for demand classes is missing some of the demand classes.
For eg. - The demand class code '1-WLKLE' is available in order lines, but it
is not available in the lookup 'DEMAND_CLASS'. The following query returns zero
rows:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (13 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
select * from apps.fnd_common_lookups where lookup_type= 'DEMAND_CLASS'
and lookup_code='1-WLKLE';
As a temporary workaround, you could modify the shipment history query to not bring
in demand class code, if the demand class is missing in the lookup 'DEMAND_CLASS'.
After that, run Shipment and Booking History collection followed by
ep_load. Verify the Actual_quantity for one particular item both in the
staging table t_src_sales_tmpl and internal table sales_data.
The issue is due to missing demand classes in the lookup. If the demand
classes available in the lookup 'DEMAND_CLASS' and order lines are in synch
with each other, then the sales history quantities should be loaded correctly
into sales data.
You must check why the demand classes are missing from the lookup
DEMAND_CLASS, but present in order lines. These need to be in synch.
next
================================================
Working with Oracle Support
---------------------------
- When there is an issue successfully collecting from EBS to Staging Tables.
1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set
to the correct values. Also make sure demantra URL is up and accessible.
2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with
'Launch Download' set to 'Yes'.
3. Upload log as well as output files for the 'Launch EP LOAD' stage.
4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL'.
5. Supply the logs found under Demand PlannerSchedulerbin folder.
next
================================================
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (14 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Version Control
- Downgrading is not supported. Our schema upgrade mechanism only knows how to
rev foreward not look backward.
next
================================================
EBS Data Load
Please note that all successfully loaded rows are loaded into the MSD_DP_SCN_ENTRIES_DENORM
table.
MSD_DP_SCENARIO_ENTRIES table contains only the records which errored out. Errored records of
a
previous load will be deleted in the next load.
To produce acceptable debug logs:
1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set
to the correct values. Also make sure demantra URL is up and accessible.
2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with
'Launch Download' set to 'Yes'.
3. Upload log as well as output files for the 'Launch EP LOAD' stage.
4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host
URL'.
5. Upload the logs.
next
================================================
Error Check
-----------
The following is a list of tables to review should an error occur.
We would suggest that these tables are empty before the load and
that you produce an automated script to count any new rows
Provide the Following to Oracle Support:
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (15 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- DB_EXCEPTION_LOG table
select * from db_exception_log order by err_date;=
- select * from version_details_history order by upgrade_date desc;
- select object_name, object_type from user_objects where upper(status) = 'INVALID';
- select count(*) from dmtra_template.t_src_item_tmpl;
- select count(*) from dmtra_template.t_src_item_tmpl_err;
- select count(*) from dmtra_template.t_src_loc_tmpl;
- select count(*) from dmtra_template.t_src_loc_tmpl_err;
- select count(*) from dmtra_template.t_src_sales_tmpl;
- select count(*) from dmtra_template.t_src_sales_tmpl_err;
- What was the 'Launch Download' Parameter set to?
- Please provide the Log file for the respective 'request set' collection program.
- select * from t_ep_item;
- select * from t_ep_organization;
- select * from t_ep_site;
After running ASCP -> Demand Management System Administrator -> Collect Shipment
and Booking History - Flat File,concurrent program completed successfully
Check the following Demantra tables:
- select * from t_src_item_tmpl
- select * from t_src_item_tmpl_err
- select * from t_src_loc_tmpl
- select * from t_src_loc_tmpl_err
- select * from t_src_sales_tmpl
- select * from t_src_sales_tmpl_err
You may also consider verifying the contents of the following:
- biio_scenario_resources / _err
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (16 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- integ_inf_var_cost_err
- biio_SUPPLY_PLANS_POP / _err
- biio_supply_plans / _er
- biio_other_plan_data / _err
- BIIO_PURGE_PLAN
next
================================================
Price List Collection Verification
----------------------------------
1. Select * from MSD_DEM_PRL_FROM_SOURCE_V;
(note: This query should be run on SOURCE instance only)
2. Select * from msd_dem_entities_inuse where ebs_entity = 'PRL';
Setup issue:
1. You must select the "Planning Method" & "Forecast control" for the Master Org.
2. The customer has built an new data model, which deleted all the seeded
display units in demantra. Because of this, pricelists were not able to be
loaded into demantra. You would have to rereate new display units like the
original seeded units.
Then run the pricelist collections.
Additional SQL to dig into the Price List
-----------------------------------------
1. select display_units ,display_units_id ,data_table ,data_field from
DEM.DISPLAY_UNITS
where display_units_id in
(select distinct display_units_id from DEM.DISPLAY_UNITS
minus
select distinct display_units_id
from DEM.DCM_PRODUCTS_UNITS )
and display_units like '%EBSPRICELIST%' and rownum < 2;
2. select distinct display_units_id
from DEM.DISPLAY_UNITS
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (17 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
minus
select distinct display_units_id
from DEM.DCM_PRODUCTS_UNITS;
3. select * from DEM.DISPLAY_UNITS;
4 select * from DEM.DCM_PRODUCTS_UNITS;
5. select distinct price_list_name from MSD_DEM_PRL_FROM_SOURCE_V
Price List at the Source Instane
--------------------------------
Execute the following SQL to investigate at the source:
1. select count(*) from MSD_DEM_PRL_FROM_SOURCE_V ;
2. select distinct price_list_name from MSD_DEM_PRL_FROM_SOURCE_V ;
3. select text from all_views where view_name like
'MSD_DEM_PRL_FROM_SOURCE_V' ;
next
================================================
Collection Success/Error Investigation
--------------------------------------
At the Destination, Check the sales date.
select min(sales_date), max(sales_date) from dmtra_template.sales_data
where actual_quantity > 0
next
================================================
Source Views of Interest
------------------------
The view 'MSD_DEM_DEPENDENT_DEMAND_V' is fetching date values from ASCP
tables which includes time as well.
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (18 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
- The procedure MSD_DEM_SOP.PUSH_TIME_DATA inserts dates corresponding to
time buckets into MSD_DEM_DATES table before supply plan data is
downloaded.
Other important objects:
- MSD_DEM_DEPENDENT_DEMAND_V
- MSD_DEM_DATES
- MSD_DEM_ENTITIES_INUSE
- MSD_DEM_ENTITY_QUERIES
- MSD_DEM_GROUP_TABLES
- MSD_DEM_ITEMS_GTT
- MSD_DEM_CONSTRAINED_FORECAST_V
- MSD_DEM_LOCATIONS_GTT
- MSD_DEM_NEW_ITEMS
- MSD_DEM_PRICE_LISTS
Problem: Legacy Collections for Shipment and Booking History errored out at Collect
Level Type stage with the following error in the log:
* Workaround: Run ERP collections for future dates which will wipe out
the sales staging tables. Then run the legacy collections with proper
vaues for demand class and sales channel (or your desired data group)
next
==================================
EBS to Demantra collection with download = YES does not insert date into the Demantra
tables. Data moves into the Demantra staging not into the base table. There is no
error in the log file.
Steps followed:
1. Create new item and sales order
2. Ran the standard collections and EP_LOAD (with download = Yes) The
concurrent programs have been completed successfully with no errors.
3. The item and sales data have been inserted into the Demantra staging tables.
4. Ran the workflow "EBS Full Download" manually, then the data was moved into
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (19 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
the Demantra base tables.
5. However, we could not see the data in worksheet. We expect EP_LOAD to put
the data into Demantra base tables.
Solution Explanation
--------------------
AppServerURL
1. The first check to make is proper setting of the parameter 'AppServerURL' in demantra.
This can be verified from the Business Modeler or from the backend. Use the following
query to get the value of this parameter from the demantra schema
- You have to start the application server before running the Business Modeler wizard.
- In Most cases if the application server is up the problem is with the application server URL.
select pval from sys_params where pname like 'AppServerURL';
This should be set to 'http://dskhyd707878.yourcompany.com:80/demantra', or
wherever your demantra server is running.
2. Check the profiles 'MSD_DEM_SCHEMA' and 'MSD_DEM_HOST_URL'. Are they set properly.
* For more information regarding MSD_DEM_HOST_URL, see note 431301.1
-------------------------------------------------------------
Profile Name - Value
-------------------------------------------------------------
Profile MSD_DEM_CATEGORY_SET_NAME -
Profile MSD_DEM_CONVERSION_TYPE -
Profile MSD_DEM_CURRENCY_CODE -
Profile MSD_DEM_MASTER_ORG - 204
Profile MSD_DEM_CUSTOMER_ATTRIBUTE - NONE
Profile MSD_DEM_TWO_LEVEL_PLANNING - 2
Profile MSD_DEM_SCHEMA - MSDEM
-------------------------------------------------------------
* Please make sure that profiles MSD_DEM_CONVERSION_TYPE and
MSD_DEM_MASTER_ORG
are set in Source instance and MSD_DEM_CURRENCY_CODE and MSD_DEM_SCHEMA profiles
are set in the Planning Server.
next
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (20 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
==================================
After running "Shipment and booking history" request from EBS application, only
some partial data gets loaded in demantra TMPL tables.
Here is my log:
+---------------------------------------------------------------------------+
Demand Planning: Version : 12.0.0
Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved.
MSDDEMARD module: Launch EP LOAD
+---------------------------------------------------------------------------+
Current system time is 22-AUG-2008 18:19:53
+---------------------------------------------------------------------------+
**Starts**22-AUG-2008 18:19:53
**Ends**22-AUG-2008 18:24:38
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
+---------------------------------------------------------------------------+
Start of log messages from FND_FILE
+---------------------------------------------------------------------------+
Exception: msd_dem_collect_history_data.run_load - 22-AUG-2008 18:24:38
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
+---------------------------------------------------------------------------+
End of log messages from FND_FILE
+---------------------------------------------------------------------------+
+---------------------------------------------------------------------------+
Executing request completion options...
Finished executing request completion options.
+---------------------------------------------------------------------------+
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (21 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
Exceptions posted by this request:
Concurrent Request for "Launch EP LOAD" has completed with error.
+---------------------------------------------------------------------------+
Concurrent request completed
Current system time is 22-AUG-2008 18:24:40
+---------------------------------------------------------------------------+
QUESTION
========
After running "Shipment and booking history" request from EBS application, only
some partial data gets loaded in demantra TMPL tables.
ORA-29273: HTTP request failed
ORA-06512: at "SYS.UTL_HTTP", line 1577
ORA-12535: TNS:operation timed out
1. The issue might be because the parameter 'AppServerURL' isn't set properly.
Please run the following query from the Demantra schema:
select pname, pval from sys_params where pname like 'AppServerURL';
2. Please check the AppServerURL in the business modeler.
3. Either reconfigure CONNECT_TIMEOUT to be 0, which means wait indefinitely, or
reconfigure CONNECT_TIMEOUT to be some higher value. Or, if the timeout is
unacceptably long, turn on tracing for further information.
next
================================================
Source Data Manipulation, Order Management booked_date.
There was data missing in the load because the booked_date was null.
Here is what we did to discover/fix the problem.
1. Delete t_src_loc_tmpl
2. Execute request set "Standard Collection".
3. Execute request set "Shipment and Booking History".
4. Review the process log from EBS
5. Review the collaboration.log
6. Login to Workflow in Demantra environment
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (22 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
7. Check if the schema ep_load exist
Below tables do not contain any rows:
T_SRC_SALES_TMPL
T_SRC_ITEMS_TMPL
T_SRC_LOC_TMPL
The error tables doe not contain any rows:
T_SRC_SALES_TMPL_ERR
T_SRC_ITEMS_TMPL_ERR
T_SRC_LOC_TMPL_ERR
Check the following:
1. The source for all the Booking History series - oe_order_headers_all and
oe_order_lines_all has data as expected.
2. The source for all the Shipment History series - oe_order_headers_all and
oe_order_lines_all has the data expected.
3. The table oe_order_headers_all has the headers_id populated.
4. The table oe_order_lines_all has line and the ordered_item is populated.
6. The table oe_order_headers_all does not have the booked_date set up.
SELECT booked_date
FROM oe_order_headers_all
To implement the solution, please execute the following steps:
1. Make a copy of the table oe_order_headers_all .
2. Populate the column booked_date with correct value on the table
oe_order_headers_all .
3. Execute request set "Standard Collection".
4. Execute request set "Shipment and Booking History".
5. Checked T_SRC_* tables in Demantra for data is collected
6. Migrate the solution as appropriate to other environments.
Also:
Do you have a shipped date populated for new combinations?
Series data must be populated into the Demantra staging tables before the load
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (23 of 24)5/6/2009 8:53:01 AM
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt
process begins. To ensure this occurs, the collection programs for all eight
historical series have been merged:
• Booking History – booked items – booked date
• Booking History – requested items – booked date
• Booking History – booked items – requested date
next
================================================
Data not loading because of worksheet date settings?
Regarding the issue of data downloaded into demantra not showing up in worksheet, this is
because of the date range specified in the worksheet settings.
CONFIGURE LOADING TEXT FILES DOES NOT LOAD DATA FOR MORE THAN 2000
COLUMN WIDTH.
During data load from text file, if total width of columns is more than 2000,
it shows error as "ORA-12899: value too large for column
"DEMANTRA"."DM_WIZ_IMPORT_FILE_DEF"."SRC_CTL_SYNTAX" (actual: 3037,
maximum:2000)".
* This is the limitation of the functionality. Change covered in enhancement request 6879562.
next
================================================
SQL* Loader Technical Health Check
1. Make sure that SQLLDR.EXE utility exist under the oracle client bin
directory and that the system path points there. Verify by opening a CMD
window and executing SQLLDR.EXE, if it's not found, add it to the system path and restart.
2. Otherwise, please provide the full engine log from that run, also provide
all the *.log/*.bad/*.txt files that should be under the engine's bin
directory.
< END OF DOC >
file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (24 of 24)5/6/2009 8:53:01 AM

More Related Content

What's hot

Oracle Fixed assets ivas
Oracle Fixed assets ivasOracle Fixed assets ivas
Oracle Fixed assets ivasAli Ibrahim
 
1.overview of advanced pricing
1.overview of advanced pricing1.overview of advanced pricing
1.overview of advanced pricingNazmul Alam
 
Oracle eBS Overview.pptx
Oracle eBS Overview.pptxOracle eBS Overview.pptx
Oracle eBS Overview.pptxssuser9dce1e1
 
Oracle Ebiz R12.2 Features -- Ravi Sagaram
Oracle Ebiz R12.2 Features -- Ravi SagaramOracle Ebiz R12.2 Features -- Ravi Sagaram
Oracle Ebiz R12.2 Features -- Ravi Sagaramravisagaram
 
Oracle Form material
Oracle Form materialOracle Form material
Oracle Form materialRajesh Ch
 
Domains in IBM Maximo Asset Management
Domains in IBM Maximo Asset ManagementDomains in IBM Maximo Asset Management
Domains in IBM Maximo Asset ManagementRobert Zientara
 
Oracle Ebs Enterprise Asset Management.docx
Oracle Ebs Enterprise Asset Management.docxOracle Ebs Enterprise Asset Management.docx
Oracle Ebs Enterprise Asset Management.docxMina Lotfy
 
oracle enterprise asset management ppt
oracle enterprise asset management pptoracle enterprise asset management ppt
oracle enterprise asset management pptEjaz Hussain
 
Introduction to lightning Web Component
Introduction to lightning Web ComponentIntroduction to lightning Web Component
Introduction to lightning Web ComponentMohith Shrivastava
 
EBS-OPM Costing.docx
EBS-OPM Costing.docxEBS-OPM Costing.docx
EBS-OPM Costing.docxMina Lotfy
 
Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...
Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...
Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...Jade Global
 
Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...
Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...
Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...Boopathy CS
 

What's hot (20)

Opm costing
Opm costingOpm costing
Opm costing
 
Oracle Fixed assets ivas
Oracle Fixed assets ivasOracle Fixed assets ivas
Oracle Fixed assets ivas
 
Understanding Physical Inventory
Understanding Physical InventoryUnderstanding Physical Inventory
Understanding Physical Inventory
 
1.overview of advanced pricing
1.overview of advanced pricing1.overview of advanced pricing
1.overview of advanced pricing
 
Oracle eBS Overview.pptx
Oracle eBS Overview.pptxOracle eBS Overview.pptx
Oracle eBS Overview.pptx
 
Oracle R12 Purchasing setup
Oracle R12 Purchasing setupOracle R12 Purchasing setup
Oracle R12 Purchasing setup
 
Oracle eAM Overview And Integration With E-Business Suite
Oracle eAM Overview And Integration With E-Business SuiteOracle eAM Overview And Integration With E-Business Suite
Oracle eAM Overview And Integration With E-Business Suite
 
Oracle Ebiz R12.2 Features -- Ravi Sagaram
Oracle Ebiz R12.2 Features -- Ravi SagaramOracle Ebiz R12.2 Features -- Ravi Sagaram
Oracle Ebiz R12.2 Features -- Ravi Sagaram
 
Oracle Form material
Oracle Form materialOracle Form material
Oracle Form material
 
Domains in IBM Maximo Asset Management
Domains in IBM Maximo Asset ManagementDomains in IBM Maximo Asset Management
Domains in IBM Maximo Asset Management
 
oracle order management
oracle order managementoracle order management
oracle order management
 
Oracle Assets Period Closing
Oracle Assets Period ClosingOracle Assets Period Closing
Oracle Assets Period Closing
 
Oracle Ebs Enterprise Asset Management.docx
Oracle Ebs Enterprise Asset Management.docxOracle Ebs Enterprise Asset Management.docx
Oracle Ebs Enterprise Asset Management.docx
 
oracle enterprise asset management ppt
oracle enterprise asset management pptoracle enterprise asset management ppt
oracle enterprise asset management ppt
 
Introduction to lightning Web Component
Introduction to lightning Web ComponentIntroduction to lightning Web Component
Introduction to lightning Web Component
 
EBS-OPM Costing.docx
EBS-OPM Costing.docxEBS-OPM Costing.docx
EBS-OPM Costing.docx
 
Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...
Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...
Case Study: Salesforce CPQ (Configure Price Quote) for Software as a Service ...
 
R12 Oracle Inventory Management, New Features
R12 Oracle Inventory Management, New FeaturesR12 Oracle Inventory Management, New Features
R12 Oracle Inventory Management, New Features
 
Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...
Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...
Oracle R12 Apps – SCM Functional Interview Questions & Answers – Purchasing M...
 
WMS Overview
WMS OverviewWMS Overview
WMS Overview
 

Similar to DemantraSolutions.pdf

Migrating to Database 12c Multitenant - New Opportunities To Get It Right!
Migrating to Database 12c Multitenant - New Opportunities To Get It Right!Migrating to Database 12c Multitenant - New Opportunities To Get It Right!
Migrating to Database 12c Multitenant - New Opportunities To Get It Right!Performance Tuning Corporation
 
Sql server 2008_replication_technical_case_study
Sql server 2008_replication_technical_case_studySql server 2008_replication_technical_case_study
Sql server 2008_replication_technical_case_studyKlaudiia Jacome
 
"It can always get worse!" – Lessons Learned in over 20 years working with Or...
"It can always get worse!" – Lessons Learned in over 20 years working with Or..."It can always get worse!" – Lessons Learned in over 20 years working with Or...
"It can always get worse!" – Lessons Learned in over 20 years working with Or...Markus Michalewicz
 
RivieraJUG - MySQL Indexes and Histograms
RivieraJUG - MySQL Indexes and HistogramsRivieraJUG - MySQL Indexes and Histograms
RivieraJUG - MySQL Indexes and HistogramsFrederic Descamps
 
TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...
TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...
TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...Symantec
 
Intro to Neo4j Ops Manager (NOM)
Intro to Neo4j Ops Manager (NOM)Intro to Neo4j Ops Manager (NOM)
Intro to Neo4j Ops Manager (NOM)Neo4j
 
O365con14 - migrating your e-mail to the cloud
O365con14 - migrating your e-mail to the cloudO365con14 - migrating your e-mail to the cloud
O365con14 - migrating your e-mail to the cloudNCCOMMS
 
[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdf
[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdf[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdf
[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdfChris Hoyean Song
 
Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...
Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...
Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...Data Con LA
 
Oracle Exadata 1Z0-485 Certification
Oracle Exadata 1Z0-485 CertificationOracle Exadata 1Z0-485 Certification
Oracle Exadata 1Z0-485 CertificationExadatadba
 
Open Source 101 2022 - MySQL Indexes and Histograms
Open Source 101 2022 - MySQL Indexes and HistogramsOpen Source 101 2022 - MySQL Indexes and Histograms
Open Source 101 2022 - MySQL Indexes and HistogramsFrederic Descamps
 
Top Ten Siemens S7 Tips and Tricks
Top Ten Siemens S7 Tips and TricksTop Ten Siemens S7 Tips and Tricks
Top Ten Siemens S7 Tips and TricksDMC, Inc.
 
Automated product categorization
Automated product categorizationAutomated product categorization
Automated product categorizationAndreas Loupasakis
 
Automated product categorization
Automated product categorization   Automated product categorization
Automated product categorization Warply
 
DMKit_2.0_README_1
DMKit_2.0_README_1DMKit_2.0_README_1
DMKit_2.0_README_1ibtesting
 
Oracle Enterprise Manager 12c - OEM12c Presentation
Oracle Enterprise Manager 12c - OEM12c PresentationOracle Enterprise Manager 12c - OEM12c Presentation
Oracle Enterprise Manager 12c - OEM12c PresentationFrancisco Alvarez
 

Similar to DemantraSolutions.pdf (20)

DAC
DACDAC
DAC
 
Migrating to Database 12c Multitenant - New Opportunities To Get It Right!
Migrating to Database 12c Multitenant - New Opportunities To Get It Right!Migrating to Database 12c Multitenant - New Opportunities To Get It Right!
Migrating to Database 12c Multitenant - New Opportunities To Get It Right!
 
Sql server 2008_replication_technical_case_study
Sql server 2008_replication_technical_case_studySql server 2008_replication_technical_case_study
Sql server 2008_replication_technical_case_study
 
"It can always get worse!" – Lessons Learned in over 20 years working with Or...
"It can always get worse!" – Lessons Learned in over 20 years working with Or..."It can always get worse!" – Lessons Learned in over 20 years working with Or...
"It can always get worse!" – Lessons Learned in over 20 years working with Or...
 
IUG ATL PC 9.5
IUG ATL PC 9.5IUG ATL PC 9.5
IUG ATL PC 9.5
 
RivieraJUG - MySQL Indexes and Histograms
RivieraJUG - MySQL Indexes and HistogramsRivieraJUG - MySQL Indexes and Histograms
RivieraJUG - MySQL Indexes and Histograms
 
TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...
TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...
TECHNICAL WHITE PAPER▶Symantec Backup Exec 2014 Blueprints - Optimized Duplic...
 
Intro to Neo4j Ops Manager (NOM)
Intro to Neo4j Ops Manager (NOM)Intro to Neo4j Ops Manager (NOM)
Intro to Neo4j Ops Manager (NOM)
 
O365con14 - migrating your e-mail to the cloud
O365con14 - migrating your e-mail to the cloudO365con14 - migrating your e-mail to the cloud
O365con14 - migrating your e-mail to the cloud
 
[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdf
[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdf[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdf
[EN] Building modern data pipeline with Snowflake + DBT + Airflow.pdf
 
Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...
Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...
Data Con LA 2022 - Supercharge your Snowflake Data Cloud from a Snowflake Dat...
 
D.J_Resume 20-01-16
D.J_Resume 20-01-16D.J_Resume 20-01-16
D.J_Resume 20-01-16
 
Oracle Exadata 1Z0-485 Certification
Oracle Exadata 1Z0-485 CertificationOracle Exadata 1Z0-485 Certification
Oracle Exadata 1Z0-485 Certification
 
Open Source 101 2022 - MySQL Indexes and Histograms
Open Source 101 2022 - MySQL Indexes and HistogramsOpen Source 101 2022 - MySQL Indexes and Histograms
Open Source 101 2022 - MySQL Indexes and Histograms
 
Top Ten Siemens S7 Tips and Tricks
Top Ten Siemens S7 Tips and TricksTop Ten Siemens S7 Tips and Tricks
Top Ten Siemens S7 Tips and Tricks
 
Automated product categorization
Automated product categorizationAutomated product categorization
Automated product categorization
 
Automated product categorization
Automated product categorization   Automated product categorization
Automated product categorization
 
DMKit_2.0_README_1
DMKit_2.0_README_1DMKit_2.0_README_1
DMKit_2.0_README_1
 
Oracle Enterprise Manager 12c - OEM12c Presentation
Oracle Enterprise Manager 12c - OEM12c PresentationOracle Enterprise Manager 12c - OEM12c Presentation
Oracle Enterprise Manager 12c - OEM12c Presentation
 
gn1595_10Mar2016
gn1595_10Mar2016gn1595_10Mar2016
gn1595_10Mar2016
 

Recently uploaded

Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Patryk Bandurski
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxOnBoard
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machinePadma Pradeep
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...Fwdays
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersThousandEyes
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsMark Billinghurst
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024BookNet Canada
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Allon Mureinik
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationRidwan Fadjar
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Mattias Andersson
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):comworks
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationSlibray Presentation
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupFlorian Wilhelm
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesSinan KOZAK
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticscarlostorres15106
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubKalema Edgar
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDGMarianaLemus7
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxnull - The Open Security Community
 

Recently uploaded (20)

Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
Integration and Automation in Practice: CI/CD in Mule Integration and Automat...
 
Maximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptxMaximizing Board Effectiveness 2024 Webinar.pptx
Maximizing Board Effectiveness 2024 Webinar.pptx
 
Install Stable Diffusion in windows machine
Install Stable Diffusion in windows machineInstall Stable Diffusion in windows machine
Install Stable Diffusion in windows machine
 
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks..."LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
"LLMs for Python Engineers: Advanced Data Analysis and Semantic Kernel",Oleks...
 
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for PartnersEnhancing Worker Digital Experience: A Hands-on Workshop for Partners
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
 
Human Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR SystemsHuman Factors of XR: Using Human Factors to Design XR Systems
Human Factors of XR: Using Human Factors to Design XR Systems
 
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
New from BookNet Canada for 2024: BNC BiblioShare - Tech Forum 2024
 
Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)Injustice - Developers Among Us (SciFiDevCon 2024)
Injustice - Developers Among Us (SciFiDevCon 2024)
 
My Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 PresentationMy Hashitalk Indonesia April 2024 Presentation
My Hashitalk Indonesia April 2024 Presentation
 
DMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special EditionDMCC Future of Trade Web3 - Special Edition
DMCC Future of Trade Web3 - Special Edition
 
Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?Are Multi-Cloud and Serverless Good or Bad?
Are Multi-Cloud and Serverless Good or Bad?
 
CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):CloudStudio User manual (basic edition):
CloudStudio User manual (basic edition):
 
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptxE-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
E-Vehicle_Hacking_by_Parul Sharma_null_owasp.pptx
 
Connect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck PresentationConnect Wave/ connectwave Pitch Deck Presentation
Connect Wave/ connectwave Pitch Deck Presentation
 
Streamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project SetupStreamlining Python Development: A Guide to a Modern Project Setup
Streamlining Python Development: A Guide to a Modern Project Setup
 
Unblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen FramesUnblocking The Main Thread Solving ANRs and Frozen Frames
Unblocking The Main Thread Solving ANRs and Frozen Frames
 
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmaticsKotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
Kotlin Multiplatform & Compose Multiplatform - Starter kit for pragmatics
 
Unleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding ClubUnleash Your Potential - Namagunga Girls Coding Club
Unleash Your Potential - Namagunga Girls Coding Club
 
APIForce Zurich 5 April Automation LPDG
APIForce Zurich 5 April  Automation LPDGAPIForce Zurich 5 April  Automation LPDG
APIForce Zurich 5 April Automation LPDG
 
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptxMaking_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
Making_way_through_DLL_hollowing_inspite_of_CFG_by_Debjeet Banerjee.pptx
 

DemantraSolutions.pdf

  • 1. 1 © 2009 Oracle Corporation – Proprietary and Confidential
  • 2. 2 Demantra Solutions Day, Date, 2004 time p.m. ET Teleconference Access: North America: xxxx International: xxxx Password: Advisor Wednesday, May 6, 2009 Time MDT Teleconference Access: 1-866-627-3315 +1-706-679-4880 Passcode: 93272045 Future Demantra Solutions Advisor Webcasts!! Don't miss out.. Register today!! Wednesday May 6, 2009 Troubleshooting Techniques for Demantra Data Load Issues 791239.1 Wednesday June 3, 2009 The Key to Optimizing Demantra Worksheet Performance 797933.1 Wednesday July 1, 2009 TBA TBA For a listing of webcasts in other product areas, see Note 398884.1. For a recording of today’s or a previous Advisor Webcast, see Note 740297.1. © 2009 Oracle Corporation – Proprietary and Confidential
  • 3. 3 Agenda • Presentation and Demo – approximately 45 minutes • Q&A Session – approximately 15 minutes • Please hold all questions to the end of the session. • To ask a question, move your cursor to the top of the screen and select the ‘bubble’ icon next to the moderator’s name. • A dialog box will open. Enter your question and select “Send.” • During the Q&A session your question will be read and an answer will follow. © 2009 Oracle Corporation – Proprietary and Confidential
  • 4. 4 ATTENTION – AUDIO INFORMATION If you encounter any audio issues, please call InterCall (Audio Conferencing) and mute your phone. 1-888-259-4812 +1-706-679-4880 Conference ID: 93272045 We would like to encourage attendees with sufficient Internet bandwidth to listen through VoiceStreaming to continue to use this audio source. Thank you. © 2009 Oracle Corporation – Proprietary and Confidential
  • 5. 5 Safe Harbor Statement The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any material, code, or functionality, and should not be relied upon in making purchasing decision. The development, release, and timing of any features or functionality described for Oracle’s products remains at the sole discretion of Oracle. © 2009 Oracle Corporation – Proprietary and Confidential
  • 6. 6 <Insert Picture Here> Demantra Solutions Data Loading Troubleshooting Jeff Goulette
  • 7. 7 Agenda The following presentation will be delivered in three parts: 1)EBS to Demantra Staging • WC-Staging-Tables.txt 2) Demantra Data Store to Demantra • WC-EP-Load-Debug.txt 3) Miscellaneous Including Publish Forecast, Custom Hooks, etc. • WC-Hook.txt 4) Duration approximately 45 Minutes Followed by Q/A Please note that all three files are available in the Demantra Solutions Metalink Forum © 2009 Oracle Corporation – Proprietary and Confidential
  • 8. 8 Theory and Future • Typically functional issues are addressed one time, never to reoccur again. This presentation is meant to be a guide, a point of entry to discover data inconsistencies. • There are shortfalls in this presentation as the number of possible data problems are potentially significant. Version 2 of this web cast will include points made today as well as comments from the Demantra Solutions forum. • These articles are available at the Demantra Solutions forum. • WC-Staging-Tables.txt • WC-EP_LOAD_Debug.txt • WC-Hook.txt • Specific business scenarios are always welcome. • For example, ‘Can Oracle supply a getting started document that addresses 11.5.10.2 MRP and Demantra? © 2009 Oracle Corporation – Proprietary and Confidential
  • 9. 9 Refresh Snapshot Logs Snapshots MRP_AP Views MSD_DEM Objects INV BOM WIP PO OM Forecast MDS OPM Shipping Booking MSC_Staging Tables MSC ODS raw data MSC ODS raw data MSC PDS planning data MSDEM Schema 1 3 2 © 2009 Oracle Corporation – Proprietary and Confidential
  • 10. 10 © 2009 Oracle Corporation – Proprietary and Confidential • To ask a question, move your cursor to the top of the screen and select the ‘bubble’ icon next to the moderator’s name. • A dialog box will open. Enter your question and select “Send.” Questions
  • 11. 11 Visit My Oracle Communities Collaborate with a large network of your industry peers, support professionals, and Oracle experts to exchange information, ask questions & get answers. Find out how your peers are using Oracle technologies and services to better meet their support and business needs. • Exchange Knowledge • Resolve Issues • Gain Expertise Visit the Demantra Solutions Support Community now!! 1. Log into My Oracle Support. 2. Select the Community link. 3. Select the Enter Here button. 4. Select the Demantra Solutions link under the E- Business Suite section of the My Communities Menu on the left side of the window. © 2009 Oracle Corporation – Proprietary and Confidential
  • 12. 12 Feedback Please let us know how we are doing by providing the following: 1. Based on the webcast description, how did your learning experience compare to what you expected when you began the webcast? 2. Would you recommend the recording for this webcast to your colleagues? 3. What topics specifically would you like to see covered in the Demantra Advisor Webcasts? 4. Will you attend future Demantra Advisor Webcasts? Email your feedback to jamie.binkley@oracle.com or coremfg-news_us@oracle.com. © 2009 Oracle Corporation – Proprietary and Confidential
  • 13. 13 THANK YOU © 2009 Oracle Corporation – Proprietary and Confidential
  • 14. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS! The material presented below is of a technical nature. Little attention has been given to functional navigation, functional demonstrations. The vast majority of data load issues are investigated / solved using SQL. To that end, this presentation focuses almost exclusively on problem investigation and resolution. This material has been assembled from 100s of bugs in which DEV gave us techniques to drill into source/destination data. Demantra Custom Hooks and Assorted Debugging Techniques Data Loading Flow ----------------- There are four data flows that move data in and out of Demantra: 1. Loading data from source into collection staging tables * These are the T_SRC_% and error tables. ep_load_main procedure. 2. Moving data from the collection staging tables into Demantra data store 3. Loading data from the Demantra data store into the Demantra engine and downloading from Demantra back to the Demantra data store. 4. Pushing the data back to the source instance in the form of a forecast. We will cover #3 and #4 in this presentation. next ================================ Creating a new series through data model upgrade from business modeler, brining in the data through custom hooks and ep_load. Step 1 - Add new column in the interface table t_src_sales_tmpl. Step 2 – Add new series through the data model wizard using the newly created column in the table t_src_sales_tmpl. Step 3 – Upgrade the data model. Step 4 – Put the custom query in the hook provided in package msd_dem_custom_hook. Step 5 – Run Shipment and booking history collection with auto download set to ‘No’. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (1 of 8)5/6/2009 8:51:26 AM
  • 15. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt Step 6 – Check the data in the interface table t_src_sales_tmpl. Step 7 – Run Workflow EBS_Full_download. Step 8 – See the data in the worksheet. Please see: ORACLE DEMANTRA, Customizing Hierarchies * Please request a copy from Oracle Support next ================================ Working with Oracle Support & Development: Debugging Custom Hooks Please set the profile MSD_DEM: Debug Mode to Yes. And then run the shipment and booking history program again and provide the following: 1. Log files and Output Files of all the concurrent programs launched. 2. Trace file and the tkprof output of the trace file for the DB session. Also upload the modified custom hooks package spec and body. next ================================ Preventing Data from Being Loaded - You do not want the demand class item level to be imported to Demantra. - You want to update the demand class column in item staging table to N/A. - To accomplish this task, you can modify the sales history custom hook package, on the EBS side, so that it will be executed during EBS collections. - The sql stament inserted into the custom hook program is below. ---- UPDATE MSDEM.t_src_sales_tmpl SET EBS_DEMAND_CLASS_SR_PK = '-777', EBS_DEMAND_CLASS_CODE = '0'; COMMIT; file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (2 of 8)5/6/2009 8:51:26 AM
  • 16. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt ---- - After running the Shipping/Booking History COllections, the SQL statement will update the T_SRC_SALES_TMPL table. All other custom hooks which are placed for updating T_SRC_ITEMS_TMPL and T_SRC_LOC_TMPL will also be updated. next ================================ Discovered Bug and Customer Problem 1. Open and run worksheet zzz. CTO: My BOM view 2. Select combination date 10/06/2008 SLC:M1:Seattle Manufacturing - Computer Service:1006:Chattanooga (OPS):Vision Operations -> CN974444 3. Enter a value for "Forecast Dependent Demand Override" 4. Press worksheet update data then rerun worksheet 5. Rerun worksheet and see that the old value is shown instead of the value entered. 6. Close and reoopen worksheet - "Forecast Dependent Demand Override" still shows old value. 7. After some 5 to 10m minutes rerun or reopen worksheet 8. See that new value finally shows up in "Forecast Dependent Demand Override" Developer Explanation fixed in 7.3 ---------------------------------- After we update some CTO GL series in Worksheet (e.g. - 'Forecast Dependent Demand Override'), the old data appear while rerunning the Worksheet, and the new data will appear only if we reopen the Worksheet and then rerun. Internal Machinations: Technical Analysis and Resolution of the problem ----------------------------------------------------------------------- 1. In the Incremental Loading Mechanism - While preparing the Sql for the T_POPU table we don't add the LUD column of the GL Data table (which exists file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (3 of 8)5/6/2009 8:51:26 AM
  • 17. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt in the GL Matrix table) into the Expression in the Select part (the expression which indicates whether a combination was changed). 2. Therefore, the solution will be: add the GL Data table LUD column (e.g. - t_ep_cto_data_lud) into the POPU Sql in the Expression which will check if a combination was changed. 3. In the Combination Cache Mechanism - after a Worksheet rerun we clear the Sub Combination and Combination Maps and then the next time we access these Maps we don't find the changed Combination, and therefore, do not remove them from the Map of Loaded Combinations, so the Client won't get the changed data. 4. The solution will be: Do not clear the Sub Comb & Combination Maps, but only synch them with the latest requested Sub Combs & Combinations. next ================================ I installed the latest build on my local environment, but I still have a problem with the worksheet. When I update a base model in the "zzz. CTO: My BOM view", the "parent demand" series should change when the update hook runs. Here is what I did: 1) Open the worksheet "zzz. CTO: My BOM view" 2) On the page level browser, go click on APAC, then APAC Site 1, and then D530 3) For the row item 10/27/2008 | D530 | D530, put the number 22 in the "Base Override" Series. Press update. 4) Wait 10 seconds and hit refresh. Notice that the "parent demand" series has not changed. Several of values in this series should have the number 22 as a value. 5) In SQL Developer or other SQL tool, run the following query: select * from t_ep_cto_data where cto_parent_dem_final=22 You should see several records that have the value 22 for the parent file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (4 of 8)5/6/2009 8:51:27 AM
  • 18. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt demand series column. Note that the last_update_date column on t_ep_cto_data has been modified. 6) Close the worksheet, then reopen it. Only now will the value 22 will apear in the "parent demand" series. Customer Solution ----------------- - The problem in this is in the Update Hook code. Upon examination of the CALC_WORKSHEET procedure I noticed that you update the 'LAST_UPDATE_DATE' column in T_EP_CTO_DATA table. - While running the worksheet we do not check the 'LAST_UPDATE_DATE' column in the T_EP_CTO_DATA table but the 'LUD' columns in the T_EP_CTO_MATRIX table. - This is according to a new feature named 'GL_MATRIX Support' which has been added to version 7.3. - In this case the column 'T_EP_CTO_DATA_LUD' in the T_EP_CTO_MATRIX table should be updated also. - In each GL Matrix table, we have the LUD columns of all the Levels which belongs to the current GL (General Level): * Plus the LAST_UPDATE_DATE column of this table * Plus the LUD column of the GL DATA table ('T_EP_CTO_DATA). After adjusting the update, the customer worksheet displayed correct results. next ================================================ Object MSD_DEM_QUERIES * I have little information regarding the table that stores the dynamic SQL used for Demantra. If there is interest, we can speak with DEV and develop a white paper geared to diagnostics. next ================================================ file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (5 of 8)5/6/2009 8:51:27 AM
  • 19. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt For Data Collection Know Your Instances I have seen numerous errors that are related to the awareness of which instance is being collected. select * from apps.msc_apps_instances; Check the t_src_loc_tmpl (or other staging table) table to determine instances and organizations are collected. next ================================ Publishing Forecast Results There have been few issues reported concerning the publishing of Demantra forecast to source. Here are a few pointers: The workflow has completed with out error yet there are no rows in the table MSD_DP_SCN_ENTRIES_DENORM. There are a number of possibilities. 1) Does BIEO_Local_Fcst have data? 2) Validation the org id of MSC_APPS_INSTANCES. SQL> select id from transfer_query where query_name = 'Local Forecast'; ----353 SQL> select * from transfer_query_levels where id = <id from 1st script>; 3) Verify the workflow status. next ================================ file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (6 of 8)5/6/2009 8:51:27 AM
  • 20. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt SNO Integration Note We support the following demand levels in SNO integration: 1. Item-Org 2. Item - Customer-customer site 3. Item - Zone When we publish the constrained forecast from SNO, we publish all these 3 types. ASCP handles these 3 types of constrained forecasts and plans successfully. next ================================ Working with Oracle Support Please set the profile MSD_DEM: Debug Mode to Yes. And then run the shipment and booking history program again and provide the following: 1. Log files and Output Files of all the concurrent programs launched. 2. Trace file and the tkprof output of the trace file for the DB session. next ================================================ R12 and Demantra 7.2.02 Install Procedure ----------------------------------------- Currently we have EBS-Demantra Integration Installation Overview and Diagram, note 434991.1. - This note addresses 11.5.10.2 and Demantra 7.1.1. - We are assembling a note that will cover install procedures for r12 and Demantra 7.2.0.2 - Metalink Delivery date 30-May-2009 next ================================================ Wrap Up file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (7 of 8)5/6/2009 8:51:27 AM
  • 21. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt ------- Please reference the following notes available on Metalink or email jeffery.goulette@oracle.com if the article is being moderated. These are available in a .zip file. ** Note.789477.1 Manipulating Orgs For The Demantra Data Load ** Note.809410.1 EBS to Demantra Data Load / Import Diagnostics Investigation ** Note.815124.1 Data Loading into Demantra EP_Load ** Note.817973.1 Demantra Custom Hooks and Additiona Pointers ** Note.806295.1 Demantra Solutions TIPs for April 2009 ** Note.563732.1 Demantra 7.1 7.2 Pre Post and Install Cross Checks ** Note.754237.1 DEMANTRA Q/A Setup, Implementation Ideas, Behavior 7.0.2, 7.1, 7.2 R12 EBS ** Note.802395.1 Demantra Patching - Version Control at EBS and Demantra Compatibility ** Note.462321.1 Demantra Environment Manipulation and Performance Tuning Suggestions file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu.../Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Hook.txt (8 of 8)5/6/2009 8:51:27 AM
  • 22. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS! The material presented below is of a technical nature. Little attention has been given to functional navigation, functional demonstrations. The vast majority of data load issues are investigated / solved using SQL. To that end, this presentation focuses almost exclusively on problem investigation and resolution. T This material has been assembled from 100s of bugs in which DEV gave us techniques to drill into source/destination data. ================================== Data Loading into Demantra EP_Load ================================== Data Loading Flow ----------------- There are four data flows that move data in and out of Demantra: 1. Loading data from source into collection staging tables 2. Moving data from the collection staging tables into Dematra data store 3. Loading data from the Demantra data store into the Demantra engine and downloading from Demtra back to the Demantr data store. 4. Pushing the data back to the source instance in the form of a forecast. We will cover #2 and #3 in this presentation. EBS to Demantra Data Load / Import Diagnostics Investigation ------------------------------------------------------------ There are several methods to load data into the Demantra staging tables. Based on the number of problems reported, the tools seem to operate as advertised. The data loaded into the Demantra staging tables can be an issue. We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation to indentify, explain and fix the load result. Summary of Integration Tasks ---------------------------- This section lists integration tasks in the appropriate sequence: file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (1 of 19)5/6/2009 8:52:23 AM
  • 23. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt 1. Initial Setup, See Implementation Guide. 2. Collect Data and Download to Three Staging Tables. See Implementation Guide. 3. Transfer data to Oracle Demantra schema. See Implementation Guide. * EP_LOAD * Import Integration Profiles 4. Generate forecasts 5. Export Output from Oracle Demantra. See Implementation Guide. * Export Integration Profiles 6. Upload Forecast. See Implementation Guide. next ================================ EP_LOAD download procedures are used for booking history streams and level members. - For example, the EP_LOAD procedures are used to load booking history by organization-site-sales channel and item-demand class into staging tables. - If the Download Now check box was not selected during the collections process, run EP_LOAD and Import Integration Profiles to move data from the staging tables into the Oracle Demantra Demand Management schema. next ================================ Launch EP LOAD. - Historical information and level data are imported into Oracle Demantra via the EP_LOAD procedure. - All other series data are imported into Oracle Demantra via Import Integration Profiles. An assumption of the EP LOAD procedure is that the series are populated into the Oracle Demantra staging tables before the load process begins. - To ensure this occurs, the collection programs for all eight historical series have been merged so that these streams are always collected simultaneously. next ================================ file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (2 of 19)5/6/2009 8:52:23 AM
  • 24. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt Launch EP_LOAD Continued. For members and history series, which are downloaded via the EP_LOAD mechanism, the mode of update is: - If there is a new member, it is added in Oracle Demantra. - If a member has been deleted in the E-Business Suite source, it stays in Oracle Demantra along with all series data for combinations that include the member. * The administrative user must manually delete the member in Oracle Demantra. - Series data in the staging area overwrites the series data in Oracle Demantra, for the combinations that are represented in the staging area. - Series data in Oracle Demantra for combinations that are not in the staging area are left unchanged. - The staging area is erased after the download. - All series data in Oracle Demantra, for all combinations, are set to null before the download actions take place. * There are a total of three EP_LOAD workflows, one EP_LOAD workflow for each of the following series: - Item members - Location members - Shipment and Booking History Caution: There is a risk that if multiple lines of business run collections very close in time to each other, a single EP_LOAD run may pull in data from multiple lines of business. - See Line Of Business Configuration and Execution in the User Guide. * We are not covering the Create the EP_LOAD_MAIN procedure, which loads data into the data model from staging tables or from files, according to your choice as setup in the Data Model Wizard. next ================================ Future Date Loading file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (3 of 19)5/6/2009 8:52:23 AM
  • 25. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt EP_LOAD process. All future information (that is, forecast data) is loaded using integration profiles or other loading mechanisms. This mechanism controls the dates marked as end of history for the Forecasting Engine and the Collaborator Workbench. With the addition of the MaxSalesGen parameter, you can now use the EP_LOAD process to load future dates into Demantra. This parameter determines how data after the end of history is populated. Note: When populating the MaxSalesGen parameter, its important to enter all dates in the MM-DD-YYYY 00:00:00 format. next ================================ Three main staging tables: T_SRC_ITEM_TMPL This staging table is used by the ep_load_main procedure. Each record corresponds to a unique item entity based on all lowest item levels in the model. T_SRC_LOC_TMPL This staging table is used by the ep_load_main procedure. Each record corresponds to a unique location entity based on all lowest item levels in the model. T_SRC_SALES_TMPL This staging table is used by the ep_load_main procedure. Each record corresponds to sales data for a given item and location combination, based on all lowest item levels in the model. next ================================ Functional Steps to load Demantra 1. Load the data into the staging tables: t_Src_sales_tmpl, t_src_item_tmpl, t_src_loc_tmpl 2. Login to the Workflow Manager. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (4 of 19)5/6/2009 8:52:23 AM
  • 26. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt 3. Start the workflow 'EBS Full Download' 4. Check the status by clicking on the 'Instance' number of the workflow. 5. Check the status on the Collaborator Workbench in the My Tasks pane. next ================================ This pre seeded WF should include all required steps to enable a successfull download of items/location/sales data into Demantra tables. Currently the WF runs the following processes: EP_LOAD_ITEMS EP_LOAD_LOCATION EP_LOAD_SALES However, in order to complete the loading process you may need to run the MDP_ADD procedure to make sure new combination are added into MDP_MATRIX. next ================================ Technical Investigation ----------------------- On 7.2.0.1 in Production: We find that after running ep_load_sales, the procedure completed without any error message, but there is no record in sales_data table. EXPECTED BEHAVIOR ----------------------------------- We expect data to be loaded Steps To Reproduce: 1. load data into t_src_sales_tmpl 2. run ep_load_sales 3. check integ_status 4. check t_src_sales_tmpl_err file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (5 of 19)5/6/2009 8:52:23 AM
  • 27. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt 5. check sales_data We have just one record in t_src_sales_tmpl, after running ep_load_sales, the procedure completed without any error message and there is no error row in t_src_sales_tmpl_err. Upon checking sales_data and integ_status, there are no rows in these tables. Debugging Steps: 1. Check if you have any error in db_exception_log table. select * from db_exception_log order by err_date desc; 2. Verify that there is not 'new' data. The existing row could have simply been updated. 3. Please review all columns: select * from t_src_sales_tmpl; select * from sales_data; Upon closer examination, we found invalid data in t_src_sales_tmpl table. After we updated the column ACTUAL_QTY from NULL to '0', the ep load sales was successful. * You would need to change the value coming from the source to successfully load without manipulating staging table data. next ================================ After launching Collection, Collect Shipment and Booking history with the parameter 'EP_LOAD'=N and running ep_load after collection, the actual_quantity loaded into sales_data is different from the actual_qty in t_src_sales_tmpl. This is what we did: -------------------- 1. Launch Collection : Collect Shipment and Booking history with the parameter 'EP_LOAD' = N 2. Launch a custom program to populate the following columns: T_ep_i_att_9, T_ep_p3, T_ep_p2 in the staging table t_ep_item_tmpl from the column ebs_product_category_desc in the table t_ep_item_tmpl. 3. Launched the EP_Load as a standalone request. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (6 of 19)5/6/2009 8:52:23 AM
  • 28. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt 4. Restarted Application Server 5. Verified the data for the items 130-0290-910 in the bucket starting 05/26/2008=18333 6. Verified from the following Query: select sum(actual_qty) from t_src_sales_tmpl where dm_item_code='130-0290-910' and sales_date between '26-may-2008' and '01-jun-2008'; - Result: 121 <---- 7. Verified from Sales Data table from the following query: select sum(actual_quantity) from Sales_data where item_id in (select item_id from msdem.mdp_matrix where t_ep_item_ep_id in (select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910')) and sales_date between '26-may-2008' and '01-jun-2008'; - Result:18333 <---- Debugging --------- Please check the setting of SYS_PARAMS 'accumulatedOrUpdate' setting: 'update' or 'accumulate' EP_LOAD_SALES should aggregate the sales in the T_SRC_SALES table by combination and sales date. - accumulatedOrUpdate = Update If the sale already exists then the actual_quantity value should be replaced. - accumulatedOrUpdate = accumulate Should add the new value to any existing values (provides the series is set as proportional) Debugging Step 2 ---------------- This is one method of comparing source data to mdp_matrix. In this case we are verifying the sums for a given date range / item. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (7 of 19)5/6/2009 8:52:23 AM
  • 29. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt - Check the source: select sum(shipped_quantity), sum(cancelled_quantity), sum(ordered_quantity), ship_from_org_id from apps.oe_order_lines_all where actual_shipment_date between '26-may-2008' and '01-jun-2008' and ordered_item= '130-0290-910' group by ship_from_org_id; - Check what was collected: select sum(actual_qty), dm_org_code from t_src_sales_tmpl where dm_item_code='130-0290-910' and sales_date between '26-may-2008' and '01-jun-2008' group by dm_org_code; - Check what was loaded into MMP_Matrix: select sum(actual_quantity) from Sales_data where item_id in (select item_id from msdem.mdp_matrix where t_ep_item_ep_id in (select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910')) and sales_date between '26-may-2008' and '01-jun-2008'; - Here we are checking a different item with a ship_from_org_id: select sum(shipped_quantity), sum(cancelled_quantity), sum(ordered_quantity), ship_from_org_id from apps.oe_order_lines_all where actual_shipment_date between '26-may-2008' and '01-jun-2008' and ordered_item= '130-0290-910' and ship_from_org_id = 722 group by ship_from_org_id; - Following the trail to collected data: select sum(actual_qty), dm_org_code from t_src_sales_tmpl where dm_item_code='130-0290-910' and sales_date = '30-May-2008' group by dm_org_code; - And finally in mdp_matrix: select sum(actual_quantity) from Sales_data where item_id in (select item_id from msdem.mdp_matrix where t_ep_item_ep_id in (select t_ep_item_ep_id from msdem.t_ep_item where item = '130-0290-910')) and sales_date ='30-May-2008'; If there is a difference, the sql can be adjusted to weed out extra rows or file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (8 of 19)5/6/2009 8:52:23 AM
  • 30. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt rows that were not collected through to mdp_matrix. next ================================ I created Integration interface Variable Cost and a workflow with transfer step to load data for that integration interface. When the staging table for integration interface is populated with data and the workflow is run, the errored out records move to the err table and the correct records vanish from the staging table but nothing is populated in demantra internal tables. If all the records in the staging table are correct then the data is populated into the base tables of Demantra. Investigation ------------- The staging table (integ_inf_var_cost) seems to contain dirty data. By this I mean that in addition to missing members (that are being handled by the system and can be noted in the error table), some of the combinations do not have valid item-location combination in the MDP_MATRIX table. Hence, the update process had failed to find any valid rows to update. On v7.1.1, the integration process validations included the following validations: 1. Integration profile structure 2. Staging table data dates range 3. Existance and validity of members and drop down series values Any error that is found, a row in the error table indicates and holds information about the problem. On v7.1.1 validation DID NOT include population validation (i.e., that the specified combination indeed exsits in MDP_MATRIX). This feature was added in the v7.2.0 release. Nonetheless, please find the following 2 SQL statements, either one will reveal those notorious combinations in the staging table: SELECT DISTINCT i.* FROM integ_inf_var_cost i, t_ep_e1_it_br_cat_3 e, file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (9 of 19)5/6/2009 8:52:23 AM
  • 31. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt t_ep_region r, t_ep_finproductgroup f WHERE i.level1 = e.e1_it_br_cat_3 AND i.level2 = r.region AND i.level3 = f.finproductgroup AND NOT EXISTS ( SELECT * FROM mdp_matrix m WHERE m.t_ep_e1_it_br_cat_3_ep_id = e.t_ep_e1_it_br_cat_3_ep_id AND m.t_ep_region_ep_id = r.t_ep_region_ep_id AND m.t_ep_finproductgroup_ep_id = f.t_ep_finproductgroup_ep_id); select distinct t6.e1_it_br_cat_3,t5.region,t4.finproductgroup from mdp_matrix t1, t_ep_finproductgroup t4, t_ep_region t5, t_ep_e1_it_br_cat_3 t6, integ_inf_var_cost vc where t1.t_ep_region_ep_id = t5.t_ep_region_ep_id and t1.t_ep_finproductgroup_ep_id = t4.t_ep_finproductgroup_ep_id and t1.t_ep_e1_it_br_cat_3_ep_id = t6.t_ep_e1_it_br_cat_3_ep_id and t1.variable_cost is null and t6.e1_it_br_cat_3 = vc.level1 and t5.region = vc.level2 and t4.finproductgroup = vc.level3; next ================================ This is a a sound approach to test the complete collection and presention of data to MDP_MATRIX and Demantra. Actions that were taken: 1. Reset the tables, by executing the following: -- Reset TRUNCATE TABLE integ_inf_var_cost_err; TRUNCATE TABLE integ_inf_var_cost; UPDATE mdp_matrix file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (10 of 19)5/6/2009 8:52:23 AM
  • 32. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt SET variable_cost = NULL; COMMIT; 2. Fill staging table, by executing the following: -- Insert data into the staging table INSERT INTO integ_inf_var_cost (sdate, level1, level2, level3, variable_cost) SELECT TO_DATE (TO_CHAR (NEXT_DAY (SYSDATE, 'monday') - 7, 'DD-MM-YYYY'), 'DD-MM-YYYY' ) sdate, shipto_commercial_org level1, region level2, finance_product_line level3, variable_cost FROM cstm_var_cost_eur; COMMIT; 3. Run the "Variable Cost Download" workflow. 4. Checked the results: a) Verify that the integ_inf_var_cost table was empty, using the following: -- Checked that all data from the staging table was handled SELECT COUNT (*) FROM integ_inf_var_cost; which had returned 0, as expected. b) Verified that that the mdp_matrix table had new rows using the following: -- Checked that the new data was introduced SELECT COUNT (*) FROM mdp_matrix WHERE variable_cost IS NOT NULL; which had returned 247. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (11 of 19)5/6/2009 8:52:23 AM
  • 33. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt next ================================ Working with Oracle Support or DEV ---------------------------------- Exporting your Demantra data model for revew: An export file which contains your data model can be exported by running the Business Modeler and selecting the menu "Data Model --> Open Data Model". Then select the active model (it is yellow) and click on the "Export" button. Then specify the location to save the *.dmw file. next ================================ Data Loading Directly into Staging Tables ----------------------------------------- 1. Load a single record (for the purpose of this test case) into each of t_src_item_tmpl,t_src_loc_tmpl and t_src_sales_tmpl 2. We then run a hybrid of the standard 'sales load' workflow, where DL_RUN_PROC is called with various arguments one by one: - ep_prepare_data, - ep_load_items, - ep_load_location, - ep_load_sales 3. Look for errors the %err tables and find that t_src_sales_tmpl_err has an error and t_src_sales_tmpl is empty Determined Cause: When loading SALES_DATA all of the levels must match existing levels in the hierarchy, not just the item code. This is ensured by a large JOIN near the beginning of the EP_LOAD_SALES procedure. We have also seen this strange result as product of bug 6520853 EP_LOAD_ITEMS DOESN'T LOAD ITEM AND EP_CHECK_ITEMS DOESN'T GIVE ERROR. This was based on proper case. When the case did not match. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (12 of 19)5/6/2009 8:52:23 AM
  • 34. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt This is fixed in 7.2.0.2 next ================================ Missing Date Values in Loading ------------------------------ After running the EP_Load procedure the worksheet does not show the data. The TO_DATE & FROM_DATE columns are not populated. Steps to Reproduce: 1. Run Build Model. 2. Load data to the T_SRC_ITEM_TMPL, T_SRC_LOC_TMPL & T_SRC_SALES_TMPL tables. 3. Run: ?Demand PlannerDeskTopload_data.bat 4. View that there are no errors in: T_SRC_ITEM_TMPL_ERR, T_SRC_LOC_TMPL_ERR T_SRC_SALES_TMPL_ERR tables 5. Check that the data was inserted to the data base. 6. Create a worksheet that will show the data. 7. See if the data is seen in the worksheet. - My results are that the levels, members and data exist in the data base. - The levels and members are seen in the worksheet wizard and can be selected. - The levels, members and data are not shown in the worksheet. Neat Debugging Example ---------------------- Create temp table with min and max dates per mdp_matrix combination: create table temp_matrix_dates as SELECT s.item_id, s.location_id, MIN(s.sales_date) AS from_date, MAX(s.sales_date) AS until_date FROM sales_data s, mdp_matrix t WHERE s.item_id = t.item_id AND s.location_id = t.location_id file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (13 of 19)5/6/2009 8:52:23 AM
  • 35. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt GROUP BY s.item_id, s.location_id Update mdp_matrix from_date and until_date based on the table created above: UPDATE mdp_matrix m set (from_date,until_date) = ( select from_date,until_date from temp_matrix_dates WHERE item_id = m.item_id AND location_id = m.location_id) WHERE item_id = m.item_id AND location_id = m.location_id; This will update the date columns in mdp_matrix for all combinations not just the ones you loaded. Use the following SQL to identify which combinations in MDP_MATRIX are missing the date values. select item_id, location_id, from_date, until_date from mdp_matrix where from_date is null and until_date is null. next ================================ The following investigation can be used to verify that data from the source actually loads into the Demantra tables. The item hhm-1 is missing in the Demantra sales_data tables but there were no errors. This customer is useing .ctl files to load the dmtra_template schema. You can change the default schema name, see note 551455.1. Step by Step Walk Through to Find the Data Following are the steps followed and the details of the issue. Load your data into flat files matching the .ctl and perform the following: - exec DATA_LOAD.EP_PREPARE_DATA; ( Runs "replace_apostrophe" which removes single quotes from the T_SRC tables) - exec DATA_LOAD.EP_LOAD_ITEMS; ( Runs no errors found in _ERR table) - exec DATA_LOAD.EP_LOAD_LOCATION; ( Runs no errors found in _ERR table) - exec DATA_LOAD.EP_LOAD_SALES; ( Runs no errors found in _ERR table) file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (14 of 19)5/6/2009 8:52:23 AM
  • 36. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt 1. Performed booking and shipping history collections with Auto down load option. 2. Found that the data is not collected into demantra. this was confirmed after verifying the data is not present in the work sheets. 3. Performed EBS full down load Work Flow. This collected the location data into the staging table. select * from dmtra_template.t_src_sales_tmpl where lower(dm_item_code) = 'hse-1'; 4. Then performed booking and shipping history collections with auto down load 5. SELECT DISTINCT dm_item_code, dm_org_code, dm_site_code, t_ep_lr1, t_ep_ls1, t_ep_p1, ebs_demand_class_code, ebs_sales_channel_code, aggre_sd -- filter column FROM ep_T_SRC_SALES_TMPL_ld WHERE ep_T_SRC_SALES_TMPL_ld.actual_qty IS NOT NULL and dm_item_code ='HHM-1' ORDER BY dm_item_code,dm_org_code,dm_site_code,t_ep_lr1,t_ep_ls1,t_ep_p1,ebs_demand_cla ss_code,ebs_sales_channel_code,aggre_sd; The 2 dates shown are 05-FEB-07 and 29-JAN-07 6. SELECT ITEMS.item_id,LOCATION.location_id FROM ITEMS, LOCATION, t_ep_item, t_ep_organization, t_ep_site, t_ep_lr1, t_ep_ls1, t_ep_p1, t_ep_ebs_dema, nd_class, t_ep_ebs_sales_ch file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (15 of 19)5/6/2009 8:52:23 AM
  • 37. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt WHERE 1 = 1 AND items.t_ep_item_ep_id = t_ep_item.t_ep_item_ep_id AND t_ep_item.item = 'HHM-1' AND location.t_ep_organization_ep_id = t_ep_organization.t_ep_organization_ep_id AND t_ep_organization.organization = 'TST:M1' AND location.t_ep_site_ep_id = t_ep_site.t_ep_site_ep_id AND t_ep_site.site = 'ABC Corporation Americas:2637:7233:Vision Operations' AND location.t_ep_lr1_ep_id = t_ep_lr1.t_ep_lr1_ep_id AND t_ep_lr1.lr1 = 'N/A' AND location.t_ep_ls1_ep_id = t_ep_ls1.t_ep_ls1_ep_id AND t_ep_ls1.ls1 = 'N/A' AND items.t_ep_p1_ep_id = t_ep_p1.t_ep_p1_ep_id AND t_ep_p1.p1 = 'N/A' AND items.t_ep_ebs_demand_class_ep_id = t_ep_ebs_demand_class.t_ep_ebs_demand_class_ep_id AND t_ep_ebs_demand_class.ebs_demand_class = '0' AND location.t_ep_ebs_sales_ch_ep_id = t_ep_ebs_sales_ch.t_ep_ebs_sales_ch_ep_id AND t_ep_ebs_sales_ch.ebs_sales_ch = 'Direct;' Gives the ITEM_ID 565 and LOCATION_ID 607 7. What you now see is only the 2 rows inserted / updated into SALES_DATA because the 'aggre_sd' date is used as a filter from the actual_qty NOT NULL select. SELECT COUNT(*) from sales_data where item_id = 565 and location_id = 607 and TRUNC(sales_date) = TO_DATE('05-FEB-07','DD-MON-RR') Gives 1 row which is right. 8. These rows are also added to MDP_LOAD_ASSIST. * When the actual_qty is NULL in the distinct list it is inserted in to a table MDP_LOAD_ASSIST and is used to populate MDP_MATRIX. * The actual_qty is NOT NULL list is aggregated by the aggre_sd filter date and the rows are inserted / updated in SALES_DATA. * The columns quantity values are averaged across all the distrinct rows even including the NOT NULL actual_qty rows by the 'aggre_sd' The EP_LOAD_SALES continues to load the rows from MDP_LOAD_ASSIST via: file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (16 of 19)5/6/2009 8:52:23 AM
  • 38. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt mdp_add('mdp_load_assist') which is in DATA_LOAD.EP_LOAD_SALES From T_SRC_SALES_TMPL Rows Total 12,476 Distinct IDs with actual_qty NULL 807 <-- These are the problem rows Distinct ID's Incl. aggre_sd and actual_qty NOT NULL 11,669 Why are there 807 problem rows? Why are they not in the err table? 9. Trying to verify success of the following items : HSE-1 select i.t_ep_item_ep_id, i.item_id,t.item, i.is_fictive from dmtra_template.items i, t_ep_item t where lower(item) ='hse-1' and i.t_ep_item_ep_id = t.t_ep_item_ep_id order by 1 ; The above will deliver the item_id to be used to verify that the data is in the sales_data table: select * from dmtra_template.sales_data where item_id in (654,671,727,823); Returned zero rows. Unclear as to why ep_load_sales does not load sales nor insert errored rows into error tables. 10. To continue the investigation SELECT * FROM dmtra_template.t_ep_item WHERE item = 'HSE-1'; Output:t_ep_item_ep_id = 188 11. What are the actual_qty values and sales_date in the source table? select dm_item_code, sales_date, actual_qty from dmtra_template.t_src_sales_tmpl file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (17 of 19)5/6/2009 8:52:23 AM
  • 39. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt where lower(dm_item_code) in ('hse-1') and actual_qty IS NOT NULL; DM_ITEM_CODE SALES_DATE ACTUAL_QTY ------------ ---------- ---------- HSE-1 12/25/2006 70 How many rows where the actual_qty values are NULL in the source table? select dm_item_code, sales_date, actual_qty from dmtra_template.t_src_sales_tmpl where lower(dm_item_code) in ('hse-1') and actual_qty IS NULL; DM_ITEM_CODE SALES_DATE ACTUAL_QTY ------------ ---------- ---------- HSE-1 1/1/2007 HSE-1 1/1/2007 12. SELECT * FROM dmtra_template.items WHERE t_ep_item_ep_id = 188; Output : item_id demand_class ------- ---------------------------------- 654 Unassicaoted 671 Australia Sales Region 727 East US Sales Region 823 West US Sales Region 13. What is the corresponding item id values from the ITEMS table? select i.t_ep_item_ep_id, i.item_id,t.item, i.is_fictive from items i, t_ep_item t where lower(item) ='hse-1' and i.t_ep_item_ep_id = t.t_ep_item_ep_id order by 1; T_EP_ITEM_EP_ID ITEM_ID ITEM IS_FICTIVE file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (18 of 19)5/6/2009 8:52:23 AM
  • 40. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt --------------- ------- ------- ---------- 188 654 HSE-1 2 14. select * from dmtra_template.sales_data where item_id = 654 and sales_date IN ('12/25/2006','1/1/2007','1/29/2007'); ITEM_ID LOCATION_ID SALES_DATE ACTUAL_QUANTITY ------- ----------- ---------- --------------- 671 746 1/29/2007 70 671 752 1/1/2007 671 753 1/1/2007 As you can see where quantity is not null there is data with correct corresponding values and dates in SALES_DATA. Expected Behaviour ------------------- We should be able to see the data for History for these series as well even when the acutal_quantity IS NULL Booking History - booked items - booked date, Booking History - requested items - booked date, Booking History - booked items - requested date, Booking History - requested items - requested date, Shipment History - shipped items - shipped date, Shipment History - shipped items - requested date, Shipment History - requested items - requested date << end of document >> file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Comm...ebcasts/2009_0506_Jeff_Goulette/WC-EP-LOAD-Debug.txt (19 of 19)5/6/2009 8:52:23 AM
  • 41. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt WARNING!! WARNING!! WARNING!! WARNING!! MAY CAUSE DROWSINESS! The material presented below is of a technical nature. Little attention has been given to functional navigation, functional demonstrations. The vast majority of data load issues are investigated / solved using SQL. To that end, this presentation focuses almost exclusively on problem investigation and resolution. This material has been assembled from 100s of bugs in which DEV gave us techniques to drill into source/destination data. Data Loading Flow ----------------- There are four data flows that move data in and out of Demantra: 1. Loading data from source into collection staging tables * These are the T_SRC_% and error tables. ep_load_main procedure. 2. Moving data from the collection staging tables into Dematra data store 3. Loading data from the Demantra data store into the Demantra engine and downloading from Demtra back to the Demantr data store. 4. Pushing the data back to the source instance in the form of a forecast. We will cover #1 and #2 in this presentation. next ================================================ Summary of Integration Tasks ---------------------------- This section lists integration tasks in the appropriate sequence: 1. Initial Setup, See Implementation Guide. 2. Collect Data and Download to Three Staging Tables. See Implementation Guide. 3. Transfer data to Oracle Demantra schema. See Implementation Guide. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (1 of 24)5/6/2009 8:53:00 AM
  • 42. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt * EP_LOAD * Import Integration Profiles 4. Generate forecasts 5. Export Output from Oracle Demantra. See Implementation Guide. * Export Integration Profiles 6. Upload Forecast. See Implementation Guide. next ================================================ EBS to Demantra Data Load / Import Diagnostics Investigation ------------------------------------------------------------ There are several methods to load data into the Demantra staging tables. Based on the number of problems reported, the tools seem to operate as advertised. The data loaded into the Demantra staging tables can be an issue. We will not focus on the tools, which appear to be intuitive, but instead discuss methods of investigation to indentify, explain and fix the load result. Load Methods ------------ - Integration Interface Wizard - Data Model Wizard - Demantra Import Tool - SQL*Loader next ================================================ The following table summarizes the core Demantra import and export tools: Data To import, use... To export, use... ------------------------------- ------------------------------- ----------------- Lowest level item and Data Model Wizard* N/A location data; sales data Series data at any Integration Interface Wizard* Integration Interface Wizard aggregation levels file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (2 of 24)5/6/2009 8:53:01 AM
  • 43. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt Sales promotions Integration Interface Wizard* Integration Interface Wizard Members and attributes of N/A Integration Interface Wizard other levels Other data, for example, Demantra import tool* N/A lookup tables *These options are in the Business Modeler. next ================================================ The core Demantra tools allow you to do the following: • Import lowest-level item, location, and sales data • Import or export series data at any aggregation level, with optional filtering • Import promotions and promotional series • Export members of any aggregation level • Import supplementary data into supporting tables as needed next ================================================ Object Manipulation ------------------- The Demantra Business Modeler contains many useful tools to perform maintenace and development. - Create Table - Alter Table - Recompile - View the Procedure Error Log - Cleanup Demantra temporary tables - Oracle Sessions Monitor - Please see user guide for complete list next ================================================ Oracle Demantra Workflows file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (3 of 24)5/6/2009 8:53:01 AM
  • 44. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt - EBS Full Download: Downloads items, locations, and shipment and booking history. - EBS Return History: Download: Downloads Return History - EBS Price List Download: Downloads Price Lists Workflows can do all the following actions - Run integration interfaces. - Run stored database procedures. - Run external batch scripts and Java classes. - Pause the workflow until a specific condition is met, possibly from a set of allowed conditions. For example, a workflow can wait for new data in a file or in a table. - Send tasks to users or groups; these tasks appear in the My Tasks module for those users, within Collaborator Workbench. A typical task is a request to examine a worksheet, make a decision, and possibly edit data. A task can also include a link to a Web page for more information. next ================================================ WF Technical ------------ Monitor the WF for errors: SELECT ACTIVITY_NAME, ACTIVITY_STATUS_CODE, ACTIVITY_RESULT_CODE, TO_CHAR(ACTIVITY_BEGIN_DATE, 'DD/MM/YYYY - HH24:MI:SS'), TO_CHAR(ACTIVITY_END_DATE, 'DD/MM/YYYY - HH24:MI:SS'), ERROR_NAME, ERROR_MESSAGE FROM WF_ITEM_ACTIVITY_STATUSES_V WHERE ACTIVITY_STATUS_CODE = 'ERROR' Check the collaborator log -------------------------- I have error in the collaborator.log file resulting from the "Download Plan Scenario Data" workflow. - They where caused by the "Notify" email step in the workflow, these errors were being generated because the demantra environment was not configured with the proper details for sending email notifications to users. - These errors might prevent the workflow from completing the rest of the steps in the flow. - The User Guide explains proper setup quite well. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (4 of 24)5/6/2009 8:53:01 AM
  • 45. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt Restart / Debugging Steps ------------------------- 1) Shutdown the applications server. 2) Clear tasks from collaborator workbench for the user. 3) Delete new DAILY_PROCESS workflow. Clear wf_process_log tableDelete all rows in the wf_scheduler and wf_process_log tables: - delete from wf_scheduler; - delete from wf_process_log; 4) Restart the application server/web server. 5) Scheduled the WF "daily process" Additional WF Checks -------------------- 1. Are there more than a single instance of the Demantra application running (i.e., with different context root, on different machines, etc.)? 2. What is the name of the problematic workflow? Is there another workflow schema by that name? 3. How were schemas removed? Using Demantra application or directly from the database (using the DELETE statement)? Use the application whenever possible. Poor WF Performance? -------------------- - Ask yourself, is this necessary data? While historic data is important, not all historic data needs to be included. - Are most of the quantities in these records valid quantities? - Have you considered changing the plan settings to limit out quantities that are below a threshold? - Can older scenario revision data be deleted if it's no longer required? - What are the output levels (including time) of the scenarios? Size Check for Performance -------------------------- Use the output as a high water mark setting for future growth/performance analysis. select SCENARIO_NAME, SCENARIO_OUTPUT_PERIOD_TYPE,DP_DIMENSION, LEVEL_NAME from MSD_DP_SCN_OUTPUT_LEVELS_V where demand_plan_id = <plan id> order by scenario_name, dp_dimension, level_name select count(*),SCENARIO_ID from msd.MSD_DP_SCN_ENTRIES_DENORM where file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (5 of 24)5/6/2009 8:53:01 AM
  • 46. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt DEMAND_PLAN_ID=6031 group by SCENARIO_ID; select count(*),SCENARIO_ID,revision from msd.MSD_DP_SCENARIO_ENTRIES where demand_plan_id=6031 group by SCENARIO_ID,revision; - Is there data that can be purged? - Are all of the scenarios and revisions lean or do they contain stale data? next ================================================ Integration profile Performance Problem? ---------------------------------------- For performance reasons, the updates are not being executed immediately, but are being accumulated, and once exceeding the configured limit being written as a chunk or block. * defined in the ImportBlockSize property in appserver.properties file Modify the 'ImportBlockSize' parameter in appserver.properties to a number that reflects better performance at your site after testing different settings. next ================================================ The Basic Input Data Review --------------------------- When fully configured, Demantra imports the following data, at a minimum, from your enterprise systems: - Item data, which describes each product that you sell. - Location data, which describes each location to which you sell or ship your products. - Sales history, which describes each sale made at each location. Specifically this includes the items sold and the quantity of those items, in each sale. - For Promotion Effectiveness: file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (6 of 24)5/6/2009 8:53:01 AM
  • 47. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt Historical information about promotional activities. Demantra can import and use other data such as returned amounts, inventory data, orders, and settlement data. - From my understanding of the Shipping/Booking History Collection for Demantra, the sales data is loaded first and then the item and loc tables are derived from it. next ================================================ Functional Considerations ------------------------- There are many functional setup issues relating directly to successful data load and execution. For example, the MSC:Organization containing generic BOM for forecast explosion profile can be set to one of the organizations at the site level. This will limit the entry of rows into MSD_DP_SCENARIO_ENTRIES and MSD_DP_SCN_ENTRIES_DENORM. Changing the profile to the Global Item Master Organization will make the records available for loading. next ================================================ Imported Data ------------- You can collect internal sales orders. They appear in the customer dimension; the customer appears as the organization code. Seeded collections from EBS into Oracle Demantra Demand Management include: - Shipment history, booking history, and returns history - Manufacturing and fiscal calendars - Price lists, currencies, and currency conversion factors - Dimensions, levels, hierarchies, and level values for demand analysis. Item levels are: - Product category: Item > Category - Product family: Item > Product family - Demand class Location levels are: - Zone: Site > Trading partner > Zone file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (7 of 24)5/6/2009 8:53:01 AM
  • 48. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt - Customer class: Site > Account > Customer > Customer class - Business group: Organization > Operating unit > Business group - Sales channel next ================================================ Integration Interface Wizard ---------------------------- The Integration Interface Wizard initializes the names of the staging tables, but you can rename the tables within the wizard if needed. The default names start with biio_. Make a note of the names of your tables, as displayed within the Integration Interface Wizard. (not a complete list) - biio_supply_plans - biio_supply_plans_pop - biio_other_plan_data - biio_PURGE_PLAN - biio_scenario_resources Troubleshooting --------------- Look to the logs - Integration Interface Table name>_ERR. - If the staging table is name Biio_My_Demand, the error table is Biio_My_Demand_Err - The _Err table will contain the rows that were not successfully added to the staging tables. - Check the Integration Log If your URL to Demantra is http://DEMANTRAMACHINE:8080/demantra/portal/loginpage.jsp then the logs would be accessed by point your browser to: http://DEMANTRAMACHINE:8080/demantra/admin/systemLogs.jsp NOTE, it is advised that you review the contents of the following for erred rows: - UPDATE_BATCH_TRAIL - UPDATE_BATCH_VALUES - UPDATE_BATCH_TRAIL_ERR - UPDATE_BATCH_VALUES_ERR file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (8 of 24)5/6/2009 8:53:01 AM
  • 49. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt Please run the following script in the demantra schema to verify data --------------------------------------------------------------------- begin apps.msd_dem_sop.load_plan_data (:supply_plan_id); end; The parameter to the procedure is the column value of supply_plan_id from the supply_plan table. The correct ID is for the supply plan for which data is being loaded from ASCP to Demantra. Use SQL to attain. For e.g. if for plan 'ASCP-DEM', the supply plan id is 134, the script should be run as begin apps.msd_dem_sop.load_plan_data (134); end; If you must log an SR, please enable trace for the session and provide the tkprof output of the trace file after the execution. Also provide the row count of the following tables in Demantra schema after running the script: 1. biio_other_plan_data 2. biio_resource_capacity 3. biio_supply_plans 4. biio_supply_plans_pop 5. biio_scenario_resources 6. biio_scenario_resource_pop 7. biio_resources 8. t_src_item_tmpl 9. t_src_loc_tmpl 10. BIIO_PURGE_PLAN next ================================================ Dumping your source data for review: 1. A dump file (and log file) which contain the tables T_SRC_ITEM_TMPL, file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (9 of 24)5/6/2009 8:53:01 AM
  • 50. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt T_SRC_LOC_TMPL, T_SRC_SALES_TMPL, TABLE_NAME, WF_GROUPS_SCHEMAS, WF_PROCESS_LOG, WF_SCHEDULER, WF_SCHEMAS, WF_SCHEMA_GROUPS, WF_SCHEMA_TEMPLATES. The command to make such a dump file: (You may need to adjust) exp user@server file=resmed_oct.dmp log=resmed_oct.log tables=( T_SRC_ITEM_TMPL, T_SRC_LOC_TMPL, T_SRC_SALES_TMPL, TABLE_NAME, WF_GROUPS_SCHEMAS, WF_PROCESS_LOG, WF_SCHEDULER, WF_SCHEMAS, WF_SCHEMA_GROUPS, and WF_SCHEMA_TEMPLATES) next ================================================ Prarmeter checks Used for Source Data Load and EP Load. Be familiar with the contents of the following tables: select * from init_params_0 order by 1; select * from sys_params order by 1; select * from db_params order by 1; select * from aps_params order by 1; - init_params_0 table should be all the descritions of the parameters. - sys_params contains the parameters listed on the Database, System, and Worksheet tabs. - db_params constains operational entries such as check_and_drop_sleep_limit. For example, if you should receive the following error: ORA-20000: Cannot DROP AK because of ORA-54 resource busy and have timed out after 255 seconds ORA-6512: at "YOURCO_DEC.CHECK_AND_DROP", line 143 The sleep period starts at 1 second and increments exponentially until it file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (10 of 24)5/6/2009 8:53:01 AM
  • 51. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt reaches a sleep period limit > 128 seconds (current default). This gives a total attempt time of 255 seconds. This limit can be changed via the new parameter in DB_PARAMS 'check_and_drop_sleep_limit'. Retry messages are written to the DB_EXCEPTION_LOG table. Another Key Entry for EP_Load There is a parameter in DB_PARAMS, ep_load_do_commits, the default is TRUE. - TRUE causes commits to be issued in the SALES MERGE loop. - FALSE causes only 1 commit to be issued just after the SALES MERGE loop. - aps_params contains the password the Business Modeler should use to connect to connect to the database as well as other important operational settings. next ================================================ Detailed Source Data Investigation ---------------------------------- Scenario: Shipment and booking history are not completely collected when we establish a new schema. Though the collections for shipment and booking history is successful with no error or warning message, we could not see the items in Demantra. If there is unclean data in t_src_item_tmpl and t_src_loc_tmpl tables, EBS full download moves the unclean data from these tables to the error table. This works fine but the process is not moving clean data to the Demantra base tables. * You will need to find your item_id organization_id as well as other data required to run the following sql. Steps to Produce: 1. Performed booking and shipping history collectins with Auto down load option. 2. Found that the data is not collected into Demantra. This was confirmed after verifying the data in work sheets. 3. Verified that this issue is related to locations via simple SQL checks. select * from dmtra_template.t_src_loc_tmpl_err; file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (11 of 24)5/6/2009 8:53:01 AM
  • 52. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt Found that there were no errors in this table. 4. Performed EBS full down load work flow. This collected the location data into the staging table. But not sales data as last time the locations data were not collected. Hence nothing in the staging table. 5. Then performed booking and shipping history collections with auto down load. Still the sales data not collected into demantra. (not sure why the customer did this) Now, we will dig through the tables to locate the problem Step 1 ================================================-- In the table INTEG_STATUS it shows that when DMTRA_TEMPLATE runs the EP_LOAD it succeeds but when APPS runs it, it fails, this could be a permissions issue. Step 2 ------ select * from dmtra_template.sales_data where item_id in (654,671,727,823); Step 3 ------ Have confirmed items HSE-1, HSE-2,HSE-3,HSE-4,HSE-5 are in staging tables: select * from dmtra_template.t_src_sales_tmpl where lower(dm_item_code) in ('hse-1', 'hse-2','hse-3','hse-4','hse-5'); Step 4 ------ Also confirmed items are loaded into the system: select i.t_ep_item_ep_id,i.item_id,t.item,i.is_fictive from dmtra_template.items i,dmtra_template.t_ep_item t where lower(item) in ('hse-1', 'hse-2','hse-3','hse-4','hse-5') and i.t_ep_item_ep_id=t.t_ep_item_ep_id order by 1; Step 5 ------ Launched complete refresh collections for Shipment and booking history with Auto-download set to 'yes'. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (12 of 24)5/6/2009 8:53:01 AM
  • 53. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt There were no records pertatining to HSE-1,HHM-1(for e.g.) in the t_src_sales_tmpl_err table and there were many records for the same items are in sales_data table. There is inconsitency between the records in t_src_sales_tmpl and sales_data table. Executed the following sqls: For Item HHM-1: SELECT * FROM dmtra_template.sales_data WHERE item_id IN(627,666); SELECT * FROM dmtra_template.t_src_sales_tmpl where dm_item_code ='HHM-1' order by sales_date desc; For Item HSE-1: SELECT * FROM dmtra_template.sales_data WHERE item_id IN(654, 671, 727, 823); SELECT * FROM dmtra_template.t_src_sales_tmpl where dm_item_code ='HSE-1' order by sales_date desc After the above data is available you should be able to determine the missing data. In this case the customer was missing location. Location is one of the major keys. next ================================================ Demand Class ------------ The customer noted that the load was successful however, there were sales orders missing. The collections code is bringing in demand classes from oe_order_lines_all. But the (master table) lookup for demand classes is missing some of the demand classes. For eg. - The demand class code '1-WLKLE' is available in order lines, but it is not available in the lookup 'DEMAND_CLASS'. The following query returns zero rows: file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (13 of 24)5/6/2009 8:53:01 AM
  • 54. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt select * from apps.fnd_common_lookups where lookup_type= 'DEMAND_CLASS' and lookup_code='1-WLKLE'; As a temporary workaround, you could modify the shipment history query to not bring in demand class code, if the demand class is missing in the lookup 'DEMAND_CLASS'. After that, run Shipment and Booking History collection followed by ep_load. Verify the Actual_quantity for one particular item both in the staging table t_src_sales_tmpl and internal table sales_data. The issue is due to missing demand classes in the lookup. If the demand classes available in the lookup 'DEMAND_CLASS' and order lines are in synch with each other, then the sales history quantities should be loaded correctly into sales data. You must check why the demand classes are missing from the lookup DEMAND_CLASS, but present in order lines. These need to be in synch. next ================================================ Working with Oracle Support --------------------------- - When there is an issue successfully collecting from EBS to Staging Tables. 1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set to the correct values. Also make sure demantra URL is up and accessible. 2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with 'Launch Download' set to 'Yes'. 3. Upload log as well as output files for the 'Launch EP LOAD' stage. 4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL'. 5. Supply the logs found under Demand PlannerSchedulerbin folder. next ================================================ file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (14 of 24)5/6/2009 8:53:01 AM
  • 55. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt Version Control - Downgrading is not supported. Our schema upgrade mechanism only knows how to rev foreward not look backward. next ================================================ EBS Data Load Please note that all successfully loaded rows are loaded into the MSD_DP_SCN_ENTRIES_DENORM table. MSD_DP_SCENARIO_ENTRIES table contains only the records which errored out. Errored records of a previous load will be deleted in the next load. To produce acceptable debug logs: 1. Verify if the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL' are set to the correct values. Also make sure demantra URL is up and accessible. 2. Set the profile 'MSD_DEM: Debug Mode' to 'Yes'. Launch collection with 'Launch Download' set to 'Yes'. 3. Upload log as well as output files for the 'Launch EP LOAD' stage. 4. Also provide values for the profiles 'MSD_DEM: Schema' and 'MSD_DEM: Host URL'. 5. Upload the logs. next ================================================ Error Check ----------- The following is a list of tables to review should an error occur. We would suggest that these tables are empty before the load and that you produce an automated script to count any new rows Provide the Following to Oracle Support: file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (15 of 24)5/6/2009 8:53:01 AM
  • 56. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt - DB_EXCEPTION_LOG table select * from db_exception_log order by err_date;= - select * from version_details_history order by upgrade_date desc; - select object_name, object_type from user_objects where upper(status) = 'INVALID'; - select count(*) from dmtra_template.t_src_item_tmpl; - select count(*) from dmtra_template.t_src_item_tmpl_err; - select count(*) from dmtra_template.t_src_loc_tmpl; - select count(*) from dmtra_template.t_src_loc_tmpl_err; - select count(*) from dmtra_template.t_src_sales_tmpl; - select count(*) from dmtra_template.t_src_sales_tmpl_err; - What was the 'Launch Download' Parameter set to? - Please provide the Log file for the respective 'request set' collection program. - select * from t_ep_item; - select * from t_ep_organization; - select * from t_ep_site; After running ASCP -> Demand Management System Administrator -> Collect Shipment and Booking History - Flat File,concurrent program completed successfully Check the following Demantra tables: - select * from t_src_item_tmpl - select * from t_src_item_tmpl_err - select * from t_src_loc_tmpl - select * from t_src_loc_tmpl_err - select * from t_src_sales_tmpl - select * from t_src_sales_tmpl_err You may also consider verifying the contents of the following: - biio_scenario_resources / _err file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (16 of 24)5/6/2009 8:53:01 AM
  • 57. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt - integ_inf_var_cost_err - biio_SUPPLY_PLANS_POP / _err - biio_supply_plans / _er - biio_other_plan_data / _err - BIIO_PURGE_PLAN next ================================================ Price List Collection Verification ---------------------------------- 1. Select * from MSD_DEM_PRL_FROM_SOURCE_V; (note: This query should be run on SOURCE instance only) 2. Select * from msd_dem_entities_inuse where ebs_entity = 'PRL'; Setup issue: 1. You must select the "Planning Method" & "Forecast control" for the Master Org. 2. The customer has built an new data model, which deleted all the seeded display units in demantra. Because of this, pricelists were not able to be loaded into demantra. You would have to rereate new display units like the original seeded units. Then run the pricelist collections. Additional SQL to dig into the Price List ----------------------------------------- 1. select display_units ,display_units_id ,data_table ,data_field from DEM.DISPLAY_UNITS where display_units_id in (select distinct display_units_id from DEM.DISPLAY_UNITS minus select distinct display_units_id from DEM.DCM_PRODUCTS_UNITS ) and display_units like '%EBSPRICELIST%' and rownum < 2; 2. select distinct display_units_id from DEM.DISPLAY_UNITS file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (17 of 24)5/6/2009 8:53:01 AM
  • 58. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt minus select distinct display_units_id from DEM.DCM_PRODUCTS_UNITS; 3. select * from DEM.DISPLAY_UNITS; 4 select * from DEM.DCM_PRODUCTS_UNITS; 5. select distinct price_list_name from MSD_DEM_PRL_FROM_SOURCE_V Price List at the Source Instane -------------------------------- Execute the following SQL to investigate at the source: 1. select count(*) from MSD_DEM_PRL_FROM_SOURCE_V ; 2. select distinct price_list_name from MSD_DEM_PRL_FROM_SOURCE_V ; 3. select text from all_views where view_name like 'MSD_DEM_PRL_FROM_SOURCE_V' ; next ================================================ Collection Success/Error Investigation -------------------------------------- At the Destination, Check the sales date. select min(sales_date), max(sales_date) from dmtra_template.sales_data where actual_quantity > 0 next ================================================ Source Views of Interest ------------------------ The view 'MSD_DEM_DEPENDENT_DEMAND_V' is fetching date values from ASCP tables which includes time as well. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (18 of 24)5/6/2009 8:53:01 AM
  • 59. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt - The procedure MSD_DEM_SOP.PUSH_TIME_DATA inserts dates corresponding to time buckets into MSD_DEM_DATES table before supply plan data is downloaded. Other important objects: - MSD_DEM_DEPENDENT_DEMAND_V - MSD_DEM_DATES - MSD_DEM_ENTITIES_INUSE - MSD_DEM_ENTITY_QUERIES - MSD_DEM_GROUP_TABLES - MSD_DEM_ITEMS_GTT - MSD_DEM_CONSTRAINED_FORECAST_V - MSD_DEM_LOCATIONS_GTT - MSD_DEM_NEW_ITEMS - MSD_DEM_PRICE_LISTS Problem: Legacy Collections for Shipment and Booking History errored out at Collect Level Type stage with the following error in the log: * Workaround: Run ERP collections for future dates which will wipe out the sales staging tables. Then run the legacy collections with proper vaues for demand class and sales channel (or your desired data group) next ================================== EBS to Demantra collection with download = YES does not insert date into the Demantra tables. Data moves into the Demantra staging not into the base table. There is no error in the log file. Steps followed: 1. Create new item and sales order 2. Ran the standard collections and EP_LOAD (with download = Yes) The concurrent programs have been completed successfully with no errors. 3. The item and sales data have been inserted into the Demantra staging tables. 4. Ran the workflow "EBS Full Download" manually, then the data was moved into file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (19 of 24)5/6/2009 8:53:01 AM
  • 60. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt the Demantra base tables. 5. However, we could not see the data in worksheet. We expect EP_LOAD to put the data into Demantra base tables. Solution Explanation -------------------- AppServerURL 1. The first check to make is proper setting of the parameter 'AppServerURL' in demantra. This can be verified from the Business Modeler or from the backend. Use the following query to get the value of this parameter from the demantra schema - You have to start the application server before running the Business Modeler wizard. - In Most cases if the application server is up the problem is with the application server URL. select pval from sys_params where pname like 'AppServerURL'; This should be set to 'http://dskhyd707878.yourcompany.com:80/demantra', or wherever your demantra server is running. 2. Check the profiles 'MSD_DEM_SCHEMA' and 'MSD_DEM_HOST_URL'. Are they set properly. * For more information regarding MSD_DEM_HOST_URL, see note 431301.1 ------------------------------------------------------------- Profile Name - Value ------------------------------------------------------------- Profile MSD_DEM_CATEGORY_SET_NAME - Profile MSD_DEM_CONVERSION_TYPE - Profile MSD_DEM_CURRENCY_CODE - Profile MSD_DEM_MASTER_ORG - 204 Profile MSD_DEM_CUSTOMER_ATTRIBUTE - NONE Profile MSD_DEM_TWO_LEVEL_PLANNING - 2 Profile MSD_DEM_SCHEMA - MSDEM ------------------------------------------------------------- * Please make sure that profiles MSD_DEM_CONVERSION_TYPE and MSD_DEM_MASTER_ORG are set in Source instance and MSD_DEM_CURRENCY_CODE and MSD_DEM_SCHEMA profiles are set in the Planning Server. next file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (20 of 24)5/6/2009 8:53:01 AM
  • 61. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt ================================== After running "Shipment and booking history" request from EBS application, only some partial data gets loaded in demantra TMPL tables. Here is my log: +---------------------------------------------------------------------------+ Demand Planning: Version : 12.0.0 Copyright (c) 1979, 1999, Oracle Corporation. All rights reserved. MSDDEMARD module: Launch EP LOAD +---------------------------------------------------------------------------+ Current system time is 22-AUG-2008 18:19:53 +---------------------------------------------------------------------------+ **Starts**22-AUG-2008 18:19:53 **Ends**22-AUG-2008 18:24:38 ORA-29273: HTTP request failed ORA-06512: at "SYS.UTL_HTTP", line 1577 ORA-12535: TNS:operation timed out +---------------------------------------------------------------------------+ Start of log messages from FND_FILE +---------------------------------------------------------------------------+ Exception: msd_dem_collect_history_data.run_load - 22-AUG-2008 18:24:38 ORA-29273: HTTP request failed ORA-06512: at "SYS.UTL_HTTP", line 1577 ORA-12535: TNS:operation timed out +---------------------------------------------------------------------------+ End of log messages from FND_FILE +---------------------------------------------------------------------------+ +---------------------------------------------------------------------------+ Executing request completion options... Finished executing request completion options. +---------------------------------------------------------------------------+ file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (21 of 24)5/6/2009 8:53:01 AM
  • 62. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt Exceptions posted by this request: Concurrent Request for "Launch EP LOAD" has completed with error. +---------------------------------------------------------------------------+ Concurrent request completed Current system time is 22-AUG-2008 18:24:40 +---------------------------------------------------------------------------+ QUESTION ======== After running "Shipment and booking history" request from EBS application, only some partial data gets loaded in demantra TMPL tables. ORA-29273: HTTP request failed ORA-06512: at "SYS.UTL_HTTP", line 1577 ORA-12535: TNS:operation timed out 1. The issue might be because the parameter 'AppServerURL' isn't set properly. Please run the following query from the Demantra schema: select pname, pval from sys_params where pname like 'AppServerURL'; 2. Please check the AppServerURL in the business modeler. 3. Either reconfigure CONNECT_TIMEOUT to be 0, which means wait indefinitely, or reconfigure CONNECT_TIMEOUT to be some higher value. Or, if the timeout is unacceptably long, turn on tracing for further information. next ================================================ Source Data Manipulation, Order Management booked_date. There was data missing in the load because the booked_date was null. Here is what we did to discover/fix the problem. 1. Delete t_src_loc_tmpl 2. Execute request set "Standard Collection". 3. Execute request set "Shipment and Booking History". 4. Review the process log from EBS 5. Review the collaboration.log 6. Login to Workflow in Demantra environment file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (22 of 24)5/6/2009 8:53:01 AM
  • 63. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt 7. Check if the schema ep_load exist Below tables do not contain any rows: T_SRC_SALES_TMPL T_SRC_ITEMS_TMPL T_SRC_LOC_TMPL The error tables doe not contain any rows: T_SRC_SALES_TMPL_ERR T_SRC_ITEMS_TMPL_ERR T_SRC_LOC_TMPL_ERR Check the following: 1. The source for all the Booking History series - oe_order_headers_all and oe_order_lines_all has data as expected. 2. The source for all the Shipment History series - oe_order_headers_all and oe_order_lines_all has the data expected. 3. The table oe_order_headers_all has the headers_id populated. 4. The table oe_order_lines_all has line and the ordered_item is populated. 6. The table oe_order_headers_all does not have the booked_date set up. SELECT booked_date FROM oe_order_headers_all To implement the solution, please execute the following steps: 1. Make a copy of the table oe_order_headers_all . 2. Populate the column booked_date with correct value on the table oe_order_headers_all . 3. Execute request set "Standard Collection". 4. Execute request set "Shipment and Booking History". 5. Checked T_SRC_* tables in Demantra for data is collected 6. Migrate the solution as appropriate to other environments. Also: Do you have a shipped date populated for new combinations? Series data must be populated into the Demantra staging tables before the load file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (23 of 24)5/6/2009 8:53:01 AM
  • 64. file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Communities/Advisor_Webcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt process begins. To ensure this occurs, the collection programs for all eight historical series have been merged: • Booking History – booked items – booked date • Booking History – requested items – booked date • Booking History – booked items – requested date next ================================================ Data not loading because of worksheet date settings? Regarding the issue of data downloaded into demantra not showing up in worksheet, this is because of the date range specified in the worksheet settings. CONFIGURE LOADING TEXT FILES DOES NOT LOAD DATA FOR MORE THAN 2000 COLUMN WIDTH. During data load from text file, if total width of columns is more than 2000, it shows error as "ORA-12899: value too large for column "DEMANTRA"."DM_WIZ_IMPORT_FILE_DEF"."SRC_CTL_SYNTAX" (actual: 3037, maximum:2000)". * This is the limitation of the functionality. Change covered in enhancement request 6879562. next ================================================ SQL* Loader Technical Health Check 1. Make sure that SQLLDR.EXE utility exist under the oracle client bin directory and that the system path points there. Verify by opening a CMD window and executing SQLLDR.EXE, if it's not found, add it to the system path and restart. 2. Otherwise, please provide the full engine log from that run, also provide all the *.log/*.bad/*.txt files that should be under the engine's bin directory. < END OF DOC > file:///D|/Oracle/MFG/FY09Q1Q2_Projects/Support_Commu...ebcasts/2009_0506_Jeff_Goulette/WC-Staging-Tables.txt (24 of 24)5/6/2009 8:53:01 AM