Really using Oracle analytic SQL functions
Upcoming SlideShare
Loading in...5
×
 

Really using Oracle analytic SQL functions

on

  • 743 views

Presentation on Oracle analytic SQL functions as I presented it on UKOUG 2012 conference in Birmingham

Presentation on Oracle analytic SQL functions as I presented it on UKOUG 2012 conference in Birmingham

Statistics

Views

Total Views
743
Views on SlideShare
738
Embed Views
5

Actions

Likes
0
Downloads
27
Comments
0

2 Embeds 5

http://www.linkedin.com 4
https://www.linkedin.com 1

Accessibility

Categories

Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Processing…
Post Comment
Edit your comment

Really using Oracle analytic SQL functions Really using Oracle analytic SQL functions Presentation Transcript

  • Really UsingAnalytic Functions Kim Berg Hansen T. Hansen Gruppen A/S
  • Who is this Kim? • a Danish SQL and PL/SQL Developer: http://dspsd.blogspot.com • Professional geek since 1996 • Oracle programmer since 2000 • Single SQL Statement mantra (©Tom Kyte) • Danish Beer Enthusiast (http://ale.dk) • Likes to cook • Reads sci-fi2 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • What’s up? • Why analytics? • Case 1: Top selling items • Case 2: Picking by FIFO • Case 3: Efficient picking route • Case 4: Picking efficiency • Case 5: Forecasting sales • Case 6: Forecast zero firework stock • Case 7: Multi-order FIFO picking (time permitting) • Any questions?3 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Why analytics? • Normal SQL functions operate on one row • Aggregates can do more rows but loose detail • When you need details together with subtotals, ranks, ratios, comparisons, you could do:  Client operations (tool or code with variables/arrays)  Scalar subqueries (multiple access of same data)  Analytic functions (often much more efficient ) • Analytics allow you to operate across the entire resultset, not just a single row4 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Top Selling Items Case 15 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Top selling items• Classic task for a programmer:• Show top three by product group• Also show how big percentage they sold of the total – Both of the total by product group – And of the grand total6 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Tables create table items( item varchar2(10) primary key, Items with groups grp varchar2(10), name varchar2(20) ) / create table sales ( Sales per month item varchar2(10) references items (item), mth date, qty number ) /7 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data - items insert into items values (101010,AUTO,Brake disc); insert into items values (102020,AUTO,Snow chain); 5 autoparts insert into items values (103030,AUTO,Sparc plug); insert into items values (104040,AUTO,Oil filter); insert into items values (105050,AUTO,Light bulb); insert into items values (201010,MOBILE,Handsfree); insert into items values (202020,MOBILE,Charger); 5 mobile insert into items values (203030,MOBILE,iGloves); insert into items values (204040,MOBILE,Headset); accessories insert into items values (205050,MOBILE,Cover);8 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data – sales AUTO insert into sales values (101010,date 2011-04-01,10); insert into sales values (101010,date 2011-05-01,11); Sales for various insert into sales values (101010,date 2011-06-01,12); months of 2011 for insert into sales values (102020,date 2011-03-01, 7); the autoparts insert into sales values (102020,date 2011-07-01, 8); insert into sales values (103030,date 2011-01-01, 6); insert into sales values (103030,date 2011-02-01, 9); insert into sales values (103030,date 2011-11-01, 4); insert into sales values (103030,date 2011-12-01,14); insert into sales values (104040,date 2011-08-01,22); insert into sales values (105050,date 2011-09-01,13); insert into sales values (105050,date 2011-10-01,15);9 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data – sales MOBILE insert into sales values (201010,date 2011-04-01, 5); insert into sales values (201010,date 2011-05-01, 6); Sales for various insert into sales values (201010,date 2011-06-01, 7); months of 2011 for insert into sales values (202020,date 2011-03-01,21); the mobile insert into sales values (202020,date 2011-07-01,23); insert into sales values (203030,date 2011-01-01, 7); accessories insert into sales values (203030,date 2011-02-01, 7); insert into sales values (203030,date 2011-11-01, 6); insert into sales values (203030,date 2011-12-01, 8); insert into sales values (204040,date 2011-08-01,35); insert into sales values (205050,date 2011-09-01,13); insert into sales values (205050,date 2011-10-01,15);10 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Base select select i.grp , i.item Join items and sales , max(i.name) name , sum(s.qty) qty from items i Sales for 2011 join sales s on s.item = i.item where s.mth between date 2011-01-01 and date 2011-12-01 Group by to get group by i.grp, i.item order by i.grp, sum(s.qty) desc, i.item total sales for 2011 per item11 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Base select GRP ITEM NAME QTY ---------- ---------- -------------------- ----- Couple of items in AUTO 101010 Brake disc 33 each group have AUTO 103030 Sparc plug 33 identical sales AUTO 105050 Light bulb 28 AUTO 104040 Oil filter 22 AUTO 102020 Snow chain 15 MOBILE 202020 Charger 44 MOBILE 204040 Headset 35 MOBILE 203030 iGloves 28 MOBILE 205050 Cover 28 MOBILE 201010 Handsfree 1812 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Which TOP? select g.grp, g.item, g.name, g.qty , dense_rank() over (partition by g.grp order by g.qty desc) drnk , rank() over (partition by g.grp order by g.qty desc) rnk , row_number() over (partition by g.grp order by g.qty desc, g.item) rnum from ( select i.grp , i.item , max(i.name) name , from sum(s.qty) qty items i Base select as join on sales s s.item = i.item inline view where s.mth between date 2011-01-01 and date 2011-12-01 group by i.grp, i.item ) g order by g.grp, g.qty desc, g.item13 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Which TOP? GRP ITEM NAME QTY DRNK RNK RNUM ---------- ---------- -------------------- ----- ----- ----- ----- The three different AUTO 101010 Brake disc 33 1 1 1 functions handle AUTO 103030 Sparc plug 33 1 1 2 ties differently AUTO 105050 Light bulb 28 2 3 3 AUTO 104040 Oil filter 22 3 4 4 AUTO 102020 Snow chain 15 4 5 5 MOBILE 202020 Charger 44 1 1 1 MOBILE 204040 Headset 35 2 2 2 MOBILE 203030 iGloves 28 3 3 3 MOBILE 205050 Cover 28 3 3 4 MOBILE 201010 Handsfree 18 4 5 514 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Without inline view select i.grp , i.item Analytics calculated last, so can include , max(i.name) name group by expressions as well as aggregates , sum(s.qty) qty , dense_rank() over (partition by i.grp order by sum(s.qty) desc) drnk , rank() over (partition by i.grp order by sum(s.qty) desc) rnk , row_number() over (partition by i.grp order by sum(s.qty) desc, i.item) rnum from items i join sales s on s.item = i.item where s.mth between date 2011-01-01 and date 2011-12-01 group by i.grp, i.item order by i.grp, sum(s.qty) desc, i.item15 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Without inline view GRP ITEM NAME QTY DRNK RNK RNUM ---------- ---------- -------------------- ----- ----- ----- ----- Identical results AUTO 101010 Brake disc 33 1 1 1 AUTO 103030 Sparc plug 33 1 1 2 AUTO 105050 Light bulb 28 2 3 3 AUTO 104040 Oil filter 22 3 4 4 AUTO 102020 Snow chain 15 4 5 5 MOBILE 202020 Charger 44 1 1 1 MOBILE 204040 Headset 35 2 2 2 MOBILE 203030 iGloves 28 3 3 3 MOBILE 205050 Cover 28 3 3 4 MOBILE 201010 Handsfree 18 4 5 516 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • TOP 3 – rank() select g.grp, g.item, g.name, g.qty, g.rnk from ( Analytic function select i.grp cannot be in , i.item , max(i.name) name where clause , sum(s.qty) qty , rank() over (partition by i.grp order by sum(s.qty) desc) rnk from items i So inline view join sales s on s.item = i.item and filter on the where s.mth between date 2011-01-01 and date 2011-12-01 alias group by i.grp, i.item ) g where g.rnk <= 3 order by g.grp, g.rnk, g.item17 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • TOP 3 – rank() GRP ITEM NAME QTY RNK ---------- ---------- -------------------- ----- ----- rank() works like the AUTO 101010 Brake disc 33 1 olympics – two gold AUTO 103030 Sparc plug 33 1 medals mean no AUTO 105050 Light bulb 28 3 MOBILE 202020 Charger 44 1 silver medal MOBILE 204040 Headset 35 2 MOBILE 203030 iGloves 28 3 MOBILE 205050 Cover 28 318 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • TOP 3 – dense_rank() select g.grp, g.item, g.name, g.qty, g.rnk from ( select i.grp , i.item , max(i.name) name , sum(s.qty) qty , dense_rank() over (partition by i.grp order by sum(s.qty) desc) rnk from items i join sales s on s.item = i.item where s.mth between date 2011-01-01 and date 2011-12-01 group by i.grp, i.item ) g where g.rnk <= 3 order by g.grp, g.rnk, g.item19 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • TOP 3 – dense_rank() GRP ITEM NAME QTY RNK ---------- ---------- -------------------- ----- ----- dense_rank() also AUTO 101010 Brake disc 33 1 gives equal rank at AUTO 103030 Sparc plug 33 1 ties, but does not AUTO 105050 Light bulb 28 2 AUTO 104040 Oil filter 22 3 skip ranks MOBILE 202020 Charger 44 1 MOBILE 204040 Headset 35 2 MOBILE 203030 iGloves 28 3 MOBILE 205050 Cover 28 320 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • TOP 3 – row_number() select g.grp, g.item, g.name, g.qty, g.rnk from ( select i.grp , i.item , max(i.name) name , sum(s.qty) qty , row_number() over (partition by i.grp order by sum(s.qty) desc, i.item) rnk from items i join sales s on s.item = i.item where s.mth between date 2011-01-01 and date 2011-12-01 group by i.grp, i.item ) g where g.rnk <= 3 order by g.grp, g.rnk, g.item21 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • TOP 3 – row_number() GRP ITEM NAME QTY RNK ---------- ---------- -------------------- ----- ----- row_number() just AUTO 101010 Brake disc 33 1 numbers AUTO 103030 Sparc plug 33 2 consecutively AUTO 105050 Light bulb 28 3 MOBILE 202020 Charger 44 1 MOBILE 204040 Headset 35 2 MOBILE 203030 iGloves 28 3 If ties, then result is ”random”, so a good idea always to use ”unique” order by22 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Percent of total select g.grp, g.item, g.name, g.qty, g.rnk , round(g.g_pct,1) g_pct ratio_to_report() , round(g.t_pct,1) t_pct from ( returns number select i.grp , i.item between 0 and 1 , max(i.name) name , sum(s.qty) qty , rank() over (partition by i.grp order by sum(s.qty) desc) rnk , 100 * ratio_to_report(sum(s.qty)) over (partition by i.grp) g_pct , 100 * ratio_to_report(sum(s.qty)) over () t_pct Multiply with 100 to from items i join sales s get percent on s.item = i.item where s.mth between date 2011-01-01 and date 2011-12-01 group by i.grp, i.item ) g where g.rnk <= 3 order by g.grp, g.rnk, g.item23 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Percent of total GRP ITEM NAME QTY RNK G_PCT T_PCT ---------- ---------- -------------------- ----- ----- ------ ------ AUTO 101010 Brake disc 33 1 25.2 11.6 AUTO 103030 Sparc plug 33 1 25.2 11.6 AUTO 105050 Light bulb 28 3 21.4 9.9 MOBILE 202020 Charger 44 1 28.8 15.5 MOBILE 204040 Headset 35 2 22.9 12.3 MOBILE 203030 iGloves 28 3 18.3 9.9 MOBILE 205050 Cover 28 3 18.3 9.924 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Top selling items • What kind of top three do you wish? – DENSE_RANK() – RANK() – ROW_NUMBER() • PARTITION BY groups of items • RATIO_TO_REPORT for percentages25 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking by FIFO Case 226 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking by FIFO • Items stored in different locations in warehouse • Pick an order by First-In First-Out principle27 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Tables create table inventory ( item varchar2(10), Item, location, quan loc varchar2(10), tity and date of qty number, purchase purch date ) / create table orderline ( Order number, ordno number, item varchar2(10), item and quantity qty number ordered ) /28 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data insert into inventory values(A1, 1-A-20, 18, DATE 2004-11-01); insert into inventory values(A1, 1-A-31, 12, DATE 2004-11-05); 2 items each 5 insert into inventory values(A1, 1-C-05, 18, DATE 2004-11-03); locations various insert into inventory values(A1, 2-A-02, 24, DATE 2004-11-02); insert into inventory values(A1, 2-D-07, 9, DATE 2004-11-04); purchase dates insert into inventory values(B1, 1-A-02, 18, DATE 2004-11-06); insert into inventory values(B1, 1-B-11, 4, DATE 2004-11-05); insert into inventory values(B1, 1-C-04, 12, DATE 2004-11-03); insert into inventory values(B1, 1-B-15, 2, DATE 2004-11-02); insert into inventory values(B1, 2-D-23, 1, DATE 2004-11-04); insert into orderline values (1,A1,24); One order with a insert into orderline values (1,B1,18); quantity of both items29 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • What to pick Picking application sets bind variable for which order to pick (Could be sales order, batch order, shop refill order) variable pick_order number; begin :pick_order := 1; end; /30 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • What can we pick select o.item , o.qty ord_qty Join orderline to , i.loc inventory to see all , i.purch that potentially can , i.qty loc_qty from orderline o be picked join inventory i on i.item = o.item where o.ordno = :pick_order order by o.item, i.purch, i.loc31 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • What can we pick ITEM ORD_QTY LOC PURCH LOC_QTY ---------- ------- ---------- ---------- ------- Visually we can see A1 24 1-A-20 2004-11-01 18 we need 18 A1 from A1 24 2-A-02 2004-11-02 24 first location and 6 A1 24 1-C-05 2004-11-03 18 A1 24 2-D-07 2004-11-04 9 from second loc. A1 24 1-A-31 2004-11-05 12 B1 18 1-B-15 2004-11-02 2 B1 18 1-C-04 2004-11-03 12 Likewise we will B1 18 2-D-23 2004-11-04 1 empty first 3 locs of B1 18 1-B-11 2004-11-05 4 B1 18 1-A-02 2004-11-06 18 B1 and pick 3 from fourth location32 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Accumulate select o.item , o.qty ord_qty Let’s try to create a , i.loc rolling sum of the , i.purch , i.qty loc_qty qty for each item , sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and current row ) sum_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order order by o.item, i.purch, i.loc33 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Accumulate ITEM ORD_QTY LOC PURCH LOC_QTY SUM_QTY ---------- ------- ---------- ---------- ------- ------- Yup, when our sum A1 24 1-A-20 2004-11-01 18 18 is greater than the A1 24 2-A-02 2004-11-02 24 42 ordered qty, it looks A1 24 1-C-05 2004-11-03 18 60 A1 24 2-D-07 2004-11-04 9 69 like we have enough A1 24 1-A-31 2004-11-05 12 81 B1 18 1-B-15 2004-11-02 2 2 B1 18 1-C-04 2004-11-03 12 14 B1 18 2-D-23 2004-11-04 1 15 B1 18 1-B-11 2004-11-05 4 19 B1 18 1-A-02 2004-11-06 18 3734 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Filter accumulated select s.* from ( So let’s try to filter select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty on that , sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and current row ) sum_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_qty < s.ord_qty order by s.item, s.purch, s.loc35 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Filter accumulated ITEM ORD_QTY LOC PURCH LOC_QTY SUM_QTY ---------- ------- ---------- ---------- ------- ------- FAIL! A1 24 1-A-20 2004-11-01 18 18 B1 18 1-B-15 2004-11-02 2 2 B1 18 1-C-04 2004-11-03 12 14 Missing the last B1 18 2-D-23 2004-11-04 1 15 location for each item36 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Accumulate previous select o.item , o.qty ord_qty One small change to , i.loc our rolling sum: , i.purch , i.qty loc_qty , sum(i.qty) over ( partition by i.item Sum of rows up to order by i.purch, i.loc but not including rows between unbounded preceding and 1 preceding ) sum_prv_qty the current row from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order order by o.item, i.purch, i.loc37 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Accumulate previous ITEM ORD_QTY LOC PURCH LOC_QTY SUM_PRV_QTY ---------- ------- ---------- ---------- ------- ----------- As long as the sum A1 24 1-A-20 2004-11-01 18 of the previous rows A1 24 2-A-02 2004-11-02 24 18 are not enough, we A1 24 1-C-05 2004-11-03 18 42 A1 24 2-D-07 2004-11-04 9 60 continue A1 24 1-A-31 2004-11-05 12 69 B1 18 1-B-15 2004-11-02 2 B1 18 1-C-04 2004-11-03 12 2 When the previous B1 18 2-D-23 2004-11-04 1 14 rows are sufficient, B1 18 1-B-11 2004-11-05 4 15 B1 18 1-A-02 2004-11-06 18 19 we stop38 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Filter previous select s.* , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty Now we can filter from ( select o.item, o.qty ord_qty correctly , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc nvl() for first row rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item least() to get qty to where o.ordno = :pick_order be picked at that ) s where s.sum_prv_qty < s.ord_qty location order by s.item, s.purch, s.loc39 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Filter previous ITEM ORD_QTY LOC PURCH LOC_QTY SUM_PRV_QTY PICK_QTY ---------- ------- ---------- ---------- ------- ----------- -------- A1 24 1-A-20 2004-11-01 18 0 18 A1 24 2-A-02 2004-11-02 24 18 6 B1 18 1-B-15 2004-11-02 2 0 2 B1 18 1-C-04 2004-11-03 12 2 12 B1 18 2-D-23 2004-11-04 1 14 1 B1 18 1-B-11 2004-11-05 4 15 340 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking list FIFO select s.loc , s.item Now order by , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( location to make a select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty pick list , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty order by s.loc41 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking list FIFO LOC ITEM PICK_QTY ---------- ---------- -------- Ready for operator 1-A-20 A1 18 to go picking 1-B-11 B1 3 1-B-15 B1 2 1-C-04 B1 12 2-A-02 A1 6 2-D-23 B1 142 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking small qty select s.loc , s.item Change pick policy , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( by changing order: select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item Here empty small order by i.qty, i.loc -- << only line changed rows between unbounded preceding and 1 preceding quantities first to ),0) sum_prv_qty from orderline o clean out locations join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty order by s.loc43 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking small qty LOC ITEM PICK_QTY ---------- ---------- -------- Lots of picks 1-A-20 A1 3 1-A-31 A1 12 1-B-11 B1 4 Will clean locations 1-B-15 B1 2 quickly for new 1-C-04 B1 11 2-D-07 A1 9 incoming goods 2-D-23 B1 144 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking few picks select s.loc , s.item Or policy of picking , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( as few times as select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty possible , nvl(sum(i.qty) over ( partition by i.item order by i.qty desc, i.loc -- << only line changed rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty order by s.loc45 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking few picks LOC ITEM PICK_QTY ---------- ---------- -------- Only two picks 1-A-02 B1 18 2-A-02 A1 24 But will be at expense of leaving small quantities all over warehouse46 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking short route select s.loc , s.item Policy of not driving , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( to the far select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty warehouse if , nvl(sum(i.qty) over ( partition by i.item possible order by i.loc -- << only line changed rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty order by s.loc47 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking short route LOC ITEM PICK_QTY ---------- ---------- -------- All picked in the 1-A-02 B1 18 very fist aisle in the 1-A-20 A1 18 warehouse 1-A-31 A1 648 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking by FIFO • SUM() by item • Ordered by purchase date • Rolling sum to find how much was picked by ”previous rows” • Filter away rows where sufficient has already been picked49 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Efficient picking route Case 350 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking small qty select s.loc , s.item Same data as FIFO , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( picking case select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item But for this case we order by i.qty, i.loc -- << only line changed rows between unbounded preceding and 1 preceding will use the policy of ),0) sum_prv_qty from orderline o picking small join inventory i on i.item = o.item quantities where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty order by s.loc51 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking small qty LOC ITEM PICK_QTY ---------- ---------- -------- Because that gives 1-A-20 A1 3 many picks and 1-A-31 A1 12 shows this case best 1-B-11 B1 4 1-B-15 B1 2 1-C-04 B1 11 2-D-07 A1 9 Notice anything 2-D-23 B1 1 about these data?52 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking route • Is this a smart route to drive?53 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Better picking route • We need to change direction every other aisle54 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Decipher loc select to_number(substr(s.loc,1,1)) warehouse , substr(s.loc,3,1) aisle In this case location , , to_number(substr(s.loc,5,2)) position s.loc can be split into , s.item warehouse, aisle , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( and position simply select o.item, o.qty ord_qty , i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( by substr() partition by i.item order by i.qty, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty order by s.loc55 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Decipher loc WAREHOUSE AISLE POSITION LOC ITEM PICK_QTY --------- ----- -------- ---------- ---------- -------- Now we can use 1 A 20 1-A-20 A1 3 analytics on the 1 A 31 1-A-31 A1 12 individual parts of 1 B 11 1-B-11 B1 4 1 B 15 1-B-15 B1 2 the location 1 C 4 1-C-04 B1 11 2 D 7 2-D-07 A1 9 2 D 23 2-D-23 B1 156 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Rank aisles select to_number(substr(s.loc,1,1)) warehouse , substr(s.loc,3,1) aisle Ordering by , dense_rank() over ( order by to_number(substr(s.loc,1,1)) -- warehouse warehouse and aisle , substr(s.loc,3,1) -- aisle ) aisle_no will give same rank , to_number(substr(s.loc,5,2)) position , s.loc to positions in same , s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( aisle select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty dense_rank() from orderline o join inventory i on i.item = o.item ensures consecutive where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty ranks order by s.loc57 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Rank aisles WAREHOUSE AISLE AISLE_NO POSITION LOC ITEM PICK_QTY --------- ----- -------- -------- ---------- ---------- -------- Now we have 1 A 1 20 1-A-20 A1 3 numbered the aisles 1 A 1 31 1-A-31 A1 12 in the order they 1 B 2 11 1-B-11 B1 4 1 B 2 15 1-B-15 B1 2 are to be visited 1 C 3 4 1-C-04 B1 11 2 D 4 7 2-D-07 A1 9 2 D 4 23 2-D-23 B1 158 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Odd up, even down select s2.warehouse, s2.aisle, s2.aisle_no, s2.position, s2.loc, s2.item, s2.pick_qty mod() and case from ( select to_number(substr(s.loc,1,1)) warehouse , substr(s.loc,3,1) aisle , dense_rank() over ( order by to_number(substr(s.loc,1,1)) -- warehouse allows us to order , substr(s.loc,3,1) -- aisle ) aisle_no , to_number(substr(s.loc,5,2)) position , s.loc, s.item , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty positive on odd from ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc aisles and negative rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item on even aisles where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty ) s2 order by s2.warehouse , s2.aisle_no , case when mod(s2.aisle_no,2) = 1 then s2.position else -s2.position end59 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Odd up, even down WAREHOUSE AISLE AISLE_NO POSITION LOC ITEM PICK_QTY --------- ----- -------- -------- ---------- ---------- -------- And so aisle 1 A 1 20 1-A-20 A1 3 1-A is ascending, 1 A 1 31 1-A-31 A1 12 1-B is descending, 1 B 2 15 1-B-15 B1 2 1 B 2 11 1-B-11 B1 4 1-C is ascending, 1 C 3 4 1-C-04 B1 11 2-D is descending 2 D 4 23 2-D-23 B1 1 2 D 4 7 2-D-07 A1 960 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Single door • Direction has to ”restart” per warehouse61 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Partition warehouse select s2.warehouse, s2.aisle, s2.aisle_no, s2.position, s2.loc, s2.item, s2.pick_qty from ( select to_number(substr(s.loc,1,1)) warehouse Move the , substr(s.loc,3,1) aisle , dense_rank() over ( partition by to_number(substr(s.loc,1,1)) -- warehouse warehouse part order by ) aisle_no substr(s.loc,3,1) -- aisle from the order by to , to_number(substr(s.loc,5,2)) position , s.loc, s.item the partition by , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.qty, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderline o join inventory i on i.item = o.item where o.ordno = :pick_order ) s where s.sum_prv_qty < s.ord_qty ) s2 order by s2.warehouse , s2.aisle_no , case when mod(s2.aisle_no,2) = 1 then s2.position else -s2.position end62 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Partition warehouse WAREHOUSE AISLE AISLE_NO POSITION LOC ITEM PICK_QTY --------- ----- -------- -------- ---------- ---------- -------- Now the aisle_no is 1 A 1 20 1-A-20 A1 3 restarted for each 1 A 1 31 1-A-31 A1 12 warehouse, so the 1 B 2 15 1-B-15 B1 2 1 B 2 11 1-B-11 B1 4 first visited aisle of a 1 C 3 4 1-C-04 B1 11 warehouse is always 2 D 1 7 2-D-07 A1 9 odd and therefore 2 D 1 23 2-D-23 B1 1 sorted ascending63 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Efficient picking route • DENSE_RANK() to number the aisles in order visited • Order the output – ”Up” on odd aisles – ”Down” on even aisles • Partition by warehouse if door is missing64 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking efficiency Case 465 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking efficiency • How fast can operators pick items? • How much do they wait idle for totes to arrive?66 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Table create table missions ( missionid number primary key, Missions are loadunit number, everytime a tote departpos varchar2(10), (loadunit) goes from departtime date, arrivepos varchar2(10), one position to arrivetime date another position on ) the conveyor system /67 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Mission datainsert into missions values ( 35986751, 10063485, STORE, timestamp 2012-04-12 06:38:07, PLF4, timestamp 2012-04-12 08:00:03 );insert into missions values ( 35986752, 10016906, STORE, timestamp 2012-04-12 06:38:07, PLF4, timestamp 2012-04-12 08:01:41 );insert into missions values ( 35986754, 10059580, STORE, timestamp 2012-04-12 06:38:07, PLF4, timestamp 2012-04-12 08:01:09 );insert into missions values ( 35986755, 10056277, STORE, timestamp 2012-04-12 06:38:07, PLF4, timestamp 2012-04-12 08:01:16 );insert into missions values ( 35986757, 10051547, STORE, timestamp 2012-04-12 06:38:07, PLF4, timestamp 2012-04-12 08:02:40 );...2690 inserts snipped out...insert into missions values ( 35992214, 10064588, PLF4, timestamp 2012-04-12 11:13:20, STORE, timestamp 2012-04-12 11:15:12 );insert into missions values ( 35992216, 10066518, PLF4, timestamp 2012-04-12 11:13:22, STORE, timestamp 2012-04-12 11:15:30 );insert into missions values ( 35992219, 10082114, PLF4, timestamp 2012-04-12 11:13:43, STORE, timestamp 2012-04-12 11:15:35 );insert into missions values ( 35992220, 10033235, PLF4, timestamp 2012-04-12 11:13:52, STORE, timestamp 2012-04-12 11:15:50 );insert into missions values ( 35992223, 10056459, PLF4, timestamp 2012-04-12 11:14:59, STORE, timestamp 2012-04-12 11:21:03 ); 68 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Arrivals select a.arrivepos pos , a.arrivetime time All missions arriving , a.loadunit at picking stations , a.missionid PLF4 and PLF5 on from missions a where a.arrivepos in (PLF4,PLF5) April 12th after and a.arrivetime >= to_date(2012-04-12 08:00:00, 08:00 YYYY-MM-DD HH24:MI:SS) and a.arrivetime <= to_date(2012-04-12 23:59:59, YYYY-MM-DD HH24:MI:SS) order by a.arrivepos, a.arrivetime69 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Arrivals POS TIME LOADUNIT MISSIONID ---- -------- --------- --------- PLF4 08:00:03 10063485 35986751 PLF4 08:00:11 10069588 35986762 PLF4 08:01:09 10059580 35986754 ... PLF4 12:47:51 10069370 35990243 PLF4 12:47:58 10026743 35990248 PLF4 12:49:06 10013439 35990250 PLF5 08:00:00 10040198 35987250 PLF5 08:00:07 10008351 35987251 PLF5 08:00:14 10068629 35987225 ... PLF5 11:28:47 10078376 35990936 PLF5 11:28:56 10035491 35990918 PLF5 11:29:07 10010287 35991015 1453 rows selected.70 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Departures select d.departpos pos , d.departtime time All missions , d.loadunit departing from , d.missionid picking stations from missions d where d.departpos in (PLF4,PLF5) PLF4 and PLF5 on and d.departtime >= to_date(2012-04-12 08:00:00, April 12th after YYYY-MM-DD HH24:MI:SS) and d.departtime <= to_date(2012-04-12 23:59:59, 08:00 YYYY-MM-DD HH24:MI:SS) order by d.departpos, d.departtime71 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Departures POS TIME LOADUNIT MISSIONID ---- -------- --------- --------- PLF4 08:00:00 10067235 35988299 PLF4 08:00:08 10063485 35988300 PLF4 08:01:07 10069588 35988307 ... PLF4 11:13:43 10082114 35992219 PLF4 11:13:52 10033235 35992220 PLF4 11:14:59 10056459 35992223 PLF5 08:00:06 10040198 35988296 PLF5 08:00:13 10008351 35988302 PLF5 08:00:35 10068629 35988303 ... PLF5 11:08:36 10018796 35992157 PLF5 11:08:45 10058221 35992158 PLF5 11:09:00 10030575 35992159 1247 rows selected.72 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Combined events select pos, time, ad, loadunit, missionid from ( select a.arrivepos pos, a.arrivetime time, A ad, a.loadunit, a.missionid from missions a where a.arrivepos in (PLF4,PLF5) and a.arrivetime >= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) and a.arrivetime <= to_date(2012-04-12 23:59:59,YYYY-MM-DD HH24:MI:SS) union all select d.departpos pos, d.departtime time, D ad, d.loadunit, d.missionid from missions d where d.departpos in (PLF4,PLF5) and d.departtime >= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) and d.departtime <= to_date(2012-04-12 23:59:59,YYYY-MM-DD HH24:MI:SS) ) s1 order by pos, time73 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Combined events POS TIME AD LOADUNIT MISSIONID ---- -------- -- --------- --------- Arrivals and PLF4 PLF4 08:00:00 08:00:03 D A 10067235 35988299 10063485 35986751 departures joined PLF4 08:00:08 D 10063485 35988300 allows us to see the PLF4 08:00:11 A 10069588 35986762 PLF4 08:01:07 D 10069588 35988307 loadunit arriving PLF4 08:01:09 A 10059580 35986754 and a little bit later PLF4 08:01:14 D 10059580 35988308 PLF4 08:01:16 A 10056277 35986755 departing PLF4 08:01:24 D 10056277 35988309 PLF4 08:01:26 A 10081310 35986764 PLF4 08:01:39 D 10081310 35988310 PLF4 08:01:41 A 10016906 35986752 ... 2700 rows selected.74 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Lead the next event with s1 as ( select a.arrivepos pos, a.arrivetime time, from missions a A ad, a.loadunit, a.missionid The analytic where a.arrivepos in (PLF4,PLF5) and a.arrivetime >= to_date(2012-04-12 and a.arrivetime <= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) 23:59:59,YYYY-MM-DD HH24:MI:SS) function lead() gives union all select d.departpos pos, d.departtime time, from missions d D ad, d.loadunit, d.missionid for each row the where d.departpos in (PLF4,PLF5) and d.departtime >= to_date(2012-04-12 and d.departtime <= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) 23:59:59,YYYY-MM-DD HH24:MI:SS) time of the next row ) select pos , time , lead(time) over ( partition by pos order by time, missionid ) nexttime , ad , loadunit from s1 order by pos, time75 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Lead the next event POS TIME NEXTTIME AD LOADUNIT ---- -------- -------- -- --------- So on each ’D’ row PLF4 PLF4 08:00:00 08:00:03 08:00:03 08:00:08 D A 10067235 10063485 NEXTTIME is the PLF4 08:00:08 08:00:11 D 10063485 time of the PLF4 08:00:11 08:01:07 A 10069588 PLF4 08:01:07 08:01:09 D 10069588 following ’A’ row PLF4 08:01:09 08:01:14 A 10059580 PLF4 08:01:14 08:01:16 D 10059580 PLF4 08:01:16 08:01:24 A 10056277 PLF4 08:01:24 08:01:26 D 10056277 And on each ’A’ row PLF4 08:01:26 08:01:39 A 10081310 NEXTTIME is the PLF4 08:01:39 08:01:41 D 10081310 PLF4 08:01:41 08:01:57 A 10016906 time of the ... 2700 rows selected. following ’D’ row76 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Lead on with s1 as ( select a.arrivepos pos, a.arrivetime time, from missions a where a.arrivepos in (PLF4,PLF5) A ad, a.loadunit, a.missionid lead() accepts a second parameter and a.arrivetime >= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) and a.arrivetime <= to_date(2012-04-12 23:59:59,YYYY-MM-DD HH24:MI:SS) union all select d.departpos pos, d.departtime time, D ad, d.loadunit, d.missionid from missions d where d.departpos in (PLF4,PLF5) and d.departtime >= to_date(2012-04-12 and d.departtime <= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) 23:59:59,YYYY-MM-DD HH24:MI:SS) telling how many ) select pos, time rows forward the , lead(time) over ( partition by pos function should order by time, missionid ) nexttime ”look” , lead(time,2) over ( partition by pos order by time, missionid ) next2time , ad, loadunit from s1 order by pos, time77 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Lead on POS TIME NEXTTIME NEXT2TIM AD LOADUNIT ---- -------- -------- -------- -- --------- The NEXT2TIME PLF4 PLF4 08:00:00 08:00:03 08:00:03 08:00:08 08:00:08 08:00:11 D A 10067235 10063485 column ”looks” 2 PLF4 08:00:08 08:00:11 08:01:07 D 10063485 rows forward PLF4 08:00:11 08:01:07 08:01:09 A 10069588 PLF4 08:01:07 08:01:09 08:01:14 D 10069588 PLF4 08:01:09 08:01:14 08:01:16 A 10059580 PLF4 08:01:14 08:01:16 08:01:24 D 10059580 PLF4 08:01:16 08:01:24 08:01:26 A 10056277 PLF4 08:01:24 08:01:26 08:01:39 D 10056277 PLF4 08:01:26 08:01:39 08:01:41 A 10081310 PLF4 08:01:39 08:01:41 08:01:57 D 10081310 PLF4 08:01:41 08:01:57 08:01:59 A 10016906 ... 2700 rows selected.78 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Filter double lead with s1 as ( select a.arrivepos pos, a.arrivetime time, A ad, a.loadunit, a.missionid from missions a where a.arrivepos in (PLF4,PLF5) and a.arrivetime >= to_date(2012-04-12 and a.arrivetime <= to_date(2012-04-12 union all select d.departpos pos, d.departtime time, 08:00:00,YYYY-MM-DD HH24:MI:SS) 23:59:59,YYYY-MM-DD HH24:MI:SS) D ad, d.loadunit, d.missionid Since we use the double lead we now from missions d where d.departpos in (PLF4,PLF5) and d.departtime >= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) and d.departtime <= to_date(2012-04-12 23:59:59,YYYY-MM-DD HH24:MI:SS) ) select pos, time arrive, nexttime depart, next2time nextarrive, loadunit from ( have all the data select pos, time , lead(time) over ( necessary on the ’A’ partition by pos order by time, missionid rows and do not ) nexttime , lead(time,2) over ( need the ’D’ rows partition by pos order by time, missionid anymore ) next2time , ad, loadunit from s1 ) s2 where ad = A order by pos, arrive79 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Filter double lead POS ARRIVE DEPART NEXTARRI LOADUNIT ---- -------- -------- -------- --------- We can now see a tote PLF4 08:00:03 08:00:08 08:00:11 10063485 arrives 08:00:03, leaves PLF4 08:00:11 08:01:07 08:01:09 10069588 PLF4 08:01:09 08:01:14 08:01:16 10059580 again at 08:00:08, and a PLF4 08:01:16 08:01:24 08:01:26 10056277 new tote arrives at PLF4 08:01:26 08:01:39 08:01:41 10081310 PLF4 08:01:41 08:01:57 08:01:59 10016906 08:00:11 ... PLF4 10:59:47 10:59:54 10:59:56 10076144 PLF4 10:59:56 11:00:11 11:00:12 10012882 Note the tote that PLF4 11:00:12 11:00:28 11:00:29 10035898 PLF4 11:00:29 11:00:42 11:00:44 10076793 arrived 10:59:56 leaves ... after 11:00:00 1453 rows selected.80 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick and wait with s1 as ( ... ) select pos, arrive, depart, nextarrive Calculate pick , (depart - arrive) * 24 * 60 * 60 pickseconds , (nextarrive - depart) * 24 * 60 * 60 waitseconds seconds and wait from ( select pos, time arrive, nexttime depart, next2time nextarrive, loadunit from ( seconds select pos, time , lead(time) over ( partition by pos order by time, missionid ) nexttime , lead(time,2) over ( partition by pos order by time, missionid ) next2time , ad, loadunit from s1 ) s2 where ad = A ) s3 Filter on desired 3 where arrive >= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) and arrive <= to_date(2012-04-12 10:59:59,YYYY-MM-DD HH24:MI:SS) hour interval order by pos, arrive81 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick and wait POS ARRIVE DEPART NEXTARRI PICKSECONDS WAITSECONDS ---- -------- -------- -------- ----------- ----------- How fast did the PLF4 08:00:03 08:00:08 08:00:11 5 3 PLF4 08:00:11 08:01:07 08:01:09 56 2 operator pick and PLF4 08:01:09 08:01:14 08:01:16 5 2 PLF4 08:01:16 08:01:24 08:01:26 8 2 how long time did ... PLF4 08:58:08 08:58:08 09:11:36 0 808 he wait for a new PLF4 09:11:36 09:12:55 09:12:56 79 1 ... tote to arrive PLF4 10:59:47 10:59:54 10:59:56 7 2 PLF4 10:59:56 11:00:11 11:00:12 15 1 PLF5 08:00:00 08:00:06 08:00:07 6 1 PLF5 08:00:07 08:00:13 08:00:14 6 1 ... PLF5 10:57:54 10:59:58 10:59:59 124 1 PLF5 10:59:59 11:00:09 11:00:10 10 1 1155 rows selected.82 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Hourly stats with s1 as ( ... ) select , pos trunc(arrive,HH24) hour Now we can use the , , count(*) picks avg(pickseconds) secondsprpick previous select as , , sum(pickseconds)/60 minutespicked 100*sum(pickseconds)/sum(pickseconds+waitseconds) pickpct basis for some plain , avg(waitseconds) secondsprwait , sum(waitseconds)/60 minuteswaited statistics by the , 100*sum(waitseconds)/sum(pickseconds+waitseconds) waitpct , , avg(pickseconds+waitseconds) secondsprcycle sum(pickseconds+waitseconds)/60 minutestotal hour , 60 * count(*) / sum(pickseconds+waitseconds) cyclesprmin from ( select pos, arrive, depart, nextarrive , (depart - arrive) * 24 * 60 * 60 pickseconds , (nextarrive - depart) * 24 * 60 * 60 waitseconds from ( select pos, time arrive, nexttime depart, next2time nextarrive, loadunit from ( select pos, time , lead(time) over ( partition by pos order by time, missionid ) nexttime , lead(time,2) over ( partition by pos order by time, missionid ) next2time , ad, loadunit from s1 ) s2 where ad = A ) s3 where arrive >= to_date(2012-04-12 08:00:00,YYYY-MM-DD HH24:MI:SS) and arrive <= to_date(2012-04-12 10:59:59,YYYY-MM-DD HH24:MI:SS) ) s4 group by pos, trunc(arrive,HH24) order by pos, trunc(arrive,HH24)83 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Hourly stats sec sec sec cycle pr min pick pr min wait pr min pr POS HOUR PICKS pick pickd pct wait waitd pct cycle total min ----- -------- ------ ----- ------ ----- ----- ------ ----- ----- ------ ----- PLF4 08:00:00 156 20.3 52.9 73.9 7.2 18.7 26.1 27.5 71.6 2.2 PLF4 09:00:00 159 13.2 35.0 71.9 5.1 13.6 28.1 18.3 48.6 3.3 PLF4 10:00:00 165 19.8 54.5 90.8 2.0 5.5 9.2 21.8 60.0 2.8 PLF5 08:00:00 247 12.9 53.2 85.3 2.2 9.2 14.7 15.2 62.4 4.0 PLF5 09:00:00 179 15.9 47.4 82.3 3.4 10.2 17.7 19.3 57.6 3.1 PLF5 10:00:00 249 10.9 45.4 75.3 3.6 14.9 24.7 14.5 60.4 4.1 6 rows selected.84 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Picking efficiency • Log over tote missions arriving and departing the picking stations • LEAD() on mission log to find the departure following an arrival => picking time • LEAD(,2) on mission log to find the arrival following a departure => waiting time85 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecasting sales Case 586 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecasting sales• Forecast the sales of next year• But follow the trend of the item87 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Table create table sales ( item varchar2(10), Simple table of mth date, monthly sales by qty number item ) /88 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data insert into sales values (Snowchain, date 2008-01-01, 79); insert into sales values (Snowchain, date 2008-02-01, 133); Item Snowchain insert into sales values (Snowchain, date 2008-03-01, 24); sells good in winter ... and trends up insert into sales values (Snowchain, date 2010-10-01, 1); insert into sales values (Snowchain, date 2010-11-01, 73); insert into sales values (Snowchain, date 2010-12-01, 160); insert into sales values (Sunshade , date 2008-01-01, 4); insert into sales values (Sunshade , date 2008-02-01, 6); Item Sunshade sells insert into sales values (Sunshade , date 2008-03-01, 32); ... good in summer insert into sales values (Sunshade , date 2010-10-01, 11); and trends down insert into sales values (Sunshade , date 2010-11-01, 3); insert into sales values (Sunshade , date 2010-12-01, 5);89 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Slope select sales.item Graph slope: y-axis is qty , sales.mth , sales.qty x-axis is a number with the scale of 1=a month , regr_slope( Range between gives sliding 2-year window sales.qty , extract(year from sales.mth) * 12 + extract(month from sales.mth) ) over ( partition by sales.item order by sales.mth range between interval 23 month preceding and current row ) slope from sales order by sales.item, sales.mth90 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Slope ITEM MTH QTY SLOPE ---------- ---------- ----- -------- Slope value most Snowchain 2008-01-01 79 Snowchain 2008-02-01 133 54.000 accurate for 2010 Snowchain ... 2008-03-01 24 -27.500 data where 2 year Snowchain Snowchain 2010-10-01 2010-11-01 1 73 -2.274 -2.363 sliding window Snowchain 2010-12-01 160 -.991 contains full set of Sunshade 2008-01-01 4 Sunshade 2008-02-01 6 2.000 data Sunshade 2008-03-01 32 14.000 ... Sunshade 2010-10-01 11 .217 Sunshade 2010-11-01 3 -.200 Sunshade 2010-12-01 5 -.574 72 rows selected.91 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Transpose using slope select item, mth, qty , qty + 12 * slope qty_next_year As x-axis had scale from ( of 1=a month and y- select sales.item, sales.mth, sales.qty , regr_slope( axis was qty, sales.qty , extract(year from sales.mth) * 12 + extract(month from sales.mth) multiplying slope ) over ( partition by sales.item with 12 gives how order by sales.mth range between interval 23 month preceding and current row much qty goes up or ) slope down in a year from sales ) where mth >= date 2010-01-01 order by item, mth92 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Transpose using slope ITEM MTH QTY QTY_NEXT_YEAR ---------- ---------- ----- ------------- Snowchain 2010-01-01 167 188,313043 Sunshade 2010-01-01 2 -11,617391 Snowchain 2010-02-01 247 304,855652 Sunshade 2010-02-01 8 -11,137391 Snowchain 2010-03-01 42 96,3913043 Sunshade 2010-03-01 28 9,11304348 Snowchain 2010-04-01 0 42,6991304 Sunshade 2010-04-01 26 8,86086957 Snowchain 2010-05-01 0 30,8869565 Sunshade 2010-05-01 23 9,66434783 Snowchain 2010-06-01 0 19,0747826 Sunshade 2010-06-01 46 39,1130435 Snowchain 2010-07-01 0 7,2626087 Sunshade 2010-07-01 73 79,4486957 Snowchain 2010-08-01 1 -3,4295652 Sunshade 2010-08-01 25 31,7147826 Snowchain 2010-09-01 0 -16,121739 Sunshade 2010-09-01 13 18,0504348 Snowchain 2010-10-01 1 -26,292174 Sunshade 2010-10-01 11 13,6086957 Snowchain 2010-11-01 73 44,6434783 Sunshade 2010-11-01 3 ,594782609 Snowchain 2010-12-01 160 148,109565 Sunshade 2010-12-01 5 -1,886956593 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecast select item , add_months(mth, 12) mth Rather than column , greatest(round(qty + 12 * slope), 0) forecast QTY_NEXT_YEAR we from ( select sales.item, sales.mth, sales.qty add a year to the , regr_slope( sales.qty month and call it a , extract(year from sales.mth) * 12 + extract(month from sales.mth) ) over ( forecast partition by sales.item order by sales.mth range between interval 23 month preceding and current row ) slope We round the from sales ) numbers and skip where mth >= date 2010-01-01 any negatives order by item, mth94 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecast ITEM MTH FORECAST ---------- ---------- --------- Snowchain 2011-01-01 188 Sunshade 2011-01-01 0 Snowchain 2011-02-01 305 Sunshade 2011-02-01 0 Snowchain 2011-03-01 96 Sunshade 2011-03-01 9 Snowchain 2011-04-01 43 Sunshade 2011-04-01 9 Snowchain 2011-05-01 31 Sunshade 2011-05-01 10 Snowchain 2011-06-01 19 Sunshade 2011-06-01 39 Snowchain 2011-07-01 7 Sunshade 2011-07-01 79 Snowchain 2011-08-01 0 Sunshade 2011-08-01 32 Snowchain 2011-09-01 0 Sunshade 2011-09-01 18 Snowchain 2011-10-01 0 Sunshade 2011-10-01 14 Snowchain 2011-11-01 45 Sunshade 2011-11-01 1 Snowchain 2011-12-01 148 Sunshade 2011-12-01 095 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Actual + forecast select item, mth, qty, type from ( select sales.item, sales.mth, sales.qty, Actual type UNION ALL of the from sales union all actual data and the select item , add_months(mth, 12) mth forecast data for a , greatest(round(qty + 12 * slope), 0) qty , Forecast type complete set of from ( select sales.item, sales.mth, sales.qty , regr_slope( sales data that can sales.qty , extract(year from sales.mth) * 12 + extract(month from sales.mth) ) over ( be shown in a graph partition by sales.item order by sales.mth range between interval 23 month preceding and current row ) slope from sales ) where mth >= date 2010-01-01 ) order by item, mth96 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Actual + forecast ITEM MTH QTY TYPE ---------- ---------- ----- ---------- Snowchain 2008-01-01 79 Actual Sunshade 2008-01-01 4 Actual Snowchain 2008-02-01 133 Actual Sunshade 2008-02-01 6 Actual Snowchain 2008-03-01 24 Actual Sunshade 2008-03-01 32 Actual ... ... Snowchain 2010-10-01 1 Actual Sunshade 2010-10-01 11 Actual Snowchain 2010-11-01 73 Actual Sunshade 2010-11-01 3 Actual Snowchain 2010-12-01 160 Actual Sunshade 2010-12-01 5 Actual Snowchain 2011-01-01 188 Forecast Sunshade 2011-01-01 0 Forecast Snowchain 2011-02-01 305 Forecast Sunshade 2011-02-01 0 Forecast Snowchain 2011-03-01 96 Forecast Sunshade 2011-03-01 9 Forecast ... ... Snowchain 2011-10-01 0 Forecast Sunshade 2011-10-01 14 Forecast Snowchain 2011-11-01 45 Forecast Sunshade 2011-11-01 1 Forecast Snowchain 2011-12-01 148 Forecast Sunshade 2011-12-01 0 Forecast97 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Actual + forecastData from previous slide is the graph• ”Actual” is normal lines• ”Forecast” is stapled lines98 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecasting sales • REGR_SLOPE() to calculate trend • RANGE window for sliding trend calculation over three years • ”Transpose” last years sales by the slope to get next years forecast99 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecast zero stock Case 6100 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecast zero stock • Fireworks sell like crazy last week of December • What hour will a store run out of stock?101 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Tables create table fw_store ( shopid varchar2(10) primary key, Stores are defined containers integer by how many ) storage containers / create table fw_sales ( shopid varchar2(10) references fw_store (shopid), Sales are hourly saleshour date, data per shop in salesnem number Net Explosive Mass ) /102 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Tables create table fw_daybudget ( shopid varchar2(10) references fw_store (shopid), Daily budget of budgetdate date, Net Explosive Mass budgetnem number per shop ) / create table fw_hourbudget ( Percentage of a hour integer, days budget percent number expected to be in ) / each hour103 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data - store insert into fw_store values (AALBORG , 4); insert into fw_store values (GLOSTRUP , 4); insert into fw_store values (HADERSLEV, 3);104 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data - sales insert into fw_sales select shopid, day + numtodsinterval(hour,hour) saleshour, salesnem from ( select AALBORG shopid, date 2011-12-27 day, 4 h9, 6 h10, 5 h11, 20 h12, 19 h13, 22 h14, 27 h15, 11 h16, 16 h17, 4 h18 from dual union all select AALBORG , date 2011-12-28, 7, 17, 18, 13, 27, 28, 20, 14, 10, 19 from dual union all select AALBORG , date 2011-12-29, 10, 14, 20, null, null, null, null, null, null, null from dual union all select GLOSTRUP , date 2011-12-27, 1, 6, 6, 14, 17, 17, 13, 15, 7, 7 from dual union all select GLOSTRUP , date 2011-12-28, 4, 14, 30, 35, 22, 21, 35, 34, 15, 25 from dual union all select GLOSTRUP , date 2011-12-29, 6, 13, 50, null, null, null, null, null, null, null from dual union all select HADERSLEV, date 2011-12-27, 4, 7, 13, 15, 17, 13, 18, 19, 10, 3 from dual union all select HADERSLEV, date 2011-12-28, 8, 5, 14, 18, 20, 18, 15, 24, 12, 1 from dual union all select HADERSLEV, date 2011-12-29, 1, 19, 33, null, null, null, null, null, null, null from dual ) s1 unpivot exclude nulls ( salesnem for hour in ( h9 as 9, h10 as 10, h11 as 11, h12 as 12, h13 as 13, h14 as 14, h15 as 15, h16 as 16, h17 as 17, h18 as 18 ) )105 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data - daybudget insert into fw_daybudget values (AALBORG , date 2011-12-27, 150); insert into fw_daybudget values (AALBORG , date 2011-12-28, 200); insert into fw_daybudget values (AALBORG , date 2011-12-29, 300); insert into fw_daybudget values (AALBORG , date 2011-12-30, 500); insert into fw_daybudget values (AALBORG , date 2011-12-31, 400); insert into fw_daybudget values (GLOSTRUP , date 2011-12-27, 150); insert into fw_daybudget values (GLOSTRUP , date 2011-12-28, 200); insert into fw_daybudget values (GLOSTRUP , date 2011-12-29, 300); insert into fw_daybudget values (GLOSTRUP , date 2011-12-30, 500); insert into fw_daybudget values (GLOSTRUP , date 2011-12-31, 400); insert into fw_daybudget values (HADERSLEV, date 2011-12-27, 100); insert into fw_daybudget values (HADERSLEV, date 2011-12-28, 150); insert into fw_daybudget values (HADERSLEV, date 2011-12-29, 200); insert into fw_daybudget values (HADERSLEV, date 2011-12-30, 400); insert into fw_daybudget values (HADERSLEV, date 2011-12-31, 300);106 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data - hourbudget insert into fw_hourbudget values ( 9, 4); insert into fw_hourbudget values (10, 8); insert into fw_hourbudget values (11, 10); insert into fw_hourbudget values (12, 12); insert into fw_hourbudget values (13, 12); insert into fw_hourbudget values (14, 12); insert into fw_hourbudget values (15, 14); insert into fw_hourbudget values (16, 14); insert into fw_hourbudget values (17, 10); insert into fw_hourbudget values (18, 4);107 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Starting NEM select s.shopid , s.containers * 250 startnem Three stores: from fw_store s order by s.shopid 2 has 4 containers ( = 1000 kg NEM) start SHOPID nem ---------- ------ 1 has 3 containers AALBORG 1000 ( = 750 kg NEM) GLOSTRUP 1000 HADERSLEV 750108 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Budget per hour select db.shopid , db.budgetdate + numtodsinterval(hb.hour,hour) budgethour Cartesian join of , db.budgetnem * hb.percent / 100 budgetnem daily budget with from fw_daybudget db hour percentages cross join fw_hourbudget hb order by db.shopid, budgethour gives us an hourly budget109 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Budget per hour budgt SHOPID BUDGETHOUR nem These hourly budget ---------- ------------------- ------ AALBORG 2011-12-27 09:00:00 6 data is now directly AALBORG 2011-12-27 10:00:00 12 AALBORG 2011-12-27 11:00:00 15 comparable to ... AALBORG 2011-12-31 15:00:00 56 hourly sales data AALBORG 2011-12-31 16:00:00 56 AALBORG 2011-12-31 17:00:00 40 AALBORG 2011-12-31 18:00:00 16 GLOSTRUP 2011-12-27 09:00:00 6 GLOSTRUP 2011-12-27 10:00:00 12 GLOSTRUP 2011-12-27 11:00:00 15 ... HADERSLEV 2011-12-31 16:00:00 42 HADERSLEV 2011-12-31 17:00:00 30 HADERSLEV 2011-12-31 18:00:00 12 150 rows selected.110 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • WITH clauses with shop as ( select s.shopid Use the starting , s.containers * 250 startnem NEM and hourly from fw_store s budget selects as ), budget as ( select db.shopid WITH clauses , db.budgetdate + numtodsinterval(hb.hour,hour) budgethour , db.budgetnem * hb.percent / 100 budgetnem from fw_daybudget db cross join fw_hourbudget hb ) ...111 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Budget + sales ... Shop and Budget WITH clauses ... select budget.shopid, shop.startnem, budget.budgethour hour Join shop and , budget.budgetnem , sum(budget.budgetnem) over ( partition by budget.shopid budget, outer join order by budget.budgethour rows between unbounded preceding and current row to sales – then we ) budgetnemacc , sales.salesnem can accumulate , sum(sales.salesnem) over ( partition by budget.shopid both budget and order by budget.budgethour rows between unbounded preceding and current row sales ) salesnemacc from shop join budget on budget.shopid = shop.shopid left outer join fw_sales sales on sales.shopid = budget.shopid and sales.saleshour = budget.budgethour order by budget.shopid, budget.budgethour112 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Budget + sales budgt sales start budgt nem sales nem ”Now” is December SHOPID nem HOUR nem acc nem acc ---------- ------ ------------------- ------ ------ ------ ------ 29th exactly 12:00, AALBORG AALBORG 1000 1000 2011-12-27 09:00:00 2011-12-27 10:00:00 6 12 6 18 4 6 4 10 so sales data stops AALBORG AALBORG 1000 1000 2011-12-27 11:00:00 2011-12-27 12:00:00 15 18 33 51 5 20 15 35 there ... AALBORG 1000 2011-12-29 10:00:00 24 386 14 331 Accumulated data AALBORG AALBORG 1000 1000 2011-12-29 2011-12-29 11:00:00 12:00:00 30 36 416 452 20 351 351 show we are behind AALBORG 1000 2011-12-29 13:00:00 36 488 351 budget ... AALBORG 1000 2011-12-31 15:00:00 56 1438 351 AALBORG 1000 2011-12-31 16:00:00 56 1494 351 AALBORG 1000 2011-12-31 17:00:00 40 1534 351 AALBORG 1000 2011-12-31 18:00:00 16 1550 351113 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Yet another WITH ... Shop and Budget WITH clauses ... ), nem as ( select budget.shopid, shop.startnem, budget.budgethour hour Real code use SYSDATE , case when budget.budgethour < to_date(2011-12-29 12:00:00, rather than hardcodet YYYY-MM-DD HH24:MI:SS) then S else B end salesbudget , case qtynem contains actual when budget.budgethour < to_date(2011-12-29 12:00:00, then nvl(sales.salesnem,0) YYYY-MM-DD HH24:MI:SS) sales for as long as we else budget.budgetnem end qtynem have it, and budget data from shop join budget after ”now” on budget.shopid = shop.shopid left outer join fw_sales sales on sales.shopid = budget.shopid and sales.saleshour = budget.budgethour )114 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Stock level ... Shop, Budget and Nem WITH clauses ... select nem.shopid Accumulate qtynem , nem.hour , nem.salesbudget similar to FIFO code , nem.qtynem , sum(nem.qtynem) over ( and subtract from partition by nem.shopid order by nem.hour startnem to rows between unbounded preceding and current row ) sumnem calculate stock at , greatest(nem.startnem - nvl( sum(nem.qtynem) over ( the beginning of partition by nem.shopid order by nem.hour each hour rows between unbounded preceding and 1 preceding ) ,0),0) stocknem from nem order by shopid, hour115 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Stock level S qty sum stock SHOPID HOUR B nem nem nem At the beginning of hour 16 ---------- ------------------- - ------ ------ ------ AALBORG 2011-12-27 09:00:00 S 4 4 1000 on December 30th, there AALBORG 2011-12-27 10:00:00 S 6 10 996 AALBORG 2011-12-27 11:00:00 S 5 15 990 will be 55 kg NEM left ... AALBORG AALBORG 2011-12-29 2011-12-29 10:00:00 11:00:00 S S 14 20 331 351 683 669 During the hour we expect AALBORG AALBORG 2011-12-29 2011-12-29 12:00:00 13:00:00 B B 36 36 387 423 649 613 to sell 70 kg and will run ... AALBORG 2011-12-30 15:00:00 B 70 945 125 out AALBORG 2011-12-30 16:00:00 B 70 1015 55 AALBORG 2011-12-30 17:00:00 B 50 1065 0 AALBORG 2011-12-30 18:00:00 B 20 1085 0 ... AALBORG 2011-12-31 17:00:00 B 40 1469 0 AALBORG 2011-12-31 18:00:00 B 16 1485 0116 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • The hour of zero stock ... Shop, Budget and Nem WITH clauses ... select shopid , max(hour) The last hour we + numtodsinterval( max(stocknem) keep (dense_rank last order by hour) still have stock / max(qtynem) keep (dense_rank last order by hour) ,hour ) zerohour The stock we have from ( select nem.shopid, nem.hour, nem.salesbudget, nem.qtynem , sum(nem.qtynem) over ( left divided by the partition by nem.shopid order by nem.hour rows between unbounded preceding and current row ) sumnem qty expected sold , greatest(nem.startnem - nvl( sum(nem.qtynem) over ( partition by nem.shopid order by nem.hour that hour gives the ) rows between unbounded preceding and 1 preceding ,0),0) stocknem from nem last fraction of hour ) where stocknem > 0 before we reach group by shopid order by shopid zero117 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • The hour of zero stock SHOPID ZEROHOUR ---------- ------------------- And so our logistics AALBORG 2011-12-30 16:47:08 planner has a GLOSTRUP 2011-12-30 15:59:08 forecast for when HADERSLEV 2011-12-30 15:58:55 the shops will run out of fireworks118 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Model the same ... Shop, Budget and Nem WITH clauses ... select shopid, rn, hour, startnem, salesbudget, qtynem, sumnem, stocknem, zerohour from nem model partition by (shopid) dimension by (rn) measures ( hour, startnem, salesbudget, qtynem, qtynem sumnem, startnem stocknem, cast(null as date) zerohour ) rules sequential order iterate (49) ( sumnem[iteration_number+1] = sumnem[iteration_number] + qtynem[iteration_number], stocknem[iteration_number+1] = stocknem[iteration_number] - qtynem[iteration_number] + case when qtynem[iteration_number] > stocknem[iteration_number] then startnem[0] else 0 end, zerohour[iteration_number+1] = case when qtynem[iteration_number+1]>stocknem[iteration_number+1] then hour[iteration_number+1] + numtodsinterval(stocknem[iteration_number+1] / qtynem[iteration_number+1], hour’) else null end ) order by shopid, hour119 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Model the same start S qty sum stock SHOPID RN HOUR nem B nem nem nem ZEROHOUR ---------- --- ------------------- ------ - ------ ------ ------ ------------------- AALBORG 0 2011-12-27 09:00:00 1000 S 4 4 1000 AALBORG 1 2011-12-27 10:00:00 1000 S 6 8 996 AALBORG 2 2011-12-27 11:00:00 1000 S 5 14 990 ... AALBORG 35 2011-12-30 14:00:00 1000 B 60 819 185 AALBORG 36 2011-12-30 15:00:00 1000 B 70 879 125 AALBORG 37 2011-12-30 16:00:00 1000 B 70 949 55 2011-12-30 16:47:08 AALBORG 38 2011-12-30 17:00:00 1000 B 50 1019 985 AALBORG 39 2011-12-30 18:00:00 1000 B 20 1069 935 ... AALBORG 48 2011-12-31 17:00:00 1000 B 40 1433 571 AALBORG 49 2011-12-31 18:00:00 1000 B 16 1473 531120 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Change the law with shop as ( select s.shopid If politicians decide no longer , s.containers * 100 startnem 250 kg NEM per container, now from fw_store s 100 kg NEM per container ), budget as ( ... ), nem as ( ... ) select shopid, rn, hour, startnem, salesbudget, qtynem, sumnem, stocknem, zerohour from nem model ...121 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Change the law start S qty sum stock SHOPID RN HOUR nem B nem nem nem ZEROHOUR Using MODEL ---------- --- ------------------- ------ - ------ ------ ------ ------------------- AALBORG 0 2011-12-27 09:00:00 400 S 4 4 400 clause allows AALBORG 1 2011-12-27 10:00:00 400 S 6 8 396 ... for forecast AALBORG 23 2011-12-29 12:00:00 400 B 36 355 49 AALBORG 24 2011-12-29 13:00:00 400 B 36 391 13 2011-12-29 13:21:40 repeated refill AALBORG 25 2011-12-29 14:00:00 400 B 36 427 377 ... of stock AALBORG 33 2011-12-30 12:00:00 400 B 60 699 105 AALBORG 34 2011-12-30 13:00:00 400 B 60 759 45 2011-12-30 13:45:00 AALBORG 35 2011-12-30 14:00:00 400 B 60 819 385 ... AALBORG 42 2011-12-31 11:00:00 400 B 40 1137 67 AALBORG 43 2011-12-31 12:00:00 400 B 48 1177 27 2011-12-31 12:33:45 AALBORG 44 2011-12-31 13:00:00 400 B 48 1225 379 ... AALBORG 49 2011-12-31 18:00:00 400 B 16 1473 131122 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Zero hours ... Shop, Budget and Nem WITH clauses ... select shopid, zerohour from ( select shopid, rn, hour, startnem, salesbudget, qtynem, sumnem, stocknem, zerohour from nem model partition by (shopid) dimension by (rn) measures ( hour, startnem, salesbudget, qtynem, qtynem sumnem, startnem stocknem, cast(null as date) zerohour ) rules sequential order iterate (49) ( sumnem[iteration_number+1] = sumnem[iteration_number] + qtynem[iteration_number], stocknem[iteration_number+1] = stocknem[iteration_number] - qtynem[iteration_number] + case when qtynem[iteration_number] > stocknem[iteration_number] then startnem[0] else 0 end, zerohour[iteration_number+1] = case when qtynem[iteration_number+1]>stocknem[iteration_number+1] then hour[iteration_number+1] + numtodsinterval(stocknem[iteration_number+1] / qtynem[iteration_number+1], hour’) else null end ) ) where zerohour is not null order by shopid, zerohour123 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Zero hours SHOPID ZEROHOUR ---------- ------------------- With smaller AALBORG 2011-12-29 13:21:40 amount in the AALBORG 2011-12-30 13:45:00 containers we need AALBORG 2011-12-31 12:33:45 GLOSTRUP 2011-12-29 11:51:36 to refill shops GLOSTRUP 2011-12-30 12:49:00 multiple times GLOSTRUP 2011-12-31 11:16:30 HADERSLEV 2011-12-29 11:47:16 HADERSLEV 2011-12-30 13:01:15 HADERSLEV 2011-12-31 11:02:00124 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Forecast zero stock • SUM() on budget sales data from ”now” forward • Identify hour when rolling sum exceeds stock – BETWEEN CURRENT ROW AND 1 PRECEDING (Similar technique as picking by FIFO) • More than analytics: – MODEL clause for repeated refill of stock125 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Multi-order FIFO picking Case 7126 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Multi-order FIFOMonty Latiolais, president of ODTUG:• Need to pick multiple orders• Each order by First-In-First-Out• Second order ”continues” where first order stops and so on127 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Tables create table inventory ( item varchar2(10), -- identification of the item Just like Case 2 loc varchar2(10), -- identification of the location qty number, -- quantity present at that location purch date -- date that quantity was purchased ) / create table orderline ( ordno number, -- id-number of the order item varchar2(10), -- identification of the item qty number -- quantity ordered ) /128 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Data Inventory data exactly like Case 2 Orderline data this time for three orders: insert into orderline values (51, A1, 24); insert into orderline values (51, B1, 18); insert into orderline values (62, A1, 8); insert into orderline values (73, A1, 16); insert into orderline values (73, B1, 6);129 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Batch pick with orderbatch as ( select o.item Group by on , sum(o.qty) qty orderline creates an from orderline o orderbatch where o.ordno in (51, 62, 73) group by o.item ) select <FIFO sql> ...130 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Batch pick with orderbatch as ( ) ... We can apply the select s.loc, s.item FIFO code on the , least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty from ( orderbatch select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding ),0) sum_prv_qty from orderbatch o join inventory i on i.item = o.item ) s where s.sum_prv_qty < s.ord_qty order by s.loc131 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Batch pick LOC ITEM PICK_QTY ------ ---- -------- Works OK, but 1-A-02 B1 5 operator cannot see 1-A-20 A1 18 how much of each 1-B-11 B1 4 1-B-15 B1 2 pick goes to what 1-C-04 B1 12 order 1-C-05 A1 6 2-A-02 A1 24 2-D-23 B1 1132 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick qty intervals with orderbatch as ( ... ) select s.loc, s.item, least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty , sum_prv_qty + 1 from_qty Let’s use more , least(sum_qty, ord_qty) to_qty from ( analytics  select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding Two rolling sums ),0) sum_prv_qty , nvl(sum(i.qty) over ( partition by i.item allow us to calculate order by i.purch, i.loc rows between unbounded preceding and current row from_qty and to_qty ),0) sum_qty from orderbatch o join inventory i on i.item = o.item ) s where s.sum_prv_qty < s.ord_qty order by s.item, s.purch, s.loc133 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick qty intervals LOC ITEM PICK_QTY FROM_QTY TO_QTY ------ ---- -------- -------- -------- So ”from 1 to 18” of 1-A-20 A1 18 1 18 the 48 pieces of A1 2-A-02 A1 24 19 42 are picked in the 1-C-05 A1 6 43 48 1-B-15 B1 2 1 2 first location 1-C-04 B1 12 3 14 ”From 19 to 42” are 2-D-23 B1 1 15 15 1-B-11 B1 4 16 19 picked in the second 1-A-02 B1 5 20 24 location And so on…134 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Order qty intervals select o.ordno , o.item Do the same with , , o.qty nvl(sum(o.qty) over ( the order quantities partition by o.item order by o.ordno rows between unbounded preceding and 1 preceding ),0) + 1 from_qty , nvl(sum(o.qty) over ( partition by o.item order by o.ordno rows between unbounded preceding and current row ),0) to_qty from orderline o where ordno in (51, 62, 73) order by o.item, o.ordno135 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Order qty intervals ORDNO ITEM QTY FROM_QTY TO_QTY ----- ---- -------- -------- -------- ”From 1 to 24” of 51 A1 24 1 24 the 48 pieces 62 A1 8 25 32 ordered of A1 is on 73 A1 16 33 48 51 B1 18 1 18 order no 51 73 B1 6 19 24 And so on136 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Overlapping intervals with orderlines as ( select o.ordno, o.item, o.qty The orderlines with , nvl(sum(o.qty) over ( partition by o.item qty intervals we put order by o.ordno in a with clause rows between unbounded preceding and 1 preceding ),0) + 1 from_qty , nvl(sum(o.qty) over ( partition by o.item order by o.ordno rows between unbounded preceding and current row ),0) to_qty from orderline o where ordno in (51, 62, 73) ), orderbatch as ( ...137 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Overlapping intervals ... ), orderbatch as ( We create the select o.item orderbatch from the , sum(o.qty) qty orderlines as a from orderlines o group by o.item second with clause ), fifo as ( ...138 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Overlapping intervals ... ), fifo as ( select s.loc, s.item, s.purch, least(s.loc_qty, s.ord_qty - s.sum_prv_qty) pick_qty And the FIFO , sum_prv_qty + 1 from_qty, least(sum_qty, ord_qty) to_qty from ( calculation with qty select o.item, o.qty ord_qty, i.loc, i.purch, i.qty loc_qty , nvl(sum(i.qty) over ( intervals as a third partition by i.item order by i.purch, i.loc rows between unbounded preceding and 1 preceding with clause ),0) sum_prv_qty , nvl(sum(i.qty) over ( partition by i.item order by i.purch, i.loc rows between unbounded preceding and current row ),0) sum_qty from orderbatch o join inventory i on i.item = o.item ) s where s.sum_prv_qty < s.ord_qty ) ...139 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Overlapping intervals with orderlines as ( ... Now we join the fifo ), orderbatch as ( ... and orderlines on ), fifo as ( ... overlapping ) intervals select f.loc, f.item, f.purch, f.pick_qty, f.from_qty, f.to_qty , o.ordno, o.qty, o.from_qty, o.to_qty from fifo f join orderlines o on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <= f.to_qty order by f.item, f.purch, o.ordno140 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Overlapping intervals LOC ITEM PURCH PICK_QTY FROM_QTY TO_QTY ORDNO QTY FROM_QTY TO_QTY ------ ---- ---------- -------- -------- ------ ----- ------ -------- ------ The single pick 1-A-20 A1 2004-11-01 18 1 18 51 24 1 24 of 24 at location 2-A-02 A1 2004-11-02 24 19 42 51 24 1 24 2-A-02 A1 2004-11-02 24 19 42 62 8 25 32 2-A-02 is joined 2-A-02 A1 2004-11-02 24 19 42 73 16 33 48 to three 1-C-05 A1 2004-11-03 6 43 48 73 16 33 48 1-B-15 B1 2004-11-02 2 1 2 51 18 1 18 orderlines all 1-C-04 B1 2004-11-03 12 3 14 51 18 1 18 with overlapping 2-D-23 B1 2004-11-04 1 15 15 51 18 1 18 1-B-11 B1 2004-11-05 4 16 19 51 18 1 18 intervals 1-B-11 B1 2004-11-05 4 16 19 73 6 19 24 1-A-02 B1 2004-11-06 5 20 24 73 6 19 24141 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Individual pick qty with orderlines as ( ... ), orderbatch as ( ... The intervals can ), fifo as ( ) ... now be used for select f.loc, f.item, f.purch, f.pick_qty, f.from_qty, f.to_qty calculating how , o.ordno, o.qty, o.from_qty, o.to_qty , least( much from the f.loc_qty , least(o.to_qty, f.to_qty) - greatest(o.from_qty, f.from_qty) + 1 location is picked ) pick_ord_qty for the individual from fifo f join orderlines o order on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <= f.to_qty order by f.item, f.purch, o.ordno142 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Individual pick qty LOC ITEM PURCH PICK_QTY FROM_QTY TO_QTY ORDNO QTY FROM_QTY TO_QTY PICK_ORD_QTY ------ ---- ---------- -------- -------- ------ ----- ------ -------- ------ ------------ 1-A-20 A1 2004-11-01 18 1 18 51 24 1 24 18 2-A-02 A1 2004-11-02 24 19 42 51 24 1 24 6 2-A-02 A1 2004-11-02 24 19 42 62 8 25 32 8 2-A-02 A1 2004-11-02 24 19 42 73 16 33 48 10 1-C-05 A1 2004-11-03 6 43 48 73 16 33 48 6 1-B-15 B1 2004-11-02 2 1 2 51 18 1 18 2 1-C-04 B1 2004-11-03 12 3 14 51 18 1 18 12 2-D-23 B1 2004-11-04 1 15 15 51 18 1 18 1 1-B-11 B1 2004-11-05 4 16 19 51 18 1 18 3 1-B-11 B1 2004-11-05 4 16 19 73 6 19 24 1 1-A-02 B1 2004-11-06 5 20 24 73 6 19 24 5143 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick list with orderlines as ( ... ), orderbatch as ( ... Tidy up the select ), fifo as ( ) ... and order by select f.loc , f.item location and we , , f.pick_qty pick_at_loc o.ordno have new pick list , least( f.loc_qty , least(o.to_qty, f.to_qty) - greatest(o.from_qty, f.from_qty) + 1 ) qty_for_ord from fifo f join orderlines o on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <= f.to_qty order by f.loc, o.ordno144 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick list LOC ITEM PICK_AT_LOC ORDNO QTY_FOR_ORD ------ ---- ----------- ----- ----------- The operator now 1-A-02 B1 5 73 5 knows to pick 24 1-A-20 A1 18 51 18 pieces of A1 at 1-B-11 B1 4 51 3 1-B-11 B1 4 73 1 location 2-A-02 and 1-B-15 B1 2 51 2 distribute them 1-C-04 B1 12 51 12 with 6 for order 1-C-05 A1 6 73 6 2-A-02 A1 24 51 6 51, 8 for order 62 2-A-02 A1 24 62 8 and 10 for order 73 2-A-02 A1 24 73 10 2-D-23 B1 1 51 1145 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick list with route ...(orderlines, orderbatch and fifo with clauses)... ), pick as ( Move the pick list select to_number(substr(f.loc,1,1)) warehouse , substr(f.loc,3,1) aisle into a fourth with , dense_rank() over ( order by to_number(substr(f.loc,1,1)), -- warehouse clause substr(f.loc,3,1) -- aisle ) aisle_no , to_number(substr(f.loc,5,2)) position , f.loc, f.item, f.pick_qty pick_at_loc, o.ordno , least( Add the ranking f.loc_qty , least(o.to_qty, f.to_qty) - greatest(o.from_qty, f.from_qty) + 1 aisle_no calculation ) qty_for_ord from fifo f join orderlines o on o.item = f.item and o.to_qty >= f.from_qty and o.from_qty <= f.to_qty ) ...146 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick list with route with orderlines as ( ... ), orderbatch as ( And so a big select ... ), fifo as ( ... of 4 with clauses ), pick as ( ... and the final pick ) select p.loc, p.item, p.pick_at_loc, p.ordno, p.qty_for_ord list of multiple from pick p order by p.warehouse orders by FIFO with , p.aisle_no efficient route , case when mod(p.aisle_no,2) = 1 then p.position else -p.position end147 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Pick list with route LOC ITEM PICK_AT_LOC ORDNO QTY_FOR_ORD ------ ---- ----------- ----- ----------- All done 1-A-02 B1 5 73 5 1-A-20 A1 18 51 18 1-B-15 B1 2 51 2 What more can we 1-B-11 B1 4 51 3 wish for?  1-B-11 B1 4 73 1 1-C-04 B1 12 51 12 1-C-05 A1 6 73 6 2-A-02 A1 24 51 6 2-A-02 A1 24 73 10 2-A-02 A1 24 62 8 2-D-23 B1 1 51 1148 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Multi-order FIFO • Do FIFO on the sum of the orders • Calculate From/To qty intervals of picks • Calculate From/To qty intervals of orders • Join overlapping intervals “Don’t you just love these kind of challenges? It’s why we do what we do!” – Monty Latiolais149 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Analytics forever …and ever and ever…150 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • It never ends… • WheelGuide® • Replenish shop stock• We use analytic functions all • Call Center statistics the time • Spare parts guide • Customer count / work schedule / number of orders • Booking calendar for mechanics• We can’t imagine living • Shop space management without analytics  • Discover idle hands • Detect seasonal variations for sales • Efficiency of Royal Danish Mail • …151 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Do It Yourself • Just start using analytics • The more you do the more often you find cases • When you start to think you need to process your data procedurally – think again! • Use the power of SQL to let the database do the hard work processing data • That’s what the database does best • And you’re paying for it so why not use it • If not – you are missing out on great functionality 152 2012-12-05 #ukoug2012 Really Using Analytic Functions
  • Any questions? • Download presentation from UKOUG • Or you can get presentation as well as the scripts at: http://goo.gl/g46b4 @kibeha kibeha@gmail.com http://dspsd.blogspot.com153 2012-12-05 #ukoug2012 Really Using Analytic Functions