Upcoming SlideShare
×

# 01 knapsack using backtracking

19,400

Published on

Published in: Technology
1 Comment
2 Likes
Statistics
Notes
• Full Name
Comment goes here.

Are you sure you want to Yes No
• a good way to represent backtracking methods, thank u for the slides.

Are you sure you want to  Yes  No
Views
Total Views
19,400
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
443
1
Likes
2
Embeds 0
No embeds

No notes for slide

### 01 knapsack using backtracking

1. 1. Backtracking Technique <ul><li>Eg. Big Castle – Large Rooms & “ Sleeping Beauty ” </li></ul><ul><li>Systematic search - BFS, DFS </li></ul><ul><li>Many paths led to nothing but “ dead-ends ” </li></ul><ul><li>Can we avoid these unnecessary labor ? </li></ul><ul><li>( bounding function or promising function ) </li></ul><ul><li>DFS with bounding function (promising function) </li></ul><ul><li>Worst-Case : Exhaustive Search </li></ul>
2. 2. Backtracking Technique <ul><li>Useful because it is efficient for “ many ” large instances </li></ul><ul><li>NP-Complete problems </li></ul><ul><li>Eg. 0-1 knapsack problem </li></ul><ul><li>D.P. : O (min(2 n , nW)) </li></ul><ul><li>Backtracking : can be very efficient </li></ul><ul><li>if good bounding function(promising function) </li></ul><ul><li>Recursion </li></ul>
3. 3. Outline of Backtracking Approach <ul><li>Backtracking : DFS of a tree except that nodes are visited if promising </li></ul><ul><li>DFS(Depth First Search) </li></ul><ul><li>Procedure d_f_s_tree ( v : node) </li></ul><ul><ul><li>var u : node </li></ul></ul><ul><ul><li>{ </li></ul></ul><ul><ul><li>visit v ; // some action // </li></ul></ul><ul><ul><li>for each child u of v </li></ul></ul><ul><ul><li>do d_f_s_tree ( u ); </li></ul></ul><ul><ul><li>} </li></ul></ul>1 2 3 5 7 11 4 6 8 9 10
4. 4. N-Queens Problem
5. 5. N-Queens Problem <ul><li>n ⅹ n Chess board </li></ul><ul><li>n -Queens </li></ul><ul><li>Eg. 8-Queens problem </li></ul>Q Q Q Q 1 2 3 4 5 6 7 8 1 2 3 4 5 6 7 8
6. 6. N-Queens Problem: DFS <ul><li>State Space Tree for 4-Queens Problem </li></ul>
7. 7. N-Queens Problem: DFS (Same Col/Row x) <ul><li>State Space Tree for 4-Queens Problem </li></ul>X 1 =1 X 2 =2 3 4 4 3 4 3 4 2 3 2 3 2 2 4 3 2 3 4 1 4 1 3 2 4 1 3 4 1 2 4 3 2 1 2 3 4 (x 1 , x 2 , x 3 , x 4 )=(2, 4, 1, 3) <ul><li>#leaf nodes : n!=4! </li></ul><ul><li>DFS : “ Backtrack ” at dead ends – 4! leaf nodes (n!) </li></ul><ul><li>Cf. Backtracking : “ Backtrack ” if non-promising ( pruning ) </li></ul>
8. 8. N-Queens Problem: Backtracking <ul><li>Promising Function(Bounding Function) </li></ul><ul><ul><li>(i) same row ⅹ </li></ul></ul><ul><ul><li>(ii) same column ⅹ (col(i) ≠ col(k)) </li></ul></ul><ul><ul><li>(iii) same diagonal ⅹ (|col(i)-col(k)|≠|i-k|) </li></ul></ul>Q (i, col(i)) Q (k, col(k)) Q (i,col(i)) Q (k,col(k))
9. 9. N-Queens Problem: Backtracking <ul><li>State Space Tree for 4-Queens Problem </li></ul>
10. 10. N-Queens Problem <ul><li>Better (faster) Algorithm </li></ul><ul><ul><li>Monte Carlo Algorithm </li></ul></ul><ul><ul><li>“ place almost all queens randomly, </li></ul></ul><ul><ul><li>then do remaining queens using backtracking. ” </li></ul></ul>#Nodes Checked DFS (n n ) #Nodes Checked DFS (same col/row X) (n!) #Nodes Checked Backtracking 341 19,173,961 9.73 ⅹ10 12 1.20ⅹ 10 16 24 40,320 4.79 ⅹ10 8 8.72ⅹ 10 10 61 15,721 1.01 ⅹ10 7 3.78ⅹ 10 8 4 8 12 14 n
11. 11. Graph Coloring <ul><li>M-coloring problem: </li></ul><ul><ul><li>Color undirected graph with ≤ m colors </li></ul></ul><ul><ul><li>2 adjacent vertices : different colors </li></ul></ul><ul><ul><li>(Eg) </li></ul></ul>V 1 V 2 V 4 V 3 2-coloring X 3-coloring O
12. 12. Graph Coloring <ul><li>State-Space Tree : m n leaves </li></ul><ul><li>Promising function : check adjacent vertices </li></ul><ul><li>for the same color </li></ul>start 1 2 3 2 1 3 2 1 3 2 1 3 X X X X X
13. 13. Hamiltonian Circuits Problem <ul><li>TSP problem in chapter 3. </li></ul><ul><ul><li>Brute-force: n!, (n-1)! </li></ul></ul><ul><ul><li>D.P. : </li></ul></ul><ul><ul><li>When n =20, B.F.  3,800 years </li></ul></ul><ul><ul><li>D.P.  45 sec. </li></ul></ul><ul><ul><li>More efficient solution? See chapter 6 </li></ul></ul><ul><li>Given an undirected or directed graph, find a tour(HC). (need not be S.P.) </li></ul>
14. 14. Hamiltonian Circuits Problem <ul><li>State-Space Tree </li></ul><ul><li>(n-1)! Leaves : worst-case </li></ul><ul><li>Promising function: </li></ul><ul><ul><li>1. i -th node ( i +1)-th node </li></ul></ul><ul><ul><li>2.(n-1)-th node 0-th node </li></ul></ul><ul><ul><li>3. i -th node ≠ 0 ~ ( i -1)-th node </li></ul></ul>HC : X (Eg) V 1 V 2 V 6 V 5 V 3 V 7 V 4 V 8 V 1 V 2 V 5 V 3 V 4
15. 15. Sum-of-Subsets Problem <ul><li>A special case of 0/1-knapsack </li></ul><ul><li>S = {item 1 , item 2 , … , item n }, w[1..n] = (w 1 , w 2 , … , w n ) weights & W </li></ul><ul><li>Find a subset A⊆S s.t. ∑w i =W </li></ul><ul><li>(Eg) n=3, w[1..3]=(2,4,5), W=6 </li></ul><ul><li>Solution : {w 1 ,w 2 } </li></ul><ul><li>NP-Complete </li></ul><ul><li>State Space Tree : 2 n leaves </li></ul>A w 1 w 2 w 3 w 2 w 3 w 3 w 3 O O O O O O O {w 1 ,w 2 }
16. 16. Sum-of-Subsets Problem <ul><li>Assumption : weights are in sorted order </li></ul><ul><li>Promising function </li></ul><ul><ul><li>ⅹ weight(up_to_that_node) + w i+1 > W (at i-th level) </li></ul></ul><ul><ul><li>ⅹ weight(up_to_that_node) + total(remaining) < W </li></ul></ul><ul><ul><li>(Eg) n=4, (w 1 ,w 2 ,w 3 ,w 4 )=(3,4,5,6), W=13 </li></ul></ul>ⅹ 12 7 3 9 7 8 13 4 0 0 3 3 0 4 5 0 0 4 5 5 6 0 0 0 0 ⅹ ⅹ : 0+5+6=11<13 ⅹ ⅹ : 9+6=15>13 ⅹ ⅹ w1=3 w2=4 w3=5 w4=6 0 4 3 7
17. 17. 0-1 Knapsack Problem <ul><li>w[1..n] = (w 1 , w 2 , … , w n ) </li></ul><ul><li>p[1..n] = (p 1 , p 2 , … , p n ) </li></ul><ul><li>W: Knapsack size </li></ul><ul><li>Determine A⊆S maximizing </li></ul><ul><li>State-Space Tree </li></ul><ul><li>2 n leaves </li></ul>x 1 =1 x 2 =1 x 3 =1 x 4 =1 0 0 0 0 0 0 0 0 0 0 0 0 0 0 1 1 1 1 1 0 1 1 1 1 1 x 2 =1
18. 18. Branch-and-Bound <ul><li>Backtracking </li></ul><ul><ul><li>Non-optimization P.  n-Queens, m-coloring … </li></ul></ul><ul><ul><li>Optimization Prob.  0/1 Knapsack </li></ul></ul><ul><ul><li>Compute promising fcn. </li></ul></ul><ul><li>Branch & Bound </li></ul><ul><ul><li>Optimization Prob.- compute promising fcn. </li></ul></ul><ul><ul><li>Maximization Prob. -> upper bound in each node </li></ul></ul><ul><ul><li>Minimization Prob. -> lower bound in each node </li></ul></ul><ul><ul><li>BFS with B&B </li></ul></ul><ul><ul><li>Best-First-Search with B&B </li></ul></ul>
19. 19. Breadth First Search <ul><li>procedure BFS(T : tree); </li></ul><ul><li>{ initialize(Q); </li></ul><ul><li>v = root of T; </li></ul><ul><li>visit v; </li></ul><ul><li>enqueue(Q,v); </li></ul><ul><li>while not empty(Q) do { </li></ul><ul><li>dequeue(Q,v); </li></ul><ul><li>for each child u of v do { </li></ul><ul><li> visit u; </li></ul><ul><li>enqueue(Q,u); } </li></ul><ul><li>} </li></ul><ul><li>} </li></ul>BFS of a graph BFS of a tree 1 2 3 4 5 6 7 8 9 1 2 5 6 7 8 9 3 4 10 11 12 13 14 15
20. 20. 0-1 Knapsack Problem <ul><li>BFS with B&B pruning </li></ul><ul><li>(Eg) n=4, p[1 … 4]=(40,30,50,10) , w[1 … 4]=(2,5,10,5) , W=16 </li></ul>0 0 115 40 2 115 0 0 82 70 7 115 40 2 98 120 17 0 70 7 80 80 12 80 70 7 70 90 12 98 40 2 50 100 17 0 90 12 90 30 5 82 80 15 82 30 5 40 0 0 60 p1=40 , w1=2 p2=30 , w2=5 p3=50 , w3=10 p4=10 , w4=5
21. 21. 0-1 Knapsack Problem <ul><li>Bound = current profit + profit of remaining “ fractional ” K.S. </li></ul><ul><li>Non-promising if bound ≤ max profit (or weight ≥W ) </li></ul><ul><li>(Eg) n=4, p[1 … 4]=(40,30,50,10) , w[1 … 4]=(2,5,10,5) , W=16 </li></ul>0 0 115 40 2 115 0 0 82 70 7 115 40 2 98 120 17 0 70 7 80 80 12 80 70 7 70 90 12 98 40 2 50 100 17 0 90 12 90 30 5 82 80 15 82 30 5 40 0 0 60 p1=40 , w1=2 p2=30 , w2=5 p3=50 , w3=10 p4=10 , w4=5
22. 22. 0-1 Knapsack Problem <ul><li>Best First Search with B&B pruning </li></ul>0 0 115 40 2 115 0 0 82 70 7 115 40 2 98 120 17 0 70 7 80 90 12 98 40 2 50 100 17 0 p1=40 , w1=2 p2=30 , w2=5 p3=50 , w3=10 p4=10 , w4=5 90 12 90
1. #### A particular slide catching your eye?

Clipping is a handy way to collect important slides you want to go back to later.