• Like
  • Save
Algorithm Design and Complexity - Course 10
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

Algorithm Design and Complexity - Course 10

  • 2,353 views
Published

 

Published in Education , Technology
  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
    Be the first to comment
No Downloads

Views

Total Views
2,353
On SlideShare
0
From Embeds
0
Number of Embeds
1

Actions

Shares
Downloads
0
Comments
0
Likes
2

Embeds 0

No embeds

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
    No notes for slide

Transcript

  • 1. Algorithm Design and Complexity Course 10
  • 2. Overview       Shortest Paths Single Source Shortest Paths Dijkstra’s Algorithm DAG SSSP Algorithm Bellman-Ford Algorithm Sample Applications
  • 3. Shortest Paths  G(V, E) (un)directed, connected and weighted graph  The weight (cost) function w: E → R w(u, v) = the weight of the edge (u, v)   Weight of a path u..v = w(u..v) = sum of the weights of all the edges on the path     The weight must be additive δ(u, v) = weight of the shortest path from u to v δ(u, v) = INF if there is no path from u to v in G δ(u, v) = min{w(p)} for all paths p = u..v
  • 4. Shortest Paths are Not Unique   Example: from CLRS Shortest paths from source node s to all the other vertices
  • 5. Shortest Paths Problems  The shortest paths from a source vertex s are organized as a tree   Shortest paths tree p[u] used to store the predecessor of any vertex in the tree  The weight can be any measure that is additive and we would like to minimize: distance, cost, time, penalties, etc.  Generalization of BFS for a weighted graph
  • 6. Types of Shortest Paths Problems   Single-source shortest paths (SSSP): find the shortest paths from a source vertex s to all the other vertices v∈V Single-destination shortest paths:    Single-source single-destination shortest path:     It is similar to the SSSP problem so use the same algorithms Just transpose the graph and solve the SSSP using the old destination vertex as a the source for this new problem At first sight, it seems to be simpler than the two previous problems However, in the worst case, the destination vertex the destination might be the most difficult vertex to be reached! Still need to compute the SP to all the other intermediate vertices All-pairs shortest paths (APSP): find the shortest paths between all the pairs of vertices in the graph  More difficult than the previous problems
  • 7. Directed Graphs?     Usually, the algorithms for shortest path are developed for directed graphs However, they also work for undirected graphs But must be adapted! The greatest problem: not go back on the same edge to the predecessor:  If the last edge considered for building a SP is (p[u], u) then do not allow to go back to p[u] from u!
  • 8. Optimal Substructure?  Minimization problem  Lemma Any subpath of a shortest path is always a shortest path!    p = u..v = u..x..y..v is a shortest path => u..x, x..y and y..v are also shortest paths! Proof by contradiction     Assume subpath x..y is not a shortest path Then there is a better path from x..y It means that p = u..v is not a short path either => contradiction! Look for greedy or DP solutions
  • 9. Negative Edges?  Are graphs with negative weights for the edges allowed?   Yes But some algorithms do not compute the correct results if negative weight edges are reachable from the source    E.g. Dijkstra Makes the problem more difficult to solve, as the greedy choice won’t work any more! However, negative weight cycles are a problem:    If they are reachable from the source No algorithm works correctly They could be detected and maybe eliminated  Difficult! What edge to remove from the cycle? 
  • 10. Cycles in a Shortest Path?  Cycles are not allowed in a shortest path between any two vertices!     It also makes because otherwise the shortest paths from a single source to all the vertices would not form a tree   Negative weight cycles: are not allowed! Positive weight cycles: should not be part of a shortest path as they could be removed and find a better path! Zero weight cycles: prefer not to use them As trees are not allowed to contain cycles If a shortest path cannot contains any cycles, then any shortest path must have at most |V| - 1 edges
  • 11. SSSP: Single-Source Shortest Paths     G(V, E) and source s∈V Find δ(s, v) for all v∈V All the SSSP algorithms use two arrays: d[u] = the cost of the best path s..u discovered by the algorithm at the current moment      An estimation of the shortest path δ(s, u) Initially d[u] = INF Iteratively reduces d[u], but always d[u] >= δ(s, u) We want d[u] = δ(s, u) when the algorithm finishes p[u] = the predecessor of u from the best path s..u discovered by the algorithm at the current moment  Initially p[u] = NIL
  • 12. SSSP Algorithms  Dijkstra:    Bellman-Ford   Greedy algorithm Only for positive weight edges For any weights, also detects negative cycles if they exist They use two procedures:  INIT-SSSP(G, s)   RELAX(u, v)    Call it once, when the algorithm starts Relax along the edge (u, v)∈E Called a number of times, depending on the algorithm The algorithms depend on:   The order of relaxing the edges The number of edges that need to be relaxed (also counting if the same edge is relaxed more than once)
  • 13. Initialization  Pessimistic: assume no path exits to any other vertex except the source INIT-SSSP(G, s) FOREACH (v∈V) d[v] = INF p[v] = NIL d[s] = 0
  • 14. Relaxing an Edge   Try to improve the estimation for vertex v, d[v], by relaxing the edge (u, v) Maybe (u, v) is the last edge on a better path from the source (d[u]+w(u,v)) than the one that has been computed until now by the algorithm for v (d[v]) RELAX(u, v) IF (d[v] > d[u] + w(u, v)) d[v] = d[u] + w(u, v) p[v] = u
  • 15. Relaxing an Edge – Examples  d[v] changes after relaxing (u,v) s d=10 s v w(u,v)=2 d=5 d=7 v u  w(u,v)=2 d=5 u d[v] does not change after relaxing (u,v) s d=10 s v w(u,v)=6 d=5 u d=10 v w(u,v)=6 d=5 u
  • 16. Triangle Inequality     For all edges (u,v)∈E: δ(s, v) <= δ(s, u) + w(u, v) δ(s, v) = δ(s, u) + w(u, v) if and only if s..u→v is a shortest path from s..v => s..u is also a shortest path by optimal substructure Else, δ(s, v) < δ(s, u) + w(u, v)
  • 17. Upper-Bound Property           After any number of edge relaxations, d[v] >= δ(s, v) for all v∈V Proof by contradiction! Initially, this is true as d[v] = INF for all v∈V Let v be the first vertex for which d[v] < δ(s, v) If u = p[v] that caused d[v] to change d[v] < δ(s, v) <= δ(s, u) + w(u,v) <= d[u] +w(u,v) => d[v] < d[u] +w(u,v) Contradicts, d[v] = d[u] +w(u,v) Once d[v] = δ(s, v), it never changes! True, since relaxing an edge can only decrease d[v]
  • 18. No-Path Property   If u is not reachable from s, then d[u] = δ(s, u) = INF Use previous property!
  • 19. Convergence Property  Used by all algorithms   If s..u→v is a shortest path and d[u] = δ(s, u) when we call RELAX(u, v), then d[v] = δ(s, v) afterwards d[v] <= d[u] + w(u,v) = δ(s, u) + w(u,v) = δ(s, v) But d[v] >= δ(s, v)  It means d[v] = δ(s, v) 
  • 20. Path Relaxation Property  Let p = <v0 = s, v1, … , vk> be a shortest path v0..vk  If we relax in order (v0, v1), (v1, v2), … , (vk-1, vk) even intermixed with other relaxations then d[vk] = δ(s, vk) afterwards   Proof by induction Base case: k = 0 => p = <v0> d[v0] = d[s] = 0 = δ(s, s)  Inductive step: Basis d[vk-1] = δ(s, vk-1) and RELAX(vk-1, vk) => convergence property => d[vk] = δ(s, vk)
  • 21. Dijkstra’s Algorithm        G(V, E) with no negative weight edges reachable from the source The algorithm does not always compute the correct shortest paths if negative weight edges are reachable from the source It is a generalization of BFS Greedy algorithm Uses a priority queue Q to extract the min(d[v]) of the “unvisited” vertices It is also similar to Prim’s algorithm The set of vertices for which the shortest path has already been determined – S
  • 22. Dijkstra – Pseudocode Dijkstra(G, w, s) INIT-SSSP(G, s) S=∅ Q = PRIORITY-QUEUE(V, d) // build a priority queue indexed by the vertices V // with priorities in d[u] for each vertex WHILE (!Q.EMPTY()) u = Q.EXTRACT-MIN() // greedy choice S = S U {u} FOREACH (v∈Adj[u]) RELAX(u, v) // may need to heapify-up the element if d[v] has changed after this relaxation! // Q.DECREASE-KEY(v, d[v])
  • 23. Dijkstra – Remarks      Dijkstra is greedy because it always chooses a vertex u that is not in S, but has the minimum path s..u from all the remaining vertices in Q Any vertex is either in Q or in S Loop invariant: d[v] = δ(s, v) for all v∈S Termination: S = V => d[v] = δ(s, v) for all v∈V Need to prove the maintenance of the invariant!
  • 24. Dijkstra – Complexity  Depends how we implement the priority queue: Θ(n * EXTRACT-MIN + m * DECREASE-KEY)  If the priority queue is a simple array:      EXTRACT-MIN: O(n) DECREASE-KEY: O(1) Dijkstra: Θ(n2 +m)  good for dense graphs If the priority queue is a binary heap:    EXTRACT-MIN: O(logn) DECREASE-KEY: O(logn) Dijkstra: Θ(nlogn +mlogn)=Θ(mlogn)  good for sparse graphs
  • 25. Dijkstra & Fibonacci Heaps   Best solution: use Fibonacci heaps http://en.wikipedia.org/wiki/Fibonacci_heap    EXTRACT-MIN: O(logn) DECREASE-KEY: O(1) Dijkstra: Θ(nlogn + m) = Θ(nlogn+m)  good for sparse and dense graphs
  • 26. Dijkstra – Example
  • 27. Dijkstra – Example (2)     d[1] = 0; (1): d[2] = 1; d[3] = 2; d[6] = 3; (2): d[4] = 7; d[5] = 10; (3): d[5] = 7;
  • 28. Dijkstra and Negative Edges  Usually, Dijkstra does not compute the correct shortest path if there is a negative weight edge reachable from s  However, there are cases when it does work correctly   E.g. when all the negative weight edges are leaving from s to some other vertices Can we determine when Dijkstra works correctly on a graph with negative edges reachable from s?   There is a condition that can easily be verified when relaxing a negative edge! Taking this into account, we can adapt Dijkstra to work for negative edges, but it becomes unefficient
  • 29. Bellman-Ford Algorithm      Works for graphs with both positive and negative edges It also detects negative weight cycles reachable from the source Uses the path relaxation property And the fact that any shortest path must have at most |V| - 1 edges It returns FALSE if it discovers a negative cycle in order to let you know that the values in d[v] are not correct!
  • 30. Bellman-Ford – Pseudocode Bellman-Ford(G, s) INIT-SSSP(G, s) FOR (i = 1..|V|-1) FOREACH ((u,v)∈E) RELAX(u,v) FOREACH ((u,v)∈E) IF (d[v] > d[u] + w(u,v)) RETURN FALSE RETURN TRUE  Complexity: Θ((n-1)*m) = Θ(n*m)  very bad for dense graphs
  • 31. Bellman-Ford – Remarks  The algorithm tries to relax all the possible paths with |V| - 1 edges This is why it relaxes each edge |V| - 1 times, in order for it to be in any position from 1 .. |V|-1 in a possible shortest path  The algorithm can definitely be improved      By considering less edges each time! E.g. the first edge should always start from s As, no path can have more than |V|-1 edges, all the shortest path should be determined after the first two nested loops are over If we can improve the estimate of any vertex, d[v], by relaxing any edge one more time than we have reached a negative cycle! This is exactly what the last cycle does!
  • 32. Bellman-Ford and Negative Cycles  Proof that Bellman-Ford returns FALSE if a negative weight cycle reachable from s is present in the graph  If BF returns TRUE it means that: d[u] = δ(s, u) <= δ(s, v) + w(u,v) = d[v] + w(u,v) for all edges (u,v)∈E  If a negative cycle exists (BF should return FALSE): Proof by contradiction! Let p = <v0, … , vk> be the negative cycle Suppose BF returns TRUE  see blackboard
  • 33. Questions  Given a directed graph  How many relaxations does Dijkstra make ?  How many relaxations does Bellman-Ford make ?  How can you find the negative weight cycle(s) discovered by Bellman-Ford ?  How would you improve Bellman-Ford ?
  • 34. SSSP in a DAG  Can we build an algorithm that is more efficient for a special case of graphs: DAGs ?  Any ideas ?
  • 35. SSSP in a DAG (2)  Compute the topological sorting of the DAG Relax the edges in the DAG by considering the start vertex according to the topological sorting order  No need for priority queue!  Uses path relaxation property as we are relaxing the edges in the right order! 
  • 36. SSSP in DAG – Example  From CLRS
  • 37. Problem 1: Currency Conversion
  • 38. Problem 2: Rolling Dice on Board  Given a dice that has on each side an integer value in 1..100, find the minimum sum that can be reached by rotating the dice on a NxN chess board, knowing the start position and the end position of the dice. The dice may rotate on any of its edges!
  • 39. Conclusions  Shortest paths problems are frequently met in reality A lot of applications  A lot of algorithms for solving SP problems    Taking into account problem and graph constraints Lots of applications in networking  Routing algorithms
  • 40. References  CLRS – Chapter 24  R. Sedgewick, K Wayne – Algorithms and Data Structures – Princeton 2007 www.cs.princeton.edu/~rs/AlgsDS07/   Problem 1 and the corresponding images are taken from these slides! MIT OCW – Introduction to Algorithms – video lectures 17 & 18