4.3.1. Introduction
Given a connected weighted directed graph G(V,E), associated with each
edge u,v⟨ ⟩∈E, there is a weight w(u,v). The single source shortest paths (SSSP)
problem is to find a shortest path from a given source r to every other vertex v∈V-
{r}. The weight (length) of a path p={ v0 , v1 ,…, vk } is the sum of the weights of
its constituent edges:
Single Source Shortest Path
Problem
Given a directed graph G(V,E) with weighted edges w(u,v), define the path
weight of a path p as
For a given source vertex s, find the minimum weight paths to every vertex
reachable from s denoted
The final solution will satisfy certain caveats:
 The graph cannot contain any negative weight cycles (otherwise there
would be no minimum path since we could simply continue to follow the
negative weight cycle producing a path weight of -∞).
 The solution cannot have any positive weight cycles (since the cycle could
simply be removed giving a lower weight path).
 The solution can be assumed to have no zero weight cycles (since they
would not affect the minimum value).
Therefore given these caveats, we know the shortest paths must be acyclic (with
≤ |V| distinct vertices) ⇒ ≤ |V| - 1 edges in each path.
Generic Algorithm
The single source shortest path algorithms use the notations with
predecessor π and distance d fields for each vertex.
The optimal solution will have v.d = δ(s,v) for all v ∈ V.
The solutions utilize the concept of edge relaxation which is a test to
determine whether going through edge (u,v) reduces the distance to v and if so
update v.π and v.d. This is accomplished using the condition
4.3.2. Bellman-Ford Algorithm
The Bellman-Ford algorithm uses relaxation to find single source shortest
paths on directed graphs that may contain negative weight edges. The algorithm
will also detect if there are any negative weight cycles (such that there is no
solution).
BELLMAN-FORD(G,w,s)
1. INITIALIZE-SINGLE-SOURCE(G,s)
2. for i = 1 to |G.V|-1
3. for each edge (u,v) ∈ G.E
4. RELAX(u,v,w)
5. for each edge (u,v) ∈ G.E
6. if v.d > u.d + w(u,v)
7. return FALSE
8. return TRUE
INITIALIZE-SINGLE-SOURCE(G,s)
1. for each vertex v ∈ G.V
2. v.d = ∞
3. v.pi = NIL
4. s.d = 0
RELAX(u,v,w)
1. if v.d > u.d + w(u,v)
2. v.d = u.d + w(u,v)
3. v.pi = u
Basically the algorithm works as follows:
1. Initialize d's, π's, and set s.d = 0 ⇒ O(V)
2. Loop |V|-1 times through all edges checking the relaxation condition to
compute minimum distances ⇒ (|V|-1) O(E) = O(VE)
3. Loop through all edges checking for negative weight cycles which occurs if
any of the relaxation conditions fail ⇒ O(E)
Example
Given the following directed graph
Using vertex 5 as the source (setting its distance to 0), we initialize all the other
distances to ∞.
Iteration 1: Edges (u5,u2) and (u5,u4) relax updating the distances to 2 and 4
Iteration 2: Edges (u2,u1), (u4,u2) and (u4,u3) relax updating the distances to 1, 2,
and 4 respectively. Note edge (u4,u2) finds a shorter path to vertex 2 by going
through vertex 4
Iteration 3: Edge (u2,u1) relaxes (since a shorter path to vertex 2 was found in the
previous iteration) updating the distance to 1
Iteration 4: No edges relax
The final shortest paths from vertex 5 with corresponding distances is
Negative cycle checks: We now check the relaxation condition one additional time
for each edge. If any of the checks pass then there exists a negative weight cycle in
the graph.
v3.d > u1.d + w(1,3) ⇒ 4 ≯ 6 + 6 = 12 ✓
v4.d > u1.d + w(1,4) ⇒ 2 ≯ 6 + 3 = 9 ✓
v1.d > u2.d + w(2,1) ⇒ 6 ≯ 3 + 3 = 6 ✓
v4.d > u3.d + w(3,4) ⇒ 2 ≯ 3 + 2 = 5 ✓
v2.d > u4.d + w(4,2) ⇒ 3 ≯ 2 + 1 = 3 ✓
v3.d > u4.d + w(4,3) ⇒ 3 ≯ 2 + 1 = 3 ✓
v2.d > u5.d + w(5,2) ⇒ 3 ≯ 0 + 4 = 4 ✓
v4.d > u5.d + w(5,4) ⇒ 2 ≯ 0 + 2 = 2 ✓
Note that for the edges on the shortest paths the relaxation criteria gives equalities.
Additionally, the path to any reachable vertex can be found by starting at the vertex
and following the π's back to the source.
For example, starting at vertex 1, u1.π = 2, u2.π = 4, u4.π = 5 ⇒ the shortest path
to vertex 1 is {5,4,2,1}.
Just as Prim's algorithm improved on Kruskal's algorithm for MST's through the use
of a priority queue, Dijkstra's algorithm improves on Bellman-Ford for single
source shortest path also through the use of a priority queue. Unlike Bellman-Ford,
however, Dijkstra's algorithm requires that all the weights are non-negative
otherwise the algorithm may fail.
4.3.3. Single-Source Shortest Paths in Directed Acyclic Graphs
By relaxing the edges in a DAG according to their topological sort of its
vertices. We can achieve Θ(n+m) time complexity.
DAG-SHORTEST (G,w,s)
1 Topologically sort the vertices of G
2 INITIALIZE (G,s)
3 for each vertex u taken in topologically sorted (increasing) order
4 do for v∈ Adj [u]
5 do RELAX (u,v,w)
Figure shows an example execution of DAG-SHORTEST on a DAG.
Figure : Excution of DAG-SHORTEST.
Step (a) shows the state of the graph after initialisation.
Following this, the vertices are considered in topological order: c, s, i,
and then t. As each vertex u is considered, vertices in Adj [u] are relaxed.
4.3.4. Dijkstra's Algorithm
Dijkstra's algorithm maintains a set S of vertices where minimum paths
have been found and a priority queue Q of the remaining vertices under discovery
ordered by increasing u.d's.
DIJKSTRA(G,w,s)
1. INITIALIZE-SINGLE-SOURCE(G,s)
2. S = ∅
3. Q = G.V
4. while Q ≠ ∅
5. u = EXTRACT-MIN(Q)
6. S = S ∪ {u}
7. for each vertex v ∈ G.Adj[u]
8. RELAX(u,v,w)
INITIALIZE-SINGLE-SOURCE(G,s)
1. for each vertex v ∈ G.V
2. v.d = ∞
3. v.pi = NIL
4. s.d = 0
RELAX(u,v,w)
1. if v.d > u.d + w(u,v)
2. v.d = u.d + w(u,v)
3. v.pi = u
Basically the algorithm works as follows:
1. Initialize d's, π's, set s.d = 0, set S = ∅, and Q = G.V (i.e. put all the
vertices into the queue with the source vertex having the smallest distance)
2. While the queue is not empty, extract the minimum vertex (whose distance
will be the shortest path distance at this point), add this vertex to S, and
relax (using the same condition as Bellman-Ford) all the edges in the
vertex's adjacency list for vertices still in Q reprioritizing the queue if
necessary
The run time of Dijkstra's algorithm depends on how Q is implemented:
Simple array with search ⇒ O(V2
+ E) = O(V2
)
Binary min-heap (if G is sparse) ⇒ O((V + E) lg V) = O(E lg V)
Fibonacci heap ⇒ O(V lg V + E)
Example
Using the directed graph
Using vertex 5 as the source (setting its distance to 0), we initialize all the other
distances to ∞, set S = ∅, and place all the vertices in the queue (resolving ties by
lowest vertex number)
Iteration 1: Dequeue vertex 5 placing it in S (with a distance 0) and relaxing edges
(u5,u2) and (u5,u4) then reprioritizing the queue
Iteration 2: Dequeue vertex 4 placing it in S (with a distance 2) and relaxing edges
(u4,u2) and (u4,u3) then reprioritizing the queue. Note edge (u4,u2) finds a shorter
path to vertex 2 by going through vertex 4
Iteration 3: Dequeue vertex 2 placing it in S (with a distance 3) and relaxing edge
(u2,u1) then reprioritizing the queue.
Iteration 4: Dequeue vertex 3 placing it in S (with a distance 3) and relaxing no
edges then reprioritizing the queue.
Iteration 5: Dequeue vertex 1 placing it in S (with a distance 6) and relaxing no
edges.
The final shortest paths from vertex 5 with corresponding distances is
Just like Bellman-Ford, the path to any reachable vertex can be found by
starting at the vertex and following the π's back to the source.
For example, starting at vertex 1, u1.π = 2, u2.π = 4, u4.π = 5 ⇒ the
shortest path to vertex 1 is {5,4,2,1}.
4.3.5. Difference constraints and shortest paths
Linear Programming: Given an m × n matrix A, an m-vector b, and an n-vector
c, we wish to find a vector x of n elements that maximise the objective function
subject to the m constraints given by Ax ≤ b.
Approaches to solving linear programming.
 Simplex algorithm
 Ellipsoid algorithm
 Karmarkkar's algorithm
Systems of difference constraints
Special linear programming: each row of the linear programming matrix A
contains 1 or -1, and all other entries of A are 0s. Thus, the constraints can be
expressed as
xj - xi ≤ bk
where 1 ≤ i,j ≤ n and 1 ≤ k ≤ m.
For example, a set of four unknowns { x1 , x2 , x3 , x4 }, and 6 difference
constraints:
x1 - x2 ≤ 6
x1 - x3 ≤ -4
x2 - x3 ≤ -10
x3 - x4 ≤ 14
x4 - x2 ≤ -4
x4 - x1 ≤ -10
... can be represented as ...
Given a system Ax ≤ b of difference constraints, the corresponding constraint graph
is a weighted directed graph G=(V,E) where
V={ v0 , v1 , v2 ,…, vn }
... where n is the number of unknowns, v0 is a special source vertex created, and vi
corresponds to xi for 1 ≤ i≤ n. Also, the set of edges ...
E={ vi , vj : x⟨ ⟩ j - xi ≤ bk   is  a  constraint}∪{ v⟨ 0 , vi :i=1,…,n}⟩
Theorem: Given a system Ax ≤ b of difference constraints, let G be the
corresponding constraint graph. If G contains no negative weight cycles, then
x={δ( v0 , v1 ),δ( v0 , v1 ),…,δ( v0 , vn )}
is a feasible solution for the system. Figure shows how this is done for the example
with four unknowns above.
Figure : A constraint graph constructed from our ongoing example using simple
rules:
(1) Each unknown xi maps onto a vertex vi .
(2) For each constraint xj - xi ≤ bk , there is an edge ( vi , vj ) with weight bk .
(3) Add a vertex v0 and have edges from it to all other vertices with weight 0. Next,
use one of the SSSP algorithms to solve d[ vi ] for 0≤i≤n; using v0 as the source.

Daa chpater14

  • 1.
    4.3.1. Introduction Given aconnected weighted directed graph G(V,E), associated with each edge u,v⟨ ⟩∈E, there is a weight w(u,v). The single source shortest paths (SSSP) problem is to find a shortest path from a given source r to every other vertex v∈V- {r}. The weight (length) of a path p={ v0 , v1 ,…, vk } is the sum of the weights of its constituent edges: Single Source Shortest Path Problem Given a directed graph G(V,E) with weighted edges w(u,v), define the path weight of a path p as For a given source vertex s, find the minimum weight paths to every vertex reachable from s denoted The final solution will satisfy certain caveats:  The graph cannot contain any negative weight cycles (otherwise there would be no minimum path since we could simply continue to follow the negative weight cycle producing a path weight of -∞).  The solution cannot have any positive weight cycles (since the cycle could simply be removed giving a lower weight path).  The solution can be assumed to have no zero weight cycles (since they would not affect the minimum value). Therefore given these caveats, we know the shortest paths must be acyclic (with ≤ |V| distinct vertices) ⇒ ≤ |V| - 1 edges in each path.
  • 2.
    Generic Algorithm The singlesource shortest path algorithms use the notations with predecessor π and distance d fields for each vertex. The optimal solution will have v.d = δ(s,v) for all v ∈ V. The solutions utilize the concept of edge relaxation which is a test to determine whether going through edge (u,v) reduces the distance to v and if so update v.π and v.d. This is accomplished using the condition 4.3.2. Bellman-Ford Algorithm The Bellman-Ford algorithm uses relaxation to find single source shortest paths on directed graphs that may contain negative weight edges. The algorithm will also detect if there are any negative weight cycles (such that there is no solution). BELLMAN-FORD(G,w,s) 1. INITIALIZE-SINGLE-SOURCE(G,s) 2. for i = 1 to |G.V|-1 3. for each edge (u,v) ∈ G.E 4. RELAX(u,v,w) 5. for each edge (u,v) ∈ G.E 6. if v.d > u.d + w(u,v) 7. return FALSE 8. return TRUE INITIALIZE-SINGLE-SOURCE(G,s) 1. for each vertex v ∈ G.V 2. v.d = ∞ 3. v.pi = NIL 4. s.d = 0 RELAX(u,v,w) 1. if v.d > u.d + w(u,v) 2. v.d = u.d + w(u,v)
  • 3.
    3. v.pi =u Basically the algorithm works as follows: 1. Initialize d's, π's, and set s.d = 0 ⇒ O(V) 2. Loop |V|-1 times through all edges checking the relaxation condition to compute minimum distances ⇒ (|V|-1) O(E) = O(VE) 3. Loop through all edges checking for negative weight cycles which occurs if any of the relaxation conditions fail ⇒ O(E) Example Given the following directed graph Using vertex 5 as the source (setting its distance to 0), we initialize all the other distances to ∞. Iteration 1: Edges (u5,u2) and (u5,u4) relax updating the distances to 2 and 4
  • 4.
    Iteration 2: Edges(u2,u1), (u4,u2) and (u4,u3) relax updating the distances to 1, 2, and 4 respectively. Note edge (u4,u2) finds a shorter path to vertex 2 by going through vertex 4 Iteration 3: Edge (u2,u1) relaxes (since a shorter path to vertex 2 was found in the previous iteration) updating the distance to 1 Iteration 4: No edges relax The final shortest paths from vertex 5 with corresponding distances is
  • 5.
    Negative cycle checks:We now check the relaxation condition one additional time for each edge. If any of the checks pass then there exists a negative weight cycle in the graph. v3.d > u1.d + w(1,3) ⇒ 4 ≯ 6 + 6 = 12 ✓ v4.d > u1.d + w(1,4) ⇒ 2 ≯ 6 + 3 = 9 ✓ v1.d > u2.d + w(2,1) ⇒ 6 ≯ 3 + 3 = 6 ✓ v4.d > u3.d + w(3,4) ⇒ 2 ≯ 3 + 2 = 5 ✓ v2.d > u4.d + w(4,2) ⇒ 3 ≯ 2 + 1 = 3 ✓ v3.d > u4.d + w(4,3) ⇒ 3 ≯ 2 + 1 = 3 ✓ v2.d > u5.d + w(5,2) ⇒ 3 ≯ 0 + 4 = 4 ✓ v4.d > u5.d + w(5,4) ⇒ 2 ≯ 0 + 2 = 2 ✓ Note that for the edges on the shortest paths the relaxation criteria gives equalities. Additionally, the path to any reachable vertex can be found by starting at the vertex and following the π's back to the source. For example, starting at vertex 1, u1.π = 2, u2.π = 4, u4.π = 5 ⇒ the shortest path to vertex 1 is {5,4,2,1}. Just as Prim's algorithm improved on Kruskal's algorithm for MST's through the use of a priority queue, Dijkstra's algorithm improves on Bellman-Ford for single source shortest path also through the use of a priority queue. Unlike Bellman-Ford, however, Dijkstra's algorithm requires that all the weights are non-negative otherwise the algorithm may fail.
  • 6.
    4.3.3. Single-Source ShortestPaths in Directed Acyclic Graphs By relaxing the edges in a DAG according to their topological sort of its vertices. We can achieve Θ(n+m) time complexity. DAG-SHORTEST (G,w,s) 1 Topologically sort the vertices of G 2 INITIALIZE (G,s) 3 for each vertex u taken in topologically sorted (increasing) order 4 do for v∈ Adj [u] 5 do RELAX (u,v,w) Figure shows an example execution of DAG-SHORTEST on a DAG.
  • 7.
    Figure : Excutionof DAG-SHORTEST. Step (a) shows the state of the graph after initialisation. Following this, the vertices are considered in topological order: c, s, i, and then t. As each vertex u is considered, vertices in Adj [u] are relaxed.
  • 8.
    4.3.4. Dijkstra's Algorithm Dijkstra'salgorithm maintains a set S of vertices where minimum paths have been found and a priority queue Q of the remaining vertices under discovery ordered by increasing u.d's. DIJKSTRA(G,w,s) 1. INITIALIZE-SINGLE-SOURCE(G,s) 2. S = ∅ 3. Q = G.V 4. while Q ≠ ∅ 5. u = EXTRACT-MIN(Q) 6. S = S ∪ {u} 7. for each vertex v ∈ G.Adj[u] 8. RELAX(u,v,w) INITIALIZE-SINGLE-SOURCE(G,s) 1. for each vertex v ∈ G.V 2. v.d = ∞ 3. v.pi = NIL 4. s.d = 0 RELAX(u,v,w) 1. if v.d > u.d + w(u,v) 2. v.d = u.d + w(u,v) 3. v.pi = u Basically the algorithm works as follows: 1. Initialize d's, π's, set s.d = 0, set S = ∅, and Q = G.V (i.e. put all the vertices into the queue with the source vertex having the smallest distance) 2. While the queue is not empty, extract the minimum vertex (whose distance will be the shortest path distance at this point), add this vertex to S, and relax (using the same condition as Bellman-Ford) all the edges in the vertex's adjacency list for vertices still in Q reprioritizing the queue if necessary The run time of Dijkstra's algorithm depends on how Q is implemented: Simple array with search ⇒ O(V2 + E) = O(V2 )
  • 9.
    Binary min-heap (ifG is sparse) ⇒ O((V + E) lg V) = O(E lg V) Fibonacci heap ⇒ O(V lg V + E) Example Using the directed graph Using vertex 5 as the source (setting its distance to 0), we initialize all the other distances to ∞, set S = ∅, and place all the vertices in the queue (resolving ties by lowest vertex number) Iteration 1: Dequeue vertex 5 placing it in S (with a distance 0) and relaxing edges (u5,u2) and (u5,u4) then reprioritizing the queue
  • 10.
    Iteration 2: Dequeuevertex 4 placing it in S (with a distance 2) and relaxing edges (u4,u2) and (u4,u3) then reprioritizing the queue. Note edge (u4,u2) finds a shorter path to vertex 2 by going through vertex 4 Iteration 3: Dequeue vertex 2 placing it in S (with a distance 3) and relaxing edge (u2,u1) then reprioritizing the queue. Iteration 4: Dequeue vertex 3 placing it in S (with a distance 3) and relaxing no edges then reprioritizing the queue.
  • 11.
    Iteration 5: Dequeuevertex 1 placing it in S (with a distance 6) and relaxing no edges. The final shortest paths from vertex 5 with corresponding distances is Just like Bellman-Ford, the path to any reachable vertex can be found by starting at the vertex and following the π's back to the source. For example, starting at vertex 1, u1.π = 2, u2.π = 4, u4.π = 5 ⇒ the shortest path to vertex 1 is {5,4,2,1}.
  • 12.
    4.3.5. Difference constraintsand shortest paths Linear Programming: Given an m × n matrix A, an m-vector b, and an n-vector c, we wish to find a vector x of n elements that maximise the objective function subject to the m constraints given by Ax ≤ b. Approaches to solving linear programming.  Simplex algorithm  Ellipsoid algorithm  Karmarkkar's algorithm Systems of difference constraints Special linear programming: each row of the linear programming matrix A contains 1 or -1, and all other entries of A are 0s. Thus, the constraints can be expressed as xj - xi ≤ bk where 1 ≤ i,j ≤ n and 1 ≤ k ≤ m. For example, a set of four unknowns { x1 , x2 , x3 , x4 }, and 6 difference constraints: x1 - x2 ≤ 6 x1 - x3 ≤ -4 x2 - x3 ≤ -10 x3 - x4 ≤ 14 x4 - x2 ≤ -4 x4 - x1 ≤ -10 ... can be represented as ... Given a system Ax ≤ b of difference constraints, the corresponding constraint graph is a weighted directed graph G=(V,E) where
  • 13.
    V={ v0 ,v1 , v2 ,…, vn } ... where n is the number of unknowns, v0 is a special source vertex created, and vi corresponds to xi for 1 ≤ i≤ n. Also, the set of edges ... E={ vi , vj : x⟨ ⟩ j - xi ≤ bk   is  a  constraint}∪{ v⟨ 0 , vi :i=1,…,n}⟩ Theorem: Given a system Ax ≤ b of difference constraints, let G be the corresponding constraint graph. If G contains no negative weight cycles, then x={δ( v0 , v1 ),δ( v0 , v1 ),…,δ( v0 , vn )} is a feasible solution for the system. Figure shows how this is done for the example with four unknowns above. Figure : A constraint graph constructed from our ongoing example using simple rules: (1) Each unknown xi maps onto a vertex vi . (2) For each constraint xj - xi ≤ bk , there is an edge ( vi , vj ) with weight bk . (3) Add a vertex v0 and have edges from it to all other vertices with weight 0. Next, use one of the SSSP algorithms to solve d[ vi ] for 0≤i≤n; using v0 as the source.