Your SlideShare is downloading. ×
0
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Lecture 16 data structures and algorithms
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

Lecture 16 data structures and algorithms

978

Published on

Published in: Education, Technology, Business
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
978
On Slideshare
0
From Embeds
0
Number of Embeds
0
Actions
Shares
0
Downloads
25
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Spanning Tree
  • 2. What is A Spanning Tree? • A spanning tree for an undirected graph G=(V,E) is a subgraph of G that is a tree and contains all the vertices of G • Can a graph have more than one spanning tree? • Can an unconnected graph have a spanning tree? a b u e c v f d
  • 3. Minimal Spanning Tree. • The weight of a subgraph is the sum of the weights of it edges. a 4 9 3 b • A minimum spanning tree for a weighted graph is a spanning tree with minimum weight. 4 u 14 2 10 c v 3 Mst T: w( T )= 15 f 8 d • Can a graph have more then one minimum spanning tree? e (u,v) T w(u,v ) is minimized
  • 4. Example of a Problem that Translates into a MST The Problem • Several pins of an electronic circuit must be connected using the least amount of wire. Modeling the Problem • The graph is a complete, undirected graph G = ( V, E ,W ), where V is the set of pins, E is the set of all possible interconnections between the pairs of pins and w(e) is the length of the wire needed to connect the pair of vertices. • Find a minimum spanning tree.
  • 5. Greedy Choice We will show two ways to build a minimum spanning tree. • A MST can be grown from the current spanning tree by adding the nearest vertex and the edge connecting the nearest vertex to the MST. (Prim's algorithm) • A MST can be grown from a forest of spanning trees by adding the smallest edge connecting two spanning trees. (Kruskal's algorithm)
  • 6. Notation • Tree-vertices: in the tree constructed so far • Non-tree vertices: rest of vertices Prim’s Selection rule • Select the minimum weight edge between a treenode and a non-tree node and add to the tree
  • 7. The Prim algorithm Main Idea Select a vertex to be a tree-node while (there are non-tree vertices) { if there is no edge connecting a tree node with a non-tree node return “no spanning tree” select an edge of minimum weight between a tree node and a non-tree node add the selected edge and its new vertex to the tree } return tree
  • 8. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Prim's Algorithm F
  • 9. A B D C E Prim's Algorithm F
  • 10. A C 2 B D E F Prim's Algorithm
  • 11. 5 A 4 2 6 2 C B D 1 3 E 2 4 Prim's Algorithm F
  • 12. A B 2 2 C 1 3 3 D E 2 F Prim's Algorithm
  • 13. A B 2 2 C 1 3 3 D E 2 F Prim's Algorithm
  • 14. A B 2 2 C D 1 3 E 2 F Prim's Algorithm
  • 15. A B 2 2 C D 1 3 E 2 F Prim's Algorithm
  • 16. minimum- spanning tree A B 2 2 C D 1 3 E 2 F Prim's Algorithm
  • 17. Kruskal„s Algorithm 1. Each vertex is in its own cluster 2. Take the edge e with the smallest weight - if e connects two vertices in different clusters, then e is added to the MST and the two clusters, which are connected by e, are merged into a single cluster - if e connects two vertices, which are already in the same cluster, ignore it 3. Continue until n-1 edges were selected Kruskal's Algorithm
  • 18. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F
  • 19. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F
  • 20. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F
  • 21. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F
  • 22. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F
  • 23. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F cycle!!
  • 24. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F
  • 25. 5 A 4 6 2 C 2 E 3 D 1 3 B 2 4 Kruskal's Algorithm F
  • 26. minimum- spanning tree A B 2 2 C D 1 3 E 2 F Kruskal's Algorithm
  • 27. Graph Traversal Traversing a graph means visiting all the vertices in the graph exactly once. Breadth First Search (BFS) Depth First Search (DFS)
  • 28. DFS Similar to in-order traversal of a binary search tree Starting from a given node, this traversal visits all the nodes up to the deepest level and so on.
  • 29. v1 v2 v1 v8 v3 DFS v2 v4 v6 v5 v7 DFS : V1 - V2 v8 v3 v4 v6 v5 v7 - V5 - V7 – V4 - V8 – V6 – V3
  • 30. v1 v2 v1 v8 v3 v4 v6 v5 v7 DFS : V1 - V2 DFS v2 v8 v3 v4 v6 v5 v7 - V5 - V7 – V4 - V8 – V3 – V6
  • 31. DFS Traversal Visit the vertex v Visit all the vertices along the path which begins at v Visit the vertex v, then the vertex immediate adjacent to v, let it be vx . If vx has an immediate adjacent vy then visit it and so on till there is a dead end. Dead end: A vertex which does not have an immediate adjacent or its immediate adjacent has been visited.
  • 32. After coming to an dead end we backtrack to v to see if it has an another adjacent vertex other than vx and then continue the same from it else from the adjacent of the adjacent (which is not visited earlier) and so on.
  • 33. Push the starting vertex into the STACK While STACK not empty do POP a vertex V If V is not visited Visit the vertex V Store V in VISIT PUSH all adjacent vertex of V onto STACK End of IF End of While STOP
  • 34. A Adjacency List C D J B E F G K A: F,C,B B: G,C C: F D: C E: D,C,J F: D G: C,E J: D,K K: E,G
  • 35. DFS of G starting at J [1] Initially push J onto STACK STACK : J VISIT: Ø [2] POP J from the STACK, add it in VISIT and PUSH onto the STACK all neighbor of J STACK: D, K VISIT: J
  • 36. [3] POP the top element K, add it in VISIT and PUSH all neighbor of K onto STACK STACK: D,E,G VISIT: J, K [4] POP the top element G, add it in VISIT and PUSH all neighbor of G onto STACK STACK: D,E, E, C, VISIT: J, K, G
  • 37. [5] POP the top element C, add it in VISIT and PUSH all neighbor of C onto STACK STACK: D,E,E, F VISIT: J, K, G, C [6] POP the top element F, add it in VISIT and PUSH all neighbor of F onto STACK STACK: D,E, E, D VISIT: J, K, G, C, F
  • 38. [5] POP the top element D, add it in VISIT and PUSH all neighbor of D onto STACK STACK: D,E,E, C VISIT: J, K, G, C, F,D [6] POP the top element C, which is already in VISIT STACK: D,E, E VISIT: J, K, G, C, F,D
  • 39. [5] POP the top element E, add it in VISIT which is already in VISIT and its neighbor onto STACK STACK: D,E, D, C, J VISIT: J, K, G, C, F,D,E [6] POP the top element J, C, D,E, D which is already in VISIT STACK: VISIT: J, K, G, C, F, D, E
  • 40. A C D J B E F J, K, G, C, F, D, E Adjacency List G K A: F,C,B B: G,C C: F D: C E: D,C,J F: D G: C,E J: D,K K: E,G
  • 41. BFS Traversal Any vertex in label i will be visited only after the visiting of all the vertices in its preceding level that is at level i – 1
  • 42. BFS Traversal [1] Enter the starting vertex v in a queue Q [2] While Q is not empty do Delete an item from Q, say u If u is not in VISIT store u in VISIT Enter all adjacent vertices of u into Q [3] Stop
  • 43. v1 v2 v8 v3 v4 v6 v5 v7
  • 44. [1] Insert the starting vertex V1 in Q Q = V1 VISIT = Ø [2] Delete an item from Q, let it be u = V1 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V2 , V 3 VISIT = V1
  • 45. [3] Delete an item from Q, let it be u = V2 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V2 , V3 , V4 , V5 VISIT = V1 , V2 [4] Delete an item from Q, let it be u = V3 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V3 , V4 , V5 , V4 , V6 VISIT = V1 , V2 , V3
  • 46. [5] Delete an item from Q, let it be u = V4 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V4 , V5 , V4 , V6 , V8 VISIT = V1 , V2 , V3 , V4 [6] Delete an item from Q, let it be u =V5 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V5 , V4 , V6 , V8 , V7 VISIT = V1 , V2 , V3 , V4 , V5
  • 47. [7] Delete an item from Q, let it be u =V4 u is in VISIT. Q = V4 , V6 , V8 , V7 VISIT = V1 , V2 , V3 , V4 , V5 [8] Delete an item from Q, let it be u =V6 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V6 , V8 , V7 VISIT = V1 , V2 , V3 , V4 , V5 , V6
  • 48. [9] Delete an item from Q, let it be u =V8 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V8 , V7 , V1 VISIT = V1 , V2 , V3 , V4 , V5 , V6 , V8 [10] Delete an item from Q, let it be u =V7 u is not in VISIT. Store u in VISIT and its adjacent element in Q Q = V7 , V1 VISIT = V1 , V2 , V3 , V4 , V5 , V6 , V8 , V7
  • 49. [11] Delete an item from Q, let it be u =V1 u is in VISIT. Q = V1 VISIT = V1 , V2 , V3 , V4 , V5 , V6 , V8 , V 7 [12] Q is empty, Stop Q= VISIT = V1 , V2 , V3 , V4 , V5 , V6 , V8 , V 7
  • 50. v1 v2 v1 v8 v8 v3 BFS v2 v4 v6 v5 v7 v3 v4 v6 v5 v7

×