Parallelizing Depth-First Search for      Robotic Graph Exploration   PRESENTED BY,   VARUN NARANGB.TECH, MATHEMATICS   AN...
Background The collective behavior of robot swarms fits them for comprehensive search of an environment, which is useful ...
Abstract We model the environment as an undirected, connected graph and require robots to explore the graph to visit all ...
Motivation We consider the task of exploring a field of crops that will subsequently be pollinated. Conducting explorati...
Assumptions The environment is an undirected, connected graph  whose nodes represent locations in the environment. Edges...
 An unlimited number of robots may occupy a node  simultaneously. At each time step, each robot starts at a node and  tr...
Formal Model The environment is an undirected graph G=(V;E). An edge {v1, v2} ε E if no obstacle separates v1 and  v2. ...
Some Key Terms Nodes are labeled if each node contains information that uniquely identifies it, and unlabeled otherwise....
DFS algorithm Depth-first search, or DFS, is a way to traverse the graph. Initially it allows visiting vertices of the g...
Complexity analysis Assume that graph is connected. Depth-first search visits every vertex in the graph and  checks ever...
 Since any graph traversal requires at least |E| edge traversals, and DFS requires 2|E| edge traversals, the overhead of ...
How to Reduce Overhead? Multiple robots should be able to divide the work of exploration and reduce runtime more dramatic...
Basic algorithm We initially provide three routines for the robot: assign, expand, and backtrack.
DFS Algorithm Initially all vertices are white (unvisited). DFS starts in arbitrary vertex and runs as follows:   Mark ...
Each iteration of the while looprepresents one time step. At each time step, each robot  traverses at most one edge. A r...
 The hive waits until the end  of the iteration to direct  expanding robots where to  go. It groups robots based on  the...
CorrectnessClaim 1. PDFS terminates when G is completelyexplored. Proof. Suppose G is completely explored. All nodes  hav...
Efficiency Analysis The runtime is the number of times the outer while loop of PDFS iterates. We investigate the worst ca...
Proof (Efficiency) Consider the graph in Figure 3.1, consisting of a “stem" of length r and a “trap" with B edges, where ...
   We have two cases:   Case 1: k=3m       At the i th iteration , 3m-i robots make it i-nodes down the stem, where the...
We simply construct a graph with r = m and                              appropriate B.Case 2.k is not a power of 3. We cho...
Worst-Case Runtime The worst-case runtime of PDFS is 2|E|
Remarks PDFS performs no better than single-robot DFS in the worst case. PDFS cannot take advantage of multiple robots w...
Acknowledgements Parallelizing Depth-First Search for Robotic Graph Exploration by Chelsea Zhang, Harvard College, Cambri...
Graph Exploration Algorithm- Term Paper Presentation
Graph Exploration Algorithm- Term Paper Presentation
Upcoming SlideShare
Loading in …5
×

Graph Exploration Algorithm- Term Paper Presentation

1,462 views

Published on

Term Paper presentation for distributed computing course.
Topic-"Graph Exploration"

  • Be the first to comment

  • Be the first to like this

Graph Exploration Algorithm- Term Paper Presentation

  1. 1. Parallelizing Depth-First Search for Robotic Graph Exploration PRESENTED BY, VARUN NARANGB.TECH, MATHEMATICS AND COMPUTING IIT GUWAHATI
  2. 2. Background The collective behavior of robot swarms fits them for comprehensive search of an environment, which is useful for terrain mapping, crop pollination, and related applications. We study efficient mapping of an obstacle-ridden environment by a collection of robots.
  3. 3. Abstract We model the environment as an undirected, connected graph and require robots to explore the graph to visit all nodes and traverse all edges. Here we present a graph exploration algorithm that parallelizes depth-first search, so that each robot completes a roughly equal part of exploration.
  4. 4. Motivation We consider the task of exploring a field of crops that will subsequently be pollinated. Conducting exploration first is useful because once the robot swarm has mapped the field entirely, it can determine best paths to crops. The Harvard RoboBees project aims to build a colony of small, autonomously flying robotic bees that can accomplish pollination
  5. 5. Assumptions The environment is an undirected, connected graph whose nodes represent locations in the environment. Edges represent the ability to move directly from one location to another. An edge does not exist between adjacent locations if an obstacle separates them, and does exist otherwise. A colony of homogeneous robots resides at a hive, located at some node.
  6. 6.  An unlimited number of robots may occupy a node simultaneously. At each time step, each robot starts at a node and traverses an edge as instructed by the hive. We have implicitly assumed robots have perfect odometry-they know their exact change in position at every step. This implies robots always know their coordinates relative to the hive. Exploration should terminate when the hive knows all nodes and all edges. To this end, at each step all robots transmit their locations back to the hive and notify the hive of any crops detected.
  7. 7. Formal Model The environment is an undirected graph G=(V;E). An edge {v1, v2} ε E if no obstacle separates v1 and v2. We require that G is connected so that all nodes are reachable from vh. Based on the edges robots traverse and the nodes where robots end up, the hive gradually builds a map of G, denoted Gm = (Vm, Em). Every node visited for the first time is added to Vm; every edge traversed for the first time is added to Em.
  8. 8. Some Key Terms Nodes are labeled if each node contains information that uniquely identifies it, and unlabeled otherwise. A graph has labeled nodes if a robot can recognize a node it has visited previously. Undirected edges: The robot can move in either direction on the edge then it is an undirected edge.
  9. 9. DFS algorithm Depth-first search, or DFS, is a way to traverse the graph. Initially it allows visiting vertices of the graph only, but there are hundreds of algorithms for graphs, which are based on DFS. The principle of the algorithm is quite simple: to go forward (in depth) while there is such possibility, otherwise to backtrack.
  10. 10. Complexity analysis Assume that graph is connected. Depth-first search visits every vertex in the graph and checks every edge its edge. Therefore, DFS complexity is O(V + E). If an adjacency matrix is used for a graph representation, then all edges, adjacent to a vertex cant be found efficiently, that results in O(V2) complexity. DFS algorithm traverses undirected graphs with labeled nodes, making 2*|E| edge traversals. Two along every edge.
  11. 11.  Since any graph traversal requires at least |E| edge traversals, and DFS requires 2|E| edge traversals, the overhead of DFS is 2. For general graphs, even if the robot is given an unlabeled map, an overhead of 2 is optimal; thus DFS is a generally optimal algorithm with respect to the authors metric.
  12. 12. How to Reduce Overhead? Multiple robots should be able to divide the work of exploration and reduce runtime more dramatically. Thus, our approach is to parallelize DFS.
  13. 13. Basic algorithm We initially provide three routines for the robot: assign, expand, and backtrack.
  14. 14. DFS Algorithm Initially all vertices are white (unvisited). DFS starts in arbitrary vertex and runs as follows:  Mark vertex u as gray (visited).  For each edge (u, v), where u is white, run depth-first search for u recursively.  Mark vertex u as black and backtrack to the parent.
  15. 15. Each iteration of the while looprepresents one time step. At each time step, each robot traverses at most one edge. A robot at a finished node cannot expand, so it backtracks. A robot at a started but unfinished node should expand an edge. A robot at an un started node should expand an edge if one exists; otherwise it should backtrack because the node is already finished (the robot has reached a dead end).
  16. 16.  The hive waits until the end of the iteration to direct expanding robots where to go. It groups robots based on their current node, then, for each node with a robot, the unexplored edges are divided evenly among all the robots there. Robots expand their assigned edges, and the hive adds these newly explored edges to Em.
  17. 17. CorrectnessClaim 1. PDFS terminates when G is completelyexplored. Proof. Suppose G is completely explored. All nodes have been finished, so the robots backtrack until their stacks are empty. This is the termination condition of PDFS. Now suppose PDFS terminates at time T. All nodes have been finished by time T.
  18. 18. Efficiency Analysis The runtime is the number of times the outer while loop of PDFS iterates. We investigate the worst case run time of the algorithm. For every k ≥ 1, ε< 1, we can construct a graph such that PDFS has runtime at least 2|E|(1- ε ).
  19. 19. Proof (Efficiency) Consider the graph in Figure 3.1, consisting of a “stem" of length r and a “trap" with B edges, where B is high. The hive is the leftmost node of degree 3. The idea is to use up all robots except one to explore the stem, and leave the remaining robot to explore the trap.
  20. 20.  We have two cases: Case 1: k=3m  At the i th iteration , 3m-i robots make it i-nodes down the stem, where there is unfinished work.  In the first iteration, PDFS assigns 3(m-1) robots to each of the branches above and below the hive. These robots arrive at finished nodes, backtrack to the hive and do nothing.  At iteration m, a single robot makes it m nodes down the stem and enters the trap.  All other robots fall into two categories: those with empty stacks, which are doing nothing, and those whose stacks contain only finished nodes, which are backtracking to the hive.  This graph contains three edges per unit length of the stem, so |E| = 3r +B.  The runtime is the time taken by the robot that enters the trap: 2r to travel across the stem and back, and 2B to explore the trap by single-robot DFS. We want 2r + 2B ≤ 2|E|(1-ε) for our claim to hold.
  21. 21. We simply construct a graph with r = m and appropriate B.Case 2.k is not a power of 3. We choose r = ┌ log3 (k), ┐making the stem long enough toensure only one robot enters the trap. We choose B according tothe equations above are satisfied.
  22. 22. Worst-Case Runtime The worst-case runtime of PDFS is 2|E|
  23. 23. Remarks PDFS performs no better than single-robot DFS in the worst case. PDFS cannot take advantage of multiple robots when, in a perverse graph, many robots finish their work, and some robot becomes the bottleneck. We cannot allocate nished robots to unfinished work because our requirement that there be no wasteful edge traversals prevents them from leaving the hive after finishing DFS.
  24. 24. Acknowledgements Parallelizing Depth-First Search for Robotic Graph Exploration by Chelsea Zhang, Harvard College, Cambridge, Massachusetts--April 1, 2010 Distributed Graph Exploration- Gerard Tel-- April 1997 Wikipedia 

×