100519 ver final

1,279 views
1,195 views

Published on

This paper presents a simulation on the relative location recognition of multi-robots using MRDS (Microsoft Robotics Developer Studio). Multiple robots allow a more robust recognition of their relative locations in accuracy and performance compared to having just a single robot client. However, in order to experiment with a cooperative robot system that utilizes multiple robots units, there are constraints relating to large initial production costs for the robot units as well as having to secure a certain size of space for the experiment. This paper has resolved those issues by performing the multi-robot relative location recognition experiment in a simulated virtual space using MRDS, a robot application tool. Moreover, relative coordinates of robots were recognized by using omni-directional panoramic vision sensors on the simulation program

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
1,279
On SlideShare
0
From Embeds
0
Number of Embeds
162
Actions
Shares
0
Downloads
17
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

100519 ver final

  1. 1. Simulation on the relative location recognition of Multi-Robots, using MRDS<br />ROBOMEC 2010 in ASAHIKAWA<br />Kyushu Institute of Technology<br />Computer Science and System Engineering<br />Kazuaki Tanaka Laboratory<br />WONJAE CHO<br />
  2. 2. Kyushu Institute of Technology<br />Index<br />1<br />Introduction<br />2<br />Simulation System Configuration<br />3<br />Relative Location Recognition<br />4<br />Experiment Results<br />5<br />Conclusion<br />
  3. 3. 1<br />1. Introduction<br />1.1 Objective<br />It is important for a mobile robot to identify its current location for the autonomous execution of its given tasks in random space.<br />Sorry..<br />Where am I ?<br />Kyushu Institute of Technology<br />
  4. 4. 2<br />1. Introduction<br />1.2 Cooperative Robot System<br />However, Multi-robots requires a large initial production cost and a certain minimum size of experiment space for creating the system.<br />Single Robot System<br />Processing time faster than Multi-robots<br />Cooperative Robot System<br />Accomplish more complex task than Single robot can working alone.<br />Reliable and Robust<br />Complete the task much faster than Single robot (By working in Parallel)<br />http://www.ornl.gov/info/ornlreview/v33_2_00/robots.htm<br />Kyushu Institute of Technology<br />
  5. 5. 3<br />1. Introduction<br />1.3 Simulation Application For Robot<br />While there may be various solution to these issues, a simulation experiment in 3D-simulated space can be on of the answers.<br />Microsoft<br />Evolution<br />OROCOS<br />Skilligent<br />URBI<br />Webots<br />No<br />No<br />No<br />No<br />Yes<br />No<br />Open source<br />Free of Charge<br />No<br />Academic,<br />Hobby<br />No<br />No<br />No<br />Yes<br />Distributed<br />Environment<br />No<br />Yes<br />No<br />Yes<br />Yes<br />No<br />Real - time<br />No<br />No<br />No<br />No<br />No<br />No<br />Range of<br />Supported<br />Hardware<br />Large<br />Small<br />Medium<br />Medium<br />Large<br />Large<br />Simulation<br />Environment<br />Yes<br />No<br />No<br />No<br />Yes<br />Yes<br />Behavior<br />Coordination<br />Yes<br />Yes<br />No<br />Yes<br />Yes<br />No<br />Reusable<br />Service<br />Building block<br />Yes<br />Yes<br />Yes<br />Yes<br />No<br />Not App<br />http://www.windowsfordevices.com/c/a/Windows-For-Devices-Articles/A-review-of-robotics-software-platforms/<br />Kyushu Institute of Technology<br />
  6. 6. 4<br />1. Introduction<br />1.4 About MRDS<br />MRDS 2008 R2 is a Windows-based environment for academic to easily create robotics applications across a wide variety of H/W.<br />http://www.microsoft.com/robotics<br />Kyushu Institute of Technology<br />
  7. 7. 5<br />2. Simulation System Configuration<br />2.1 System Configuration<br />Communication<br />Service<br />Orchestration<br />Service<br />Simulation<br />Service<br />Client<br />Recognizes the location of the Client by using Sensor<br />I/O Service of the Data<br />(using UDP) <br />3D Modeling<br />Runtime Environment Service<br />Concurrency and Coordination Runtime (CCR)<br />Decentralized Software<br />Services(DSS)<br />Fig. 1 System block diagram<br />Kyushu Institute of Technology<br />
  8. 8. 6<br />2. Simulation System Configuration<br />2.2 Simulation Robot Configuration<br /> Pioneer3DX(produced by Mobile Robots)<br /> Using SimpleDashBoard Service<br /> Motor Service (Control of 2 Wheels)<br /> Camera Service (Omni-Directional vision)<br /> Communication Service (UDP)<br />Kyushu Institute of Technology<br />
  9. 9. 7<br />2. System Configuration<br />2.3 Omni-directional Vision on Simulation<br />The hardware-based method, which acquires panoramic images by using multiple cameras.<br />Step 3: Display on Simulation<br />Step 1 : Take a Picture <br />Step 2 : Integrate Image<br />Front<br />Right<br />Left<br />Rear<br />Sequence : <br />Front -> Right -> Rear -> Left<br />Appendix A<br />Field of View = 90°<br />Kyushu Institute of Technology<br />
  10. 10. 8<br />2. System Configuration<br />2.3 Appendix A, Omni-Directional Image on the simulation<br />Fig. 2 Panoramic Image on the Simulation<br />Kyushu Institute of Technology<br />
  11. 11. 9<br />3. Relative Location Recognition<br />3.0 Image Processing Configuration<br />Preprocessing<br />Calculating <br />Of Position<br />Client<br />Template Matching<br />Binaryadaptive method<br />Normalized Gray-level Correlation(NGC) <br />Image Data<br />Using Relative Coordinate System<br />2. LabelingSequential Labeling<br />Tracking of Relative Position <br />Action<br />Using Particle Filter<br />Fig. 3 Block Diagram depicting the Relative Recognition<br />Kyushu Institute of Technology<br />
  12. 12. 10<br />3. Relative Location Recognition<br />3.1 Extracting the Region of Interest<br />Step 2 : binary using adaptive method<br />Step 3 : Sequential Labeling<br />Step 1 : Histogram Analysis<br />Kyushu Institute of Technology<br />
  13. 13. 11<br />3. Relative Location Recognition<br />3.2 Template Matching (Normalized Gray-level Correlation: NGC)<br />‘i’ , ‘j’ : indices for the pixel<br />a(i, j) : brightness of the compared section of the region – m<br />b(i, j) : brightness of the template pattern – t<br />Template size : M X N<br />Kyushu Institute of Technology<br />
  14. 14. 12<br />3. Relative Location Recognition<br />3.3 Calculating the Relative Locations of Robots<br />y<br />R3(x3, y3)<br />d13<br />d23<br />Relative Coordinate<br />d12<br />x<br />R2(x2, y2)<br />R1(x1, y1)<br />World Coordinate<br />Fig. 4 Relative Coordinate System among Robots<br />Kyushu Institute of Technology<br />
  15. 15. 13<br />4. Experiment Results<br />4.1 Experimental Environment<br />The Robots were placed in 1m intervals and their locations were measured by simulated sensors.<br />Virtual Environment1) Space : 10m X 10m X 1m2) Number of Robot : 3 units3) No Obstacles<br />Experimental Performance1.86GHz and 1.5GB RAM<br />10m<br />Kyushu Institute of Technology<br />
  16. 16. 14<br />4. Experiment Results<br />4.2 Simulation Result UI<br />The images collected for the location recognition, and the identified objects in each image are displayed.<br />A. Drawing Relative Position of Robots on the map<br />A<br />B<br />B. Extracting Regions of Interest on the simulation<br />C. Resulting positions of the recognition process<br />C<br />Appendix B<br />Kyushu Institute of Technology<br />
  17. 17. 15<br />4. Experiment Results<br />4.2 Appendix B, Simulation video<br />Kyushu Institute of Technology<br />
  18. 18. 16<br />4. Experiment Results<br />4.3 Location Recognition Result and Error rates<br />The average error rate is 5.26.<br />Kyushu Institute of Technology<br />
  19. 19. 17<br />5. Conclusion<br />Simulation System has shown that relative locations of robots can be recognized by using simulated panoramic vision cameras.<br />However,<br />Since the current system recognizes only the locations of standstill robots, it cannot track the locations of moving robots.<br />Therefore !!<br />Kyushu Institute of Technology<br />
  20. 20. 18<br />5. Conclusion<br />A simulation system might be developed in the future that can track the locations of robots in motion, using particle filter.<br />Fig. 4 tracking object using particle filter<br />Kyushu Institute of Technology<br />

×