Your SlideShare is downloading. ×
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
100519 ver final
Upcoming SlideShare
Loading in...5
×

Thanks for flagging this SlideShare!

Oops! An error has occurred.

×
Saving this for later? Get the SlideShare app to save on your phone or tablet. Read anywhere, anytime – even offline.
Text the download link to your phone
Standard text messaging rates apply

100519 ver final

1,091

Published on

This paper presents a simulation on the relative location recognition of multi-robots using MRDS (Microsoft Robotics Developer Studio). Multiple robots allow a more robust recognition of their …

This paper presents a simulation on the relative location recognition of multi-robots using MRDS (Microsoft Robotics Developer Studio). Multiple robots allow a more robust recognition of their relative locations in accuracy and performance compared to having just a single robot client. However, in order to experiment with a cooperative robot system that utilizes multiple robots units, there are constraints relating to large initial production costs for the robot units as well as having to secure a certain size of space for the experiment. This paper has resolved those issues by performing the multi-robot relative location recognition experiment in a simulated virtual space using MRDS, a robot application tool. Moreover, relative coordinates of robots were recognized by using omni-directional panoramic vision sensors on the simulation program

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total Views
1,091
On Slideshare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
16
Comments
0
Likes
0
Embeds 0
No embeds

Report content
Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

Cancel
No notes for slide

Transcript

  • 1. Simulation on the relative location recognition of Multi-Robots, using MRDS
    ROBOMEC 2010 in ASAHIKAWA
    Kyushu Institute of Technology
    Computer Science and System Engineering
    Kazuaki Tanaka Laboratory
    WONJAE CHO
  • 2. Kyushu Institute of Technology
    Index
    1
    Introduction
    2
    Simulation System Configuration
    3
    Relative Location Recognition
    4
    Experiment Results
    5
    Conclusion
  • 3. 1
    1. Introduction
    1.1 Objective
    It is important for a mobile robot to identify its current location for the autonomous execution of its given tasks in random space.
    Sorry..
    Where am I ?
    Kyushu Institute of Technology
  • 4. 2
    1. Introduction
    1.2 Cooperative Robot System
    However, Multi-robots requires a large initial production cost and a certain minimum size of experiment space for creating the system.
    Single Robot System
    Processing time faster than Multi-robots
    Cooperative Robot System
    Accomplish more complex task than Single robot can working alone.
    Reliable and Robust
    Complete the task much faster than Single robot (By working in Parallel)
    http://www.ornl.gov/info/ornlreview/v33_2_00/robots.htm
    Kyushu Institute of Technology
  • 5. 3
    1. Introduction
    1.3 Simulation Application For Robot
    While there may be various solution to these issues, a simulation experiment in 3D-simulated space can be on of the answers.
    Microsoft
    Evolution
    OROCOS
    Skilligent
    URBI
    Webots
    No
    No
    No
    No
    Yes
    No
    Open source
    Free of Charge
    No
    Academic,
    Hobby
    No
    No
    No
    Yes
    Distributed
    Environment
    No
    Yes
    No
    Yes
    Yes
    No
    Real - time
    No
    No
    No
    No
    No
    No
    Range of
    Supported
    Hardware
    Large
    Small
    Medium
    Medium
    Large
    Large
    Simulation
    Environment
    Yes
    No
    No
    No
    Yes
    Yes
    Behavior
    Coordination
    Yes
    Yes
    No
    Yes
    Yes
    No
    Reusable
    Service
    Building block
    Yes
    Yes
    Yes
    Yes
    No
    Not App
    http://www.windowsfordevices.com/c/a/Windows-For-Devices-Articles/A-review-of-robotics-software-platforms/
    Kyushu Institute of Technology
  • 6. 4
    1. Introduction
    1.4 About MRDS
    MRDS 2008 R2 is a Windows-based environment for academic to easily create robotics applications across a wide variety of H/W.
    http://www.microsoft.com/robotics
    Kyushu Institute of Technology
  • 7. 5
    2. Simulation System Configuration
    2.1 System Configuration
    Communication
    Service
    Orchestration
    Service
    Simulation
    Service
    Client
    Recognizes the location of the Client by using Sensor
    I/O Service of the Data
    (using UDP)
    3D Modeling
    Runtime Environment Service
    Concurrency and Coordination Runtime (CCR)
    Decentralized Software
    Services(DSS)
    Fig. 1 System block diagram
    Kyushu Institute of Technology
  • 8. 6
    2. Simulation System Configuration
    2.2 Simulation Robot Configuration
    Pioneer3DX(produced by Mobile Robots)
    Using SimpleDashBoard Service
    Motor Service (Control of 2 Wheels)
    Camera Service (Omni-Directional vision)
    Communication Service (UDP)
    Kyushu Institute of Technology
  • 9. 7
    2. System Configuration
    2.3 Omni-directional Vision on Simulation
    The hardware-based method, which acquires panoramic images by using multiple cameras.
    Step 3: Display on Simulation
    Step 1 : Take a Picture
    Step 2 : Integrate Image
    Front
    Right
    Left
    Rear
    Sequence :
    Front -> Right -> Rear -> Left
    Appendix A
    Field of View = 90°
    Kyushu Institute of Technology
  • 10. 8
    2. System Configuration
    2.3 Appendix A, Omni-Directional Image on the simulation
    Fig. 2 Panoramic Image on the Simulation
    Kyushu Institute of Technology
  • 11. 9
    3. Relative Location Recognition
    3.0 Image Processing Configuration
    Preprocessing
    Calculating
    Of Position
    Client
    Template Matching
    Binaryadaptive method
    Normalized Gray-level Correlation(NGC)
    Image Data
    Using Relative Coordinate System
    2. LabelingSequential Labeling
    Tracking of Relative Position
    Action
    Using Particle Filter
    Fig. 3 Block Diagram depicting the Relative Recognition
    Kyushu Institute of Technology
  • 12. 10
    3. Relative Location Recognition
    3.1 Extracting the Region of Interest
    Step 2 : binary using adaptive method
    Step 3 : Sequential Labeling
    Step 1 : Histogram Analysis
    Kyushu Institute of Technology
  • 13. 11
    3. Relative Location Recognition
    3.2 Template Matching (Normalized Gray-level Correlation: NGC)
    ‘i’ , ‘j’ : indices for the pixel
    a(i, j) : brightness of the compared section of the region – m
    b(i, j) : brightness of the template pattern – t
    Template size : M X N
    Kyushu Institute of Technology
  • 14. 12
    3. Relative Location Recognition
    3.3 Calculating the Relative Locations of Robots
    y
    R3(x3, y3)
    d13
    d23
    Relative Coordinate
    d12
    x
    R2(x2, y2)
    R1(x1, y1)
    World Coordinate
    Fig. 4 Relative Coordinate System among Robots
    Kyushu Institute of Technology
  • 15. 13
    4. Experiment Results
    4.1 Experimental Environment
    The Robots were placed in 1m intervals and their locations were measured by simulated sensors.
    Virtual Environment1) Space : 10m X 10m X 1m2) Number of Robot : 3 units3) No Obstacles
    Experimental Performance1.86GHz and 1.5GB RAM
    10m
    Kyushu Institute of Technology
  • 16. 14
    4. Experiment Results
    4.2 Simulation Result UI
    The images collected for the location recognition, and the identified objects in each image are displayed.
    A. Drawing Relative Position of Robots on the map
    A
    B
    B. Extracting Regions of Interest on the simulation
    C. Resulting positions of the recognition process
    C
    Appendix B
    Kyushu Institute of Technology
  • 17. 15
    4. Experiment Results
    4.2 Appendix B, Simulation video
    Kyushu Institute of Technology
  • 18. 16
    4. Experiment Results
    4.3 Location Recognition Result and Error rates
    The average error rate is 5.26.
    Kyushu Institute of Technology
  • 19. 17
    5. Conclusion
    Simulation System has shown that relative locations of robots can be recognized by using simulated panoramic vision cameras.
    However,
    Since the current system recognizes only the locations of standstill robots, it cannot track the locations of moving robots.
    Therefore !!
    Kyushu Institute of Technology
  • 20. 18
    5. Conclusion
    A simulation system might be developed in the future that can track the locations of robots in motion, using particle filter.
    Fig. 4 tracking object using particle filter
    Kyushu Institute of Technology

×