Omni-directional Vision and 3D Robot Animation Based Teleoperation of Hydraulically Actuated Hexpod Robot COMET-IV Hiroshi OHROKU Control and Robotics Lab. CHIBA University
Background http://www.yomiuri.co.jp/feature/20080614-2892868/news/20080616-OYT1T00506.htm http://headlines.yahoo.co.jp/hl The Expectation of robots is increasing! Assistance during hazardous operations Disaster-relief work Construction Mine detection and clearance  etc.
Background Utility of multi-legged robot Disaster site Construction site Mine field Rough terrain Multi-legged robots have high  stability  and  mobility  in rough terrain.  Sites of hazardous operations Multi-legged robots can enter areas  where it is difficult for wheeled robots  and crawler type robots to enter.
Development of COMET-IV COMET-III COMET-IV COMET-II COMET-I Fully autonomous locomotion on outdoor rough terrain Assistance during various hazardous operations Development goal
Large-scaled legged Robots (Past) Walking Forest Machine MECANT ASV TITAN-XI
About autonomous system Fully autonomous system Half autonomous system < Intelligent system >   - Task management & Planning   - Environment recognition   - Self-Localization Upper- layer system   < Robust controller >   - Force & Position control Lower- layer system   < Robust controller >   - Force & Position control Lower- layer system     - Image of real environment   - Sensor data Teleoperation system    - Operation   - Supervision It is very difficult to implement fully autonomous robot in the stricken area ・・・
Tele-robotics researches(1) http://pc.watch.impress.co.jp/docs/2002/1219/hrp.htm Master-slave system + 3D Animation + Open Dynamic Engine ( ODE ) http://const.tokyu.com/topics/1998/topics_03.html Master-slave system + FES ( Functional Electrical Stimulation ) based bilateral control
Tele-robotics researches(2) K.Saitoh, T.Machida, K.Kiyokawa, H.Takemura : A Mobile Robot Control Interface Using Omnidirectional Images and 3D Geometric Models,  Technical report of IEICE. Multimedia and virtual environment 105-256, 7/12(2005)
Focus of my research The research on teleoperation system to assist operator for  controlled large-scale legged type robot is little In rough terrain operation, the change of  ・・・・ Body height Attitude Leg movement The teleoperation assistant system is indispensable for controlling legged robot  in outdoor environment Development of the Teleoperation assistant system ■ Omni-directional vision ■ 3D Animation of robot movement
Development Tasks Autonomous Navigation System Teleoperation Assistant System Walking speed: max 1 km/h Vertical step: max 1 m Design specification in locomotion Gradient: max 20 deg Omni-directional walking Control System Gait Planning Foot Trajectory Tracking Force Control Attitude Control
Hardware Specifications Leg 1 Leg 2 Leg 3 Leg 4 Leg 5 Leg 6 L×W×H 2.5×3.3×2.8  ( m ) Weight 2200  ( kg ) Power Source   (Max. Output) Gasoline Engine  ×2   (25.0ps/3600rpm) Supply Pressure 22  ( Mpa ) Supply Flow 78 × 2  ( L/min ) W:3.3 m H:2.8 m L:2.5 m
Hardware Specifications LRF ( Laser Range Finder ) Stereo vision Omni-directional Vision sensor
Coordinate System of Leg L2 L3 L4 L1 Link Length (m) Range of Motion [deg] Shoulder L1 0 -180° ≦ θ 1  ≦  180° Thigh L2 1.13 50° ≦ θ 2  ≦  142° Shank L3 0.77 36.6° ≦ θ 3  ≦  150° Foot L4 0.39 -47.9°≦ θ 4  ≦  103°
Basic configuration of  hydraulic control system
Omni-directional gait In this system, we applied the trajectory ・・・ “ The Standard Circular Gait”  [ * ] “ Low impact foot trajectory” [ ** ] If it is possible to change the body trajectory  arbitrarily , we can apply various navigation strategies !! Strategy + [ * ] S.Hirose, H.Kikuchi, Y.Umetani:    The Standard Circular Gait of the Quadruped Walking Vehicle,    Journal of Robotics Society of Japan, 2-6, 41/52 (1984) [ ** ] Y.Sakakibara, K.Kan, Y.Hosoda, M.Hattori, M.Fujie :    Low Impact Foot Trajectory for aQuadruped Walking Machine,     Journal of the Robotics Society of Japan, 8-6,22/31(1990) The ability to change the body trajectory arbitrarily  at any step during walking  is needed.
Omni-directional gait Circular gait Rotation center Crab gait Crab gait Circular gait Circular gait includes all gait [*]
Definition of coordinate system X c Y c X s_1 Y s_1 <FRONT> <REAR> <LEFT> <RIGHT> Leg1 Leg2 Leg3 Leg4 Leg5 Leg6 O s_1 ,  Z s_1 O c ,  Z c X t Y t O t Rotation center  coordinate system Body coordinate system Shoulder coordinate system R ct θ ct : Crab angle
Condition of circular angle setting Condition: Arbitrary crab angle walking is achieved  by setting as follows.   -> ∞ Circular gait ≒  Crab gait
Teleoperation assistant system Instruction  information Log Omni-directional  image  Generated  image  Received sensor data  Map  3D Animation of robot
Peripheral Device ■ Omni-directional vision sensor    - Digital Video Camera : DCR-HC48 ( SONY )    -  Hyperbolic Mirror   ■ Joystick: Cyborg Evo Force (  Saitek  ) Interface USB1.1 Size  ( mm ) 210 × 199 × 240  Buttons 11 Axes 3 Diameter of mirror 82 mm Angle of elevation 15° Angle of depression 50°
Instruction information Action value shows the basic movement action.  ■  Stand up  ■  Sit down  ■  Walking start  ■  Walking stop Teleoperation Computer Locomotion Computer UDP Socket ( User Datagram Protocol ) Action Action value Cycle_Time Cycle time θ tb Traverse angle of the body center at one cycle [O t ] R ct Position of rotation center [polar display of O c ] θ ct
Coordinate system of Joystick The eight directions of basic locomotion angle  β  can be selected using the joystick.
Gait parameters Setting The range of the traverse angle < Crab gait > < Circular gait >   -> ∞ is set to the minimal value The circular angle of body center  derived from  rot_z  which is set to
Configuration of omni-directional vision sensor a 26.2 [mm] b 35.4 [mm] c 44.1 [mm] Diameter of mirror 55.0 [mm] Angle of elevation 15.0 [deg] Angle of depression 50.0 [deg]
Performance of Ambient Environmental Image(1) Central axis of a mirror fits an optical axis of a camera is assumed.  The direction where the angle of pan and tilt is set to 0 degree to make X axis and the origin of projection center coordination is (0, 0).  ( 7 ) The spherical coordinates was derived by using pan, tilt angle and focal length  which is direction projection plane as against the sphere.  ( 8 ) ( 9 ) ( 10 )
Performance of Ambient Environmental Image(2) The coordinates that derived from Eq. (7) to Eq. (10) is transformed to  rectangular coordinates.   ( 11 ) Mirror parameter is in Semi-major axis and mirror parameter b is  in Semi-minor axis. Semi-latus rectum is derived by Eq. (12).  ( 12 ) ( 13 )
Performance of Ambient Environmental Image(3) Parameters  l  and  m  is used in spherical projection was derived from  Eq. (14) to Eq. (15) including.   ( 14 ) ( 15 ) ( 16 ) On the other hand, projection coordinates of hyperboloidal is calculated as  Eq. (16) and (17). ( 17 )
Performance of Ambient Environmental Image(4) Finally, the pixel coordinate system is determined by using matrix K via Eq. (18).  Furthermore projection coordinates is calculated by Eq. (19).  The values (358.1, 215.1) in the matrix K are XY coordinates of the center point  in omni-directional image.  ( 18 ) ( 19 )
Texture mapping Time consuming factor on the system processing unit still become an issues  because previous mentioned calculation method does similar process  to all image pixels.  Consequentially, it is difficult to present the smooth image operator. Therefore, the load of the calculation processing is reduced by  using  texture mapping .
Texture mapping Fig.11-(a) Fig.12 Fig.11-(b) The corresponding points in omni-directional image  as against lattice points were derived  from Eq1 to Eq3. ② ① Texture mapping vertex[0] = P[0] texcoor[0] = T[0] vertex[1] = P[1] texcoor[1] = T[1] vertex[2] = P[2]  texcoor[2] = T[2] vertex[3] = P[3]  texcoor[3] = T[3] ③
Mapping image Fig.12 Calculated points on texture coordinate system
Robot animation using 3D geometric models and sensor data The 3D COMET-IV online model is designed to predict the real-time  movement of COMET-IV on the reality environment.  Each sensor data is used for the robot 3D animation movement reference  which is transmitted from target computer unit on COMET-IV to  teleoperation computer via wireless serial MODEM with  data transfer rate 57600bps at 10Hz.  Locomotion Computer Wireless serial Modem (for sensor data) Wireless serial Modem (for sensor data) 2.4GHz
Coordinate system of 3D robot Table.6  and Fig. 13 shows the parameters used for robot 3D animation  in virtual environment and coordinate system of 3D robot respectively.
Initialization of 3D Animation For initialization,  received azimuth angle, XY value of GPS, and roll and pitch angle of the body  are set up as default values.  The robot is rotated using azimuth angle ψ including magnetic variation  Δd with Y axis.
The homogeneous transformation  [1] Each leg (Foot - Shank - Thigh - Shoulder) After initialization, each part is expressed as follows. ( 20 ) ( 21 ) ( 22 ) ( 23 ) [2] Robot body ( 24 ) ( 25 )
Obstacle avoidance walking test   Operator Goal Obstacle(2) Obstacle(1) ■ Controller is PID position control ■ Cycle time 16 [s] ■ Omni-directional gait  ( crab gait ) ■ Tripod walking Robot Setting: ■ The number of lattices was set  as 16×12 ■ virtual obstacle object on the screen System Setting:
Experimental Results  Fig.15 Walking trajectory acquired by GPS  Fig.16 Foot trajectory of walking test (Leg1)
Experimental Photos and Images(1)  Fig.17 Photos of the obstacle avoidance walking test Fig.18 Images of 3D animation
Experimental Photos and Images(2)  (a)’ (a) (b)’ (b) (c)’ (c) Fig.19 Omni-directional images and generated images
Conclusion Application of omni-directional gait Implementation of Teleoperation assistant system -Outdoor obstacle avoidance walking experiment indicates  the effectiveness of proposed system. -Teleoperation assistant system is that applied with  the omni-directional vision sensor and robot 3D animation  was successfully implemented on COMET-IV system.  However the problem when network entered the state of a high load  in the video data transfer that influenced the communications  between teleoperation computer and locomotion computer are still remained.
Future Work The network of the video data transfer and communications will be separated.  We will improve the accuracy of self-localization and  apply the force control to COMET-IV in order to  realize steady walking in rough terrain.

Outline

  • 1.
    Omni-directional Vision and3D Robot Animation Based Teleoperation of Hydraulically Actuated Hexpod Robot COMET-IV Hiroshi OHROKU Control and Robotics Lab. CHIBA University
  • 2.
    Background http://www.yomiuri.co.jp/feature/20080614-2892868/news/20080616-OYT1T00506.htm http://headlines.yahoo.co.jp/hlThe Expectation of robots is increasing! Assistance during hazardous operations Disaster-relief work Construction Mine detection and clearance etc.
  • 3.
    Background Utility ofmulti-legged robot Disaster site Construction site Mine field Rough terrain Multi-legged robots have high stability and mobility in rough terrain. Sites of hazardous operations Multi-legged robots can enter areas where it is difficult for wheeled robots and crawler type robots to enter.
  • 4.
    Development of COMET-IVCOMET-III COMET-IV COMET-II COMET-I Fully autonomous locomotion on outdoor rough terrain Assistance during various hazardous operations Development goal
  • 5.
    Large-scaled legged Robots(Past) Walking Forest Machine MECANT ASV TITAN-XI
  • 6.
    About autonomous systemFully autonomous system Half autonomous system < Intelligent system >   - Task management & Planning   - Environment recognition   - Self-Localization Upper- layer system < Robust controller >   - Force & Position control Lower- layer system < Robust controller >   - Force & Position control Lower- layer system   - Image of real environment   - Sensor data Teleoperation system   - Operation   - Supervision It is very difficult to implement fully autonomous robot in the stricken area ・・・
  • 7.
    Tele-robotics researches(1) http://pc.watch.impress.co.jp/docs/2002/1219/hrp.htmMaster-slave system + 3D Animation + Open Dynamic Engine ( ODE ) http://const.tokyu.com/topics/1998/topics_03.html Master-slave system + FES ( Functional Electrical Stimulation ) based bilateral control
  • 8.
    Tele-robotics researches(2) K.Saitoh,T.Machida, K.Kiyokawa, H.Takemura : A Mobile Robot Control Interface Using Omnidirectional Images and 3D Geometric Models, Technical report of IEICE. Multimedia and virtual environment 105-256, 7/12(2005)
  • 9.
    Focus of myresearch The research on teleoperation system to assist operator for controlled large-scale legged type robot is little In rough terrain operation, the change of ・・・・ Body height Attitude Leg movement The teleoperation assistant system is indispensable for controlling legged robot in outdoor environment Development of the Teleoperation assistant system ■ Omni-directional vision ■ 3D Animation of robot movement
  • 10.
    Development Tasks AutonomousNavigation System Teleoperation Assistant System Walking speed: max 1 km/h Vertical step: max 1 m Design specification in locomotion Gradient: max 20 deg Omni-directional walking Control System Gait Planning Foot Trajectory Tracking Force Control Attitude Control
  • 11.
    Hardware Specifications Leg1 Leg 2 Leg 3 Leg 4 Leg 5 Leg 6 L×W×H 2.5×3.3×2.8 ( m ) Weight 2200 ( kg ) Power Source   (Max. Output) Gasoline Engine ×2   (25.0ps/3600rpm) Supply Pressure 22 ( Mpa ) Supply Flow 78 × 2  ( L/min ) W:3.3 m H:2.8 m L:2.5 m
  • 12.
    Hardware Specifications LRF( Laser Range Finder ) Stereo vision Omni-directional Vision sensor
  • 13.
    Coordinate System ofLeg L2 L3 L4 L1 Link Length (m) Range of Motion [deg] Shoulder L1 0 -180° ≦ θ 1 ≦ 180° Thigh L2 1.13 50° ≦ θ 2 ≦ 142° Shank L3 0.77 36.6° ≦ θ 3 ≦ 150° Foot L4 0.39 -47.9°≦ θ 4 ≦ 103°
  • 14.
    Basic configuration of hydraulic control system
  • 15.
    Omni-directional gait Inthis system, we applied the trajectory ・・・ “ The Standard Circular Gait” [ * ] “ Low impact foot trajectory” [ ** ] If it is possible to change the body trajectory arbitrarily , we can apply various navigation strategies !! Strategy + [ * ] S.Hirose, H.Kikuchi, Y.Umetani:   The Standard Circular Gait of the Quadruped Walking Vehicle,   Journal of Robotics Society of Japan, 2-6, 41/52 (1984) [ ** ] Y.Sakakibara, K.Kan, Y.Hosoda, M.Hattori, M.Fujie :    Low Impact Foot Trajectory for aQuadruped Walking Machine,    Journal of the Robotics Society of Japan, 8-6,22/31(1990) The ability to change the body trajectory arbitrarily at any step during walking is needed.
  • 16.
    Omni-directional gait Circulargait Rotation center Crab gait Crab gait Circular gait Circular gait includes all gait [*]
  • 17.
    Definition of coordinatesystem X c Y c X s_1 Y s_1 <FRONT> <REAR> <LEFT> <RIGHT> Leg1 Leg2 Leg3 Leg4 Leg5 Leg6 O s_1 , Z s_1 O c , Z c X t Y t O t Rotation center coordinate system Body coordinate system Shoulder coordinate system R ct θ ct : Crab angle
  • 18.
    Condition of circularangle setting Condition: Arbitrary crab angle walking is achieved by setting as follows.   -> ∞ Circular gait ≒ Crab gait
  • 19.
    Teleoperation assistant systemInstruction information Log Omni-directional image Generated image Received sensor data Map 3D Animation of robot
  • 20.
    Peripheral Device ■Omni-directional vision sensor    - Digital Video Camera : DCR-HC48 ( SONY )    - Hyperbolic Mirror ■ Joystick: Cyborg Evo Force ( Saitek ) Interface USB1.1 Size ( mm ) 210 × 199 × 240 Buttons 11 Axes 3 Diameter of mirror 82 mm Angle of elevation 15° Angle of depression 50°
  • 21.
    Instruction information Actionvalue shows the basic movement action.  ■ Stand up  ■ Sit down  ■ Walking start  ■ Walking stop Teleoperation Computer Locomotion Computer UDP Socket ( User Datagram Protocol ) Action Action value Cycle_Time Cycle time θ tb Traverse angle of the body center at one cycle [O t ] R ct Position of rotation center [polar display of O c ] θ ct
  • 22.
    Coordinate system ofJoystick The eight directions of basic locomotion angle β can be selected using the joystick.
  • 23.
    Gait parameters SettingThe range of the traverse angle < Crab gait > < Circular gait >   -> ∞ is set to the minimal value The circular angle of body center derived from rot_z which is set to
  • 24.
    Configuration of omni-directionalvision sensor a 26.2 [mm] b 35.4 [mm] c 44.1 [mm] Diameter of mirror 55.0 [mm] Angle of elevation 15.0 [deg] Angle of depression 50.0 [deg]
  • 25.
    Performance of AmbientEnvironmental Image(1) Central axis of a mirror fits an optical axis of a camera is assumed. The direction where the angle of pan and tilt is set to 0 degree to make X axis and the origin of projection center coordination is (0, 0). ( 7 ) The spherical coordinates was derived by using pan, tilt angle and focal length which is direction projection plane as against the sphere. ( 8 ) ( 9 ) ( 10 )
  • 26.
    Performance of AmbientEnvironmental Image(2) The coordinates that derived from Eq. (7) to Eq. (10) is transformed to rectangular coordinates. ( 11 ) Mirror parameter is in Semi-major axis and mirror parameter b is in Semi-minor axis. Semi-latus rectum is derived by Eq. (12). ( 12 ) ( 13 )
  • 27.
    Performance of AmbientEnvironmental Image(3) Parameters l and m is used in spherical projection was derived from Eq. (14) to Eq. (15) including. ( 14 ) ( 15 ) ( 16 ) On the other hand, projection coordinates of hyperboloidal is calculated as Eq. (16) and (17). ( 17 )
  • 28.
    Performance of AmbientEnvironmental Image(4) Finally, the pixel coordinate system is determined by using matrix K via Eq. (18). Furthermore projection coordinates is calculated by Eq. (19). The values (358.1, 215.1) in the matrix K are XY coordinates of the center point in omni-directional image. ( 18 ) ( 19 )
  • 29.
    Texture mapping Timeconsuming factor on the system processing unit still become an issues because previous mentioned calculation method does similar process to all image pixels. Consequentially, it is difficult to present the smooth image operator. Therefore, the load of the calculation processing is reduced by using texture mapping .
  • 30.
    Texture mapping Fig.11-(a)Fig.12 Fig.11-(b) The corresponding points in omni-directional image as against lattice points were derived from Eq1 to Eq3. ② ① Texture mapping vertex[0] = P[0] texcoor[0] = T[0] vertex[1] = P[1] texcoor[1] = T[1] vertex[2] = P[2] texcoor[2] = T[2] vertex[3] = P[3] texcoor[3] = T[3] ③
  • 31.
    Mapping image Fig.12Calculated points on texture coordinate system
  • 32.
    Robot animation using3D geometric models and sensor data The 3D COMET-IV online model is designed to predict the real-time movement of COMET-IV on the reality environment. Each sensor data is used for the robot 3D animation movement reference which is transmitted from target computer unit on COMET-IV to teleoperation computer via wireless serial MODEM with data transfer rate 57600bps at 10Hz. Locomotion Computer Wireless serial Modem (for sensor data) Wireless serial Modem (for sensor data) 2.4GHz
  • 33.
    Coordinate system of3D robot Table.6 and Fig. 13 shows the parameters used for robot 3D animation in virtual environment and coordinate system of 3D robot respectively.
  • 34.
    Initialization of 3DAnimation For initialization, received azimuth angle, XY value of GPS, and roll and pitch angle of the body are set up as default values. The robot is rotated using azimuth angle ψ including magnetic variation Δd with Y axis.
  • 35.
    The homogeneous transformation [1] Each leg (Foot - Shank - Thigh - Shoulder) After initialization, each part is expressed as follows. ( 20 ) ( 21 ) ( 22 ) ( 23 ) [2] Robot body ( 24 ) ( 25 )
  • 36.
    Obstacle avoidance walkingtest Operator Goal Obstacle(2) Obstacle(1) ■ Controller is PID position control ■ Cycle time 16 [s] ■ Omni-directional gait ( crab gait ) ■ Tripod walking Robot Setting: ■ The number of lattices was set as 16×12 ■ virtual obstacle object on the screen System Setting:
  • 37.
    Experimental Results Fig.15 Walking trajectory acquired by GPS Fig.16 Foot trajectory of walking test (Leg1)
  • 38.
    Experimental Photos andImages(1) Fig.17 Photos of the obstacle avoidance walking test Fig.18 Images of 3D animation
  • 39.
    Experimental Photos andImages(2) (a)’ (a) (b)’ (b) (c)’ (c) Fig.19 Omni-directional images and generated images
  • 40.
    Conclusion Application ofomni-directional gait Implementation of Teleoperation assistant system -Outdoor obstacle avoidance walking experiment indicates the effectiveness of proposed system. -Teleoperation assistant system is that applied with the omni-directional vision sensor and robot 3D animation was successfully implemented on COMET-IV system. However the problem when network entered the state of a high load in the video data transfer that influenced the communications between teleoperation computer and locomotion computer are still remained.
  • 41.
    Future Work Thenetwork of the video data transfer and communications will be separated. We will improve the accuracy of self-localization and apply the force control to COMET-IV in order to realize steady walking in rough terrain.

Editor's Notes

  • #5 Figure1 is COMET-III. Figure2 is COMET-IV. Our research group has developed a series of mine detection and clearance robots, like COMET-I, II and III. COMET-IV was developed as the robot capable of various work support We proposed a new walking robot platform and aim to test its validity.
  • #11 This is development tasks in our project. We need to develop three systems. Autonomous navigation, Tele-operation and control. And we set four design specification in locomotion. Today’s presentation is about Tele-operation system.
  • #12 I will give the COMET-IV specifications. COMET-IV&apos;s length is 2.5m. The width is 3.3m. The height is 2.8m. It weights 2200kg. The robot has two gasoline engines which drive two hydraulic pumps. The generator supply power is 1200 watts. The supply pressure and flow are as follows.
  • #14 Next, I will explain about the leg coordinate system. Link1 is the Shoulder. Length is 0mm. The range of motion is from -180°to 180°. Link2 is the Thigh. Link3 is the Shank. Link4 is the Foot.
  • #15 The right side is the COMET-IV system and the left side is the Remote Control PC System. The video data streaming PC and Omni-directional camera are connected with IEEE thirteen ninety-four. We use DVTS to transmit and receive the digital video stream. DVTS is stands for Digital Video Transfer System. Panorama and perspective images are generated from the received omni-directional image. Those images are displayed on the screen. The purple box shows the operator instruction flow. We read the joystick signal and set gait parameters. Gait parameters include stride length, gait cycle time and angle and so on. Those gait parameters are transmitted to the robot as instruction information via the wireless network. Next, the gray box shows the COMET-IV system. Each leg has 4 degree of freedom. The shoulder is actuated by hydraulic motor. The thigh, shank and foot are actuated by hydraulic cylinders. Each sensor data is obtained from the AD board. The control voltage is calculated in the control computer, and it is transmitted to the DA board. Each leg is actuated in this manner and the robot walks.
  • #21 I will explain about the peripheral devices we use. This is the omni-directional camera. The Omni-directional camera is composed of the video camera and the hyperbolic mirror. The diameter of mirror is 82mm. The angle of elevation is 15°. And, the angle of depression is 50°. The joystick is connected with the remote control PC with USB interface.