Laser Range Finder (LRF) Assisted Force-based
Walking for Hexapod Robot COMET-IV
by
Mohd Razali Daud1
, Kenzo Nonami2
1,2
Department of Mechanical Engineering, Division of Artificial System Science,
Graduate School of Engineering, Chiba University
1-33 Yayoi-cho, Inage-ku, Chiba 263-8522, Japan
1
E-mail: mrazali@graduate.chiba-u.jp
2
E-mail: nonami@faculty.chiba-u.jp
1. Introduction (1)
Surgery robot [1] Walking Forest Machine [2]
Efficient assists and convenience for human life
1. Introduction (2)
Efficient assists and convenience for human life
Cleaning robot, iRobot [3]
Service robot, PatrolBot [4]
1. Introduction (3)
Ground robots
more statically stableLegged robot
Wheeled type robot [5]
Crawler type robot [5]
Legged type robot [4]
1. Introduction (5)
If no vision sensor!!!
1. Introduction (6)
With vision sensor
2. COMET-IV   Configuration (1)
Omni-directional
camera
Stereo-vision
camera
2. COMET-IV   Configuration (2)
Item Value
Overall Height [m] 2.80
Body Height [m] 0.8
Width [m] 3.50
Length [m]
(omni directional gait)
3. 0
Max. Walking Speed [m/h] 1000
Weight [kg] 2120
Gasoline Engine [cc] 750 (2 units)
2. COMET-IV   Configuration (3)
Autonomous Navigation System
Control System
Leg Controller
Navigation Computer
Target Computer
Proportional
Solenoid Valve
: Pressure Sensor
: Potentiometer
Hydraulic
Motor
Shoulder
Thigh
Shank
Foot
×6
A/D
Board
Attitude Sensor (On the body)
D/A
Board
Valve
Controller
Host Computer
(Program Transmit)
Laser Range Finder
Stereo-vision Camera
Omni-directional Camera
Gait Planner
Path Planner
MAPPING & MOTION   PLAN
StartGoal
2. COMET-IV   Configuration (4) – Why 3D
3-D LRF
Reflected
laser beam
spots (Wall)
LRF with normal
mount style (2D)
Moving robot
Wall
2. COMET-IV   Configuration (5) – Why 3D
3. Walking Trajectory and Patterns
0.5n oc ct cθ θ θ= −
cmθ
smθ
1. Movement based on center of body
and shoulder rotational angle.
2. Using tripod gait pattern.
( )ocθ
: center of body (CoB) angle
: initial of SCS based angle of each leg
( )ct
θ
{ }2 5
,cm c cθ θ θ=
{ }1 3 4 6
, , ,sm c c c cθ θ θ θ θ=
: SCS conrol input trajectory for each leg( )ncθ
4. Force threshold-based Trajectory
0 1,3( )
n
z foot footF F z I F= =
5. 3D Position of Obstacle Based on LRF
c cos siniX R ϕ θ=
c cos cosiY R θ ϕ=
c siniZ H R ϕ= +
6. Grid Occupancy Registration (1)
( ) ( ) ( )
1 1
GD CD
GC C GC C GC C GC
P P
R R R Tψ θ ϕ
   
=   
   
6. Grid Occupancy Registration (2)
Obstacles with less
than 0.75 m height
Obstacles with more
than 0.75 m height
7. Obstacle’s Geometric Requirement (1)
7. Obstacle’s Geometric Requirement (2)
Edges of
obstacles
8. Dynamic Stable Range for Force Control (1)
min
max
( );
( ) ;
e b
b
e x b
L t S
S
L t S

= 
+ l
,
,
( ) wb wb
wa wa
k x y
e sk x y
L s G l
=
=
= ∑
Without
vision
With
vision
8. Dynamic Stable Range for Force Control (2)
8. Dynamic Stable Range for Force Control (3)
9. Experiments and Results (1)
9. Experiments and Results (2)
Matched with
vision input
9. Experiments and Results (3)
Z-axis position of Leg 1
9. Experiments and Results (4)
Vision-based PPF gives more
stable result
9. Experiments and Results (5)
Vision-based PPF gives more
stable result
9. Experiments and Results (6)
Vision-based PPF gives
better result
9. Conclusion and future works
1. The proposed GWALR with customized Edge Detection method has
successfully determined the geometric of the obstacles that exist in
the environment, and they are correctly represented by the grid cell
map.
2. The control algorithm has effectively and dynamically switches the
stable range in accordance to the performance of Pull-back action,
and has successfully maintained the body of the robot in a horizontal
position at all times even when walking on uneven ground. However,
though the leg opening size is fixed at 0.8 m per half cycle, when the
robot is operated to walk long distances, at times the opening
changed in a random manner, and may cause the leg to step at a
different place instead of as instructed by the controller unit.
3. In future research, the use of the vision output to control the leg
opening size should be considered while enabling the robot to
always keep its body in a horizontal position.
4. To ensure the robot can be operated in all surface conditions,
integration of the LRF with the other sensors need to be done.
P1131131674

P1131131674

  • 1.
    Laser Range Finder(LRF) Assisted Force-based Walking for Hexapod Robot COMET-IV by Mohd Razali Daud1 , Kenzo Nonami2 1,2 Department of Mechanical Engineering, Division of Artificial System Science, Graduate School of Engineering, Chiba University 1-33 Yayoi-cho, Inage-ku, Chiba 263-8522, Japan 1 E-mail: mrazali@graduate.chiba-u.jp 2 E-mail: nonami@faculty.chiba-u.jp
  • 2.
    1. Introduction (1) Surgeryrobot [1] Walking Forest Machine [2] Efficient assists and convenience for human life
  • 3.
    1. Introduction (2) Efficientassists and convenience for human life Cleaning robot, iRobot [3] Service robot, PatrolBot [4]
  • 4.
    1. Introduction (3) Groundrobots more statically stableLegged robot Wheeled type robot [5] Crawler type robot [5] Legged type robot [4]
  • 5.
    1. Introduction (5) Ifno vision sensor!!!
  • 6.
  • 7.
    2. COMET-IV  Configuration (1) Omni-directional camera Stereo-vision camera
  • 8.
    2. COMET-IV  Configuration (2) Item Value Overall Height [m] 2.80 Body Height [m] 0.8 Width [m] 3.50 Length [m] (omni directional gait) 3. 0 Max. Walking Speed [m/h] 1000 Weight [kg] 2120 Gasoline Engine [cc] 750 (2 units)
  • 9.
    2. COMET-IV  Configuration (3) Autonomous Navigation System Control System Leg Controller Navigation Computer Target Computer Proportional Solenoid Valve : Pressure Sensor : Potentiometer Hydraulic Motor Shoulder Thigh Shank Foot ×6 A/D Board Attitude Sensor (On the body) D/A Board Valve Controller Host Computer (Program Transmit) Laser Range Finder Stereo-vision Camera Omni-directional Camera Gait Planner Path Planner
  • 10.
    MAPPING & MOTION  PLAN StartGoal 2. COMET-IV   Configuration (4) – Why 3D 3-D LRF
  • 11.
    Reflected laser beam spots (Wall) LRFwith normal mount style (2D) Moving robot Wall 2. COMET-IV   Configuration (5) – Why 3D
  • 12.
    3. Walking Trajectoryand Patterns 0.5n oc ct cθ θ θ= − cmθ smθ 1. Movement based on center of body and shoulder rotational angle. 2. Using tripod gait pattern. ( )ocθ : center of body (CoB) angle : initial of SCS based angle of each leg ( )ct θ { }2 5 ,cm c cθ θ θ= { }1 3 4 6 , , ,sm c c c cθ θ θ θ θ= : SCS conrol input trajectory for each leg( )ncθ
  • 13.
    4. Force threshold-basedTrajectory 0 1,3( ) n z foot footF F z I F= =
  • 14.
    5. 3D Positionof Obstacle Based on LRF c cos siniX R ϕ θ= c cos cosiY R θ ϕ= c siniZ H R ϕ= +
  • 15.
    6. Grid OccupancyRegistration (1) ( ) ( ) ( ) 1 1 GD CD GC C GC C GC C GC P P R R R Tψ θ ϕ     =       
  • 16.
    6. Grid OccupancyRegistration (2) Obstacles with less than 0.75 m height Obstacles with more than 0.75 m height
  • 17.
  • 18.
    7. Obstacle’s GeometricRequirement (2) Edges of obstacles
  • 19.
    8. Dynamic StableRange for Force Control (1) min max ( ); ( ) ; e b b e x b L t S S L t S  =  + l , , ( ) wb wb wa wa k x y e sk x y L s G l = = = ∑ Without vision With vision
  • 20.
    8. Dynamic StableRange for Force Control (2)
  • 21.
    8. Dynamic StableRange for Force Control (3)
  • 22.
    9. Experiments andResults (1)
  • 23.
    9. Experiments andResults (2) Matched with vision input
  • 24.
    9. Experiments andResults (3) Z-axis position of Leg 1
  • 25.
    9. Experiments andResults (4) Vision-based PPF gives more stable result
  • 26.
    9. Experiments andResults (5) Vision-based PPF gives more stable result
  • 27.
    9. Experiments andResults (6) Vision-based PPF gives better result
  • 28.
    9. Conclusion andfuture works 1. The proposed GWALR with customized Edge Detection method has successfully determined the geometric of the obstacles that exist in the environment, and they are correctly represented by the grid cell map. 2. The control algorithm has effectively and dynamically switches the stable range in accordance to the performance of Pull-back action, and has successfully maintained the body of the robot in a horizontal position at all times even when walking on uneven ground. However, though the leg opening size is fixed at 0.8 m per half cycle, when the robot is operated to walk long distances, at times the opening changed in a random manner, and may cause the leg to step at a different place instead of as instructed by the controller unit. 3. In future research, the use of the vision output to control the leg opening size should be considered while enabling the robot to always keep its body in a horizontal position. 4. To ensure the robot can be operated in all surface conditions, integration of the LRF with the other sensors need to be done.