Presented in Int. conf. on Advanced Mechatronics 2015.
It gives a series of methods to detect scissors by vision system and to generate robotic caging grasp motions by Choreonoid, a motion planner.
Enhancing Worker Digital Experience: A Hands-on Workshop for Partners
Icam2015 presentation poster - Object Recognition and Planning of Ring-type Caging for Scissors
1. Object Recognition and Planning of Ring-type Caging for Scissors
* S. Makita*1
, S. Tsuji*2
, T. Tsuji*3
, K. Harada*4
*1
NIT Sasebo College, *2
Omron Corp., *3
Kyushu Univ., *4
AIST
Keywords
Caging, Grasping, Manipulation, Object recognition,
Motion planning, Object with holes
Abstract
● Scissors, an object with holes, can be caged by a two-
fingered hand
● Necessary features for caging a scissors are
● Position of the holes (a handle)
● Orientation of the scissors (especially blades)
● Object recognition is based on OpenCV and SURF
● Choreonoid*[1]
, a motion planner, and graspPlugin*[2]
, a
grasp planner, are used for caging motion generation
Motivation
● Caging, a geometrical constraint by robots, is substitution of grasping, a force-control-based approach
● Ring-type caging can be easily achieved by simple visual features (loop-shapes)
● Vision system has an advantage over some RGB-D system on the viewpoint of size restrictions
Our proposal
● Obtain visual features by webcams for ring-type caging
● Ellipsoid approximation for handle of scissors
● Majority decisions for orientation of blades with Hough transform of line segments
● Motion planning with Choreonoid and graspPlugin modified for caging with its sufficient conditions
Sufficient condition for ring-type caging
● Capture a loop shape of object by robot fingers and make a Hopf link
RGB vs. RGB-D
● RGB-D images has richer information of objects than only RGB.
● RGB-D sensors are still larger than RGB cameras. It is a disadvantage for implementation to ro-
bot hands.
Caging planning by Choreonoid*[1]
, a motion planner
● A motion planner using PRM (Probabilistic Roadmap Method)
graspPlugin*[2]
● Grasp planner for Choreonoid, including grasping, trajectory and task planning
Our modification
Reference: *[1] Choreonoid, http://choreonoid.org/en/, *[2] graspPlugin, http://choreonoid.org/GraspPlugin/i/?q=en
2. Object recognition
Position of handle of scissors
1) Detect contours using luminance gradient, and run closing
processing to connect separated contours
2) Recognize loop contours from a binary image, and the loops
make nested structure.
3) The loops surrounded by the outermost loop are approxi-
mated by each ellipse using least square method.
4) Estimate major and minor axis of each ellipse and compare
the distance of each axis in all the combinations of ellipses.
5) Assume a handle that is composed by two ellipses whose
sum of differences is smallest in all the combinations.
Orientation of scissors
1) Define a direction vector between each center of two el-
lipses approximating the hollows of handle
2) Detect the blades of scissors as several line segments rec-
ognized from Hough transform applied to contours
3) Define a direction vector from the handle to the line seg-
ments and examine the relative position of them by outer
product
4) Estimate orientation of the scissors by majority decision
Distance estimation by stereo vision
Widely-used method, with SURF features, epipolar geometry and stereo rectification
Motion planning by Choreonoid
1) Move the robot hand to right over of midpoint between
the handle and rotate it to let the gripper vertical
2) Let the line segment between both gripper fingers in
parallel to the line segment between the handle’s loops
3) Approach the gripper to the object and close it
Note: “Robot fingers and the table” cooperatively make a
“loop” to cage the scissors
0
10
20
30
40
50
60
70
80
90
100
Scissors A Scissors B Scissors C Scissors D
Successrate[%]
Handle detection Orientation estimation
2
Acknowledgment: This work was supported by JSPS KAKENHI Grant #23760248
Average planning time
● Generate goal posture:
10 ms
● Path planning: 10ms
Experimental results
● Scissors with decorated texture has less success rate
● Orientation estimation is executed with high accuracy if
handle detection is successfully achieved
● Some obstacles are removed through labelling process
● Note: Scissors are on the white flat table
A failure case