Successfully reported this slideshow.
Your SlideShare is downloading. ×

RSJ 2017 - Grasp Adaptation Control with Finger Vision

Loading in …3
×

Check these out next

1 of 23 Ad
1 of 23 Ad
Advertisement

More Related Content

Advertisement

RSJ 2017 - Grasp Adaptation Control with Finger Vision

  1. 1. Grasp Adaptation Control with Finger Vision Verificationwith Deformable and Fragile Objects Akihiko Yamaguchi(*1*2), Chris G. Atkeson(*1) *1 Robotics Institute, Carnegie Mellon University *2 Grad Schl of Info Sci, Tohoku University
  2. 2. Tactile Sensing for AI-based Manipulation For next generation of robot manipulation (deformableobjects, fragile, cooking, …), AI-based approach is necessary Robot Learning,ReinforcementLearning, Machine Learning, Deep Learning,Deep RL, Planning, Optimization Tactile sensing improves AI-based manipulation but we don’t know a good strategy to use tactile sensing in learning manipulation2
  3. 3. Is Tactile Sensing Really Necessary? E.g. Learning grasping with deep learning (L) Learning to grasp from 50K Tries, Pinto et al. 2016 https://youtu.be/oSqHc0nLkm8 (R) Learning hand-eye coordination for robotic grasping, Levine et al. 2017 https://youtu.be/l8zKZLqkfII No tactile sensing was used 3
  4. 4. Tactile Sensing is Useful in Many Scenarios What if uncertain external force is applied? What if grasping the same-visual / different-weight containers? Imagine: Wearing heavy gloves; Frozen hands 4
  5. 5. Using Tactile Sensing in Learning Manipulation To make manipulation robust and learning faster In this study: Using FingerVision for tactile sensing(+α) Learning general policy of grasping deformable and fragile objects 7
  6. 6. FingerVision: Vision-based Tactile Sensing 8 Multimodal tactile sensing Force distribution Proximity Vision  Slip / Deformation  Object pose, texture,shape Low-cost and easy to manufacture Physically robust Cameras are becoming smaller and cheaper thanks to smart phone & IoT NanEye: 1mm x 1mm x 1mm http://www.bapimgsys.com/area- camera/ac62kusb-color-area-camera-based- on-naneye-sensor.html
  7. 7. How to Use FingerVision in Learning Grasping? Eternal or head vision is necessary Representation of policy to be learned: No tactile: (Vision)  (Command) With tactile: (Vision, Tactile)  (Command) 12
  8. 8. How to Use FingerVision in Learning Grasping? Grasping (picking up) structure: Deciding a grasp pose (vision) Reaching gripper to a grasp pose (vision) Grasping the object (tactile[force], (vision)) Lifting up the object (tactile[slip, force], (vision)) Evaluating grasp (vision, tactile) 13
  9. 9. How to Use FingerVision in Learning Grasping? Grasping (picking up) structure: Grasp Pose Estimator(vision-based) Deciding a grasp pose (vision) Reaching gripper to a grasp pose (vision) Grasp Adaptation Controller (tactile-based) Grasping the object (tactile[force], (vision)) Lifting up the object (tactile[slip, force], (vision)) ----------------------------------------- Evaluating grasp (vision, tactile) 14
  10. 10. Grasp Adaptation Controller Grasping and lifting up Idea: Controlling to avoid slip Simple state machine: [1] Grasp test  Moving the object upward slightly  If slip detected --> moving the object to the initial height and closing the gripper slightly  No slip --> [2] [2] Lift-up  Moving the object upward to a target height  If slip detected --> [1] Slip avoidance feedback controller is always activated15
  11. 11. Experiments 16
  12. 12. Drop during lifting
  13. 13. Computer vision failure (1)
  14. 14. Computer vision failure (2)
  15. 15. Future Work Learning Grasp Pose Estimator How to parameterize Grasp Adaptation Controller? Improving Grasp Adaptation Controller Failure detection and recovery Improving FingerVision Computer vision (object detection, movement classification) Hardware (sensitive to normal force) Learning how to break objects (and how to avoid it) New gripper Agility is necessary22
  16. 16. Academic Crowdfunding for FingerVision Demo at: 1号館1104 (4-2) 23 http://akihikoy.net/p/fv.html

×