Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Run OSGi on your robot and teach it new tricks - T Verbelen

75 views

Published on

OSGi Community Event 2017 Presentation by Tim Verbelen [imec]

Recent and upcoming OSGi specifications such as Promises and PushStreams provide great tools for asynchronous programming in Java. This is particulary useful for programming robots, where issuing a command is inherintly aync from observing the effect. In this talk we will present an asynchronous OSGi service for controlling a Kuka robot. Moreover, OSGi modularity allows us to easily integrate our robot with other systems. For example in our research, we connect several sensors to the robot, and use deep learning techniques to let the robot learn new behaviors from this sensor information.

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Run OSGi on your robot and teach it new tricks - T Verbelen

  1. 1. PUBLIC CONTROL YOUR ROBOT WITH OSGI AND TEACH IT NEW TRICKS TIM VERBELEN
  2. 2. ROBOTS ARE TAKING OVER THE WORLD...
  3. 3. ...ARE THEY?
  4. 4. ROBOT PROGRAMMING COMPLEXITY 4 CONTROLLEDUNCONTROLLED ENVIRONMENTCOMPLEXITY SIMPLE HARD BEHAVIOR COMPLEXITY
  5. 5. SELF-LEARNING ROBOTS? 5 CONTROLLEDUNCONTROLLED ENVIRONMENTCOMPLEXITY SIMPLE HARD BEHAVIOR COMPLEXITY WE ARE HERE WE ARE HERE
  6. 6. CONTROL YOUR ROBOT
  7. 7. OUR ROBOT KUKA YOUBOT two finger gripper 5 DOF arm LIDAR sensor omnidirectional base platform Nvidia Jetson TX1 embedded GPU - 256 cuda cores - quad core ARM CPU - 4 GB RAM
  8. 8. ROS  Generic message passing framework  Robot geometry and description  Advanced robotics features (i.e. FK, IK, odometry, ...)  Bindings for C, C++, Python and Java  Interfaces to many robot simulators  Supports many robots and sensors 8 ROBOT OPERATING SYSTEM The Robot Operating System (ROS) is a set of software libraries and tools that help you build robot applications
  9. 9. FROM ROS TO OSGI ROS core rosjavarosjava ROS core Additional ROS nodes Youbot OSGi bundle Youbot OSGi bundle Arm OmniDirectional OSGi ROS launch bundles OSGi ROS launch bundles SimulatorROS controllerYoubotROS controller Pub-sub to topics: /youbot/arm/arm_controller/position_command /youbot/cmd_vel /youbot/joint_states 9
  10. 10. ROBOT AS A (OSGI) SERVICE 10 public interface Arm { // set position for an arm joint void setPosition(int joint, float position); // set positions for all joints void setPositions(float… positions); // move arm tip to a point in cartesian space void moveTo(float x, float y, float z); ... } WHEN TO RETURN?
  11. 11. ROBOT AS A (OSGI) SERVICE 11 public interface Arm { // set position for an arm joint Promise<Arm> setPosition(int joint, float position); // set positions for all joints Promise<Arm> setPositions(float… positions); // move arm tip to a point in cartesian space Promise<Arm> moveTo(float x, float y, float z); ... } A PROMISING API
  12. 12. ROBOT AS A (OSGI) SERVICE 12 @Component public class MyController { @Reference private Arm arm; public void doSomething(){ arm.openGripper() .then(p -> p.getValue().moveTo(0.3f, 0.0f, 0.25f)) .then(p -> p.getValue().setPosition(4, 1.57f)) .then(p -> p.getValue().moveTo(0.3f, 0.3f, 0.09f)) .then(p -> p.getValue().closeGripper()); // immediately returns a Promise } } PROMISES IN ACTION
  13. 13. ROBOT AS A (OSGI) SERVICE 13 public interface OmniDirectional { // This promise resolves “immediately” once the // robot starts moving Promise<OmniDirectional> move(float vx, float vy, float va); // Convenience method to wait until something happens Promise<OmniDirectional> until(Promise<?> condition); ... } WHAT ABOUT OMNIDIRECTIONAL?
  14. 14. ROBOT AS A (OSGI) SERVICE 14 @Component public class MyBaseController { @Reference private OmniDirectional base; public void doSomething(){ base.move(0.0f, 0.0f, 1.0f) .until(lidarDetectsObject()) .then(p -> p.getValue().stop()); // turn around until the lidar detects something } private Promise<Object> lidarDetectsObject(){...} } PROMISES IN ACTION (2)
  15. 15. TEACH IT NEW TRICKS
  16. 16. THE TRICK: FETCH OBJECTS 16
  17. 17. REINFORCEMENT LEARNING 18 AgentAgentEnvironmentEnvironment Observation
  18. 18. REINFORCEMENT LEARNING 19 AgentAgentEnvironmentEnvironment Action
  19. 19. REINFORCEMENT LEARNING 20 AgentAgentEnvironmentEnvironment Reward
  20. 20. DEEP REINFORCEMENT LEARNING 21 USING DEEP NEURAL NETWORKS AS FUNCTION APPROXIMATORS AgentAgent EnvironmentEnvironment Action Observation Train using Reward
  21. 21. LEARNING ACTIONS FROM ROBOT INPUT Neural NetworkNeural Network Reward signal: negative distance to the can train 22
  22. 22. SCALING DEEP REINFORCEMENT LEARNING SimulationSimulation SimulationSimulation SimulationSimulation SimulationSimulation Experience Pool LearnerLearner Neural Network Repository 23
  23. 23. DIANNE DISTRIBUTED ARTIFICIAL NEURAL NETWORKS  Modular, distributed deep learning framework in OSGi  Builds on top of Torch, exploiting highly optimized CPU and GPU operations  Transparent distributed neural network inference and training  Web UI to quickly prototype and experiment 24
  24. 24. EXPLOIT ADDITIONAL SENSORS IN THE ENVIRONMENT TRAIN A NEURAL NETWORK TO FUSE ADDITIONAL SENSOR INPUTS 27 FusingLayerFusingLayer deploy on the "edge"
  25. 25. IMEC TECHNOLOGY FORUM Evaluate the SessionsEvaluate the Sessions Sign in and vote at eclipsecon.orgSign in and vote at eclipsecon.org - 1- 1 + 1+ 100
  26. 26. PUBLIC THANK YOU http://dianne.intec.ugent.be/ https://github.com/ibcn-cloudlet/dianne tim.verbelen@ugent.be How deep is your learning? 12:00-12:35 - Seminarraum 5
  27. 27. PUBLIC

×