Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Robotics Powered by Machine Learning

4,777 views

Published on

This presentation is from AI Dev Days conference, Bangalore on 9th March 2018. URL: www.aidevdays.com

Abstract: This session will cover the application of machine learning and how its application can control and alter the behavior of Robots. A use case will be showcased covering the Robot acting as an inference agent that can interpret and apply analytics and machine learning to financial data and answer questions based on this.

Presenters: Sharath Kumar & Shikha Maheshwari (IBM)

Published in: Software
  • Be the first to comment

Robotics Powered by Machine Learning

  1. 1. AI Dev Days Robotics Powered by Machine Learning R K Sharath Kumar (sharrkum@in.ibm.com) Shikha Maheshwari (shikha.mah@in.ibm.com) #aidevdays
  2. 2. 2
  3. 3. 3 There are different possible models to address each problem. Each model has its own learning algorithm(s) and its specific pros and cons. Discrete features
  4. 4. Robotics Raw curiosity More computatio nal power Advances in AI algorithms and architecture s Better/smal ler sensors and actuators Better batteries 3D printing Advances in neural implants New military doctrine Demograph ic changes Economic need Advances in neural implants Technology push Demand pull Raw curiosity
  5. 5. Dependent Autonomous
  6. 6. Fixed place Mobile
  7. 7. Thing-facing Human- facing
  8. 8. Predictable Responsive Social
  9. 9. Use Cases For AI 9 Concierge (Robot, Avatar, Space) Where is the elevator? Retail (Robot, Avatar, Space) Do you wanna build a snowman? ElderCare (Robot, Avatar, Space) I’ve fallen and I can’t get up! Cobot (Robot) Get me a screwdriver. Manufacturi ng (Robot) Watch me do this. Transportati on (Robot, Avatar, Space) Open the pod bay doors, Watson. Boardroom (Avatar, Space) Help me decide. Companion (Avatar, Device) Let’s play a game.
  10. 10. An Agent May Be Embodied In A Robot…
  11. 11. Text + Audio + Vision + Emotion + Motion + Physical interaction Socially-intelligent cooperative robot
  12. 12. Cognitive systems are creating a new partnership between humans and technology 12 Humans excel at DILEMMAS COMPASSION DREAMING ABSTRACTION IMAGINATION MORALS GENERALIZATION Cognitive Systems excel atCOMMON SENSE (but with many biases) ELIMINATING BIASES LOCATING KNOWLEDGE PATTERN IDENTIFICATION MACHINE LEARNING NATURAL LANGUAGE PROCESSING AT SCALE PROVIDING ENDLESS CAPACITY
  13. 13. 13 A cognitive business has systems that can enhance digital intelligence exponentially. They can reason, grasp underlying concepts, form hypotheses, and infer and extract ideas. Understand Reason Learn Interact Cognitive systems understand imagery, language and other unstructured data like humans do With abilities to see, talk and hear, cognitive systems interact with humans in a natural way. With each data point, interaction and outcome, they develop and sharpen expertise, so they never stop learning.
  14. 14. Who is Nao (pronounced as ‘Now’) • Nao is a humanoid robot • Developed by SoftBank Robotics • Originated at Aldebaran in France • Nao’s can play the roles of • A ‘Hospitality’ Robot, to • Attract & Welcome • Inform & Assist customers • A Companion Robot • Person care use cases • children, elderly, entertainment, … • It is not intended (nor capable) to replace a human --- Will complement humans for some chores
  15. 15. Nao built-in capabilities • The Nao robot has a set of sophisticated built-in capabilities that can operate stand-alone: • Sensors & Actuators • LEDs (eyes, ears, shoulder blades) • Bumpers, head and hands (tactile switches), Proximity (laser ranging) • Accelerometer, gyroscope • Animatronics: robot animation, robot pose sensing • Sound • Sound tracking • Ability to track origin of sound through directional microphones • Voice: • Say: Text-To-Speech • Voice Recognition (some onboard, some off-robot) • Vision • 2-D cameras • Face learning & recognition and tracking • Object learning & recognition • Emotions detection • Communications • WIFI connectivity (plus wired Ethernet) • Tablet (connected to robot head and WIFI) 15
  16. 16. Nao system environment 16 • Main system ‘head’: • Quad core CPU, 4GB RAM, 8GB SRAM, 16GB SSD • Runs a Linux-flavor OS (NaoQi) • Many peripheral controllers to drive robotics • 14 motors, 30 sensors • Large Battery, 12-18h autonomy • Connectivity: WIFI 802.11 a/b/g/n, wired Ethernet (head) • Tablet add-on 10”1, 1280x800 TFT • Android, separate system, 1GHz Cortex A5+VPU+Mali 400 GPU 1GB RAM, 4GB flash • USB-connected internally to Head, plus WIFI
  17. 17. Nao software environment • NaoQi operating system, Linux-based OS • SSH/FTP/… available to administer the robot • Distributed objects system (NaoQi), extensible • Software and Programming • Choregraphe • ‘wiring boxes’, code written in Python • Virtual robot Emulation • Robot animation (recording of positions, playback, sync with actions) • Dialog (QiChat), to drive request-response with voice • Automation, no magic • Native C++. Distributed Objects platform (QiMessage) • Remotely accessible from outside the robot (e.g. gateway) • Android tablet, can interact with robot through JavaScript APIs 17
  18. 18. Implementing Robot Applications • ‘Autonomous Life’ mode • Behaviors are loaded and actable • Activation of behaviors by triggers • Can sense when there are people around • 3 engagement (proximity) zones drive behavior • Starts to Engage in Human interaction through a specific application • Behaviors have to be coded and scripted • Response to a stimulation (sound, image, timer, engagement) • Native Dialog system is used to code for human-robot interaction flows • Based on a question & answer paradigm • The robot does not take initiatives by itself! • Robot Movements (attitudes, animations) are coded and synchronized with dialog and interactions • Emotional behavior can be played automatically
  19. 19. How to make NAO Robot smarter
  20. 20. Robot as an inference agent • Create a cognitive agent in NAO robot • Watson Conversation service • Data Science Experience • Enable the exchange of information between different systems • Process natural language to derive insights from the data
  21. 21. Architecture • Establish the communication between the NAO robot and IBM Data Science Experience (DSX) by using Watson Conversation API & Node-RED • Create the Watson Conversation chat bot application • Perform statistical analysis on a financial data set by using Jupyter (Python) Notebook on IBM DSX
  22. 22. • Watson Conversation Service (https://console.bluemix.net/catalog/services/conversation) • Add a natural language interface to your application to automate interactions with your end users • Combines machine learning, natural language understanding, and integrated dialog tools to create conversation flows Watson Conversation Watson Conversation
  23. 23. • IBM Data Science Experience (https://datascience.ibm.com/) • An interactive, collaborative, cloud-based environment where data scientists can use multiple tools to activate insights • Various tools available to analyze data • Jupyter notebooks in Python, Scala, or R • The Flow Editor to create models that use machine learning • Rstudio within DSX to run R notebooks • Data Refinery to prepare data for analysis • Streams Designer to design stream flows to collect and analyze large amounts of streaming data IBM Data Science Experience
  24. 24. • A browser-based editor • to wire together flows using the wide range of nodes in the palette • can be deployed to its runtime in a single-click Node-RED
  25. 25. Demo
  26. 26. Recap
  27. 27. https://developer.ibm.com/code/patterns/robotic-calculations-and- inference-agent/
  28. 28. https://developer.ibm.com/code
  29. 29. Q & A
  30. 30. Thank You !!

×