Successfully reported this slideshow.
We use your LinkedIn profile and activity data to personalize ads and to show you more relevant ads. You can change your ad preferences anytime.

Arpan pal roboticsensing_sw2015

1,807 views

Published on

Robotic Sensing, Rome, Italy

Published in: Technology
  • Be the first to comment

  • Be the first to like this

Arpan pal roboticsensing_sw2015

  1. 1. 1Copyright © 2014 Tata Consultancy Services Limited Five Senses Computing in Robots for Remote Monitoring Applications 19th May 2015 Arpan Pal Principal Scientist Innovation Lab, Kolkata With Ranjan Dasgupta
  2. 2. 2 5 Senses Computing http://readwrite.com/2012/12/18/ibms-cognitive-computing-plans-giving-smartphones-5-senses http://www.extremetech.com/extreme/143478-ibm-predicts-computers-will-have-the-five-human- senses-within-five-years • Online Shopping • Remote Healthcare Touch – Feel Remotely • Remote Identification, Recognition and Measurement Sight – 3D Vision • Remote Surveillance Hearing – 3D Hearing • Remote Monitoring, • Virtual Taste Buds Taste – Ingredient Analyzer • Remote Healthcare • Remote Surveillance Smell – Gas Analyzers
  3. 3. 3 Why it is Important in Robotic Sensing Robots can carry a whole lot of sensors – human beings can also do that The only difference between robots and human beings is the 5 senses To provide the robot with the ability of cognition, it must have the 5 senses Robots are useful in hazardous areas, or for cost- effective sensing Advanced Machine Learning and Deep Learning on 5 senses Data – Cognitive Computing
  4. 4. 4 Robotic Sensing – State of the Art Current State of the Art in Robots • 2D Vision • Normal acoustic sensing via microphone • Ranging / Obstacle Detection Basic Sensing Technology that is available but not predominantly deployed on robots • Real-time 3D vision • Acoustic 3D • Thermal 3D • Smell • Gas • Touch • Taste
  5. 5. 5 Use Cases - Oil Refineries / Underground Mines Checking for Discrepancy / Quality Control in Factory Assembly Lines Tank Gauging (Sludge Heel Evaluation ) in Oil Refineries – presence of hazardous gases generated inside tank Manual inspection in high risk and inaccessible areas • Unsafe, operational and occupational hazard • Needs Robotic Sensing Underground coal mines – zero visibility and dangerous environment due to presence of (high temperature, gas, damps) - mine disaster Possible acoustic / thermal / gas sources
  6. 6. 6 Click to edit Master title styleDiscrepancy Checking in Factory
  7. 7. 7 Checking Discrepancy using Camera Capture multiple 2D images from various positions around the object Create 3D model Geometry model Measurement of reference points and places to check discrepancies
  8. 8. 8 Camera based 3D Reconstruction from 2D images Input Images Sparse Reconstruction using Mobile Inertial Sensors for Camera Position Estimation Dense Reconstruction -120 images Dense Reconstruction - 20 images • Low cost solution for 3D reconstruction from multiple 2D images • Motion information from the inbuilt inertial sensors – for camera position estimation
  9. 9. 9 Tank Sludge Measurement
  10. 10. 10 Thermal Imagery – Thermal imaging of the environment and map into 3D optical space – 3D Opto-thermal representation of objects and quantitative thermography
  11. 11. 11
  12. 12. 12 Acoustic Source Localization and Acoustic Imaging
  13. 13. 13 Acoustic Source Localization • Localize sound source using array of microphones • Detect sound sources (pumping system, motors/compressors, water drop) other than voice frequency
  14. 14. 14 Acoustic Sensor Array (ASA) – Imaging Theory  Ultrasonic Imaging of Objects (~40kHz) at 5-10m range – Employed especially in dark and smoky environments – Augment optical / thermal vision for improved perception  A 2D planar, fully populated array (1/2 wavelength spacing) of microphones and transmitters approx. 4 x 4  Time duration of the pulse: 1 msec.  Frequency of the sinusoid in the pulsed-CW signal: 40 kHz.  Directional Array Elements 4 x 4  Element spacing: 0.5 * λ  Distance of the target from array: 5.0 m.  Target: 1.75m x 2.0m x 0.3m  Maximum steer angle in horizontal: ±10 degrees  Maximum steer angle in vertical: ±10 degrees
  15. 15. 15 System Requirements
  16. 16. 16 • Map thermal profiles of objects captured using thermal camera with optical vision Next Generation Multisensory AGV Acoustic array for imaging objects (planned) • Transmission of ultrasonic waves • Receive backscattered acoustic waves • SAR based beam- forming techniques using directional microphones • Linear microphone array for audio source localization • Currently done via Kinect • Standard Webcam for optical imaging Firebird VI from NextRobotics
  17. 17. 17 Network Throughput Requirement Operation Image Data Size Bandwidth (Frequency ) N/W Throughput 3D Reconstruction (SD-SFR Camera) 640 x 480 x 24 bits - compressed 30 FPS 27.6 Mbps 3D Reconstruction (SD-HFR Camera ) 640 x 480 x 24 bits – compressed 60 FPS 55.2 Mbps 3D Opto Thermal Mapping ( Thermal Camera ) 640 x 480 x 24 bits - uncompressed 6.5 FPS 48 Mbps 3D Opto Thermal Mapping ( Camera decoupled with thermal sensor ) 640 x 480 x 24 bits, 8 bit. Compressed optical, uncompressed thermal 30 FPS 31.2 Mbps Acoustic Source Localization 24 bit 40 ksps 960 Kbps Active Acoustic Imaging 16 bit – 4x4 array 250 ksps 64 Mbps
  18. 18. 18 From Grid to Cloud and then from Cloud to Edge Cognitive Analytics  Computing over a huge data set, with real-time or near-real-time requirements  Requires a huge cloud infrastructure  Or, it may be possible to leverage the edge devices (Robots, Routers and Gateways) Edge Device Computing  Computing power at edge remain unused most of the time  Energy cost is typically at consumer rates, far less than cost at cloud which is at Enterprise rates  Reduction in data size that needs to be sent to cloud – direct saving in edge energy and communication cost  Reduction in Network Congestion  Reduction in Bandwidth Requirement “Cloud computing is simply a buzzword used to repackage grid computing and utility computing, both of which have existed for decades” – whatis.com
  19. 19. 19 Fog Computing Source: Flavio Bonomi et.al. MCC2012, Helsinki, Finland Dense Reconstruction • 120 images, compute time (4 core, 1GPU) ~ 20 min (without using inertial sensors) • 120 images - 4 core, 1GPU) ~ 1 min (with inertial sensors). • Bandwidth saving ~ 8 times if done on edge Sparse Reconstruction • 20 images, compute time (4 core, 1GPU) ~ 3 min (without using inertial sensors) • 20 images – compute time (4 core, 1GPU) ~10 sec. (with inertial sensors) • Bandwidth saving ~ 200 times, if done on edge TCS Connected Universe Platform (TCUP) for IoT – • Seamless connectivity from sensor to gateway to cloud (lightweight) • OGC-SoS based sensor data storage • Analytics Support • Remote Device Management • Edge Processing support at the Gateway
  20. 20. 20 Summary 3D reconstruction is extremely compute and network heavy operation Using the robot position from on-board inertial sensors like accelerometer and gyroscope can considerably reduce compute load Creation of Point Cloud in the Robot Edge Gateway can result into 8 to 200 times bandwidth saving Audio and other three senses have similar or less data size and compute power requirements
  21. 21. 21 Patents and Papers Publications o Ramu Vempada, Parijat Deshpande, Karthikeyan Vaiapury, Arindam Saha, Keshaw Dewangan, Ranjan Das Gupta, and Arpan Pal, "Sound Source Localization with 3D Optical Fusion for Hazardous Area Surveillance using Autonomous Ground Vehicles," Proceedings of the International Conference on Robotics and Automation Developing Countries Forum, Seattle, Washington, May 26-30, 2015 o Parijat Deshpande, V. Ramu Reddy, Arindam Saha, Karthikeyan Vaiapury, Keshaw Dewangan and Ranjan Dasgupta, "A Next Generation Mobile Robot with Multi-Mode Sense of 3D Perception," Proceedings of the 17th International Conference on Advanced Robotics, Istanbul, Turkey, July 27-31, 2015 o V.Ramu Reddy, Parijat Deshpande and R.Dasgupta, “Robotics Audition using Kinect,” Proceedings of the 6th International Conference on Automation Robotics and Applications, Queenstown, New Zealand, February 17-19, 2015 o A Banerjee, A Mukherjee, H S Paul, S Dey, Offloading work to mobile devices: an availability-aware data partitioning approach, MCS 2013. o S Dey, A Mukherjee, HS Paul, A Pal, Challenges of Using Edge Devices in IoT Computation Grids, ICPADS 2013 o A Mukherjee, HS Paul, S Dey, A Banerjee, ANGELS for distributed analytics in IoT, WF-IoT 2013 o A Mukherjee, S Dey, HS Paul, B Das, Utilising condor for data parallel analytics in an IoT context—An experience report,, 9th IEEE International Conference on Wireless and Mobile Computing, Networking and Communications - IoT 2013 workshop
  22. 22. 22  Pioneer & Leader in Indian IT TCS was established in 1968  One of the top ranked global software service provider  Largest Software service provider in Asia  300,000+ associates  USD 15Billion+ annual revenue  Global presence – 55+ countries, 119 nationalities  First Software R&D Center in India Tata Consultancy Services (TCS) at a Glance Bangalore, India1 Chennai, India2 Cincinnati, USA3 Delhi, India4 Hyderabad, India5 Kolkata, India6 Mumbai, India7 Peterborough, UK8 Pune, India9 2000+ Associates in Research, Development and Asset Creation Singapore10 Innovation @ TCS TCS Connected Universe Platform (TCUP) • M2M Communication • Distributed Computing • Sensor Integration and Management • Analytics Services Context-aware Applications • Healthcare • Insurance • Retail • Manufacturing • Smart Building / Campus • Smart Villages / Cities Overview 10 Corporate Innovation Labs Co-innovation Network (COIN) with Academia and Industry Internet-of-Things Research Three stage Innovation Process – Explore, Enable. Exploit
  23. 23. 23 Copyright © 2014 Tata Consultancy Services Limited Thank You arpan.pal@tcs.com

×