The age of 
empathic devices
Vincent Spruyt
Head of Deep Learning, Argus Labs
http://www.slideshare.net/ArgusLabs
Let’s look at the world in 2014
A perfect storm is coming
@arguslabs
IOT
AI
Mobile
Sensors
User acceptance
The Mobile & Internet of Things Everything storm
Wearable
37B by 2018
Smart Appliances
XYZ B by 2019
Apps
12B by 2013
Cars...
The sensor storm: sensors are everywhere
Sensors are a device’s equivalent to possessing human-like sensing
capabilities a...
The data and AI storm
Activity of spiny stellate neuron part of primary visual
cortex, with developed horizontally-oriente...
Let’s evaluate how we work with smart-phones
A phone or interconnected sensory device ?
2000
Verbal
2014
Non
verbal
User experiences are changing
Intelligent capabilities need to be interfacing with the end user, as
an overall service lay...
Apps are changing
Service layers are emerging as background apps that only
interface with a mobile users when it’s really ...
How will we deal with mobile by 2020 ?
A phone is only a form factor
The functionality of a smart-phone will come in many
shapes and functionalities.

Already a ...
Your smart-phone is a docking station
Due to increasing processing and connectivity capabilities, your smart-phone will be...
Your smart-phone controls your vitals
Your smart-phone is a
smart agent
Capable of interpreting the context of a mobile user
a smart-phone is capable of respond...
Bringing 2020 in the present
What if we told you, your phone could 
already be empathic ?
Turning smart-phones into sentient, cognitive and 
empathic devices.

Make a better future by augmenting the cognitive
cap...
We did it !
The first time ever people can start making machines and software 
cognitive and emotionally aware of their hum...
It’s a service layer

Cognitive platform through SDK and API
Handling > 1 million requests per second
Ambient sensing tech...
Self-learning
biking – running – depression – stress – happy – excited – petrol
head – soccer mom – car – party – music – ...
Not only smart-phones
can be sentient !
The future of mobile
Is fueled by artificial intelligence and empathic algorithms and will
transform your future as if you ...
Argus Labs
http://www.arguslabs.be 
@arguslabs

http://www.slideshare.net/ArgusLabs
When smart-phones sense how you feel: The era of intelligent mobile devices - Vincent Spruyt, Argus Labs
When smart-phones sense how you feel: The era of intelligent mobile devices - Vincent Spruyt, Argus Labs
Upcoming SlideShare
Loading in …5
×

When smart-phones sense how you feel: The era of intelligent mobile devices - Vincent Spruyt, Argus Labs

370 views
248 views

Published on

Mobile Theatre - June 17th, 12:30-13:00

Argus Labs uses deep learning algorithms to sense, understand and predict human behaviour and emotions, based on the sensors in a smart-phone and general usage of a smart-phone. The presentation will demonstrate how smart-phones will start to behave as intelligent entities that know how a user feels and improve our lives.

Published in: Mobile, Business, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
370
On SlideShare
0
From Embeds
0
Number of Embeds
1
Actions
Shares
0
Downloads
6
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

When smart-phones sense how you feel: The era of intelligent mobile devices - Vincent Spruyt, Argus Labs

  1. 1. The age of empathic devices Vincent Spruyt Head of Deep Learning, Argus Labs http://www.slideshare.net/ArgusLabs
  2. 2. Let’s look at the world in 2014
  3. 3. A perfect storm is coming @arguslabs IOT AI Mobile Sensors User acceptance
  4. 4. The Mobile & Internet of Things Everything storm Wearable 37B by 2018 Smart Appliances XYZ B by 2019 Apps 12B by 2013 Cars 12B by 2013 @arguslabs The coming era of smart-phone intelligence will allows any human to interact with the physical world in completely new ways
  5. 5. The sensor storm: sensors are everywhere Sensors are a device’s equivalent to possessing human-like sensing capabilities and allow to sense and understand contextual cues. Automatic gender detection, through walking patterns Mood detection, through mobile handling Activity detection (walking, running, sitting, …) … Automatic in- versus outdoor detection Mood detection, through heart rate and oxygen level detection … @arguslabs
  6. 6. The data and AI storm Activity of spiny stellate neuron part of primary visual cortex, with developed horizontally-oriented receptive field Recent Artificial Intelligence breakthroughs are based on computational models of the human brain. With the worldwide adoption of smart-phones, data generation has exploded while processing capabilities have never been as powerful as today. 500+ TB 200+ TB 700+ TB @arguslabs
  7. 7. Let’s evaluate how we work with smart-phones
  8. 8. A phone or interconnected sensory device ? 2000 Verbal 2014 Non verbal
  9. 9. User experiences are changing Intelligent capabilities need to be interfacing with the end user, as an overall service layer. Good privacy controls are required, but most importantly added value must be shown through the use of intelligent and sensing apps. This might feel strange today, but remember the last time you didn’t allow this?
  10. 10. Apps are changing Service layers are emerging as background apps that only interface with a mobile users when it’s really relevant.
  11. 11. How will we deal with mobile by 2020 ?
  12. 12. A phone is only a form factor The functionality of a smart-phone will come in many shapes and functionalities. Already a first generation of smart wearable devices illustrate how this might go.
  13. 13. Your smart-phone is a docking station Due to increasing processing and connectivity capabilities, your smart-phone will be a hub and controller that will be used to gather and control data from other connected devices, implants, or wearable devices. Emerging fog computing architectures are only a prelude. Important channel for handling, analyzing and processing data. Controls the user experience.
  14. 14. Your smart-phone controls your vitals
  15. 15. Your smart-phone is a smart agent Capable of interpreting the context of a mobile user a smart-phone is capable of responding to the needs of said user in a proactive manner.
  16. 16. Bringing 2020 in the present
  17. 17. What if we told you, your phone could already be empathic ?
  18. 18. Turning smart-phones into sentient, cognitive and empathic devices. Make a better future by augmenting the cognitive capacities of humans by means of an exocortex. If we want machines to start working for us, they need to understand us, in order to become truly engaging.
  19. 19. We did it ! The first time ever people can start making machines and software cognitive and emotionally aware of their human users
  20. 20. It’s a service layer Cognitive platform through SDK and API Handling > 1 million requests per second Ambient sensing technology (no input needed) Android, IOS*, Microsoft Mobile** & Tizen** Built for mobile and embedded devices Proprietary IP and patented Now, anyone can use the human channel *  Being  ported/tested  **  Por0ng  started  
  21. 21. Self-learning biking – running – depression – stress – happy – excited – petrol head – soccer mom – car – party – music – meeting room – busy day – and much, much more
  22. 22. Not only smart-phones can be sentient !
  23. 23. The future of mobile Is fueled by artificial intelligence and empathic algorithms and will transform your future as if you could never have imagined !
  24. 24. Argus Labs http://www.arguslabs.be @arguslabs http://www.slideshare.net/ArgusLabs

×