Social (Assistive) Robots
Would you let a robot mentor you?
Glen Koskela
Fujitsu Fellow
CTO Nordics
Copyright 2018 FUJITSU
Artificial Intelligence
Machine learning
Deep learning
Neural networks
Automated reasoning
Object recognition
Conversational intelligence
Language understanding
Semantic search
Motion video analytics
A core technology that accelerates
move to digital, and pushes
transformations even deeper
Machine translation
Emotion recognition
Motion-based analysis
Time series analytics
Knowledge discovery
Pattern discovery
Predictive analytics
Fraud detection
Face recognition
1
Copyright 2018 FUJITSU
Social robots
2
Must be able to understand
and communicate with
social cues that people use.
A robot capable of social behaviors? With the ability to respond and interact with humans?
Autonomous robot that interacts and
communicates with humans or other
autonomous physical agents by following
social behaviors and rules attached to its role.
Copyright 2018 FUJITSU
Robot features as a panelist in UN meeting
3
October 11th, 2017
The Future of Everything – Sustainable Development in the Age of Rapid Technological Change.
Joint meeting of the United Nations General Assembly Second Committee and UN Economic and Social Council.
UN Deputy Secretary-General has a brief dialogue with Sophia
A business environment and society where
robots understand, work with and collaborate
with people, and express a variety of emotions
using gestures, body movements, voice and
tone controls, and facial expressions.
Copyright 2018 FUJITSU
Service robots lead to deeper engagement
4
Building attractive, and friendly, future customer service points
Human robot interaction
Machine vision
Speech recognition
Artificial intelligence
Machine learning
Cloud robotics
Friendly mediator robots to deliver
relevant services with an expressive
range of voice tones via speech
synthesis across wide-ranging
customer services applications in
retail, hospitality, tourism, finance,
healthcare and education.
Copyright 2018 FUJITSU
Expresses delight, gets mad, laughs, speaks, nods?
5
New partners in several fields of human activities such as health, education, customer service
Experiments conducted at care facility showed that residents
who spent time with the robot were more talkative and those
who ordinarily showed little emotion began smiling more often.
Copyright 2018 FUJITSU
Enabling robots to actively interact
6
Pre-defined instruction set (call and response model)
e.g. ‘put lights on’.
Matching simple circumstances (sense and recommend),
e.g. location -> ‘there’s congestion in the area’
Learn from history of operations, preferences and
activities (matching interests and concerns)
Sensing reactions when choosing options (mediation)
Amending the conversation with postures (motion)
Responding with appropriate attitude (voice pitch)
Most communication robots
work only in response to
clear user instructions.
Copyright 2018 FUJITSU
Perceptual technologies to extract behavioral cues
7
Encoding and decoding non-verbal behavioral cues
(= computationally model peoples' non-verbal behaviors)
Computer vision
Signal processing
Statistical learning
Modality-specific effects
Sub-modality-specific effects
Their integration to address
questions connected to
inference of various social
variables in increasingly
diverse situations.
Head pose estimation, visual focus of
attention (VFOA) from head pose,
wandering VFOA, speaking style,
contextual VFOA, role recognition,
kinect gaze sensing,
emergent leadership, dominance,
group characterization, group
cohesion, personality perception,
conflict detection, effectiveness of
delivery, interpersonal attraction,…
Copyright 2018 FUJITSU
Imitation of human communication behavior
8
CC BY 2.0 photo credit: https://www.flickr.com/photos/quinnanya/3820640681/
A mechanism is needed that support various conversations and questions
Appearance of embodies robots is very important. The robot must adjust
the time it starts speaking to match user’s receptivity, imitate human
communication behavior, such as active glancing and making eye contact,
elicit reactions by responding with utterances or nodding, alternate turn-
taking, facial expressions, and by filling silent periods during turn-taking.
Copyright 2018 FUJITSU
Emotional expression
9
Expressing itself and feelings using its whole body and continuous voice tone control
WelcomeWow!I’m sorry This way, please Yes, Sir 222 movementsEye contact
Copyright 2018 FUJITSU
Conversation applications
10
Technologies to handle user needs, to make conversation, and to change non-verbal behavior
Execution control
Dialog service platform
Development support
Turn-taking control
Conversation starting time control
“It’s raining heavily, traffic is congested”
“They are talking about it in the news”
“Would you like to hear the forecast?”
“Your train-line is already 20 minutes late”
“Would you like me to call you a taxi instead?”
“I’ll send a message to office that you’ll be in time”
“Remember to take your umbrella”
Copyright 2018 FUJITSU
Robot AI Platform – more Women in Tech needed!
11
Data collection, Machine Learning analysis
Robot Interaction Device Controller (Dev. Kit)
Vocal emotion analysis:
calm, anger, joy, sorrow, energy level…
Facial expression recognition:
anger, disgust, fear, happiness, sadness,
surprise, gender, age,…
Conversational natural-language assistant:
knowledge to handle specific content domains,
goal-oriented dialogs and contents drifting dialogs
Copyright 2018 FUJITSU
Human Centric Intelligent Society
12
Copyright 2018 FUJITSU

Social (assistive) robots

  • 1.
    Social (Assistive) Robots Wouldyou let a robot mentor you? Glen Koskela Fujitsu Fellow CTO Nordics
  • 2.
    Copyright 2018 FUJITSU ArtificialIntelligence Machine learning Deep learning Neural networks Automated reasoning Object recognition Conversational intelligence Language understanding Semantic search Motion video analytics A core technology that accelerates move to digital, and pushes transformations even deeper Machine translation Emotion recognition Motion-based analysis Time series analytics Knowledge discovery Pattern discovery Predictive analytics Fraud detection Face recognition 1
  • 3.
    Copyright 2018 FUJITSU Socialrobots 2 Must be able to understand and communicate with social cues that people use. A robot capable of social behaviors? With the ability to respond and interact with humans? Autonomous robot that interacts and communicates with humans or other autonomous physical agents by following social behaviors and rules attached to its role.
  • 4.
    Copyright 2018 FUJITSU Robotfeatures as a panelist in UN meeting 3 October 11th, 2017 The Future of Everything – Sustainable Development in the Age of Rapid Technological Change. Joint meeting of the United Nations General Assembly Second Committee and UN Economic and Social Council. UN Deputy Secretary-General has a brief dialogue with Sophia A business environment and society where robots understand, work with and collaborate with people, and express a variety of emotions using gestures, body movements, voice and tone controls, and facial expressions.
  • 5.
    Copyright 2018 FUJITSU Servicerobots lead to deeper engagement 4 Building attractive, and friendly, future customer service points Human robot interaction Machine vision Speech recognition Artificial intelligence Machine learning Cloud robotics Friendly mediator robots to deliver relevant services with an expressive range of voice tones via speech synthesis across wide-ranging customer services applications in retail, hospitality, tourism, finance, healthcare and education.
  • 6.
    Copyright 2018 FUJITSU Expressesdelight, gets mad, laughs, speaks, nods? 5 New partners in several fields of human activities such as health, education, customer service Experiments conducted at care facility showed that residents who spent time with the robot were more talkative and those who ordinarily showed little emotion began smiling more often.
  • 7.
    Copyright 2018 FUJITSU Enablingrobots to actively interact 6 Pre-defined instruction set (call and response model) e.g. ‘put lights on’. Matching simple circumstances (sense and recommend), e.g. location -> ‘there’s congestion in the area’ Learn from history of operations, preferences and activities (matching interests and concerns) Sensing reactions when choosing options (mediation) Amending the conversation with postures (motion) Responding with appropriate attitude (voice pitch) Most communication robots work only in response to clear user instructions.
  • 8.
    Copyright 2018 FUJITSU Perceptualtechnologies to extract behavioral cues 7 Encoding and decoding non-verbal behavioral cues (= computationally model peoples' non-verbal behaviors) Computer vision Signal processing Statistical learning Modality-specific effects Sub-modality-specific effects Their integration to address questions connected to inference of various social variables in increasingly diverse situations. Head pose estimation, visual focus of attention (VFOA) from head pose, wandering VFOA, speaking style, contextual VFOA, role recognition, kinect gaze sensing, emergent leadership, dominance, group characterization, group cohesion, personality perception, conflict detection, effectiveness of delivery, interpersonal attraction,…
  • 9.
    Copyright 2018 FUJITSU Imitationof human communication behavior 8 CC BY 2.0 photo credit: https://www.flickr.com/photos/quinnanya/3820640681/ A mechanism is needed that support various conversations and questions Appearance of embodies robots is very important. The robot must adjust the time it starts speaking to match user’s receptivity, imitate human communication behavior, such as active glancing and making eye contact, elicit reactions by responding with utterances or nodding, alternate turn- taking, facial expressions, and by filling silent periods during turn-taking.
  • 10.
    Copyright 2018 FUJITSU Emotionalexpression 9 Expressing itself and feelings using its whole body and continuous voice tone control WelcomeWow!I’m sorry This way, please Yes, Sir 222 movementsEye contact
  • 11.
    Copyright 2018 FUJITSU Conversationapplications 10 Technologies to handle user needs, to make conversation, and to change non-verbal behavior Execution control Dialog service platform Development support Turn-taking control Conversation starting time control “It’s raining heavily, traffic is congested” “They are talking about it in the news” “Would you like to hear the forecast?” “Your train-line is already 20 minutes late” “Would you like me to call you a taxi instead?” “I’ll send a message to office that you’ll be in time” “Remember to take your umbrella”
  • 12.
    Copyright 2018 FUJITSU RobotAI Platform – more Women in Tech needed! 11 Data collection, Machine Learning analysis Robot Interaction Device Controller (Dev. Kit) Vocal emotion analysis: calm, anger, joy, sorrow, energy level… Facial expression recognition: anger, disgust, fear, happiness, sadness, surprise, gender, age,… Conversational natural-language assistant: knowledge to handle specific content domains, goal-oriented dialogs and contents drifting dialogs
  • 13.
    Copyright 2018 FUJITSU HumanCentric Intelligent Society 12
  • 14.