DEMO: https://www.youtube.com/watch?v=8c8PuASMdFk
Assistive robotics solutions help people to recover their lost mobility and autonomy in their daily life. This work presents a comparison between two Human Machine Interfaces (HMIs) based on head postures and facial expressions to control a robotized wheelchair. Comparing both strategies, JoyFace has shown to be the safest and easiest to use, on the other hand, RealSense has demanded more physical efforts but may be
the appropriate solution for people who suffered severe trauma as most of them cannot even move their heads. Although both HMIs need improvements, these strategies have shown to be promising technologies for people paralyzed from down the neck to control a robotized wheelchair.
College Call Girls Nashik Nehal 7001305949 Independent Escort Service Nashik
Comparison of HMIs to Control a Robot Wheelchair
1. Comparison of Human Machine Interfaces to Control a
Robotized Wheelchair
Guilherme Pereira, Suzana Mota, Dandara Andrade, Eric Rohmer
Computer Engineering and Industrial Automation Department - DCA
School of Electrical and Computer Engineering - FEEC
University of Campinas - UNICAMP
{gpereira, suzanavm, dandrade, eric}@dca.fee.unicamp.br
13o Simp´osio Brasileiro de Automa¸c˜ao Inteligente
Porto Alegre, RS, October 2017
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 1 / 21
2. Introduction System Overview Methodology Results Conclusions
Summary
1 Introduction
2 System Overview
3 Methodology
4 Results
5 Conclusions
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 2 / 21
3. Introduction System Overview Methodology Results Conclusions
Summary
1 Introduction
2 System Overview
3 Methodology
4 Results
5 Conclusions
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 3 / 21
4. Introduction System Overview Methodology Results Conclusions
Introduction
Problem
People paralyzed from down the neck want to recover their mobility and
autonomy in their daily lives.
General objective
Compare two Human Machine Interfaces (HMIs) to control a robotized
wheelchair using the head displacement and facial expressions.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 4 / 21
5. Introduction System Overview Methodology Results Conclusions
Literature Review
(ITURRATE et al., 2009)
X
(GAUTAM et al., 2014) (KIM et al., 2013)
Fi 5 3D i f ti f th ’ f i d t it di ti
(ESCOBEDO et al., 2013) (ROHMER et al., 2015)
A Novel Platform Supporting Multiple Control Strategies for Assistive
Robots
Eric Rohmer1, Paulo Pinheiro1, Klaus Raizer1, Leonardo Olivi1 and Eleri Cardozo1
Abstract— This work presents a platform for the development
of a functional prototype for assistive robotic vehicles support-
ing various control strategies in the context of a smart envi-
ronment. The implemented framework allows an operator with
a disability to interact with a smart environment by means of
handfree devices (small movements of the face or limbs through
Electromyograph (EMG), or Electroencephalograph (EEG),
among others). The present work also details the integration
and testing of four control strategies (manual control, shared
control, point to go, and fully autonomous), giving the user the
opportunity to choose among them based on the structure of the
environment, personal preference, or capability. An intelligent
assistive agent was integrated into the framework which helps
the operator navigating the user interface and interacting with
the environment. The controls performances for a common
scenario are compared to validate the platform and compare the
implemented navigation algorithms, and experimental results
are presented and discussed.
I. INTRODUCTION
A great deal of the assistive robotic researches is aiming at
f nding solutions to help people with disabilities to recover
their lost mobility and autonomy in their daily life. They
mainly investigate robotizing wheelchairs [1][2], using ma-
nipulators [3][4], and interaction with robots through various
“hand free” devices adapted to the disability, using eye/face
tracking[5][6], voice/puff/sip activation and especially Brain-
Computer Interfaces (BCI)[7][8][9], or a combination of
those.
In this paper we present a novel platform to support the
through an experiment where each control strategy is ana-
lyzed and performances compared, over a common scenario.
II. PLATFORM’S ARCHITECTURE
This section details each of the components of the platform
seen in Fig. 1.
Fig. 1. Complete platform’s architecture
Proceedings of the 24th IEEE International
Symposiumon Robot and Human Interactive Communication
Kobe, J apan, Aug 31 - Sept 4, 2015
Figure 1: Different HMI solutions to control robotized wheelchairs.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 5 / 21
6. Introduction System Overview Methodology Results Conclusions
Proposal and Contributions
Proposal
Friendly HMIs based on computer vision techniques to control a
robotized wheelchair.
Contributions
Development of JoyFace: a new low-cost approach utilizing
computer vision to control a robotized wheelchair.
Comparison of JoyFace and RealSense HMIs to detect the upsides
and downsides of each approach.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 6 / 21
7. Introduction System Overview Methodology Results Conclusions
Summary
1 Introduction
2 System Overview
3 Methodology
4 Results
5 Conclusions
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 7 / 21
8. Introduction System Overview Methodology Results Conclusions
System Overview
Wheelchair
movements
JoyFace RealSense
Go front head up kiss
Turn right head right eyebrows up
Turn left head left mouth open
Stop head down smile
MATLAB
Application
JoyFace RealSense
OR
HMIs
UDP
COMPUTER ROBOTIZED WHEELCHAIR
Emergency
push button
Laser
rangefinder
Access
point
Embedded
Control
HTTP
Figure 2: HMIs controls the robotized wheelchair using high-level commands.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 8 / 21
9. Introduction System Overview Methodology Results Conclusions
JoyFace
Head Up
Head RightHead Left
Head Down
Figure 3: JoyFace facial expressions commands used to control the robotized
wheelchair.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 9 / 21
10. Introduction System Overview Methodology Results Conclusions
RealSense
eyebrows up (right)kiss (front)
smile (stop)mouth open (left)
Figure 4: Intel RealSense SDK sample used to get the facial expressions and
control the wheelchair.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 10 / 21
11. Introduction System Overview Methodology Results Conclusions
Summary
1 Introduction
2 System Overview
3 Methodology
4 Results
5 Conclusions
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 11 / 21
12. Introduction System Overview Methodology Results Conclusions
Experimental Procedure
Test JoyFace and RealSense with ten
healthy subjects (9 men and 1 women)
Navigate in an indoor corridor with
obstacles using one HMI at a time.
Experiment steps with each
vonlunteer:
1 Training navigation with
Joyface/RealSense;
2 Navigation to collect data;
3 Training navigation with
Joyface/RealSense;
4 Navigation to collect data;
5 Answer questionnaire.
START
POINT
END
POINT
Figure 5: Corridor with
obstacles.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 12 / 21
13. Introduction System Overview Methodology Results Conclusions
Questionnaire
Inspired in NASA-TLX Methodology
Based on your experience with JoyFace and RealSense:
Which HMI has demanded more mental efforts?
Which HMI has demanded more physical efforts?
Which HMI has offered more security?
Which HMI was easier to use?
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 13 / 21
14. Introduction System Overview Methodology Results Conclusions
Summary
1 Introduction
2 System Overview
3 Methodology
4 Results
5 Conclusions
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 14 / 21
15. Introduction System Overview Methodology Results Conclusions
Results
Mental demand Physical demand Security Facility
0
10
20
30
40
50
60
70
80
90
Comparison between JoyFace and RealSense
JoyFace
RealSense
Scales
Percentage(%)
Figure 6: The graphic shows the subjects’ impressions comparing JoyFace and
RealSense regarding the fours aspects of the questionnaire.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 15 / 21
16. Introduction System Overview Methodology Results Conclusions
Results
Table 1: Lap times (in minutes) and number of emergency stops the subjects
had to take due to collisions, imminence of collisions, or even panic during the
navigation with JoyFace and RealSense HMIs.
Subject JoyFace
No.
Stops
RealSense
No.
Stops
1 01:37 0 03:25 3
2 03:17 1 03:06 0
3 01:44 0 06:07 2
4 04:17 4 03:58 4
5 01:34 0 02:35 1
6 02:07 0 05:10 0
7 04:56 0 02:50 0
8 02:03 0 04:21 2
9 03:42 0 02:47 0
10 01:59 0 03:30 3
Avg. 02:43 0,5 03:46 1,5
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 16 / 21
17. Introduction System Overview Methodology Results Conclusions
Results
Joyface
Y (m)
-4-3-2-101234
X(m)
0
1
2
3
4
5
6
7
8
Start point
Trajectory
Figure 7: Trajectory performed by
trained subject using JoyFace (1’ 22”)
RealSense
Y (m)
-4-3-2-101234
X(m)
0
1
2
3
4
5
6
7
8
Start point
Trajectory
Figure 8: Trajectory performed by
trained subject using RealSense (1’
36”)
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 17 / 21
18. Introduction System Overview Methodology Results Conclusions
Summary
1 Introduction
2 System Overview
3 Methodology
4 Results
5 Conclusions
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 18 / 21
19. Introduction System Overview Methodology Results Conclusions
Conclusions
JoyFace
Safer
Easier to use
Requires less mental effort
Demands more physical effort than RealSense
RealSense
Does not rely on a feedback screen
Requires less physical efforts than JoyFace
People with limited head moviments
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 19 / 21
20. Introduction System Overview Methodology Results Conclusions
Future Work
Midas Touch problem with both HMIs
Use extra command to lock/unlock the system
Joyface: implement long smile detection with OpenCV
RealSense: use long smile
RealSense: adaptative facial expressions detection
Improve experimental procedure using NASA-TLX Methodology
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 20 / 21
22. References I
“Nasa-tlx.” http://www.nasatlx.com/, 2011.
(Accessed in 20/03/2017).
“Intel realsense sdk.”
https://software.intel.com/en-us/intel-realsense-sdk.
Accessed: 2017-04-18.
A. J[U+FFFD]nior, “Robotiza[U+FFFD][U+FFFD]o de uma cadeira
de rodas motorizada: arquitetura, modelos, controle e
aplica[U+FFFD][U+FFFD]es.,” Master’s thesis, School of Electrical
and Computer Engineering, FEEC, UNICAMP., 2016.
World Health Organization, World report on disability.
World Health Organization, 2011.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 22 / 21
23. References II
IBGE, “Cartilha do censo 2010: Pessoas com deficiˆencia,”
Bras[U+FFFD]lia: Secretaria de Direitos Humanos da
Presid[U+FFFD]ncia da Rep[U+FFFD]blica (SDH)/Secretaria
Nacional de Promo[U+FFFD][U+FFFD]o dos Direitos da Pessoa com
Defici[U+FFFD]ncia (SNPD), 2010.
P. Viola and M. Jones, “Robust real-time object detection,”
International Journal of Computer Vision, vol. 4, no. 34–47, 2001.
R. Chauhan, Y. Jain, H. Agarwal, and A. Patil, “Study of
implementation of voice controlled wheelchair,” in Advanced
Computing and Communication Systems (ICACCS), 2016 3rd
International Conference on, vol. 1, pp. 1–4, IEEE, 2016.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 23 / 21
24. References III
C. P. Papageorgiou, M. Oren, and T. Poggio, “A general framework
for object detection,” in Computer vision, 1998. sixth international
conference on, pp. 555–562, IEEE, 1998.
J. Kim, H. Park, J. Bruce, E. Sutton, D. Rowles, D. Pucci,
J. Holbrook, J. Minocha, B. Nardone, D. West, et al., “The tongue
enables computer and wheelchair control for people with spinal cord
injury,” Science translational medicine, vol. 5, no. 213,
pp. 213ra166–213ra166, 2013.
E. Rohmer, P. Pinheiro, K. Raizer, et al., “A novel platform
supporting multiple control strategies for assistive robots,” in Robot
and Human Interactive Communication (RO-MAN), 2015 24th
IEEE International Symposium on, pp. 763–769, IEEE, 2015.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 24 / 21
25. References IV
E. Rohmer, P. Pinheiro, E. Cardozo, et al., “Laser based driving
assistance for smart robotic wheelchairs,” in Emerging Technologies
& Factory Automation (ETFA), 2015 IEEE 20th Conference on,
pp. 1–4, IEEE, 2015.
R. C. Simpson, “Smart wheelchairs: A literature review,” Journal of
rehabilitation research and development, vol. 42, no. 4, p. 423, 2005.
R. E. Cowan, B. J. Fregly, M. L. Boninger, L. Chan, M. M.
Rodgers, and D. J. Reinkensmeyer, “Recent trends in assistive
technology for mobility,” Journal of neuroengineering and
rehabilitation, vol. 9, no. 1, p. 20, 2012.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 25 / 21
26. References V
C. Escolano, J. M. Antelis, and J. Minguez, “A telepresence mobile
robot controlled with a noninvasive brain–computer interface,”
IEEE Transactions on Systems, Man, and Cybernetics, Part B
(Cybernetics), vol. 42, no. 3, pp. 793–804, 2012.
I. Iturrate, J. M. Antelis, A. Kubler, and J. Minguez, “A
noninvasive brain-actuated wheelchair based on a p300
neurophysiological protocol and automated navigation,” IEEE
Transactions on Robotics, vol. 25, no. 3, pp. 614–627, 2009.
R. Souza, F. Pinho, L. Olivi, and E. Cardozo, “A restful platform
for networked robotics,” in Ubiquitous Robots and Ambient
Intelligence (URAI), 2013 10th International Conference on,
pp. 423–428, IEEE, 2013.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 26 / 21
27. References VI
NASA, “Nasa-tlx.” http://www.nasatlx.com/, 2011.
Accessed 2017 Jun 10.
R. J. Jacob, “Eye tracking in advanced interface design,” Virtual
environments and advanced interface design, pp. 258–288, 1995.
X. Huo and M. Ghovanloo, “Evaluation of a wireless wearable
tongue–computer interface by individuals with high-level spinal cord
injuries,” Journal of neural engineering, vol. 7, no. 2, p. 026008,
2010.
G. Gautam, G. Sumanth, K. Karthikeyan, S. Sundar, and
D. Venkataraman, “Eye movement based electronic wheel chair for
physically challenged persons,” Int. J. Sci. Technol. Res, vol. 3,
no. 2, 2014.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 27 / 21
28. References VII
A. Escobedo, A. Spalanzani, and C. Laugier, “Multimodal control of
a robotic wheelchair: Using contextual information for usability
improvement,” in Intelligent Robots and Systems (IROS), 2013
IEEE/RSJ International Conference on, pp. 4262–4267, IEEE, 2013.
Pereira et al. (UNICAMP) Paper 690 SBAI 2017 28 / 21