SlideShare a Scribd company logo
1 of 4
Survey on Human Computer interaction
for disabled persons
Muhammad Bilal Muhammad Usman Ghani
Department of Computer Science Department of Computer Science
University of Islamabad University of Islamabad
Air University, E9, Islamabad Air University, E9, Islamabad
140666@students.au.edu.pk 140681@students.au.edu.pk
Abstract - At present, there are many solutions which
facilitate the interaction of computers to handicapped people. In
recent years there has been an increased interest in Human-
Computer Interaction Systems allowing for more natural
communication with machines. Such systems are especially
important for elderly and disabled persons. In this paper we have
conducted a survey of ten different techniques and tried to
analyze how these techniques are working. The techniques have
also been compared on the basis of many features which give an
insight into the effectiveness of these techniques.
Index Terms – HCI (Human Computer Interaction) cursor
control, disable peoples, facial recognition,
I. INTRODUCTION
HCI can be described as the point of communication between
the human user and computer. Typical input devices used
nowadays for communication with the machine are: mouse,
keyboard, trackball, touchpad and touch screen. All these
interfaces require manual control and can’t be used by person
impaired in movement capacity. This fact induces the need for
development of the alternative method of communication
between human and computer that would be suitable for the
disabled persons. Therefore it attracts the attention of
researchers all over the world. For severely disabled persons,
whose movement is limited to the muscles around the eyes,
most suitable systems are that which can be controlled by eye
blinks since blinking is the last voluntary action the disabled
person loses control of [1]. Eye blinks can be classified into 3
types: voluntary, reflexive and spontaneous. Spontaneous
blinking which is done without external stimuli and internal
effort. A reflex blink occurs in response to an external
stimulus, such as objects that appear rapidly in front of the eye.
Voluntary blink is larger amplitude than Reflex blink, with the
use of all 3 divisions of the orbicularis oculi muscle. The eye-
movement or eye blink controlled HCI systems are very useful
for persons who cannot speak or use hands to communicate.
These systems use mostly the technique of EOG
(Electrooculography). A number of eye-blink and face
detection techniques have been developed. They can be
divided into contact and non-contact methods. Contact
methods are basically those methods in which there is some
physical connection between user and machine. Another
classification divides eye-blink techniques into vision and non-
vision based. Non-vision methods include electrooculography
(EOG) and high frequency transceivers. The vision-based
system are designed for the disabled users who are capable of
blinking voluntarily. Mostly every technique use two hardware
components, a web camera and a microphone. The web
camera is expected to be pointed to the face of the person who
uses the computer. The microphone captures the voice of the
user and detects the voice commands which are used to issue
mouse actions such as click, double click and right click. Drag
and Scroll events also can be done. And finally mouse events
are passed to operating system through System API which is
Win32 API in windows.
To get an insight into the techniques that are already been
using by disabled persons (physical), how these techniques
works, how is it being used and is contributing to make man-
machine interaction easy and effective is the purpose of this
survey.
In this paper, we discuss ten different techniques that make use
of methods designed specifically to improve the interaction
experience between a user and machine. The techniques we
have highlighted in this paper serve all those physical disabled
persons that can’t use computer because of their disability to
interact with computer. Furthermore we have highlighted few
parameters in the table below to illustrate the comparison
between techniques.
The paper is divided into the following sections: Section 2
presents the literature review. Section 3 presents the analysis
and finally section 3 draws conclusions.
II. LITERATURE REVIEW
In [2], the author’s presents an alternative to the conventional
mouse using computer vision and voice recognition
technologies as key techniques. The user can use his or her
nose’s relative movements to move the mouse pointer and use
other mouse actions by voice commands. Enhanced template
matching image processing technologies have used to track the
nose position. Background light changes are rectified with
using adoptive thresholds. The voice interaction is achieved by
using Speech API calls and overall mouse events are passed to
system using system API calls. The only additional hardware
components used are, a web cam and a microphone which
come by default in most laptop computers in built. And such
external microphones and web cameras are available in very
affordable price.
In [3], the author’s presents a vision & feature-based system
for detection of long voluntary eye blinks and interpretation of
blink patterns for communication between man and machine.
Supplemented by the mechanism for detecting multiple eye
blinks, this paper provides a complete solution for building
intelligent hands-free input devices. Due to recent increase of
computer power and decrease of camera cost, it became very
common to see a camera on top of a computer monitor. The
described technique uses off-the self-cameras that allow one
for tracking nose features, eyebrows and head position
robustly and precisely in both 2D and 3D coordinates. This
tracking and monitoring allows user to give input to the
computer machine and access the entire system in a hands free
manner.
In [4], the author’s presents a novel wearable single-channel
electrooculography (EOG) based human-computer interface
(HCI) with a simple system design and robust performance. In
this system, EOG signals for control are generated from
double eye blinks, collected by a commercial wearable device
(the NeuroSky Mind Wave headset), and then converted into a
sequence of commands that can control cursor navigations and
actions. The EOG-based cursor control system is tested on 8
subjects in indoor or outdoor environment, and the average
accuracy is 84.42% for indoor uses and 71.50% for outdoor
uses. Compared with other existing EOG-based HCI systems,
this system is highly user-friendly and does not require any
training. Therefore, this system has the potential to provide an
easy-to-use and cheap assistive technique for locked-in
patients who have lost their main body muscular abilities but
with proper eye-condition.
In [5], the author’s presents a system called as Vocal Mouse
(VM). This device will allow users to continuously control the
mouse pointer using words as well as sounds by varying vocal
parameters such as vowel quality, loudness and pitch.
Traditional method of using only standard spoken words was
inefficient for performing continuous tasks and they are often
recognized poorly by automatic speech recognizers. Now, VM
allows the users to work on both continuous and discrete
motion control. This includes commands given as words or
regular sounds consisting of vowels and consonants. Low-level
acoustic features are extracted in real time using LPC (Linear
Predictive Coding). Pattern recognition is performed using a
new proposed technique called “minimum feature distance
technique”. This proposed technique is based on calculating
distances between the spoken word and each stored word in
the library during training process. Features from pattern
recognition module are processed to produce output in the
form of cursor’s 2-D movement. VM can be used by novice
users without extensive training and it presents a viable
alternative to existing speech-based cursor control methods.
In [6], the author’s presents a vision based human–computer
interface. The interface detects eye movements and interprets
them as cursor control commands. The employed image
processing methods include webcam for detecting the face,
and template matching method based eye region detection. The
Haar feature technique is used for eye feature extraction. SVM
classification method is used for classifying the eye
movements. The classification of eye movements such as eye
open, eye close, eyeball left, and eyeball right movements are
used for cursor top, bottom, left and right movement
respectively. The interface is based on a notebook equipped
with a typical web camera and requires no extra light sources.
In Hough transform, circular method is used to control the
cursor movements. This method is used for physically
challenged persons to operate the computers effectively with
their eye movements.
In [7], the author’s presents a real-time human computer
interaction system based on hand gesture. The whole system
consists of three components: hand detection, gesture
recognition and human-computer interaction (HCI) based on
recognition and realizes the robust control of mouse and
keyboard events with a higher accuracy of gesture recognition.
Specifically, we use the convolutional neural network (CNN)
to recognize gestures and makes it attainable to identify
relatively complex gestures using only one cheap monocular
camera. They introduce the Kalman filter to estimate the hand
position based on which the mouse cursor control is realized in
a stable and smooth way. During the HCI stage, they develop a
simple strategy to avoid the false recognition caused by noises
- mostly transient, false gestures, and thus to improve the
reliability of interaction. The developed system is highly
extendable and can be used in human robotic or other human-
machine interaction scenarios with more complex command
formats rather than just mouse and keyboard events.
In [8], the author’s presents a framework for computer control
without need for special PC software or drivers. The
framework is based on a tongue control system recently
developed at Centre for Sensory-Motor Interaction (SMI),
Aalborg University. The framework provides emulation of a
standard USB keyboard and mouse, and allows tongue control
of any computer using standard USB drivers available in all
modern operating systems.
In [9], the author’s presents the mouse to control a computer
with a shoe called IPFM: intelligent pressure foot-mouse. It is
mainly composed of five parts including pressure sensor (FSR)
embedded in a shoe, 3-axis acceleration sensor in front of shoe
to measure a foot motion, computing module (FPGA), wireless
communication module and visualization module
(LABVIEW). From construction of the five parts, we
developed a new input device called IPFM, which can be used
by people who have difficult in using their hands to operate
computers or devices easily.
In [10], the author’s presents the mouse pointer and performs
various mouse operations such as left click, right click, double
click, and drag etc. using gestures recognition technique.
Recognizing gestures is a complex task which involves many
aspects such as motion modelling, motion analysis, pattern
recognition and machine learning. Keeping all the essential
factors in mind a system has been created which recognizes the
movement of fingers and various patterns formed by them.
Colour caps have been used for fingers to distinguish it from
the background colour such as skin colour.
In [11], the author’s presents a new approach to control
personal computers with high efficiency. The main three
features of this interface are nose tracking cursor control, auto
brightness control and display control based on the presence
and detection of valid human face. The proposed system is low
cost and exhibits inherent security and power saving
capabilities.
III. COMPARISON PARAMETERS
We have compared our techniques on the basis of some
parameters.
The parameters are explained as follows:
Efficiency: The comparison of what is actually produced or
performed with what can be achieved with the same
consumption of resources. It is an important factor in
determination of productivity.
Accuracy: The quality or state of being correct or precise.
Sensor: A device which detects or measures a physical
property and records, indicates, or otherwise responds to it.
Sensor can be of many types:
ď‚· Temperature sensor
ď‚· Proximity sensor
ď‚· IR sensor
ď‚· Pressure sensor ( used in IPFM technique)
ď‚· Light sensor ( To control illumination in many
techniques)
ď‚· Touch sensor
Cost Expensive: This parameter is for checking which
techniques are costly and which are cheap. For examples,
IPFM (Intelligent pressure foot machine) is costly because of
sensors fitted in shoe. Facial recognition technique in which
camera is put in front of user and it records the movement of
eye is cheap.
Facial Recognition: A facial recognition system is a
technology capable of identifying or verifying a person from a
digital image. This parameter is for comparing techniques on
the basis of facial recognition, means which techniques include
facial recognition and which are not.
Gesture Recognition: Gesture recognition is the mathematical
interpretation of a human motion by a computing device. This
parameter is also for comparing techniques on the basis of
gesture recognition, means which techniques include gesture
recognition and which are not.
Eye-Blink Detection: There are many techniques that
consider the eye-blink as a function, and do some mouse click
events on it. This parameter is also for comparing techniques
on the basis of eye-blink detection.
Speech Detection: In some techniques, mouse clicking events
occurred through voice commands. Therefore speech detection
parameter is to consider those techniques which used speech
detection.
Parameters
Parameters
IV. CONCLUSION
This section presents the conclusion. A survey was conducted
in this paper to analyze the solutions which facilitate the
interaction of computers to handicapped people. The important
advantage of approx. all techniques is that system does not
need prior knowledge of face location or skin colour is not
required, nor any special lighting. In many human-computer
interfaces for the disabled additional hardware is required to
be worn by the user, such as special head gear, sensors, or
markers. But those systems that only use cameras are cost
cheap and easy to use, completely non-intrusive, and therefore
more user-friendly and easier to configure. In case of face
recognition, it is better to have the light source behind the
computer and not behind the user. The systems that can be
used in outdoor environment are more preferable. Some
techniques need to consider flexibility factor to further
improve their performance.
5 6 7 8
1. No No Yes No
2. No Yes Yes No
3. Yes Yes No No
4. Yes Yes No Yes
5. No No No No
6. No No Yes No
7. No Yes Yes No
8. Yes No No No
9. No Yes Yes Yes
10. No Yes No No
1 2 3 4
1. Yes Yes Yes No
2. Yes Yes Yes Yes
3. Yes Yes Yes Yes
4. No Yes Yes Yes
5. No Yes Yes No
6. Yes No Yes No
7. Yes No No No
8. No Yes Yes No
9. Yes Yes Yes Yes
10
.
No No Yes No
REFERENCES
[1] R. Ruddarraju, et al, Perceptual user interfaces using vision-based eye
tracking, in: the 5th international conference on Multimodal interfaces,
Vancouver, British Columbia, Canada, 2003, pp. 227 – 233.
[2] S. K. Chathuranga, K. C. Samarawickrama¤, H. M. L. Chandima, K. G.
T. D. Chathuranga and A. M. H. S. Abeykoon, "Hands free interface for
Human Computer Interaction," 2010 Fifth International Conference on
Information and Automation for Sustainability, Colombo, 2010, pp. 359-
364. doi: 10.1109/ICIAFS.2010.5715687.
[3] K. Parmar, B. Mehta and R. Sawant, "Facial-feature based
HumanComputer Interface for disabled people," 2012 International
Conference on Communication, Information & Computing Technology
(ICCICT), Mumbai, 2012, pp. 1-5. doi: 10.1109/ICCICT.2012.6398171.
[4] A. M. S. Ang, Z. G. Zhang, Y. S. Hung and J. N. F. Mak, "A user-
friendly wearable single-channel EOG-based human-computer interface
for cursor control," 2015 7th International IEEE/EMBS Conference on
Neural Engineering (NER), Montpellier, 2015, pp. 565-568. doi:
10.1109/NER.2015.7146685.
[5] Nibha, Vikram Nandal “Mouse control- a new era in human computer
interaction”, ISSN 2320–088X, IJCSMC, Vol. 3, Issue. 7, July 2014.
[6] M.Mangaiyarkarasi and A.Geetha “Cursor control system using facial
expressions for human computer interactions”, ISSN: 0976-1353 Volume
8 Issue 1 - APRIL 2014.
[7] Pei Xu, “A real-time hand gesture recognition and human computer
interaction systems”, arXiv: 1704.07296v1 [cs.CV] 24 Apr 2017.
[8] Morten Enemark Lund, et al, “A framework for mouse and keyboard
emulation in a tongue control system”, 3-6 Sept. 2009.
doi: 10.1109/IEMBS.2009.5334055
[9] Hyun-Min Choi, et al, “IPFM: Intelligent pressure foot-mouse”, Vol.8,
No.5 (2013)
[10] Rachit Puri, et al, “Gesture recognition based mouse events”, Samsung
Research India-Bangalore – 560037, India
[11] Shadman Sakib Khan, et al, “Nose Tracking Cursor Control for the
People with Disabilities: An Improved HCI” 2017 3rd International
Conference on Electrical Information and Communication Technology
(EICT), 7-9 December 2017, Khulna, Bangladesh

More Related Content

What's hot

J04302076081
J04302076081J04302076081
J04302076081ijceronline
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyesRoshmi Sarmah
 
IRJET- Automated Criminal Identification System using Face Detection and Reco...
IRJET- Automated Criminal Identification System using Face Detection and Reco...IRJET- Automated Criminal Identification System using Face Detection and Reco...
IRJET- Automated Criminal Identification System using Face Detection and Reco...IRJET Journal
 
Saksham presentation
Saksham presentationSaksham presentation
Saksham presentationSakshamTurki
 
Saksham seminar report
Saksham seminar reportSaksham seminar report
Saksham seminar reportSakshamTurki
 
Face recognition using laplacianfaces
Face recognition using laplacianfaces Face recognition using laplacianfaces
Face recognition using laplacianfaces StudsPlanet.com
 
Human Computer Interaction of an Information System
Human Computer Interaction of an Information SystemHuman Computer Interaction of an Information System
Human Computer Interaction of an Information Systemuniversity of education,Lahore
 
Face Recognition report
Face Recognition reportFace Recognition report
Face Recognition reportlavanya693
 
Criminal Detection System
Criminal Detection SystemCriminal Detection System
Criminal Detection SystemIntrader Amit
 
Blue eyes technology
Blue eyes technologyBlue eyes technology
Blue eyes technologykomal jain
 
Face detection and recognition
Face detection and recognitionFace detection and recognition
Face detection and recognitionPankaj Thakur
 
Face recognition technology
Face recognition technologyFace recognition technology
Face recognition technologyranjit banshpal
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPTBinayak Ghosh
 
Face Recognition Proposal Presentation
Face Recognition Proposal PresentationFace Recognition Proposal Presentation
Face Recognition Proposal PresentationMd. Atiqur Rahman
 

What's hot (20)

J04302076081
J04302076081J04302076081
J04302076081
 
Blue eyes
Blue eyesBlue eyes
Blue eyes
 
Seminar report on blue eyes
Seminar report on blue eyesSeminar report on blue eyes
Seminar report on blue eyes
 
IRJET- Automated Criminal Identification System using Face Detection and Reco...
IRJET- Automated Criminal Identification System using Face Detection and Reco...IRJET- Automated Criminal Identification System using Face Detection and Reco...
IRJET- Automated Criminal Identification System using Face Detection and Reco...
 
Saksham presentation
Saksham presentationSaksham presentation
Saksham presentation
 
Saksham seminar report
Saksham seminar reportSaksham seminar report
Saksham seminar report
 
Face recognition using laplacianfaces
Face recognition using laplacianfaces Face recognition using laplacianfaces
Face recognition using laplacianfaces
 
I1803045256
I1803045256I1803045256
I1803045256
 
Human Computer Interaction of an Information System
Human Computer Interaction of an Information SystemHuman Computer Interaction of an Information System
Human Computer Interaction of an Information System
 
Z4501149153
Z4501149153Z4501149153
Z4501149153
 
Face Recognition report
Face Recognition reportFace Recognition report
Face Recognition report
 
Criminal Detection System
Criminal Detection SystemCriminal Detection System
Criminal Detection System
 
Blue eyes technology
Blue eyes technologyBlue eyes technology
Blue eyes technology
 
Face detection and recognition
Face detection and recognitionFace detection and recognition
Face detection and recognition
 
Blue eyes seminar report
Blue eyes seminar reportBlue eyes seminar report
Blue eyes seminar report
 
Comparative study on computers operated by eyes and brain
Comparative study on computers operated by eyes and brainComparative study on computers operated by eyes and brain
Comparative study on computers operated by eyes and brain
 
Face recognition technology
Face recognition technologyFace recognition technology
Face recognition technology
 
Reading System for the Blind PPT
Reading System for the Blind PPTReading System for the Blind PPT
Reading System for the Blind PPT
 
Face Recognition Proposal Presentation
Face Recognition Proposal PresentationFace Recognition Proposal Presentation
Face Recognition Proposal Presentation
 
Week6 face detection
Week6 face detectionWeek6 face detection
Week6 face detection
 

Similar to Survey of HCI Techniques for Disabled Users

Eye tracker based HCI
Eye tracker based HCIEye tracker based HCI
Eye tracker based HCISaswati
 
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...IRJET Journal
 
synergetic framework for eyeball mouse and gesture recognition (1).pptx
synergetic framework for eyeball mouse and gesture recognition (1).pptxsynergetic framework for eyeball mouse and gesture recognition (1).pptx
synergetic framework for eyeball mouse and gesture recognition (1).pptxGopi Krishna
 
blue eyes seminar ppt.pptx
blue eyes seminar ppt.pptxblue eyes seminar ppt.pptx
blue eyes seminar ppt.pptxKvaishnavi6
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand GesturesIRJET Journal
 
Eye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardEye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardIRJET Journal
 
IRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET Journal
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computerseSAT Publishing House
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computerseSAT Journals
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with EyeballIRJET Journal
 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Camijsrd.com
 
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...IRJET Journal
 
Eye phone
Eye phoneEye phone
Eye phonejaiprada
 
Eye phone
Eye phoneEye phone
Eye phonedeepalis25
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear CameraIRJET Journal
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes TechnologyRamki M
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesIRJET Journal
 

Similar to Survey of HCI Techniques for Disabled Users (20)

F0932733
F0932733F0932733
F0932733
 
Eye tracker based HCI
Eye tracker based HCIEye tracker based HCI
Eye tracker based HCI
 
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
Facial-Expression Based Mouse Cursor Control for Physically Challenged Indivi...
 
synergetic framework for eyeball mouse and gesture recognition (1).pptx
synergetic framework for eyeball mouse and gesture recognition (1).pptxsynergetic framework for eyeball mouse and gesture recognition (1).pptx
synergetic framework for eyeball mouse and gesture recognition (1).pptx
 
blue eyes seminar ppt.pptx
blue eyes seminar ppt.pptxblue eyes seminar ppt.pptx
blue eyes seminar ppt.pptx
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
Eye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardEye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual Keyboard
 
IRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex SensorsIRJET- Human Activity Recognition using Flex Sensors
IRJET- Human Activity Recognition using Flex Sensors
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computers
 
Gesture control algorithm for personal computers
Gesture control algorithm for personal computersGesture control algorithm for personal computers
Gesture control algorithm for personal computers
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with Eyeball
 
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-CamHand Gesture Recognition System for Human-Computer Interaction with Web-Cam
Hand Gesture Recognition System for Human-Computer Interaction with Web-Cam
 
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...
 
Eye phone
Eye phoneEye phone
Eye phone
 
Eye phone
Eye phoneEye phone
Eye phone
 
HGR-thesis
HGR-thesisHGR-thesis
HGR-thesis
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
 
Blue Eyes Technology
Blue Eyes TechnologyBlue Eyes Technology
Blue Eyes Technology
 
9783319609270 c2
9783319609270 c29783319609270 c2
9783319609270 c2
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
 

Recently uploaded

Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Celine George
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPCeline George
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.arsicmarija21
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxOH TEIK BIN
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentInMediaRes1
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️9953056974 Low Rate Call Girls In Saket, Delhi NCR
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersSabitha Banu
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxLigayaBacuel1
 
Romantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxRomantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxsqpmdrvczh
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxthorishapillay1
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatYousafMalik24
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17Celine George
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPCeline George
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxiammrhaywood
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Celine George
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxpboyjonauth
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceSamikshaHamane
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxChelloAnnAsuncion2
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptxSherlyMaeNeri
 

Recently uploaded (20)

Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17Difference Between Search & Browse Methods in Odoo 17
Difference Between Search & Browse Methods in Odoo 17
 
How to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERPHow to do quick user assign in kanban in Odoo 17 ERP
How to do quick user assign in kanban in Odoo 17 ERP
 
AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.AmericanHighSchoolsprezentacijaoskolama.
AmericanHighSchoolsprezentacijaoskolama.
 
Solving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptxSolving Puzzles Benefits Everyone (English).pptx
Solving Puzzles Benefits Everyone (English).pptx
 
Alper Gobel In Media Res Media Component
Alper Gobel In Media Res Media ComponentAlper Gobel In Media Res Media Component
Alper Gobel In Media Res Media Component
 
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
call girls in Kamla Market (DELHI) 🔝 >༒9953330565🔝 genuine Escort Service 🔝✔️✔️
 
DATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginnersDATA STRUCTURE AND ALGORITHM for beginners
DATA STRUCTURE AND ALGORITHM for beginners
 
Planning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptxPlanning a health career 4th Quarter.pptx
Planning a health career 4th Quarter.pptx
 
9953330565 Low Rate Call Girls In Rohini Delhi NCR
9953330565 Low Rate Call Girls In Rohini  Delhi NCR9953330565 Low Rate Call Girls In Rohini  Delhi NCR
9953330565 Low Rate Call Girls In Rohini Delhi NCR
 
Romantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptxRomantic Opera MUSIC FOR GRADE NINE pptx
Romantic Opera MUSIC FOR GRADE NINE pptx
 
Proudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptxProudly South Africa powerpoint Thorisha.pptx
Proudly South Africa powerpoint Thorisha.pptx
 
Earth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice greatEarth Day Presentation wow hello nice great
Earth Day Presentation wow hello nice great
 
How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17How to Configure Email Server in Odoo 17
How to Configure Email Server in Odoo 17
 
What is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERPWhat is Model Inheritance in Odoo 17 ERP
What is Model Inheritance in Odoo 17 ERP
 
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptxECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
ECONOMIC CONTEXT - PAPER 1 Q3: NEWSPAPERS.pptx
 
Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17Computed Fields and api Depends in the Odoo 17
Computed Fields and api Depends in the Odoo 17
 
Introduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptxIntroduction to AI in Higher Education_draft.pptx
Introduction to AI in Higher Education_draft.pptx
 
Roles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in PharmacovigilanceRoles & Responsibilities in Pharmacovigilance
Roles & Responsibilities in Pharmacovigilance
 
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptxGrade 9 Q4-MELC1-Active and Passive Voice.pptx
Grade 9 Q4-MELC1-Active and Passive Voice.pptx
 
Judging the Relevance and worth of ideas part 2.pptx
Judging the Relevance  and worth of ideas part 2.pptxJudging the Relevance  and worth of ideas part 2.pptx
Judging the Relevance and worth of ideas part 2.pptx
 

Survey of HCI Techniques for Disabled Users

  • 1. Survey on Human Computer interaction for disabled persons Muhammad Bilal Muhammad Usman Ghani Department of Computer Science Department of Computer Science University of Islamabad University of Islamabad Air University, E9, Islamabad Air University, E9, Islamabad 140666@students.au.edu.pk 140681@students.au.edu.pk Abstract - At present, there are many solutions which facilitate the interaction of computers to handicapped people. In recent years there has been an increased interest in Human- Computer Interaction Systems allowing for more natural communication with machines. Such systems are especially important for elderly and disabled persons. In this paper we have conducted a survey of ten different techniques and tried to analyze how these techniques are working. The techniques have also been compared on the basis of many features which give an insight into the effectiveness of these techniques. Index Terms – HCI (Human Computer Interaction) cursor control, disable peoples, facial recognition, I. INTRODUCTION HCI can be described as the point of communication between the human user and computer. Typical input devices used nowadays for communication with the machine are: mouse, keyboard, trackball, touchpad and touch screen. All these interfaces require manual control and can’t be used by person impaired in movement capacity. This fact induces the need for development of the alternative method of communication between human and computer that would be suitable for the disabled persons. Therefore it attracts the attention of researchers all over the world. For severely disabled persons, whose movement is limited to the muscles around the eyes, most suitable systems are that which can be controlled by eye blinks since blinking is the last voluntary action the disabled person loses control of [1]. Eye blinks can be classified into 3 types: voluntary, reflexive and spontaneous. Spontaneous blinking which is done without external stimuli and internal effort. A reflex blink occurs in response to an external stimulus, such as objects that appear rapidly in front of the eye. Voluntary blink is larger amplitude than Reflex blink, with the use of all 3 divisions of the orbicularis oculi muscle. The eye- movement or eye blink controlled HCI systems are very useful for persons who cannot speak or use hands to communicate. These systems use mostly the technique of EOG (Electrooculography). A number of eye-blink and face detection techniques have been developed. They can be divided into contact and non-contact methods. Contact methods are basically those methods in which there is some physical connection between user and machine. Another classification divides eye-blink techniques into vision and non- vision based. Non-vision methods include electrooculography (EOG) and high frequency transceivers. The vision-based system are designed for the disabled users who are capable of blinking voluntarily. Mostly every technique use two hardware components, a web camera and a microphone. The web camera is expected to be pointed to the face of the person who uses the computer. The microphone captures the voice of the user and detects the voice commands which are used to issue mouse actions such as click, double click and right click. Drag and Scroll events also can be done. And finally mouse events are passed to operating system through System API which is Win32 API in windows. To get an insight into the techniques that are already been using by disabled persons (physical), how these techniques works, how is it being used and is contributing to make man- machine interaction easy and effective is the purpose of this survey. In this paper, we discuss ten different techniques that make use of methods designed specifically to improve the interaction experience between a user and machine. The techniques we have highlighted in this paper serve all those physical disabled persons that can’t use computer because of their disability to interact with computer. Furthermore we have highlighted few parameters in the table below to illustrate the comparison between techniques. The paper is divided into the following sections: Section 2 presents the literature review. Section 3 presents the analysis and finally section 3 draws conclusions. II. LITERATURE REVIEW In [2], the author’s presents an alternative to the conventional mouse using computer vision and voice recognition technologies as key techniques. The user can use his or her nose’s relative movements to move the mouse pointer and use other mouse actions by voice commands. Enhanced template matching image processing technologies have used to track the nose position. Background light changes are rectified with using adoptive thresholds. The voice interaction is achieved by using Speech API calls and overall mouse events are passed to system using system API calls. The only additional hardware components used are, a web cam and a microphone which come by default in most laptop computers in built. And such external microphones and web cameras are available in very affordable price.
  • 2. In [3], the author’s presents a vision & feature-based system for detection of long voluntary eye blinks and interpretation of blink patterns for communication between man and machine. Supplemented by the mechanism for detecting multiple eye blinks, this paper provides a complete solution for building intelligent hands-free input devices. Due to recent increase of computer power and decrease of camera cost, it became very common to see a camera on top of a computer monitor. The described technique uses off-the self-cameras that allow one for tracking nose features, eyebrows and head position robustly and precisely in both 2D and 3D coordinates. This tracking and monitoring allows user to give input to the computer machine and access the entire system in a hands free manner. In [4], the author’s presents a novel wearable single-channel electrooculography (EOG) based human-computer interface (HCI) with a simple system design and robust performance. In this system, EOG signals for control are generated from double eye blinks, collected by a commercial wearable device (the NeuroSky Mind Wave headset), and then converted into a sequence of commands that can control cursor navigations and actions. The EOG-based cursor control system is tested on 8 subjects in indoor or outdoor environment, and the average accuracy is 84.42% for indoor uses and 71.50% for outdoor uses. Compared with other existing EOG-based HCI systems, this system is highly user-friendly and does not require any training. Therefore, this system has the potential to provide an easy-to-use and cheap assistive technique for locked-in patients who have lost their main body muscular abilities but with proper eye-condition. In [5], the author’s presents a system called as Vocal Mouse (VM). This device will allow users to continuously control the mouse pointer using words as well as sounds by varying vocal parameters such as vowel quality, loudness and pitch. Traditional method of using only standard spoken words was inefficient for performing continuous tasks and they are often recognized poorly by automatic speech recognizers. Now, VM allows the users to work on both continuous and discrete motion control. This includes commands given as words or regular sounds consisting of vowels and consonants. Low-level acoustic features are extracted in real time using LPC (Linear Predictive Coding). Pattern recognition is performed using a new proposed technique called “minimum feature distance technique”. This proposed technique is based on calculating distances between the spoken word and each stored word in the library during training process. Features from pattern recognition module are processed to produce output in the form of cursor’s 2-D movement. VM can be used by novice users without extensive training and it presents a viable alternative to existing speech-based cursor control methods. In [6], the author’s presents a vision based human–computer interface. The interface detects eye movements and interprets them as cursor control commands. The employed image processing methods include webcam for detecting the face, and template matching method based eye region detection. The Haar feature technique is used for eye feature extraction. SVM classification method is used for classifying the eye movements. The classification of eye movements such as eye open, eye close, eyeball left, and eyeball right movements are used for cursor top, bottom, left and right movement respectively. The interface is based on a notebook equipped with a typical web camera and requires no extra light sources. In Hough transform, circular method is used to control the cursor movements. This method is used for physically challenged persons to operate the computers effectively with their eye movements. In [7], the author’s presents a real-time human computer interaction system based on hand gesture. The whole system consists of three components: hand detection, gesture recognition and human-computer interaction (HCI) based on recognition and realizes the robust control of mouse and keyboard events with a higher accuracy of gesture recognition. Specifically, we use the convolutional neural network (CNN) to recognize gestures and makes it attainable to identify relatively complex gestures using only one cheap monocular camera. They introduce the Kalman filter to estimate the hand position based on which the mouse cursor control is realized in a stable and smooth way. During the HCI stage, they develop a simple strategy to avoid the false recognition caused by noises - mostly transient, false gestures, and thus to improve the reliability of interaction. The developed system is highly extendable and can be used in human robotic or other human- machine interaction scenarios with more complex command formats rather than just mouse and keyboard events. In [8], the author’s presents a framework for computer control without need for special PC software or drivers. The framework is based on a tongue control system recently developed at Centre for Sensory-Motor Interaction (SMI), Aalborg University. The framework provides emulation of a standard USB keyboard and mouse, and allows tongue control of any computer using standard USB drivers available in all modern operating systems. In [9], the author’s presents the mouse to control a computer with a shoe called IPFM: intelligent pressure foot-mouse. It is mainly composed of five parts including pressure sensor (FSR) embedded in a shoe, 3-axis acceleration sensor in front of shoe to measure a foot motion, computing module (FPGA), wireless communication module and visualization module (LABVIEW). From construction of the five parts, we developed a new input device called IPFM, which can be used by people who have difficult in using their hands to operate computers or devices easily. In [10], the author’s presents the mouse pointer and performs various mouse operations such as left click, right click, double click, and drag etc. using gestures recognition technique.
  • 3. Recognizing gestures is a complex task which involves many aspects such as motion modelling, motion analysis, pattern recognition and machine learning. Keeping all the essential factors in mind a system has been created which recognizes the movement of fingers and various patterns formed by them. Colour caps have been used for fingers to distinguish it from the background colour such as skin colour. In [11], the author’s presents a new approach to control personal computers with high efficiency. The main three features of this interface are nose tracking cursor control, auto brightness control and display control based on the presence and detection of valid human face. The proposed system is low cost and exhibits inherent security and power saving capabilities. III. COMPARISON PARAMETERS We have compared our techniques on the basis of some parameters. The parameters are explained as follows: Efficiency: The comparison of what is actually produced or performed with what can be achieved with the same consumption of resources. It is an important factor in determination of productivity. Accuracy: The quality or state of being correct or precise. Sensor: A device which detects or measures a physical property and records, indicates, or otherwise responds to it. Sensor can be of many types: ď‚· Temperature sensor ď‚· Proximity sensor ď‚· IR sensor ď‚· Pressure sensor ( used in IPFM technique) ď‚· Light sensor ( To control illumination in many techniques) ď‚· Touch sensor Cost Expensive: This parameter is for checking which techniques are costly and which are cheap. For examples, IPFM (Intelligent pressure foot machine) is costly because of sensors fitted in shoe. Facial recognition technique in which camera is put in front of user and it records the movement of eye is cheap. Facial Recognition: A facial recognition system is a technology capable of identifying or verifying a person from a digital image. This parameter is for comparing techniques on the basis of facial recognition, means which techniques include facial recognition and which are not. Gesture Recognition: Gesture recognition is the mathematical interpretation of a human motion by a computing device. This parameter is also for comparing techniques on the basis of gesture recognition, means which techniques include gesture recognition and which are not. Eye-Blink Detection: There are many techniques that consider the eye-blink as a function, and do some mouse click events on it. This parameter is also for comparing techniques on the basis of eye-blink detection. Speech Detection: In some techniques, mouse clicking events occurred through voice commands. Therefore speech detection parameter is to consider those techniques which used speech detection. Parameters Parameters IV. CONCLUSION This section presents the conclusion. A survey was conducted in this paper to analyze the solutions which facilitate the interaction of computers to handicapped people. The important advantage of approx. all techniques is that system does not need prior knowledge of face location or skin colour is not required, nor any special lighting. In many human-computer interfaces for the disabled additional hardware is required to be worn by the user, such as special head gear, sensors, or markers. But those systems that only use cameras are cost cheap and easy to use, completely non-intrusive, and therefore more user-friendly and easier to configure. In case of face recognition, it is better to have the light source behind the computer and not behind the user. The systems that can be used in outdoor environment are more preferable. Some techniques need to consider flexibility factor to further improve their performance. 5 6 7 8 1. No No Yes No 2. No Yes Yes No 3. Yes Yes No No 4. Yes Yes No Yes 5. No No No No 6. No No Yes No 7. No Yes Yes No 8. Yes No No No 9. No Yes Yes Yes 10. No Yes No No 1 2 3 4 1. Yes Yes Yes No 2. Yes Yes Yes Yes 3. Yes Yes Yes Yes 4. No Yes Yes Yes 5. No Yes Yes No 6. Yes No Yes No 7. Yes No No No 8. No Yes Yes No 9. Yes Yes Yes Yes 10 . No No Yes No
  • 4. REFERENCES [1] R. Ruddarraju, et al, Perceptual user interfaces using vision-based eye tracking, in: the 5th international conference on Multimodal interfaces, Vancouver, British Columbia, Canada, 2003, pp. 227 – 233. [2] S. K. Chathuranga, K. C. Samarawickrama¤, H. M. L. Chandima, K. G. T. D. Chathuranga and A. M. H. S. Abeykoon, "Hands free interface for Human Computer Interaction," 2010 Fifth International Conference on Information and Automation for Sustainability, Colombo, 2010, pp. 359- 364. doi: 10.1109/ICIAFS.2010.5715687. [3] K. Parmar, B. Mehta and R. Sawant, "Facial-feature based HumanComputer Interface for disabled people," 2012 International Conference on Communication, Information & Computing Technology (ICCICT), Mumbai, 2012, pp. 1-5. doi: 10.1109/ICCICT.2012.6398171. [4] A. M. S. Ang, Z. G. Zhang, Y. S. Hung and J. N. F. Mak, "A user- friendly wearable single-channel EOG-based human-computer interface for cursor control," 2015 7th International IEEE/EMBS Conference on Neural Engineering (NER), Montpellier, 2015, pp. 565-568. doi: 10.1109/NER.2015.7146685. [5] Nibha, Vikram Nandal “Mouse control- a new era in human computer interaction”, ISSN 2320–088X, IJCSMC, Vol. 3, Issue. 7, July 2014. [6] M.Mangaiyarkarasi and A.Geetha “Cursor control system using facial expressions for human computer interactions”, ISSN: 0976-1353 Volume 8 Issue 1 - APRIL 2014. [7] Pei Xu, “A real-time hand gesture recognition and human computer interaction systems”, arXiv: 1704.07296v1 [cs.CV] 24 Apr 2017. [8] Morten Enemark Lund, et al, “A framework for mouse and keyboard emulation in a tongue control system”, 3-6 Sept. 2009. doi: 10.1109/IEMBS.2009.5334055 [9] Hyun-Min Choi, et al, “IPFM: Intelligent pressure foot-mouse”, Vol.8, No.5 (2013) [10] Rachit Puri, et al, “Gesture recognition based mouse events”, Samsung Research India-Bangalore – 560037, India [11] Shadman Sakib Khan, et al, “Nose Tracking Cursor Control for the People with Disabilities: An Improved HCI” 2017 3rd International Conference on Electrical Information and Communication Technology (EICT), 7-9 December 2017, Khulna, Bangladesh