SlideShare a Scribd company logo
1 of 5
Download to read offline
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 130
Implementing Deep Learning Model in Human Computer Interaction
with Face Recognition using Convolutional Neural Network
Sharath B1, Harshith R2, Lankesh J3, Sanjay G R4, Chandru A S5
1Sharath B: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India
2Harshith R: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India
3Lankesh J: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India
4Sanjay G R: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India
5Chandru A S (co-author): Assistant Professor, Dept. of ISE, NIE-IT, Mysore, Karnataka, India
---------------------------------------------------------------------***---------------------------------------------------------------------
Abstract –With the advent of technology inthisdigitalage,
there is always room for improvement in the computer field.
Hand free computing is needed from today to meet the needs
of quadriplegics. This paper introduces the Human Computer
Interaction (HCI) system which is very important for
amputators and those who have problems using their hands.
The system developed is an eye-based communication that
acts as a computer mouse to translate eye movements such as
blinking, staring and teasing at the actions of a mouse cursor.
The program in question uses a simple webcam and its
software requirements are Python (3.6), OpenCv, numpyanda
few other packages needed for face recognition. The face
detector can be constructed using the HOG (Histogram of
oriented Gradients) feature and the line filter, as well as the
sliding window path. No hands and no external computer
hardware or sensors needed.
Key Words: Python(3.6), OpenCv, Human computer
interaction, numpy, face recognition, Histogram of Oriented
Gradients (HOG).
1. INTRODUCTION
PEOPLE who are quadriplegic and unable to talk — for
example, with cerebral palsy, traumatic brain injury, or
stroke — often have difficulty expressing their desires,
thoughts, and needs. They use their limited voluntary
movements to communicate with family, friends, and other
care providers. Some people may shake their heads. Some
may voluntarily blink or blink. Others may move their eyes
or tongue. Assisted technical deviceshave beendeveloped to
help them use their movements voluntarily to control
computers. People with disabilities can communicate
through spelling or add-ons. Disable people who cannot
move anything without their eyes. For these people eye
movement and blinking are the only way to communicate
with an external voice through a computer. This study aims
to develop a program that can help people with physical
disabilities by allowing them to connect to a computer
system using only their eyes. Human interactions with
computers have become an integral part of our daily lives.
Here, eyes as input, user eye movement can provide a
simple, natural and high bandwidth input source.
This Computer mouse or finger movement has become the
most common way to move the cursor on the screen in
modern technology. The system detects any mouse orfinger
movement so that the map matches the movement of the
cursor. Some people who do not have working arms, so-
called ‘amputated’ will not be able to use the current
technology to use the mouse. Therefore, if the movement of
the eyeball is not tracked and if the direction of the eye
towards it can be determined, the movement of the eyeball
can be done on a map and the amputated leg will be able to
move the cursor at will. ‘Eye tracking mouse’ will be very
useful for the target person. Currently, mouse trackingis not
widely available, and only a few companies have developed
this technology and made it available. We aim to fix the eye
tracking mouse where most of the mouse functions will be
found, so that the user can move the cursor using his or her
eye. We try to measure the user’s ‘viewing’ direction and
move the cursor to where his or her eye is trying to focus.
people who are blind actually cannot use a computer so by
adding audio output they can listen tothecommandsand act
accordingly. Mouse pointing and clicking action remains
common for a long time. However, for some reason a person
may find it uncomfortable or if those who are unable to
move their hands, there is a need to use these mouse free
hands.
In general, eye movements and facial expressions are the
basis of a handless mouse.
2. RELATED WORK
This can be achieved in various ways. Camera mouse
suggested by Margrit Betke et. al. [1] for people with
quadriplegic and non-speech. User movements are tracked
using the camera and this canbemappedtothemousecursor
movement that appears on the screen. However, another
approach was proposed by Robert Gabriel Lupu, et. al. [2]
with the interaction ofapersonalcomputerthatusesadevice
mounted on the head to track eye movements and interpret
on-screen. Another option for Prof. Prashantsalunkeet.al[3]
introduced eye tracking techniques using Hough transform.
A lot of work is being done to improve the features of the
HCI.A paper by Muhammad Usman Ghani, et. Al [4] suggests
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 131
that eye movements can be read as inputandusedtohelpthe
user access visual cues without using any other hardware
device such as a mouse or keyboard [5]. Thiscan be achieved
by using image processing algorithms and computer vision.
Another way to get eyes is to, by using the Haar cascade
feature.
Eyes can be obtained by pairing them with pre-stored
templates as suggested by Vaibhav Nangare et. al [6]. For an
accurate picture of the iris an IR sensor can be used. A
gyroscope can be used to identify the head as suggested by
Anamika Mali et. al [7]. Clicking function can be done by
'viewing' or by staring at the screen. Also, by looking at a
portion of any part of the screen(toporbottom),thescrolling
function can be done as suggested by Zhang et. al [8].
As well as eye movements, it becomes easier when we
combine subtle face movements with their parts as well.
Real-timeeye detection using facialexpressionsassuggested
by Teresa Soukupova and Jan ´ Cech [9] shows how blinking
can be detected using facial expressions. This is a great
feature as blinking actions are needed to translate click
actions. Visualization of the eyes and other parts of the face
can be done using openCv and Python with dlib [10]. Like a
blink can be detected. [11]. Akshay Chandra [13] proposes
the same by attaching a mouse cursor using facial
movements.
3. METHODOLOGY
The proposed procedure in this paper works based on the
followingactions: a) Blinkingb) closingboththeeyesc)Head
movement (pitch and yawning) d) Opening the mouth.
3.1 Requirements:
The requirements for this job are listed and reviewedand
analysed. Since the core of this project is part of the
software, we only talked about the need for software.
Software requirement –
1) Python:
Python is the software we used, wheretheinterfaceis
Jupyter Notebook. It is a very common tool, in some
basic mathematics, and is one of the easiest tools to
use. Some of the libraries used are:
 Numpy – 1.13.3
 OpenCV – 3.2.0
 PyAutoGUI – 0.9.36
 Dlib – 19.4.0
 Imutils – 0.4.6
The procedure is as follows:
1) Since the project is based on finding facial features and
drawing a cursor, the webcam needs to be accessed first,
which means it will be turned on by the webcam. Once the
webcam is turned on, the system needs to extract the entire
frame from the video. The frame rate of the video is usually
equal to 30 frames per second, so the frame every 1/30
second will be used for processing. This framework contains
a set of processes before the elements of the framework are
detected and mapped in the cursor. And this process
continuously occursthroughouttheframework aspartof the
loop.
2) Once the frame has been removed, face circuitsneedto be
detected. Therefore, the frames will face a set of image
processing functions to process the frame properly, making
it easier for the system to detect features such as eyes,
mouth, nose, etc.
The processing techniques are:
i) Resizing:
The image is rotated first over the y axis of y. Next, theimage
needs to be resized. Resize function refers to setting a new
image adjustment to any value as required. For this project,
the new resolution is 640 X 480.
ii) BGR to gray:
The data we use to detect different parts of the face requires
a grey format image to give moreaccurateresults.Therefore,
the image, i.e., the video frame from the webcam needs to go
through the process of converting its format from RGB to
grayscale.
Once the image has been converted to grayscale format, it
can be used to detect faces and identify facial features.
iii) Detection and Prediction of facial features:
For face recognition and features, a pre-built model is used
in the project, with available values that can betranslated by
python to ensure that the face is located in the image. There
is a function called ‘detector ()’, which is made available to
models, which helps us to find faces.
After face detection, facial features can now be detected
using the ‘predictor’ function. The function helps us to score
68 points on any 2D image. Thesepointsareaccompanied by
different facial points near the required parts such as eyes,
mouth, etc.
Acquired activity values are in the form of 2D links. Each of
the 68 points is the basic number x and y of the connectors,
which, when connected, will form a visible surface.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 132
Then they are saved as a list of values so that they can be
edited and used in the next step to connect any links and
draw a border to represent the required regions of the face.
The four sets of frames are considered to be 4 separateparts
of these numbers kept in order, to be kept separately as
connectors to represent the required regions, namely: left
eye, right eye, nose and mouth.
Once 4 identical members have been prepared, borders or
‘concerts’ are drawn near the points using 3 of these same
members by connecting these points, using the
‘drawcontour’ function and the shape formed around the
two eyes and mouth.
iv) Mouth and Eye aspect ratios: Once the concerts have
been planned, it is necessary to have a statereference,which
in comparison, provides the program with any information
on any action performed by these regions such as blinking,
yawning, etc.
These references areunderstoodasmeasurements,between
2D connectors, and the switch changes, in fact tellingusthat,
parts of the face region have moved away from the normal
area and action has been taken.
The system is built to predict local facial features. The pre-
built Dlib model assists infastandaccuratefacial recognition
and 68 points 2D facial features as already mentioned. Here,
the Eye-Aspect Ratio (EAR) and the mouth-aspect-ratio
(MAR) are used to detect blinking / yawning respectively.
These actions are translated into mouse actions.
Eye-Aspect-Ratio (EAR)
Fig.1: Eye-Aspect-Ratio
The value of EAR decreases significantly when the eye is
closed. This feature can be used to click on an action.
Mouth-Aspect-Ratio (MAR)
Fig.2: Mouth-Aspect-Ratio
MAR rises when the mouth opens. This is used as a starting
and ending action on the mouse. For example, if the scale is
increased, it could mean that the distances between the
points representing the facial region have changed and the
action has been man-made. This action should be
understood as a person trying to perform a task using a
mouse.
Therefore, in order for these functions to be enabled, an
'aspect ratios' needs to be defined, which, if it exceeds the
specified limit, translates the action to be performed.
v) Detection of actions performed by the face: After defining
the dimensions, the framework can now compare the
dimensions of the facial features with the dimensions
described in the different actions, of the current process
being considered. It is done using the statement ‘if’. The
actions identified by the program are:
1) To open the mouse: The user needs to ‘yaw’ which opens
his mouth, vertically, and then increases the distance
between the corresponding 2D mouth points. Thealgorithm
detects changes in range by making a computerrating,andif
this rating exceeds a certainlimit,thesystemisactivatedand
the cursor can be removed. The user needs to position the
nose, up, down, left or right rectangle, in order to move the
cursor in the opposite direction. The farther away the
rectangle, the faster the movement of the cursor
2) Left / Right Click: With a click, you need to closeanyofthe
eyes, and make sure you keep one open. The system first
checks whether the magnitude of the difference is greater
than the limit by using the difference between the two eye
measurements, to ensure that the user wants to make a left
or right click, and does not want to scroll. (Requires both
eyes to clear)
3) Scroll: The user can scroll the mouse, up or down. You
need to close your eyes in such a way that the dimension of
both eyes is below the set value. In this case, when the user
puts his nose out of the rectangle, the mouse does the
scrolling work, rather than moving the cursor. He can move
his nose or over the rectangle to scroll up, or move it below
the rectangle to scroll down.
vi) Multiple Points: Although the current system does not
track multiple points, some initial tests were performed to
test such tracking. In the future, multi-point tracking maybe
used, for example, tocalculatedistances betweenthe nostrils
and the pupils for eye recognition. This will result in a
special tracking system. The power of the current version is
its standard — any feature can be selected for tracking
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 133
vii) Voice output: In this project we have added a voice
output to people who cannot see the screen using their eyes,
the voice output is: right click, left click, right, left, up, down,
scroll mode is on, scroll mode is off, input mode is on and
input mode is off.
viii) Distance from screen to user: This distance needs to be
adjusted correctly so that you can accurately distinguish
different areas of the eye and keep track of the center. If this
distance is too large then the movement of the eye will be
much smaller than the width of the screen so it will be
difficult to accurately track the center area. If the distance is
too small than the movement of the eyes it will be too large
and due to the curvature of the eye.
4. RESULT
The mouse control system will work, and the user can move
the cursor or click at will. The value of the cursor position
change on any axis can be adjusted according to user
requirements. Mouse control is activated by opening the
mouth when the amount of MAR crosses a certain limit.
Fig.3: Opening the mouth to open 'input mode'
The cursor is moved by moving the eye right, left, top and
down as per the requirement.
Scroll mode is activated by mock operation. Scrolling can be
done by moving the head upwards called throwing and, on
the sides, called yawning. Scroll mode is stopped by joking
again. Clicking action occurs with the blink of an eye. Right
blink is accompanied by right blink and left blink is
accompanied by left blink.
Fig.4: Close your eyes for a few seconds to go to scroll
mode
Fig.5: moving face left to move cursor left
Fig.6: moving face right to move cursor right
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072
© 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 134
Mouse sensitivity can be adjusted according to user needs.
Overall, the project works as it should. although comfort is
not the same as a hand-controlled mouse, this project can be
easily implemented with some practice. We have
implemented a testing process to test the usability, accuracy
and reliability of the ETM system. Twenty participants,
ranging in age from 19 to 27, participated in the study. They
all had normal or adjusted vision, had no prior eye tracking
experience and needed a little training to use the system.
5. CONCLUSIONS
This function can be extended to improve system speed
using better trained models. Also, the system can be made
more powerful by making a change in the location of the
cursor, in proportion to the number of movements of the
user's head, that is, the user can determine, the amount he
wants the position of the indicator to change. we have
designed the project in such a way that blind people can use
a computer by listening to audio output produced by their
camera-directed actions. Also, future research work can be
done to make the measurement more accurate, as the range
of values is the result of the measurement of the feature,
which is usually smaller. Therefore, to make the algorithm
detect actions more accurately, there may be some
modification to the formulas of the aspect ratios used. Also,
to make the face detection process easier, some image
processing techniques can be used before themodel gets the
face and facial features.
REFERENCES
[1] Margrit Betke,James Gips, “The Camera Mouse: Visual
Tracking of Body Features to Provide Computer Access for
People With Severe Disabilities” ,in IEEE transactions on
neural systems and rehabilitation engineering, vol.10, no.1,
March 2002
[2] Robert Gabriel Lupu, Florina Ungureanu, Valentin
Siriteanu, “Eye Tracking Mouse for Human Computer
Interaction”, in The 4th IEEE International Conference on E-
Health and Bioengineering - EHB 2013.
[3] Prof. Prashant Salunkhe, Miss. AshwiniR.Patil ,“ADevice
Controlled Using Eye Movement”, in International
Conference on Electrical, Electronics, and Optimization
Techniques (ICEEOT) - 2016.
[4] Muhammad Usman Ghani, Sarah Chaudhry, Maryam
Sohail, Muhammad Nafees Geelani, “GazePointer: A Real
Time Mouse Pointer Control Implementation Based On Eye
Gaze Tracking”,in INMIC 19-20 Dec. 2013.
[5] Alex Poole and Linden J. Ball, “Eye Tracking in Human-
Computer InteractionandUsabilityResearch:CurrentStatus
and Future Prospects,” in Encyclopedia of Human Computer
Interaction (30 December 2005) Key: citeulike:3431568,
2006, pp. 211-219.
[6] Vaibhav Nangare, Utkarsha Samant, Sonit Rabha,
“Controlling Mouse Motions Using Eye Blinks, Head
Movements and Voice Recognition”,in International Journal
of Scientific and Research Publications, Volume 6, Issue 3,
March 2016.
[7] Anamika Mali, Ritisha Chavan, Priyanka Dhanawade,
“Optimal System for Manipulating Mouse Pointer through
Eyes”,in International Research Journal of Engineering and
Technology (IRJET) March 2016.
[8] Xuebai Zhang, Xiaolong Liu, Shyan-Ming Yuan,Shu-Fan
Lin, “Eye Tracking Based Control SystemforNatural Human-
Computer Interaction”,in Hindawi Computational
Intelligence and Neuroscience Volume 2017, Article ID
5739301, 9 pages.
[9] Tereza Soukupova´ and Jan Cˇ ech. “Real-Time Eye Blink
Detection using Facial Landmarks.” In 21st ComputerVision
Winter Workshop, February 2016.
[10] Adrian Rosebrock. “Detect eyes, nose, lips, and jaw”
with dlib, OpenCV, and Python.”
[11] Adrian Rosebrock. Eye blink detection with OpenCV,
Python, and dlib.
[12] C.Sagonas, G. Tzimiropoulos, S. Zafeiriou, M. Pantic.300
Faces in-the-Wild Challenge: The first facial landmark
localization Challenge. Proceedings of IEEE Int’l Conf. on
Computer Vision (ICCV-W), 300 Faces in-the-WildChallenge
(300-W). Sydney, Australia, December 2013

More Related Content

Similar to Implementing Deep Learning Model in Human Computer Interaction with Face Recognition using Convolutional Neural Network

Assistance Application for Visually Impaired - VISION
Assistance Application for Visually  Impaired - VISIONAssistance Application for Visually  Impaired - VISION
Assistance Application for Visually Impaired - VISIONIJSRED
 
Mouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep LearningMouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep LearningIRJET Journal
 
Mouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep LearningMouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep LearningIRJET Journal
 
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People
IRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind People
IRJET - For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET Journal
 
INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...
INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...
INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...IRJET Journal
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET Journal
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningIRJET Journal
 
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET Journal
 
IRJET - Chatbot with Gesture based User Input
IRJET -  	  Chatbot with Gesture based User InputIRJET -  	  Chatbot with Gesture based User Input
IRJET - Chatbot with Gesture based User InputIRJET Journal
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIDr. Amarjeet Singh
 
Drishyam - Virtual Eye for Blind
Drishyam - Virtual Eye for BlindDrishyam - Virtual Eye for Blind
Drishyam - Virtual Eye for BlindIRJET Journal
 
IRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind AssistanceIRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind AssistanceIRJET Journal
 
IRJET - Blind Guidance using Smart Cap
IRJET - Blind Guidance using Smart CapIRJET - Blind Guidance using Smart Cap
IRJET - Blind Guidance using Smart CapIRJET Journal
 
IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con...
IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...
IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con...IRJET Journal
 
IRJET- Vision Based Sign Language by using Matlab
IRJET- Vision Based Sign Language by using MatlabIRJET- Vision Based Sign Language by using Matlab
IRJET- Vision Based Sign Language by using MatlabIRJET Journal
 
Sign Language Detection using Action Recognition
Sign Language Detection using Action RecognitionSign Language Detection using Action Recognition
Sign Language Detection using Action RecognitionIRJET Journal
 
Eye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardEye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardIRJET Journal
 
IRJET- Smart Interactive Mirror Interface
IRJET-  	  Smart Interactive Mirror Interface IRJET-  	  Smart Interactive Mirror Interface
IRJET- Smart Interactive Mirror Interface IRJET Journal
 
IRJET- Development of a Face Recognition System with Deep Learning and Py...
IRJET-  	  Development of a Face Recognition System with Deep Learning and Py...IRJET-  	  Development of a Face Recognition System with Deep Learning and Py...
IRJET- Development of a Face Recognition System with Deep Learning and Py...IRJET Journal
 

Similar to Implementing Deep Learning Model in Human Computer Interaction with Face Recognition using Convolutional Neural Network (20)

Assistance Application for Visually Impaired - VISION
Assistance Application for Visually  Impaired - VISIONAssistance Application for Visually  Impaired - VISION
Assistance Application for Visually Impaired - VISION
 
Mouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep LearningMouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep Learning
 
Mouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep LearningMouse Cursor Control Hands Free Using Deep Learning
Mouse Cursor Control Hands Free Using Deep Learning
 
V4 n2 139
V4 n2 139V4 n2 139
V4 n2 139
 
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People
IRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind PeopleIRJET -  	  For(E)Sight :A Perceptive Device to Assist Blind People
IRJET - For(E)Sight :A Perceptive Device to Assist Blind People
 
INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...
INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...
INDOOR AND OUTDOOR NAVIGATION ASSISTANCE SYSTEM FOR VISUALLY IMPAIRED PEOPLE ...
 
IRJET- Sign Language Interpreter
IRJET- Sign Language InterpreterIRJET- Sign Language Interpreter
IRJET- Sign Language Interpreter
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine Learning
 
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
 
IRJET - Chatbot with Gesture based User Input
IRJET -  	  Chatbot with Gesture based User InputIRJET -  	  Chatbot with Gesture based User Input
IRJET - Chatbot with Gesture based User Input
 
Eye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AIEye(I) Still Know! – An App for the Blind Built using Web and AI
Eye(I) Still Know! – An App for the Blind Built using Web and AI
 
Drishyam - Virtual Eye for Blind
Drishyam - Virtual Eye for BlindDrishyam - Virtual Eye for Blind
Drishyam - Virtual Eye for Blind
 
IRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind AssistanceIRJET- Object Detection and Recognition for Blind Assistance
IRJET- Object Detection and Recognition for Blind Assistance
 
IRJET - Blind Guidance using Smart Cap
IRJET - Blind Guidance using Smart CapIRJET - Blind Guidance using Smart Cap
IRJET - Blind Guidance using Smart Cap
 
IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con...
IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...IRJET -  	  Smart E – Cane for the Visually Challenged and Blind using ML Con...
IRJET - Smart E – Cane for the Visually Challenged and Blind using ML Con...
 
IRJET- Vision Based Sign Language by using Matlab
IRJET- Vision Based Sign Language by using MatlabIRJET- Vision Based Sign Language by using Matlab
IRJET- Vision Based Sign Language by using Matlab
 
Sign Language Detection using Action Recognition
Sign Language Detection using Action RecognitionSign Language Detection using Action Recognition
Sign Language Detection using Action Recognition
 
Eye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual KeyboardEye-Blink Detection System for Virtual Keyboard
Eye-Blink Detection System for Virtual Keyboard
 
IRJET- Smart Interactive Mirror Interface
IRJET-  	  Smart Interactive Mirror Interface IRJET-  	  Smart Interactive Mirror Interface
IRJET- Smart Interactive Mirror Interface
 
IRJET- Development of a Face Recognition System with Deep Learning and Py...
IRJET-  	  Development of a Face Recognition System with Deep Learning and Py...IRJET-  	  Development of a Face Recognition System with Deep Learning and Py...
IRJET- Development of a Face Recognition System with Deep Learning and Py...
 

More from IRJET Journal

TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...IRJET Journal
 
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURESTUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTUREIRJET Journal
 
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...IRJET Journal
 
Effect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsEffect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsIRJET Journal
 
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...IRJET Journal
 
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...IRJET Journal
 
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...IRJET Journal
 
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...IRJET Journal
 
A REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASA REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASIRJET Journal
 
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...IRJET Journal
 
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProP.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProIRJET Journal
 
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...IRJET Journal
 
Survey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemSurvey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemIRJET Journal
 
Review on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesReview on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesIRJET Journal
 
React based fullstack edtech web application
React based fullstack edtech web applicationReact based fullstack edtech web application
React based fullstack edtech web applicationIRJET Journal
 
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...IRJET Journal
 
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.IRJET Journal
 
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...IRJET Journal
 
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignMultistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignIRJET Journal
 
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...IRJET Journal
 

More from IRJET Journal (20)

TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
 
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURESTUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
 
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
 
Effect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsEffect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil Characteristics
 
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
 
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
 
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
 
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
 
A REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASA REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADAS
 
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
 
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProP.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
 
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
 
Survey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemSurvey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare System
 
Review on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesReview on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridges
 
React based fullstack edtech web application
React based fullstack edtech web applicationReact based fullstack edtech web application
React based fullstack edtech web application
 
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
 
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
 
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
 
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignMultistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
 
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
 

Recently uploaded

Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerAnamika Sarkar
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AIabhishek36461
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024Mark Billinghurst
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024hassan khalil
 
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...VICTOR MAESTRE RAMIREZ
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girlsssuser7cb4ff
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidNikhilNagaraju
 
Introduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptxIntroduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptxvipinkmenon1
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxwendy cai
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSCAESB
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfAsst.prof M.Gokilavani
 
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2RajaP95
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile servicerehmti665
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxJoão Esperancinha
 
complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...asadnawaz62
 
Heart Disease Prediction using machine learning.pptx
Heart Disease Prediction using machine learning.pptxHeart Disease Prediction using machine learning.pptx
Heart Disease Prediction using machine learning.pptxPoojaBan
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVRajaP95
 

Recently uploaded (20)

Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube ExchangerStudy on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
Study on Air-Water & Water-Water Heat Exchange in a Finned Tube Exchanger
 
Past, Present and Future of Generative AI
Past, Present and Future of Generative AIPast, Present and Future of Generative AI
Past, Present and Future of Generative AI
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024IVE Industry Focused Event - Defence Sector 2024
IVE Industry Focused Event - Defence Sector 2024
 
Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024Architect Hassan Khalil Portfolio for 2024
Architect Hassan Khalil Portfolio for 2024
 
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
VICTOR MAESTRE RAMIREZ - Planetary Defender on NASA's Double Asteroid Redirec...
 
Call Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call GirlsCall Girls Narol 7397865700 Independent Call Girls
Call Girls Narol 7397865700 Independent Call Girls
 
main PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfidmain PPT.pptx of girls hostel security using rfid
main PPT.pptx of girls hostel security using rfid
 
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Serviceyoung call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
young call girls in Rajiv Chowk🔝 9953056974 🔝 Delhi escort Service
 
Introduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptxIntroduction to Microprocesso programming and interfacing.pptx
Introduction to Microprocesso programming and interfacing.pptx
 
What are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptxWhat are the advantages and disadvantages of membrane structures.pptx
What are the advantages and disadvantages of membrane structures.pptx
 
GDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentationGDSC ASEB Gen AI study jams presentation
GDSC ASEB Gen AI study jams presentation
 
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdfCCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
CCS355 Neural Network & Deep Learning Unit II Notes with Question bank .pdf
 
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2HARMONY IN THE HUMAN BEING - Unit-II UHV-2
HARMONY IN THE HUMAN BEING - Unit-II UHV-2
 
Call Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile serviceCall Girls Delhi {Jodhpur} 9711199012 high profile service
Call Girls Delhi {Jodhpur} 9711199012 high profile service
 
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCRCall Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
Call Us -/9953056974- Call Girls In Vikaspuri-/- Delhi NCR
 
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptxDecoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
Decoding Kotlin - Your guide to solving the mysterious in Kotlin.pptx
 
complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...complete construction, environmental and economics information of biomass com...
complete construction, environmental and economics information of biomass com...
 
Heart Disease Prediction using machine learning.pptx
Heart Disease Prediction using machine learning.pptxHeart Disease Prediction using machine learning.pptx
Heart Disease Prediction using machine learning.pptx
 
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IVHARMONY IN THE NATURE AND EXISTENCE - Unit-IV
HARMONY IN THE NATURE AND EXISTENCE - Unit-IV
 

Implementing Deep Learning Model in Human Computer Interaction with Face Recognition using Convolutional Neural Network

  • 1. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 130 Implementing Deep Learning Model in Human Computer Interaction with Face Recognition using Convolutional Neural Network Sharath B1, Harshith R2, Lankesh J3, Sanjay G R4, Chandru A S5 1Sharath B: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India 2Harshith R: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India 3Lankesh J: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India 4Sanjay G R: Student, Dept. of ISE, NIE-IT, Mysore, Karnataka, India 5Chandru A S (co-author): Assistant Professor, Dept. of ISE, NIE-IT, Mysore, Karnataka, India ---------------------------------------------------------------------***--------------------------------------------------------------------- Abstract –With the advent of technology inthisdigitalage, there is always room for improvement in the computer field. Hand free computing is needed from today to meet the needs of quadriplegics. This paper introduces the Human Computer Interaction (HCI) system which is very important for amputators and those who have problems using their hands. The system developed is an eye-based communication that acts as a computer mouse to translate eye movements such as blinking, staring and teasing at the actions of a mouse cursor. The program in question uses a simple webcam and its software requirements are Python (3.6), OpenCv, numpyanda few other packages needed for face recognition. The face detector can be constructed using the HOG (Histogram of oriented Gradients) feature and the line filter, as well as the sliding window path. No hands and no external computer hardware or sensors needed. Key Words: Python(3.6), OpenCv, Human computer interaction, numpy, face recognition, Histogram of Oriented Gradients (HOG). 1. INTRODUCTION PEOPLE who are quadriplegic and unable to talk — for example, with cerebral palsy, traumatic brain injury, or stroke — often have difficulty expressing their desires, thoughts, and needs. They use their limited voluntary movements to communicate with family, friends, and other care providers. Some people may shake their heads. Some may voluntarily blink or blink. Others may move their eyes or tongue. Assisted technical deviceshave beendeveloped to help them use their movements voluntarily to control computers. People with disabilities can communicate through spelling or add-ons. Disable people who cannot move anything without their eyes. For these people eye movement and blinking are the only way to communicate with an external voice through a computer. This study aims to develop a program that can help people with physical disabilities by allowing them to connect to a computer system using only their eyes. Human interactions with computers have become an integral part of our daily lives. Here, eyes as input, user eye movement can provide a simple, natural and high bandwidth input source. This Computer mouse or finger movement has become the most common way to move the cursor on the screen in modern technology. The system detects any mouse orfinger movement so that the map matches the movement of the cursor. Some people who do not have working arms, so- called ‘amputated’ will not be able to use the current technology to use the mouse. Therefore, if the movement of the eyeball is not tracked and if the direction of the eye towards it can be determined, the movement of the eyeball can be done on a map and the amputated leg will be able to move the cursor at will. ‘Eye tracking mouse’ will be very useful for the target person. Currently, mouse trackingis not widely available, and only a few companies have developed this technology and made it available. We aim to fix the eye tracking mouse where most of the mouse functions will be found, so that the user can move the cursor using his or her eye. We try to measure the user’s ‘viewing’ direction and move the cursor to where his or her eye is trying to focus. people who are blind actually cannot use a computer so by adding audio output they can listen tothecommandsand act accordingly. Mouse pointing and clicking action remains common for a long time. However, for some reason a person may find it uncomfortable or if those who are unable to move their hands, there is a need to use these mouse free hands. In general, eye movements and facial expressions are the basis of a handless mouse. 2. RELATED WORK This can be achieved in various ways. Camera mouse suggested by Margrit Betke et. al. [1] for people with quadriplegic and non-speech. User movements are tracked using the camera and this canbemappedtothemousecursor movement that appears on the screen. However, another approach was proposed by Robert Gabriel Lupu, et. al. [2] with the interaction ofapersonalcomputerthatusesadevice mounted on the head to track eye movements and interpret on-screen. Another option for Prof. Prashantsalunkeet.al[3] introduced eye tracking techniques using Hough transform. A lot of work is being done to improve the features of the HCI.A paper by Muhammad Usman Ghani, et. Al [4] suggests
  • 2. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 131 that eye movements can be read as inputandusedtohelpthe user access visual cues without using any other hardware device such as a mouse or keyboard [5]. Thiscan be achieved by using image processing algorithms and computer vision. Another way to get eyes is to, by using the Haar cascade feature. Eyes can be obtained by pairing them with pre-stored templates as suggested by Vaibhav Nangare et. al [6]. For an accurate picture of the iris an IR sensor can be used. A gyroscope can be used to identify the head as suggested by Anamika Mali et. al [7]. Clicking function can be done by 'viewing' or by staring at the screen. Also, by looking at a portion of any part of the screen(toporbottom),thescrolling function can be done as suggested by Zhang et. al [8]. As well as eye movements, it becomes easier when we combine subtle face movements with their parts as well. Real-timeeye detection using facialexpressionsassuggested by Teresa Soukupova and Jan ´ Cech [9] shows how blinking can be detected using facial expressions. This is a great feature as blinking actions are needed to translate click actions. Visualization of the eyes and other parts of the face can be done using openCv and Python with dlib [10]. Like a blink can be detected. [11]. Akshay Chandra [13] proposes the same by attaching a mouse cursor using facial movements. 3. METHODOLOGY The proposed procedure in this paper works based on the followingactions: a) Blinkingb) closingboththeeyesc)Head movement (pitch and yawning) d) Opening the mouth. 3.1 Requirements: The requirements for this job are listed and reviewedand analysed. Since the core of this project is part of the software, we only talked about the need for software. Software requirement – 1) Python: Python is the software we used, wheretheinterfaceis Jupyter Notebook. It is a very common tool, in some basic mathematics, and is one of the easiest tools to use. Some of the libraries used are:  Numpy – 1.13.3  OpenCV – 3.2.0  PyAutoGUI – 0.9.36  Dlib – 19.4.0  Imutils – 0.4.6 The procedure is as follows: 1) Since the project is based on finding facial features and drawing a cursor, the webcam needs to be accessed first, which means it will be turned on by the webcam. Once the webcam is turned on, the system needs to extract the entire frame from the video. The frame rate of the video is usually equal to 30 frames per second, so the frame every 1/30 second will be used for processing. This framework contains a set of processes before the elements of the framework are detected and mapped in the cursor. And this process continuously occursthroughouttheframework aspartof the loop. 2) Once the frame has been removed, face circuitsneedto be detected. Therefore, the frames will face a set of image processing functions to process the frame properly, making it easier for the system to detect features such as eyes, mouth, nose, etc. The processing techniques are: i) Resizing: The image is rotated first over the y axis of y. Next, theimage needs to be resized. Resize function refers to setting a new image adjustment to any value as required. For this project, the new resolution is 640 X 480. ii) BGR to gray: The data we use to detect different parts of the face requires a grey format image to give moreaccurateresults.Therefore, the image, i.e., the video frame from the webcam needs to go through the process of converting its format from RGB to grayscale. Once the image has been converted to grayscale format, it can be used to detect faces and identify facial features. iii) Detection and Prediction of facial features: For face recognition and features, a pre-built model is used in the project, with available values that can betranslated by python to ensure that the face is located in the image. There is a function called ‘detector ()’, which is made available to models, which helps us to find faces. After face detection, facial features can now be detected using the ‘predictor’ function. The function helps us to score 68 points on any 2D image. Thesepointsareaccompanied by different facial points near the required parts such as eyes, mouth, etc. Acquired activity values are in the form of 2D links. Each of the 68 points is the basic number x and y of the connectors, which, when connected, will form a visible surface.
  • 3. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 132 Then they are saved as a list of values so that they can be edited and used in the next step to connect any links and draw a border to represent the required regions of the face. The four sets of frames are considered to be 4 separateparts of these numbers kept in order, to be kept separately as connectors to represent the required regions, namely: left eye, right eye, nose and mouth. Once 4 identical members have been prepared, borders or ‘concerts’ are drawn near the points using 3 of these same members by connecting these points, using the ‘drawcontour’ function and the shape formed around the two eyes and mouth. iv) Mouth and Eye aspect ratios: Once the concerts have been planned, it is necessary to have a statereference,which in comparison, provides the program with any information on any action performed by these regions such as blinking, yawning, etc. These references areunderstoodasmeasurements,between 2D connectors, and the switch changes, in fact tellingusthat, parts of the face region have moved away from the normal area and action has been taken. The system is built to predict local facial features. The pre- built Dlib model assists infastandaccuratefacial recognition and 68 points 2D facial features as already mentioned. Here, the Eye-Aspect Ratio (EAR) and the mouth-aspect-ratio (MAR) are used to detect blinking / yawning respectively. These actions are translated into mouse actions. Eye-Aspect-Ratio (EAR) Fig.1: Eye-Aspect-Ratio The value of EAR decreases significantly when the eye is closed. This feature can be used to click on an action. Mouth-Aspect-Ratio (MAR) Fig.2: Mouth-Aspect-Ratio MAR rises when the mouth opens. This is used as a starting and ending action on the mouse. For example, if the scale is increased, it could mean that the distances between the points representing the facial region have changed and the action has been man-made. This action should be understood as a person trying to perform a task using a mouse. Therefore, in order for these functions to be enabled, an 'aspect ratios' needs to be defined, which, if it exceeds the specified limit, translates the action to be performed. v) Detection of actions performed by the face: After defining the dimensions, the framework can now compare the dimensions of the facial features with the dimensions described in the different actions, of the current process being considered. It is done using the statement ‘if’. The actions identified by the program are: 1) To open the mouse: The user needs to ‘yaw’ which opens his mouth, vertically, and then increases the distance between the corresponding 2D mouth points. Thealgorithm detects changes in range by making a computerrating,andif this rating exceeds a certainlimit,thesystemisactivatedand the cursor can be removed. The user needs to position the nose, up, down, left or right rectangle, in order to move the cursor in the opposite direction. The farther away the rectangle, the faster the movement of the cursor 2) Left / Right Click: With a click, you need to closeanyofthe eyes, and make sure you keep one open. The system first checks whether the magnitude of the difference is greater than the limit by using the difference between the two eye measurements, to ensure that the user wants to make a left or right click, and does not want to scroll. (Requires both eyes to clear) 3) Scroll: The user can scroll the mouse, up or down. You need to close your eyes in such a way that the dimension of both eyes is below the set value. In this case, when the user puts his nose out of the rectangle, the mouse does the scrolling work, rather than moving the cursor. He can move his nose or over the rectangle to scroll up, or move it below the rectangle to scroll down. vi) Multiple Points: Although the current system does not track multiple points, some initial tests were performed to test such tracking. In the future, multi-point tracking maybe used, for example, tocalculatedistances betweenthe nostrils and the pupils for eye recognition. This will result in a special tracking system. The power of the current version is its standard — any feature can be selected for tracking
  • 4. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 133 vii) Voice output: In this project we have added a voice output to people who cannot see the screen using their eyes, the voice output is: right click, left click, right, left, up, down, scroll mode is on, scroll mode is off, input mode is on and input mode is off. viii) Distance from screen to user: This distance needs to be adjusted correctly so that you can accurately distinguish different areas of the eye and keep track of the center. If this distance is too large then the movement of the eye will be much smaller than the width of the screen so it will be difficult to accurately track the center area. If the distance is too small than the movement of the eyes it will be too large and due to the curvature of the eye. 4. RESULT The mouse control system will work, and the user can move the cursor or click at will. The value of the cursor position change on any axis can be adjusted according to user requirements. Mouse control is activated by opening the mouth when the amount of MAR crosses a certain limit. Fig.3: Opening the mouth to open 'input mode' The cursor is moved by moving the eye right, left, top and down as per the requirement. Scroll mode is activated by mock operation. Scrolling can be done by moving the head upwards called throwing and, on the sides, called yawning. Scroll mode is stopped by joking again. Clicking action occurs with the blink of an eye. Right blink is accompanied by right blink and left blink is accompanied by left blink. Fig.4: Close your eyes for a few seconds to go to scroll mode Fig.5: moving face left to move cursor left Fig.6: moving face right to move cursor right
  • 5. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 09 Issue: 07 | July 2022 www.irjet.net p-ISSN: 2395-0072 © 2022, IRJET | Impact Factor value: 7.529 | ISO 9001:2008 Certified Journal | Page 134 Mouse sensitivity can be adjusted according to user needs. Overall, the project works as it should. although comfort is not the same as a hand-controlled mouse, this project can be easily implemented with some practice. We have implemented a testing process to test the usability, accuracy and reliability of the ETM system. Twenty participants, ranging in age from 19 to 27, participated in the study. They all had normal or adjusted vision, had no prior eye tracking experience and needed a little training to use the system. 5. CONCLUSIONS This function can be extended to improve system speed using better trained models. Also, the system can be made more powerful by making a change in the location of the cursor, in proportion to the number of movements of the user's head, that is, the user can determine, the amount he wants the position of the indicator to change. we have designed the project in such a way that blind people can use a computer by listening to audio output produced by their camera-directed actions. Also, future research work can be done to make the measurement more accurate, as the range of values is the result of the measurement of the feature, which is usually smaller. Therefore, to make the algorithm detect actions more accurately, there may be some modification to the formulas of the aspect ratios used. Also, to make the face detection process easier, some image processing techniques can be used before themodel gets the face and facial features. REFERENCES [1] Margrit Betke,James Gips, “The Camera Mouse: Visual Tracking of Body Features to Provide Computer Access for People With Severe Disabilities” ,in IEEE transactions on neural systems and rehabilitation engineering, vol.10, no.1, March 2002 [2] Robert Gabriel Lupu, Florina Ungureanu, Valentin Siriteanu, “Eye Tracking Mouse for Human Computer Interaction”, in The 4th IEEE International Conference on E- Health and Bioengineering - EHB 2013. [3] Prof. Prashant Salunkhe, Miss. AshwiniR.Patil ,“ADevice Controlled Using Eye Movement”, in International Conference on Electrical, Electronics, and Optimization Techniques (ICEEOT) - 2016. [4] Muhammad Usman Ghani, Sarah Chaudhry, Maryam Sohail, Muhammad Nafees Geelani, “GazePointer: A Real Time Mouse Pointer Control Implementation Based On Eye Gaze Tracking”,in INMIC 19-20 Dec. 2013. [5] Alex Poole and Linden J. Ball, “Eye Tracking in Human- Computer InteractionandUsabilityResearch:CurrentStatus and Future Prospects,” in Encyclopedia of Human Computer Interaction (30 December 2005) Key: citeulike:3431568, 2006, pp. 211-219. [6] Vaibhav Nangare, Utkarsha Samant, Sonit Rabha, “Controlling Mouse Motions Using Eye Blinks, Head Movements and Voice Recognition”,in International Journal of Scientific and Research Publications, Volume 6, Issue 3, March 2016. [7] Anamika Mali, Ritisha Chavan, Priyanka Dhanawade, “Optimal System for Manipulating Mouse Pointer through Eyes”,in International Research Journal of Engineering and Technology (IRJET) March 2016. [8] Xuebai Zhang, Xiaolong Liu, Shyan-Ming Yuan,Shu-Fan Lin, “Eye Tracking Based Control SystemforNatural Human- Computer Interaction”,in Hindawi Computational Intelligence and Neuroscience Volume 2017, Article ID 5739301, 9 pages. [9] Tereza Soukupova´ and Jan Cˇ ech. “Real-Time Eye Blink Detection using Facial Landmarks.” In 21st ComputerVision Winter Workshop, February 2016. [10] Adrian Rosebrock. “Detect eyes, nose, lips, and jaw” with dlib, OpenCV, and Python.” [11] Adrian Rosebrock. Eye blink detection with OpenCV, Python, and dlib. [12] C.Sagonas, G. Tzimiropoulos, S. Zafeiriou, M. Pantic.300 Faces in-the-Wild Challenge: The first facial landmark localization Challenge. Proceedings of IEEE Int’l Conf. on Computer Vision (ICCV-W), 300 Faces in-the-WildChallenge (300-W). Sydney, Australia, December 2013