SlideShare a Scribd company logo
1 of 7
Download to read offline
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 410
MESSAGE CONVEYOR FOR LOCKED SYNDROME PATIENTS
BY VIRTUAL KEYBOARD
Dr. R. V. Shalini1 Dept. of Biomedical Engineering , Sri Shakthi Institute Of Engineering And
Technology,Tamilnadu, India
1Abhiram.K, Sri Shakthi Institute Of Engineering And Technology
2Abimannan.A, Sri Shakthi Institute Of Engineering and Technology
3Anandhu Raji, Sri Shakthi Institute of Engineering and Technology
4Nishad.S, Sri Shakthi Institute Of Engineering and Technology
--------------------------------------------------------------------------***--------------------------------------------------------------------------
Abstract — the Message conveyor for locked syndrome
patients is an assistive technology designed to help
individuals with Locked-in syndrome. Locked-in syndrome
is a rare disorder of the nervous system. People with
locked-in syndrome are paralyzed except for the muscles
that control eye movement. Conscious (aware) and can
think and reason, but cannot move or speak; although they
may be able to communicate with blinking eye
movements. The system consists of a camera that detects
the user's face and eyes and tracks the movement of the
eyes to control a virtual keyboard on the screen. The
system uses a Histogram of Oriented Gradients (HOG)
descriptors to detect the user's face, and a Shape Predictor
68 Face Landmark algorithm to detect the user's eyes. Eye
blinking is detected by drawing two lines in the eye and
checking for the disappearance of the vertical line. Eye
gaze detection is done by detecting eyeballs and applying a
threshold on them. The system also counts the white part
of the eye to determine the direction of the gaze. The
virtual keyboard is displayed on the screen and is
controlled by eye movements and blinking. The user can
select letters and symbols by blinking, and additional
controls are provided to adjust the settings such as
language and speed of the speech output. The output of the
system can be in the form of text or speech, and is
displayed on a 16x2 LCD or pronounced using a speaker.
The text or speech output is determined by the user's
selections on the virtual keyboard. In summary, the Eye
Gaze Controlled Typing System is an innovative technology
that enables individuals with motor impairments or
disabilities to communicate using eye movements and
blinking. The system uses advanced computer vision
algorithms to detect the user's face, eyes, and gaze
direction and provides a simple and intuitive interface for
controlling the virtual keyboard and adjusting the system
settings
Keywords: Assistive technology, Eye gaze tracking,
Computer vision, Virtual keyboard, Eye blink detection
NodeMCU, LCD, Voice playback, Disabilities.
I. INTRODUCTION
The Message conveyor for locked syndrome patients is an
innovative assistive technology designed to help
individuals with motor impairments or disabilities to
communicate using eye movements and blinking. This
system uses advanced computer vision algorithms to
detect the user's face, eyes, and gaze direction, and
provides a simple and intuitive interface for controlling the
virtual keyboard and adjusting the system settings. The
system works by detecting the user's face using a
Histogram of Oriented Gradients (HOG) descriptors and
tracking the movement of the user's eyes using a Shape
Predictor 68 Face Landmark algorithm. Eye blinking is
detected by drawing two lines in the eye and checking for
the disappearance of the vertical line. Eye gaze detection is
done by detecting eyeballs and applying a threshold on
them. The system also counts the white part of the eye to
determine the direction of the gaze. The virtual keyboard
is displayed on the screen and is controlled by eye
movements and blinking. The user can select letters and
symbols by blinking, and additional controls are provided
to adjust the settings such as language and speed of the
speech output. The output of the system can be in the form
of text or speech, and is displayed on a 16x2 LCD or
pronounced using a speaker. The text or speech output is
determined by the user's selections on the virtual
keyboard. Overall, the Eye Gaze Controlled Typing System
is an innovative technology that provides a new way for
individuals with motor impairments or disabilities to
communicate and express themselves.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 411
II. BLOCK DIAGRAM
Fig 2.1. Block Diagram
III. PROPOSED METHODOLOGY
The camera or video input device captures the user's face
and eye movements. The face detection component uses
the Histogram of Oriented Gradients (HOG) feature
descriptor to identify the user's face in the image. The
shape predictor 68 face landmarks uses 68 specific points
to identify features such as the eyes, nose, lips, eyebrows,
and face area. The eye detection component uses these
landmarks to identify the user's eyes. The eye blink
detection component tracks vertical and horizontal lines in
the eyes to detect blinking. The eye gaze detection
component tracks the user's eyeballs and applies a
threshold to identify the white part of the eye. This allows
the system to determine which part of the virtual
keyboard the user is looking at. The virtual keyboard is
displayed on the screen, and the user selects letters and
other characters by blinking their eyes. The NodeMCU
with built-in WIFI receives the user's text selection and
sends it to the display (16x2 LCD) and voice playback
components for user feedback. The user can interact with
the system and control its functions.
IV. MODULE DESCRIPTION
A.Camera or Video Input Device
The camera used in the eye gaze-controlled typing system
can vary depending on the specific implementation and
requirements of the system. However, generally, a camera
with high resolution and a good frame rate is preferred for
accurate tracking of the user's eye movements. In general,
a camera with a resolution of 720p or higher is suitable for
eye tracking. The frame rate should be at least 30 frames
per second (fps) to ensure smooth and accurate tracking.
The camera should also have good low-light performance,
as the system may be used in different lighting conditions.
In terms of the camera type, both webcam and
Smartphone cameras can be used for eye tracking.
However, webcams are typically preferred for their ease of
use and better stability, while Smartphone cameras may be
more portable and flexible in terms of placement. It’s
important to note that the camera used in the system
should be compatible with the software used for face
detection, eye detection, and eye tracking. The specific
software may have recommended camera requirements,
so it's important to check with the software
documentation or manufacturer for any specific camera
recommendations.
Fig.4.1. Web Camera
B. Face Detection (using HOG)
The face detection block in the eye gaze-controlled typing
system uses the Histogram of Oriented Gradients (HOG)
feature descriptor to detect faces in the captured image.
HOG is a popular feature descriptor used in computer
vision and image processing for object detection, including
face detection. Here's a brief overview of how to face
detection using HOG works: Image pre-processing: The
input image is pre-processed to improve the performance
of the face detection algorithm. This may include steps
such as greyscale conversion, contrast normalization, and
histogram equalization. Sliding window approach: The
HOG descriptor works by dividing the input image into
small, overlapping sub-windows or regions. Each sub-
window is represented as a feature vector using the HOG
descriptor. Gradient orientation calculation: The HOG
descriptor counts the number of occurrences of gradient
orientation within each sub-window. The gradient
orientation is calculated by computing the gradient
magnitude and direction of the pixels in the sub-window.
Histogram of oriented gradients: The gradient orientation
values are grouped into bins to form a histogram of
oriented gradients for each sub-window. The HOG
descriptor is a concatenation of these histograms. Support
Vector Machine (SVM) classification: The HOG descriptor
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 412
is used as input to a binary SVM classifier, which is trained
to distinguish between face and non-face sub-windows.
The classifier is trained using a large number of positive
(face) and negative (non-face) examples. On-maximum
suppression: The output of the classifier is a set of
bounding boxes around potential face regions in the input
image. To reduce false positives, non-maximum
suppression is applied to eliminate overlapping bounding
boxes and keep only the most likely face region. The face
detection block in the eye gaze-controlled typing system
uses the dlib library in OpenCV to implement the HOG-
based face detection algorithm. This algorithm has been
shown to be more accurate than other methods for face
detection, making it a popular choice for many computer
vision applications.
Fig.4.2. Face Detection using HOG
c. Shape Predictor 68 Face Landmark
The shape predictor 68 face is a machine learning-based
algorithm used to locate 68 specific landmarks or key
points on the face. It is commonly used in computer vision
applications such as face tracking, face recognition, and
facial expression analysis. Here’s an overview of how the
shape predictor 68 face works: Training data: The shape
predictor 68 face algorithm is trained on a large dataset of
face images, which are labeled with the locations of 68
specific facial landmarks. The training process involves
using machine learning algorithms to learn the
relationship between facial features and landmark
locations. Feature extraction: The shape predictor 68 face
algorithm uses feature extraction techniques to identify
facial landmarks. Specifically, it extracts HOG features from
the image, which are then used to train a regression model
to predict the locations of the 68 landmarks. Regression
model: The shape predictor 68 face algorithms use a
regression model to predict the locations of the 68 facial
landmarks. The regression model takes the extracted HOG
features as input and outputs the landmark coordinates.
Facial landmark detection: Once the shape predictor 68
face algorithm is trained, it can be used to detect the 68
facial landmarks in new images. This involves applying the
trained regression model to the input image to predict the
locations of the landmarks. Landmark tracking: The
landmark locations can be used to track the face as it
moves, which can be useful for applications such as facial
expression analysis or eye gaze tracking. In the eye gaze-
controlled typing system, the shape predictor 68 faces are
used to detect the locations of the user's eyes. The
algorithm is used to locate the eye regions by identifying
specific landmarks, such as the corners of the eyes and the
iris. Once the eye regions are located, the system can use
additional image processing techniques to detect eye
blinks and track the user's eye gaze to select keys on the
virtual keyboard
Fig.4.3. Shape Predictor 68 Face
To detect eye gaze, the eye gaze-controlled typing system
uses image processing techniques to analyze the video
feed from the camera and monitor changes in the eye
regions over time.The basic principle behind eye gaze
detection is that the position of the iris in the eye can be
used to infer the direction of the person's gaze. In the eye
gaze-controlled typing system, the eye gaze detection
algorithm first isolates the eye regions using the shape
predictor 68 face algorithm. Once the eye regions are
identified, the system applies a threshold to the grayscale
image to convert it into a binary image. This binary image
is then used to detect the iris and estimate the position of
the pupil center. Once the position of the pupil center is
estimated, the system uses various algorithms, such as
Hough transforms and circular pattern matching, to
estimate the position and direction of the gaze. This
estimation is typically done by comparing the position of
the pupil center with the position of the eye corners and
Landmark
D. Eye Gaze Detection (using eyeball tracking)
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 413
using this information to infer the direction of the gaze. By
monitoring gaze direction, the system can allow users to
select keys on the virtual keyboard simply by moving their
eyes
E. Eye Blink Detection using vertical and horizontal lines
To detect eye blinks, the eye gaze-controlled typing system
uses image processing techniques to analyze the video
feed from the camera and monitor changes in the eye
regions over time. The basic principle behind eye blink
detection is that the eyelids close when a person blinks,
causing a temporary occlusion of the eye regions. This
change in the eye regions can be detected using image
processing techniques, such as edge detection,
thresholding, and blob analysis. In the eye gaze-controlled
typing system, the eye blink detection algorithm first
isolates the eye regions using the shape predictor 68 face
algorithm. Once the eye regions are identified, the system
applies a threshold to the grayscale image to convert it into
a binary image. This binary image is then used to detect
changes in the eye regions over time. To detect eye blinks,
the system uses two lines that are drawn over the eye
regions, one vertical and one horizontal. When the eye is
open, both lines are visible. However, when the eye blinks,
the vertical line disappears temporarily due to the
occlusion of the eyelid. The system tracks the
disappearance of the vertical line and uses it to detect eye
blinks.
F. Virtual Keyboard Displayed on Screen
The virtual keyboard in the eye gaze-controlled typing
system is a software-based user interface that allows users
to input text using eye movements. It is displayed on the
screen and consists of a grid of keys, similar to a standard
physical keyboard. The keys can include letters, numbers,
and symbols, depending on the language and preferences
of the user. The virtual keyboard in this system is designed
to be used with eye movements, specifically eye blinks and
gaze direction. Each key on the virtual keyboard is selected
by a combination of eye blinks and gaze direction. To
select a key, the user first gazes at the key they want to
select. Once the desired key is in focus, the user blinks
their eyes to activate the selection. The system then
registers the selected key and begins to build the text by
adding the character to a buffer. The virtual keyboard
output is received by the NodeMCU, which is a
microcontroller with built-in Wi-Fi. The NodeMCU
communicates with a voice playback and 16x2 LCD, which
shows the text received by NodeMCU and pronounces it
through a speaker if the text is stored. By combining the
virtual keyboard with these hardware components, the
system can provide a complete solution for text input and
output using only eye movements.
G. Selecting Letters by Blinking Eyes
In the eye gaze-controlled typing system, each letter on the
virtual keyboard is selected by a combination of eye gaze
and eye blink. Here's how it works: The user gazes at the
letter they want to select on the virtual keyboard. Once the
desired letter is in focus, the user blinks their eyes to select
the letter. The system detects the eye blink and registers
the selection. The selected letter is then added to a buffer,
which holds the current text being input by the user. The
process is repeated for each letter the user wants to select.
The system uses a computer vision algorithm to detect eye
blinks and distinguish them from other eye movements.
The algorithm detects when the upper and lower eyelids
come together and then separate, which signifies a blink.
By setting a threshold for the duration of this action, the
system can distinguish between a blink and other eye
movements, such as saccades or fixations. The
combination of eye gaze and eye blink allows the user to
select letters on the virtual keyboard with a high degree of
accuracy and precision. It can take some practice to get
used to this method of text input, but it can be an effective
way for individuals with motor disabilities to communicate
using only their eyes.
H.NodeMCU with Built-In WIFI
NodeMCU is an open-source development board designed
for the Internet of Things (IoT). It is based on the ESP8266
microcontroller and includes built-in Wi-Fi, making it easy
to connect to the internet and communicate with other
devices. The NodeMCU board is similar to other
development boards like the Arduino, but it is specifically
designed for IoT applications. It is programmed using the
Lua scripting language, which is easy to learn and use.
NodeMCU includes several features that make it a popular
choice for IoT projects, including Wi-Fi connectivity:
NodeMCU has built-in Wi-Fi, which allows it to connect to
the internet and communicate with other devices.GPIO
pins: NodeMCU includes several general-purpose
input/output (GPIO) pins, which can be used to connect to
sensors, actuators, and other devices. Analog input:
NodeMCU also includes analog input pins, which can be
used to read values from sensors that output analog
signals.USB interface: NodeMCU can be connected to a
computer via USB, which allows it to be programmed and
powered. Low cost: NodeMCU is a relatively low-cost
development board, making it accessible to hobbyists and
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 414
students. In the eye gaze-controlled typing system,
NodeMCU is used to receive the text selected by the user
on the virtual keyboard. This text is then displayed on a
16x2 LCD display and pronounced using a speaker
connected to the NodeMCU board. NodeMCU
communicates with the other components of the system
using Wi-Fi, which allows for wireless communication and
eliminates the need for additional wires or connections.
Fig.4.4. NodeMCU with Built-In WIFI
I. Display (16x2 LCD) and Voice Playback for User Feedback
In the eye gaze-controlled typing system, a 16x2 LCD
display and a speaker are used to provide visual and audio
feedback to the user. The 16x2 LCD display is a type of
alphanumeric display that can show up to 32 characters at
a time in two lines of 16 characters each. It is a commonly
used display in embedded systems and is easy to interface
with using digital input/output pins. In the eye gaze-
controlled typing system, the LCD display is used to show
the text selected by the user on the virtual keyboard. This
allows the user to confirm that the correct text has been
selected before it is pronounced by the speaker. The
speaker is used to provide audio feedback to the user in
the form of spoken words. In the eye gaze-controlled
typing system, the speaker is connected to the NodeMCU
board and is used to pronounce the text selected by the
user on the virtual keyboard. The speaker can be any type
of audio playback devices, such as a simple piezoelectric
buzzer or a more advanced speaker with built-in
amplification. Both the display and the speaker are
connected to the NodeMCU board using digital
input/output pins. The NodeMCU board communicates
with these components using software libraries that allow
for easy interfacing with the display and speaker. This
allows the user to receive both visual and audio feedback
from the system, making it more user-friendly.
Fig.4.4. Display (16x2 LCD)
J. Output Text or Speech
The output of the eye gaze-controlled typing system can be
in the form of text or speech, depending on the user's
preference. If the user selects text as the output, the
system will display the selected text on the 16x2 LCD
display. The user can then read the text and use it as
needed. If the user selects speech as the output, the system
will pronounce the selected text using the speaker. The
speaker can be programmed to produce speech in a variety
of languages and accents, allowing for greater flexibility
and accessibility for users from different backgrounds. In
either case, the output text or speech is determined by the
user's selections on the virtual keyboard, which is
controlled by eye movements and blinking. This allows
users with motor impairments or disabilities to interact
with the system and produce text or speech output in a
way that is comfortable and natural for them.
After implementing all setup and successful run of our
message conveyer several times to obtain results. This
device is designed to help people who are physically
disabled and are not able to Speak to convey their message
to others. The system detects eye blinking and the letter in
which The cursor is placed during the time of eye blinking
will be shown on the screen, thus making a sentence. This
system is designed at a very low cost so that it is affordable
to everyone and is also accessible to small children as well.
The proposed system will provide a robust and a sort of
technique to write for physically disabled people, it is not a
quick way of writing for normal people. The proposed
system in this paper provides a new dimension in the life
of disabled People who haven’t any other physical abilities
without eyes movement.
V. RESULT AND DISCUSSION
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 415
VI. CONCLUSION
The Message conveyor for locked syndrome patients is a
significant advancement in assistive technology that
provides an intuitive and efficient means for individuals
with motor impairments or disabilities to communicate.
The system uses advanced computer vision algorithms to
detect the user's face, eyes, and gaze direction and
provides a simple and easy-to-use interface for controlling
the virtual keyboard. The system is designed to be
customizable, with various settings that can be adjusted to
suit the user's needs and preferences. The output can be in
the form of text or speech, and is displayed on an LCD
display or pronounced using a speaker. The Eye Gaze
Controlled Typing System has the potential to greatly
improve the quality of life for individuals with motor
impairments or disabilities, enabling them to
communicate more effectively and express themselves
more freely. It represents a significant step forward in
assistive technology and has the potential to become an
essential tool for individuals with disabilities.
VII. REFERENCES
1.R. Zhang, Y. Yang, J. Wang, T. Ma, X. Lv and P. Xu, "A
speller system for the locked-in patient to communicate
with friends," 2015 IEEE 28th Canadian Conference on
Electrical and Computer Engineering (CCECE), Halifax, NS,
Canada, 2015, pp. 1203-1206, doi:
10.1109/CCECE.2015.7129448.
2.K. Ullah, M. Ali, M. Rizwan, and M. Imran, "Low-cost
single-channel EEG based communication system for
people with lock-in syndrome," 2011 IEEE 14th
International Multitopic Conference, Karachi, Pakistan,
2011, pp. 120-125, doi: 10.1109/INMIC.2011.6151455.
3.M. Mogrovejo, E. Pinos-Velez, R. Redrovan and L. Serpa-
Andrade, "Communication system for people with locked-
in syndrome through electromyography signals," 2017
IEEE International Autumn Meeting on Power, Electronics
and Computing (ROPEC), Ixtapa, Mexico, 2017, pp. 1-5, doi:
10.1109/ROPEC.2017.8261573.
4.S. Ramkumar et al., "Designing Communication System
for Person with Locked-in Syndrome Using Machine
Learning Technique," 2019 IEEE International Conference
on Clean Energy and Energy Efficient Electronics Circuit
for Sustainable Development (INCCES), Krishnankoil,
India, 2019, pp. 1-5, doi:
10.1109/INCCES47820.2019.9167686.
5.S. Adama, U. Chaudhary, N. Birbaumer and M. Bogdan,
"Longitudinal Analysis of the Connectivity and Complexity
of Complete Locked-in Syndrome Patients
Electroencephalographic signal," 2020 IEEE International
Conference on Bioinformatics and Biomedicine (BIBM),
Seoul, Korea (South), 2020, pp. 958-962, doi:
10.1109/BIBM49941.2020.9313391.
6.M. Moeneclaey and P. Sanders, "Syndrome-based Viterbi
decoder node synchronization and out-of-lock detection,"
[Proceedings] GLOBECOM '90: IEEE Global
Telecommunications Conference and Exhibition, San
Diego, CA, USA, 1990, pp. 604-608 vol.1, doi:
10.1109/GLOCOM.1990.116581.
7.A. Comaniciu and L. Najafizadeh, "Enabling
Communication for Locked-in Syndrome Patients using
Deep Learning and an Emoji-based Brain Computer
Interface," 2018 IEEE Biomedical Circuits and Systems
Conference (BioCAS), Cleveland, OH, USA, 2018, pp. 1-4,
doi: 10.1109/BIOCAS.2018.8584821.
8.P. Swetha, S. Amardeep, A. Siva Nagasen, G. Manoj Kumar
and G. Kranthi Kumar, "Arduino based Virtual Keyboard
for Locked-in-Syndrome," 2020 Fourth International
Conference on Computing Methodologies and
Communication (ICCMC), Erode, India, 2020, pp. 1024-
1028, doi: 10.1109/ICCMC48092.2020.ICCMC-000191.
9.S. Aziz, S. Ibraheem, A. Malik, F. Aamir, M. U. Khan and U.
Shehzad, "Electrooculugram based Communication System
for People with Locked-in-Syndrome," 2020 International
Conference on Electrical, Communication, and Computer
Engineering (ICECCE), Istanbul, Turkey, 2020, pp. 1-6, doi:
10.1109/ICECCE49384.2020.9179323
10.W. Shehieb, S. Alansari and N. Jadallah, "EEG-based
communication system for patients with locked-in
syndrome using fuzzy logic," 2017 10th Biomedical
Engineering International Conference (BMEiCON),
Hokkaido, Japan, 2017, pp. 1-5, doi:
10.1109/BMEiCON.2017.8229168.
11.C. Guger et al., "MindBEAGLE — A new system for the
assessment and communication with patients with
disorders of consciousness and complete locked-in
syndrom," 2017 IEEE International Conference on
Systems, Man, and Cybernetics (SMC), Banff, AB, Canada,
2017, pp. 3008-3013, doi: 10.1109/SMC.2017.8123086.
12.D. Koo, H. F. H. Polanco, M. Cross, Y. H. Rho, N. Ames and
J. Raiti, "Demonstration of low-cost EEG system providing
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072
© 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 416
on-demand communication for locked-in patients," 2021
IEEE Global Humanitarian Technology Conference (GHTC),
Seattle, WA, USA, 2021, pp. 108-111, doi:
10.1109/GHTC53159.2021.9612454..

More Related Content

Similar to MESSAGE CONVEYOR FOR LOCKED SYNDROME PATIENTS BY VIRTUAL KEYBOARD

An Automatic Car Engine Startup and Brake Oil Monitoring Using Sensors
An Automatic Car Engine Startup and Brake Oil Monitoring Using SensorsAn Automatic Car Engine Startup and Brake Oil Monitoring Using Sensors
An Automatic Car Engine Startup and Brake Oil Monitoring Using SensorsIJRES Journal
 
IRJET - Examination Forgery Avoidance System using Image Processing and IoT
IRJET - Examination Forgery Avoidance System using Image Processing and IoTIRJET - Examination Forgery Avoidance System using Image Processing and IoT
IRJET - Examination Forgery Avoidance System using Image Processing and IoTIRJET Journal
 
IRJET- Improvising Reliability of Autonomous Car using Risk Detection
IRJET-  	  Improvising Reliability of Autonomous Car using Risk DetectionIRJET-  	  Improvising Reliability of Autonomous Car using Risk Detection
IRJET- Improvising Reliability of Autonomous Car using Risk DetectionIRJET Journal
 
IRJET- Biometric based Bank Locker System
IRJET- Biometric based Bank Locker SystemIRJET- Biometric based Bank Locker System
IRJET- Biometric based Bank Locker SystemIRJET Journal
 
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance SystemIRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance SystemIRJET Journal
 
IRJET- Real-time Eye Tracking for Password Authentication
IRJET- Real-time Eye Tracking for Password AuthenticationIRJET- Real-time Eye Tracking for Password Authentication
IRJET- Real-time Eye Tracking for Password AuthenticationIRJET Journal
 
IRJET- Drive Assistance an Android Application for Drowsiness Detection
IRJET- Drive Assistance an Android Application for Drowsiness DetectionIRJET- Drive Assistance an Android Application for Drowsiness Detection
IRJET- Drive Assistance an Android Application for Drowsiness DetectionIRJET Journal
 
IRJET- Drivers Stupor Scrutinizing System
IRJET-  	  Drivers Stupor Scrutinizing SystemIRJET-  	  Drivers Stupor Scrutinizing System
IRJET- Drivers Stupor Scrutinizing SystemIRJET Journal
 
SMART MEDIA PLAYER USING AI
SMART MEDIA PLAYER USING AISMART MEDIA PLAYER USING AI
SMART MEDIA PLAYER USING AIIRJET Journal
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with EyeballIRJET Journal
 
Hand Gesture Identification
Hand Gesture IdentificationHand Gesture Identification
Hand Gesture IdentificationIRJET Journal
 
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET Journal
 
IRJET-Drunk and Drowsy Detection of Drivers using Artificial Intelligence
IRJET-Drunk and Drowsy Detection of Drivers using Artificial IntelligenceIRJET-Drunk and Drowsy Detection of Drivers using Artificial Intelligence
IRJET-Drunk and Drowsy Detection of Drivers using Artificial IntelligenceIRJET Journal
 
Human Driver’s Drowsiness Detection System
Human Driver’s Drowsiness Detection SystemHuman Driver’s Drowsiness Detection System
Human Driver’s Drowsiness Detection SystemIRJET Journal
 
A VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITION
A VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITIONA VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITION
A VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITIONIRJET Journal
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGIRJET Journal
 
IRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET Journal
 
Person Acquisition and Identification Tool
Person Acquisition and Identification ToolPerson Acquisition and Identification Tool
Person Acquisition and Identification ToolIRJET Journal
 
Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...
Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...
Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...IRJET Journal
 

Similar to MESSAGE CONVEYOR FOR LOCKED SYNDROME PATIENTS BY VIRTUAL KEYBOARD (20)

Blue Eye Technology
Blue Eye TechnologyBlue Eye Technology
Blue Eye Technology
 
An Automatic Car Engine Startup and Brake Oil Monitoring Using Sensors
An Automatic Car Engine Startup and Brake Oil Monitoring Using SensorsAn Automatic Car Engine Startup and Brake Oil Monitoring Using Sensors
An Automatic Car Engine Startup and Brake Oil Monitoring Using Sensors
 
IRJET - Examination Forgery Avoidance System using Image Processing and IoT
IRJET - Examination Forgery Avoidance System using Image Processing and IoTIRJET - Examination Forgery Avoidance System using Image Processing and IoT
IRJET - Examination Forgery Avoidance System using Image Processing and IoT
 
IRJET- Improvising Reliability of Autonomous Car using Risk Detection
IRJET-  	  Improvising Reliability of Autonomous Car using Risk DetectionIRJET-  	  Improvising Reliability of Autonomous Car using Risk Detection
IRJET- Improvising Reliability of Autonomous Car using Risk Detection
 
IRJET- Biometric based Bank Locker System
IRJET- Biometric based Bank Locker SystemIRJET- Biometric based Bank Locker System
IRJET- Biometric based Bank Locker System
 
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance SystemIRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
IRJET- Vehicular Data Acquisition & Driver Behaviours Surveillance System
 
IRJET- Real-time Eye Tracking for Password Authentication
IRJET- Real-time Eye Tracking for Password AuthenticationIRJET- Real-time Eye Tracking for Password Authentication
IRJET- Real-time Eye Tracking for Password Authentication
 
IRJET- Drive Assistance an Android Application for Drowsiness Detection
IRJET- Drive Assistance an Android Application for Drowsiness DetectionIRJET- Drive Assistance an Android Application for Drowsiness Detection
IRJET- Drive Assistance an Android Application for Drowsiness Detection
 
IRJET- Drivers Stupor Scrutinizing System
IRJET-  	  Drivers Stupor Scrutinizing SystemIRJET-  	  Drivers Stupor Scrutinizing System
IRJET- Drivers Stupor Scrutinizing System
 
SMART MEDIA PLAYER USING AI
SMART MEDIA PLAYER USING AISMART MEDIA PLAYER USING AI
SMART MEDIA PLAYER USING AI
 
Cursor Movement with Eyeball
Cursor Movement with EyeballCursor Movement with Eyeball
Cursor Movement with Eyeball
 
Hand Gesture Identification
Hand Gesture IdentificationHand Gesture Identification
Hand Gesture Identification
 
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
IRJET- Igaze-Eye Gaze Direction Evaluation to Operate a Virtual Keyboard for ...
 
IRJET-Drunk and Drowsy Detection of Drivers using Artificial Intelligence
IRJET-Drunk and Drowsy Detection of Drivers using Artificial IntelligenceIRJET-Drunk and Drowsy Detection of Drivers using Artificial Intelligence
IRJET-Drunk and Drowsy Detection of Drivers using Artificial Intelligence
 
Human Driver’s Drowsiness Detection System
Human Driver’s Drowsiness Detection SystemHuman Driver’s Drowsiness Detection System
Human Driver’s Drowsiness Detection System
 
A VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITION
A VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITIONA VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITION
A VISUAL ATTENDANCE SYSTEM USING FACE RECOGNITION
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
 
IRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually ImpairedIRJET- Smart Helmet for Visually Impaired
IRJET- Smart Helmet for Visually Impaired
 
Person Acquisition and Identification Tool
Person Acquisition and Identification ToolPerson Acquisition and Identification Tool
Person Acquisition and Identification Tool
 
Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...
Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...
Stay Awake Alert: A Driver Drowsiness Detection System with Location Tracking...
 

More from IRJET Journal

TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...IRJET Journal
 
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURESTUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTUREIRJET Journal
 
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...IRJET Journal
 
Effect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsEffect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsIRJET Journal
 
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...IRJET Journal
 
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...IRJET Journal
 
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...IRJET Journal
 
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...IRJET Journal
 
A REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASA REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASIRJET Journal
 
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...IRJET Journal
 
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProP.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProIRJET Journal
 
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...IRJET Journal
 
Survey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemSurvey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemIRJET Journal
 
Review on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesReview on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesIRJET Journal
 
React based fullstack edtech web application
React based fullstack edtech web applicationReact based fullstack edtech web application
React based fullstack edtech web applicationIRJET Journal
 
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...IRJET Journal
 
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.IRJET Journal
 
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...IRJET Journal
 
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignMultistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignIRJET Journal
 
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...IRJET Journal
 

More from IRJET Journal (20)

TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
 
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURESTUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
 
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
 
Effect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsEffect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil Characteristics
 
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
 
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
 
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
 
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
 
A REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASA REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADAS
 
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
 
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProP.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
 
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
 
Survey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemSurvey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare System
 
Review on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesReview on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridges
 
React based fullstack edtech web application
React based fullstack edtech web applicationReact based fullstack edtech web application
React based fullstack edtech web application
 
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
 
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
 
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
 
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignMultistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
 
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
 

Recently uploaded

Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls in Nagpur High Profile
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSKurinjimalarL3
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxupamatechverse
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...Call Girls in Nagpur High Profile
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdfankushspencer015
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxpranjaldaimarysona
 
UNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular ConduitsUNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular Conduitsrknatarajan
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escortsranjana rawat
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxupamatechverse
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSISrknatarajan
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxpurnimasatapathy1234
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur High Profile
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performancesivaprakash250
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINESIVASHANKAR N
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escortsranjana rawat
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxupamatechverse
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSRajkumarAkumalla
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...Soham Mondal
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130Suhani Kapoor
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)Suman Mia
 

Recently uploaded (20)

Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service NashikCall Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
Call Girls Service Nashik Vaishnavi 7001305949 Independent Escort Service Nashik
 
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICSAPPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
APPLICATIONS-AC/DC DRIVES-OPERATING CHARACTERISTICS
 
Introduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptxIntroduction and different types of Ethernet.pptx
Introduction and different types of Ethernet.pptx
 
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...Booking open Available Pune Call Girls Koregaon Park  6297143586 Call Hot Ind...
Booking open Available Pune Call Girls Koregaon Park 6297143586 Call Hot Ind...
 
AKTU Computer Networks notes --- Unit 3.pdf
AKTU Computer Networks notes ---  Unit 3.pdfAKTU Computer Networks notes ---  Unit 3.pdf
AKTU Computer Networks notes --- Unit 3.pdf
 
Processing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptxProcessing & Properties of Floor and Wall Tiles.pptx
Processing & Properties of Floor and Wall Tiles.pptx
 
UNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular ConduitsUNIT-II FMM-Flow Through Circular Conduits
UNIT-II FMM-Flow Through Circular Conduits
 
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur EscortsHigh Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
High Profile Call Girls Nagpur Isha Call 7001035870 Meet With Nagpur Escorts
 
Introduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptxIntroduction to IEEE STANDARDS and its different types.pptx
Introduction to IEEE STANDARDS and its different types.pptx
 
UNIT-III FMM. DIMENSIONAL ANALYSIS
UNIT-III FMM.        DIMENSIONAL ANALYSISUNIT-III FMM.        DIMENSIONAL ANALYSIS
UNIT-III FMM. DIMENSIONAL ANALYSIS
 
Microscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptxMicroscopic Analysis of Ceramic Materials.pptx
Microscopic Analysis of Ceramic Materials.pptx
 
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur EscortsCall Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
Call Girls in Nagpur Suman Call 7001035870 Meet With Nagpur Escorts
 
UNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its PerformanceUNIT - IV - Air Compressors and its Performance
UNIT - IV - Air Compressors and its Performance
 
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINEMANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
MANUFACTURING PROCESS-II UNIT-2 LATHE MACHINE
 
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
(MEERA) Dapodi Call Girls Just Call 7001035870 [ Cash on Delivery ] Pune Escorts
 
Introduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptxIntroduction to Multiple Access Protocol.pptx
Introduction to Multiple Access Protocol.pptx
 
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICSHARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
HARDNESS, FRACTURE TOUGHNESS AND STRENGTH OF CERAMICS
 
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
OSVC_Meta-Data based Simulation Automation to overcome Verification Challenge...
 
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
VIP Call Girls Service Hitech City Hyderabad Call +91-8250192130
 
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)Software Development Life Cycle By  Team Orange (Dept. of Pharmacy)
Software Development Life Cycle By Team Orange (Dept. of Pharmacy)
 

MESSAGE CONVEYOR FOR LOCKED SYNDROME PATIENTS BY VIRTUAL KEYBOARD

  • 1. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 410 MESSAGE CONVEYOR FOR LOCKED SYNDROME PATIENTS BY VIRTUAL KEYBOARD Dr. R. V. Shalini1 Dept. of Biomedical Engineering , Sri Shakthi Institute Of Engineering And Technology,Tamilnadu, India 1Abhiram.K, Sri Shakthi Institute Of Engineering And Technology 2Abimannan.A, Sri Shakthi Institute Of Engineering and Technology 3Anandhu Raji, Sri Shakthi Institute of Engineering and Technology 4Nishad.S, Sri Shakthi Institute Of Engineering and Technology --------------------------------------------------------------------------***-------------------------------------------------------------------------- Abstract — the Message conveyor for locked syndrome patients is an assistive technology designed to help individuals with Locked-in syndrome. Locked-in syndrome is a rare disorder of the nervous system. People with locked-in syndrome are paralyzed except for the muscles that control eye movement. Conscious (aware) and can think and reason, but cannot move or speak; although they may be able to communicate with blinking eye movements. The system consists of a camera that detects the user's face and eyes and tracks the movement of the eyes to control a virtual keyboard on the screen. The system uses a Histogram of Oriented Gradients (HOG) descriptors to detect the user's face, and a Shape Predictor 68 Face Landmark algorithm to detect the user's eyes. Eye blinking is detected by drawing two lines in the eye and checking for the disappearance of the vertical line. Eye gaze detection is done by detecting eyeballs and applying a threshold on them. The system also counts the white part of the eye to determine the direction of the gaze. The virtual keyboard is displayed on the screen and is controlled by eye movements and blinking. The user can select letters and symbols by blinking, and additional controls are provided to adjust the settings such as language and speed of the speech output. The output of the system can be in the form of text or speech, and is displayed on a 16x2 LCD or pronounced using a speaker. The text or speech output is determined by the user's selections on the virtual keyboard. In summary, the Eye Gaze Controlled Typing System is an innovative technology that enables individuals with motor impairments or disabilities to communicate using eye movements and blinking. The system uses advanced computer vision algorithms to detect the user's face, eyes, and gaze direction and provides a simple and intuitive interface for controlling the virtual keyboard and adjusting the system settings Keywords: Assistive technology, Eye gaze tracking, Computer vision, Virtual keyboard, Eye blink detection NodeMCU, LCD, Voice playback, Disabilities. I. INTRODUCTION The Message conveyor for locked syndrome patients is an innovative assistive technology designed to help individuals with motor impairments or disabilities to communicate using eye movements and blinking. This system uses advanced computer vision algorithms to detect the user's face, eyes, and gaze direction, and provides a simple and intuitive interface for controlling the virtual keyboard and adjusting the system settings. The system works by detecting the user's face using a Histogram of Oriented Gradients (HOG) descriptors and tracking the movement of the user's eyes using a Shape Predictor 68 Face Landmark algorithm. Eye blinking is detected by drawing two lines in the eye and checking for the disappearance of the vertical line. Eye gaze detection is done by detecting eyeballs and applying a threshold on them. The system also counts the white part of the eye to determine the direction of the gaze. The virtual keyboard is displayed on the screen and is controlled by eye movements and blinking. The user can select letters and symbols by blinking, and additional controls are provided to adjust the settings such as language and speed of the speech output. The output of the system can be in the form of text or speech, and is displayed on a 16x2 LCD or pronounced using a speaker. The text or speech output is determined by the user's selections on the virtual keyboard. Overall, the Eye Gaze Controlled Typing System is an innovative technology that provides a new way for individuals with motor impairments or disabilities to communicate and express themselves.
  • 2. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 411 II. BLOCK DIAGRAM Fig 2.1. Block Diagram III. PROPOSED METHODOLOGY The camera or video input device captures the user's face and eye movements. The face detection component uses the Histogram of Oriented Gradients (HOG) feature descriptor to identify the user's face in the image. The shape predictor 68 face landmarks uses 68 specific points to identify features such as the eyes, nose, lips, eyebrows, and face area. The eye detection component uses these landmarks to identify the user's eyes. The eye blink detection component tracks vertical and horizontal lines in the eyes to detect blinking. The eye gaze detection component tracks the user's eyeballs and applies a threshold to identify the white part of the eye. This allows the system to determine which part of the virtual keyboard the user is looking at. The virtual keyboard is displayed on the screen, and the user selects letters and other characters by blinking their eyes. The NodeMCU with built-in WIFI receives the user's text selection and sends it to the display (16x2 LCD) and voice playback components for user feedback. The user can interact with the system and control its functions. IV. MODULE DESCRIPTION A.Camera or Video Input Device The camera used in the eye gaze-controlled typing system can vary depending on the specific implementation and requirements of the system. However, generally, a camera with high resolution and a good frame rate is preferred for accurate tracking of the user's eye movements. In general, a camera with a resolution of 720p or higher is suitable for eye tracking. The frame rate should be at least 30 frames per second (fps) to ensure smooth and accurate tracking. The camera should also have good low-light performance, as the system may be used in different lighting conditions. In terms of the camera type, both webcam and Smartphone cameras can be used for eye tracking. However, webcams are typically preferred for their ease of use and better stability, while Smartphone cameras may be more portable and flexible in terms of placement. It’s important to note that the camera used in the system should be compatible with the software used for face detection, eye detection, and eye tracking. The specific software may have recommended camera requirements, so it's important to check with the software documentation or manufacturer for any specific camera recommendations. Fig.4.1. Web Camera B. Face Detection (using HOG) The face detection block in the eye gaze-controlled typing system uses the Histogram of Oriented Gradients (HOG) feature descriptor to detect faces in the captured image. HOG is a popular feature descriptor used in computer vision and image processing for object detection, including face detection. Here's a brief overview of how to face detection using HOG works: Image pre-processing: The input image is pre-processed to improve the performance of the face detection algorithm. This may include steps such as greyscale conversion, contrast normalization, and histogram equalization. Sliding window approach: The HOG descriptor works by dividing the input image into small, overlapping sub-windows or regions. Each sub- window is represented as a feature vector using the HOG descriptor. Gradient orientation calculation: The HOG descriptor counts the number of occurrences of gradient orientation within each sub-window. The gradient orientation is calculated by computing the gradient magnitude and direction of the pixels in the sub-window. Histogram of oriented gradients: The gradient orientation values are grouped into bins to form a histogram of oriented gradients for each sub-window. The HOG descriptor is a concatenation of these histograms. Support Vector Machine (SVM) classification: The HOG descriptor
  • 3. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 412 is used as input to a binary SVM classifier, which is trained to distinguish between face and non-face sub-windows. The classifier is trained using a large number of positive (face) and negative (non-face) examples. On-maximum suppression: The output of the classifier is a set of bounding boxes around potential face regions in the input image. To reduce false positives, non-maximum suppression is applied to eliminate overlapping bounding boxes and keep only the most likely face region. The face detection block in the eye gaze-controlled typing system uses the dlib library in OpenCV to implement the HOG- based face detection algorithm. This algorithm has been shown to be more accurate than other methods for face detection, making it a popular choice for many computer vision applications. Fig.4.2. Face Detection using HOG c. Shape Predictor 68 Face Landmark The shape predictor 68 face is a machine learning-based algorithm used to locate 68 specific landmarks or key points on the face. It is commonly used in computer vision applications such as face tracking, face recognition, and facial expression analysis. Here’s an overview of how the shape predictor 68 face works: Training data: The shape predictor 68 face algorithm is trained on a large dataset of face images, which are labeled with the locations of 68 specific facial landmarks. The training process involves using machine learning algorithms to learn the relationship between facial features and landmark locations. Feature extraction: The shape predictor 68 face algorithm uses feature extraction techniques to identify facial landmarks. Specifically, it extracts HOG features from the image, which are then used to train a regression model to predict the locations of the 68 landmarks. Regression model: The shape predictor 68 face algorithms use a regression model to predict the locations of the 68 facial landmarks. The regression model takes the extracted HOG features as input and outputs the landmark coordinates. Facial landmark detection: Once the shape predictor 68 face algorithm is trained, it can be used to detect the 68 facial landmarks in new images. This involves applying the trained regression model to the input image to predict the locations of the landmarks. Landmark tracking: The landmark locations can be used to track the face as it moves, which can be useful for applications such as facial expression analysis or eye gaze tracking. In the eye gaze- controlled typing system, the shape predictor 68 faces are used to detect the locations of the user's eyes. The algorithm is used to locate the eye regions by identifying specific landmarks, such as the corners of the eyes and the iris. Once the eye regions are located, the system can use additional image processing techniques to detect eye blinks and track the user's eye gaze to select keys on the virtual keyboard Fig.4.3. Shape Predictor 68 Face To detect eye gaze, the eye gaze-controlled typing system uses image processing techniques to analyze the video feed from the camera and monitor changes in the eye regions over time.The basic principle behind eye gaze detection is that the position of the iris in the eye can be used to infer the direction of the person's gaze. In the eye gaze-controlled typing system, the eye gaze detection algorithm first isolates the eye regions using the shape predictor 68 face algorithm. Once the eye regions are identified, the system applies a threshold to the grayscale image to convert it into a binary image. This binary image is then used to detect the iris and estimate the position of the pupil center. Once the position of the pupil center is estimated, the system uses various algorithms, such as Hough transforms and circular pattern matching, to estimate the position and direction of the gaze. This estimation is typically done by comparing the position of the pupil center with the position of the eye corners and Landmark D. Eye Gaze Detection (using eyeball tracking)
  • 4. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 413 using this information to infer the direction of the gaze. By monitoring gaze direction, the system can allow users to select keys on the virtual keyboard simply by moving their eyes E. Eye Blink Detection using vertical and horizontal lines To detect eye blinks, the eye gaze-controlled typing system uses image processing techniques to analyze the video feed from the camera and monitor changes in the eye regions over time. The basic principle behind eye blink detection is that the eyelids close when a person blinks, causing a temporary occlusion of the eye regions. This change in the eye regions can be detected using image processing techniques, such as edge detection, thresholding, and blob analysis. In the eye gaze-controlled typing system, the eye blink detection algorithm first isolates the eye regions using the shape predictor 68 face algorithm. Once the eye regions are identified, the system applies a threshold to the grayscale image to convert it into a binary image. This binary image is then used to detect changes in the eye regions over time. To detect eye blinks, the system uses two lines that are drawn over the eye regions, one vertical and one horizontal. When the eye is open, both lines are visible. However, when the eye blinks, the vertical line disappears temporarily due to the occlusion of the eyelid. The system tracks the disappearance of the vertical line and uses it to detect eye blinks. F. Virtual Keyboard Displayed on Screen The virtual keyboard in the eye gaze-controlled typing system is a software-based user interface that allows users to input text using eye movements. It is displayed on the screen and consists of a grid of keys, similar to a standard physical keyboard. The keys can include letters, numbers, and symbols, depending on the language and preferences of the user. The virtual keyboard in this system is designed to be used with eye movements, specifically eye blinks and gaze direction. Each key on the virtual keyboard is selected by a combination of eye blinks and gaze direction. To select a key, the user first gazes at the key they want to select. Once the desired key is in focus, the user blinks their eyes to activate the selection. The system then registers the selected key and begins to build the text by adding the character to a buffer. The virtual keyboard output is received by the NodeMCU, which is a microcontroller with built-in Wi-Fi. The NodeMCU communicates with a voice playback and 16x2 LCD, which shows the text received by NodeMCU and pronounces it through a speaker if the text is stored. By combining the virtual keyboard with these hardware components, the system can provide a complete solution for text input and output using only eye movements. G. Selecting Letters by Blinking Eyes In the eye gaze-controlled typing system, each letter on the virtual keyboard is selected by a combination of eye gaze and eye blink. Here's how it works: The user gazes at the letter they want to select on the virtual keyboard. Once the desired letter is in focus, the user blinks their eyes to select the letter. The system detects the eye blink and registers the selection. The selected letter is then added to a buffer, which holds the current text being input by the user. The process is repeated for each letter the user wants to select. The system uses a computer vision algorithm to detect eye blinks and distinguish them from other eye movements. The algorithm detects when the upper and lower eyelids come together and then separate, which signifies a blink. By setting a threshold for the duration of this action, the system can distinguish between a blink and other eye movements, such as saccades or fixations. The combination of eye gaze and eye blink allows the user to select letters on the virtual keyboard with a high degree of accuracy and precision. It can take some practice to get used to this method of text input, but it can be an effective way for individuals with motor disabilities to communicate using only their eyes. H.NodeMCU with Built-In WIFI NodeMCU is an open-source development board designed for the Internet of Things (IoT). It is based on the ESP8266 microcontroller and includes built-in Wi-Fi, making it easy to connect to the internet and communicate with other devices. The NodeMCU board is similar to other development boards like the Arduino, but it is specifically designed for IoT applications. It is programmed using the Lua scripting language, which is easy to learn and use. NodeMCU includes several features that make it a popular choice for IoT projects, including Wi-Fi connectivity: NodeMCU has built-in Wi-Fi, which allows it to connect to the internet and communicate with other devices.GPIO pins: NodeMCU includes several general-purpose input/output (GPIO) pins, which can be used to connect to sensors, actuators, and other devices. Analog input: NodeMCU also includes analog input pins, which can be used to read values from sensors that output analog signals.USB interface: NodeMCU can be connected to a computer via USB, which allows it to be programmed and powered. Low cost: NodeMCU is a relatively low-cost development board, making it accessible to hobbyists and
  • 5. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 414 students. In the eye gaze-controlled typing system, NodeMCU is used to receive the text selected by the user on the virtual keyboard. This text is then displayed on a 16x2 LCD display and pronounced using a speaker connected to the NodeMCU board. NodeMCU communicates with the other components of the system using Wi-Fi, which allows for wireless communication and eliminates the need for additional wires or connections. Fig.4.4. NodeMCU with Built-In WIFI I. Display (16x2 LCD) and Voice Playback for User Feedback In the eye gaze-controlled typing system, a 16x2 LCD display and a speaker are used to provide visual and audio feedback to the user. The 16x2 LCD display is a type of alphanumeric display that can show up to 32 characters at a time in two lines of 16 characters each. It is a commonly used display in embedded systems and is easy to interface with using digital input/output pins. In the eye gaze- controlled typing system, the LCD display is used to show the text selected by the user on the virtual keyboard. This allows the user to confirm that the correct text has been selected before it is pronounced by the speaker. The speaker is used to provide audio feedback to the user in the form of spoken words. In the eye gaze-controlled typing system, the speaker is connected to the NodeMCU board and is used to pronounce the text selected by the user on the virtual keyboard. The speaker can be any type of audio playback devices, such as a simple piezoelectric buzzer or a more advanced speaker with built-in amplification. Both the display and the speaker are connected to the NodeMCU board using digital input/output pins. The NodeMCU board communicates with these components using software libraries that allow for easy interfacing with the display and speaker. This allows the user to receive both visual and audio feedback from the system, making it more user-friendly. Fig.4.4. Display (16x2 LCD) J. Output Text or Speech The output of the eye gaze-controlled typing system can be in the form of text or speech, depending on the user's preference. If the user selects text as the output, the system will display the selected text on the 16x2 LCD display. The user can then read the text and use it as needed. If the user selects speech as the output, the system will pronounce the selected text using the speaker. The speaker can be programmed to produce speech in a variety of languages and accents, allowing for greater flexibility and accessibility for users from different backgrounds. In either case, the output text or speech is determined by the user's selections on the virtual keyboard, which is controlled by eye movements and blinking. This allows users with motor impairments or disabilities to interact with the system and produce text or speech output in a way that is comfortable and natural for them. After implementing all setup and successful run of our message conveyer several times to obtain results. This device is designed to help people who are physically disabled and are not able to Speak to convey their message to others. The system detects eye blinking and the letter in which The cursor is placed during the time of eye blinking will be shown on the screen, thus making a sentence. This system is designed at a very low cost so that it is affordable to everyone and is also accessible to small children as well. The proposed system will provide a robust and a sort of technique to write for physically disabled people, it is not a quick way of writing for normal people. The proposed system in this paper provides a new dimension in the life of disabled People who haven’t any other physical abilities without eyes movement. V. RESULT AND DISCUSSION
  • 6. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 415 VI. CONCLUSION The Message conveyor for locked syndrome patients is a significant advancement in assistive technology that provides an intuitive and efficient means for individuals with motor impairments or disabilities to communicate. The system uses advanced computer vision algorithms to detect the user's face, eyes, and gaze direction and provides a simple and easy-to-use interface for controlling the virtual keyboard. The system is designed to be customizable, with various settings that can be adjusted to suit the user's needs and preferences. The output can be in the form of text or speech, and is displayed on an LCD display or pronounced using a speaker. The Eye Gaze Controlled Typing System has the potential to greatly improve the quality of life for individuals with motor impairments or disabilities, enabling them to communicate more effectively and express themselves more freely. It represents a significant step forward in assistive technology and has the potential to become an essential tool for individuals with disabilities. VII. REFERENCES 1.R. Zhang, Y. Yang, J. Wang, T. Ma, X. Lv and P. Xu, "A speller system for the locked-in patient to communicate with friends," 2015 IEEE 28th Canadian Conference on Electrical and Computer Engineering (CCECE), Halifax, NS, Canada, 2015, pp. 1203-1206, doi: 10.1109/CCECE.2015.7129448. 2.K. Ullah, M. Ali, M. Rizwan, and M. Imran, "Low-cost single-channel EEG based communication system for people with lock-in syndrome," 2011 IEEE 14th International Multitopic Conference, Karachi, Pakistan, 2011, pp. 120-125, doi: 10.1109/INMIC.2011.6151455. 3.M. Mogrovejo, E. Pinos-Velez, R. Redrovan and L. Serpa- Andrade, "Communication system for people with locked- in syndrome through electromyography signals," 2017 IEEE International Autumn Meeting on Power, Electronics and Computing (ROPEC), Ixtapa, Mexico, 2017, pp. 1-5, doi: 10.1109/ROPEC.2017.8261573. 4.S. Ramkumar et al., "Designing Communication System for Person with Locked-in Syndrome Using Machine Learning Technique," 2019 IEEE International Conference on Clean Energy and Energy Efficient Electronics Circuit for Sustainable Development (INCCES), Krishnankoil, India, 2019, pp. 1-5, doi: 10.1109/INCCES47820.2019.9167686. 5.S. Adama, U. Chaudhary, N. Birbaumer and M. Bogdan, "Longitudinal Analysis of the Connectivity and Complexity of Complete Locked-in Syndrome Patients Electroencephalographic signal," 2020 IEEE International Conference on Bioinformatics and Biomedicine (BIBM), Seoul, Korea (South), 2020, pp. 958-962, doi: 10.1109/BIBM49941.2020.9313391. 6.M. Moeneclaey and P. Sanders, "Syndrome-based Viterbi decoder node synchronization and out-of-lock detection," [Proceedings] GLOBECOM '90: IEEE Global Telecommunications Conference and Exhibition, San Diego, CA, USA, 1990, pp. 604-608 vol.1, doi: 10.1109/GLOCOM.1990.116581. 7.A. Comaniciu and L. Najafizadeh, "Enabling Communication for Locked-in Syndrome Patients using Deep Learning and an Emoji-based Brain Computer Interface," 2018 IEEE Biomedical Circuits and Systems Conference (BioCAS), Cleveland, OH, USA, 2018, pp. 1-4, doi: 10.1109/BIOCAS.2018.8584821. 8.P. Swetha, S. Amardeep, A. Siva Nagasen, G. Manoj Kumar and G. Kranthi Kumar, "Arduino based Virtual Keyboard for Locked-in-Syndrome," 2020 Fourth International Conference on Computing Methodologies and Communication (ICCMC), Erode, India, 2020, pp. 1024- 1028, doi: 10.1109/ICCMC48092.2020.ICCMC-000191. 9.S. Aziz, S. Ibraheem, A. Malik, F. Aamir, M. U. Khan and U. Shehzad, "Electrooculugram based Communication System for People with Locked-in-Syndrome," 2020 International Conference on Electrical, Communication, and Computer Engineering (ICECCE), Istanbul, Turkey, 2020, pp. 1-6, doi: 10.1109/ICECCE49384.2020.9179323 10.W. Shehieb, S. Alansari and N. Jadallah, "EEG-based communication system for patients with locked-in syndrome using fuzzy logic," 2017 10th Biomedical Engineering International Conference (BMEiCON), Hokkaido, Japan, 2017, pp. 1-5, doi: 10.1109/BMEiCON.2017.8229168. 11.C. Guger et al., "MindBEAGLE — A new system for the assessment and communication with patients with disorders of consciousness and complete locked-in syndrom," 2017 IEEE International Conference on Systems, Man, and Cybernetics (SMC), Banff, AB, Canada, 2017, pp. 3008-3013, doi: 10.1109/SMC.2017.8123086. 12.D. Koo, H. F. H. Polanco, M. Cross, Y. H. Rho, N. Ames and J. Raiti, "Demonstration of low-cost EEG system providing
  • 7. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 10 Issue: 03 | Mar 2023 www.irjet.net p-ISSN: 2395-0072 © 2023, IRJET | Impact Factor value: 8.226 | ISO 9001:2008 Certified Journal | Page 416 on-demand communication for locked-in patients," 2021 IEEE Global Humanitarian Technology Conference (GHTC), Seattle, WA, USA, 2021, pp. 108-111, doi: 10.1109/GHTC53159.2021.9612454..