SlideShare a Scribd company logo
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072
© 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6951
Survey on Sign Language and Gesture Recognition System
Hema B N1, Sania Anjum2, Umme Hani3, Vanaja P4, Akshatha M5
1,2,3,4 Students, Dept. of Computer Science and Engineering, Vidya vikas Institute of Engineering and Technology,
Mysore, Karnataka, India
5 Professor, Dept. of Computer Science and Engineering, Vidya vikas Institute of Engineering and Technology,
Mysore, Karnataka, India
---------------------------------------------------------------------***---------------------------------------------------------------------
Abstract - Every day we see many disabled people such as
deaf, dumb and blind etc. They face difficulty to interact with
others. Around 4 million of people in India are deaf and
around 10 million of people in India are dumb. And they use
sign language to communicate with each other and normal
people. But Normal people find it difficult to understand the
sign language and gesture made by deaf and dumb people. So
there are many techniques which can be used to convert the
sign language made by the disables into the form(such astext,
voice etc) which can be understood by the normal people.
Previously developed techniquesareallsensorsbasedandthey
didn’t give the general solution. Recently number of vision
based techniques are available to accomplish this. This paper
explains the various available techniques that can be used to
convert the sign languageandgesture intotheunderstandable
format.
Key Words: Sign language recognition (SLR
), Human Computer Interaction(HCI), 7Hu (7Moments),
K-Nearest Neighbour (KNN), Probabilistic Neural
Network (PNN), Histogram of Oriented Gradient (HOG).
1.INTRODUCTION
Hand gesture recognition provides an intelligent, natural,
and convenient way of human–computer interaction (HCI).
Sign language recognition (SLR) and gesture-based control
are two major applications for hand gesture recognition
technologies . SLR aims to interpret sign languages
automatically by a computer in order to help the deaf
communicate with hearing society conveniently. Since sign
language is a kind of highly structured and largely symbolic
human gesture set, SLR also serves as a good basic for the
development of general gesture-based HCI. [1] In this paper
we are discussing work done in the area of hand gesture
recognition and analyze the methods for recognitionofhand
gesture .Dumb people use hand signstocommunicate,hence
normal people face problem in recognizingtheirlanguage by
signs made. Hence there is a need of the systems which
recognizes the differentsignsandconveystheinformationto
the normal people. No one form of sign languageisuniversal
as it varies from region to region and country to countryand
a single gesture can carry a different meaning in a different
part of the world. Various available sign languages are
American Sign Language (ASL), British Sign Language(BSL),
Turkish Sign Language(TSL),IndianSignLanguage(ISL) and
many more.
2. SIGN LANGUAGE
Sign language is a means of communication among the deaf
and mute community. It emerges and evolves naturally
within hearing impaired community. The sign language
differs from people to people and from region to region.
There is no particular standardized sign language they
follow. As the sign language becomes a communicating
language to the deaf and mute people it can be made easier
for them to communicate with normal people too. Gesture
recognition is an efficient way through which the
movements of hand can be recognized easily. According to a
survey made by some organizations there are many people
who are deaf and mute and find it difficult to communicate.
Gestures are made by them as an easy way of
communication.
2.1 Sensor Based Approach
This approach collects the data of gesture performed by
using different sensors. The data is then analyzed and
conclusions are drawn in accordance with the recognition
model. In case of hand gesture recognition different types of
sensors were used and placed on hand, when the hand
performs any gesture, the data is recorded and is then
further analyzed. The first sensor used was Data glovesthen
LED’s came into existence. The invention of the first data
glove was done in 1977. Sensorbasedapproachdamagesthe
natural motion of hand because of use of external hardware.
The major disadvantage is complex gestures cannot be
performed using this method. [1]
2.2 Vision Based Approach
This approach takes image from the camera as data of
gesture. The vision based method mainly concentrates on
captured image of gesture and extract the main feature and
recognizes it. The color bands were usedatthestartofvision
based approach. The main disadvantage of this method was
the standard color should be used on the finger tips. Then
use of bare hands preferred rather than the color bands. [1]
3. LITERATURE SURVEY
In this paper [2] the sign and hand gesturesarecapturedand
processed with the help of mat lab and the converted into
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072
© 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6952
speech & text form .The feature extraction of values of the
images is evaluated based on 7Hu(7 moments) invariant
moment technique and The classification techniques are
applied using K-NearestNeighbour(KNN) iscompared with
the PNN(Probabilistic Neural Network) for the accuracy
rate. The performance of proposed classifier KNN is decided
based on various parameters Parameters can be calculateby
formula’s, Using 7hu moments technique’s for feature
extraction will be done , the 7Hu moments are a vector of
algebraic invariants that added regular moment. They are
invariant under change of translation, size and rotation. Hu
moments have been widely used in classification [2]. And
KNN (k-nearest neighbour’s classification) algorithm will
classifies the objects on basis of majority votes of its
neighbour’s. The object is assigned to the class by Most
common among its nearest neighbour’s k (k is a small
positive integer).
Fig-1 : Architecture of the system
Using (KNN) classification we get approximately 82% accuracy
rate. The limitations of this system is the results of speech is audio
output and the gestures or signs is only on the left side of the upper
half of the captured image. And Capturing of image carefully and
with high quality webcam is needed.
In this paper [3], hand gesture recognition can be done by
wearing gloves this proposed system can work for real-time
translation of Taiwanese sign language. The different hand
gesture can be identified by using the 10 flex sensors and
inertial sensors which is embeddedintotheglove,itincludes
three major parameters, there are
1. The posture of fingers
2. Orientation of the palm
3. Motion of the hand
As defined in Taiwanese Sign Languageanditcanrecognized
any without ambiguity. The finger flexion postures can be
collected from the flex sensors, the palm orientation
acquired by G-sensor, also the motion trajectoryacquired by
gyroscope are used as the input signals of the proposed
system. The input signals will be gathered and validate or
checked periodically to see if it is a valid sign language
gesture or not. The flex sensor are attached with a fixed
resister, to calculate the voltagebetweentheflexsensor(RF)
and the fixed resistor (RB) using below formula
Vo = Vi RF /RF +RB
Depending upon the values of RF &RB the angle of bent in
hand figure will be obtained. The orientations of palm like
up, down, right, left etc. Will be identified by using 3D data
sequence along x-axis y-axis and z-axis. Once all these
process done the sampled signal can last longer than a
predefined clock cycles, and it is regarded as a valid hand
sign language gesture and will be sent to smart phones via
Bluetooth for gesture identification and to speech
translation. The accuracy for gesture recognition in this
proposed system is up to 94%.
In this paper [4] A vision based interface system is
developed for controlling and performing various computer
functions with the aim of human computer interaction. The
human computer interaction aims at easy way ofinteraction
with the system. Image segmentation and feature extraction
can be done by using the vision base technology.
Human computer interaction is a multidisciplinary field of
study focusing on the design of computertechnologyandthe
interaction between humans andcomputers.Theresearches
done with HCI deals with the design, evaluation,
implementation and assessment of new interfaces for
improving the interaction between humans and machines.
Here convex hull and convexity defects algorithms are
programmed in Open CV. These are manipulated to various
parameters which can be used for computer control. The
steps involved are:
 Derive the binary image from the coloured image
frames
 Find the contour of the binary image and draw this
on another blank image
 Find the center of mass of hand
 Find and draw the convexhull andconvexitydefects
on a blank image
 Define and manipulate certain points which can be
used for gesture control
From the obtained colour image, binary image is derived
using image segmentation technique. This is followed by
extracting the contour of the image which is required for
drawing the convex hull and convexity defect. The center of
mass of the hand is computed using algorithms available in
Open CV. The convex hull of the hand contour is drawn. The
region between the fingers forms the convexity defects. The
number of fingers present in the hand gesture is determined
by counting the number of convexity defects. The vertices of
the convex polygon are found out to identify the convexity
defects. There are few parameters to be considered:
 Point start-Point of contour where the convexity
defect begins
 Point end- Point of contour where the convexity
defect ends
 Point far-Point within the defect which is farthest
from the convex hull
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072
© 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6953
 Depth-Distance between the convex hull i.e the
outermost points and the farthest points within the
contour.
These points obtained can be used as parameters for
designing hand gestures for computer control.
In this paper [5], a vision based sign language recognition
system using LABVIEW forautomatic signlanguagehasbeen
presented. The goal of this project is to develop a new vision
based technology for recognizingandtranslatingcontinuous
sign language to text. Although the deaf, hard of hearing and
hearing signers can communicate without problems among
themselves ,there is a serious challenge for deaf community
trying to integrate into educational ,social and work
environments.
Methodology:
Vision based sign language translation is a real-time system
which is broken down into three main parts starting with
Image acquisition followed by image processing and last
comes image recognition. The program starts with image
acquisition .i.e., sign images capturing by the camera. The
acquired images are pre-processedtodifferentiatestaticand
dynamic signs, and also the start and end of the sign. The
images are extracted to be used in the recognition stage. In
the recognition stage, the features extracted are compared
with the available database of pattern matching templates.
A threshold value is set for themaximumdifferencebetween
the input sign and the database, if the difference isbelowthe
maximum limit, a match is found and the sign is recognized.
Environmental setup:
Image acquisition process is subjected to many
environmental concerns such as the position of the camera,
lighting sensitivity and background condition.
Result:
This sign language translator is able to translate alphabets
(A-Z/a-z) and numbers (0-9). All the signs can be translated
real-time. The current system has only been trained on a
very small database. This is a wearable phone system that
provides the greatest utility for automatic sign language to
spoken English translator. It can be worn by the signer
whenever communication with a non-signer might be
necessary
In this paper [6] there are two feature descriptors used
Histograms of Oriented Gradients and Scale Invariant
Feature Transform. The algorithmimplementedinKNN uses
HOG features for an image and classifies them using SVM
technique. In the proposed system the algorithm
implemented to recognize the gestures of Indian Sign
Language is of static images.
 Categorization of single handed or double handed:
sub categorization is the main aim here which is
implemented using HOG-SVM classifier. Here the
images are converted from colour to binary. Hence
SVM is quite accurate in separating single and
double handed
 Feature extraction: for both single and double
handed feature recognition HOG and SIFT are
extracted and these two descriptors are used. The
HOG uses histogram to depict the intensity
distribution which is done by dividing the image
into cells and creating 1D histogram for edge
orientation in each cell. In SIFT, initially the points
of interest are called as key points, are located
which represents local maximaandminima thatare
applied to the whole image. Next the key points are
filtered by eliminating low contrast points by
setting a threshold
 Feature fusion using K-nearest correlated
neighbour features algorithm: the matrices of HOG
and SIFT features for the training images are
constructed separately for single handed and
double handed features. The correlation of SIFT an
SIFT for a test image is calculated with all other
features of the dataset
Result:
Without categorising into single and double handed
gestures:
 Accuracy using KNN algorithm of HOG features of
images is 78.84% and that for SIFT gave 80%.
 After classifying into single and double handed
gestures:
 HOG gave100% for single handed 82.77% for
double handed
 SIFT gave 92.50% for single handed 75.55% for
double handed
 Fusion of both gave 97.50% for single handed and
91.11% for double handed
 Validation of accuracy of algorithm: the proposed
method was able to identify 59testimagescorrectly
out of 60 test images for single handedgestures.For
double handed gestures out of 200 test images it
was correctly able to identify 179 of them
In the today’s world there are many disabled people (deaf,
mute, blind, etc) who face lot of problems when they try to
communicate with other. Previously developed devices did
not implement any general solution.Thispaper [7]describes
a new method of developing wearable sensor gloves for
detecting hand gestures which uses British and Indian sign
language system. The outputs are produced in the text
format using LCD and audio format using APR9600 module.
The hand gesture or the hand signs are converted to
electrical signals using flex sensor. These electrical signals
are processed to produce appropriate audio andtextoutput.
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072
© 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6954
Previously designed devices were not accurateintracingthe
hand gestures. The paper employs methodoftuninginorder
to improve the accuracy of detecting hand gesture. The
paper as gives the statistical overview of the population of
India in both urban and ruler area sufferingfromatleastone
disability . It discuss in brief the type of gesture recognition
available the gesture recognition can be done in three ways,
i.e., vision based, sensor based and hybrid of the fittwoThey
have explained how the processing has been divided into
three stage that is sensors ,processing unit and output
A glove with flex sensor is used to detect the signs made by
the person. The made. Gestures which are made are
converted to equivalent electrical signals by the sensor.
These signals are sent to input pins of the MSP430F149. For
different gestures the signals are different. A range of ADC
values [12bit ADC assumed] is assigned for each gesture the
corresponding word would be stored in the memory. When
the controller receives these signals it compares with the
stored values and the corresponding word is selected and
the word is spoken out of the speaker. Speaker here can
used. The corresponding word is simultaneously displayed
on the LCD. The complete block diagram is shown in Figure
below,
Fig-2: Block diagram of working model
This paper [7] also explain about the various hardware
devices like Gloves, MSP430F149, APR9600 and their
functionality in the proposed system. The working of
software how the gesture is converted into electrical signal
and forwarded to the processing unit. The advantage of
using gloves for in verbal communication is the are cheap
and accurate. The drawback is processing the delay.
This paper [8] tells about the first data driven automatic
sign-language to speech translation system. They have
combined a SL (sign-language) recognizing framework with
a state-of-art phrased based machine translation (MT)
system using corpora (complete written work) of both
American sign language and Irish sign language data. They
have also explained about vision-based knowledge source.
They also inform us about the challenges that comes down
when a deaf and hearing person communicate. They have
suggested that data-driven output recognizer is not easily
intelligible because of different grammar and annotation
format
Sign Language: Sign language is local to specific countries
SL are grammatically distinct from the spoken language and
grammar and it make use of visual/gestural modality hand
are used as information carrier it also make use of facial
expression tilts of the headsandshouldersaswell asvelocity
of sign
Sign Language Transcription: There is lack of formally
adopted writing system of SL some attempts have been
made to describe the hand shape location and articulation
moment despite this they fall short to be used in
computational hence they adopt annotation ,in annotation
manual transcription of sign language is taken from video
data.
The Corpus: Data-driven approach MT require a bilingual
data set (speaking two language fluently)
Sign language recognition: The automatic sign language
recognition (ASLR) system is basedonTheautomatic speech
recognition (ASR) system. To model image variability (no
fixed pattern) various approaches are known and havebeen
applied to gesture recognition. To extract features the
dominant hand is tracked in each image sequence hence
robust tracking algorithm is required Data-driven SL MT.
This area is new they are 4 method employed in this
approach
1. Example –based MT for English sign language of the
Netherland
2. SMT (statistical machine translation) for German and
German sign language
3. (chiu) a system for Chinese and Taiwanese sign language
4. (san-segundo) for Spanish and Spanish sign language
SMT : SMT uses a state-of –the-art phrase basedSMTsystem
to automatically transfer the meaning of source language
sentence into a target language sentence
Result:
The size of RWTH-Boston 104 is far too small to make
reliable assumption at very least we show that SMT is
capable to work as an intermediate step for complete sign-
to-speech system. For extremely small training data the
resulting translation quality is reasonable. As no sign
language parser exits for annotation theyproposestemming
of the glosses (leaving out the flexion) during the
recognition. Moreover sign language produce quite a large
effect known as co articulation (the movement betweentwo
sign that cannot be easily trained) .
International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056
Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072
© 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6955
This paper [9] is about to design a system that can
understand the sign language and gestures accurately so
that the deaf and dumb people may communicate with the
outside world easily without the need of an interpreter to
help them .
Here the work is done for the American sign language (ASL)
which has its own unique gesture to represents the words
and the English alphabets i.e. is from A-Z. The process is to
recognize the sign language and gesture as shown in the
below figure
Fig-3: System block diagram
The image will be captured through the web camera and it
stored in system. The acquired image that is stored in the
system windows needs to be connected to the mat lab
software automatically to continue the further process.This
is can be done by creating an object s of that images. With
the help of high speed processors in system .The imaqe info
function will be used here to determine if the device
supports device configuration files or not. If the input is an
RGB image, it can be of class uint8, uint16, single, or double.
The output image is of the same class as the input image.
These captured colour (RGB) image first converted into
grayscale or binary image as some of the pre-processing
techniques can be applied on grayscale and binary image
only. Further these binary images undergoesthecanny edge
detection technique .The Edge Detection block finds the
edges in an input image by approximating the gradient
magnitude of the image. The block convolves the input
matrix with the Sobel, Prewitt, or Roberts kernel. The block
outputs two gradient components of the image, which are
the result of this convolution operation and the output a
binary image, is a matrix of Boolean values that is 0 and 1’s
forms. If a pixel value is 1, it is an edge or else it’s not edge
part of image. A KNN (k nearest neighbors) algorithm used
the classification purpose and alternatively Watershed
transform process will be used instead of KNN which
computes a label matrix which can identify the watershed
regions of the input matrix, which can have any dimension.
And the elements of L are integer values greater than or
equal to 0. The elements labelled 0 do not belong to a unique
watershed region, and at the end figure tip is detected.
Finally the last step is recognition which match the gesture
with the predefined gesture database and produces the
suitable matched meaning of the gesture in speech and text
form.
4. CONCLUSIONS
This paper deals with the different algorithms and the
techniques that can be used for recognizing the sign
language and the hand gesture made by different deaf and
dumb people. Handgesturerecognitionsystemisconsidered
as a way for more intuitive and proficient human computer
interaction tool. The range of applications includes virtual
prototyping, sign language analysis and medical training.
Sign language is one of the tool of communication for
physically impaired, deaf and dumb people. From the above
consideration it is clear that the vision based hand gesture
recognition has made remarkable progress in the field of
hand gesture recognition. Software tools which are used for
implementing the gesture recognitionsystem areC,C++, and
Java language. To simplify the work especially when image
processing operations are needed, MATLAB with image
processing toolbox is used.
REFERENCES
[1] IJTRS-V2-I7-005 Volume 2 Issue VII, August 2017
survey on hand gesture techniques for sign language
recognition
[2] Umang Patel & Aarti G. Ambekar “Moment based sign
language recognition for Indian Languages “2017 Third
International Conference on Computing,
Communication, Control and Automation (ICCUBEA)
[3] Lih-Jen kau, Wan-Lin Su, Pei-Ju Yu, Sin-Jhan Wei“AReal-
time Portable Sign Language Translation System”
Department of Electronic Engineering, National Taipei
University of Technology No.1, Sec. 3, Chung-Hsiao E.
Rd., Taipei 10608, Taiwan, R.O.C.
[4] Obtaining hand gesture parameters using Image
Processing by Alisha Pradhan and B.B.V.L.Deepak ,2015
International Conference on Smart Technology and
Management (ICSTM).
[5] Vision based sign language translation device
(conference paper: February 2013 by Yellapu Madhuri
and Anburajan Mariamichael).
[6] K-nearest correlated neighbour classification forIndian
sign language gesture recognition using feature
extraction(by Bhumika Gupta, Pushkar Shukla and
Ankush Mittal).
[7] Vikram Sharma M, Virtual Talk for deaf, mute, blind and
normal humans, Texas instruments India Educator’s
conference,2013.
[8] Hand in Hand: Automatic Sign Language to English
Translation, by Daniel Stein, Philippe Dreuw, Hermann
Ney and Sara Morrissey, Andy Way
[9] Er. Aditi Kalsh, Dr. N.S. Garewal “Sign Language
Recognition System” International Journal of
Computational Engineering Research 2013
[10] https://en.wikipedia.org/wiki/Sign_language

More Related Content

What's hot

sign language recognition using HMM
sign language recognition using HMMsign language recognition using HMM
sign language recognition using HMMveegrrl
 
A Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interactionA Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interaction
Kunika Barai
 
Review on Hand Gesture Recognition
Review on Hand Gesture RecognitionReview on Hand Gesture Recognition
Review on Hand Gesture Recognition
dbpublications
 
IRJET - Paint using Hand Gesture
IRJET - Paint using Hand GestureIRJET - Paint using Hand Gesture
IRJET - Paint using Hand Gesture
IRJET Journal
 
Gesturerecognition
GesturerecognitionGesturerecognition
Gesturerecognition
Mariya Khan
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET Journal
 
IRJET- Hand Gesture Recognition and Voice Conversion for Deaf and Dumb
IRJET- Hand Gesture Recognition and Voice Conversion for Deaf and DumbIRJET- Hand Gesture Recognition and Voice Conversion for Deaf and Dumb
IRJET- Hand Gesture Recognition and Voice Conversion for Deaf and Dumb
IRJET Journal
 
Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot
IJERA Editor
 
Ppt final
Ppt finalPpt final
Ppt final
julietrincy
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
IJARIIT
 
IRJET - Sign Language Text to Speech Converter using Image Processing and...
IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...
IRJET - Sign Language Text to Speech Converter using Image Processing and...
IRJET Journal
 
IRJET - Chatbot with Gesture based User Input
IRJET -  	  Chatbot with Gesture based User InputIRJET -  	  Chatbot with Gesture based User Input
IRJET - Chatbot with Gesture based User Input
IRJET Journal
 
Appearance based static hand gesture alphabet recognition
Appearance based static hand gesture alphabet recognitionAppearance based static hand gesture alphabet recognition
Appearance based static hand gesture alphabet recognitionIAEME Publication
 
A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...
A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...
A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...
ijtsrd
 
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
IJERA Editor
 
Automatic Isolated word sign language recognition
Automatic Isolated word sign language recognitionAutomatic Isolated word sign language recognition
Automatic Isolated word sign language recognition
Sana Fakhfakh
 
Hand and wrist localization approach: sign language recognition
Hand and wrist localization approach: sign language recognition Hand and wrist localization approach: sign language recognition
Hand and wrist localization approach: sign language recognition
Sana Fakhfakh
 
IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...
IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...
IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...
IRJET Journal
 

What's hot (20)

sign language recognition using HMM
sign language recognition using HMMsign language recognition using HMM
sign language recognition using HMM
 
A Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interactionA Dynamic hand gesture recognition for human computer interaction
A Dynamic hand gesture recognition for human computer interaction
 
Review on Hand Gesture Recognition
Review on Hand Gesture RecognitionReview on Hand Gesture Recognition
Review on Hand Gesture Recognition
 
IRJET - Paint using Hand Gesture
IRJET - Paint using Hand GestureIRJET - Paint using Hand Gesture
IRJET - Paint using Hand Gesture
 
Gesturerecognition
GesturerecognitionGesturerecognition
Gesturerecognition
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
 
IRJET- Hand Gesture Recognition and Voice Conversion for Deaf and Dumb
IRJET- Hand Gesture Recognition and Voice Conversion for Deaf and DumbIRJET- Hand Gesture Recognition and Voice Conversion for Deaf and Dumb
IRJET- Hand Gesture Recognition and Voice Conversion for Deaf and Dumb
 
Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot Password Based Hand Gesture Controlled Robot
Password Based Hand Gesture Controlled Robot
 
Ppt final
Ppt finalPpt final
Ppt final
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
 
IRJET - Sign Language Text to Speech Converter using Image Processing and...
IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...
IRJET - Sign Language Text to Speech Converter using Image Processing and...
 
IRJET - Chatbot with Gesture based User Input
IRJET -  	  Chatbot with Gesture based User InputIRJET -  	  Chatbot with Gesture based User Input
IRJET - Chatbot with Gesture based User Input
 
Nikppt
NikpptNikppt
Nikppt
 
Appearance based static hand gesture alphabet recognition
Appearance based static hand gesture alphabet recognitionAppearance based static hand gesture alphabet recognition
Appearance based static hand gesture alphabet recognition
 
A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...
A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...
A Deep Neural Framework for Continuous Sign Language Recognition by Iterative...
 
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
Hand Motion Gestures For Mobile Communication Based On Inertial Sensors For O...
 
Automatic Isolated word sign language recognition
Automatic Isolated word sign language recognitionAutomatic Isolated word sign language recognition
Automatic Isolated word sign language recognition
 
Hand and wrist localization approach: sign language recognition
Hand and wrist localization approach: sign language recognition Hand and wrist localization approach: sign language recognition
Hand and wrist localization approach: sign language recognition
 
IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...
IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...
IRJET- Tamil Sign Language Recognition Using Machine Learning to Aid Deaf and...
 
40120140503005 2
40120140503005 240120140503005 2
40120140503005 2
 

Similar to IRJET- Survey on Sign Language and Gesture Recognition System

Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand Gestures
IRJET Journal
 
IRJET- Gesture Recognition for Indian Sign Language using HOG and SVM
IRJET-  	  Gesture Recognition for Indian Sign Language using HOG and SVMIRJET-  	  Gesture Recognition for Indian Sign Language using HOG and SVM
IRJET- Gesture Recognition for Indian Sign Language using HOG and SVM
IRJET Journal
 
IRJET- Hand Gesture Recognition System using Convolutional Neural Networks
IRJET- Hand Gesture Recognition System using Convolutional Neural NetworksIRJET- Hand Gesture Recognition System using Convolutional Neural Networks
IRJET- Hand Gesture Recognition System using Convolutional Neural Networks
IRJET Journal
 
Real Time Sign Language Detection
Real Time Sign Language DetectionReal Time Sign Language Detection
Real Time Sign Language Detection
IRJET Journal
 
Latent fingerprint and vein matching using ridge feature identification
Latent fingerprint and vein matching using ridge feature identificationLatent fingerprint and vein matching using ridge feature identification
Latent fingerprint and vein matching using ridge feature identification
eSAT Publishing House
 
IRJET- Hand Movement Recognition for a Speech Impaired Person
IRJET-  	  Hand Movement Recognition for a Speech Impaired PersonIRJET-  	  Hand Movement Recognition for a Speech Impaired Person
IRJET- Hand Movement Recognition for a Speech Impaired Person
IRJET Journal
 
Gestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand GloveGestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand Glove
IRJET Journal
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
IRJET Journal
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine Learning
IRJET Journal
 
Dynamic Hand Gesture Recognition for Indian Sign Language: A Review
Dynamic Hand Gesture Recognition for Indian Sign Language: A ReviewDynamic Hand Gesture Recognition for Indian Sign Language: A Review
Dynamic Hand Gesture Recognition for Indian Sign Language: A Review
IRJET Journal
 
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and CameraIRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET Journal
 
IRJET- Hand Gesture Recognition for Deaf and Dumb
IRJET- Hand Gesture Recognition for Deaf and DumbIRJET- Hand Gesture Recognition for Deaf and Dumb
IRJET- Hand Gesture Recognition for Deaf and Dumb
IRJET Journal
 
Sign Language Recognition using Mediapipe
Sign Language Recognition using MediapipeSign Language Recognition using Mediapipe
Sign Language Recognition using Mediapipe
IRJET Journal
 
Design and Development of Motion Based Continuous Sign Language Detection
Design and Development of Motion Based Continuous Sign Language DetectionDesign and Development of Motion Based Continuous Sign Language Detection
Design and Development of Motion Based Continuous Sign Language Detection
IRJET Journal
 
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...
IRJET Journal
 
IRJET- Sign Language Interpreter using Image Processing and Machine Learning
IRJET- Sign Language Interpreter using Image Processing and Machine LearningIRJET- Sign Language Interpreter using Image Processing and Machine Learning
IRJET- Sign Language Interpreter using Image Processing and Machine Learning
IRJET Journal
 
IRJET - Android based Portable Hand Sign Recognition System
IRJET -  	  Android based Portable Hand Sign Recognition SystemIRJET -  	  Android based Portable Hand Sign Recognition System
IRJET - Android based Portable Hand Sign Recognition System
IRJET Journal
 
Vision based dynamic gesture recognition in Indian sign Language
Vision based dynamic gesture recognition in Indian sign LanguageVision based dynamic gesture recognition in Indian sign Language
Vision based dynamic gesture recognition in Indian sign LanguageUnnikrishnan Parameswaran
 
Hand Gesture Recognition System Using Holistic Mediapipe
Hand Gesture Recognition System Using Holistic MediapipeHand Gesture Recognition System Using Holistic Mediapipe
Hand Gesture Recognition System Using Holistic Mediapipe
IRJET Journal
 
HAND GESTURE VOCALIZER
HAND GESTURE VOCALIZERHAND GESTURE VOCALIZER
HAND GESTURE VOCALIZER
IRJET Journal
 

Similar to IRJET- Survey on Sign Language and Gesture Recognition System (20)

Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand Gestures
 
IRJET- Gesture Recognition for Indian Sign Language using HOG and SVM
IRJET-  	  Gesture Recognition for Indian Sign Language using HOG and SVMIRJET-  	  Gesture Recognition for Indian Sign Language using HOG and SVM
IRJET- Gesture Recognition for Indian Sign Language using HOG and SVM
 
IRJET- Hand Gesture Recognition System using Convolutional Neural Networks
IRJET- Hand Gesture Recognition System using Convolutional Neural NetworksIRJET- Hand Gesture Recognition System using Convolutional Neural Networks
IRJET- Hand Gesture Recognition System using Convolutional Neural Networks
 
Real Time Sign Language Detection
Real Time Sign Language DetectionReal Time Sign Language Detection
Real Time Sign Language Detection
 
Latent fingerprint and vein matching using ridge feature identification
Latent fingerprint and vein matching using ridge feature identificationLatent fingerprint and vein matching using ridge feature identification
Latent fingerprint and vein matching using ridge feature identification
 
IRJET- Hand Movement Recognition for a Speech Impaired Person
IRJET-  	  Hand Movement Recognition for a Speech Impaired PersonIRJET-  	  Hand Movement Recognition for a Speech Impaired Person
IRJET- Hand Movement Recognition for a Speech Impaired Person
 
Gestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand GloveGestures Based Sign Interpretation System using Hand Glove
Gestures Based Sign Interpretation System using Hand Glove
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
 
Sign Language Recognition using Machine Learning
Sign Language Recognition using Machine LearningSign Language Recognition using Machine Learning
Sign Language Recognition using Machine Learning
 
Dynamic Hand Gesture Recognition for Indian Sign Language: A Review
Dynamic Hand Gesture Recognition for Indian Sign Language: A ReviewDynamic Hand Gesture Recognition for Indian Sign Language: A Review
Dynamic Hand Gesture Recognition for Indian Sign Language: A Review
 
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and CameraIRJET- Finger Gesture Recognition using Laser Line Generator and Camera
IRJET- Finger Gesture Recognition using Laser Line Generator and Camera
 
IRJET- Hand Gesture Recognition for Deaf and Dumb
IRJET- Hand Gesture Recognition for Deaf and DumbIRJET- Hand Gesture Recognition for Deaf and Dumb
IRJET- Hand Gesture Recognition for Deaf and Dumb
 
Sign Language Recognition using Mediapipe
Sign Language Recognition using MediapipeSign Language Recognition using Mediapipe
Sign Language Recognition using Mediapipe
 
Design and Development of Motion Based Continuous Sign Language Detection
Design and Development of Motion Based Continuous Sign Language DetectionDesign and Development of Motion Based Continuous Sign Language Detection
Design and Development of Motion Based Continuous Sign Language Detection
 
Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...
 
IRJET- Sign Language Interpreter using Image Processing and Machine Learning
IRJET- Sign Language Interpreter using Image Processing and Machine LearningIRJET- Sign Language Interpreter using Image Processing and Machine Learning
IRJET- Sign Language Interpreter using Image Processing and Machine Learning
 
IRJET - Android based Portable Hand Sign Recognition System
IRJET -  	  Android based Portable Hand Sign Recognition SystemIRJET -  	  Android based Portable Hand Sign Recognition System
IRJET - Android based Portable Hand Sign Recognition System
 
Vision based dynamic gesture recognition in Indian sign Language
Vision based dynamic gesture recognition in Indian sign LanguageVision based dynamic gesture recognition in Indian sign Language
Vision based dynamic gesture recognition in Indian sign Language
 
Hand Gesture Recognition System Using Holistic Mediapipe
Hand Gesture Recognition System Using Holistic MediapipeHand Gesture Recognition System Using Holistic Mediapipe
Hand Gesture Recognition System Using Holistic Mediapipe
 
HAND GESTURE VOCALIZER
HAND GESTURE VOCALIZERHAND GESTURE VOCALIZER
HAND GESTURE VOCALIZER
 

More from IRJET Journal

TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
IRJET Journal
 
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURESTUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
IRJET Journal
 
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
IRJET Journal
 
Effect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsEffect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil Characteristics
IRJET Journal
 
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
IRJET Journal
 
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
IRJET Journal
 
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
IRJET Journal
 
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
IRJET Journal
 
A REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASA REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADAS
IRJET Journal
 
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
IRJET Journal
 
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProP.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
IRJET Journal
 
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
IRJET Journal
 
Survey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemSurvey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare System
IRJET Journal
 
Review on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesReview on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridges
IRJET Journal
 
React based fullstack edtech web application
React based fullstack edtech web applicationReact based fullstack edtech web application
React based fullstack edtech web application
IRJET Journal
 
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
IRJET Journal
 
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
IRJET Journal
 
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
IRJET Journal
 
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignMultistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
IRJET Journal
 
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
IRJET Journal
 

More from IRJET Journal (20)

TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
TUNNELING IN HIMALAYAS WITH NATM METHOD: A SPECIAL REFERENCES TO SUNGAL TUNNE...
 
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURESTUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
STUDY THE EFFECT OF RESPONSE REDUCTION FACTOR ON RC FRAMED STRUCTURE
 
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
A COMPARATIVE ANALYSIS OF RCC ELEMENT OF SLAB WITH STARK STEEL (HYSD STEEL) A...
 
Effect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil CharacteristicsEffect of Camber and Angles of Attack on Airfoil Characteristics
Effect of Camber and Angles of Attack on Airfoil Characteristics
 
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
A Review on the Progress and Challenges of Aluminum-Based Metal Matrix Compos...
 
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
Dynamic Urban Transit Optimization: A Graph Neural Network Approach for Real-...
 
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
Structural Analysis and Design of Multi-Storey Symmetric and Asymmetric Shape...
 
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
A Review of “Seismic Response of RC Structures Having Plan and Vertical Irreg...
 
A REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADASA REVIEW ON MACHINE LEARNING IN ADAS
A REVIEW ON MACHINE LEARNING IN ADAS
 
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
Long Term Trend Analysis of Precipitation and Temperature for Asosa district,...
 
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD ProP.E.B. Framed Structure Design and Analysis Using STAAD Pro
P.E.B. Framed Structure Design and Analysis Using STAAD Pro
 
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
A Review on Innovative Fiber Integration for Enhanced Reinforcement of Concre...
 
Survey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare SystemSurvey Paper on Cloud-Based Secured Healthcare System
Survey Paper on Cloud-Based Secured Healthcare System
 
Review on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridgesReview on studies and research on widening of existing concrete bridges
Review on studies and research on widening of existing concrete bridges
 
React based fullstack edtech web application
React based fullstack edtech web applicationReact based fullstack edtech web application
React based fullstack edtech web application
 
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
A Comprehensive Review of Integrating IoT and Blockchain Technologies in the ...
 
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
A REVIEW ON THE PERFORMANCE OF COCONUT FIBRE REINFORCED CONCRETE.
 
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
Optimizing Business Management Process Workflows: The Dynamic Influence of Mi...
 
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic DesignMultistoried and Multi Bay Steel Building Frame by using Seismic Design
Multistoried and Multi Bay Steel Building Frame by using Seismic Design
 
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
Cost Optimization of Construction Using Plastic Waste as a Sustainable Constr...
 

Recently uploaded

Forklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella PartsForklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella Parts
Intella Parts
 
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTSHeap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Soumen Santra
 
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdfTop 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Teleport Manpower Consultant
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Christina Lin
 
14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application
SyedAbiiAzazi1
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
SamSarthak3
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
SUTEJAS
 
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdfGoverning Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
WENKENLI1
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
Massimo Talia
 
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
obonagu
 
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABSDESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
itech2017
 
Water billing management system project report.pdf
Water billing management system project report.pdfWater billing management system project report.pdf
Water billing management system project report.pdf
Kamal Acharya
 
Swimming pool mechanical components design.pptx
Swimming pool  mechanical components design.pptxSwimming pool  mechanical components design.pptx
Swimming pool mechanical components design.pptx
yokeleetan1
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
Aditya Rajan Patra
 
Planning Of Procurement o different goods and services
Planning Of Procurement o different goods and servicesPlanning Of Procurement o different goods and services
Planning Of Procurement o different goods and services
JoytuBarua2
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
Osamah Alsalih
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
Victor Morales
 
Fundamentals of Induction Motor Drives.pptx
Fundamentals of Induction Motor Drives.pptxFundamentals of Induction Motor Drives.pptx
Fundamentals of Induction Motor Drives.pptx
manasideore6
 
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
ssuser7dcef0
 
digital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdfdigital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdf
drwaing
 

Recently uploaded (20)

Forklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella PartsForklift Classes Overview by Intella Parts
Forklift Classes Overview by Intella Parts
 
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTSHeap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
Heap Sort (SS).ppt FOR ENGINEERING GRADUATES, BCA, MCA, MTECH, BSC STUDENTS
 
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdfTop 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
Top 10 Oil and Gas Projects in Saudi Arabia 2024.pdf
 
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming PipelinesHarnessing WebAssembly for Real-time Stateless Streaming Pipelines
Harnessing WebAssembly for Real-time Stateless Streaming Pipelines
 
14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application14 Template Contractual Notice - EOT Application
14 Template Contractual Notice - EOT Application
 
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdfAKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
AKS UNIVERSITY Satna Final Year Project By OM Hardaha.pdf
 
Understanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine LearningUnderstanding Inductive Bias in Machine Learning
Understanding Inductive Bias in Machine Learning
 
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdfGoverning Equations for Fundamental Aerodynamics_Anderson2010.pdf
Governing Equations for Fundamental Aerodynamics_Anderson2010.pdf
 
Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024Nuclear Power Economics and Structuring 2024
Nuclear Power Economics and Structuring 2024
 
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
在线办理(ANU毕业证书)澳洲国立大学毕业证录取通知书一模一样
 
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABSDESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
DESIGN AND ANALYSIS OF A CAR SHOWROOM USING E TABS
 
Water billing management system project report.pdf
Water billing management system project report.pdfWater billing management system project report.pdf
Water billing management system project report.pdf
 
Swimming pool mechanical components design.pptx
Swimming pool  mechanical components design.pptxSwimming pool  mechanical components design.pptx
Swimming pool mechanical components design.pptx
 
Recycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part IIIRecycled Concrete Aggregate in Construction Part III
Recycled Concrete Aggregate in Construction Part III
 
Planning Of Procurement o different goods and services
Planning Of Procurement o different goods and servicesPlanning Of Procurement o different goods and services
Planning Of Procurement o different goods and services
 
MCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdfMCQ Soil mechanics questions (Soil shear strength).pdf
MCQ Soil mechanics questions (Soil shear strength).pdf
 
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressionsKuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
KuberTENes Birthday Bash Guadalajara - K8sGPT first impressions
 
Fundamentals of Induction Motor Drives.pptx
Fundamentals of Induction Motor Drives.pptxFundamentals of Induction Motor Drives.pptx
Fundamentals of Induction Motor Drives.pptx
 
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
NUMERICAL SIMULATIONS OF HEAT AND MASS TRANSFER IN CONDENSING HEAT EXCHANGERS...
 
digital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdfdigital fundamental by Thomas L.floydl.pdf
digital fundamental by Thomas L.floydl.pdf
 

IRJET- Survey on Sign Language and Gesture Recognition System

  • 1. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072 © 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6951 Survey on Sign Language and Gesture Recognition System Hema B N1, Sania Anjum2, Umme Hani3, Vanaja P4, Akshatha M5 1,2,3,4 Students, Dept. of Computer Science and Engineering, Vidya vikas Institute of Engineering and Technology, Mysore, Karnataka, India 5 Professor, Dept. of Computer Science and Engineering, Vidya vikas Institute of Engineering and Technology, Mysore, Karnataka, India ---------------------------------------------------------------------***--------------------------------------------------------------------- Abstract - Every day we see many disabled people such as deaf, dumb and blind etc. They face difficulty to interact with others. Around 4 million of people in India are deaf and around 10 million of people in India are dumb. And they use sign language to communicate with each other and normal people. But Normal people find it difficult to understand the sign language and gesture made by deaf and dumb people. So there are many techniques which can be used to convert the sign language made by the disables into the form(such astext, voice etc) which can be understood by the normal people. Previously developed techniquesareallsensorsbasedandthey didn’t give the general solution. Recently number of vision based techniques are available to accomplish this. This paper explains the various available techniques that can be used to convert the sign languageandgesture intotheunderstandable format. Key Words: Sign language recognition (SLR ), Human Computer Interaction(HCI), 7Hu (7Moments), K-Nearest Neighbour (KNN), Probabilistic Neural Network (PNN), Histogram of Oriented Gradient (HOG). 1.INTRODUCTION Hand gesture recognition provides an intelligent, natural, and convenient way of human–computer interaction (HCI). Sign language recognition (SLR) and gesture-based control are two major applications for hand gesture recognition technologies . SLR aims to interpret sign languages automatically by a computer in order to help the deaf communicate with hearing society conveniently. Since sign language is a kind of highly structured and largely symbolic human gesture set, SLR also serves as a good basic for the development of general gesture-based HCI. [1] In this paper we are discussing work done in the area of hand gesture recognition and analyze the methods for recognitionofhand gesture .Dumb people use hand signstocommunicate,hence normal people face problem in recognizingtheirlanguage by signs made. Hence there is a need of the systems which recognizes the differentsignsandconveystheinformationto the normal people. No one form of sign languageisuniversal as it varies from region to region and country to countryand a single gesture can carry a different meaning in a different part of the world. Various available sign languages are American Sign Language (ASL), British Sign Language(BSL), Turkish Sign Language(TSL),IndianSignLanguage(ISL) and many more. 2. SIGN LANGUAGE Sign language is a means of communication among the deaf and mute community. It emerges and evolves naturally within hearing impaired community. The sign language differs from people to people and from region to region. There is no particular standardized sign language they follow. As the sign language becomes a communicating language to the deaf and mute people it can be made easier for them to communicate with normal people too. Gesture recognition is an efficient way through which the movements of hand can be recognized easily. According to a survey made by some organizations there are many people who are deaf and mute and find it difficult to communicate. Gestures are made by them as an easy way of communication. 2.1 Sensor Based Approach This approach collects the data of gesture performed by using different sensors. The data is then analyzed and conclusions are drawn in accordance with the recognition model. In case of hand gesture recognition different types of sensors were used and placed on hand, when the hand performs any gesture, the data is recorded and is then further analyzed. The first sensor used was Data glovesthen LED’s came into existence. The invention of the first data glove was done in 1977. Sensorbasedapproachdamagesthe natural motion of hand because of use of external hardware. The major disadvantage is complex gestures cannot be performed using this method. [1] 2.2 Vision Based Approach This approach takes image from the camera as data of gesture. The vision based method mainly concentrates on captured image of gesture and extract the main feature and recognizes it. The color bands were usedatthestartofvision based approach. The main disadvantage of this method was the standard color should be used on the finger tips. Then use of bare hands preferred rather than the color bands. [1] 3. LITERATURE SURVEY In this paper [2] the sign and hand gesturesarecapturedand processed with the help of mat lab and the converted into
  • 2. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072 © 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6952 speech & text form .The feature extraction of values of the images is evaluated based on 7Hu(7 moments) invariant moment technique and The classification techniques are applied using K-NearestNeighbour(KNN) iscompared with the PNN(Probabilistic Neural Network) for the accuracy rate. The performance of proposed classifier KNN is decided based on various parameters Parameters can be calculateby formula’s, Using 7hu moments technique’s for feature extraction will be done , the 7Hu moments are a vector of algebraic invariants that added regular moment. They are invariant under change of translation, size and rotation. Hu moments have been widely used in classification [2]. And KNN (k-nearest neighbour’s classification) algorithm will classifies the objects on basis of majority votes of its neighbour’s. The object is assigned to the class by Most common among its nearest neighbour’s k (k is a small positive integer). Fig-1 : Architecture of the system Using (KNN) classification we get approximately 82% accuracy rate. The limitations of this system is the results of speech is audio output and the gestures or signs is only on the left side of the upper half of the captured image. And Capturing of image carefully and with high quality webcam is needed. In this paper [3], hand gesture recognition can be done by wearing gloves this proposed system can work for real-time translation of Taiwanese sign language. The different hand gesture can be identified by using the 10 flex sensors and inertial sensors which is embeddedintotheglove,itincludes three major parameters, there are 1. The posture of fingers 2. Orientation of the palm 3. Motion of the hand As defined in Taiwanese Sign Languageanditcanrecognized any without ambiguity. The finger flexion postures can be collected from the flex sensors, the palm orientation acquired by G-sensor, also the motion trajectoryacquired by gyroscope are used as the input signals of the proposed system. The input signals will be gathered and validate or checked periodically to see if it is a valid sign language gesture or not. The flex sensor are attached with a fixed resister, to calculate the voltagebetweentheflexsensor(RF) and the fixed resistor (RB) using below formula Vo = Vi RF /RF +RB Depending upon the values of RF &RB the angle of bent in hand figure will be obtained. The orientations of palm like up, down, right, left etc. Will be identified by using 3D data sequence along x-axis y-axis and z-axis. Once all these process done the sampled signal can last longer than a predefined clock cycles, and it is regarded as a valid hand sign language gesture and will be sent to smart phones via Bluetooth for gesture identification and to speech translation. The accuracy for gesture recognition in this proposed system is up to 94%. In this paper [4] A vision based interface system is developed for controlling and performing various computer functions with the aim of human computer interaction. The human computer interaction aims at easy way ofinteraction with the system. Image segmentation and feature extraction can be done by using the vision base technology. Human computer interaction is a multidisciplinary field of study focusing on the design of computertechnologyandthe interaction between humans andcomputers.Theresearches done with HCI deals with the design, evaluation, implementation and assessment of new interfaces for improving the interaction between humans and machines. Here convex hull and convexity defects algorithms are programmed in Open CV. These are manipulated to various parameters which can be used for computer control. The steps involved are:  Derive the binary image from the coloured image frames  Find the contour of the binary image and draw this on another blank image  Find the center of mass of hand  Find and draw the convexhull andconvexitydefects on a blank image  Define and manipulate certain points which can be used for gesture control From the obtained colour image, binary image is derived using image segmentation technique. This is followed by extracting the contour of the image which is required for drawing the convex hull and convexity defect. The center of mass of the hand is computed using algorithms available in Open CV. The convex hull of the hand contour is drawn. The region between the fingers forms the convexity defects. The number of fingers present in the hand gesture is determined by counting the number of convexity defects. The vertices of the convex polygon are found out to identify the convexity defects. There are few parameters to be considered:  Point start-Point of contour where the convexity defect begins  Point end- Point of contour where the convexity defect ends  Point far-Point within the defect which is farthest from the convex hull
  • 3. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072 © 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6953  Depth-Distance between the convex hull i.e the outermost points and the farthest points within the contour. These points obtained can be used as parameters for designing hand gestures for computer control. In this paper [5], a vision based sign language recognition system using LABVIEW forautomatic signlanguagehasbeen presented. The goal of this project is to develop a new vision based technology for recognizingandtranslatingcontinuous sign language to text. Although the deaf, hard of hearing and hearing signers can communicate without problems among themselves ,there is a serious challenge for deaf community trying to integrate into educational ,social and work environments. Methodology: Vision based sign language translation is a real-time system which is broken down into three main parts starting with Image acquisition followed by image processing and last comes image recognition. The program starts with image acquisition .i.e., sign images capturing by the camera. The acquired images are pre-processedtodifferentiatestaticand dynamic signs, and also the start and end of the sign. The images are extracted to be used in the recognition stage. In the recognition stage, the features extracted are compared with the available database of pattern matching templates. A threshold value is set for themaximumdifferencebetween the input sign and the database, if the difference isbelowthe maximum limit, a match is found and the sign is recognized. Environmental setup: Image acquisition process is subjected to many environmental concerns such as the position of the camera, lighting sensitivity and background condition. Result: This sign language translator is able to translate alphabets (A-Z/a-z) and numbers (0-9). All the signs can be translated real-time. The current system has only been trained on a very small database. This is a wearable phone system that provides the greatest utility for automatic sign language to spoken English translator. It can be worn by the signer whenever communication with a non-signer might be necessary In this paper [6] there are two feature descriptors used Histograms of Oriented Gradients and Scale Invariant Feature Transform. The algorithmimplementedinKNN uses HOG features for an image and classifies them using SVM technique. In the proposed system the algorithm implemented to recognize the gestures of Indian Sign Language is of static images.  Categorization of single handed or double handed: sub categorization is the main aim here which is implemented using HOG-SVM classifier. Here the images are converted from colour to binary. Hence SVM is quite accurate in separating single and double handed  Feature extraction: for both single and double handed feature recognition HOG and SIFT are extracted and these two descriptors are used. The HOG uses histogram to depict the intensity distribution which is done by dividing the image into cells and creating 1D histogram for edge orientation in each cell. In SIFT, initially the points of interest are called as key points, are located which represents local maximaandminima thatare applied to the whole image. Next the key points are filtered by eliminating low contrast points by setting a threshold  Feature fusion using K-nearest correlated neighbour features algorithm: the matrices of HOG and SIFT features for the training images are constructed separately for single handed and double handed features. The correlation of SIFT an SIFT for a test image is calculated with all other features of the dataset Result: Without categorising into single and double handed gestures:  Accuracy using KNN algorithm of HOG features of images is 78.84% and that for SIFT gave 80%.  After classifying into single and double handed gestures:  HOG gave100% for single handed 82.77% for double handed  SIFT gave 92.50% for single handed 75.55% for double handed  Fusion of both gave 97.50% for single handed and 91.11% for double handed  Validation of accuracy of algorithm: the proposed method was able to identify 59testimagescorrectly out of 60 test images for single handedgestures.For double handed gestures out of 200 test images it was correctly able to identify 179 of them In the today’s world there are many disabled people (deaf, mute, blind, etc) who face lot of problems when they try to communicate with other. Previously developed devices did not implement any general solution.Thispaper [7]describes a new method of developing wearable sensor gloves for detecting hand gestures which uses British and Indian sign language system. The outputs are produced in the text format using LCD and audio format using APR9600 module. The hand gesture or the hand signs are converted to electrical signals using flex sensor. These electrical signals are processed to produce appropriate audio andtextoutput.
  • 4. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072 © 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6954 Previously designed devices were not accurateintracingthe hand gestures. The paper employs methodoftuninginorder to improve the accuracy of detecting hand gesture. The paper as gives the statistical overview of the population of India in both urban and ruler area sufferingfromatleastone disability . It discuss in brief the type of gesture recognition available the gesture recognition can be done in three ways, i.e., vision based, sensor based and hybrid of the fittwoThey have explained how the processing has been divided into three stage that is sensors ,processing unit and output A glove with flex sensor is used to detect the signs made by the person. The made. Gestures which are made are converted to equivalent electrical signals by the sensor. These signals are sent to input pins of the MSP430F149. For different gestures the signals are different. A range of ADC values [12bit ADC assumed] is assigned for each gesture the corresponding word would be stored in the memory. When the controller receives these signals it compares with the stored values and the corresponding word is selected and the word is spoken out of the speaker. Speaker here can used. The corresponding word is simultaneously displayed on the LCD. The complete block diagram is shown in Figure below, Fig-2: Block diagram of working model This paper [7] also explain about the various hardware devices like Gloves, MSP430F149, APR9600 and their functionality in the proposed system. The working of software how the gesture is converted into electrical signal and forwarded to the processing unit. The advantage of using gloves for in verbal communication is the are cheap and accurate. The drawback is processing the delay. This paper [8] tells about the first data driven automatic sign-language to speech translation system. They have combined a SL (sign-language) recognizing framework with a state-of-art phrased based machine translation (MT) system using corpora (complete written work) of both American sign language and Irish sign language data. They have also explained about vision-based knowledge source. They also inform us about the challenges that comes down when a deaf and hearing person communicate. They have suggested that data-driven output recognizer is not easily intelligible because of different grammar and annotation format Sign Language: Sign language is local to specific countries SL are grammatically distinct from the spoken language and grammar and it make use of visual/gestural modality hand are used as information carrier it also make use of facial expression tilts of the headsandshouldersaswell asvelocity of sign Sign Language Transcription: There is lack of formally adopted writing system of SL some attempts have been made to describe the hand shape location and articulation moment despite this they fall short to be used in computational hence they adopt annotation ,in annotation manual transcription of sign language is taken from video data. The Corpus: Data-driven approach MT require a bilingual data set (speaking two language fluently) Sign language recognition: The automatic sign language recognition (ASLR) system is basedonTheautomatic speech recognition (ASR) system. To model image variability (no fixed pattern) various approaches are known and havebeen applied to gesture recognition. To extract features the dominant hand is tracked in each image sequence hence robust tracking algorithm is required Data-driven SL MT. This area is new they are 4 method employed in this approach 1. Example –based MT for English sign language of the Netherland 2. SMT (statistical machine translation) for German and German sign language 3. (chiu) a system for Chinese and Taiwanese sign language 4. (san-segundo) for Spanish and Spanish sign language SMT : SMT uses a state-of –the-art phrase basedSMTsystem to automatically transfer the meaning of source language sentence into a target language sentence Result: The size of RWTH-Boston 104 is far too small to make reliable assumption at very least we show that SMT is capable to work as an intermediate step for complete sign- to-speech system. For extremely small training data the resulting translation quality is reasonable. As no sign language parser exits for annotation theyproposestemming of the glosses (leaving out the flexion) during the recognition. Moreover sign language produce quite a large effect known as co articulation (the movement betweentwo sign that cannot be easily trained) .
  • 5. International Research Journal of Engineering and Technology (IRJET) e-ISSN: 2395-0056 Volume: 06 Issue: 03 | Mar 2019 www.irjet.net p-ISSN: 2395-0072 © 2019, IRJET | Impact Factor value: 7.211 | ISO 9001:2008 Certified Journal | Page 6955 This paper [9] is about to design a system that can understand the sign language and gestures accurately so that the deaf and dumb people may communicate with the outside world easily without the need of an interpreter to help them . Here the work is done for the American sign language (ASL) which has its own unique gesture to represents the words and the English alphabets i.e. is from A-Z. The process is to recognize the sign language and gesture as shown in the below figure Fig-3: System block diagram The image will be captured through the web camera and it stored in system. The acquired image that is stored in the system windows needs to be connected to the mat lab software automatically to continue the further process.This is can be done by creating an object s of that images. With the help of high speed processors in system .The imaqe info function will be used here to determine if the device supports device configuration files or not. If the input is an RGB image, it can be of class uint8, uint16, single, or double. The output image is of the same class as the input image. These captured colour (RGB) image first converted into grayscale or binary image as some of the pre-processing techniques can be applied on grayscale and binary image only. Further these binary images undergoesthecanny edge detection technique .The Edge Detection block finds the edges in an input image by approximating the gradient magnitude of the image. The block convolves the input matrix with the Sobel, Prewitt, or Roberts kernel. The block outputs two gradient components of the image, which are the result of this convolution operation and the output a binary image, is a matrix of Boolean values that is 0 and 1’s forms. If a pixel value is 1, it is an edge or else it’s not edge part of image. A KNN (k nearest neighbors) algorithm used the classification purpose and alternatively Watershed transform process will be used instead of KNN which computes a label matrix which can identify the watershed regions of the input matrix, which can have any dimension. And the elements of L are integer values greater than or equal to 0. The elements labelled 0 do not belong to a unique watershed region, and at the end figure tip is detected. Finally the last step is recognition which match the gesture with the predefined gesture database and produces the suitable matched meaning of the gesture in speech and text form. 4. CONCLUSIONS This paper deals with the different algorithms and the techniques that can be used for recognizing the sign language and the hand gesture made by different deaf and dumb people. Handgesturerecognitionsystemisconsidered as a way for more intuitive and proficient human computer interaction tool. The range of applications includes virtual prototyping, sign language analysis and medical training. Sign language is one of the tool of communication for physically impaired, deaf and dumb people. From the above consideration it is clear that the vision based hand gesture recognition has made remarkable progress in the field of hand gesture recognition. Software tools which are used for implementing the gesture recognitionsystem areC,C++, and Java language. To simplify the work especially when image processing operations are needed, MATLAB with image processing toolbox is used. REFERENCES [1] IJTRS-V2-I7-005 Volume 2 Issue VII, August 2017 survey on hand gesture techniques for sign language recognition [2] Umang Patel & Aarti G. Ambekar “Moment based sign language recognition for Indian Languages “2017 Third International Conference on Computing, Communication, Control and Automation (ICCUBEA) [3] Lih-Jen kau, Wan-Lin Su, Pei-Ju Yu, Sin-Jhan Wei“AReal- time Portable Sign Language Translation System” Department of Electronic Engineering, National Taipei University of Technology No.1, Sec. 3, Chung-Hsiao E. Rd., Taipei 10608, Taiwan, R.O.C. [4] Obtaining hand gesture parameters using Image Processing by Alisha Pradhan and B.B.V.L.Deepak ,2015 International Conference on Smart Technology and Management (ICSTM). [5] Vision based sign language translation device (conference paper: February 2013 by Yellapu Madhuri and Anburajan Mariamichael). [6] K-nearest correlated neighbour classification forIndian sign language gesture recognition using feature extraction(by Bhumika Gupta, Pushkar Shukla and Ankush Mittal). [7] Vikram Sharma M, Virtual Talk for deaf, mute, blind and normal humans, Texas instruments India Educator’s conference,2013. [8] Hand in Hand: Automatic Sign Language to English Translation, by Daniel Stein, Philippe Dreuw, Hermann Ney and Sara Morrissey, Andy Way [9] Er. Aditi Kalsh, Dr. N.S. Garewal “Sign Language Recognition System” International Journal of Computational Engineering Research 2013 [10] https://en.wikipedia.org/wiki/Sign_language