SlideShare a Scribd company logo
International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 
E-ISSN: 2321-9637 
Gesture Based Computing as an Alternative to Mouse 
by Calibrating Principal Contour Process Actions 
198 
Chinnu Thomas1, D.Lakshmi2 
PG Student of CSE1, Associate Professor of CSE2 
Adithya Institute of Technology1, 2 
Chinnuthomas13@gmail.com1, lakshmi.lifefordivine@gmail.com2 
Abstract- Computing is no longer a discrete activity bound to a desktop; network computing and mobile 
computing are fast becoming a part of everyday life and so is the Internet. In the present day framework of 
interactive and intelligent computing, an efficient human computer interaction is assuming utmost importance. 
The topic of user experience and interaction has been increasingly popular and widespread lately. Unlike 
decades ago, when people place most of their attentions on the quality and functionality, or brand of a product, 
at the present time, the user interaction experience and usability seems to be the vital element when people are 
considering and selecting a product. Gesture based computing enables humans to interface with the machine 
(HMI) and interact naturally without any dedicated devices. Building a richer bridge between machines and 
humans than primitive text user interface or even (graphical user interfaces) GUIs, which still limit the majority 
of input to keyboard and mouse. In fact we are bridging this gap by bringing intangible, digital information out 
into the tangible world, and allowing us to interact with this information via natural hand gestures. Gesture 
Based Computing provides an attractive alternative for human computer interaction (HCI). A novel approach is 
proposed for implementing a real-time Human interaction system capable of understanding commands with 
emphasis on hand gestures by analyzing the principal contour and fingertips. 
Index Terms- Human Machine Interface; Human Computer interaction; Hand Gesture; Gesture based 
Computing. 
1. INTRODUCTION 
No doubt, Present world framework of technology is 
seeking for an interesting interface for Human 
Computer Interaction (HCI) which can provide an 
environment where a user can interact with a 
computer or roughly speaking a machine in a more 
user friendly way. We are bridging the gap of 
interaction by bringing immaterial, digital information 
out into the material world, and allowing us to interact 
with this information via natural hand gestures. 
Gestures considered as a natural way of 
communication among human especially for hear-impaired, 
since it is a physical movement of hands 
,arms, or body which conveying meaningful 
information by delivering an expressive message. 
Gesture recognition then, is the interpretation of that 
movement as semantically meanings command. 
Vision Based approach of hand gesture 
recognition has received much attention from 
academia and industry in recent years, largely due to 
the development of human-computer interaction 
(HCI) technologies and the growing popularity of 
smart devices such as smart phones. The essential 
intention of building hand gesture recognition system 
is to create a natural interaction between human and 
computer where the recognized gestures can be used 
for controlling a robot [15] or conveying meaningful 
information [1]. How to form the resulted hand 
gestures to be understood and well interpreted by the 
computer considered as the problem of gesture 
interaction. 
Hand gestures recognition (HGR) is one of the 
major innovative areas of research for the 
engineers,scientists and bioinformatics. Today many 
researchers in the academia and industry are working 
on different application without wearing any extra and 
dedicated device to make interactions more easy, 
natural and convenient. However, human-robot 
interaction using hand gestures provides a remarkable 
challenge. For the vision part, the complex and 
cluttered backgrounds, dynamic lighting conditions 
and a deformable human hand shape, extraction of 
wrong object can cause a machine to misunderstand 
the gesture. If the machine is mobile, the hand gesture 
recognition system also needs to satisfy other 
constraints, such as the size of the gesture in the 
image and adaptation to motion. Moreover, to be 
natural, the machine must be person-independent and 
give feedback in real time. 
The main focusing point of the paper is to presents a 
Hand gesture recognition system emphasizing the 
detection of the motion of the hand in a real-time 
scenario, track it, identify its features such as 
fingertips and use those to create a simple application, 
which is measured to be challenging problem in the 
human-computer interaction context and promising as 
well. Although recent reviews [1,6-8,15,19,20] in 
computer vision based have explained the importance 
of gesture recognition system for human computer 
interaction (HCI), this work concentrates on vision 
based techniques method and it’s up-to-date. With 
expecting to point out various research developments 
as well as it emphasis good beginning for interested
International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 
E-ISSN: 2321-9637 
199 
persons in hand gesture recognition area. 
Conclusively, we give some discussion on the current 
challenges and open questions in this area and point 
out a list of possible directions for future work. 
The rest of the paper is organized as follows. 
Section 2 explains the basic concepts and approaches 
of the vision based hand gesture approaches. Section 
3 includes several related works. The proposed work 
is detailed in Section 4. Finally the work is settled in 
Section 5. 
2. HAND GESTURE TECHNOLOGY 
To enable hand gesture recognition, numerous 
approaches have been proposed, which can be 
classified into various categories. For any system the 
first step is to collect the data necessary to accomplish 
a specific task. In order to acquire input data for hand 
posture and gesture recognition system different 
technologies are used. A common taxonomy is based 
on whether extra devices are required for raw data 
collecting. In this way, they are categorized into data 
glove based hand gesture recognition, vision based 
hand gesture recognition [5], and color glove based 
hand gesture recognition [6]. Figure 1gives an 
example of these technologies. 
(a) Data-Glove based. (b) Colored marker (c) Vision based. 
Fig 1: Examples of hand gesture recognition input technologies. 
2.1 Data glove based approaches 
Data glove based approaches require the user to 
wear a cumbersome glove-like device, which is 
equipped with sensors that can sense the movements 
of hand(s) and fingers, and pass the information to the 
computer. Hence it can be referred as Instrumented 
glove approach. These approaches can easily provide 
exact coordinates of palm and finger’s location and 
orientation, and hand configurations, The advantages 
of these approaches are high accuracy and fast 
reaction speed. However these approaches require the 
user to be connected with the computer physically 
which obstacle the ease of interaction between users 
and computers, besides the cost of these devices are 
quite expensive. 
2.2 Colored Markers based approaches 
Color glove based approaches represent a 
compromise between data glove based approaches 
and vision based approaches. Marked gloves or 
colored markers are gloves that worn by the human 
hand [6] with some colors to direct the process of 
tracking the hand and locating the palm and fingers 
[6], which provide the ability to extract geometric 
features necessary to form hand shape [6]. Compared 
with instrumented data glove the amenity of this 
technology is its simplicity in use, and it costs low 
price. Intrinsically, they are similar to the latter, 
except that, with the help of colored gloves, the image 
preprocessing phase (e.g., segmentation, localization 
and detection of hands) can be greatly simplified. The 
disadvantages are similar to data glove based 
approaches: they are unnatural and not suitable for 
applications with multiple users due to hygiene issues. 
2.3 Vision Based approaches: 
Vision based approaches do not require the user to 
wear anything (naked hands). Instead, video 
camera(s) are used to capture the images of hands, 
which are then processed and analyzed using 
computer vision techniques. This type of hand gesture 
recognition is simple, natural and convenient for users 
and at present they are the most popular approaches to 
gesture recognition. Although these approaches are 
simple but a lot of gesture challenges are raised such 
as the complex and cluttered background, lighting 
variation, and other skin patches with the hand object, 
besides system requirements such as robustness 
,velocity, recognition time, throughput, and 
computational efficiency [1][10]. 
3. RELATED WORK 
William F. and Michal R applied orientation 
histogram in [3] a method for recognizing gestures 
based on pattern recognition. Mentions some issues 
include similar gestures might have different 
orientation histograms and different gestures could 
have similar orientation histograms, besides that, the 
proposed method achieved well for any objects that 
dominate the image even if it is not the hand gesture. 
Xingyan L. In [5] presented fuzzy c-means clustering 
algorithm to recognize hand gestures in a mobile 
remote. There are some restrictions with current 
system as the recognition accuracy drops quickly 
when the distance between the user and the camera is 
greater than 1.5 meters or when the lighting is too 
strong. The system cannot deal with an image that has 
two or more patches of skin with similar size.
International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 
E-ISSN: 2321-9637 
METHOD MERITS LIMITATIONS APLLICATIONS 
200 
Stergiopoulou E. [1] recognized static hand gestures 
using Self-Growing and Self-Organized Neural Gas 
(SGONG) network. Three geometric features was 
Table1: comparison between various methods of hand gesture recognition system 
extracted, two angles based on hand slope and the 
distance from the palm center was determined, where 
these features used to determine the number of the 
raised fingers class. The system recognized 31 
predefined gestures with recognition rate 90.45%, in 
processing time 1.5 second, but it is time consuming 
and when the number of training data increase, the 
time needed for classification are increased too. Hasan 
[4] applied multivariate Gaussian distribution to 
recognize hand gestures using nongeometric features. 
The input hand image is segmented using two 
different methods ; skin color based segmentation by 
applying HSV color model and clustering based 
thresholding techniques. Lamberti [6] presents a real-time 
hand gesture recognizer based on a color glove. 
Color markers are employed to get the features as five 
distances from palm to all fingers and four angles 
between those distances. Daeho Lee and SeungGwan 
Lee [11] presents a novel vision based method in 
which , fingertips are detected by a novel scale-invariant 
angle detection based on a variable k-cosine. 
Table1 shows the comparisons between different hand 
gesture recognition techniques. 
a novel distance metric, 
Finger-EarthMover’s 
Distance (FEMD), 
Thresholding 
decomposition 
robust to hand articulations, 
distortions and orientation 
or scale changes, and can 
work in uncontrolled 
environments 
False and confusing gesture 
orientations 
Rock-Paper-Scissors 
Game, Arithmetic 
Computation 
Learning Vector 
Quantization, HSI color 
space. 
Cluttered backgrounds can 
be managed 
Accuracy Of Capturing. 
Color variations upon 
lighting changes 
Multimedia presentation 
command 
Fuzzy C-Means 
algorithm , Threshold. 
Enough speed and sufficient 
reliability 
Recognition Rate : 85.83% 
wrong object extraction, 
environment lighting 
changes, Distance variations, 
Cannot distinguish two or 
more patches of skin. 
mobile robot 
Self-Growing and Self- 
Organized Neural Gas 
(SGONG) network, 
YCbCr color space, 
Gaussian distribution 
Reliable since it is relatively 
immune to changing 
lighting conditions and 
provides good coverage of 
the human skin color. Fast - 
does not require post-processing 
of the hand 
image Recognition Rate : 
90.45% 
Time consuming 
Complex 
3D Modeling, Numbers 
Recognition 
Laplacian filter 
Euclidian distance 
metric, HSV color 
model 
Promising solution for 
illumination changes 
Recognition Rate : 91% 
Different orientation 
histograms Background is 
plane and uniform, 
Computational complexity 
Helicopter Signaller for 
Marshaling Operations, 
Television Control. 
Orientation histogram, 
Euclidean distance 
metric 
simple and fast to compute 
offers some robustness to 
scene illumination changes 
similar gestures might have 
different orientation 
histograms different gestures 
could have similar 
orientation histograms 
wrong object detection 
Pattern Recognition, 
Graphic Editor Control
International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 
E-ISSN: 2321-9637 
201 
4. PROPOSED WORK 
Gesture based computing interface makes the user 
more interactive with the PC. The power of 
controlling the PCs will be in our hands. With the 
help of mere bare hand and its gestures we can get the 
power of pointing device like mouse. Gesture based 
computing makes the interface more advanced 
compared to pointing device like mouse. We can get 
full access of our personal data on our finger tips. 
The proposed system is illustrated in the Fig 
1,consists of 3 main components viz Image Capturing, 
Image Processing And Application Handler. The 
Inage capturing mainly initiates the webcam to 
capture the image. After getting the captured image 
processing of the particulars are facilitated. Includes 
configuring skin color values and detecting the 
concerned skin color bands. Next emphasis the 
finding of different contours in the image and storing 
it in an array. From the contours detected we must 
identify the Principal Contour this will reduce the 
wrong object detection. Principal contour helps us to 
detect the finger tips ,to track the gesture actions and 
to find the current positions. This identified gesture 
action are send to application handler to control 
different applications. 
4.1 Module Description 
4.1.1 Image Capture Module 
We need to capture the gestures provided by the 
hands .In order to do that we use an Image capturing 
device (Web cam) to pictures every movement of the 
hands, and this is marked as the input of our program 
to control the system. The Java Media Framework 
(JMF) is a Java library that enables audio, video and 
other time-based media to be added to Java 
applications and applets. This optional package, 
which can capture, play, stream, and transcode 
multiple media formats, extends the Java Platform, 
Standard Edition (Java SE) and allows development 
of cross-platform multimedia applications. Using this 
key feature in JMF it is used to detect the user hand 
movements and it is directed to the Computing 
device. 
4.1.2 Color Detection Module 
It should be processed in such a way that we have to 
identify the colors that are required by analyzing each 
Fig 2. Block diagram of the proposed work 
pixel. In order to identify color the pixels in the image 
is arranged in a one dimensional array. It makes the 
color analysis much easier. The RGB components in 
each pixel are calculated. The color value of each 
pixel can range from 0-255. Depending on this value 
of red, blue and green we can identify the pixel color. 
I.e. to get red color, the R value should be greater than 
150 and the other must be less than 40. 
This is the method that we are using here for 
processing image. In java a class named pixelGrabbe 
is used for this process. Here we are comparing the 
obtained RGB value with the already decided color 
values. If we are obtaining such an RGB value in the 
prescribed limit, the analysis will be terminated and 
the obtained color is taken as the input for the next 
process. Hence this module is considered as the
International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 
E-ISSN: 2321-9637 
202 
learning phase in which we should able to detect the 
skin color and make that recognizable by the HMI. 
4.1.3 Image Comparison Module 
From the image, the Skin color is detected and the 
coordinate value of that color pixel is obtained. This is 
done for each frame. From the coordinate value thus 
obtained, the position mouse is calculated. The 
position is calculated as the difference in coordinate 
value of that color in consecutive frames. Depending 
upon this value the different mouse operation such as 
mouse pointer movement, left click, right click and 
double click is performed. 
4.1.4 ADD-ONS 
GBC has got four applications. They are: 
(1) Mouse activity handler module. 
(2) Image viewer module. 
(3) File transfer module 
(4) Mail dispatcher. 
(1) Mouse Activity Handler Module 
Skin color and a concerned gestures are assigned for 
basic mouse action. Gesture1 – Movement, 
Gesture232 – Right Click, Gesture121 – Left Click. 
The coordinates of mouse pointer changes as the 
projected gesture move over the screen. This is done 
by calculating the co-ordinate axis frame by frame. 
The captured position of skin color is stored in a 
variable (current x & current y).if the same color is 
detected in the next frame at different position the 
present values of current x & current y will be moved 
to previous x & previous y respectively. The new co-ordinate 
value is moved to the current x& current y. 
By calculating the displacement of these two axes the 
movement of mouse pointer is done. 
The rest mouse actions are done by other respective colors and it is managed by Robot class which is provided by java. This class implementations. Using the class to generate input 
events differs from posting events to the AWT event 
queue or AWT components in that the events are 
generated in the platform's native input queue. For 
example, Robot.mouseMove will essentially move the 
mouse cursor instead of just generating mouse move 
events. 
(2) Image Viewer 
Image viewer is a application which helps us to view 
digital images that are stored in our computer. To 
change the magnification level to zoom in on the 
current picture click the magnifier and then drag the 
slider to zoom in on the picture. This lets us to get 
close up view of the image. The same way zoom out 
also can be done. To view next or previous image that 
is in the same folder as the current picture, can click 
on next or previous button. 
In GBC instead of these buttons we use Gesture2 for 
viewing previous and next images. If skin color is 
moved from left to right, previous image can be 
viewed. For next image, skin color is moved from 
right to left. For zoom in and zoom out Gesture5 is 
used a time. As the distance between these color 
increases the picture will get enlarged and if the 
distance between the color decreases the picture is 
shortened. So by using GBC, it helps us to view 
images in a different way. Instead of sitting close to a 
system and always clicking on button to view 
previous and next images and to zoom in and zoom 
out we need only to show the specified gestures with 
specified skin color and gesture(count of fingertips). 
(3) File Transfer 
File transfer is the process of transferring a file 
from one system to another system. Here we use 
gestures and skin color band for this purpose. For that 
we need a server to control and manage those 
activities. More than two clients can take part in this. 
Two java classes, fileInputStream and file 
OutputStream are used. The system participating in 
this must be connected locally via Ethernet cable or 
by using switch. 
The server initiates the application. The client 
program is run on those computers where the actual 
file is present and on those where the file needs to be 
copied. The file which is to e transferred between 
computers is selected first. When skin color band 
along with Gesture121 are showed on the system 
which is having the selected file, it gets copied. Now 
we need to show skin color band and Gesture5 on 
those computer to which the file is to be transferred. 
And the file will be pasted on the specified destination 
or folder. 
(4) Mail Dispatcher
International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 
E-ISSN: 2321-9637 
203 
It is just like Gmail and Yahoo mail, we can 
access mail completely. We can compose, store, read 
and send mail. We can store email addresses and other 
confidential data here. Since the data must not be read 
by others we set password for such documents. 
Usually passwords are set of characters or numbers or 
combination of both. For a hacker it can be easily 
hacked and the data can be read. The data are no ore 
confidential. 
But now a day instead of password, patterns are used. 
It is common in android phones. We just need to set a 
pattern by drawing a pattern. Using only this pattern 
one can open the phone. Just like this application, we 
use pattern based locking technique for login to the 
mail dispatcher. When the user first opens the mail 
dispatcher, he is asked for login, where he has to draw 
the pattern. It contains 9 rectangular boxes arranged in 
a 3*3 matrix form. By using red color glow he can 
move from one block to another and by using green 
glow he can select the box of his pattern. If the pattern 
is correct then only he can login into his account. 
Thus it ensures security. 
5. CONCLUSIONS 
Hand gesture recognition is considered to be very 
challenging for real-life applications because of its 
requirements on the robustness, accuracy and 
efficiency. In this paper , we described real-time hand 
gesture recognition system based on principal contour 
action analysis. Primarily a comprehensive review on 
tools and techniques of gesture recognition system has 
been provided with emphasis on hand gesture 
expressions. The major tools surveyed include 
Orientation Histograms, fuzzy clustering HMMs, 
ANN, HSV and Geometrical method have been 
reviewed and analyzed. Nearly all researchers are 
using colored images for achieving better results. 
Comparison between diverse gesture recognition 
systems have been presented with explaining the 
important parameters needed for any recognition 
system which include: the segmentation process, 
features extraction, and the classification algorithm. 
Despite the fact that the recognition rate increases 
gradually, the concerned algorithms experience 
several issues. The challenges include wrong object 
extraction, complex and nonuniform background, 
partial-occlusion, background disturbance, object 
reappearance, illumination change etc. The assortment 
of specific algorithm for recognition depends on the 
application needed. In this work, we proposed cost 
effective and user friendly Hand Gesture Recognition 
System by Principal Contour Action analysis which 
finds the principal contour from the background and 
analyses the position and contour of the detected 
fingertips, various interface actions such as clicking, 
moving ,zooming and pointing are recognized. Setting 
a threshold value of time in capturing gestures 
employed for differentiating the concerned gestures. 
Two-handed dynamic-gesture multimodal interaction 
is thus a promising area for future research. 
Customization of gestures also can be employed to 
make it more user friendly. 
REFERENCES 
[1] Stergiopoulou, E., & Papamarkos, N. (2009). 
‘Hand gesture recognition using a neural network 
shape fitting technique’. Elsevier Engineering 
Applications of Artificial Intelligence 22, 1141- 
1158. 
http://dx.doi.org/10.1016/j.engappai.2009.03.008 
[2] Malima, A., Özgür, E., & Çetin, M. (2006). ‘A 
fast algorithm for vision-based hand gesture 
recognition for robot control’. IEEE 14th 
conference on Signal Processing and 
Communications Applications, pp. 1- 4. 
http://dx.doi.org/10.1109/SIU.2006.1659822 
[3] W. T. Freeman and Michal R., (1995) 
‘Orientation Histograms for Hand Gesture 
Recognition’, IEEE International Workshop on 
Automatic Face and Gesture Recognition. 
[4] M. M. Hasan, P. K. Mishra, (2011). ‘HSV 
Brightness Factor Matching for Gesture 
Recognition System’, International Journal of 
Image Processing (IJIP), Vol. 4(5). 
[5] Xingyan Li. (2003). ‘Gesture Recognition Based 
on Fuzzy C-Means Clustering Algorithm’, 
Department of Computer Science. The University 
of Tennessee Knoxville. 
[6] Luigi Lamberti& Francesco Camastra, (2011). 
‘Real-Time Hand Gesture Recognition Using a 
Color Glove’, Springer 16th international 
conference on Image analysis and processing: 
Part I (ICIAP'11), pp. 365-373. 
[7] C. Keskin, F. Krac, Y. Kara, and L. Akarun, 
(2011). ‘Real time hand pose estimation using 
depth sensors,’ in Proc. IEEE Int. Conf. Comput. 
Vis. Workshops, Nov. 2011, pp. 1228–1234. 
[8] Z. Pan,Y. Li,M. Zhang, C. Sun,K. Guo,X. Tang, 
and S. Zhou, (2010) ‘A realtime multi-cue hand 
tracking algorithm based on computer vision,’ in 
IEEE Virt. Real. Conf., 2010, pp. 219–222. 
[9] G. Hackenberg, R.McCall, andW. Broll, (2011). 
‘Lightweight palm and finger tracking for real-time 
3-D gesture control,’ in IEEE Virtual 
Reality Conf., Mar. 2011, pp. 19–26. 
[10]A. Chaudhary, J. Raheja, and S. Raheja, (2012). 
‘A vision based geometrical method to find 
fingers positions in real time hand gesture 
recognition,’ J. Software, vol. 7, no. 4, pp. 861– 
869.
International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 
E-ISSN: 2321-9637 
204 
[11]D. Lee and S. Lee, (2011). ‘Vision-based finger 
action recognition by angle detection and contour 
analysis,’ ETRI J., vol. 33, no. 3, pp. 415–422. 
[12] J. Raheja, A. Chaudhary, and K. Singal, (2011). 
‘Tracking of fingertips and centers of palm using 
Kinect,’ in Proc. Int. Conf. Computat. Intell., 
Modell. Simulat., pp. 248–252. 
[13]V. Frati and D. Prattichizzo, (2011). ‘Using 
Kinect for hand tracking and rendering in 
wearable haptics,’ in IEEE World Haptics Conf. 
WHC, Jun. 2011, pp. 317–321. 
[14]H. Liang, J. Yuan, and D. Thalmann,(2011). ‘3-D 
fingertip and palm tracking in depth image 
sequences,’ in Proc. 20th ACM Int. Conf. 
Multimedia, pp. 785–788. 
[15]Ghobadi, S. E., Loepprich, O. E., Ahmadov, F., 
Bernshausen, J., Hartmann, K., & Lo®eld, O. 
(2008).‘Real time hand based robot control using 
multimodal images’. International Journal of 
Computer Science IJCS.Vol 35(4). Retrieved 
from 
www.iaeng.org/IJCS/issues_v35/issue_4/IJCS_3 
5_4_08.pdf 
[16]Garg, P., Aggarwal, N., & Sofat, S. 
(2009).‘Vision based hand gesture recognition’. 
World Academy of Science, Engineering and 
Technology. Retrieved from 
www.waset.org/journals/waset/v49/v49-173.pdf 
[17]K. Grauman and T. Darrell (2004) . ‘Fast contour 
matching using approximate earth mover's 
distance’. In Proceedings of the IEEE Conference 
on Computer Vision and Pattern Recognition, 
Washington DC. 
[18]Zhou Ren, Junsong Yuan, Wenyu Liu, (2013). 
‘Minimum Near-Convex Shape Decomposition’, 
IEEE transactions on pattern analysis and 
machine intelligence, vol. 35, no. 10, october. 
[19]Marco Maisto, Massimo Panella, Member, IEEE, 
Luca Liparulo, and Andrea Proietti (2013). ‘An 
Accurate Algorithm for the Identification of 
Fingertips Using an RGB-D Camera’, In IEEE 
journal on emerging and selected topics in 
circuits and systems, vol. 3, no. 2, june. 
[20]Zhou Ren, Junsong Yuan,(2013). ‘Robust Part- 
Based Hand Gesture Recognition Using Kinect 
Sensor’, IEEE transactions on multimedia, vol. 
15, no. 5, August. 
[21] Chinnu Thomas, S. Pradeepa, (2014), ‘A 
Comprehensive Review on Vision Based Hand 
Gesture Technology, in International Journal of 
Research in Advent Technology, Volume 2, Issue 
1, January.

More Related Content

What's hot

Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
IRJET Journal
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET Journal
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
IJMER
 
IRJET - Chatbot with Gesture based User Input
IRJET -  	  Chatbot with Gesture based User InputIRJET -  	  Chatbot with Gesture based User Input
IRJET - Chatbot with Gesture based User Input
IRJET Journal
 
183
183183
IRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually ImpairedIRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually Impaired
IRJET Journal
 
Computer vision approaches for offline signature verification & forgery detec...
Computer vision approaches for offline signature verification & forgery detec...Computer vision approaches for offline signature verification & forgery detec...
Computer vision approaches for offline signature verification & forgery detec...Editor Jacotech
 
IRJET - Paint using Hand Gesture
IRJET - Paint using Hand GestureIRJET - Paint using Hand Gesture
IRJET - Paint using Hand Gesture
IRJET Journal
 
An HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf CommunityAn HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf Community
IJEACS
 
2 4-10
2 4-102 4-10
2 4-10
engrode
 
Hand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture RecognitionHand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture Recognition
AM Publications,India
 
Haptic Technology- Interaction with Virtuality
Haptic Technology- Interaction with VirtualityHaptic Technology- Interaction with Virtuality
Haptic Technology- Interaction with Virtuality
vivatechijri
 
G0342039042
G0342039042G0342039042
G0342039042ijceronline
 
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
ijasa
 
Designing in Context
Designing in ContextDesigning in Context
Designing in Context
Thomas Grill
 
IRJET - Sign Language Text to Speech Converter using Image Processing and...
IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...
IRJET - Sign Language Text to Speech Converter using Image Processing and...
IRJET Journal
 
Palm Authentication Using Biometric System
Palm Authentication Using Biometric SystemPalm Authentication Using Biometric System
Palm Authentication Using Biometric System
IJERD Editor
 

What's hot (20)

Niknewppt
NiknewpptNiknewppt
Niknewppt
 
Accessing Operating System using Finger Gesture
Accessing Operating System using Finger GestureAccessing Operating System using Finger Gesture
Accessing Operating System using Finger Gesture
 
Paper
PaperPaper
Paper
 
IRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture RecognitionIRJET- Survey Paper on Vision based Hand Gesture Recognition
IRJET- Survey Paper on Vision based Hand Gesture Recognition
 
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
Real Time Vision Hand Gesture Recognition Based Media Control via LAN & Wirel...
 
IRJET - Chatbot with Gesture based User Input
IRJET -  	  Chatbot with Gesture based User InputIRJET -  	  Chatbot with Gesture based User Input
IRJET - Chatbot with Gesture based User Input
 
Nikppt
NikpptNikppt
Nikppt
 
183
183183
183
 
IRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually ImpairedIRJET- Navigation and Camera Reading System for Visually Impaired
IRJET- Navigation and Camera Reading System for Visually Impaired
 
Computer vision approaches for offline signature verification & forgery detec...
Computer vision approaches for offline signature verification & forgery detec...Computer vision approaches for offline signature verification & forgery detec...
Computer vision approaches for offline signature verification & forgery detec...
 
IRJET - Paint using Hand Gesture
IRJET - Paint using Hand GestureIRJET - Paint using Hand Gesture
IRJET - Paint using Hand Gesture
 
An HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf CommunityAn HCI Principles based Framework to Support Deaf Community
An HCI Principles based Framework to Support Deaf Community
 
2 4-10
2 4-102 4-10
2 4-10
 
Hand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture RecognitionHand Segmentation for Hand Gesture Recognition
Hand Segmentation for Hand Gesture Recognition
 
Haptic Technology- Interaction with Virtuality
Haptic Technology- Interaction with VirtualityHaptic Technology- Interaction with Virtuality
Haptic Technology- Interaction with Virtuality
 
G0342039042
G0342039042G0342039042
G0342039042
 
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...Vertical Fragmentation of Location Information to Enable Location Privacy in ...
Vertical Fragmentation of Location Information to Enable Location Privacy in ...
 
Designing in Context
Designing in ContextDesigning in Context
Designing in Context
 
IRJET - Sign Language Text to Speech Converter using Image Processing and...
IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...IRJET -  	  Sign Language Text to Speech Converter using Image Processing and...
IRJET - Sign Language Text to Speech Converter using Image Processing and...
 
Palm Authentication Using Biometric System
Palm Authentication Using Biometric SystemPalm Authentication Using Biometric System
Palm Authentication Using Biometric System
 

Viewers also liked

Paper id 212014104
Paper id 212014104Paper id 212014104
Paper id 212014104IJRAT
 
Assistive technology powerpoint
Assistive technology powerpointAssistive technology powerpoint
Assistive technology powerpoint
alpayne2
 
Paper id 24201448
Paper id 24201448Paper id 24201448
Paper id 24201448IJRAT
 
Paper id 26201481
Paper id 26201481Paper id 26201481
Paper id 26201481
IJRAT
 
Paper id 27201419
Paper id 27201419Paper id 27201419
Paper id 27201419
IJRAT
 
Paper id 252014130
Paper id 252014130Paper id 252014130
Paper id 252014130IJRAT
 
Paper id 24201469
Paper id 24201469Paper id 24201469
Paper id 24201469IJRAT
 
Paper id 27201434
Paper id 27201434Paper id 27201434
Paper id 27201434
IJRAT
 
Paper id 23201429
Paper id 23201429Paper id 23201429
Paper id 23201429IJRAT
 
Paper id 21201422
Paper id 21201422Paper id 21201422
Paper id 21201422IJRAT
 
Paper id 21201485
Paper id 21201485Paper id 21201485
Paper id 21201485IJRAT
 
Paper id 2720149
Paper id 2720149Paper id 2720149
Paper id 2720149
IJRAT
 
Presentasi agama2
Presentasi agama2Presentasi agama2
Presentasi agama2
Dimas Saputra
 
Paper id 2520148
Paper id 2520148Paper id 2520148
Paper id 2520148IJRAT
 
Paper id 25201476
Paper id 25201476Paper id 25201476
Paper id 25201476IJRAT
 
Paper id 2320143
Paper id 2320143Paper id 2320143
Paper id 2320143IJRAT
 
140814 texto-ten reasons not to write
140814 texto-ten reasons not to write140814 texto-ten reasons not to write
140814 texto-ten reasons not to write
Alejandro Munévar Salazar
 
Paper id 2520148 (2)
Paper id 2520148 (2)Paper id 2520148 (2)
Paper id 2520148 (2)IJRAT
 
Air air bushings
Air air bushingsAir air bushings
Air air bushings
jtcool74
 
2d work slide show
2d work slide show2d work slide show
2d work slide show
Libby Lynch
 

Viewers also liked (20)

Paper id 212014104
Paper id 212014104Paper id 212014104
Paper id 212014104
 
Assistive technology powerpoint
Assistive technology powerpointAssistive technology powerpoint
Assistive technology powerpoint
 
Paper id 24201448
Paper id 24201448Paper id 24201448
Paper id 24201448
 
Paper id 26201481
Paper id 26201481Paper id 26201481
Paper id 26201481
 
Paper id 27201419
Paper id 27201419Paper id 27201419
Paper id 27201419
 
Paper id 252014130
Paper id 252014130Paper id 252014130
Paper id 252014130
 
Paper id 24201469
Paper id 24201469Paper id 24201469
Paper id 24201469
 
Paper id 27201434
Paper id 27201434Paper id 27201434
Paper id 27201434
 
Paper id 23201429
Paper id 23201429Paper id 23201429
Paper id 23201429
 
Paper id 21201422
Paper id 21201422Paper id 21201422
Paper id 21201422
 
Paper id 21201485
Paper id 21201485Paper id 21201485
Paper id 21201485
 
Paper id 2720149
Paper id 2720149Paper id 2720149
Paper id 2720149
 
Presentasi agama2
Presentasi agama2Presentasi agama2
Presentasi agama2
 
Paper id 2520148
Paper id 2520148Paper id 2520148
Paper id 2520148
 
Paper id 25201476
Paper id 25201476Paper id 25201476
Paper id 25201476
 
Paper id 2320143
Paper id 2320143Paper id 2320143
Paper id 2320143
 
140814 texto-ten reasons not to write
140814 texto-ten reasons not to write140814 texto-ten reasons not to write
140814 texto-ten reasons not to write
 
Paper id 2520148 (2)
Paper id 2520148 (2)Paper id 2520148 (2)
Paper id 2520148 (2)
 
Air air bushings
Air air bushingsAir air bushings
Air air bushings
 
2d work slide show
2d work slide show2d work slide show
2d work slide show
 

Similar to Paper id 25201413

Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...
IRJET Journal
 
Hand Gesture Recognition using OpenCV and Python
Hand Gesture Recognition using OpenCV and PythonHand Gesture Recognition using OpenCV and Python
Hand Gesture Recognition using OpenCV and Python
ijtsrd
 
Gesture recognition document
Gesture recognition documentGesture recognition document
Gesture recognition document
Nikhil Jha
 
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
International Journal of Technical Research & Application
 
Real time human-computer interaction
Real time human-computer interactionReal time human-computer interaction
Real time human-computer interaction
ijfcstjournal
 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
Mariya Khan
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
IRJET Journal
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Editor IJCATR
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
IRJET Journal
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
IRJET Journal
 
Ay4103315317
Ay4103315317Ay4103315317
Ay4103315317
IJERA Editor
 
Essay On Recognition
Essay On RecognitionEssay On Recognition
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET Journal
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
IRJET Journal
 
Design a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural NetworkDesign a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural Network
IRJET Journal
 
Virtual Smart Phones
Virtual Smart PhonesVirtual Smart Phones
Virtual Smart Phones
IRJET Journal
 
Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand Gestures
IRJET Journal
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
IRJET Journal
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
IJARIIT
 
TOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURESTOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURES
IRJET Journal
 

Similar to Paper id 25201413 (20)

Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...Indian Sign Language Recognition using Vision Transformer based Convolutional...
Indian Sign Language Recognition using Vision Transformer based Convolutional...
 
Hand Gesture Recognition using OpenCV and Python
Hand Gesture Recognition using OpenCV and PythonHand Gesture Recognition using OpenCV and Python
Hand Gesture Recognition using OpenCV and Python
 
Gesture recognition document
Gesture recognition documentGesture recognition document
Gesture recognition document
 
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
HAND GESTURE RECOGNITION FOR HCI (HUMANCOMPUTER INTERACTION) USING ARTIFICIAL...
 
Real time human-computer interaction
Real time human-computer interactionReal time human-computer interaction
Real time human-computer interaction
 
Gesture recognition
Gesture recognitionGesture recognition
Gesture recognition
 
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
Smart Presentation Control by Hand Gestures Using Computer Vision and Google’...
 
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A SurveyNatural Hand Gestures Recognition System for Intelligent HCI: A Survey
Natural Hand Gestures Recognition System for Intelligent HCI: A Survey
 
A Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand GesturesA Survey Paper on Controlling Computer using Hand Gestures
A Survey Paper on Controlling Computer using Hand Gestures
 
Controlling Computer using Hand Gestures
Controlling Computer using Hand GesturesControlling Computer using Hand Gestures
Controlling Computer using Hand Gestures
 
Ay4103315317
Ay4103315317Ay4103315317
Ay4103315317
 
Essay On Recognition
Essay On RecognitionEssay On Recognition
Essay On Recognition
 
IRJET- Finger Gesture Recognition Using Linear Camera
IRJET-  	  Finger Gesture Recognition Using Linear CameraIRJET-  	  Finger Gesture Recognition Using Linear Camera
IRJET- Finger Gesture Recognition Using Linear Camera
 
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNINGSLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
SLIDE PRESENTATION BY HAND GESTURE RECOGNITION USING MACHINE LEARNING
 
Design a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural NetworkDesign a System for Hand Gesture Recognition with Neural Network
Design a System for Hand Gesture Recognition with Neural Network
 
Virtual Smart Phones
Virtual Smart PhonesVirtual Smart Phones
Virtual Smart Phones
 
Sign Language Identification based on Hand Gestures
Sign Language Identification based on Hand GesturesSign Language Identification based on Hand Gestures
Sign Language Identification based on Hand Gestures
 
Virtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand GesturesVirtual Mouse Control Using Hand Gestures
Virtual Mouse Control Using Hand Gestures
 
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in ComputerMems Sensor Based Approach for Gesture Recognition to Control Media in Computer
Mems Sensor Based Approach for Gesture Recognition to Control Media in Computer
 
TOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURESTOUCHLESS ECOSYSTEM USING HAND GESTURES
TOUCHLESS ECOSYSTEM USING HAND GESTURES
 

More from IJRAT

96202108
9620210896202108
96202108
IJRAT
 
97202107
9720210797202107
97202107
IJRAT
 
93202101
9320210193202101
93202101
IJRAT
 
92202102
9220210292202102
92202102
IJRAT
 
91202104
9120210491202104
91202104
IJRAT
 
87202003
8720200387202003
87202003
IJRAT
 
87202001
8720200187202001
87202001
IJRAT
 
86202013
8620201386202013
86202013
IJRAT
 
86202008
8620200886202008
86202008
IJRAT
 
86202005
8620200586202005
86202005
IJRAT
 
86202004
8620200486202004
86202004
IJRAT
 
85202026
8520202685202026
85202026
IJRAT
 
711201940
711201940711201940
711201940
IJRAT
 
711201939
711201939711201939
711201939
IJRAT
 
711201935
711201935711201935
711201935
IJRAT
 
711201927
711201927711201927
711201927
IJRAT
 
711201905
711201905711201905
711201905
IJRAT
 
710201947
710201947710201947
710201947
IJRAT
 
712201907
712201907712201907
712201907
IJRAT
 
712201903
712201903712201903
712201903
IJRAT
 

More from IJRAT (20)

96202108
9620210896202108
96202108
 
97202107
9720210797202107
97202107
 
93202101
9320210193202101
93202101
 
92202102
9220210292202102
92202102
 
91202104
9120210491202104
91202104
 
87202003
8720200387202003
87202003
 
87202001
8720200187202001
87202001
 
86202013
8620201386202013
86202013
 
86202008
8620200886202008
86202008
 
86202005
8620200586202005
86202005
 
86202004
8620200486202004
86202004
 
85202026
8520202685202026
85202026
 
711201940
711201940711201940
711201940
 
711201939
711201939711201939
711201939
 
711201935
711201935711201935
711201935
 
711201927
711201927711201927
711201927
 
711201905
711201905711201905
711201905
 
710201947
710201947710201947
710201947
 
712201907
712201907712201907
712201907
 
712201903
712201903712201903
712201903
 

Paper id 25201413

  • 1. International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 E-ISSN: 2321-9637 Gesture Based Computing as an Alternative to Mouse by Calibrating Principal Contour Process Actions 198 Chinnu Thomas1, D.Lakshmi2 PG Student of CSE1, Associate Professor of CSE2 Adithya Institute of Technology1, 2 Chinnuthomas13@gmail.com1, lakshmi.lifefordivine@gmail.com2 Abstract- Computing is no longer a discrete activity bound to a desktop; network computing and mobile computing are fast becoming a part of everyday life and so is the Internet. In the present day framework of interactive and intelligent computing, an efficient human computer interaction is assuming utmost importance. The topic of user experience and interaction has been increasingly popular and widespread lately. Unlike decades ago, when people place most of their attentions on the quality and functionality, or brand of a product, at the present time, the user interaction experience and usability seems to be the vital element when people are considering and selecting a product. Gesture based computing enables humans to interface with the machine (HMI) and interact naturally without any dedicated devices. Building a richer bridge between machines and humans than primitive text user interface or even (graphical user interfaces) GUIs, which still limit the majority of input to keyboard and mouse. In fact we are bridging this gap by bringing intangible, digital information out into the tangible world, and allowing us to interact with this information via natural hand gestures. Gesture Based Computing provides an attractive alternative for human computer interaction (HCI). A novel approach is proposed for implementing a real-time Human interaction system capable of understanding commands with emphasis on hand gestures by analyzing the principal contour and fingertips. Index Terms- Human Machine Interface; Human Computer interaction; Hand Gesture; Gesture based Computing. 1. INTRODUCTION No doubt, Present world framework of technology is seeking for an interesting interface for Human Computer Interaction (HCI) which can provide an environment where a user can interact with a computer or roughly speaking a machine in a more user friendly way. We are bridging the gap of interaction by bringing immaterial, digital information out into the material world, and allowing us to interact with this information via natural hand gestures. Gestures considered as a natural way of communication among human especially for hear-impaired, since it is a physical movement of hands ,arms, or body which conveying meaningful information by delivering an expressive message. Gesture recognition then, is the interpretation of that movement as semantically meanings command. Vision Based approach of hand gesture recognition has received much attention from academia and industry in recent years, largely due to the development of human-computer interaction (HCI) technologies and the growing popularity of smart devices such as smart phones. The essential intention of building hand gesture recognition system is to create a natural interaction between human and computer where the recognized gestures can be used for controlling a robot [15] or conveying meaningful information [1]. How to form the resulted hand gestures to be understood and well interpreted by the computer considered as the problem of gesture interaction. Hand gestures recognition (HGR) is one of the major innovative areas of research for the engineers,scientists and bioinformatics. Today many researchers in the academia and industry are working on different application without wearing any extra and dedicated device to make interactions more easy, natural and convenient. However, human-robot interaction using hand gestures provides a remarkable challenge. For the vision part, the complex and cluttered backgrounds, dynamic lighting conditions and a deformable human hand shape, extraction of wrong object can cause a machine to misunderstand the gesture. If the machine is mobile, the hand gesture recognition system also needs to satisfy other constraints, such as the size of the gesture in the image and adaptation to motion. Moreover, to be natural, the machine must be person-independent and give feedback in real time. The main focusing point of the paper is to presents a Hand gesture recognition system emphasizing the detection of the motion of the hand in a real-time scenario, track it, identify its features such as fingertips and use those to create a simple application, which is measured to be challenging problem in the human-computer interaction context and promising as well. Although recent reviews [1,6-8,15,19,20] in computer vision based have explained the importance of gesture recognition system for human computer interaction (HCI), this work concentrates on vision based techniques method and it’s up-to-date. With expecting to point out various research developments as well as it emphasis good beginning for interested
  • 2. International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 E-ISSN: 2321-9637 199 persons in hand gesture recognition area. Conclusively, we give some discussion on the current challenges and open questions in this area and point out a list of possible directions for future work. The rest of the paper is organized as follows. Section 2 explains the basic concepts and approaches of the vision based hand gesture approaches. Section 3 includes several related works. The proposed work is detailed in Section 4. Finally the work is settled in Section 5. 2. HAND GESTURE TECHNOLOGY To enable hand gesture recognition, numerous approaches have been proposed, which can be classified into various categories. For any system the first step is to collect the data necessary to accomplish a specific task. In order to acquire input data for hand posture and gesture recognition system different technologies are used. A common taxonomy is based on whether extra devices are required for raw data collecting. In this way, they are categorized into data glove based hand gesture recognition, vision based hand gesture recognition [5], and color glove based hand gesture recognition [6]. Figure 1gives an example of these technologies. (a) Data-Glove based. (b) Colored marker (c) Vision based. Fig 1: Examples of hand gesture recognition input technologies. 2.1 Data glove based approaches Data glove based approaches require the user to wear a cumbersome glove-like device, which is equipped with sensors that can sense the movements of hand(s) and fingers, and pass the information to the computer. Hence it can be referred as Instrumented glove approach. These approaches can easily provide exact coordinates of palm and finger’s location and orientation, and hand configurations, The advantages of these approaches are high accuracy and fast reaction speed. However these approaches require the user to be connected with the computer physically which obstacle the ease of interaction between users and computers, besides the cost of these devices are quite expensive. 2.2 Colored Markers based approaches Color glove based approaches represent a compromise between data glove based approaches and vision based approaches. Marked gloves or colored markers are gloves that worn by the human hand [6] with some colors to direct the process of tracking the hand and locating the palm and fingers [6], which provide the ability to extract geometric features necessary to form hand shape [6]. Compared with instrumented data glove the amenity of this technology is its simplicity in use, and it costs low price. Intrinsically, they are similar to the latter, except that, with the help of colored gloves, the image preprocessing phase (e.g., segmentation, localization and detection of hands) can be greatly simplified. The disadvantages are similar to data glove based approaches: they are unnatural and not suitable for applications with multiple users due to hygiene issues. 2.3 Vision Based approaches: Vision based approaches do not require the user to wear anything (naked hands). Instead, video camera(s) are used to capture the images of hands, which are then processed and analyzed using computer vision techniques. This type of hand gesture recognition is simple, natural and convenient for users and at present they are the most popular approaches to gesture recognition. Although these approaches are simple but a lot of gesture challenges are raised such as the complex and cluttered background, lighting variation, and other skin patches with the hand object, besides system requirements such as robustness ,velocity, recognition time, throughput, and computational efficiency [1][10]. 3. RELATED WORK William F. and Michal R applied orientation histogram in [3] a method for recognizing gestures based on pattern recognition. Mentions some issues include similar gestures might have different orientation histograms and different gestures could have similar orientation histograms, besides that, the proposed method achieved well for any objects that dominate the image even if it is not the hand gesture. Xingyan L. In [5] presented fuzzy c-means clustering algorithm to recognize hand gestures in a mobile remote. There are some restrictions with current system as the recognition accuracy drops quickly when the distance between the user and the camera is greater than 1.5 meters or when the lighting is too strong. The system cannot deal with an image that has two or more patches of skin with similar size.
  • 3. International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 E-ISSN: 2321-9637 METHOD MERITS LIMITATIONS APLLICATIONS 200 Stergiopoulou E. [1] recognized static hand gestures using Self-Growing and Self-Organized Neural Gas (SGONG) network. Three geometric features was Table1: comparison between various methods of hand gesture recognition system extracted, two angles based on hand slope and the distance from the palm center was determined, where these features used to determine the number of the raised fingers class. The system recognized 31 predefined gestures with recognition rate 90.45%, in processing time 1.5 second, but it is time consuming and when the number of training data increase, the time needed for classification are increased too. Hasan [4] applied multivariate Gaussian distribution to recognize hand gestures using nongeometric features. The input hand image is segmented using two different methods ; skin color based segmentation by applying HSV color model and clustering based thresholding techniques. Lamberti [6] presents a real-time hand gesture recognizer based on a color glove. Color markers are employed to get the features as five distances from palm to all fingers and four angles between those distances. Daeho Lee and SeungGwan Lee [11] presents a novel vision based method in which , fingertips are detected by a novel scale-invariant angle detection based on a variable k-cosine. Table1 shows the comparisons between different hand gesture recognition techniques. a novel distance metric, Finger-EarthMover’s Distance (FEMD), Thresholding decomposition robust to hand articulations, distortions and orientation or scale changes, and can work in uncontrolled environments False and confusing gesture orientations Rock-Paper-Scissors Game, Arithmetic Computation Learning Vector Quantization, HSI color space. Cluttered backgrounds can be managed Accuracy Of Capturing. Color variations upon lighting changes Multimedia presentation command Fuzzy C-Means algorithm , Threshold. Enough speed and sufficient reliability Recognition Rate : 85.83% wrong object extraction, environment lighting changes, Distance variations, Cannot distinguish two or more patches of skin. mobile robot Self-Growing and Self- Organized Neural Gas (SGONG) network, YCbCr color space, Gaussian distribution Reliable since it is relatively immune to changing lighting conditions and provides good coverage of the human skin color. Fast - does not require post-processing of the hand image Recognition Rate : 90.45% Time consuming Complex 3D Modeling, Numbers Recognition Laplacian filter Euclidian distance metric, HSV color model Promising solution for illumination changes Recognition Rate : 91% Different orientation histograms Background is plane and uniform, Computational complexity Helicopter Signaller for Marshaling Operations, Television Control. Orientation histogram, Euclidean distance metric simple and fast to compute offers some robustness to scene illumination changes similar gestures might have different orientation histograms different gestures could have similar orientation histograms wrong object detection Pattern Recognition, Graphic Editor Control
  • 4. International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 E-ISSN: 2321-9637 201 4. PROPOSED WORK Gesture based computing interface makes the user more interactive with the PC. The power of controlling the PCs will be in our hands. With the help of mere bare hand and its gestures we can get the power of pointing device like mouse. Gesture based computing makes the interface more advanced compared to pointing device like mouse. We can get full access of our personal data on our finger tips. The proposed system is illustrated in the Fig 1,consists of 3 main components viz Image Capturing, Image Processing And Application Handler. The Inage capturing mainly initiates the webcam to capture the image. After getting the captured image processing of the particulars are facilitated. Includes configuring skin color values and detecting the concerned skin color bands. Next emphasis the finding of different contours in the image and storing it in an array. From the contours detected we must identify the Principal Contour this will reduce the wrong object detection. Principal contour helps us to detect the finger tips ,to track the gesture actions and to find the current positions. This identified gesture action are send to application handler to control different applications. 4.1 Module Description 4.1.1 Image Capture Module We need to capture the gestures provided by the hands .In order to do that we use an Image capturing device (Web cam) to pictures every movement of the hands, and this is marked as the input of our program to control the system. The Java Media Framework (JMF) is a Java library that enables audio, video and other time-based media to be added to Java applications and applets. This optional package, which can capture, play, stream, and transcode multiple media formats, extends the Java Platform, Standard Edition (Java SE) and allows development of cross-platform multimedia applications. Using this key feature in JMF it is used to detect the user hand movements and it is directed to the Computing device. 4.1.2 Color Detection Module It should be processed in such a way that we have to identify the colors that are required by analyzing each Fig 2. Block diagram of the proposed work pixel. In order to identify color the pixels in the image is arranged in a one dimensional array. It makes the color analysis much easier. The RGB components in each pixel are calculated. The color value of each pixel can range from 0-255. Depending on this value of red, blue and green we can identify the pixel color. I.e. to get red color, the R value should be greater than 150 and the other must be less than 40. This is the method that we are using here for processing image. In java a class named pixelGrabbe is used for this process. Here we are comparing the obtained RGB value with the already decided color values. If we are obtaining such an RGB value in the prescribed limit, the analysis will be terminated and the obtained color is taken as the input for the next process. Hence this module is considered as the
  • 5. International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 E-ISSN: 2321-9637 202 learning phase in which we should able to detect the skin color and make that recognizable by the HMI. 4.1.3 Image Comparison Module From the image, the Skin color is detected and the coordinate value of that color pixel is obtained. This is done for each frame. From the coordinate value thus obtained, the position mouse is calculated. The position is calculated as the difference in coordinate value of that color in consecutive frames. Depending upon this value the different mouse operation such as mouse pointer movement, left click, right click and double click is performed. 4.1.4 ADD-ONS GBC has got four applications. They are: (1) Mouse activity handler module. (2) Image viewer module. (3) File transfer module (4) Mail dispatcher. (1) Mouse Activity Handler Module Skin color and a concerned gestures are assigned for basic mouse action. Gesture1 – Movement, Gesture232 – Right Click, Gesture121 – Left Click. The coordinates of mouse pointer changes as the projected gesture move over the screen. This is done by calculating the co-ordinate axis frame by frame. The captured position of skin color is stored in a variable (current x & current y).if the same color is detected in the next frame at different position the present values of current x & current y will be moved to previous x & previous y respectively. The new co-ordinate value is moved to the current x& current y. By calculating the displacement of these two axes the movement of mouse pointer is done. The rest mouse actions are done by other respective colors and it is managed by Robot class which is provided by java. This class implementations. Using the class to generate input events differs from posting events to the AWT event queue or AWT components in that the events are generated in the platform's native input queue. For example, Robot.mouseMove will essentially move the mouse cursor instead of just generating mouse move events. (2) Image Viewer Image viewer is a application which helps us to view digital images that are stored in our computer. To change the magnification level to zoom in on the current picture click the magnifier and then drag the slider to zoom in on the picture. This lets us to get close up view of the image. The same way zoom out also can be done. To view next or previous image that is in the same folder as the current picture, can click on next or previous button. In GBC instead of these buttons we use Gesture2 for viewing previous and next images. If skin color is moved from left to right, previous image can be viewed. For next image, skin color is moved from right to left. For zoom in and zoom out Gesture5 is used a time. As the distance between these color increases the picture will get enlarged and if the distance between the color decreases the picture is shortened. So by using GBC, it helps us to view images in a different way. Instead of sitting close to a system and always clicking on button to view previous and next images and to zoom in and zoom out we need only to show the specified gestures with specified skin color and gesture(count of fingertips). (3) File Transfer File transfer is the process of transferring a file from one system to another system. Here we use gestures and skin color band for this purpose. For that we need a server to control and manage those activities. More than two clients can take part in this. Two java classes, fileInputStream and file OutputStream are used. The system participating in this must be connected locally via Ethernet cable or by using switch. The server initiates the application. The client program is run on those computers where the actual file is present and on those where the file needs to be copied. The file which is to e transferred between computers is selected first. When skin color band along with Gesture121 are showed on the system which is having the selected file, it gets copied. Now we need to show skin color band and Gesture5 on those computer to which the file is to be transferred. And the file will be pasted on the specified destination or folder. (4) Mail Dispatcher
  • 6. International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 E-ISSN: 2321-9637 203 It is just like Gmail and Yahoo mail, we can access mail completely. We can compose, store, read and send mail. We can store email addresses and other confidential data here. Since the data must not be read by others we set password for such documents. Usually passwords are set of characters or numbers or combination of both. For a hacker it can be easily hacked and the data can be read. The data are no ore confidential. But now a day instead of password, patterns are used. It is common in android phones. We just need to set a pattern by drawing a pattern. Using only this pattern one can open the phone. Just like this application, we use pattern based locking technique for login to the mail dispatcher. When the user first opens the mail dispatcher, he is asked for login, where he has to draw the pattern. It contains 9 rectangular boxes arranged in a 3*3 matrix form. By using red color glow he can move from one block to another and by using green glow he can select the box of his pattern. If the pattern is correct then only he can login into his account. Thus it ensures security. 5. CONCLUSIONS Hand gesture recognition is considered to be very challenging for real-life applications because of its requirements on the robustness, accuracy and efficiency. In this paper , we described real-time hand gesture recognition system based on principal contour action analysis. Primarily a comprehensive review on tools and techniques of gesture recognition system has been provided with emphasis on hand gesture expressions. The major tools surveyed include Orientation Histograms, fuzzy clustering HMMs, ANN, HSV and Geometrical method have been reviewed and analyzed. Nearly all researchers are using colored images for achieving better results. Comparison between diverse gesture recognition systems have been presented with explaining the important parameters needed for any recognition system which include: the segmentation process, features extraction, and the classification algorithm. Despite the fact that the recognition rate increases gradually, the concerned algorithms experience several issues. The challenges include wrong object extraction, complex and nonuniform background, partial-occlusion, background disturbance, object reappearance, illumination change etc. The assortment of specific algorithm for recognition depends on the application needed. In this work, we proposed cost effective and user friendly Hand Gesture Recognition System by Principal Contour Action analysis which finds the principal contour from the background and analyses the position and contour of the detected fingertips, various interface actions such as clicking, moving ,zooming and pointing are recognized. Setting a threshold value of time in capturing gestures employed for differentiating the concerned gestures. Two-handed dynamic-gesture multimodal interaction is thus a promising area for future research. Customization of gestures also can be employed to make it more user friendly. REFERENCES [1] Stergiopoulou, E., & Papamarkos, N. (2009). ‘Hand gesture recognition using a neural network shape fitting technique’. Elsevier Engineering Applications of Artificial Intelligence 22, 1141- 1158. http://dx.doi.org/10.1016/j.engappai.2009.03.008 [2] Malima, A., Özgür, E., & Çetin, M. (2006). ‘A fast algorithm for vision-based hand gesture recognition for robot control’. IEEE 14th conference on Signal Processing and Communications Applications, pp. 1- 4. http://dx.doi.org/10.1109/SIU.2006.1659822 [3] W. T. Freeman and Michal R., (1995) ‘Orientation Histograms for Hand Gesture Recognition’, IEEE International Workshop on Automatic Face and Gesture Recognition. [4] M. M. Hasan, P. K. Mishra, (2011). ‘HSV Brightness Factor Matching for Gesture Recognition System’, International Journal of Image Processing (IJIP), Vol. 4(5). [5] Xingyan Li. (2003). ‘Gesture Recognition Based on Fuzzy C-Means Clustering Algorithm’, Department of Computer Science. The University of Tennessee Knoxville. [6] Luigi Lamberti& Francesco Camastra, (2011). ‘Real-Time Hand Gesture Recognition Using a Color Glove’, Springer 16th international conference on Image analysis and processing: Part I (ICIAP'11), pp. 365-373. [7] C. Keskin, F. Krac, Y. Kara, and L. Akarun, (2011). ‘Real time hand pose estimation using depth sensors,’ in Proc. IEEE Int. Conf. Comput. Vis. Workshops, Nov. 2011, pp. 1228–1234. [8] Z. Pan,Y. Li,M. Zhang, C. Sun,K. Guo,X. Tang, and S. Zhou, (2010) ‘A realtime multi-cue hand tracking algorithm based on computer vision,’ in IEEE Virt. Real. Conf., 2010, pp. 219–222. [9] G. Hackenberg, R.McCall, andW. Broll, (2011). ‘Lightweight palm and finger tracking for real-time 3-D gesture control,’ in IEEE Virtual Reality Conf., Mar. 2011, pp. 19–26. [10]A. Chaudhary, J. Raheja, and S. Raheja, (2012). ‘A vision based geometrical method to find fingers positions in real time hand gesture recognition,’ J. Software, vol. 7, no. 4, pp. 861– 869.
  • 7. International Journal of Research in Advent Technology, Vol.2, No.5, May 2014 E-ISSN: 2321-9637 204 [11]D. Lee and S. Lee, (2011). ‘Vision-based finger action recognition by angle detection and contour analysis,’ ETRI J., vol. 33, no. 3, pp. 415–422. [12] J. Raheja, A. Chaudhary, and K. Singal, (2011). ‘Tracking of fingertips and centers of palm using Kinect,’ in Proc. Int. Conf. Computat. Intell., Modell. Simulat., pp. 248–252. [13]V. Frati and D. Prattichizzo, (2011). ‘Using Kinect for hand tracking and rendering in wearable haptics,’ in IEEE World Haptics Conf. WHC, Jun. 2011, pp. 317–321. [14]H. Liang, J. Yuan, and D. Thalmann,(2011). ‘3-D fingertip and palm tracking in depth image sequences,’ in Proc. 20th ACM Int. Conf. Multimedia, pp. 785–788. [15]Ghobadi, S. E., Loepprich, O. E., Ahmadov, F., Bernshausen, J., Hartmann, K., & Lo®eld, O. (2008).‘Real time hand based robot control using multimodal images’. International Journal of Computer Science IJCS.Vol 35(4). Retrieved from www.iaeng.org/IJCS/issues_v35/issue_4/IJCS_3 5_4_08.pdf [16]Garg, P., Aggarwal, N., & Sofat, S. (2009).‘Vision based hand gesture recognition’. World Academy of Science, Engineering and Technology. Retrieved from www.waset.org/journals/waset/v49/v49-173.pdf [17]K. Grauman and T. Darrell (2004) . ‘Fast contour matching using approximate earth mover's distance’. In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition, Washington DC. [18]Zhou Ren, Junsong Yuan, Wenyu Liu, (2013). ‘Minimum Near-Convex Shape Decomposition’, IEEE transactions on pattern analysis and machine intelligence, vol. 35, no. 10, october. [19]Marco Maisto, Massimo Panella, Member, IEEE, Luca Liparulo, and Andrea Proietti (2013). ‘An Accurate Algorithm for the Identification of Fingertips Using an RGB-D Camera’, In IEEE journal on emerging and selected topics in circuits and systems, vol. 3, no. 2, june. [20]Zhou Ren, Junsong Yuan,(2013). ‘Robust Part- Based Hand Gesture Recognition Using Kinect Sensor’, IEEE transactions on multimedia, vol. 15, no. 5, August. [21] Chinnu Thomas, S. Pradeepa, (2014), ‘A Comprehensive Review on Vision Based Hand Gesture Technology, in International Journal of Research in Advent Technology, Volume 2, Issue 1, January.